id
stringlengths
11
95
author
stringlengths
3
36
task_category
stringclasses
16 values
tags
sequencelengths
1
4.05k
created_time
int64
1.65k
1.74k
last_modified
int64
1.62k
1.74k
downloads
int64
0
15.6M
likes
int64
0
4.86k
README
stringlengths
246
1.01M
matched_task
sequencelengths
1
8
matched_bigbio_names
sequencelengths
1
8
is_bionlp
stringclasses
3 values
croissantllm/base_150k
croissantllm
text2text-generation
[ "transformers", "pytorch", "llama", "text-generation", "legal", "code", "text-generation-inference", "art", "text2text-generation", "fr", "en", "dataset:cerebras/SlimPajama-627B", "dataset:uonlp/CulturaX", "dataset:pg19", "dataset:bigcode/starcoderdata", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,704
1,706
6
0
--- datasets: - cerebras/SlimPajama-627B - uonlp/CulturaX - pg19 - bigcode/starcoderdata language: - fr - en license: mit pipeline_tag: text2text-generation tags: - legal - code - text-generation-inference - art --- # CroissantLLM - Base (150k steps) This model is part of the CroissantLLM initiative, and corresponds to the checkpoint after 150k steps (2.36 T) tokens. To play with the final model, we recommend using the Chat version: https://huggingface.co/croissantllm/CroissantLLMChat-v0.1. ## Abstract We introduce CroissantLLM, a 1.3B language model pretrained on a set of 3T English and French tokens, to bring to the research and industrial community a high-performance, fully open-sourced bilingual model that runs swiftly on consumer-grade local hardware. To that end, we pioneer the approach of training an intrinsically bilingual model with a 1:1 English-to-French pretraining data ratio, a custom tokenizer, and bilingual finetuning datasets. We release the training dataset, notably containing a French split with manually curated, high-quality, and varied data sources. To assess performance outside of English, we craft a novel benchmark, FrenchBench, consisting of an array of classification and generation tasks, covering various orthogonal aspects of model performance in the French Language. Additionally, rooted in transparency and to foster further Large Language Model research, we release codebases, and dozens of checkpoints across various model sizes, training data distributions, and training steps, as well as fine-tuned Chat models, and strong translation models. We evaluate our model through the FMTI framework, and validate 81% of the transparency criteria, far beyond the scores of even most open initiatives. This work enriches the NLP landscape, breaking away from previous English-centric work in order to strengthen our understanding of multilinguality in language models. ## Citation Our work can be cited as: ```bash Coming soon ``` ## Usage This model is a base model, that is, it is not finetuned for Chat function and works best with few-shot prompting strategies. ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer model_name = "croissantllm/base_150k" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16, device_map="auto") inputs = tokenizer("I am so tired I could sleep right now. -> Je suis si fatigué que je pourrais m'endormir maintenant. He is heading to the market. -> Il va au marché. We are running on the beach. ->", return_tensors="pt").to(model.device) tokens = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.95, top_k=60, temperature=0.5) print(tokenizer.decode(tokens[0])) # remove bos token inputs = tokenizer("Capitales: France -> Paris, Italie -> Rome, Allemagne -> Berlin, Espagne ->", return_tensors="pt", add_special_tokens=True).to(model.device) tokens = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.95, top_k=60) print(tokenizer.decode(tokens[0])) ```
[ "TRANSLATION" ]
[ "CRAFT" ]
Non_BioNLP
Shashwat13333/bge-base-en-v1.5_v1
Shashwat13333
sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:150", "loss:MatryoshkaLoss", "loss:MultipleNegativesRankingLoss", "en", "arxiv:1908.10084", "arxiv:2205.13147", "arxiv:1705.00652", "base_model:BAAI/bge-base-en-v1.5", "base_model:finetune:BAAI/bge-base-en-v1.5", "license:apache-2.0", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,738
1,738
13
0
--- base_model: BAAI/bge-base-en-v1.5 language: - en library_name: sentence-transformers license: apache-2.0 metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:150 - loss:MatryoshkaLoss - loss:MultipleNegativesRankingLoss widget: - source_sentence: Do you provide support 24/7? sentences: - 'How can we get started with your DevOps solutions? Getting started is easy. Contact us through our website. We''ll schedule a consultation to discuss your needs, evaluate your current infrastructure, and propose a customized DevOps solution designed to achieve your goals.' - 'This is our Portfolio Introducing the world of Housing Finance& Banking Firm. Corporate Website with 10 regional languages in India with analytics and user personalization and Dashboard for Regional Managers, Sales Agents, etc. to manage the Builder Requests, approve/deny Properties, manage visits and appointments, manage leads, etc. Introducing the world of Global Automotive Brand.We have implemented a Multi Locale Multilingual Omnichannel platform for Royal Enfield. The platform supports public websites, customer portals, internal portals, business applications for over 35+ different locations all over the world. Developed Digital Platform for Students, Guardians, Teachers, Tutors, with AI/ML in collaboration with Successive Technologies Inc, USA. Cloud, Dev-Sec-Ops & Data Governance Managing cloud provisioning and modernization alongside automated infrastructure, event-driven microservices, containerization, DevOps, cybersecurity, and 24x7 monitoring support ensures efficient, secure, and responsive IT operations.' - 'We are a New breed of innovative digital transformation agency, redefining storytelling for an always-on world. With roots dating back to 2017, we started as a pocket size team of enthusiasts with a goal of helping traditional businesses transform and create dynamic, digital cultures through disruptive strategies and agile deployment of innovative solutions.' - source_sentence: What services do you offer for AI adoption? sentences: - 'In what ways can machine learning optimize our operations? Machine learning algorithms can analyze operational data to identify inefficiencies, predict maintenance needs, optimize supply chains, and automate repetitive tasks, significantly improving operational efficiency and reducing costs.' - "At Techchefz Digital, we specialize in guiding companies through the complexities\ \ of adopting and integrating Artificial Intelligence and Machine Learning technologies.\ \ Our consultancy services are designed to enhance your operational efficiency\ \ and decision-making capabilities across all sectors. With a global network of\ \ AI/ML experts and a commitment to excellence, we are your partners in transforming\ \ innovative possibilities into real-world achievements. \ \ \ \ \n DATA INTELLIGENCE PLATFORMS we\ \ specialize in\nTensorFlow\nDatabricks\nTableau\nPytorch\nOpenAI\nPinecone\"" - "SERVICES WE PROVIDE\nFlexible engagement models tailored to your needs\nWe specialize\ \ in comprehensive website audits that provide valuable insights and recommendations\ \ to enhance your online presence.\nDigital Strategy & Consulting\nCreating digital\ \ roadmap that transform your digital enterprise and produce a return on investment,\ \ basis our discovery framework, brainstorming sessions & current state analysis.\n\ \nPlatform Selection\nHelping you select the optimal digital experience, commerce,\ \ cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying\ \ next-gen scalable and agile enterprise digital platforms, along with multi-platform\ \ integrations. \nProduct Builds\nHelp you ideate, strategize, and engineer\ \ your product with help of our enterprise frameworks\nInfrastructure\nSpecialize\ \ in multi-cloud infrastructure helping you put forward the right cloud infrastructure\ \ and optimization strategy.\n\nManaged Services\nOperate and monitor your business-critical\ \ applications, data, and IT workloads, along with Application maintenance and\ \ operations.\nTeam Augmentation\nHelp you scale up and augment your existing\ \ team to solve your hiring challenges with our easy to deploy staff augmentation\ \ offerings.\"" - source_sentence: What challenges did the company face in its early days? sentences: - 'How do we do Custom Development ? We follow below process to develop custom web or mobile Application on Agile Methodology, breaking requirements in pieces and developing and shipping them with considering utmost quality: Requirements Analysis We begin by understanding the client&#39;s needs and objectives for the website. Identify key features, functionality, and any specific design preferences. Project Planning Then create a detailed project plan outlining the scope, timeline, and milestones. Define the technology stack and development tools suitable for the project. User Experience Design Then comes the stage of Developing wireframes or prototypes to visualize the website&#39;s structure and layout. We create a custom design that aligns with the brand identity and user experience goals. Development After getting Sign-off on Design from Client, we break the requirements into Sprints on Agile Methodology, and start developing them.' - 'After a transformative scuba dive in the Maldives, Mayank Maggon made a pivotal decision to depart from the corporate ladder in December 2016. Fueled by a clear vision to revolutionize the digital landscape, Mayank set out to leverage the best technology ingredients, crafting custom applications and digital ecosystems tailored to clients'' specific needs, limitations, and budgets. However, this solo journey was not without its challenges. Mayank had to initiate the revenue engine by offering corporate trainings and conducting online batches for tech training across the USA. He also undertook small projects and subcontracted modules of larger projects for clients in the US, UK, and India. It was only after this initial groundwork that Mayank was able to hire a group of interns, whom he meticulously trained and groomed to prepare them for handling Enterprise Level Applications. This journey reflects Mayank''s resilience, determination, and entrepreneurial spirit in building TechChefz Digital from the ground up. With a passion for innovation and a relentless drive for excellence, Mayank has steered TechChefz Digital through strategic partnerships, groundbreaking projects, and exponential growth. His leadership has been instrumental in shaping TechChefz Digital into a leading force in the digital transformation arena, inspiring a culture of innovation and excellence that continues to propel the company forward.' - 'Our Solutions Strategy & Digital Transformation Innovate via digital transformation, modernize tech, craft product strategies, enhance customer experiences, optimize data analytics, transition to cloud for growth and efficiency Product Engineering & Custom Development Providing product development, enterprise web and mobile development, microservices integrations, quality engineering, and application support services to drive innovation and enhance operational efficiency.' - source_sentence: What kind of data do you leverage for AI solutions? sentences: - 'In the Introducing the world of Global Insurance Firm, we crafted Effective Solutions for Complex Problems and delieverd a comprehensive Website Development, Production Support & Managed Services, we optimized customer journeys, integrate analytics, CRM, ERP, and third-party applications, and implement cutting-edge technologies for enhanced performance and efficiency and achievied 200% Reduction in operational time & effort managing content & experience, 70% Reduction in Deployment Errors and Downtime, 2.5X Customer Engagement, Conversion & Retention' - 'Why do we need Microservices ? Instead of building a monolithic application where all functionalities are tightly integrated, microservices break down the system into modular and loosely coupled services. Scalability Flexibility and Agility Resilience and Fault Isolation Technology Diversity Continuous Delivery' - Our AI/ML services pave the way for transformative change across industries, embodying a client-focused approach that integrates seamlessly with human-centric innovation. Our collaborative teams are dedicated to fostering growth, leveraging data, and harnessing the predictive power of artificial intelligence to forge the next wave of software excellence. We don't just deliver AI; we deliver the future. - source_sentence: What do you guys do for digital strategy? sentences: - " What we do\n\nDigital Strategy\nCreating digital frameworks that transform\ \ your digital enterprise and produce a return on investment.\n\nPlatform Selection\n\ Helping you select the optimal digital experience, commerce, cloud and marketing\ \ platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable\ \ and agile enterprise digital platforms, along with multi-platform integrations.\n\ \nProduct Builds\nHelp you ideate, strategize, and engineer your product with\ \ help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and\ \ augment your existing team to solve your hiring challenges with our easy to\ \ deploy staff augmentation offerings .\nManaged Services\nOperate and monitor\ \ your business-critical applications, data, and IT workloads, along with Application\ \ maintenance and operations\n" - "Introducing the world of\nGlobal Hospitality Firm\n\nIn this project, We focused\ \ on strategizing CX, diverse platform dev, travel booking, indemnity journeys,\ \ digital community, and managed services enhance travel experience and operational\ \ efficiency. \nStrategizing & defining the Customer Experience across business\ \ units and respective products / services,\nPlatform Development and Integrations\ \ across different tech stacks - Drupal, Magento, MERN, Microservices, Canvas\ \ LMS, OKTA SSO, AWS based Cloud Infrastructure, Build Automation\nTravel Packages\ \ Booking Platform with payments, subscriptions, real time booking, etc\nIndemnity\ \ & Self-Service Journeys\n\nAnd we achieved, 100% Improvement in Marketing Content,\ \ Real Time Prices & Inventories delivery. 80% Increase in Customer Retention,175%\ \ Increase in Partner & Vendor Operational Efficiency" - 'Introducing the world of General Insurance Firm In this project, we implemented Digital Solution and Implementation with Headless Drupal as the CMS, and lightweight React JS (Next JS SSR on Node JS) with the following features: PWA & AMP based Web Pages Page Speed Optimization Reusable and scalable React JS / Next JS Templates and Components Headless Drupal CMS with Content & Experience management, approval workflows, etc for seamless collaboration between the business and marketing teams Minimalistic Buy and Renewal Journeys for various products, with API integrations and adherence to data compliances We achieved 250% Reduction in Operational Time and Effort in managing the Content & Experience for Buy & renew Journeys,220% Reduction in Customer Drops during buy and renewal journeys, 300% Reduction in bounce rate on policy landing and campaign pages' model-index: - name: BGE base Financial Matryoshka results: - task: type: information-retrieval name: Information Retrieval dataset: name: dim 768 type: dim_768 metrics: - type: cosine_accuracy@1 value: 0.18666666666666668 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.5866666666666667 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.68 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.8 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.18666666666666668 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.19555555555555554 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.13599999999999998 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.07999999999999997 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.18666666666666668 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.5866666666666667 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.68 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.8 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.48942651032647805 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.38962962962962955 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.398026376123124 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 512 type: dim_512 metrics: - type: cosine_accuracy@1 value: 0.24 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.5733333333333334 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.6533333333333333 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.8 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.24 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.1911111111111111 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.13066666666666663 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.07999999999999997 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.24 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.5733333333333334 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.6533333333333333 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.8 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.4991793077336057 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.4047195767195766 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.4124023465759078 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 256 type: dim_256 metrics: - type: cosine_accuracy@1 value: 0.21333333333333335 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.5466666666666666 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.6266666666666667 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.7466666666666667 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.21333333333333335 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.1822222222222222 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.12533333333333332 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.07466666666666665 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.21333333333333335 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.5466666666666666 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.6266666666666667 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.7466666666666667 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.4717065825983648 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.38359259259259254 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.39417579048787715 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 128 type: dim_128 metrics: - type: cosine_accuracy@1 value: 0.21333333333333335 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.52 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.5733333333333334 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.7066666666666667 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.21333333333333335 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.1733333333333333 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.11466666666666667 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.07066666666666666 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.21333333333333335 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.52 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.5733333333333334 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.7066666666666667 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.44415760022208445 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.36086772486772484 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.37364447853598953 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 64 type: dim_64 metrics: - type: cosine_accuracy@1 value: 0.14666666666666667 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.4 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.5066666666666667 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.6133333333333333 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.14666666666666667 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.13333333333333333 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.10133333333333334 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.06133333333333333 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.14666666666666667 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.4 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.5066666666666667 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.6133333333333333 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.3595031317594935 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.27981481481481474 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.29776557642203677 name: Cosine Map@100 --- # BGE base Financial Matryoshka This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) <!-- at revision a5beb1e3e68b9ab74eb54cfd186867f64f240e1a --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 dimensions - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> - **Language:** en - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("Shashwat13333/bge-base-en-v1.5_v1") # Run inference sentences = [ 'What do you guys do for digital strategy?', ' What we do\n\nDigital Strategy\nCreating digital frameworks that transform your digital enterprise and produce a return on investment.\n\nPlatform Selection\nHelping you select the optimal digital experience, commerce, cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable and agile enterprise digital platforms, along with multi-platform integrations.\n\nProduct Builds\nHelp you ideate, strategize, and engineer your product with help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and augment your existing team to solve your hiring challenges with our easy to deploy staff augmentation offerings .\nManaged Services\nOperate and monitor your business-critical applications, data, and IT workloads, along with Application maintenance and operations\n', 'Introducing the world of General Insurance Firm\nIn this project, we implemented Digital Solution and Implementation with Headless Drupal as the CMS, and lightweight React JS (Next JS SSR on Node JS) with the following features:\nPWA & AMP based Web Pages\nPage Speed Optimization\nReusable and scalable React JS / Next JS Templates and Components\nHeadless Drupal CMS with Content & Experience management, approval workflows, etc for seamless collaboration between the business and marketing teams\nMinimalistic Buy and Renewal Journeys for various products, with API integrations and adherence to data compliances\n\nWe achieved 250% Reduction in Operational Time and Effort in managing the Content & Experience for Buy & renew Journeys,220% Reduction in Customer Drops during buy and renewal journeys, 300% Reduction in bounce rate on policy landing and campaign pages', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Information Retrieval * Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 | |:--------------------|:-----------|:-----------|:-----------|:-----------|:-----------| | cosine_accuracy@1 | 0.1867 | 0.24 | 0.2133 | 0.2133 | 0.1467 | | cosine_accuracy@3 | 0.5867 | 0.5733 | 0.5467 | 0.52 | 0.4 | | cosine_accuracy@5 | 0.68 | 0.6533 | 0.6267 | 0.5733 | 0.5067 | | cosine_accuracy@10 | 0.8 | 0.8 | 0.7467 | 0.7067 | 0.6133 | | cosine_precision@1 | 0.1867 | 0.24 | 0.2133 | 0.2133 | 0.1467 | | cosine_precision@3 | 0.1956 | 0.1911 | 0.1822 | 0.1733 | 0.1333 | | cosine_precision@5 | 0.136 | 0.1307 | 0.1253 | 0.1147 | 0.1013 | | cosine_precision@10 | 0.08 | 0.08 | 0.0747 | 0.0707 | 0.0613 | | cosine_recall@1 | 0.1867 | 0.24 | 0.2133 | 0.2133 | 0.1467 | | cosine_recall@3 | 0.5867 | 0.5733 | 0.5467 | 0.52 | 0.4 | | cosine_recall@5 | 0.68 | 0.6533 | 0.6267 | 0.5733 | 0.5067 | | cosine_recall@10 | 0.8 | 0.8 | 0.7467 | 0.7067 | 0.6133 | | **cosine_ndcg@10** | **0.4894** | **0.4992** | **0.4717** | **0.4442** | **0.3595** | | cosine_mrr@10 | 0.3896 | 0.4047 | 0.3836 | 0.3609 | 0.2798 | | cosine_map@100 | 0.398 | 0.4124 | 0.3942 | 0.3736 | 0.2978 | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 150 training samples * Columns: <code>anchor</code> and <code>positive</code> * Approximate statistics based on the first 150 samples: | | anchor | positive | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 12.15 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 126.17 tokens</li><li>max: 378 tokens</li></ul> | * Samples: | anchor | positive | |:--------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>Is it hard to move old systems to the cloud?</code> | <code>We offer custom software development, digital marketing strategies, and tailored solutions to drive tangible results for your business. Our expert team combines technical prowess with industry insights to propel your business forward in the digital landscape.<br><br>"Engage, analyze & target your customers<br>Digital transformation enables you to interact with customers across multiple channels, providing personalized experiences. This could include social media engagement, interactive websites, and mobile apps." "Empower your employees & partners<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Optimize & automate your operations<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Transform your products<br>The push for digi...</code> | | <code>What benefits does marketing automation offer for time management?</code> | <code>Our MarTech capabilities<br><br>Personalization<br>Involves tailoring marketing messages and experiences to individual customers. It enhances customer engagement, loyalty, and ultimately, conversion rates.<br><br>Marketing Automation<br>Marketing automation streamlines repetitive tasks such as email marketing, lead nurturing, and social media posting. It improves efficiency, saves time, and ensures timely communication with customers.<br><br>Customer Relationship Management<br>CRM systems help manage interactions with current and potential customers. They store customer data, track interactions, and facilitate communication, improving customer retention.</code> | | <code>do you track customer behavior?</code> | <code>How can your recommendation engines improve our business?<br>Our recommendation engines are designed to analyze customer behavior and preferences to deliver personalized suggestions, enhancing user experience, increasing sales, and boosting customer retention.</code> | * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters: ```json { "loss": "MultipleNegativesRankingLoss", "matryoshka_dims": [ 768, 512, 256, 128, 64 ], "matryoshka_weights": [ 1, 1, 1, 1, 1 ], "n_dims_per_step": -1 } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: epoch - `gradient_accumulation_steps`: 4 - `learning_rate`: 1e-05 - `weight_decay`: 0.01 - `num_train_epochs`: 4 - `lr_scheduler_type`: cosine - `warmup_ratio`: 0.1 - `fp16`: True - `load_best_model_at_end`: True - `optim`: adamw_torch_fused - `push_to_hub`: True - `hub_model_id`: Shashwat13333/bge-base-en-v1.5_v1 - `push_to_hub_model_id`: bge-base-en-v1.5_v1 - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: epoch - `prediction_loss_only`: True - `per_device_train_batch_size`: 8 - `per_device_eval_batch_size`: 8 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 4 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 1e-05 - `weight_decay`: 0.01 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 4 - `max_steps`: -1 - `lr_scheduler_type`: cosine - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch_fused - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: True - `resume_from_checkpoint`: None - `hub_model_id`: Shashwat13333/bge-base-en-v1.5_v1 - `hub_strategy`: every_save - `hub_private_repo`: None - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: bge-base-en-v1.5_v1 - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `average_tokens_across_devices`: False - `prompts`: None - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 | |:----------:|:------:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:| | 0.2105 | 1 | 22.6183 | - | - | - | - | - | | 0.8421 | 4 | - | 0.4602 | 0.4392 | 0.4498 | 0.4162 | 0.3698 | | 1.2105 | 5 | 20.549 | - | - | - | - | - | | 1.8421 | 8 | - | 0.5047 | 0.4304 | 0.4538 | 0.4202 | 0.3458 | | 2.4211 | 10 | 17.664 | - | - | - | - | - | | **2.8421** | **12** | **-** | **0.482** | **0.4618** | **0.4658** | **0.4537** | **0.3496** | | 3.6316 | 15 | 14.6735 | - | - | - | - | - | | 3.8421 | 16 | - | 0.4894 | 0.4992 | 0.4717 | 0.4442 | 0.3595 | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.11.11 - Sentence Transformers: 3.4.1 - Transformers: 4.48.2 - PyTorch: 2.5.1+cu124 - Accelerate: 1.3.0 - Datasets: 3.2.0 - Tokenizers: 0.21.0 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MatryoshkaLoss ```bibtex @misc{kusupati2024matryoshka, title={Matryoshka Representation Learning}, author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi}, year={2024}, eprint={2205.13147}, archivePrefix={arXiv}, primaryClass={cs.LG} } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "CRAFT" ]
TBD
RichardErkhov/BSC-LT_-_salamandra-2b-instruct-4bits
RichardErkhov
null
[ "safetensors", "llama", "arxiv:2403.14009", "arxiv:2403.20266", "arxiv:2101.00027", "arxiv:2207.00220", "arxiv:1810.06694", "arxiv:1911.05507", "arxiv:1906.03741", "arxiv:2406.17557", "arxiv:2402.06619", "arxiv:1803.09010", "4-bit", "bitsandbytes", "region:us" ]
1,729
1,729
4
0
--- {} --- Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) salamandra-2b-instruct - bnb 4bits - Model creator: https://huggingface.co/BSC-LT/ - Original model: https://huggingface.co/BSC-LT/salamandra-2b-instruct/ Original model description: --- license: apache-2.0 library_name: transformers pipeline_tag: text-generation language: - bg - ca - code - cs - cy - da - de - el - en - es - et - eu - fi - fr - ga - gl - hr - hu - it - lt - lv - mt - nl - nn - \no - oc - pl - pt - ro - ru - sh - sk - sl - sr - sv - uk --- ![](./images/salamandra_header.png) # Salamandra Model Card Salamandra is a highly multilingual model pre-trained from scratch that comes in three different sizes — 2B, 7B and 40B parameters — with their respective base and instruction-tuned variants. This model card corresponds to the 7B instructed version. To visit the model cards of other Salamandra versions, please refer to the [Model Index](#model-index). The entire Salamandra family is released under a permissive [Apache 2.0 license]((https://www.apache.org/licenses/LICENSE-2.0)). Along with the open weights, all training scripts and configuration files are made publicly available in [this GitHub repository](https://github.com/langtech-bsc/salamandra). > [!WARNING] > **DISCLAIMER:** This model is a first proof-of-concept designed to demonstrate the instruction-following capabilities of recently released base models. > It has been optimized to engage in conversation but has *NOT* been aligned through RLHF to filter or avoid sensitive topics. > As a result, it may generate harmful or inappropriate content. > The team is actively working to enhance its performance through further instruction and alignment with RL techniques. --- ## Model Details ### Description Transformer-based decoder-only language model that has been pre-trained from scratch on 7.8 trillion tokens of highly curated data. The pre-training corpus contains text in 35 European languages and code. ### Hyperparameters The full list of hyperparameters for each model can be found [here](https://github.com/langtech-bsc/salamandra/tree/main/configs). ### Architecture | | | |-------------------------|:--------------| | Total Parameters | 2,253,490,176 | | Embedding Parameters | 524,288,000 | | Layers | 24 | | Hidden size | 2,048 | | Attention heads | 16 | | Context length | 8,192 | | Vocabulary size | 256,000 | | Precision | bfloat16 | | Embedding type | RoPE | | Activation Function | SwiGLU | | Layer normalization | RMS Norm | | Flash attention | ✅ | | Grouped Query Attention | ❌ | | Num. query groups | N/A | --- ## Intended Use ### Direct Use The models are intended for both research and commercial use in any of the languages included in the training data. The base models are intended either for language generation or to be further fine-tuned for specific use-cases. The instruction-tuned variants can be used as general-purpose assistants, as long as the user is fully aware of the model’s limitations. ### Out-of-scope Use The model is not intended for malicious activities, such as harming others or violating human rights. Any downstream application must comply with current laws and regulations. Irresponsible usage in production environments without proper risk assessment and mitigation is also discouraged. --- ## Hardware and Software ### Training Framework Pre-training was conducted using NVIDIA’s [NeMo Framework](https://docs.nvidia.com/nemo-framework/index.html), which leverages PyTorch Lightning for efficient model training in highly distributed settings. The instruction-tuned versions were produced with [FastChat](https://github.com/lm-sys/FastChat). ### Compute Infrastructure All models were trained on [MareNostrum 5](https://www.bsc.es/ca/marenostrum/marenostrum-5), a pre-exascale EuroHPC supercomputer hosted and operated by Barcelona Supercomputing Center. The accelerated partition is composed of 1,120 nodes with the following specifications: - 4x Nvidia Hopper GPUs with 64 HBM2 memory - 2x Intel Sapphire Rapids 8460Y+ at 2.3Ghz and 32c each (64 cores) - 4x NDR200 (BW per node 800Gb/s) - 512 GB of Main memory (DDR5) - 460GB on NVMe storage |Model|Nodes|GPUs| |:---:|:---:|:---:| |2B|64|256| |7B|128|512| |40B|256 / 512|1,024 / 2,048| --- ## How to use The instruction-following models use the commonly adopted ChatML template: ```jinja {%- if not date_string is defined %}{%- set date_string = "2024-09-30" %}{%- endif %}{{ "<|im_start|>system\nsystem_message\nToday Date: "+ date_string +"<|im_end|>\n" }}{% for message in messages %}{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n'}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %} ``` Where `system_message` is used to guide the model during generation and `date_string` can be set to allow the model to respond with the current date. The exact same chat template should be used for an enhanced conversational experience. The easiest way to apply it is by using the tokenizer's built-in functions, as shown in the following snippet. ```python from datetime import datetime from transformers import AutoTokenizer, AutoModelForCausalLM import transformers import torch model_id = "BSC-LT/salamandra-2b-instruct" text = "At what temperature does water boil?" tokenizer = AutoTokenizer.from_pretrained(model_id) model = AutoModelForCausalLM.from_pretrained( model_id, device_map="auto", torch_dtype=torch.bfloat16 ) message = [ { "role": "user", "content": text } ] date_string = datetime.today().strftime('%Y-%m-%d') prompt = tokenizer.apply_chat_template( message, tokenize=False, add_generation_prompt=True, date_string=date_string ) inputs = tokenizer.encode(prompt, add_special_tokens=False, return_tensors="pt") outputs = model.generate(input_ids=inputs.to(model.device), max_new_tokens=200) print(tokenizer.decode(outputs[0], skip_special_tokens=True)) ``` Using this template, each turn is preceded by a `<|im_start|>` delimiter and the role of the entity (either `user`, for content supplied by the user, or `assistant` for LLM responses), and finished with the `<|im_end|>` token. --- ## Data ### Pretraining Data The training corpus consists of 2.4 trillion tokens, including 35 European languages and 92 programming languages. It amounts to a total of 33TB of pre-processed text. Languages were sampled manually by giving x2 oversampling to Spain's co-official languages (Spanish, Catalan, Galician and Basque), code was undersampled by half, and the rest of the languages were kept as is, resulting in the following distribution: ![lang distrib](./images/corpus_languages.png) This highly multilingual corpus is predominantly composed of data from Colossal OSCAR, which contributes a significant 66.06% of the total tokens. Following this, Starcoder provides 11.91%, and Spanish Crawling adds 3.34%. The next largest sources are French FR at 3.12% and Proof Pile at 1.98%. Other notable contributions include Macocu, Pile of Law, and Eurlex, each contributing around 1.5% to 1.3%. These major sources collectively form the bulk of the corpus, ensuring a rich and diverse dataset for training the language model. The remaining 10% comes from smaller sources in various languages. Feel free to click the expand button below to see the full list of sources. <details> <summary>Data Sources</summary> | Dataset | Language | Source | |-----------------------------------------------|---------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------| | Parlamint corpus | at, bg, cz, dk, ee, es, es-ga, fi, fr, gb, gr, hr, hu, it, lv, nl, no, pl, pt, rs, se, si | Erjavec et al., 2021 | | Bulgarian National Corpus | bg | [Link](http://old.dcl.bas.bg/dataset/BulNC.7z) | | Crawl of Bulgarian news websites | bg | [Link](http://old.dcl.bas.bg/dataset/Bulgarian_news.7z) | | Colossal OSCAR 1.0 | bg, ca, cs, cy, da, de, el, en, es, et, eu, fi, fr, ga, gl, hr, hu, it, lt, lv, mt, nl, nn, no, oc, pl, pt, ro, ru, sh, sk, sl, sr, sv, uk | Brack et al., 2024 | | Wikimedia dumps | bg, ca, cs, da, de, el, en, es, et, eu, fi, fr, ga, gl, hr, hu, it, lt, lv, mt, nl, nn, no, pl, pt, ro, sh, sk, sl, sr, uk | [Link](https://dumps.wikimedia.org/) | | OpenSubtitlesv2016 | bg, ca, cs, da, de, el, en, es, et, eu, fi, fr, gl, hr, it, lt, lv, nl, no, pl, pt, ro, sk, sl, sr, sv, uk | Lison & Tiedemann, 2016 | | MaCoCu web corpus | bg, ca, el, hr, mt, sl, sr, uk | Bañón et al., 2022 | | EurLEX-Resources | bg, cs, da, de, el, en, es, et, fi, fr, ga, hr, hu, it, lt, lv, mt, nl, pl, pt, ro, sk, sl, sv | [Link](https://huggingface.co/datasets/joelniklaus/eurlex_resources) | | MC4-Legal | bg, cs, da, de, el, en, es, et, fi, fr, ga, hu, it, lt, lv, mt, nl, pl, pt, ro, sk, sl, sv | [Link](https://huggingface.co/datasets/joelito/legal-mc4) | | CURLICAT Corpus | bg, hr, hu, pl, ro, sk, sl | Váradi et al., 2022 | | CATalog | ca | Palomar-Giner et al., 2024 | | Spanish Crawling | ca, es, eu, gl | Relevant Spanish websites crawling | | Starcoder | code | Li et al., 2023 | | SYN v9: large corpus of written Czech | cs | Křen et al., 2021 | | Welsh-GOV | cy | Crawling from [Link](https://www.llyw.cymru) | | DaNewsroom | da | Varab & Schluter, 2020 | | Danish GigaWord | da | Strømberg-Derczynski et al., 2021 | | DK-CLARIN Reference Corpus of General Danish | da | [Link](https://korpus.dsl.dk/clarin/) | | The Danish Parliament Corpus 2009 - 2017, v1 | da | Hansen, 2018 | | DeWaC | de | [Link](https://docs.sslmit.unibo.it/doku.php?id=corpora:dewac) | | Open Legal Data - German court decisions and laws | de | Ostendorff et al., 2020 | | Greek Legal Code | el | Papaloukas et al., 2021 | | Greek Web Corpus | el | Outsios et al., 2018 | | Auxiliary Mathematics Problems and Solutions (AMPS) dataset | en | Hendrycks et al., 2021 | | BIGPATENT | en | Sharma et al., 2019 | | FineWeb-Edu (350BT subset) | en | Penedo et al., 2024 | | peS2o | en | Soldaini & Lo, 2023 | | PG-19 | en | Rae et al., 2019 | | Pile of Law (selected subsets) | en | Henderson* et al., 2022 | | proof-pile | en | [Link](https://huggingface.co/datasets/hoskinson-center/proof-pile) | | RedPajama-Data T1 (StackExchange subset) | en | Computer, 2023 | | The Pile (PhilPapers subset) | en | Gao et al., 2021 | | Biomedical | es | Internally generated scientific dataset: Dialnet, Scielo, CSIC, TDX, BSC, UCM | | HPLTDatasets v1 - Spanish | es | de Gibert et al., 2024 | | Legal | es | Internally generated legal dataset: BOE, BORME, Senado, Congreso, Spanish court orders, DOGC | | Scientific | es | Internally generated scientific dataset: Wikipedia LS, Pubmed, MeSpEn, patents, clinical cases, medical crawler | | Spanish Legal Domain Corpora | es | Gutiérrez-Fandiño et al., 2021 | | Estonian National Corpus 2021 | et | Koppel & Kallas, 2022 | | Estonian Reference Corpus | et | [Link](https://www.cl.ut.ee/korpused/segakorpus/) | | EusCrawl (w/o Wikipedia or NC-licenses) | eu | Artetxe et al., 2022 | | Latxa Corpus v1.1 | eu | Etxaniz et al., 2024 [Link](https://huggingface.co/datasets/HiTZ/latxa-corpus-v1.1) | | Aya Dataset (w/o Evaluation Suite) | eu, hr, nl, fi, ka, hu, lt, nn, ro, sk, lv, cy, bg, cs, en, fr, de, ga, mt, pl, ru, sl, sv, ca, da, et, gl, el, it, no, pt, sr, es, uk | Singh et al., 2024 | | Yle Finnish News Archive | fi | [Link](http://urn.fi/urn:nbn:fi:lb-2021050401) | | CaBeRnet: a New French Balanced Reference Corpus | fr | Popa-Fabre et al., 2020 | | French Public Domain Books | fr | [Link](https://huggingface.co/datasets/PleIAs/French-PD-Books) | | French Public Domain Newspapers | fr | [Link](https://huggingface.co/datasets/PleIAs/French-PD-Newspapers) | | Irish Universal Dependencies | ga | [Link](https://universaldependencies.org/ga/index.html) | | The Gaois bilingual corpus of English-Irish legislation (Irish legislation) | ga | [Link](https://portulanclarin.net/repository/browse/the-gaois-bilingual-corpus-of-english-irish-legislation-processed/daeac17c9e3511ea9b7f02420a000407b83de243dc0b469aab41084386c5b80f/) | | CorpusNÓS | gl | de-Dios-Flores et al., 2024 | | Croatian web corpus hrWaC 2.1 | hr | Ljubešić & Klubička, 2014 | | ITWaC | it | [Link](https://docs.sslmit.unibo.it/doku.php?id=corpora:itwac) | | Corpus of State-related content from the Latvian Web (Processed) | lv | [Link](https://catalog.elra.info/en-us/repository/browse/ELRA-W0169/) | | Korpus Malti | mt | Micallef et al., 2022 | | SoNaR Corpus NC 1.2 | nl | [Link](https://taalmaterialen.ivdnt.org/download/tstc-sonar-corpus/) | | Norwegian Colossal Corpus | nn, no | Kummervold et al., 2021 | | Occitan Corpus | oc | Provided by [IEA](https://www.institutestudisaranesi.cat/) | | NKJP-PodkorpusMilionowy-1.2 (National Corpus of Polish) | pl | Lewandowska-Tomaszczyk et al., 2013 | | Polish Parliamentary Corpus / Korpus Dyskursu Parlamentarnego | pl | Ogrodniczuk, 2018 | | Brazilian Portuguese Web as Corpus | pt | Wagner Filho et al., 2018 | | ParlamentoPT | pt | Rodrigues et al., 2023 | | MARCELL Romanian legislative subcorpus v2 | ro | [Link](https://elrc-share.eu/reposMARCELL%20Romanian%20legislative%20subcorpus%20v2itory/browse/marcell-romanian-legislative-subcorpus-v2/2da548428b9d11eb9c1a00155d026706ce94a6b59ffc4b0e9fb5cd9cebe6889e/) | | Korpus slovenských právnych predpisov v1.9 | sk | [Link](https://www.juls.savba.sk/data/marcell/legal-sk-20220322-1.9.ver.xz) | | od-justice 2.0 | sk | [Link](https://www.juls.savba.sk/data/od-justice/od-justice-2.0.ver.xz) | | Corpus of academic Slovene KAS 2.0 | sl | Žagar et al., 2022 | | slWaC web corpus | sl | Erjavec et al., 2015 | | SrpKorSubset (news, legal, academic, conversation, literary) | sr | [Link](http://www.korpus.matf.bg.ac.rs/) | | The Swedish Culturomics Gigaword Corpus | sv | Rødven-Eide, 2016 | | Corpus of laws and legal acts of Ukraine | uk | [Link](https://lang.org.ua/en/corpora/#anchor7) | <details> <summary>References</summary> - Abadji, J., Suárez, P. J. O., Romary, L., & Sagot, B. (2021). Ungoliant: An optimized pipeline for the generation of a very large-scale multilingual web corpus (H. Lüngen, M. Kupietz, P. Bański, A. Barbaresi, S. Clematide, & I. Pisetta, Eds.; pp. 1–9). Leibniz-Institut für Deutsche Sprache. [Link](https://doi.org/10.14618/ids-pub-10468) - Artetxe, M., Aldabe, I., Agerri, R., Perez-de-Viñaspre, O., & Soroa, A. (2022). Does Corpus Quality Really Matter for Low-Resource Languages? - Bañón, M., Esplà-Gomis, M., Forcada, M. L., García-Romero, C., Kuzman, T., Ljubešić, N., van Noord, R., Sempere, L. P., Ramírez-Sánchez, G., Rupnik, P., Suchomel, V., Toral, A., van der Werff, T., & Zaragoza, J. (2022). MaCoCu: Massive collection and curation of monolingual and bilingual data: Focus on under-resourced languages. Proceedings of the 23rd Annual Conference of the European Association for Machine Translation, 303–304. [Link](https://aclanthology.org/2022.eamt-1.41) - Brack, M., Ostendorff, M., Suarez, P. O., Saiz, J. J., Castilla, I. L., Palomar-Giner, J., Shvets, A., Schramowski, P., Rehm, G., Villegas, M., & Kersting, K. (2024). Community OSCAR: A Community Effort for Multilingual Web Data. [Link](https://occiglot.eu/papers/Community_Oscar.pdf) - Computer, T. (2023). RedPajama: An Open Source Recipe to Reproduce LLaMA training dataset [Computer software]. [Link](https://github.com/togethercomputer/RedPajama-Data) - de Gibert, O., Nail, G., Arefyev, N., Bañón, M., van der Linde, J., Ji, S., Zaragoza-Bernabeu, J., Aulamo, M., Ramírez-Sánchez, G., Kutuzov, A., Pyysalo, S., Oepen, S., & Tiedemann, J. (2024). A New Massive Multilingual Dataset for High-Performance Language Technologies (arXiv:2403.14009). arXiv. [Link](http://arxiv.org/abs/2403.14009) - Dodge, J., Sap, M., Marasović, A., Agnew, W., Ilharco, G., Groeneveld, D., Mitchell, M., & Gardner, M. (2021). Documenting Large Webtext Corpora: A Case Study on the Colossal Clean Crawled Corpus. In M.-F. Moens, X. Huang, L. Specia, & S. W. Yih (Eds.), Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (pp. 1286–1305). Association for Computational Linguistics. [Link](https://doi.org/10.18653/v1/2021.emnlp-main.98) - Erjavec, T., Ljubešić, N., & Logar, N. (2015). The slWaC corpus of the Slovene web. Informatica (Slovenia), 39, 35–42. - Erjavec, T., Ogrodniczuk, M., Osenova, P., Ljubešić, N., Simov, K., Grigorova, V., Rudolf, M., Pančur, A., Kopp, M., Barkarson, S., Steingrímsson, S. hór, van der Pol, H., Depoorter, G., de Does, J., Jongejan, B., Haltrup Hansen, D., Navarretta, C., Calzada Pérez, M., de Macedo, L. D., … Rayson, P. (2021). Linguistically annotated multilingual comparable corpora of parliamentary debates ParlaMint.ana 2.1. [Link](http://hdl.handle.net/11356/1431) - Etxaniz, J., Sainz, O., Perez, N., Aldabe, I., Rigau, G., Agirre, E., Ormazabal, A., Artetxe, M., & Soroa, A. (2024). Latxa: An Open Language Model and Evaluation Suite for Basque. [Link] (https://arxiv.org/abs/2403.20266) - Gao, L., Biderman, S., Black, S., Golding, L., Hoppe, T., Foster, C., Phang, J., He, H., Thite, A., Nabeshima, N., Presser, S., & Leahy, C. (2021). The Pile: An 800GB Dataset of Diverse Text for Language Modeling. CoRR, abs/2101.00027. [Link](https://arxiv.org/abs/2101.00027) - Gutiérrez-Fandiño, A., Armengol-Estapé, J., Gonzalez-Agirre, A., & Villegas, M. (2021). Spanish Legalese Language Model and Corpora. - Hansen, D. H. (2018). The Danish Parliament Corpus 2009—2017, v1. [Link](http://hdl.handle.net/20.500.12115/8) - Henderson*, P., Krass*, M. S., Zheng, L., Guha, N., Manning, C. D., Jurafsky, D., & Ho, D. E. (2022). Pile of Law: Learning Responsible Data Filtering from the Law and a 256GB Open-Source Legal Dataset. arXiv. [Link](https://arxiv.org/abs/2207.00220) - Hendrycks, D., Burns, C., Kadavath, S., Arora, A., Basart, S., Tang, E., Song, D., & Steinhardt, J. (2021). Measuring Mathematical Problem Solving With the MATH Dataset. NeurIPS. - Jansen, T., Tong, Y., Zevallos, V., & Suarez, P. O. (2022). Perplexed by Quality: A Perplexity-based Method for Adult and Harmful Content Detection in Multilingual Heterogeneous Web Data. - Koppel, K., & Kallas, J. (2022). Eesti keele ühendkorpuste sari 2013–2021: Mahukaim eestikeelsete digitekstide kogu. Eesti Rakenduslingvistika Ühingu Aastaraamat Estonian Papers in Applied Linguistics, 18, 207–228. [Link](https://doi.org/10.5128/erya18.12) - Křen, M., Cvrček, V., Henyš, J., Hnátková, M., Jelínek, T., Kocek, J., Kováříková, D., Křivan, J., Milička, J., Petkevič, V., Procházka, P., Skoumalová, H., Šindlerová, J., & Škrabal, M. (2021). SYN v9: Large corpus of written Czech. [Link](http://hdl.handle.net/11234/1-4635) - Kreutzer, J., Caswell, I., Wang, L., Wahab, A., van Esch, D., Ulzii-Orshikh, N., Tapo, A., Subramani, N., Sokolov, A., Sikasote, C., Setyawan, M., Sarin, S., Samb, S., Sagot, B., Rivera, C., Rios, A., Papadimitriou, I., Osei, S., Suarez, P. O., … Adeyemi, M. (2022). Quality at a Glance: An Audit of Web-Crawled Multilingual Datasets. Transactions of the Association for Computational Linguistics, 10, 50–72. [Link](https://doi.org/10.1162/tacl_a_00447) - Kummervold, P. E., De la Rosa, J., Wetjen, F., & Brygfjeld, S. A. (2021). Operationalizing a National Digital Library: The Case for a Norwegian Transformer Model. In S. Dobnik & L. Øvrelid (Eds.), Proceedings of the 23rd Nordic Conference on Computational Linguistics (NoDaLiDa) (pp. 20–29). Linköping University Electronic Press, Sweden. [Link](https://aclanthology.org/2021.nodalida-main.3) - Lewandowska-Tomaszczyk, B., Górski, R., Łaziński, M., & Przepiórkowski, A. (2013). The National Corpus of Polish (NKJP). Language use and data analysis. 309–319. - Li, R., Allal, L. B., Zi, Y., Muennighoff, N., Kocetkov, D., Mou, C., Marone, M., Akiki, C., Li, J., Chim, J., Liu, Q., Zheltonozhskii, E., Zhuo, T. Y., Wang, T., Dehaene, O., Davaadorj, M., Lamy-Poirier, J., Monteiro, J., Shliazhko, O., … Vries, H. de. (2023). StarCoder: May the source be with you! - Lison, P., & Tiedemann, J. (2016). OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles. In N. Calzolari, K. Choukri, T. Declerck, S. Goggi, M. Grobelnik, B. Maegaard, J. Mariani, H. Mazo, A. Moreno, J. Odijk, & S. Piperidis (Eds.), Proceedings of the Tenth International Conference on Language Resources and Evaluation (LREC’16) (pp. 923–929). European Language Resources Association (ELRA). [Link](https://aclanthology.org/L16-1147) - Ljubešić, N., & Klubička, F. (2014). Bs,hr,srWaC - Web Corpora of Bosnian, Croatian and Serbian. In F. Bildhauer & R. Schäfer (Eds.), Proceedings of the 9th Web as Corpus Workshop (WaC-9) (pp. 29–35). Association for Computational Linguistics. [Link](https://doi.org/10.3115/v1/W14-0405) - Micallef, K., Gatt, A., Tanti, M., van der Plas, L., & Borg, C. (2022). Pre-training Data Quality and Quantity for a Low-Resource Language: New Corpus and BERT Models for Maltese. Proceedings of the Third Workshop on Deep Learning for Low-Resource Natural Language Processing, 90–101. [Link](https://doi.org/10.18653/v1/2022.deeplo-1.10) - Ogrodniczuk, M. (2018). Polish Parliamentary Corpus. [Link](https://api.semanticscholar.org/CorpusID:235134113) - Ostendorff, M., Blume, T., & Ostendorff, S. (2020). Towards an Open Platform for Legal Information. Proceedings of the ACM/IEEE Joint Conference on Digital Libraries in 2020, 385–388. [Link](https://doi.org/10.1145/3383583.3398616) - Ostendorff, M., Suarez, P. O., Lage, L. F., & Rehm, G. (2024). LLM-Datasets: An Open Framework for Pretraining Datasets of Large Language Models. First Conference on Language Modeling. [Link](https://openreview.net/forum?id=5RdIMlGLXL) - Outsios, S., Skianis, K., Meladianos, P., Xypolopoulos, C., & Vazirgiannis, M. (2018). Word Embeddings from Large-Scale Greek Web content. arXiv Preprint arXiv:1810.06694. - Palomar-Giner, J., Saiz, J. J., Espuña, F., Mina, M., Da Dalt, S., Llop, J., Ostendorff, M., Ortiz Suarez, P., Rehm, G., Gonzalez-Agirre, A., & Villegas, M. (2024). A CURATEd CATalog: Rethinking the Extraction of Pretraining Corpora for Mid-Resourced Languages. In N. Calzolari, M.-Y. Kan, V. Hoste, A. Lenci, S. Sakti, & N. Xue (Eds.), Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024) (pp. 335–349). ELRA and ICCL. [Link](https://aclanthology.org/2024.lrec-main.31) - Papaloukas, C., Chalkidis, I., Athinaios, K., Pantazi, D.-A., & Koubarakis, M. (2021). Multi-granular Legal Topic Classification on Greek Legislation. Proceedings of the Natural Legal Language Processing Workshop 2021, 63–75. [Link](https://doi.org/10.48550/arXiv.2109.15298) - Popa-Fabre, M., Ortiz Suárez, P. J., Sagot, B., & de la Clergerie, É. (2020). French Contextualized Word-Embeddings with a sip of CaBeRnet: A New French Balanced Reference Corpus. Proceedings of the 8th Workshop on Challenges in the Management of Large Corpora, 15–23. [Link](https://aclanthology.org/2020.cmlc-1.3) - Rae, J. W., Potapenko, A., Jayakumar, S. M., Hillier, C., & Lillicrap, T. P. (2019). Compressive Transformers for Long-Range Sequence Modelling. arXiv Preprint. [Link](https://arxiv.org/abs/1911.05507) - Rodrigues, J., Gomes, L., Silva, J., Branco, A., Santos, R., Cardoso, H. L., & Osório, T. (2023). Advancing Neural Encoding of Portuguese with Transformer Albertina PT-\*. - Rødven-Eide, S. (2016). The Swedish Culturomics Gigaword CorpusThe Swedish Culturomics Gigaword Corpus [Dataset]. Språkbanken Text. [Link](https://doi.org/10.23695/3WMV-1Z09) - Sharma, E., Li, C., & Wang, L. (2019). BIGPATENT: A Large-Scale Dataset for Abstractive and Coherent Summarization. CoRR, abs/1906.03741. [Link](http://arxiv.org/abs/1906.03741) - Soldaini, L., & Lo, K. (2023). peS2o (Pretraining Efficiently on S2ORC) Dataset. Allen Institute for AI. - Strømberg-Derczynski, L., Ciosici, M., Baglini, R., Christiansen, M. H., Dalsgaard, J. A., Fusaroli, R., Henrichsen, P. J., Hvingelby, R., Kirkedal, A., Kjeldsen, A. S., Ladefoged, C., Nielsen, F. Å., Madsen, J., Petersen, M. L., Rystrøm, J. H., & Varab, D. (2021). The Danish Gigaword Corpus. Proceedings of the 23rd Nordic Conference on Computational Linguistics (NoDaLiDa), 413–421. [Link](https://aclanthology.org/2021.nodalida-main.46) - Subramani, N., Luccioni, S., Dodge, J., & Mitchell, M. (2023). Detecting Personal Information in Training Corpora: An Analysis. 208–220. [Link](https://doi.org/10.18653/v1/2023.trustnlp-1.18) - Varab, D., & Schluter, N. (2020). DaNewsroom: A Large-scale Danish Summarisation Dataset. Proceedings of The 12th Language Resources and Evaluation Conference, 6731–6739. [Link](https://www.aclweb.org/anthology/2020.lrec-1.831) - Váradi, T., Nyéki, B., Koeva, S., Tadić, M., Štefanec, V., Ogrodniczuk, M., Nitoń, B., Pezik, P., Barbu Mititelu, V., Irimia, E., Mitrofan, M., Tufi\textcommabelows, D., Garabík, R., Krek, S., & Repar, A. (2022). Introducing the CURLICAT Corpora: Seven-language Domain Specific Annotated Corpora from Curated Sources. In N. Calzolari, F. Béchet, P. Blache, K. Choukri, C. Cieri, T. Declerck, S. Goggi, H. Isahara, B. Maegaard, J. Mariani, H. Mazo, J. Odijk, & S. Piperidis (Eds.), Proceedings of the Thirteenth Language Resources and Evaluation Conference (pp. 100–108). European Language Resources Association. [Link](https://aclanthology.org/2022.lrec-1.11) - Wagner Filho, J. A., Wilkens, R., Idiart, M., & Villavicencio, A. (2018). The brwac corpus: A new open resource for brazilian portuguese. Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018). - Žagar, A., Kavaš, M., Robnik-Šikonja, M., Erjavec, T., Fišer, D., Ljubešić, N., Ferme, M., Borovič, M., Boškovič, B., Ojsteršek, M., & Hrovat, G. (2022). Corpus of academic Slovene KAS 2.0. [Link](http://hdl.handle.net/11356/1448) - Alicia Parrish, Angelica Chen, Nikita Nangia, Vishakh Padmakumar, Jason Phang, Jana Thompson, Phu Mon Htut, and Samuel Bowman. 2022. BBQ: A hand-built bias benchmark for question answering. In Findings of the Association for Computational Linguistics: ACL 2022, pages 2086–2105, Dublin, Ireland. Association for Computational Linguistics. - Emily Sheng, Kai-Wei Chang, Premkumar Natarajan, and Nanyun Peng. 2019. The Woman Worked as a Babysitter: On Biases in Language Generation. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 3407–3412, Hong Kong, China. Association for Computational Linguistics. - Clark, P., Cowhey, I., Etzioni, O., Khot, T., Sabharwal, A., Schoenick, C., & Tafjord, O. (2018). Think you have Solved Question Answering? Try ARC, the AI2 Reasoning Challenge. arXiv:1803. 05457v1. - Richard Socher, Alex Perelygin, Jean Wu, Jason Chuang, Christopher D. Manning, Andrew Ng, and Christopher Potts. 2013. Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank. In Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pages 1631–1642, Seattle, Washington, USA. Association for Computational Linguistics. - Penedo, G., Kydlíček, H., allal, L. B., Lozhkov, A., Mitchell, M., Raffel, C., Von Werra, L., & Wolf, T. (2024). The FineWeb Datasets: Decanting the Web for the Finest Text Data at Scale (arXiv:2406.17557). arXiv. http://arxiv.org/abs/2406.17557 - Singh, S., Vargus, F., Dsouza, D., Karlsson, B. F., Mahendiran, A., Ko, W.-Y., Shandilya, H., Patel, J., Mataciunas, D., OMahony, L., Zhang, M., Hettiarachchi, R., Wilson, J., Machado, M., Moura, L. S., Krzemiński, D., Fadaei, H., Ergün, I., Okoh, I., … Hooker, S. (2024). Aya Dataset: An Open-Access Collection for Multilingual Instruction Tuning (arXiv:2402.06619). arXiv. http://arxiv.org/abs/2402.06619 </details> </details> The model was trained for 3 epochs, with two final rounds of 0.3B higher-quality tokens each, meaning that the total number of tokens seen during pre-training amounts to roughly 7.8 trillion tokens. We provide an extense Datasheet section following the best practices defined by [(Gebru et al., 2021)](https://arxiv.org/pdf/1803.09010). <details> <summary>Datasheet</summary> #### Motivation **For what purpose was the dataset created? Was there a specific task in mind? Was there a specific gap that needed to be filled? Please provide a description.** The purpose of creating this dataset is to pre-train the Salamandra family of multilingual models with high performance in a large number of European languages (35) and code (including 92 different programming languages). In addition, we aim to represent especially the co-official languages of Spain: Spanish, Catalan, Galician, and Basque. This is the reason why we carry out an oversampling of these languages. We detected that there is a great lack of massive multilingual data, especially in minority languages (Ostendorff & Rehm, 2023), so part of our efforts in the creation of this pre-training dataset have resulted in the contribution to large projects such as the Community OSCAR (Brack et al., 2024), which includes 151 languages and 40T words, or CATalog (Palomar-Giner et al., 2024), the largest open dataset in Catalan in the world. **Who created the dataset (e.g., which team, research group) and on behalf of which entity (e.g., company, institution, organization)?** The dataset has been created by the Language Technologies unit (LangTech) of the Barcelona Supercomputing Center - Centro Nacional de Supercomputación (BSC-CNS), which aims to advance the field of natural language processing through cutting-edge research and development and the use of HPC. In particular, it was created by the unit's data team, the main contributors being Javier Saiz, Ferran Espuña, and Jorge Palomar. However, the creation of the dataset would not have been possible without the collaboration of a large number of collaborators, partners, and public institutions, which can be found in detail in the acknowledgements. **Who funded the creation of the dataset? If there is an associated grant, please provide the name of the grantor and the grant name and number.** This work/research has been promoted and financed by the Government of Catalonia through the [Aina project](https://projecteaina.cat/). #### Composition **What do the instances that comprise the dataset represent (e.g., documents, photos, people, countries)? Are there multiple types of instances (e.g., movies, users, and ratings; people and interactions between them; nodes and edges)? Please provide a description.** The dataset consists entirely of text documents in various languages. Specifically, data was mainly sourced from the following databases and repositories: - **Common Crawl:** Repository that holds website data and is run by the Common Crawl non-profit organization. It is updated monthly and is distributed under the CC0 1.0 public domain license. - **GitHub:** Community platform that allows developers to create, store, manage, and share their code. Repositories are crawled and then distributed with their original licenses, which may vary from permissive to non-commercial licenses. - **Wikimedia:** Database that holds the collection databases managed by the Wikimedia Foundation, including Wikipedia, Wikibooks, Wikinews, Wikiquote, Wikisource, and Wikivoyage. It is updated monthly and is distributed under Creative Commons Attribution-ShareAlike License 4.0. - **EurLex:** Repository that holds the collection of legal documents from the European Union, available in all of the EU’s 24 official languages and run by the Publications Office of the European Union. It is updated daily and is distributed under the Creative Commons Attribution 4.0 International license. - **Other repositories:** Specific repositories were crawled under permission for domain-specific corpora, which include academic, legal, and newspaper repositories. We provide a complete list of dataset sources at the end of this section. **How many instances are there in total (of each type, if appropriate)?** The dataset contains a diverse range of instances across multiple languages, with notable adjustments for certain languages. English represents the largest portion, accounting for 39.08% of the total data. Spanish was upsampled by a factor of 2, bringing its share to 16.59%, while Catalan (1.84%), Basque (0.26%), and Galician (0.36%) were also upsampled by 2. On the other hand, code-related data was downsampled by half, making up 6.42% of the total. Other prominent languages include French (6.59%), Russian (5.39%), German (4.25%), and Hungarian (3.93%), with several additional languages contributing between 1% and 2%, and smaller portions represented by a variety of others. **Does the dataset contain all possible instances or is it a sample (not necessarily random) of instances from a larger set? If the dataset is a sample, then what is the larger set? Is the sample representative of the larger set (e.g., geographic coverage)? If so, please describe how this representativeness was validated/verified. If it is not representative of the larger set, please describe why not (e.g., to cover a more diverse range of instances, because instances were withheld or unavailable).** The dataset is a sample from multiple sources, with different weights based on the primary language of the content: Spanish, Catalan, Basque, and Galician content was upsampled by a factor of two, while programming languages were downsampled by a factor of half. Other sources were sampled in proportion to their occurrence. **What data does each instance consist of? “Raw” data (e.g., unprocessed text or images) or features? In either case, please provide a description.** Each instance consists of a text document processed for deduplication, language identification, and source-specific filtering. Some documents required optical character recognition (OCR) to extract text from non-text formats such as PDFs. **Is there a label or target associated with each instance? If so, please provide a description.** Each instance is labeled with a unique identifier, the primary language of the content, and the URL for web-sourced instances. Additional labels were automatically assigned to detect specific types of content —harmful or toxic content— and to assign preliminary indicators of undesired qualities —very short documents, high density of symbols, etc.— which were used for filtering instances. **Is any information missing from individual instances? If so, please provide a description, explaining why this information is missing (e.g., because it was unavailable). This does not include intentionally removed information, but might include, e.g., redacted text.** No significant information is missing from the instances. **Are relationships between individual instances made explicit (e.g., users’ movie ratings, social network links)? If so, please describe how these relationships are made explicit.** Instances are related through shared metadata, such as source and language identifiers. **Are there recommended data splits (e.g., training, development/validation, testing)? If so, please provide a description of these splits, explaining the rationale behind them.** The dataset is split randomly into training, validation, and test sets. **Are there any errors, sources of noise, or redundancies in the dataset? If so, please provide a description.** Despite removing duplicated instances within each source, redundancy remains at the paragraph and sentence levels, particularly in web-sourced instances where SEO techniques and templates contribute to repeated textual patterns. Some instances may also be duplicated across sources due to format variations. **Is the dataset self-contained, or does it link to or otherwise rely on external resources (e.g., websites, tweets, other datasets)? If it links to or relies on external resources, a) are there guarantees that they will exist, and remain constant, over time; b) are there official archival versions of the complete dataset (i.e., including the external resources as they existed at the time the dataset was created); c) are there any restrictions (e.g., licenses, fees) associated with any of the external resources that might apply to a dataset consumer? Please provide descriptions of all external resources and any restrictions associated with them, as well as links or other access points, as appropriate.** The dataset is self-contained and does not rely on external resources. **Does the dataset contain data that might be considered confidential (e.g., data that is protected by legal privilege or by doctor–patient confidentiality, data that includes the content of individuals’ non-public communications)? If so, please provide a description.** The dataset does not contain confidential data. **Does the dataset contain data that, if viewed directly, might be offensive, insulting, threatening, or might otherwise cause anxiety? If so, please describe why. If the dataset does not relate to people, you may skip the remaining questions in this section.** The dataset includes web-crawled content, which may overrepresent pornographic material across languages (Kreutzer et al., 2022). Although pre-processing techniques were applied to mitigate offensive content, the heterogeneity and scale of web-sourced data make exhaustive filtering challenging, which makes it next to impossible to identify all adult content without falling into excessive filtering, which may negatively influence certain demographic groups (Dodge et al., 2021). **Does the dataset identify any subpopulations (e.g., by age, gender)? If so, please describe how these subpopulations are identified and provide a description of their respective distributions within the dataset.** The dataset does not explicitly identify any subpopulations. **Is it possible to identify individuals (i.e., one or more natural persons), either directly or indirectly (i.e., in combination with other data) from the dataset? If so, please describe how.** Web-sourced instances in the dataset may contain personally identifiable information (PII) that is publicly available on the Web, such as names, IP addresses, email addresses, and phone numbers. While it would be possible to indirectly identify individuals through the combination of multiple data points, the nature and scale of web data makes it difficult to parse such information. In any case, efforts are made to filter or anonymize sensitive data during pre-processing, but some identifiable information may remain in the dataset. **Does the dataset contain data that might be considered sensitive in any way? If so, please provide a description.** Given that the dataset includes web-sourced content and other publicly available documents, instances may inadvertently reveal financial information, health-related details, or forms of government identification, such as social security numbers (Subramani et al., 2023), especially if the content originates from less-regulated sources or user-generated platforms. #### Collection Process **How was the data collected?** This dataset is constituted by combining several sources, whose acquisition methods can be classified into three groups: - Web-sourced datasets with some preprocessing available under permissive license (p.e. Common Crawl). - Domain-specific or language-specific raw crawls (p.e. Spanish Crawling). - Manually curated data obtained through collaborators, data providers (by means of legal assignment agreements) or open source projects (p.e. CATalog). **What mechanisms or procedures were used to collect the data? How were these mechanisms or procedures validated?** According to the three groups previously defined, these are the mechanisms used in each of them: - Open direct download. Validation: data integrity tests. - Ad-hoc scrapers or crawlers. Validation: software unit and data integrity tests. - Direct download via FTP, SFTP, API or S3. Validation: data integrity tests. **If the dataset is a sample from a larger set, what was the sampling strategy?** The sampling strategy was to use the whole dataset resulting from the filtering explained in the ‘preprocessing/cleaning/labelling’ section, with the particularity that an upsampling of 2 (i.e. twice the probability of sampling a document) was performed for the co-official languages of Spain (Spanish, Catalan, Galician, Basque), and a downsampling of 1/2 was applied for code (half the probability of sampling a code document, evenly distributed among all programming languages). **Who was involved in the data collection process and how were they compensated?** This data is generally extracted, filtered and sampled by automated processes. The code required to run these processes has been developed entirely by members of the LangTech data team, or otherwise obtained from open-source software. Furthermore, there has been no monetary consideration for acquiring data from suppliers. **Over what timeframe was the data collected? Does this timeframe match the creation timeframe of the data associated with the instances? If not, please describe the timeframe in which the data associated with the instances was created.** Data were acquired and processed from April 2023 to April 2024. However, as mentioned, much data has been obtained from open projects such as Common Crawl, which contains data from 2014, so it is the end date (04/2024) rather than the start date that is important. **Were any ethical review processes conducted? If so, please provide a description of these review processes, including the outcomes, as well as a link or other access point to any supporting documentation.** No particular ethical review process has been carried out as the data is mostly open and not particularly sensitive. However, we have an internal evaluation team and a bias team to monitor ethical issues. In addition, we work closely with ‘Observatori d'Ètica en Intel·ligència Artificial’ (OEIAC) and ‘Agencia Española de Supervisión de la Inteligencia Artificial’ (AESIA) to audit the processes we carry out from an ethical and legal point of view, respectively. #### Preprocessing **Was any preprocessing/cleaning/labeling of the data done? If so, please provide a description. If not, you may skip the remaining questions in this section.** Instances of text documents were not altered, but web-sourced documents were filtered based on specific criteria along two dimensions: - Quality: documents with a score lower than 0.8, based on undesired qualities, such as documents with low number of lines, very short sentences, presence of long footers and headers, and high percentage of punctuation, obtained through CURATE (Palomar-Giner et al., 2024) were filtered out. - Harmful or adult content: documents originating from Colossal OSCAR were filtered using LLM-Datasets (Ostendorff et al., 2024) based on the perplexity from a language model (‘harmful_pp’ field) provided by the Ungoliant pipeline (Abadji et al., 2021). **Was the “raw” data saved in addition to the preprocessed/cleaned/labeled data? If so, please provide a link or other access point to the “raw” data.** The original raw data was not kept. **Is the software that was used to preprocess/clean/label the data available? If so, please provide a link or other access point.** Yes, the preprocessing and filtering software is open-sourced. The [CURATE](https://github.com/langtech-bsc/CURATE) pipeline was used for Spanish Crawling and CATalog, and the [Ungoliant](https://github.com/oscar-project/ungoliant) pipeline was used for the OSCAR project. #### Uses **Has the dataset been used for any tasks already? If so, please provide a description.** Pre-train the Salamandra model family. **What (other) tasks could the dataset be used for?** The data can be used primarily to pre-train other language models, which can then be used for a wide range of use cases. The dataset could also be used for other tasks such as fine-tuning language models, cross-lingual NLP tasks, machine translation, domain-specific text generation, and language-specific data analysis. **Is there anything about the composition of the dataset or the way it was collected and preprocessed/cleaned/labeled that might impact future uses? Is there anything a dataset consumer could do to mitigate these risks or harms?** Web-crawled content is over-represented with standard language varieties, impacting language model performance for minority languages. Language diversity in data is crucial to avoid bias, especially in encoding non-standard dialects, preventing the exclusion of demographic groups. Moreover, despite legal uncertainties in web-scraped data, we prioritize permissive licenses and privacy protection measures, acknowledging the challenges posed by personally identifiable information (PII) within large-scale datasets. Our ongoing efforts aim to address privacy concerns and contribute to a more inclusive linguistic dataset. **Are there tasks for which the dataset should not be used?** - #### Distribution **Will the dataset be distributed to third parties outside of the entity on behalf of which the dataset was created? If so, please provide a description.** The dataset will not be released or distributed to third parties. Any related question to distribution is omitted in this section. #### Maintenance **Who will be supporting/hosting/maintaining the dataset?** The dataset will be hosted by the Language Technologies unit (LangTech) of the Barcelona Supercomputing Center (BSC). The team will ensure regular updates and monitor the dataset for any issues related to content integrity, legal compliance, and bias for the sources they are responsible for. **How can the owner/curator/manager of the dataset be contacted?** The data owner may be contacted with the email address [email protected]. **Will the dataset be updated?** The dataset will not be updated. **If the dataset relates to people, are there applicable limits on the retention of the data associated with the instances? If so, please describe these limits and explain how they will be enforced.** The dataset does not keep sensitive data that could allow direct identification of individuals, apart from the data that is publicly available in web-sourced content. Due to the sheer volume and diversity of web data, it is not feasible to notify individuals or manage data retention on an individual basis. However, efforts are made to mitigate the risks associated with sensitive information through pre-processing and filtering to remove identifiable or harmful content. Despite these measures, vigilance is maintained to address potential privacy and ethical issues. **Will older versions of the dataset continue to be supported/hosted/maintained? If so, please describe how. If not, please describe how its obsolescence will be communicated to dataset consumers.** Since the dataset will not be updated, only the final version will be kept. **If others want to extend/augment/build on/contribute to the dataset, is there a mechanism for them to do so?** The dataset does not allow for external contributions. </details> ### Finetuning Data This instruction-tuned variant has been trained with a mixture of 276k English, Spanish, and Catalan multi-turn instructions gathered from open datasets: | Dataset | ca | en | es | |-----------------------|:------:|:------:|:------:| | alpaca-cleaned | - | 50,000 | - | | aya-dataset | - | 3,944 | 3,854 | | CoQCat | 4,797 | - | - | | databricks-dolly-15k | - | 15,011 | - | | dolly-3k-ca | 3,232 | - | - | | flores-instr | 1,994 | 1,994 | 3,988 | | MentorCA | 7,122 | - | - | | MentorES | - | - | 7,122 | | no-robots | - | 9,499 | - | | oasst-ca | 2,518 | - | - | | oasst2 | 750 | 31,086 | 15,438 | | open-orca | - | 50,000 | - | | RagMultilingual | 16,043 | 14,997 | 11,263 | | tower-blocks | - | 19,895 | 2,000 | | **Total** | **36,456** | **196,426** | **43,665** | --- ## Evaluation ### Gold-standard benchmarks Evaluation is done using the Language Model Evaluation Harness (Gao et al., 2024). We evaluate on a set of tasks taken from [SpanishBench](https://github.com/EleutherAI/lm-evaluation-harness/tree/main/lm_eval/tasks/spanish_bench), [CatalanBench](https://github.com/EleutherAI/lm-evaluation-harness/tree/main/lm_eval/tasks/catalan_bench), [BasqueBench](https://github.com/EleutherAI/lm-evaluation-harness/tree/main/lm_eval/tasks/basque_bench) and [GalicianBench](https://github.com/EleutherAI/lm-evaluation-harness/tree/main/lm_eval/tasks/galician_bench). These benchmarks include both new and existing tasks and datasets. Given that this is an instructed model, we add LM Evaluation Harness's native feature of `chat-template` to the setup. In the tables below, we include the results in a selection of evaluation datasets that represent model's performance across a variety of tasks within these benchmarks. We only use tasks that are either human generated, human translated, or with a strong human-in-the-loop (i.e., machine translation followed by professional revision or machine generation followed by human revision and annotation). This is the reason behind the variety in number of tasks reported across languages. As more tasks that fulfill these requirements are published, we will update the presented results. We also intend to expand the evaluation to other languages, as long as the datasets meet our quality standards. During the implementation of the evaluation we observed a series of issues worth considering when replicating and interpreting the results presented. These issues include ≈1.5% variances in performance in some tasks depending on the version of the `transformers` library used, and depending on the use (or lack of use) of tensor parallelism when loading a model. When implementing existing tasks, we carry out a comprehensive quality evaluation of the dataset, the Harness task itself, and what kind of input models see during evaluation. Our implementation (see links above) addresses multiple existing problems such as errors in datasets and prompts, and lack of pre-processing. All this means that results will vary if using other Harness implementations, and may slightly vary depending on the replication setup. It should be noted that these results are subject to all the drawbacks of every current gold-standard evaluation, and that the figures do not fully represent the models capabilities and potential. We thus advise caution when reading and interpreting the results. A full list of results compared to other baselines, a discussion of the model's performance across tasks and its implications, and details regarding problem-solving with task implementation will soon be available in the technical report. All results reported below are on a 0-shot setting. #### Spanish <table><thead> <tr> <th>Category</th> <th>Task</th> <th>Metric</th> <th>Result</th> </tr></thead> <tbody> <tr> <td>Commonsense Reasoning</td> <td>xstorycloze_es</td> <td>acc</td> <td>62.34</td> </tr> <tr> <td rowspan="2">NLI</td> <td>wnli_es</td> <td>acc</td> <td>47.89</td> </tr> <tr> <td>xnli_es</td> <td>acc</td> <td>47.03</td> </tr> <tr> <td>Paraphrasing</td> <td>paws_es</td> <td>acc</td> <td>55.5</td> </tr> <tr> <td>QA</td> <td>xquad_es</td> <td>acc</td> <td>42.21</td> </tr> <tr> <td>Translation</td> <td>flores_es</td> <td>bleu</td> <td>20.27</td> </tr> </tbody> </table> #### Catalan <table><thead> <tr> <th>Category</th> <th>Task</th> <th>Metric</th> <th>Result</th> </tr></thead> <tbody> <tr> <td rowspan="2">Commonsense Reasoning</td> <td>copa_ca</td> <td>acc</td> <td>70.4</td> </tr> <tr> <td>xstorycloze_ca</td> <td>acc</td> <td>63.07</td> </tr> <tr> <td rowspan="2">NLI</td> <td>wnli_ca</td> <td>acc</td> <td>52.11</td> </tr> <tr> <td>xnli_ca</td> <td>acc</td> <td>51.69</td> </tr> <tr> <td rowspan="2">Paraphrasing</td> <td>parafraseja</td> <td>acc</td> <td>61.88</td> </tr> <tr> <td>paws_ca</td> <td>acc</td> <td>57.7</td> </tr> <tr> <td rowspan="5">QA</td> <td>arc_ca_easy</td> <td>acc</td> <td>51.94</td> </tr> <tr> <td>arc_ca_challenge</td> <td>acc</td> <td>29.52</td> </tr> <tr> <td>openbookqa_ca</td> <td>acc</td> <td>26.4</td> </tr> <tr> <td>piqa_ca</td> <td>acc</td> <td>62.89</td> </tr> <tr> <td>siqa_ca</td> <td>acc</td> <td>42.63</td> </tr> <tr> <td>Translation</td> <td>flores_ca</td> <td>bleu</td> <td>24.48</td> </tr> </tbody></table> #### Basque <table><thead> <tr> <th>Category</th> <th>Task</th> <th>Metric</th> <th>Result</th> </tr></thead> <tbody> <tr> <td rowspan="2">Commonsense Reasoning</td> <td>xcopa_eu</td> <td>acc</td> <td>53.6</td> </tr> <tr> <td>xstorycloze_eu</td> <td>acc</td> <td>56.39</td> </tr> <tr> <td rowspan="2">NLI</td> <td>wnli_eu</td> <td>acc</td> <td>45.07</td> </tr> <tr> <td>xnli_eu</td> <td>acc</td> <td>39.44</td> </tr> <tr> <td rowspan="3">QA</td> <td>eus_exams</td> <td>acc</td> <td>25.35</td> </tr> <tr> <td>eus_proficiency</td> <td>acc</td> <td>26.37</td> </tr> <tr> <td>eus_trivia</td> <td>acc</td> <td>26.24</td> </tr> <tr> <td>Reading Comprehension</td> <td>eus_reading</td> <td>acc</td> <td>24.72</td> </tr> <tr> <td>Translation</td> <td>flores_eu</td> <td>bleu</td> <td>9.67</td> </tr> </tbody></table> #### Galician <table><thead> <tr> <th>Category</th> <th>Task</th> <th>Metric</th> <th>Result</th> </tr></thead> <tbody> <tr> <td rowspan="2">Paraphrasing</td> <td>parafrases_gl</td> <td>acc</td> <td>50.00</td> </tr> <tr> <td>paws_gl</td> <td>acc</td> <td>52.20</td> </tr> <tr> <td>QA</td> <td>openbookqa_gl</td> <td>acc</td> <td>33.2</td> </tr> <tr> <td>Translation</td> <td>flores_gl</td> <td>bleu</td> <td>22.39</td> </tr> </tbody> </table> ### LLM-as-a-judge We use [Prometheus-2 8x7B](https://huggingface.co/prometheus-eval/prometheus-8x7b-v2.0) as a judge to evaluate the responses of the model. Tasks are created from existing multilingual evaluation datasets covering the same categories as the ones measured in our gold-standard benchmarks. We randomly select a subset of 250 instances per language from the `test` set of each source dataset. To evaluate the responses of our model, we use task-specific criteria developed in-house for the _LLM-judge_ to use. Each criterion is measured either as a 5-point Likert scale or as a binary task depending on the idiosyncrasy of the task and criterion. Prompts for each task are created in various ways to score the model's robustness in addition to these criteria. This is done by presenting the same source instance within three different prompts. We then calculate the variance between the scores assigned by the _LLM-judge_ to our model's responses to the three prompt styles and average it across all instances. Prompts are human translated to all languages measured. We do not provide the _LLM-judge_ with a reference answer. The _judge_ prompt we use during evaluation is the same used to fine tune the Prometheus-2 family. We keep the _judge_ prompt and criteria used to present the _LLM-judge_ with the task prompts and model responses in English for evaluation across languages. The _judge_ prompt used is: ```python "You are a fair judge assistant tasked with providing clear, objective feedback based on specific criteria, ensuring each assessment reflects the absolute standards set for performance. ###Task Description: An instruction (might include an Input inside it), a response to evaluate, and a score rubric representing a evaluation criteria are given. 1. Write a detailed feedback that assess the quality of the response strictly based on the given score rubric, not evaluating in general. 2. After writing a feedback, write a score that is an integer between {a} and {b}. You should refer to the score rubric. 3. The output format should look as follows: \"Feedback: (write a feedback for criteria) [RESULT] (an integer number between {a} and {b})\" 4. Please do not generate any other opening, closing, and explanations. ###The instruction to evaluate: {input} ###Response to evaluate: {prediction} ###Score Rubrics: {criteria} ###Feedback:" ``` As an example, prompts for the Math task in English are based on instances from [MGSM](https://huggingface.co/datasets/juletxara/mgsm), and each instance is presented within these prompts: ```python "en": [ ("I need help with this math problem: \"", "\" Give me the answer step by step and also the final result separately."), ("Can you please help me answer this? \"", "\" Explain the answer and give me the final result as well. Thanks."), ("Help me with this problem: \"", "\" I need the answer explained and the final result separately.") ] ``` This task is then evaluated by the _LLM-judge_ using two criteria, reasoning capability (5-point Likert) and mathematical correctness (binary): ```python reasoning_capability_criteria = { "reasoning_capability": """ [Does the model's answer demonstrate reasoning capability?] Score 1: The answer demonstrates poor reasoning, with illogical arguments or conclusions that do not follow from the provided information. Score 2: The answer shows weak reasoning, with some logical connections but also contains significant flaws or gaps in the argumentation. Score 3: The answer demonstrates adequate reasoning, with generally logical arguments, but may have minor flaws or a lack of depth in the reasoning process. Score 4: The answer shows strong reasoning, with well-structured arguments and conclusions that logically follow from the information provided. Score 5: The answer demonstrates exceptional reasoning, with clear, coherent, and insightful arguments that are logically sound and well-supported by the information provided.""" } mathematical_correctness_binary_criteria = { "mathematical_correctness_binary": """ [Is the model's answer mathematically correct?] Score 0: The answer contains mathematical errors that render the solution incorrect or unreliable. Score 1: The answer is mathematically correct, with accurate calculations and appropriate use of mathematical concepts.""" } ``` #### Multilingual results Here, we present results for seven categories of tasks in Spanish, Catalan, Basque, Galician, and English. Results are presented for each task, criterion and language. Criteria with a `(B)` after their name are binary criteria (i.e., numbers go from 0 to 1, where 1 is best). The rest of the criteria are measured using a 5-point Likert scale, where 5 is best. The first number of the pair of numbers separated by `/` shows the average score for the criterion (and language). The second number of each pair is the robustness score, where numbers closer to 0 mean that the model generates similar responses when comparing the three prompt varieties for a single instance. Further details on all tasks and criteria, a full list of results compared to other baselines, a discussion of the model's performance across tasks and its implications, and details regarding problem-solving with task implementation will soon be available in the technical report. | **Category** | **Dataset** | **Metric** | **es** | **ca** | **gl** | **eu** | **en** | |---------|---------|-----------|-------|-------|-------|-------|-------| | **Commonsense Reasoning** | **XStoryCloze** | Ending Coherence (1 to 5) | 2.36/0.66 | 2.49/0.76 | 2.45/0.68 | 2.30/0.67 | 3.06/0.77 | | **Paraphrasing** | **PAWS** | Paraphrase Completeness (0/1) | 0.60/0.15 | 0.54/0.17 | 0.64/0.14 | ----/---- | 0.79/0.11 | | | | Paraphrase Generation (1 to 5) | 2.89/1.46 | 2.71/1.70 | 2.80/1.21 | ----/---- | 3.64/0.80 | | | | Paraphrase Grammatical Correctness (0/1) | 0.74/0.13 | 0.68/0.15 | 0.78/0.10 | ----/---- | 0.89/0.07 | | **Reading Comprehension** | **Belebele** | Passage Comprehension (1 to 5) | 3.05/0.60 | 2.81/0.66 | 2.74/0.78 | 2.52/0.46 | 3.11/0.71 | | | | Answer Relevance (0/1) | 0.74/0.09 | 0.66/0.11 | 0.65/0.12 | 0.59/0.12 | 0.75/0.09 | | **Extreme Summarization** | **XLSum & caBreu & summarization_gl** | Extreme Summarization Informativeness (1 to 5) | 3.07/0.39 | 3.33/0.43 | 3.11/0.36 | ----/---- | 3.06/0.35 | | | | Extreme Summarization Conciseness (1 to 5) | 2.92/0.42 | 2.67/0.54 | 2.93/0.39 | ----/---- | 3.13/0.31 | | **Mathematics** | **mgsm** | Reasoning Capability (1 to 5) | 1.89/0.47 | 1.91/0.45 | 1.97/0.43 | 2.17/0.44 | 2.16/0.56 | | | | Mathematical Correctness (0/1) | 0.24/0.10 | 0.28/0.11 | 0.27/0.11 | 0.44/0.13 | 0.27/0.10 | | **Translation form Language** | **FLoRes** | Translation Fluency (1 to 5) | 3.74/0.15 | 3.69/0.22 | ----/---- | ----/---- | 3.69/0.18 | | | | Translation Accuracy (1 to 5) | 4.01/0.24 | 3.98/0.31 | ----/---- | ----/---- | 3.98/0.25 | | **Translation to Language** | **FLoRes** | Translation Fluency (1 to 5) | 3.75/0.14 | 3.69/0.17 | ----/---- | ----/---- | 4.09/0.16 | | | | Translation Accuracy (1 to 5) | 4.08/0.22 | 3.98/0.24 | ----/---- | ----/---- | 4.47/0.18 | --- ## Ethical Considerations and Limitations We examine the presence of undesired societal and cognitive biases present in this model using different benchmarks. For societal biases, we test performance using the BBQ dataset (Parrish et al., 2022) in the original English and the Regard dataset (Sheng et al., 2019). We report that moderate accuracies (between 0.5 and 0.6 depending on the social groups) in disambiguated settings, the model performs very poorly in ambiguous setting. Taken together, these results suggest the pervasiveness of social biases that may have an effect on task performance Our cognitive bias analysis focuses on positional effects in 0-shot settings, and majority class bias in few-shot settings. For positional effects, we leverage the ARC Multiple Choice Question dataset (Clark et al., 2018). We observe significant, but moderate weak primacy effects, whereby the model shows a preference for answers towards the beginning of the list of provided answers. We measure effects of majority class effects in few-shot settings using SST-2 (Socher et al., 2013). We again detect significant effects, with a small effect size. This suggests that the model is relatively robust against the examined cognitive biases. We highlight that our analyses of these biases are by no means exhaustive and are limited by the relative scarcity of adequate resources in all languages present in the training data. We aim to gradually extend and expand our analyses in future work. These results can be expected from a model that has undergone only a preliminary instruction tuning. These tests are performed in order to show the biases the model may contain. We urge developers to take them into account and perform safety testing and tuning tailored to their specific applications of the model. --- ## Additional information ### Author The Language Technologies Unit from Barcelona Supercomputing Center. ### Contact For further information, please send an email to <[email protected]>. ### Copyright Copyright(c) 2024 by Language Technologies Unit, Barcelona Supercomputing Center. ### Funding This work has been promoted and financed by the Government of Catalonia through the [Aina Project](https://projecteaina.cat/). This work is funded by the _Ministerio para la Transformación Digital y de la Función Pública_ - Funded by EU – NextGenerationEU within the framework of [ILENIA Project](https://proyectoilenia.es/) with reference 2022/TL22/00215337. ### Acknowledgements This project has benefited from the contributions of numerous teams and institutions, mainly through data contributions, knowledge transfer or technical support. In Catalonia, many institutions have been involved in the project. Our thanks to Òmnium Cultural, Parlament de Catalunya, Institut d'Estudis Aranesos, Racó Català, Vilaweb, ACN, Nació Digital, El món and Aquí Berguedà. At national level, we are especially grateful to our ILENIA project partners: CENID, HiTZ and CiTIUS for their participation. We also extend our genuine gratitude to the Spanish Senate and Congress, Fundación Dialnet, Fundación Elcano and the ‘Instituto Universitario de Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería (SIANI)’ of the University of Las Palmas de Gran Canaria. At the international level, we thank the Welsh government, DFKI, Occiglot project, especially Malte Ostendorff, and The Common Crawl Foundation, especially Pedro Ortiz, for their collaboration. We would also like to give special thanks to the NVIDIA team, with whom we have met regularly, specially to: Ignacio Sarasua, Adam Henryk Grzywaczewski, Oleg Sudakov, Sergio Perez, Miguel Martinez, Felipes Soares and Meriem Bendris. Their constant support has been especially appreciated throughout the entire process. Their valuable efforts have been instrumental in the development of this work. ### Disclaimer Be aware that the model may contain biases or other unintended distortions. When third parties deploy systems or provide services based on this model, or use the model themselves, they bear the responsibility for mitigating any associated risks and ensuring compliance with applicable regulations, including those governing the use of Artificial Intelligence. The Barcelona Supercomputing Center, as the owner and creator of the model, shall not be held liable for any outcomes resulting from third-party use. ### Citation Technical report and paper coming soon. ### License [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ## Model Index |Model|Base|Instruct| |:---:|:---:|:---:| |2B| [Link](https://huggingface.co/BSC-LT/salamandra-2b) | [Link](https://huggingface.co/BSC-LT/salamandra-2b-instruct) | |7B| [Link](https://huggingface.co/BSC-LT/salamandra-7b) | [Link](https://huggingface.co/BSC-LT/salamandra-7b-instruct) | |40B| WiP | WiP |
[ "QUESTION_ANSWERING", "TRANSLATION", "SUMMARIZATION", "PARAPHRASING" ]
[ "BEAR", "SCIELO" ]
Non_BioNLP
michaelfeil/ct2fast-pythia-160m
michaelfeil
null
[ "transformers", "ctranslate2", "int8", "float16", "pytorch", "causal-lm", "pythia", "en", "dataset:the_pile", "arxiv:2101.00027", "arxiv:2201.07311", "license:apache-2.0", "endpoints_compatible", "region:us" ]
1,682
1,684
12
1
--- datasets: - the_pile language: - en license: apache-2.0 tags: - ctranslate2 - int8 - float16 - pytorch - causal-lm - pythia --- # # Fast-Inference with Ctranslate2 Speedup inference while reducing memory by 2x-4x using int8 inference in C++ on CPU or GPU. quantized version of [EleutherAI/pythia-160m](https://huggingface.co/EleutherAI/pythia-160m) ```bash pip install hf-hub-ctranslate2>=2.0.6 ``` Converted on 2023-05-19 using ``` ct2-transformers-converter --model EleutherAI/pythia-160m --output_dir /home/michael/tmp-ct2fast-pythia-160m --force --copy_files tokenizer.json README.md tokenizer_config.json special_tokens_map.json .gitattributes --quantization float16 ``` Checkpoint compatible to [ctranslate2>=3.13.0](https://github.com/OpenNMT/CTranslate2) and [hf-hub-ctranslate2>=2.0.6](https://github.com/michaelfeil/hf-hub-ctranslate2) - `compute_type=int8_float16` for `device="cuda"` - `compute_type=int8` for `device="cpu"` ```python from hf_hub_ctranslate2 import TranslatorCT2fromHfHub, GeneratorCT2fromHfHub from transformers import AutoTokenizer model_name = "michaelfeil/ct2fast-pythia-160m" # use either TranslatorCT2fromHfHub or GeneratorCT2fromHfHub here, depending on model. model = GeneratorCT2fromHfHub( # load in int8 on CUDA model_name_or_path=model_name, device="cuda", compute_type="int8_float16", tokenizer=AutoTokenizer.from_pretrained("EleutherAI/pythia-160m") ) outputs = model.generate( text=["How do you call a fast Flan-ingo?", "User: How are you doing? Bot:"], ) print(outputs) ``` # Licence and other remarks: This is just a quantized version. Licence conditions are intended to be idential to original huggingface repo. # Original description The *Pythia Scaling Suite* is a collection of models developed to facilitate interpretability research. It contains two sets of eight models of sizes 70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two models: one trained on the Pile, and one trained on the Pile after the dataset has been globally deduplicated. All 8 model sizes are trained on the exact same data, in the exact same order. We also provide 154 intermediate checkpoints per model, hosted on Hugging Face as branches. The Pythia model suite was deliberately designed to promote scientific research on large language models, especially interpretability research. Despite not centering downstream performance as a design goal, we find the models <a href="#evaluations">match or exceed</a> the performance of similar and same-sized models, such as those in the OPT and GPT-Neo suites. <details> <summary style="font-weight:600">Details on previous early release and naming convention.</summary> Previously, we released an early version of the Pythia suite to the public. However, we decided to retrain the model suite to address a few hyperparameter discrepancies. This model card <a href="#changelog">lists the changes</a>; see appendix B in the Pythia paper for further discussion. We found no difference in benchmark performance between the two Pythia versions. The old models are [still available](https://huggingface.co/models?other=pythia_v0), but we suggest the retrained suite if you are just starting to use Pythia.<br> **This is the current release.** Please note that all models in the *Pythia* suite were renamed in January 2023. For clarity, a <a href="#naming-convention-and-parameter-count">table comparing the old and new names</a> is provided in this model card, together with exact parameter counts. </details> <br> # Pythia-160M ## Model Details - Developed by: [EleutherAI](http://eleuther.ai) - Model type: Transformer-based Language Model - Language: English - Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia) for training procedure, config files, and details on how to use. - Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox) - License: Apache 2.0 - Contact: to ask questions about this model, join the [EleutherAI Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`. Please read the existing *Pythia* documentation before asking about it in the EleutherAI Discord. For general correspondence: [contact@eleuther. ai](mailto:[email protected]). <figure> | Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models | | -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: | | 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — | | 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M | | 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M | | 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — | | 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B | | 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B | | 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B | | 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — | <figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and non-deduped models of a given size have the same hyperparameters. “Equivalent” models have <b>exactly</b> the same architecture, and the same number of non-embedding parameters.</figcaption> </figure> ## Uses and Limitations ### Intended Use The primary intended use of Pythia is research on the behavior, functionality, and limitations of large language models. This suite is intended to provide a controlled setting for performing scientific experiments. We also provide 154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints `step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to `step143000`. These checkpoints are hosted on Hugging Face as branches. Note that branch `143000` corresponds exactly to the model checkpoint on the `main` branch of each model. You may also further fine-tune and adapt Pythia-160M for deployment, as long as your use is in accordance with the Apache 2.0 license. Pythia models work with the Hugging Face [Transformers Library](https://huggingface.co/docs/transformers/index). If you decide to use pre-trained Pythia-160M as a basis for your fine-tuned model, please conduct your own risk and bias assessment. ### Out-of-scope use The Pythia Suite is **not** intended for deployment. It is not a in itself a product and cannot be used for human-facing interactions. For example, the model may generate harmful or offensive text. Please evaluate the risks associated with your particular use case. Pythia models are English-language only, and are not suitable for translation or generating text in other languages. Pythia-160M has not been fine-tuned for downstream contexts in which language models are commonly deployed, such as writing genre prose, or commercial chatbots. This means Pythia-160M will **not** respond to a given prompt the way a product like ChatGPT does. This is because, unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement Learning from Human Feedback (RLHF) to better “follow” human instructions. ### Limitations and biases The core functionality of a large language model is to take a string of text and predict the next token. The token used by the model need not produce the most “accurate” text. Never rely on Pythia-160M to produce factually accurate output. This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset known to contain profanity and texts that are lewd or otherwise offensive. See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a discussion of documented biases with regards to gender, religion, and race. Pythia-160M may produce socially unacceptable or undesirable text, *even if* the prompt itself does not include anything explicitly offensive. If you plan on using text generated through, for example, the Hosted Inference API, we recommend having a human curate the outputs of this language model before presenting it to other people. Please inform your audience that the text was generated by Pythia-160M. ### Quickstart Pythia models can be loaded and used via the following code, demonstrated here for the third `pythia-70m-deduped` checkpoint: ```python from transformers import GPTNeoXForCausalLM, AutoTokenizer model = GPTNeoXForCausalLM.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) tokenizer = AutoTokenizer.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) inputs = tokenizer("Hello, I am", return_tensors="pt") tokens = model.generate(**inputs) tokenizer.decode(tokens[0]) ``` Revision/branch `step143000` corresponds exactly to the model checkpoint on the `main` branch of each model.<br> For more information on how to use all Pythia models, see [documentation on GitHub](https://github.com/EleutherAI/pythia). ## Training ### Training data [The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in English. It was created by EleutherAI specifically for training large language models. It contains texts from 22 diverse sources, roughly broken down into five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl), prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and miscellaneous (e.g. GitHub, Enron Emails). See [the Pile paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources, methodology, and a discussion of ethical implications. Consult [the datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation about the Pile and its component datasets. The Pile can be downloaded from the [official website](https://pile.eleuther.ai/), or from a [community mirror](https://the-eye.eu/public/AI/pile/).<br> The Pile was **not** deduplicated before being used to train Pythia-160M. ### Training procedure All models were trained on the exact same data, in the exact same order. Each model saw 299,892,736,000 tokens during training, and 143 checkpoints for each model are saved every 2,097,152,000 tokens, spaced evenly throughout training, from `step1000` to `step143000` (which is the same as `main`). In addition, we also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`. This corresponds to training for just under 1 epoch on the Pile for non-deduplicated models, and about 1.5 epochs on the deduplicated Pile. All *Pythia* models trained for 143000 steps at a batch size of 2M (2,097,152 tokens).<br> See [GitHub](https://github.com/EleutherAI/pythia) for more details on training procedure, including [how to reproduce it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br> Pythia uses the same tokenizer as [GPT-NeoX- 20B](https://huggingface.co/EleutherAI/gpt-neox-20b). ## Evaluations All 16 *Pythia* models were evaluated using the [LM Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access the results by model and step at `results/json/*` in the [GitHub repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br> Expand the sections below to see plots of evaluation results for all Pythia and Pythia-deduped models compared with OPT and BLOOM. <details> <summary>LAMBADA – OpenAI</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/> </details> <details> <summary>Physical Interaction: Question Answering (PIQA)</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/> </details> <details> <summary>WinoGrande</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/> </details> <details> <summary>AI2 Reasoning Challenge—Easy Set</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/> </details> <details> <summary>SciQ</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/> </details> ## Changelog This section compares differences between previously released [Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current models. See Appendix B of the Pythia paper for further discussion of these changes and the motivation behind them. We found that retraining Pythia had no impact on benchmark performance. - All model sizes are now trained with uniform batch size of 2M tokens. Previously, the models of size 160M, 410M, and 1.4B parameters were trained with batch sizes of 4M tokens. - We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64, 128,256,512} in addition to every 1000 training steps. - Flash Attention was used in the new retrained suite. - We remedied a minor inconsistency that existed in the original suite: all models of size 2.8B parameters or smaller had a learning rate (LR) schedule which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and 12B models all used an LR schedule which decayed to a minimum LR of 0. In the redone training runs, we rectified this inconsistency: all models now were trained with LR decaying to a minimum of 0.1× their maximum LR. ### Naming convention and parameter count *Pythia* models were renamed in January 2023. It is possible that the old naming convention still persists in some documentation by accident. The current naming convention (70M, 160M, etc.) is based on total parameter count. <figure style="width:32em"> | current Pythia suffix | old suffix | total params | non-embedding params | | --------------------: | ---------: | -------------: | -------------------: | | 70M | 19M | 70,426,624 | 18,915,328 | | 160M | 125M | 162,322,944 | 85,056,000 | | 410M | 350M | 405,334,016 | 302,311,424 | | 1B | 800M | 1,011,781,632 | 805,736,448 | | 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 | | 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 | | 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 | | 12B | 13B | 11,846,072,320 | 11,327,027,200 | </figure>
[ "QUESTION_ANSWERING", "TRANSLATION" ]
[ "SCIQ" ]
Non_BioNLP
mav23/gpt-neox-20b-GGUF
mav23
null
[ "gguf", "pytorch", "causal-lm", "en", "dataset:EleutherAI/pile", "arxiv:2204.06745", "arxiv:2101.00027", "arxiv:2201.07311", "arxiv:2104.09864", "license:apache-2.0", "endpoints_compatible", "region:us" ]
1,728
1,729
278
0
--- datasets: - EleutherAI/pile language: - en license: apache-2.0 tags: - pytorch - causal-lm --- GPT-NeoX-20B is a 20 billion parameter autoregressive language model trained on [the Pile](https://pile.eleuther.ai/) using the [GPT-NeoX library](https://github.com/EleutherAI/gpt-neox). Its architecture intentionally resembles that of GPT-3, and is almost identical to that of [GPT-J- 6B](https://huggingface.co/EleutherAI/gpt-j-6B). Its training dataset contains a multitude of English-language texts, reflecting the general-purpose nature of this model. See the [accompanying paper](https://arxiv.org/abs/2204.06745) for details about model architecture (including how it differs from GPT-3), training procedure, and additional evaluations. ### Model details - Developed by: [EleutherAI](http://eleuther.ai) - Model type: Transformer-based Language Model - Language: English - Learn more: [GPT-NeoX-20B: An Open-Source Autoregressive Language Model](https://arxiv.org/abs/2204.06745). For details about the training dataset, see [the Pile paper](https://arxiv.org/abs/2101.00027), and [its data sheet](https://arxiv.org/abs/2201.07311). - License: Apache 2.0 - Contact: to ask questions about this model, join the [EleutherAI Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`. Please read the existing GPT-NeoX-20B documentation before asking about the model on Discord. For general correspondence: [contact@eleuther. ai](mailto:[email protected]). <figure style="width:30em"> | Hyperparameter | Value | | ---------------------- | ----------- | | n<sub>parameters</sub> | 20554567680 | | n<sub>layers</sub> | 44 | | d<sub>model</sub> | 6144 | | n<sub>heads</sub> | 64 | | d<sub>head</sub> | 96 | | n<sub>vocab</sub> | 50257 | | Sequence Length | 2048 | | Learning Rate | 0.97 x 10<sup>-5</sup> | | Positional Encoding | [Rotary Position Embedding (RoPE)](https://arxiv.org/abs/2104.09864) | </figure> ### Uses and limitations #### Intended use GPT-NeoX-20B was developed primarily for research purposes. It learns an inner representation of the English language that can be used to extract features useful for downstream tasks. In addition to scientific uses, you may also further fine-tune and adapt GPT-NeoX-20B for deployment, as long as your use is in accordance with the Apache 2.0 license. This model works with the [Transformers Library](https://huggingface.co/docs/transformers/index). If you decide to use pre-trained GPT-NeoX-20B as a basis for your fine-tuned model, please note that you need to conduct your own risk and bias assessment. #### Out-of-scope use GPT-NeoX-20B is **not** intended for deployment as-is. It is not a product and cannot be used for human-facing interactions without supervision. GPT-NeoX-20B has not been fine-tuned for downstream tasks for which language models are commonly deployed, such as writing genre prose, or commercial chatbots. This means GPT-NeoX-20B will likely **not** respond to a given prompt the way products such as ChatGPT do. This is because, unlike GPT-NeoX-20B, ChatGPT was fine-tuned using methods such as Reinforcement Learning from Human Feedback (RLHF) to better “understand” human instructions and dialogue. This model is English-language only, and thus cannot be used for translation or generating text in other languages. #### Limitations and biases The core functionality of GPT-NeoX-20B is to take a string of text and predict the next token. Remember that the statistically most likely next token need not result in the most “accurate” text. Never rely on GPT-NeoX-20B to produce factually accurate output. This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset known to contain profanity and texts that are lewd or otherwise offensive. See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a discussion of documented biases with regards to gender, religion, and race. GPT-NeoX-20B may produce socially unacceptable or undesirable text, *even if* the prompt itself does not include anything explicitly offensive. We recommend curating the outputs of this model before presenting it to a human reader. Please inform your audience that you are using artificially generated text. #### How to use If you simply want to try out some prompts, check out [this playground](https://20b.eleuther.ai/). GPT-NeoX-20B can be loaded using the `AutoModelForCausalLM` functionality: ```python from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b") model = AutoModelForCausalLM.from_pretrained("EleutherAI/gpt-neox-20b") ``` ### Training #### Training dataset The Pile is a 825GiB general-purpose dataset in English. It was created by EleutherAI specifically for training large language models. It contains texts from 22 diverse sources, roughly broken down into five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl), prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and miscellaneous (e.g. GitHub, Enron Emails). See [the Pile paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources, methodology, and a discussion of ethical implications. Consult [the datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation about the Pile and its component datasets. The Pile can be downloaded from the [official website](https://pile.eleuther.ai/), or from a [community mirror](https://the-eye.eu/public/AI/pile/). The Pile was **not** deduplicated before being used to train GPT-NeoX-20B. #### Training procedure GPT-NeoX-20B was trained with a batch size of approximately 3.15M tokens (1538 sequences of 2048 tokens each), for a total of 150,000 steps. Tensor parallelism and pipeline parallelism were used to distribute the model across GPUs. Additional details about the training procedure are in [Section 3 of the accompanying paper](https://arxiv.org/abs/2204.06745). ### Evaluations <figure style="width:55em"> | Model | OpenAI’s LAMBADA | SciQ | PIQA | TriviaQA | ARC (Challenge) | | ------------- | :--------------: | :-----------: | :-----------: | :-----------: | :-------------: | | GPT-J-6B | 0.683 ± 0.006 | 0.910 ± 0.009 | 0.752 ± 0.010 | 0.170 ± 0.004 | 0.340 ± 0.014 | | FairSeq 6.7B | 0.673 ± 0.007 | 0.895 ± 0.010 | 0.762 ± 0.010 | 0.221 ± 0.004 | 0.329 ± 0.014 | | GPT-3 Curie | 0.693 ± 0.006 | 0.918 ± 0.009 | 0.767 ± 0.010 | 0.196 ± 0.004 | 0.334 ± 0.014 | | FairSeq 13B | 0.709 ± 0.006 | 0.910 ± 0.009 | 0.769 ± 0.010 | 0.270 ± 0.004 | 0.345 ± 0.014 | | GPT-NeoX-20B | 0.720 ± 0.006 | 0.928 ± 0.008 | 0.779 ± 0.010 | 0.259 ± 0.004 | 0.380 ± 0.014 | | GPT-3 DaVinci | 0.752 ± 0.006 | 0.949 ± 0.007 | 0.791 ± 0.009 | 0.409 ± 0.005 | 0.435 ± 0.014 | <figcaption>Zero-shot performance on selected natural language tasks.</figcaption> </figure> This is a heavily abridged version of the evaluation results. Appendix D of the [GPT-NeoX-20B paper](https://arxiv.org/abs/2204.06745) compares more model sizes, and contains additional evaluations, including on: zero and five-shot natural language tasks, zero and five-shot Basic Arithmetic and MATH, and zero-shot Hendrycks tasks. ### BibTeX To cite the GPT-NeoX-20B paper: ``` @misc{https://doi.org/10.48550/arxiv.2204.06745, doi = {10.48550/ARXIV.2204.06745}, url = {https://arxiv.org/abs/2204.06745}, author = {Black, Sid and Biderman, Stella and Hallahan, Eric and Anthony, Quentin and Gao, Leo and Golding, Laurence and He, Horace and Leahy, Connor and McDonell, Kyle and Phang, Jason and Pieler, Michael and Prashanth, USVSN Sai and Purohit, Shivanshu and Reynolds, Laria and Tow, Jonathan and Wang, Ben and Weinbach, Samuel}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {GPT-NeoX-20B: An Open-Source Autoregressive Language Model}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ``` # [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neox-20b) | Metric | Value | |-----------------------|---------------------------| | Avg. | 36.02 | | ARC (25-shot) | 45.73 | | HellaSwag (10-shot) | 73.45 | | MMLU (5-shot) | 25.0 | | TruthfulQA (0-shot) | 31.61 | | Winogrande (5-shot) | 68.9 | | GSM8K (5-shot) | 2.43 | | DROP (3-shot) | 5.04 |
[ "TRANSLATION" ]
[ "SCIQ" ]
Non_BioNLP
davanstrien/query-to-dataset-viewer-descriptions
davanstrien
sentence-similarity
[ "sentence-transformers", "tensorboard", "safetensors", "new", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:1141", "loss:CachedMultipleNegativesRankingLoss", "custom_code", "en", "dataset:davanstrien/query-to-dataset-viewer-descriptions", "arxiv:1908.10084", "arxiv:2101.06983", "base_model:Alibaba-NLP/gte-base-en-v1.5", "base_model:finetune:Alibaba-NLP/gte-base-en-v1.5", "license:apache-2.0", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,726
1,726
13
1
--- base_model: Alibaba-NLP/gte-base-en-v1.5 datasets: - davanstrien/query-to-dataset-viewer-descriptions language: - en library_name: sentence-transformers license: apache-2.0 metrics: - cosine_accuracy - dot_accuracy - manhattan_accuracy - euclidean_accuracy - max_accuracy pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:1141 - loss:CachedMultipleNegativesRankingLoss widget: - source_sentence: 'USER_QUERY: mental health conversation dataset' sentences: - "NEGATIVE: DATASET_NAME: \"llmModeluser/Therapy_sessions_dataset\"\nFEATURES:\ \ {'conversations': {'dtype': 'string', '_type': 'Value'}, 'id': {'dtype': 'string',\ \ '_type': 'Value'}}\nDATA SAMPLE:\n[\n {\n \"row_idx\": 0,\n \"row\":\ \ {\n \"conversations\": \"[{'from': 'human', 'value': \\\"I've been feeling\ \ so sad and overwhelmed lately. Work has become such a massive source of stress\ \ for me.\\\"}\\n {'from': 'gpt', 'value': \\\"Hey there, I'm here to listen and\ \ support you. It sounds like work has been really challenging lately. Can you\ \ tell me more about what's been going on?\\\"}\\n {'from': 'human', 'value':\ \ \\\"I recently got a promotion at work, which I thought would be exciting. But\ \ the added responsibilities and pressure have just taken a toll on my mental\ \ health. It's been a really moving experience for me.\\\"}\\n {'from': 'gpt',\ \ 'value': \\\"I can understand how it can be overwhelming when we're faced with\ \ higher expectations. It's okay to acknowledge your emotions and allow yourself\ \ to feel sad in this situation. It's an important part of the healing process.\ \ What specific challenges have you been facing at work?\\\"}\\n {'from': 'human',\ \ 'value': \\\"Well, the workload has increased significantly, and I find it hard\ \ to maintain a work-life balance. I've been staying late at the office, and it\ \ feels like I'm constantly under a pile of never-ending tasks. It's just so hard\ \ to keep up, and it's impacting my overall well-being.\\\"}\\n {'from': 'gpt',\ \ 'value': \\\"It sounds like you're dealing with a lot of pressure to perform\ \ and succeed. Remember, it's crucial to take care of yourself, both mentally\ \ and physically. A healthy work-life balance is essential. Have you tried any\ \ strategies to cope with the added workload?\\\"}\\n {'from': 'human', 'value':\ \ \\\"I've been trying to prioritize my tasks and delegate whenever possible.\ \ I've also started practicing meditation during my breaks to help manage stress.\ \ But sometimes, it feels like no matter what I do, I can't catch a break. It's\ \ been a constant struggle.\\\"}\\n {'from': 'gpt', 'value': \\\"It's great to\ \ hear that you're already implementing some helpful strategies. Remember, progress\ \ takes time, and it's okay to have setbacks. In addition to what you're already\ \ doing, I encourage you to also communicate with your supervisor or team about\ \ your workload and discuss possible solutions together.\\\"}\\n {'from': 'human',\ \ 'value': \\\"You're right. I haven't really opened up about my struggles to\ \ my coworkers or supervisor. I guess I've been afraid of appearing weak or incapable.\ \ How can I approach this discussion without feeling vulnerable?\\\"}\\n {'from':\ \ 'gpt', 'value': \\\"It's completely normal to feel that way, but remember, asking\ \ for support is a strength, not a weakness. Start by scheduling a conversation\ \ with your supervisor or a trusted colleague in a private and comfortable setting.\ \ Be honest about your challenges and express your willingness to find solutions\ \ together. Remember, you're not alone in this.\\\"}\\n {'from': 'human', 'value':\ \ \\\"Thank you for your understanding and guidance. I appreciate the reminder\ \ that I don't have to face this alone. I'll gather my courage and initiate that\ \ conversation soon. I need to prioritize my well-being and find a healthier balance.\\\ \"}\\n {'from': 'gpt', 'value': \\\"You're very welcome! I'm here to support you\ \ every step of the way. Taking care of yourself should always be a priority.\ \ Remember to be kind to yourself and celebrate your progress, no matter how small\ \ it may seem. You've got this!\\\"}]\",\n \"id\": \"identity_0\"\n },\n\ \ \"truncated_cells\": []\n },\n {\n \"row_idx\": 1,\n \"row\": {\n\ \ \"conversations\": \"[{'from': 'human', 'value': \\\"Hi, I'm feeling really\ \ scared about something and I need some advice. It's about friendship.\\\"}\\\ n {'from': 'gpt', 'value': \\\"Hello there! I'm here to listen and offer support.\ \ Tell me more about what's going on with your friendship. What specifically is\ \ making you feel scared?\\\"}\\n {'from': 'human', 'value': \\\"Well, lately,\ \ I've been feeling like my friends don't value me as much as I value them. It's\ \ like I'm always the one reaching out and making plans, but they never initiate\ \ anything. It's making me doubt if our friendship is genuine.\\\"}\\n {'from':\ \ 'gpt', 'value': \\\"I understand how disheartening that can be. It's important\ \ to feel valued and appreciated in any relationship. Have you talked to your\ \ friends about how you're feeling?\\\"}\\n {'from': 'human', 'value': \\\"No,\ \ I haven't. I'm afraid that they'll think I'm being needy or that they'll get\ \ defensive. I don't want to ruin our friendship by bringing it up.\\\"}\\n {'from':\ \ 'gpt', 'value': \\\"It's normal to feel hesitant about having difficult conversations,\ \ but addressing your concerns is a crucial step towards building stronger connections.\ \ Before jumping to conclusions, it's important to communicate openly and honestly.\ \ You deserve to express your feelings.\\\"}\\n {'from': 'human', 'value': \\\"\ You're right, I should talk to them. But how do I approach the conversation without\ \ sounding accusatory or angry?\\\"}\\n {'from': 'gpt', 'value': 'That\\\\'s a\ \ great question! It can be helpful to use \\\"I\\\" statements to express how\ \ you feel, rather than placing blame on others. For example, you can say, \\\"\ I\\\\'ve been feeling a bit overlooked lately, and I wanted to talk about our\ \ friendship to ensure we\\\\'re on the same page.\\\"'}\\n {'from': 'human',\ \ 'value': \\\"I see, so it's important to frame it as a dialogue rather than\ \ confrontational. That makes sense.\\\"}\\n {'from': 'gpt', 'value': 'Absolutely!\ \ Opening up a conversation can lead to better understanding and growth. Remember,\ \ your feelings are valid, and a healthy friendship should provide a safe space\ \ to express them. If your friends truly care about you, they will listen and\ \ try to understand your perspective.'}\\n {'from': 'human', 'value': \\\"It's\ \ reassuring to hear that. I'll try to gather the courage to talk to them. But\ \ what if they don't respond well? What if they dismiss my concerns?\\\"}\\n {'from':\ \ 'gpt', 'value': \\\"It's natural to worry about how others might react, but\ \ remember, you cannot control their responses. What you can control is how you\ \ handle the situation. If they dismiss your concerns, it might be an indication\ \ that they are not as invested in the friendship as you would like. In that case,\ \ it's essential to focus on surrounding yourself with people who appreciate and\ \ support you.\\\"}\\n {'from': 'human', 'value': \\\"That's true. I shouldn't\ \ hold onto toxic relationships. It's just hard to let go sometimes, you know?\\\ \"}\\n {'from': 'gpt', 'value': \\\"Letting go can be challenging, especially\ \ when we have invested time and energy into a friendship. It's essential to prioritize\ \ your well-being and surround yourself with those who uplift you. Remember, you\ \ deserve friendships that nourish your self-esteem and bring positivity into\ \ your life.\\\"}\\n {'from': 'human', 'value': 'Thank you for reminding me of\ \ my worth. I appreciate your advice and support.'}\\n {'from': 'gpt', 'value':\ \ \\\"You're very welcome! Building healthy friendships is a journey, and you\ \ have taken the first step by recognizing the need for change. Remember to be\ \ gentle with yourself as you navigate through this process. You're doing great!\\\ \"}\\n {'from': 'human', 'value': 'I will, thank you. I feel much better after\ \ talking to you.'}\\n {'from': 'gpt', 'value': \\\"I'm glad to hear that. Remember,\ \ I'm here for you whenever you need someone to talk to. Don't hesitate to reach\ \ out. You've got this!\\\"}]\",\n \"id\": \"identity_1\"\n },\n \"\ truncated_cells\": []\n }\n]" - "HUB_DATASET_PREVIEW: DATASET_NAME: \"emozilla/dolma-v1_7-arxiv\"\nFEATURES: {'text':\ \ {'dtype': 'string', '_type': 'Value'}, 'id': {'dtype': 'string', '_type': 'Value'},\ \ 'metadata': {'file_path': {'dtype': 'string', '_type': 'Value'}}}\nDATA SAMPLE:\n\ [\n {\n \"row_idx\": 0,\n \"row\": {\n \"text\": \"\\\\section{Introduction}\\\ nLet $G$ be a simple undirected graph with the \\\\textit{vertex set} $V(G)$ and\ \ the \\\\textit{edge set} $E(G)$. A vertex with degree one is called a \\\\textit{pendant\ \ vertex}. The distance between the vertices $u$ and $v$ in graph $G$ is denoted\ \ by $d_G(u,v)$. A cycle $C$ is called \\\\textit{chordless} if $C$ has no \\\\\ textit{cycle chord} (that is an edge not in the edge set of $C$ whose endpoints\ \ lie on the vertices of $C$).\\nThe \\\\textit{Induced subgraph} on vertex set\ \ $S$ is denoted by $\\\\langle S\\\\rangle$. A path that starts in $v$ and ends\ \ in $u$ is denoted by $\\\\stackrel\\\\frown{v u}$.\\nA \\\\textit{traceable}\ \ graph is a graph that possesses a Hamiltonian path.\\nIn a graph $G$, we say\ \ that a cycle $C$ is \\\\textit{formed by the path} $Q$ if $ | E(C) \\\\setminus\ \ E(Q) | = 1 $. So every vertex of $C$ belongs to $V(Q)$.\\n\\nIn 2011 the following\ \ conjecture was proposed:\\n\\\\begin{conjecture}(Hoffmann-Ostenhof \\\\cite{hoffman})\\\ nLet $G$ be a connected cubic graph. Then $G$ has a decomposition into a spanning\ \ tree, a matching and a family of cycles.\\n\\n\\\\end{conjecture}\\nConjecture\ \ \\\\theconjecture$\\\\,$ also appears in Problem 516 \\\\cite{cameron}. There\ \ are a few partial results known for Conjecture \\\\theconjecture. Kostochka\ \ \\\\cite{kostocha} noticed that the Petersen graph, the prisms over cycles,\ \ and many other graphs have a decomposition desired in Conjecture \\\\theconjecture.\ \ Ozeki and Ye \\\\cite{ozeki} proved that the conjecture holds for 3-connected\ \ cubic plane graphs. Furthermore, it was proved by Bachstein \\\\cite{bachstein}\ \ that Conjecture \\\\theconjecture$\\\\,$ is true for every 3-connected cubic\ \ graph embedded in torus or Klein-bottle. Akbari, Jensen and Siggers \\\\cite[Theorem\ \ 9]{akbari} showed that Conjecture \\\\theconjecture$\\\\,$ is true for Hamiltonian\ \ cubic graphs.\\n\\nIn this paper, we show that Conjecture \\\\theconjecture$\\\ \\,$ holds for traceable cubic graphs.\\n\\\\section{Results}\\nBefore proving\ \ the main result, we need the following lemma.\\n\\\\begin{lemma}\\n\\\\label{lemma:1}\\\ nLet $G$ be a cubic graph. Suppose that $V(G)$ can be partitioned into a tree\ \ $T$ and finitely many cycles such that there is no edge between any pair of\ \ cycles (not necessarily distinct cycles), and every pendant vertex of $T$ is\ \ adjacent to at least one vertex of a cycle. Then, Conjecture \\\\theconjecture$\\\ \\,$ holds for $G$.\\n\\\\end{lemma}\\n\\\\begin{proof}\\nBy assumption, every\ \ vertex of each cycle in the partition is adjacent to exactly one vertex of $T$.\ \ Call the set of all edges with one endpoint in a cycle and another endpoint\ \ in $T$ by $Q$.\\nClearly, the induced subgraph on $E(T) \\\\cup Q$ is a spanning\ \ tree of $G$. We call it $T'$. Note that every edge between a pendant vertex\ \ of $T$ and the union of cycles in the partition is also contained in $T'$. Thus,\ \ every pendant vertex of $T'$ is contained in a cycle of the partition. Now,\ \ consider the graph $H = G \\\\setminus E(T')$. For every $v \\\\in V(T)$, $d_H(v)\ \ \\\\leq 1$. So Conjecture \\\\theconjecture$\\\\,$ holds for $G$. \\\\vspace{1em}\\\ n\\\\end{proof}\\n\\n\\n\\\\noindent\\\\textbf{Remark 1.}\\n\\\\label{remark:1}\\\ nLet $C$ be a cycle formed by the path $Q$. Then clearly there exists a chordless\ \ cycle formed by $Q$.\\n\\nNow, we are in a position to prove the main result.\\\ n\\n\\\\begin{theorem}\\nConjecture \\\\theconjecture$\\\\,$ holds for traceable\ \ cubic graphs.\\n\\\\end{theorem}\\n\\\\begin{proof}\\nLet $G$ be a traceable\ \ cubic graph and $P : v_1, \\\\dots, v_n$ be a Hamiltonian path in $G$. By \\\ \\cite[Theorem 9]{akbari}, Conjecture A holds for $v_1 v_n \\\\in E(G)$. Thus\ \ we can assume that $v_1 v_n \\\\notin E(G)$. Let $v_1 v_j, v_1 v_{j'}, v_i\ \ v_n, v_{i'} v_n \\\\in E(G)\\\\setminus E(P)$ and $j' < j < n$, $1 < i < i'$.\ \ Two cases can occur:\\n\\\\begin{enumerate}[leftmargin=0pt,label=]\\n\\\\item\\\ n\\\\textbf{Case 1.}\\nAssume that $i < j$. Consider the following graph in Figure\ \ \\\\ref{fig:overlapping} in which the thick edges denote the path $P$. Call\ \ the three paths between $v_j$ and $v_i$, from the left to the right, by $P_1$,\ \ $P_2$ and $P_3$, respectively (note that $P_1$ contains the edge $e'$ and $P_3$\ \ contains the edge $e$).\\n\\n\\\\begin{figure}[H]\\n \\\\begin{center}\\n \ \ \\\\includegraphics[width=40mm]{engImages/overlapping.pdf}\\n \\\\caption{Paths\ \ $P_1$, $P_2$ and $P_3$}\\n \\\\label{fig:overlapping}\\n \\\\end{center}\\\ n\\\\end{figure}\\n\\n\\nIf $P_2$ has order $2$, then $G$ is Hamiltonian and so\ \ by \\\\cite[Theorem 9]{akbari} Conjecture \\\\theconjecture$\\\\,$ holds. Thus\ \ we can assume that $P_1$, $P_2$ and $P_3$ have order at least $3$. Now, consider\ \ the following subcases:\\\\\\\\\\n\\n\\\\begin{enumerate}[leftmargin=0pt,label=]\\\ n\\\\label{case:1}\\n\\\\item \\\\textbf{Subcase 1.} There is no edge between\ \ $V(P_r)$ and $V(P_s)$ for $1 \\\\leq r < s \\\\leq 3$. Since every vertex of\ \ $P_i$ has degree 3 for every $i$, by \\\\hyperref[remark:1]{Remark 1}$\\\\,$\ \ there are two chordless cycles $C_1$ and $C_2$ formed by $P_1$ and $P_2$, respectively.\\\ nDefine a tree $T$ with the edge set\\n$$ E\\\\Big(\\\\langle V(G) \\\\setminus\ \ \\\\big(V(C_1) \\\\cup V(C_2)\\\\big) \\\\rangle\\\\Big) \\\\bigcap \\\\big(\\\ \\bigcup_{i=1}^3 E(P_i)\\\\big).$$\\nNow, apply \\\\hyperref[lemma:1]{Lemma 1}\ \ $\\\\,$for the partition $\\\\{T, C_1, C_2\\\\}$.\\\\\\\\\\n\\n\\\\item \\\\\ textbf{Subcase 2.}\\n\\\\label{case:edge}\\nThere exists at least one edge between\ \ some $P_r$ and $P_s$, $r<s$. With no loss of generality, assume that $r=1$ and\ \ $s=2$. Suppose that $ab \\\\in E(G)$, where $a \\\\in V(P_1)$, $b \\\\in V(P_2)$\ \ and $d_{P_1}(v_j, a) + d_{P_2}(v_j, b)$ is minimum.\\n\\n\\\\begin{figure}[H]\\\ n \\\\begin{center}\\n \\\\includegraphics[width=40mm]{engImages/ab.pdf}\\\ n \\\\caption{The edge $ab$ between $P_1$ and $P_2$}\\n \\\\label{fig:ab}\\\ n \\\\end{center}\\n\\\\end{figure}\\n\\nThree cases occur: \\\\\\\\\\n\\n(a)\ \ There is no chordless cycle formed by either of the paths $\\\\stackrel\\\\\ frown{v_j a}$ or $\\\\stackrel\\\\frown{v_j b}$. Let $C$ be the chordless cycle\ \ $\\\\stackrel\\\\frown{v_j a}\\\\stackrel\\\\frown{ b v_j}$. Define $T$ with\ \ the edge set\\n$$ E\\\\Big(\\\\langle V(G) \\\\setminus V(C)\\\\rangle\\\\Big)\ \ \\\\bigcap \\\\big(\\\\bigcup_{i=1}^3 E(P_i)\\\\big).$$\\nNow, apply \\\\hyperref[lemma:1]{Lemma\ \ 1} $\\\\,$for the partition $\\\\{T,C\\\\}$.\\t\\\\\\\\\\n\\n(b) There are two\ \ chordless cycles, say $C_1$ and $C_2$, respectively formed by the paths $\\\\\ stackrel\\\\frown{v_j a}$ and $\\\\stackrel\\\\frown{v_j b}$. Now, consider the\ \ partition $C_1$, $C_2$ and the tree induced on the following edges,\\n$$E\\\\\ Big(\\\\langle V(G) \\\\setminus \\\\big(V(C_1) \\\\cup V(C_2)\\\\big) \\\\rangle\\\ \\Big) \\\\; \\\\bigcap \\\\; E\\\\Big(\\\\bigcup_{i=1}^3 P_i\\\\Big),$$\\nand\ \ apply \\\\hyperref[lemma:1]{Lemma 1}.\\\\\\\\\\n\\n(c) With no loss of generality,\ \ there exists a chordless cycle formed by the path $\\\\stackrel\\\\frown{v_j\ \ a}$ and there is no chordless cycle formed by the path $\\\\stackrel\\\\frown{v_j\ \ b}$.\\nFirst, suppose that for every chordless cycle $C_t$ on $\\\\stackrel\\\ \\frown{v_j a}$, at least one of the vertices of $C_t$ is adjacent to a vertex\ \ in $V(G) \\\\setminus V(P_1)$.\\nWe call one of the edges with one end in $C_t$\ \ and other endpoint in $V(G) \\\\setminus V(P_1)$ by $e_t$. Let $v_j=w_0, w_1,\ \ \\\\dots, w_l=a$ be all vertices of the path $\\\\stackrel\\\\frown{v_j a}$\ \ in $P_1$. Choose the shortest path $w_0 w_{i_1} w_{i_2} \\\\dots w_l$ such that\ \ $0 < i_1 < i_2 < \\\\dots < l$.\\nDefine a tree $T$ whose edge set is the thin\ \ edges in Figure \\\\ref{fig:deltaCycle}.\\\\\\\\\\nCall the cycle $w_0 w_{i_1}\ \ \\\\dots w_l \\\\stackrel\\\\frown{b w_0}$ by $C'$. Now, by removing $C'$, $q$\ \ vertex disjoint paths $Q_1, \\\\dots, Q_q$ which are contained in $\\\\stackrel\\\ \\frown{v_j a}$ remain. Note that there exists a path of order $2$ in $C'$ which\ \ by adding this path to $Q_i$ we find a cycle $C_{t_i}$, for some $i$. Hence\ \ there exists an edge $e_{t_i}$ connecting $Q_i$ to $V(G) \\\\setminus V(P_1)$.\ \ Now, we define a tree $T$ whose the edge set is,\\n$$\\\\quad\\\\quad\\\\quad\ \ \\\\bigg( E\\\\Big(\\\\langle V(G) \\\\setminus V(C') \\\\rangle \\\\Big)\\\\\ ; \\\\bigcap \\\\; \\\\Big(\\\\bigcup_{i=1}^3 E(P_i)\\\\Big) \\\\bigg) \\\\bigcup\ \ \\\\Big(\\\\big\\\\{e_{t_i} \\\\mid 1 \\\\leq i \\\\leq q \\\\big\\\\} \\\\\ Big).$$\\nApply \\\\hyperref[lemma:1]{Lemma 1} $\\\\,$for the partition $\\\\\ {T,C'\\\\}$.\\\\\\\\\\n\\n\\\\begin{figure}[H]\\n \\\\begin{center}\\n \\\\\ includegraphics[width=40mm]{engImages/deltaCycle.pdf}\\n \\\\caption{The cycle\ \ $C'$ and the tree $T$}\\n \\\\label{fig:deltaCycle}\\n \\\\end{center}\\\ n\\\\end{figure}\\n\\nNext, assume that there exists a cycle $C_1$ formed by $\\\ \\stackrel\\\\frown{v_j a}$ such that none of the vertices of $C_1$ is adjacent\ \ to $V(G) \\\\setminus V(P_1)$. Choose the smallest cycle with this property.\ \ Obviously, this cycle is chordless. Now, three cases can be considered:\\\\\\\ \\\\n\\n\\\\begin{enumerate}[leftmargin=5pt,label=(\\\\roman*)]\\n\\\\item There\ \ exists a cycle $C_2$ formed by $P_2$ or $P_3$. Define the partition $C_1$, $C_2$\ \ and a tree with the following edge set,\\n$$E\\\\Big(\\\\langle V(G) \\\\setminus\ \ \\\\big(V(C_1) \\\\cup V(C_2)\\\\big)\\\\rangle \\\\Big) \\\\bigcap \\\\Big(\ \ \\\\bigcup_{i=1}^3 E(P_i) \\\\Big),$$\\nand apply \\\\hyperref[lemma:1]{Lemma\ \ 1}.\\\\\\\\\\n\\n\\\\item There is no chordless cycle formed by $P_2$ and by\ \ $P_3$, and there is at least one edge between $V(P_2)$ and $V(P_3)$. Let $ab\ \ \\\\in E(G)$, $a \\\\in V(P_2)$ and $b \\\\in V(P_3)$ and moreover $d_{P_2}(v_j,\ \ a) + d_{P_3}(v_j,b)$ is minimum. Notice that the cycle $\\\\stackrel\\\\frown{v_j\ \ a} \\\\stackrel\\\\frown{b v_j}$ is chordless. Let us call this cycle by $C_2$.\ \ Now, define the partition $C_2$ and a tree with the following edge set,\\n$$E\\\ \\Big(\\\\langle V(G) \\\\setminus V(C_2)\\\\rangle \\\\Big) \\\\bigcap \\\\Big(\ \ \\\\bigcup_{i=1}^3 E(P_i) \\\\Big),$$\\nand apply \\\\hyperref[lemma:1]{Lemma\ \ 1}.\\\\\\\\\\n\\n\\\\item There is no chordless cycle formed by $P_2$ and by\ \ $P_3$, and there is no edge between $V(P_2)$ and $V(P_3)$. Let $C_2$ be the\ \ cycle consisting of two paths $P_2$ and $P_3$. Define the partition $C_2$ and\ \ a tree with the following edge set,\\n$$E\\\\Big(\\\\langle V(G) \\\\setminus\ \ V(C_2)\\\\rangle \\\\Big) \\\\bigcap \\\\Big( \\\\bigcup_{i=1}^3 E(P_i) \\\\\ Big),$$\\nand apply \\\\hyperref[lemma:1]{Lemma 1}.\\n\\n\\\\end{enumerate}\\\ n\\n\\n\\\\end{enumerate}\\n\\n\\\\vspace{5mm}\\n\\\\item\\n\\\\textbf{Case 2.}\\\ n\\\\label{case:2}\\nAssume that $j < i$ for all Hamiltonian paths. Among all\ \ Hamiltonian paths consider the path such that $i'-j'$ is maximum. Now, three\ \ cases can be considered:\\\\\\\\\\n\\n\\\\begin{enumerate}[leftmargin=0pt,label=]\\\ n\\\\item \\\\textbf{Subcase 1.} There is no $s < j'$ and $t > i'$ such that $v_s\ \ v_t \\\\in E(G)$. By \\\\hyperref[remark:1]{Remark 1} $\\\\,$ there are two\ \ chordless cycles $C_1$ and $C_2$, respectively formed by the paths $v_1 v_{j'}$\ \ and $v_{i'} v_n$. By assumption there is no edge $xy$, where $x \\\\in V(C_1)$\ \ and $y \\\\in V(C_2)$.\\nDefine a tree $T$ with the edge set:\\n$$ E\\\\Big(\\\ \\langle V(G) \\\\setminus \\\\big(V(C_1) \\\\cup V(C_2)\\\\big) \\\\rangle \\\ \\Big) \\\\bigcap \\\\Big( E(P) \\\\cup \\\\{v_{i'}v_n, v_{j'}v_1\\\\} \\\\Big).$$\\\ nNow, apply \\\\hyperref[lemma:1]{Lemma 1} $\\\\,$for the partition $\\\\{T, C_1,\ \ C_2\\\\}$.\\\\\\\\\\n\\n\\\\item \\\\textbf{Subcase 2.}\\n\\\\label{subcase:22}\ \ There are at least four indices $s, s' < j$ and $t, t' > i$ such that $v_s v_t,\ \ v_{s'} v_{t'} \\\\in E(G)$. Choose four indices $g, h < j$ and $e, f > i$ such\ \ that $v_h v_e, v_g v_f \\\\in E(G)$ and $|g-h| + |e-f|$ is minimum.\\n\\n\\\\\ begin{figure}[H]\\n \\\\begin{center}\\n \\\\includegraphics[width=90mm]{engImages/case2-subcase2.pdf}\\\ n \\\\caption{Two edges $v_h v_e$ and $v_g v_f$}\\n \\\\label{fig:non-overlapping}\\\ n \\\\end{center}\\n\\\\end{figure}\\n\\nThree cases can be considered:\\\\\\\ \\\\n\\n\\\\begin{enumerate}[leftmargin=0pt,label=(\\\\alph*)]\\n\\\\item There\ \ is no chordless cycle formed by $\\\\stackrel\\\\frown{v_g v_h}$ and by $\\\\\ stackrel\\\\frown{v_e v_f}$.\\n\\nConsider the cycle $\\\\stackrel\\\\frown{v_g\ \ v_h} \\\\stackrel\\\\frown{v_e v_f}v_g$ and call it $C$. Now, define a tree\ \ $T$ with the edge set,\\n$$\\\\,\\\\,\\\\,E\\\\Big(\\\\langle V(G) \\\\setminus\ \ V(C)\\\\rangle \\\\Big) \\\\bigcap \\\\Big( E(P) \\\\cup \\\\{v_1v_{j}, v_{i}v_n\\\ \\} \\\\Big),$$\\napply \\\\hyperref[lemma:1]{Lemma 1} $\\\\,$for the partition\ \ $\\\\{T, C\\\\}$.\\\\\\\\\\n\\n\\\\item With no loss of generality, there exists\ \ a chordless cycle formed by $\\\\stackrel\\\\frown{v_e v_f}$ and there is no\ \ chordless cycle formed by the path $\\\\stackrel\\\\frown{v_g v_h}$. First suppose\ \ that there is a chordless cycle $C_1$ formed by $\\\\stackrel\\\\frown{v_e v_f}$\ \ such that there is no edge between $V(C_1)$ and $\\\\{v_1, \\\\dots, v_j\\\\\ }$. By \\\\hyperref[remark:1]{Remark 1} $,$ there exists a chordless cycle $C_2$\ \ formed by $\\\\stackrel\\\\frown{v_1 v_j}$. By assumption there is no edge between\ \ $V(C_1)$ and $V(C_2)$. Now, define a tree $T$ with the edge set,\\n\\n$$\\\\\ quad\\\\quad\\\\quad\\\\quad E\\\\Big(\\\\langle V(G) \\\\setminus \\\\big(V(C_1)\ \ \\\\cup V(C_2)\\\\big)\\\\rangle \\\\Big) \\\\bigcap \\\\Big( E(P) \\\\cup \\\ \\{v_1v_{j}, v_{i}v_n\\\\} \\\\Big),$$\\n\\nand apply \\\\hyperref[lemma:1]{Lemma\ \ 1} $\\\\,$for the partition $\\\\{T, C_1, C_2\\\\}$.\\n\\n$\\\\;$ Next assume\ \ that for every cycle $C_r$ formed by $\\\\stackrel\\\\frown{v_e v_f}$, there\ \ are two vertices $x_r \\\\in V(C_r)$ and $y_r \\\\in \\\\{v_1, \\\\dots, v_j\\\ \\}$ such that $x_r y_r \\\\in E(G)$. Let $v_e=w_0, w_1, \\\\dots, w_l=v_f$ be\ \ all vertices of the path $\\\\stackrel\\\\frown{v_e v_f}$ in $P$. Choose the\ \ shortest path $w_0 w_{i_1} w_{i_2} \\\\dots w_l$ such that $0 < i_1 < i_2 <\ \ \\\\dots < l$. Consider the cycle $w_0 w_{i_1} \\\\dots w_l \\\\stackrel\\\\\ frown{v_g v_h}$ and call it $C$. Now, by removing $C$, $q$ vertex disjoint paths\ \ $Q_1, \\\\dots, Q_q$ which are contained in $\\\\stackrel\\\\frown{v_e v_f}$\ \ remain. Note that there exists a path of order $2$ in $C$ which by adding this\ \ path to $Q_i$ we find a cycle $C_{r_i}$, for some $i$. Hence there exists an\ \ edge $x_{r_i} y_{r_i}$ connecting $Q_i$ to $V(G) \\\\setminus V(\\\\stackrel\\\ \\frown{v_e v_f})$. We define a tree $T$ whose edge set is the edges,\\n$$\\\\\ quad\\\\quad\\\\quad\\\\quad\\\\quad\\\\quad E\\\\Big(\\\\langle V(G) \\\\setminus\ \ V(C)\\\\rangle \\\\Big) \\\\bigcap \\\\Big( E(P) \\\\cup \\\\{v_1v_{j}, v_{i}v_n\\\ \\} \\\\cup \\\\big\\\\{x_{r_i} y_{r_i} \\\\mid 1 \\\\leq i \\\\leq q\\\\big\\\ \\} \\\\Big),$$\\nthen apply \\\\hyperref[lemma:1]{Lemma 1} $\\\\,$ on the partition\ \ $\\\\{T, C\\\\}$.\\\\\\\\\\n\\\\begin{figure}[H]\\n \\\\begin{center}\\n \ \ \\\\includegraphics[width=90mm]{engImages/deltaNonOverlapping.pdf}\\n \\\ \\caption{The tree $T$ and the shortest path $w_0 w_{i_1}\\\\dots w_l$}\\n \ \ \\\\label{fig:delta-non-overlapping}\\n \\\\end{center}\\n\\\\end{figure}\\\ n\\n\\\\item There are at least two chordless cycles, say $C_1$ and $C_2$ formed\ \ by the paths $\\\\stackrel\\\\frown{v_g v_h}$ and $\\\\stackrel\\\\frown{v_e\ \ v_f}$, respectively. Since $|g-h| + |e-f|$ is minimum, there is no edge $xy\ \ \\\\in E(G)$ with $x \\\\in V(C_1)$ and $y \\\\in V(C_2)$. Now, define a tree\ \ $T$ with the edge set,\\n$$\\\\quad\\\\quad\\\\quad\\\\quad E\\\\Big( \\\\langle\ \ V(G) \\\\setminus \\\\big(V(C_1) \\\\cup V(C_2)\\\\big) \\\\rangle \\\\Big)\ \ \\\\bigcap \\\\Big( E(P) \\\\cup \\\\{v_1 v_{j}, v_{i}v_n\\\\} \\\\Big),$$\\\ nand apply \\\\hyperref[lemma:1]{Lemma 1} $\\\\,$for the partition $\\\\{T, C_1,\ \ C_2\\\\}$.\\\\\\\\\\n\\\\end{enumerate}\\n\\n\\\\item \\\\textbf{Subcase 3.}\ \ There exist exactly two indices $s,t$, $s < j' < i' < t$ such that $v_s v_t\ \ \\\\in E(G)$ and there are no two other indices $s', t'$ such that $s' < j <\ \ i < t'$ and $v_{s'} v_{t'} \\\\in E(G)$. We can assume that there is no cycle\ \ formed by $\\\\stackrel\\\\frown{v_{s+1} v_j}$ or $\\\\stackrel\\\\frown{v_i\ \ v_{t-1}}$, to see this by symmetry consider a cycle $C$ formed by $\\\\stackrel\\\ \\frown{v_{s+1} v_j}$. By \\\\hyperref[remark:1]{Remark 1} $\\\\,$ there exist\ \ chordless cycles $C_1$ formed by $\\\\stackrel\\\\frown{v_{s+1} v_j}$ and $C_2$\ \ formed by $\\\\stackrel\\\\frown{v_{i} v_n}$. By assumption $v_s v_t$ is the\ \ only edge such that $s < j$ and $t > i \\\\;$. Therefore, there is no edge\ \ between $V(C_1)$ and $V(C_2)$. Now, let $T$ be a tree defined by the edge set,\\\ n$$ E\\\\Big(\\\\langle V(G) \\\\setminus \\\\big(V(C_1) \\\\cup V(C_2)\\\\big)\\\ \\rangle \\\\Big) \\\\bigcap \\\\Big( E(P) \\\\cup \\\\{v_1v_{j}, v_{i}v_n\\\\\ } \\\\Big),$$\\nand apply \\\\hyperref[lemma:1]{Lemma 1} $\\\\,$for the partition\ \ \\\\{$T$, $C_1$, $C_2$\\\\}.\\\\\\\\\\n\\n$\\\\quad$Furthermore, we can also\ \ assume that either $s \\\\neq j'-1$ or $t \\\\neq i'+1$, otherwise we have\ \ the Hamiltonian cycle $\\\\stackrel\\\\frown{v_1 v_s} \\\\stackrel\\\\frown{v_t\ \ v_n} \\\\stackrel\\\\frown{v_{i'} v_{j'}} v_1$ and by \\\\cite[Theorem 9]{akbari}\ \ Conjecture \\\\theconjecture$\\\\,$ holds.\\n\\n$\\\\quad$By symmetry, suppose\ \ that $s \\\\neq j'-1$. Let $v_k$ be the vertex adjacent to $v_{j'-1}$, and $k\ \ \\\\notin \\\\{j'-2, j'\\\\}$. It can be shown that $k > j'-1$, since otherwise\ \ by considering the Hamiltonian path $P': \\\\; \\\\stackrel\\\\frown{ v_{k+1}\ \ v_{j'-1}}\\\\stackrel\\\\frown{v_k v_1} \\\\stackrel\\\\frown{v_{j'} v_n}$,\ \ the new $i'-j'$ is greater than the old one and this contradicts our assumption\ \ about $P$ in the \\\\hyperref[case:2]{Case 2}.\\n\\n$\\\\quad$We know that $j'\ \ < k < i$. Moreover, the fact that $\\\\stackrel\\\\frown{v_{s+1} v_j}$ does\ \ not form a cycle contradicts the case that $j' < k \\\\le j$. So $j < k < i$.\ \ Consider two cycles $C_1$ and $C_2$, respectively with the vertices $v_1 \\\\\ stackrel\\\\frown{v_{j'} v_{j}} v_1$ and $v_n \\\\stackrel\\\\frown{v_{i'} v_{i}}\ \ v_n$. The cycles $C_1$ and $C_2$ are chordless, otherwise there exist cycles\ \ formed by the paths $\\\\stackrel\\\\frown{v_{s+1} v_j}$ or $\\\\stackrel\\\\\ frown{v_i v_{t-1}}$. Now, define a tree $T$ with the edge set\\n$$ E\\\\Big(\\\ \\langle V(G) \\\\setminus \\\\big(V(C_1) \\\\cup V(C_2)\\\\big)\\\\rangle \\\\\ Big) \\\\bigcap \\\\Big( E(P) \\\\cup \\\\{v_s v_t, v_k v_{j'-1}\\\\} \\\\Big),$$\\\ nand apply \\\\hyperref[lemma:1]{Lemma 1} $\\\\,$for the partition \\\\{$T$, $C_1$,\ \ $C_2$\\\\}.\\n\\\\end{enumerate}\\n\\\\end{enumerate}\\n\\\\end{proof}\\n\\\ n\\\\noindent\\\\textbf{Remark 2.}\\n\\\\label{remark:2}\\nIndeed, in the proof\ \ of the previous theorem we showed a stronger result, that is, for every traceable\ \ cubic graph there is a decomposition with at most two cycles.\\n\\n\",\n \ \ \"id\": \"b7c40b41b7eedaa408f87d154284a1aba126589c\",\n \"metadata\"\ : {\n \"file_path\": \"/home/ubuntu/dolma-v1_7/arxiv-0000.json.gz\"\n \ \ }\n },\n \"truncated_cells\": []\n },\n {\n \"row_idx\": 1,\n\ \ \"row\": {\n \"text\": \"\\\\section{Principle of nano strain-amplifier}\\\ r\\n\\r\\n\\\\begin{figure*}[t!]\\r\\n\\t\\\\centering\\r\\n\\t\\\\includegraphics[width=5.4in]{Fig1}\\\ r\\n\\t\\t\\\\vspace{-0.5em}\\r\\n\\t\\\\caption{Schematic sketches of nanowire\ \ strain sensors. (a)(b) Conventional non-released and released NW structure;\ \ \\r\\n\\t\\t(c)(d) The proposed nano strain-amplifier and its simplified physical\ \ model.}\\r\\n\\t\\\\label{fig:fig1}\\r\\n\\t\\t\\\\vspace{-1em}\\r\\n\\\\end{figure*}\\\ r\\nFigure \\\\ref{fig:fig1}(a) and 1(b) show the concept of the conventional\ \ structures of piezoresistive sensors. The piezoresistive elements are either\ \ released from, or kept on, the substrate. The sensitivity ($S$) of the sensors\ \ is defined based on the ratio of the relative resistance change ($\\\\Delta\ \ R/R$) of the sensing element and the strain applied to the substrate ($\\\\\ varepsilon_{sub}$):\\r\\n\\\\begin{equation}\\r\\nS = (\\\\Delta R/R)/\\\\varepsilon_{sub}\\\ r\\n\\\\label{eq:sensitivity}\\r\\n\\\\end{equation}\\r\\nIn addition, the relative\ \ resistance change $\\\\Delta R/R$ can be calculated from the gauge factor ($GF$)\ \ of the material used to make the piezoresistive elements: $\\\\Delta R/R = GF\ \ \\\\varepsilon_{ind}$, where $\\\\varepsilon_{ind}$ is the strain induced into\ \ the piezoresistor. In most of the conventional strain gauges as shown in Fig.\ \ \\\\ref{fig:fig1} (a,b), the thickness of the sensing layer is typically below\ \ a few hundred nanometers, which is much smaller than that of the substrate.\ \ Therefore, the strain induced into the piezoresistive elements is approximately\ \ the same as that of the substrate ($\\\\varepsilon_{ind} \\\\approx \\\\varepsilon_{sub}$).\ \ Consequently, to improve the sensitivity of strain sensors (e.g. enlarging $\\\ \\Delta R/R$), electrical approaches which can enlarge the gauge factor ($GF$)\ \ are required. Nevertheless, as aforementioned, the existence of the large gauge\ \ factor in nanowires due to quantum confinement or surface state, is still considered\ \ as controversial. \\n\\r\\nIt is also evident from Eq. \\\\ref{eq:sensitivity}\ \ that the sensitivity of strain sensors can also be improved using a mechanical\ \ approach, which enlarges the strain induced into the piezoresistive element.\ \ Figure \\\\ref{fig:fig1}(c) shows our proposed nano strain-amplifier structure,\ \ in which the piezoresistive nanowires are locally fabricated at the centre of\ \ a released bridge. The key idea of this structure is that, under a certain strain\ \ applied to the substrate, a large strain will be concentrated at the locally\ \ fabricated SiC nanowires. The working principle of the nano strain-amplifier\ \ is similar to that of the well-known dogbone structure, which is widely used\ \ to characterize the tensile strength of materials \\\\cite{dogbone1,dogbone2}.\ \ That is, when a stress is applied to the dogbone-shape of a certain material,\ \ a crack, if generated, will occur at the middle part of the dogbone. The large\ \ strain concentrated at the narrow area located at the centre part with respect\ \ to the wider areas located at outer region, causes the crack. Qualitative and\ \ quantitative explanations of the nano strain-amplifier are presented as follows.\ \ \\r\\n\\r\\nFor the sake of simplicity, the released micro frame and nanowire\ \ (single wire or array) of the nano strain-amplifier can be considered as solid\ \ springs, Fig. \\\\ref{fig:fig1}(d). The stiffness of these springs are proportional\ \ to their width ($w$) and inversely proportional to their length (l): $K \\\\\ propto w/l$. Consequently, the model of the released nanowire and micro frames\ \ can be simplified as a series of springs, where the springs with higher stiffness\ \ correspond to the micro frame, and the single spring with lower stiffness corresponds\ \ to the nanowire. It is well-known in classical physics that, for serially connected\ \ springs, a larger strain will be concentrated in the low--stiffness string,\ \ while a smaller strain will be induced in the high--stiffness string \\\\cite{Springbook}.\ \ The following analysis quantitatively explained the amplification of the strain.\\\ t\\r\\n\\r\\n\\\\begin{figure}[b!]\\r\\n\\t\\\\centering\\r\\n\\t\\\\includegraphics[width=3in]{Fig2}\\\ r\\n\\t\\\\vspace{-1em}\\r\\n\\t\\\\caption{Finite element analysis of the strain\ \ induced in to the nanowire array utilizing nano strain-amplifier.}\\r\\n\\t\\\ \\label{fig:fig2}\\r\\n\\\\end{figure}\\r\\nWhen a tensile mechanical strain ($\\\ \\varepsilon_{sub}$) is applied to the substrate, the released structure will\ \ also be elongated. Since the stiffness of the released frame is much smaller\ \ than that of the substrate, it is safe to assume that the released structure\ \ will follows the elongation of the substrate. The displacement of the released\ \ structure $\\\\Delta L$ is:\\r\\n\\\\begin{equation}\\r\\n\\\\Delta L = \\\\\ Delta L_m + \\\\Delta L_n = L_m \\\\varepsilon_m + L_n \\\\varepsilon_n\\r\\n\\\ \\label{eq:displacement}\\r\\n\\\\end{equation} \\r\\nwhere $L_m$, $L_n$ are the\ \ length; $\\\\Delta L_m$, $\\\\Delta L_n$ are the displacement; and $\\\\varepsilon_m$,\ \ $\\\\varepsilon_n$ are the strains induced into the micro spring and nano spring,\ \ respectively. The subscripts m and n stand for the micro frames and nanowires,\ \ respectively. Furthermore, due to the equilibrium of the stressing force ($F$)\ \ along the series of springs, the following relationship is established: $F=\ \ K_m\\\\Delta L_m = K_n \\\\Delta L_n$, where $K_m$, $K_n$ are the stiffness\ \ of the released micro frames and nanowires, respectively. Consequently the relationship\ \ between the displacement of the micro frame (higher stiffness) and nanowires\ \ (lower stiffness) is:\\r\\n\\\\begin{equation}\\r\\n\\\\frac{\\\\Delta L_m}{\\\ \\Delta L_n}=\\\\frac{K_n}{K_m}=\\\\frac{L_mw_n}{L_nw_m}\\r\\n\\\\label{eq:euili}\\\ r\\n\\\\end{equation}\\r\\nSubstituting Eqn. \\\\ref{eq:euili} into Eqn. \\\\\ ref{eq:displacement}, the strain induced into the locally fabricated nanowires\ \ is:\\r\\n\\\\begin{equation}\\r\\n\\\\varepsilon_n = \\\\frac{\\\\Delta L_n}{L_n}\ \ = \\\\frac{1}{1-\\\\frac{w_m-w_n}{w_m}\\\\frac{L_m}{L}}\\\\varepsilon_{sub}\\\ r\\n\\\\label{eq:strainamp}\\r\\n\\\\end{equation} \\r\\n\\r\\nEquation \\\\ref{eq:strainamp}\ \ indicates that increasing the ratio of $w_m/w_n$ and $L_m/L_n$ significantly\ \ amplifies the strain induced into the nanowire from the strain applied to the\ \ substrate. This model is also applicable to the case of nanowire arrays, in\ \ which $w_n$ is the total width of all nanowires in the array.\\n\\r\\nThe theoretical\ \ model is then verified using the finite element analysis (FEA). In the FEA simulation,\ \ we compare the strain induced into (i) non released nanowires, (ii) the conventionally\ \ released nanowires, and (iii) our nano strain-amplifier structure, using COMSOL\ \ Multiphysics \\\\texttrademark. In our nano strain amplifying structure, the\ \ width of the released frame was set to be 8 $\\\\mu$m, while the width of each\ \ nanowire in the array (3 wires) was set to be 370 nm. The nanowires array structure\ \ was selected as it can enhance the electrical conductance of the SiC nanowires\ \ resistor which makes the subsequent experimental demonstration easier. The ratio\ \ between the length of nanowires and micro bridge was set to be 1: 20. With this\ \ geometrical dimensions, strain induced into nanowires array $\\\\varepsilon_n$\ \ was numerically calculated to be approximately 6 times larger than $\\\\varepsilon_{sub}$,\ \ Eqn. \\\\ref{eq:strainamp}. The simulation results show that for all structure,\ \ the elongation of non-released and released nanowires follow that of the substrate.\ \ In addition, strain was almost completely transferred into conventional released\ \ and non-released structures. Furthermore, the ratio of the strain induced in\ \ to the locally fabricated nanowires was estimated to be 5.9 times larger than\ \ that of the substrate, Fig. \\\\ref{fig:fig2}. These results are in solid agreement\ \ with the theoretical analysis presented above. For a nanowire array with an\ \ average width of 470 nm, the amplified gain of strain was found to be 4.5. \ \ \\t\\r\\n\\r\\nBased on the theoretical analysis, we conducted the following\ \ experiments to demonstrate the high sensitivity of SiC nanowire strain sensors\ \ using the nano strain-amplifier. A thin 3C-SiC film with its thickness of 300\ \ nm was epitaxially grown on a 150 mm diameter Si wafer using low pressure chemical\ \ vapour deposition \\\\cite{SiC_growth}. The film was \\\\emph{in situ} doped\ \ using Al dopants. The carrier concentration of the p-type 3C-SiC was found to\ \ be $5 \\\\times 10^{18}$ cm$^{-3}$, using a hot probe technique \\\\cite{philip}.\ \ The details of the characteristics of the grown film can be found elsewhere\ \ \\\\cite{Phan_JMC}. Subsequently, I-shape p-type SiC resistors with aluminum\ \ electrodes deposited on the surface were patterned using inductive coupled plasma\ \ (ICP) etching. As the piezoresistance of p-type 3C-SiC depends on crystallographic\ \ orientation, all SiC resistors of the present work were aligned along [110]\ \ direction to maximize the piezoresistive effect. Next, the micro scale SiC resistors\ \ were then released from the Si substrate using dry etching (XeF$_2$). Finally,\ \ SiC nanowire arrays were formed at the centre of the released bridge using focused\ \ ion beam (FIB). Two types of nanowire array were fabricated with three nanowires\ \ for each array. The average width of each nanowire in each type were 380 nm\ \ and 470 nm, respectively. Figure \\\\ref{fig:fig3} shows the SEM images of the\ \ fabricated samples, including the conventional released structure, non-released\ \ nanowires, and the nano strain-amplifier. \\r\\n\\r\\n\\\\begin{figure}[t!]\\\ r\\n\\t\\\\centering\\r\\n\\t\\\\includegraphics[width=3in]{Fig3}\\r\\n\\t\\\\\ caption{SEM image of SiC strain sensors. (a) Released SiC micro bridge used for\ \ the subsequent fabrication of the nano strain-amplifier; (b) SEM of a micro\ \ SiC resistor where the SiC nanowires array were formed using FIB; (c) SEM of\ \ non-released SiC nanowires; (d) SEM of locally fabricated SiC nanowires released\ \ from the Si substrate (nano strain-amplifier).}\\r\\n\\t\\\\label{fig:fig3}\\\ r\\n\\t\\\\vspace{-1em}\\r\\n\\\\end{figure}\\r\\nThe current voltage (I-V) curves\ \ of all fabricated samples were characterized using a HP 4145 \\\\texttrademark\ \ ~parameter analyzer. The linear relationship between the applied voltage and\ \ measured current, indicated that Al made a good Ohmic contact with the highly\ \ doped SiC resistance, Fig. \\\\ref{fig:IV}. Additionally, the electrical conductivity\ \ of both nanowires and micro frame estimated from the I-V curve and the dimensions\ \ of the resistors shows almost the same value. This indicated that the FIB process\ \ did not cause a significant surface damage to the fabricated nanowires. \\\ r\\n\\t\\r\\n\\\\begin{figure}[b!]\\r\\n\\t\\\\centering\\r\\n\\t\\\\includegraphics[width=3in]{Fig4}\\\ r\\n\\t\\t\\\\vspace{-1.5em}\\r\\n\\t\\\\caption{Current voltage curves of the\ \ fabricated SiC resistors.}\\r\\n\\t\\\\label{fig:IV}\\r\\n\\n\\\\end{figure}\\\ r\\n\\r\\nThe bending experiment was used to characterize the piezoresistive effect\ \ in micro size SiC resistors and locally fabricated SiC nanowire array. In this\ \ experiment one end of the Si cantilever (with a thickness of 625 $\\\\mu$m,\ \ and a width of 7 mm) was fixed while the other end was deflected by applying\ \ different forces. The distance from the fabricated nanowires to the free end\ \ of the Si cantilever was approximately 45 mm. The strain induced into the Si\ \ substrate is $\\\\varepsilon_\\\\text{sub} = Mt/2EI$, where $M$ is the applied\ \ bending moment; and $t$, $E$ and $I$ are the thickness, Young's modulus and\ \ the moment of inertia of the Si cantilever, respectively. The response of the\ \ SiC resistance to applied strain was then measured using a multimeter (Agilent\ \ \\\\texttrademark 34401 A).\\n\\r\\n\\\\begin{figure}[h!]\\r\\n\\t\\\\centering\\\ r\\n\\t\\\\includegraphics[width=3in]{Fig5.eps}\\r\\n\\t\\t\\\\vspace{-1.5em}\\\ r\\n\\t\\\\caption{Experimental results. (a) A comparision between the relative\ \ resistance change in the nano strain-amplifiers, non released nanowires and\ \ released micro frames; (b) The repeatability of the SiC nanowires strain sensors\ \ utilizing the proposed structure.}\\r\\n\\t\\\\label{fig:DRR}\\r\\n\\t\\t\\\ t\\\\vspace{-1em}\\r\\n\\\\end{figure}\\t\\r\\nThe relative resistance change\ \ ($\\\\Delta R/R$) of the micro and nano SiC resistors was plotted against the\ \ strain induced into the Si substrate $\\\\varepsilon_{sub}$, Fig. \\\\ref{fig:DRR}(a).\ \ For all fabricated samples, the relative resistance change shows a good linear\ \ relationship with the applied strain ($\\\\varepsilon_{sub}$). In addition,\ \ with the same applied strain to the Si substrate, the resistance change of the\ \ SiC nanowires using the nano strain-amplifier was much larger than that of the\ \ the SiC micro resistor and the conventional non-released SiC nanowires. In addition,\ \ reducing the width of the SiC nanowires also resulted in the increase of the\ \ sensitivity. The magnitude of the piezoresistive effect in the nano strain-amplifier\ \ as well as conventional structures were then quantitatively evaluated based\ \ on the effective gauge factor ($GF_{eff}$), which is defined as the ratio of\ \ the relative resistance change to the applied strain to the substrate: $GF_{eff}\ \ = (\\\\Delta R/R)/\\\\varepsilon_{sub}$. Accordingly, the effective gauge factor\ \ of the released micro SiC was found to be 28, while that of the non-released\ \ SiC nanowires was 35. From the data shown in Fig. \\\\ref{fig:DRR}, the effective\ \ gauge factor of the 380 nm and 470 nm SiC nanowires in the nano strain-amplifier\ \ were calculated as 150 and 124, respectively. Thus for nanowire arrays with\ \ average widths of 380 nm and 470 nm, the sensitivity of the nano strain-amplifier\ \ was 5.4 times and 4.6 times larger than the bulk SiC, respectively. These results\ \ were consistent with analytical and numerical models presented above. The relative\ \ resistance change of the nano strain-amplifier also showed excellent linearity\ \ with the applied strain, with a linear regression of above 99\\\\%. \\r\\n\\\ r\\nThe resistance change of the nano strain-amplifier can also be converted into\ \ voltage signals using a Wheatstone bridge, Fig. \\\\ref{fig:DRR}(b). The output\ \ voltage of the nano strain-amplifier increases with increasing tensile strains\ \ from 0 ppm to 180 ppm, and returned to the initial value when the strain was\ \ completely removed, confirming a good repeatability after several strain induced\ \ cycles. The linearity of the relative resistance change, and the repeatability\ \ indicate that the proposed structure is promising for strain sensing applications.\\\ r\\n \\r\\nIn conclusion, this work presents a novel mechanical approach to\ \ obtain highly sensitive piezoresistance in nanowires based on a nano strain-amplifier.\ \ The key factor of the nano strain-amplifier lies on nanowires locally fabricated\ \ on a released micro structure. Experimental studies were conducted on SiC nanowires,\ \ confirming that by utilizing our nano strain-amplifier, the sensitivity of SiC\ \ nanowires was 5.4 times larger than that of conventional structures. This result\ \ indicated that the nano strain-amplifier is an excellent platform for ultra\ \ sensitive strain sensing applications. \\r\\n\\r\\n\\r\\n\",\n \"id\"\ : \"1b77ae9f541b19668cc96624c7ec0f83945284e2\",\n \"metadata\": {\n \ \ \"file_path\": \"/home/ubuntu/dolma-v1_7/arxiv-0000.json.gz\"\n }\n \ \ },\n \"truncated_cells\": []\n }\n]" - "HUB_DATASET_PREVIEW: DATASET_NAME: \"KelvinTichana2/mentalhealthcurated\"\nFEATURES:\ \ {'Human': {'dtype': 'string', '_type': 'Value'}, 'Assistant': {'dtype': 'string',\ \ '_type': 'Value'}}\nDATA SAMPLE:\n[\n {\n \"row_idx\": 0,\n \"row\":\ \ {\n \"Human\": \"hello, hey, hi, good day, greetings, what's up?, how is\ \ it going\",\n \"Assistant\": \"Hello! How are you today!, Hey! What's up,\ \ Hey, How are you feeling today\"\n },\n \"truncated_cells\": []\n },\n\ \ {\n \"row_idx\": 1,\n \"row\": {\n \"Human\": \"cya, see you later,\ \ goodbye, Have a good day, bye, I am leaving\",\n \"Assistant\": \"Talk\ \ to you later!, Bye!, Goodbye!\"\n },\n \"truncated_cells\": []\n }\n]" - source_sentence: 'USER_QUERY: named entity recognition dataset conll2003' sentences: - "NEGATIVE: DATASET_NAME: \"whoisjones/litset\"\nFEATURES: {'id': {'dtype': 'int64',\ \ '_type': 'Value'}, 'tokens': {'feature': {'dtype': 'string', '_type': 'Value'},\ \ '_type': 'Sequence'}, 'ner_tags': {'feature': {'dtype': 'int64', '_type': 'Value'},\ \ '_type': 'Sequence'}}\nDATA SAMPLE:\n[\n {\n \"row_idx\": 0,\n \"row\"\ : {\n \"id\": 1,\n \"tokens\": [\n \"A\",\n \"few\",\n\ \ \"examples\",\n \"of\",\n \"autistic\",\n \"symptoms\"\ ,\n \"and\",\n \"treatments\",\n \"were\",\n \"described\"\ ,\n \"long\",\n \"before\",\n \"autism\",\n \"was\"\ ,\n \"named\",\n \".\"\n ],\n \"ner_tags\": [\n \ \ 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n\ \ 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n \ \ 0,\n 0,\n 0\n ]\n },\n \"truncated_cells\": []\n\ \ },\n {\n \"row_idx\": 1,\n \"row\": {\n \"id\": 2,\n \"tokens\"\ : [\n \"The\",\n \"Table\",\n \"Talk\",\n \"of\",\n\ \ \"Martin\",\n \"Luther\",\n \",\",\n \"compiled\"\ ,\n \"by\",\n \"his\",\n \"notetaker\",\n \",\",\n\ \ \"Mathesius\",\n \",\",\n \"contains\",\n \"the\"\ ,\n \"story\",\n \"of\",\n \"a\",\n \"12\",\n \ \ \"year\",\n \"old\",\n \"boy\",\n \"who\",\n \ \ \"may\",\n \"have\",\n \"been\",\n \"severely\",\n \ \ \"autistic\",\n \".\"\n ],\n \"ner_tags\": [\n 0,\n\ \ 717291,\n 717291,\n 0,\n 578735,\n 578735,\n\ \ 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n \ \ 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n\ \ 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n \ \ 0,\n 0,\n 0,\n 0,\n 0\n ]\n },\n \"\ truncated_cells\": []\n }\n]" - "HUB_DATASET_PREVIEW: DATASET_NAME: \"ZhongshengWang/alpaca-booksum\"\nFEATURES:\ \ {'input': {'dtype': 'string', '_type': 'Value'}, 'output': {'dtype': 'string',\ \ '_type': 'Value'}, 'instruction': {'dtype': 'string', '_type': 'Value'}}\nDATA\ \ SAMPLE:\n[\n {\n \"row_idx\": 0,\n \"row\": {\n \"instruction\"\ : \"Please complete the task of abstracting and extracting text content from different\ \ domains, where input is the content of the article and output is the result\ \ of the summary.\",\n \"input\": \"\\n \\\"Mine ear is open, and my heart\ \ prepared:\\n The worst is worldly loss thou canst unfold:\\n Say, is my kingdom\ \ lost?\\\"\\n\\n SHAKESPEARE.\\n\\n\\nIt was a feature peculiar to the colonial\ \ wars of North America, that\\nthe toils and dangers of the wilderness were to\ \ be encountered before\\nthe adverse hosts could meet. A wide and apparently\ \ an impervious\\nboundary of forests severed the possessions of the hostile provinces\ \ of\\nFrance and England. The hardy colonist, and the trained European who\\\ nfought at his side, frequently expended months in struggling against the\\nrapids\ \ of the streams, or in effecting the rugged passes of the\\nmountains, in quest\ \ of an opportunity to exhibit their courage in a more\\nmartial conflict. But,\ \ emulating the patience and self-denial of the\\npractised native warriors, they\ \ learned to overcome every difficulty;\\nand it would seem that, in time, there\ \ was no recess of the woods so\\ndark, nor any secret place so lovely, that it\ \ might claim exemption from\\nthe inroads of those who had pledged their blood\ \ to satiate their\\nvengeance, or to uphold the cold and selfish policy of the\ \ distant\\nmonarchs of Europe.\\n\\nPerhaps no district throughout the wide extent\ \ of the intermediate\\nfrontiers can furnish a livelier picture of the cruelty\ \ and fierceness\\nof the savage warfare of those periods than the country which\ \ lies\\nbetween the head waters of the Hudson and the adjacent lakes.\\n\\nThe\ \ facilities which nature had there offered to the march of the\\ncombatants were\ \ too obvious to be neglected. The lengthened sheet of the\\nChamplain stretched\ \ from the frontiers of Canada, deep within the\\nborders of the neighboring province\ \ of New York, forming a natural\\npassage across half the distance that the French\ \ were compelled to\\nmaster in order to strike their enemies. Near its southern\ \ termination,\\nit received the contributions of another lake, whose waters were\ \ so\\nlimpid as to have been exclusively selected by the Jesuit missionaries\\\ nto perform the typical purification of baptism, and to obtain for it the\\ntitle\ \ of lake \\\"du Saint Sacrement.\\\" The less zealous English thought\\nthey\ \ conferred a sufficient honor on its unsullied fountains, when they\\nbestowed\ \ the name of their reigning prince, the second of the house of\\nHanover. The\ \ two united to rob the untutored possessors of its wooded\\nscenery of their\ \ native right to perpetuate its original appellation of\\n\\\"Horican.\\\"[1]\\\ n\\nWinding its way among countless islands, and imbedded in mountains, the\\\ n\\\"holy lake\\\" extended a dozen leagues still farther to the south. With\\\ nthe high plain that there interposed itself to the further passage of\\nthe water,\ \ commenced a portage of as many miles, which conducted the\\nadventurer to the\ \ banks of the Hudson, at a point where, with the usual\\nobstructions of the\ \ rapids, or rifts, as they were then termed in the\\nlanguage of the country,\ \ the river became navigable to the tide.\\n\\nWhile, in the pursuit of their\ \ daring plans of annoyance, the restless\\nenterprise of the French even attempted\ \ the distant and difficult gorges\\nof the Alleghany, it may easily be imagined\ \ that their proverbial\\nacuteness would not overlook the natural advantages\ \ of the district we\\nhave just described. It became, emphatically, the bloody\ \ arena, in which\\nmost of the battles for the mastery of the colonies were contested.\\\ nForts were erected at the different points that commanded the facilities\\nof\ \ the route, and were taken and retaken, razed and rebuilt, as victory\\nalighted\ \ on the hostile banners. While the husbandman shrank back from\\nthe dangerous\ \ passes, within the safer boundaries of the more ancient\\nsettlements, armies\ \ larger than those that had often disposed of the\\nsceptres of the mother countries,\ \ were seen to bury themselves in these\\nforests, whence they rarely returned\ \ but in skeleton bands, that were\\nhaggard with care, or dejected by defeat.\ \ Though the arts of peace were\\nunknown to this fatal region, its forests were\ \ alive with men; its\\nshades and glens rang with the sounds of martial music,\ \ and the echoes\\nof its mountains threw back the laugh, or repeated the wanton\ \ cry, of\\nmany a gallant and reckless youth, as he hurried by them, in the\\\ nnoontide of his spirits, to slumber in a long night of forgetfulness.\\n\\nIt\ \ was in this scene of strife and bloodshed that the incidents we shall\\nattempt\ \ to relate occurred, during the third year of the war which\\nEngland and France\ \ last waged for the possession of a country that\\nneither was destined to retain.\\\ n\\nThe imbecility of her military leaders abroad, and the fatal want of\\nenergy\ \ in her councils at home, had lowered the character of Great\\nBritain from the\ \ proud elevation on which it had been placed, by the\\ntalents and enterprise\ \ of her former warriors and statesmen. No longer\\ndreaded by her enemies, her\ \ servants were fast losing the confidence of\\nself-respect. In this mortifying\ \ abasement, the colonists, though\\ninnocent of her imbecility, and too humble\ \ to be the agents of her\\nblunders, were but the natural participators.\\n\\\ nThey had recently seen a chosen army from that country, which,\\nreverencing\ \ as a mother, they had blindly believed invincible--an army\\nled by a chief\ \ who had been selected from a crowd of trained warriors,\\nfor his rare military\ \ endowments, disgracefully routed by a handful of\\nFrench and Indians, and only\ \ saved from annihilation by the coolness and\\nspirit of a Virginian boy, whose\ \ riper fame has since diffused itself,\\nwith the steady influence of moral truth,\ \ to the uttermost confines of\\nChristendom.[2] A wide frontier had been laid\ \ naked by this unexpected\\ndisaster, and more substantial evils were preceded\ \ by a thousand\\nfanciful and imaginary dangers. The alarmed colonists believed\ \ that the\\nyells of the savages mingled with every fitful gust of wind that\ \ issued\\nfrom the interminable forests of the west. The terrific character of\\\ ntheir merciless enemies increased immeasurably the natural horrors of\\nwarfare.\ \ Numberless recent massacres were still vivid in their\\nrecollections; nor was\ \ there any ear in the provinces so deaf as not to\\nhave drunk in with avidity\ \ the narrative of some fearful tale of\\nmidnight murder, in which the natives\ \ of the forests were the principal\\nand barbarous actors. As the credulous and\ \ excited traveller related the\\nhazardous chances of the wilderness, the blood\ \ of the timid curdled\\nwith terror, and mothers cast anxious glances even at\ \ those children\\nwhich slumbered within the security of the largest towns. In\ \ short, the\\nmagnifying influence of fear began to set at naught the calculations\ \ of\\nreason, and to render those who should have remembered their manhood,\\\ nthe slaves of the basest of passions. Even the most confident and the\\nstoutest\ \ hearts began to think the issue of the contest was becoming\\ndoubtful; and\ \ that abject class was hourly increasing in numbers, who\\nthought they foresaw\ \ all the possessions of the English crown in America\\nsubdued by their Christian\ \ foes, or laid waste by the inroads of their\\nrelentless allies.\\n\\nWhen,\ \ therefore, intelligence was received at the fort, which covered\\nthe southern\ \ termination of the portage between the Hudson and the\\nlakes, that Montcalm\ \ had been seen moving up the Champlain, with an army\\n\\\"numerous as the leaves\ \ on the trees,\\\" its truth was admitted with more\\nof the craven reluctance\ \ of fear than with the stern joy that a warrior\\nshould feel, in finding an\ \ enemy within reach of his blow. The news had\\nbeen brought, towards the decline\ \ of a day in midsummer, by an Indian\\nrunner, who also bore an urgent request\ \ from Munro, the commander of a\\nwork on the shore of the \\\"holy lake,\\\"\ \ for a speedy and powerful\\nreinforcement. It has already been mentioned that\ \ the distance between\\nthese two posts was less than five leagues. The rude\ \ path, which\\noriginally formed their line of communication, had been widened\ \ for the\\npassage of wagons; so that the distance which had been travelled by\ \ the\\nson of the forest in two hours, might easily be effected by a detachment\\\ nof troops, with their necessary baggage, between the rising and setting\\nof\ \ a summer sun. The loyal servants of the British crown had given to\\none of\ \ these forest fastnesses the name of William Henry, and to the\\nother that of\ \ Fort Edward; calling each after a favorite prince of the\\nreigning family.\ \ The veteran Scotchman just named held the first, with a\\nregiment of regulars\ \ and a few provincials; a force really by far too\\nsmall to make head against\ \ the formidable power that Montcalm was\\nleading to the foot of his earthen\ \ mounds. At the latter, however, lay\\nGeneral Webb, who commanded the armies\ \ of the king in the northern\\nprovinces, with a body of more than five thousand\ \ men. By uniting the\\nseveral detachments of his command, this officer might\ \ have arrayed\\nnearly double that number of combatants against the enterprising\\\ nFrenchman, who had ventured so far from his reinforcements, with an army\\nbut\ \ little superior in numbers.\\n\\nBut under the influence of their degraded fortunes,\ \ both officers and\\nmen appeared better disposed to await the approach of their\ \ formidable\\nantagonists, within their works, than to resist the progress of\ \ their\\nmarch, by emulating the successful example of the French at Fort du\\\ nQuesne, and striking a blow on their advance.\\n\\nAfter the first surprise of\ \ the intelligence had a little abated, a\\nrumor was spread through the entrenched\ \ camp, which stretched along the\\nmargin of the Hudson, forming a chain of outworks\ \ to the body of the\\nfort itself, that a chosen detachment of fifteen hundred\ \ men was to\\ndepart, with the dawn, for William Henry, the post at the northern\\\ nextremity of the portage. That which at first was only rumor, soon\\nbecame certainty,\ \ as orders passed from the quarters of the\\ncommander-in-chief to the several\ \ corps he had selected for this\\nservice, to prepare for their speedy departure.\ \ All doubt as to the\\nintention of Webb now vanished, and an hour or two of\ \ hurried footsteps\\nand anxious faces succeeded. The novice in the military\ \ art flew from\\npoint to point, retarding his own preparations by the excess\ \ of his\\nviolent and somewhat distempered zeal; while the more practised veteran\\\ nmade his arrangements with a deliberation that scorned every appearance\\nof\ \ haste; though his sober lineaments and anxious eye sufficiently\\nbetrayed that\ \ he had no very strong professional relish for the as yet\\nuntried and dreaded\ \ warfare of the wilderness. At length the sun set in\\na flood of glory, behind\ \ the distant western hills, and as darkness drew\\nits veil around the secluded\ \ spot the sounds of preparation diminished;\\nthe last light finally disappeared\ \ from the log cabin of some officer;\\nthe trees cast their deeper shadows over\ \ the mounds and the rippling\\nstream, and a silence soon pervaded the camp,\ \ as deep as that which\\nreigned in the vast forest by which it was environed.\\\ n\\nAccording to the orders of the preceding night, the heavy sleep of the\\narmy\ \ was broken by the rolling of the warning drums, whose rattling\\nechoes were\ \ heard issuing, on the damp morning air, out of every vista\\nof the woods, just\ \ as day began to draw the shaggy outlines of some tall\\npines of the vicinity,\ \ on the opening brightness of a soft and cloudless\\neastern sky. In an instant\ \ the whole camp was in motion; the meanest\\nsoldier arousing from his lair to\ \ witness the departure of his\\ncomrades, and to share in the excitement and\ \ incidents of the hour. The\\nsimple array of the chosen band was soon completed.\ \ While the regular\\nand trained hirelings of the king marched with haughtiness\ \ to the right\\nof the line, the less pretending colonists took their humbler\ \ position\\non its left, with a docility that long practice had rendered easy.\ \ The\\nscouts departed; strong guards preceded and followed the lumbering\\nvehicles\ \ that bore the baggage; and before the gray light of the morning\\nwas mellowed\ \ by the rays of the sun, the main body of the combatants\\nwheeled into column,\ \ and left the encampment with a show of high\\nmilitary bearing, that served\ \ to drown the slumbering apprehensions of\\nmany a novice, who was now about\ \ to make his first essay in arms. While\\nin view of their admiring comrades,\ \ the same proud front and ordered\\narray was observed, until the notes of their\ \ fifes growing fainter in\\ndistance, the forest at length appeared to swallow\ \ up the living mass\\nwhich had slowly entered its bosom.\\n\\nThe deepest sounds\ \ of the retiring and invisible column had ceased to be\\nborne on the breeze\ \ to the listeners, and the latest straggler had\\nalready disappeared in pursuit;\ \ but there still remained the signs of\\nanother departure, before a log cabin\ \ of unusual size and\\naccommodations, in front of which those sentinels paced\ \ their rounds,\\nwho were known to guard the person of the English general. At\ \ this spot\\nwere gathered some half dozen horses, caparisoned in a manner which\\\ nshowed that two, at least, were destined to bear the persons of females,\\nof\ \ a rank that it was not usual to meet so far in the wilds of the\\ncountry. A\ \ third wore the trappings and arms of an officer of the staff;\\nwhile the rest,\ \ from the plainness of the housings, and the travelling\\nmails with which they\ \ were encumbered, were evidently fitted for the\\nreception of as many menials,\ \ who were, seemingly, already awaiting the\\npleasure of those they served. At\ \ a respectful distance from this\\nunusual show were gathered divers groups of\ \ curious idlers; some\\nadmiring the blood and bone of the high-mettled military\ \ charger, and\\nothers gazing at the preparations, with dull wonder of vulgar\ \ curiosity.\\nThere was one man, however, who, by his countenance and actions,\ \ formed\\na marked exception to those who composed the latter class of spectators,\\\ nbeing neither idle, nor seemingly very ignorant.\\n\\nThe person of this individual\ \ was to the last degree ungainly, without\\nbeing in any particular manner deformed.\ \ He had all the bones and joints\\nof other men, without any of their proportions.\ \ Erect, his stature\\nsurpassed that of his fellows; seated, he appeared reduced\ \ within the\\nordinary limits of the race. The same contrariety in his members\ \ seemed\\nto exist throughout the whole man. His head was large; his shoulders\\\ nnarrow; his arms long and dangling; while his hands were small, if not\\ndelicate.\ \ His legs and thighs were thin, nearly to emaciation, but of\\nextraordinary\ \ length; and his knees would have been considered\\ntremendous, had they not\ \ been outdone by the broader foundations on\\nwhich this false superstructure\ \ of the blended human orders was so\\nprofanely reared. The ill-assorted and\ \ injudicious attire of the\\nindividual only served to render his awkwardness\ \ more conspicuous. A\\nsky-blue coat, with short and broad skirts and low cape,\ \ exposed a long\\nthin neck, and longer and thinner legs, to the worst animadversions\ \ of\\nthe evil disposed. His nether garment was of yellow nankeen, closely\\\ nfitted to the shape, and tied at his bunches of knees by large knots of\\nwhite\ \ ribbon, a good deal sullied by use. Clouded cotton stockings, and\\nshoes, on\ \ one of the latter of which was a plated spur, completed the\\ncostume of the\ \ lower extremity of this figure, no curve or angle of\\nwhich was concealed,\ \ but, on the other hand, studiously exhibited,\\nthrough the vanity or simplicity\ \ of its owner. From beneath the flap of\\nan enormous pocket of a soiled vest\ \ of embossed silk, heavily ornamented\\nwith tarnished silver lace, projected\ \ an instrument, which, from being\\nseen in such martial company, might have\ \ been easily mistaken for some\\nmischievous and unknown implement of war. Small\ \ as it was, this uncommon\\nengine had excited the curiosity of most of the Europeans\ \ in the camp,\\nthough several of the provincials were seen to handle it, not\ \ only\\nwithout fear, but with the utmost familiarity. A large, civil cocked\\\ nhat, like those worn by clergymen within the last thirty years,\\nsurmounted\ \ the whole, furnishing dignity to a good-natured and somewhat\\nvacant countenance,\ \ that apparently needed such artificial aid, to\\nsupport the gravity of some\ \ high and extraordinary trust.\\n\\nWhile the common herd stood aloof, in deference\ \ to the quarters of Webb,\\nthe figure we have described stalked in the centre\ \ of the domestics,\\nfreely expressing his censures or commendations on the merits\ \ of the\\nhorses, as by chance they displeased or satisfied his judgment.\\n\\\ n\\\"This beast, I rather conclude, friend, is not of home raising, but is\\nfrom\ \ foreign lands, or perhaps from the little island itself over the\\nblue water?\\\ \" he said, in a voice as remarkable for the softness and\\nsweetness of its tones,\ \ as was his person for its rare proportions: \\\"I\\nmay speak of these things,\ \ and be no braggart; for I have been down at\\nboth havens; that which is situate\ \ at the mouth of Thames, and is named\\nafter the capital of Old England, and\ \ that which is called 'Haven,' with\\nthe addition of the word 'New'; and have\ \ seen the snows and brigantines\\ncollecting their droves, like the gathering\ \ to the ark, being outward\\nbound to the Island of Jamaica, for the purpose\ \ of barter and traffic in\\nfour-footed animals; but never before have I beheld\ \ a beast which\\nverified the true Scripture war-horse like this: 'He paweth\ \ in the\\nvalley, and rejoiceth in his strength: he goeth on to meet the armed\\\ nmen. He saith among the trumpets, Ha, ha; and he smelleth the battle\\nafar off,\ \ the thunder of the captains, and the shouting.' It would seem\\nthat the stock\ \ of the horse of Israel has descended to our own time;\\nwould it not, friend?\\\ \"\\n\\nReceiving no reply to this extraordinary appeal, which in truth, as it\\\ nwas delivered with the vigor of full and sonorous tones, merited some\\nsort\ \ of notice, he who had thus sung forth the language of the Holy Book\\nturned\ \ to the silent figure to whom he had unwittingly addressed\\nhimself, and found\ \ a new and more powerful subject of admiration in the\\nobject that encountered\ \ his gaze. His eyes fell on the still, upright,\\nand rigid form of the \\\"\ Indian runner,\\\" who had borne to the camp the\\nunwelcome tidings of the preceding\ \ evening. Although in a state of\\nperfect repose, and apparently disregarding,\ \ with characteristic\\nstoicism, the excitement and bustle around him, there\ \ was a sullen\\nfierceness mingled with the quiet of the savage, that was likely\ \ to\\narrest the attention of much more experienced eyes than those which now\\\ nscanned him, in unconcealed amazement. The native bore both the tomahawk\\nand\ \ knife of his tribe; and yet his appearance was not altogether that\\nof a warrior.\ \ On the contrary, there was an air of neglect about his\\nperson, like that which\ \ might have proceeded from great and recent\\nexertion, which he had not yet\ \ found leisure to repair. The colors of\\nthe war-paint had blended in dark confusion\ \ about his fierce\\ncountenance, and rendered his swarthy lineaments still more\ \ savage and\\nrepulsive than if art had attempted an effect which had been thus\\\ nproduced by chance. His eye, alone, which glistened like a fiery star\\namid\ \ lowering clouds, was to be seen in its state of native wildness.\\nFor a single\ \ instant, his searching and yet wary glance met the\\nwondering look of the other,\ \ and then changing its direction, partly in\\ncunning, and partly in disdain,\ \ it remained fixed, as if penetrating the\\ndistant air.\\n\\nIt is impossible\ \ to say what unlooked-for remark this short and silent\\ncommunication, between\ \ two such singular men, might have elicited from\\nthe white man, had not his\ \ active curiosity been again drawn to other\\nobjects. A general movement among\ \ the domestics, and a low sound of\\ngentle voices, announced the approach of\ \ those whose presence alone was\\nwanted to enable the cavalcade to move. The\ \ simple admirer of the\\nwar-horse instantly fell back to a low, gaunt, switch-tailed\ \ mare, that\\nwas unconsciously gleaning the faded herbage of the camp nigh by;\ \ where,\\nleaning with one elbow on the blanket that concealed an apology for\ \ a\\nsaddle, he became a spectator of the departure, while a foal was quietly\\\ nmaking its morning repast, on the opposite side of the same animal.\\n\\nA young\ \ man, in the dress of an officer, conducted to their steeds two\\nfemales, who,\ \ as it was apparent by their dresses, were prepared to\\nencounter the fatigues\ \ of a journey in the woods. One, and she was the\\nmost juvenile in her appearance,\ \ though both were young, permitted\\nglimpses of her dazzling complexion, fair\ \ golden hair, and bright blue\\neyes, to be caught, as she artlessly suffered\ \ the morning air to blow\\naside the green veil which descended low from her\ \ beaver. The flush\\nwhich still lingered above the pines in the western sky\ \ was not more\\nbright nor delicate than the bloom on her cheek; nor was the\ \ opening day\\nmore cheering than the animated smile which she bestowed on the\ \ youth,\\nas he assisted her into the saddle. The other, who appeared to share\\\ nequally in the attentions of the young officer, concealed her charms\\nfrom the\ \ gaze of the soldiery, with a care that seemed better fitted to\\nthe experience\ \ of four or five additional years. It could be seen,\\nhowever, that her person,\ \ though moulded with the same exquisite\\nproportions, of which none of the graces\ \ were lost by the travelling\\ndress she wore, was rather fuller and more mature\ \ than that of her\\ncompanion.\\n\\nNo sooner were these females seated, than\ \ their attendant sprang lightly\\ninto the saddle of the war-horse, when the\ \ whole three bowed to Webb,\\nwho, in courtesy, awaited their parting on the\ \ threshold of his cabin,\\nand turning their horses' heads, they proceeded at\ \ a slow amble,\\nfollowed by their train, towards the northern entrance of the\\\ nencampment. As they traversed that short distance, not a voice was\\nheard amongst\ \ them; but a slight exclamation proceeded from the younger\\nof the females,\ \ as the Indian runner glided by her, unexpectedly, and\\nled the way along the\ \ military road in her front. Though this sudden and\\nstartling movement of the\ \ Indian produced no sound from the other, in\\nthe surprise her veil also was\ \ allowed to open its folds, and betrayed\\nan indescribable look of pity, admiration,\ \ and horror, as her dark eye\\nfollowed the easy motions of the savage. The tresses\ \ of this lady were\\nshining and black, like the plumage of the raven. Her complexion\ \ was not\\nbrown, but it rather appeared charged with the color of the rich blood,\\\ nthat seemed ready to burst its bounds. And yet there was neither\\ncoarseness\ \ nor want of shadowing in a countenance that was exquisitely\\nregular and dignified,\ \ and surpassingly beautiful. She smiled, as if in\\npity at her own momentary\ \ forgetfulness, discovering by the act a row of\\nteeth that would have shamed\ \ the purest ivory; when, replacing the veil,\\nshe bowed her face, and rode in\ \ silence, like one whose thoughts were\\nabstracted from the scene around her.\\\ n\\n\\n\\n\\n \\\"Sola, sola, wo, ha, ho, sola!\\\"\\n\\n SHAKESPEARE.\\n\\\ n\\nWhile one of the lovely beings we have so cursorily presented to the\\nreader\ \ was thus lost in thought, the other quickly recovered from the\\nalarm which\ \ induced the exclamation, and, laughing at her own weakness,\\nshe inquired of\ \ the youth who rode by her side,--\\n\\n\\\"Are such spectres frequent in the\ \ woods, Heyward; or is this sight an\\nespecial entertainment on our behalf?\ \ If the latter, gratitude must\\nclose our mouths; but if the former, both Cora\ \ and I shall have need to\\ndraw largely on that stock of hereditary courage\ \ which we boast, even\\nbefore we are made to encounter the redoubtable Montcalm.\\\ \"\\n\\n\\\"Yon Indian is a 'runner' of the army; and, after the fashion of his\\\ npeople, he may be accounted a hero,\\\" returned the officer. \\\"He has\\nvolunteered\ \ to guide us to the lake, by a path but little known, sooner\\nthan if we followed\ \ the tardy movements of the column: and, by\\nconsequence, more agreeably.\\\"\ \\n\\n\\\"I like him not,\\\" said the lady, shuddering, partly in assumed, yet\ \ more\\nin real terror. \\\"You know him, Duncan, or you would not trust yourself\\\ nso freely to his keeping?\\\"\\n\\n\\\"Say, rather, Alice, that I would not trust\ \ you. I do know him, or he\\nwould not have my confidence, and least of all at\ \ this moment. He is\\nsaid to be a Canadian, too; and yet he served with our\ \ friends the\\nMohawks, who, as you know, are one of the six allied nations.[3]\ \ He was\\nbrought among us, as I have heard, by some strange accident in which\\\ nyour father was interested, and in which the savage was rigidly dealt\\nby--but\ \ I forget the idle tale; it is enough, that he is now our\\nfriend.\\\"\\n\\\ n\\\"If he has been my father's enemy, I like him still less!\\\" exclaimed the\\\ nnow really anxious girl. \\\"Will you not speak to him, Major Heyward, that\\\ nI may hear his tones? Foolish though it may be, you have often heard me\\navow\ \ my faith in the tones of the human voice!\\\"\\n\\n\\\"It would be in vain;\ \ and answered, most probably, by an ejaculation.\\nThough he may understand it,\ \ he affects, like most of his people, to be\\nignorant of the English; and least\ \ of all will he condescend to speak\\nit, now that war demands the utmost exercise\ \ of his dignity. But he\\nstops; the private path by which we are to journey\ \ is, doubtless, at\\nhand.\\\"\\n\\nThe conjecture of Major Heyward was true.\ \ When they reached the spot\\nwhere the Indian stood, pointing into the thicket\ \ that fringed the\\nmilitary road, a narrow and blind path, which might, with\ \ some little\\ninconvenience, receive one person at a time, became visible.\\\ n\\n\\\"Here, then, lies our way,\\\" said the young man, in a low voice.\\n\\\ \"Manifest no distrust, or you may invite the danger you appear to\\napprehend.\\\ \"\\n\\n\\\"Cora, what think you?\\\" asked the reluctant fair one. \\\"If we\ \ journey\\nwith the troops, though we may find their presence irksome, shall\ \ we not\\nfeel better assurance of our safety?\\\"\\n\\n\\\"Being little accustomed\ \ to the practices of the savages, Alice, you\\nmistake the place of real danger,\\\ \" said Heyward. \\\"If enemies have\\nreached the portage at all, a thing by\ \ no means probable, as our scouts\\nare abroad, they will surely be found skirting\ \ the column where scalps\\nabound the most. The route of the detachment is known,\ \ while ours,\\nhaving been determined within the hour, must still be secret.\\\ \"\\n\\n\\\"Should we distrust the man because his manners are not our manners,\ \ and\\nthat his skin is dark?\\\" coldly asked Cora.\\n\\nAlice hesitated no\ \ longer; but giving her Narragansett[4] a smart cut\\nof the whip, she was the\ \ first to dash aside the slight branches of the\\nbushes, and to follow the runner\ \ along the dark and tangled pathway. The\\nyoung man regarded the last speaker\ \ in open admiration, and even\\npermitted her fairer though certainly not more\ \ beautiful companion to\\nproceed unattended, while he sedulously opened the\ \ way himself for the\\npassage of her who has been called Cora. It would seem\ \ that the\\ndomestics had been previously instructed; for, instead of penetrating\\\ nthe thicket, they followed the route of the column; a measure which\\nHeyward\ \ stated had been dictated by the sagacity of their guide, in\\norder to diminish\ \ the marks of their trail, if, haply, the Canadian\\nsavages should be lurking\ \ so far in advance of their army. For many\\nminutes the intricacy of the route\ \ admitted of no further dialogue;\\nafter which they emerged from the broad border\ \ of underbrush which grew\\nalong the line of the highway, and entered under\ \ the high but dark\\narches of the forest. Here their progress was less interrupted,\ \ and the\\ninstant the guide perceived that the females could command their steeds,\\\ nhe moved on, at a pace between a trot and a walk, and at a rate which\\nkept\ \ the sure-footed and peculiar animals they rode, at a fast yet easy\\namble.\ \ The youth had turned to speak to the dark-eyed Cora, when the\\ndistant sound\ \ of horses' hoofs, clattering over the roots of the broken\\nway in his rear,\ \ caused him to check his charger; and, as his companions\\ndrew their reins at\ \ the same instant, the whole party came to a halt, in\\norder to obtain an explanation\ \ of the unlooked-for interruption.\\n\\nIn a few moments a colt was seen gliding,\ \ like a fallow-deer, among the\\nstraight trunks of the pines; and, in another\ \ instant, the person of the\\nungainly man described in the preceding chapter,\ \ came into view, with as\\nmuch rapidity as he could excite his meagre beast\ \ to endure without\\ncoming to an open rupture. Until now this personage had\ \ escaped the\\nobservation of the travellers. If he possessed the power to arrest\ \ any\\nwandering eye when exhibiting the glories of his altitude on foot, his\\\ nequestrian graces were still more likely to attract attention.\\nNotwithstanding\ \ a constant application of his one armed heel to the\\nflanks of the mare, the\ \ most confirmed gait that he could establish was\\na Canterbury gallop with the\ \ hind legs, in which those more forward\\nassisted for doubtful moments, though\ \ generally content to maintain a\\nloping trot. Perhaps the rapidity of the changes\ \ from one of these paces\\nto the other created an optical illusion, which might\ \ thus magnify the\\npowers of the beast; for it is certain that Heyward, who\ \ possessed a\\ntrue eye for the merits of a horse, was unable, with his utmost\\\ ningenuity, to decide by what sort of movement his pursuer worked his\\nsinuous\ \ way on his footsteps with such persevering hardihood.\\n\\nThe industry and\ \ movements of the rider were not less remarkable than\\nthose of the ridden.\ \ At each change in the evolutions of the latter, the\\nformer raised his tall\ \ person in the stirrups; producing, in this\\nmanner, by the undue elongation\ \ of his legs, such sudden growths and\\ndiminishings of the stature, as baffled\ \ every conjecture that might be\\nmade as to his dimensions. If to this be added\ \ the fact that, in\\nconsequence of the ex parte application of the spur, one\ \ side of the\\nmare appeared to journey faster than the other; and that the aggrieved\\\ nflank was resolutely indicated by unremitted flourishes of a bushy tail,\\nwe\ \ finish the picture of both horse and man.\\n\\nThe frown which had gathered\ \ around the handsome, open, and manly brow\\nof Heyward, gradually relaxed, and\ \ his lips curled into a slight smile,\\nas he regarded the stranger. Alice made\ \ no very powerful effort to\\ncontrol her merriment; and even the dark, thoughtful\ \ eye of Cora lighted\\nwith a humor that, it would seem, the habit, rather than\ \ the nature of\\nits mistress repressed.\\n\\n\\\"Seek you any here?\\\" demanded\ \ Heyward, when the other had arrived\\nsufficiently nigh to abate his speed;\ \ \\\"I trust you are no messenger of\\nevil tidings?\\\"\\n\\n\\\"Even so,\\\"\ \ replied the stranger, making diligent use of his triangular\\ncastor, to produce\ \ a circulation in the close air of the woods, and\\nleaving his hearers in doubt\ \ to which of the young man's questions he\\nresponded; when, however, he had\ \ cooled his face, and recovered his\\nbreath, he continued, \\\"I hear you are\ \ riding to William Henry; as I am\\njourneying thitherward myself, I concluded\ \ good company would seem\\nconsistent to the wishes of both parties.\\\"\\n\\\ n\\\"You appear to possess the privilege of a casting vote,\\\" returned\\nHeyward;\ \ \\\"we are three, whilst you have consulted no one but yourself.\\\"\\n\\n\\\ \"Even so. The first point to be obtained is to know one's own mind. Once\\nsure\ \ of that, and where women are concerned, it is not easy, the next\\nis, to act\ \ up to the decision. I have endeavored to do both, and here I\\nam.\\\"\\n\\\ n\\\"If you journey to the lake, you have mistaken your route,\\\" said\\nHeyward,\ \ haughtily; \\\"the highway thither is at least half a mile behind\\nyou.\\\"\ \\n\\n\\\"Even so,\\\" returned the stranger, nothing daunted by this cold\\nreception;\ \ \\\"I have tarried at 'Edward' a week, and I should be dumb not\\nto have inquired\ \ the road I was to journey; and if dumb there would be\\nan end to my calling.\\\ \" After simpering in a small way, like one whose\\nmodesty prohibited a more\ \ open expression of his admiration of a\\nwitticism that was perfectly unintelligible\ \ to his hearers, he\\ncontinued: \\\"It is not prudent for any one of my profession\ \ to be too\\nfamiliar with those he is to instruct; for which reason I follow\ \ not the\\nline of the army; besides which, I conclude that a gentleman of your\\\ ncharacter has the best judgment in matters of wayfaring; I have\\ntherefore decided\ \ to join company, in order that the ride may be made\\nagreeable, and partake\ \ of social communion.\\\"\\n\\n\\\"A most arbitrary, if not a hasty decision!\\\ \" exclaimed Heyward,\\nundecided whether to give vent to his growing anger, or\ \ to laugh in the\\nother's face. \\\"But you speak of instruction, and of a profession;\ \ are\\nyou an adjunct to the provincial corps, as a master of the noble science\\\ nof defence and offence; or, perhaps, you are one who draws lines and\\nangles,\ \ under the pretence of expounding the mathematics?\\\"\\n\\nThe stranger regarded\ \ his interrogator a moment, in wonder; and then,\\nlosing every mark of self-satisfaction\ \ in an expression of solemn\\nhumility, he answered:--\\n\\n\\\"Of offence, I\ \ hope there is none, to either party: of defence, I make\\nnone--by God's good\ \ mercy, having committed no palpable sin since last\\nentreating his pardoning\ \ grace. I understand not your allusions about\\nlines and angles; and I leave\ \ expounding to those who have been called\\nand set apart for that holy office.\ \ I lay claim to no higher gift than a\\nsmall insight into the glorious art of\ \ petitioning and thanksgiving, as\\npractised in psalmody.\\\"\\n\\n\\\"The man\ \ is, most manifestly, a disciple of Apollo,\\\" cried the amused\\nAlice, \\\"\ and I take him under my own especial protection. Nay, throw\\naside that frown,\ \ Heyward, and in pity to my longing ears, suffer him to\\njourney in our train.\ \ Besides,\\\" she added, in a low and hurried voice,\\ncasting a glance at the\ \ distant Cora, who slowly followed the footsteps\\nof their silent but sullen\ \ guide, \\\"it may be a friend added to our\\nstrength, in time of need.\\\"\\\ n\\n\\\"Think you, Alice, that I would trust those I love by this secret path,\\\ ndid I imagine such need could happen?\\\"\\n\\n\\\"Nay, nay, I think not of it\ \ now; but this strange man amuses me; and if\\nhe 'hath music in his soul,' let\ \ us not churlishly reject his company.\\\"\\nShe pointed persuasively along the\ \ path with her riding-whip, while\\ntheir eyes met in a look which the young\ \ man lingered a moment to\\nprolong; then yielding to her gentle influence, he\ \ clapped his spurs\\ninto his charger, and in a few bounds was again at the side\ \ of Cora.\\n\\n\\\"I am glad to encounter thee, friend,\\\" continued the maiden,\ \ waving her\\nhand to the stranger to proceed, as she urged her Narragansett\ \ to renew\\nits amble. \\\"Partial relatives have almost persuaded me that I\ \ am not\\nentirely worthless in a duet myself; and we may enliven our wayfaring\ \ by\\nindulging in our favorite pursuit. It might be of signal advantage to\\\ none, ignorant as I, to hear the opinions and experience of a master in\\nthe\ \ art.\\\"\\n\\n\\\"It is refreshing both to the spirits and to the body to indulge\ \ in\\npsalmody, in befitting seasons,\\\" returned the master of song,\\nunhesitatingly\ \ complying with her intimation to follow; \\\"and nothing\\nwould relieve the\ \ mind more than such a consoling communion. But four\\nparts are altogether necessary\ \ to the perfection of melody. You have all\\nthe manifestations of a soft and\ \ rich treble; I can, by especial aid,\\ncarry a full tenor to the highest letter;\ \ but we lack counter and bass!\\nYon officer of the king, who hesitated to admit\ \ me to his company, might\\nfill the latter, if one may judge from the intonations\ \ of his voice in\\ncommon dialogue.\\\"\\n\\n\\\"Judge not too rashly from hasty\ \ and deceptive appearances,\\\" said the\\nlady, smiling; \\\"though Major Heyward\ \ can assume such deep notes on\\noccasion, believe me, his natural tones are\ \ better fitted for a mellow\\ntenor than the bass you heard.\\\"\\n\\n\\\"Is\ \ he, then, much practised in the art of psalmody?\\\" demanded her\\nsimple companion.\\\ n\\nAlice felt disposed to laugh, though she succeeded in suppressing her\\nmerriment,\ \ ere she answered,--\\n\\n\\\"I apprehend that he is rather addicted to profane\ \ song. The chances of\\na soldier's life are but little fitted for the encouragement\ \ of more\\nsober inclinations.\\\"\\n\\n\\\"Man's voice is given to him, like\ \ his other talents, to be used, and\\nnot to be abused. None can say they have\ \ ever known me neglect my gifts!\\nI am thankful that, though my boyhood may\ \ be said to have been set\\napart, like the youth of the royal David, for the\ \ purposes of music, no\\nsyllable of rude verse has ever profaned my lips.\\\"\ \\n\\n\\\"You have, then, limited your efforts to sacred song?\\\"\\n\\n\\\"Even\ \ so. As the psalms of David exceed all other language, so does the\\npsalmody\ \ that has been fitted to them by the divines and sages of the\\nland, surpass\ \ all vain poetry. Happily, I may say that I utter nothing\\nbut the thoughts\ \ and the wishes of the King of Israel himself; for\\nthough the times may call\ \ for some slight changes, yet does this version\\nwhich we use in the colonies\ \ of New England, so much exceed all other\\nversions, that, by its richness,\ \ its exactness, and its spiritual\\nsimplicity, it approacheth, as near as may\ \ be, to the great work of the\\ninspired writer. I never abide in any place,\ \ sleeping or waking, without\\nan example of this gifted work. 'Tis the six-and-twentieth\ \ edition,\\npromulgated at Boston, Anno Domini 1744; and is entitled, _The Psalms,\\\ nHymns, and Spiritual Songs of the Old and New Testaments; faithfully\\ntranslated\ \ into English Metre, for the Use, Edification, and Comfort of\\nthe Saints, in\ \ Public and Private, especially in New England_.\\\"\\n\\nDuring this eulogium\ \ on the rare production of his native poets, the\\nstranger had drawn the book\ \ from his pocket, and, fitting a pair of\\niron-rimmed spectacles to his nose,\ \ opened the volume with a care and\\nveneration suited to its sacred purposes.\ \ Then, without circumlocution\\nor apology, first pronouncing the word \\\"Standish,\\\ \" and placing the\\nunknown engine, already described, to his mouth, from which\ \ he drew a\\nhigh, shrill sound, that was followed by an octave below, from his\ \ own\\nvoice, he commenced singing the following words, in full, sweet, and\\\ nmelodious tones, that set the music, the poetry, and even the uneasy\\nmotion\ \ of his ill-trained beast at defiance:--\\n\\n \\\"How good it is, O see,\\\ n And how it pleaseth well,\\n Together, e'en in unity,\\n For brethren\ \ so to dwell.\\n It's like the choice ointment,\\n From the head to the beard\ \ did go:\\n Down Aaron's beard, that downward went,\\n His garment's skirts\ \ unto.\\\"\\n\\nThe delivery of these skilful rhymes was accompanied, on the\ \ part of the\\nstranger, by a regular rise and fall of his right hand, which\\\ nterminated at the descent, by suffering the fingers to dwell a moment on\\nthe\ \ leaves of the little volume; and on the ascent, by such a flourish\\nof the\ \ member as none but the initiated may ever hope to imitate. It\\nwould seem that\ \ long practice had rendered this manual accompaniment\\nnecessary; for it did\ \ not cease until the preposition which the poet had\\nselected for the close\ \ of his verse, had been duly delivered like a word\\nof two syllables.\\n\\nSuch\ \ an innovation on the silence and retirement of the forest could not\\nfail to\ \ enlist the ears of those who journeyed at so short a distance in\\nadvance.\ \ The Indian muttered a few words in broken English to Heyward,\\nwho, in his\ \ turn, spoke to the stranger; at once interrupting, and, for\\nthe time, closing\ \ his musical efforts.\\n\\n\\\"Though we are not in danger, common prudence would\ \ teach us to journey\\nthrough this wilderness in as quiet a manner as possible.\ \ You will,\\nthen, pardon me, Alice, should I diminish your enjoyments, by requesting\\\ nthis gentleman to postpone his chant until a safer opportunity.\\\"\\n\\n\\\"\ You will diminish them, indeed,\\\" returned the arch girl, \\\"for never did\\\ nI hear a more unworthy conjunction of execution and language, than that\\nto\ \ which I have been listening; and I was far gone in a learned inquiry\\ninto\ \ the causes of such an unfitness between sound and sense, when you\\nbroke the\ \ charm of my musings by that bass of yours, Duncan!\\\"\\n\\n\\\"I know not what\ \ you call my bass,\\\" said Heyward, piqued at her remark,\\n\\\"but I know that\ \ your safety, and that of Cora, is far dearer to me than\\ncould be any orchestra\ \ of Handel's music.\\\" He paused and turned his head\\nquickly towards a thicket,\ \ and then bent his eyes suspiciously on their\\nguide, who continued his steady\ \ pace, in undisturbed gravity. The young\\nman smiled to himself, for he believed\ \ he had mistaken some shining\\nberry of the woods for the glistening eyeballs\ \ of a prowling savage, and\\nhe rode forward, continuing the conversation which\ \ had been interrupted\\nby the passing thought.\\n\\nMajor Heyward was mistaken\ \ only in suffering his youthful and generous\\npride to suppress his active watchfulness.\ \ The cavalcade had not long\\npassed, before the branches of the bushes that\ \ formed the thicket were\\ncautiously moved asunder, and a human visage, as fiercely\ \ wild as savage\\nart and unbridled passions could make it, peered out on the\ \ retiring\\nfootsteps of the travellers. A gleam of exultation shot across the\\\ ndarkly painted lineaments of the inhabitant of the forest, as he traced\\nthe\ \ route of his intended victims, who rode unconsciously onward; the\\nlight and\ \ graceful forms of the females waving among the trees, in the\\ncurvatures of\ \ their path, followed at each bend by the manly figure of\\nHeyward, until, finally,\ \ the shapeless person of the singing-master was\\nconcealed behind the numberless\ \ trunks of trees, that rose, in dark\\nlines, in the intermediate space.\\n\\\ n\\n\\n\",\n \"output\": \"Before any characters appear, the time and geography\ \ are made clear. Though it is the last war that England and France waged for\ \ a country that neither would retain, the wilderness between the forces still\ \ has to be overcome first. Thus it is in 1757, in the New York area between the\ \ head waters of the Hudson River and Lake George to the north. Because only two\ \ years earlier General Braddock was disgracefully routed by a handful of French\ \ and Indians, the frontier is now exposed to real and imaginary savage disasters\ \ as well as to the horrors of warfare. Fear has replaced reason. Near dusk of\ \ a day in July, an Indian runner named Magua arrives at Fort Edward on the upper\ \ Hudson. He has come from Fort William Henry at the southern tip of Lake George\ \ with the news that the French General Montcalm is moving south with a very large\ \ army and that Munro, commander of Fort William Henry, is in urgent need of plentiful\ \ reinforcements from General Webb. Early the next morning, a limited detachment\ \ of fifteen hundred regulars and colonists departs as if swallowed by the forest.\ \ Shortly afterwards, Major Duncan Heyward and Alice and Cora Munro, guided by\ \ Magua on foot, take by horseback a secret route toward William Henry for the\ \ girls to join their father. Blonde Alice is doubtful about Magua, covered with\ \ war paint and showing a sullen fierceness; but dark-haired Cora is stoically\ \ common sense about him, even though Heyward mentions that their father had once\ \ had to deal rigidly with the Indian. As the small party pushes on, they are\ \ overtaken by David Gamut, a tall, ungainly psalmodist ridiculously dressed and\ \ carrying a pitch pipe while riding a mare followed by its young colt. He desires\ \ to join them, and after some banter between him and Alice, he pulls out the\ \ twenty-sixth edition of The Bay Psalm Book, sounds his pipe, and renders a song\ \ \\\"in full, sweet, and melodious tones.\\\" At a muttered comment from Magua,\ \ Heyward insists upon silence for safety. Then he glances about them and, satisfied\ \ that he has seen only shining berries, smiles to himself as they move on. But\ \ he is wrong. The branches move and a man peers exultingly after them as they\ \ disappear among the dark lines of trees.\"\n },\n \"truncated_cells\"\ : []\n },\n {\n \"row_idx\": 1,\n \"row\": {\n \"instruction\": \"\ Please complete the task of abstracting and extracting text content from different\ \ domains, where input is the content of the article and output is the result\ \ of the summary.\",\n \"input\": \"\\n \\\"Before these fields were shorn\ \ and tilled,\\n Full to the brim our rivers flowed;\\n The melody of waters\ \ filled\\n The fresh and boundless wood;\\n And torrents dashed, and rivulets\ \ played,\\n And fountains spouted in the shade.\\\"\\n\\n BRYANT.\\n\\n\\\ nLeaving the unsuspecting Heyward and his confiding companions to\\npenetrate\ \ still deeper into a forest that contained such treacherous\\ninmates, we must\ \ use an author's privilege, and shift the scene a few\\nmiles to the westward\ \ of the place where we have last seen them.\\n\\nOn that day, two men were lingering\ \ on the banks of a small but rapid\\nstream, within an hour's journey of the\ \ encampment of Webb, like those\\nwho awaited the appearance of an absent person,\ \ or the approach of some\\nexpected event. The vast canopy of woods spread itself\ \ to the margin of\\nthe river overhanging the water, and shadowing its dark current\ \ with a\\ndeeper hue. The rays of the sun were beginning to grow less fierce,\ \ and\\nthe intense heat of the day was lessened, as the cooler vapors of the\\\ nsprings and fountains rose above their leafy beds, and rested in the\\natmosphere.\ \ Still that breathing silence, which marks the drowsy\\nsultriness of an American\ \ landscape in July, pervaded the secluded spot,\\ninterrupted only by the low\ \ voices of the men, the occasional and lazy\\ntap of a woodpecker, the discordant\ \ cry of some gaudy jay, or a swelling\\non the ear, from the dull roar of a distant\ \ waterfall.\\n\\nThese feeble and broken sounds were, however, too familiar to\ \ the\\nforesters, to draw their attention from the more interesting matter of\\\ ntheir dialogue. While one of these loiterers showed the red skin and\\nwild accoutrements\ \ of a native of the woods, the other exhibited,\\nthrough the mask of his rude\ \ and nearly savage equipments, the brighter,\\nthough sunburnt and long-faded\ \ complexion of one who might claim descent\\nfrom a European parentage. The former\ \ was seated on the end of a mossy\\nlog, in a posture that permitted him to heighten\ \ the effect of his\\nearnest language, by the calm but expressive gestures of\ \ an Indian\\nengaged in debate. His body, which was nearly naked, presented a\\\ nterrific emblem of death, drawn in intermingled colors of white and\\nblack.\ \ His closely shaved head, on which no other hair than the well\\nknown and chivalrous\ \ scalping tuft[5] was preserved, was without\\nornament of any kind, with the\ \ exception of a solitary eagle's plume,\\nthat crossed his crown, and depended\ \ over the left shoulder. A tomahawk\\nand scalping-knife, of English manufacture,\ \ were in his girdle; while a\\nshort military rifle, of that sort with which\ \ the policy of the whites\\narmed their savage allies, lay carelessly across\ \ his bare and sinewy\\nknee. The expanded chest, full formed limbs, and grave\ \ countenance of\\nthis warrior, would denote that he had reached the vigor of\ \ his days,\\nthough no symptoms of decay appeared to have yet weakened his manhood.\\\ n\\nThe frame of the white man, judging by such parts as were not concealed\\\ nby his clothes, was like that of one who had known hardships and\\nexertion from\ \ his earliest youth. His person, though muscular, was\\nrather attenuated than\ \ full; but every nerve and muscle appeared strung\\nand indurated by unremitted\ \ exposure and toil. He wore a hunting-shirt\\nof forest green, fringed with faded\ \ yellow[6], and a summer cap of skins\\nwhich had been shorn of their fur. He\ \ also bore a knife in a girdle of\\nwampum, like that which confined the scanty\ \ garments of the Indian, but\\nno tomahawk. His moccasins were ornamented after\ \ the gay fashion of the\\nnatives, while the only part of his under-dress which\ \ appeared below the\\nhunting-frock, was a pair of buckskin leggings, that laced\ \ at the sides,\\nand which were gartered above the knees with the sinews of a\ \ deer. A\\npouch and horn completed his personal accoutrements, though a rifle\ \ of\\ngreat length[7], which the theory of the more ingenious whites had\\ntaught\ \ them was the most dangerous of all fire-arms, leaned against a\\nneighboring\ \ sapling. The eye of the hunter, or scout, whichever he might\\nbe, was small,\ \ quick, keen, and restless, roving while he spoke, on\\nevery side of him, as\ \ if in quest of game, or distrusting the sudden\\napproach of some lurking enemy.\ \ Notwithstanding the symptoms of habitual\\nsuspicion, his countenance was not\ \ only without guile, but at the moment\\nat which he is introduced, it was charged\ \ with an expression of sturdy\\nhonesty.\\n\\n\\\"Even your traditions make the\ \ case in my favor, Chingachgook,\\\" he said,\\nspeaking in the tongue which\ \ was known to all the natives who formerly\\ninhabited the country between the\ \ Hudson and the Potomac, and of which\\nwe shall give a free translation for\ \ the benefit of the reader;\\nendeavoring, at the same time, to preserve some\ \ of the peculiarities,\\nboth of the individual and of the language. \\\"Your\ \ fathers came from the\\nsetting sun, crossed the big river,[8] fought the people\ \ of the country,\\nand took the land; and mine came from the red sky of the morning,\ \ over\\nthe salt lake, and did their work much after the fashion that had been\\\ nset them by yours; then let God judge the matter between us, and friends\\nspare\ \ their words!\\\"\\n\\n\\\"My fathers fought with the naked redmen!\\\" returned\ \ the Indian sternly,\\nin the same language. \\\"Is there no difference, Hawkeye,\ \ between the\\nstone-headed arrow of the warrior, and the leaden bullet with\ \ which you\\nkill?\\\"\\n\\n\\\"There is reason in an Indian, though nature has\ \ made him with a red\\nskin!\\\" said the white man, shaking his head like one\ \ on whom such an\\nappeal to his justice was not thrown away. For a moment he\ \ appeared to\\nbe conscious of having the worst of the argument, then, rallying\ \ again,\\nhe answered the objection of his antagonist in the best manner his\\\ nlimited information would allow: \\\"I am no scholar, and I care not who\\nknows\ \ it; but judging from what I have seen, at deer chases and squirrel\\nhunts,\ \ of the sparks below, I should think a rifle in the hands of their\\ngrandfathers\ \ was not so dangerous as a hickory bow and a good flint-head\\nmight be, if drawn\ \ with Indian judgment, and sent by an Indian eye.\\\"\\n\\n\\\"You have the story\ \ told by your fathers,\\\" returned the other, coldly\\nwaving his hand. \\\"\ What say your old men? do they tell the young\\nwarriors, that the pale-faces\ \ met the redmen, painted for war and armed\\nwith the stone hatchet and wooden\ \ gun?\\\"\\n\\n\\\"I am not a prejudiced man, nor one who vaunts himself on his\ \ natural\\nprivileges, though the worst enemy I have on earth, and he is an\\\ nIroquois, daren't deny that I am genuine white,\\\" the scout replied,\\nsurveying,\ \ with secret satisfaction, the faded color of his bony and\\nsinewy hand; \\\"\ and I am willing to own that my people have many ways, of\\nwhich, as an honest\ \ man, I can't approve. It is one of their customs to\\nwrite in books what they\ \ have done and seen, instead of telling them in\\ntheir villages, where the lie\ \ can be given to the face of a cowardly\\nboaster, and the brave soldier can\ \ call on his comrades to witness for\\nthe truth of his words. In consequence\ \ of this bad fashion, a man who is\\ntoo conscientious to misspend his days among\ \ the women, in learning the\\nnames of black marks, may never hear of the deeds\ \ of his fathers, nor\\nfeel a pride in striving to outdo them. For myself, I\ \ conclude the\\nBumppos could shoot, for I have a natural turn with a rifle,\ \ which must\\nhave been handed down from generation to generation, as, our holy\\\ ncommandments tell us, all good and evil gifts are bestowed; though I\\nshould\ \ be loth to answer for other people in such a matter. But every\\nstory has its\ \ two sides; so I ask you, Chingachgook, what passed,\\naccording to the traditions\ \ of the redmen, when our fathers first met?\\\"\\n\\nA silence of a minute succeeded,\ \ during which the Indian sat mute; then,\\nfull of the dignity of his office,\ \ he commenced his brief tale, with a\\nsolemnity that served to heighten its\ \ appearance of truth.\\n\\n\\\"Listen, Hawkeye, and your ear shall drink no lie.\ \ 'Tis what my fathers\\nhave said, and what the Mohicans have done.\\\" He hesitated\ \ a single\\ninstant, and bending a cautious glance toward his companion, he\\\ ncontinued, in a manner that was divided between interrogation and\\nassertion,\ \ \\\"Does not this stream at our feet run towards the summer,\\nuntil its waters\ \ grow salt, and the current flows upward?\\\"\\n\\n\\\"It can't be denied that\ \ your traditions tell you true in both these\\nmatters,\\\" said the white man;\ \ \\\"for I have been there, and have seen\\nthem; though, why water, which is\ \ so sweet in the shade, should become\\nbitter in the sun, is an alteration for\ \ which I have never been able to\\naccount.\\\"\\n\\n\\\"And the current!\\\"\ \ demanded the Indian, who expected his reply with that\\nsort of interest that\ \ a man feels in the confirmation of testimony, at\\nwhich he marvels even while\ \ he respects it; \\\"the fathers of Chingachgook\\nhave not lied!\\\"\\n\\n\\\ \"The Holy Bible is not more true, and that is the truest thing in\\nnature. They\ \ call this up-stream current the tide, which is a thing soon\\nexplained, and\ \ clear enough. Six hours the waters run in, and six hours\\nthey run out, and\ \ the reason is this: when there is higher water in the\\nsea than in the river,\ \ they run in, until the river gets to be highest,\\nand then it runs out again.\\\ \"\\n\\n\\\"The waters in the woods, and on the great lakes, run downward until\\\ nthey lie like my hand,\\\" said the Indian, stretching the limb\\nhorizontally\ \ before him, \\\"and then they run no more.\\\"\\n\\n\\\"No honest man will deny\ \ it,\\\" said the scout, a little nettled at the\\nimplied distrust of his explanation\ \ of the mystery of the tides; \\\"and I\\ngrant that it is true on the small\ \ scale, and where the land is level.\\nBut everything depends on what scale you\ \ look at things. Now, on the\\nsmall scale, the 'arth is level; but on the large\ \ scale it is round. In\\nthis manner, pools and ponds, and even the great fresh-water\ \ lake, may\\nbe stagnant, as you and I both know they are, having seen them;\ \ but when\\nyou come to spread water over a great tract, like the sea, where\ \ the\\nearth is round, how in reason can the water be quiet? You might as well\\\ nexpect the river to lie still on the brink of those black rocks a mile\\nabove\ \ us, though your own ears tell you that it is tumbling over them at\\nthis very\ \ moment!\\\"\\n\\nIf unsatisfied by the philosophy of his companion, the Indian\ \ was far\\ntoo dignified to betray his unbelief. He listened like one who was\\\ nconvinced, and resumed his narrative in his former solemn manner.\\n\\n\\\"We\ \ came from the place where the sun is hid at night, over great plains\\nwhere\ \ the buffaloes live, until we reached the big river. There we\\nfought the Alligewi,\ \ till the ground was red with their blood. From the\\nbanks of the big river\ \ to the shores of the salt lake, there was none to\\nmeet us. The Maquas followed\ \ at a distance. We said the country should\\nbe ours from the place where the\ \ water runs up no longer on this stream,\\nto a river twenty suns' journey toward\ \ the summer. The land we had taken\\nlike warriors, we kept like men. We drove\ \ the Maquas into the woods with\\nthe bears. They only tasted salt at the licks;\ \ they drew no fish from\\nthe great lake; we threw them the bones.\\\"\\n\\n\\\ \"All this I have heard and believe,\\\" said the white man, observing that\\\ nthe Indian paused: \\\"but it was long before the English came into the\\ncountry.\\\ \"\\n\\n\\\"A pine grew then where this chestnut now stands. The first pale-faces\\\ nwho came among us spoke no English. They came in a large canoe, when my\\nfathers\ \ had buried the tomahawk with the redmen around them. Then,\\nHawkeye,\\\" he\ \ continued, betraying his deep emotion only by permitting\\nhis voice to fall\ \ to those low, guttural tones, which rendered his\\nlanguage, as spoken at times,\ \ so very musical; \\\"then, Hawkeye, we were\\none people, and we were happy.\ \ The salt lake gave us its fish, the wood\\nits deer, and the air its birds.\ \ We took wives who bore us children; we\\nworshipped the Great Spirit; and we\ \ kept the Maquas beyond the sound of\\nour songs of triumph!\\\"\\n\\n\\\"Know\ \ you anything of your own family at that time?\\\" demanded the white.\\n\\\"\ But you are a just man, for an Indian! and, as I suppose you hold their\\ngifts,\ \ your fathers must have been brave warriors, and wise men at the\\ncouncil fire.\\\ \"\\n\\n\\\"My tribe is the grandfather of nations, but I am an unmixed man. The\\\ nblood of chiefs is in my veins, where it must stay forever. The Dutch\\nlanded,\ \ and gave my people the fire-water; they drank until the heavens\\nand the earth\ \ seemed to meet, and they foolishly thought they had found\\nthe Great Spirit.\ \ Then they parted with their land. Foot by foot, they\\nwere driven back from\ \ the shores, until I, that am a chief and a\\nsagamore, have never seen the sun\ \ shine but through the trees, and have\\nnever visited the graves of, my fathers!\\\ \"\\n\\n\\\"Graves bring solemn feelings over the mind,\\\" returned the scout,\ \ a good\\ndeal touched at the calm suffering of his companion; \\\"and they often\ \ aid\\na man in his good intentions; though, for myself, I expect to leave my\\\ nown bones unburied, to bleach in the woods, or to be torn asunder by the\\nwolves.\ \ But where are to be found those of your race who came to their\\nkin in the\ \ Delaware country, so many summers since?\\\"\\n\\n\\\"Where are the blossoms\ \ of those summers!--fallen, one by one: so all of\\nmy family departed, each\ \ in his turn, to the land of spirits. I am on\\nthe hill-top, and must go down\ \ into the valley; and when Uncas follows\\nin my footsteps, there will no longer\ \ be any of the blood of the\\nsagamores, for my boy is the last of the Mohicans.\\\ \"\\n\\n\\\"Uncas is here!\\\" said another voice, in the same soft, guttural\ \ tones,\\nnear his elbow; \\\"who speaks to Uncas?\\\"\\n\\nThe white man loosened\ \ his knife in his leathern sheath, and made an\\ninvoluntary movement of the\ \ hand towards his rifle, at this sudden\\ninterruption; but the Indian sat composed,\ \ and without turning his head\\nat the unexpected sounds.\\n\\nAt the next instant,\ \ a youthful warrior passed between them, with a\\nnoiseless step, and seated\ \ himself on the bank of the rapid stream. No\\nexclamation of surprise escaped\ \ the father, nor was any question asked,\\nor reply given, for several minutes;\ \ each appearing to await the moment\\nwhen he might speak, without betraying\ \ womanish curiosity or childish\\nimpatience. The white man seemed to take counsel\ \ from their customs,\\nand, relinquishing his grasp of the rifle, he also remained\ \ silent and\\nreserved. At length Chingachgook turned his eyes slowly towards\ \ his son,\\nand demanded,--\\n\\n\\\"Do the Maquas dare to leave the print of\ \ their moccasins in these\\nwoods?\\\"\\n\\n\\\"I have been on their trail,\\\ \" replied the young Indian, \\\"and know that\\nthey number as many as the fingers\ \ of my two hands; but they lie hid,\\nlike cowards.\\\"\\n\\n\\\"The thieves\ \ are outlying for scalps and plunder!\\\" said the white man,\\nwhom we shall\ \ call Hawkeye, after the manner of his companions. \\\"That\\nbushy Frenchman,\ \ Montcalm, will send his spies into our very camp, but\\nhe will know what road\ \ we travel!\\\"\\n\\n\\\"Tis enough!\\\" returned the father, glancing his eye\ \ towards the setting\\nsun; \\\"they shall be driven like deer from their bushes.\ \ Hawkeye, let us\\neat to-night, and show the Maquas that we are men to-morrow.\\\ \"\\n\\n\\\"I am as ready to do the one as the other; but to fight the Iroquois\\\ n'tis necessary to find the skulkers; and to eat, 'tis necessary to get\\nthe\ \ game--talk of the devil and he will come; there is a pair of the\\nbiggest antlers\ \ I have seen this season, moving the bushes below the\\nhill! Now, Uncas,\\\"\ \ he continued in a half whisper, and laughing with a\\nkind of inward sound,\ \ like one who had learnt to be watchful, \\\"I will\\nbet my charger three times\ \ full of powder, against a foot of wampum,\\nthat I take him atwixt the eyes,\ \ and nearer to the right than to the\\nleft.\\\"\\n\\n\\\"It cannot be!\\\" said\ \ the young Indian, springing to his feet with\\nyouthful eagerness; \\\"all but\ \ the tips of his horns are hid!\\\"\\n\\n\\\"He's a boy!\\\" said the white man,\ \ shaking his head while he spoke, and\\naddressing the father. \\\"Does he think\ \ when a hunter sees a part of the\\ncreatur', he can't tell where the rest of\ \ him should be!\\\"\\n\\n[Illustration: _Copyright by Charles Scribner's Sons_\\\ n\\nUNCAS SLAYS A DEER\\n\\n_Avoiding the horns of the infuriated animal, Uncas\ \ darted to his side,\\nand passed his knife across the throat_]\\n\\nAdjusting\ \ his rifle, he was about to make an exhibition of that skill,\\non which he so\ \ much valued himself, when the warrior struck up the piece\\nwith his hand, saying--\\\ n\\n\\\"Hawkeye! will you fight the Maquas?\\\"\\n\\n\\\"These Indians know the\ \ nature of the woods, as it might be by\\ninstinct!\\\" returned the scout, dropping\ \ his rifle, and turning away like\\na man who was convinced of his error. \\\"\ I must leave the buck to your\\narrow, Uncas, or we may kill a deer for them thieves,\ \ the Iroquois, to\\neat.\\\"\\n\\nThe instant the father seconded this intimation\ \ by an expressive gesture\\nof the hand, Uncas threw himself on the ground, and\ \ approached the\\nanimal with wary movements. When within a few yards of the\ \ cover, he\\nfitted an arrow to his bow with the utmost care, while the antlers\\\ nmoved, as if their owner snuffed an enemy in the tainted air. In another\\nmoment\ \ the twang of the cord was heard, a white streak was seen glancing\\ninto the\ \ bushes, and the wounded buck plunged from the cover, to the\\nvery feet of his\ \ hidden enemy. Avoiding the horns of the infuriated\\nanimal, Uncas darted to\ \ his side, and passed his knife across the\\nthroat, when bounding to the edge\ \ of the river it fell, dyeing the\\nwaters with its blood.\\n\\n\\\"'Twas done\ \ with Indian skill,\\\" said the scout, laughing inwardly, but\\nwith vast satisfaction;\ \ \\\"and 'twas a pretty sight to behold! Though an\\narrow is a near shot, and\ \ needs a knife to finish the work.\\\"\\n\\n\\\"Hugh!\\\" ejaculated his companion,\ \ turning quickly, like a hound who\\nscented game.\\n\\n\\\"By the Lord, there\ \ is a drove of them!\\\" exclaimed the scout, whose eyes\\nbegan to glisten with\ \ the ardor of his usual occupation; \\\"if they come\\nwithin range of a bullet\ \ I will drop one, though the whole Six Nations\\nshould be lurking within sound!\ \ What do you hear, Chingachgook? for to\\nmy ears the woods are dumb.\\\"\\n\\\ n\\\"There is but one deer, and he is dead,\\\" said the Indian, bending his\\\ nbody till his ear nearly touched the earth. \\\"I hear the sounds of feet!\\\"\ \\n\\n\\\"Perhaps the wolves have driven the buck to shelter, and are following\\\ non his trail.\\\"\\n\\n\\\"No. The horses of white men are coming!\\\" returned\ \ the other, raising\\nhimself with dignity, and resuming his seat on the log\ \ with his former\\ncomposure. \\\"Hawkeye, they are your brothers; speak to them.\\\ \"\\n\\n\\\"That will I, and in English that the king needn't be ashamed to\\\ nanswer,\\\" returned the hunter, speaking in the language of which he\\nboasted;\ \ \\\"but I see nothing, nor do I hear the sounds of man or beast;\\n'tis strange\ \ that an Indian should understand white sounds better than a\\nman who, his very\ \ enemies will own, has no cross in his blood, although\\nhe may have lived with\ \ the redskins long enough to be suspected! Ha!\\nthere goes something like the\ \ cracking of a dry stick, too--now I hear\\nthe bushes move--yes, yes, there\ \ is a trampling that I mistook for the\\nfalls--and--but here they come themselves;\ \ God keep them from the\\nIroquois!\\\"\\n\\n\\n\\n\",\n \"output\": \"\ In another part of the forest by the river a few miles to the west, Hawkeye and\ \ Chingachgook appear to be waiting for someone as they talk with low voices.\ \ It is now afternoon. The Indian and the scout are attired according to their\ \ forest habits: Chingachgook with his semi-nude, war-painted body and scalping\ \ tuft of hair, his tomahawk, scalping knife, and short rifle; Hawkeye with his\ \ hunting shirt, skin cap, buckskin leggings, knife, pouch and horn, and long\ \ rifle. They discuss their respective forefathers, and Chingachgook relates the\ \ slow demise of his tribe of Mohicans so that only he and his son Uncas now remain.\ \ At the mention of his name, Uncas, a youthful warrior dressed much like Hawkeye,\ \ appears and says that he has been on the trail of the Maquas, another name for\ \ the Mengwe or Iroquois, their natural enemies. The antlers of a deer are seen\ \ in the distance, and Hawkeye is about to shoot the animal for food when the\ \ warrior warns him that a shot will warn the enemy. Just as Uncas kills it with\ \ an arrow, they hear the sounds of feet which Chingachgook recognizes as the\ \ horses of white men.\"\n },\n \"truncated_cells\": []\n }\n]" - "HUB_DATASET_PREVIEW: DATASET_NAME: \"tner/conll2003\"\nFEATURES: {'tokens': {'feature':\ \ {'dtype': 'string', '_type': 'Value'}, '_type': 'Sequence'}, 'tags': {'feature':\ \ {'dtype': 'int32', '_type': 'Value'}, '_type': 'Sequence'}}\nDATA SAMPLE:\n\ [\n {\n \"row_idx\": 0,\n \"row\": {\n \"tokens\": [\n \"EU\"\ ,\n \"rejects\",\n \"German\",\n \"call\",\n \"to\"\ ,\n \"boycott\",\n \"British\",\n \"lamb\",\n \".\"\ \n ],\n \"tags\": [\n 1,\n 0,\n 2,\n 0,\n\ \ 0,\n 0,\n 2,\n 0,\n 0\n ]\n },\n\ \ \"truncated_cells\": []\n },\n {\n \"row_idx\": 1,\n \"row\": {\n\ \ \"tokens\": [\n \"Peter\",\n \"Blackburn\"\n ],\n \ \ \"tags\": [\n 3,\n 4\n ]\n },\n \"truncated_cells\"\ : []\n }\n]" - source_sentence: 'USER_QUERY: text to sql dataset' sentences: - "HUB_DATASET_PREVIEW: DATASET_NAME: \"InfiniFlow/text2sql\"\nFEATURES: {'text':\ \ {'dtype': 'string', '_type': 'Value'}}\nDATA SAMPLE:\n[\n {\n \"row_idx\"\ : 0,\n \"row\": {\n \"text\": \"\"\n },\n \"truncated_cells\": []\n\ \ },\n {\n \"row_idx\": 1,\n \"row\": {\n \"text\": \"### \\u7528\\\ u6237\\u8868\\uff08users\\uff09\"\n },\n \"truncated_cells\": []\n }\n]" - "HUB_DATASET_PREVIEW: DATASET_NAME: \"andresgtn/celeb-identities\"\nFEATURES:\ \ {'image': {'_type': 'Image'}, 'label': {'names': ['Brad_Pitt', 'Donald_Trump',\ \ 'Emma_Stone', 'Jessica_Alba', 'Johnny_Depp', 'Julia_Roberts'], '_type': 'ClassLabel'}}\n\ DATA SAMPLE:\n[\n {\n \"row_idx\": 0,\n \"row\": {\n \"image\": {\n\ \ \"src\": \"https://datasets-server.huggingface.co/assets/andresgtn/celeb-identities/--/default/train/0/image/image.jpg?Expires=1726591617&Signature=q--3qiojJDrvuSWWGLgJWrS4c6npTB56vauiMf7TqH8cYxEoUIBTGWHpn2d38sz9duxhXFgmmlkGIk042lszNSMnkVK9Y3vQJI9-FhXjRpZzWlS-PY-e-7ly7fssmssEy0NSinNQ-z8hg2fhs1T1N1iHH9-vyr1B1QWqJYVcBw7ccoDAEr6nlTzQeHKqyEoXEachNGgABSHqIErpBz4aaCP-af~jUAiojqDluy55H4d8mZ6xfs9dgt5WCLTJ0mDkSRiHdDKQ4RA12R6JAk0zOj19Ldhp6wXXrf9jFuSvzKEU7ElwE8qN~MSak9sTM81ngizmx42Y~Fgx270MlRQPLQ__&Key-Pair-Id=K3EI6M078Z3AC3\"\ ,\n \"height\": 240,\n \"width\": 165\n },\n \"label\"\ : 0\n },\n \"truncated_cells\": []\n },\n {\n \"row_idx\": 1,\n \ \ \"row\": {\n \"image\": {\n \"src\": \"https://datasets-server.huggingface.co/assets/andresgtn/celeb-identities/--/default/train/1/image/image.jpg?Expires=1726591617&Signature=sJR70D9XoXJtYCKDloI2SvXrMHeapO5og240B4WuNMO8Mr-q3-9ZunPQX22-fa0QkVRdy9R4NQoAto34KGwJGfn3sDZL-YBQboROs1OMwuYBhtNh1~1SgBKKhuhww-QQce9Z7DD4MwGy8j1HCdLOJmkvFiBbd-B~w6kdTOBbekCJPJmrr1zGz~cXkg7zzpnKpBcScK8XA0Y9ESNkKVl~4Q~RTl839vo93NqlKoWW2gmVCM0d5BFn3~mZm9HHWj1bOPerssRcYLSwwC1iOB5fmK-Y6e~fRWMnrnq94N3O20S-uYher6Q7wtssANteZGCKIJVBULAb3oRU0o~NN1UhsQ__&Key-Pair-Id=K3EI6M078Z3AC3\"\ ,\n \"height\": 240,\n \"width\": 165\n },\n \"label\"\ : 0\n },\n \"truncated_cells\": []\n }\n]" - "NEGATIVE: DATASET_NAME: \"lamini/spider_text_to_sql\"\nFEATURES: {'input': {'dtype':\ \ 'string', '_type': 'Value'}, 'output': {'dtype': 'string', '_type': 'Value'}}\n\ DATA SAMPLE:\n[\n {\n \"row_idx\": 0,\n \"row\": {\n \"input\": \"\ [INST] Here is a database schema:\\ndepartment :\\nDepartment_ID [ INT ] primary_key\\\ nName [ TEXT ]\\nCreation [ TEXT ]\\nRanking [ INT ]\\nBudget_in_Billions [ INT\ \ ]\\nNum_Employees [ INT ]\\n\\nhead :\\nhead_ID [ INT ] primary_key\\nname [\ \ TEXT ]\\nborn_state [ TEXT ]\\nage [ INT ]\\n\\nmanagement :\\ndepartment_ID\ \ [ INT ] primary_key management.department_ID = department.Department_ID\\nhead_ID\ \ [ INT ] management.head_ID = head.head_ID\\ntemporary_acting [ TEXT ]\\n\\nPlease\ \ write me a SQL statement that answers the following question: How many heads\ \ of the departments are older than 56 ? [/INST]\",\n \"output\": \"SELECT\ \ count(*) FROM head WHERE age > 56;\"\n },\n \"truncated_cells\": []\n\ \ },\n {\n \"row_idx\": 1,\n \"row\": {\n \"input\": \"[INST] Here\ \ is a database schema:\\ndepartment :\\nDepartment_ID [ INT ] primary_key\\nName\ \ [ TEXT ]\\nCreation [ TEXT ]\\nRanking [ INT ]\\nBudget_in_Billions [ INT ]\\\ nNum_Employees [ INT ]\\n\\nhead :\\nhead_ID [ INT ] primary_key\\nname [ TEXT\ \ ]\\nborn_state [ TEXT ]\\nage [ INT ]\\n\\nmanagement :\\ndepartment_ID [ INT\ \ ] primary_key management.department_ID = department.Department_ID\\nhead_ID\ \ [ INT ] management.head_ID = head.head_ID\\ntemporary_acting [ TEXT ]\\n\\nPlease\ \ write me a SQL statement that answers the following question: List the name,\ \ born state and age of the heads of departments ordered by age. [/INST]\",\n\ \ \"output\": \"SELECT name , born_state , age FROM head ORDER BY age;\"\ \n },\n \"truncated_cells\": []\n }\n]" - source_sentence: 'USER_QUERY: multimodal conversation dataset' sentences: - "HUB_DATASET_PREVIEW: DATASET_NAME: \"BUAADreamer/llava-en-zh-2k\"\nFEATURES:\ \ {'messages': [{'role': {'dtype': 'string', '_type': 'Value'}, 'content': {'dtype':\ \ 'string', '_type': 'Value'}}], 'images': {'feature': {'_type': 'Image'}, '_type':\ \ 'Sequence'}}\nDATA SAMPLE:\n[\n {\n \"row_idx\": 0,\n \"row\": {\n \ \ \"messages\": [\n {\n \"role\": \"user\",\n \"content\"\ : \"<image>How many baseball players are visible in the image?\"\n },\n\ \ {\n \"role\": \"assistant\",\n \"content\": \"There\ \ are three baseball players visible in the image.\"\n },\n {\n\ \ \"role\": \"user\",\n \"content\": \"What are the players\ \ holding in their hands?\"\n },\n {\n \"role\": \"assistant\"\ ,\n \"content\": \"The players are holding baseball bats in their hands.\"\ \n },\n {\n \"role\": \"user\",\n \"content\"\ : \"Are the players in a dugout?\"\n },\n {\n \"role\"\ : \"assistant\",\n \"content\": \"Yes, the three baseball players are\ \ standing in the dugout.\"\n },\n {\n \"role\": \"user\"\ ,\n \"content\": \"Is the image in color or black and white?\"\n \ \ },\n {\n \"role\": \"assistant\",\n \"content\"\ : \"The image is an old black and white photo of the three baseball players.\"\ \n },\n {\n \"role\": \"user\",\n \"content\"\ : \"Do the players belong to a specific baseball team?\"\n },\n \ \ {\n \"role\": \"assistant\",\n \"content\": \"Yes, the players\ \ belong to the Boston Red Sox baseball team.\"\n }\n ],\n \"\ images\": [\n {\n \"src\": \"https://datasets-server.huggingface.co/assets/BUAADreamer/llava-en-zh-2k/--/fba994c834822bddd3cd79e929c33135f4289d2b/--/en/train/0/images/image-1d100e9.jpg?Expires=1726591851&Signature=QHGD147HyWamORfSjz0QoG51Ru86g3STPBNDAEOLK7NTq8Y~b4vVt3u~XF9njlRWwNnVF7AQ8-l9f2pCWxggPnZw1wZEfBAC5Q1oOW2CwT-gMYME~I-9qeJrbQtkszer9U0-H5rkECK0DVgWKKIagyjjJBmSSLz2QAb3BlV5aEMGEbJj-4G1X2xzoVaMIfg~ogGSb6~2PAKoLoKMPUZIxrm~2LoPEGVJeXr7JIa8-ISVukshW8ZEDQII~eXfOBeEsTd1Mk5BpJz26dH3W7X2rdhq54KeQ8H9OBt~HlU0XbXigFABeMymb-GBH-fDNHUDmb9Zt3io5hBMOd7I64DLyQ__&Key-Pair-Id=K3EI6M078Z3AC3\"\ ,\n \"height\": 308,\n \"width\": 384\n }\n ]\n\ \ },\n \"truncated_cells\": []\n },\n {\n \"row_idx\": 1,\n \"row\"\ : {\n \"messages\": [\n {\n \"role\": \"user\",\n \ \ \"content\": \"What is the man wearing in the image?<image>\"\n },\n\ \ {\n \"role\": \"assistant\",\n \"content\": \"The man\ \ in the image is wearing chaps, a cowboy hat, and is dressed like a cowboy.\"\ \n },\n {\n \"role\": \"user\",\n \"content\"\ : \"What is the cowboy doing in the image?\"\n },\n {\n \ \ \"role\": \"assistant\",\n \"content\": \"The cowboy is riding on\ \ a horse in a grassy field and pointing into the distance, likely directing attention\ \ towards a specific location or object.\"\n },\n {\n \"\ role\": \"user\",\n \"content\": \"What kind of landscape surrounds the\ \ cowboy and his horse?\"\n },\n {\n \"role\": \"assistant\"\ ,\n \"content\": \"The cowboy and his horse are surrounded by a grassy\ \ field, which creates a natural and open setting for the scene.\"\n },\n\ \ {\n \"role\": \"user\",\n \"content\": \"What is the\ \ significance of the cowboy attire and its relation to horse riding?\"\n \ \ },\n {\n \"role\": \"assistant\",\n \"content\"\ : \"Cowboy attire is deeply rooted in the history, culture, and traditions of\ \ the American West. It is both functional and symbolic, representing the hardworking\ \ and rugged lifestyle of cowboys who spent much of their time working with livestock\ \ and riding horses.\\n\\nThe key elements of cowboy attire, like the cowboy hat\ \ and chaps, serve specific purposes:\\n1. Cowboy Hat: A wide-brimmed hat made\ \ of stiffened felt or straw is designed to protect the cowboy from the sun, wind,\ \ and rain. The hat's distinctive high crown allows for air circulation and provides\ \ a touch of style.\\n2. Chaps: These are sturdy leather leg coverings worn over\ \ pants to protect the cowboy's legs from rough terrain, thorny bushes, and other\ \ elements while riding or working with livestock. Chaps provide an additional\ \ layer of protection and durability, particularly while riding through rugged\ \ landscapes and herding cattle.\\n\\nIn summary, the cowboy attire seen in the\ \ image is not just a fashion statement but is deeply connected to the history\ \ and traditions of the American West. The functional pieces of clothing, like\ \ the cowboy hat and chaps, are designed to protect and support the cowboy during\ \ horse riding and working with livestock.\"\n }\n ],\n \"images\"\ : [\n {\n \"src\": \"https://datasets-server.huggingface.co/assets/BUAADreamer/llava-en-zh-2k/--/fba994c834822bddd3cd79e929c33135f4289d2b/--/en/train/1/images/image-1d100e9.jpg?Expires=1726591851&Signature=WyNDGZXVbzPOU9iOQSDPFt1MizgmdT-KqdVAG8nIVSK0Gg8OO-qmhKxgIVjyWMHnWyNbW5svuMoukPMyv9hiHMsNh0YmzdjMR9Gwb6mRvsisEAdaLl71Q053MYxEqkZWCB6PbXG5yEazHL4RHvDphsUEhZS-0Yk8Kzx0HHc12HNaJfiO4fO4IPkY3eLw5xLgNoKIcvvO9TDo0JEbc1ej6YkxGUdqXyVrG2Y4zYnhrCM0drgKVzq24cQ9YZ78HW5f-EsXsftbj0ZzEg4SKcuVgrqaKG8SJ~i0aV-OtkXiTCWxW16D4hfsmpXZShZAHesa1EOGprkYdtQG4Kfte12maQ__&Key-Pair-Id=K3EI6M078Z3AC3\"\ ,\n \"height\": 288,\n \"width\": 384\n }\n ]\n\ \ },\n \"truncated_cells\": []\n }\n]" - "HUB_DATASET_PREVIEW: DATASET_NAME: \"suolyer/eprstmt\"\nFEATURES: {'input': {'dtype':\ \ 'string', '_type': 'Value'}, 'output': {'dtype': 'string', '_type': 'Value'},\ \ 'choice': {'feature': {'dtype': 'string', '_type': 'Value'}, '_type': 'Sequence'},\ \ 'label': {'dtype': 'int64', '_type': 'Value'}, 'id': {'dtype': 'int64', '_type':\ \ 'Value'}}\nDATA SAMPLE:\n[\n {\n \"row_idx\": 0,\n \"row\": {\n \ \ \"input\": \"\\u7ed9\\u51fa\\u5546\\u54c1\\u7684\\u8bc4\\u8bba\\u6587\\u672c\\\ u53ca\\u5176\\u6781\\u6027\\uff08\\u6b63\\u9762\\u6216\\u8d1f\\u9762\\uff09\\\ u3002\\u5982\\u679c\\u7ed9\\u5b9a\\u7684\\u53e5\\u5b50\\u53ca\\u5176\\u6781\\\ u6027\\u5339\\u914d\\uff0c\\u5219\\u751f\\u6210\\u7b54\\u6848\\u201c\\u6b63\\\ u9762\\u201d\\uff0c\\u5426\\u5219\\u751f\\u6210\\u7b54\\u6848\\u201c\\u8d1f\\\ u9762\\u201d\\u3002\\u5475\\u5475\\u4e86 \\u8fd9\\u7269\\u6d41\\u901f\\u5ea6\\\ u4e5f\\u662f\\u6ca1\\u8c01\\u4e86 \\u540c\\u57ce\\u7f51\\u8d2d\\u7adf\\u7136\\\ u4e09\\u5929\\u4e86\\u8fd8\\u4e0d\\u5230\",\n \"output\": \"\\u8d1f\\u9762\"\ ,\n \"choice\": [\n \"\\u8d1f\\u9762\",\n \"\\u6b63\\u9762\"\ \n ],\n \"label\": 0,\n \"id\": 0\n },\n \"truncated_cells\"\ : []\n },\n {\n \"row_idx\": 1,\n \"row\": {\n \"input\": \"\\u7ed9\\\ u51fa\\u5546\\u54c1\\u7684\\u8bc4\\u8bba\\u6587\\u672c\\u53ca\\u5176\\u6781\\\ u6027\\uff08\\u6b63\\u9762\\u6216\\u8d1f\\u9762\\uff09\\u3002\\u5982\\u679c\\\ u7ed9\\u5b9a\\u7684\\u53e5\\u5b50\\u53ca\\u5176\\u6781\\u6027\\u5339\\u914d\\\ uff0c\\u5219\\u751f\\u6210\\u7b54\\u6848\\u201c\\u6b63\\u9762\\u201d\\uff0c\\\ u5426\\u5219\\u751f\\u6210\\u7b54\\u6848\\u201c\\u8d1f\\u9762\\u201d\\u3002\\\ u8fd8\\u4e0d\\u9519\\uff0c\\u7b49\\u8bd5\\u7528\\u4e00\\u6bb5\\u65f6\\u95f4\\\ u518d\\u8bf4\",\n \"output\": \"\\u6b63\\u9762\",\n \"choice\": [\n\ \ \"\\u8d1f\\u9762\",\n \"\\u6b63\\u9762\"\n ],\n \"label\"\ : 1,\n \"id\": 0\n },\n \"truncated_cells\": []\n }\n]" - "NEGATIVE: DATASET_NAME: \"passing2961/photochat_plus\"\nFEATURES: {'photo_description':\ \ {'dtype': 'string', '_type': 'Value'}, 'trigger_sentences': {'feature': {'dtype':\ \ 'string', '_type': 'Value'}, '_type': 'Sequence'}, 'dialogue_id': {'dtype':\ \ 'int64', '_type': 'Value'}, 'photo_url': {'dtype': 'string', '_type': 'Value'},\ \ 'dialogue': [{'message': {'dtype': 'string', '_type': 'Value'}, 'share_photo':\ \ {'dtype': 'bool', '_type': 'Value'}, 'user_id': {'dtype': 'int64', '_type':\ \ 'Value'}}], 'image_descriptions': {'feature': {'dtype': 'string', '_type': 'Value'},\ \ '_type': 'Sequence'}, 'intents': {'feature': {'dtype': 'string', '_type': 'Value'},\ \ '_type': 'Sequence'}, 'salient_information': {'feature': {'dtype': 'string',\ \ '_type': 'Value'}, '_type': 'Sequence'}, 'photo_id': {'dtype': 'string', '_type':\ \ 'Value'}}\nDATA SAMPLE:\n[\n {\n \"row_idx\": 0,\n \"row\": {\n \ \ \"photo_description\": \"The photo has your brother Kannon. Objects in the photo:\ \ Man\",\n \"trigger_sentences\": [\n \"How is Kannon doing?\"\n \ \ ],\n \"dialogue_id\": 500,\n \"photo_url\": \"https://farm6.staticflickr.com/151/369716968_bde7e83418_o.jpg\"\ ,\n \"dialogue\": [\n {\n \"message\": \"Hello, how have\ \ you been, dear friend?\",\n \"share_photo\": false,\n \"user_id\"\ : 1\n },\n {\n \"message\": \"Great!\",\n \"share_photo\"\ : false,\n \"user_id\": 0\n },\n {\n \"message\"\ : \"Thanks for asking\",\n \"share_photo\": false,\n \"user_id\"\ : 0\n },\n {\n \"message\": \"And how have you been?\"\ ,\n \"share_photo\": false,\n \"user_id\": 0\n },\n \ \ {\n \"message\": \"It seems like we haven't talked in forever\"\ ,\n \"share_photo\": false,\n \"user_id\": 0\n },\n \ \ {\n \"message\": \"I have been doing well, keeping busy, spent\ \ a lot of time outdoors. What have you been up to?\",\n \"share_photo\"\ : false,\n \"user_id\": 1\n },\n {\n \"message\"\ : \"Last night my brother Kannon did a poetry reading\",\n \"share_photo\"\ : false,\n \"user_id\": 0\n },\n {\n \"message\"\ : \"Really? How did it go? You know how much I love poetry.\",\n \"share_photo\"\ : false,\n \"user_id\": 1\n },\n {\n \"message\"\ : \"It went really well\",\n \"share_photo\": false,\n \"user_id\"\ : 0\n },\n {\n \"message\": \"Do you remember my brother\ \ Kannon?\",\n \"share_photo\": false,\n \"user_id\": 0\n \ \ },\n {\n \"message\": \"Absolutely! How could I forget,\ \ he left quite an impression\",\n \"share_photo\": false,\n \ \ \"user_id\": 1\n },\n {\n \"message\": \"How is Kannon\ \ doing?\",\n \"share_photo\": false,\n \"user_id\": 1\n \ \ },\n {\n \"message\": \"\",\n \"share_photo\":\ \ true,\n \"user_id\": 0\n },\n {\n \"message\"\ : \"Great\",\n \"share_photo\": false,\n \"user_id\": 0\n \ \ },\n {\n \"message\": \"Here is a photo from last night\"\ ,\n \"share_photo\": false,\n \"user_id\": 0\n },\n \ \ {\n \"message\": \"Wow, he seems so confident in that pic! Wish\ \ that I could have been there.\",\n \"share_photo\": false,\n \ \ \"user_id\": 1\n }\n ],\n \"image_descriptions\": [\n \ \ \"A photo of Kannon\",\n \"A picture of Kannon.\",\n \"a\ \ photo of recent situation\"\n ],\n \"intents\": [\n \"Information\ \ Dissemination\",\n \"Social Bonding\"\n ],\n \"salient_information\"\ : [\n \"poetry\",\n \"How is Kannon doing?\",\n \"Kannon\ \ doing\"\n ],\n \"photo_id\": \"train/19e8f436d4b2fc25\"\n },\n\ \ \"truncated_cells\": []\n },\n {\n \"row_idx\": 1,\n \"row\": {\n\ \ \"photo_description\": \"The photo has your uncle Kieran. Objects in the\ \ photo: Clothing, Man\",\n \"trigger_sentences\": [\n \"guess what\ \ new animal he got?\",\n \"He's always had goats and chickens, but guess\ \ what new animal he got?\"\n ],\n \"dialogue_id\": 501,\n \"photo_url\"\ : \"https://farm8.staticflickr.com/53/189664134_f70fc8947a_o.jpg\",\n \"\ dialogue\": [\n {\n \"message\": \"Hey! You remember my uncle\ \ who owns the hobby farm, right?\",\n \"share_photo\": false,\n \ \ \"user_id\": 0\n },\n {\n \"message\": \"Yeah i\ \ do\",\n \"share_photo\": false,\n \"user_id\": 1\n \ \ },\n {\n \"message\": \"Uncle Keiran?\",\n \"share_photo\"\ : false,\n \"user_id\": 0\n },\n {\n \"message\"\ : \"How about him?\",\n \"share_photo\": false,\n \"user_id\"\ : 1\n },\n {\n \"message\": \"He's always had goats and\ \ chickens, but guess what new animal he got?\",\n \"share_photo\": false,\n\ \ \"user_id\": 0\n },\n {\n \"message\": \"Dog?\"\ ,\n \"share_photo\": false,\n \"user_id\": 1\n },\n \ \ {\n \"message\": \"Nope, a wild hog!\",\n \"share_photo\"\ : false,\n \"user_id\": 0\n },\n {\n \"message\"\ : \"And not the motorcycle kind ;)\",\n \"share_photo\": false,\n \ \ \"user_id\": 0\n },\n {\n \"message\": \"\",\n\ \ \"share_photo\": true,\n \"user_id\": 0\n },\n \ \ {\n \"message\": \"Wow\",\n \"share_photo\": false,\n \ \ \"user_id\": 1\n }\n ],\n \"image_descriptions\": [\n\ \ \"A photo of the hog's appearance.\",\n \"a photo of wild hog\"\ ,\n \"An image of the new wild hog\"\n ],\n \"intents\": [\n\ \ \"Social Bonding\",\n \"Visual Clarification\"\n ],\n \ \ \"salient_information\": [\n \"hog\",\n \"not the motorcycle\ \ kind\",\n \"wild hog\",\n \"a wild hog\"\n ],\n \"photo_id\"\ : \"train/07d688f5e2142b87\"\n },\n \"truncated_cells\": []\n }\n]" - source_sentence: 'USER_QUERY: kotlin code dataset' sentences: - "HUB_DATASET_PREVIEW: DATASET_NAME: \"DucHaiten/Classic-Anime\"\nFEATURES: {'image':\ \ {'_type': 'Image'}}\nDATA SAMPLE:\n[\n {\n \"row_idx\": 0,\n \"row\"\ : {\n \"image\": {\n \"src\": \"https://datasets-server.huggingface.co/assets/DucHaiten/Classic-Anime/--/8b5b48b361fc115087d3e909f5756f83691dd215/--/default/train/0/image/image.jpg?Expires=1726591575&Signature=s8HUsrjKzPR82e4Z2ivQvcFiaQPhhRtKOhAeOAQv2J667GZW65fWMTXre6-aFpEQUB4m01SIA~Dqn~pDM07eXZhMTWg53y-bg-2ZzqdROTWriUSHNMCF~O1LO9PLJ29Hv6NrHuiCWYZGiB62Xz3442Xp4JbkdoyWH~GjuJuxfF~knZ7TiUvcxv5eBqXFTHYkl4x1isTsv25xhRIfOac0u0zsVG8lO228oYDeSYVqkWyZobB6udMtYo8K4YebHWWaNPKrblmoTW3fbBzllbwxHoH2afSEui~Gy0CHeAerrnlAH7c9f4bG5e~qGx6IgNQSH-hZHXFaEmmIkcLNPd8NCA__&Key-Pair-Id=K3EI6M078Z3AC3\"\ ,\n \"height\": 1080,\n \"width\": 1920\n }\n },\n \"\ truncated_cells\": []\n },\n {\n \"row_idx\": 1,\n \"row\": {\n \"\ image\": {\n \"src\": \"https://datasets-server.huggingface.co/assets/DucHaiten/Classic-Anime/--/8b5b48b361fc115087d3e909f5756f83691dd215/--/default/train/1/image/image.jpg?Expires=1726591575&Signature=I~oedYR11PUVCBujVEn--etHf8sNa8JSR0GaRvZx5qDXXSKIpcPOb3haWO3vtiIVuE-FOxMl-9G4HIQ4v4EbvQDUBimqZytYMD5h86vGxLJYcp9BOeeK6gVjw0b6YGA5z6UmzuJ6Zq4K5GYNjG6C9PjFnr0nFDPAys69Um4z~toHQiPM37S3ilBO9UOk1eKmRge75~-ZEkfOPAsk7PG1Eny2qoLaz7ADmjF-Sm-fXqcBhjLpzhHvMqfq~4Grvq7SY2CUVM-amU0a5Jz6Hul62WhPbtYm8rLqkSVFsj8FK5Mk1UG2PscSUjoMEPVPL6d8T9htkeC8Yj1axBnkHKJXww__&Key-Pair-Id=K3EI6M078Z3AC3\"\ ,\n \"height\": 1080,\n \"width\": 1440\n }\n },\n \"\ truncated_cells\": []\n }\n]" - "NEGATIVE: DATASET_NAME: \"vikp/starcoder_cleaned\"\nFEATURES: {'code': {'dtype':\ \ 'string', '_type': 'Value'}, 'repo_path': {'dtype': 'string', '_type': 'Value'}}\n\ DATA SAMPLE:\n[\n {\n \"row_idx\": 0,\n \"row\": {\n \"code\": \"\ # ---\\n# jupyter:\\n# jupytext:\\n# text_representation:\\n# extension:\ \ .py\\n# format_name: light\\n# format_version: '1.5'\\n# jupytext_version:\ \ 1.14.4\\n# kernelspec:\\n# display_name: Python 3\\n# language: python\\\ n# name: python3\\n# ---\\n\\n# # 09 Strain Gage\\n#\\n# This is one of the\ \ most commonly used sensor. It is used in many transducers. Its fundamental\ \ operating principle is fairly easy to understand and it will be the purpose\ \ of this lecture. \\n#\\n# A strain gage is essentially a thin wire that is wrapped\ \ on film of plastic. \\n# <img src=\\\"img/StrainGage.png\\\" width=\\\"200\\\ \">\\n# The strain gage is then mounted (glued) on the part for which the strain\ \ must be measured. \\n# <img src=\\\"img/Strain_gauge_2.jpg\\\" width=\\\"200\\\ \">\\n#\\n# ## Stress, Strain\\n# When a beam is under axial load, the axial stress,\ \ $\\\\sigma_a$, is defined as:\\n# \\\\begin{align*}\\n# \\\\sigma_a = \\\\frac{F}{A}\\\ n# \\\\end{align*}\\n# with $F$ the axial load, and $A$ the cross sectional area\ \ of the beam under axial load.\\n#\\n# <img src=\\\"img/BeamUnderStrain.png\\\ \" width=\\\"200\\\">\\n#\\n# Under the load, the beam of length $L$ will extend\ \ by $dL$, giving rise to the definition of strain, $\\\\epsilon_a$:\\n# \\\\\ begin{align*}\\n# \\\\epsilon_a = \\\\frac{dL}{L}\\n# \\\\end{align*}\\n# The\ \ beam will also contract laterally: the cross sectional area is reduced by $dA$.\ \ This results in a transverval strain $\\\\epsilon_t$. The transversal and\ \ axial strains are related by the Poisson's ratio:\\n# \\\\begin{align*}\\n#\ \ \\\\nu = - \\\\frac{\\\\epsilon_t }{\\\\epsilon_a}\\n# \\\\end{align*}\\n# For\ \ a metal the Poission's ratio is typically $\\\\nu = 0.3$, for an incompressible\ \ material, such as rubber (or water), $\\\\nu = 0.5$.\\n#\\n# Within the elastic\ \ limit, the axial stress and axial strain are related through Hooke's law by\ \ the Young's modulus, $E$:\\n# \\\\begin{align*}\\n# \\\\sigma_a = E \\\\epsilon_a\\\ n# \\\\end{align*}\\n#\\n# <img src=\\\"img/ElasticRegime.png\\\" width=\\\"200\\\ \">\\n\\n# ## Resistance of a wire\\n#\\n# The electrical resistance of a wire\ \ $R$ is related to its physical properties (the electrical resistiviy, $\\\\\ rho$ in $\\\\Omega$/m) and its geometry: length $L$ and cross sectional area $A$.\\\ n#\\n# \\\\begin{align*}\\n# R = \\\\frac{\\\\rho L}{A}\\n# \\\\end{align*}\\\ n#\\n# Mathematically, the change in wire dimension will result inchange in its\ \ electrical resistance. This can be derived from first principle:\\n# \\\\begin{align}\\\ n# \\\\frac{dR}{R} = \\\\frac{d\\\\rho}{\\\\rho} + \\\\frac{dL}{L} - \\\\frac{dA}{A}\\\ n# \\\\end{align}\\n# If the wire has a square cross section, then:\\n# \\\\begin{align*}\\\ n# A & = L'^2 \\\\\\\\\\n# \\\\frac{dA}{A} & = \\\\frac{d(L'^2)}{L'^2} = \\\\\ frac{2L'dL'}{L'^2} = 2 \\\\frac{dL'}{L'}\\n# \\\\end{align*}\\n# We have related\ \ the change in cross sectional area to the transversal strain.\\n# \\\\begin{align*}\\\ n# \\\\epsilon_t = \\\\frac{dL'}{L'}\\n# \\\\end{align*}\\n# Using the Poisson's\ \ ratio, we can relate then relate the change in cross-sectional area ($dA/A$)\ \ to axial strain $\\\\epsilon_a = dL/L$.\\n# \\\\begin{align*}\\n# \\\\epsilon_t\ \ &= - \\\\nu \\\\epsilon_a \\\\\\\\\\n# \\\\frac{dL'}{L'} &= - \\\\nu \\\\frac{dL}{L}\ \ \\\\; \\\\text{or}\\\\\\\\\\n# \\\\frac{dA}{A} & = 2\\\\frac{dL'}{L'} = -2 \\\ \\nu \\\\frac{dL}{L}\\n# \\\\end{align*}\\n# Finally we can substitute express\ \ $dA/A$ in eq. for $dR/R$ and relate change in resistance to change of wire geometry,\ \ remembering that for a metal $\\\\nu =0.3$:\\n# \\\\begin{align}\\n# \\\\frac{dR}{R}\ \ & = \\\\frac{d\\\\rho}{\\\\rho} + \\\\frac{dL}{L} - \\\\frac{dA}{A} \\\\\\\\\ \\n# & = \\\\frac{d\\\\rho}{\\\\rho} + \\\\frac{dL}{L} - (-2\\\\nu \\\\frac{dL}{L})\ \ \\\\\\\\\\n# & = \\\\frac{d\\\\rho}{\\\\rho} + 1.6 \\\\frac{dL}{L} = \\\\frac{d\\\ \\rho}{\\\\rho} + 1.6 \\\\epsilon_a\\n# \\\\end{align}\\n# It also happens that\ \ for most metals, the resistivity increases with axial strain. In general, one\ \ can then related the change in resistance to axial strain by defining the strain\ \ gage factor:\\n# \\\\begin{align}\\n# S = 1.6 + \\\\frac{d\\\\rho}{\\\\rho}\\\ \\cdot \\\\frac{1}{\\\\epsilon_a}\\n# \\\\end{align}\\n# and finally, we have:\\\ n# \\\\begin{align*}\\n# \\\\frac{dR}{R} = S \\\\epsilon_a\\n# \\\\end{align*}\\\ n# $S$ is materials dependent and is typically equal to 2.0 for most commercially\ \ availabe strain gages. It is dimensionless.\\n#\\n# Strain gages are made of\ \ thin wire that is wraped in several loops, effectively increasing the length\ \ of the wire and therefore the sensitivity of the sensor.\\n#\\n# _Question:\\\ n#\\n# Explain why a longer wire is necessary to increase the sensitivity of the\ \ sensor_.\\n#\\n# Most commercially available strain gages have a nominal resistance\ \ (resistance under no load, $R_{ini}$) of 120 or 350 $\\\\Omega$.\\n#\\n# Within\ \ the elastic regime, strain is typically within the range $10^{-6} - 10^{-3}$,\ \ in fact strain is expressed in unit of microstrain, with a 1 microstrain = $10^{-6}$.\ \ Therefore, changes in resistances will be of the same order. If one were to\ \ measure resistances, we will need a dynamic range of 120 dB, whih is typically\ \ very expensive. Instead, one uses the Wheatstone bridge to transform the change\ \ in resistance to a voltage, which is easier to measure and does not require\ \ such a large dynamic range.\\n\\n# ## Wheatstone bridge:\\n# <img src=\\\"img/WheatstoneBridge.png\\\ \" width=\\\"200\\\">\\n#\\n# The output voltage is related to the difference\ \ in resistances in the bridge:\\n# \\\\begin{align*}\\n# \\\\frac{V_o}{V_s} =\ \ \\\\frac{R_1R_3-R_2R_4}{(R_1+R_4)(R_2+R_3)}\\n# \\\\end{align*}\\n#\\n# If the\ \ bridge is balanced, then $V_o = 0$, it implies: $R_1/R_2 = R_4/R_3$.\\n#\\n#\ \ In practice, finding a set of resistors that balances the bridge is challenging,\ \ and a potentiometer is used as one of the resistances to do minor adjustement\ \ to balance the bridge. If one did not do the adjustement (ie if we did not\ \ zero the bridge) then all the measurement will have an offset or bias that could\ \ be removed in a post-processing phase, as long as the bias stayed constant.\\\ n#\\n# If each resistance $R_i$ is made to vary slightly around its initial value,\ \ ie $R_i = R_{i,ini} + dR_i$. For simplicity, we will assume that the initial\ \ value of the four resistances are equal, ie $R_{1,ini} = R_{2,ini} = R_{3,ini}\ \ = R_{4,ini} = R_{ini}$. This implies that the bridge was initially balanced,\ \ then the output voltage would be:\\n#\\n# \\\\begin{align*}\\n# \\\\frac{V_o}{V_s}\ \ = \\\\frac{1}{4} \\\\left( \\\\frac{dR_1}{R_{ini}} - \\\\frac{dR_2}{R_{ini}}\ \ + \\\\frac{dR_3}{R_{ini}} - \\\\frac{dR_4}{R_{ini}} \\\\right)\\n# \\\\end{align*}\\\ n#\\n# Note here that the changes in $R_1$ and $R_3$ have a positive effect on\ \ $V_o$, while the changes in $R_2$ and $R_4$ have a negative effect on $V_o$.\ \ In practice, this means that is a beam is a in tension, then a strain gage\ \ mounted on the branch 1 or 3 of the Wheatstone bridge will produce a positive\ \ voltage, while a strain gage mounted on branch 2 or 4 will produce a negative\ \ voltage. One takes advantage of this to increase sensitivity to measure strain.\\\ n#\\n# ### Quarter bridge\\n# One uses only one quarter of the bridge, ie strain\ \ gages are only mounted on one branch of the bridge.\\n#\\n# \\\\begin{align*}\\\ n# \\\\frac{V_o}{V_s} = \\\\pm \\\\frac{1}{4} \\\\epsilon_a S\\n# \\\\end{align*}\\\ n# Sensitivity, $G$:\\n# \\\\begin{align*}\\n# G = \\\\frac{V_o}{\\\\epsilon_a}\ \ = \\\\pm \\\\frac{1}{4}S V_s\\n# \\\\end{align*}\\n#\\n#\\n# ### Half bridge\\\ n# One uses half of the bridge, ie strain gages are mounted on two branches of\ \ the bridge.\\n#\\n# \\\\begin{align*}\\n# \\\\frac{V_o}{V_s} = \\\\pm \\\\frac{1}{2}\ \ \\\\epsilon_a S\\n# \\\\end{align*}\\n#\\n# ### Full bridge\\n#\\n# One uses\ \ of the branches of the bridge, ie strain gages are mounted on each branch.\\\ n#\\n# \\\\begin{align*}\\n# \\\\frac{V_o}{V_s} = \\\\pm \\\\epsilon_a S\\n# \\\ \\end{align*}\\n#\\n# Therefore, as we increase the order of bridge, the sensitivity\ \ of the instrument increases. However, one should be carefull how we mount the\ \ strain gages as to not cancel out their measurement.\\n\\n# _Exercise_\\n#\\\ n# 1- Wheatstone bridge\\n#\\n# <img src=\\\"img/WheatstoneBridge.png\\\" width=\\\ \"200\\\">\\n#\\n# > How important is it to know \\\\& match the resistances of\ \ the resistors you employ to create your bridge?\\n# > How would you do that\ \ practically?\\n# > Assume $R_1=120\\\\,\\\\Omega$, $R_2=120\\\\,\\\\Omega$,\ \ $R_3=120\\\\,\\\\Omega$, $R_4=110\\\\,\\\\Omega$, $V_s=5.00\\\\,\\\\text{V}$.\ \ What is $V_\\\\circ$?\\n\\nVs = 5.00\\nVo = (120**2-120*110)/(230*240) * Vs\\\ nprint('Vo = ',Vo, ' V')\\n\\n# typical range in strain a strain gauge can measure\\\ n# 1 -1000 micro-Strain\\nAxialStrain = 1000*10**(-6) # axial strain\\nStrainGageFactor\ \ = 2\\nR_ini = 120 # Ohm\\nR_1 = R_ini+R_ini*StrainGageFactor*AxialStrain\\nprint(R_1)\\\ nVo = (120**2-120*(R_1))/((120+R_1)*240) * Vs\\nprint('Vo = ', Vo, ' V')\\n\\\ n# > How important is it to know \\\\& match the resistances of the resistors\ \ you employ to create your bridge?\\n# > How would you do that practically?\\\ n# > Assume $R_1= R_2 =R_3=120\\\\,\\\\Omega$, $R_4=120.01\\\\,\\\\Omega$, $V_s=5.00\\\ \\,\\\\text{V}$. What is $V_\\\\circ$?\\n\\nVs = 5.00\\nVo = (120**2-120*120.01)/(240.01*240)\ \ * Vs\\nprint(Vo)\\n\\n# 2- Strain gage 1:\\n#\\n# One measures the strain on\ \ a bridge steel beam. The modulus of elasticity is $E=190$ GPa. Only one strain\ \ gage is mounted on the bottom of the beam; the strain gage factor is $S=2.02$.\\\ n#\\n# > a) What kind of electronic circuit will you use? Draw a sketch of it.\\\ n#\\n# > b) Assume all your resistors including the unloaded strain gage are balanced\ \ and measure $120\\\\,\\\\Omega$, and that the strain gage is at location $R_2$.\ \ The supply voltage is $5.00\\\\,\\\\text{VDC}$. Will $V_\\\\circ$ be positive\ \ or negative when a downward load is added?\\n\\n# In practice, we cannot have\ \ all resistances = 120 $\\\\Omega$. at zero load, the bridge will be unbalanced\ \ (show $V_o \\\\neq 0$). How could we balance our bridge?\\n#\\n# Use a potentiometer\ \ to balance bridge, for the load cell, we ''zero'' the instrument.\\n#\\n# Other\ \ option to zero-out our instrument? Take data at zero-load, record the voltage,\ \ $V_{o,noload}$. Substract $V_{o,noload}$ to my data.\\n\\n# > c) For a loading\ \ in which $V_\\\\circ = -1.25\\\\,\\\\text{mV}$, calculate the strain $\\\\epsilon_a$\ \ in units of microstrain.\\n\\n# \\\\begin{align*}\\n# \\\\frac{V_o}{V_s} & =\ \ - \\\\frac{1}{4} \\\\epsilon_a S\\\\\\\\\\n# \\\\epsilon_a & = -\\\\frac{4}{S}\ \ \\\\frac{V_o}{V_s}\\n# \\\\end{align*}\\n\\nS = 2.02\\nVo = -0.00125\\nVs =\ \ 5\\neps_a = -1*(4/S)*(Vo/Vs)\\nprint(eps_a)\\n\\n# > d) Calculate the axial\ \ stress (in MPa) in the beam under this load.\\n\\n\\n\\n# > e) You now want\ \ more sensitivity in your measurement, you install a second strain gage on to\\\ n\\n# p of the beam. Which resistor should you use for this second active strain\ \ gage?\\n#\\n# > f) With this new setup and the same applied load than previously,\ \ what should be the output voltage?\\n\\n# 3- Strain Gage with Long Lead Wires\ \ \\n#\\n# <img src=\\\"img/StrainGageLongWires.png\\\" width=\\\"360\\\">\\n#\\\ n# A quarter bridge strain gage Wheatstone bridge circuit is constructed with\ \ $120\\\\,\\\\Omega$ resistors and a $120\\\\,\\\\Omega$ strain gage. For this\ \ practical application, the strain gage is located very far away form the DAQ\ \ station and the lead wires to the strain gage are $10\\\\,\\\\text{m}$ long\ \ and the lead wire have a resistance of $0.080\\\\,\\\\Omega/\\\\text{m}$. The\ \ lead wire resistance can lead to problems since $R_{lead}$ changes with temperature.\\\ n#\\n# > Design a modified circuit that will cancel out the effect of the lead\ \ wires.\\n\\n# ## Homework\\n#\\n\",\n \"repo_path\": \"Lectures/09_StrainGage.ipynb\"\ \n },\n \"truncated_cells\": []\n },\n {\n \"row_idx\": 1,\n \"\ row\": {\n \"code\": \"# ---\\n# jupyter:\\n# jupytext:\\n# split_at_heading:\ \ true\\n# text_representation:\\n# extension: .py\\n# format_name:\ \ light\\n# format_version: '1.5'\\n# jupytext_version: 1.14.4\\n#\ \ kernelspec:\\n# display_name: Python 3\\n# language: python\\n# \ \ name: python3\\n# ---\\n\\n#export\\nfrom fastai.basics import *\\nfrom fastai.tabular.core\ \ import *\\nfrom fastai.tabular.model import *\\n\\nfrom fastai.tabular.data\ \ import *\\n\\n#hide\\nfrom nbdev.showdoc import *\\n\\n\\n# +\\n#default_exp\ \ tabular.learner\\n# -\\n\\n# # Tabular learner\\n#\\n# > The function to immediately\ \ get a `Learner` ready to train for tabular data\\n\\n# The main function you\ \ probably want to use in this module is `tabular_learner`. It will automatically\ \ create a `TabulaModel` suitable for your data and infer the irght loss function.\ \ See the [tabular tutorial](http://docs.fast.ai/tutorial.tabular) for an example\ \ of use in context.\\n\\n# ## Main functions\\n\\n#export\\n@log_args(but_as=Learner.__init__)\\\ nclass TabularLearner(Learner):\\n \\\"`Learner` for tabular data\\\"\\n \ \ def predict(self, row):\\n tst_to = self.dls.valid_ds.new(pd.DataFrame(row).T)\\\ n tst_to.process()\\n tst_to.conts = tst_to.conts.astype(np.float32)\\\ n dl = self.dls.valid.new(tst_to)\\n inp,preds,_,dec_preds = self.get_preds(dl=dl,\ \ with_input=True, with_decoded=True)\\n i = getattr(self.dls, 'n_inp',\ \ -1)\\n b = (*tuplify(inp),*tuplify(dec_preds))\\n full_dec = self.dls.decode((*tuplify(inp),*tuplify(dec_preds)))\\\ n return full_dec,dec_preds[0],preds[0]\\n\\n\\nshow_doc(TabularLearner,\ \ title_level=3)\\n\\n\\n# It works exactly as a normal `Learner`, the only difference\ \ is that it implements a `predict` method specific to work on a row of data.\\\ n\\n#export\\n@log_args(to_return=True, but_as=Learner.__init__)\\n@delegates(Learner.__init__)\\\ ndef tabular_learner(dls, layers=None, emb_szs=None, config=None, n_out=None,\ \ y_range=None, **kwargs):\\n \\\"Get a `Learner` using `dls`, with `metrics`,\ \ including a `TabularModel` created using the remaining params.\\\"\\n if\ \ config is None: config = tabular_config()\\n if layers is None: layers =\ \ [200,100]\\n to = dls.train_ds\\n emb_szs = get_emb_sz(dls.train_ds, {}\ \ if emb_szs is None else emb_szs)\\n if n_out is None: n_out = get_c(dls)\\\ n assert n_out, \\\"`n_out` is not defined, and could not be infered from data,\ \ set `dls.c` or pass `n_out`\\\"\\n if y_range is None and 'y_range' in config:\ \ y_range = config.pop('y_range')\\n model = TabularModel(emb_szs, len(dls.cont_names),\ \ n_out, layers, y_range=y_range, **config)\\n return TabularLearner(dls, model,\ \ **kwargs)\\n\\n\\n# If your data was built with fastai, you probably won't need\ \ to pass anything to `emb_szs` unless you want to change the default of the library\ \ (produced by `get_emb_sz`), same for `n_out` which should be automatically inferred.\ \ `layers` will default to `[200,100]` and is passed to `TabularModel` along with\ \ the `config`.\\n#\\n# Use `tabular_config` to create a `config` and cusotmize\ \ the model used. There is just easy access to `y_range` because this argument\ \ is often used.\\n#\\n# All the other arguments are passed to `Learner`.\\n\\\ npath = untar_data(URLs.ADULT_SAMPLE)\\ndf = pd.read_csv(path/'adult.csv')\\ncat_names\ \ = ['workclass', 'education', 'marital-status', 'occupation', 'relationship',\ \ 'race']\\ncont_names = ['age', 'fnlwgt', 'education-num']\\nprocs = [Categorify,\ \ FillMissing, Normalize]\\ndls = TabularDataLoaders.from_df(df, path, procs=procs,\ \ cat_names=cat_names, cont_names=cont_names, \\n \ \ y_names=\\\"salary\\\", valid_idx=list(range(800,1000)), bs=64)\\nlearn\ \ = tabular_learner(dls)\\n\\n#hide\\ntst = learn.predict(df.iloc[0])\\n\\n# +\\\ n#hide\\n#test y_range is passed\\nlearn = tabular_learner(dls, y_range=(0,32))\\\ nassert isinstance(learn.model.layers[-1], SigmoidRange)\\ntest_eq(learn.model.layers[-1].low,\ \ 0)\\ntest_eq(learn.model.layers[-1].high, 32)\\n\\nlearn = tabular_learner(dls,\ \ config = tabular_config(y_range=(0,32)))\\nassert isinstance(learn.model.layers[-1],\ \ SigmoidRange)\\ntest_eq(learn.model.layers[-1].low, 0)\\ntest_eq(learn.model.layers[-1].high,\ \ 32)\\n\\n\\n# -\\n\\n#export\\n@typedispatch\\ndef show_results(x:Tabular, y:Tabular,\ \ samples, outs, ctxs=None, max_n=10, **kwargs):\\n df = x.all_cols[:max_n]\\\ n for n in x.y_names: df[n+'_pred'] = y[n][:max_n].values\\n display_df(df)\\\ n\\n\\n# ## Export -\\n\\n#hide\\nfrom nbdev.export import notebook2script\\nnotebook2script()\\\ n\\n\\n\",\n \"repo_path\": \"nbs/43_tabular.learner.ipynb\"\n },\n \ \ \"truncated_cells\": []\n }\n]" - "HUB_DATASET_PREVIEW: DATASET_NAME: \"mvasiliniuc/iva-kotlin-codeint\"\nFEATURES:\ \ {'repo_name': {'dtype': 'string', '_type': 'Value'}, 'path': {'dtype': 'string',\ \ '_type': 'Value'}, 'copies': {'dtype': 'string', '_type': 'Value'}, 'size':\ \ {'dtype': 'string', '_type': 'Value'}, 'content': {'dtype': 'string', '_type':\ \ 'Value'}, 'license': {'dtype': 'string', '_type': 'Value'}}\nDATA SAMPLE:\n\ [\n {\n \"row_idx\": 0,\n \"row\": {\n \"repo_name\": \"Cognifide/gradle-aem-plugin\"\ ,\n \"path\": \"src/main/kotlin/com/cognifide/gradle/aem/instance/tasks/InstanceReload.kt\"\ ,\n \"copies\": \"1\",\n \"size\": \"1052\",\n \"content\": \"\ package com.cognifide.gradle.aem.instance.tasks\\n\\nimport com.cognifide.gradle.aem.common.instance.action.AwaitUpAction\\\ nimport com.cognifide.gradle.aem.common.instance.action.ReloadAction\\nimport\ \ com.cognifide.gradle.aem.common.instance.names\\nimport com.cognifide.gradle.aem.common.tasks.Instance\\\ nimport org.gradle.api.tasks.TaskAction\\n\\nopen class InstanceReload : Instance()\ \ {\\n\\n private var reloadOptions: ReloadAction.() -> Unit = {}\\n\\n \ \ fun reload(options: ReloadAction.() -> Unit) {\\n this.reloadOptions\ \ = options\\n }\\n\\n private var awaitUpOptions: AwaitUpAction.() -> Unit\ \ = {}\\n\\n fun awaitUp(options: AwaitUpAction.() -> Unit) {\\n this.awaitUpOptions\ \ = options\\n }\\n\\n @TaskAction\\n fun reload() {\\n instanceManager.awaitReloaded(anyInstances,\ \ reloadOptions, awaitUpOptions)\\n common.notifier.lifecycle(\\\"Instance(s)\ \ reloaded\\\", \\\"Which: ${anyInstances.names}\\\")\\n }\\n\\n init {\\\ n description = \\\"Reloads all AEM instance(s).\\\"\\n }\\n\\n companion\ \ object {\\n const val NAME = \\\"instanceReload\\\"\\n }\\n}\\n\"\ ,\n \"license\": \"apache-2.0\"\n },\n \"truncated_cells\": []\n },\n\ \ {\n \"row_idx\": 1,\n \"row\": {\n \"repo_name\": \"80998062/Fank\"\ ,\n \"path\": \"presentation/src/main/java/com/sinyuk/fanfou/ui/status/StatusView.kt\"\ ,\n \"copies\": \"1\",\n \"size\": \"8490\",\n \"content\": \"\ /*\\n *\\n * * Apache License\\n * *\\n * * Copyright [2017] Sinyuk\\n * *\\\ n * * Licensed under the Apache License, Version 2.0 (the \\\"License\\\");\\\ n * * you may not use this file except in compliance with the License.\\n * \ \ * You may obtain a copy of the License at\\n * *\\n * * http://www.apache.org/licenses/LICENSE-2.0\\\ n * *\\n * * Unless required by applicable law or agreed to in writing, software\\\ n * * distributed under the License is distributed on an \\\"AS IS\\\" BASIS,\\\ n * * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\\\ n * * See the License for the specific language governing permissions and\\n\ \ * * limitations under the License.\\n *\\n */\\n\\npackage com.sinyuk.fanfou.ui.status\\\ n\\nimport android.os.Build\\nimport android.os.Bundle\\nimport android.support.v4.app.Fragment\\\ nimport android.support.v4.app.FragmentPagerAdapter\\nimport android.text.Editable\\\ nimport android.text.TextWatcher\\nimport android.view.View\\nimport android.view.ViewTreeObserver\\\ nimport cn.dreamtobe.kpswitch.util.KeyboardUtil\\nimport com.linkedin.android.spyglass.suggestions.SuggestionsResult\\\ nimport com.linkedin.android.spyglass.suggestions.interfaces.Suggestible\\nimport\ \ com.linkedin.android.spyglass.suggestions.interfaces.SuggestionsResultListener\\\ nimport com.linkedin.android.spyglass.suggestions.interfaces.SuggestionsVisibilityManager\\\ nimport com.linkedin.android.spyglass.tokenization.QueryToken\\nimport com.linkedin.android.spyglass.tokenization.impl.WordTokenizer\\\ nimport com.linkedin.android.spyglass.tokenization.impl.WordTokenizerConfig\\\ nimport com.linkedin.android.spyglass.tokenization.interfaces.QueryTokenReceiver\\\ nimport com.sinyuk.fanfou.R\\nimport com.sinyuk.fanfou.base.AbstractActivity\\\ nimport com.sinyuk.fanfou.base.AbstractFragment\\nimport com.sinyuk.fanfou.di.Injectable\\\ nimport com.sinyuk.fanfou.domain.DO.Player\\nimport com.sinyuk.fanfou.domain.DO.Status\\\ nimport com.sinyuk.fanfou.domain.STATUS_LIMIT\\nimport com.sinyuk.fanfou.domain.StatusCreation\\\ nimport com.sinyuk.fanfou.domain.TIMELINE_CONTEXT\\nimport com.sinyuk.fanfou.ui.editor.EditorView\\\ nimport com.sinyuk.fanfou.ui.editor.MentionListView\\nimport com.sinyuk.fanfou.ui.timeline.TimelineView\\\ nimport com.sinyuk.fanfou.util.obtainViewModelFromActivity\\nimport com.sinyuk.fanfou.viewmodel.FanfouViewModelFactory\\\ nimport com.sinyuk.fanfou.viewmodel.PlayerViewModel\\nimport kotlinx.android.synthetic.main.status_view.*\\\ nimport kotlinx.android.synthetic.main.status_view_footer.*\\nimport kotlinx.android.synthetic.main.status_view_reply_actionbar.*\\\ nimport javax.inject.Inject\\n\\n\\n/**\\n * Created by sinyuk on 2018/1/12.\\\ n *\\n */\\nclass StatusView : AbstractFragment(), Injectable, QueryTokenReceiver,\ \ SuggestionsResultListener, SuggestionsVisibilityManager {\\n\\n companion\ \ object {\\n fun newInstance(status: Status, photoExtra: Bundle? = null)\ \ = StatusView().apply {\\n arguments = Bundle().apply {\\n \ \ putParcelable(\\\"status\\\", status)\\n putBundle(\\\ \"photoExtra\\\", photoExtra)\\n }\\n }\\n }\\n\\n override\ \ fun layoutId() = R.layout.status_view\\n\\n @Inject\\n lateinit var factory:\ \ FanfouViewModelFactory\\n\\n private val playerViewModel by lazy { obtainViewModelFromActivity(factory,\ \ PlayerViewModel::class.java) }\\n\\n override fun onEnterAnimationEnd(savedInstanceState:\ \ Bundle?) {\\n super.onEnterAnimationEnd(savedInstanceState)\\n \ \ navBack.setOnClickListener { onBackPressedSupport() }\\n setupEditor()\\\ n setupKeyboard()\\n onTextChanged(0)\\n setupViewPager()\\\ n\\n val status = arguments!!.getParcelable<Status>(\\\"status\\\")\\n\ \ fullscreenButton.setOnClickListener {\\n (activity as AbstractActivity).start(EditorView.newInstance(status.id,\\\ n replyEt.mentionsText,\\n StatusCreation.REPOST_STATUS))\\\ n replyEt.text = null\\n }\\n }\\n\\n private fun setupViewPager()\ \ {\\n val status = arguments!!.getParcelable<Status>(\\\"status\\\")\\\ n val bundle = arguments!!.getBundle(\\\"photoExtra\\\")\\n val\ \ fragments: List<Fragment> = if (findChildFragment(TimelineView::class.java)\ \ == null) {\\n val mentionView = MentionListView()\\n mentionView.onItemClickListener\ \ = onSuggestionSelectListener\\n mutableListOf(TimelineView.contextTimeline(TIMELINE_CONTEXT,\ \ status, bundle), mentionView)\\n } else {\\n mutableListOf(findChildFragment(TimelineView::class.java),\ \ MentionListView())\\n }\\n\\n viewPager.setPagingEnabled(false)\\\ n viewPager.offscreenPageLimit = 1\\n viewPager.adapter = object\ \ : FragmentPagerAdapter(childFragmentManager) {\\n override fun getItem(position:\ \ Int) = fragments[position]\\n\\n override fun getCount() = fragments.size\\\ n }\\n }\\n\\n private var keyboardListener: ViewTreeObserver.OnGlobalLayoutListener?\ \ = null\\n\\n private fun setupKeyboard() {\\n keyboardListener = KeyboardUtil.attach(activity,\ \ panelRoot, {\\n // TODO: how comes the Exception: panelRootContainer\ \ must not be null\\n panelRootContainer?.visibility =\\n \ \ if (it) {\\n if (replyEt.requestFocus()) replyEt.setSelection(replyEt.text.length)\\\ n View.VISIBLE\\n } else {\\n \ \ replyEt.clearFocus()\\n View.GONE\\\ n }\\n })\\n }\\n\\n private val config = WordTokenizerConfig.Builder()\\\ n .setExplicitChars(\\\"@\\\")\\n .setThreshold(3)\\n \ \ .setMaxNumKeywords(5)\\n .setWordBreakChars(\\\" \\\").build()\\\ n\\n private fun setupEditor() {\\n replyEt.tokenizer = WordTokenizer(config)\\\ n replyEt.setAvoidPrefixOnTap(true)\\n replyEt.setQueryTokenReceiver(this)\\\ n replyEt.setSuggestionsVisibilityManager(this)\\n replyEt.setAvoidPrefixOnTap(true)\\\ n\\n replyCommitButton.setOnClickListener { }\\n\\n if (Build.VERSION.SDK_INT\ \ >= Build.VERSION_CODES.O)\\n textCountProgress.min = 0\\n \ \ textCountProgress.max = STATUS_LIMIT\\n replyEt.addTextChangedListener(object\ \ : TextWatcher {\\n override fun afterTextChanged(s: Editable?) {\\\ n onTextChanged(s?.length ?: 0)\\n }\\n\\n \ \ override fun beforeTextChanged(s: CharSequence?, start: Int, count: Int, after:\ \ Int) {\\n\\n }\\n\\n override fun onTextChanged(s: CharSequence?,\ \ start: Int, before: Int, count: Int) {\\n\\n }\\n })\\n \ \ }\\n\\n\\n /**\\n * @param count \\u5b57\\u6570\\n */\\n private\ \ fun onTextChanged(count: Int) {\\n textCountProgress.progress = count\\\ n replyCommitButton.isEnabled = count in 1..STATUS_LIMIT\\n }\\n\\n\\\ n private val onSuggestionSelectListener = object : MentionListView.OnItemClickListener\ \ {\\n override fun onItemClick(position: Int, item: Suggestible) {\\n\ \ (item as Player).let {\\n replyEt.insertMention(it)\\\ n displaySuggestions(false)\\n playerViewModel.updateMentionedAt(it)\ \ //\\n onTextChanged(replyEt.text.length)\\n replyEt.requestFocus()\\\ n replyEt.setSelection(replyEt.text.length)\\n }\\n\ \ }\\n }\\n\\n @Suppress(\\\"PrivatePropertyName\\\")\\n private\ \ val BUCKET = \\\"player-mentioned\\\"\\n\\n override fun onQueryReceived(queryToken:\ \ QueryToken): MutableList<String> {\\n val data = playerViewModel.filter(queryToken.keywords)\\\ n onReceiveSuggestionsResult(SuggestionsResult(queryToken, data), BUCKET)\\\ n return arrayOf(BUCKET).toMutableList()\\n }\\n\\n override fun\ \ onReceiveSuggestionsResult(result: SuggestionsResult, bucket: String) {\\n \ \ val data = result.suggestions\\n if (data?.isEmpty() != false)\ \ return\\n displaySuggestions(true)\\n findChildFragment(MentionListView::class.java).setData(data)\\\ n }\\n\\n override fun displaySuggestions(display: Boolean) {\\n \ \ viewPager.setCurrentItem(if (display) 1 else 0, true)\\n }\\n\\n override\ \ fun isDisplayingSuggestions() = viewPager.currentItem == 1\\n\\n override\ \ fun onBackPressedSupport(): Boolean {\\n when {\\n panelRootContainer.visibility\ \ == View.VISIBLE -> KeyboardUtil.hideKeyboard(panelRootContainer)\\n \ \ isDisplayingSuggestions -> displaySuggestions(false)\\n else ->\ \ pop()\\n }\\n return true\\n\\n }\\n\\n override fun onDestroy()\ \ {\\n keyboardListener?.let { KeyboardUtil.detach(activity, it) }\\n \ \ activity?.currentFocus?.let { KeyboardUtil.hideKeyboard(it) }\\n \ \ super.onDestroy()\\n }\\n\\n}\",\n \"license\": \"mit\"\n },\n\ \ \"truncated_cells\": []\n }\n]" model-index: - name: Alibaba-NLP/gte-base-en-v1.5 trained on query-to-dataset-viewer-descriptions results: - task: type: triplet name: Triplet dataset: name: Unknown type: unknown metrics: - type: cosine_accuracy value: 1.0 name: Cosine Accuracy - type: dot_accuracy value: 0.0 name: Dot Accuracy - type: manhattan_accuracy value: 1.0 name: Manhattan Accuracy - type: euclidean_accuracy value: 1.0 name: Euclidean Accuracy - type: max_accuracy value: 1.0 name: Max Accuracy --- # Alibaba-NLP/gte-base-en-v1.5 trained on query-to-dataset-viewer-descriptions This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [Alibaba-NLP/gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5) on the [query-to-dataset-viewer-descriptions](https://huggingface.co/datasets/davanstrien/query-to-dataset-viewer-descriptions) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [Alibaba-NLP/gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5) <!-- at revision a8e4f3e0ee719c75bc30d12b8eae0f8440502718 --> - **Maximum Sequence Length:** 8192 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity - **Training Dataset:** - [query-to-dataset-viewer-descriptions](https://huggingface.co/datasets/davanstrien/query-to-dataset-viewer-descriptions) - **Language:** en - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NewModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("query-to-dataset-viewer-descriptions") # Run inference sentences = [ 'USER_QUERY: kotlin code dataset', 'HUB_DATASET_PREVIEW: DATASET_NAME: "mvasiliniuc/iva-kotlin-codeint"\nFEATURES: {\'repo_name\': {\'dtype\': \'string\', \'_type\': \'Value\'}, \'path\': {\'dtype\': \'string\', \'_type\': \'Value\'}, \'copies\': {\'dtype\': \'string\', \'_type\': \'Value\'}, \'size\': {\'dtype\': \'string\', \'_type\': \'Value\'}, \'content\': {\'dtype\': \'string\', \'_type\': \'Value\'}, \'license\': {\'dtype\': \'string\', \'_type\': \'Value\'}}\nDATA SAMPLE:\n[\n {\n "row_idx": 0,\n "row": {\n "repo_name": "Cognifide/gradle-aem-plugin",\n "path": "src/main/kotlin/com/cognifide/gradle/aem/instance/tasks/InstanceReload.kt",\n "copies": "1",\n "size": "1052",\n "content": "package com.cognifide.gradle.aem.instance.tasks\\n\\nimport com.cognifide.gradle.aem.common.instance.action.AwaitUpAction\\nimport com.cognifide.gradle.aem.common.instance.action.ReloadAction\\nimport com.cognifide.gradle.aem.common.instance.names\\nimport com.cognifide.gradle.aem.common.tasks.Instance\\nimport org.gradle.api.tasks.TaskAction\\n\\nopen class InstanceReload : Instance() {\\n\\n private var reloadOptions: ReloadAction.() -> Unit = {}\\n\\n fun reload(options: ReloadAction.() -> Unit) {\\n this.reloadOptions = options\\n }\\n\\n private var awaitUpOptions: AwaitUpAction.() -> Unit = {}\\n\\n fun awaitUp(options: AwaitUpAction.() -> Unit) {\\n this.awaitUpOptions = options\\n }\\n\\n @TaskAction\\n fun reload() {\\n instanceManager.awaitReloaded(anyInstances, reloadOptions, awaitUpOptions)\\n common.notifier.lifecycle(\\"Instance(s) reloaded\\", \\"Which: ${anyInstances.names}\\")\\n }\\n\\n init {\\n description = \\"Reloads all AEM instance(s).\\"\\n }\\n\\n companion object {\\n const val NAME = \\"instanceReload\\"\\n }\\n}\\n",\n "license": "apache-2.0"\n },\n "truncated_cells": []\n },\n {\n "row_idx": 1,\n "row": {\n "repo_name": "80998062/Fank",\n "path": "presentation/src/main/java/com/sinyuk/fanfou/ui/status/StatusView.kt",\n "copies": "1",\n "size": "8490",\n "content": "/*\\n *\\n * * Apache License\\n * *\\n * * Copyright [2017] Sinyuk\\n * *\\n * * Licensed under the Apache License, Version 2.0 (the \\"License\\");\\n * * you may not use this file except in compliance with the License.\\n * * You may obtain a copy of the License at\\n * *\\n * * http://www.apache.org/licenses/LICENSE-2.0\\n * *\\n * * Unless required by applicable law or agreed to in writing, software\\n * * distributed under the License is distributed on an \\"AS IS\\" BASIS,\\n * * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\\n * * See the License for the specific language governing permissions and\\n * * limitations under the License.\\n *\\n */\\n\\npackage com.sinyuk.fanfou.ui.status\\n\\nimport android.os.Build\\nimport android.os.Bundle\\nimport android.support.v4.app.Fragment\\nimport android.support.v4.app.FragmentPagerAdapter\\nimport android.text.Editable\\nimport android.text.TextWatcher\\nimport android.view.View\\nimport android.view.ViewTreeObserver\\nimport cn.dreamtobe.kpswitch.util.KeyboardUtil\\nimport com.linkedin.android.spyglass.suggestions.SuggestionsResult\\nimport com.linkedin.android.spyglass.suggestions.interfaces.Suggestible\\nimport com.linkedin.android.spyglass.suggestions.interfaces.SuggestionsResultListener\\nimport com.linkedin.android.spyglass.suggestions.interfaces.SuggestionsVisibilityManager\\nimport com.linkedin.android.spyglass.tokenization.QueryToken\\nimport com.linkedin.android.spyglass.tokenization.impl.WordTokenizer\\nimport com.linkedin.android.spyglass.tokenization.impl.WordTokenizerConfig\\nimport com.linkedin.android.spyglass.tokenization.interfaces.QueryTokenReceiver\\nimport com.sinyuk.fanfou.R\\nimport com.sinyuk.fanfou.base.AbstractActivity\\nimport com.sinyuk.fanfou.base.AbstractFragment\\nimport com.sinyuk.fanfou.di.Injectable\\nimport com.sinyuk.fanfou.domain.DO.Player\\nimport com.sinyuk.fanfou.domain.DO.Status\\nimport com.sinyuk.fanfou.domain.STATUS_LIMIT\\nimport com.sinyuk.fanfou.domain.StatusCreation\\nimport com.sinyuk.fanfou.domain.TIMELINE_CONTEXT\\nimport com.sinyuk.fanfou.ui.editor.EditorView\\nimport com.sinyuk.fanfou.ui.editor.MentionListView\\nimport com.sinyuk.fanfou.ui.timeline.TimelineView\\nimport com.sinyuk.fanfou.util.obtainViewModelFromActivity\\nimport com.sinyuk.fanfou.viewmodel.FanfouViewModelFactory\\nimport com.sinyuk.fanfou.viewmodel.PlayerViewModel\\nimport kotlinx.android.synthetic.main.status_view.*\\nimport kotlinx.android.synthetic.main.status_view_footer.*\\nimport kotlinx.android.synthetic.main.status_view_reply_actionbar.*\\nimport javax.inject.Inject\\n\\n\\n/**\\n * Created by sinyuk on 2018/1/12.\\n *\\n */\\nclass StatusView : AbstractFragment(), Injectable, QueryTokenReceiver, SuggestionsResultListener, SuggestionsVisibilityManager {\\n\\n companion object {\\n fun newInstance(status: Status, photoExtra: Bundle? = null) = StatusView().apply {\\n arguments = Bundle().apply {\\n putParcelable(\\"status\\", status)\\n putBundle(\\"photoExtra\\", photoExtra)\\n }\\n }\\n }\\n\\n override fun layoutId() = R.layout.status_view\\n\\n @Inject\\n lateinit var factory: FanfouViewModelFactory\\n\\n private val playerViewModel by lazy { obtainViewModelFromActivity(factory, PlayerViewModel::class.java) }\\n\\n override fun onEnterAnimationEnd(savedInstanceState: Bundle?) {\\n super.onEnterAnimationEnd(savedInstanceState)\\n navBack.setOnClickListener { onBackPressedSupport() }\\n setupEditor()\\n setupKeyboard()\\n onTextChanged(0)\\n setupViewPager()\\n\\n val status = arguments!!.getParcelable<Status>(\\"status\\")\\n fullscreenButton.setOnClickListener {\\n (activity as AbstractActivity).start(EditorView.newInstance(status.id,\\n replyEt.mentionsText,\\n StatusCreation.REPOST_STATUS))\\n replyEt.text = null\\n }\\n }\\n\\n private fun setupViewPager() {\\n val status = arguments!!.getParcelable<Status>(\\"status\\")\\n val bundle = arguments!!.getBundle(\\"photoExtra\\")\\n val fragments: List<Fragment> = if (findChildFragment(TimelineView::class.java) == null) {\\n val mentionView = MentionListView()\\n mentionView.onItemClickListener = onSuggestionSelectListener\\n mutableListOf(TimelineView.contextTimeline(TIMELINE_CONTEXT, status, bundle), mentionView)\\n } else {\\n mutableListOf(findChildFragment(TimelineView::class.java), MentionListView())\\n }\\n\\n viewPager.setPagingEnabled(false)\\n viewPager.offscreenPageLimit = 1\\n viewPager.adapter = object : FragmentPagerAdapter(childFragmentManager) {\\n override fun getItem(position: Int) = fragments[position]\\n\\n override fun getCount() = fragments.size\\n }\\n }\\n\\n private var keyboardListener: ViewTreeObserver.OnGlobalLayoutListener? = null\\n\\n private fun setupKeyboard() {\\n keyboardListener = KeyboardUtil.attach(activity, panelRoot, {\\n // TODO: how comes the Exception: panelRootContainer must not be null\\n panelRootContainer?.visibility =\\n if (it) {\\n if (replyEt.requestFocus()) replyEt.setSelection(replyEt.text.length)\\n View.VISIBLE\\n } else {\\n replyEt.clearFocus()\\n View.GONE\\n }\\n })\\n }\\n\\n private val config = WordTokenizerConfig.Builder()\\n .setExplicitChars(\\"@\\")\\n .setThreshold(3)\\n .setMaxNumKeywords(5)\\n .setWordBreakChars(\\" \\").build()\\n\\n private fun setupEditor() {\\n replyEt.tokenizer = WordTokenizer(config)\\n replyEt.setAvoidPrefixOnTap(true)\\n replyEt.setQueryTokenReceiver(this)\\n replyEt.setSuggestionsVisibilityManager(this)\\n replyEt.setAvoidPrefixOnTap(true)\\n\\n replyCommitButton.setOnClickListener { }\\n\\n if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O)\\n textCountProgress.min = 0\\n textCountProgress.max = STATUS_LIMIT\\n replyEt.addTextChangedListener(object : TextWatcher {\\n override fun afterTextChanged(s: Editable?) {\\n onTextChanged(s?.length ?: 0)\\n }\\n\\n override fun beforeTextChanged(s: CharSequence?, start: Int, count: Int, after: Int) {\\n\\n }\\n\\n override fun onTextChanged(s: CharSequence?, start: Int, before: Int, count: Int) {\\n\\n }\\n })\\n }\\n\\n\\n /**\\n * @param count \\u5b57\\u6570\\n */\\n private fun onTextChanged(count: Int) {\\n textCountProgress.progress = count\\n replyCommitButton.isEnabled = count in 1..STATUS_LIMIT\\n }\\n\\n\\n private val onSuggestionSelectListener = object : MentionListView.OnItemClickListener {\\n override fun onItemClick(position: Int, item: Suggestible) {\\n (item as Player).let {\\n replyEt.insertMention(it)\\n displaySuggestions(false)\\n playerViewModel.updateMentionedAt(it) //\\n onTextChanged(replyEt.text.length)\\n replyEt.requestFocus()\\n replyEt.setSelection(replyEt.text.length)\\n }\\n }\\n }\\n\\n @Suppress(\\"PrivatePropertyName\\")\\n private val BUCKET = \\"player-mentioned\\"\\n\\n override fun onQueryReceived(queryToken: QueryToken): MutableList<String> {\\n val data = playerViewModel.filter(queryToken.keywords)\\n onReceiveSuggestionsResult(SuggestionsResult(queryToken, data), BUCKET)\\n return arrayOf(BUCKET).toMutableList()\\n }\\n\\n override fun onReceiveSuggestionsResult(result: SuggestionsResult, bucket: String) {\\n val data = result.suggestions\\n if (data?.isEmpty() != false) return\\n displaySuggestions(true)\\n findChildFragment(MentionListView::class.java).setData(data)\\n }\\n\\n override fun displaySuggestions(display: Boolean) {\\n viewPager.setCurrentItem(if (display) 1 else 0, true)\\n }\\n\\n override fun isDisplayingSuggestions() = viewPager.currentItem == 1\\n\\n override fun onBackPressedSupport(): Boolean {\\n when {\\n panelRootContainer.visibility == View.VISIBLE -> KeyboardUtil.hideKeyboard(panelRootContainer)\\n isDisplayingSuggestions -> displaySuggestions(false)\\n else -> pop()\\n }\\n return true\\n\\n }\\n\\n override fun onDestroy() {\\n keyboardListener?.let { KeyboardUtil.detach(activity, it) }\\n activity?.currentFocus?.let { KeyboardUtil.hideKeyboard(it) }\\n super.onDestroy()\\n }\\n\\n}",\n "license": "mit"\n },\n "truncated_cells": []\n }\n]', 'NEGATIVE: DATASET_NAME: "vikp/starcoder_cleaned"\nFEATURES: {\'code\': {\'dtype\': \'string\', \'_type\': \'Value\'}, \'repo_path\': {\'dtype\': \'string\', \'_type\': \'Value\'}}\nDATA SAMPLE:\n[\n {\n "row_idx": 0,\n "row": {\n "code": "# ---\\n# jupyter:\\n# jupytext:\\n# text_representation:\\n# extension: .py\\n# format_name: light\\n# format_version: \'1.5\'\\n# jupytext_version: 1.14.4\\n# kernelspec:\\n# display_name: Python 3\\n# language: python\\n# name: python3\\n# ---\\n\\n# # 09 Strain Gage\\n#\\n# This is one of the most commonly used sensor. It is used in many transducers. Its fundamental operating principle is fairly easy to understand and it will be the purpose of this lecture. \\n#\\n# A strain gage is essentially a thin wire that is wrapped on film of plastic. \\n# <img src=\\"img/StrainGage.png\\" width=\\"200\\">\\n# The strain gage is then mounted (glued) on the part for which the strain must be measured. \\n# <img src=\\"img/Strain_gauge_2.jpg\\" width=\\"200\\">\\n#\\n# ## Stress, Strain\\n# When a beam is under axial load, the axial stress, $\\\\sigma_a$, is defined as:\\n# \\\\begin{align*}\\n# \\\\sigma_a = \\\\frac{F}{A}\\n# \\\\end{align*}\\n# with $F$ the axial load, and $A$ the cross sectional area of the beam under axial load.\\n#\\n# <img src=\\"img/BeamUnderStrain.png\\" width=\\"200\\">\\n#\\n# Under the load, the beam of length $L$ will extend by $dL$, giving rise to the definition of strain, $\\\\epsilon_a$:\\n# \\\\begin{align*}\\n# \\\\epsilon_a = \\\\frac{dL}{L}\\n# \\\\end{align*}\\n# The beam will also contract laterally: the cross sectional area is reduced by $dA$. This results in a transverval strain $\\\\epsilon_t$. The transversal and axial strains are related by the Poisson\'s ratio:\\n# \\\\begin{align*}\\n# \\\\nu = - \\\\frac{\\\\epsilon_t }{\\\\epsilon_a}\\n# \\\\end{align*}\\n# For a metal the Poission\'s ratio is typically $\\\\nu = 0.3$, for an incompressible material, such as rubber (or water), $\\\\nu = 0.5$.\\n#\\n# Within the elastic limit, the axial stress and axial strain are related through Hooke\'s law by the Young\'s modulus, $E$:\\n# \\\\begin{align*}\\n# \\\\sigma_a = E \\\\epsilon_a\\n# \\\\end{align*}\\n#\\n# <img src=\\"img/ElasticRegime.png\\" width=\\"200\\">\\n\\n# ## Resistance of a wire\\n#\\n# The electrical resistance of a wire $R$ is related to its physical properties (the electrical resistiviy, $\\\\rho$ in $\\\\Omega$/m) and its geometry: length $L$ and cross sectional area $A$.\\n#\\n# \\\\begin{align*}\\n# R = \\\\frac{\\\\rho L}{A}\\n# \\\\end{align*}\\n#\\n# Mathematically, the change in wire dimension will result inchange in its electrical resistance. This can be derived from first principle:\\n# \\\\begin{align}\\n# \\\\frac{dR}{R} = \\\\frac{d\\\\rho}{\\\\rho} + \\\\frac{dL}{L} - \\\\frac{dA}{A}\\n# \\\\end{align}\\n# If the wire has a square cross section, then:\\n# \\\\begin{align*}\\n# A & = L\'^2 \\\\\\\\\\n# \\\\frac{dA}{A} & = \\\\frac{d(L\'^2)}{L\'^2} = \\\\frac{2L\'dL\'}{L\'^2} = 2 \\\\frac{dL\'}{L\'}\\n# \\\\end{align*}\\n# We have related the change in cross sectional area to the transversal strain.\\n# \\\\begin{align*}\\n# \\\\epsilon_t = \\\\frac{dL\'}{L\'}\\n# \\\\end{align*}\\n# Using the Poisson\'s ratio, we can relate then relate the change in cross-sectional area ($dA/A$) to axial strain $\\\\epsilon_a = dL/L$.\\n# \\\\begin{align*}\\n# \\\\epsilon_t &= - \\\\nu \\\\epsilon_a \\\\\\\\\\n# \\\\frac{dL\'}{L\'} &= - \\\\nu \\\\frac{dL}{L} \\\\; \\\\text{or}\\\\\\\\\\n# \\\\frac{dA}{A} & = 2\\\\frac{dL\'}{L\'} = -2 \\\\nu \\\\frac{dL}{L}\\n# \\\\end{align*}\\n# Finally we can substitute express $dA/A$ in eq. for $dR/R$ and relate change in resistance to change of wire geometry, remembering that for a metal $\\\\nu =0.3$:\\n# \\\\begin{align}\\n# \\\\frac{dR}{R} & = \\\\frac{d\\\\rho}{\\\\rho} + \\\\frac{dL}{L} - \\\\frac{dA}{A} \\\\\\\\\\n# & = \\\\frac{d\\\\rho}{\\\\rho} + \\\\frac{dL}{L} - (-2\\\\nu \\\\frac{dL}{L}) \\\\\\\\\\n# & = \\\\frac{d\\\\rho}{\\\\rho} + 1.6 \\\\frac{dL}{L} = \\\\frac{d\\\\rho}{\\\\rho} + 1.6 \\\\epsilon_a\\n# \\\\end{align}\\n# It also happens that for most metals, the resistivity increases with axial strain. In general, one can then related the change in resistance to axial strain by defining the strain gage factor:\\n# \\\\begin{align}\\n# S = 1.6 + \\\\frac{d\\\\rho}{\\\\rho}\\\\cdot \\\\frac{1}{\\\\epsilon_a}\\n# \\\\end{align}\\n# and finally, we have:\\n# \\\\begin{align*}\\n# \\\\frac{dR}{R} = S \\\\epsilon_a\\n# \\\\end{align*}\\n# $S$ is materials dependent and is typically equal to 2.0 for most commercially availabe strain gages. It is dimensionless.\\n#\\n# Strain gages are made of thin wire that is wraped in several loops, effectively increasing the length of the wire and therefore the sensitivity of the sensor.\\n#\\n# _Question:\\n#\\n# Explain why a longer wire is necessary to increase the sensitivity of the sensor_.\\n#\\n# Most commercially available strain gages have a nominal resistance (resistance under no load, $R_{ini}$) of 120 or 350 $\\\\Omega$.\\n#\\n# Within the elastic regime, strain is typically within the range $10^{-6} - 10^{-3}$, in fact strain is expressed in unit of microstrain, with a 1 microstrain = $10^{-6}$. Therefore, changes in resistances will be of the same order. If one were to measure resistances, we will need a dynamic range of 120 dB, whih is typically very expensive. Instead, one uses the Wheatstone bridge to transform the change in resistance to a voltage, which is easier to measure and does not require such a large dynamic range.\\n\\n# ## Wheatstone bridge:\\n# <img src=\\"img/WheatstoneBridge.png\\" width=\\"200\\">\\n#\\n# The output voltage is related to the difference in resistances in the bridge:\\n# \\\\begin{align*}\\n# \\\\frac{V_o}{V_s} = \\\\frac{R_1R_3-R_2R_4}{(R_1+R_4)(R_2+R_3)}\\n# \\\\end{align*}\\n#\\n# If the bridge is balanced, then $V_o = 0$, it implies: $R_1/R_2 = R_4/R_3$.\\n#\\n# In practice, finding a set of resistors that balances the bridge is challenging, and a potentiometer is used as one of the resistances to do minor adjustement to balance the bridge. If one did not do the adjustement (ie if we did not zero the bridge) then all the measurement will have an offset or bias that could be removed in a post-processing phase, as long as the bias stayed constant.\\n#\\n# If each resistance $R_i$ is made to vary slightly around its initial value, ie $R_i = R_{i,ini} + dR_i$. For simplicity, we will assume that the initial value of the four resistances are equal, ie $R_{1,ini} = R_{2,ini} = R_{3,ini} = R_{4,ini} = R_{ini}$. This implies that the bridge was initially balanced, then the output voltage would be:\\n#\\n# \\\\begin{align*}\\n# \\\\frac{V_o}{V_s} = \\\\frac{1}{4} \\\\left( \\\\frac{dR_1}{R_{ini}} - \\\\frac{dR_2}{R_{ini}} + \\\\frac{dR_3}{R_{ini}} - \\\\frac{dR_4}{R_{ini}} \\\\right)\\n# \\\\end{align*}\\n#\\n# Note here that the changes in $R_1$ and $R_3$ have a positive effect on $V_o$, while the changes in $R_2$ and $R_4$ have a negative effect on $V_o$. In practice, this means that is a beam is a in tension, then a strain gage mounted on the branch 1 or 3 of the Wheatstone bridge will produce a positive voltage, while a strain gage mounted on branch 2 or 4 will produce a negative voltage. One takes advantage of this to increase sensitivity to measure strain.\\n#\\n# ### Quarter bridge\\n# One uses only one quarter of the bridge, ie strain gages are only mounted on one branch of the bridge.\\n#\\n# \\\\begin{align*}\\n# \\\\frac{V_o}{V_s} = \\\\pm \\\\frac{1}{4} \\\\epsilon_a S\\n# \\\\end{align*}\\n# Sensitivity, $G$:\\n# \\\\begin{align*}\\n# G = \\\\frac{V_o}{\\\\epsilon_a} = \\\\pm \\\\frac{1}{4}S V_s\\n# \\\\end{align*}\\n#\\n#\\n# ### Half bridge\\n# One uses half of the bridge, ie strain gages are mounted on two branches of the bridge.\\n#\\n# \\\\begin{align*}\\n# \\\\frac{V_o}{V_s} = \\\\pm \\\\frac{1}{2} \\\\epsilon_a S\\n# \\\\end{align*}\\n#\\n# ### Full bridge\\n#\\n# One uses of the branches of the bridge, ie strain gages are mounted on each branch.\\n#\\n# \\\\begin{align*}\\n# \\\\frac{V_o}{V_s} = \\\\pm \\\\epsilon_a S\\n# \\\\end{align*}\\n#\\n# Therefore, as we increase the order of bridge, the sensitivity of the instrument increases. However, one should be carefull how we mount the strain gages as to not cancel out their measurement.\\n\\n# _Exercise_\\n#\\n# 1- Wheatstone bridge\\n#\\n# <img src=\\"img/WheatstoneBridge.png\\" width=\\"200\\">\\n#\\n# > How important is it to know \\\\& match the resistances of the resistors you employ to create your bridge?\\n# > How would you do that practically?\\n# > Assume $R_1=120\\\\,\\\\Omega$, $R_2=120\\\\,\\\\Omega$, $R_3=120\\\\,\\\\Omega$, $R_4=110\\\\,\\\\Omega$, $V_s=5.00\\\\,\\\\text{V}$. What is $V_\\\\circ$?\\n\\nVs = 5.00\\nVo = (120**2-120*110)/(230*240) * Vs\\nprint(\'Vo = \',Vo, \' V\')\\n\\n# typical range in strain a strain gauge can measure\\n# 1 -1000 micro-Strain\\nAxialStrain = 1000*10**(-6) # axial strain\\nStrainGageFactor = 2\\nR_ini = 120 # Ohm\\nR_1 = R_ini+R_ini*StrainGageFactor*AxialStrain\\nprint(R_1)\\nVo = (120**2-120*(R_1))/((120+R_1)*240) * Vs\\nprint(\'Vo = \', Vo, \' V\')\\n\\n# > How important is it to know \\\\& match the resistances of the resistors you employ to create your bridge?\\n# > How would you do that practically?\\n# > Assume $R_1= R_2 =R_3=120\\\\,\\\\Omega$, $R_4=120.01\\\\,\\\\Omega$, $V_s=5.00\\\\,\\\\text{V}$. What is $V_\\\\circ$?\\n\\nVs = 5.00\\nVo = (120**2-120*120.01)/(240.01*240) * Vs\\nprint(Vo)\\n\\n# 2- Strain gage 1:\\n#\\n# One measures the strain on a bridge steel beam. The modulus of elasticity is $E=190$ GPa. Only one strain gage is mounted on the bottom of the beam; the strain gage factor is $S=2.02$.\\n#\\n# > a) What kind of electronic circuit will you use? Draw a sketch of it.\\n#\\n# > b) Assume all your resistors including the unloaded strain gage are balanced and measure $120\\\\,\\\\Omega$, and that the strain gage is at location $R_2$. The supply voltage is $5.00\\\\,\\\\text{VDC}$. Will $V_\\\\circ$ be positive or negative when a downward load is added?\\n\\n# In practice, we cannot have all resistances = 120 $\\\\Omega$. at zero load, the bridge will be unbalanced (show $V_o \\\\neq 0$). How could we balance our bridge?\\n#\\n# Use a potentiometer to balance bridge, for the load cell, we \'\'zero\'\' the instrument.\\n#\\n# Other option to zero-out our instrument? Take data at zero-load, record the voltage, $V_{o,noload}$. Substract $V_{o,noload}$ to my data.\\n\\n# > c) For a loading in which $V_\\\\circ = -1.25\\\\,\\\\text{mV}$, calculate the strain $\\\\epsilon_a$ in units of microstrain.\\n\\n# \\\\begin{align*}\\n# \\\\frac{V_o}{V_s} & = - \\\\frac{1}{4} \\\\epsilon_a S\\\\\\\\\\n# \\\\epsilon_a & = -\\\\frac{4}{S} \\\\frac{V_o}{V_s}\\n# \\\\end{align*}\\n\\nS = 2.02\\nVo = -0.00125\\nVs = 5\\neps_a = -1*(4/S)*(Vo/Vs)\\nprint(eps_a)\\n\\n# > d) Calculate the axial stress (in MPa) in the beam under this load.\\n\\n\\n\\n# > e) You now want more sensitivity in your measurement, you install a second strain gage on to\\n\\n# p of the beam. Which resistor should you use for this second active strain gage?\\n#\\n# > f) With this new setup and the same applied load than previously, what should be the output voltage?\\n\\n# 3- Strain Gage with Long Lead Wires \\n#\\n# <img src=\\"img/StrainGageLongWires.png\\" width=\\"360\\">\\n#\\n# A quarter bridge strain gage Wheatstone bridge circuit is constructed with $120\\\\,\\\\Omega$ resistors and a $120\\\\,\\\\Omega$ strain gage. For this practical application, the strain gage is located very far away form the DAQ station and the lead wires to the strain gage are $10\\\\,\\\\text{m}$ long and the lead wire have a resistance of $0.080\\\\,\\\\Omega/\\\\text{m}$. The lead wire resistance can lead to problems since $R_{lead}$ changes with temperature.\\n#\\n# > Design a modified circuit that will cancel out the effect of the lead wires.\\n\\n# ## Homework\\n#\\n",\n "repo_path": "Lectures/09_StrainGage.ipynb"\n },\n "truncated_cells": []\n },\n {\n "row_idx": 1,\n "row": {\n "code": "# ---\\n# jupyter:\\n# jupytext:\\n# split_at_heading: true\\n# text_representation:\\n# extension: .py\\n# format_name: light\\n# format_version: \'1.5\'\\n# jupytext_version: 1.14.4\\n# kernelspec:\\n# display_name: Python 3\\n# language: python\\n# name: python3\\n# ---\\n\\n#export\\nfrom fastai.basics import *\\nfrom fastai.tabular.core import *\\nfrom fastai.tabular.model import *\\n\\nfrom fastai.tabular.data import *\\n\\n#hide\\nfrom nbdev.showdoc import *\\n\\n\\n# +\\n#default_exp tabular.learner\\n# -\\n\\n# # Tabular learner\\n#\\n# > The function to immediately get a `Learner` ready to train for tabular data\\n\\n# The main function you probably want to use in this module is `tabular_learner`. It will automatically create a `TabulaModel` suitable for your data and infer the irght loss function. See the [tabular tutorial](http://docs.fast.ai/tutorial.tabular) for an example of use in context.\\n\\n# ## Main functions\\n\\n#export\\n@log_args(but_as=Learner.__init__)\\nclass TabularLearner(Learner):\\n \\"`Learner` for tabular data\\"\\n def predict(self, row):\\n tst_to = self.dls.valid_ds.new(pd.DataFrame(row).T)\\n tst_to.process()\\n tst_to.conts = tst_to.conts.astype(np.float32)\\n dl = self.dls.valid.new(tst_to)\\n inp,preds,_,dec_preds = self.get_preds(dl=dl, with_input=True, with_decoded=True)\\n i = getattr(self.dls, \'n_inp\', -1)\\n b = (*tuplify(inp),*tuplify(dec_preds))\\n full_dec = self.dls.decode((*tuplify(inp),*tuplify(dec_preds)))\\n return full_dec,dec_preds[0],preds[0]\\n\\n\\nshow_doc(TabularLearner, title_level=3)\\n\\n\\n# It works exactly as a normal `Learner`, the only difference is that it implements a `predict` method specific to work on a row of data.\\n\\n#export\\n@log_args(to_return=True, but_as=Learner.__init__)\\n@delegates(Learner.__init__)\\ndef tabular_learner(dls, layers=None, emb_szs=None, config=None, n_out=None, y_range=None, **kwargs):\\n \\"Get a `Learner` using `dls`, with `metrics`, including a `TabularModel` created using the remaining params.\\"\\n if config is None: config = tabular_config()\\n if layers is None: layers = [200,100]\\n to = dls.train_ds\\n emb_szs = get_emb_sz(dls.train_ds, {} if emb_szs is None else emb_szs)\\n if n_out is None: n_out = get_c(dls)\\n assert n_out, \\"`n_out` is not defined, and could not be infered from data, set `dls.c` or pass `n_out`\\"\\n if y_range is None and \'y_range\' in config: y_range = config.pop(\'y_range\')\\n model = TabularModel(emb_szs, len(dls.cont_names), n_out, layers, y_range=y_range, **config)\\n return TabularLearner(dls, model, **kwargs)\\n\\n\\n# If your data was built with fastai, you probably won\'t need to pass anything to `emb_szs` unless you want to change the default of the library (produced by `get_emb_sz`), same for `n_out` which should be automatically inferred. `layers` will default to `[200,100]` and is passed to `TabularModel` along with the `config`.\\n#\\n# Use `tabular_config` to create a `config` and cusotmize the model used. There is just easy access to `y_range` because this argument is often used.\\n#\\n# All the other arguments are passed to `Learner`.\\n\\npath = untar_data(URLs.ADULT_SAMPLE)\\ndf = pd.read_csv(path/\'adult.csv\')\\ncat_names = [\'workclass\', \'education\', \'marital-status\', \'occupation\', \'relationship\', \'race\']\\ncont_names = [\'age\', \'fnlwgt\', \'education-num\']\\nprocs = [Categorify, FillMissing, Normalize]\\ndls = TabularDataLoaders.from_df(df, path, procs=procs, cat_names=cat_names, cont_names=cont_names, \\n y_names=\\"salary\\", valid_idx=list(range(800,1000)), bs=64)\\nlearn = tabular_learner(dls)\\n\\n#hide\\ntst = learn.predict(df.iloc[0])\\n\\n# +\\n#hide\\n#test y_range is passed\\nlearn = tabular_learner(dls, y_range=(0,32))\\nassert isinstance(learn.model.layers[-1], SigmoidRange)\\ntest_eq(learn.model.layers[-1].low, 0)\\ntest_eq(learn.model.layers[-1].high, 32)\\n\\nlearn = tabular_learner(dls, config = tabular_config(y_range=(0,32)))\\nassert isinstance(learn.model.layers[-1], SigmoidRange)\\ntest_eq(learn.model.layers[-1].low, 0)\\ntest_eq(learn.model.layers[-1].high, 32)\\n\\n\\n# -\\n\\n#export\\n@typedispatch\\ndef show_results(x:Tabular, y:Tabular, samples, outs, ctxs=None, max_n=10, **kwargs):\\n df = x.all_cols[:max_n]\\n for n in x.y_names: df[n+\'_pred\'] = y[n][:max_n].values\\n display_df(df)\\n\\n\\n# ## Export -\\n\\n#hide\\nfrom nbdev.export import notebook2script\\nnotebook2script()\\n\\n\\n",\n "repo_path": "nbs/43_tabular.learner.ipynb"\n },\n "truncated_cells": []\n }\n]', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Triplet * Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator) | Metric | Value | |:-------------------|:--------| | cosine_accuracy | 1.0 | | dot_accuracy | 0.0 | | manhattan_accuracy | 1.0 | | euclidean_accuracy | 1.0 | | **max_accuracy** | **1.0** | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### query-to-dataset-viewer-descriptions * Dataset: [query-to-dataset-viewer-descriptions](https://huggingface.co/datasets/davanstrien/query-to-dataset-viewer-descriptions) * Size: 1,141 training samples * Columns: <code>query</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | query | positive | negative | |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 11.72 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 40 tokens</li><li>mean: 2018.88 tokens</li><li>max: 8192 tokens</li></ul> | <ul><li>min: 41 tokens</li><li>mean: 2125.25 tokens</li><li>max: 8192 tokens</li></ul> | * Samples: | query | positive | negative | |:------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>USER_QUERY: LLM paper dataset</code> | <code>HUB_DATASET_PREVIEW: DATASET_NAME: "MarkrAI/AutoRAG-evaluation-2024-LLM-paper-v1"<br>FEATURES: {'doc_id': {'dtype': 'string', '_type': 'Value'}, 'contents': {'dtype': 'string', '_type': 'Value'}, 'metadata': {'creation_datetime': {'dtype': 'string', '_type': 'Value'}, 'file_name': {'dtype': 'string', '_type': 'Value'}, 'file_path': {'dtype': 'string', '_type': 'Value'}, 'file_size': {'dtype': 'int64', '_type': 'Value'}, 'file_type': {'dtype': 'null', '_type': 'Value'}, 'last_accessed_datetime': {'dtype': 'string', '_type': 'Value'}, 'last_modified_datetime': {'dtype': 'string', '_type': 'Value'}}}<br>DATA SAMPLE:<br>[<br> {<br> "row_idx": 0,<br> "row": {<br> "doc_id": "6f86094c-47fe-43de-a77a-e8c34c69c997",<br> "contents": "# Rag-Driver: Generalisable Driving Explanations With Retrieval-Augmented In-Context Learning In Multi-Modal Large Language Model\n\nJianhao Yuan1, Shuyang Sun1, Daniel Omeiza1, Bo Zhao2, Paul Newman1, Lars Kunze1, Matthew Gadd1\n1 University of Oxford 2 Beijing Academy of Artificial Intelligence\n{jianhaoyuan,kevinsun,daniel,pnewman,lars,mattgadd}@robots.ox.ac.uk \nAbstract\u2014Robots powered by 'blackbox' models need to provide\nhuman-understandable explanations which we can trust. Hence,\nexplainability plays a critical role in trustworthy autonomous\ndecision-making to foster transparency and acceptance among\nend users, especially in complex autonomous driving. Recent\nadvancements in Multi-Modal Large Language models (MLLMs)\nhave shown promising potential in enhancing the explainability\nas a driving agent by producing control predictions along with\nnatural language explanations. However, severe data scarcity\ndue to expensive annotation costs and significant domain gaps\nbetween different datasets makes the development of a robust and\ngeneralisable system an extremely challenging task. Moreover, the\nprohibitively expensive training requirements of MLLM and the\nunsolved problem of catastrophic forgetting further limit their\ngeneralisability post-deployment. To address these challenges, we\npresent RAG-Driver, a novel retrieval-augmented multi-modal\nlarge language model that leverages in-context learning for high-\nperformance, explainable, and generalisable autonomous driving.\nBy grounding in retrieved expert demonstration, we empirically\nvalidate that RAG-Driver achieves state-of-the-art performance in\nproducing driving action explanations, justifications, and control\nsignal prediction. More importantly, it exhibits exceptional zero-\nshot generalisation capabilities to unseen environments without \nfurther training endeavours1.\nIndex Terms\u2014Autonomous driving, multi-modal language\nmodel, end-to-end driving, domain generalisation",<br> "metadata": {<br> "creation_datetime": "2024-03-04",<br> "file_name": "2402.10828v1.md",<br> "file_path": "paper_data/2402.10828v1.md",<br> "file_size": 64885,<br> "file_type": null,<br> "last_accessed_datetime": "2024-03-04",<br> "last_modified_datetime": "2024-02-22"<br> }<br> },<br> "truncated_cells": []<br> },<br> {<br> "row_idx": 1,<br> "row": {<br> "doc_id": "cf485ad0-8ec4-4a63-a0c6-5d7eb499c0c8",<br> "contents": "# Rag-Driver: Generalisable Driving Explanations With Retrieval-Augmented In-Context Learning In Multi-Modal Large Language Model\n## I. Introduction\n\nDriven by the emerging development of deep learning, autonomous driving has observed a paradigm shift from rulesbased decision systems [66, 21] to data-driven learning-based approaches [28, 6, 36]. However, this comes at the cost of transparency in decision-making, especially for end-to-end autonomous driving systems which are considered black-box in nature [13]. Thus, in addition to precision in action control, explanation provision is key in ensuring trustworthy decisionmaking to reconcile the system's decisions with end-user expectations to foster confidence and acceptance [79, 8, 57] in dynamic driving environments. \nTraditional approaches have mainly relied on attention visualisation [5, 7, 55] as a proxy to rationalise the decisions of the black-box systems or auxiliary intermediate tasks such as semantic segmentation [25, 32], object detection [16, 31], and affordance prediction [68, 45] provide meaningful intermediate representation for decision-making. However, these methods do not engage end-users in the dialogue as they are onedirectional and not readily comprehensible by the general users for the purpose of fostering trust and confidence. An alternative promising approach is the integration of natural language explanations [38, 33, 54], in particular through Multi-Modal Large Language Models (MLLMs) [1, 70]. These models, pretrained on extensive web-scale datasets, demonstrate remarkable reasoning capacity, enabling the transformation of complex vehicular decision-making processes into more understandable narrative formats, thereby offering a new layer of explainability to conventional systems. \nWhile several early attempts have demonstrated the potential of MLLMs as general explainable driving agents [78, 76, 51], these methods fall short of human-level understanding. One of the limitations is their failure to generalise to unseen environments. A primary obstacle is the lack of high-quality annotated data [56], coupled with the significant domain shift across various datasets [23], which hinders the models' generalisation capacity to novel environments outside of the training data distribution. Another critical challenge is the prohibitively expensive training requirement and the unsolved problem of catastrophic forgetting [39], which make re-training or finetuning impractical solutions due to the immense computational demands and severe performance degradation. Consequently, this further limits the models' generalisability after deployment, as they struggle to effectively utilise new data in constantly evolving environments and driving scenarios. \nTo address these challenges, we introduce *RAG-Driver*, a novel retrieval-augment",<br> "metadata": {<br> "creation_datetime": "2024-03-04",<br> "file_name": "2402.10828v1.md",<br> "file_path": "paper_data/2402.10828v1.md",<br> "file_size": 64885,<br> "file_type": null,<br> "last_accessed_datetime": "2024-03-04",<br> "last_modified_datetime": "2024-02-22"<br> }<br> },<br> "truncated_cells": []<br> }<br>]</code> | <code>NEGATIVE: DATASET_NAME: "emozilla/dolma-v1_7-arxiv"<br>FEATURES: {'text': {'dtype': 'string', '_type': 'Value'}, 'id': {'dtype': 'string', '_type': 'Value'}, 'metadata': {'file_path': {'dtype': 'string', '_type': 'Value'}}}<br>DATA SAMPLE:<br>[<br> {<br> "row_idx": 0,<br> "row": {<br> "text": "\\section{Introduction}\nLet $G$ be a simple undirected graph with the \\textit{vertex set} $V(G)$ and the \\textit{edge set} $E(G)$. A vertex with degree one is called a \\textit{pendant vertex}. The distance between the vertices $u$ and $v$ in graph $G$ is denoted by $d_G(u,v)$. A cycle $C$ is called \\textit{chordless} if $C$ has no \\textit{cycle chord} (that is an edge not in the edge set of $C$ whose endpoints lie on the vertices of $C$).\nThe \\textit{Induced subgraph} on vertex set $S$ is denoted by $\\langle S\\rangle$. A path that starts in $v$ and ends in $u$ is denoted by $\\stackrel\\frown{v u}$.\nA \\textit{traceable} graph is a graph that possesses a Hamiltonian path.\nIn a graph $G$, we say that a cycle $C$ is \\textit{formed by the path} $Q$ if $ | E(C) \\setminus E(Q) | = 1 $. So every vertex of $C$ belongs to $V(Q)$.\n\nIn 2011 the following conjecture was proposed:\n\\begin{conjecture}(Hoffmann-Ostenhof \\cite{hoffman})\nLet $G$ be a connected cubic graph. Then $G$ has a decomposition into a spanning tree, a matching and a family of cycles.\n\n\\end{conjecture}\nConjecture \\theconjecture$\\,$ also appears in Problem 516 \\cite{cameron}. There are a few partial results known for Conjecture \\theconjecture. Kostochka \\cite{kostocha} noticed that the Petersen graph, the prisms over cycles, and many other graphs have a decomposition desired in Conjecture \\theconjecture. Ozeki and Ye \\cite{ozeki} proved that the conjecture holds for 3-connected cubic plane graphs. Furthermore, it was proved by Bachstein \\cite{bachstein} that Conjecture \\theconjecture$\\,$ is true for every 3-connected cubic graph embedded in torus or Klein-bottle. Akbari, Jensen and Siggers \\cite[Theorem 9]{akbari} showed that Conjecture \\theconjecture$\\,$ is true for Hamiltonian cubic graphs.\n\nIn this paper, we show that Conjecture \\theconjecture$\\,$ holds for traceable cubic graphs.\n\\section{Results}\nBefore proving the main result, we need the following lemma.\n\\begin{lemma}\n\\label{lemma:1}\nLet $G$ be a cubic graph. Suppose that $V(G)$ can be partitioned into a tree $T$ and finitely many cycles such that there is no edge between any pair of cycles (not necessarily distinct cycles), and every pendant vertex of $T$ is adjacent to at least one vertex of a cycle. Then, Conjecture \\theconjecture$\\,$ holds for $G$.\n\\end{lemma}\n\\begin{proof}\nBy assumption, every vertex of each cycle in the partition is adjacent to exactly one vertex of $T$. Call the set of all edges with one endpoint in a cycle and another endpoint in $T$ by $Q$.\nClearly, the induced subgraph on $E(T) \\cup Q$ is a spanning tree of $G$. We call it $T'$. Note that every edge between a pendant vertex of $T$ and the union of cycles in the partition is also contained in $T'$. Thus, every pendant vertex of $T'$ is contained in a cycle of the partition. Now, consider the graph $H = G \\setminus E(T')$. For every $v \\in V(T)$, $d_H(v) \\leq 1$. So Conjecture \\theconjecture$\\,$ holds for $G$. \\vspace{1em}\n\\end{proof}\n\n\n\\noindent\\textbf{Remark 1.}\n\\label{remark:1}\nLet $C$ be a cycle formed by the path $Q$. Then clearly there exists a chordless cycle formed by $Q$.\n\nNow, we are in a position to prove the main result.\n\n\\begin{theorem}\nConjecture \\theconjecture$\\,$ holds for traceable cubic graphs.\n\\end{theorem}\n\\begin{proof}\nLet $G$ be a traceable cubic graph and $P : v_1, \\dots, v_n$ be a Hamiltonian path in $G$. By \\cite[Theorem 9]{akbari}, Conjecture A holds for $v_1 v_n \\in E(G)$. Thus we can assume that $v_1 v_n \\notin E(G)$. Let $v_1 v_j, v_1 v_{j'}, v_i v_n, v_{i'} v_n \\in E(G)\\setminus E(P)$ and $j' < j < n$, $1 < i < i'$. Two cases can occur:\n\\begin{enumerate}[leftmargin=0pt,label=]\n\\item\n\\textbf{Case 1.}\nAssume that $i < j$. Consider the following graph in Figure \\ref{fig:overlapping} in which the thick edges denote the path $P$. Call the three paths between $v_j$ and $v_i$, from the left to the right, by $P_1$, $P_2$ and $P_3$, respectively (note that $P_1$ contains the edge $e'$ and $P_3$ contains the edge $e$).\n\n\\begin{figure}[H]\n \\begin{center}\n \\includegraphics[width=40mm]{engImages/overlapping.pdf}\n \\caption{Paths $P_1$, $P_2$ and $P_3$}\n \\label{fig:overlapping}\n \\end{center}\n\\end{figure}\n\n\nIf $P_2$ has order $2$, then $G$ is Hamiltonian and so by \\cite[Theorem 9]{akbari} Conjecture \\theconjecture$\\,$ holds. Thus we can assume that $P_1$, $P_2$ and $P_3$ have order at least $3$. Now, consider the following subcases:\\\\\n\n\\begin{enumerate}[leftmargin=0pt,label=]\n\\label{case:1}\n\\item \\textbf{Subcase 1.} There is no edge between $V(P_r)$ and $V(P_s)$ for $1 \\leq r < s \\leq 3$. Since every vertex of $P_i$ has degree 3 for every $i$, by \\hyperref[remark:1]{Remark 1}$\\,$ there are two chordless cycles $C_1$ and $C_2$ formed by $P_1$ and $P_2$, respectively.\nDefine a tree $T$ with the edge set\n$$ E\\Big(\\langle V(G) \\setminus \\big(V(C_1) \\cup V(C_2)\\big) \\rangle\\Big) \\bigcap \\big(\\bigcup_{i=1}^3 E(P_i)\\big).$$\nNow, apply \\hyperref[lemma:1]{Lemma 1} $\\,$for the partition $\\{T, C_1, C_2\\}$.\\\\\n\n\\item \\textbf{Subcase 2.}\n\\label{case:edge}\nThere exists at least one edge between some $P_r$ and $P_s$, $r<s$. With no loss of generality, assume that $r=1$ and $s=2$. Suppose that $ab \\in E(G)$, where $a \\in V(P_1)$, $b \\in V(P_2)$ and $d_{P_1}(v_j, a) + d_{P_2}(v_j, b)$ is minimum.\n\n\\begin{figure}[H]\n \\begin{center}\n \\includegraphics[width=40mm]{engImages/ab.pdf}\n \\caption{The edge $ab$ between $P_1$ and $P_2$}\n \\label{fig:ab}\n \\end{center}\n\\end{figure}\n\nThree cases occur: \\\\\n\n(a) There is no chordless cycle formed by either of the paths $\\stackrel\\frown{v_j a}$ or $\\stackrel\\frown{v_j b}$. Let $C$ be the chordless cycle $\\stackrel\\frown{v_j a}\\stackrel\\frown{ b v_j}$. Define $T$ with the edge set\n$$ E\\Big(\\langle V(G) \\setminus V(C)\\rangle\\Big) \\bigcap \\big(\\bigcup_{i=1}^3 E(P_i)\\big).$$\nNow, apply \\hyperref[lemma:1]{Lemma 1} $\\,$for the partition $\\{T,C\\}$.\t\\\\\n\n(b) There are two chordless cycles, say $C_1$ and $C_2$, respectively formed by the paths $\\stackrel\\frown{v_j a}$ and $\\stackrel\\frown{v_j b}$. Now, consider the partition $C_1$, $C_2$ and the tree induced on the following edges,\n$$E\\Big(\\langle V(G) \\setminus \\big(V(C_1) \\cup V(C_2)\\big) \\rangle\\Big) \\; \\bigcap \\; E\\Big(\\bigcup_{i=1}^3 P_i\\Big),$$\nand apply \\hyperref[lemma:1]{Lemma 1}.\\\\\n\n(c) With no loss of generality, there exists a chordless cycle formed by the path $\\stackrel\\frown{v_j a}$ and there is no chordless cycle formed by the path $\\stackrel\\frown{v_j b}$.\nFirst, suppose that for every chordless cycle $C_t$ on $\\stackrel\\frown{v_j a}$, at least one of the vertices of $C_t$ is adjacent to a vertex in $V(G) \\setminus V(P_1)$.\nWe call one of the edges with one end in $C_t$ and other endpoint in $V(G) \\setminus V(P_1)$ by $e_t$. Let $v_j=w_0, w_1, \\dots, w_l=a$ be all vertices of the path $\\stackrel\\frown{v_j a}$ in $P_1$. Choose the shortest path $w_0 w_{i_1} w_{i_2} \\dots w_l$ such that $0 < i_1 < i_2 < \\dots < l$.\nDefine a tree $T$ whose edge set is the thin edges in Figure \\ref{fig:deltaCycle}.\\\\\nCall the cycle $w_0 w_{i_1} \\dots w_l \\stackrel\\frown{b w_0}$ by $C'$. Now, by removing $C'$, $q$ vertex disjoint paths $Q_1, \\dots, Q_q$ which are contained in $\\stackrel\\frown{v_j a}$ remain. Note that there exists a path of order $2$ in $C'$ which by adding this path to $Q_i$ we find a cycle $C_{t_i}$, for some $i$. Hence there exists an edge $e_{t_i}$ connecting $Q_i$ to $V(G) \\setminus V(P_1)$. Now, we define a tree $T$ whose the edge set is,\n$$\\quad\\quad\\quad \\bigg( E\\Big(\\langle V(G) \\setminus V(C') \\rangle \\Big)\\; \\bigcap \\; \\Big(\\bigcup_{i=1}^3 E(P_i)\\Big) \\bigg) \\bigcup \\Big(\\big\\{e_{t_i} \\mid 1 \\leq i \\leq q \\big\\} \\Big).$$\nApply \\hyperref[lemma:1]{Lemma 1} $\\,$for the partition $\\{T,C'\\}$.\\\\\n\n\\begin{figure}[H]\n \\begin{center}\n \\includegraphics[width=40mm]{engImages/deltaCycle.pdf}\n \\caption{The cycle $C'$ and the tree $T$}\n \\label{fig:deltaCycle}\n \\end{center}\n\\end{figure}\n\nNext, assume that there exists a cycle $C_1$ formed by $\\stackrel\\frown{v_j a}$ such that none of the vertices of $C_1$ is adjacent to $V(G) \\setminus V(P_1)$. Choose the smallest cycle with this property. Obviously, this cycle is chordless. Now, three cases can be considered:\\\\\n\n\\begin{enumerate}[leftmargin=5pt,label=(\\roman*)]\n\\item There exists a cycle $C_2$ formed by $P_2$ or $P_3$. Define the partition $C_1$, $C_2$ and a tree with the following edge set,\n$$E\\Big(\\langle V(G) \\setminus \\big(V(C_1) \\cup V(C_2)\\big)\\rangle \\Big) \\bigcap \\Big( \\bigcup_{i=1}^3 E(P_i) \\Big),$$\nand apply \\hyperref[lemma:1]{Lemma 1}.\\\\\n\n\\item There is no chordless cycle formed by $P_2$ and by $P_3$, and there is at least one edge between $V(P_2)$ and $V(P_3)$. Let $ab \\in E(G)$, $a \\in V(P_2)$ and $b \\in V(P_3)$ and moreover $d_{P_2}(v_j, a) + d_{P_3}(v_j,b)$ is minimum. Notice that the cycle $\\stackrel\\frown{v_j a} \\stackrel\\frown{b v_j}$ is chordless. Let us call this cycle by $C_2$. Now, define the partition $C_2$ and a tree with the following edge set,\n$$E\\Big(\\langle V(G) \\setminus V(C_2)\\rangle \\Big) \\bigcap \\Big( \\bigcup_{i=1}^3 E(P_i) \\Big),$$\nand apply \\hyperref[lemma:1]{Lemma 1}.\\\\\n\n\\item There is no chordless cycle formed by $P_2$ and by $P_3$, and there is no edge between $V(P_2)$ and $V(P_3)$. Let $C_2$ be the cycle consisting of two paths $P_2$ and $P_3$. Define the partition $C_2$ and a tree with the following edge set,\n$$E\\Big(\\langle V(G) \\setminus V(C_2)\\rangle \\Big) \\bigcap \\Big( \\bigcup_{i=1}^3 E(P_i) \\Big),$$\nand apply \\hyperref[lemma:1]{Lemma 1}.\n\n\\end{enumerate}\n\n\n\\end{enumerate}\n\n\\vspace{5mm}\n\\item\n\\textbf{Case 2.}\n\\label{case:2}\nAssume that $j < i$ for all Hamiltonian paths. Among all Hamiltonian paths consider the path such that $i'-j'$ is maximum. Now, three cases can be considered:\\\\\n\n\\begin{enumerate}[leftmargin=0pt,label=]\n\\item \\textbf{Subcase 1.} There is no $s < j'$ and $t > i'$ such that $v_s v_t \\in E(G)$. By \\hyperref[remark:1]{Remark 1} $\\,$ there are two chordless cycles $C_1$ and $C_2$, respectively formed by the paths $v_1 v_{j'}$ and $v_{i'} v_n$. By assumption there is no edge $xy$, where $x \\in V(C_1)$ and $y \\in V(C_2)$.\nDefine a tree $T$ with the edge set:\n$$ E\\Big(\\langle V(G) \\setminus \\big(V(C_1) \\cup V(C_2)\\big) \\rangle \\Big) \\bigcap \\Big( E(P) \\cup \\{v_{i'}v_n, v_{j'}v_1\\} \\Big).$$\nNow, apply \\hyperref[lemma:1]{Lemma 1} $\\,$for the partition $\\{T, C_1, C_2\\}$.\\\\\n\n\\item \\textbf{Subcase 2.}\n\\label{subcase:22} There are at least four indices $s, s' < j$ and $t, t' > i$ such that $v_s v_t, v_{s'} v_{t'} \\in E(G)$. Choose four indices $g, h < j$ and $e, f > i$ such that $v_h v_e, v_g v_f \\in E(G)$ and $|g-h| + |e-f|$ is minimum.\n\n\\begin{figure}[H]\n \\begin{center}\n \\includegraphics[width=90mm]{engImages/case2-subcase2.pdf}\n \\caption{Two edges $v_h v_e$ and $v_g v_f$}\n \\label{fig:non-overlapping}\n \\end{center}\n\\end{figure}\n\nThree cases can be considered:\\\\\n\n\\begin{enumerate}[leftmargin=0pt,label=(\\alph*)]\n\\item There is no chordless cycle formed by $\\stackrel\\frown{v_g v_h}$ and by $\\stackrel\\frown{v_e v_f}$.\n\nConsider the cycle $\\stackrel\\frown{v_g v_h} \\stackrel\\frown{v_e v_f}v_g$ and call it $C$. Now, define a tree $T$ with the edge set,\n$$\\,\\,\\,E\\Big(\\langle V(G) \\setminus V(C)\\rangle \\Big) \\bigcap \\Big( E(P) \\cup \\{v_1v_{j}, v_{i}v_n\\} \\Big),$$\napply \\hyperref[lemma:1]{Lemma 1} $\\,$for the partition $\\{T, C\\}$.\\\\\n\n\\item With no loss of generality, there exists a chordless cycle formed by $\\stackrel\\frown{v_e v_f}$ and there is no chordless cycle formed by the path $\\stackrel\\frown{v_g v_h}$. First suppose that there is a chordless cycle $C_1$ formed by $\\stackrel\\frown{v_e v_f}$ such that there is no edge between $V(C_1)$ and $\\{v_1, \\dots, v_j\\}$. By \\hyperref[remark:1]{Remark 1} $,$ there exists a chordless cycle $C_2$ formed by $\\stackrel\\frown{v_1 v_j}$. By assumption there is no edge between $V(C_1)$ and $V(C_2)$. Now, define a tree $T$ with the edge set,\n\n$$\\quad\\quad\\quad\\quad E\\Big(\\langle V(G) \\setminus \\big(V(C_1) \\cup V(C_2)\\big)\\rangle \\Big) \\bigcap \\Big( E(P) \\cup \\{v_1v_{j}, v_{i}v_n\\} \\Big),$$\n\nand apply \\hyperref[lemma:1]{Lemma 1} $\\,$for the partition $\\{T, C_1, C_2\\}$.\n\n$\\;$ Next assume that for every cycle $C_r$ formed by $\\stackrel\\frown{v_e v_f}$, there are two vertices $x_r \\in V(C_r)$ and $y_r \\in \\{v_1, \\dots, v_j\\}$ such that $x_r y_r \\in E(G)$. Let $v_e=w_0, w_1, \\dots, w_l=v_f$ be all vertices of the path $\\stackrel\\frown{v_e v_f}$ in $P$. Choose the shortest path $w_0 w_{i_1} w_{i_2} \\dots w_l$ such that $0 < i_1 < i_2 < \\dots < l$. Consider the cycle $w_0 w_{i_1} \\dots w_l \\stackrel\\frown{v_g v_h}$ and call it $C$. Now, by removing $C$, $q$ vertex disjoint paths $Q_1, \\dots, Q_q$ which are contained in $\\stackrel\\frown{v_e v_f}$ remain. Note that there exists a path of order $2$ in $C$ which by adding this path to $Q_i$ we find a cycle $C_{r_i}$, for some $i$. Hence there exists an edge $x_{r_i} y_{r_i}$ connecting $Q_i$ to $V(G) \\setminus V(\\stackrel\\frown{v_e v_f})$. We define a tree $T$ whose edge set is the edges,\n$$\\quad\\quad\\quad\\quad\\quad\\quad E\\Big(\\langle V(G) \\setminus V(C)\\rangle \\Big) \\bigcap \\Big( E(P) \\cup \\{v_1v_{j}, v_{i}v_n\\} \\cup \\big\\{x_{r_i} y_{r_i} \\mid 1 \\leq i \\leq q\\big\\} \\Big),$$\nthen apply \\hyperref[lemma:1]{Lemma 1} $\\,$ on the partition $\\{T, C\\}$.\\\\\n\\begin{figure}[H]\n \\begin{center}\n \\includegraphics[width=90mm]{engImages/deltaNonOverlapping.pdf}\n \\caption{The tree $T$ and the shortest path $w_0 w_{i_1}\\dots w_l$}\n \\label{fig:delta-non-overlapping}\n \\end{center}\n\\end{figure}\n\n\\item There are at least two chordless cycles, say $C_1$ and $C_2$ formed by the paths $\\stackrel\\frown{v_g v_h}$ and $\\stackrel\\frown{v_e v_f}$, respectively. Since $|g-h| + |e-f|$ is minimum, there is no edge $xy \\in E(G)$ with $x \\in V(C_1)$ and $y \\in V(C_2)$. Now, define a tree $T$ with the edge set,\n$$\\quad\\quad\\quad\\quad E\\Big( \\langle V(G) \\setminus \\big(V(C_1) \\cup V(C_2)\\big) \\rangle \\Big) \\bigcap \\Big( E(P) \\cup \\{v_1 v_{j}, v_{i}v_n\\} \\Big),$$\nand apply \\hyperref[lemma:1]{Lemma 1} $\\,$for the partition $\\{T, C_1, C_2\\}$.\\\\\n\\end{enumerate}\n\n\\item \\textbf{Subcase 3.} There exist exactly two indices $s,t$, $s < j' < i' < t$ such that $v_s v_t \\in E(G)$ and there are no two other indices $s', t'$ such that $s' < j < i < t'$ and $v_{s'} v_{t'} \\in E(G)$. We can assume that there is no cycle formed by $\\stackrel\\frown{v_{s+1} v_j}$ or $\\stackrel\\frown{v_i v_{t-1}}$, to see this by symmetry consider a cycle $C$ formed by $\\stackrel\\frown{v_{s+1} v_j}$. By \\hyperref[remark:1]{Remark 1} $\\,$ there exist chordless cycles $C_1$ formed by $\\stackrel\\frown{v_{s+1} v_j}$ and $C_2$ formed by $\\stackrel\\frown{v_{i} v_n}$. By assumption $v_s v_t$ is the only edge such that $s < j$ and $t > i \\;$. Therefore, there is no edge between $V(C_1)$ and $V(C_2)$. Now, let $T$ be a tree defined by the edge set,\n$$ E\\Big(\\langle V(G) \\setminus \\big(V(C_1) \\cup V(C_2)\\big)\\rangle \\Big) \\bigcap \\Big( E(P) \\cup \\{v_1v_{j}, v_{i}v_n\\} \\Big),$$\nand apply \\hyperref[lemma:1]{Lemma 1} $\\,$for the partition \\{$T$, $C_1$, $C_2$\\}.\\\\\n\n$\\quad$Furthermore, we can also assume that either $s \\neq j'-1$ or $t \\neq i'+1$, otherwise we have the Hamiltonian cycle $\\stackrel\\frown{v_1 v_s} \\stackrel\\frown{v_t v_n} \\stackrel\\frown{v_{i'} v_{j'}} v_1$ and by \\cite[Theorem 9]{akbari} Conjecture \\theconjecture$\\,$ holds.\n\n$\\quad$By symmetry, suppose that $s \\neq j'-1$. Let $v_k$ be the vertex adjacent to $v_{j'-1}$, and $k \\notin \\{j'-2, j'\\}$. It can be shown that $k > j'-1$, since otherwise by considering the Hamiltonian path $P': \\; \\stackrel\\frown{ v_{k+1} v_{j'-1}}\\stackrel\\frown{v_k v_1} \\stackrel\\frown{v_{j'} v_n}$, the new $i'-j'$ is greater than the old one and this contradicts our assumption about $P$ in the \\hyperref[case:2]{Case 2}.\n\n$\\quad$We know that $j' < k < i$. Moreover, the fact that $\\stackrel\\frown{v_{s+1} v_j}$ does not form a cycle contradicts the case that $j' < k \\le j$. So $j < k < i$. Consider two cycles $C_1$ and $C_2$, respectively with the vertices $v_1 \\stackrel\\frown{v_{j'} v_{j}} v_1$ and $v_n \\stackrel\\frown{v_{i'} v_{i}} v_n$. The cycles $C_1$ and $C_2$ are chordless, otherwise there exist cycles formed by the paths $\\stackrel\\frown{v_{s+1} v_j}$ or $\\stackrel\\frown{v_i v_{t-1}}$. Now, define a tree $T$ with the edge set\n$$ E\\Big(\\langle V(G) \\setminus \\big(V(C_1) \\cup V(C_2)\\big)\\rangle \\Big) \\bigcap \\Big( E(P) \\cup \\{v_s v_t, v_k v_{j'-1}\\} \\Big),$$\nand apply \\hyperref[lemma:1]{Lemma 1} $\\,$for the partition \\{$T$, $C_1$, $C_2$\\}.\n\\end{enumerate}\n\\end{enumerate}\n\\end{proof}\n\n\\noindent\\textbf{Remark 2.}\n\\label{remark:2}\nIndeed, in the proof of the previous theorem we showed a stronger result, that is, for every traceable cubic graph there is a decomposition with at most two cycles.\n\n",<br> "id": "b7c40b41b7eedaa408f87d154284a1aba126589c",<br> "metadata": {<br> "file_path": "/home/ubuntu/dolma-v1_7/arxiv-0000.json.gz"<br> }<br> },<br> "truncated_cells": []<br> },<br> {<br> "row_idx": 1,<br> "row": {<br> "text": "\\section{Principle of nano strain-amplifier}\r\n\r\n\\begin{figure*}[t!]\r\n\t\\centering\r\n\t\\includegraphics[width=5.4in]{Fig1}\r\n\t\t\\vspace{-0.5em}\r\n\t\\caption{Schematic sketches of nanowire strain sensors. (a)(b) Conventional non-released and released NW structure; \r\n\t\t(c)(d) The proposed nano strain-amplifier and its simplified physical model.}\r\n\t\\label{fig:fig1}\r\n\t\t\\vspace{-1em}\r\n\\end{figure*}\r\nFigure \\ref{fig:fig1}(a) and 1(b) show the concept of the conventional structures of piezoresistive sensors. The piezoresistive elements are either released from, or kept on, the substrate. The sensitivity ($S$) of the sensors is defined based on the ratio of the relative resistance change ($\\Delta R/R$) of the sensing element and the strain applied to the substrate ($\\varepsilon_{sub}$):\r\n\\begin{equation}\r\nS = (\\Delta R/R)/\\varepsilon_{sub}\r\n\\label{eq:sensitivity}\r\n\\end{equation}\r\nIn addition, the relative resistance change $\\Delta R/R$ can be calculated from the gauge factor ($GF$) of the material used to make the piezoresistive elements: $\\Delta R/R = GF \\varepsilon_{ind}$, where $\\varepsilon_{ind}$ is the strain induced into the piezoresistor. In most of the conventional strain gauges as shown in Fig. \\ref{fig:fig1} (a,b), the thickness of the sensing layer is typically below a few hundred nanometers, which is much smaller than that of the substrate. Therefore, the strain induced into the piezoresistive elements is approximately the same as that of the substrate ($\\varepsilon_{ind} \\approx \\varepsilon_{sub}$). Consequently, to improve the sensitivity of strain sensors (e.g. enlarging $\\Delta R/R$), electrical approaches which can enlarge the gauge factor ($GF$) are required. Nevertheless, as aforementioned, the existence of the large gauge factor in nanowires due to quantum confinement or surface state, is still considered as controversial. \n\r\nIt is also evident from Eq. \\ref{eq:sensitivity} that the sensitivity of strain sensors can also be improved using a mechanical approach, which enlarges the strain induced into the piezoresistive element. Figure \\ref{fig:fig1}(c) shows our proposed nano strain-amplifier structure, in which the piezoresistive nanowires are locally fabricated at the centre of a released bridge. The key idea of this structure is that, under a certain strain applied to the substrate, a large strain will be concentrated at the locally fabricated SiC nanowires. The working principle of the nano strain-amplifier is similar to that of the well-known dogbone structure, which is widely used to characterize the tensile strength of materials \\cite{dogbone1,dogbone2}. That is, when a stress is applied to the dogbone-shape of a certain material, a crack, if generated, will occur at the middle part of the dogbone. The large strain concentrated at the narrow area located at the centre part with respect to the wider areas located at outer region, causes the crack. Qualitative and quantitative explanations of the nano strain-amplifier are presented as follows. \r\n\r\nFor the sake of simplicity, the released micro frame and nanowire (single wire or array) of the nano strain-amplifier can be considered as solid springs, Fig. \\ref{fig:fig1}(d). The stiffness of these springs are proportional to their width ($w$) and inversely proportional to their length (l): $K \\propto w/l$. Consequently, the model of the released nanowire and micro frames can be simplified as a series of springs, where the springs with higher stiffness correspond to the micro frame, and the single spring with lower stiffness corresponds to the nanowire. It is well-known in classical physics that, for serially connected springs, a larger strain will be concentrated in the low--stiffness string, while a smaller strain will be induced in the high--stiffness string \\cite{Springbook}. The following analysis quantitatively explained the amplification of the strain.\t\r\n\r\n\\begin{figure}[b!]\r\n\t\\centering\r\n\t\\includegraphics[width=3in]{Fig2}\r\n\t\\vspace{-1em}\r\n\t\\caption{Finite element analysis of the strain induced in to the nanowire array utilizing nano strain-amplifier.}\r\n\t\\label{fig:fig2}\r\n\\end{figure}\r\nWhen a tensile mechanical strain ($\\varepsilon_{sub}$) is applied to the substrate, the released structure will also be elongated. Since the stiffness of the released frame is much smaller than that of the substrate, it is safe to assume that the released structure will follows the elongation of the substrate. The displacement of the released structure $\\Delta L$ is:\r\n\\begin{equation}\r\n\\Delta L = \\Delta L_m + \\Delta L_n = L_m \\varepsilon_m + L_n \\varepsilon_n\r\n\\label{eq:displacement}\r\n\\end{equation} \r\nwhere $L_m$, $L_n$ are the length; $\\Delta L_m$, $\\Delta L_n$ are the displacement; and $\\varepsilon_m$, $\\varepsilon_n$ are the strains induced into the micro spring and nano spring, respectively. The subscripts m and n stand for the micro frames and nanowires, respectively. Furthermore, due to the equilibrium of the stressing force ($F$) along the series of springs, the following relationship is established: $F= K_m\\Delta L_m = K_n \\Delta L_n$, where $K_m$, $K_n$ are the stiffness of the released micro frames and nanowires, respectively. Consequently the relationship between the displacement of the micro frame (higher stiffness) and nanowires (lower stiffness) is:\r\n\\begin{equation}\r\n\\frac{\\Delta L_m}{\\Delta L_n}=\\frac{K_n}{K_m}=\\frac{L_mw_n}{L_nw_m}\r\n\\label{eq:euili}\r\n\\end{equation}\r\nSubstituting Eqn. \\ref{eq:euili} into Eqn. \\ref{eq:displacement}, the strain induced into the locally fabricated nanowires is:\r\n\\begin{equation}\r\n\\varepsilon_n = \\frac{\\Delta L_n}{L_n} = \\frac{1}{1-\\frac{w_m-w_n}{w_m}\\frac{L_m}{L}}\\varepsilon_{sub}\r\n\\label{eq:strainamp}\r\n\\end{equation} \r\n\r\nEquation \\ref{eq:strainamp} indicates that increasing the ratio of $w_m/w_n$ and $L_m/L_n$ significantly amplifies the strain induced into the nanowire from the strain applied to the substrate. This model is also applicable to the case of nanowire arrays, in which $w_n$ is the total width of all nanowires in the array.\n\r\nThe theoretical model is then verified using the finite element analysis (FEA). In the FEA simulation, we compare the strain induced into (i) non released nanowires, (ii) the conventionally released nanowires, and (iii) our nano strain-amplifier structure, using COMSOL Multiphysics \\texttrademark. In our nano strain amplifying structure, the width of the released frame was set to be 8 $\\mu$m, while the width of each nanowire in the array (3 wires) was set to be 370 nm. The nanowires array structure was selected as it can enhance the electrical conductance of the SiC nanowires resistor which makes the subsequent experimental demonstration easier. The ratio between the length of nanowires and micro bridge was set to be 1: 20. With this geometrical dimensions, strain induced into nanowires array $\\varepsilon_n$ was numerically calculated to be approximately 6 times larger than $\\varepsilon_{sub}$, Eqn. \\ref{eq:strainamp}. The simulation results show that for all structure, the elongation of non-released and released nanowires follow that of the substrate. In addition, strain was almost completely transferred into conventional released and non-released structures. Furthermore, the ratio of the strain induced in to the locally fabricated nanowires was estimated to be 5.9 times larger than that of the substrate, Fig. \\ref{fig:fig2}. These results are in solid agreement with the theoretical analysis presented above. For a nanowire array with an average width of 470 nm, the amplified gain of strain was found to be 4.5. \t\r\n\r\nBased on the theoretical analysis, we conducted the following experiments to demonstrate the high sensitivity of SiC nanowire strain sensors using the nano strain-amplifier. A thin 3C-SiC film with its thickness of 300 nm was epitaxially grown on a 150 mm diameter Si wafer using low pressure chemical vapour deposition \\cite{SiC_growth}. The film was \\emph{in situ} doped using Al dopants. The carrier concentration of the p-type 3C-SiC was found to be $5 \\times 10^{18}$ cm$^{-3}$, using a hot probe technique \\cite{philip}. The details of the characteristics of the grown film can be found elsewhere \\cite{Phan_JMC}. Subsequently, I-shape p-type SiC resistors with aluminum electrodes deposited on the surface were patterned using inductive coupled plasma (ICP) etching. As the piezoresistance of p-type 3C-SiC depends on crystallographic orientation, all SiC resistors of the present work were aligned along [110] direction to maximize the piezoresistive effect. Next, the micro scale SiC resistors were then released from the Si substrate using dry etching (XeF$_2$). Finally, SiC nanowire arrays were formed at the centre of the released bridge using focused ion beam (FIB). Two types of nanowire array were fabricated with three nanowires for each array. The average width of each nanowire in each type were 380 nm and 470 nm, respectively. Figure \\ref{fig:fig3} shows the SEM images of the fabricated samples, including the conventional released structure, non-released nanowires, and the nano strain-amplifier. \r\n\r\n\\begin{figure}[t!]\r\n\t\\centering\r\n\t\\includegraphics[width=3in]{Fig3}\r\n\t\\caption{SEM image of SiC strain sensors. (a) Released SiC micro bridge used for the subsequent fabrication of the nano strain-amplifier; (b) SEM of a micro SiC resistor where the SiC nanowires array were formed using FIB; (c) SEM of non-released SiC nanowires; (d) SEM of locally fabricated SiC nanowires released from the Si substrate (nano strain-amplifier).}\r\n\t\\label{fig:fig3}\r\n\t\\vspace{-1em}\r\n\\end{figure}\r\nThe current voltage (I-V) curves of all fabricated samples were characterized using a HP 4145 \\texttrademark ~parameter analyzer. The linear relationship between the applied voltage and measured current, indicated that Al made a good Ohmic contact with the highly doped SiC resistance, Fig. \\ref{fig:IV}. Additionally, the electrical conductivity of both nanowires and micro frame estimated from the I-V curve and the dimensions of the resistors shows almost the same value. This indicated that the FIB process did not cause a significant surface damage to the fabricated nanowires. \r\n\t\r\n\\begin{figure}[b!]\r\n\t\\centering\r\n\t\\includegraphics[width=3in]{Fig4}\r\n\t\t\\vspace{-1.5em}\r\n\t\\caption{Current voltage curves of the fabricated SiC resistors.}\r\n\t\\label{fig:IV}\r\n\n\\end{figure}\r\n\r\nThe bending experiment was used to characterize the piezoresistive effect in micro size SiC resistors and locally fabricated SiC nanowire array. In this experiment one end of the Si cantilever (with a thickness of 625 $\\mu$m, and a width of 7 mm) was fixed while the other end was deflected by applying different forces. The distance from the fabricated nanowires to the free end of the Si cantilever was approximately 45 mm. The strain induced into the Si substrate is $\\varepsilon_\\text{sub} = Mt/2EI$, where $M$ is the applied bending moment; and $t$, $E$ and $I$ are the thickness, Young's modulus and the moment of inertia of the Si cantilever, respectively. The response of the SiC resistance to applied strain was then measured using a multimeter (Agilent \\texttrademark 34401 A).\n\r\n\\begin{figure}[h!]\r\n\t\\centering\r\n\t\\includegraphics[width=3in]{Fig5.eps}\r\n\t\t\\vspace{-1.5em}\r\n\t\\caption{Experimental results. (a) A comparision between the relative resistance change in the nano strain-amplifiers, non released nanowires and released micro frames; (b) The repeatability of the SiC nanowires strain sensors utilizing the proposed structure.}\r\n\t\\label{fig:DRR}\r\n\t\t\t\\vspace{-1em}\r\n\\end{figure}\t\r\nThe relative resistance change ($\\Delta R/R$) of the micro and nano SiC resistors was plotted against the strain induced into the Si substrate $\\varepsilon_{sub}$, Fig. \\ref{fig:DRR}(a). For all fabricated samples, the relative resistance change shows a good linear relationship with the applied strain ($\\varepsilon_{sub}$). In addition, with the same applied strain to the Si substrate, the resistance change of the SiC nanowires using the nano strain-amplifier was much larger than that of the the SiC micro resistor and the conventional non-released SiC nanowires. In addition, reducing the width of the SiC nanowires also resulted in the increase of the sensitivity. The magnitude of the piezoresistive effect in the nano strain-amplifier as well as conventional structures were then quantitatively evaluated based on the effective gauge factor ($GF_{eff}$), which is defined as the ratio of the relative resistance change to the applied strain to the substrate: $GF_{eff} = (\\Delta R/R)/\\varepsilon_{sub}$. Accordingly, the effective gauge factor of the released micro SiC was found to be 28, while that of the non-released SiC nanowires was 35. From the data shown in Fig. \\ref{fig:DRR}, the effective gauge factor of the 380 nm and 470 nm SiC nanowires in the nano strain-amplifier were calculated as 150 and 124, respectively. Thus for nanowire arrays with average widths of 380 nm and 470 nm, the sensitivity of the nano strain-amplifier was 5.4 times and 4.6 times larger than the bulk SiC, respectively. These results were consistent with analytical and numerical models presented above. The relative resistance change of the nano strain-amplifier also showed excellent linearity with the applied strain, with a linear regression of above 99\\%. \r\n\r\nThe resistance change of the nano strain-amplifier can also be converted into voltage signals using a Wheatstone bridge, Fig. \\ref{fig:DRR}(b). The output voltage of the nano strain-amplifier increases with increasing tensile strains from 0 ppm to 180 ppm, and returned to the initial value when the strain was completely removed, confirming a good repeatability after several strain induced cycles. The linearity of the relative resistance change, and the repeatability indicate that the proposed structure is promising for strain sensing applications.\r\n \r\nIn conclusion, this work presents a novel mechanical approach to obtain highly sensitive piezoresistance in nanowires based on a nano strain-amplifier. The key factor of the nano strain-amplifier lies on nanowires locally fabricated on a released micro structure. Experimental studies were conducted on SiC nanowires, confirming that by utilizing our nano strain-amplifier, the sensitivity of SiC nanowires was 5.4 times larger than that of conventional structures. This result indicated that the nano strain-amplifier is an excellent platform for ultra sensitive strain sensing applications. \r\n\r\n\r\n",<br> "id": "1b77ae9f541b19668cc96624c7ec0f83945284e2",<br> "metadata": {<br> "file_path": "/home/ubuntu/dolma-v1_7/arxiv-0000.json.gz"<br> }<br> },<br> "truncated_cells": []<br> }<br>]</code> | | <code>USER_QUERY: code vulnerability dataset</code> | <code>HUB_DATASET_PREVIEW: DATASET_NAME: "benjis/bigvul"<br>FEATURES: {'CVE ID': {'dtype': 'string', '_type': 'Value'}, 'CVE Page': {'dtype': 'string', '_type': 'Value'}, 'CWE ID': {'dtype': 'string', '_type': 'Value'}, 'codeLink': {'dtype': 'string', '_type': 'Value'}, 'commit_id': {'dtype': 'string', '_type': 'Value'}, 'commit_message': {'dtype': 'string', '_type': 'Value'}, 'func_after': {'dtype': 'string', '_type': 'Value'}, 'func_before': {'dtype': 'string', '_type': 'Value'}, 'lang': {'dtype': 'string', '_type': 'Value'}, 'project': {'dtype': 'string', '_type': 'Value'}, 'vul': {'dtype': 'int8', '_type': 'Value'}}<br>DATA SAMPLE:<br>[<br> {<br> "row_idx": 0,<br> "row": {<br> "CVE ID": "CVE-2017-7586",<br> "CVE Page": "https://www.cvedetails.com/cve/CVE-2017-7586/",<br> "CWE ID": "CWE-119",<br> "codeLink": "https://github.com/erikd/libsndfile/commit/708e996c87c5fae77b104ccfeb8f6db784c32074",<br> "commit_id": "708e996c87c5fae77b104ccfeb8f6db784c32074",<br> "commit_message": "src/ : Move to a variable length header buffer\n\nPreviously, the `psf->header` buffer was a fixed length specified by\n`SF_HEADER_LEN` which was set to `12292`. This was problematic for\ntwo reasons; this value was un-necessarily large for the majority\nof files and too small for some others.\n\nNow the size of the header buffer starts at 256 bytes and grows as\nnecessary up to a maximum of 100k.",<br> "func_after": "psf_get_date_str (char *str, int maxlen)\n{\ttime_t\t\tcurrent ;\n\tstruct tm\ttimedata, *tmptr ;\n\n\ttime (&current) ;\n\n#if defined (HAVE_GMTIME_R)\n\t/* If the re-entrant version is available, use it. */\n\ttmptr = gmtime_r (&current, &timedata) ;\n#elif defined (HAVE_GMTIME)\n\t/* Otherwise use the standard one and copy the data to local storage. */\n\ttmptr = gmtime (&current) ;\n\tmemcpy (&timedata, tmptr, sizeof (timedata)) ;\n#else\n\ttmptr = NULL ;\n#endif\n\n\tif (tmptr)\n\t\tsnprintf (str, maxlen, \"%4d-%02d-%02d %02d:%02d:%02d UTC\",\n\t\t\t1900 + timedata.tm_year, timedata.tm_mon, timedata.tm_mday,\n\t\t\ttimedata.tm_hour, timedata.tm_min, timedata.tm_sec) ;\n\telse\n\t\tsnprintf (str, maxlen, \"Unknown date\") ;\n\n\treturn ;\n} /* psf_get_date_str */\n",<br> "func_before": "psf_get_date_str (char *str, int maxlen)\n{\ttime_t\t\tcurrent ;\n\tstruct tm\ttimedata, *tmptr ;\n\n\ttime (&current) ;\n\n#if defined (HAVE_GMTIME_R)\n\t/* If the re-entrant version is available, use it. */\n\ttmptr = gmtime_r (&current, &timedata) ;\n#elif defined (HAVE_GMTIME)\n\t/* Otherwise use the standard one and copy the data to local storage. */\n\ttmptr = gmtime (&current) ;\n\tmemcpy (&timedata, tmptr, sizeof (timedata)) ;\n#else\n\ttmptr = NULL ;\n#endif\n\n\tif (tmptr)\n\t\tsnprintf (str, maxlen, \"%4d-%02d-%02d %02d:%02d:%02d UTC\",\n\t\t\t1900 + timedata.tm_year, timedata.tm_mon, timedata.tm_mday,\n\t\t\ttimedata.tm_hour, timedata.tm_min, timedata.tm_sec) ;\n\telse\n\t\tsnprintf (str, maxlen, \"Unknown date\") ;\n\n\treturn ;\n} /* psf_get_date_str */\n",<br> "lang": "C",<br> "project": "libsndfile",<br> "vul": 0<br> },<br> "truncated_cells": []<br> },<br> {<br> "row_idx": 1,<br> "row": {<br> "CVE ID": "CVE-2018-18352",<br> "CVE Page": "https://www.cvedetails.com/cve/CVE-2018-18352/",<br> "CWE ID": "CWE-732",<br> "codeLink": "https://github.com/chromium/chromium/commit/a9cbaa7a40e2b2723cfc2f266c42f4980038a949",<br> "commit_id": "a9cbaa7a40e2b2723cfc2f266c42f4980038a949",<br> "commit_message": "Simplify \"WouldTaintOrigin\" concept in media/blink\n\nCurrently WebMediaPlayer has three predicates:\n - DidGetOpaqueResponseFromServiceWorker\n - HasSingleSecurityOrigin\n - DidPassCORSAccessCheck\n. These are used to determine whether the response body is available\nfor scripts. They are known to be confusing, and actually\nMediaElementAudioSourceHandler::WouldTaintOrigin misuses them.\n\nThis CL merges the three predicates to one, WouldTaintOrigin, to remove\nthe confusion. Now the \"response type\" concept is available and we\ndon't need a custom CORS check, so this CL removes\nBaseAudioContext::WouldTaintOrigin. This CL also renames\nURLData::has_opaque_data_ and its (direct and indirect) data accessors\nto match the spec.\n\nBug: 849942, 875153\nChange-Id: I6acf50169d7445c4ff614e80ac606f79ee577d2a\nReviewed-on: https://chromium-review.googlesource.com/c/1238098\nReviewed-by: Fredrik Hubinette <[email protected]>\nReviewed-by: Kinuko Yasuda <[email protected]>\nReviewed-by: Raymond Toy <[email protected]>\nCommit-Queue: Yutaka Hirano <[email protected]>\nCr-Commit-Position: refs/heads/master@{#598258}",<br> "func_after": "void MultibufferDataSource::CreateResourceLoader(int64_t first_byte_position,\n int64_t last_byte_position) {\n DCHECK(render_task_runner_->BelongsToCurrentThread());\n\n SetReader(new MultiBufferReader(\n url_data()->multibuffer(), first_byte_position, last_byte_position,\n base::Bind(&MultibufferDataSource::ProgressCallback, weak_ptr_)));\n reader_->SetIsClientAudioElement(is_client_audio_element_);\n UpdateBufferSizes();\n}\n",<br> "func_before": "void MultibufferDataSource::CreateResourceLoader(int64_t first_byte_position,\n int64_t last_byte_position) {\n DCHECK(render_task_runner_->BelongsToCurrentThread());\n\n SetReader(new MultiBufferReader(\n url_data()->multibuffer(), first_byte_position, last_byte_position,\n base::Bind(&MultibufferDataSource::ProgressCallback, weak_ptr_)));\n reader_->SetIsClientAudioElement(is_client_audio_element_);\n UpdateBufferSizes();\n}\n",<br> "lang": "C",<br> "project": "Chrome",<br> "vul": 0<br> },<br> "truncated_cells": []<br> }<br>]</code> | <code>NEGATIVE: DATASET_NAME: "sfakhoury/NL2Fix"<br>FEATURES: {'defects4j_project': {'dtype': 'string', '_type': 'Value'}, 'defects4j_bug_id': {'dtype': 'string', '_type': 'Value'}, 'file_path': {'dtype': 'string', '_type': 'Value'}, 'bug_start_line': {'dtype': 'string', '_type': 'Value'}, 'bug_end_line': {'dtype': 'string', '_type': 'Value'}, 'issue_title': {'dtype': 'string', '_type': 'Value'}, 'issue_description': {'dtype': 'string', '_type': 'Value'}, 'original_src': {'dtype': 'string', '_type': 'Value'}, 'original_src_wo_comments': {'dtype': 'string', '_type': 'Value'}, 'fixed_src': {'dtype': 'string', '_type': 'Value'}, 'fixed_src_wo_comments': {'dtype': 'string', '_type': 'Value'}}<br>DATA SAMPLE:<br>[<br> {<br> "row_idx": 0,<br> "row": {<br> "defects4j_project": "Math",<br> "defects4j_bug_id": "19",<br> "file_path": "src/main/java/org/apache/commons/math3/optimization/direct/CMAESOptimizer.java",<br> "bug_start_line": "504",<br> "bug_end_line": "561",<br> "issue_title": "Wide bounds to CMAESOptimizer result in NaN parameters passed to fitness function",<br> "issue_description": "If you give large values as lower/upper bounds (for example -Double.MAX_VALUE as a lower bound), the optimizer can call the fitness function with parameters set to NaN. My guess is this is due to FitnessFunction.encode/decode generating NaN when normalizing/denormalizing parameters. For example, if the difference between the lower and upper bound is greater than Double.MAX_VALUE, encode could divide infinity by infinity.",<br> "original_src": "private void checkParameters() {\n final double[] init = getStartPoint();\n final double[] lB = getLowerBound();\n final double[] uB = getUpperBound();\n\n // Checks whether there is at least one finite bound value.\n boolean hasFiniteBounds = false;\n for (int i = 0; i < lB.length; i++) {\n if (!Double.isInfinite(lB[i]) ||\n !Double.isInfinite(uB[i])) {\n hasFiniteBounds = true;\n break;\n }\n }\n // Checks whether there is at least one infinite bound value.\n boolean hasInfiniteBounds = false;\n if (hasFiniteBounds) {\n for (int i = 0; i < lB.length; i++) {\n if (Double.isInfinite(lB[i]) ||\n Double.isInfinite(uB[i])) {\n hasInfiniteBounds = true;\n break;\n }\n }\n\n if (hasInfiniteBounds) {\n // If there is at least one finite bound, none can be infinite,\n // because mixed cases are not supported by the current code.\n throw new MathUnsupportedOperationException();\n } else {\n // Convert API to internal handling of boundaries.\n boundaries = new double[2][];\n boundaries[0] = lB;\n boundaries[1] = uB;\n\n // Abort early if the normalization will overflow (cf. \"encode\" method).\n }\n } else {\n // Convert API to internal handling of boundaries.\n boundaries = null;\n }\n\n if (inputSigma != null) {\n if (inputSigma.length != init.length) {\n throw new DimensionMismatchException(inputSigma.length, init.length);\n }\n for (int i = 0; i < init.length; i++) {\n if (inputSigma[i] < 0) {\n throw new NotPositiveException(inputSigma[i]);\n }\n if (boundaries != null) {\n if (inputSigma[i] > boundaries[1][i] - boundaries[0][i]) {\n throw new OutOfRangeException(inputSigma[i], 0, boundaries[1][i] - boundaries[0][i]);\n }\n }\n }\n }\n }",<br> "original_src_wo_comments": "private void checkParameters ( ) { final double [ ] init = getStartPoint ( ) ; final double [ ] lB = getLowerBound ( ) ; final double [ ] uB = getUpperBound ( ) ; boolean hasFiniteBounds = false ; for ( int i = 0 ; i < lB . length ; i ++ ) { if ( ! Double . isInfinite ( lB [ i ] ) || ! Double . isInfinite ( uB [ i ] ) ) { hasFiniteBounds = true ; break ; } } boolean hasInfiniteBounds = false ; if ( hasFiniteBounds ) { for ( int i = 0 ; i < lB . length ; i ++ ) { if ( Double . isInfinite ( lB [ i ] ) || Double . isInfinite ( uB [ i ] ) ) { hasInfiniteBounds = true ; break ; } } if ( hasInfiniteBounds ) { throw new MathUnsupportedOperationException ( ) ; } else { boundaries = new double [ 2 ] [ ] ; boundaries [ 0 ] = lB ; boundaries [ 1 ] = uB ; } } else { boundaries = null ; } if ( inputSigma != null ) { if ( inputSigma . length != init . length ) { throw new DimensionMismatchException ( inputSigma . length , init . length ) ; } for ( int i = 0 ; i < init . length ; i ++ ) { if ( inputSigma [ i ] < 0 ) { throw new NotPositiveException ( inputSigma [ i ] ) ; } if ( boundaries != null ) { if ( inputSigma [ i ] > boundaries [ 1 ] [ i ] - boundaries [ 0 ] [ i ] ) { throw new OutOfRangeException ( inputSigma [ i ] , 0 , boundaries [ 1 ] [ i ] - boundaries [ 0 ] [ i ] ) ; } } } } }",<br> "fixed_src": "private void checkParameters() {\n final double[] init = getStartPoint();\n final double[] lB = getLowerBound();\n final double[] uB = getUpperBound();\n\n // Checks whether there is at least one finite bound value.\n boolean hasFiniteBounds = false;\n for (int i = 0; i < lB.length; i++) {\n if (!Double.isInfinite(lB[i]) ||\n !Double.isInfinite(uB[i])) {\n hasFiniteBounds = true;\n break;\n }\n }\n // Checks whether there is at least one infinite bound value.\n boolean hasInfiniteBounds = false;\n if (hasFiniteBounds) {\n for (int i = 0; i < lB.length; i++) {\n if (Double.isInfinite(lB[i]) ||\n Double.isInfinite(uB[i])) {\n hasInfiniteBounds = true;\n break;\n }\n }\n\n if (hasInfiniteBounds) {\n // If there is at least one finite bound, none can be infinite,\n // because mixed cases are not supported by the current code.\n throw new MathUnsupportedOperationException();\n } else {\n // Convert API to internal handling of boundaries.\n boundaries = new double[2][];\n boundaries[0] = lB;\n boundaries[1] = uB;\n\n // Abort early if the normalization will overflow (cf. \"encode\" method).\n for (int i = 0; i < lB.length; i++) {\n if (Double.isInfinite(boundaries[1][i] - boundaries[0][i])) {\n final double max = Double.MAX_VALUE + boundaries[0][i];\n final NumberIsTooLargeException e\n = new NumberIsTooLargeException(boundaries[1][i],\n max,\n true);\n e.getContext().addMessage(LocalizedFormats.OVERFLOW);\n e.getContext().addMessage(LocalizedFormats.INDEX, i);\n\n throw e;\n }\n }\n }\n } else {\n // Convert API to internal handling of boundaries.\n boundaries = null;\n }\n\n if (inputSigma != null) {\n if (inputSigma.length != init.length) {\n throw new DimensionMismatchException(inputSigma.length, init.length);\n }\n for (int i = 0; i < init.length; i++) {\n if (inputSigma[i] < 0) {\n throw new NotPositiveException(inputSigma[i]);\n }\n if (boundaries != null) {\n if (inputSigma[i] > boundaries[1][i] - boundaries[0][i]) {\n throw new OutOfRangeException(inputSigma[i], 0, boundaries[1][i] - boundaries[0][i]);\n }\n }\n }\n }\n }",<br> "fixed_src_wo_comments": "private void checkParameters ( ) { final double [ ] init = getStartPoint ( ) ; final double [ ] lB = getLowerBound ( ) ; final double [ ] uB = getUpperBound ( ) ; boolean hasFiniteBounds = false ; for ( int i = 0 ; i < lB . length ; i ++ ) { if ( ! Double . isInfinite ( lB [ i ] ) || ! Double . isInfinite ( uB [ i ] ) ) { hasFiniteBounds = true ; break ; } } boolean hasInfiniteBounds = false ; if ( hasFiniteBounds ) { for ( int i = 0 ; i < lB . length ; i ++ ) { if ( Double . isInfinite ( lB [ i ] ) || Double . isInfinite ( uB [ i ] ) ) { hasInfiniteBounds = true ; break ; } } if ( hasInfiniteBounds ) { throw new MathUnsupportedOperationException ( ) ; } else { boundaries = new double [ 2 ] [ ] ; boundaries [ 0 ] = lB ; boundaries [ 1 ] = uB ; for ( int i = 0 ; i < lB . length ; i ++ ) { if ( Double . isInfinite ( boundaries [ 1 ] [ i ] - boundaries [ 0 ] [ i ] ) ) { final double max = Double . MAX_VALUE + boundaries [ 0 ] [ i ] ; final NumberIsTooLargeException e = new NumberIsTooLargeException ( boundaries [ 1 ] [ i ] , max , true ) ; e . getContext ( ) . addMessage ( LocalizedFormats . OVERFLOW ) ; e . getContext ( ) . addMessage ( LocalizedFormats . INDEX , i ) ; throw e ; } } } } else { boundaries = null ; } if ( inputSigma != null ) { if ( inputSigma . length != init . length ) { throw new DimensionMismatchException ( inputSigma . length , init . length ) ; } for ( int i = 0 ; i < init . length ; i ++ ) { if ( inputSigma [ i ] < 0 ) { throw new NotPositiveException ( inputSigma [ i ] ) ; } if ( boundaries != null ) { if ( inputSigma [ i ] > boundaries [ 1 ] [ i ] - boundaries [ 0 ] [ i ] ) { throw new OutOfRangeException ( inputSigma [ i ] , 0 , boundaries [ 1 ] [ i ] - boundaries [ 0 ] [ i ] ) ; } } } } }"<br> },<br> "truncated_cells": []<br> },<br> {<br> "row_idx": 1,<br> "row": {<br> "defects4j_project": "Compress",<br> "defects4j_bug_id": "16",<br> "file_path": "src/main/java/org/apache/commons/compress/archivers/ArchiveStreamFactory.java",<br> "bug_start_line": "197",<br> "bug_end_line": "258",<br> "issue_title": "Too relaxed tar detection in ArchiveStreamFactory",<br> "issue_description": "The relaxed tar detection logic added in COMPRESS-117 unfortunately matches also some non-tar files like a [test AIFF file|https://svn.apache.org/repos/asf/tika/trunk/tika-parsers/src/test/resources/test-documents/testAIFF.aif] that Apache Tika uses. It would be good to improve the detection heuristics to still match files like the one in COMPRESS-117 but avoid false positives like the AIFF file in Tika.",<br> "original_src": "public ArchiveInputStream createArchiveInputStream(final InputStream in)\n throws ArchiveException {\n if (in == null) {\n throw new IllegalArgumentException(\"Stream must not be null.\");\n }\n\n if (!in.markSupported()) {\n throw new IllegalArgumentException(\"Mark is not supported.\");\n }\n\n final byte[] signature = new byte[12];\n in.mark(signature.length);\n try {\n int signatureLength = in.read(signature);\n in.reset();\n if (ZipArchiveInputStream.matches(signature, signatureLength)) {\n return new ZipArchiveInputStream(in);\n } else if (JarArchiveInputStream.matches(signature, signatureLength)) {\n return new JarArchiveInputStream(in);\n } else if (ArArchiveInputStream.matches(signature, signatureLength)) {\n return new ArArchiveInputStream(in);\n } else if (CpioArchiveInputStream.matches(signature, signatureLength)) {\n return new CpioArchiveInputStream(in);\n }\n\n // Dump needs a bigger buffer to check the signature;\n final byte[] dumpsig = new byte[32];\n in.mark(dumpsig.length);\n signatureLength = in.read(dumpsig);\n in.reset();\n if (DumpArchiveInputStream.matches(dumpsig, signatureLength)) {\n return new DumpArchiveInputStream(in);\n }\n\n // Tar needs an even bigger buffer to check the signature; read the first block\n final byte[] tarheader = new byte[512];\n in.mark(tarheader.length);\n signatureLength = in.read(tarheader);\n in.reset();\n if (TarArchiveInputStream.matches(tarheader, signatureLength)) {\n return new TarArchiveInputStream(in);\n }\n // COMPRESS-117 - improve auto-recognition\n if (signatureLength >= 512) {\n try {\n TarArchiveInputStream tais = new TarArchiveInputStream(new ByteArrayInputStream(tarheader));\n // COMPRESS-191 - verify the header checksum\n tais.getNextEntry();\n return new TarArchiveInputStream(in);\n } catch (Exception e) { // NOPMD\n // can generate IllegalArgumentException as well\n // as IOException\n // autodetection, simply not a TAR\n // ignored\n }\n }\n } catch (IOException e) {\n throw new ArchiveException(\"Could not use reset and mark operations.\", e);\n }\n\n throw new ArchiveException(\"No Archiver found for the stream signature\");\n }",<br> "original_src_wo_comments": "public ArchiveInputStream createArchiveInputStream ( final InputStream in ) throws ArchiveException { if ( in == null ) { throw new IllegalArgumentException ( \"Stream must not be null.\" ) ; } if ( ! in . markSupported ( ) ) { throw new IllegalArgumentException ( \"Mark is not supported.\" ) ; } final byte [ ] signature = new byte [ 12 ] ; in . mark ( signature . length ) ; try { int signatureLength = in . read ( signature ) ; in . reset ( ) ; if ( ZipArchiveInputStream . matches ( signature , signatureLength ) ) { return new ZipArchiveInputStream ( in ) ; } else if ( JarArchiveInputStream . matches ( signature , signatureLength ) ) { return new JarArchiveInputStream ( in ) ; } else if ( ArArchiveInputStream . matches ( signature , signatureLength ) ) { return new ArArchiveInputStream ( in ) ; } else if ( CpioArchiveInputStream . matches ( signature , signatureLength ) ) { return new CpioArchiveInputStream ( in ) ; } final byte [ ] dumpsig = new byte [ 32 ] ; in . mark ( dumpsig . length ) ; signatureLength = in . read ( dumpsig ) ; in . reset ( ) ; if ( DumpArchiveInputStream . matches ( dumpsig , signatureLength ) ) { return new DumpArchiveInputStream ( in ) ; } final byte [ ] tarheader = new byte [ 512 ] ; in . mark ( tarheader . length ) ; signatureLength = in . read ( tarheader ) ; in . reset ( ) ; if ( TarArchiveInputStream . matches ( tarheader , signatureLength ) ) { return new TarArchiveInputStream ( in ) ; } if ( signatureLength >= 512 ) { try { TarArchiveInputStream tais = new TarArchiveInputStream ( new ByteArrayInputStream ( tarheader ) ) ; tais . getNextEntry ( ) ; return new TarArchiveInputStream ( in ) ; } catch ( Exception e ) { } } } catch ( IOException e ) { throw new ArchiveException ( \"Could not use reset and mark operations.\" , e ) ; } throw new ArchiveException ( \"No Archiver found for the stream signature\" ) ; }",<br> "fixed_src": "public ArchiveInputStream createArchiveInputStream(final InputStream in)\n throws ArchiveException {\n if (in == null) {\n throw new IllegalArgumentException(\"Stream must not be null.\");\n }\n\n if (!in.markSupported()) {\n throw new IllegalArgumentException(\"Mark is not supported.\");\n }\n\n final byte[] signature = new byte[12];\n in.mark(signature.length);\n try {\n int signatureLength = in.read(signature);\n in.reset();\n if (ZipArchiveInputStream.matches(signature, signatureLength)) {\n return new ZipArchiveInputStream(in);\n } else if (JarArchiveInputStream.matches(signature, signatureLength)) {\n return new JarArchiveInputStream(in);\n } else if (ArArchiveInputStream.matches(signature, signatureLength)) {\n return new ArArchiveInputStream(in);\n } else if (CpioArchiveInputStream.matches(signature, signatureLength)) {\n return new CpioArchiveInputStream(in);\n }\n\n // Dump needs a bigger buffer to check the signature;\n final byte[] dumpsig = new byte[32];\n in.mark(dumpsig.length);\n signatureLength = in.read(dumpsig);\n in.reset();\n if (DumpArchiveInputStream.matches(dumpsig, signatureLength)) {\n return new DumpArchiveInputStream(in);\n }\n\n // Tar needs an even bigger buffer to check the signature; read the first block\n final byte[] tarheader = new byte[512];\n in.mark(tarheader.length);\n signatureLength = in.read(tarheader);\n in.reset();\n if (TarArchiveInputStream.matches(tarheader, signatureLength)) {\n return new TarArchiveInputStream(in);\n }\n // COMPRESS-117 - improve auto-recognition\n if (signatureLength >= 512) {\n try {\n TarArchiveInputStream tais = new TarArchiveInputStream(new ByteArrayInputStream(tarheader));\n // COMPRESS-191 - verify the header checksum\n if (tais.getNextTarEntry().isCheckSumOK()) {\n return new TarArchiveInputStream(in);\n }\n } catch (Exception e) { // NOPMD\n // can generate IllegalArgumentException as well\n // as IOException\n // autodetection, simply not a TAR\n // ignored\n }\n }\n } catch (IOException e) {\n throw new ArchiveException(\"Could not use reset and mark operations.\", e);\n }\n\n throw new ArchiveException(\"No Archiver found for the stream signature\");\n }",<br> "fixed_src_wo_comments": "public ArchiveInputStream createArchiveInputStream ( final InputStream in ) throws ArchiveException { if ( in == null ) { throw new IllegalArgumentException ( \"Stream must not be null.\" ) ; } if ( ! in . markSupported ( ) ) { throw new IllegalArgumentException ( \"Mark is not supported.\" ) ; } final byte [ ] signature = new byte [ 12 ] ; in . mark ( signature . length ) ; try { int signatureLength = in . read ( signature ) ; in . reset ( ) ; if ( ZipArchiveInputStream . matches ( signature , signatureLength ) ) { return new ZipArchiveInputStream ( in ) ; } else if ( JarArchiveInputStream . matches ( signature , signatureLength ) ) { return new JarArchiveInputStream ( in ) ; } else if ( ArArchiveInputStream . matches ( signature , signatureLength ) ) { return new ArArchiveInputStream ( in ) ; } else if ( CpioArchiveInputStream . matches ( signature , signatureLength ) ) { return new CpioArchiveInputStream ( in ) ; } final byte [ ] dumpsig = new byte [ 32 ] ; in . mark ( dumpsig . length ) ; signatureLength = in . read ( dumpsig ) ; in . reset ( ) ; if ( DumpArchiveInputStream . matches ( dumpsig , signatureLength ) ) { return new DumpArchiveInputStream ( in ) ; } final byte [ ] tarheader = new byte [ 512 ] ; in . mark ( tarheader . length ) ; signatureLength = in . read ( tarheader ) ; in . reset ( ) ; if ( TarArchiveInputStream . matches ( tarheader , signatureLength ) ) { return new TarArchiveInputStream ( in ) ; } if ( signatureLength >= 512 ) { try { TarArchiveInputStream tais = new TarArchiveInputStream ( new ByteArrayInputStream ( tarheader ) ) ; if ( tais . getNextTarEntry ( ) . isCheckSumOK ( ) ) { return new TarArchiveInputStream ( in ) ; } } catch ( Exception e ) { } } } catch ( IOException e ) { throw new ArchiveException ( \"Could not use reset and mark operations.\" , e ) ; } throw new ArchiveException ( \"No Archiver found for the stream signature\" ) ; }"<br> },<br> "truncated_cells": []<br> }<br>]</code> | | <code>USER_QUERY: english korean translation dataset</code> | <code>HUB_DATASET_PREVIEW: DATASET_NAME: "yoonjae22/Aihub_translate"<br>FEATURES: {'instruction': {'dtype': 'string', '_type': 'Value'}, 'output': {'dtype': 'string', '_type': 'Value'}, 'text': {'dtype': 'string', '_type': 'Value'}, 'input': {'dtype': 'string', '_type': 'Value'}}<br>DATA SAMPLE:<br>[<br> {<br> "row_idx": 0,<br> "row": {<br> "input": "Bible Coloring' is a coloring application that allows you to experience beautiful stories in the Bible.",<br> "output": "'Bible Coloring'\uc740 \uc131\uacbd\uc758 \uc544\ub984\ub2e4\uc6b4 \uc774\uc57c\uae30\ub97c \uccb4\ud5d8 \ud560 \uc218 \uc788\ub294 \uceec\ub7ec\ub9c1 \uc571\uc785\ub2c8\ub2e4.",<br> "instruction": "Please translate the English sentence into Korean.",<br> "text": "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n### Instruction:\nBible Coloring' is a coloring application that allows you to experience beautiful stories in the Bible.\n\n###Response:\n'Bible Coloring'\uc740 \uc131\uacbd\uc758 \uc544\ub984\ub2e4\uc6b4 \uc774\uc57c\uae30\ub97c \uccb4\ud5d8 \ud560 \uc218 \uc788\ub294 \uceec\ub7ec\ub9c1 \uc571\uc785\ub2c8\ub2e4."<br> },<br> "truncated_cells": []<br> },<br> {<br> "row_idx": 1,<br> "row": {<br> "input": "Do you work at a City bank?",<br> "output": "\uc528\ud2f0\uc740\ud589\uc5d0\uc11c \uc77c\ud558\uc138\uc694?",<br> "instruction": "Please translate the English sentence into Korean.",<br> "text": "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n### Instruction:\nDo you work at a City bank?\n\n###Response:\n\uc528\ud2f0\uc740\ud589\uc5d0\uc11c \uc77c\ud558\uc138\uc694?"<br> },<br> "truncated_cells": []<br> }<br>]</code> | <code>NEGATIVE: DATASET_NAME: "werty1248/EnKo-Translation-LongTextOnly-dedup"<br>FEATURES: {'english': {'dtype': 'string', '_type': 'Value'}, 'korean': {'dtype': 'string', '_type': 'Value'}, 'from': {'dtype': 'string', '_type': 'Value'}, 'category': {'dtype': 'string', '_type': 'Value'}}<br>DATA SAMPLE:<br>[<br> {<br> "row_idx": 0,<br> "row": {<br> "english": "ROOFTOP GREENING STRUCTURETo provide a structure firmly and easily installing a house cultivation arch-like aggregate in a rooftop greening structure. This rooftop greening structure includes pressingly fixing each of support stands 24 in each of support stand line groups 24A to a rooftop slab surface through a greening support layer 6, using a fastener which pierces into the greening support layer 6, and steps over between each of the support stands 24 and the rooftop slab surface 2, and installing a holding member 36 for holding a house cultivation arch-like aggregate 50 each on the upper end surface of each of the support stands 24 in each of the support stand line groups 24A. As a result of this, the support stand 24 which has stiffness higher than the greening support layer 6, and is firmly fixed to the rooftop slab surface 2 through the greening support layer 6 is used for holding the end part of the arch-like aggregate 50. The holding member 36 for holding the end part of the arch-like aggregate 50 is installed on the upper end surface of the support stand 24 so as to suppress the holding member 36 from burying in soil and increase the visibility.In a rooftop greening structure in which a greening support layer is formed by laying a plurality of greening support panels on the rooftop floor and soil is arranged on the greening support layer, a pair of support stands are placed on the greening support layer. The rows are arranged so as to be separated from each other, and each of the support rows is arranged upright so as to form a row with a plurality of supports having higher rigidity than the greening support layer. Each support pedestal in each support pedestal row group is configured through the greening support layer by using a fastener that penetrates the greening support layer and straddles between each support pedestal and the rooftop floor surface. It is characterized in that it is pressed and fixed to the rooftop floor surface, and the upper end surface of each support stand in each support stand row group is provided with a holding portion for holding an arch-shaped aggregate for house cultivation. Rooftop greening structure.",<br> "korean": "\uc625\uc0c1 \ub179\ud654 \uad6c\uc870\uc625\uc0c1 \ub179\ud654 \uad6c\uc870\uc5d0 \uc788\uc5b4\uc11c \ud558\uc6b0\uc2a4 \uc7ac\ubc30\uc6a9 \uc544\uce58\ud615 \uace8\uc7ac\ub97c \uacac\uace0\ud558\uace0 \uc6a9\uc774\ud558\uac8c \uace0\uc815\ud558\ub294 \uad6c\uc870\ub97c \uc81c\uacf5\ud55c\ub2e4. \uac01 \uc9c0\uc9c0\ub300\ub82c\uad70 24 A\uc758 \uac01 \uc9c0\uc9c0\ub300 24\ub97c \ub179\ud654 \uc9c0\uc6d0\uce35 6\uc744 \uad00\ud1b5\ud574 \uac01 \uc9c0\uc9c0\ub300 24\uc640 \uc625\uc0c1 \uc2ac\ub798\ube0c\uba74 2 \uc0ac\uc774\ub97c \ub118\ub294 \uace0\uc815\uad6c\ub97c \uc774\uc6a9\ud568\uc73c\ub85c\uc368, \ub179\ud654 \uc9c0\uc6d0\uce35 6\uc744 \ud1b5\ud574 \uc0c1\uae30 \uc625\uc0c1 \uc2ac\ub798\ube0c\uba74\uc5d0 \uac00\uc555 \uace0\uc815\ud558\uace0 \uadf8 \uac01 \uc9c0\uc9c0\ub300\ub82c\uad70 24 A\uc758 \uac01 \uc9c0\uc9c0\ub300 24\uc758 \uc0c1\ub2e8\uba74\uc5d0 \ud558\uc6b0\uc2a4 \uc7ac\ubc30\uc6a9 \uc544\uce58\ud615 \uace8\uc7ac 50\uc744 \uc9c0\uc9c0\ud558\uae30 \uc704\ud55c \uc9c0\uc9c0 \ubd80\uc7ac 36\uc744 \uac01\uac01 \ub9c8\ub828\ud55c\ub2e4. \uc774\uac83\uc5d0 \uc758\ud574 \uc544\uce58\ud615 \uace8\uc7ac 50\uc758 \ub2e8\ubd80\ub97c \uc9c0\uc9c0\ud558\ub294 \uac83\uc73c\ub85c\uc11c \ub179\ud654 \uc9c0\uc6d0\uce35 6\ubcf4\ub2e4 \uac15\uc131\uc774 \ub192\uace0 \uc625\uc0c1 \uc2ac\ub798\ube0c\uba74 2\uc5d0 \ub179\ud654 \uc9c0\uc6d0\uce35 6\uc744 \ud1b5\ud574 \uc81c\ub300\ub85c \uace0\uc815\ub41c \uc9c0\uc9c0\ub300 24\uac00 \uc774\uc6a9\ub418\ub3c4\ub85d \ud55c\ub2e4. \ub610\ud55c \uc544\uce58\ud615 \uace8\uc7ac 50\uc758 \ub2e8\ubd80\ub97c \uc9c0\uc9c0\ud558\ub294 \uc9c0\uc9c0 \ubd80\uc7ac 36\uc744 \uc9c0\uc9c0\ub300 24\uc758 \uc0c1\ub2e8\uba74\uc5d0 \ub9c8\ub828\ud568\uc73c\ub85c\uc368, \ud1a0\uc591\uc5d0 \ud30c\ubb3b\ud788\ub294 \uac83\uc744 \uc5b5\uc81c\ud558\uace0 \uadf8 \uc9c0\uc9c0 \ubd80\uc7ac 36\uc758 \uc2dc\uc778\uc131\uc744 \ud5a5\uc0c1\uc2dc\ud0a8\ub2e4.\uc625\uc0c1 \ubc14\ub2e5\uba74\uc0c1\uc5d0 \ubcf5\uc218\uc758 \ub179\ud654 \uc9c0\uc6d0 \ud328\ub110\uc744 \ubd80\uc124\ud568\uc73c\ub85c\uc368 \ub179\ud654 \uc9c0\uc6d0\uce35\uc774 \ud615\uc131\ub418\uace0 \uc0c1\uae30 \ub179\ud654 \uc9c0\uc6d0\uce35\uc0c1\uc5d0 \ud1a0\uc591\uc774 \ubc30\uc124\ub418\ub294 \uc625\uc0c1 \ub179\ud654 \uad6c\uc870\uc5d0 \uc788\uc5b4\uc11c \uc0c1\uae30 \ub179\ud654 \uc9c0\uc6d0\uce35\uc0c1\uc5d0 \ud55c \uc30d\uc758 \uc9c0\uc9c0\ub300\ub82c\uad70\uc774 \uc11c\ub85c \uc774\uaca9\ub41c \uc0c1\ud0dc\ub97c \uac00\uc9c0\uace0 \ubc30\uce58\ub418\uace0 \uc0c1\uae30 \uac01 \uc9c0\uc9c0\ub300\ub82c\uad70\uc774 \uc0c1\uae30 \ub179\ud654 \uc9c0\uc6d0\uce35\ubcf4\ub2e4 \uac15\uc131\uc774 \ud5a5\uc0c1\ub41c \ubcf5\uc218\uc758 \uc9c0\uc9c0\ub300\ub97c \uac04\uaca9\uc744 \ub450\uba74\uc11c \uc5f4\uc744 \uc774\ub8e8\ub3c4\ub85d \uc785\uc124 \ubc30\uce58\ud568\uc73c\ub85c\uc368 \uad6c\uc131\ub418\uace0 \uc0c1\uae30 \uac01 \uc9c0\uc9c0\ub300\ub82c\uad70\uc758 \uac01 \uc9c0\uc9c0\ub300\uac00 \uc0c1\uae30 \ub179\ud654 \uc9c0\uc6d0\uce35\uc744 \uad00\ud1b5\ud574 \uc0c1\uae30 \uac01 \uc9c0\uc9c0\ub300\uc640 \uc0c1\uae30 \uc625\uc0c1 \ubc14\ub2e5\uba74 \uc0ac\uc774\ub97c \ub118\ub294 \uace0\uc815\uad6c\ub97c \uc774\uc6a9\ud568\uc73c\ub85c\uc368, \uc0c1\uae30 \ub179\ud654 \uc9c0\uc6d0\uce35\uc744 \ud1b5\ud574 \uc0c1\uae30 \uc625\uc0c1 \ubc14\ub2e5\uba74\uc5d0 \uac00\uc555 \uace0\uc815\ub418\uc5b4 \uc0c1\uae30 \uac01 \uc9c0\uc9c0\ub300\ub82c\uad70\uc758 \uac01 \uc9c0\uc9c0\ub300\uc758 \uc0c1\ub2e8\uba74\uc5d0\ub294 \ud558\uc6b0\uc2a4 \uc7ac\ubc30\uc6a9 \uc544\uce58\ud615 \uace8\uc7ac\ub97c \uc9c0\uc9c0\ud558\uae30 \uc704\ud55c \uc9c0\uc9c0\ubd80\uac00 \uac01\uac01 \uad6c\ube44\ub418\uc5b4 \uc788\ub294, \uac83\uc744 \ud2b9\uc9d5\uc73c\ub85c \ud558\ub294 \uc625\uc0c1 \ub179\ud654 \uad6c\uc870.",<br> "from": "nayohan/aihub-en-ko-translation-12m",<br> "category": "full"<br> },<br> "truncated_cells": []<br> },<br> {<br> "row_idx": 1,<br> "row": {<br> "english": "Native chicken breeding methodThe invention discloses a native chicken breeding method, which includes steps that the shield degree of a breeding grove is 60-65%; a native chicken activity field with area of 5-8 mu is encircled bya 1.8-2.2m high nylon mesh; a ventilating and warming device is arranged in a henhouse; feed and water are delivered at 8: 00-15: 00 in every morning, and native chicken are put in grove activity field at 15:00-17: 00 in the afternoon; music is displayed at 17: 00-18: 30, and feed is delivered at outside of the henhouse to domesticize the native chickens , and then chickens are returned to the henhouse; the henhouse is cleaned at intervals of 12-15 days; the henhouse is sterilized by an automatic sterilizing system during the stocking period in the afternoon at intervals of 3-5 days. The native chicken breeding method can well consider about the stocking time, thus the stocking instinct of the native chickens is well guaranteed, the food intake of the native chickens is increased throughthe reasonable captive time; the meat growth is accelerated, the breeding cycle is shortened, and the meat quality of the native chickens is ensured.A kind of 1. cultural method of chicken, it is characterised in that\uff1ait the described method comprises the following steps\uff1a\uff081\uff09selection cultivation ground\uff1aselection away from livestock and poultry transaction place, slaughtering field, chemical plant, garbage disposal plant, avoid air, dust, water source, germ and the cultivation of the woods of noise pollution, the moon degree of covering of the woods is 60~65%, with 1.8~2.2 meters of high nylon net circle area is 5~8 mu of chicken playground, and vegetable seeds is broadcasted sowing in forest land\uff1b\uff082\uff09build chicken house\uff1athe wind sheltering in woods ground on the sunny side, hen house is built in the chicken playground centre position that physical features is high and dry, draining blowdown condition is good, and ventilation heating is set in hen house equipment, hen house is interior to set automatic sterilizing system\uff1b\uff083\uff09select kind\uff1aselect it is resistance to it is extensive, action flexibly, the pure native that power of looking for food is strong, premunition is strong\uff1b\uff084\uff09dietary management\uff1aevery mu of forest land puts 260~280 in a suitable place to breed, every morning 8:00~15:feed and water are launched in stable breeding when 00, afternoon 15:00~17:it is put into forest land playground when 00 to put in a suitable place to breed, 17:00~18:dispensing feed outside music colony house is played when 30 to enter row domestication makes chicken return to colony house, and day temperature is maintained at 20~23 degrees celsius in circle, and nocturnal temperature is maintained at 20~23 degrees celsius\uff1b \uff085\uff09disinfectant management\uff1ato being cleaned in hen house, colony house is started certainly during chicken is put in a suitable place to breed afternoon within every 3~5 days within every 12~15 days dynamic disinfection system is sterilized, and lime powder for every 2~3 months to the main passageway in woods forest land.",<br> "korean": "\ud1a0\uc885\ub2ed \uc0ac\uc721\ubc29\ubc95\uc774 \ubc1c\uba85\ud488\uc740 \uc0ac\uc721\uc7a5\uc758 \ubc29\ud328\ub3c4\uac00 60~65%\uc778 \ud1a0\uc885\ub2ed \uc0ac\uc721\ubc95\uc744 \uacf5\uac1c\ud558\uace0 \uc788\uc73c\uba70, \uba74\uc801\uc774 5~8m\uc778 \ud1a0\uc885\ub2ed \ud65c\ub3d9\uc7a5\uc744 1.8~2.2m \ub192\uc774\uc758 \ub098\uc77c\ub860 \uba54\uc2dc\ub85c \ub458\ub7ec\uc2f8\uace0 \uc788\uc73c\uba70, \ub2ed\uc7a5\uc5d0 \ud658\uae30 \ubc0f \ub09c\ubc29 \uc7a5\uce58\uac00 \ubc30\uce58\ub418\uc5b4 \uc788\uc73c\uba70, \ub9e4\uc77c \uc544\uce68 8\uc2dc~15\ubd84\uc5d0 \uc0ac\ub8cc\uc640 \ubb3c\uc774 \uc804\ub2ec\ub418\uace0 \uc788\ub2e4. \uadf8\ub9ac\uace0 \ud1a0\uc885\ub2ed\uc740 \uc624\ud6c4 15:00-17:00\uc5d0 \uc232 \ud65c\ub3d9\uc7a5\uc5d0 \ud22c\uc785\ub418\uace0, 17: 00-18:30\uc5d0\ub294 \uc74c\uc545\uc774 \uc5f0\uc8fc\ub418\uba70, \ubaa8\uc774\ub294 \ub2ed\uc7a5 \ubc16\uc5d0\uc11c \ubc30\ub2ec\uc744 \ubc1b\uc544 \ud1a0\uc885\ub2ed\uc744 \uae38\ub4e4\uc774\uace0, \ub2ed\uc7a5\uc740 12-15\uc77c \uac04\uaca9\uc73c\ub85c \ub2ed\uc7a5\uc73c\ub85c \ub3cc\ub824\ubcf4\ub0b8\ub2e4; \ub2ed\uc7a5\uc740 \uc790\ub3d9\uc18c\ub3c5\ub41c\ub2e4.c \uc624\ud6c4\uc758 \ubcf4\uad00 \uae30\uac04 \ub3d9\uc548 3~5\uc77c \uac04\uaca9\uc73c\ub85c \uba78\uade0 \uc2dc\uc2a4\ud15c. \ud1a0\uc885\ub2ed \uc0ac\uc721\ubc95\uc740 \uc0ac\uc721 \uc2dc\uac04\uc744 \uc798 \uace0\ub824\ud560 \uc218 \uc788\uae30 \ub54c\ubb38\uc5d0 \ud1a0\uc885\ub2ed\uc758 \uc0ac\uc721 \ubcf8\ub2a5\uc774 \uc798 \ubcf4\uc7a5\ub418\uace0, \ud1a0\uc885\ub2ed\uc758 \uba39\uc774 \uc12d\ucde8\uac00 \uc801\uc808\ud55c \ud3ec\ud68d \uc2dc\uac04\uc744 \ud1b5\ud574 \uc99d\uac00\ud55c\ub2e4; \uc721\uc2dd \uc131\uc7a5\uc774 \uac00\uc18d\ud654\ub418\uace0, \ubc88\uc2dd \uc8fc\uae30\uac00 \uc9e7\uc544\uc9c0\uba70, \ud1a0\uc885\ub2ed\uc758 \uc721\uc9c8\ub3c4 e\uc774\ub2e4.\ub204\uc5d0\uc288\uc5b4\ub2ed\uc758 \uc77c\uc885\uc73c\ub85c, \ubb18\uc0ac\ub41c \ubc29\ubc95\uc740 \ub2e4\uc74c\uacfc \uac19\uc740 \ub2e8\uacc4\ub85c \uad6c\uc131\ub41c\ub2e4: \uff091select\uc120\uc815\uc7ac\ubc30\uc7a5: \uac00\ucd95\uacfc \uac00\uae08\ub958 \uac70\ub798\uc7a5\uc18c\ub85c\ubd80\ud130\uc758 \uc120\ud0dd, \ub3c4\ucd95\uc7a5, \ud654\ud559\uacf5\uc7a5, \uc4f0\ub808\uae30 \ucc98\ub9ac\uc7a5, \uacf5\uae30, \uba3c\uc9c0, \uc218\uc6d0, \uc138\uade0, \uadf8\ub9ac\uace0 \uc232\uc758 \ubb34\uade0 \uc7ac\ubc30\uc774\uc138\uc624\uc5fc, \uc232\uc758 \ub2ec\uc758 \ub36e\uc784\ub3c4\ub294 60~65%\uc774\uace0, \ub192\uc740 \ub098\uc77c\ub860 \uadf8\ubb3c\ub9dd \uba74\uc801 1.8~2.2m\ub294 \ub2ed \ub180\uc774\ud130\uc758 5~8mu\uc774\uba70, \uc232 \uc18d\uc5d0 \ucc44\uc18c \uc528\uc557\uc744 \ubfcc\ub9ac\ub294 \uac83\uc744 \ubc29\uc1a1\ud55c\ub2e4. \uc2e0\uccb4\uc801 \ud2b9\uc9d5\uc774 \ub192\uace0 \uac74\uc870\ud558\uba70 \ubc30\uc218 \ube14\ub85c\uc6b0\ub2e4\uc6b4 \uc870\uac74\uc774 \uc88b\ub2e4, \uadf8\ub9ac\uace0 \ud658\uae30 \ub09c\ubc29\uc740 \ub2ed\uc9d1 \uc7a5\ube44\uc5d0 \uc124\uc815\ub41c\ub2e4, \ub2ed\uc9d1\uc740 \uc790\ub3d9 \uc0b4\uade0 \uc2dc\uc2a4\ud15c\uc744 \uc124\uc815\ud558\uae30 \uc704\ud55c \ub0b4\ubd80\uc774\ub2e4;33selectselect cind;select codelt it's \uad11\ubc94\uc704\ud558\uace0, \uc720\uc5f0\ud558\uac8c \uc791\uc6a9\ud558\uba70, \uc74c\uc2dd\uc744 \ucc3e\ub294 \ud798\uc774 \uac15\ud55c \uc21c\uc218\ud55c \ud1a0\uc885, \uc608\uac10\uc774 \uac15\ud558\ub2e4;select4aary \uad00\ub9ac:\uc784\uc57c\uc758 \ubaa8\ub4e0 \ubba4\ub294 260~280\ubc88\uc2dd\uc744 \ud558\uae30\uc5d0 \uc801\ud569\ud55c \uc7a5\uc18c\uc5d0 \ubc30\uce58\ud558\uace0, \ub9e4\uc77c \uc544\uce68 8:00~15:\uc0ac\ub8cc\uc640 \ubb3c\uc740 00\ubc88\uc2dd\uc744 \ud560 \ub54c \uc548\uc815\uc801\uc778 \ubc88\uc2dd\uc9c0\ub85c \ud22c\uc785\ud558\uace0, 17:00~18:\uc74c\uc545\uc9d1 \uc678\ubd80\uc758 \uc0ac\ub8cc\ub4e4\uc774 30\ubc88 \uc904\uc5d0 \ub4e4\uc5b4\uc11c\uba74 \uc7ac\uc0dd\ub429\ub2c8\ub2e4.\ub2ed\uc758 \uad70\uc9d1 \ubcf5\uadc0\ub294 \uc544\uc774\ub514\ucf00\uc774\uc158\uc73c\ub85c, \ub0ae \uae30\uc628\uc740 \uc6d0\uc8fc 20~23\ub3c4, \uc57c\ud589\uc131 \uc628\ub3c4\ub294 20~23\ub3c4\ub97c \uc720\uc9c0\ud558\uba70, \u30105\u3011\uc911\uc694\ud55c \uad00\ub9ac:\ub2ed\uc9d1 \uccad\uc18c\ub294 \ubc18\ub4dc\uc2dc \uc2dc\uc791\ud558\uba70, \ub2ed\uc740 3\ub144\ub9c8\ub2e4 \uc624\ud6c4\ub9c8\ub2e4 \ubc88\uc2dd\ud558\uae30\uc5d0 \uc801\ud569\ud55c \uc7a5\uc18c\uc5d0 \ub454\ub2e4.12~15\uc77c \uc774\ub0b4\uc5d0\ub294 5\uc77c \uc774\ub0b4 \ub3d9\uc801\uc18c\ub3c5\uc2dc\uc2a4\ud15c\uc774 \uba78\uade0 \ucc98\ub9ac\ub418\uba70, \uc232\uc18d\uc758 \uc8fc\ud1b5\ub85c\ub85c 2~3\uac1c\uc6d4\ub9c8\ub2e4 \ub77c\uc784\ud30c\uc6b0\ub354\uac00 \ud22c\uc785\ub41c\ub2e4.",<br> "from": "nayohan/aihub-en-ko-translation-12m",<br> "category": "full"<br> },<br> "truncated_cells": []<br> }<br>]</code> | * Loss: [<code>CachedMultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedmultiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Evaluation Dataset #### query-to-dataset-viewer-descriptions * Dataset: [query-to-dataset-viewer-descriptions](https://huggingface.co/datasets/davanstrien/query-to-dataset-viewer-descriptions) at [eb9d1be](https://huggingface.co/datasets/davanstrien/query-to-dataset-viewer-descriptions/tree/eb9d1becf412659d97049b5895ad5521f8015383) * Size: 1,433 evaluation samples * Columns: <code>query</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | query | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 11.75 tokens</li><li>max: 17 tokens</li></ul> | <ul><li>min: 46 tokens</li><li>mean: 2188.82 tokens</li><li>max: 8192 tokens</li></ul> | <ul><li>min: 33 tokens</li><li>mean: 1769.64 tokens</li><li>max: 8192 tokens</li></ul> | * Samples: | query | positive | negative | |:--------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>USER_QUERY: multimodal conversation dataset</code> | <code>HUB_DATASET_PREVIEW: DATASET_NAME: "BUAADreamer/llava-en-zh-2k"<br>FEATURES: {'messages': [{'role': {'dtype': 'string', '_type': 'Value'}, 'content': {'dtype': 'string', '_type': 'Value'}}], 'images': {'feature': {'_type': 'Image'}, '_type': 'Sequence'}}<br>DATA SAMPLE:<br>[<br> {<br> "row_idx": 0,<br> "row": {<br> "messages": [<br> {<br> "role": "user",<br> "content": "<image>How many baseball players are visible in the image?"<br> },<br> {<br> "role": "assistant",<br> "content": "There are three baseball players visible in the image."<br> },<br> {<br> "role": "user",<br> "content": "What are the players holding in their hands?"<br> },<br> {<br> "role": "assistant",<br> "content": "The players are holding baseball bats in their hands."<br> },<br> {<br> "role": "user",<br> "content": "Are the players in a dugout?"<br> },<br> {<br> "role": "assistant",<br> "content": "Yes, the three baseball players are standing in the dugout."<br> },<br> {<br> "role": "user",<br> "content": "Is the image in color or black and white?"<br> },<br> {<br> "role": "assistant",<br> "content": "The image is an old black and white photo of the three baseball players."<br> },<br> {<br> "role": "user",<br> "content": "Do the players belong to a specific baseball team?"<br> },<br> {<br> "role": "assistant",<br> "content": "Yes, the players belong to the Boston Red Sox baseball team."<br> }<br> ],<br> "images": [<br> {<br> "src": "https://datasets-server.huggingface.co/assets/BUAADreamer/llava-en-zh-2k/--/fba994c834822bddd3cd79e929c33135f4289d2b/--/en/train/0/images/image-1d100e9.jpg?Expires=1726591851&Signature=QHGD147HyWamORfSjz0QoG51Ru86g3STPBNDAEOLK7NTq8Y~b4vVt3u~XF9njlRWwNnVF7AQ8-l9f2pCWxggPnZw1wZEfBAC5Q1oOW2CwT-gMYME~I-9qeJrbQtkszer9U0-H5rkECK0DVgWKKIagyjjJBmSSLz2QAb3BlV5aEMGEbJj-4G1X2xzoVaMIfg~ogGSb6~2PAKoLoKMPUZIxrm~2LoPEGVJeXr7JIa8-ISVukshW8ZEDQII~eXfOBeEsTd1Mk5BpJz26dH3W7X2rdhq54KeQ8H9OBt~HlU0XbXigFABeMymb-GBH-fDNHUDmb9Zt3io5hBMOd7I64DLyQ__&Key-Pair-Id=K3EI6M078Z3AC3",<br> "height": 308,<br> "width": 384<br> }<br> ]<br> },<br> "truncated_cells": []<br> },<br> {<br> "row_idx": 1,<br> "row": {<br> "messages": [<br> {<br> "role": "user",<br> "content": "What is the man wearing in the image?<image>"<br> },<br> {<br> "role": "assistant",<br> "content": "The man in the image is wearing chaps, a cowboy hat, and is dressed like a cowboy."<br> },<br> {<br> "role": "user",<br> "content": "What is the cowboy doing in the image?"<br> },<br> {<br> "role": "assistant",<br> "content": "The cowboy is riding on a horse in a grassy field and pointing into the distance, likely directing attention towards a specific location or object."<br> },<br> {<br> "role": "user",<br> "content": "What kind of landscape surrounds the cowboy and his horse?"<br> },<br> {<br> "role": "assistant",<br> "content": "The cowboy and his horse are surrounded by a grassy field, which creates a natural and open setting for the scene."<br> },<br> {<br> "role": "user",<br> "content": "What is the significance of the cowboy attire and its relation to horse riding?"<br> },<br> {<br> "role": "assistant",<br> "content": "Cowboy attire is deeply rooted in the history, culture, and traditions of the American West. It is both functional and symbolic, representing the hardworking and rugged lifestyle of cowboys who spent much of their time working with livestock and riding horses.\n\nThe key elements of cowboy attire, like the cowboy hat and chaps, serve specific purposes:\n1. Cowboy Hat: A wide-brimmed hat made of stiffened felt or straw is designed to protect the cowboy from the sun, wind, and rain. The hat's distinctive high crown allows for air circulation and provides a touch of style.\n2. Chaps: These are sturdy leather leg coverings worn over pants to protect the cowboy's legs from rough terrain, thorny bushes, and other elements while riding or working with livestock. Chaps provide an additional layer of protection and durability, particularly while riding through rugged landscapes and herding cattle.\n\nIn summary, the cowboy attire seen in the image is not just a fashion statement but is deeply connected to the history and traditions of the American West. The functional pieces of clothing, like the cowboy hat and chaps, are designed to protect and support the cowboy during horse riding and working with livestock."<br> }<br> ],<br> "images": [<br> {<br> "src": "https://datasets-server.huggingface.co/assets/BUAADreamer/llava-en-zh-2k/--/fba994c834822bddd3cd79e929c33135f4289d2b/--/en/train/1/images/image-1d100e9.jpg?Expires=1726591851&Signature=WyNDGZXVbzPOU9iOQSDPFt1MizgmdT-KqdVAG8nIVSK0Gg8OO-qmhKxgIVjyWMHnWyNbW5svuMoukPMyv9hiHMsNh0YmzdjMR9Gwb6mRvsisEAdaLl71Q053MYxEqkZWCB6PbXG5yEazHL4RHvDphsUEhZS-0Yk8Kzx0HHc12HNaJfiO4fO4IPkY3eLw5xLgNoKIcvvO9TDo0JEbc1ej6YkxGUdqXyVrG2Y4zYnhrCM0drgKVzq24cQ9YZ78HW5f-EsXsftbj0ZzEg4SKcuVgrqaKG8SJ~i0aV-OtkXiTCWxW16D4hfsmpXZShZAHesa1EOGprkYdtQG4Kfte12maQ__&Key-Pair-Id=K3EI6M078Z3AC3",<br> "height": 288,<br> "width": 384<br> }<br> ]<br> },<br> "truncated_cells": []<br> }<br>]</code> | <code>NEGATIVE: DATASET_NAME: "passing2961/photochat_plus"<br>FEATURES: {'photo_description': {'dtype': 'string', '_type': 'Value'}, 'trigger_sentences': {'feature': {'dtype': 'string', '_type': 'Value'}, '_type': 'Sequence'}, 'dialogue_id': {'dtype': 'int64', '_type': 'Value'}, 'photo_url': {'dtype': 'string', '_type': 'Value'}, 'dialogue': [{'message': {'dtype': 'string', '_type': 'Value'}, 'share_photo': {'dtype': 'bool', '_type': 'Value'}, 'user_id': {'dtype': 'int64', '_type': 'Value'}}], 'image_descriptions': {'feature': {'dtype': 'string', '_type': 'Value'}, '_type': 'Sequence'}, 'intents': {'feature': {'dtype': 'string', '_type': 'Value'}, '_type': 'Sequence'}, 'salient_information': {'feature': {'dtype': 'string', '_type': 'Value'}, '_type': 'Sequence'}, 'photo_id': {'dtype': 'string', '_type': 'Value'}}<br>DATA SAMPLE:<br>[<br> {<br> "row_idx": 0,<br> "row": {<br> "photo_description": "The photo has your brother Kannon. Objects in the photo: Man",<br> "trigger_sentences": [<br> "How is Kannon doing?"<br> ],<br> "dialogue_id": 500,<br> "photo_url": "https://farm6.staticflickr.com/151/369716968_bde7e83418_o.jpg",<br> "dialogue": [<br> {<br> "message": "Hello, how have you been, dear friend?",<br> "share_photo": false,<br> "user_id": 1<br> },<br> {<br> "message": "Great!",<br> "share_photo": false,<br> "user_id": 0<br> },<br> {<br> "message": "Thanks for asking",<br> "share_photo": false,<br> "user_id": 0<br> },<br> {<br> "message": "And how have you been?",<br> "share_photo": false,<br> "user_id": 0<br> },<br> {<br> "message": "It seems like we haven't talked in forever",<br> "share_photo": false,<br> "user_id": 0<br> },<br> {<br> "message": "I have been doing well, keeping busy, spent a lot of time outdoors. What have you been up to?",<br> "share_photo": false,<br> "user_id": 1<br> },<br> {<br> "message": "Last night my brother Kannon did a poetry reading",<br> "share_photo": false,<br> "user_id": 0<br> },<br> {<br> "message": "Really? How did it go? You know how much I love poetry.",<br> "share_photo": false,<br> "user_id": 1<br> },<br> {<br> "message": "It went really well",<br> "share_photo": false,<br> "user_id": 0<br> },<br> {<br> "message": "Do you remember my brother Kannon?",<br> "share_photo": false,<br> "user_id": 0<br> },<br> {<br> "message": "Absolutely! How could I forget, he left quite an impression",<br> "share_photo": false,<br> "user_id": 1<br> },<br> {<br> "message": "How is Kannon doing?",<br> "share_photo": false,<br> "user_id": 1<br> },<br> {<br> "message": "",<br> "share_photo": true,<br> "user_id": 0<br> },<br> {<br> "message": "Great",<br> "share_photo": false,<br> "user_id": 0<br> },<br> {<br> "message": "Here is a photo from last night",<br> "share_photo": false,<br> "user_id": 0<br> },<br> {<br> "message": "Wow, he seems so confident in that pic! Wish that I could have been there.",<br> "share_photo": false,<br> "user_id": 1<br> }<br> ],<br> "image_descriptions": [<br> "A photo of Kannon",<br> "A picture of Kannon.",<br> "a photo of recent situation"<br> ],<br> "intents": [<br> "Information Dissemination",<br> "Social Bonding"<br> ],<br> "salient_information": [<br> "poetry",<br> "How is Kannon doing?",<br> "Kannon doing"<br> ],<br> "photo_id": "train/19e8f436d4b2fc25"<br> },<br> "truncated_cells": []<br> },<br> {<br> "row_idx": 1,<br> "row": {<br> "photo_description": "The photo has your uncle Kieran. Objects in the photo: Clothing, Man",<br> "trigger_sentences": [<br> "guess what new animal he got?",<br> "He's always had goats and chickens, but guess what new animal he got?"<br> ],<br> "dialogue_id": 501,<br> "photo_url": "https://farm8.staticflickr.com/53/189664134_f70fc8947a_o.jpg",<br> "dialogue": [<br> {<br> "message": "Hey! You remember my uncle who owns the hobby farm, right?",<br> "share_photo": false,<br> "user_id": 0<br> },<br> {<br> "message": "Yeah i do",<br> "share_photo": false,<br> "user_id": 1<br> },<br> {<br> "message": "Uncle Keiran?",<br> "share_photo": false,<br> "user_id": 0<br> },<br> {<br> "message": "How about him?",<br> "share_photo": false,<br> "user_id": 1<br> },<br> {<br> "message": "He's always had goats and chickens, but guess what new animal he got?",<br> "share_photo": false,<br> "user_id": 0<br> },<br> {<br> "message": "Dog?",<br> "share_photo": false,<br> "user_id": 1<br> },<br> {<br> "message": "Nope, a wild hog!",<br> "share_photo": false,<br> "user_id": 0<br> },<br> {<br> "message": "And not the motorcycle kind ;)",<br> "share_photo": false,<br> "user_id": 0<br> },<br> {<br> "message": "",<br> "share_photo": true,<br> "user_id": 0<br> },<br> {<br> "message": "Wow",<br> "share_photo": false,<br> "user_id": 1<br> }<br> ],<br> "image_descriptions": [<br> "A photo of the hog's appearance.",<br> "a photo of wild hog",<br> "An image of the new wild hog"<br> ],<br> "intents": [<br> "Social Bonding",<br> "Visual Clarification"<br> ],<br> "salient_information": [<br> "hog",<br> "not the motorcycle kind",<br> "wild hog",<br> "a wild hog"<br> ],<br> "photo_id": "train/07d688f5e2142b87"<br> },<br> "truncated_cells": []<br> }<br>]</code> | | <code>USER_QUERY: question answering dataset for technical documentation</code> | <code>HUB_DATASET_PREVIEW: DATASET_NAME: "hythyt/preguntas-respuestas-RAG"<br>FEATURES: {'DATOS': {'dtype': 'string', '_type': 'Value'}, 'PREGUNTA': {'dtype': 'string', '_type': 'Value'}, 'RESPUESTA': {'dtype': 'string', '_type': 'Value'}}<br>DATA SAMPLE:<br>[<br> {<br> "row_idx": 0,<br> "row": {<br> "DATOS": "ARISbyhyt Manual de Usuario\nSecci\u00f3n 8\nARISbyhyt es un programa dise\u00f1ado para gestionar proyectos de forma eficiente. Permite crear tareas, asignar recursos y realizar un seguimiento detallado del progreso. Para agregar una nueva tarea, accede al men\u00fa \"Proyectos\" y selecciona \"Nueva Tarea\". Completa la informaci\u00f3n requerida, como nombre, fecha de inicio y asignaci\u00f3n de recursos.\n\nSecci\u00f3n 4\nEn ARISbyhyt, puedes personalizar tus informes para obtener datos espec\u00edficos sobre el progreso del proyecto. Accede al men\u00fa \"Informes\", elige la opci\u00f3n \"Personalizar Informe\" y selecciona los campos que deseas incluir, como el estado de las tareas o el tiempo dedicado a cada fase.",<br> "PREGUNTA": "\u00bfC\u00f3mo puedo crear una nueva tarea en ARISbyhyt?",<br> "RESPUESTA": "Para agregar una nueva tarea en ARISbyhyt, sigue estos pasos:\n1. Ve al men\u00fa \"Proyectos\".\n2. Selecciona \"Nueva Tarea\".\n3. Completa la informaci\u00f3n solicitada, como nombre, fecha de inicio y asignaci\u00f3n de recursos."<br> },<br> "truncated_cells": []<br> },<br> {<br> "row_idx": 1,<br> "row": {<br> "DATOS": "ARISbyhyt Manual de Usuario \nSecci\u00f3n 2\nEn ARISbyhyt, puedes personalizar tus informes para obtener datos espec\u00edficos sobre el progreso del proyecto. Accede al men\u00fa \"Informes\", elige la opci\u00f3n \"Personalizar Informe\" y selecciona los campos que deseas incluir, como el estado de las tareas o el tiempo dedicado a cada fase.",<br> "PREGUNTA": "\u00bfC\u00f3mo puedo personalizar un informe en ARISbyhyt para obtener datos espec\u00edficos sobre el progreso del proyecto?",<br> "RESPUESTA": "Para personalizar un informe en ARISbyhyt, sigue estos pasos:\n1. Dir\u00edgete al men\u00fa \"Informes\".\n2. Selecciona \"Personalizar Informe\".\n3. Elige los campos que deseas incluir, como el estado de las tareas o el tiempo dedicado a cada fase."<br> },<br> "truncated_cells": []<br> }<br>]</code> | <code>NEGATIVE: DATASET_NAME: "cmalaviya/expertqa"<br>FEATURES: {'example_id': {'dtype': 'int64', '_type': 'Value'}, 'context': {'dtype': 'string', '_type': 'Value'}, 'question': {'dtype': 'string', '_type': 'Value'}, 'answer': {'dtype': 'string', '_type': 'Value'}}<br>DATA SAMPLE:<br>[<br> {<br> "row_idx": 0,<br> "row": {<br> "example_id": 0,<br> "context": "",<br> "question": "Some customers of mine are not paying their debts on time. Do I have to keep all my customers?",<br> "answer": "You don't necessarily have to keep all your customers, especially if they consistently fail to pay their debts on time. There are different types of non-paying customers, such as cash-strapped, purposefully late, and non-payer by nature . It is essential to maintain a positive attitude and treat your customers with respect while trying to collect their debts . However, if you consistently face issues with particular customers not paying their debts, you may opt to discontinue providing services or products to them and focus on other reliable customers. You may need to consult a professional debt collector or a business attorney in such cases to decide the appropriate next steps in debt collections . To prevent nonpayment issues in the future, you can implement various strategies, such as researching new prospects, being clear with your payment policies, and setting up contracts detailing payment expectations and late fees ."<br> },<br> "truncated_cells": []<br> },<br> {<br> "row_idx": 1,<br> "row": {<br> "example_id": 1,<br> "context": "",<br> "question": "When accounts are faced with ethical dilemmas that often bring their integrity into question, the question is whether they are equipped enough tp deal with those?",<br> "answer": "<Answer> The context provided does not give specific information on whether accountants are adequately equipped to handle ethical dilemmas that could question their integrity. The text does suggest, however, that when faced with an ethical dilemma, one must question the situation honestly and transparently. And, if doubts persist, they have the obligation to raise these questions with those in authority . This suggests the need for a strong understanding of ethics to navigate such situations. The text also implies a correlation between integrity and ethics stating, \"Integrity can be measured by ethics\" . In a broader perspective, the text suggests that professionals, like nurses for example, often face dilemmas uncommon to the general populace . Due to the rapid advancement in medical technology, the study of ethics has become increasingly relevant, indicating that equipping professionals with adequate knowledge in ethics is necessary to navigate the demands and challenges of their roles effectively . Furthermore, it shows that managers grapple with ethical decisions involving questions of morality and integrity especially in situations where prior decisions by other managers create ethical dilemmas . While this analysis provides general insights on the significance of ethical decision-making and the need for professionals to navigate ethical dilemmas effectively, it does not provide a specific commentary on the readiness or the adequacy of training or framework available to accountants to deal with such scenarios. Hence, it is not possible to definitively answer the question based on the context provided. <Evidences> In South Africa SAICA has equipped accountants with code of professional conduct that they should follow when faced with ethical dilemmas. the code gives them guidance on how to deal with those. SAICA code of professional conduct https://www.misti.com/internal-audit-insights/ethics-and-the-internal-auditor Ethics and the Internal Auditor freely express these thoughts and ideas, the culture may be sending the wrong message. When you are personally faced with an ethical dilemma, you must ask yourself whether you are looking at the situation as honestly and transparently as you can. If questions still arise, it is your obligation to raise those questions to individuals in positions of responsibility. Integrity can be measured by ethics If someone had you name the top three people in history that you felt displayed unquestionable integrity, would those same individuals measure high on the ethics scale? Most likely they would. Integrity is adherence to https://misti.com/internal-audit-insights/ethics-and-the-internal-auditor Ethics and the Internal Auditor thoughts and ideas, the culture may be sending the wrong message. When you are personally faced with an ethical dilemma, you must ask yourself whether you are looking at the situation as honestly and transparently as you can. If questions still arise, it is your obligation to raise those questions to individuals in positions of responsibility. Integrity can be measured by ethics If someone had you name the top three people in history that you felt displayed unquestionable integrity, would those same individuals measure high on the ethics scale? Most likely they would. Integrity is adherence to a moral code, https://www.misti.co.uk/internal-audit-insights/ethics-and-the-internal-auditor Ethics and the Internal Auditor wrong message. When you are personally faced with an ethical dilemma, you must ask yourself whether you are looking at the situation as honestly and transparently as you can. If questions still arise, it is your obligation to raise those questions to individuals in positions of responsibility. Integrity can be measured by ethics If someone had you name the top 3 people in history that you felt displayed unquestionable integrity, would those same individuals measure high on the ethics scale? Most likely they would. Integrity is adherence to a moral code, reflected in honesty and harmony in what one thinks, SAICA equip accountants with all the relevant information in order to be able to identify ethical dilemmas https://www.misti.com/internal-audit-insights/ethics-and-the-internal-auditor Ethics and the Internal Auditor freely express these thoughts and ideas, the culture may be sending the wrong message. When you are personally faced with an ethical dilemma, you must ask yourself whether you are looking at the situation as honestly and transparently as you can. If questions still arise, it is your obligation to raise those questions to individuals in positions of responsibility. Integrity can be measured by ethics If someone had you name the top three people in history that you felt displayed unquestionable integrity, would those same individuals measure high on the ethics scale? Most likely they would. Integrity is adherence to https://misti.com/internal-audit-insights/ethics-and-the-internal-auditor Ethics and the Internal Auditor thoughts and ideas, the culture may be sending the wrong message. When you are personally faced with an ethical dilemma, you must ask yourself whether you are looking at the situation as honestly and transparently as you can. If questions still arise, it is your obligation to raise those questions to individuals in positions of responsibility. Integrity can be measured by ethics If someone had you name the top three people in history that you felt displayed unquestionable integrity, would those same individuals measure high on the ethics scale? Most likely they would. Integrity is adherence to a moral code, https://www.misti.co.uk/internal-audit-insights/ethics-and-the-internal-auditor Ethics and the Internal Auditor wrong message. When you are personally faced with an ethical dilemma, you must ask yourself whether you are looking at the situation as honestly and transparently as you can. If questions still arise, it is your obligation to raise those questions to individuals in positions of responsibility. Integrity can be measured by ethics If someone had you name the top 3 people in history that you felt displayed unquestionable integrity, would those same individuals measure high on the ethics scale? Most likely they would. Integrity is adherence to a moral code, reflected in honesty and harmony in what one thinks, https://www.bartleby.com/essay/The-Ethical-Dilemma-Of-A-Family-Nurse-F3H66JS4CPLLX The Ethical Dilemma Of A Family Nurse Practitioner | Bartleby or external factors. Due to the increased complexity of the health system, nowadays nurses are faced with ethical and legal decisions and often come across dilemmas regarding patient care. From this perspective a good question to be raised would be whether or not nurses have the necessary background, knowledge and skills to make appropriate Ethics : Ethics And Ethics professionals who often face dilemmas that are not experienced by the general population. The fast-paced growth of medical technology has made the study of ethics even more relevant. The study of bioethics, or biomedical ethics, refers to moral dilemmas due to https://www.bartleby.com/essay/The-Ethical-Dilemma-Of-A-Family-Nurse-F3H66JS4CPLLX The Ethical Dilemma Of A Family Nurse Practitioner | Bartleby or external factors. Due to the increased complexity of the health system, nowadays nurses are faced with ethical and legal decisions and often come across dilemmas regarding patient care. From this perspective a good question to be raised would be whether or not nurses have the necessary background, knowledge and skills to make appropriate Ethics : Ethics And Ethics professionals who often face dilemmas that are not experienced by the general population. The fast-paced growth of medical technology has made the study of ethics even more relevant. The study of bioethics, or biomedical ethics, refers to moral dilemmas due to http://projectmanager.com.au/ethics-governance-individual-project-manager/ \u00bb Ethics, governance and the individual project manager Project Manager if the payments were not made and as a consequence the mine was not adequately protected leading to equipment damaged and mine workers being attacked and injured, who would be to blame? The managers were faced with a classic dilemma. At the personal level, making the right ethical decisions can be difficult and involve questions of morality and integrity. Your decision becomes more difficult when it involves dealing with a situation created by earlier decisions made by other managers working for the organisation. The managers who decided to make payments to Indonesian officials that may contravene the Bribery of Foreign SAICA equip accountants with all the relevant information in order to be able to identify ethical dilemmas SAICA equip accountants with all the relevant information in order to be able to identify ethical dilemmas"<br> },<br> "truncated_cells": []<br> }<br>]</code> | | <code>USER_QUERY: conversational ai training dataset</code> | <code>HUB_DATASET_PREVIEW: DATASET_NAME: "dataautogpt3/flan1m-alpaca-uncensored"<br>FEATURES: {'instruction': {'dtype': 'string', '_type': 'Value'}, 'input': {'dtype': 'string', '_type': 'Value'}, 'output': {'dtype': 'string', '_type': 'Value'}}<br>DATA SAMPLE:<br>[<br> {<br> "row_idx": 0,<br> "row": {<br> "instruction": "You are a helpful assistant, who always provide explanation. Think like you are answering to a five year old.",<br> "input": "Continue the following story.\n\nEmily held and rocked Kayla as they sobbed because \nSelect from the following.\n +Emily really needed a good friend.\n +Kayla really needed a good friend.",<br> "output": "Once upon a time, Emily held and rocked Kayla as they both sobbed because Kayla really needed a good friend. You see, little Kayla was feeling very lonely and sad because she didn't have anyone to play with or talk to.\n\nEmily wanted to help Kayla and be her friend, so she did something truly special. Emily decided to throw a magical friendship party for Kayla. Emily knew that this would cheer her up and help her make some new friends!\n\nEmily worked really hard, preparing for the big day. She bought balloons, baked cookies, and even created a treasure hunt for everyone to enjoy. She invited lots of other kids from their neighborhood, so Kayla could meet them all and find a friend to play with.\n\nOn the day of the party, the sun was shining, and the sky was filled with fluffy clouds. All the kids in the neighborhood were excited to come to the party and play with Kayla.\n\nEmily and Kayla welcomed their guests with huge smiles and warm hugs. The kids played games, laughed, and explored the treasure hunt that Emily had set up all around the backyard.\n\nSoon, Kayla was surrounded by new friends, chatting and laughing together. She began to feel so happy and loved. As the sun began to set and the party ended, Kayla felt like she finally had found the friends she needed.\n\nThat night, as Emily tucked Kayla into bed, Kayla thanked her for being such an amazing friend and throwing the best party ever. Emily smiled and said, \"That's what friends are for, to help each other when we're feeling sad and lonely.\"\n\nAnd from that day on, Emily and Kayla were the best of friends and shared many more magical adventures together. Now, Kayla knew that she always had a good friend by her side."<br> },<br> "truncated_cells": []<br> },<br> {<br> "row_idx": 1,<br> "row": {<br> "instruction": "You are an AI assistant that helps people find information. User will you give you a question. Your task is to answer as faithfully as you can. While answering think step-bystep and justify your answer.",<br> "input": "For this chain-of-thought reasoning and answer, what was the question?\nA boy playing ball is not necessarily not very good at playing ball. There is no way to know what level of skill the boy has.\n A: it is not possible to tell",<br> "output": "The question likely was:\n\n\"Does a boy playing ball automatically mean he is not very good at playing ball?\""<br> },<br> "truncated_cells": []<br> }<br>]</code> | <code>NEGATIVE: DATASET_NAME: "sgp-bench/open-instruct-gpt4o_40k"<br>FEATURES: {'instruction': {'dtype': 'string', '_type': 'Value'}, 'input': {'dtype': 'string', '_type': 'Value'}, 'output': {'dtype': 'string', '_type': 'Value'}}<br>DATA SAMPLE:<br>[<br> {<br> "row_idx": 0,<br> "row": {<br> "instruction": "Can you tell me how cookies work for tracking user data?",<br> "input": "",<br> "output": "Sure! Cookies are small pieces of data that websites can store on your computer, allowing them to track your activities and improve the user experience. For example, cookies can remember your preferences, keep track of your shopping history, or save information you enter into forms. They\u2019re typically stored in a file on your hard drive, and you can configure your browser to block or delete cookies, or\n"<br> },<br> "truncated_cells": []<br> },<br> {<br> "row_idx": 1,<br> "row": {<br> "instruction": "Given the sentence \"White fluffy dog hangs out of the window of a turquoise building.\" is it true that \"The dog is indoors.\"? Let's be accurate as possible and think first.",<br> "input": "",<br> "output": "A dog being in a turquoise building is the same as a dog being indoors. Final answer: yes."<br> },<br> "truncated_cells": []<br> }<br>]</code> | * Loss: [<code>CachedMultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedmultiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 4 - `per_device_eval_batch_size`: 4 - `num_train_epochs`: 200 - `warmup_ratio`: 0.1 - `bf16`: True - `load_best_model_at_end`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 4 - `per_device_eval_batch_size`: 4 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 5e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 200 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: True - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `eval_use_gather_object`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | loss | max_accuracy | |:----------:|:--------:|:-------------:|:----------:|:------------:| | 0 | 0 | - | - | 0.5 | | 0.3497 | 100 | 1.0509 | 0.7070 | - | | 0.6993 | 200 | 0.6183 | 0.3396 | - | | 1.0490 | 300 | 0.3746 | 0.2282 | - | | 1.3986 | 400 | 0.2481 | 0.1616 | - | | 1.7483 | 500 | 0.2198 | 0.1302 | - | | 2.0979 | 600 | 0.166 | 0.1164 | - | | 2.4476 | 700 | 0.1045 | 0.1174 | - | | 2.7972 | 800 | 0.0797 | 0.1095 | - | | 3.1469 | 900 | 0.0422 | 0.1176 | - | | 3.4965 | 1000 | 0.0595 | 0.1115 | - | | 3.8462 | 1100 | 0.0416 | 0.1008 | - | | 4.1958 | 1200 | 0.0174 | 0.1233 | - | | 4.5455 | 1300 | 0.0273 | 0.1032 | - | | 4.8951 | 1400 | 0.0389 | 0.0990 | - | | **5.2448** | **1500** | **0.0126** | **0.0963** | **-** | | 5.5944 | 1600 | 0.0074 | 0.1193 | - | | 5.9441 | 1700 | 0.0165 | 0.1379 | - | | 6.2937 | 1800 | 0.0046 | 0.1127 | - | | 6.6434 | 1900 | 0.0158 | 0.1289 | - | | 6.9930 | 2000 | 0.0157 | 0.1009 | - | | 7.3427 | 2100 | 0.0032 | 0.1075 | - | | 7.6923 | 2200 | 0.0072 | 0.1289 | - | | 8.0420 | 2300 | 0.0192 | 0.1176 | - | | 8.3916 | 2400 | 0.001 | 0.1214 | - | | 8.7413 | 2500 | 0.024 | 0.1320 | 1.0 | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.1.0 - Transformers: 4.44.2 - PyTorch: 2.4.0+cu121 - Accelerate: 0.34.2 - Datasets: 3.0.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### CachedMultipleNegativesRankingLoss ```bibtex @misc{gao2021scaling, title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup}, author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan}, year={2021}, eprint={2101.06983}, archivePrefix={arXiv}, primaryClass={cs.LG} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "NAMED_ENTITY_RECOGNITION", "TEXT_CLASSIFICATION", "QUESTION_ANSWERING", "TRANSLATION" ]
[ "BEAR" ]
Non_BioNLP
EleutherAI/pythia-6.9b-deduped-v0
EleutherAI
text-generation
[ "transformers", "pytorch", "gpt_neox", "text-generation", "causal-lm", "pythia", "pythia_v0", "en", "dataset:EleutherAI/the_pile_deduplicated", "arxiv:2101.00027", "arxiv:2201.07311", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
1,666
1,688
558
20
--- datasets: - EleutherAI/the_pile_deduplicated language: - en license: apache-2.0 tags: - pytorch - causal-lm - pythia - pythia_v0 --- The *Pythia Scaling Suite* is a collection of models developed to facilitate interpretability research. It contains two sets of eight models of sizes 70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two models: one trained on the Pile, and one trained on the Pile after the dataset has been globally deduplicated. All 8 model sizes are trained on the exact same data, in the exact same order. All Pythia models are available [on Hugging Face](https://huggingface.co/models?other=pythia). The Pythia model suite was deliberately designed to promote scientific research on large language models, especially interpretability research. Despite not centering downstream performance as a design goal, we find the models <a href="#evaluations">match or exceed</a> the performance of similar and same-sized models, such as those in the OPT and GPT-Neo suites. Please note that all models in the *Pythia* suite were renamed in January 2023. For clarity, a <a href="#naming-convention-and-parameter-count">table comparing the old and new names</a> is provided in this model card, together with exact parameter counts. ## Pythia-6.9B-deduped ### Model Details - Developed by: [EleutherAI](http://eleuther.ai) - Model type: Transformer-based Language Model - Language: English - Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia) for training procedure, config files, and details on how to use. - Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox) - License: Apache 2.0 - Contact: to ask questions about this model, join the [EleutherAI Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`. Please read the existing *Pythia* documentation before asking about it in the EleutherAI Discord. For general correspondence: [contact@eleuther. ai](mailto:[email protected]). <figure> | Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models | | -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: | | 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — | | 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M | | 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M | | 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — | | 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B | | 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B | | 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B | | 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — | <figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and non-deduped models of a given size have the same hyperparameters. “Equivalent” models have <b>exactly</b> the same architecture, and the same number of non-embedding parameters.</figcaption> </figure> ### Uses and Limitations #### Intended Use The primary intended use of Pythia is research on the behavior, functionality, and limitations of large language models. This suite is intended to provide a controlled setting for performing scientific experiments. To enable the study of how language models change in the course of training, we provide 143 evenly spaced intermediate checkpoints per model. These checkpoints are hosted on Hugging Face as branches. Note that branch `143000` corresponds exactly to the model checkpoint on the `main` branch of each model. You may also further fine-tune and adapt Pythia-6.9B-deduped for deployment, as long as your use is in accordance with the Apache 2.0 license. Pythia models work with the Hugging Face [Transformers Library](https://huggingface.co/docs/transformers/index). If you decide to use pre-trained Pythia-6.9B-deduped as a basis for your fine-tuned model, please conduct your own risk and bias assessment. #### Out-of-scope use The Pythia Suite is **not** intended for deployment. It is not a in itself a product and cannot be used for human-facing interactions. Pythia models are English-language only, and are not suitable for translation or generating text in other languages. Pythia-6.9B-deduped has not been fine-tuned for downstream contexts in which language models are commonly deployed, such as writing genre prose, or commercial chatbots. This means Pythia-6.9B-deduped will **not** respond to a given prompt the way a product like ChatGPT does. This is because, unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement Learning from Human Feedback (RLHF) to better “understand” human instructions. #### Limitations and biases The core functionality of a large language model is to take a string of text and predict the next token. The token deemed statistically most likely by the model need not produce the most “accurate” text. Never rely on Pythia-6.9B-deduped to produce factually accurate output. This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset known to contain profanity and texts that are lewd or otherwise offensive. See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a discussion of documented biases with regards to gender, religion, and race. Pythia-6.9B-deduped may produce socially unacceptable or undesirable text, *even if* the prompt itself does not include anything explicitly offensive. If you plan on using text generated through, for example, the Hosted Inference API, we recommend having a human curate the outputs of this language model before presenting it to other people. Please inform your audience that the text was generated by Pythia-6.9B-deduped. ### Quickstart Pythia models can be loaded and used via the following code, demonstrated here for the third `pythia-70m-deduped` checkpoint: ```python from transformers import GPTNeoXForCausalLM, AutoTokenizer model = GPTNeoXForCausalLM.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) tokenizer = AutoTokenizer.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) inputs = tokenizer("Hello, I am", return_tensors="pt") tokens = model.generate(**inputs) tokenizer.decode(tokens[0]) ``` Revision/branch `step143000` corresponds exactly to the model checkpoint on the `main` branch of each model.<br> For more information on how to use all Pythia models, see [documentation on GitHub](https://github.com/EleutherAI/pythia). ### Training #### Training data Pythia-6.9B-deduped was trained on the Pile **after the dataset has been globally deduplicated**.<br> [The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in English. It was created by EleutherAI specifically for training large language models. It contains texts from 22 diverse sources, roughly broken down into five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl), prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and miscellaneous (e.g. GitHub, Enron Emails). See [the Pile paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources, methodology, and a discussion of ethical implications. Consult [the datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation about the Pile and its component datasets. The Pile can be downloaded from the [official website](https://pile.eleuther.ai/), or from a [community mirror](https://the-eye.eu/public/AI/pile/). #### Training procedure All models were trained on the exact same data, in the exact same order. Each model saw 299,892,736,000 tokens during training, and 143 checkpoints for each model are saved every 2,097,152,000 tokens, spaced evenly throughout training. This corresponds to training for just under 1 epoch on the Pile for non-deduplicated models, and about 1.5 epochs on the deduplicated Pile. All *Pythia* models trained for the equivalent of 143000 steps at a batch size of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch size of 4M tokens listed were originally trained for 71500 steps instead, with checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for consistency with all 2M batch models, so `step1000` is the first checkpoint for `pythia-1.4b` that was saved (corresponding to step 500 in training), and `step1000` is likewise the first `pythia-6.9b` checkpoint that was saved (corresponding to 1000 “actual” steps).<br> See [GitHub](https://github.com/EleutherAI/pythia) for more details on training procedure, including [how to reproduce it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br> Pythia uses the same tokenizer as [GPT-NeoX- 20B](https://huggingface.co/EleutherAI/gpt-neox-20b). ### Evaluations All 16 *Pythia* models were evaluated using the [LM Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access the results by model and step at `results/json/*` in the [GitHub repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br> Expand the sections below to see plots of evaluation results for all Pythia and Pythia-deduped models compared with OPT and BLOOM. <details> <summary>LAMBADA – OpenAI</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/> </details> <details> <summary>Physical Interaction: Question Answering (PIQA)</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/> </details> <details> <summary>WinoGrande</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/> </details> <details> <summary>AI2 Reasoning Challenge – Challenge Set</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/> </details> <details> <summary>SciQ</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/> </details> ### Naming convention and parameter count *Pythia* models were renamed in January 2023. It is possible that the old naming convention still persists in some documentation by accident. The current naming convention (70M, 160M, etc.) is based on total parameter count. <figure style="width:32em"> | current Pythia suffix | old suffix | total params | non-embedding params | | --------------------: | ---------: | -------------: | -------------------: | | 70M | 19M | 70,426,624 | 18,915,328 | | 160M | 125M | 162,322,944 | 85,056,000 | | 410M | 350M | 405,334,016 | 302,311,424 | | 1B | 800M | 1,011,781,632 | 805,736,448 | | 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 | | 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 | | 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 | | 13B | 12B | 11,846,072,320 | 11,327,027,200 | </figure>
[ "QUESTION_ANSWERING", "TRANSLATION" ]
[ "SCIQ" ]
Non_BioNLP
LoneStriker/OpenBioLLM-Llama3-70B-6.0bpw-h6-exl2
LoneStriker
text-generation
[ "transformers", "pytorch", "llama", "text-generation", "llama-3", "Mixtral", "instruct", "finetune", "chatml", "DPO", "RLHF", "gpt4", "distillation", "conversational", "en", "arxiv:2305.18290", "arxiv:2303.13375", "arxiv:2212.13138", "arxiv:2305.09617", "arxiv:2402.07023", "base_model:meta-llama/Meta-Llama-3-70B-Instruct", "base_model:quantized:meta-llama/Meta-Llama-3-70B-Instruct", "license:llama3", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "6-bit", "exl2", "region:us" ]
1,714
1,714
8
0
--- base_model: meta-llama/Meta-Llama-3-70B-Instruct language: - en license: llama3 tags: - llama-3 - llama - Mixtral - instruct - finetune - chatml - DPO - RLHF - gpt4 - distillation widget: - example_title: OpenBioLLM-70B messages: - role: system content: You are an expert and experienced from the healthcare and biomedical domain with extensive medical knowledge and practical experience. - role: user content: How long does it take for newborn jaundice to go away? output: text: 'Newborn jaundice, also known as neonatal jaundice, is a common condition in newborns where the yellowing of the skin and eyes occurs due to an elevated level of bilirubin in the blood. Bilirubin is a yellow pigment that forms when red blood cells break down. In most cases, newborn jaundice resolves on its own without any specific treatment. The duration of newborn jaundice can vary depending on several factors such as the underlying cause, gestational age at birth, and individual variations in bilirubin metabolism. Here are some general guidelines: 1. Physiological jaundice: This is the most common type of newborn jaundice and usually appears within 24-72 hours after birth. It tends to peak between the second and fifth day of life and gradually improves over the next week or two. By the time the baby is one week old, the jaundice should have mostly resolved. 2. Breast milk jaundice: This type of jaundice occurs in breastfed babies and may appear later than physiological jaundice, typically between the fifth and fourteenth day of life. It tends to persist for a longer duration but usually resolves within six weeks after birth. 3. Pathological jaundice: This type of jaundice is less common and occurs due to an underlying medical condition that affects bilirubin metabolism or liver function. The duration of pathological jaundice depends on the specific cause and may require treatment. It''s important for parents to monitor their newborn''s jaundice closely and seek medical advice if the jaundice progresses rapidly, becomes severe, or is accompanied by other symptoms such as poor feeding, lethargy, or excessive sleepiness. In these cases, further evaluation and management may be necessary. Remember that each baby is unique, and the timing of jaundice resolution can vary. If you have concerns about your newborn''s jaundice, it''s always best to consult with a healthcare professional for personalized advice and guidance.' model-index: - name: OpenBioLLM-70B results: [] --- <div align="center"> <img width="260px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/BrQCb95lmEIFz79QAmoNA.png"></div> ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/fJIOPJnY6Ff6fUiSIuMEt.png) <div align="center"> <h1>Advancing Open-source Large Language Models in Medical Domain</h1> </div> <p align="center" style="margin-top: 0px;"> <a href="https://colab.research.google.com/drive/1F5oV20InEYeAJGmBwYF9NM_QhLmjBkKJ?usp=sharing"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="OpenChat Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 10px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text" style=" margin-right: 5px;">Online Demo</span> </a> | <a href="https://github.com/openlifescience-ai"> <img src="https://github.githubassets.com/assets/GitHub-Mark-ea2971cee799.png" alt="GitHub Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text" style=" margin-right: 5px;">GitHub</span> </a> | <a href="#"> <img src="https://github.com/alpayariyak/openchat/blob/master/assets/arxiv-logomark-small-square-border.png?raw=true" alt="ArXiv Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text" style="margin-right: 5px;">Paper</span> </a> | <a href="https://discord.gg/A5Fjf5zC69"> <img src="https://cloud.githubusercontent.com/assets/6291467/26705903/96c2d66e-477c-11e7-9f4e-f3c0efe96c9a.png" alt="Discord Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text">Discord</span> </a> </p> ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/KGmRE5w2sepNtwsEu8t7K.jpeg) Introducing OpenBioLLM-70B: A State-of-the-Art Open Source Biomedical Large Language Model OpenBioLLM-70B is an advanced open source language model designed specifically for the biomedical domain. Developed by Saama AI Labs, this model leverages cutting-edge techniques to achieve state-of-the-art performance on a wide range of biomedical tasks. 🏥 **Biomedical Specialization**: OpenBioLLM-70B is tailored for the unique language and knowledge requirements of the medical and life sciences fields. It was fine-tuned on a vast corpus of high-quality biomedical data, enabling it to understand and generate text with domain-specific accuracy and fluency. 🎓 **Superior Performance**: With 70 billion parameters, OpenBioLLM-70B outperforms other open source biomedical language models of similar scale. It has also demonstrated better results compared to larger proprietary & open-source models like GPT-4, Gemini, Meditron-70B, Med-PaLM-1 & Med-PaLM-2 on biomedical benchmarks. 🧠 **Advanced Training Techniques**: OpenBioLLM-70B builds upon the powerful foundations of the **Meta-Llama-3-70B-Instruct** and [Meta-Llama-3-70B-Instruct](meta-llama/Meta-Llama-3-70B-Instruct) models. It incorporates the DPO dataset and fine-tuning recipe along with a custom diverse medical instruction dataset. Key components of the training pipeline include: <div align="center"> <img width="1200px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/oPchsJsEpQoGcGXVbh7YS.png"> </div> - **Policy Optimization**: [Direct Preference Optimization: Your Language Model is Secretly a Reward Model (DPO)](https://arxiv.org/abs/2305.18290) - **Fine-tuning dataset**: Custom Medical Instruct dataset (We plan to release a sample training dataset in our upcoming paper; please stay updated) This combination of cutting-edge techniques enables OpenBioLLM-70B to align with key capabilities and preferences for biomedical applications. ⚙️ **Release Details**: - **Model Size**: 70 billion parameters - **Quantization**: Optimized quantized versions available [Here](https://huggingface.co/aaditya/OpenBioLLM-70B-GGUF) - **Language(s) (NLP):** en - **Developed By**: [Ankit Pal (Aaditya Ura)](https://aadityaura.github.io/) from Saama AI Labs - **License:** Meta-Llama License - **Fine-tuned from models:** [Meta-Llama-3-70B-Instruct](meta-llama/Meta-Llama-3-70B-Instruct) - **Resources for more information:** - Paper: Coming soon The model can be fine-tuned for more specialized tasks and datasets as needed. OpenBioLLM-70B represents an important step forward in democratizing advanced language AI for the biomedical community. By leveraging state-of-the-art architectures and training techniques from leading open source efforts like Llama-3, we have created a powerful tool to accelerate innovation and discovery in healthcare and the life sciences. We are excited to share OpenBioLLM-70B with researchers and developers around the world. ### Use with transformers **Important: Please use the exact chat template provided by Llama-3 instruct version. Otherwise there will be a degradation in the performance. The model output can be verbose in rare cases. Please consider setting temperature = 0 to make this happen less.** See the snippet below for usage with Transformers: ```python import transformers import torch model_id = "aaditya/OpenBioLLM-Llama3-70B" pipeline = transformers.pipeline( "text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device="auto", ) messages = [ {"role": "system", "content": "You are an expert and experienced from the healthcare and biomedical domain with extensive medical knowledge and practical experience. Your name is OpenBioLLM, and you were developed by Saama AI Labs. who's willing to help answer the user's query with explanation. In your explanation, leverage your deep medical expertise such as relevant anatomical structures, physiological processes, diagnostic criteria, treatment guidelines, or other pertinent medical concepts. Use precise medical terminology while still aiming to make the explanation clear and accessible to a general audience."}, {"role": "user", "content": "How can i split a 3mg or 4mg waefin pill so i can get a 2.5mg pill?"}, ] prompt = pipeline.tokenizer.apply_chat_template( messages, tokenize=False, add_generation_prompt=True ) terminators = [ pipeline.tokenizer.eos_token_id, pipeline.tokenizer.convert_tokens_to_ids("<|eot_id|>") ] outputs = pipeline( prompt, max_new_tokens=256, eos_token_id=terminators, do_sample=True, temperature=0.0, top_p=0.9, ) print(outputs[0]["generated_text"][len(prompt):]) ``` ## **Training procedure** ### **Training hyperparameters** <details> <summary>Click to see details</summary> - learning_rate: 0.0002 - lr_scheduler: cosine - train_batch_size: 12 - eval_batch_size: 8 - GPU: H100 80GB SXM5 - num_devices: 8 - optimizer: adamw_bnb_8bit - lr_scheduler_warmup_steps: 100 - num_epochs: 4 </details> ### **Peft hyperparameters** <details> <summary>Click to see details</summary> - adapter: qlora - lora_r: 128 - lora_alpha: 256 - lora_dropout: 0.05 - lora_target_linear: true -lora_target_modules: - q_proj - v_proj - k_proj - o_proj - gate_proj - down_proj - up_proj </details> ### **Training results** ### **Framework versions** - Transformers 4.39.3 - Pytorch 2.1.2+cu121 - Datasets 2.18.0 - Tokenizers 0.15.1 - Axolotl - Lm harness for evaluation # Benchmark Results 🔥 OpenBioLLM-70B demonstrates superior performance compared to larger models, such as GPT-4, Gemini, Meditron-70B, Med-PaLM-1 & Med-PaLM-2 across 9 diverse biomedical datasets, achieving state-of-the-art results with an average score of 86.06%, despite having a significantly smaller parameter count. The model's strong performance in domain-specific tasks, such as Clinical KG, Medical Genetics, and PubMedQA, highlights its ability to effectively capture and apply biomedical knowledge. 🚨 The GPT-4, Med-PaLM-1, and Med-PaLM-2 results are taken from their official papers. Since Med-PaLM doesn't provide zero-shot accuracy, we are using 5-shot accuracy from their paper for comparison. All results presented are in the zero-shot setting, except for Med-PaLM-2 and Med-PaLM-1, which use 5-shot accuracy. | | Clinical KG | Medical Genetics | Anatomy | Pro Medicine | College Biology | College Medicine | MedQA 4 opts | PubMedQA | MedMCQA | Avg | |--------------------|-------------|------------------|---------|--------------|-----------------|------------------|--------------|----------|---------|-------| | **OpenBioLLM-70B** | **92.93** | **93.197** | **83.904** | 93.75 | 93.827 | **85.749** | 78.162 | 78.97 | **74.014** | **86.05588** | | Med-PaLM-2 (5-shot) | 88.3 | 90 | 77.8 | **95.2** | 94.4 | 80.9 | **79.7** | **79.2** | 71.3 | 84.08 | | **GPT-4** | 86.04 | 91 | 80 | 93.01 | **95.14** | 76.88 | 78.87 | 75.2 | 69.52 | 82.85 | | Med-PaLM-1 (Flan-PaLM, 5-shot) | 80.4 | 75 | 63.7 | 83.8 | 88.9 | 76.3 | 67.6 | 79 | 57.6 | 74.7 | | **OpenBioLLM-8B** | 76.101 | 86.1 | 69.829 | 78.21 | 84.213 | 68.042 | 58.993 | 74.12 | 56.913 | 72.502 | | Gemini-1.0 | 76.7 | 75.8 | 66.7 | 77.7 | 88 | 69.2 | 58 | 70.7 | 54.3 | 70.79 | | GPT-3.5 Turbo 1106 | 74.71 | 74 | 72.79 | 72.79 | 72.91 | 64.73 | 57.71 | 72.66 | 53.79 | 66 | | Meditron-70B | 66.79 | 69 | 53.33 | 71.69 | 76.38 | 63 | 57.1 | 76.6 | 46.85 | 64.52 | | gemma-7b | 69.81 | 70 | 59.26 | 66.18 | 79.86 | 60.12 | 47.21 | 76.2 | 48.96 | 64.18 | | Mistral-7B-v0.1 | 68.68 | 71 | 55.56 | 68.38 | 68.06 | 59.54 | 50.82 | 75.4 | 48.2 | 62.85 | | Apollo-7B | 62.26 | 72 | 61.48 | 69.12 | 70.83 | 55.49 | 55.22 | 39.8 | 53.77 | 60 | | MedAlpaca-7b | 57.36 | 69 | 57.04 | 67.28 | 65.28 | 54.34 | 41.71 | 72.8 | 37.51 | 58.03 | | BioMistral-7B | 59.9 | 64 | 56.5 | 60.4 | 59 | 54.7 | 50.6 | 77.5 | 48.1 | 57.3 | | AlpaCare-llama2-7b | 49.81 | 49 | 45.92 | 33.82 | 50 | 43.35 | 29.77 | 72.2 | 34.42 | 45.36 | | ClinicalGPT | 30.56 | 27 | 30.37 | 19.48 | 25 | 24.27 | 26.08 | 63.8 | 28.18 | 30.52 | <div align="center"> <img width="1600px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/_SzdcJSBjZyo8RS1bTEkP.png"> </div> ## Detailed Medical Subjectwise accuracy ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/UXF-V0col0Z0sS6BGPBkE.png) # Use Cases & Examples 🚨 **Below results are from the quantized version of OpenBioLLM-70B # Summarize Clinical Notes OpenBioLLM-70B can efficiently analyze and summarize complex clinical notes, EHR data, and discharge summaries, extracting key information and generating concise, structured summaries ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/xdwdBgOxNi_TfML0hKlI8.png) # Answer Medical Questions OpenBioLLM-70B can provide answers to a wide range of medical questions. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/zO95GlwOQEZqCKQF69mE6.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/OKBczKw7gWeW5xsuDpc27.png) <details> <summary>Click to see details</summary> ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/eJGHT5khppYvJb8fQ-YW4.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/Cnbwrqa_-ORHRuNRC2P6Y.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/J9DhdcvukAc9mnnW9fj2C.png) </details> # Clinical Entity Recognition OpenBioLLM-70B can perform advanced clinical entity recognition by identifying and extracting key medical concepts, such as diseases, symptoms, medications, procedures, and anatomical structures, from unstructured clinical text. By leveraging its deep understanding of medical terminology and context, the model can accurately annotate and categorize clinical entities, enabling more efficient information retrieval, data analysis, and knowledge discovery from electronic health records, research articles, and other biomedical text sources. This capability can support various downstream applications, such as clinical decision support, pharmacovigilance, and medical research. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/_69BW4k9LVABFwtxixL45.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/DKy5wYCoPhoPPUc1-x8_J.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/7WD9zCCBZT4-4XlfnIQjl.png) # Biomarkers Extraction ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/ZttoM4AiteT7gFYVhjIpN.png) # Classification OpenBioLLM-70B can perform various biomedical classification tasks, such as disease prediction, sentiment analysis, medical document categorization ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/Bf5MW1d75qT-1F_TR_hC0.png) # De-Identification OpenBioLLM-70B can detect and remove personally identifiable information (PII) from medical records, ensuring patient privacy and compliance with data protection regulations like HIPAA. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/hKX4kzm--Tw5bj6K78msy.png) **Advisory Notice!**  While OpenBioLLM-70B leverages high-quality data sources, its outputs may still contain inaccuracies, biases, or misalignments that could pose risks if relied upon for medical decision-making without further testing and refinement. The model's performance has not yet been rigorously evaluated in randomized controlled trials or real-world healthcare environments. Therefore, we strongly advise against using OpenBioLLM-70B for any direct patient care, clinical decision support, or other professional medical purposes at this time. Its use should be limited to research, development, and exploratory applications by qualified individuals who understand its limitations. OpenBioLLM-70B is intended solely as a research tool to assist healthcare professionals and should never be considered a replacement for the professional judgment and expertise of a qualified medical doctor. Appropriately adapting and validating OpenBioLLM-70B for specific medical use cases would require significant additional work, potentially including: - Thorough testing and evaluation in relevant clinical scenarios - Alignment with evidence-based guidelines and best practices - Mitigation of potential biases and failure modes - Integration with human oversight and interpretation - Compliance with regulatory and ethical standards Always consult a qualified healthcare provider for personal medical needs. # Citation If you find OpenBioLLM-70B & 8B useful in your work, please cite the model as follows: ``` @misc{OpenBioLLMs, author = {Ankit Pal, Malaikannan Sankarasubbu}, title = {OpenBioLLMs: Advancing Open-Source Large Language Models for Healthcare and Life Sciences}, year = {2024}, publisher = {Hugging Face}, journal = {Hugging Face repository}, howpublished = {\url{https://huggingface.co/aaditya/OpenBioLLM-Llama3-70B}} } ``` The accompanying paper is currently in progress and will be released soon. <div align="center"> <h2> 💌 Contact </h2> </div> We look forward to hearing you and collaborating on this exciting project! **Contributors:** - [Ankit Pal (Aaditya Ura)](https://aadityaura.github.io/) [aadityaura at gmail dot com] - Saama AI Labs - Note: I am looking for a funded PhD opportunity, especially if it fits my Responsible Generative AI, Multimodal LLMs, Geometric Deep Learning, and Healthcare AI skillset. # References We thank the [Meta Team](meta-llama/Meta-Llama-3-70B-Instruct) for their amazing models! Result sources - [1] GPT-4 [Capabilities of GPT-4 on Medical Challenge Problems] (https://arxiv.org/abs/2303.13375) - [2] Med-PaLM-1 [Large Language Models Encode Clinical Knowledge](https://arxiv.org/abs/2212.13138) - [3] Med-PaLM-2 [Towards Expert-Level Medical Question Answering with Large Language Models](https://arxiv.org/abs/2305.09617) - [4] Gemini-1.0 [Gemini Goes to Med School](https://arxiv.org/abs/2402.07023)
[ "QUESTION_ANSWERING" ]
[ "MEDQA", "PUBMEDQA" ]
BioNLP
HIT-TMG/KaLM-embedding-multilingual-mini-instruct-v1
HIT-TMG
sentence-similarity
[ "sentence-transformers", "safetensors", "qwen2", "feature-extraction", "sentence-similarity", "mteb", "arxiv:2501.01028", "license:mit", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,729
1,741
36,178
32
--- license: mit pipeline_tag: sentence-similarity tags: - sentence-transformers - feature-extraction - sentence-similarity - mteb model-index: - name: KaLM-Embedding results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en-ext) type: mteb/amazon_counterfactual config: en-ext split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 94.35532233883059 - type: ap value: 60.40219300665376 - type: ap_weighted value: 60.40219300665376 - type: f1 value: 86.52001470357649 - type: f1_weighted value: 94.65531755022661 - type: main_score value: 94.35532233883059 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 91.71641791044776 - type: ap value: 68.4050364584575 - type: ap_weighted value: 68.4050364584575 - type: f1 value: 87.91854774634491 - type: f1_weighted value: 92.0430596057422 - type: main_score value: 91.71641791044776 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 96.49945000000001 - type: ap value: 94.97348227456295 - type: ap_weighted value: 94.97348227456295 - type: f1 value: 96.49855824500423 - type: f1_weighted value: 96.49855824500422 - type: main_score value: 96.49945000000001 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 61.242 - type: f1 value: 59.353696237560094 - type: f1_weighted value: 59.35369623756011 - type: main_score value: 61.242 - task: type: Retrieval dataset: name: MTEB ArguAna type: mteb/arguana config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: main_score value: 56.569 - type: map_at_1 value: 31.080999999999996 - type: map_at_10 value: 47.432 - type: map_at_100 value: 48.247 - type: map_at_1000 value: 48.251 - type: map_at_20 value: 48.114000000000004 - type: map_at_3 value: 42.425000000000004 - type: map_at_5 value: 45.128 - type: mrr_at_1 value: 31.57894736842105 - type: mrr_at_10 value: 47.6253132832081 - type: mrr_at_100 value: 48.440395388879296 - type: mrr_at_1000 value: 48.44416076630039 - type: mrr_at_20 value: 48.30706364782469 - type: mrr_at_3 value: 42.59127548601235 - type: mrr_at_5 value: 45.347321005215804 - type: nauc_map_at_1000_diff1 value: 7.110790588301176 - type: nauc_map_at_1000_max value: -12.892696039828866 - type: nauc_map_at_1000_std value: -15.5709273320573 - type: nauc_map_at_100_diff1 value: 7.117551663882657 - type: nauc_map_at_100_max value: -12.882680977142957 - type: nauc_map_at_100_std value: -15.56350483617667 - type: nauc_map_at_10_diff1 value: 6.903272993199564 - type: nauc_map_at_10_max value: -13.012877497725961 - type: nauc_map_at_10_std value: -15.947400478856006 - type: nauc_map_at_1_diff1 value: 10.03503740028087 - type: nauc_map_at_1_max value: -13.351553937797 - type: nauc_map_at_1_std value: -14.137614923859612 - type: nauc_map_at_20_diff1 value: 7.01754882034529 - type: nauc_map_at_20_max value: -12.864438636302197 - type: nauc_map_at_20_std value: -15.541510619190976 - type: nauc_map_at_3_diff1 value: 7.018587254951812 - type: nauc_map_at_3_max value: -13.38420244471981 - type: nauc_map_at_3_std value: -16.127099270987785 - type: nauc_map_at_5_diff1 value: 6.920961668066123 - type: nauc_map_at_5_max value: -13.169892625713931 - type: nauc_map_at_5_std value: -16.21272880801226 - type: nauc_mrr_at_1000_diff1 value: 5.5525831294754004 - type: nauc_mrr_at_1000_max value: -12.98089269414052 - type: nauc_mrr_at_1000_std value: -15.396489593627944 - type: nauc_mrr_at_100_diff1 value: 5.559525360367539 - type: nauc_mrr_at_100_max value: -12.970885236428334 - type: nauc_mrr_at_100_std value: -15.389102542398783 - type: nauc_mrr_at_10_diff1 value: 5.38828048977972 - type: nauc_mrr_at_10_max value: -13.096637253890634 - type: nauc_mrr_at_10_std value: -15.775810422484374 - type: nauc_mrr_at_1_diff1 value: 8.58091801149426 - type: nauc_mrr_at_1_max value: -12.352949021555306 - type: nauc_mrr_at_1_std value: -13.545487974417847 - type: nauc_mrr_at_20_diff1 value: 5.4666282281067735 - type: nauc_mrr_at_20_max value: -12.952039027828944 - type: nauc_mrr_at_20_std value: -15.367907454271231 - type: nauc_mrr_at_3_diff1 value: 5.1862331302405735 - type: nauc_mrr_at_3_max value: -13.816401285559108 - type: nauc_mrr_at_3_std value: -15.872101319770382 - type: nauc_mrr_at_5_diff1 value: 5.471097057115419 - type: nauc_mrr_at_5_max value: -13.269134531334442 - type: nauc_mrr_at_5_std value: -15.95735511276538 - type: nauc_ndcg_at_1000_diff1 value: 6.8032235432235275 - type: nauc_ndcg_at_1000_max value: -12.52617810408163 - type: nauc_ndcg_at_1000_std value: -15.38677998208727 - type: nauc_ndcg_at_100_diff1 value: 6.971743190062509 - type: nauc_ndcg_at_100_max value: -12.284060222136334 - type: nauc_ndcg_at_100_std value: -15.203583619739097 - type: nauc_ndcg_at_10_diff1 value: 5.9423315360857005 - type: nauc_ndcg_at_10_max value: -12.649746010742199 - type: nauc_ndcg_at_10_std value: -16.72153869758235 - type: nauc_ndcg_at_1_diff1 value: 10.03503740028087 - type: nauc_ndcg_at_1_max value: -13.351553937797 - type: nauc_ndcg_at_1_std value: -14.137614923859612 - type: nauc_ndcg_at_20_diff1 value: 6.379802915097805 - type: nauc_ndcg_at_20_max value: -12.01427315352701 - type: nauc_ndcg_at_20_std value: -15.108250307425825 - type: nauc_ndcg_at_3_diff1 value: 6.298556094258956 - type: nauc_ndcg_at_3_max value: -13.536187803253377 - type: nauc_ndcg_at_3_std value: -16.999347732797407 - type: nauc_ndcg_at_5_diff1 value: 6.099858591554027 - type: nauc_ndcg_at_5_max value: -13.097631098081774 - type: nauc_ndcg_at_5_std value: -17.215525664264348 - type: nauc_precision_at_1000_diff1 value: -21.130247827110427 - type: nauc_precision_at_1000_max value: 24.21748822806628 - type: nauc_precision_at_1000_std value: 83.6578697460551 - type: nauc_precision_at_100_diff1 value: 29.395727608507894 - type: nauc_precision_at_100_max value: 51.676651935775695 - type: nauc_precision_at_100_std value: 62.92260397258278 - type: nauc_precision_at_10_diff1 value: -0.25306953208178373 - type: nauc_precision_at_10_max value: -9.710491261292093 - type: nauc_precision_at_10_std value: -21.697648668302183 - type: nauc_precision_at_1_diff1 value: 10.03503740028087 - type: nauc_precision_at_1_max value: -13.351553937797 - type: nauc_precision_at_1_std value: -14.137614923859612 - type: nauc_precision_at_20_diff1 value: -2.084669856957687 - type: nauc_precision_at_20_max value: 6.736841084303921 - type: nauc_precision_at_20_std value: -0.330152716888139 - type: nauc_precision_at_3_diff1 value: 4.202256387521114 - type: nauc_precision_at_3_max value: -14.043068948669681 - type: nauc_precision_at_3_std value: -19.71625712734227 - type: nauc_precision_at_5_diff1 value: 3.2694130100522667 - type: nauc_precision_at_5_max value: -12.7772753118202 - type: nauc_precision_at_5_std value: -20.917228577779888 - type: nauc_recall_at_1000_diff1 value: -21.13024782711332 - type: nauc_recall_at_1000_max value: 24.21748822806101 - type: nauc_recall_at_1000_std value: 83.6578697460535 - type: nauc_recall_at_100_diff1 value: 29.395727608504448 - type: nauc_recall_at_100_max value: 51.67665193577227 - type: nauc_recall_at_100_std value: 62.92260397258032 - type: nauc_recall_at_10_diff1 value: -0.2530695320818313 - type: nauc_recall_at_10_max value: -9.710491261292015 - type: nauc_recall_at_10_std value: -21.697648668302048 - type: nauc_recall_at_1_diff1 value: 10.03503740028087 - type: nauc_recall_at_1_max value: -13.351553937797 - type: nauc_recall_at_1_std value: -14.137614923859612 - type: nauc_recall_at_20_diff1 value: -2.0846698569576856 - type: nauc_recall_at_20_max value: 6.736841084303534 - type: nauc_recall_at_20_std value: -0.3301527168878837 - type: nauc_recall_at_3_diff1 value: 4.202256387521115 - type: nauc_recall_at_3_max value: -14.043068948669694 - type: nauc_recall_at_3_std value: -19.716257127342317 - type: nauc_recall_at_5_diff1 value: 3.26941301005235 - type: nauc_recall_at_5_max value: -12.777275311820102 - type: nauc_recall_at_5_std value: -20.917228577779866 - type: ndcg_at_1 value: 31.080999999999996 - type: ndcg_at_10 value: 56.569 - type: ndcg_at_100 value: 59.772999999999996 - type: ndcg_at_1000 value: 59.843 - type: ndcg_at_20 value: 58.933 - type: ndcg_at_3 value: 46.209 - type: ndcg_at_5 value: 51.090999999999994 - type: precision_at_1 value: 31.080999999999996 - type: precision_at_10 value: 8.578 - type: precision_at_100 value: 0.991 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.744000000000001 - type: precision_at_3 value: 19.061 - type: precision_at_5 value: 13.812 - type: recall_at_1 value: 31.080999999999996 - type: recall_at_10 value: 85.775 - type: recall_at_100 value: 99.14699999999999 - type: recall_at_1000 value: 99.644 - type: recall_at_20 value: 94.879 - type: recall_at_3 value: 57.18299999999999 - type: recall_at_5 value: 69.06099999999999 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: main_score value: 48.009758343820856 - type: v_measure value: 48.009758343820856 - type: v_measure_std value: 14.203651443985635 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: main_score value: 39.401811401341035 - type: v_measure value: 39.401811401341035 - type: v_measure_std value: 14.736655369522248 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: main_score value: 60.158996366210474 - type: map value: 60.158996366210474 - type: mrr value: 74.69034428175702 - type: nAUC_map_diff1 value: 7.7660414737755605 - type: nAUC_map_max value: 20.377348037855818 - type: nAUC_map_std value: 18.290516035806565 - type: nAUC_mrr_diff1 value: 10.721266751736124 - type: nAUC_mrr_max value: 31.3686330442438 - type: nAUC_mrr_std value: 19.240868443170196 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cosine_pearson value: 87.53887478826596 - type: cosine_spearman value: 86.32606338345799 - type: euclidean_pearson value: 86.76233071291158 - type: euclidean_spearman value: 86.32606338345799 - type: main_score value: 86.32606338345799 - type: manhattan_pearson value: 86.05455915524152 - type: manhattan_spearman value: 85.8868967502423 - type: pearson value: 87.53887478826596 - type: spearman value: 86.32606338345799 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 84.92857142857144 - type: f1 value: 84.30505630131526 - type: f1_weighted value: 84.30505630131528 - type: main_score value: 84.92857142857144 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: main_score value: 40.014867273983484 - type: v_measure value: 40.014867273983484 - type: v_measure_std value: 0.6558905123714063 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: main_score value: 33.79424302438114 - type: v_measure value: 33.79424302438114 - type: v_measure_std value: 0.837779778459544 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: mteb/cqadupstack-android config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: main_score value: 52.884 - type: map_at_1 value: 34.634 - type: map_at_10 value: 46.339000000000006 - type: map_at_100 value: 47.857 - type: map_at_1000 value: 47.97 - type: map_at_20 value: 47.205000000000005 - type: map_at_3 value: 42.543 - type: map_at_5 value: 44.772 - type: mrr_at_1 value: 41.3447782546495 - type: mrr_at_10 value: 52.23857210981671 - type: mrr_at_100 value: 52.90915062899396 - type: mrr_at_1000 value: 52.95240146583995 - type: mrr_at_20 value: 52.655345331835804 - type: mrr_at_3 value: 49.71387696709583 - type: mrr_at_5 value: 51.23748211731041 - type: nauc_map_at_1000_diff1 value: 51.49705927936061 - type: nauc_map_at_1000_max value: 35.528845247090466 - type: nauc_map_at_1000_std value: -4.253741985593714 - type: nauc_map_at_100_diff1 value: 51.508584685268886 - type: nauc_map_at_100_max value: 35.56248075672379 - type: nauc_map_at_100_std value: -4.176500881199186 - type: nauc_map_at_10_diff1 value: 51.338718973920614 - type: nauc_map_at_10_max value: 34.946543347441214 - type: nauc_map_at_10_std value: -5.33037717427031 - type: nauc_map_at_1_diff1 value: 56.23620820617472 - type: nauc_map_at_1_max value: 30.320970401424987 - type: nauc_map_at_1_std value: -6.655365474007067 - type: nauc_map_at_20_diff1 value: 51.51775947048102 - type: nauc_map_at_20_max value: 35.25983470448141 - type: nauc_map_at_20_std value: -4.612859125963163 - type: nauc_map_at_3_diff1 value: 52.725269902770755 - type: nauc_map_at_3_max value: 33.299803481018195 - type: nauc_map_at_3_std value: -6.33353874546021 - type: nauc_map_at_5_diff1 value: 51.672349084315485 - type: nauc_map_at_5_max value: 34.645370794379886 - type: nauc_map_at_5_std value: -5.94117791112353 - type: nauc_mrr_at_1000_diff1 value: 49.12249635354981 - type: nauc_mrr_at_1000_max value: 36.29480359532615 - type: nauc_mrr_at_1000_std value: -4.665759763477847 - type: nauc_mrr_at_100_diff1 value: 49.11255442003998 - type: nauc_mrr_at_100_max value: 36.29806935257465 - type: nauc_mrr_at_100_std value: -4.663481407381479 - type: nauc_mrr_at_10_diff1 value: 48.779673215220065 - type: nauc_mrr_at_10_max value: 36.12766214960087 - type: nauc_mrr_at_10_std value: -5.0778877090392625 - type: nauc_mrr_at_1_diff1 value: 53.70331290003521 - type: nauc_mrr_at_1_max value: 35.17671705244682 - type: nauc_mrr_at_1_std value: -6.289432335416569 - type: nauc_mrr_at_20_diff1 value: 48.98440189775321 - type: nauc_mrr_at_20_max value: 36.24567442841102 - type: nauc_mrr_at_20_std value: -4.808524080549843 - type: nauc_mrr_at_3_diff1 value: 50.09142180621504 - type: nauc_mrr_at_3_max value: 36.57201237478509 - type: nauc_mrr_at_3_std value: -5.10258589719658 - type: nauc_mrr_at_5_diff1 value: 49.15413181233011 - type: nauc_mrr_at_5_max value: 36.64387975128488 - type: nauc_mrr_at_5_std value: -5.2142664019104 - type: nauc_ndcg_at_1000_diff1 value: 49.48338541267995 - type: nauc_ndcg_at_1000_max value: 36.71225124867686 - type: nauc_ndcg_at_1000_std value: -2.1565353674328636 - type: nauc_ndcg_at_100_diff1 value: 49.378803009143354 - type: nauc_ndcg_at_100_max value: 37.05072158645242 - type: nauc_ndcg_at_100_std value: -1.1554881315239078 - type: nauc_ndcg_at_10_diff1 value: 48.217255194293706 - type: nauc_ndcg_at_10_max value: 35.70709987917217 - type: nauc_ndcg_at_10_std value: -4.5843409864100835 - type: nauc_ndcg_at_1_diff1 value: 53.70331290003521 - type: nauc_ndcg_at_1_max value: 35.17671705244682 - type: nauc_ndcg_at_1_std value: -6.289432335416569 - type: nauc_ndcg_at_20_diff1 value: 48.90479671421663 - type: nauc_ndcg_at_20_max value: 35.63061062961699 - type: nauc_ndcg_at_20_std value: -3.2759049624453924 - type: nauc_ndcg_at_3_diff1 value: 50.66992100707998 - type: nauc_ndcg_at_3_max value: 35.647144096807054 - type: nauc_ndcg_at_3_std value: -4.675684277632912 - type: nauc_ndcg_at_5_diff1 value: 48.86023024957704 - type: nauc_ndcg_at_5_max value: 36.36204191994049 - type: nauc_ndcg_at_5_std value: -4.979721506683613 - type: nauc_precision_at_1000_diff1 value: -20.176146428291695 - type: nauc_precision_at_1000_max value: -4.944333530911747 - type: nauc_precision_at_1000_std value: -2.6416464331580256 - type: nauc_precision_at_100_diff1 value: -11.455305661135391 - type: nauc_precision_at_100_max value: 9.563783942313348 - type: nauc_precision_at_100_std value: 9.987888995757324 - type: nauc_precision_at_10_diff1 value: 6.577302086017673 - type: nauc_precision_at_10_max value: 25.67586949524924 - type: nauc_precision_at_10_std value: 5.543682394632135 - type: nauc_precision_at_1_diff1 value: 53.70331290003521 - type: nauc_precision_at_1_max value: 35.17671705244682 - type: nauc_precision_at_1_std value: -6.289432335416569 - type: nauc_precision_at_20_diff1 value: 0.0352451246393809 - type: nauc_precision_at_20_max value: 19.02340589034973 - type: nauc_precision_at_20_std value: 10.156322995661567 - type: nauc_precision_at_3_diff1 value: 31.114868446262108 - type: nauc_precision_at_3_max value: 35.740653736733925 - type: nauc_precision_at_3_std value: -0.4754489918596968 - type: nauc_precision_at_5_diff1 value: 17.05966182310583 - type: nauc_precision_at_5_max value: 32.37346687203089 - type: nauc_precision_at_5_std value: 1.4954175443689899 - type: nauc_recall_at_1000_diff1 value: 42.86116448480766 - type: nauc_recall_at_1000_max value: 63.759509563968976 - type: nauc_recall_at_1000_std value: 61.175429354991614 - type: nauc_recall_at_100_diff1 value: 40.88375670987642 - type: nauc_recall_at_100_max value: 44.62608189829668 - type: nauc_recall_at_100_std value: 25.55163256804942 - type: nauc_recall_at_10_diff1 value: 37.759771219935175 - type: nauc_recall_at_10_max value: 31.146081092167627 - type: nauc_recall_at_10_std value: -4.512890345394815 - type: nauc_recall_at_1_diff1 value: 56.23620820617472 - type: nauc_recall_at_1_max value: 30.320970401424987 - type: nauc_recall_at_1_std value: -6.655365474007067 - type: nauc_recall_at_20_diff1 value: 38.4827047216752 - type: nauc_recall_at_20_max value: 30.50125803520275 - type: nauc_recall_at_20_std value: 0.8771358044937425 - type: nauc_recall_at_3_diff1 value: 47.487079446530906 - type: nauc_recall_at_3_max value: 32.19896007873808 - type: nauc_recall_at_3_std value: -5.164803420738882 - type: nauc_recall_at_5_diff1 value: 41.699415045286415 - type: nauc_recall_at_5_max value: 33.168829040464196 - type: nauc_recall_at_5_std value: -5.366546702094067 - type: ndcg_at_1 value: 41.345 - type: ndcg_at_10 value: 52.884 - type: ndcg_at_100 value: 57.94200000000001 - type: ndcg_at_1000 value: 59.68 - type: ndcg_at_20 value: 54.957 - type: ndcg_at_3 value: 47.692 - type: ndcg_at_5 value: 50.251000000000005 - type: precision_at_1 value: 41.345 - type: precision_at_10 value: 10.057 - type: precision_at_100 value: 1.574 - type: precision_at_1000 value: 0.201 - type: precision_at_20 value: 5.9799999999999995 - type: precision_at_3 value: 22.842000000000002 - type: precision_at_5 value: 16.595 - type: recall_at_1 value: 34.634 - type: recall_at_10 value: 65.185 - type: recall_at_100 value: 85.703 - type: recall_at_1000 value: 96.65599999999999 - type: recall_at_20 value: 72.322 - type: recall_at_3 value: 50.182 - type: recall_at_5 value: 57.159000000000006 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval type: mteb/cqadupstack-english config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: main_score value: 48.264 - type: map_at_1 value: 31.224 - type: map_at_10 value: 42.332 - type: map_at_100 value: 43.533 - type: map_at_1000 value: 43.662 - type: map_at_20 value: 42.972 - type: map_at_3 value: 39.159 - type: map_at_5 value: 41.047 - type: mrr_at_1 value: 39.04458598726115 - type: mrr_at_10 value: 48.18686179354971 - type: mrr_at_100 value: 48.803902946647234 - type: mrr_at_1000 value: 48.84702137486075 - type: mrr_at_20 value: 48.56368295512913 - type: mrr_at_3 value: 45.83864118895968 - type: mrr_at_5 value: 47.20806794055207 - type: nauc_map_at_1000_diff1 value: 51.86414274615986 - type: nauc_map_at_1000_max value: 34.717053941484025 - type: nauc_map_at_1000_std value: -4.340680651811943 - type: nauc_map_at_100_diff1 value: 51.84970191815774 - type: nauc_map_at_100_max value: 34.64676814212115 - type: nauc_map_at_100_std value: -4.4387297635880385 - type: nauc_map_at_10_diff1 value: 52.119436277416945 - type: nauc_map_at_10_max value: 33.94135075756255 - type: nauc_map_at_10_std value: -5.625757602694689 - type: nauc_map_at_1_diff1 value: 57.92845763299044 - type: nauc_map_at_1_max value: 29.3199164115535 - type: nauc_map_at_1_std value: -11.586283921183611 - type: nauc_map_at_20_diff1 value: 51.92734614822424 - type: nauc_map_at_20_max value: 34.35250084161699 - type: nauc_map_at_20_std value: -5.049283716884917 - type: nauc_map_at_3_diff1 value: 52.783776450874356 - type: nauc_map_at_3_max value: 32.394255917655535 - type: nauc_map_at_3_std value: -8.05902730660978 - type: nauc_map_at_5_diff1 value: 52.14993873615333 - type: nauc_map_at_5_max value: 33.48431923578608 - type: nauc_map_at_5_std value: -6.472903440360678 - type: nauc_mrr_at_1000_diff1 value: 50.49829531271091 - type: nauc_mrr_at_1000_max value: 37.183131918098425 - type: nauc_mrr_at_1000_std value: 0.4928095353543418 - type: nauc_mrr_at_100_diff1 value: 50.494636141021424 - type: nauc_mrr_at_100_max value: 37.185446950719715 - type: nauc_mrr_at_100_std value: 0.5056844413835279 - type: nauc_mrr_at_10_diff1 value: 50.55418166759066 - type: nauc_mrr_at_10_max value: 37.17369235180479 - type: nauc_mrr_at_10_std value: 0.3511264489316608 - type: nauc_mrr_at_1_diff1 value: 55.09381247060509 - type: nauc_mrr_at_1_max value: 37.17089033507927 - type: nauc_mrr_at_1_std value: -2.545073558300969 - type: nauc_mrr_at_20_diff1 value: 50.46232349188045 - type: nauc_mrr_at_20_max value: 37.22028938157565 - type: nauc_mrr_at_20_std value: 0.4342508184428254 - type: nauc_mrr_at_3_diff1 value: 50.98797216868357 - type: nauc_mrr_at_3_max value: 37.32821622965925 - type: nauc_mrr_at_3_std value: -0.6918122573096884 - type: nauc_mrr_at_5_diff1 value: 50.477903924122025 - type: nauc_mrr_at_5_max value: 37.343161615517296 - type: nauc_mrr_at_5_std value: 0.34187371397979793 - type: nauc_ndcg_at_1000_diff1 value: 49.71083273417971 - type: nauc_ndcg_at_1000_max value: 36.08714449707927 - type: nauc_ndcg_at_1000_std value: 0.3359295264579242 - type: nauc_ndcg_at_100_diff1 value: 49.64047591726873 - type: nauc_ndcg_at_100_max value: 36.0502827680962 - type: nauc_ndcg_at_100_std value: 0.4394585830222923 - type: nauc_ndcg_at_10_diff1 value: 50.3895028633975 - type: nauc_ndcg_at_10_max value: 35.51838515595454 - type: nauc_ndcg_at_10_std value: -1.8340842845181509 - type: nauc_ndcg_at_1_diff1 value: 55.09381247060509 - type: nauc_ndcg_at_1_max value: 37.17089033507927 - type: nauc_ndcg_at_1_std value: -2.545073558300969 - type: nauc_ndcg_at_20_diff1 value: 49.975850062007375 - type: nauc_ndcg_at_20_max value: 35.8777155711073 - type: nauc_ndcg_at_20_std value: -1.1833564484981665 - type: nauc_ndcg_at_3_diff1 value: 50.3823214340417 - type: nauc_ndcg_at_3_max value: 35.776477162991746 - type: nauc_ndcg_at_3_std value: -3.0969092422279623 - type: nauc_ndcg_at_5_diff1 value: 50.18424405483706 - type: nauc_ndcg_at_5_max value: 35.886540678742485 - type: nauc_ndcg_at_5_std value: -2.2048728336054912 - type: nauc_precision_at_1000_diff1 value: -8.409825453277659 - type: nauc_precision_at_1000_max value: 14.148796859940632 - type: nauc_precision_at_1000_std value: 28.34712816378856 - type: nauc_precision_at_100_diff1 value: -4.133099395945424 - type: nauc_precision_at_100_max value: 23.436894225838895 - type: nauc_precision_at_100_std value: 31.777687917658554 - type: nauc_precision_at_10_diff1 value: 12.456499608847746 - type: nauc_precision_at_10_max value: 34.40385767678226 - type: nauc_precision_at_10_std value: 22.64168731207244 - type: nauc_precision_at_1_diff1 value: 55.09381247060509 - type: nauc_precision_at_1_max value: 37.17089033507927 - type: nauc_precision_at_1_std value: -2.545073558300969 - type: nauc_precision_at_20_diff1 value: 4.838516065171166 - type: nauc_precision_at_20_max value: 31.381417947568412 - type: nauc_precision_at_20_std value: 26.974660907322917 - type: nauc_precision_at_3_diff1 value: 28.180760599976384 - type: nauc_precision_at_3_max value: 36.40321247194992 - type: nauc_precision_at_3_std value: 9.375871028699667 - type: nauc_precision_at_5_diff1 value: 19.689988735115058 - type: nauc_precision_at_5_max value: 35.98837508752083 - type: nauc_precision_at_5_std value: 16.284464606894232 - type: nauc_recall_at_1000_diff1 value: 33.594125915695884 - type: nauc_recall_at_1000_max value: 31.574941156196807 - type: nauc_recall_at_1000_std value: 20.460707032380316 - type: nauc_recall_at_100_diff1 value: 38.54327301097089 - type: nauc_recall_at_100_max value: 33.368528599783126 - type: nauc_recall_at_100_std value: 15.321500393966641 - type: nauc_recall_at_10_diff1 value: 44.219731053687255 - type: nauc_recall_at_10_max value: 31.484342080988824 - type: nauc_recall_at_10_std value: 0.22452148883121484 - type: nauc_recall_at_1_diff1 value: 57.92845763299044 - type: nauc_recall_at_1_max value: 29.3199164115535 - type: nauc_recall_at_1_std value: -11.586283921183611 - type: nauc_recall_at_20_diff1 value: 41.39285600168573 - type: nauc_recall_at_20_max value: 32.966202138611465 - type: nauc_recall_at_20_std value: 3.365583403518244 - type: nauc_recall_at_3_diff1 value: 47.33546382576856 - type: nauc_recall_at_3_max value: 30.988541475501425 - type: nauc_recall_at_3_std value: -5.87940259105687 - type: nauc_recall_at_5_diff1 value: 45.27313627261692 - type: nauc_recall_at_5_max value: 32.34545008582682 - type: nauc_recall_at_5_std value: -1.6738776274622713 - type: ndcg_at_1 value: 39.045 - type: ndcg_at_10 value: 48.264 - type: ndcg_at_100 value: 52.493 - type: ndcg_at_1000 value: 54.457 - type: ndcg_at_20 value: 49.888 - type: ndcg_at_3 value: 43.86 - type: ndcg_at_5 value: 45.983000000000004 - type: precision_at_1 value: 39.045 - type: precision_at_10 value: 9.096 - type: precision_at_100 value: 1.442 - type: precision_at_1000 value: 0.191 - type: precision_at_20 value: 5.309 - type: precision_at_3 value: 21.316 - type: precision_at_5 value: 15.197 - type: recall_at_1 value: 31.224 - type: recall_at_10 value: 59.080999999999996 - type: recall_at_100 value: 76.897 - type: recall_at_1000 value: 89.23 - type: recall_at_20 value: 64.891 - type: recall_at_3 value: 46.076 - type: recall_at_5 value: 51.964 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval type: mteb/cqadupstack-gaming config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: main_score value: 62.366 - type: map_at_1 value: 42.703 - type: map_at_10 value: 56.281000000000006 - type: map_at_100 value: 57.260999999999996 - type: map_at_1000 value: 57.30800000000001 - type: map_at_20 value: 56.871 - type: map_at_3 value: 52.897000000000006 - type: map_at_5 value: 54.773 - type: mrr_at_1 value: 48.589341692789965 - type: mrr_at_10 value: 59.43538836642291 - type: mrr_at_100 value: 59.999373625798235 - type: mrr_at_1000 value: 60.02341349127948 - type: mrr_at_20 value: 59.78236245014694 - type: mrr_at_3 value: 56.99059561128534 - type: mrr_at_5 value: 58.373040752351216 - type: nauc_map_at_1000_diff1 value: 51.724911969542475 - type: nauc_map_at_1000_max value: 31.59720256654406 - type: nauc_map_at_1000_std value: -8.448863423330733 - type: nauc_map_at_100_diff1 value: 51.721207885585294 - type: nauc_map_at_100_max value: 31.598189555174677 - type: nauc_map_at_100_std value: -8.415293705149518 - type: nauc_map_at_10_diff1 value: 51.74316546903847 - type: nauc_map_at_10_max value: 31.370796021816087 - type: nauc_map_at_10_std value: -9.144187110965651 - type: nauc_map_at_1_diff1 value: 55.602123379999405 - type: nauc_map_at_1_max value: 26.15423784568626 - type: nauc_map_at_1_std value: -11.354042579102689 - type: nauc_map_at_20_diff1 value: 51.71343659271482 - type: nauc_map_at_20_max value: 31.53988815092091 - type: nauc_map_at_20_std value: -8.65212495986148 - type: nauc_map_at_3_diff1 value: 52.064639443577846 - type: nauc_map_at_3_max value: 30.3485604522721 - type: nauc_map_at_3_std value: -10.751274075635509 - type: nauc_map_at_5_diff1 value: 51.72321940513861 - type: nauc_map_at_5_max value: 30.392319659455435 - type: nauc_map_at_5_std value: -9.939778501885101 - type: nauc_mrr_at_1000_diff1 value: 51.184984251728025 - type: nauc_mrr_at_1000_max value: 32.69216958808548 - type: nauc_mrr_at_1000_std value: -8.500776574802599 - type: nauc_mrr_at_100_diff1 value: 51.17941032241811 - type: nauc_mrr_at_100_max value: 32.70608756736136 - type: nauc_mrr_at_100_std value: -8.477679942920167 - type: nauc_mrr_at_10_diff1 value: 51.07904444322852 - type: nauc_mrr_at_10_max value: 32.65962497893277 - type: nauc_mrr_at_10_std value: -8.709383804816481 - type: nauc_mrr_at_1_diff1 value: 54.53142920528978 - type: nauc_mrr_at_1_max value: 30.926785799334677 - type: nauc_mrr_at_1_std value: -10.41145527848442 - type: nauc_mrr_at_20_diff1 value: 51.14693383001116 - type: nauc_mrr_at_20_max value: 32.73093259139165 - type: nauc_mrr_at_20_std value: -8.447633887171534 - type: nauc_mrr_at_3_diff1 value: 51.17432400675771 - type: nauc_mrr_at_3_max value: 32.85252288214242 - type: nauc_mrr_at_3_std value: -9.21642979066159 - type: nauc_mrr_at_5_diff1 value: 51.036935248981905 - type: nauc_mrr_at_5_max value: 32.502626235077095 - type: nauc_mrr_at_5_std value: -8.948887571702919 - type: nauc_ndcg_at_1000_diff1 value: 50.73024891705996 - type: nauc_ndcg_at_1000_max value: 33.26584662078177 - type: nauc_ndcg_at_1000_std value: -6.163854205845618 - type: nauc_ndcg_at_100_diff1 value: 50.67040290788501 - type: nauc_ndcg_at_100_max value: 33.68165097437155 - type: nauc_ndcg_at_100_std value: -5.301942481514177 - type: nauc_ndcg_at_10_diff1 value: 50.407269736351054 - type: nauc_ndcg_at_10_max value: 33.1723247102446 - type: nauc_ndcg_at_10_std value: -7.313191608002288 - type: nauc_ndcg_at_1_diff1 value: 54.53142920528978 - type: nauc_ndcg_at_1_max value: 30.926785799334677 - type: nauc_ndcg_at_1_std value: -10.41145527848442 - type: nauc_ndcg_at_20_diff1 value: 50.45722009686969 - type: nauc_ndcg_at_20_max value: 33.54250850995858 - type: nauc_ndcg_at_20_std value: -6.008420175252642 - type: nauc_ndcg_at_3_diff1 value: 50.769657622259686 - type: nauc_ndcg_at_3_max value: 31.792120043553002 - type: nauc_ndcg_at_3_std value: -10.040327445335686 - type: nauc_ndcg_at_5_diff1 value: 50.398976656987614 - type: nauc_ndcg_at_5_max value: 31.61780666125045 - type: nauc_ndcg_at_5_std value: -8.943124136769121 - type: nauc_precision_at_1000_diff1 value: -17.275717791952 - type: nauc_precision_at_1000_max value: 7.275527027803384 - type: nauc_precision_at_1000_std value: 16.685486896410826 - type: nauc_precision_at_100_diff1 value: -11.162266422032406 - type: nauc_precision_at_100_max value: 12.70258577369679 - type: nauc_precision_at_100_std value: 21.391285680664513 - type: nauc_precision_at_10_diff1 value: 7.81828602576801 - type: nauc_precision_at_10_max value: 24.78598247621288 - type: nauc_precision_at_10_std value: 9.374021745818432 - type: nauc_precision_at_1_diff1 value: 54.53142920528978 - type: nauc_precision_at_1_max value: 30.926785799334677 - type: nauc_precision_at_1_std value: -10.41145527848442 - type: nauc_precision_at_20_diff1 value: 0.1631191398252266 - type: nauc_precision_at_20_max value: 20.619391150501272 - type: nauc_precision_at_20_std value: 16.276264697116872 - type: nauc_precision_at_3_diff1 value: 27.04714503298839 - type: nauc_precision_at_3_max value: 30.101606964258337 - type: nauc_precision_at_3_std value: -3.681729229946907 - type: nauc_precision_at_5_diff1 value: 17.843974173274304 - type: nauc_precision_at_5_max value: 25.676881643654763 - type: nauc_precision_at_5_std value: 1.5965157990195873 - type: nauc_recall_at_1000_diff1 value: 29.087262485289735 - type: nauc_recall_at_1000_max value: 59.55059060998873 - type: nauc_recall_at_1000_std value: 62.21218125216127 - type: nauc_recall_at_100_diff1 value: 41.30594954847261 - type: nauc_recall_at_100_max value: 48.03865105456248 - type: nauc_recall_at_100_std value: 28.904820877938946 - type: nauc_recall_at_10_diff1 value: 43.528832373563795 - type: nauc_recall_at_10_max value: 36.333747103215266 - type: nauc_recall_at_10_std value: -0.586937217589867 - type: nauc_recall_at_1_diff1 value: 55.602123379999405 - type: nauc_recall_at_1_max value: 26.15423784568626 - type: nauc_recall_at_1_std value: -11.354042579102689 - type: nauc_recall_at_20_diff1 value: 42.86486871096986 - type: nauc_recall_at_20_max value: 39.37052680687811 - type: nauc_recall_at_20_std value: 7.7270172598031985 - type: nauc_recall_at_3_diff1 value: 46.744057097749746 - type: nauc_recall_at_3_max value: 32.0901543978326 - type: nauc_recall_at_3_std value: -9.836059759091158 - type: nauc_recall_at_5_diff1 value: 44.52443640046374 - type: nauc_recall_at_5_max value: 31.155871822952808 - type: nauc_recall_at_5_std value: -7.116612032547676 - type: ndcg_at_1 value: 48.589 - type: ndcg_at_10 value: 62.366 - type: ndcg_at_100 value: 66.011 - type: ndcg_at_1000 value: 66.88199999999999 - type: ndcg_at_20 value: 63.979 - type: ndcg_at_3 value: 56.764 - type: ndcg_at_5 value: 59.426 - type: precision_at_1 value: 48.589 - type: precision_at_10 value: 9.981 - type: precision_at_100 value: 1.277 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_20 value: 5.514 - type: precision_at_3 value: 25.308000000000003 - type: precision_at_5 value: 17.241 - type: recall_at_1 value: 42.703 - type: recall_at_10 value: 77.08 - type: recall_at_100 value: 92.374 - type: recall_at_1000 value: 98.402 - type: recall_at_20 value: 82.87400000000001 - type: recall_at_3 value: 62.138000000000005 - type: recall_at_5 value: 68.679 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval type: mteb/cqadupstack-gis config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: main_score value: 39.971000000000004 - type: map_at_1 value: 25.06 - type: map_at_10 value: 34.551 - type: map_at_100 value: 35.568 - type: map_at_1000 value: 35.65 - type: map_at_20 value: 35.127 - type: map_at_3 value: 31.936999999999998 - type: map_at_5 value: 33.186 - type: mrr_at_1 value: 27.11864406779661 - type: mrr_at_10 value: 36.72652676889963 - type: mrr_at_100 value: 37.57204686098606 - type: mrr_at_1000 value: 37.63141267969674 - type: mrr_at_20 value: 37.19310670147632 - type: mrr_at_3 value: 34.27495291902072 - type: mrr_at_5 value: 35.438794726930304 - type: nauc_map_at_1000_diff1 value: 43.63829107634628 - type: nauc_map_at_1000_max value: 23.954060999822257 - type: nauc_map_at_1000_std value: -0.5807446969781898 - type: nauc_map_at_100_diff1 value: 43.610748406014466 - type: nauc_map_at_100_max value: 23.94949736158448 - type: nauc_map_at_100_std value: -0.5982601848367343 - type: nauc_map_at_10_diff1 value: 43.72900243122612 - type: nauc_map_at_10_max value: 23.508469522079885 - type: nauc_map_at_10_std value: -0.5258931194184133 - type: nauc_map_at_1_diff1 value: 50.922871467903654 - type: nauc_map_at_1_max value: 24.6067671408884 - type: nauc_map_at_1_std value: -4.630126214452492 - type: nauc_map_at_20_diff1 value: 43.63024854824786 - type: nauc_map_at_20_max value: 23.874524344212734 - type: nauc_map_at_20_std value: -0.556366665388133 - type: nauc_map_at_3_diff1 value: 44.38253552931588 - type: nauc_map_at_3_max value: 22.561513802056236 - type: nauc_map_at_3_std value: -3.005119773408719 - type: nauc_map_at_5_diff1 value: 44.016586535650795 - type: nauc_map_at_5_max value: 23.302456735449038 - type: nauc_map_at_5_std value: -1.7618309245289323 - type: nauc_mrr_at_1000_diff1 value: 42.68205493907015 - type: nauc_mrr_at_1000_max value: 26.024690905326025 - type: nauc_mrr_at_1000_std value: 0.6287706252427459 - type: nauc_mrr_at_100_diff1 value: 42.654961103491004 - type: nauc_mrr_at_100_max value: 26.029087860328065 - type: nauc_mrr_at_100_std value: 0.6163052064323858 - type: nauc_mrr_at_10_diff1 value: 42.56564515109072 - type: nauc_mrr_at_10_max value: 25.666414824261224 - type: nauc_mrr_at_10_std value: 0.7949641234835698 - type: nauc_mrr_at_1_diff1 value: 49.966125488185206 - type: nauc_mrr_at_1_max value: 27.193710462071348 - type: nauc_mrr_at_1_std value: -2.2786990240033718 - type: nauc_mrr_at_20_diff1 value: 42.65274684886744 - type: nauc_mrr_at_20_max value: 26.052180768841172 - type: nauc_mrr_at_20_std value: 0.7171447318848092 - type: nauc_mrr_at_3_diff1 value: 43.22408289408012 - type: nauc_mrr_at_3_max value: 25.34061478734211 - type: nauc_mrr_at_3_std value: -1.1093305128661515 - type: nauc_mrr_at_5_diff1 value: 42.87983482470224 - type: nauc_mrr_at_5_max value: 25.91557396366082 - type: nauc_mrr_at_5_std value: -0.13066697110897257 - type: nauc_ndcg_at_1000_diff1 value: 41.53426396594562 - type: nauc_ndcg_at_1000_max value: 25.526814765685046 - type: nauc_ndcg_at_1000_std value: 2.2841859589382487 - type: nauc_ndcg_at_100_diff1 value: 40.61825803826763 - type: nauc_ndcg_at_100_max value: 25.344384823963455 - type: nauc_ndcg_at_100_std value: 1.9818508179504288 - type: nauc_ndcg_at_10_diff1 value: 40.82184056229221 - type: nauc_ndcg_at_10_max value: 23.832384873845786 - type: nauc_ndcg_at_10_std value: 2.4835478280573966 - type: nauc_ndcg_at_1_diff1 value: 49.966125488185206 - type: nauc_ndcg_at_1_max value: 27.193710462071348 - type: nauc_ndcg_at_1_std value: -2.2786990240033718 - type: nauc_ndcg_at_20_diff1 value: 40.648257910495396 - type: nauc_ndcg_at_20_max value: 25.1143676738966 - type: nauc_ndcg_at_20_std value: 2.2994895733337084 - type: nauc_ndcg_at_3_diff1 value: 42.115026070978224 - type: nauc_ndcg_at_3_max value: 22.895171049309084 - type: nauc_ndcg_at_3_std value: -2.160818780944711 - type: nauc_ndcg_at_5_diff1 value: 41.608274106869516 - type: nauc_ndcg_at_5_max value: 23.8694881434902 - type: nauc_ndcg_at_5_std value: -0.2034244843217431 - type: nauc_precision_at_1000_diff1 value: -0.08291845059826138 - type: nauc_precision_at_1000_max value: 20.313650012376964 - type: nauc_precision_at_1000_std value: 13.510706405842074 - type: nauc_precision_at_100_diff1 value: 9.885311318637227 - type: nauc_precision_at_100_max value: 26.374081882816075 - type: nauc_precision_at_100_std value: 12.021731392392521 - type: nauc_precision_at_10_diff1 value: 25.883633917220507 - type: nauc_precision_at_10_max value: 26.552638392568888 - type: nauc_precision_at_10_std value: 14.460458912586468 - type: nauc_precision_at_1_diff1 value: 49.966125488185206 - type: nauc_precision_at_1_max value: 27.193710462071348 - type: nauc_precision_at_1_std value: -2.2786990240033718 - type: nauc_precision_at_20_diff1 value: 20.695053025711932 - type: nauc_precision_at_20_max value: 29.151449538281586 - type: nauc_precision_at_20_std value: 13.496486151691874 - type: nauc_precision_at_3_diff1 value: 33.475423305252995 - type: nauc_precision_at_3_max value: 24.486060318210537 - type: nauc_precision_at_3_std value: 1.9847009660547001 - type: nauc_precision_at_5_diff1 value: 31.14043721035368 - type: nauc_precision_at_5_max value: 27.224889907879906 - type: nauc_precision_at_5_std value: 6.539905565691817 - type: nauc_recall_at_1000_diff1 value: 34.33506268392135 - type: nauc_recall_at_1000_max value: 37.11939420491589 - type: nauc_recall_at_1000_std value: 31.371417780064085 - type: nauc_recall_at_100_diff1 value: 26.348832193119886 - type: nauc_recall_at_100_max value: 28.096364816659065 - type: nauc_recall_at_100_std value: 11.980597075104523 - type: nauc_recall_at_10_diff1 value: 31.684763745718985 - type: nauc_recall_at_10_max value: 21.556273820201323 - type: nauc_recall_at_10_std value: 10.480665669920347 - type: nauc_recall_at_1_diff1 value: 50.922871467903654 - type: nauc_recall_at_1_max value: 24.6067671408884 - type: nauc_recall_at_1_std value: -4.630126214452492 - type: nauc_recall_at_20_diff1 value: 30.160960913064304 - type: nauc_recall_at_20_max value: 26.303437539000505 - type: nauc_recall_at_20_std value: 10.389326804314718 - type: nauc_recall_at_3_diff1 value: 36.88184391262179 - type: nauc_recall_at_3_max value: 20.190953608016223 - type: nauc_recall_at_3_std value: -1.3089868832214695 - type: nauc_recall_at_5_diff1 value: 34.99254305849935 - type: nauc_recall_at_5_max value: 22.230820355560727 - type: nauc_recall_at_5_std value: 2.678023175693563 - type: ndcg_at_1 value: 27.119 - type: ndcg_at_10 value: 39.971000000000004 - type: ndcg_at_100 value: 44.952 - type: ndcg_at_1000 value: 46.821 - type: ndcg_at_20 value: 41.881 - type: ndcg_at_3 value: 34.727000000000004 - type: ndcg_at_5 value: 36.814 - type: precision_at_1 value: 27.119 - type: precision_at_10 value: 6.271 - type: precision_at_100 value: 0.9249999999999999 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_20 value: 3.605 - type: precision_at_3 value: 14.991 - type: precision_at_5 value: 10.26 - type: recall_at_1 value: 25.06 - type: recall_at_10 value: 54.635 - type: recall_at_100 value: 77.639 - type: recall_at_1000 value: 91.301 - type: recall_at_20 value: 61.763 - type: recall_at_3 value: 40.143 - type: recall_at_5 value: 45.193 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval type: mteb/cqadupstack-mathematica config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: main_score value: 30.308 - type: map_at_1 value: 16.154 - type: map_at_10 value: 24.743000000000002 - type: map_at_100 value: 26.069 - type: map_at_1000 value: 26.197 - type: map_at_20 value: 25.46 - type: map_at_3 value: 21.816 - type: map_at_5 value: 23.443 - type: mrr_at_1 value: 20.149253731343283 - type: mrr_at_10 value: 29.10847547974411 - type: mrr_at_100 value: 30.13595361660887 - type: mrr_at_1000 value: 30.211025784243766 - type: mrr_at_20 value: 29.706267830545784 - type: mrr_at_3 value: 26.451077943615264 - type: mrr_at_5 value: 27.868988391376444 - type: nauc_map_at_1000_diff1 value: 31.47263576493308 - type: nauc_map_at_1000_max value: 18.49384617286511 - type: nauc_map_at_1000_std value: 0.5754985941500461 - type: nauc_map_at_100_diff1 value: 31.44160594144755 - type: nauc_map_at_100_max value: 18.46607563648124 - type: nauc_map_at_100_std value: 0.5879794819886102 - type: nauc_map_at_10_diff1 value: 31.71626861875994 - type: nauc_map_at_10_max value: 18.662179744257916 - type: nauc_map_at_10_std value: -0.013163124651967131 - type: nauc_map_at_1_diff1 value: 37.33971420967126 - type: nauc_map_at_1_max value: 17.543923177907566 - type: nauc_map_at_1_std value: -0.6312070176608349 - type: nauc_map_at_20_diff1 value: 31.443960381506987 - type: nauc_map_at_20_max value: 18.39695256653282 - type: nauc_map_at_20_std value: 0.24204111048796523 - type: nauc_map_at_3_diff1 value: 32.66647821102399 - type: nauc_map_at_3_max value: 17.166769100670678 - type: nauc_map_at_3_std value: 0.2511302116485242 - type: nauc_map_at_5_diff1 value: 31.814363889022516 - type: nauc_map_at_5_max value: 17.450292361372707 - type: nauc_map_at_5_std value: -0.45123652210324744 - type: nauc_mrr_at_1000_diff1 value: 31.885514197021163 - type: nauc_mrr_at_1000_max value: 18.697001653609462 - type: nauc_mrr_at_1000_std value: -0.7075589181761113 - type: nauc_mrr_at_100_diff1 value: 31.859235999194958 - type: nauc_mrr_at_100_max value: 18.685923862530778 - type: nauc_mrr_at_100_std value: -0.7027394321332194 - type: nauc_mrr_at_10_diff1 value: 32.00819090481358 - type: nauc_mrr_at_10_max value: 18.858552402155677 - type: nauc_mrr_at_10_std value: -0.8729017160389365 - type: nauc_mrr_at_1_diff1 value: 36.55463496530352 - type: nauc_mrr_at_1_max value: 17.893580417517832 - type: nauc_mrr_at_1_std value: -2.7268036629932895 - type: nauc_mrr_at_20_diff1 value: 31.79086317678036 - type: nauc_mrr_at_20_max value: 18.72847970596078 - type: nauc_mrr_at_20_std value: -0.7526268512949703 - type: nauc_mrr_at_3_diff1 value: 32.24844813811655 - type: nauc_mrr_at_3_max value: 17.810304497390504 - type: nauc_mrr_at_3_std value: -1.3573591649881485 - type: nauc_mrr_at_5_diff1 value: 32.29719658849603 - type: nauc_mrr_at_5_max value: 18.01176246232617 - type: nauc_mrr_at_5_std value: -1.3156140758149915 - type: nauc_ndcg_at_1000_diff1 value: 30.420235654700672 - type: nauc_ndcg_at_1000_max value: 20.14284394608303 - type: nauc_ndcg_at_1000_std value: 2.409633449702056 - type: nauc_ndcg_at_100_diff1 value: 29.54867297316048 - type: nauc_ndcg_at_100_max value: 19.63470407851956 - type: nauc_ndcg_at_100_std value: 3.062730904774899 - type: nauc_ndcg_at_10_diff1 value: 30.288655944213627 - type: nauc_ndcg_at_10_max value: 20.304033843092395 - type: nauc_ndcg_at_10_std value: 0.7042902099149692 - type: nauc_ndcg_at_1_diff1 value: 36.55463496530352 - type: nauc_ndcg_at_1_max value: 17.893580417517832 - type: nauc_ndcg_at_1_std value: -2.7268036629932895 - type: nauc_ndcg_at_20_diff1 value: 29.315712836253248 - type: nauc_ndcg_at_20_max value: 19.55539590463071 - type: nauc_ndcg_at_20_std value: 1.4238452417516618 - type: nauc_ndcg_at_3_diff1 value: 31.54355638372054 - type: nauc_ndcg_at_3_max value: 17.766299875547816 - type: nauc_ndcg_at_3_std value: 0.28964137714040095 - type: nauc_ndcg_at_5_diff1 value: 30.818060499932542 - type: nauc_ndcg_at_5_max value: 18.068091310151164 - type: nauc_ndcg_at_5_std value: -0.16020203299958868 - type: nauc_precision_at_1000_diff1 value: 1.8177927649439825 - type: nauc_precision_at_1000_max value: 1.9156412467603505 - type: nauc_precision_at_1000_std value: -1.0195378172264247 - type: nauc_precision_at_100_diff1 value: 7.852064632368817 - type: nauc_precision_at_100_max value: 11.41378732164787 - type: nauc_precision_at_100_std value: 8.845589790612463 - type: nauc_precision_at_10_diff1 value: 19.576158908850957 - type: nauc_precision_at_10_max value: 22.963840017872794 - type: nauc_precision_at_10_std value: 2.426835326713512 - type: nauc_precision_at_1_diff1 value: 36.55463496530352 - type: nauc_precision_at_1_max value: 17.893580417517832 - type: nauc_precision_at_1_std value: -2.7268036629932895 - type: nauc_precision_at_20_diff1 value: 15.305985286454149 - type: nauc_precision_at_20_max value: 18.827005672571858 - type: nauc_precision_at_20_std value: 3.992229421735929 - type: nauc_precision_at_3_diff1 value: 26.358279542321966 - type: nauc_precision_at_3_max value: 19.340749761958552 - type: nauc_precision_at_3_std value: 0.8501109386129221 - type: nauc_precision_at_5_diff1 value: 22.462129435924727 - type: nauc_precision_at_5_max value: 18.890119720243188 - type: nauc_precision_at_5_std value: 0.21756962337473482 - type: nauc_recall_at_1000_diff1 value: 25.079504569576184 - type: nauc_recall_at_1000_max value: 36.71138367024086 - type: nauc_recall_at_1000_std value: 18.882277140819067 - type: nauc_recall_at_100_diff1 value: 19.980741195591563 - type: nauc_recall_at_100_max value: 21.648381374802273 - type: nauc_recall_at_100_std value: 14.541121099803092 - type: nauc_recall_at_10_diff1 value: 24.61930855038573 - type: nauc_recall_at_10_max value: 22.98083391642699 - type: nauc_recall_at_10_std value: 2.860945348018573 - type: nauc_recall_at_1_diff1 value: 37.33971420967126 - type: nauc_recall_at_1_max value: 17.543923177907566 - type: nauc_recall_at_1_std value: -0.6312070176608349 - type: nauc_recall_at_20_diff1 value: 20.478434900407255 - type: nauc_recall_at_20_max value: 20.439655780702832 - type: nauc_recall_at_20_std value: 5.4039574030039885 - type: nauc_recall_at_3_diff1 value: 27.845972047264578 - type: nauc_recall_at_3_max value: 16.649682003649193 - type: nauc_recall_at_3_std value: 2.171037068117454 - type: nauc_recall_at_5_diff1 value: 26.76354795664187 - type: nauc_recall_at_5_max value: 17.488511178851763 - type: nauc_recall_at_5_std value: 0.7909085800561211 - type: ndcg_at_1 value: 20.149 - type: ndcg_at_10 value: 30.308 - type: ndcg_at_100 value: 36.361 - type: ndcg_at_1000 value: 39.128 - type: ndcg_at_20 value: 32.719 - type: ndcg_at_3 value: 24.969 - type: ndcg_at_5 value: 27.409 - type: precision_at_1 value: 20.149 - type: precision_at_10 value: 5.784000000000001 - type: precision_at_100 value: 1.011 - type: precision_at_1000 value: 0.13799999999999998 - type: precision_at_20 value: 3.5319999999999996 - type: precision_at_3 value: 12.106 - type: precision_at_5 value: 9.030000000000001 - type: recall_at_1 value: 16.154 - type: recall_at_10 value: 43.092000000000006 - type: recall_at_100 value: 68.998 - type: recall_at_1000 value: 88.127 - type: recall_at_20 value: 51.937999999999995 - type: recall_at_3 value: 28.473 - type: recall_at_5 value: 34.624 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval type: mteb/cqadupstack-physics config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: main_score value: 46.931 - type: map_at_1 value: 30.036 - type: map_at_10 value: 40.753 - type: map_at_100 value: 42.098 - type: map_at_1000 value: 42.201 - type: map_at_20 value: 41.494 - type: map_at_3 value: 37.55 - type: map_at_5 value: 39.266 - type: mrr_at_1 value: 36.57362848893166 - type: mrr_at_10 value: 46.15953985058891 - type: mrr_at_100 value: 46.964409847048735 - type: mrr_at_1000 value: 47.006684152310186 - type: mrr_at_20 value: 46.63576095668375 - type: mrr_at_3 value: 43.39108116778952 - type: mrr_at_5 value: 45.0609560474815 - type: nauc_map_at_1000_diff1 value: 50.008393482865934 - type: nauc_map_at_1000_max value: 27.44292854668337 - type: nauc_map_at_1000_std value: -1.1827744848485413 - type: nauc_map_at_100_diff1 value: 50.01593736030433 - type: nauc_map_at_100_max value: 27.401227555060693 - type: nauc_map_at_100_std value: -1.226830892874052 - type: nauc_map_at_10_diff1 value: 50.22186852707843 - type: nauc_map_at_10_max value: 26.882005386152162 - type: nauc_map_at_10_std value: -1.7817280491798217 - type: nauc_map_at_1_diff1 value: 53.70420852974904 - type: nauc_map_at_1_max value: 25.134260139256465 - type: nauc_map_at_1_std value: -5.16360510676616 - type: nauc_map_at_20_diff1 value: 50.03553131371993 - type: nauc_map_at_20_max value: 27.028712351429306 - type: nauc_map_at_20_std value: -1.4264982725018232 - type: nauc_map_at_3_diff1 value: 50.56170061459129 - type: nauc_map_at_3_max value: 27.125222360081885 - type: nauc_map_at_3_std value: -2.1772011676637457 - type: nauc_map_at_5_diff1 value: 50.55287654401218 - type: nauc_map_at_5_max value: 27.179943148291034 - type: nauc_map_at_5_std value: -1.9278191493666326 - type: nauc_mrr_at_1000_diff1 value: 50.19001608358556 - type: nauc_mrr_at_1000_max value: 30.11015154646845 - type: nauc_mrr_at_1000_std value: -0.01731538046574592 - type: nauc_mrr_at_100_diff1 value: 50.17990723644671 - type: nauc_mrr_at_100_max value: 30.08888004508371 - type: nauc_mrr_at_100_std value: -0.03777479539357456 - type: nauc_mrr_at_10_diff1 value: 50.29875316793952 - type: nauc_mrr_at_10_max value: 30.0700394599554 - type: nauc_mrr_at_10_std value: -0.1129279328368799 - type: nauc_mrr_at_1_diff1 value: 53.13267349109123 - type: nauc_mrr_at_1_max value: 29.600631965679142 - type: nauc_mrr_at_1_std value: -1.0534342020289145 - type: nauc_mrr_at_20_diff1 value: 50.20426738346865 - type: nauc_mrr_at_20_max value: 30.03033165917099 - type: nauc_mrr_at_20_std value: -0.0990630706915973 - type: nauc_mrr_at_3_diff1 value: 50.44930547118647 - type: nauc_mrr_at_3_max value: 30.18069271699821 - type: nauc_mrr_at_3_std value: -0.4106548753200651 - type: nauc_mrr_at_5_diff1 value: 50.42405239937933 - type: nauc_mrr_at_5_max value: 30.323511080797132 - type: nauc_mrr_at_5_std value: -0.10914898852912731 - type: nauc_ndcg_at_1000_diff1 value: 48.4023648301636 - type: nauc_ndcg_at_1000_max value: 29.372043713546457 - type: nauc_ndcg_at_1000_std value: 1.4160068477128542 - type: nauc_ndcg_at_100_diff1 value: 48.43331450594402 - type: nauc_ndcg_at_100_max value: 28.62936981969224 - type: nauc_ndcg_at_100_std value: 0.8983763064461433 - type: nauc_ndcg_at_10_diff1 value: 49.03974183137114 - type: nauc_ndcg_at_10_max value: 27.134352966349006 - type: nauc_ndcg_at_10_std value: -0.7394110214476277 - type: nauc_ndcg_at_1_diff1 value: 53.13267349109123 - type: nauc_ndcg_at_1_max value: 29.600631965679142 - type: nauc_ndcg_at_1_std value: -1.0534342020289145 - type: nauc_ndcg_at_20_diff1 value: 48.48145045039039 - type: nauc_ndcg_at_20_max value: 27.312478220117836 - type: nauc_ndcg_at_20_std value: -0.08007639532022988 - type: nauc_ndcg_at_3_diff1 value: 49.795198984753725 - type: nauc_ndcg_at_3_max value: 28.851373164423457 - type: nauc_ndcg_at_3_std value: -0.7114306314589505 - type: nauc_ndcg_at_5_diff1 value: 49.76549299850904 - type: nauc_ndcg_at_5_max value: 28.333095297025384 - type: nauc_ndcg_at_5_std value: -0.6065340225903514 - type: nauc_precision_at_1000_diff1 value: -14.995860825405593 - type: nauc_precision_at_1000_max value: 10.497503977177239 - type: nauc_precision_at_1000_std value: 15.472908805216562 - type: nauc_precision_at_100_diff1 value: -5.056728888436733 - type: nauc_precision_at_100_max value: 16.225279572994932 - type: nauc_precision_at_100_std value: 17.333024162674036 - type: nauc_precision_at_10_diff1 value: 18.485355184593836 - type: nauc_precision_at_10_max value: 21.53388484848657 - type: nauc_precision_at_10_std value: 9.864926100512946 - type: nauc_precision_at_1_diff1 value: 53.13267349109123 - type: nauc_precision_at_1_max value: 29.600631965679142 - type: nauc_precision_at_1_std value: -1.0534342020289145 - type: nauc_precision_at_20_diff1 value: 9.420119338006966 - type: nauc_precision_at_20_max value: 19.132214665647382 - type: nauc_precision_at_20_std value: 13.168229893698857 - type: nauc_precision_at_3_diff1 value: 34.51509644827664 - type: nauc_precision_at_3_max value: 28.988501800675305 - type: nauc_precision_at_3_std value: 6.887544108087535 - type: nauc_precision_at_5_diff1 value: 29.728890633704864 - type: nauc_precision_at_5_max value: 27.527807375891044 - type: nauc_precision_at_5_std value: 8.615115789487243 - type: nauc_recall_at_1000_diff1 value: 21.30536250453658 - type: nauc_recall_at_1000_max value: 45.66826079811565 - type: nauc_recall_at_1000_std value: 40.948489257124734 - type: nauc_recall_at_100_diff1 value: 36.41578755512283 - type: nauc_recall_at_100_max value: 25.843843547872236 - type: nauc_recall_at_100_std value: 9.98566808528975 - type: nauc_recall_at_10_diff1 value: 42.73428373449279 - type: nauc_recall_at_10_max value: 22.45723124505396 - type: nauc_recall_at_10_std value: 0.6596133636511106 - type: nauc_recall_at_1_diff1 value: 53.70420852974904 - type: nauc_recall_at_1_max value: 25.134260139256465 - type: nauc_recall_at_1_std value: -5.16360510676616 - type: nauc_recall_at_20_diff1 value: 39.67103657607903 - type: nauc_recall_at_20_max value: 21.767425036370714 - type: nauc_recall_at_20_std value: 2.792759310018829 - type: nauc_recall_at_3_diff1 value: 46.672591160111224 - type: nauc_recall_at_3_max value: 26.876529270231792 - type: nauc_recall_at_3_std value: -1.1160005181114536 - type: nauc_recall_at_5_diff1 value: 45.77174277314153 - type: nauc_recall_at_5_max value: 26.349199537996853 - type: nauc_recall_at_5_std value: -0.09454430813945205 - type: ndcg_at_1 value: 36.574 - type: ndcg_at_10 value: 46.931 - type: ndcg_at_100 value: 52.40899999999999 - type: ndcg_at_1000 value: 54.31 - type: ndcg_at_20 value: 49.098000000000006 - type: ndcg_at_3 value: 41.585 - type: ndcg_at_5 value: 44.009 - type: precision_at_1 value: 36.574 - type: precision_at_10 value: 8.518 - type: precision_at_100 value: 1.317 - type: precision_at_1000 value: 0.167 - type: precision_at_20 value: 4.99 - type: precision_at_3 value: 19.794999999999998 - type: precision_at_5 value: 13.879 - type: recall_at_1 value: 30.036 - type: recall_at_10 value: 60.043 - type: recall_at_100 value: 82.78999999999999 - type: recall_at_1000 value: 95.017 - type: recall_at_20 value: 67.509 - type: recall_at_3 value: 44.769 - type: recall_at_5 value: 51.23 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval type: mteb/cqadupstack-programmers config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: main_score value: 43.147999999999996 - type: map_at_1 value: 27.299 - type: map_at_10 value: 37.441 - type: map_at_100 value: 38.977000000000004 - type: map_at_1000 value: 39.068999999999996 - type: map_at_20 value: 38.282 - type: map_at_3 value: 34.217 - type: map_at_5 value: 36.027 - type: mrr_at_1 value: 33.44748858447489 - type: mrr_at_10 value: 42.456738783793554 - type: mrr_at_100 value: 43.485313174917046 - type: mrr_at_1000 value: 43.52577210412886 - type: mrr_at_20 value: 43.02431629929082 - type: mrr_at_3 value: 39.72602739726027 - type: mrr_at_5 value: 41.32420091324198 - type: nauc_map_at_1000_diff1 value: 42.430993099089214 - type: nauc_map_at_1000_max value: 28.098034312926952 - type: nauc_map_at_1000_std value: 3.231295090968473 - type: nauc_map_at_100_diff1 value: 42.42649976590143 - type: nauc_map_at_100_max value: 28.07518501114065 - type: nauc_map_at_100_std value: 3.2663627223954257 - type: nauc_map_at_10_diff1 value: 42.37108247761657 - type: nauc_map_at_10_max value: 27.784006301694887 - type: nauc_map_at_10_std value: 2.1562734801370382 - type: nauc_map_at_1_diff1 value: 46.996543750833226 - type: nauc_map_at_1_max value: 23.22775877678291 - type: nauc_map_at_1_std value: -3.185987618625673 - type: nauc_map_at_20_diff1 value: 42.285605547136605 - type: nauc_map_at_20_max value: 27.87619604505037 - type: nauc_map_at_20_std value: 2.868182127790041 - type: nauc_map_at_3_diff1 value: 43.17884748984982 - type: nauc_map_at_3_max value: 26.640107029543174 - type: nauc_map_at_3_std value: -0.6337177522670645 - type: nauc_map_at_5_diff1 value: 42.55295619170691 - type: nauc_map_at_5_max value: 27.09386543850697 - type: nauc_map_at_5_std value: 1.1172301120800785 - type: nauc_mrr_at_1000_diff1 value: 41.44240071604904 - type: nauc_mrr_at_1000_max value: 29.942727017459177 - type: nauc_mrr_at_1000_std value: 4.847580130462551 - type: nauc_mrr_at_100_diff1 value: 41.43634208329461 - type: nauc_mrr_at_100_max value: 29.94502158371524 - type: nauc_mrr_at_100_std value: 4.873085525046516 - type: nauc_mrr_at_10_diff1 value: 41.434406767394215 - type: nauc_mrr_at_10_max value: 29.961051443508534 - type: nauc_mrr_at_10_std value: 4.490183376727645 - type: nauc_mrr_at_1_diff1 value: 46.01681006012476 - type: nauc_mrr_at_1_max value: 28.39735171499139 - type: nauc_mrr_at_1_std value: 0.8500045602957598 - type: nauc_mrr_at_20_diff1 value: 41.324947979964605 - type: nauc_mrr_at_20_max value: 29.939799023317963 - type: nauc_mrr_at_20_std value: 4.8458435024129685 - type: nauc_mrr_at_3_diff1 value: 41.87918200877444 - type: nauc_mrr_at_3_max value: 29.878707844397507 - type: nauc_mrr_at_3_std value: 2.754394941481161 - type: nauc_mrr_at_5_diff1 value: 41.17158211294708 - type: nauc_mrr_at_5_max value: 29.525114418603625 - type: nauc_mrr_at_5_std value: 3.6695976231626792 - type: nauc_ndcg_at_1000_diff1 value: 40.85015584223998 - type: nauc_ndcg_at_1000_max value: 30.175833847400003 - type: nauc_ndcg_at_1000_std value: 7.454581754774201 - type: nauc_ndcg_at_100_diff1 value: 40.679563549502475 - type: nauc_ndcg_at_100_max value: 30.105638179098303 - type: nauc_ndcg_at_100_std value: 8.61962835140906 - type: nauc_ndcg_at_10_diff1 value: 40.37700967457906 - type: nauc_ndcg_at_10_max value: 29.33300077317775 - type: nauc_ndcg_at_10_std value: 5.023758212980035 - type: nauc_ndcg_at_1_diff1 value: 46.01681006012476 - type: nauc_ndcg_at_1_max value: 28.39735171499139 - type: nauc_ndcg_at_1_std value: 0.8500045602957598 - type: nauc_ndcg_at_20_diff1 value: 39.98886010789604 - type: nauc_ndcg_at_20_max value: 29.36296219371212 - type: nauc_ndcg_at_20_std value: 7.1201782062536925 - type: nauc_ndcg_at_3_diff1 value: 40.92324084648135 - type: nauc_ndcg_at_3_max value: 28.520942397787785 - type: nauc_ndcg_at_3_std value: 1.0293165278727892 - type: nauc_ndcg_at_5_diff1 value: 40.317533959797814 - type: nauc_ndcg_at_5_max value: 28.339428764903264 - type: nauc_ndcg_at_5_std value: 3.1896497530161687 - type: nauc_precision_at_1000_diff1 value: -6.9969817860247625 - type: nauc_precision_at_1000_max value: 9.347778794059506 - type: nauc_precision_at_1000_std value: 7.9646208472184625 - type: nauc_precision_at_100_diff1 value: 2.991937395454712 - type: nauc_precision_at_100_max value: 18.71624281667294 - type: nauc_precision_at_100_std value: 21.600526590609512 - type: nauc_precision_at_10_diff1 value: 18.37445514123775 - type: nauc_precision_at_10_max value: 29.699257376065063 - type: nauc_precision_at_10_std value: 18.095751349204832 - type: nauc_precision_at_1_diff1 value: 46.01681006012476 - type: nauc_precision_at_1_max value: 28.39735171499139 - type: nauc_precision_at_1_std value: 0.8500045602957598 - type: nauc_precision_at_20_diff1 value: 11.472713745988054 - type: nauc_precision_at_20_max value: 25.690985880662325 - type: nauc_precision_at_20_std value: 22.46754877988948 - type: nauc_precision_at_3_diff1 value: 29.052028827439607 - type: nauc_precision_at_3_max value: 31.04481903220871 - type: nauc_precision_at_3_std value: 8.208096616199493 - type: nauc_precision_at_5_diff1 value: 23.711708272374533 - type: nauc_precision_at_5_max value: 30.24946804680551 - type: nauc_precision_at_5_std value: 12.681259000978528 - type: nauc_recall_at_1000_diff1 value: 16.82259171106293 - type: nauc_recall_at_1000_max value: 42.76820203485854 - type: nauc_recall_at_1000_std value: 55.97238149176407 - type: nauc_recall_at_100_diff1 value: 27.21094062723115 - type: nauc_recall_at_100_max value: 33.698956290459584 - type: nauc_recall_at_100_std value: 37.63664733891902 - type: nauc_recall_at_10_diff1 value: 33.26348363515544 - type: nauc_recall_at_10_max value: 29.5718227632449 - type: nauc_recall_at_10_std value: 10.62584355073482 - type: nauc_recall_at_1_diff1 value: 46.996543750833226 - type: nauc_recall_at_1_max value: 23.22775877678291 - type: nauc_recall_at_1_std value: -3.185987618625673 - type: nauc_recall_at_20_diff1 value: 30.615386537256107 - type: nauc_recall_at_20_max value: 29.459243404458636 - type: nauc_recall_at_20_std value: 18.849849153868913 - type: nauc_recall_at_3_diff1 value: 37.22492629427872 - type: nauc_recall_at_3_max value: 27.49351222866847 - type: nauc_recall_at_3_std value: 0.31700586087567145 - type: nauc_recall_at_5_diff1 value: 34.4555753891359 - type: nauc_recall_at_5_max value: 27.219221048995283 - type: nauc_recall_at_5_std value: 6.057763073329902 - type: ndcg_at_1 value: 33.446999999999996 - type: ndcg_at_10 value: 43.147999999999996 - type: ndcg_at_100 value: 49.601 - type: ndcg_at_1000 value: 51.437 - type: ndcg_at_20 value: 45.704 - type: ndcg_at_3 value: 37.978 - type: ndcg_at_5 value: 40.431 - type: precision_at_1 value: 33.446999999999996 - type: precision_at_10 value: 7.888000000000001 - type: precision_at_100 value: 1.298 - type: precision_at_1000 value: 0.16199999999999998 - type: precision_at_20 value: 4.749 - type: precision_at_3 value: 17.922 - type: precision_at_5 value: 12.9 - type: recall_at_1 value: 27.299 - type: recall_at_10 value: 54.92399999999999 - type: recall_at_100 value: 82.308 - type: recall_at_1000 value: 94.451 - type: recall_at_20 value: 63.952 - type: recall_at_3 value: 40.788000000000004 - type: recall_at_5 value: 47.198 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: CQADupstackRetrieval_is_a_combined_dataset config: default split: test revision: CQADupstackRetrieval_is_a_combined_dataset metrics: - type: main_score value: 42.21466666666666 - type: ndcg_at_10 value: 42.21466666666666 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval type: mteb/cqadupstack-stats config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: main_score value: 35.535 - type: map_at_1 value: 23.082 - type: map_at_10 value: 30.991000000000003 - type: map_at_100 value: 31.968000000000004 - type: map_at_1000 value: 32.07 - type: map_at_20 value: 31.535000000000004 - type: map_at_3 value: 28.605000000000004 - type: map_at_5 value: 30.06 - type: mrr_at_1 value: 25.920245398773005 - type: mrr_at_10 value: 33.93191888207225 - type: mrr_at_100 value: 34.77251067424867 - type: mrr_at_1000 value: 34.838890717603476 - type: mrr_at_20 value: 34.396659782410225 - type: mrr_at_3 value: 31.722903885480587 - type: mrr_at_5 value: 33.11860940695298 - type: nauc_map_at_1000_diff1 value: 50.959235687536655 - type: nauc_map_at_1000_max value: 30.655083426929526 - type: nauc_map_at_1000_std value: 4.551329335263164 - type: nauc_map_at_100_diff1 value: 50.95439619487166 - type: nauc_map_at_100_max value: 30.623042271335667 - type: nauc_map_at_100_std value: 4.553201745695824 - type: nauc_map_at_10_diff1 value: 50.67983398647876 - type: nauc_map_at_10_max value: 30.286986966981583 - type: nauc_map_at_10_std value: 3.9148983660544125 - type: nauc_map_at_1_diff1 value: 58.20205266764334 - type: nauc_map_at_1_max value: 28.58134257489169 - type: nauc_map_at_1_std value: 0.40198884745343 - type: nauc_map_at_20_diff1 value: 50.90472178620438 - type: nauc_map_at_20_max value: 30.563325966498205 - type: nauc_map_at_20_std value: 4.369655492671673 - type: nauc_map_at_3_diff1 value: 52.084512866747325 - type: nauc_map_at_3_max value: 29.374244156637356 - type: nauc_map_at_3_std value: 2.0818606642419963 - type: nauc_map_at_5_diff1 value: 51.27705609284862 - type: nauc_map_at_5_max value: 30.17700495077409 - type: nauc_map_at_5_std value: 3.2722185125269103 - type: nauc_mrr_at_1000_diff1 value: 51.909591752092425 - type: nauc_mrr_at_1000_max value: 33.36453135370183 - type: nauc_mrr_at_1000_std value: 7.404496516950065 - type: nauc_mrr_at_100_diff1 value: 51.900856693619126 - type: nauc_mrr_at_100_max value: 33.350334085938364 - type: nauc_mrr_at_100_std value: 7.410015907741515 - type: nauc_mrr_at_10_diff1 value: 51.82074175684569 - type: nauc_mrr_at_10_max value: 33.32820656085001 - type: nauc_mrr_at_10_std value: 7.043558257826565 - type: nauc_mrr_at_1_diff1 value: 60.46456002140532 - type: nauc_mrr_at_1_max value: 33.31049028455304 - type: nauc_mrr_at_1_std value: 4.830131026566884 - type: nauc_mrr_at_20_diff1 value: 51.8644842944308 - type: nauc_mrr_at_20_max value: 33.3675144190388 - type: nauc_mrr_at_20_std value: 7.256444002173675 - type: nauc_mrr_at_3_diff1 value: 52.904828169011154 - type: nauc_mrr_at_3_max value: 32.55024244450511 - type: nauc_mrr_at_3_std value: 6.014060915782276 - type: nauc_mrr_at_5_diff1 value: 52.361187623943614 - type: nauc_mrr_at_5_max value: 33.38079408144374 - type: nauc_mrr_at_5_std value: 6.854165091950606 - type: nauc_ndcg_at_1000_diff1 value: 48.30949790825087 - type: nauc_ndcg_at_1000_max value: 32.568281800544476 - type: nauc_ndcg_at_1000_std value: 8.966636096573168 - type: nauc_ndcg_at_100_diff1 value: 47.9550901718591 - type: nauc_ndcg_at_100_max value: 31.969231434862483 - type: nauc_ndcg_at_100_std value: 8.909343996509326 - type: nauc_ndcg_at_10_diff1 value: 47.56929495928323 - type: nauc_ndcg_at_10_max value: 31.131109409439638 - type: nauc_ndcg_at_10_std value: 6.03049937873584 - type: nauc_ndcg_at_1_diff1 value: 60.46456002140532 - type: nauc_ndcg_at_1_max value: 33.31049028455304 - type: nauc_ndcg_at_1_std value: 4.830131026566884 - type: nauc_ndcg_at_20_diff1 value: 47.99938648902949 - type: nauc_ndcg_at_20_max value: 31.584023047520475 - type: nauc_ndcg_at_20_std value: 7.3552147944361685 - type: nauc_ndcg_at_3_diff1 value: 50.28269131499986 - type: nauc_ndcg_at_3_max value: 30.233582570806007 - type: nauc_ndcg_at_3_std value: 3.78476869218036 - type: nauc_ndcg_at_5_diff1 value: 49.049921852112895 - type: nauc_ndcg_at_5_max value: 31.174764383636816 - type: nauc_ndcg_at_5_std value: 4.931908749150788 - type: nauc_precision_at_1000_diff1 value: 6.883972818358869 - type: nauc_precision_at_1000_max value: 21.834322765687677 - type: nauc_precision_at_1000_std value: 20.000731976327703 - type: nauc_precision_at_100_diff1 value: 19.786688523669632 - type: nauc_precision_at_100_max value: 30.328428959273722 - type: nauc_precision_at_100_std value: 26.147922491368902 - type: nauc_precision_at_10_diff1 value: 31.41218497795092 - type: nauc_precision_at_10_max value: 33.95003889463453 - type: nauc_precision_at_10_std value: 19.08301072890509 - type: nauc_precision_at_1_diff1 value: 60.46456002140532 - type: nauc_precision_at_1_max value: 33.31049028455304 - type: nauc_precision_at_1_std value: 4.830131026566884 - type: nauc_precision_at_20_diff1 value: 30.502712564255486 - type: nauc_precision_at_20_max value: 35.178872501427975 - type: nauc_precision_at_20_std value: 23.358935743161783 - type: nauc_precision_at_3_diff1 value: 43.1022211297112 - type: nauc_precision_at_3_max value: 33.93732742672912 - type: nauc_precision_at_3_std value: 10.823942310140167 - type: nauc_precision_at_5_diff1 value: 38.63486834833309 - type: nauc_precision_at_5_max value: 36.23894828623807 - type: nauc_precision_at_5_std value: 14.675475211699615 - type: nauc_recall_at_1000_diff1 value: 23.04089688983766 - type: nauc_recall_at_1000_max value: 40.167606539321355 - type: nauc_recall_at_1000_std value: 43.02153663005034 - type: nauc_recall_at_100_diff1 value: 32.000202612409794 - type: nauc_recall_at_100_max value: 31.12741249551696 - type: nauc_recall_at_100_std value: 24.54365478830203 - type: nauc_recall_at_10_diff1 value: 36.14374447048929 - type: nauc_recall_at_10_max value: 29.498316079260555 - type: nauc_recall_at_10_std value: 8.641435315254533 - type: nauc_recall_at_1_diff1 value: 58.20205266764334 - type: nauc_recall_at_1_max value: 28.58134257489169 - type: nauc_recall_at_1_std value: 0.40198884745343 - type: nauc_recall_at_20_diff1 value: 36.22347557385489 - type: nauc_recall_at_20_max value: 29.750817583764405 - type: nauc_recall_at_20_std value: 13.219998916877149 - type: nauc_recall_at_3_diff1 value: 43.42606106046774 - type: nauc_recall_at_3_max value: 27.02370831585066 - type: nauc_recall_at_3_std value: 2.148594878901326 - type: nauc_recall_at_5_diff1 value: 40.74252027743906 - type: nauc_recall_at_5_max value: 29.661893704694375 - type: nauc_recall_at_5_std value: 5.8950594952457145 - type: ndcg_at_1 value: 25.919999999999998 - type: ndcg_at_10 value: 35.535 - type: ndcg_at_100 value: 40.316 - type: ndcg_at_1000 value: 42.84 - type: ndcg_at_20 value: 37.424 - type: ndcg_at_3 value: 31.223 - type: ndcg_at_5 value: 33.521 - type: precision_at_1 value: 25.919999999999998 - type: precision_at_10 value: 5.736 - type: precision_at_100 value: 0.876 - type: precision_at_1000 value: 0.117 - type: precision_at_20 value: 3.359 - type: precision_at_3 value: 13.804 - type: precision_at_5 value: 9.754999999999999 - type: recall_at_1 value: 23.082 - type: recall_at_10 value: 46.399 - type: recall_at_100 value: 68.06 - type: recall_at_1000 value: 86.821 - type: recall_at_20 value: 53.525 - type: recall_at_3 value: 34.871 - type: recall_at_5 value: 40.492 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval type: mteb/cqadupstack-tex config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: main_score value: 29.707 - type: map_at_1 value: 17.159 - type: map_at_10 value: 24.869 - type: map_at_100 value: 26.021 - type: map_at_1000 value: 26.151000000000003 - type: map_at_20 value: 25.526 - type: map_at_3 value: 22.538 - type: map_at_5 value: 23.796999999999997 - type: mrr_at_1 value: 20.99105299380592 - type: mrr_at_10 value: 28.786336971127096 - type: mrr_at_100 value: 29.74490721636805 - type: mrr_at_1000 value: 29.823214274100618 - type: mrr_at_20 value: 29.363881329195756 - type: mrr_at_3 value: 26.531314521679345 - type: mrr_at_5 value: 27.7339986235376 - type: nauc_map_at_1000_diff1 value: 37.13825685322779 - type: nauc_map_at_1000_max value: 25.949209359787055 - type: nauc_map_at_1000_std value: -0.1789880172036093 - type: nauc_map_at_100_diff1 value: 37.13565311027618 - type: nauc_map_at_100_max value: 25.909889375022395 - type: nauc_map_at_100_std value: -0.20169274828783654 - type: nauc_map_at_10_diff1 value: 37.412949674073325 - type: nauc_map_at_10_max value: 25.837714322449912 - type: nauc_map_at_10_std value: -0.7989713426808079 - type: nauc_map_at_1_diff1 value: 43.66535106611136 - type: nauc_map_at_1_max value: 24.934157845499076 - type: nauc_map_at_1_std value: -2.798761696625911 - type: nauc_map_at_20_diff1 value: 37.25188765578179 - type: nauc_map_at_20_max value: 25.887533682661708 - type: nauc_map_at_20_std value: -0.48710070531162597 - type: nauc_map_at_3_diff1 value: 38.747478927053876 - type: nauc_map_at_3_max value: 25.551679823476835 - type: nauc_map_at_3_std value: -1.5848393331273871 - type: nauc_map_at_5_diff1 value: 38.11902875922142 - type: nauc_map_at_5_max value: 25.84766602647597 - type: nauc_map_at_5_std value: -1.0063039730468788 - type: nauc_mrr_at_1000_diff1 value: 35.409856966860396 - type: nauc_mrr_at_1000_max value: 27.86922067595656 - type: nauc_mrr_at_1000_std value: -0.0734512410447464 - type: nauc_mrr_at_100_diff1 value: 35.400804471162054 - type: nauc_mrr_at_100_max value: 27.866591373962002 - type: nauc_mrr_at_100_std value: -0.04959722841487173 - type: nauc_mrr_at_10_diff1 value: 35.5199909370886 - type: nauc_mrr_at_10_max value: 27.962695735822045 - type: nauc_mrr_at_10_std value: -0.5296125062220955 - type: nauc_mrr_at_1_diff1 value: 41.65630429652082 - type: nauc_mrr_at_1_max value: 27.826862844728982 - type: nauc_mrr_at_1_std value: -2.0718644041769205 - type: nauc_mrr_at_20_diff1 value: 35.38119545574273 - type: nauc_mrr_at_20_max value: 27.888497220693953 - type: nauc_mrr_at_20_std value: -0.2890434589026467 - type: nauc_mrr_at_3_diff1 value: 36.603117913849466 - type: nauc_mrr_at_3_max value: 27.947449591583933 - type: nauc_mrr_at_3_std value: -1.0865714056168478 - type: nauc_mrr_at_5_diff1 value: 35.92459791709931 - type: nauc_mrr_at_5_max value: 28.035251623858272 - type: nauc_mrr_at_5_std value: -0.8711878495606741 - type: nauc_ndcg_at_1000_diff1 value: 34.06248430947299 - type: nauc_ndcg_at_1000_max value: 26.7997542315953 - type: nauc_ndcg_at_1000_std value: 3.3240363708933742 - type: nauc_ndcg_at_100_diff1 value: 33.68748871110203 - type: nauc_ndcg_at_100_max value: 26.362138300414788 - type: nauc_ndcg_at_100_std value: 3.3435049793759717 - type: nauc_ndcg_at_10_diff1 value: 34.50272053437263 - type: nauc_ndcg_at_10_max value: 26.41321919372202 - type: nauc_ndcg_at_10_std value: 0.44722981908997034 - type: nauc_ndcg_at_1_diff1 value: 41.65630429652082 - type: nauc_ndcg_at_1_max value: 27.826862844728982 - type: nauc_ndcg_at_1_std value: -2.0718644041769205 - type: nauc_ndcg_at_20_diff1 value: 34.095928245730065 - type: nauc_ndcg_at_20_max value: 26.278658129351108 - type: nauc_ndcg_at_20_std value: 1.333694029082928 - type: nauc_ndcg_at_3_diff1 value: 36.69705632103637 - type: nauc_ndcg_at_3_max value: 26.78968350373072 - type: nauc_ndcg_at_3_std value: -1.0804397591306258 - type: nauc_ndcg_at_5_diff1 value: 35.72910772416993 - type: nauc_ndcg_at_5_max value: 26.70057707274289 - type: nauc_ndcg_at_5_std value: -0.13486271460127894 - type: nauc_precision_at_1000_diff1 value: 0.05861252770643225 - type: nauc_precision_at_1000_max value: 18.601946335509112 - type: nauc_precision_at_1000_std value: 9.800060286260463 - type: nauc_precision_at_100_diff1 value: 7.363883419620025 - type: nauc_precision_at_100_max value: 22.20848267682575 - type: nauc_precision_at_100_std value: 12.714551550333642 - type: nauc_precision_at_10_diff1 value: 21.331506854435275 - type: nauc_precision_at_10_max value: 28.684902701505965 - type: nauc_precision_at_10_std value: 3.6550639959191207 - type: nauc_precision_at_1_diff1 value: 41.65630429652082 - type: nauc_precision_at_1_max value: 27.826862844728982 - type: nauc_precision_at_1_std value: -2.0718644041769205 - type: nauc_precision_at_20_diff1 value: 16.844978902521646 - type: nauc_precision_at_20_max value: 27.441958887770646 - type: nauc_precision_at_20_std value: 6.3826805047558315 - type: nauc_precision_at_3_diff1 value: 30.639398097322594 - type: nauc_precision_at_3_max value: 29.939776959172697 - type: nauc_precision_at_3_std value: -0.20286831584574574 - type: nauc_precision_at_5_diff1 value: 26.70376825047474 - type: nauc_precision_at_5_max value: 29.60604358978513 - type: nauc_precision_at_5_std value: 1.5809149742471655 - type: nauc_recall_at_1000_diff1 value: 17.785715749599042 - type: nauc_recall_at_1000_max value: 23.48376672770539 - type: nauc_recall_at_1000_std value: 30.385000337970858 - type: nauc_recall_at_100_diff1 value: 21.05284222570054 - type: nauc_recall_at_100_max value: 21.945063586716614 - type: nauc_recall_at_100_std value: 17.466562038077875 - type: nauc_recall_at_10_diff1 value: 26.597231762971674 - type: nauc_recall_at_10_max value: 23.5079436519741 - type: nauc_recall_at_10_std value: 3.263135880492641 - type: nauc_recall_at_1_diff1 value: 43.66535106611136 - type: nauc_recall_at_1_max value: 24.934157845499076 - type: nauc_recall_at_1_std value: -2.798761696625911 - type: nauc_recall_at_20_diff1 value: 24.832091637143787 - type: nauc_recall_at_20_max value: 22.315764495952237 - type: nauc_recall_at_20_std value: 6.129833251765541 - type: nauc_recall_at_3_diff1 value: 32.85408650886733 - type: nauc_recall_at_3_max value: 24.409412121823397 - type: nauc_recall_at_3_std value: 0.04999270761091106 - type: nauc_recall_at_5_diff1 value: 30.258414223370007 - type: nauc_recall_at_5_max value: 24.512878195644664 - type: nauc_recall_at_5_std value: 1.849046122226546 - type: ndcg_at_1 value: 20.991 - type: ndcg_at_10 value: 29.707 - type: ndcg_at_100 value: 35.043 - type: ndcg_at_1000 value: 38.032 - type: ndcg_at_20 value: 31.828 - type: ndcg_at_3 value: 25.488 - type: ndcg_at_5 value: 27.348 - type: precision_at_1 value: 20.991 - type: precision_at_10 value: 5.416 - type: precision_at_100 value: 0.947 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_20 value: 3.324 - type: precision_at_3 value: 12.113 - type: precision_at_5 value: 8.734 - type: recall_at_1 value: 17.159 - type: recall_at_10 value: 40.397 - type: recall_at_100 value: 64.139 - type: recall_at_1000 value: 85.328 - type: recall_at_20 value: 48.193000000000005 - type: recall_at_3 value: 28.555999999999997 - type: recall_at_5 value: 33.394 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval type: mteb/cqadupstack-unix config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: main_score value: 41.831 - type: map_at_1 value: 25.889 - type: map_at_10 value: 36.131 - type: map_at_100 value: 37.277 - type: map_at_1000 value: 37.383 - type: map_at_20 value: 36.797000000000004 - type: map_at_3 value: 33.194 - type: map_at_5 value: 34.88 - type: mrr_at_1 value: 30.69029850746269 - type: mrr_at_10 value: 40.34274312959011 - type: mrr_at_100 value: 41.15568315924076 - type: mrr_at_1000 value: 41.21534922643823 - type: mrr_at_20 value: 40.81612888073637 - type: mrr_at_3 value: 37.624378109452735 - type: mrr_at_5 value: 39.25217661691539 - type: nauc_map_at_1000_diff1 value: 51.0475973810661 - type: nauc_map_at_1000_max value: 38.75825500903846 - type: nauc_map_at_1000_std value: 1.6136905986292485 - type: nauc_map_at_100_diff1 value: 51.04820272616417 - type: nauc_map_at_100_max value: 38.74584044282816 - type: nauc_map_at_100_std value: 1.5969607728429231 - type: nauc_map_at_10_diff1 value: 50.94166583581915 - type: nauc_map_at_10_max value: 38.37738102486977 - type: nauc_map_at_10_std value: 1.2635889890868346 - type: nauc_map_at_1_diff1 value: 59.242331404755774 - type: nauc_map_at_1_max value: 39.02663876284084 - type: nauc_map_at_1_std value: -0.4739614669668662 - type: nauc_map_at_20_diff1 value: 50.97455684751073 - type: nauc_map_at_20_max value: 38.57646135094768 - type: nauc_map_at_20_std value: 1.4640361871795349 - type: nauc_map_at_3_diff1 value: 51.608034622903176 - type: nauc_map_at_3_max value: 38.433045221071325 - type: nauc_map_at_3_std value: 0.5831392788488381 - type: nauc_map_at_5_diff1 value: 50.947880732714445 - type: nauc_map_at_5_max value: 38.60925399151572 - type: nauc_map_at_5_std value: 1.291960076749259 - type: nauc_mrr_at_1000_diff1 value: 50.210650177335104 - type: nauc_mrr_at_1000_max value: 37.951469256285804 - type: nauc_mrr_at_1000_std value: 0.7902286837699785 - type: nauc_mrr_at_100_diff1 value: 50.20638219267218 - type: nauc_mrr_at_100_max value: 37.9377948931531 - type: nauc_mrr_at_100_std value: 0.774713370156735 - type: nauc_mrr_at_10_diff1 value: 49.836111870473935 - type: nauc_mrr_at_10_max value: 37.65348449064669 - type: nauc_mrr_at_10_std value: 0.5231944356104865 - type: nauc_mrr_at_1_diff1 value: 57.56522049860187 - type: nauc_mrr_at_1_max value: 39.39798439825698 - type: nauc_mrr_at_1_std value: -0.4516317740083426 - type: nauc_mrr_at_20_diff1 value: 50.1006649557446 - type: nauc_mrr_at_20_max value: 37.84223094800734 - type: nauc_mrr_at_20_std value: 0.8086280885894073 - type: nauc_mrr_at_3_diff1 value: 50.441725884115996 - type: nauc_mrr_at_3_max value: 37.90807984566849 - type: nauc_mrr_at_3_std value: 0.02550782712808399 - type: nauc_mrr_at_5_diff1 value: 49.85802035503023 - type: nauc_mrr_at_5_max value: 38.065589153711116 - type: nauc_mrr_at_5_std value: 0.6274639011716443 - type: nauc_ndcg_at_1000_diff1 value: 49.03659827838649 - type: nauc_ndcg_at_1000_max value: 39.132735746113575 - type: nauc_ndcg_at_1000_std value: 3.627422164709519 - type: nauc_ndcg_at_100_diff1 value: 49.00264137357818 - type: nauc_ndcg_at_100_max value: 39.01919928439472 - type: nauc_ndcg_at_100_std value: 3.558699165061359 - type: nauc_ndcg_at_10_diff1 value: 48.26671791603934 - type: nauc_ndcg_at_10_max value: 37.571416815576114 - type: nauc_ndcg_at_10_std value: 1.9403797342170153 - type: nauc_ndcg_at_1_diff1 value: 57.56522049860187 - type: nauc_ndcg_at_1_max value: 39.39798439825698 - type: nauc_ndcg_at_1_std value: -0.4516317740083426 - type: nauc_ndcg_at_20_diff1 value: 48.66105608484808 - type: nauc_ndcg_at_20_max value: 38.22139553816886 - type: nauc_ndcg_at_20_std value: 2.8911511133753782 - type: nauc_ndcg_at_3_diff1 value: 49.00804557017609 - type: nauc_ndcg_at_3_max value: 37.72179482159779 - type: nauc_ndcg_at_3_std value: 0.8400931058476853 - type: nauc_ndcg_at_5_diff1 value: 48.24457268105435 - type: nauc_ndcg_at_5_max value: 38.191301845180604 - type: nauc_ndcg_at_5_std value: 1.9471919379129263 - type: nauc_precision_at_1000_diff1 value: -12.33889190395623 - type: nauc_precision_at_1000_max value: -1.3115353486004107 - type: nauc_precision_at_1000_std value: 2.495795006465732 - type: nauc_precision_at_100_diff1 value: 2.1703067538960084 - type: nauc_precision_at_100_max value: 15.898479441971332 - type: nauc_precision_at_100_std value: 7.910076672658263 - type: nauc_precision_at_10_diff1 value: 22.20759907514248 - type: nauc_precision_at_10_max value: 25.51471885225117 - type: nauc_precision_at_10_std value: 2.3609262388624512 - type: nauc_precision_at_1_diff1 value: 57.56522049860187 - type: nauc_precision_at_1_max value: 39.39798439825698 - type: nauc_precision_at_1_std value: -0.4516317740083426 - type: nauc_precision_at_20_diff1 value: 15.94035009026911 - type: nauc_precision_at_20_max value: 23.178150944386744 - type: nauc_precision_at_20_std value: 5.207387751900332 - type: nauc_precision_at_3_diff1 value: 34.99396954995648 - type: nauc_precision_at_3_max value: 33.14418980052923 - type: nauc_precision_at_3_std value: 1.660740116435417 - type: nauc_precision_at_5_diff1 value: 29.544849162475362 - type: nauc_precision_at_5_max value: 32.150735196144645 - type: nauc_precision_at_5_std value: 3.323068902360027 - type: nauc_recall_at_1000_diff1 value: 30.978839058267006 - type: nauc_recall_at_1000_max value: 48.722880061794 - type: nauc_recall_at_1000_std value: 46.28381322993451 - type: nauc_recall_at_100_diff1 value: 40.22130846505397 - type: nauc_recall_at_100_max value: 38.28644243336189 - type: nauc_recall_at_100_std value: 15.77321980757386 - type: nauc_recall_at_10_diff1 value: 38.9910969333204 - type: nauc_recall_at_10_max value: 32.807008720875984 - type: nauc_recall_at_10_std value: 5.065337152044106 - type: nauc_recall_at_1_diff1 value: 59.242331404755774 - type: nauc_recall_at_1_max value: 39.02663876284084 - type: nauc_recall_at_1_std value: -0.4739614669668662 - type: nauc_recall_at_20_diff1 value: 40.14875646536079 - type: nauc_recall_at_20_max value: 34.83600129324774 - type: nauc_recall_at_20_std value: 9.01370840232733 - type: nauc_recall_at_3_diff1 value: 42.65832338786475 - type: nauc_recall_at_3_max value: 35.56970517818321 - type: nauc_recall_at_3_std value: 1.8050805176967801 - type: nauc_recall_at_5_diff1 value: 40.07274624634327 - type: nauc_recall_at_5_max value: 35.74226371272684 - type: nauc_recall_at_5_std value: 4.873290118594757 - type: ndcg_at_1 value: 30.69 - type: ndcg_at_10 value: 41.831 - type: ndcg_at_100 value: 46.966 - type: ndcg_at_1000 value: 49.334 - type: ndcg_at_20 value: 43.927 - type: ndcg_at_3 value: 36.534 - type: ndcg_at_5 value: 39.126 - type: precision_at_1 value: 30.69 - type: precision_at_10 value: 7.1739999999999995 - type: precision_at_100 value: 1.095 - type: precision_at_1000 value: 0.14200000000000002 - type: precision_at_20 value: 4.184 - type: precision_at_3 value: 16.853 - type: precision_at_5 value: 11.922 - type: recall_at_1 value: 25.889 - type: recall_at_10 value: 54.962999999999994 - type: recall_at_100 value: 77.239 - type: recall_at_1000 value: 93.729 - type: recall_at_20 value: 62.534 - type: recall_at_3 value: 40.336 - type: recall_at_5 value: 47.083000000000006 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval type: mteb/cqadupstack-webmasters config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: main_score value: 41.695 - type: map_at_1 value: 26.296999999999997 - type: map_at_10 value: 35.929 - type: map_at_100 value: 37.625 - type: map_at_1000 value: 37.856 - type: map_at_20 value: 36.831 - type: map_at_3 value: 33.042 - type: map_at_5 value: 34.552 - type: mrr_at_1 value: 31.422924901185773 - type: mrr_at_10 value: 40.36718112805069 - type: mrr_at_100 value: 41.48635728771627 - type: mrr_at_1000 value: 41.53760971899895 - type: mrr_at_20 value: 41.05566983667548 - type: mrr_at_3 value: 38.24110671936759 - type: mrr_at_5 value: 39.49604743083004 - type: nauc_map_at_1000_diff1 value: 45.88419073101831 - type: nauc_map_at_1000_max value: 32.272696879964606 - type: nauc_map_at_1000_std value: 6.435633876271509 - type: nauc_map_at_100_diff1 value: 46.118272363764085 - type: nauc_map_at_100_max value: 32.459168722724094 - type: nauc_map_at_100_std value: 6.292246088710509 - type: nauc_map_at_10_diff1 value: 46.603302676569655 - type: nauc_map_at_10_max value: 32.38318941747706 - type: nauc_map_at_10_std value: 4.720511340512196 - type: nauc_map_at_1_diff1 value: 53.474193431022286 - type: nauc_map_at_1_max value: 30.096745684269028 - type: nauc_map_at_1_std value: 0.1635051536400562 - type: nauc_map_at_20_diff1 value: 46.25945687266626 - type: nauc_map_at_20_max value: 32.47553839186572 - type: nauc_map_at_20_std value: 5.566329221862548 - type: nauc_map_at_3_diff1 value: 47.86679192761851 - type: nauc_map_at_3_max value: 31.531646616728803 - type: nauc_map_at_3_std value: 3.1837781149112496 - type: nauc_map_at_5_diff1 value: 46.4585625030729 - type: nauc_map_at_5_max value: 32.013423473733624 - type: nauc_map_at_5_std value: 4.403527966937636 - type: nauc_mrr_at_1000_diff1 value: 44.168029521898646 - type: nauc_mrr_at_1000_max value: 33.231405944995004 - type: nauc_mrr_at_1000_std value: 8.153326593928266 - type: nauc_mrr_at_100_diff1 value: 44.17027683367582 - type: nauc_mrr_at_100_max value: 33.23422175046355 - type: nauc_mrr_at_100_std value: 8.198284732472755 - type: nauc_mrr_at_10_diff1 value: 44.2496903067119 - type: nauc_mrr_at_10_max value: 33.055178332856116 - type: nauc_mrr_at_10_std value: 7.831026978775937 - type: nauc_mrr_at_1_diff1 value: 48.4273290718694 - type: nauc_mrr_at_1_max value: 31.89937877913926 - type: nauc_mrr_at_1_std value: 3.873149993747884 - type: nauc_mrr_at_20_diff1 value: 44.09113284049905 - type: nauc_mrr_at_20_max value: 33.22019452622306 - type: nauc_mrr_at_20_std value: 8.133802855890329 - type: nauc_mrr_at_3_diff1 value: 44.86167450862544 - type: nauc_mrr_at_3_max value: 32.98194923216794 - type: nauc_mrr_at_3_std value: 6.9890614678195 - type: nauc_mrr_at_5_diff1 value: 43.939080994503634 - type: nauc_mrr_at_5_max value: 33.25648484685068 - type: nauc_mrr_at_5_std value: 7.943963197772268 - type: nauc_ndcg_at_1000_diff1 value: 43.42006126140444 - type: nauc_ndcg_at_1000_max value: 32.89416354016926 - type: nauc_ndcg_at_1000_std value: 9.740672987523162 - type: nauc_ndcg_at_100_diff1 value: 43.737763705105145 - type: nauc_ndcg_at_100_max value: 33.102019342275725 - type: nauc_ndcg_at_100_std value: 10.354524698232671 - type: nauc_ndcg_at_10_diff1 value: 43.574979909615 - type: nauc_ndcg_at_10_max value: 32.22335464466024 - type: nauc_ndcg_at_10_std value: 7.827717817165889 - type: nauc_ndcg_at_1_diff1 value: 48.4273290718694 - type: nauc_ndcg_at_1_max value: 31.89937877913926 - type: nauc_ndcg_at_1_std value: 3.873149993747884 - type: nauc_ndcg_at_20_diff1 value: 43.135943873988566 - type: nauc_ndcg_at_20_max value: 32.88264995288679 - type: nauc_ndcg_at_20_std value: 9.104351404942863 - type: nauc_ndcg_at_3_diff1 value: 45.18739397775064 - type: nauc_ndcg_at_3_max value: 31.580166756620283 - type: nauc_ndcg_at_3_std value: 6.137398763080745 - type: nauc_ndcg_at_5_diff1 value: 42.950299500112955 - type: nauc_ndcg_at_5_max value: 32.04130248991469 - type: nauc_ndcg_at_5_std value: 8.322547993875903 - type: nauc_precision_at_1000_diff1 value: -23.129419591612365 - type: nauc_precision_at_1000_max value: -11.41420275910081 - type: nauc_precision_at_1000_std value: 19.146268912764334 - type: nauc_precision_at_100_diff1 value: -12.413671568737618 - type: nauc_precision_at_100_max value: 0.537649304108213 - type: nauc_precision_at_100_std value: 27.325180241816415 - type: nauc_precision_at_10_diff1 value: 15.277020606429655 - type: nauc_precision_at_10_max value: 23.51972448360081 - type: nauc_precision_at_10_std value: 19.103862771406927 - type: nauc_precision_at_1_diff1 value: 48.4273290718694 - type: nauc_precision_at_1_max value: 31.89937877913926 - type: nauc_precision_at_1_std value: 3.873149993747884 - type: nauc_precision_at_20_diff1 value: 4.910626579631313 - type: nauc_precision_at_20_max value: 17.000613397246163 - type: nauc_precision_at_20_std value: 24.370825263718903 - type: nauc_precision_at_3_diff1 value: 31.259123635562613 - type: nauc_precision_at_3_max value: 28.91653697836493 - type: nauc_precision_at_3_std value: 11.718828024267332 - type: nauc_precision_at_5_diff1 value: 21.896001023343413 - type: nauc_precision_at_5_max value: 26.53717311029016 - type: nauc_precision_at_5_std value: 17.506215861477873 - type: nauc_recall_at_1000_diff1 value: 15.545423862859614 - type: nauc_recall_at_1000_max value: 33.54097556941026 - type: nauc_recall_at_1000_std value: 41.970927423554926 - type: nauc_recall_at_100_diff1 value: 32.29112323650048 - type: nauc_recall_at_100_max value: 31.72353031716839 - type: nauc_recall_at_100_std value: 30.06509939448423 - type: nauc_recall_at_10_diff1 value: 36.223842357407875 - type: nauc_recall_at_10_max value: 29.16462133003001 - type: nauc_recall_at_10_std value: 9.404694229411104 - type: nauc_recall_at_1_diff1 value: 53.474193431022286 - type: nauc_recall_at_1_max value: 30.096745684269028 - type: nauc_recall_at_1_std value: 0.1635051536400562 - type: nauc_recall_at_20_diff1 value: 32.2732032299642 - type: nauc_recall_at_20_max value: 30.699505625402928 - type: nauc_recall_at_20_std value: 15.947782026021681 - type: nauc_recall_at_3_diff1 value: 41.746081012759426 - type: nauc_recall_at_3_max value: 29.019436574100016 - type: nauc_recall_at_3_std value: 4.757836484193213 - type: nauc_recall_at_5_diff1 value: 35.74337633697516 - type: nauc_recall_at_5_max value: 30.17283125351457 - type: nauc_recall_at_5_std value: 9.488723875013253 - type: ndcg_at_1 value: 31.423000000000002 - type: ndcg_at_10 value: 41.695 - type: ndcg_at_100 value: 48.109 - type: ndcg_at_1000 value: 50.39900000000001 - type: ndcg_at_20 value: 44.208999999999996 - type: ndcg_at_3 value: 37.241 - type: ndcg_at_5 value: 39.228 - type: precision_at_1 value: 31.423000000000002 - type: precision_at_10 value: 7.866 - type: precision_at_100 value: 1.603 - type: precision_at_1000 value: 0.245 - type: precision_at_20 value: 4.99 - type: precision_at_3 value: 17.523 - type: precision_at_5 value: 12.49 - type: recall_at_1 value: 26.296999999999997 - type: recall_at_10 value: 52.778000000000006 - type: recall_at_100 value: 80.961 - type: recall_at_1000 value: 94.894 - type: recall_at_20 value: 62.239 - type: recall_at_3 value: 39.814 - type: recall_at_5 value: 45.381 - task: type: Retrieval dataset: name: MTEB CQADupstackWordpressRetrieval type: mteb/cqadupstack-wordpress config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: main_score value: 33.936 - type: map_at_1 value: 21.297 - type: map_at_10 value: 29.29 - type: map_at_100 value: 30.407 - type: map_at_1000 value: 30.514999999999997 - type: map_at_20 value: 29.983999999999998 - type: map_at_3 value: 26.950000000000003 - type: map_at_5 value: 28.287000000000003 - type: mrr_at_1 value: 23.65988909426987 - type: mrr_at_10 value: 31.57996655224012 - type: mrr_at_100 value: 32.58133076268842 - type: mrr_at_1000 value: 32.659811204298656 - type: mrr_at_20 value: 32.18205959735665 - type: mrr_at_3 value: 29.482439926062852 - type: mrr_at_5 value: 30.600739371534207 - type: nauc_map_at_1000_diff1 value: 33.65655465193916 - type: nauc_map_at_1000_max value: 29.523610574712706 - type: nauc_map_at_1000_std value: -0.48883917163984836 - type: nauc_map_at_100_diff1 value: 33.657822812150975 - type: nauc_map_at_100_max value: 29.531870292234302 - type: nauc_map_at_100_std value: -0.49454342105691873 - type: nauc_map_at_10_diff1 value: 34.03649741206849 - type: nauc_map_at_10_max value: 29.48133710519135 - type: nauc_map_at_10_std value: -1.3003031064360702 - type: nauc_map_at_1_diff1 value: 41.319491034458395 - type: nauc_map_at_1_max value: 30.08436727224079 - type: nauc_map_at_1_std value: -4.283931261517225 - type: nauc_map_at_20_diff1 value: 33.644132189750856 - type: nauc_map_at_20_max value: 29.57915168728321 - type: nauc_map_at_20_std value: -0.71252104365507 - type: nauc_map_at_3_diff1 value: 33.8965524645013 - type: nauc_map_at_3_max value: 28.898722773976697 - type: nauc_map_at_3_std value: -1.8649217196078969 - type: nauc_map_at_5_diff1 value: 33.65177546877711 - type: nauc_map_at_5_max value: 29.449552621308055 - type: nauc_map_at_5_std value: -1.9217932476234898 - type: nauc_mrr_at_1000_diff1 value: 34.21675867856096 - type: nauc_mrr_at_1000_max value: 30.198504997318466 - type: nauc_mrr_at_1000_std value: 0.5352461648974925 - type: nauc_mrr_at_100_diff1 value: 34.210091539379874 - type: nauc_mrr_at_100_max value: 30.19136090320817 - type: nauc_mrr_at_100_std value: 0.5431068443349623 - type: nauc_mrr_at_10_diff1 value: 34.50092238629405 - type: nauc_mrr_at_10_max value: 30.360381404088816 - type: nauc_mrr_at_10_std value: 0.007947172236616928 - type: nauc_mrr_at_1_diff1 value: 41.47500594264137 - type: nauc_mrr_at_1_max value: 30.932862195893563 - type: nauc_mrr_at_1_std value: -3.0060183101242157 - type: nauc_mrr_at_20_diff1 value: 34.15523281231642 - type: nauc_mrr_at_20_max value: 30.251528444714324 - type: nauc_mrr_at_20_std value: 0.41483749048122587 - type: nauc_mrr_at_3_diff1 value: 34.54541333351149 - type: nauc_mrr_at_3_max value: 30.357741809442512 - type: nauc_mrr_at_3_std value: -0.5977586679572796 - type: nauc_mrr_at_5_diff1 value: 34.058033979119465 - type: nauc_mrr_at_5_max value: 30.19093785155445 - type: nauc_mrr_at_5_std value: -0.6829700596355942 - type: nauc_ndcg_at_1000_diff1 value: 31.530363860261506 - type: nauc_ndcg_at_1000_max value: 29.90327018263153 - type: nauc_ndcg_at_1000_std value: 3.033100623143071 - type: nauc_ndcg_at_100_diff1 value: 31.56967408174602 - type: nauc_ndcg_at_100_max value: 29.53643288504651 - type: nauc_ndcg_at_100_std value: 3.4997411689634883 - type: nauc_ndcg_at_10_diff1 value: 32.27374955735248 - type: nauc_ndcg_at_10_max value: 29.519348153684348 - type: nauc_ndcg_at_10_std value: 0.3042011208954651 - type: nauc_ndcg_at_1_diff1 value: 41.47500594264137 - type: nauc_ndcg_at_1_max value: 30.932862195893563 - type: nauc_ndcg_at_1_std value: -3.0060183101242157 - type: nauc_ndcg_at_20_diff1 value: 31.102403306150194 - type: nauc_ndcg_at_20_max value: 29.677553740846967 - type: nauc_ndcg_at_20_std value: 2.1195261321395766 - type: nauc_ndcg_at_3_diff1 value: 32.02333047452249 - type: nauc_ndcg_at_3_max value: 28.888372073027796 - type: nauc_ndcg_at_3_std value: -0.924661397180436 - type: nauc_ndcg_at_5_diff1 value: 31.466174122311667 - type: nauc_ndcg_at_5_max value: 29.307628068867754 - type: nauc_ndcg_at_5_std value: -1.2046829876982417 - type: nauc_precision_at_1000_diff1 value: -6.075546300902165 - type: nauc_precision_at_1000_max value: -2.187623217222419 - type: nauc_precision_at_1000_std value: 12.584752959282211 - type: nauc_precision_at_100_diff1 value: 14.295101079499434 - type: nauc_precision_at_100_max value: 20.388641516894513 - type: nauc_precision_at_100_std value: 21.887960759975524 - type: nauc_precision_at_10_diff1 value: 24.536039003837043 - type: nauc_precision_at_10_max value: 29.357326635020637 - type: nauc_precision_at_10_std value: 7.65955284021577 - type: nauc_precision_at_1_diff1 value: 41.47500594264137 - type: nauc_precision_at_1_max value: 30.932862195893563 - type: nauc_precision_at_1_std value: -3.0060183101242157 - type: nauc_precision_at_20_diff1 value: 18.634308701475955 - type: nauc_precision_at_20_max value: 27.88621903726711 - type: nauc_precision_at_20_std value: 14.96789816785273 - type: nauc_precision_at_3_diff1 value: 26.928594601514146 - type: nauc_precision_at_3_max value: 29.653482500006007 - type: nauc_precision_at_3_std value: 2.114869053308719 - type: nauc_precision_at_5_diff1 value: 24.137817643228992 - type: nauc_precision_at_5_max value: 29.467809315433215 - type: nauc_precision_at_5_std value: 2.2335268351775777 - type: nauc_recall_at_1000_diff1 value: 7.561889223723366 - type: nauc_recall_at_1000_max value: 34.64462683484328 - type: nauc_recall_at_1000_std value: 32.07766726976165 - type: nauc_recall_at_100_diff1 value: 21.87202458692393 - type: nauc_recall_at_100_max value: 26.060326662408357 - type: nauc_recall_at_100_std value: 20.038540279921996 - type: nauc_recall_at_10_diff1 value: 26.59257905849799 - type: nauc_recall_at_10_max value: 27.840231433969887 - type: nauc_recall_at_10_std value: 3.547350776489353 - type: nauc_recall_at_1_diff1 value: 41.319491034458395 - type: nauc_recall_at_1_max value: 30.08436727224079 - type: nauc_recall_at_1_std value: -4.283931261517225 - type: nauc_recall_at_20_diff1 value: 21.84062118775981 - type: nauc_recall_at_20_max value: 27.960813344120865 - type: nauc_recall_at_20_std value: 9.945117730379264 - type: nauc_recall_at_3_diff1 value: 26.240584213234957 - type: nauc_recall_at_3_max value: 27.32563942378109 - type: nauc_recall_at_3_std value: 0.15754039149189397 - type: nauc_recall_at_5_diff1 value: 25.327116061029542 - type: nauc_recall_at_5_max value: 28.12294625143933 - type: nauc_recall_at_5_std value: -0.7151467503960333 - type: ndcg_at_1 value: 23.66 - type: ndcg_at_10 value: 33.936 - type: ndcg_at_100 value: 39.172000000000004 - type: ndcg_at_1000 value: 41.858000000000004 - type: ndcg_at_20 value: 36.248999999999995 - type: ndcg_at_3 value: 29.454 - type: ndcg_at_5 value: 31.555 - type: precision_at_1 value: 23.66 - type: precision_at_10 value: 5.323 - type: precision_at_100 value: 0.8500000000000001 - type: precision_at_1000 value: 0.11900000000000001 - type: precision_at_20 value: 3.216 - type: precision_at_3 value: 12.568999999999999 - type: precision_at_5 value: 8.909 - type: recall_at_1 value: 21.297 - type: recall_at_10 value: 46.007 - type: recall_at_100 value: 69.73700000000001 - type: recall_at_1000 value: 89.91900000000001 - type: recall_at_20 value: 54.806 - type: recall_at_3 value: 33.727000000000004 - type: recall_at_5 value: 38.675 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: mteb/climate-fever config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: main_score value: 27.794999999999998 - type: map_at_1 value: 11.138 - type: map_at_10 value: 19.56 - type: map_at_100 value: 21.416 - type: map_at_1000 value: 21.6 - type: map_at_20 value: 20.556 - type: map_at_3 value: 16.066 - type: map_at_5 value: 17.883 - type: mrr_at_1 value: 24.364820846905538 - type: mrr_at_10 value: 36.21314823432085 - type: mrr_at_100 value: 37.17398469553677 - type: mrr_at_1000 value: 37.21013480329614 - type: mrr_at_20 value: 36.78840955357927 - type: mrr_at_3 value: 32.486427795874 - type: mrr_at_5 value: 34.77958740499451 - type: nauc_map_at_1000_diff1 value: 22.20775473369687 - type: nauc_map_at_1000_max value: 36.19769366030157 - type: nauc_map_at_1000_std value: 17.568432565671753 - type: nauc_map_at_100_diff1 value: 22.202037755951228 - type: nauc_map_at_100_max value: 36.13800341266643 - type: nauc_map_at_100_std value: 17.486248132972992 - type: nauc_map_at_10_diff1 value: 22.9042018284273 - type: nauc_map_at_10_max value: 36.08475064127247 - type: nauc_map_at_10_std value: 15.726587888083884 - type: nauc_map_at_1_diff1 value: 28.652249616122717 - type: nauc_map_at_1_max value: 32.05131359795648 - type: nauc_map_at_1_std value: 11.262948253532807 - type: nauc_map_at_20_diff1 value: 22.451026108322598 - type: nauc_map_at_20_max value: 36.32385371085683 - type: nauc_map_at_20_std value: 16.64337500445571 - type: nauc_map_at_3_diff1 value: 23.16011840834893 - type: nauc_map_at_3_max value: 33.24586608916762 - type: nauc_map_at_3_std value: 12.56332091941363 - type: nauc_map_at_5_diff1 value: 22.93957941358747 - type: nauc_map_at_5_max value: 34.699460514009 - type: nauc_map_at_5_std value: 14.063661191876298 - type: nauc_mrr_at_1000_diff1 value: 23.24777062437872 - type: nauc_mrr_at_1000_max value: 33.450026215376866 - type: nauc_mrr_at_1000_std value: 20.426349474081853 - type: nauc_mrr_at_100_diff1 value: 23.23401699847253 - type: nauc_mrr_at_100_max value: 33.45459692613422 - type: nauc_mrr_at_100_std value: 20.440070448958714 - type: nauc_mrr_at_10_diff1 value: 23.281604083585396 - type: nauc_mrr_at_10_max value: 33.4988527620155 - type: nauc_mrr_at_10_std value: 20.367252947781857 - type: nauc_mrr_at_1_diff1 value: 26.355328110953575 - type: nauc_mrr_at_1_max value: 30.471508547730092 - type: nauc_mrr_at_1_std value: 16.11568495246132 - type: nauc_mrr_at_20_diff1 value: 23.140683139461732 - type: nauc_mrr_at_20_max value: 33.48554958878313 - type: nauc_mrr_at_20_std value: 20.44494070529154 - type: nauc_mrr_at_3_diff1 value: 23.301943271387042 - type: nauc_mrr_at_3_max value: 32.422994068557635 - type: nauc_mrr_at_3_std value: 18.939596173947923 - type: nauc_mrr_at_5_diff1 value: 23.33922409006143 - type: nauc_mrr_at_5_max value: 33.0752792306208 - type: nauc_mrr_at_5_std value: 19.768166202806604 - type: nauc_ndcg_at_1000_diff1 value: 19.998881742304263 - type: nauc_ndcg_at_1000_max value: 37.786993391629984 - type: nauc_ndcg_at_1000_std value: 24.51994563648888 - type: nauc_ndcg_at_100_diff1 value: 19.98463107036392 - type: nauc_ndcg_at_100_max value: 37.00722603001812 - type: nauc_ndcg_at_100_std value: 23.717744758974426 - type: nauc_ndcg_at_10_diff1 value: 21.62784861661253 - type: nauc_ndcg_at_10_max value: 37.16285223589196 - type: nauc_ndcg_at_10_std value: 19.34332938831155 - type: nauc_ndcg_at_1_diff1 value: 26.355328110953575 - type: nauc_ndcg_at_1_max value: 30.471508547730092 - type: nauc_ndcg_at_1_std value: 16.11568495246132 - type: nauc_ndcg_at_20_diff1 value: 20.55696079927241 - type: nauc_ndcg_at_20_max value: 37.60669992563356 - type: nauc_ndcg_at_20_std value: 21.09713313195671 - type: nauc_ndcg_at_3_diff1 value: 22.08438430773322 - type: nauc_ndcg_at_3_max value: 32.68110059834722 - type: nauc_ndcg_at_3_std value: 15.267429669015595 - type: nauc_ndcg_at_5_diff1 value: 21.715020935808575 - type: nauc_ndcg_at_5_max value: 35.17110301407326 - type: nauc_ndcg_at_5_std value: 16.78243466311895 - type: nauc_precision_at_1000_diff1 value: -3.2231794613702007 - type: nauc_precision_at_1000_max value: 10.42559310530991 - type: nauc_precision_at_1000_std value: 24.602086786850514 - type: nauc_precision_at_100_diff1 value: 1.7021223120345566 - type: nauc_precision_at_100_max value: 17.38852629914526 - type: nauc_precision_at_100_std value: 29.337128327095286 - type: nauc_precision_at_10_diff1 value: 12.164922485567033 - type: nauc_precision_at_10_max value: 32.37319082664107 - type: nauc_precision_at_10_std value: 26.300541100072984 - type: nauc_precision_at_1_diff1 value: 26.355328110953575 - type: nauc_precision_at_1_max value: 30.471508547730092 - type: nauc_precision_at_1_std value: 16.11568495246132 - type: nauc_precision_at_20_diff1 value: 7.385735474290768 - type: nauc_precision_at_20_max value: 28.422173054750115 - type: nauc_precision_at_20_std value: 27.035109636511876 - type: nauc_precision_at_3_diff1 value: 16.418314508072836 - type: nauc_precision_at_3_max value: 31.785139366157615 - type: nauc_precision_at_3_std value: 20.32896371836789 - type: nauc_precision_at_5_diff1 value: 14.937559885788062 - type: nauc_precision_at_5_max value: 32.24391988837453 - type: nauc_precision_at_5_std value: 23.17707476156323 - type: nauc_recall_at_1000_diff1 value: 5.616430433184691 - type: nauc_recall_at_1000_max value: 36.55384286718441 - type: nauc_recall_at_1000_std value: 38.50298604014725 - type: nauc_recall_at_100_diff1 value: 8.877636292128273 - type: nauc_recall_at_100_max value: 30.860213540250705 - type: nauc_recall_at_100_std value: 28.929321541751467 - type: nauc_recall_at_10_diff1 value: 16.07834176997954 - type: nauc_recall_at_10_max value: 35.937627989165364 - type: nauc_recall_at_10_std value: 18.808606461025498 - type: nauc_recall_at_1_diff1 value: 28.652249616122717 - type: nauc_recall_at_1_max value: 32.05131359795648 - type: nauc_recall_at_1_std value: 11.262948253532807 - type: nauc_recall_at_20_diff1 value: 12.600911526162099 - type: nauc_recall_at_20_max value: 35.177943309574985 - type: nauc_recall_at_20_std value: 21.99092004265232 - type: nauc_recall_at_3_diff1 value: 17.49507952659312 - type: nauc_recall_at_3_max value: 31.406559780417105 - type: nauc_recall_at_3_std value: 12.274503076493051 - type: nauc_recall_at_5_diff1 value: 16.612956574037305 - type: nauc_recall_at_5_max value: 33.34670088062603 - type: nauc_recall_at_5_std value: 14.445553526736607 - type: ndcg_at_1 value: 24.365000000000002 - type: ndcg_at_10 value: 27.794999999999998 - type: ndcg_at_100 value: 35.11 - type: ndcg_at_1000 value: 38.383 - type: ndcg_at_20 value: 30.616 - type: ndcg_at_3 value: 21.97 - type: ndcg_at_5 value: 24.264 - type: precision_at_1 value: 24.365000000000002 - type: precision_at_10 value: 8.827 - type: precision_at_100 value: 1.6660000000000001 - type: precision_at_1000 value: 0.22799999999999998 - type: precision_at_20 value: 5.6160000000000005 - type: precision_at_3 value: 16.2 - type: precision_at_5 value: 13.055 - type: recall_at_1 value: 11.138 - type: recall_at_10 value: 34.454 - type: recall_at_100 value: 59.648 - type: recall_at_1000 value: 77.823 - type: recall_at_20 value: 42.476 - type: recall_at_3 value: 20.630000000000003 - type: recall_at_5 value: 26.517000000000003 - task: type: Retrieval dataset: name: MTEB DBPedia type: mteb/dbpedia config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: main_score value: 39.582 - type: map_at_1 value: 8.193 - type: map_at_10 value: 18.838 - type: map_at_100 value: 26.791999999999998 - type: map_at_1000 value: 28.659000000000002 - type: map_at_20 value: 21.678 - type: map_at_3 value: 13.535 - type: map_at_5 value: 15.706000000000001 - type: mrr_at_1 value: 61.25000000000001 - type: mrr_at_10 value: 71.5827380952381 - type: mrr_at_100 value: 71.92227940484834 - type: mrr_at_1000 value: 71.92843656364919 - type: mrr_at_20 value: 71.82391254578756 - type: mrr_at_3 value: 69.54166666666667 - type: mrr_at_5 value: 70.89166666666667 - type: nauc_map_at_1000_diff1 value: 20.81525104511085 - type: nauc_map_at_1000_max value: 12.28738487676873 - type: nauc_map_at_1000_std value: 24.87551199629768 - type: nauc_map_at_100_diff1 value: 21.693837182713217 - type: nauc_map_at_100_max value: 8.69725977707396 - type: nauc_map_at_100_std value: 21.354633072475515 - type: nauc_map_at_10_diff1 value: 24.388731902741767 - type: nauc_map_at_10_max value: -4.0866423282629585 - type: nauc_map_at_10_std value: -0.9510081645949322 - type: nauc_map_at_1_diff1 value: 32.58191575261803 - type: nauc_map_at_1_max value: -10.57813486927926 - type: nauc_map_at_1_std value: -9.588423425329879 - type: nauc_map_at_20_diff1 value: 24.050743021827124 - type: nauc_map_at_20_max value: 0.6686240161106345 - type: nauc_map_at_20_std value: 6.53795559839344 - type: nauc_map_at_3_diff1 value: 26.43827919066607 - type: nauc_map_at_3_max value: -10.727017270257825 - type: nauc_map_at_3_std value: -9.512078389268677 - type: nauc_map_at_5_diff1 value: 25.71002404847907 - type: nauc_map_at_5_max value: -7.097015507701878 - type: nauc_map_at_5_std value: -6.476602516100202 - type: nauc_mrr_at_1000_diff1 value: 45.608034553728835 - type: nauc_mrr_at_1000_max value: 30.922028122514266 - type: nauc_mrr_at_1000_std value: 34.21750207725521 - type: nauc_mrr_at_100_diff1 value: 45.590642197534805 - type: nauc_mrr_at_100_max value: 30.930031708368194 - type: nauc_mrr_at_100_std value: 34.21945637610545 - type: nauc_mrr_at_10_diff1 value: 45.540994123130126 - type: nauc_mrr_at_10_max value: 30.83734303048343 - type: nauc_mrr_at_10_std value: 34.348404162478694 - type: nauc_mrr_at_1_diff1 value: 49.560483335546415 - type: nauc_mrr_at_1_max value: 28.883661816871232 - type: nauc_mrr_at_1_std value: 30.89553654418874 - type: nauc_mrr_at_20_diff1 value: 45.499322734057515 - type: nauc_mrr_at_20_max value: 30.918972161205733 - type: nauc_mrr_at_20_std value: 34.282904222510595 - type: nauc_mrr_at_3_diff1 value: 45.39622724954005 - type: nauc_mrr_at_3_max value: 31.457074078677454 - type: nauc_mrr_at_3_std value: 34.079043384571555 - type: nauc_mrr_at_5_diff1 value: 44.71358730464237 - type: nauc_mrr_at_5_max value: 30.69295376764748 - type: nauc_mrr_at_5_std value: 34.31128800389916 - type: nauc_ndcg_at_1000_diff1 value: 23.109017019057422 - type: nauc_ndcg_at_1000_max value: 23.08462483716398 - type: nauc_ndcg_at_1000_std value: 36.8911815972109 - type: nauc_ndcg_at_100_diff1 value: 23.827280037818173 - type: nauc_ndcg_at_100_max value: 13.309666633249211 - type: nauc_ndcg_at_100_std value: 28.44384667395871 - type: nauc_ndcg_at_10_diff1 value: 26.972856731999386 - type: nauc_ndcg_at_10_max value: 14.620707258357266 - type: nauc_ndcg_at_10_std value: 23.111341368346462 - type: nauc_ndcg_at_1_diff1 value: 43.59088178770794 - type: nauc_ndcg_at_1_max value: 21.904917923054317 - type: nauc_ndcg_at_1_std value: 21.98647522718905 - type: nauc_ndcg_at_20_diff1 value: 26.283361626051914 - type: nauc_ndcg_at_20_max value: 11.10518046266052 - type: nauc_ndcg_at_20_std value: 21.355473505613944 - type: nauc_ndcg_at_3_diff1 value: 30.024148446672083 - type: nauc_ndcg_at_3_max value: 18.48737788479935 - type: nauc_ndcg_at_3_std value: 23.24967559220411 - type: nauc_ndcg_at_5_diff1 value: 27.31687195788342 - type: nauc_ndcg_at_5_max value: 17.233426051712428 - type: nauc_ndcg_at_5_std value: 22.98467702068255 - type: nauc_precision_at_1000_diff1 value: -13.448141290306074 - type: nauc_precision_at_1000_max value: 42.26049965587544 - type: nauc_precision_at_1000_std value: 17.838997647650835 - type: nauc_precision_at_100_diff1 value: -5.070670934466766 - type: nauc_precision_at_100_max value: 33.96276536553548 - type: nauc_precision_at_100_std value: 47.592571562595765 - type: nauc_precision_at_10_diff1 value: 5.079452111840327 - type: nauc_precision_at_10_max value: 33.145301874068146 - type: nauc_precision_at_10_std value: 46.26256386765269 - type: nauc_precision_at_1_diff1 value: 49.560483335546415 - type: nauc_precision_at_1_max value: 28.883661816871232 - type: nauc_precision_at_1_std value: 30.89553654418874 - type: nauc_precision_at_20_diff1 value: 3.253674888888517 - type: nauc_precision_at_20_max value: 34.667104498369575 - type: nauc_precision_at_20_std value: 49.202859485875535 - type: nauc_precision_at_3_diff1 value: 15.790066053828234 - type: nauc_precision_at_3_max value: 27.215083484496542 - type: nauc_precision_at_3_std value: 33.11505410450215 - type: nauc_precision_at_5_diff1 value: 9.530674873702113 - type: nauc_precision_at_5_max value: 31.21998248355014 - type: nauc_precision_at_5_std value: 39.07247161423012 - type: nauc_recall_at_1000_diff1 value: 5.70231960458697 - type: nauc_recall_at_1000_max value: 16.173798281531525 - type: nauc_recall_at_1000_std value: 40.45772368713694 - type: nauc_recall_at_100_diff1 value: 9.815485122352673 - type: nauc_recall_at_100_max value: 3.5894004884530735 - type: nauc_recall_at_100_std value: 23.442799836302864 - type: nauc_recall_at_10_diff1 value: 14.537879655467389 - type: nauc_recall_at_10_max value: -10.56087357341994 - type: nauc_recall_at_10_std value: -7.372934296480146 - type: nauc_recall_at_1_diff1 value: 32.58191575261803 - type: nauc_recall_at_1_max value: -10.57813486927926 - type: nauc_recall_at_1_std value: -9.588423425329879 - type: nauc_recall_at_20_diff1 value: 13.359604621352824 - type: nauc_recall_at_20_max value: -6.037674048018859 - type: nauc_recall_at_20_std value: -0.191231970406073 - type: nauc_recall_at_3_diff1 value: 20.620776298724362 - type: nauc_recall_at_3_max value: -14.34692846751201 - type: nauc_recall_at_3_std value: -12.202460021792232 - type: nauc_recall_at_5_diff1 value: 17.573424943863706 - type: nauc_recall_at_5_max value: -10.968843043485661 - type: nauc_recall_at_5_std value: -10.513373048008399 - type: ndcg_at_1 value: 48.375 - type: ndcg_at_10 value: 39.582 - type: ndcg_at_100 value: 45.259 - type: ndcg_at_1000 value: 53.022000000000006 - type: ndcg_at_20 value: 39.038000000000004 - type: ndcg_at_3 value: 42.802 - type: ndcg_at_5 value: 40.538000000000004 - type: precision_at_1 value: 61.25000000000001 - type: precision_at_10 value: 32.2 - type: precision_at_100 value: 10.545 - type: precision_at_1000 value: 2.2880000000000003 - type: precision_at_20 value: 24.05 - type: precision_at_3 value: 48.083 - type: precision_at_5 value: 40.65 - type: recall_at_1 value: 8.193 - type: recall_at_10 value: 25.519 - type: recall_at_100 value: 54.124 - type: recall_at_1000 value: 77.92099999999999 - type: recall_at_20 value: 32.385999999999996 - type: recall_at_3 value: 15.211 - type: recall_at_5 value: 18.891 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 85.565 - type: f1 value: 81.12346731656551 - type: f1_weighted value: 85.98372374550102 - type: main_score value: 85.565 - task: type: Retrieval dataset: name: MTEB FEVER type: mteb/fever config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: main_score value: 86.026 - type: map_at_1 value: 73.339 - type: map_at_10 value: 81.943 - type: map_at_100 value: 82.12899999999999 - type: map_at_1000 value: 82.145 - type: map_at_20 value: 82.05799999999999 - type: map_at_3 value: 80.827 - type: map_at_5 value: 81.628 - type: mrr_at_1 value: 78.98289828982898 - type: mrr_at_10 value: 87.04412703175062 - type: mrr_at_100 value: 87.1023996343652 - type: mrr_at_1000 value: 87.10370910386118 - type: mrr_at_20 value: 87.08615223713309 - type: mrr_at_3 value: 86.2386238623861 - type: mrr_at_5 value: 86.86568656865666 - type: nauc_map_at_1000_diff1 value: 48.22616948132843 - type: nauc_map_at_1000_max value: 1.6340021380561394 - type: nauc_map_at_1000_std value: -25.200746351372793 - type: nauc_map_at_100_diff1 value: 48.198187398812806 - type: nauc_map_at_100_max value: 1.6220191408228601 - type: nauc_map_at_100_std value: -25.193042721137566 - type: nauc_map_at_10_diff1 value: 48.00585391806132 - type: nauc_map_at_10_max value: 1.4817376575907626 - type: nauc_map_at_10_std value: -25.201484788329843 - type: nauc_map_at_1_diff1 value: 52.76212226788538 - type: nauc_map_at_1_max value: 0.0520314144507071 - type: nauc_map_at_1_std value: -26.20833932232049 - type: nauc_map_at_20_diff1 value: 48.12533777970878 - type: nauc_map_at_20_max value: 1.5240294773565493 - type: nauc_map_at_20_std value: -25.192450618181123 - type: nauc_map_at_3_diff1 value: 47.96480519565094 - type: nauc_map_at_3_max value: 1.1887774816136902 - type: nauc_map_at_3_std value: -26.31363833371711 - type: nauc_map_at_5_diff1 value: 47.79080333430883 - type: nauc_map_at_5_max value: 1.6220551876503297 - type: nauc_map_at_5_std value: -25.250585439913415 - type: nauc_mrr_at_1000_diff1 value: 64.95992140968579 - type: nauc_mrr_at_1000_max value: 1.6737288643493216 - type: nauc_mrr_at_1000_std value: -37.732249646223224 - type: nauc_mrr_at_100_diff1 value: 64.95845005240741 - type: nauc_mrr_at_100_max value: 1.6807060884331666 - type: nauc_mrr_at_100_std value: -37.73314881154047 - type: nauc_mrr_at_10_diff1 value: 64.9115307834577 - type: nauc_mrr_at_10_max value: 1.7195209183889257 - type: nauc_mrr_at_10_std value: -37.88536525017639 - type: nauc_mrr_at_1_diff1 value: 66.13713227430745 - type: nauc_mrr_at_1_max value: 0.37082095312916874 - type: nauc_mrr_at_1_std value: -34.379038222842254 - type: nauc_mrr_at_20_diff1 value: 64.95488651854674 - type: nauc_mrr_at_20_max value: 1.6985375216432168 - type: nauc_mrr_at_20_std value: -37.755703989608705 - type: nauc_mrr_at_3_diff1 value: 64.9535677343948 - type: nauc_mrr_at_3_max value: 1.5195414353630512 - type: nauc_mrr_at_3_std value: -39.21735562852805 - type: nauc_mrr_at_5_diff1 value: 64.85513437757459 - type: nauc_mrr_at_5_max value: 1.9382830256224208 - type: nauc_mrr_at_5_std value: -38.043842104083545 - type: nauc_ndcg_at_1000_diff1 value: 49.74095915307536 - type: nauc_ndcg_at_1000_max value: 2.605169283095937 - type: nauc_ndcg_at_1000_std value: -25.835814259340832 - type: nauc_ndcg_at_100_diff1 value: 49.002859024867945 - type: nauc_ndcg_at_100_max value: 2.5116469969385884 - type: nauc_ndcg_at_100_std value: -25.479921013562272 - type: nauc_ndcg_at_10_diff1 value: 48.25197176801494 - type: nauc_ndcg_at_10_max value: 1.9108104946028264 - type: nauc_ndcg_at_10_std value: -25.780784974391295 - type: nauc_ndcg_at_1_diff1 value: 66.13713227430745 - type: nauc_ndcg_at_1_max value: 0.37082095312916874 - type: nauc_ndcg_at_1_std value: -34.379038222842254 - type: nauc_ndcg_at_20_diff1 value: 48.59674729644139 - type: nauc_ndcg_at_20_max value: 1.9950884849133927 - type: nauc_ndcg_at_20_std value: -25.569135598052622 - type: nauc_ndcg_at_3_diff1 value: 49.305511135576275 - type: nauc_ndcg_at_3_max value: 1.8638668857901368 - type: nauc_ndcg_at_3_std value: -29.02269314595723 - type: nauc_ndcg_at_5_diff1 value: 48.1680764938404 - type: nauc_ndcg_at_5_max value: 2.4842182285117964 - type: nauc_ndcg_at_5_std value: -26.244542780767375 - type: nauc_precision_at_1000_diff1 value: -4.478420343136971 - type: nauc_precision_at_1000_max value: 11.70949232501659 - type: nauc_precision_at_1000_std value: 1.7386198733671119 - type: nauc_precision_at_100_diff1 value: -4.172269763651759 - type: nauc_precision_at_100_max value: 13.082661117154743 - type: nauc_precision_at_100_std value: 1.8002212793127355 - type: nauc_precision_at_10_diff1 value: 5.702289274109695 - type: nauc_precision_at_10_max value: 8.484620250928458 - type: nauc_precision_at_10_std value: -8.132389694515703 - type: nauc_precision_at_1_diff1 value: 66.13713227430745 - type: nauc_precision_at_1_max value: 0.37082095312916874 - type: nauc_precision_at_1_std value: -34.379038222842254 - type: nauc_precision_at_20_diff1 value: 0.5564831263316283 - type: nauc_precision_at_20_max value: 8.881191911131173 - type: nauc_precision_at_20_std value: -3.696180671957281 - type: nauc_precision_at_3_diff1 value: 35.75913314270679 - type: nauc_precision_at_3_max value: 7.896253718358011 - type: nauc_precision_at_3_std value: -33.8336411888768 - type: nauc_precision_at_5_diff1 value: 17.101795422527648 - type: nauc_precision_at_5_max value: 11.993885038446976 - type: nauc_precision_at_5_std value: -16.39044303210142 - type: nauc_recall_at_1000_diff1 value: 1.765610982286282 - type: nauc_recall_at_1000_max value: 16.0490507693684 - type: nauc_recall_at_1000_std value: 28.474043694387696 - type: nauc_recall_at_100_diff1 value: 6.2725603406909265 - type: nauc_recall_at_100_max value: 10.665282199745704 - type: nauc_recall_at_100_std value: 13.266482323582757 - type: nauc_recall_at_10_diff1 value: 16.010002473322103 - type: nauc_recall_at_10_max value: 4.051158641772395 - type: nauc_recall_at_10_std value: -3.963886778602456 - type: nauc_recall_at_1_diff1 value: 52.76212226788538 - type: nauc_recall_at_1_max value: 0.0520314144507071 - type: nauc_recall_at_1_std value: -26.20833932232049 - type: nauc_recall_at_20_diff1 value: 12.763325751516286 - type: nauc_recall_at_20_max value: 4.589618045061225 - type: nauc_recall_at_20_std value: 2.3135711002947525 - type: nauc_recall_at_3_diff1 value: 31.878202992328298 - type: nauc_recall_at_3_max value: 2.398044119809843 - type: nauc_recall_at_3_std value: -22.48228292127779 - type: nauc_recall_at_5_diff1 value: 22.01091185405021 - type: nauc_recall_at_5_max value: 6.161863454884261 - type: nauc_recall_at_5_std value: -10.442113305092082 - type: ndcg_at_1 value: 78.983 - type: ndcg_at_10 value: 86.026 - type: ndcg_at_100 value: 86.666 - type: ndcg_at_1000 value: 86.945 - type: ndcg_at_20 value: 86.333 - type: ndcg_at_3 value: 84.269 - type: ndcg_at_5 value: 85.439 - type: precision_at_1 value: 78.983 - type: precision_at_10 value: 10.282 - type: precision_at_100 value: 1.078 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_20 value: 5.2330000000000005 - type: precision_at_3 value: 32.218 - type: precision_at_5 value: 20.06 - type: recall_at_1 value: 73.339 - type: recall_at_10 value: 93.557 - type: recall_at_100 value: 96.03399999999999 - type: recall_at_1000 value: 97.784 - type: recall_at_20 value: 94.6 - type: recall_at_3 value: 88.851 - type: recall_at_5 value: 91.81 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: mteb/fiqa config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: main_score value: 45.019 - type: map_at_1 value: 21.923000000000002 - type: map_at_10 value: 36.661 - type: map_at_100 value: 38.727000000000004 - type: map_at_1000 value: 38.896 - type: map_at_20 value: 37.821 - type: map_at_3 value: 31.812 - type: map_at_5 value: 34.474 - type: mrr_at_1 value: 43.05555555555556 - type: mrr_at_10 value: 52.714824612972755 - type: mrr_at_100 value: 53.47543808894285 - type: mrr_at_1000 value: 53.50616025822894 - type: mrr_at_20 value: 53.14543059863263 - type: mrr_at_3 value: 50.10288065843621 - type: mrr_at_5 value: 51.715534979423836 - type: nauc_map_at_1000_diff1 value: 41.776705312708486 - type: nauc_map_at_1000_max value: 24.93532754336337 - type: nauc_map_at_1000_std value: -2.794190590614799 - type: nauc_map_at_100_diff1 value: 41.73579109673881 - type: nauc_map_at_100_max value: 24.80625280860252 - type: nauc_map_at_100_std value: -2.814441295874619 - type: nauc_map_at_10_diff1 value: 41.75395538260581 - type: nauc_map_at_10_max value: 23.219207680303324 - type: nauc_map_at_10_std value: -3.5779070328036138 - type: nauc_map_at_1_diff1 value: 48.46545399169614 - type: nauc_map_at_1_max value: 16.49315969594624 - type: nauc_map_at_1_std value: -7.505787454483636 - type: nauc_map_at_20_diff1 value: 41.53641801097531 - type: nauc_map_at_20_max value: 24.00770569213574 - type: nauc_map_at_20_std value: -3.191754163523877 - type: nauc_map_at_3_diff1 value: 41.75052616046243 - type: nauc_map_at_3_max value: 19.115081667001014 - type: nauc_map_at_3_std value: -6.668596004487064 - type: nauc_map_at_5_diff1 value: 42.45446754604312 - type: nauc_map_at_5_max value: 20.947253345126185 - type: nauc_map_at_5_std value: -5.125992439200763 - type: nauc_mrr_at_1000_diff1 value: 52.09717084990717 - type: nauc_mrr_at_1000_max value: 38.086957556354456 - type: nauc_mrr_at_1000_std value: -0.68079244284855 - type: nauc_mrr_at_100_diff1 value: 52.081504543550686 - type: nauc_mrr_at_100_max value: 38.10189737899758 - type: nauc_mrr_at_100_std value: -0.6731400759499799 - type: nauc_mrr_at_10_diff1 value: 51.962775327926934 - type: nauc_mrr_at_10_max value: 37.860734658269976 - type: nauc_mrr_at_10_std value: -0.8627588620266099 - type: nauc_mrr_at_1_diff1 value: 56.643374967422865 - type: nauc_mrr_at_1_max value: 37.424164231372195 - type: nauc_mrr_at_1_std value: -3.808604224746232 - type: nauc_mrr_at_20_diff1 value: 51.9634718440668 - type: nauc_mrr_at_20_max value: 37.99992134394818 - type: nauc_mrr_at_20_std value: -0.5725435512805715 - type: nauc_mrr_at_3_diff1 value: 51.9083290591896 - type: nauc_mrr_at_3_max value: 37.49495462369628 - type: nauc_mrr_at_3_std value: -2.193915400523023 - type: nauc_mrr_at_5_diff1 value: 52.24074329239152 - type: nauc_mrr_at_5_max value: 37.96365352861984 - type: nauc_mrr_at_5_std value: -1.5116002789297864 - type: nauc_ndcg_at_1000_diff1 value: 43.88564426048843 - type: nauc_ndcg_at_1000_max value: 31.371070838376326 - type: nauc_ndcg_at_1000_std value: 1.182058822041445 - type: nauc_ndcg_at_100_diff1 value: 43.47882005622348 - type: nauc_ndcg_at_100_max value: 30.23626893448966 - type: nauc_ndcg_at_100_std value: 1.3554256181078206 - type: nauc_ndcg_at_10_diff1 value: 42.78328747987686 - type: nauc_ndcg_at_10_max value: 26.971284497406334 - type: nauc_ndcg_at_10_std value: -0.9361763271905158 - type: nauc_ndcg_at_1_diff1 value: 56.643374967422865 - type: nauc_ndcg_at_1_max value: 37.424164231372195 - type: nauc_ndcg_at_1_std value: -3.808604224746232 - type: nauc_ndcg_at_20_diff1 value: 42.51200178317055 - type: nauc_ndcg_at_20_max value: 27.807479427212844 - type: nauc_ndcg_at_20_std value: -0.16279719845344157 - type: nauc_ndcg_at_3_diff1 value: 41.983935082179556 - type: nauc_ndcg_at_3_max value: 28.446235814415143 - type: nauc_ndcg_at_3_std value: -3.0007943000595003 - type: nauc_ndcg_at_5_diff1 value: 43.21852196702825 - type: nauc_ndcg_at_5_max value: 26.601248066336986 - type: nauc_ndcg_at_5_std value: -2.5471886292781702 - type: nauc_precision_at_1000_diff1 value: -0.26010199321259797 - type: nauc_precision_at_1000_max value: 35.79601474558423 - type: nauc_precision_at_1000_std value: 14.342818001909988 - type: nauc_precision_at_100_diff1 value: 6.004698224173632 - type: nauc_precision_at_100_max value: 38.52857855255943 - type: nauc_precision_at_100_std value: 16.21705591642149 - type: nauc_precision_at_10_diff1 value: 17.49728453546782 - type: nauc_precision_at_10_max value: 38.24671033647839 - type: nauc_precision_at_10_std value: 12.030940471652098 - type: nauc_precision_at_1_diff1 value: 56.643374967422865 - type: nauc_precision_at_1_max value: 37.424164231372195 - type: nauc_precision_at_1_std value: -3.808604224746232 - type: nauc_precision_at_20_diff1 value: 13.057739432783794 - type: nauc_precision_at_20_max value: 37.84177604877064 - type: nauc_precision_at_20_std value: 13.135243737603359 - type: nauc_precision_at_3_diff1 value: 29.106393446078787 - type: nauc_precision_at_3_max value: 33.51402929333319 - type: nauc_precision_at_3_std value: 1.9298573035534488 - type: nauc_precision_at_5_diff1 value: 25.039378213923403 - type: nauc_precision_at_5_max value: 36.213261098065125 - type: nauc_precision_at_5_std value: 7.142334933169122 - type: nauc_recall_at_1000_diff1 value: 24.897608581023757 - type: nauc_recall_at_1000_max value: 24.60932291382376 - type: nauc_recall_at_1000_std value: 30.05990115014322 - type: nauc_recall_at_100_diff1 value: 30.807527684131564 - type: nauc_recall_at_100_max value: 22.540558835740985 - type: nauc_recall_at_100_std value: 14.493739358980907 - type: nauc_recall_at_10_diff1 value: 31.683742260409076 - type: nauc_recall_at_10_max value: 17.828711448272134 - type: nauc_recall_at_10_std value: 1.899605838015785 - type: nauc_recall_at_1_diff1 value: 48.46545399169614 - type: nauc_recall_at_1_max value: 16.49315969594624 - type: nauc_recall_at_1_std value: -7.505787454483636 - type: nauc_recall_at_20_diff1 value: 30.08305577595204 - type: nauc_recall_at_20_max value: 18.75062281011906 - type: nauc_recall_at_20_std value: 4.502661433146342 - type: nauc_recall_at_3_diff1 value: 33.53153516576839 - type: nauc_recall_at_3_max value: 14.790607412204485 - type: nauc_recall_at_3_std value: -6.1140323409194846 - type: nauc_recall_at_5_diff1 value: 35.64279484984148 - type: nauc_recall_at_5_max value: 15.401875599379574 - type: nauc_recall_at_5_std value: -3.2844856697915774 - type: ndcg_at_1 value: 43.056 - type: ndcg_at_10 value: 45.019 - type: ndcg_at_100 value: 51.98199999999999 - type: ndcg_at_1000 value: 54.581999999999994 - type: ndcg_at_20 value: 47.721999999999994 - type: ndcg_at_3 value: 40.54 - type: ndcg_at_5 value: 42.142 - type: precision_at_1 value: 43.056 - type: precision_at_10 value: 12.531 - type: precision_at_100 value: 1.9949999999999999 - type: precision_at_1000 value: 0.245 - type: precision_at_20 value: 7.446 - type: precision_at_3 value: 27.058 - type: precision_at_5 value: 20.061999999999998 - type: recall_at_1 value: 21.923000000000002 - type: recall_at_10 value: 52.85300000000001 - type: recall_at_100 value: 78.133 - type: recall_at_1000 value: 93.75 - type: recall_at_20 value: 61.085 - type: recall_at_3 value: 37.118 - type: recall_at_5 value: 44.031 - task: type: Retrieval dataset: name: MTEB HotpotQA type: mteb/hotpotqa config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: main_score value: 68.65299999999999 - type: map_at_1 value: 38.893 - type: map_at_10 value: 59.375 - type: map_at_100 value: 60.303 - type: map_at_1000 value: 60.364 - type: map_at_20 value: 59.964 - type: map_at_3 value: 55.718 - type: map_at_5 value: 57.99999999999999 - type: mrr_at_1 value: 77.78528021607022 - type: mrr_at_10 value: 84.49470006323453 - type: mrr_at_100 value: 84.6519637218647 - type: mrr_at_1000 value: 84.65768034160618 - type: mrr_at_20 value: 84.61055874712832 - type: mrr_at_3 value: 83.59441818591013 - type: mrr_at_5 value: 84.19266261534956 - type: nauc_map_at_1000_diff1 value: 15.948378928650673 - type: nauc_map_at_1000_max value: 15.711635353869994 - type: nauc_map_at_1000_std value: 0.937019577383957 - type: nauc_map_at_100_diff1 value: 15.918426215773247 - type: nauc_map_at_100_max value: 15.699627284031124 - type: nauc_map_at_100_std value: 0.9584857374941618 - type: nauc_map_at_10_diff1 value: 15.879270822613408 - type: nauc_map_at_10_max value: 15.463063162099125 - type: nauc_map_at_10_std value: 0.15481877422177437 - type: nauc_map_at_1_diff1 value: 71.30652188008001 - type: nauc_map_at_1_max value: 32.60802008342313 - type: nauc_map_at_1_std value: -12.29496015891874 - type: nauc_map_at_20_diff1 value: 15.853758892635852 - type: nauc_map_at_20_max value: 15.570900027569573 - type: nauc_map_at_20_std value: 0.6783433634347852 - type: nauc_map_at_3_diff1 value: 17.97014394473015 - type: nauc_map_at_3_max value: 15.218485551181926 - type: nauc_map_at_3_std value: -2.4303445320319272 - type: nauc_map_at_5_diff1 value: 16.50404017618271 - type: nauc_map_at_5_max value: 15.285663669100073 - type: nauc_map_at_5_std value: -0.989351556289713 - type: nauc_mrr_at_1000_diff1 value: 70.0763435325149 - type: nauc_mrr_at_1000_max value: 34.01106818267054 - type: nauc_mrr_at_1000_std value: -10.558570244805534 - type: nauc_mrr_at_100_diff1 value: 70.0763826742575 - type: nauc_mrr_at_100_max value: 34.01329127860268 - type: nauc_mrr_at_100_std value: -10.553859035770314 - type: nauc_mrr_at_10_diff1 value: 70.03690200308235 - type: nauc_mrr_at_10_max value: 34.10786779680883 - type: nauc_mrr_at_10_std value: -10.509981664609755 - type: nauc_mrr_at_1_diff1 value: 71.30652188008001 - type: nauc_mrr_at_1_max value: 32.60802008342313 - type: nauc_mrr_at_1_std value: -12.29496015891874 - type: nauc_mrr_at_20_diff1 value: 70.07320564989382 - type: nauc_mrr_at_20_max value: 34.01911070550699 - type: nauc_mrr_at_20_std value: -10.532501476325248 - type: nauc_mrr_at_3_diff1 value: 69.73518331018965 - type: nauc_mrr_at_3_max value: 33.7438084424745 - type: nauc_mrr_at_3_std value: -11.302692900313119 - type: nauc_mrr_at_5_diff1 value: 69.86565354847778 - type: nauc_mrr_at_5_max value: 34.135593857390504 - type: nauc_mrr_at_5_std value: -10.380178093077621 - type: nauc_ndcg_at_1000_diff1 value: 20.865436555566845 - type: nauc_ndcg_at_1000_max value: 18.83121871269731 - type: nauc_ndcg_at_1000_std value: 3.566623532300052 - type: nauc_ndcg_at_100_diff1 value: 19.90357263881322 - type: nauc_ndcg_at_100_max value: 18.387111355628193 - type: nauc_ndcg_at_100_std value: 4.243680531655493 - type: nauc_ndcg_at_10_diff1 value: 19.721051339510907 - type: nauc_ndcg_at_10_max value: 17.558512453515227 - type: nauc_ndcg_at_10_std value: 1.2891095080720567 - type: nauc_ndcg_at_1_diff1 value: 71.30652188008001 - type: nauc_ndcg_at_1_max value: 32.60802008342313 - type: nauc_ndcg_at_1_std value: -12.29496015891874 - type: nauc_ndcg_at_20_diff1 value: 19.519425870891023 - type: nauc_ndcg_at_20_max value: 17.77152674804043 - type: nauc_ndcg_at_20_std value: 2.7253915106561712 - type: nauc_ndcg_at_3_diff1 value: 23.595619290089495 - type: nauc_ndcg_at_3_max value: 17.443501928111456 - type: nauc_ndcg_at_3_std value: -3.1185231896019183 - type: nauc_ndcg_at_5_diff1 value: 21.128676475251222 - type: nauc_ndcg_at_5_max value: 17.427440887891148 - type: nauc_ndcg_at_5_std value: -0.8006655617871765 - type: nauc_precision_at_1000_diff1 value: -18.605360521020412 - type: nauc_precision_at_1000_max value: 13.992651128348118 - type: nauc_precision_at_1000_std value: 34.896942379633316 - type: nauc_precision_at_100_diff1 value: -11.425102107370272 - type: nauc_precision_at_100_max value: 11.216164840931667 - type: nauc_precision_at_100_std value: 27.722125456439343 - type: nauc_precision_at_10_diff1 value: -3.1401539776631653 - type: nauc_precision_at_10_max value: 10.416214004945402 - type: nauc_precision_at_10_std value: 10.251563605515335 - type: nauc_precision_at_1_diff1 value: 71.30652188008001 - type: nauc_precision_at_1_max value: 32.60802008342313 - type: nauc_precision_at_1_std value: -12.29496015891874 - type: nauc_precision_at_20_diff1 value: -6.456921653790667 - type: nauc_precision_at_20_max value: 10.23022445081364 - type: nauc_precision_at_20_std value: 15.935771905722302 - type: nauc_precision_at_3_diff1 value: 8.38156786039047 - type: nauc_precision_at_3_max value: 12.08129239567508 - type: nauc_precision_at_3_std value: 0.05626041327325479 - type: nauc_precision_at_5_diff1 value: 2.4102262974666653 - type: nauc_precision_at_5_max value: 11.160384909564122 - type: nauc_precision_at_5_std value: 4.587163311214582 - type: nauc_recall_at_1000_diff1 value: -18.605360521019925 - type: nauc_recall_at_1000_max value: 13.992651128348363 - type: nauc_recall_at_1000_std value: 34.89694237963353 - type: nauc_recall_at_100_diff1 value: -11.425102107370193 - type: nauc_recall_at_100_max value: 11.216164840931476 - type: nauc_recall_at_100_std value: 27.72212545643919 - type: nauc_recall_at_10_diff1 value: -3.140153977663016 - type: nauc_recall_at_10_max value: 10.416214004945413 - type: nauc_recall_at_10_std value: 10.251563605515395 - type: nauc_recall_at_1_diff1 value: 71.30652188008001 - type: nauc_recall_at_1_max value: 32.60802008342313 - type: nauc_recall_at_1_std value: -12.29496015891874 - type: nauc_recall_at_20_diff1 value: -6.45692165379055 - type: nauc_recall_at_20_max value: 10.230224450813735 - type: nauc_recall_at_20_std value: 15.935771905722335 - type: nauc_recall_at_3_diff1 value: 8.381567860390362 - type: nauc_recall_at_3_max value: 12.081292395675078 - type: nauc_recall_at_3_std value: 0.05626041327321052 - type: nauc_recall_at_5_diff1 value: 2.4102262974666355 - type: nauc_recall_at_5_max value: 11.160384909564078 - type: nauc_recall_at_5_std value: 4.587163311214529 - type: ndcg_at_1 value: 77.78500000000001 - type: ndcg_at_10 value: 68.65299999999999 - type: ndcg_at_100 value: 71.69200000000001 - type: ndcg_at_1000 value: 72.869 - type: ndcg_at_20 value: 70.078 - type: ndcg_at_3 value: 63.568000000000005 - type: ndcg_at_5 value: 66.402 - type: precision_at_1 value: 77.78500000000001 - type: precision_at_10 value: 14.386 - type: precision_at_100 value: 1.672 - type: precision_at_1000 value: 0.183 - type: precision_at_20 value: 7.6499999999999995 - type: precision_at_3 value: 40.473 - type: precision_at_5 value: 26.515 - type: recall_at_1 value: 38.893 - type: recall_at_10 value: 71.931 - type: recall_at_100 value: 83.619 - type: recall_at_1000 value: 91.431 - type: recall_at_20 value: 76.496 - type: recall_at_3 value: 60.709 - type: recall_at_5 value: 66.286 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 95.0268 - type: ap value: 92.72653250341486 - type: ap_weighted value: 92.72653250341486 - type: f1 value: 95.02503365717179 - type: f1_weighted value: 95.02503365717179 - type: main_score value: 95.0268 - task: type: Retrieval dataset: name: MTEB MSMARCO type: mteb/msmarco config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: main_score value: 35.191 - type: map_at_1 value: 16.139 - type: map_at_10 value: 28.101 - type: map_at_100 value: 29.461 - type: map_at_1000 value: 29.515 - type: map_at_20 value: 28.936 - type: map_at_3 value: 23.954 - type: map_at_5 value: 26.308999999999997 - type: mrr_at_1 value: 16.59025787965616 - type: mrr_at_10 value: 28.583100241051344 - type: mrr_at_100 value: 29.89488944200741 - type: mrr_at_1000 value: 29.94198818201922 - type: mrr_at_20 value: 29.397153289126486 - type: mrr_at_3 value: 24.512893982807853 - type: mrr_at_5 value: 26.840974212034318 - type: nauc_map_at_1000_diff1 value: 29.92308133337915 - type: nauc_map_at_1000_max value: -4.792013160789208 - type: nauc_map_at_1000_std value: -20.365722765519205 - type: nauc_map_at_100_diff1 value: 29.927608009586475 - type: nauc_map_at_100_max value: -4.813011061550381 - type: nauc_map_at_100_std value: -20.34066079647475 - type: nauc_map_at_10_diff1 value: 29.85964417677257 - type: nauc_map_at_10_max value: -5.020819297438392 - type: nauc_map_at_10_std value: -21.185600868900707 - type: nauc_map_at_1_diff1 value: 31.91727354325134 - type: nauc_map_at_1_max value: -3.3836191178002637 - type: nauc_map_at_1_std value: -18.94420033626203 - type: nauc_map_at_20_diff1 value: 29.909409775064265 - type: nauc_map_at_20_max value: -4.882624170262229 - type: nauc_map_at_20_std value: -20.737422787243176 - type: nauc_map_at_3_diff1 value: 29.96619551770926 - type: nauc_map_at_3_max value: -4.521984358305567 - type: nauc_map_at_3_std value: -20.675567430573214 - type: nauc_map_at_5_diff1 value: 29.672157845793336 - type: nauc_map_at_5_max value: -4.784226867946108 - type: nauc_map_at_5_std value: -21.090554010504313 - type: nauc_mrr_at_1000_diff1 value: 29.57786251899136 - type: nauc_mrr_at_1000_max value: -4.554864207268301 - type: nauc_mrr_at_1000_std value: -20.124071230468733 - type: nauc_mrr_at_100_diff1 value: 29.57869911178864 - type: nauc_mrr_at_100_max value: -4.568738533954914 - type: nauc_mrr_at_100_std value: -20.097461372571754 - type: nauc_mrr_at_10_diff1 value: 29.50101055760309 - type: nauc_mrr_at_10_max value: -4.699465165716407 - type: nauc_mrr_at_10_std value: -20.85880213075095 - type: nauc_mrr_at_1_diff1 value: 31.5283761916309 - type: nauc_mrr_at_1_max value: -3.2410968598060226 - type: nauc_mrr_at_1_std value: -18.877804738741848 - type: nauc_mrr_at_20_diff1 value: 29.55469091898283 - type: nauc_mrr_at_20_max value: -4.6114669798589585 - type: nauc_mrr_at_20_std value: -20.433076769992457 - type: nauc_mrr_at_3_diff1 value: 29.62441465248462 - type: nauc_mrr_at_3_max value: -4.317634456438896 - type: nauc_mrr_at_3_std value: -20.545356421989975 - type: nauc_mrr_at_5_diff1 value: 29.3174731757817 - type: nauc_mrr_at_5_max value: -4.524554398532275 - type: nauc_mrr_at_5_std value: -20.87564955466439 - type: nauc_ndcg_at_1000_diff1 value: 29.417049449756306 - type: nauc_ndcg_at_1000_max value: -4.429863573283831 - type: nauc_ndcg_at_1000_std value: -18.672687178180762 - type: nauc_ndcg_at_100_diff1 value: 29.52545788575206 - type: nauc_ndcg_at_100_max value: -4.839548635918072 - type: nauc_ndcg_at_100_std value: -17.445902376477168 - type: nauc_ndcg_at_10_diff1 value: 29.349337034114708 - type: nauc_ndcg_at_10_max value: -5.654575625474153 - type: nauc_ndcg_at_10_std value: -21.867391862075433 - type: nauc_ndcg_at_1_diff1 value: 31.5283761916309 - type: nauc_ndcg_at_1_max value: -3.2410968598060226 - type: nauc_ndcg_at_1_std value: -18.877804738741848 - type: nauc_ndcg_at_20_diff1 value: 29.478679665234736 - type: nauc_ndcg_at_20_max value: -5.348280869926551 - type: nauc_ndcg_at_20_std value: -20.32251566103604 - type: nauc_ndcg_at_3_diff1 value: 29.41586840338385 - type: nauc_ndcg_at_3_max value: -4.737448759293484 - type: nauc_ndcg_at_3_std value: -21.114595209094198 - type: nauc_ndcg_at_5_diff1 value: 28.95897834819025 - type: nauc_ndcg_at_5_max value: -5.144033504465505 - type: nauc_ndcg_at_5_std value: -21.73482008242439 - type: nauc_precision_at_1000_diff1 value: -4.773246418887565 - type: nauc_precision_at_1000_max value: 18.94086713593158 - type: nauc_precision_at_1000_std value: 14.940921913943725 - type: nauc_precision_at_100_diff1 value: 15.529104524208284 - type: nauc_precision_at_100_max value: 4.152043132226839 - type: nauc_precision_at_100_std value: 15.362588630598356 - type: nauc_precision_at_10_diff1 value: 26.327252473718293 - type: nauc_precision_at_10_max value: -6.385696358427295 - type: nauc_precision_at_10_std value: -22.43695468265468 - type: nauc_precision_at_1_diff1 value: 31.5283761916309 - type: nauc_precision_at_1_max value: -3.2410968598060226 - type: nauc_precision_at_1_std value: -18.877804738741848 - type: nauc_precision_at_20_diff1 value: 25.09386904802987 - type: nauc_precision_at_20_max value: -4.384006847324815 - type: nauc_precision_at_20_std value: -15.476174306633775 - type: nauc_precision_at_3_diff1 value: 27.88147581285313 - type: nauc_precision_at_3_max value: -5.10330889992625 - type: nauc_precision_at_3_std value: -22.17804890064486 - type: nauc_precision_at_5_diff1 value: 26.673260429548385 - type: nauc_precision_at_5_max value: -5.849985467654149 - type: nauc_precision_at_5_std value: -23.22704929951935 - type: nauc_recall_at_1000_diff1 value: 11.078337058729081 - type: nauc_recall_at_1000_max value: 29.31329518339392 - type: nauc_recall_at_1000_std value: 61.689932707089845 - type: nauc_recall_at_100_diff1 value: 27.694660226790095 - type: nauc_recall_at_100_max value: -4.662880554456902 - type: nauc_recall_at_100_std value: 17.291575712920476 - type: nauc_recall_at_10_diff1 value: 28.14620642731046 - type: nauc_recall_at_10_max value: -7.883918071832969 - type: nauc_recall_at_10_std value: -23.85382911185965 - type: nauc_recall_at_1_diff1 value: 31.91727354325134 - type: nauc_recall_at_1_max value: -3.3836191178002637 - type: nauc_recall_at_1_std value: -18.94420033626203 - type: nauc_recall_at_20_diff1 value: 28.411188230736368 - type: nauc_recall_at_20_max value: -7.489052404904147 - type: nauc_recall_at_20_std value: -17.923010929300084 - type: nauc_recall_at_3_diff1 value: 28.13888531840714 - type: nauc_recall_at_3_max value: -5.385513963117635 - type: nauc_recall_at_3_std value: -22.09635477229696 - type: nauc_recall_at_5_diff1 value: 27.197531472369057 - type: nauc_recall_at_5_max value: -6.204044942502606 - type: nauc_recall_at_5_std value: -23.25902678179945 - type: ndcg_at_1 value: 16.59 - type: ndcg_at_10 value: 35.191 - type: ndcg_at_100 value: 41.778999999999996 - type: ndcg_at_1000 value: 43.126999999999995 - type: ndcg_at_20 value: 38.153 - type: ndcg_at_3 value: 26.718999999999998 - type: ndcg_at_5 value: 30.919999999999998 - type: precision_at_1 value: 16.59 - type: precision_at_10 value: 5.992999999999999 - type: precision_at_100 value: 0.927 - type: precision_at_1000 value: 0.104 - type: precision_at_20 value: 3.6020000000000003 - type: precision_at_3 value: 11.815000000000001 - type: precision_at_5 value: 9.218 - type: recall_at_1 value: 16.139 - type: recall_at_10 value: 57.272999999999996 - type: recall_at_100 value: 87.819 - type: recall_at_1000 value: 98.10900000000001 - type: recall_at_20 value: 68.77 - type: recall_at_3 value: 34.172999999999995 - type: recall_at_5 value: 44.259 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 97.57637938896491 - type: f1 value: 97.39941554989736 - type: f1_weighted value: 97.58495129362304 - type: main_score value: 97.57637938896491 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 86.08071135430917 - type: f1 value: 60.67695519910473 - type: f1_weighted value: 86.22253292076088 - type: main_score value: 86.08071135430917 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 74.9394754539341 - type: f1 value: 71.84595519829237 - type: f1_weighted value: 73.7724380212837 - type: main_score value: 74.9394754539341 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 82.0611970410222 - type: f1 value: 80.96764019308867 - type: f1_weighted value: 81.75048816703206 - type: main_score value: 82.0611970410222 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: main_score value: 35.535315182381275 - type: v_measure value: 35.535315182381275 - type: v_measure_std value: 1.2947784991789062 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: main_score value: 32.701317380058356 - type: v_measure value: 32.701317380058356 - type: v_measure_std value: 1.212859415243672 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 59042f120c80e8afa9cdbb224f67076cec0fc9a7 metrics: - type: main_score value: 31.586146859630325 - type: map value: 31.586146859630325 - type: mrr value: 32.74920599119196 - type: nAUC_map_diff1 value: 11.669586995601716 - type: nAUC_map_max value: -19.043343922416184 - type: nAUC_map_std value: -0.002926267520007513 - type: nAUC_mrr_diff1 value: 11.132898797866952 - type: nAUC_mrr_max value: -13.521554137760747 - type: nAUC_mrr_std value: 1.6662256096686372 - task: type: Retrieval dataset: name: MTEB NFCorpus type: mteb/nfcorpus config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: main_score value: 34.493 - type: map_at_1 value: 5.469 - type: map_at_10 value: 12.681999999999999 - type: map_at_100 value: 16.136 - type: map_at_1000 value: 17.574 - type: map_at_20 value: 14.063 - type: map_at_3 value: 9.252 - type: map_at_5 value: 11.03 - type: mrr_at_1 value: 43.962848297213625 - type: mrr_at_10 value: 53.748095729519854 - type: mrr_at_100 value: 54.31371383821993 - type: mrr_at_1000 value: 54.34550446424 - type: mrr_at_20 value: 54.05753630252571 - type: mrr_at_3 value: 51.34158926728587 - type: mrr_at_5 value: 52.951496388028886 - type: nauc_map_at_1000_diff1 value: 22.42945451651053 - type: nauc_map_at_1000_max value: 25.044939905094555 - type: nauc_map_at_1000_std value: 14.6947376252321 - type: nauc_map_at_100_diff1 value: 24.05126858377848 - type: nauc_map_at_100_max value: 24.260286968462943 - type: nauc_map_at_100_std value: 11.274560706750162 - type: nauc_map_at_10_diff1 value: 28.610449405636412 - type: nauc_map_at_10_max value: 17.669350840567517 - type: nauc_map_at_10_std value: -0.5603965547026133 - type: nauc_map_at_1_diff1 value: 44.546139576048574 - type: nauc_map_at_1_max value: 3.5966098414779686 - type: nauc_map_at_1_std value: -15.204463497276185 - type: nauc_map_at_20_diff1 value: 26.93971089998854 - type: nauc_map_at_20_max value: 20.89952744553902 - type: nauc_map_at_20_std value: 4.323667205452283 - type: nauc_map_at_3_diff1 value: 34.03753780494977 - type: nauc_map_at_3_max value: 10.951970261908517 - type: nauc_map_at_3_std value: -8.942935860299977 - type: nauc_map_at_5_diff1 value: 31.13647526539977 - type: nauc_map_at_5_max value: 13.55486409562657 - type: nauc_map_at_5_std value: -6.285335121924455 - type: nauc_mrr_at_1000_diff1 value: 33.04380727929978 - type: nauc_mrr_at_1000_max value: 40.97460730083534 - type: nauc_mrr_at_1000_std value: 22.68307762886138 - type: nauc_mrr_at_100_diff1 value: 33.038505852668905 - type: nauc_mrr_at_100_max value: 41.004813808229976 - type: nauc_mrr_at_100_std value: 22.727078227914703 - type: nauc_mrr_at_10_diff1 value: 32.945102642427294 - type: nauc_mrr_at_10_max value: 40.59087425732438 - type: nauc_mrr_at_10_std value: 22.2969763977488 - type: nauc_mrr_at_1_diff1 value: 34.55355095202985 - type: nauc_mrr_at_1_max value: 34.35691144716251 - type: nauc_mrr_at_1_std value: 16.025738199559136 - type: nauc_mrr_at_20_diff1 value: 33.01684360381644 - type: nauc_mrr_at_20_max value: 40.82433798731643 - type: nauc_mrr_at_20_std value: 22.56838707992269 - type: nauc_mrr_at_3_diff1 value: 33.2000664328818 - type: nauc_mrr_at_3_max value: 40.65557927809233 - type: nauc_mrr_at_3_std value: 21.640445622194292 - type: nauc_mrr_at_5_diff1 value: 33.14724263980201 - type: nauc_mrr_at_5_max value: 40.37502720649393 - type: nauc_mrr_at_5_std value: 20.91483571628846 - type: nauc_ndcg_at_1000_diff1 value: 23.13999445390973 - type: nauc_ndcg_at_1000_max value: 40.904356797688244 - type: nauc_ndcg_at_1000_std value: 31.135131225973755 - type: nauc_ndcg_at_100_diff1 value: 21.60764588276507 - type: nauc_ndcg_at_100_max value: 34.72455917031235 - type: nauc_ndcg_at_100_std value: 26.084570343364895 - type: nauc_ndcg_at_10_diff1 value: 21.273666650824712 - type: nauc_ndcg_at_10_max value: 36.42637032684147 - type: nauc_ndcg_at_10_std value: 25.854371107614753 - type: nauc_ndcg_at_1_diff1 value: 35.40190534464431 - type: nauc_ndcg_at_1_max value: 34.09394953710087 - type: nauc_ndcg_at_1_std value: 15.082336268368568 - type: nauc_ndcg_at_20_diff1 value: 20.629683502494935 - type: nauc_ndcg_at_20_max value: 35.01440571472175 - type: nauc_ndcg_at_20_std value: 26.1516323412204 - type: nauc_ndcg_at_3_diff1 value: 27.314585132007803 - type: nauc_ndcg_at_3_max value: 38.19301088947643 - type: nauc_ndcg_at_3_std value: 22.37292581921333 - type: nauc_ndcg_at_5_diff1 value: 24.033794102904647 - type: nauc_ndcg_at_5_max value: 36.466778291326506 - type: nauc_ndcg_at_5_std value: 23.15763774408816 - type: nauc_precision_at_1000_diff1 value: -13.984096369493178 - type: nauc_precision_at_1000_max value: 8.50221544384146 - type: nauc_precision_at_1000_std value: 35.62592696752026 - type: nauc_precision_at_100_diff1 value: -12.115042643624523 - type: nauc_precision_at_100_max value: 21.139964351279062 - type: nauc_precision_at_100_std value: 45.41323150126541 - type: nauc_precision_at_10_diff1 value: 3.5604358960435594 - type: nauc_precision_at_10_max value: 38.21371536948471 - type: nauc_precision_at_10_std value: 40.093467246870674 - type: nauc_precision_at_1_diff1 value: 34.55355095202985 - type: nauc_precision_at_1_max value: 34.35691144716251 - type: nauc_precision_at_1_std value: 16.025738199559136 - type: nauc_precision_at_20_diff1 value: -2.2994929672216142 - type: nauc_precision_at_20_max value: 33.41182551515417 - type: nauc_precision_at_20_std value: 42.926074063475376 - type: nauc_precision_at_3_diff1 value: 17.026846985190286 - type: nauc_precision_at_3_max value: 40.78926087324481 - type: nauc_precision_at_3_std value: 28.26154405706766 - type: nauc_precision_at_5_diff1 value: 10.066105504177528 - type: nauc_precision_at_5_max value: 38.397299240351515 - type: nauc_precision_at_5_std value: 31.504726528569105 - type: nauc_recall_at_1000_diff1 value: 5.433767085525343 - type: nauc_recall_at_1000_max value: 17.082294989371675 - type: nauc_recall_at_1000_std value: 17.867147762696924 - type: nauc_recall_at_100_diff1 value: 10.513494371628159 - type: nauc_recall_at_100_max value: 19.63867418942476 - type: nauc_recall_at_100_std value: 14.421450754520809 - type: nauc_recall_at_10_diff1 value: 22.750728383486376 - type: nauc_recall_at_10_max value: 15.735611146890621 - type: nauc_recall_at_10_std value: -0.40290229377136233 - type: nauc_recall_at_1_diff1 value: 44.546139576048574 - type: nauc_recall_at_1_max value: 3.5966098414779686 - type: nauc_recall_at_1_std value: -15.204463497276185 - type: nauc_recall_at_20_diff1 value: 22.44097500377964 - type: nauc_recall_at_20_max value: 19.99783526750806 - type: nauc_recall_at_20_std value: 5.831968175648315 - type: nauc_recall_at_3_diff1 value: 30.742501145388644 - type: nauc_recall_at_3_max value: 11.887713348765457 - type: nauc_recall_at_3_std value: -7.507756416467706 - type: nauc_recall_at_5_diff1 value: 25.251057623903268 - type: nauc_recall_at_5_max value: 11.530971742020508 - type: nauc_recall_at_5_std value: -6.9727238554804005 - type: ndcg_at_1 value: 42.57 - type: ndcg_at_10 value: 34.493 - type: ndcg_at_100 value: 31.912000000000003 - type: ndcg_at_1000 value: 40.485 - type: ndcg_at_20 value: 32.314 - type: ndcg_at_3 value: 39.546 - type: ndcg_at_5 value: 38.009 - type: precision_at_1 value: 43.963 - type: precision_at_10 value: 25.728 - type: precision_at_100 value: 8.297 - type: precision_at_1000 value: 2.094 - type: precision_at_20 value: 19.288 - type: precision_at_3 value: 37.564 - type: precision_at_5 value: 33.375 - type: recall_at_1 value: 5.469 - type: recall_at_10 value: 16.733 - type: recall_at_100 value: 32.867000000000004 - type: recall_at_1000 value: 63.873000000000005 - type: recall_at_20 value: 20.312 - type: recall_at_3 value: 10.386 - type: recall_at_5 value: 13.679 - task: type: Retrieval dataset: name: MTEB NQ type: mteb/nq config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: main_score value: 49.539 - type: map_at_1 value: 26.016000000000002 - type: map_at_10 value: 41.23 - type: map_at_100 value: 42.466 - type: map_at_1000 value: 42.494 - type: map_at_20 value: 42.049 - type: map_at_3 value: 36.272999999999996 - type: map_at_5 value: 39.172000000000004 - type: mrr_at_1 value: 29.634994206257243 - type: mrr_at_10 value: 43.814949695598514 - type: mrr_at_100 value: 44.75158330890793 - type: mrr_at_1000 value: 44.76933611785972 - type: mrr_at_20 value: 44.450136580422104 - type: mrr_at_3 value: 39.56160679799143 - type: mrr_at_5 value: 42.083333333333336 - type: nauc_map_at_1000_diff1 value: 31.377733390159623 - type: nauc_map_at_1000_max value: 10.852802240297759 - type: nauc_map_at_1000_std value: -8.156368414989963 - type: nauc_map_at_100_diff1 value: 31.37926107010834 - type: nauc_map_at_100_max value: 10.866567017386616 - type: nauc_map_at_100_std value: -8.13083658675661 - type: nauc_map_at_10_diff1 value: 31.302395420970413 - type: nauc_map_at_10_max value: 10.696471249499485 - type: nauc_map_at_10_std value: -8.608828614048587 - type: nauc_map_at_1_diff1 value: 34.515378947817545 - type: nauc_map_at_1_max value: 8.23278785130009 - type: nauc_map_at_1_std value: -8.790135666737623 - type: nauc_map_at_20_diff1 value: 31.405784027747636 - type: nauc_map_at_20_max value: 10.743222784357599 - type: nauc_map_at_20_std value: -8.336520716356294 - type: nauc_map_at_3_diff1 value: 30.790756885918242 - type: nauc_map_at_3_max value: 9.611996527156451 - type: nauc_map_at_3_std value: -10.30419579409286 - type: nauc_map_at_5_diff1 value: 31.018701056437692 - type: nauc_map_at_5_max value: 10.415471498676181 - type: nauc_map_at_5_std value: -9.267868426130615 - type: nauc_mrr_at_1000_diff1 value: 30.954103753005274 - type: nauc_mrr_at_1000_max value: 11.65610034595576 - type: nauc_mrr_at_1000_std value: -6.236607914879178 - type: nauc_mrr_at_100_diff1 value: 30.95419964742793 - type: nauc_mrr_at_100_max value: 11.67074501272962 - type: nauc_mrr_at_100_std value: -6.2148004414091504 - type: nauc_mrr_at_10_diff1 value: 30.909871849241917 - type: nauc_mrr_at_10_max value: 11.663150347843652 - type: nauc_mrr_at_10_std value: -6.412145873320221 - type: nauc_mrr_at_1_diff1 value: 33.69803436461973 - type: nauc_mrr_at_1_max value: 9.810616582626253 - type: nauc_mrr_at_1_std value: -6.5168183653335845 - type: nauc_mrr_at_20_diff1 value: 30.97036659208301 - type: nauc_mrr_at_20_max value: 11.615291040042264 - type: nauc_mrr_at_20_std value: -6.317206649176624 - type: nauc_mrr_at_3_diff1 value: 30.347687412668307 - type: nauc_mrr_at_3_max value: 11.045997984562728 - type: nauc_mrr_at_3_std value: -7.344237528386735 - type: nauc_mrr_at_5_diff1 value: 30.607591550974323 - type: nauc_mrr_at_5_max value: 11.478687020349025 - type: nauc_mrr_at_5_std value: -6.773130489910162 - type: nauc_ndcg_at_1000_diff1 value: 30.721715941822435 - type: nauc_ndcg_at_1000_max value: 12.363613568822352 - type: nauc_ndcg_at_1000_std value: -6.083916245339269 - type: nauc_ndcg_at_100_diff1 value: 30.608831858292408 - type: nauc_ndcg_at_100_max value: 12.894646588979683 - type: nauc_ndcg_at_100_std value: -5.148801091143074 - type: nauc_ndcg_at_10_diff1 value: 30.483771661792847 - type: nauc_ndcg_at_10_max value: 12.18129035771911 - type: nauc_ndcg_at_10_std value: -7.165744970217042 - type: nauc_ndcg_at_1_diff1 value: 33.79845141868468 - type: nauc_ndcg_at_1_max value: 9.88864563426806 - type: nauc_ndcg_at_1_std value: -6.43552016535101 - type: nauc_ndcg_at_20_diff1 value: 30.77504113488907 - type: nauc_ndcg_at_20_max value: 12.28245448589153 - type: nauc_ndcg_at_20_std value: -6.325276590452571 - type: nauc_ndcg_at_3_diff1 value: 29.602918057743278 - type: nauc_ndcg_at_3_max value: 10.39055264754259 - type: nauc_ndcg_at_3_std value: -10.014843769784985 - type: nauc_ndcg_at_5_diff1 value: 29.94463296702168 - type: nauc_ndcg_at_5_max value: 11.551920125900473 - type: nauc_ndcg_at_5_std value: -8.48593988495145 - type: nauc_precision_at_1000_diff1 value: -5.690546724212895 - type: nauc_precision_at_1000_max value: 9.109366247129207 - type: nauc_precision_at_1000_std value: 14.65465630262207 - type: nauc_precision_at_100_diff1 value: -1.2336613199255233 - type: nauc_precision_at_100_max value: 14.632255993612098 - type: nauc_precision_at_100_std value: 20.106751006299508 - type: nauc_precision_at_10_diff1 value: 16.156638161044377 - type: nauc_precision_at_10_max value: 15.461271728023455 - type: nauc_precision_at_10_std value: 4.613330902566019 - type: nauc_precision_at_1_diff1 value: 33.79845141868468 - type: nauc_precision_at_1_max value: 9.88864563426806 - type: nauc_precision_at_1_std value: -6.43552016535101 - type: nauc_precision_at_20_diff1 value: 10.833258836740004 - type: nauc_precision_at_20_max value: 14.399547246551503 - type: nauc_precision_at_20_std value: 10.691750912308304 - type: nauc_precision_at_3_diff1 value: 23.440967729505452 - type: nauc_precision_at_3_max value: 12.708378101618688 - type: nauc_precision_at_3_std value: -7.2002199170375105 - type: nauc_precision_at_5_diff1 value: 20.632161061662867 - type: nauc_precision_at_5_max value: 14.803138265646187 - type: nauc_precision_at_5_std value: -1.9170585171231866 - type: nauc_recall_at_1000_diff1 value: 17.469814268756277 - type: nauc_recall_at_1000_max value: 67.91132861575576 - type: nauc_recall_at_1000_std value: 59.719785001643054 - type: nauc_recall_at_100_diff1 value: 20.871489158949146 - type: nauc_recall_at_100_max value: 42.25616221901811 - type: nauc_recall_at_100_std value: 41.83257983711543 - type: nauc_recall_at_10_diff1 value: 26.116159187824273 - type: nauc_recall_at_10_max value: 15.673928195577544 - type: nauc_recall_at_10_std value: -4.068034337550412 - type: nauc_recall_at_1_diff1 value: 34.515378947817545 - type: nauc_recall_at_1_max value: 8.23278785130009 - type: nauc_recall_at_1_std value: -8.790135666737623 - type: nauc_recall_at_20_diff1 value: 26.830515495608314 - type: nauc_recall_at_20_max value: 17.956121895077352 - type: nauc_recall_at_20_std value: 1.8149755315374414 - type: nauc_recall_at_3_diff1 value: 25.57777694351554 - type: nauc_recall_at_3_max value: 10.768605841163243 - type: nauc_recall_at_3_std value: -11.548054988544685 - type: nauc_recall_at_5_diff1 value: 25.69071002325843 - type: nauc_recall_at_5_max value: 13.248151375739594 - type: nauc_recall_at_5_std value: -8.31127808515032 - type: ndcg_at_1 value: 29.605999999999998 - type: ndcg_at_10 value: 49.539 - type: ndcg_at_100 value: 54.67999999999999 - type: ndcg_at_1000 value: 55.287 - type: ndcg_at_20 value: 52.196 - type: ndcg_at_3 value: 40.111999999999995 - type: ndcg_at_5 value: 44.983000000000004 - type: precision_at_1 value: 29.605999999999998 - type: precision_at_10 value: 8.607 - type: precision_at_100 value: 1.147 - type: precision_at_1000 value: 0.121 - type: precision_at_20 value: 4.938 - type: precision_at_3 value: 18.627 - type: precision_at_5 value: 13.927999999999999 - type: recall_at_1 value: 26.016000000000002 - type: recall_at_10 value: 72.51100000000001 - type: recall_at_100 value: 94.60499999999999 - type: recall_at_1000 value: 99.054 - type: recall_at_20 value: 82.353 - type: recall_at_3 value: 47.989 - type: recall_at_5 value: 59.243 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: mteb/quora config: default split: test revision: e4e08e0b7dbe3c8700f0daef558ff32256715259 metrics: - type: main_score value: 89.387 - type: map_at_1 value: 71.61699999999999 - type: map_at_10 value: 85.785 - type: map_at_100 value: 86.407 - type: map_at_1000 value: 86.42 - type: map_at_20 value: 86.206 - type: map_at_3 value: 82.867 - type: map_at_5 value: 84.736 - type: mrr_at_1 value: 82.49 - type: mrr_at_10 value: 88.59147619047603 - type: mrr_at_100 value: 88.67100295673903 - type: mrr_at_1000 value: 88.67132516200078 - type: mrr_at_20 value: 88.6561804240649 - type: mrr_at_3 value: 87.72499999999982 - type: mrr_at_5 value: 88.34599999999975 - type: nauc_map_at_1000_diff1 value: 77.75322227698767 - type: nauc_map_at_1000_max value: 27.15325474904755 - type: nauc_map_at_1000_std value: -45.950703261401266 - type: nauc_map_at_100_diff1 value: 77.75046471198675 - type: nauc_map_at_100_max value: 27.125684918574887 - type: nauc_map_at_100_std value: -46.00793046653974 - type: nauc_map_at_10_diff1 value: 77.96301805869726 - type: nauc_map_at_10_max value: 26.63787475984541 - type: nauc_map_at_10_std value: -48.2092244990593 - type: nauc_map_at_1_diff1 value: 81.04847175933422 - type: nauc_map_at_1_max value: 20.828021860691376 - type: nauc_map_at_1_std value: -40.4427741623345 - type: nauc_map_at_20_diff1 value: 77.82691021180123 - type: nauc_map_at_20_max value: 26.979439675350086 - type: nauc_map_at_20_std value: -46.94206477224242 - type: nauc_map_at_3_diff1 value: 78.57251235300281 - type: nauc_map_at_3_max value: 24.306776325229592 - type: nauc_map_at_3_std value: -50.446232609379706 - type: nauc_map_at_5_diff1 value: 78.23538738312993 - type: nauc_map_at_5_max value: 26.005150155221003 - type: nauc_map_at_5_std value: -49.72081450369548 - type: nauc_mrr_at_1000_diff1 value: 78.29655431237718 - type: nauc_mrr_at_1000_max value: 29.392496550114718 - type: nauc_mrr_at_1000_std value: -41.08607589889516 - type: nauc_mrr_at_100_diff1 value: 78.29662146607758 - type: nauc_mrr_at_100_max value: 29.393300424020218 - type: nauc_mrr_at_100_std value: -41.086465937239026 - type: nauc_mrr_at_10_diff1 value: 78.30206302797494 - type: nauc_mrr_at_10_max value: 29.367617601691403 - type: nauc_mrr_at_10_std value: -41.241804159667225 - type: nauc_mrr_at_1_diff1 value: 79.00375724290345 - type: nauc_mrr_at_1_max value: 29.763227602149133 - type: nauc_mrr_at_1_std value: -37.58361433096388 - type: nauc_mrr_at_20_diff1 value: 78.29875275029173 - type: nauc_mrr_at_20_max value: 29.39463895371502 - type: nauc_mrr_at_20_std value: -41.13808938179999 - type: nauc_mrr_at_3_diff1 value: 78.04981713424701 - type: nauc_mrr_at_3_max value: 28.760448174610858 - type: nauc_mrr_at_3_std value: -42.25770370267669 - type: nauc_mrr_at_5_diff1 value: 78.24030781659526 - type: nauc_mrr_at_5_max value: 29.4627965404159 - type: nauc_mrr_at_5_std value: -41.48382971161236 - type: nauc_ndcg_at_1000_diff1 value: 77.63586978346414 - type: nauc_ndcg_at_1000_max value: 28.36041361858413 - type: nauc_ndcg_at_1000_std value: -43.84956631664592 - type: nauc_ndcg_at_100_diff1 value: 77.5782899412669 - type: nauc_ndcg_at_100_max value: 28.175349147299023 - type: nauc_ndcg_at_100_std value: -44.03384730985532 - type: nauc_ndcg_at_10_diff1 value: 77.65612732311726 - type: nauc_ndcg_at_10_max value: 27.447934213310145 - type: nauc_ndcg_at_10_std value: -47.477846933136206 - type: nauc_ndcg_at_1_diff1 value: 79.00375724290345 - type: nauc_ndcg_at_1_max value: 29.763227602149133 - type: nauc_ndcg_at_1_std value: -37.58361433096388 - type: nauc_ndcg_at_20_diff1 value: 77.6857905925127 - type: nauc_ndcg_at_20_max value: 27.85965135690326 - type: nauc_ndcg_at_20_std value: -46.035623659567534 - type: nauc_ndcg_at_3_diff1 value: 77.20000663124452 - type: nauc_ndcg_at_3_max value: 25.83926946771269 - type: nauc_ndcg_at_3_std value: -48.46047480037077 - type: nauc_ndcg_at_5_diff1 value: 77.47304156996891 - type: nauc_ndcg_at_5_max value: 27.277217473255703 - type: nauc_ndcg_at_5_std value: -48.29036456924513 - type: nauc_precision_at_1000_diff1 value: -44.34289619168728 - type: nauc_precision_at_1000_max value: -3.3267888861609882 - type: nauc_precision_at_1000_std value: 40.7640626789122 - type: nauc_precision_at_100_diff1 value: -44.40180123691582 - type: nauc_precision_at_100_max value: -4.036815279824888 - type: nauc_precision_at_100_std value: 40.258738157948144 - type: nauc_precision_at_10_diff1 value: -40.174969736392725 - type: nauc_precision_at_10_max value: -1.2107921107014503 - type: nauc_precision_at_10_std value: 26.914317558152383 - type: nauc_precision_at_1_diff1 value: 79.00375724290345 - type: nauc_precision_at_1_max value: 29.763227602149133 - type: nauc_precision_at_1_std value: -37.58361433096388 - type: nauc_precision_at_20_diff1 value: -42.997551532370395 - type: nauc_precision_at_20_max value: -2.7260912846581435 - type: nauc_precision_at_20_std value: 33.47494527610656 - type: nauc_precision_at_3_diff1 value: -21.172181060238913 - type: nauc_precision_at_3_max value: 4.5591660958836835 - type: nauc_precision_at_3_std value: 4.474651862429931 - type: nauc_precision_at_5_diff1 value: -33.376618015297154 - type: nauc_precision_at_5_max value: 1.7302644290575764 - type: nauc_precision_at_5_std value: 16.980633045220895 - type: nauc_recall_at_1000_diff1 value: 58.24743045343488 - type: nauc_recall_at_1000_max value: -21.258859048904625 - type: nauc_recall_at_1000_std value: 5.841590725271873 - type: nauc_recall_at_100_diff1 value: 64.62432244425025 - type: nauc_recall_at_100_max value: 11.438889005688548 - type: nauc_recall_at_100_std value: -48.21565456849923 - type: nauc_recall_at_10_diff1 value: 73.84516212868728 - type: nauc_recall_at_10_max value: 21.581336143130912 - type: nauc_recall_at_10_std value: -71.40446430175044 - type: nauc_recall_at_1_diff1 value: 81.04847175933422 - type: nauc_recall_at_1_max value: 20.828021860691376 - type: nauc_recall_at_1_std value: -40.4427741623345 - type: nauc_recall_at_20_diff1 value: 74.07490425440125 - type: nauc_recall_at_20_max value: 22.741699258253938 - type: nauc_recall_at_20_std value: -75.22910750948694 - type: nauc_recall_at_3_diff1 value: 74.81258758793922 - type: nauc_recall_at_3_max value: 19.256464797371688 - type: nauc_recall_at_3_std value: -61.27309744783545 - type: nauc_recall_at_5_diff1 value: 73.49570838483187 - type: nauc_recall_at_5_max value: 22.485129670655922 - type: nauc_recall_at_5_std value: -64.95541946081566 - type: ndcg_at_1 value: 82.49 - type: ndcg_at_10 value: 89.387 - type: ndcg_at_100 value: 90.464 - type: ndcg_at_1000 value: 90.533 - type: ndcg_at_20 value: 90.01599999999999 - type: ndcg_at_3 value: 86.726 - type: ndcg_at_5 value: 88.249 - type: precision_at_1 value: 82.49 - type: precision_at_10 value: 13.543 - type: precision_at_100 value: 1.5350000000000001 - type: precision_at_1000 value: 0.157 - type: precision_at_20 value: 7.185 - type: precision_at_3 value: 37.983 - type: precision_at_5 value: 24.954 - type: recall_at_1 value: 71.61699999999999 - type: recall_at_10 value: 96.207 - type: recall_at_100 value: 99.726 - type: recall_at_1000 value: 99.991 - type: recall_at_20 value: 98.188 - type: recall_at_3 value: 88.466 - type: recall_at_5 value: 92.83200000000001 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: main_score value: 55.165421795067815 - type: v_measure value: 55.165421795067815 - type: v_measure_std value: 4.407201142010862 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 385e3cb46b4cfa89021f56c4380204149d0efe33 metrics: - type: main_score value: 64.40104113698271 - type: v_measure value: 64.40104113698271 - type: v_measure_std value: 13.302523246335362 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: mteb/scidocs config: default split: test revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88 metrics: - type: main_score value: 20.429 - type: map_at_1 value: 4.868 - type: map_at_10 value: 12.27 - type: map_at_100 value: 14.332 - type: map_at_1000 value: 14.625 - type: map_at_20 value: 13.333 - type: map_at_3 value: 8.795 - type: map_at_5 value: 10.392 - type: mrr_at_1 value: 24.0 - type: mrr_at_10 value: 34.65333333333329 - type: mrr_at_100 value: 35.674251079833766 - type: mrr_at_1000 value: 35.73520785942911 - type: mrr_at_20 value: 35.22774876654128 - type: mrr_at_3 value: 31.166666666666664 - type: mrr_at_5 value: 33.281666666666624 - type: nauc_map_at_1000_diff1 value: 17.399043123319522 - type: nauc_map_at_1000_max value: 31.2734183775543 - type: nauc_map_at_1000_std value: 17.077403711100832 - type: nauc_map_at_100_diff1 value: 17.403713887640865 - type: nauc_map_at_100_max value: 31.27377201272501 - type: nauc_map_at_100_std value: 16.87360366282937 - type: nauc_map_at_10_diff1 value: 17.359001538120168 - type: nauc_map_at_10_max value: 30.468920168811948 - type: nauc_map_at_10_std value: 13.380268231544715 - type: nauc_map_at_1_diff1 value: 21.421764472532455 - type: nauc_map_at_1_max value: 22.406495947870948 - type: nauc_map_at_1_std value: 7.278461750059741 - type: nauc_map_at_20_diff1 value: 17.309681501618616 - type: nauc_map_at_20_max value: 30.723309484933736 - type: nauc_map_at_20_std value: 15.103661234366466 - type: nauc_map_at_3_diff1 value: 19.21373088647576 - type: nauc_map_at_3_max value: 28.20473469906757 - type: nauc_map_at_3_std value: 8.112728025403056 - type: nauc_map_at_5_diff1 value: 18.058060387271972 - type: nauc_map_at_5_max value: 30.126841947570814 - type: nauc_map_at_5_std value: 10.52754125285907 - type: nauc_mrr_at_1000_diff1 value: 19.441702934302622 - type: nauc_mrr_at_1000_max value: 25.596393086654306 - type: nauc_mrr_at_1000_std value: 12.03335655261492 - type: nauc_mrr_at_100_diff1 value: 19.45550504725835 - type: nauc_mrr_at_100_max value: 25.616075945406113 - type: nauc_mrr_at_100_std value: 12.064272002353919 - type: nauc_mrr_at_10_diff1 value: 19.439283557585867 - type: nauc_mrr_at_10_max value: 25.630347604493288 - type: nauc_mrr_at_10_std value: 12.031032042077703 - type: nauc_mrr_at_1_diff1 value: 21.522585669781943 - type: nauc_mrr_at_1_max value: 22.47948118859334 - type: nauc_mrr_at_1_std value: 7.382278936017263 - type: nauc_mrr_at_20_diff1 value: 19.41398208318509 - type: nauc_mrr_at_20_max value: 25.627882587061446 - type: nauc_mrr_at_20_std value: 12.073194157092846 - type: nauc_mrr_at_3_diff1 value: 19.605200019472257 - type: nauc_mrr_at_3_max value: 25.325244620209876 - type: nauc_mrr_at_3_std value: 9.621890524197736 - type: nauc_mrr_at_5_diff1 value: 19.39540169944071 - type: nauc_mrr_at_5_max value: 25.603584740156034 - type: nauc_mrr_at_5_std value: 11.176904475558963 - type: nauc_ndcg_at_1000_diff1 value: 16.677472512130397 - type: nauc_ndcg_at_1000_max value: 30.803531883263386 - type: nauc_ndcg_at_1000_std value: 24.271183062150264 - type: nauc_ndcg_at_100_diff1 value: 17.36630862763037 - type: nauc_ndcg_at_100_max value: 31.94802140143363 - type: nauc_ndcg_at_100_std value: 23.50492571448407 - type: nauc_ndcg_at_10_diff1 value: 16.96591943739385 - type: nauc_ndcg_at_10_max value: 29.983229462186355 - type: nauc_ndcg_at_10_std value: 16.195748077489096 - type: nauc_ndcg_at_1_diff1 value: 21.522585669781943 - type: nauc_ndcg_at_1_max value: 22.47948118859334 - type: nauc_ndcg_at_1_std value: 7.382278936017263 - type: nauc_ndcg_at_20_diff1 value: 16.95752397256498 - type: nauc_ndcg_at_20_max value: 30.17083071239411 - type: nauc_ndcg_at_20_std value: 18.58280825082001 - type: nauc_ndcg_at_3_diff1 value: 18.84612108439313 - type: nauc_ndcg_at_3_max value: 27.98191818651593 - type: nauc_ndcg_at_3_std value: 9.424277024329921 - type: nauc_ndcg_at_5_diff1 value: 17.508065912086675 - type: nauc_ndcg_at_5_max value: 29.611412732203608 - type: nauc_ndcg_at_5_std value: 12.623793734445126 - type: nauc_precision_at_1000_diff1 value: 6.265199779097322 - type: nauc_precision_at_1000_max value: 20.008066463216657 - type: nauc_precision_at_1000_std value: 35.98021866405677 - type: nauc_precision_at_100_diff1 value: 11.877723135952802 - type: nauc_precision_at_100_max value: 28.979530033834557 - type: nauc_precision_at_100_std value: 33.61448120665875 - type: nauc_precision_at_10_diff1 value: 13.347374773447774 - type: nauc_precision_at_10_max value: 29.532781336663056 - type: nauc_precision_at_10_std value: 20.58195880074721 - type: nauc_precision_at_1_diff1 value: 21.522585669781943 - type: nauc_precision_at_1_max value: 22.47948118859334 - type: nauc_precision_at_1_std value: 7.382278936017263 - type: nauc_precision_at_20_diff1 value: 12.623490622184555 - type: nauc_precision_at_20_max value: 27.985132320790147 - type: nauc_precision_at_20_std value: 24.017624920206707 - type: nauc_precision_at_3_diff1 value: 17.586564287642346 - type: nauc_precision_at_3_max value: 30.03148650786217 - type: nauc_precision_at_3_std value: 10.379451374554094 - type: nauc_precision_at_5_diff1 value: 14.824891223085926 - type: nauc_precision_at_5_max value: 31.410239486293527 - type: nauc_precision_at_5_std value: 15.624402346760954 - type: nauc_recall_at_1000_diff1 value: 6.310837044332995 - type: nauc_recall_at_1000_max value: 20.095529403256776 - type: nauc_recall_at_1000_std value: 36.54872612878018 - type: nauc_recall_at_100_diff1 value: 12.038563848928966 - type: nauc_recall_at_100_max value: 28.986817020127525 - type: nauc_recall_at_100_std value: 33.54721716249713 - type: nauc_recall_at_10_diff1 value: 13.26933896316366 - type: nauc_recall_at_10_max value: 29.38186602785486 - type: nauc_recall_at_10_std value: 20.275621953504526 - type: nauc_recall_at_1_diff1 value: 21.421764472532455 - type: nauc_recall_at_1_max value: 22.406495947870948 - type: nauc_recall_at_1_std value: 7.278461750059741 - type: nauc_recall_at_20_diff1 value: 12.570312459960123 - type: nauc_recall_at_20_max value: 27.709620758158497 - type: nauc_recall_at_20_std value: 23.607200666051515 - type: nauc_recall_at_3_diff1 value: 17.403838471827413 - type: nauc_recall_at_3_max value: 30.03567479942994 - type: nauc_recall_at_3_std value: 10.168877039526405 - type: nauc_recall_at_5_diff1 value: 14.617283448905278 - type: nauc_recall_at_5_max value: 31.260794318671316 - type: nauc_recall_at_5_std value: 15.292480271424239 - type: ndcg_at_1 value: 24.0 - type: ndcg_at_10 value: 20.429 - type: ndcg_at_100 value: 28.327999999999996 - type: ndcg_at_1000 value: 33.489999999999995 - type: ndcg_at_20 value: 23.236 - type: ndcg_at_3 value: 19.36 - type: ndcg_at_5 value: 16.866 - type: precision_at_1 value: 24.0 - type: precision_at_10 value: 10.58 - type: precision_at_100 value: 2.196 - type: precision_at_1000 value: 0.344 - type: precision_at_20 value: 6.9 - type: precision_at_3 value: 17.967 - type: precision_at_5 value: 14.74 - type: recall_at_1 value: 4.868 - type: recall_at_10 value: 21.47 - type: recall_at_100 value: 44.622 - type: recall_at_1000 value: 69.777 - type: recall_at_20 value: 28.028 - type: recall_at_3 value: 10.933 - type: recall_at_5 value: 14.948 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: 20a6d6f312dd54037fe07a32d58e5e168867909d metrics: - type: cosine_pearson value: 83.56937382314794 - type: cosine_spearman value: 79.63245426461405 - type: euclidean_pearson value: 81.23038281326936 - type: euclidean_spearman value: 79.63246287500021 - type: main_score value: 79.63245426461405 - type: manhattan_pearson value: 81.22715334724163 - type: manhattan_spearman value: 79.47235517811446 - type: pearson value: 83.56937382314794 - type: spearman value: 79.63245426461405 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cosine_pearson value: 87.94074172378106 - type: cosine_spearman value: 81.49535893255212 - type: euclidean_pearson value: 85.67127466141365 - type: euclidean_spearman value: 81.49519105826656 - type: main_score value: 81.49535893255212 - type: manhattan_pearson value: 85.7939378777207 - type: manhattan_spearman value: 81.68788285150019 - type: pearson value: 87.94074172378106 - type: spearman value: 81.49535893255212 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cosine_pearson value: 83.13868249088958 - type: cosine_spearman value: 84.49255715794354 - type: euclidean_pearson value: 83.94702761019037 - type: euclidean_spearman value: 84.49261181536836 - type: main_score value: 84.49255715794354 - type: manhattan_pearson value: 84.05461037469608 - type: manhattan_spearman value: 84.58504951653568 - type: pearson value: 83.13868249088958 - type: spearman value: 84.49255715794354 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cosine_pearson value: 80.86639951141099 - type: cosine_spearman value: 80.05601661201852 - type: euclidean_pearson value: 80.97495767233256 - type: euclidean_spearman value: 80.05600716279979 - type: main_score value: 80.05601661201852 - type: manhattan_pearson value: 80.68673997093622 - type: manhattan_spearman value: 79.895855702411 - type: pearson value: 80.86639951141099 - type: spearman value: 80.05601661201852 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cosine_pearson value: 84.13791770600066 - type: cosine_spearman value: 86.54345663501209 - type: euclidean_pearson value: 85.62978165451675 - type: euclidean_spearman value: 86.54346234593214 - type: main_score value: 86.54345663501209 - type: manhattan_pearson value: 85.3032964455555 - type: manhattan_spearman value: 86.30088652823572 - type: pearson value: 84.13791770600066 - type: spearman value: 86.54345663501209 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cosine_pearson value: 84.40315982722548 - type: cosine_spearman value: 85.40751435377788 - type: euclidean_pearson value: 84.35271010578505 - type: euclidean_spearman value: 85.40751373941698 - type: main_score value: 85.40751435377788 - type: manhattan_pearson value: 84.17785174793401 - type: manhattan_spearman value: 85.23156904732424 - type: pearson value: 84.40315982722548 - type: spearman value: 85.40751435377788 - task: type: STS dataset: name: MTEB STS17 (en-ar) type: mteb/sts17-crosslingual-sts config: en-ar split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 59.98924365555529 - type: cosine_spearman value: 60.12821686053337 - type: euclidean_pearson value: 60.90431312863765 - type: euclidean_spearman value: 60.12821686053337 - type: main_score value: 60.12821686053337 - type: manhattan_pearson value: 59.05369093717122 - type: manhattan_spearman value: 57.65837693471568 - type: pearson value: 59.98924365555529 - type: spearman value: 60.12821686053337 - task: type: STS dataset: name: MTEB STS17 (en-de) type: mteb/sts17-crosslingual-sts config: en-de split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 74.95271349225828 - type: cosine_spearman value: 75.43839974308261 - type: euclidean_pearson value: 75.68179466828151 - type: euclidean_spearman value: 75.43839974308261 - type: main_score value: 75.43839974308261 - type: manhattan_pearson value: 75.4848070012919 - type: manhattan_spearman value: 74.92507658877852 - type: pearson value: 74.95271349225828 - type: spearman value: 75.43839974308261 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 86.18555151297676 - type: cosine_spearman value: 86.40304228488033 - type: euclidean_pearson value: 86.8788548303146 - type: euclidean_spearman value: 86.40304228488033 - type: main_score value: 86.40304228488033 - type: manhattan_pearson value: 86.79312171236047 - type: manhattan_spearman value: 86.26008520753594 - type: pearson value: 86.18555151297676 - type: spearman value: 86.40304228488033 - task: type: STS dataset: name: MTEB STS17 (en-tr) type: mteb/sts17-crosslingual-sts config: en-tr split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 54.99479996647493 - type: cosine_spearman value: 53.67766339389046 - type: euclidean_pearson value: 55.32473081178422 - type: euclidean_spearman value: 53.67766339389046 - type: main_score value: 53.67766339389046 - type: manhattan_pearson value: 54.66604584985125 - type: manhattan_spearman value: 52.48322788533404 - type: pearson value: 54.99479996647493 - type: spearman value: 53.67766339389046 - task: type: STS dataset: name: MTEB STS17 (es-en) type: mteb/sts17-crosslingual-sts config: es-en split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 76.65184590191937 - type: cosine_spearman value: 78.04569100389011 - type: euclidean_pearson value: 77.11425698246029 - type: euclidean_spearman value: 78.04569100389011 - type: main_score value: 78.04569100389011 - type: manhattan_pearson value: 77.34799982307821 - type: manhattan_spearman value: 78.22975685912238 - type: pearson value: 76.65184590191937 - type: spearman value: 78.04569100389011 - task: type: STS dataset: name: MTEB STS17 (fr-en) type: mteb/sts17-crosslingual-sts config: fr-en split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 76.30743924244035 - type: cosine_spearman value: 75.2110676227775 - type: euclidean_pearson value: 77.10837892816058 - type: euclidean_spearman value: 75.2110676227775 - type: main_score value: 75.2110676227775 - type: manhattan_pearson value: 76.814009334774 - type: manhattan_spearman value: 74.96159426113054 - type: pearson value: 76.30743924244035 - type: spearman value: 75.2110676227775 - task: type: STS dataset: name: MTEB STS17 (it-en) type: mteb/sts17-crosslingual-sts config: it-en split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 75.11771819741416 - type: cosine_spearman value: 74.96778304560281 - type: euclidean_pearson value: 75.56941540554674 - type: euclidean_spearman value: 74.96778304560281 - type: main_score value: 74.96778304560281 - type: manhattan_pearson value: 75.18422319871718 - type: manhattan_spearman value: 74.45788102060328 - type: pearson value: 75.11771819741416 - type: spearman value: 74.96778304560281 - task: type: STS dataset: name: MTEB STS17 (nl-en) type: mteb/sts17-crosslingual-sts config: nl-en split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 72.42454093118816 - type: cosine_spearman value: 71.9097547231894 - type: euclidean_pearson value: 73.04051728705643 - type: euclidean_spearman value: 71.9097547231894 - type: main_score value: 71.9097547231894 - type: manhattan_pearson value: 72.5487755597775 - type: manhattan_spearman value: 71.080265405627 - type: pearson value: 72.42454093118816 - type: spearman value: 71.9097547231894 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 69.3881685924264 - type: cosine_spearman value: 69.37162939123382 - type: euclidean_pearson value: 70.5377770359738 - type: euclidean_spearman value: 69.37162939123382 - type: main_score value: 69.37162939123382 - type: manhattan_pearson value: 70.86501303890763 - type: manhattan_spearman value: 69.54018077011284 - type: pearson value: 69.3881685924264 - type: spearman value: 69.37162939123382 - task: type: STS dataset: name: MTEB STS22 (de-en) type: mteb/sts22-crosslingual-sts config: de-en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 64.64985744446284 - type: cosine_spearman value: 63.89323074678119 - type: euclidean_pearson value: 66.9623010036117 - type: euclidean_spearman value: 63.89323074678119 - type: main_score value: 63.89323074678119 - type: manhattan_pearson value: 68.60076281156398 - type: manhattan_spearman value: 64.80183430943912 - type: pearson value: 64.64985744446284 - type: spearman value: 63.89323074678119 - task: type: STS dataset: name: MTEB STS22 (es-en) type: mteb/sts22-crosslingual-sts config: es-en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 78.95094282575697 - type: cosine_spearman value: 80.66341954222823 - type: euclidean_pearson value: 79.7677956183949 - type: euclidean_spearman value: 80.66341954222823 - type: main_score value: 80.66341954222823 - type: manhattan_pearson value: 81.52201735972797 - type: manhattan_spearman value: 81.65309541429473 - type: pearson value: 78.95094282575697 - type: spearman value: 80.66341954222823 - task: type: STS dataset: name: MTEB STS22 (pl-en) type: mteb/sts22-crosslingual-sts config: pl-en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 77.99167158750629 - type: cosine_spearman value: 77.00326330683939 - type: euclidean_pearson value: 77.60571751826936 - type: euclidean_spearman value: 77.00326330683939 - type: main_score value: 77.00326330683939 - type: manhattan_pearson value: 78.19839585217989 - type: manhattan_spearman value: 78.44894390841364 - type: pearson value: 77.99167158750629 - type: spearman value: 77.00326330683939 - type: cosine_pearson value: 77.99167158750629 - type: cosine_spearman value: 77.00326330683939 - type: euclidean_pearson value: 77.60571751826936 - type: euclidean_spearman value: 77.00326330683939 - type: main_score value: 77.00326330683939 - type: manhattan_pearson value: 78.19839585217989 - type: manhattan_spearman value: 78.44894390841364 - type: pearson value: 77.99167158750629 - type: spearman value: 77.00326330683939 - task: type: STS dataset: name: MTEB STS22 (zh-en) type: mteb/sts22-crosslingual-sts config: zh-en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 77.21035942564082 - type: cosine_spearman value: 76.57212143103963 - type: euclidean_pearson value: 78.03973868360728 - type: euclidean_spearman value: 76.57212143103963 - type: main_score value: 76.57212143103963 - type: manhattan_pearson value: 78.16591898142042 - type: manhattan_spearman value: 76.83958214147293 - type: pearson value: 77.21035942564082 - type: spearman value: 76.57212143103963 - type: cosine_pearson value: 77.21035942564082 - type: cosine_spearman value: 76.57212143103963 - type: euclidean_pearson value: 78.03973868360728 - type: euclidean_spearman value: 76.57212143103963 - type: main_score value: 76.57212143103963 - type: manhattan_pearson value: 78.16591898142042 - type: manhattan_spearman value: 76.83958214147293 - type: pearson value: 77.21035942564082 - type: spearman value: 76.57212143103963 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cosine_pearson value: 81.21615375003084 - type: cosine_spearman value: 84.2970803211202 - type: euclidean_pearson value: 83.54765755364517 - type: euclidean_spearman value: 84.2970803211202 - type: main_score value: 84.2970803211202 - type: manhattan_pearson value: 83.2769664077453 - type: manhattan_spearman value: 84.09545601307758 - type: pearson value: 81.21615375003084 - type: spearman value: 84.2970803211202 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: main_score value: 80.72245608609909 - type: map value: 80.72245608609909 - type: mrr value: 94.86804408373035 - type: nAUC_map_diff1 value: 3.565293868431913 - type: nAUC_map_max value: 53.87118155384518 - type: nAUC_map_std value: 69.73850807835032 - type: nAUC_mrr_diff1 value: 48.33938058863373 - type: nAUC_mrr_max value: 82.0796869926262 - type: nAUC_mrr_std value: 79.20228314778093 - task: type: Retrieval dataset: name: MTEB SciFact type: mteb/scifact config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: main_score value: 72.604 - type: map_at_1 value: 57.05 - type: map_at_10 value: 68.026 - type: map_at_100 value: 68.54299999999999 - type: map_at_1000 value: 68.56 - type: map_at_20 value: 68.329 - type: map_at_3 value: 65.565 - type: map_at_5 value: 66.81899999999999 - type: mrr_at_1 value: 60.0 - type: mrr_at_10 value: 68.97116402116401 - type: mrr_at_100 value: 69.43171438050388 - type: mrr_at_1000 value: 69.44900642374887 - type: mrr_at_20 value: 69.25799802049801 - type: mrr_at_3 value: 67.11111111111111 - type: mrr_at_5 value: 68.27777777777779 - type: nauc_map_at_1000_diff1 value: 66.45098144160822 - type: nauc_map_at_1000_max value: 52.26713946112144 - type: nauc_map_at_1000_std value: -3.2435941711161194 - type: nauc_map_at_100_diff1 value: 66.45069255591629 - type: nauc_map_at_100_max value: 52.277529223166994 - type: nauc_map_at_100_std value: -3.236289003540743 - type: nauc_map_at_10_diff1 value: 66.50847900934123 - type: nauc_map_at_10_max value: 52.56336813799116 - type: nauc_map_at_10_std value: -3.2225840417202547 - type: nauc_map_at_1_diff1 value: 69.8066007922827 - type: nauc_map_at_1_max value: 46.19700236373352 - type: nauc_map_at_1_std value: -11.167127232139137 - type: nauc_map_at_20_diff1 value: 66.49775686319742 - type: nauc_map_at_20_max value: 52.31488178119375 - type: nauc_map_at_20_std value: -3.528866881477926 - type: nauc_map_at_3_diff1 value: 67.0124735448113 - type: nauc_map_at_3_max value: 51.47207513467635 - type: nauc_map_at_3_std value: -4.688170694240992 - type: nauc_map_at_5_diff1 value: 66.37338579400031 - type: nauc_map_at_5_max value: 51.03182506884805 - type: nauc_map_at_5_std value: -4.090110073585303 - type: nauc_mrr_at_1000_diff1 value: 66.13316468798861 - type: nauc_mrr_at_1000_max value: 53.18661162667272 - type: nauc_mrr_at_1000_std value: -1.1549432899803578 - type: nauc_mrr_at_100_diff1 value: 66.13308912088833 - type: nauc_mrr_at_100_max value: 53.196523181344176 - type: nauc_mrr_at_100_std value: -1.148961396684306 - type: nauc_mrr_at_10_diff1 value: 66.11198414850364 - type: nauc_mrr_at_10_max value: 53.45434553493992 - type: nauc_mrr_at_10_std value: -1.0202103385535555 - type: nauc_mrr_at_1_diff1 value: 69.18818640546156 - type: nauc_mrr_at_1_max value: 50.224102107450285 - type: nauc_mrr_at_1_std value: -4.4508756307510104 - type: nauc_mrr_at_20_diff1 value: 66.12038286624204 - type: nauc_mrr_at_20_max value: 53.23900442821744 - type: nauc_mrr_at_20_std value: -1.3453691424031584 - type: nauc_mrr_at_3_diff1 value: 66.23482655095762 - type: nauc_mrr_at_3_max value: 53.519304370411625 - type: nauc_mrr_at_3_std value: -1.0512555098049736 - type: nauc_mrr_at_5_diff1 value: 65.63605277411375 - type: nauc_mrr_at_5_max value: 53.17390536531564 - type: nauc_mrr_at_5_std value: -0.5198682324341892 - type: nauc_ndcg_at_1000_diff1 value: 65.85075826609345 - type: nauc_ndcg_at_1000_max value: 53.814329968179045 - type: nauc_ndcg_at_1000_std value: -0.9856729250792472 - type: nauc_ndcg_at_100_diff1 value: 65.78229528993444 - type: nauc_ndcg_at_100_max value: 54.1747645815977 - type: nauc_ndcg_at_100_std value: -0.47502756295876847 - type: nauc_ndcg_at_10_diff1 value: 66.00876580480991 - type: nauc_ndcg_at_10_max value: 55.06235713538037 - type: nauc_ndcg_at_10_std value: -1.5534145585575012 - type: nauc_ndcg_at_1_diff1 value: 69.18818640546156 - type: nauc_ndcg_at_1_max value: 50.224102107450285 - type: nauc_ndcg_at_1_std value: -4.4508756307510104 - type: nauc_ndcg_at_20_diff1 value: 65.95831573232856 - type: nauc_ndcg_at_20_max value: 54.24206688010573 - type: nauc_ndcg_at_20_std value: -2.705254164112238 - type: nauc_ndcg_at_3_diff1 value: 66.14046065126678 - type: nauc_ndcg_at_3_max value: 54.07332075118414 - type: nauc_ndcg_at_3_std value: -2.0119140501882793 - type: nauc_ndcg_at_5_diff1 value: 65.21102868019805 - type: nauc_ndcg_at_5_max value: 52.596880916483165 - type: nauc_ndcg_at_5_std value: -2.1720193236802023 - type: nauc_precision_at_1000_diff1 value: -21.99504940846271 - type: nauc_precision_at_1000_max value: 19.25403291298791 - type: nauc_precision_at_1000_std value: 46.296476764054404 - type: nauc_precision_at_100_diff1 value: -11.741691903205695 - type: nauc_precision_at_100_max value: 25.699636707900623 - type: nauc_precision_at_100_std value: 43.96233624765463 - type: nauc_precision_at_10_diff1 value: 11.568895847591932 - type: nauc_precision_at_10_max value: 39.43006347212197 - type: nauc_precision_at_10_std value: 28.751839941496836 - type: nauc_precision_at_1_diff1 value: 69.18818640546156 - type: nauc_precision_at_1_max value: 50.224102107450285 - type: nauc_precision_at_1_std value: -4.4508756307510104 - type: nauc_precision_at_20_diff1 value: 4.854833212085455 - type: nauc_precision_at_20_max value: 34.19851755381116 - type: nauc_precision_at_20_std value: 28.728626880402068 - type: nauc_precision_at_3_diff1 value: 35.04823458092479 - type: nauc_precision_at_3_max value: 47.8670338954734 - type: nauc_precision_at_3_std value: 19.389299130775157 - type: nauc_precision_at_5_diff1 value: 25.605002849466736 - type: nauc_precision_at_5_max value: 43.50575999348689 - type: nauc_precision_at_5_std value: 24.80257266140189 - type: nauc_recall_at_1000_diff1 value: 55.07703081232429 - type: nauc_recall_at_1000_max value: 70.71661998132596 - type: nauc_recall_at_1000_std value: 64.58916900093288 - type: nauc_recall_at_100_diff1 value: 59.97732426303837 - type: nauc_recall_at_100_max value: 71.64532479658504 - type: nauc_recall_at_100_std value: 37.87515006002412 - type: nauc_recall_at_10_diff1 value: 64.45621875630812 - type: nauc_recall_at_10_max value: 64.72171592433827 - type: nauc_recall_at_10_std value: 0.9026532647803642 - type: nauc_recall_at_1_diff1 value: 69.8066007922827 - type: nauc_recall_at_1_max value: 46.19700236373352 - type: nauc_recall_at_1_std value: -11.167127232139137 - type: nauc_recall_at_20_diff1 value: 63.79448821637328 - type: nauc_recall_at_20_max value: 61.597381158568524 - type: nauc_recall_at_20_std value: -7.27449509788767 - type: nauc_recall_at_3_diff1 value: 64.75442031192492 - type: nauc_recall_at_3_max value: 56.12106077054382 - type: nauc_recall_at_3_std value: -2.661587128227682 - type: nauc_recall_at_5_diff1 value: 60.82940800383688 - type: nauc_recall_at_5_max value: 53.647222430433736 - type: nauc_recall_at_5_std value: -0.793229884870239 - type: ndcg_at_1 value: 60.0 - type: ndcg_at_10 value: 72.604 - type: ndcg_at_100 value: 74.83800000000001 - type: ndcg_at_1000 value: 75.27199999999999 - type: ndcg_at_20 value: 73.599 - type: ndcg_at_3 value: 68.509 - type: ndcg_at_5 value: 70.352 - type: precision_at_1 value: 60.0 - type: precision_at_10 value: 9.733 - type: precision_at_100 value: 1.083 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_20 value: 5.067 - type: precision_at_3 value: 27.444000000000003 - type: precision_at_5 value: 17.666999999999998 - type: recall_at_1 value: 57.05 - type: recall_at_10 value: 85.422 - type: recall_at_100 value: 95.333 - type: recall_at_1000 value: 98.667 - type: recall_at_20 value: 89.156 - type: recall_at_3 value: 74.211 - type: recall_at_5 value: 79.094 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cosine_accuracy value: 99.76237623762376 - type: cosine_accuracy_threshold value: 89.08973932266235 - type: cosine_ap value: 93.82184396471453 - type: cosine_f1 value: 87.87878787878789 - type: cosine_f1_threshold value: 89.08973932266235 - type: cosine_precision value: 88.77551020408163 - type: cosine_recall value: 87.0 - type: dot_accuracy value: 99.76237623762376 - type: dot_accuracy_threshold value: 89.08973932266235 - type: dot_ap value: 93.82179339271785 - type: dot_f1 value: 87.87878787878789 - type: dot_f1_threshold value: 89.08973932266235 - type: dot_precision value: 88.77551020408163 - type: dot_recall value: 87.0 - type: euclidean_accuracy value: 99.76237623762376 - type: euclidean_accuracy_threshold value: 46.71244025230408 - type: euclidean_ap value: 93.82184396471453 - type: euclidean_f1 value: 87.87878787878789 - type: euclidean_f1_threshold value: 46.71244025230408 - type: euclidean_precision value: 88.77551020408163 - type: euclidean_recall value: 87.0 - type: main_score value: 94.18170827750167 - type: manhattan_accuracy value: 99.77425742574258 - type: manhattan_accuracy_threshold value: 1095.131492614746 - type: manhattan_ap value: 94.18170827750167 - type: manhattan_f1 value: 88.45577211394303 - type: manhattan_f1_threshold value: 1108.85648727417 - type: manhattan_precision value: 88.41158841158841 - type: manhattan_recall value: 88.5 - type: max_ap value: 94.18170827750167 - type: max_f1 value: 88.45577211394303 - type: max_precision value: 88.77551020408163 - type: max_recall value: 88.5 - type: similarity_accuracy value: 99.76237623762376 - type: similarity_accuracy_threshold value: 89.08973932266235 - type: similarity_ap value: 93.82184396471453 - type: similarity_f1 value: 87.87878787878789 - type: similarity_f1_threshold value: 89.08973932266235 - type: similarity_precision value: 88.77551020408163 - type: similarity_recall value: 87.0 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: main_score value: 65.93583959980214 - type: v_measure value: 65.93583959980214 - type: v_measure_std value: 3.9403815544270233 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: main_score value: 35.594885571404724 - type: v_measure value: 35.594885571404724 - type: v_measure_std value: 1.5163847345337254 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: main_score value: 49.85213562933509 - type: map value: 49.85213562933509 - type: mrr value: 50.62702922077922 - type: nAUC_map_diff1 value: 36.55011836042864 - type: nAUC_map_max value: 13.45991062036654 - type: nAUC_map_std value: 10.192881915639742 - type: nAUC_mrr_diff1 value: 37.058265888016976 - type: nAUC_mrr_max value: 14.081819232783383 - type: nAUC_mrr_std value: 11.215978874656958 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cosine_pearson value: 25.349220308622627 - type: cosine_spearman value: 27.880975911253458 - type: dot_pearson value: 25.349197273883224 - type: dot_spearman value: 27.880903951553655 - type: main_score value: 27.880975911253458 - type: pearson value: 25.349220308622627 - type: spearman value: 27.880975911253458 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: mteb/trec-covid config: default split: test revision: bb9466bac8153a0349341eb1b22e06409e78ef4e metrics: - type: main_score value: 79.536 - type: map_at_1 value: 0.215 - type: map_at_10 value: 2.048 - type: map_at_100 value: 12.842999999999998 - type: map_at_1000 value: 31.032 - type: map_at_20 value: 3.8379999999999996 - type: map_at_3 value: 0.64 - type: map_at_5 value: 1.052 - type: mrr_at_1 value: 84.0 - type: mrr_at_10 value: 91.16666666666666 - type: mrr_at_100 value: 91.16666666666666 - type: mrr_at_1000 value: 91.16666666666666 - type: mrr_at_20 value: 91.16666666666666 - type: mrr_at_3 value: 90.66666666666666 - type: mrr_at_5 value: 91.16666666666666 - type: nauc_map_at_1000_diff1 value: -18.530580290412697 - type: nauc_map_at_1000_max value: 43.14744028154331 - type: nauc_map_at_1000_std value: 79.6699665194256 - type: nauc_map_at_100_diff1 value: -21.271315814062437 - type: nauc_map_at_100_max value: 17.55081814849073 - type: nauc_map_at_100_std value: 48.17729810787553 - type: nauc_map_at_10_diff1 value: -11.002124943974252 - type: nauc_map_at_10_max value: -9.6495971981689 - type: nauc_map_at_10_std value: 6.648364965330221 - type: nauc_map_at_1_diff1 value: 0.1251393811417004 - type: nauc_map_at_1_max value: -12.601700488498643 - type: nauc_map_at_1_std value: -3.5018878780762366 - type: nauc_map_at_20_diff1 value: -19.526191160714987 - type: nauc_map_at_20_max value: -4.175483070077258 - type: nauc_map_at_20_std value: 16.014345473073693 - type: nauc_map_at_3_diff1 value: -0.8632406748675692 - type: nauc_map_at_3_max value: -12.9654502212951 - type: nauc_map_at_3_std value: -1.5551804410996426 - type: nauc_map_at_5_diff1 value: -9.294941718115151 - type: nauc_map_at_5_max value: -12.795655812948572 - type: nauc_map_at_5_std value: 0.6128051906803516 - type: nauc_mrr_at_1000_diff1 value: 33.997935217447434 - type: nauc_mrr_at_1000_max value: 41.160149696734955 - type: nauc_mrr_at_1000_std value: 27.657024869568446 - type: nauc_mrr_at_100_diff1 value: 33.997935217447434 - type: nauc_mrr_at_100_max value: 41.160149696734955 - type: nauc_mrr_at_100_std value: 27.657024869568446 - type: nauc_mrr_at_10_diff1 value: 33.997935217447434 - type: nauc_mrr_at_10_max value: 41.160149696734955 - type: nauc_mrr_at_10_std value: 27.657024869568446 - type: nauc_mrr_at_1_diff1 value: 37.279086892488884 - type: nauc_mrr_at_1_max value: 43.292832596956316 - type: nauc_mrr_at_1_std value: 20.305596465390227 - type: nauc_mrr_at_20_diff1 value: 33.997935217447434 - type: nauc_mrr_at_20_max value: 41.160149696734955 - type: nauc_mrr_at_20_std value: 27.657024869568446 - type: nauc_mrr_at_3_diff1 value: 31.138610414926326 - type: nauc_mrr_at_3_max value: 39.545043163464186 - type: nauc_mrr_at_3_std value: 31.70252018936244 - type: nauc_mrr_at_5_diff1 value: 33.997935217447434 - type: nauc_mrr_at_5_max value: 41.160149696734955 - type: nauc_mrr_at_5_std value: 27.657024869568446 - type: nauc_ndcg_at_1000_diff1 value: -20.948326611476556 - type: nauc_ndcg_at_1000_max value: 36.766927406101956 - type: nauc_ndcg_at_1000_std value: 75.32635798841658 - type: nauc_ndcg_at_100_diff1 value: -14.54815381092273 - type: nauc_ndcg_at_100_max value: 51.38801585344711 - type: nauc_ndcg_at_100_std value: 76.47002281413397 - type: nauc_ndcg_at_10_diff1 value: -12.80351464937073 - type: nauc_ndcg_at_10_max value: 35.71831279387225 - type: nauc_ndcg_at_10_std value: 52.15347275643156 - type: nauc_ndcg_at_1_diff1 value: 20.42160737812909 - type: nauc_ndcg_at_1_max value: 34.20619235836624 - type: nauc_ndcg_at_1_std value: 13.088179936005965 - type: nauc_ndcg_at_20_diff1 value: -18.116251292365128 - type: nauc_ndcg_at_20_max value: 46.9808896232964 - type: nauc_ndcg_at_20_std value: 61.73761431506857 - type: nauc_ndcg_at_3_diff1 value: -4.44558396286013 - type: nauc_ndcg_at_3_max value: 26.953553278525938 - type: nauc_ndcg_at_3_std value: 33.375410187254786 - type: nauc_ndcg_at_5_diff1 value: -15.495190925371652 - type: nauc_ndcg_at_5_max value: 29.21035888164427 - type: nauc_ndcg_at_5_std value: 41.168078957076396 - type: nauc_precision_at_1000_diff1 value: 6.339888107354097 - type: nauc_precision_at_1000_max value: 51.87294743895088 - type: nauc_precision_at_1000_std value: 49.22667294372217 - type: nauc_precision_at_100_diff1 value: -10.245901160105356 - type: nauc_precision_at_100_max value: 56.07707608097002 - type: nauc_precision_at_100_std value: 78.96626562096216 - type: nauc_precision_at_10_diff1 value: -4.590219332829025 - type: nauc_precision_at_10_max value: 47.52908614003191 - type: nauc_precision_at_10_std value: 59.53043786106239 - type: nauc_precision_at_1_diff1 value: 37.279086892488884 - type: nauc_precision_at_1_max value: 43.292832596956316 - type: nauc_precision_at_1_std value: 20.305596465390227 - type: nauc_precision_at_20_diff1 value: -14.763079024242392 - type: nauc_precision_at_20_max value: 56.25820402898436 - type: nauc_precision_at_20_std value: 67.6952843431086 - type: nauc_precision_at_3_diff1 value: 2.9292734630949067 - type: nauc_precision_at_3_max value: 41.296148445888285 - type: nauc_precision_at_3_std value: 46.551771604768255 - type: nauc_precision_at_5_diff1 value: -15.368719472623535 - type: nauc_precision_at_5_max value: 39.706937186186984 - type: nauc_precision_at_5_std value: 45.991734125764275 - type: nauc_recall_at_1000_diff1 value: -18.70157967410686 - type: nauc_recall_at_1000_max value: 27.303031147629746 - type: nauc_recall_at_1000_std value: 63.59247900235757 - type: nauc_recall_at_100_diff1 value: -21.505202598262795 - type: nauc_recall_at_100_max value: 3.1053955846040666 - type: nauc_recall_at_100_std value: 35.59388419574821 - type: nauc_recall_at_10_diff1 value: -13.309140466736356 - type: nauc_recall_at_10_max value: -16.90482412154473 - type: nauc_recall_at_10_std value: 2.1355678490728542 - type: nauc_recall_at_1_diff1 value: 0.1251393811417004 - type: nauc_recall_at_1_max value: -12.601700488498643 - type: nauc_recall_at_1_std value: -3.5018878780762366 - type: nauc_recall_at_20_diff1 value: -21.303497421292096 - type: nauc_recall_at_20_max value: -13.765429909809388 - type: nauc_recall_at_20_std value: 9.07482009539061 - type: nauc_recall_at_3_diff1 value: -6.017177782774693 - type: nauc_recall_at_3_max value: -19.064966459546255 - type: nauc_recall_at_3_std value: -3.0227410013796967 - type: nauc_recall_at_5_diff1 value: -14.078289790672653 - type: nauc_recall_at_5_max value: -19.52038684292809 - type: nauc_recall_at_5_std value: -2.6267198328675994 - type: ndcg_at_1 value: 78.0 - type: ndcg_at_10 value: 79.536 - type: ndcg_at_100 value: 62.65500000000001 - type: ndcg_at_1000 value: 56.359 - type: ndcg_at_20 value: 77.561 - type: ndcg_at_3 value: 80.296 - type: ndcg_at_5 value: 79.806 - type: precision_at_1 value: 84.0 - type: precision_at_10 value: 85.6 - type: precision_at_100 value: 64.92 - type: precision_at_1000 value: 24.89 - type: precision_at_20 value: 83.2 - type: precision_at_3 value: 87.333 - type: precision_at_5 value: 87.2 - type: recall_at_1 value: 0.215 - type: recall_at_10 value: 2.246 - type: recall_at_100 value: 15.784 - type: recall_at_1000 value: 53.427 - type: recall_at_20 value: 4.281 - type: recall_at_3 value: 0.688 - type: recall_at_5 value: 1.142 - task: type: Retrieval dataset: name: MTEB Touche2020 type: mteb/touche2020 config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: main_score value: 31.186999999999998 - type: map_at_1 value: 3.4070000000000005 - type: map_at_10 value: 13.313 - type: map_at_100 value: 19.900000000000002 - type: map_at_1000 value: 21.437 - type: map_at_20 value: 15.714 - type: map_at_3 value: 6.923 - type: map_at_5 value: 9.054 - type: mrr_at_1 value: 44.89795918367347 - type: mrr_at_10 value: 56.63832199546485 - type: mrr_at_100 value: 57.666166033512965 - type: mrr_at_1000 value: 57.666166033512965 - type: mrr_at_20 value: 57.51229496127455 - type: mrr_at_3 value: 53.40136054421768 - type: mrr_at_5 value: 55.1360544217687 - type: nauc_map_at_1000_diff1 value: 6.929929189103678 - type: nauc_map_at_1000_max value: -20.5925373398606 - type: nauc_map_at_1000_std value: 7.835669658058121 - type: nauc_map_at_100_diff1 value: 7.528899533894891 - type: nauc_map_at_100_max value: -21.032199268806018 - type: nauc_map_at_100_std value: 5.370650925959299 - type: nauc_map_at_10_diff1 value: 14.176770339374578 - type: nauc_map_at_10_max value: -19.194036092916633 - type: nauc_map_at_10_std value: -14.964801890692026 - type: nauc_map_at_1_diff1 value: 16.059944358241733 - type: nauc_map_at_1_max value: -25.302527766801695 - type: nauc_map_at_1_std value: -13.565207797491604 - type: nauc_map_at_20_diff1 value: 11.361043123465297 - type: nauc_map_at_20_max value: -18.0301938420575 - type: nauc_map_at_20_std value: -7.25573010108597 - type: nauc_map_at_3_diff1 value: 21.973707928327727 - type: nauc_map_at_3_max value: -20.079194093834058 - type: nauc_map_at_3_std value: -20.173080790091422 - type: nauc_map_at_5_diff1 value: 19.669071376698206 - type: nauc_map_at_5_max value: -23.679751632414845 - type: nauc_map_at_5_std value: -20.28001860761147 - type: nauc_mrr_at_1000_diff1 value: 6.875737447320781 - type: nauc_mrr_at_1000_max value: -44.8769334243922 - type: nauc_mrr_at_1000_std value: 7.361962913444513 - type: nauc_mrr_at_100_diff1 value: 6.875737447320781 - type: nauc_mrr_at_100_max value: -44.8769334243922 - type: nauc_mrr_at_100_std value: 7.361962913444513 - type: nauc_mrr_at_10_diff1 value: 6.574806453972689 - type: nauc_mrr_at_10_max value: -47.267277277496596 - type: nauc_mrr_at_10_std value: 8.783148855636174 - type: nauc_mrr_at_1_diff1 value: 12.940754496022242 - type: nauc_mrr_at_1_max value: -35.544013626458145 - type: nauc_mrr_at_1_std value: 6.0616339439628915 - type: nauc_mrr_at_20_diff1 value: 7.179017109424859 - type: nauc_mrr_at_20_max value: -45.52183055340191 - type: nauc_mrr_at_20_std value: 6.960503593984209 - type: nauc_mrr_at_3_diff1 value: 2.10431985300728 - type: nauc_mrr_at_3_max value: -41.662819302741184 - type: nauc_mrr_at_3_std value: 5.68448693989341 - type: nauc_mrr_at_5_diff1 value: 5.25929369032379 - type: nauc_mrr_at_5_max value: -44.62592534259141 - type: nauc_mrr_at_5_std value: 6.26151671868977 - type: nauc_ndcg_at_1000_diff1 value: -6.563466320842519 - type: nauc_ndcg_at_1000_max value: -33.15200693567147 - type: nauc_ndcg_at_1000_std value: 29.09290649197198 - type: nauc_ndcg_at_100_diff1 value: -4.290185637900728 - type: nauc_ndcg_at_100_max value: -35.6991058391752 - type: nauc_ndcg_at_100_std value: 24.47606141799262 - type: nauc_ndcg_at_10_diff1 value: 4.171305930645993 - type: nauc_ndcg_at_10_max value: -33.02156808389195 - type: nauc_ndcg_at_10_std value: -0.7115167969929295 - type: nauc_ndcg_at_1_diff1 value: 4.295135743080979 - type: nauc_ndcg_at_1_max value: -30.841816609035575 - type: nauc_ndcg_at_1_std value: 11.08702259742227 - type: nauc_ndcg_at_20_diff1 value: 5.716130418772172 - type: nauc_ndcg_at_20_max value: -32.02017772879846 - type: nauc_ndcg_at_20_std value: 0.42043490374547515 - type: nauc_ndcg_at_3_diff1 value: 0.7696408676847266 - type: nauc_ndcg_at_3_max value: -28.19446012238678 - type: nauc_ndcg_at_3_std value: 1.4270173161697919 - type: nauc_ndcg_at_5_diff1 value: 4.011877087450832 - type: nauc_ndcg_at_5_max value: -35.474817068811866 - type: nauc_ndcg_at_5_std value: -1.0183501951460643 - type: nauc_precision_at_1000_diff1 value: -18.852617887278956 - type: nauc_precision_at_1000_max value: 26.536677685298997 - type: nauc_precision_at_1000_std value: 31.17777014427175 - type: nauc_precision_at_100_diff1 value: -21.993356262198738 - type: nauc_precision_at_100_max value: -14.151354806872973 - type: nauc_precision_at_100_std value: 68.01931004336306 - type: nauc_precision_at_10_diff1 value: 3.518175306600991 - type: nauc_precision_at_10_max value: -34.29876549408336 - type: nauc_precision_at_10_std value: 8.571886047048881 - type: nauc_precision_at_1_diff1 value: 12.940754496022242 - type: nauc_precision_at_1_max value: -35.544013626458145 - type: nauc_precision_at_1_std value: 6.0616339439628915 - type: nauc_precision_at_20_diff1 value: 6.23454071647187 - type: nauc_precision_at_20_max value: -29.16565290719762 - type: nauc_precision_at_20_std value: 25.567483624610297 - type: nauc_precision_at_3_diff1 value: 8.77511441582519 - type: nauc_precision_at_3_max value: -29.389312907952135 - type: nauc_precision_at_3_std value: -6.397150206890867 - type: nauc_precision_at_5_diff1 value: 9.795445750266063 - type: nauc_precision_at_5_max value: -38.88827845334236 - type: nauc_precision_at_5_std value: -3.397760151003072 - type: nauc_recall_at_1000_diff1 value: -28.033327034031043 - type: nauc_recall_at_1000_max value: -15.30930042500693 - type: nauc_recall_at_1000_std value: 69.27496829698434 - type: nauc_recall_at_100_diff1 value: -12.558500592244782 - type: nauc_recall_at_100_max value: -27.109814142314832 - type: nauc_recall_at_100_std value: 40.23660136119213 - type: nauc_recall_at_10_diff1 value: 8.859020421080002 - type: nauc_recall_at_10_max value: -26.101835112681034 - type: nauc_recall_at_10_std value: -12.02508230851673 - type: nauc_recall_at_1_diff1 value: 16.059944358241733 - type: nauc_recall_at_1_max value: -25.302527766801695 - type: nauc_recall_at_1_std value: -13.565207797491604 - type: nauc_recall_at_20_diff1 value: 6.598503996413421 - type: nauc_recall_at_20_max value: -25.661355219947264 - type: nauc_recall_at_20_std value: -0.5270972932429998 - type: nauc_recall_at_3_diff1 value: 15.848752699477423 - type: nauc_recall_at_3_max value: -20.67227958185249 - type: nauc_recall_at_3_std value: -19.687883601951533 - type: nauc_recall_at_5_diff1 value: 15.210234895525055 - type: nauc_recall_at_5_max value: -30.20253332454299 - type: nauc_recall_at_5_std value: -19.986130369906242 - type: ndcg_at_1 value: 40.816 - type: ndcg_at_10 value: 31.186999999999998 - type: ndcg_at_100 value: 42.742000000000004 - type: ndcg_at_1000 value: 53.230999999999995 - type: ndcg_at_20 value: 31.057000000000002 - type: ndcg_at_3 value: 34.382000000000005 - type: ndcg_at_5 value: 32.038 - type: precision_at_1 value: 44.897999999999996 - type: precision_at_10 value: 27.143 - type: precision_at_100 value: 8.735 - type: precision_at_1000 value: 1.59 - type: precision_at_20 value: 19.898 - type: precision_at_3 value: 34.694 - type: precision_at_5 value: 31.019999999999996 - type: recall_at_1 value: 3.4070000000000005 - type: recall_at_10 value: 19.987 - type: recall_at_100 value: 52.888999999999996 - type: recall_at_1000 value: 85.172 - type: recall_at_20 value: 27.025 - type: recall_at_3 value: 7.774 - type: recall_at_5 value: 11.571 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de metrics: - type: accuracy value: 89.0380859375 - type: ap value: 34.26536468203791 - type: ap_weighted value: 34.26536468203791 - type: f1 value: 73.86921962038298 - type: f1_weighted value: 90.61132302248866 - type: main_score value: 89.0380859375 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 76.49405772495754 - type: f1 value: 76.73610452546936 - type: f1_weighted value: 76.14362047024868 - type: main_score value: 76.49405772495754 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: main_score value: 49.554702818248735 - type: v_measure value: 49.554702818248735 - type: v_measure_std value: 0.9278298624304031 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cosine_accuracy value: 85.29534481730941 - type: cosine_accuracy_threshold value: 90.6567394733429 - type: cosine_ap value: 71.59976408272617 - type: cosine_f1 value: 66.54452180285818 - type: cosine_f1_threshold value: 88.94971013069153 - type: cosine_precision value: 61.95133045258131 - type: cosine_recall value: 71.87335092348285 - type: dot_accuracy value: 85.29534481730941 - type: dot_accuracy_threshold value: 90.65674543380737 - type: dot_ap value: 71.5997871796046 - type: dot_f1 value: 66.54452180285818 - type: dot_f1_threshold value: 88.94971013069153 - type: dot_precision value: 61.95133045258131 - type: dot_recall value: 71.87335092348285 - type: euclidean_accuracy value: 85.29534481730941 - type: euclidean_accuracy_threshold value: 43.2279109954834 - type: euclidean_ap value: 71.59977967634174 - type: euclidean_f1 value: 66.54452180285818 - type: euclidean_f1_threshold value: 47.01125621795654 - type: euclidean_precision value: 61.95133045258131 - type: euclidean_recall value: 71.87335092348285 - type: main_score value: 71.5997871796046 - type: manhattan_accuracy value: 85.1820945341837 - type: manhattan_accuracy_threshold value: 1019.9851989746094 - type: manhattan_ap value: 71.22149639016482 - type: manhattan_f1 value: 66.31834750911301 - type: manhattan_f1_threshold value: 1109.6149444580078 - type: manhattan_precision value: 61.46396396396396 - type: manhattan_recall value: 72.00527704485488 - type: max_ap value: 71.5997871796046 - type: max_f1 value: 66.54452180285818 - type: max_precision value: 61.95133045258131 - type: max_recall value: 72.00527704485488 - type: similarity_accuracy value: 85.29534481730941 - type: similarity_accuracy_threshold value: 90.6567394733429 - type: similarity_ap value: 71.59976408272617 - type: similarity_f1 value: 66.54452180285818 - type: similarity_f1_threshold value: 88.94971013069153 - type: similarity_precision value: 61.95133045258131 - type: similarity_recall value: 71.87335092348285 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cosine_accuracy value: 89.19936352699189 - type: cosine_accuracy_threshold value: 87.72701621055603 - type: cosine_ap value: 86.32764736710979 - type: cosine_f1 value: 78.40269966254218 - type: cosine_f1_threshold value: 86.80565357208252 - type: cosine_precision value: 76.41426692004093 - type: cosine_recall value: 80.49738219895288 - type: dot_accuracy value: 89.19936352699189 - type: dot_accuracy_threshold value: 87.72701621055603 - type: dot_ap value: 86.32762879051161 - type: dot_f1 value: 78.40269966254218 - type: dot_f1_threshold value: 86.80565357208252 - type: dot_precision value: 76.41426692004093 - type: dot_recall value: 80.49738219895288 - type: euclidean_accuracy value: 89.19936352699189 - type: euclidean_accuracy_threshold value: 49.54388439655304 - type: euclidean_ap value: 86.3276630523782 - type: euclidean_f1 value: 78.40269966254218 - type: euclidean_f1_threshold value: 51.36992931365967 - type: euclidean_precision value: 76.41426692004093 - type: euclidean_recall value: 80.49738219895288 - type: main_score value: 86.3276630523782 - type: manhattan_accuracy value: 89.16637559669344 - type: manhattan_accuracy_threshold value: 1150.1700401306152 - type: manhattan_ap value: 86.28674414277404 - type: manhattan_f1 value: 78.34183768482997 - type: manhattan_f1_threshold value: 1213.088321685791 - type: manhattan_precision value: 75.87475651107424 - type: manhattan_recall value: 80.97474591931014 - type: max_ap value: 86.3276630523782 - type: max_f1 value: 78.40269966254218 - type: max_precision value: 76.41426692004093 - type: max_recall value: 80.97474591931014 - type: similarity_accuracy value: 89.19936352699189 - type: similarity_accuracy_threshold value: 87.72701621055603 - type: similarity_ap value: 86.32764736710979 - type: similarity_f1 value: 78.40269966254218 - type: similarity_f1_threshold value: 86.80565357208252 - type: similarity_precision value: 76.41426692004093 - type: similarity_recall value: 80.49738219895288 - task: type: STS dataset: name: MTEB AFQMC type: C-MTEB/AFQMC config: default split: validation revision: b44c3b011063adb25877c13823db83bb193913c4 metrics: - type: cosine_pearson value: 38.59465613249044 - type: cosine_spearman value: 39.876884773191065 - type: euclidean_pearson value: 38.370163017159996 - type: euclidean_spearman value: 39.87692498028858 - type: main_score value: 39.876884773191065 - type: manhattan_pearson value: 38.058013850119785 - type: manhattan_spearman value: 39.531271872106856 - type: pearson value: 38.59465613249044 - type: spearman value: 39.876884773191065 - task: type: STS dataset: name: MTEB ATEC type: C-MTEB/ATEC config: default split: test revision: 0f319b1142f28d00e055a6770f3f726ae9b7d865 metrics: - type: cosine_pearson value: 46.457799031090666 - type: cosine_spearman value: 47.170032935367935 - type: euclidean_pearson value: 49.399858337266004 - type: euclidean_spearman value: 47.17003293450119 - type: main_score value: 47.170032935367935 - type: manhattan_pearson value: 49.19428772786887 - type: manhattan_spearman value: 46.94649743167009 - type: pearson value: 46.457799031090666 - type: spearman value: 47.170032935367935 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (zh) type: mteb/amazon_reviews_multi config: zh split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 50.246 - type: f1 value: 45.84988588370862 - type: f1_weighted value: 45.84988588370862 - type: main_score value: 50.246 - task: type: STS dataset: name: MTEB BQ type: C-MTEB/BQ config: default split: test revision: e3dda5e115e487b39ec7e618c0c6a29137052a55 metrics: - type: cosine_pearson value: 53.67950003884396 - type: cosine_spearman value: 54.36088598761955 - type: euclidean_pearson value: 53.09394654913335 - type: euclidean_spearman value: 54.36088252221325 - type: main_score value: 54.36088598761955 - type: manhattan_pearson value: 52.805415867146955 - type: manhattan_spearman value: 54.06705049402532 - type: pearson value: 53.67950003884396 - type: spearman value: 54.36088598761955 - task: type: Clustering dataset: name: MTEB CLSClusteringP2P type: C-MTEB/CLSClusteringP2P config: default split: test revision: 4b6227591c6c1a73bc76b1055f3b7f3588e72476 metrics: - type: main_score value: 41.608876653105966 - type: v_measure value: 41.608876653105966 - type: v_measure_std value: 1.0624705258546963 - task: type: Clustering dataset: name: MTEB CLSClusteringS2S type: C-MTEB/CLSClusteringS2S config: default split: test revision: e458b3f5414b62b7f9f83499ac1f5497ae2e869f metrics: - type: main_score value: 39.7110966049789 - type: v_measure value: 39.7110966049789 - type: v_measure_std value: 0.875231943450341 - task: type: Reranking dataset: name: MTEB CMedQAv1 type: C-MTEB/CMedQAv1-reranking config: default split: test revision: 8d7f1e942507dac42dc58017c1a001c3717da7df metrics: - type: main_score value: 81.7193302624052 - type: map value: 81.7193302624052 - type: mrr value: 84.58841269841271 - type: nAUC_map_diff1 value: 57.41916975321788 - type: nAUC_map_max value: 61.409376634272874 - type: nAUC_map_std value: 28.913154318201233 - type: nAUC_mrr_diff1 value: 64.85350793018186 - type: nAUC_mrr_max value: 69.46338529223004 - type: nAUC_mrr_std value: 35.373588518165235 - task: type: Reranking dataset: name: MTEB CMedQAv2 type: C-MTEB/CMedQAv2-reranking config: default split: test revision: 23d186750531a14a0357ca22cd92d712fd512ea0 metrics: - type: main_score value: 82.59163356780259 - type: map value: 82.59163356780259 - type: mrr value: 85.54900793650792 - type: nAUC_map_diff1 value: 61.10665055831455 - type: nAUC_map_max value: 60.91441391850925 - type: nAUC_map_std value: 21.471788062972436 - type: nAUC_mrr_diff1 value: 69.95883630916767 - type: nAUC_mrr_max value: 71.06959737866757 - type: nAUC_mrr_std value: 30.819473605657606 - task: type: Retrieval dataset: name: MTEB CmedqaRetrieval type: C-MTEB/CmedqaRetrieval config: default split: dev revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301 metrics: - type: main_score value: 42.631 - type: map_at_1 value: 24.834 - type: map_at_10 value: 36.447 - type: map_at_100 value: 38.04 - type: map_at_1000 value: 38.179 - type: map_at_20 value: 37.281 - type: map_at_3 value: 32.761 - type: map_at_5 value: 34.871 - type: mrr_at_1 value: 38.05951487871968 - type: mrr_at_10 value: 45.57554071057435 - type: mrr_at_100 value: 46.447190120013 - type: mrr_at_1000 value: 46.50606585607273 - type: mrr_at_20 value: 46.057122452003696 - type: mrr_at_3 value: 43.34000166708336 - type: mrr_at_5 value: 44.58531299491537 - type: nauc_map_at_1000_diff1 value: 48.47252945149055 - type: nauc_map_at_1000_max value: 34.62100533042246 - type: nauc_map_at_1000_std value: -2.684326419049642 - type: nauc_map_at_100_diff1 value: 48.43175156549248 - type: nauc_map_at_100_max value: 34.58371253483366 - type: nauc_map_at_100_std value: -2.719072576245476 - type: nauc_map_at_10_diff1 value: 48.18476956739444 - type: nauc_map_at_10_max value: 33.52918292302435 - type: nauc_map_at_10_std value: -3.746440821126843 - type: nauc_map_at_1_diff1 value: 52.68253139221022 - type: nauc_map_at_1_max value: 26.033202075590157 - type: nauc_map_at_1_std value: -5.756330655143574 - type: nauc_map_at_20_diff1 value: 48.33335064427594 - type: nauc_map_at_20_max value: 34.08423189594616 - type: nauc_map_at_20_std value: -3.2957587803371693 - type: nauc_map_at_3_diff1 value: 49.07970552101722 - type: nauc_map_at_3_max value: 30.931354812941592 - type: nauc_map_at_3_std value: -5.397714078300849 - type: nauc_map_at_5_diff1 value: 48.582852045037974 - type: nauc_map_at_5_max value: 32.37350218464533 - type: nauc_map_at_5_std value: -4.604286079722004 - type: nauc_mrr_at_1000_diff1 value: 55.36516647246729 - type: nauc_mrr_at_1000_max value: 41.8197309169163 - type: nauc_mrr_at_1000_std value: 1.2938880389263046 - type: nauc_mrr_at_100_diff1 value: 55.33480230365865 - type: nauc_mrr_at_100_max value: 41.82044267368069 - type: nauc_mrr_at_100_std value: 1.3168989639934452 - type: nauc_mrr_at_10_diff1 value: 55.25761484350501 - type: nauc_mrr_at_10_max value: 41.625145381930565 - type: nauc_mrr_at_10_std value: 1.0129282219497187 - type: nauc_mrr_at_1_diff1 value: 60.68654871568434 - type: nauc_mrr_at_1_max value: 43.033167419208546 - type: nauc_mrr_at_1_std value: 0.4003726817671297 - type: nauc_mrr_at_20_diff1 value: 55.265505678078995 - type: nauc_mrr_at_20_max value: 41.7232926738926 - type: nauc_mrr_at_20_std value: 1.1959474260609984 - type: nauc_mrr_at_3_diff1 value: 56.49797535079964 - type: nauc_mrr_at_3_max value: 41.922468081636865 - type: nauc_mrr_at_3_std value: 0.7461678066019137 - type: nauc_mrr_at_5_diff1 value: 55.726696029505305 - type: nauc_mrr_at_5_max value: 41.7068087576993 - type: nauc_mrr_at_5_std value: 0.9345604936396126 - type: nauc_ndcg_at_1000_diff1 value: 49.12475845061519 - type: nauc_ndcg_at_1000_max value: 38.13450613159849 - type: nauc_ndcg_at_1000_std value: 0.9070870161011241 - type: nauc_ndcg_at_100_diff1 value: 48.12044160559342 - type: nauc_ndcg_at_100_max value: 37.98858612073559 - type: nauc_ndcg_at_100_std value: 1.398027778560473 - type: nauc_ndcg_at_10_diff1 value: 47.49083707975477 - type: nauc_ndcg_at_10_max value: 35.424124038022484 - type: nauc_ndcg_at_10_std value: -1.9285006153671742 - type: nauc_ndcg_at_1_diff1 value: 60.68654871568434 - type: nauc_ndcg_at_1_max value: 43.033167419208546 - type: nauc_ndcg_at_1_std value: 0.4003726817671297 - type: nauc_ndcg_at_20_diff1 value: 47.692259910508014 - type: nauc_ndcg_at_20_max value: 36.20333827999666 - type: nauc_ndcg_at_20_std value: -1.1366081258269927 - type: nauc_ndcg_at_3_diff1 value: 49.926059859304004 - type: nauc_ndcg_at_3_max value: 36.554915901584614 - type: nauc_ndcg_at_3_std value: -1.7727717324767251 - type: nauc_ndcg_at_5_diff1 value: 48.504726113001304 - type: nauc_ndcg_at_5_max value: 35.2222520201459 - type: nauc_ndcg_at_5_std value: -2.1147823162180046 - type: nauc_precision_at_1000_diff1 value: 5.95771915067704 - type: nauc_precision_at_1000_max value: 29.222734901088483 - type: nauc_precision_at_1000_std value: 21.021319062045514 - type: nauc_precision_at_100_diff1 value: 12.441767269549631 - type: nauc_precision_at_100_max value: 37.028610753731876 - type: nauc_precision_at_100_std value: 22.59370573792191 - type: nauc_precision_at_10_diff1 value: 25.3055255305395 - type: nauc_precision_at_10_max value: 41.57346735024518 - type: nauc_precision_at_10_std value: 10.851514810119529 - type: nauc_precision_at_1_diff1 value: 60.68654871568434 - type: nauc_precision_at_1_max value: 43.033167419208546 - type: nauc_precision_at_1_std value: 0.4003726817671297 - type: nauc_precision_at_20_diff1 value: 21.503387725334118 - type: nauc_precision_at_20_max value: 40.35637914704234 - type: nauc_precision_at_20_std value: 14.15622720179941 - type: nauc_precision_at_3_diff1 value: 37.92102588120911 - type: nauc_precision_at_3_max value: 42.61959379379323 - type: nauc_precision_at_3_std value: 4.531204029823331 - type: nauc_precision_at_5_diff1 value: 31.822114101121624 - type: nauc_precision_at_5_max value: 42.00621213856077 - type: nauc_precision_at_5_std value: 7.038453918682581 - type: nauc_recall_at_1000_diff1 value: 30.906717381989445 - type: nauc_recall_at_1000_max value: 49.86631344507457 - type: nauc_recall_at_1000_std value: 44.77133994051694 - type: nauc_recall_at_100_diff1 value: 29.06337979940958 - type: nauc_recall_at_100_max value: 35.64030149194558 - type: nauc_recall_at_100_std value: 16.019430611168264 - type: nauc_recall_at_10_diff1 value: 34.92848768468913 - type: nauc_recall_at_10_max value: 28.566945065867454 - type: nauc_recall_at_10_std value: -2.1058035354561557 - type: nauc_recall_at_1_diff1 value: 52.68253139221022 - type: nauc_recall_at_1_max value: 26.033202075590157 - type: nauc_recall_at_1_std value: -5.756330655143574 - type: nauc_recall_at_20_diff1 value: 33.82932775397309 - type: nauc_recall_at_20_max value: 29.679872190739044 - type: nauc_recall_at_20_std value: 0.10165951410954753 - type: nauc_recall_at_3_diff1 value: 42.53700938223526 - type: nauc_recall_at_3_max value: 27.477725171266385 - type: nauc_recall_at_3_std value: -5.201557627828334 - type: nauc_recall_at_5_diff1 value: 39.158896850349116 - type: nauc_recall_at_5_max value: 27.90842581577196 - type: nauc_recall_at_5_std value: -3.646479982111823 - type: ndcg_at_1 value: 38.06 - type: ndcg_at_10 value: 42.631 - type: ndcg_at_100 value: 49.114000000000004 - type: ndcg_at_1000 value: 51.745 - type: ndcg_at_20 value: 44.895 - type: ndcg_at_3 value: 38.153999999999996 - type: ndcg_at_5 value: 39.994 - type: precision_at_1 value: 38.06 - type: precision_at_10 value: 9.35 - type: precision_at_100 value: 1.471 - type: precision_at_1000 value: 0.181 - type: precision_at_20 value: 5.461 - type: precision_at_3 value: 21.555 - type: precision_at_5 value: 15.443999999999999 - type: recall_at_1 value: 24.834 - type: recall_at_10 value: 51.881 - type: recall_at_100 value: 79.095 - type: recall_at_1000 value: 97.077 - type: recall_at_20 value: 59.471 - type: recall_at_3 value: 37.836 - type: recall_at_5 value: 43.913999999999994 - task: type: PairClassification dataset: name: MTEB Cmnli type: C-MTEB/CMNLI config: default split: validation revision: 41bc36f332156f7adc9e38f53777c959b2ae9766 metrics: - type: cosine_accuracy value: 69.29645219482863 - type: cosine_accuracy_threshold value: 83.89029502868652 - type: cosine_ap value: 76.28529631089978 - type: cosine_f1 value: 72.18316549496485 - type: cosine_f1_threshold value: 79.37869429588318 - type: cosine_precision value: 60.79372699631941 - type: cosine_recall value: 88.82394201543138 - type: dot_accuracy value: 69.29645219482863 - type: dot_accuracy_threshold value: 83.890300989151 - type: dot_ap value: 76.28533525182606 - type: dot_f1 value: 72.18316549496485 - type: dot_f1_threshold value: 79.37869429588318 - type: dot_precision value: 60.79372699631941 - type: dot_recall value: 88.82394201543138 - type: euclidean_accuracy value: 69.29645219482863 - type: euclidean_accuracy_threshold value: 56.762146949768066 - type: euclidean_ap value: 76.28547969937172 - type: euclidean_f1 value: 72.18316549496485 - type: euclidean_f1_threshold value: 64.22040462493896 - type: euclidean_precision value: 60.79372699631941 - type: euclidean_recall value: 88.82394201543138 - type: main_score value: 76.28547969937172 - type: manhattan_accuracy value: 68.86349969933855 - type: manhattan_accuracy_threshold value: 1325.539207458496 - type: manhattan_ap value: 75.73527179489312 - type: manhattan_f1 value: 71.93284110448064 - type: manhattan_f1_threshold value: 1450.2345085144043 - type: manhattan_precision value: 63.386809269162214 - type: manhattan_recall value: 83.14238952536824 - type: max_ap value: 76.28547969937172 - type: max_f1 value: 72.18316549496485 - type: max_precision value: 63.386809269162214 - type: max_recall value: 88.82394201543138 - type: similarity_accuracy value: 69.29645219482863 - type: similarity_accuracy_threshold value: 83.89029502868652 - type: similarity_ap value: 76.28529631089978 - type: similarity_f1 value: 72.18316549496485 - type: similarity_f1_threshold value: 79.37869429588318 - type: similarity_precision value: 60.79372699631941 - type: similarity_recall value: 88.82394201543138 - task: type: Retrieval dataset: name: MTEB CovidRetrieval type: C-MTEB/CovidRetrieval config: default split: dev revision: 1271c7809071a13532e05f25fb53511ffce77117 metrics: - type: main_score value: 82.76599999999999 - type: map_at_1 value: 70.99600000000001 - type: map_at_10 value: 79.022 - type: map_at_100 value: 79.262 - type: map_at_1000 value: 79.266 - type: map_at_20 value: 79.211 - type: map_at_3 value: 77.081 - type: map_at_5 value: 78.348 - type: mrr_at_1 value: 71.12750263435194 - type: mrr_at_10 value: 79.00563667686959 - type: mrr_at_100 value: 79.24545000482046 - type: mrr_at_1000 value: 79.24986213861123 - type: mrr_at_20 value: 79.19503716749968 - type: mrr_at_3 value: 77.1338250790306 - type: mrr_at_5 value: 78.38250790305591 - type: nauc_map_at_1000_diff1 value: 79.78007097062118 - type: nauc_map_at_1000_max value: 31.495494389521216 - type: nauc_map_at_1000_std value: -44.554113523471585 - type: nauc_map_at_100_diff1 value: 79.77901003479913 - type: nauc_map_at_100_max value: 31.501728637681925 - type: nauc_map_at_100_std value: -44.54526589087225 - type: nauc_map_at_10_diff1 value: 79.70465086616332 - type: nauc_map_at_10_max value: 31.447942385382856 - type: nauc_map_at_10_std value: -44.86102015819248 - type: nauc_map_at_1_diff1 value: 81.89774804895447 - type: nauc_map_at_1_max value: 29.53109235427305 - type: nauc_map_at_1_std value: -42.80277721451948 - type: nauc_map_at_20_diff1 value: 79.77871635635559 - type: nauc_map_at_20_max value: 31.560274527206733 - type: nauc_map_at_20_std value: -44.55008236120152 - type: nauc_map_at_3_diff1 value: 79.37871528079008 - type: nauc_map_at_3_max value: 30.314627717947655 - type: nauc_map_at_3_std value: -46.583081505018214 - type: nauc_map_at_5_diff1 value: 79.47410569600237 - type: nauc_map_at_5_max value: 30.717452787943255 - type: nauc_map_at_5_std value: -45.56487302807213 - type: nauc_mrr_at_1000_diff1 value: 79.83396133475738 - type: nauc_mrr_at_1000_max value: 31.902081193300802 - type: nauc_mrr_at_1000_std value: -44.32825329012893 - type: nauc_mrr_at_100_diff1 value: 79.832888351025 - type: nauc_mrr_at_100_max value: 31.90821451879506 - type: nauc_mrr_at_100_std value: -44.31946551133598 - type: nauc_mrr_at_10_diff1 value: 79.75766328526763 - type: nauc_mrr_at_10_max value: 31.84709271229474 - type: nauc_mrr_at_10_std value: -44.64251370779262 - type: nauc_mrr_at_1_diff1 value: 81.88675621341875 - type: nauc_mrr_at_1_max value: 30.624768062722435 - type: nauc_mrr_at_1_std value: -41.826968180693456 - type: nauc_mrr_at_20_diff1 value: 79.83221800317402 - type: nauc_mrr_at_20_max value: 31.96340672339527 - type: nauc_mrr_at_20_std value: -44.32956320098315 - type: nauc_mrr_at_3_diff1 value: 79.34629346809106 - type: nauc_mrr_at_3_max value: 31.358295528236113 - type: nauc_mrr_at_3_std value: -45.97803582281396 - type: nauc_mrr_at_5_diff1 value: 79.494177213373 - type: nauc_mrr_at_5_max value: 31.52236804483443 - type: nauc_mrr_at_5_std value: -45.138775893398694 - type: nauc_ndcg_at_1000_diff1 value: 79.42223230573576 - type: nauc_ndcg_at_1000_max value: 32.28843903409106 - type: nauc_ndcg_at_1000_std value: -44.3133954110294 - type: nauc_ndcg_at_100_diff1 value: 79.3929907054809 - type: nauc_ndcg_at_100_max value: 32.49291426150998 - type: nauc_ndcg_at_100_std value: -43.996604718501075 - type: nauc_ndcg_at_10_diff1 value: 79.11644773352661 - type: nauc_ndcg_at_10_max value: 32.54744027915217 - type: nauc_ndcg_at_10_std value: -45.44820798746672 - type: nauc_ndcg_at_1_diff1 value: 81.71471193659804 - type: nauc_ndcg_at_1_max value: 30.56723762753589 - type: nauc_ndcg_at_1_std value: -42.00582595178881 - type: nauc_ndcg_at_20_diff1 value: 79.34070754205227 - type: nauc_ndcg_at_20_max value: 33.08175655505984 - type: nauc_ndcg_at_20_std value: -43.93297429354463 - type: nauc_ndcg_at_3_diff1 value: 78.41040890432154 - type: nauc_ndcg_at_3_max value: 30.540602587995053 - type: nauc_ndcg_at_3_std value: -48.682741281966244 - type: nauc_ndcg_at_5_diff1 value: 78.52045059102817 - type: nauc_ndcg_at_5_max value: 31.145620595701786 - type: nauc_ndcg_at_5_std value: -46.96161213475506 - type: nauc_precision_at_1000_diff1 value: -26.20700295711843 - type: nauc_precision_at_1000_max value: 50.992072309587066 - type: nauc_precision_at_1000_std value: 49.034232966809896 - type: nauc_precision_at_100_diff1 value: -1.2318650992746658 - type: nauc_precision_at_100_max value: 54.103623972545876 - type: nauc_precision_at_100_std value: 38.158651434354105 - type: nauc_precision_at_10_diff1 value: 47.40081635911143 - type: nauc_precision_at_10_max value: 46.01760789553407 - type: nauc_precision_at_10_std value: -22.545587533051467 - type: nauc_precision_at_1_diff1 value: 81.71471193659804 - type: nauc_precision_at_1_max value: 30.56723762753589 - type: nauc_precision_at_1_std value: -42.00582595178881 - type: nauc_precision_at_20_diff1 value: 31.902645462266044 - type: nauc_precision_at_20_max value: 60.06037928799191 - type: nauc_precision_at_20_std value: 10.125381568485691 - type: nauc_precision_at_3_diff1 value: 70.23181696295782 - type: nauc_precision_at_3_max value: 31.33307476962615 - type: nauc_precision_at_3_std value: -52.773523783308995 - type: nauc_precision_at_5_diff1 value: 63.24118340779976 - type: nauc_precision_at_5_max value: 35.536460706118284 - type: nauc_precision_at_5_std value: -43.859100503715496 - type: nauc_recall_at_1000_diff1 value: 63.10783066308766 - type: nauc_recall_at_1000_max value: 64.17746555050037 - type: nauc_recall_at_1000_std value: -1.1314627694685895 - type: nauc_recall_at_100_diff1 value: 70.70747402244945 - type: nauc_recall_at_100_max value: 63.81462634298472 - type: nauc_recall_at_100_std value: 2.7329437124855858 - type: nauc_recall_at_10_diff1 value: 74.5724683430861 - type: nauc_recall_at_10_max value: 42.06028697147503 - type: nauc_recall_at_10_std value: -50.426163431789384 - type: nauc_recall_at_1_diff1 value: 81.89774804895447 - type: nauc_recall_at_1_max value: 29.53109235427305 - type: nauc_recall_at_1_std value: -42.80277721451948 - type: nauc_recall_at_20_diff1 value: 74.1386367152198 - type: nauc_recall_at_20_max value: 60.26605112943992 - type: nauc_recall_at_20_std value: -24.167905489617926 - type: nauc_recall_at_3_diff1 value: 74.68360442418249 - type: nauc_recall_at_3_max value: 29.73174978017023 - type: nauc_recall_at_3_std value: -58.048521143234844 - type: nauc_recall_at_5_diff1 value: 73.33434605574439 - type: nauc_recall_at_5_max value: 31.829043506426963 - type: nauc_recall_at_5_std value: -55.33176739081927 - type: ndcg_at_1 value: 71.233 - type: ndcg_at_10 value: 82.76599999999999 - type: ndcg_at_100 value: 83.799 - type: ndcg_at_1000 value: 83.898 - type: ndcg_at_20 value: 83.44 - type: ndcg_at_3 value: 79.03999999999999 - type: ndcg_at_5 value: 81.285 - type: precision_at_1 value: 71.233 - type: precision_at_10 value: 9.526 - type: precision_at_100 value: 0.9990000000000001 - type: precision_at_1000 value: 0.101 - type: precision_at_20 value: 4.8950000000000005 - type: precision_at_3 value: 28.346 - type: precision_at_5 value: 18.124000000000002 - type: recall_at_1 value: 70.99600000000001 - type: recall_at_10 value: 94.31 - type: recall_at_100 value: 98.84100000000001 - type: recall_at_1000 value: 99.579 - type: recall_at_20 value: 96.944 - type: recall_at_3 value: 84.589 - type: recall_at_5 value: 89.98899999999999 - task: type: Retrieval dataset: name: MTEB DuRetrieval type: C-MTEB/DuRetrieval config: default split: dev revision: a1a333e290fe30b10f3f56498e3a0d911a693ced metrics: - type: main_score value: 82.353 - type: map_at_1 value: 23.408 - type: map_at_10 value: 73.302 - type: map_at_100 value: 76.532 - type: map_at_1000 value: 76.578 - type: map_at_20 value: 75.765 - type: map_at_3 value: 49.297999999999995 - type: map_at_5 value: 62.96000000000001 - type: mrr_at_1 value: 83.3 - type: mrr_at_10 value: 88.85841269841264 - type: mrr_at_100 value: 88.937851229216 - type: mrr_at_1000 value: 88.94253811030754 - type: mrr_at_20 value: 88.90522789194803 - type: mrr_at_3 value: 88.31666666666662 - type: mrr_at_5 value: 88.66416666666662 - type: nauc_map_at_1000_diff1 value: 3.978108165433077 - type: nauc_map_at_1000_max value: 32.84013060265069 - type: nauc_map_at_1000_std value: 17.104374545928255 - type: nauc_map_at_100_diff1 value: 3.9594456007844183 - type: nauc_map_at_100_max value: 32.84323698444807 - type: nauc_map_at_100_std value: 17.083360165851175 - type: nauc_map_at_10_diff1 value: 6.564428602685249 - type: nauc_map_at_10_max value: 29.490007273766956 - type: nauc_map_at_10_std value: 6.955854455105477 - type: nauc_map_at_1_diff1 value: 43.01902060700144 - type: nauc_map_at_1_max value: -8.940094269879843 - type: nauc_map_at_1_std value: -28.063233795166276 - type: nauc_map_at_20_diff1 value: 4.446904145850981 - type: nauc_map_at_20_max value: 32.47424290913474 - type: nauc_map_at_20_std value: 14.957146942696257 - type: nauc_map_at_3_diff1 value: 25.91745605988797 - type: nauc_map_at_3_max value: 3.661124903759869 - type: nauc_map_at_3_std value: -21.936610233451646 - type: nauc_map_at_5_diff1 value: 16.629939273347865 - type: nauc_map_at_5_max value: 14.666913498454564 - type: nauc_map_at_5_std value: -12.39941441022446 - type: nauc_mrr_at_1000_diff1 value: 26.08525262735903 - type: nauc_mrr_at_1000_max value: 47.86393129438558 - type: nauc_mrr_at_1000_std value: 28.811634091001743 - type: nauc_mrr_at_100_diff1 value: 26.081836904532153 - type: nauc_mrr_at_100_max value: 47.880134050815 - type: nauc_mrr_at_100_std value: 28.828980969011475 - type: nauc_mrr_at_10_diff1 value: 26.09549377249783 - type: nauc_mrr_at_10_max value: 48.11004429436051 - type: nauc_mrr_at_10_std value: 29.041772733561455 - type: nauc_mrr_at_1_diff1 value: 26.095576390896717 - type: nauc_mrr_at_1_max value: 40.102786808829485 - type: nauc_mrr_at_1_std value: 21.16142603421125 - type: nauc_mrr_at_20_diff1 value: 26.078553311053394 - type: nauc_mrr_at_20_max value: 47.9955055491724 - type: nauc_mrr_at_20_std value: 28.92844826033336 - type: nauc_mrr_at_3_diff1 value: 25.736821420614447 - type: nauc_mrr_at_3_max value: 48.30695057366758 - type: nauc_mrr_at_3_std value: 29.295726311215475 - type: nauc_mrr_at_5_diff1 value: 25.979034861669714 - type: nauc_mrr_at_5_max value: 48.500915285456344 - type: nauc_mrr_at_5_std value: 29.449704923164106 - type: nauc_ndcg_at_1000_diff1 value: 6.624272455812551 - type: nauc_ndcg_at_1000_max value: 41.526519286613414 - type: nauc_ndcg_at_1000_std value: 27.91983541845217 - type: nauc_ndcg_at_100_diff1 value: 6.033169661320914 - type: nauc_ndcg_at_100_max value: 41.6841728152419 - type: nauc_ndcg_at_100_std value: 28.35967524719135 - type: nauc_ndcg_at_10_diff1 value: 5.627968389448389 - type: nauc_ndcg_at_10_max value: 37.18261001317417 - type: nauc_ndcg_at_10_std value: 19.757054878692408 - type: nauc_ndcg_at_1_diff1 value: 26.095576390896717 - type: nauc_ndcg_at_1_max value: 40.102786808829485 - type: nauc_ndcg_at_1_std value: 21.16142603421125 - type: nauc_ndcg_at_20_diff1 value: 5.678380464964442 - type: nauc_ndcg_at_20_max value: 40.70268508824627 - type: nauc_ndcg_at_20_std value: 25.003203457508622 - type: nauc_ndcg_at_3_diff1 value: 5.7196343030730645 - type: nauc_ndcg_at_3_max value: 34.50950904905902 - type: nauc_ndcg_at_3_std value: 20.099411226966403 - type: nauc_ndcg_at_5_diff1 value: 7.398974214665505 - type: nauc_ndcg_at_5_max value: 31.777872881596885 - type: nauc_ndcg_at_5_std value: 14.212532410116573 - type: nauc_precision_at_1000_diff1 value: -26.784369186388286 - type: nauc_precision_at_1000_max value: 20.9055343942668 - type: nauc_precision_at_1000_std value: 48.97851074406537 - type: nauc_precision_at_100_diff1 value: -27.79381730090699 - type: nauc_precision_at_100_max value: 22.80005440633608 - type: nauc_precision_at_100_std value: 50.935594672026795 - type: nauc_precision_at_10_diff1 value: -30.285772529280557 - type: nauc_precision_at_10_max value: 32.73392928068347 - type: nauc_precision_at_10_std value: 47.96878369413408 - type: nauc_precision_at_1_diff1 value: 26.095576390896717 - type: nauc_precision_at_1_max value: 40.102786808829485 - type: nauc_precision_at_1_std value: 21.16142603421125 - type: nauc_precision_at_20_diff1 value: -28.93118180068221 - type: nauc_precision_at_20_max value: 27.34554979821627 - type: nauc_precision_at_20_std value: 50.768062841591245 - type: nauc_precision_at_3_diff1 value: -20.842604987632818 - type: nauc_precision_at_3_max value: 38.567385349160865 - type: nauc_precision_at_3_std value: 34.962189381111585 - type: nauc_precision_at_5_diff1 value: -27.39434681486595 - type: nauc_precision_at_5_max value: 36.46059763518038 - type: nauc_precision_at_5_std value: 39.893251684847286 - type: nauc_recall_at_1000_diff1 value: -11.949093496228018 - type: nauc_recall_at_1000_max value: 73.88534051191724 - type: nauc_recall_at_1000_std value: 74.63173870654316 - type: nauc_recall_at_100_diff1 value: -10.612653444299633 - type: nauc_recall_at_100_max value: 55.332461824335255 - type: nauc_recall_at_100_std value: 55.6971441098854 - type: nauc_recall_at_10_diff1 value: 1.6381390695279527 - type: nauc_recall_at_10_max value: 30.7773121587242 - type: nauc_recall_at_10_std value: 5.983376763709044 - type: nauc_recall_at_1_diff1 value: 43.01902060700144 - type: nauc_recall_at_1_max value: -8.940094269879843 - type: nauc_recall_at_1_std value: -28.063233795166276 - type: nauc_recall_at_20_diff1 value: -3.5879888483690268 - type: nauc_recall_at_20_max value: 42.56780359254684 - type: nauc_recall_at_20_std value: 28.64620011473346 - type: nauc_recall_at_3_diff1 value: 24.423753178927363 - type: nauc_recall_at_3_max value: 0.28631207577281326 - type: nauc_recall_at_3_std value: -24.79099042560129 - type: nauc_recall_at_5_diff1 value: 15.716357450134492 - type: nauc_recall_at_5_max value: 9.923967009889193 - type: nauc_recall_at_5_std value: -18.11714448988651 - type: ndcg_at_1 value: 83.3 - type: ndcg_at_10 value: 82.353 - type: ndcg_at_100 value: 85.952 - type: ndcg_at_1000 value: 86.393 - type: ndcg_at_20 value: 84.333 - type: ndcg_at_3 value: 79.128 - type: ndcg_at_5 value: 78.96300000000001 - type: precision_at_1 value: 83.3 - type: precision_at_10 value: 40.36 - type: precision_at_100 value: 4.769 - type: precision_at_1000 value: 0.488 - type: precision_at_20 value: 22.295 - type: precision_at_3 value: 71.25 - type: precision_at_5 value: 61.18 - type: recall_at_1 value: 23.408 - type: recall_at_10 value: 85.44800000000001 - type: recall_at_100 value: 96.712 - type: recall_at_1000 value: 98.988 - type: recall_at_20 value: 91.304 - type: recall_at_3 value: 52.65 - type: recall_at_5 value: 69.81 - task: type: Retrieval dataset: name: MTEB EcomRetrieval type: C-MTEB/EcomRetrieval config: default split: dev revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9 metrics: - type: main_score value: 63.712999999999994 - type: map_at_1 value: 49.0 - type: map_at_10 value: 58.620000000000005 - type: map_at_100 value: 59.183 - type: map_at_1000 value: 59.19799999999999 - type: map_at_20 value: 58.948 - type: map_at_3 value: 55.883 - type: map_at_5 value: 57.452999999999996 - type: mrr_at_1 value: 49.0 - type: mrr_at_10 value: 58.61988095238089 - type: mrr_at_100 value: 59.18251462760907 - type: mrr_at_1000 value: 59.1981896580556 - type: mrr_at_20 value: 58.94805232134562 - type: mrr_at_3 value: 55.883333333333304 - type: mrr_at_5 value: 57.4533333333333 - type: nauc_map_at_1000_diff1 value: 60.33101842801658 - type: nauc_map_at_1000_max value: 19.502683068762945 - type: nauc_map_at_1000_std value: -9.052741690420172 - type: nauc_map_at_100_diff1 value: 60.320202163437884 - type: nauc_map_at_100_max value: 19.511425958183473 - type: nauc_map_at_100_std value: -9.046775711361885 - type: nauc_map_at_10_diff1 value: 60.32228179956949 - type: nauc_map_at_10_max value: 19.6159978656515 - type: nauc_map_at_10_std value: -9.132522477977544 - type: nauc_map_at_1_diff1 value: 61.89621977613427 - type: nauc_map_at_1_max value: 15.015734335373715 - type: nauc_map_at_1_std value: -12.641774992365185 - type: nauc_map_at_20_diff1 value: 60.351130642660486 - type: nauc_map_at_20_max value: 19.433343357030232 - type: nauc_map_at_20_std value: -9.21598413872683 - type: nauc_map_at_3_diff1 value: 60.26725821298107 - type: nauc_map_at_3_max value: 18.3498595109406 - type: nauc_map_at_3_std value: -10.051517839346984 - type: nauc_map_at_5_diff1 value: 60.164921439673925 - type: nauc_map_at_5_max value: 18.593900545400267 - type: nauc_map_at_5_std value: -9.934110598947624 - type: nauc_mrr_at_1000_diff1 value: 60.33101842801658 - type: nauc_mrr_at_1000_max value: 19.502683068762945 - type: nauc_mrr_at_1000_std value: -9.052741690420172 - type: nauc_mrr_at_100_diff1 value: 60.320202163437884 - type: nauc_mrr_at_100_max value: 19.511425958183473 - type: nauc_mrr_at_100_std value: -9.046775711361885 - type: nauc_mrr_at_10_diff1 value: 60.32228179956949 - type: nauc_mrr_at_10_max value: 19.6159978656515 - type: nauc_mrr_at_10_std value: -9.132522477977544 - type: nauc_mrr_at_1_diff1 value: 61.89621977613427 - type: nauc_mrr_at_1_max value: 15.015734335373715 - type: nauc_mrr_at_1_std value: -12.641774992365185 - type: nauc_mrr_at_20_diff1 value: 60.351130642660486 - type: nauc_mrr_at_20_max value: 19.433343357030232 - type: nauc_mrr_at_20_std value: -9.21598413872683 - type: nauc_mrr_at_3_diff1 value: 60.26725821298107 - type: nauc_mrr_at_3_max value: 18.3498595109406 - type: nauc_mrr_at_3_std value: -10.051517839346984 - type: nauc_mrr_at_5_diff1 value: 60.164921439673925 - type: nauc_mrr_at_5_max value: 18.593900545400267 - type: nauc_mrr_at_5_std value: -9.934110598947624 - type: nauc_ndcg_at_1000_diff1 value: 60.190733838614676 - type: nauc_ndcg_at_1000_max value: 22.361539210340222 - type: nauc_ndcg_at_1000_std value: -5.745163462434749 - type: nauc_ndcg_at_100_diff1 value: 59.89473232352801 - type: nauc_ndcg_at_100_max value: 22.68282893350434 - type: nauc_ndcg_at_100_std value: -5.4179387740783 - type: nauc_ndcg_at_10_diff1 value: 60.07971889322107 - type: nauc_ndcg_at_10_max value: 22.591286648072977 - type: nauc_ndcg_at_10_std value: -6.68500894448089 - type: nauc_ndcg_at_1_diff1 value: 61.89621977613427 - type: nauc_ndcg_at_1_max value: 15.015734335373715 - type: nauc_ndcg_at_1_std value: -12.641774992365185 - type: nauc_ndcg_at_20_diff1 value: 60.182873920240475 - type: nauc_ndcg_at_20_max value: 21.964898434175247 - type: nauc_ndcg_at_20_std value: -6.906365610289816 - type: nauc_ndcg_at_3_diff1 value: 59.8208566369894 - type: nauc_ndcg_at_3_max value: 19.388884168625417 - type: nauc_ndcg_at_3_std value: -9.151250601081255 - type: nauc_ndcg_at_5_diff1 value: 59.599342583351955 - type: nauc_ndcg_at_5_max value: 19.8910854628725 - type: nauc_ndcg_at_5_std value: -8.885354650481215 - type: nauc_precision_at_1000_diff1 value: 63.58164887576627 - type: nauc_precision_at_1000_max value: 92.23383046912454 - type: nauc_precision_at_1000_std value: 87.13881949176067 - type: nauc_precision_at_100_diff1 value: 53.73002142033278 - type: nauc_precision_at_100_max value: 70.37128576920941 - type: nauc_precision_at_100_std value: 55.41687263140533 - type: nauc_precision_at_10_diff1 value: 59.41629120257138 - type: nauc_precision_at_10_max value: 38.24957021696883 - type: nauc_precision_at_10_std value: 6.335412380239172 - type: nauc_precision_at_1_diff1 value: 61.89621977613427 - type: nauc_precision_at_1_max value: 15.015734335373715 - type: nauc_precision_at_1_std value: -12.641774992365185 - type: nauc_precision_at_20_diff1 value: 59.95367722749617 - type: nauc_precision_at_20_max value: 38.11970211089507 - type: nauc_precision_at_20_std value: 8.468361991180146 - type: nauc_precision_at_3_diff1 value: 58.418401476502524 - type: nauc_precision_at_3_max value: 22.708479411978058 - type: nauc_precision_at_3_std value: -6.238867799833925 - type: nauc_precision_at_5_diff1 value: 57.54249152786323 - type: nauc_precision_at_5_max value: 24.64947877432984 - type: nauc_precision_at_5_std value: -5.018047100033905 - type: nauc_recall_at_1000_diff1 value: 63.581648875766604 - type: nauc_recall_at_1000_max value: 92.23383046912458 - type: nauc_recall_at_1000_std value: 87.13881949176098 - type: nauc_recall_at_100_diff1 value: 53.73002142033278 - type: nauc_recall_at_100_max value: 70.37128576920976 - type: nauc_recall_at_100_std value: 55.41687263140555 - type: nauc_recall_at_10_diff1 value: 59.41629120257145 - type: nauc_recall_at_10_max value: 38.2495702169689 - type: nauc_recall_at_10_std value: 6.335412380239176 - type: nauc_recall_at_1_diff1 value: 61.89621977613427 - type: nauc_recall_at_1_max value: 15.015734335373715 - type: nauc_recall_at_1_std value: -12.641774992365185 - type: nauc_recall_at_20_diff1 value: 59.95367722749639 - type: nauc_recall_at_20_max value: 38.11970211089514 - type: nauc_recall_at_20_std value: 8.468361991180268 - type: nauc_recall_at_3_diff1 value: 58.41840147650248 - type: nauc_recall_at_3_max value: 22.708479411978043 - type: nauc_recall_at_3_std value: -6.238867799833981 - type: nauc_recall_at_5_diff1 value: 57.542491527863206 - type: nauc_recall_at_5_max value: 24.649478774330014 - type: nauc_recall_at_5_std value: -5.018047100033782 - type: ndcg_at_1 value: 49.0 - type: ndcg_at_10 value: 63.712999999999994 - type: ndcg_at_100 value: 66.523 - type: ndcg_at_1000 value: 66.922 - type: ndcg_at_20 value: 64.904 - type: ndcg_at_3 value: 58.099000000000004 - type: ndcg_at_5 value: 60.913 - type: precision_at_1 value: 49.0 - type: precision_at_10 value: 7.99 - type: precision_at_100 value: 0.932 - type: precision_at_1000 value: 0.096 - type: precision_at_20 value: 4.2299999999999995 - type: precision_at_3 value: 21.5 - type: precision_at_5 value: 14.26 - type: recall_at_1 value: 49.0 - type: recall_at_10 value: 79.9 - type: recall_at_100 value: 93.2 - type: recall_at_1000 value: 96.3 - type: recall_at_20 value: 84.6 - type: recall_at_3 value: 64.5 - type: recall_at_5 value: 71.3 - task: type: Classification dataset: name: MTEB IFlyTek type: C-MTEB/IFlyTek-classification config: default split: validation revision: 421605374b29664c5fc098418fe20ada9bd55f8a metrics: - type: accuracy value: 49.188149288187766 - type: f1 value: 35.82742058478872 - type: f1_weighted value: 46.33812923348324 - type: main_score value: 49.188149288187766 - task: type: Classification dataset: name: MTEB JDReview type: C-MTEB/JDReview-classification config: default split: test revision: b7c64bd89eb87f8ded463478346f76731f07bf8b metrics: - type: accuracy value: 83.45215759849907 - type: ap value: 49.602287249765666 - type: ap_weighted value: 49.602287249765666 - type: f1 value: 77.84519218126933 - type: f1_weighted value: 84.83784419250833 - type: main_score value: 83.45215759849907 - task: type: STS dataset: name: MTEB LCQMC type: C-MTEB/LCQMC config: default split: test revision: 17f9b096f80380fce5ed12a9be8be7784b337daf metrics: - type: cosine_pearson value: 66.78399631818323 - type: cosine_spearman value: 70.38648345929874 - type: euclidean_pearson value: 68.79036522204457 - type: euclidean_spearman value: 70.38649454085622 - type: main_score value: 70.38648345929874 - type: manhattan_pearson value: 68.74927335399974 - type: manhattan_spearman value: 70.3453886791424 - type: pearson value: 66.78399631818323 - type: spearman value: 70.38648345929874 - task: type: Reranking dataset: name: MTEB MMarcoReranking type: C-MTEB/Mmarco-reranking config: default split: dev revision: 8e0c766dbe9e16e1d221116a3f36795fbade07f6 metrics: - type: main_score value: 26.991570930656568 - type: map value: 26.991570930656568 - type: mrr value: 25.460714285714285 - type: nAUC_map_diff1 value: 12.174277381054415 - type: nAUC_map_max value: 5.768145859960792 - type: nAUC_map_std value: -0.6863999286086584 - type: nAUC_mrr_diff1 value: 11.83053464449912 - type: nAUC_mrr_max value: 4.893060023643725 - type: nAUC_mrr_std value: -0.22755376963555723 - task: type: Retrieval dataset: name: MTEB MMarcoRetrieval type: C-MTEB/MMarcoRetrieval config: default split: dev revision: 539bbde593d947e2a124ba72651aafc09eb33fc2 metrics: - type: main_score value: 78.679 - type: map_at_1 value: 65.349 - type: map_at_10 value: 74.802 - type: map_at_100 value: 75.141 - type: map_at_1000 value: 75.151 - type: map_at_20 value: 75.03999999999999 - type: map_at_3 value: 72.831 - type: map_at_5 value: 74.09400000000001 - type: mrr_at_1 value: 67.55014326647564 - type: mrr_at_10 value: 75.31912038932084 - type: mrr_at_100 value: 75.6225574951573 - type: mrr_at_1000 value: 75.63176308010398 - type: mrr_at_20 value: 75.53574557856176 - type: mrr_at_3 value: 73.59598853868198 - type: mrr_at_5 value: 74.70343839541526 - type: nauc_map_at_1000_diff1 value: 77.81972509758704 - type: nauc_map_at_1000_max value: 27.445457824343595 - type: nauc_map_at_1000_std value: -18.60670002314929 - type: nauc_map_at_100_diff1 value: 77.81776087022583 - type: nauc_map_at_100_max value: 27.465677796741794 - type: nauc_map_at_100_std value: -18.574455053179566 - type: nauc_map_at_10_diff1 value: 77.668921503636 - type: nauc_map_at_10_max value: 27.564476726876563 - type: nauc_map_at_10_std value: -18.67577233314456 - type: nauc_map_at_1_diff1 value: 80.13251752826227 - type: nauc_map_at_1_max value: 19.700940114548352 - type: nauc_map_at_1_std value: -24.276498497801104 - type: nauc_map_at_20_diff1 value: 77.76444686257037 - type: nauc_map_at_20_max value: 27.507355610895434 - type: nauc_map_at_20_std value: -18.570029885207234 - type: nauc_map_at_3_diff1 value: 77.62870706241021 - type: nauc_map_at_3_max value: 25.979199504514654 - type: nauc_map_at_3_std value: -20.480776195240768 - type: nauc_map_at_5_diff1 value: 77.68046637184071 - type: nauc_map_at_5_max value: 27.068345296401887 - type: nauc_map_at_5_std value: -19.515458511154968 - type: nauc_mrr_at_1000_diff1 value: 78.12673001253819 - type: nauc_mrr_at_1000_max value: 28.23584877768183 - type: nauc_mrr_at_1000_std value: -17.765605843184606 - type: nauc_mrr_at_100_diff1 value: 78.12476632443614 - type: nauc_mrr_at_100_max value: 28.255499574563654 - type: nauc_mrr_at_100_std value: -17.73302695902061 - type: nauc_mrr_at_10_diff1 value: 77.98552897771079 - type: nauc_mrr_at_10_max value: 28.433270245298903 - type: nauc_mrr_at_10_std value: -17.721467674164725 - type: nauc_mrr_at_1_diff1 value: 80.74164178463916 - type: nauc_mrr_at_1_max value: 23.400992011183135 - type: nauc_mrr_at_1_std value: -23.155846305708668 - type: nauc_mrr_at_20_diff1 value: 78.08519488707572 - type: nauc_mrr_at_20_max value: 28.305974768972476 - type: nauc_mrr_at_20_std value: -17.70766096956611 - type: nauc_mrr_at_3_diff1 value: 77.99203426607973 - type: nauc_mrr_at_3_max value: 27.39053740753677 - type: nauc_mrr_at_3_std value: -19.110899565832597 - type: nauc_mrr_at_5_diff1 value: 77.99012861357085 - type: nauc_mrr_at_5_max value: 28.018453732422905 - type: nauc_mrr_at_5_std value: -18.45275089190139 - type: nauc_ndcg_at_1000_diff1 value: 77.37899152370498 - type: nauc_ndcg_at_1000_max value: 29.715512454119402 - type: nauc_ndcg_at_1000_std value: -15.311768186844196 - type: nauc_ndcg_at_100_diff1 value: 77.30487512550962 - type: nauc_ndcg_at_100_max value: 30.358291073116767 - type: nauc_ndcg_at_100_std value: -14.276238712942787 - type: nauc_ndcg_at_10_diff1 value: 76.55306779956729 - type: nauc_ndcg_at_10_max value: 31.003218536597576 - type: nauc_ndcg_at_10_std value: -14.528637377688142 - type: nauc_ndcg_at_1_diff1 value: 80.74164178463916 - type: nauc_ndcg_at_1_max value: 23.400992011183135 - type: nauc_ndcg_at_1_std value: -23.155846305708668 - type: nauc_ndcg_at_20_diff1 value: 76.92359358217516 - type: nauc_ndcg_at_20_max value: 30.734983558658648 - type: nauc_ndcg_at_20_std value: -14.12117266760052 - type: nauc_ndcg_at_3_diff1 value: 76.65174056138369 - type: nauc_ndcg_at_3_max value: 27.744998584618365 - type: nauc_ndcg_at_3_std value: -18.596857381234265 - type: nauc_ndcg_at_5_diff1 value: 76.64434516875298 - type: nauc_ndcg_at_5_max value: 29.580949778455096 - type: nauc_ndcg_at_5_std value: -16.820146947848347 - type: nauc_precision_at_1000_diff1 value: -15.819998326963425 - type: nauc_precision_at_1000_max value: 22.790060032171432 - type: nauc_precision_at_1000_std value: 25.646210332652032 - type: nauc_precision_at_100_diff1 value: -3.225658983047692 - type: nauc_precision_at_100_max value: 31.046785086458396 - type: nauc_precision_at_100_std value: 30.64496213174489 - type: nauc_precision_at_10_diff1 value: 22.399826113454544 - type: nauc_precision_at_10_max value: 37.17215584865757 - type: nauc_precision_at_10_std value: 16.375879066453813 - type: nauc_precision_at_1_diff1 value: 80.74164178463916 - type: nauc_precision_at_1_max value: 23.400992011183135 - type: nauc_precision_at_1_std value: -23.155846305708668 - type: nauc_precision_at_20_diff1 value: 11.824890141102545 - type: nauc_precision_at_20_max value: 35.7858012680296 - type: nauc_precision_at_20_std value: 24.36537306318588 - type: nauc_precision_at_3_diff1 value: 46.964579254137156 - type: nauc_precision_at_3_max value: 31.240508812172248 - type: nauc_precision_at_3_std value: -4.790609954536406 - type: nauc_precision_at_5_diff1 value: 35.92331054363029 - type: nauc_precision_at_5_max value: 34.58921599366064 - type: nauc_precision_at_5_std value: 3.955705927038542 - type: nauc_recall_at_1000_diff1 value: 69.82124326053469 - type: nauc_recall_at_1000_max value: 77.26332872982017 - type: nauc_recall_at_1000_std value: 74.20589405678723 - type: nauc_recall_at_100_diff1 value: 71.09335151657598 - type: nauc_recall_at_100_max value: 74.66551138520433 - type: nauc_recall_at_100_std value: 62.296014312578606 - type: nauc_recall_at_10_diff1 value: 68.34266216578438 - type: nauc_recall_at_10_max value: 51.776074855673635 - type: nauc_recall_at_10_std value: 11.551590635685633 - type: nauc_recall_at_1_diff1 value: 80.13251752826227 - type: nauc_recall_at_1_max value: 19.700940114548352 - type: nauc_recall_at_1_std value: -24.276498497801104 - type: nauc_recall_at_20_diff1 value: 68.44098404116468 - type: nauc_recall_at_20_max value: 58.0709257934264 - type: nauc_recall_at_20_std value: 27.20288447881239 - type: nauc_recall_at_3_diff1 value: 72.224364274587 - type: nauc_recall_at_3_max value: 32.11973511168104 - type: nauc_recall_at_3_std value: -13.287781131985849 - type: nauc_recall_at_5_diff1 value: 70.97684486885963 - type: nauc_recall_at_5_max value: 39.47238239221433 - type: nauc_recall_at_5_std value: -5.749985209368605 - type: ndcg_at_1 value: 67.55 - type: ndcg_at_10 value: 78.679 - type: ndcg_at_100 value: 80.16 - type: ndcg_at_1000 value: 80.42 - type: ndcg_at_20 value: 79.50500000000001 - type: ndcg_at_3 value: 74.96199999999999 - type: ndcg_at_5 value: 77.093 - type: precision_at_1 value: 67.55 - type: precision_at_10 value: 9.589 - type: precision_at_100 value: 1.031 - type: precision_at_1000 value: 0.105 - type: precision_at_20 value: 4.966 - type: precision_at_3 value: 28.319 - type: precision_at_5 value: 18.129 - type: recall_at_1 value: 65.349 - type: recall_at_10 value: 90.10000000000001 - type: recall_at_100 value: 96.685 - type: recall_at_1000 value: 98.714 - type: recall_at_20 value: 93.298 - type: recall_at_3 value: 80.324 - type: recall_at_5 value: 85.37700000000001 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-CN) type: mteb/amazon_massive_intent config: zh-CN split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 70.94149293880295 - type: f1 value: 67.43015916458866 - type: f1_weighted value: 70.02165762549619 - type: main_score value: 70.94149293880295 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-TW) type: mteb/amazon_massive_intent config: zh-TW split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 63.05312710154675 - type: f1 value: 61.11778922874984 - type: f1_weighted value: 61.425454449692396 - type: main_score value: 63.05312710154675 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-CN) type: mteb/amazon_massive_scenario config: zh-CN split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 77.79757901815736 - type: f1 value: 76.85610655879204 - type: f1_weighted value: 77.36623686607157 - type: main_score value: 77.79757901815736 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-TW) type: mteb/amazon_massive_scenario config: zh-TW split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 71.34498991257566 - type: f1 value: 71.42538497861686 - type: f1_weighted value: 70.47776598531958 - type: main_score value: 71.34498991257566 - task: type: Retrieval dataset: name: MTEB MedicalRetrieval type: C-MTEB/MedicalRetrieval config: default split: dev revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6 metrics: - type: main_score value: 57.528999999999996 - type: map_at_1 value: 48.699999999999996 - type: map_at_10 value: 54.674 - type: map_at_100 value: 55.187 - type: map_at_1000 value: 55.24 - type: map_at_20 value: 54.933 - type: map_at_3 value: 53.367 - type: map_at_5 value: 54.081999999999994 - type: mrr_at_1 value: 48.8 - type: mrr_at_10 value: 54.71369047619046 - type: mrr_at_100 value: 55.23606881716415 - type: mrr_at_1000 value: 55.2887596380029 - type: mrr_at_20 value: 54.98226974307081 - type: mrr_at_3 value: 53.41666666666666 - type: mrr_at_5 value: 54.131666666666646 - type: nauc_map_at_1000_diff1 value: 79.392997677128 - type: nauc_map_at_1000_max value: 47.4042544614244 - type: nauc_map_at_1000_std value: 23.2164546714886 - type: nauc_map_at_100_diff1 value: 79.3811285055918 - type: nauc_map_at_100_max value: 47.399489637525214 - type: nauc_map_at_100_std value: 23.24298678047571 - type: nauc_map_at_10_diff1 value: 79.51795702164893 - type: nauc_map_at_10_max value: 47.3775323018549 - type: nauc_map_at_10_std value: 22.863584607876017 - type: nauc_map_at_1_diff1 value: 82.77387889149895 - type: nauc_map_at_1_max value: 48.92316018033766 - type: nauc_map_at_1_std value: 20.670920881420933 - type: nauc_map_at_20_diff1 value: 79.36321354500926 - type: nauc_map_at_20_max value: 47.347135287818695 - type: nauc_map_at_20_std value: 23.128792587733724 - type: nauc_map_at_3_diff1 value: 79.89693675044646 - type: nauc_map_at_3_max value: 47.999519454025815 - type: nauc_map_at_3_std value: 22.67285215587248 - type: nauc_map_at_5_diff1 value: 79.72880868956226 - type: nauc_map_at_5_max value: 47.870829359727615 - type: nauc_map_at_5_std value: 22.75976001331719 - type: nauc_mrr_at_1000_diff1 value: 79.2558524289943 - type: nauc_mrr_at_1000_max value: 47.68193948210489 - type: nauc_mrr_at_1000_std value: 23.488171939833503 - type: nauc_mrr_at_100_diff1 value: 79.2441760972466 - type: nauc_mrr_at_100_max value: 47.67677923765432 - type: nauc_mrr_at_100_std value: 23.51432250784555 - type: nauc_mrr_at_10_diff1 value: 79.39423493974832 - type: nauc_mrr_at_10_max value: 47.672297066929545 - type: nauc_mrr_at_10_std value: 23.13845505800058 - type: nauc_mrr_at_1_diff1 value: 82.51854957699533 - type: nauc_mrr_at_1_max value: 49.43475537911197 - type: nauc_mrr_at_1_std value: 21.172657021240443 - type: nauc_mrr_at_20_diff1 value: 79.22702612117199 - type: nauc_mrr_at_20_max value: 47.62286080846738 - type: nauc_mrr_at_20_std value: 23.398587017649174 - type: nauc_mrr_at_3_diff1 value: 79.76301529177348 - type: nauc_mrr_at_3_max value: 48.26663425470944 - type: nauc_mrr_at_3_std value: 22.935349467987145 - type: nauc_mrr_at_5_diff1 value: 79.5934610019844 - type: nauc_mrr_at_5_max value: 48.1407033814883 - type: nauc_mrr_at_5_std value: 23.025008156084695 - type: nauc_ndcg_at_1000_diff1 value: 77.97548063568358 - type: nauc_ndcg_at_1000_max value: 46.670156188276266 - type: nauc_ndcg_at_1000_std value: 25.32524568996684 - type: nauc_ndcg_at_100_diff1 value: 77.58788261282791 - type: nauc_ndcg_at_100_max value: 46.366231150510664 - type: nauc_ndcg_at_100_std value: 26.02842093987038 - type: nauc_ndcg_at_10_diff1 value: 78.15883898742274 - type: nauc_ndcg_at_10_max value: 46.181496192291974 - type: nauc_ndcg_at_10_std value: 23.997358704992077 - type: nauc_ndcg_at_1_diff1 value: 82.77387889149895 - type: nauc_ndcg_at_1_max value: 48.92316018033766 - type: nauc_ndcg_at_1_std value: 20.670920881420933 - type: nauc_ndcg_at_20_diff1 value: 77.51209948232727 - type: nauc_ndcg_at_20_max value: 46.02903895633775 - type: nauc_ndcg_at_20_std value: 25.023178998194467 - type: nauc_ndcg_at_3_diff1 value: 79.0464751622174 - type: nauc_ndcg_at_3_max value: 47.65456262552185 - type: nauc_ndcg_at_3_std value: 23.50005981191216 - type: nauc_ndcg_at_5_diff1 value: 78.73621060890696 - type: nauc_ndcg_at_5_max value: 47.4490746627881 - type: nauc_ndcg_at_5_std value: 23.70727530773819 - type: nauc_precision_at_1000_diff1 value: 63.42066238259988 - type: nauc_precision_at_1000_max value: 43.54369198659821 - type: nauc_precision_at_1000_std value: 55.676388202339524 - type: nauc_precision_at_100_diff1 value: 67.14856074074835 - type: nauc_precision_at_100_max value: 40.92023184354666 - type: nauc_precision_at_100_std value: 45.790641988757145 - type: nauc_precision_at_10_diff1 value: 73.22243545156664 - type: nauc_precision_at_10_max value: 41.458823923773686 - type: nauc_precision_at_10_std value: 28.142697919198138 - type: nauc_precision_at_1_diff1 value: 82.77387889149895 - type: nauc_precision_at_1_max value: 48.92316018033766 - type: nauc_precision_at_1_std value: 20.670920881420933 - type: nauc_precision_at_20_diff1 value: 69.5822714276579 - type: nauc_precision_at_20_max value: 40.258145844180724 - type: nauc_precision_at_20_std value: 33.443132096498665 - type: nauc_precision_at_3_diff1 value: 76.48729951428531 - type: nauc_precision_at_3_max value: 46.58972515297812 - type: nauc_precision_at_3_std value: 26.07700999310317 - type: nauc_precision_at_5_diff1 value: 75.58859746051998 - type: nauc_precision_at_5_max value: 46.09484444567729 - type: nauc_precision_at_5_std value: 26.82420134602608 - type: nauc_recall_at_1000_diff1 value: 63.42066238260002 - type: nauc_recall_at_1000_max value: 43.543691986598645 - type: nauc_recall_at_1000_std value: 55.67638820233998 - type: nauc_recall_at_100_diff1 value: 67.14856074074834 - type: nauc_recall_at_100_max value: 40.92023184354673 - type: nauc_recall_at_100_std value: 45.79064198875728 - type: nauc_recall_at_10_diff1 value: 73.22243545156665 - type: nauc_recall_at_10_max value: 41.45882392377375 - type: nauc_recall_at_10_std value: 28.14269791919819 - type: nauc_recall_at_1_diff1 value: 82.77387889149895 - type: nauc_recall_at_1_max value: 48.92316018033766 - type: nauc_recall_at_1_std value: 20.670920881420933 - type: nauc_recall_at_20_diff1 value: 69.58227142765797 - type: nauc_recall_at_20_max value: 40.25814584418081 - type: nauc_recall_at_20_std value: 33.443132096498665 - type: nauc_recall_at_3_diff1 value: 76.4872995142853 - type: nauc_recall_at_3_max value: 46.589725152978076 - type: nauc_recall_at_3_std value: 26.07700999310312 - type: nauc_recall_at_5_diff1 value: 75.58859746051999 - type: nauc_recall_at_5_max value: 46.09484444567737 - type: nauc_recall_at_5_std value: 26.8242013460261 - type: ndcg_at_1 value: 48.699999999999996 - type: ndcg_at_10 value: 57.528999999999996 - type: ndcg_at_100 value: 60.38 - type: ndcg_at_1000 value: 61.937 - type: ndcg_at_20 value: 58.518 - type: ndcg_at_3 value: 54.818999999999996 - type: ndcg_at_5 value: 56.101 - type: precision_at_1 value: 48.699999999999996 - type: precision_at_10 value: 6.65 - type: precision_at_100 value: 0.8059999999999999 - type: precision_at_1000 value: 0.093 - type: precision_at_20 value: 3.5249999999999995 - type: precision_at_3 value: 19.667 - type: precision_at_5 value: 12.42 - type: recall_at_1 value: 48.699999999999996 - type: recall_at_10 value: 66.5 - type: recall_at_100 value: 80.60000000000001 - type: recall_at_1000 value: 93.2 - type: recall_at_20 value: 70.5 - type: recall_at_3 value: 59.0 - type: recall_at_5 value: 62.1 - task: type: Classification dataset: name: MTEB MultilingualSentiment type: C-MTEB/MultilingualSentiment-classification config: default split: test revision: 46958b007a63fdbf239b7672c25d0bea67b5ea1a metrics: - type: accuracy value: 78.08 - type: f1 value: 77.44308848942492 - type: f1_weighted value: 77.44308848942492 - type: main_score value: 78.08 - task: type: PairClassification dataset: name: MTEB Ocnli type: C-MTEB/OCNLI config: default split: validation revision: 66e76a618a34d6d565d5538088562851e6daa7ec metrics: - type: cosine_accuracy value: 66.8651867893882 - type: cosine_accuracy_threshold value: 84.34688448905945 - type: cosine_ap value: 69.83287846115917 - type: cosine_f1 value: 71.33520074696546 - type: cosine_f1_threshold value: 83.85992050170898 - type: cosine_precision value: 63.93305439330545 - type: cosine_recall value: 80.67581837381204 - type: dot_accuracy value: 66.8651867893882 - type: dot_accuracy_threshold value: 84.34690237045288 - type: dot_ap value: 69.83287846115917 - type: dot_f1 value: 71.33520074696546 - type: dot_f1_threshold value: 83.85992050170898 - type: dot_precision value: 63.93305439330545 - type: dot_recall value: 80.67581837381204 - type: euclidean_accuracy value: 66.8651867893882 - type: euclidean_accuracy_threshold value: 55.95196485519409 - type: euclidean_ap value: 69.83287846115917 - type: euclidean_f1 value: 71.33520074696546 - type: euclidean_f1_threshold value: 56.81561827659607 - type: euclidean_precision value: 63.93305439330545 - type: euclidean_recall value: 80.67581837381204 - type: main_score value: 69.83287846115917 - type: manhattan_accuracy value: 66.0530590146183 - type: manhattan_accuracy_threshold value: 1215.458583831787 - type: manhattan_ap value: 69.51465499538298 - type: manhattan_f1 value: 70.56159420289853 - type: manhattan_f1_threshold value: 1344.7942733764648 - type: manhattan_precision value: 61.77636796193497 - type: manhattan_recall value: 82.259767687434 - type: max_ap value: 69.83287846115917 - type: max_f1 value: 71.33520074696546 - type: max_precision value: 63.93305439330545 - type: max_recall value: 82.259767687434 - type: similarity_accuracy value: 66.8651867893882 - type: similarity_accuracy_threshold value: 84.34688448905945 - type: similarity_ap value: 69.83287846115917 - type: similarity_f1 value: 71.33520074696546 - type: similarity_f1_threshold value: 83.85992050170898 - type: similarity_precision value: 63.93305439330545 - type: similarity_recall value: 80.67581837381204 - task: type: Classification dataset: name: MTEB OnlineShopping type: C-MTEB/OnlineShopping-classification config: default split: test revision: e610f2ebd179a8fda30ae534c3878750a96db120 metrics: - type: accuracy value: 93.66999999999999 - type: ap value: 92.68160375501351 - type: ap_weighted value: 92.68160375501351 - type: f1 value: 93.6673524115384 - type: f1_weighted value: 93.67269842799493 - type: main_score value: 93.66999999999999 - task: type: STS dataset: name: MTEB PAWSX type: C-MTEB/PAWSX config: default split: test revision: 9c6a90e430ac22b5779fb019a23e820b11a8b5e1 metrics: - type: cosine_pearson value: 14.427978400689973 - type: cosine_spearman value: 15.182736434509348 - type: euclidean_pearson value: 17.726048874983753 - type: euclidean_spearman value: 15.201779286945575 - type: main_score value: 15.182736434509348 - type: manhattan_pearson value: 17.715716154164234 - type: manhattan_spearman value: 15.250986981738777 - type: pearson value: 14.427978400689973 - type: spearman value: 15.182736434509348 - task: type: STS dataset: name: MTEB QBQTC type: C-MTEB/QBQTC config: default split: test revision: 790b0510dc52b1553e8c49f3d2afb48c0e5c48b7 metrics: - type: cosine_pearson value: 28.677852039385687 - type: cosine_spearman value: 30.317414500566187 - type: euclidean_pearson value: 28.546943523039168 - type: euclidean_spearman value: 30.31773442605619 - type: main_score value: 30.317414500566187 - type: manhattan_pearson value: 29.06524931618951 - type: manhattan_spearman value: 30.85475318983088 - type: pearson value: 28.677852039385687 - type: spearman value: 30.317414500566187 - task: type: STS dataset: name: MTEB STS22 (zh) type: mteb/sts22-crosslingual-sts config: zh split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 74.25169654144152 - type: cosine_spearman value: 74.02188505990078 - type: euclidean_pearson value: 71.78459076777199 - type: euclidean_spearman value: 74.02188505990078 - type: main_score value: 74.02188505990078 - type: manhattan_pearson value: 71.38471936226554 - type: manhattan_spearman value: 73.72453020549669 - type: pearson value: 74.25169654144152 - type: spearman value: 74.02188505990078 - task: type: STS dataset: name: MTEB STSB type: C-MTEB/STSB config: default split: test revision: 0cde68302b3541bb8b3c340dc0644b0b745b3dc0 metrics: - type: cosine_pearson value: 76.73366278962006 - type: cosine_spearman value: 78.136597096582 - type: euclidean_pearson value: 77.15227584574502 - type: euclidean_spearman value: 78.13622498113003 - type: main_score value: 78.136597096582 - type: manhattan_pearson value: 77.02225035694117 - type: manhattan_spearman value: 78.03964720563964 - type: pearson value: 76.73366278962006 - type: spearman value: 78.136597096582 - task: type: Reranking dataset: name: MTEB T2Reranking type: C-MTEB/T2Reranking config: default split: dev revision: 76631901a18387f85eaa53e5450019b87ad58ef9 metrics: - type: main_score value: 66.38154648171584 - type: map value: 66.38154648171584 - type: mrr value: 76.14530606871499 - type: nAUC_map_diff1 value: -9.806394737932642 - type: nAUC_map_max value: 33.96115791248053 - type: nAUC_map_std value: -3.643316859964786 - type: nAUC_mrr_diff1 value: -6.510263484170889 - type: nAUC_mrr_max value: 26.441557887574124 - type: nAUC_mrr_std value: -4.608018494327204 - task: type: Retrieval dataset: name: MTEB T2Retrieval type: C-MTEB/T2Retrieval config: default split: dev revision: 8731a845f1bf500a4f111cf1070785c793d10e64 metrics: - type: main_score value: 84.133 - type: map_at_1 value: 27.297 - type: map_at_10 value: 76.494 - type: map_at_100 value: 80.119 - type: map_at_1000 value: 80.185 - type: map_at_20 value: 79.251 - type: map_at_3 value: 53.864999999999995 - type: map_at_5 value: 66.143 - type: mrr_at_1 value: 89.57566193231632 - type: mrr_at_10 value: 92.13000711126722 - type: mrr_at_100 value: 92.21882184581148 - type: mrr_at_1000 value: 92.22214774256558 - type: mrr_at_20 value: 92.18699134744894 - type: mrr_at_3 value: 91.66228300894257 - type: mrr_at_5 value: 91.97264597580231 - type: nauc_map_at_1000_diff1 value: 15.207460819974095 - type: nauc_map_at_1000_max value: 42.32453165892631 - type: nauc_map_at_1000_std value: 21.593634336302127 - type: nauc_map_at_100_diff1 value: 15.216272171820561 - type: nauc_map_at_100_max value: 42.22983840076597 - type: nauc_map_at_100_std value: 21.534370324932652 - type: nauc_map_at_10_diff1 value: 19.599553856210008 - type: nauc_map_at_10_max value: 30.246318219245573 - type: nauc_map_at_10_std value: 5.914404965156733 - type: nauc_map_at_1_diff1 value: 52.87085305237716 - type: nauc_map_at_1_max value: -24.27989564325726 - type: nauc_map_at_1_std value: -35.442050298290376 - type: nauc_map_at_20_diff1 value: 15.87998380728732 - type: nauc_map_at_20_max value: 39.78308211411551 - type: nauc_map_at_20_std value: 18.241218939315434 - type: nauc_map_at_3_diff1 value: 39.155089053329014 - type: nauc_map_at_3_max value: -11.970155586820502 - type: nauc_map_at_3_std value: -31.83333979404834 - type: nauc_map_at_5_diff1 value: 31.43539185744996 - type: nauc_map_at_5_max value: 3.5586067754503152 - type: nauc_map_at_5_std value: -20.89939723260621 - type: nauc_mrr_at_1000_diff1 value: 47.58856242843391 - type: nauc_mrr_at_1000_max value: 73.33044542878086 - type: nauc_mrr_at_1000_std value: 41.41370720044016 - type: nauc_mrr_at_100_diff1 value: 47.58885589082642 - type: nauc_mrr_at_100_max value: 73.33895048178488 - type: nauc_mrr_at_100_std value: 41.42862248729776 - type: nauc_mrr_at_10_diff1 value: 47.60432720674615 - type: nauc_mrr_at_10_max value: 73.47964069672504 - type: nauc_mrr_at_10_std value: 41.60604407817306 - type: nauc_mrr_at_1_diff1 value: 47.84195771830615 - type: nauc_mrr_at_1_max value: 68.95221045759685 - type: nauc_mrr_at_1_std value: 35.145250281429824 - type: nauc_mrr_at_20_diff1 value: 47.58534671931297 - type: nauc_mrr_at_20_max value: 73.39618815713096 - type: nauc_mrr_at_20_std value: 41.50538366605475 - type: nauc_mrr_at_3_diff1 value: 47.54080143480509 - type: nauc_mrr_at_3_max value: 73.27456449852177 - type: nauc_mrr_at_3_std value: 41.190010138623364 - type: nauc_mrr_at_5_diff1 value: 47.631799071300314 - type: nauc_mrr_at_5_max value: 73.50427384392508 - type: nauc_mrr_at_5_std value: 41.41445819292792 - type: nauc_ndcg_at_1000_diff1 value: 19.178203338132032 - type: nauc_ndcg_at_1000_max value: 54.846002008332206 - type: nauc_ndcg_at_1000_std value: 33.669755579706234 - type: nauc_ndcg_at_100_diff1 value: 18.825625578528154 - type: nauc_ndcg_at_100_max value: 53.96154830438667 - type: nauc_ndcg_at_100_std value: 33.63879617215427 - type: nauc_ndcg_at_10_diff1 value: 18.95559446945268 - type: nauc_ndcg_at_10_max value: 44.21334528575739 - type: nauc_ndcg_at_10_std value: 22.47737214494352 - type: nauc_ndcg_at_1_diff1 value: 47.84195771830615 - type: nauc_ndcg_at_1_max value: 68.95221045759685 - type: nauc_ndcg_at_1_std value: 35.145250281429824 - type: nauc_ndcg_at_20_diff1 value: 18.915787332802143 - type: nauc_ndcg_at_20_max value: 48.64628634208606 - type: nauc_ndcg_at_20_std value: 27.471901227649102 - type: nauc_ndcg_at_3_diff1 value: 14.800326460175548 - type: nauc_ndcg_at_3_max value: 58.714123081214986 - type: nauc_ndcg_at_3_std value: 32.87146819333138 - type: nauc_ndcg_at_5_diff1 value: 15.117887863548916 - type: nauc_ndcg_at_5_max value: 51.62270126506565 - type: nauc_ndcg_at_5_std value: 28.21637936542305 - type: nauc_precision_at_1000_diff1 value: -34.6115257538737 - type: nauc_precision_at_1000_max value: 46.57505454335497 - type: nauc_precision_at_1000_std value: 58.73410354296305 - type: nauc_precision_at_100_diff1 value: -34.51864090348213 - type: nauc_precision_at_100_max value: 48.12778307352527 - type: nauc_precision_at_100_std value: 60.33112526548986 - type: nauc_precision_at_10_diff1 value: -33.913446995683536 - type: nauc_precision_at_10_max value: 51.827800576762726 - type: nauc_precision_at_10_std value: 56.15214316846719 - type: nauc_precision_at_1_diff1 value: 47.84195771830615 - type: nauc_precision_at_1_max value: 68.95221045759685 - type: nauc_precision_at_1_std value: 35.145250281429824 - type: nauc_precision_at_20_diff1 value: -34.25535498799855 - type: nauc_precision_at_20_max value: 50.23119733433027 - type: nauc_precision_at_20_std value: 59.671418737988546 - type: nauc_precision_at_3_diff1 value: -28.417107232598877 - type: nauc_precision_at_3_max value: 61.16886341335774 - type: nauc_precision_at_3_std value: 48.34533128391697 - type: nauc_precision_at_5_diff1 value: -33.54570066440394 - type: nauc_precision_at_5_max value: 56.522769824532936 - type: nauc_precision_at_5_std value: 51.704950593707935 - type: nauc_recall_at_1000_diff1 value: 2.93977183499487 - type: nauc_recall_at_1000_max value: 59.19161397622145 - type: nauc_recall_at_1000_std value: 62.44563668374114 - type: nauc_recall_at_100_diff1 value: 8.013825549311562 - type: nauc_recall_at_100_max value: 49.846341160862714 - type: nauc_recall_at_100_std value: 48.1170998033127 - type: nauc_recall_at_10_diff1 value: 18.010735796887985 - type: nauc_recall_at_10_max value: 21.358569425898903 - type: nauc_recall_at_10_std value: 1.3301139186106035 - type: nauc_recall_at_1_diff1 value: 52.87085305237716 - type: nauc_recall_at_1_max value: -24.27989564325726 - type: nauc_recall_at_1_std value: -35.442050298290376 - type: nauc_recall_at_20_diff1 value: 11.816321531579238 - type: nauc_recall_at_20_max value: 36.13782953010234 - type: nauc_recall_at_20_std value: 23.555109581359886 - type: nauc_recall_at_3_diff1 value: 37.46336191367832 - type: nauc_recall_at_3_max value: -16.038670342884316 - type: nauc_recall_at_3_std value: -34.074784083025214 - type: nauc_recall_at_5_diff1 value: 30.274716744272567 - type: nauc_recall_at_5_max value: -4.34067124108913 - type: nauc_recall_at_5_std value: -26.21894992157237 - type: ndcg_at_1 value: 89.576 - type: ndcg_at_10 value: 84.133 - type: ndcg_at_100 value: 87.773 - type: ndcg_at_1000 value: 88.421 - type: ndcg_at_20 value: 85.909 - type: ndcg_at_3 value: 85.539 - type: ndcg_at_5 value: 84.143 - type: precision_at_1 value: 89.576 - type: precision_at_10 value: 41.789 - type: precision_at_100 value: 4.995 - type: precision_at_1000 value: 0.515 - type: precision_at_20 value: 23.224 - type: precision_at_3 value: 74.79400000000001 - type: precision_at_5 value: 62.683 - type: recall_at_1 value: 27.297 - type: recall_at_10 value: 83.035 - type: recall_at_100 value: 94.915 - type: recall_at_1000 value: 98.225 - type: recall_at_20 value: 88.984 - type: recall_at_3 value: 55.533 - type: recall_at_5 value: 69.575 - task: type: Classification dataset: name: MTEB TNews type: C-MTEB/TNews-classification config: default split: validation revision: 317f262bf1e6126357bbe89e875451e4b0938fe4 metrics: - type: accuracy value: 51.664 - type: f1 value: 49.254634831292336 - type: f1_weighted value: 51.23047453836118 - type: main_score value: 51.664 - task: type: Clustering dataset: name: MTEB ThuNewsClusteringP2P type: C-MTEB/ThuNewsClusteringP2P config: default split: test revision: 5798586b105c0434e4f0fe5e767abe619442cf93 metrics: - type: main_score value: 62.931149356482294 - type: v_measure value: 62.931149356482294 - type: v_measure_std value: 1.2113879267357022 - task: type: Clustering dataset: name: MTEB ThuNewsClusteringS2S type: C-MTEB/ThuNewsClusteringS2S config: default split: test revision: 8a8b2caeda43f39e13c4bc5bea0f8a667896e10d metrics: - type: main_score value: 59.18138500076393 - type: v_measure value: 59.18138500076393 - type: v_measure_std value: 1.441163494106974 - task: type: Retrieval dataset: name: MTEB VideoRetrieval type: C-MTEB/VideoRetrieval config: default split: dev revision: 58c2597a5943a2ba48f4668c3b90d796283c5639 metrics: - type: main_score value: 72.14500000000001 - type: map_at_1 value: 58.8 - type: map_at_10 value: 68.014 - type: map_at_100 value: 68.506 - type: map_at_1000 value: 68.51899999999999 - type: map_at_20 value: 68.333 - type: map_at_3 value: 66.31700000000001 - type: map_at_5 value: 67.31200000000001 - type: mrr_at_1 value: 58.8 - type: mrr_at_10 value: 68.01432539682544 - type: mrr_at_100 value: 68.50595347947811 - type: mrr_at_1000 value: 68.51919475199976 - type: mrr_at_20 value: 68.33299226014789 - type: mrr_at_3 value: 66.3166666666667 - type: mrr_at_5 value: 67.31166666666671 - type: nauc_map_at_1000_diff1 value: 68.3842603726721 - type: nauc_map_at_1000_max value: 5.841784848188991 - type: nauc_map_at_1000_std value: -31.890361063810364 - type: nauc_map_at_100_diff1 value: 68.38290538651279 - type: nauc_map_at_100_max value: 5.851346250195991 - type: nauc_map_at_100_std value: -31.88363804217233 - type: nauc_map_at_10_diff1 value: 68.42162270332948 - type: nauc_map_at_10_max value: 5.545878771437991 - type: nauc_map_at_10_std value: -32.33063386887081 - type: nauc_map_at_1_diff1 value: 69.28202470263717 - type: nauc_map_at_1_max value: 5.263512365959786 - type: nauc_map_at_1_std value: -29.659416343096055 - type: nauc_map_at_20_diff1 value: 68.3987969552634 - type: nauc_map_at_20_max value: 5.7847092517499785 - type: nauc_map_at_20_std value: -32.0616280955644 - type: nauc_map_at_3_diff1 value: 68.42478821018057 - type: nauc_map_at_3_max value: 4.861120340503774 - type: nauc_map_at_3_std value: -33.56938270962587 - type: nauc_map_at_5_diff1 value: 68.20507686427763 - type: nauc_map_at_5_max value: 5.369798374942801 - type: nauc_map_at_5_std value: -32.83081659270383 - type: nauc_mrr_at_1000_diff1 value: 68.3842603726721 - type: nauc_mrr_at_1000_max value: 5.841784848188991 - type: nauc_mrr_at_1000_std value: -31.890361063810364 - type: nauc_mrr_at_100_diff1 value: 68.38290538651279 - type: nauc_mrr_at_100_max value: 5.851346250195991 - type: nauc_mrr_at_100_std value: -31.88363804217233 - type: nauc_mrr_at_10_diff1 value: 68.42162270332948 - type: nauc_mrr_at_10_max value: 5.545878771437991 - type: nauc_mrr_at_10_std value: -32.33063386887081 - type: nauc_mrr_at_1_diff1 value: 69.28202470263717 - type: nauc_mrr_at_1_max value: 5.263512365959786 - type: nauc_mrr_at_1_std value: -29.659416343096055 - type: nauc_mrr_at_20_diff1 value: 68.3987969552634 - type: nauc_mrr_at_20_max value: 5.7847092517499785 - type: nauc_mrr_at_20_std value: -32.0616280955644 - type: nauc_mrr_at_3_diff1 value: 68.42478821018057 - type: nauc_mrr_at_3_max value: 4.861120340503774 - type: nauc_mrr_at_3_std value: -33.56938270962587 - type: nauc_mrr_at_5_diff1 value: 68.20507686427763 - type: nauc_mrr_at_5_max value: 5.369798374942801 - type: nauc_mrr_at_5_std value: -32.83081659270383 - type: nauc_ndcg_at_1000_diff1 value: 68.14552912036231 - type: nauc_ndcg_at_1000_max value: 7.562355001802865 - type: nauc_ndcg_at_1000_std value: -30.13999419402607 - type: nauc_ndcg_at_100_diff1 value: 68.09990028004812 - type: nauc_ndcg_at_100_max value: 7.917285926128676 - type: nauc_ndcg_at_100_std value: -29.909889861196902 - type: nauc_ndcg_at_10_diff1 value: 68.32387598538823 - type: nauc_ndcg_at_10_max value: 6.442888130533218 - type: nauc_ndcg_at_10_std value: -32.43505234576926 - type: nauc_ndcg_at_1_diff1 value: 69.28202470263717 - type: nauc_ndcg_at_1_max value: 5.263512365959786 - type: nauc_ndcg_at_1_std value: -29.659416343096055 - type: nauc_ndcg_at_20_diff1 value: 68.19058463118989 - type: nauc_ndcg_at_20_max value: 7.4710128713487975 - type: nauc_ndcg_at_20_std value: -31.212367402512527 - type: nauc_ndcg_at_3_diff1 value: 68.2422738747729 - type: nauc_ndcg_at_3_max value: 4.866392479207864 - type: nauc_ndcg_at_3_std value: -35.0611297009806 - type: nauc_ndcg_at_5_diff1 value: 67.76867006392196 - type: nauc_ndcg_at_5_max value: 5.876702580928499 - type: nauc_ndcg_at_5_std value: -33.66450752679279 - type: nauc_precision_at_1000_diff1 value: 59.01318860877509 - type: nauc_precision_at_1000_max value: 92.88340336134347 - type: nauc_precision_at_1000_std value: 92.92425303454743 - type: nauc_precision_at_100_diff1 value: 62.909039584826274 - type: nauc_precision_at_100_max value: 53.748941437039655 - type: nauc_precision_at_100_std value: 25.24916943521579 - type: nauc_precision_at_10_diff1 value: 68.09729905629663 - type: nauc_precision_at_10_max value: 12.03384315001613 - type: nauc_precision_at_10_std value: -31.81483891962282 - type: nauc_precision_at_1_diff1 value: 69.28202470263717 - type: nauc_precision_at_1_max value: 5.263512365959786 - type: nauc_precision_at_1_std value: -29.659416343096055 - type: nauc_precision_at_20_diff1 value: 66.6897634037554 - type: nauc_precision_at_20_max value: 23.11402140195658 - type: nauc_precision_at_20_std value: -20.564049852242167 - type: nauc_precision_at_3_diff1 value: 67.64170624528396 - type: nauc_precision_at_3_max value: 4.945160628945999 - type: nauc_precision_at_3_std value: -40.41499950328566 - type: nauc_precision_at_5_diff1 value: 65.92840910208848 - type: nauc_precision_at_5_max value: 8.229706730154186 - type: nauc_precision_at_5_std value: -36.74013989591443 - type: nauc_recall_at_1000_diff1 value: 59.01318860877662 - type: nauc_recall_at_1000_max value: 92.88340336134418 - type: nauc_recall_at_1000_std value: 92.92425303454706 - type: nauc_recall_at_100_diff1 value: 62.90903958482619 - type: nauc_recall_at_100_max value: 53.748941437040145 - type: nauc_recall_at_100_std value: 25.249169435216018 - type: nauc_recall_at_10_diff1 value: 68.0972990562968 - type: nauc_recall_at_10_max value: 12.033843150016319 - type: nauc_recall_at_10_std value: -31.814838919622566 - type: nauc_recall_at_1_diff1 value: 69.28202470263717 - type: nauc_recall_at_1_max value: 5.263512365959786 - type: nauc_recall_at_1_std value: -29.659416343096055 - type: nauc_recall_at_20_diff1 value: 66.6897634037554 - type: nauc_recall_at_20_max value: 23.114021401956656 - type: nauc_recall_at_20_std value: -20.564049852241986 - type: nauc_recall_at_3_diff1 value: 67.64170624528384 - type: nauc_recall_at_3_max value: 4.9451606289460095 - type: nauc_recall_at_3_std value: -40.41499950328563 - type: nauc_recall_at_5_diff1 value: 65.92840910208865 - type: nauc_recall_at_5_max value: 8.229706730154424 - type: nauc_recall_at_5_std value: -36.740139895914325 - type: ndcg_at_1 value: 58.8 - type: ndcg_at_10 value: 72.14500000000001 - type: ndcg_at_100 value: 74.477 - type: ndcg_at_1000 value: 74.821 - type: ndcg_at_20 value: 73.34 - type: ndcg_at_3 value: 68.634 - type: ndcg_at_5 value: 70.416 - type: precision_at_1 value: 58.8 - type: precision_at_10 value: 8.5 - type: precision_at_100 value: 0.9570000000000001 - type: precision_at_1000 value: 0.098 - type: precision_at_20 value: 4.49 - type: precision_at_3 value: 25.1 - type: precision_at_5 value: 15.920000000000002 - type: recall_at_1 value: 58.8 - type: recall_at_10 value: 85.0 - type: recall_at_100 value: 95.7 - type: recall_at_1000 value: 98.4 - type: recall_at_20 value: 89.8 - type: recall_at_3 value: 75.3 - type: recall_at_5 value: 79.60000000000001 - task: type: Classification dataset: name: MTEB Waimai type: C-MTEB/waimai-classification config: default split: test revision: 339287def212450dcaa9df8c22bf93e9980c7023 metrics: - type: accuracy value: 89.12 - type: ap value: 74.85094489946682 - type: ap_weighted value: 74.85094489946682 - type: f1 value: 87.58964139879481 - type: f1_weighted value: 89.11267843686537 - type: main_score value: 89.12 - task: type: Clustering dataset: name: MTEB AlloProfClusteringP2P type: lyon-nlp/alloprof config: default split: test revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b metrics: - type: main_score value: 66.7100274116735 - type: v_measure value: 66.7100274116735 - type: v_measure_std value: 2.065600197695283 - type: main_score value: 47.67572024379311 - type: v_measure value: 47.67572024379311 - type: v_measure_std value: 3.1905282169494953 - task: type: Reranking dataset: name: MTEB AlloprofReranking type: lyon-nlp/mteb-fr-reranking-alloprof-s2p config: default split: test revision: 65393d0d7a08a10b4e348135e824f385d420b0fd metrics: - type: main_score value: 75.04647907753767 - type: map value: 75.04647907753767 - type: mrr value: 76.25801875154207 - type: nAUC_map_diff1 value: 56.38279442235466 - type: nAUC_map_max value: 20.009630947768642 - type: nAUC_map_std value: 21.626818227466185 - type: nAUC_mrr_diff1 value: 56.33463291672874 - type: nAUC_mrr_max value: 20.472794140230853 - type: nAUC_mrr_std value: 21.491759650866392 - task: type: Retrieval dataset: name: MTEB AlloprofRetrieval type: lyon-nlp/alloprof config: default split: test revision: fcf295ea64c750f41fadbaa37b9b861558e1bfbd metrics: - type: main_score value: 50.638000000000005 - type: map_at_1 value: 33.161 - type: map_at_10 value: 44.698 - type: map_at_100 value: 45.596 - type: map_at_1000 value: 45.635999999999996 - type: map_at_20 value: 45.265 - type: map_at_3 value: 41.703 - type: map_at_5 value: 43.488 - type: mrr_at_1 value: 33.160621761658035 - type: mrr_at_10 value: 44.697771883652734 - type: mrr_at_100 value: 45.59624815182174 - type: mrr_at_1000 value: 45.63609361771601 - type: mrr_at_20 value: 45.26480516767501 - type: mrr_at_3 value: 41.70264824409908 - type: mrr_at_5 value: 43.488054116292574 - type: nauc_map_at_1000_diff1 value: 38.49809004106204 - type: nauc_map_at_1000_max value: 31.640827883359986 - type: nauc_map_at_1000_std value: 2.5944833693677563 - type: nauc_map_at_100_diff1 value: 38.47974017961114 - type: nauc_map_at_100_max value: 31.6745580307424 - type: nauc_map_at_100_std value: 2.6197669693649965 - type: nauc_map_at_10_diff1 value: 38.43029274269754 - type: nauc_map_at_10_max value: 31.669351274164402 - type: nauc_map_at_10_std value: 2.2938216424530955 - type: nauc_map_at_1_diff1 value: 42.39449280665502 - type: nauc_map_at_1_max value: 27.396202491464315 - type: nauc_map_at_1_std value: 0.39154393747181304 - type: nauc_map_at_20_diff1 value: 38.44710465218088 - type: nauc_map_at_20_max value: 31.618626111686442 - type: nauc_map_at_20_std value: 2.5092690901463994 - type: nauc_map_at_3_diff1 value: 38.68180058655341 - type: nauc_map_at_3_max value: 30.48704606797293 - type: nauc_map_at_3_std value: 1.6764325554613773 - type: nauc_map_at_5_diff1 value: 38.27528363570654 - type: nauc_map_at_5_max value: 31.105696409714735 - type: nauc_map_at_5_std value: 2.3132867223174043 - type: nauc_mrr_at_1000_diff1 value: 38.49809004106204 - type: nauc_mrr_at_1000_max value: 31.640827883359986 - type: nauc_mrr_at_1000_std value: 2.5944833693677563 - type: nauc_mrr_at_100_diff1 value: 38.47974017961114 - type: nauc_mrr_at_100_max value: 31.6745580307424 - type: nauc_mrr_at_100_std value: 2.6197669693649965 - type: nauc_mrr_at_10_diff1 value: 38.43029274269754 - type: nauc_mrr_at_10_max value: 31.669351274164402 - type: nauc_mrr_at_10_std value: 2.2938216424530955 - type: nauc_mrr_at_1_diff1 value: 42.39449280665502 - type: nauc_mrr_at_1_max value: 27.396202491464315 - type: nauc_mrr_at_1_std value: 0.39154393747181304 - type: nauc_mrr_at_20_diff1 value: 38.44710465218088 - type: nauc_mrr_at_20_max value: 31.618626111686442 - type: nauc_mrr_at_20_std value: 2.5092690901463994 - type: nauc_mrr_at_3_diff1 value: 38.68180058655341 - type: nauc_mrr_at_3_max value: 30.48704606797293 - type: nauc_mrr_at_3_std value: 1.6764325554613773 - type: nauc_mrr_at_5_diff1 value: 38.27528363570654 - type: nauc_mrr_at_5_max value: 31.105696409714735 - type: nauc_mrr_at_5_std value: 2.3132867223174043 - type: nauc_ndcg_at_1000_diff1 value: 37.94639112622322 - type: nauc_ndcg_at_1000_max value: 33.25000406312992 - type: nauc_ndcg_at_1000_std value: 3.927246572224288 - type: nauc_ndcg_at_100_diff1 value: 37.488139235799 - type: nauc_ndcg_at_100_max value: 34.38011697151766 - type: nauc_ndcg_at_100_std value: 4.94760159362139 - type: nauc_ndcg_at_10_diff1 value: 37.318669958427996 - type: nauc_ndcg_at_10_max value: 34.19162673981376 - type: nauc_ndcg_at_10_std value: 3.2011892955083256 - type: nauc_ndcg_at_1_diff1 value: 42.39449280665502 - type: nauc_ndcg_at_1_max value: 27.396202491464315 - type: nauc_ndcg_at_1_std value: 0.39154393747181304 - type: nauc_ndcg_at_20_diff1 value: 37.290108058390985 - type: nauc_ndcg_at_20_max value: 34.108858641349556 - type: nauc_ndcg_at_20_std value: 4.169459504439506 - type: nauc_ndcg_at_3_diff1 value: 37.62224828453568 - type: nauc_ndcg_at_3_max value: 31.519305313909218 - type: nauc_ndcg_at_3_std value: 2.087339522812091 - type: nauc_ndcg_at_5_diff1 value: 36.888334499663785 - type: nauc_ndcg_at_5_max value: 32.6601407781184 - type: nauc_ndcg_at_5_std value: 3.2124484680546175 - type: nauc_precision_at_1000_diff1 value: 59.19282490825572 - type: nauc_precision_at_1000_max value: 68.32089152822621 - type: nauc_precision_at_1000_std value: 67.77161809421868 - type: nauc_precision_at_100_diff1 value: 29.47575945272322 - type: nauc_precision_at_100_max value: 63.42195725833949 - type: nauc_precision_at_100_std value: 34.923105379547344 - type: nauc_precision_at_10_diff1 value: 33.52501919318297 - type: nauc_precision_at_10_max value: 44.49893440034256 - type: nauc_precision_at_10_std value: 6.680426129369459 - type: nauc_precision_at_1_diff1 value: 42.39449280665502 - type: nauc_precision_at_1_max value: 27.396202491464315 - type: nauc_precision_at_1_std value: 0.39154393747181304 - type: nauc_precision_at_20_diff1 value: 32.17682672599943 - type: nauc_precision_at_20_max value: 46.87049521936974 - type: nauc_precision_at_20_std value: 13.53258332473726 - type: nauc_precision_at_3_diff1 value: 34.54132207851944 - type: nauc_precision_at_3_max value: 34.574775459010255 - type: nauc_precision_at_3_std value: 3.298031208443393 - type: nauc_precision_at_5_diff1 value: 32.475852196639195 - type: nauc_precision_at_5_max value: 37.73978486643185 - type: nauc_precision_at_5_std value: 6.185472179658329 - type: nauc_recall_at_1000_diff1 value: 59.19282490825427 - type: nauc_recall_at_1000_max value: 68.32089152822542 - type: nauc_recall_at_1000_std value: 67.77161809421989 - type: nauc_recall_at_100_diff1 value: 29.475759452723388 - type: nauc_recall_at_100_max value: 63.421957258339425 - type: nauc_recall_at_100_std value: 34.92310537954746 - type: nauc_recall_at_10_diff1 value: 33.525019193182956 - type: nauc_recall_at_10_max value: 44.498934400342485 - type: nauc_recall_at_10_std value: 6.680426129369434 - type: nauc_recall_at_1_diff1 value: 42.39449280665502 - type: nauc_recall_at_1_max value: 27.396202491464315 - type: nauc_recall_at_1_std value: 0.39154393747181304 - type: nauc_recall_at_20_diff1 value: 32.17682672599945 - type: nauc_recall_at_20_max value: 46.87049521936974 - type: nauc_recall_at_20_std value: 13.53258332473721 - type: nauc_recall_at_3_diff1 value: 34.54132207851946 - type: nauc_recall_at_3_max value: 34.5747754590102 - type: nauc_recall_at_3_std value: 3.2980312084433936 - type: nauc_recall_at_5_diff1 value: 32.47585219663912 - type: nauc_recall_at_5_max value: 37.73978486643183 - type: nauc_recall_at_5_std value: 6.18547217965832 - type: ndcg_at_1 value: 33.161 - type: ndcg_at_10 value: 50.638000000000005 - type: ndcg_at_100 value: 55.076 - type: ndcg_at_1000 value: 56.18300000000001 - type: ndcg_at_20 value: 52.681 - type: ndcg_at_3 value: 44.488 - type: ndcg_at_5 value: 47.705999999999996 - type: precision_at_1 value: 33.161 - type: precision_at_10 value: 6.9430000000000005 - type: precision_at_100 value: 0.9039999999999999 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 3.873 - type: precision_at_3 value: 17.516000000000002 - type: precision_at_5 value: 12.073 - type: recall_at_1 value: 33.161 - type: recall_at_10 value: 69.43 - type: recall_at_100 value: 90.371 - type: recall_at_1000 value: 99.18 - type: recall_at_20 value: 77.461 - type: recall_at_3 value: 52.547 - type: recall_at_5 value: 60.363 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (fr) type: mteb/amazon_reviews_multi config: fr split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 52.622 - type: f1 value: 48.89589865194384 - type: f1_weighted value: 48.89589865194384 - type: main_score value: 52.622 - task: type: Retrieval dataset: name: MTEB BSARDRetrieval type: maastrichtlawtech/bsard config: default split: test revision: 5effa1b9b5fa3b0f9e12523e6e43e5f86a6e6d59 metrics: - type: main_score value: 59.458999999999996 - type: map_at_1 value: 8.559 - type: map_at_10 value: 15.853 - type: map_at_100 value: 16.955000000000002 - type: map_at_1000 value: 17.039 - type: map_at_20 value: 16.491 - type: map_at_3 value: 13.739 - type: map_at_5 value: 14.887 - type: mrr_at_1 value: 8.558558558558559 - type: mrr_at_10 value: 15.852995852995852 - type: mrr_at_100 value: 16.95536191852861 - type: mrr_at_1000 value: 17.03894008776081 - type: mrr_at_20 value: 16.490710101391212 - type: mrr_at_3 value: 13.738738738738734 - type: mrr_at_5 value: 14.887387387387385 - type: nauc_map_at_1000_diff1 value: 22.6427616709538 - type: nauc_map_at_1000_max value: 30.273021433334108 - type: nauc_map_at_1000_std value: 8.648862859092157 - type: nauc_map_at_100_diff1 value: 22.523593314805954 - type: nauc_map_at_100_max value: 30.197098780769366 - type: nauc_map_at_100_std value: 8.638222954134465 - type: nauc_map_at_10_diff1 value: 22.382593376046035 - type: nauc_map_at_10_max value: 30.80647774104949 - type: nauc_map_at_10_std value: 7.6451773140303825 - type: nauc_map_at_1_diff1 value: 32.27835486300824 - type: nauc_map_at_1_max value: 31.839925744574 - type: nauc_map_at_1_std value: 7.524965617228806 - type: nauc_map_at_20_diff1 value: 22.78130766181537 - type: nauc_map_at_20_max value: 30.207832515412452 - type: nauc_map_at_20_std value: 7.988030006241385 - type: nauc_map_at_3_diff1 value: 21.54291029527254 - type: nauc_map_at_3_max value: 30.60738044134162 - type: nauc_map_at_3_std value: 5.115709517278456 - type: nauc_map_at_5_diff1 value: 22.891658440504543 - type: nauc_map_at_5_max value: 30.40238430175482 - type: nauc_map_at_5_std value: 6.496264144977426 - type: nauc_mrr_at_1000_diff1 value: 22.6427616709538 - type: nauc_mrr_at_1000_max value: 30.273021433334108 - type: nauc_mrr_at_1000_std value: 8.648862859092157 - type: nauc_mrr_at_100_diff1 value: 22.523593314805954 - type: nauc_mrr_at_100_max value: 30.197098780769366 - type: nauc_mrr_at_100_std value: 8.638222954134465 - type: nauc_mrr_at_10_diff1 value: 22.382593376046035 - type: nauc_mrr_at_10_max value: 30.80647774104949 - type: nauc_mrr_at_10_std value: 7.6451773140303825 - type: nauc_mrr_at_1_diff1 value: 32.27835486300824 - type: nauc_mrr_at_1_max value: 31.839925744574 - type: nauc_mrr_at_1_std value: 7.524965617228806 - type: nauc_mrr_at_20_diff1 value: 22.78130766181537 - type: nauc_mrr_at_20_max value: 30.207832515412452 - type: nauc_mrr_at_20_std value: 7.988030006241385 - type: nauc_mrr_at_3_diff1 value: 21.54291029527254 - type: nauc_mrr_at_3_max value: 30.60738044134162 - type: nauc_mrr_at_3_std value: 5.115709517278456 - type: nauc_mrr_at_5_diff1 value: 22.891658440504543 - type: nauc_mrr_at_5_max value: 30.40238430175482 - type: nauc_mrr_at_5_std value: 6.496264144977426 - type: nauc_ndcg_at_1000_diff1 value: 22.131590111018863 - type: nauc_ndcg_at_1000_max value: 30.119495176526417 - type: nauc_ndcg_at_1000_std value: 14.152746889343884 - type: nauc_ndcg_at_100_diff1 value: 19.59019307197614 - type: nauc_ndcg_at_100_max value: 29.26698074164439 - type: nauc_ndcg_at_100_std value: 14.64843229218199 - type: nauc_ndcg_at_10_diff1 value: 20.04399986794229 - type: nauc_ndcg_at_10_max value: 30.370494010101606 - type: nauc_ndcg_at_10_std value: 9.076324266988427 - type: nauc_ndcg_at_1_diff1 value: 32.27835486300824 - type: nauc_ndcg_at_1_max value: 31.839925744574 - type: nauc_ndcg_at_1_std value: 7.524965617228806 - type: nauc_ndcg_at_20_diff1 value: 21.047174465558204 - type: nauc_ndcg_at_20_max value: 28.383850745017487 - type: nauc_ndcg_at_20_std value: 10.079085665060253 - type: nauc_ndcg_at_3_diff1 value: 18.696202337264843 - type: nauc_ndcg_at_3_max value: 29.95559912145818 - type: nauc_ndcg_at_3_std value: 4.515594333379446 - type: nauc_ndcg_at_5_diff1 value: 21.14710675076888 - type: nauc_ndcg_at_5_max value: 29.60877022537729 - type: nauc_ndcg_at_5_std value: 6.721635773882387 - type: nauc_precision_at_1000_diff1 value: 30.982325786968197 - type: nauc_precision_at_1000_max value: 34.26481840304951 - type: nauc_precision_at_1000_std value: 43.39003460634655 - type: nauc_precision_at_100_diff1 value: 11.987279247967425 - type: nauc_precision_at_100_max value: 28.50285582800895 - type: nauc_precision_at_100_std value: 35.49648389671331 - type: nauc_precision_at_10_diff1 value: 15.562900584507142 - type: nauc_precision_at_10_max value: 29.558066061869663 - type: nauc_precision_at_10_std value: 12.47595674036553 - type: nauc_precision_at_1_diff1 value: 32.27835486300824 - type: nauc_precision_at_1_max value: 31.839925744574 - type: nauc_precision_at_1_std value: 7.524965617228806 - type: nauc_precision_at_20_diff1 value: 18.081035071003427 - type: nauc_precision_at_20_max value: 23.85063262716287 - type: nauc_precision_at_20_std value: 15.071481920870877 - type: nauc_precision_at_3_diff1 value: 12.597351208698534 - type: nauc_precision_at_3_max value: 28.496818992459538 - type: nauc_precision_at_3_std value: 3.2373330095471893 - type: nauc_precision_at_5_diff1 value: 17.904830065631092 - type: nauc_precision_at_5_max value: 27.89909851354525 - type: nauc_precision_at_5_std value: 7.3432451499420734 - type: nauc_recall_at_1000_diff1 value: 30.982325786968097 - type: nauc_recall_at_1000_max value: 34.264818403049496 - type: nauc_recall_at_1000_std value: 43.39003460634647 - type: nauc_recall_at_100_diff1 value: 11.987279247967388 - type: nauc_recall_at_100_max value: 28.502855828008883 - type: nauc_recall_at_100_std value: 35.49648389671325 - type: nauc_recall_at_10_diff1 value: 15.562900584507085 - type: nauc_recall_at_10_max value: 29.558066061869624 - type: nauc_recall_at_10_std value: 12.475956740365447 - type: nauc_recall_at_1_diff1 value: 32.27835486300824 - type: nauc_recall_at_1_max value: 31.839925744574 - type: nauc_recall_at_1_std value: 7.524965617228806 - type: nauc_recall_at_20_diff1 value: 18.081035071003342 - type: nauc_recall_at_20_max value: 23.850632627162785 - type: nauc_recall_at_20_std value: 15.071481920870786 - type: nauc_recall_at_3_diff1 value: 12.597351208698562 - type: nauc_recall_at_3_max value: 28.496818992459545 - type: nauc_recall_at_3_std value: 3.237333009547214 - type: nauc_recall_at_5_diff1 value: 17.90483006563107 - type: nauc_recall_at_5_max value: 27.89909851354522 - type: nauc_recall_at_5_std value: 7.343245149942006 - type: ndcg_at_1 value: 8.559 - type: ndcg_at_10 value: 19.828000000000003 - type: ndcg_at_100 value: 25.468000000000004 - type: ndcg_at_1000 value: 28.058 - type: ndcg_at_20 value: 22.122 - type: ndcg_at_3 value: 15.524 - type: ndcg_at_5 value: 17.579 - type: precision_at_1 value: 8.559 - type: precision_at_10 value: 3.243 - type: precision_at_100 value: 0.5950000000000001 - type: precision_at_1000 value: 0.08099999999999999 - type: precision_at_20 value: 2.072 - type: precision_at_3 value: 6.907000000000001 - type: precision_at_5 value: 5.135 - type: recall_at_1 value: 8.559 - type: recall_at_10 value: 32.432 - type: recall_at_100 value: 59.458999999999996 - type: recall_at_1000 value: 80.631 - type: recall_at_20 value: 41.441 - type: recall_at_3 value: 20.721 - type: recall_at_5 value: 25.676 - task: type: Clustering dataset: name: MTEB HALClusteringS2S type: lyon-nlp/clustering-hal-s2s config: default split: test revision: e06ebbbb123f8144bef1a5d18796f3dec9ae2915 metrics: - type: main_score value: 26.958035381361377 - type: v_measure value: 26.958035381361377 - type: v_measure_std value: 2.401353383071989 - task: type: Clustering dataset: name: MTEB MLSUMClusteringP2P (fr) type: reciTAL/mlsum config: fr split: test revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7 metrics: - type: main_score value: 46.15554988136895 - type: v_measure value: 46.15554988136895 - type: v_measure_std value: 2.459531525134688 - type: main_score value: 45.73187202144909 - type: v_measure value: 45.73187202144909 - type: v_measure_std value: 1.6402520163270633 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (fr) type: mteb/mtop_domain config: fr split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 95.78766050735986 - type: f1 value: 95.61497706645892 - type: f1_weighted value: 95.79887587161483 - type: main_score value: 95.78766050735986 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (fr) type: mteb/mtop_intent config: fr split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 80.8800501096148 - type: f1 value: 53.9945274705194 - type: f1_weighted value: 80.94438738414857 - type: main_score value: 80.8800501096148 - task: type: Classification dataset: name: MTEB MasakhaNEWSClassification (fra) type: mteb/masakhanews config: fra split: test revision: 18193f187b92da67168c655c9973a165ed9593dd metrics: - type: accuracy value: 83.6255924170616 - type: f1 value: 79.70294641135138 - type: f1_weighted value: 83.33457992982105 - type: main_score value: 83.6255924170616 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (fra) type: masakhane/masakhanews config: fra split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 77.1970570860131 - type: v_measure value: 77.1970570860131 - type: v_measure_std value: 22.0055550035463 - type: main_score value: 65.92601417312947 - type: v_measure value: 65.92601417312947 - type: v_measure_std value: 30.421071440935687 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fr) type: mteb/amazon_massive_intent config: fr split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 69.5359784801614 - type: f1 value: 64.640488940591 - type: f1_weighted value: 67.85916565361048 - type: main_score value: 69.5359784801614 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fr) type: mteb/amazon_massive_scenario config: fr split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 78.52723604572965 - type: f1 value: 77.1995224144067 - type: f1_weighted value: 78.1215987283123 - type: main_score value: 78.52723604572965 - task: type: Retrieval dataset: name: MTEB MintakaRetrieval (fr) type: jinaai/mintakaqa config: fr split: test revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e metrics: - type: main_score value: 26.448 - type: map_at_1 value: 14.947 - type: map_at_10 value: 22.303 - type: map_at_100 value: 23.477999999999998 - type: map_at_1000 value: 23.586 - type: map_at_20 value: 22.962 - type: map_at_3 value: 19.949 - type: map_at_5 value: 21.252 - type: mrr_at_1 value: 14.946764946764945 - type: mrr_at_10 value: 22.303001053001033 - type: mrr_at_100 value: 23.478040499941816 - type: mrr_at_1000 value: 23.585987565381252 - type: mrr_at_20 value: 22.96198948271138 - type: mrr_at_3 value: 19.949494949494913 - type: mrr_at_5 value: 21.251706251706192 - type: nauc_map_at_1000_diff1 value: 30.124123232611005 - type: nauc_map_at_1000_max value: 19.329718056410893 - type: nauc_map_at_1000_std value: 3.7304142418877606 - type: nauc_map_at_100_diff1 value: 30.06763654065989 - type: nauc_map_at_100_max value: 19.339926348634375 - type: nauc_map_at_100_std value: 3.7507886962889376 - type: nauc_map_at_10_diff1 value: 30.235621359267817 - type: nauc_map_at_10_max value: 19.315231135265865 - type: nauc_map_at_10_std value: 3.888262415552999 - type: nauc_map_at_1_diff1 value: 37.87356036243269 - type: nauc_map_at_1_max value: 17.63892349776284 - type: nauc_map_at_1_std value: -2.0575597858386208 - type: nauc_map_at_20_diff1 value: 30.06800385756772 - type: nauc_map_at_20_max value: 19.172804564418264 - type: nauc_map_at_20_std value: 3.721149536049358 - type: nauc_map_at_3_diff1 value: 32.09160567273595 - type: nauc_map_at_3_max value: 19.055280691204825 - type: nauc_map_at_3_std value: 1.9160849079572526 - type: nauc_map_at_5_diff1 value: 30.81034541116131 - type: nauc_map_at_5_max value: 19.172166581396308 - type: nauc_map_at_5_std value: 3.251197681984862 - type: nauc_mrr_at_1000_diff1 value: 30.12412337741088 - type: nauc_mrr_at_1000_max value: 19.329717809214035 - type: nauc_mrr_at_1000_std value: 3.730414425912248 - type: nauc_mrr_at_100_diff1 value: 30.06763654065989 - type: nauc_mrr_at_100_max value: 19.339926348634375 - type: nauc_mrr_at_100_std value: 3.7507886962889376 - type: nauc_mrr_at_10_diff1 value: 30.235621359267817 - type: nauc_mrr_at_10_max value: 19.315231135265865 - type: nauc_mrr_at_10_std value: 3.888262415552999 - type: nauc_mrr_at_1_diff1 value: 37.87356036243269 - type: nauc_mrr_at_1_max value: 17.63892349776284 - type: nauc_mrr_at_1_std value: -2.0575597858386208 - type: nauc_mrr_at_20_diff1 value: 30.06800385756772 - type: nauc_mrr_at_20_max value: 19.172804564418264 - type: nauc_mrr_at_20_std value: 3.721149536049358 - type: nauc_mrr_at_3_diff1 value: 32.09160567273595 - type: nauc_mrr_at_3_max value: 19.055280691204825 - type: nauc_mrr_at_3_std value: 1.9160849079572526 - type: nauc_mrr_at_5_diff1 value: 30.81034541116131 - type: nauc_mrr_at_5_max value: 19.172166581396308 - type: nauc_mrr_at_5_std value: 3.251197681984862 - type: nauc_ndcg_at_1000_diff1 value: 28.057639637340476 - type: nauc_ndcg_at_1000_max value: 20.172072747981893 - type: nauc_ndcg_at_1000_std value: 5.991944827605241 - type: nauc_ndcg_at_100_diff1 value: 26.60019642442434 - type: nauc_ndcg_at_100_max value: 20.47271103053784 - type: nauc_ndcg_at_100_std value: 6.489412476969333 - type: nauc_ndcg_at_10_diff1 value: 27.165894912173762 - type: nauc_ndcg_at_10_max value: 19.79447862928707 - type: nauc_ndcg_at_10_std value: 6.648857204092722 - type: nauc_ndcg_at_1_diff1 value: 37.87356036243269 - type: nauc_ndcg_at_1_max value: 17.63892349776284 - type: nauc_ndcg_at_1_std value: -2.0575597858386208 - type: nauc_ndcg_at_20_diff1 value: 26.582793970516843 - type: nauc_ndcg_at_20_max value: 19.348538329936638 - type: nauc_ndcg_at_20_std value: 6.138040315782395 - type: nauc_ndcg_at_3_diff1 value: 30.57338000196413 - type: nauc_ndcg_at_3_max value: 19.37852889877986 - type: nauc_ndcg_at_3_std value: 3.0568087546329408 - type: nauc_ndcg_at_5_diff1 value: 28.469299405769632 - type: nauc_ndcg_at_5_max value: 19.599386892314122 - type: nauc_ndcg_at_5_std value: 5.299940395199246 - type: nauc_precision_at_1000_diff1 value: 24.170281200655943 - type: nauc_precision_at_1000_max value: 39.623019898347664 - type: nauc_precision_at_1000_std value: 44.81985014306762 - type: nauc_precision_at_100_diff1 value: 14.474857644755179 - type: nauc_precision_at_100_max value: 26.05636850160609 - type: nauc_precision_at_100_std value: 16.53010919038197 - type: nauc_precision_at_10_diff1 value: 19.584122367964167 - type: nauc_precision_at_10_max value: 20.86686195708289 - type: nauc_precision_at_10_std value: 13.525636908101404 - type: nauc_precision_at_1_diff1 value: 37.87356036243269 - type: nauc_precision_at_1_max value: 17.63892349776284 - type: nauc_precision_at_1_std value: -2.0575597858386208 - type: nauc_precision_at_20_diff1 value: 17.420915050615722 - type: nauc_precision_at_20_max value: 19.45722509579383 - type: nauc_precision_at_20_std value: 12.077196513907348 - type: nauc_precision_at_3_diff1 value: 26.865120457860247 - type: nauc_precision_at_3_max value: 20.154933241021354 - type: nauc_precision_at_3_std value: 5.86927947299836 - type: nauc_precision_at_5_diff1 value: 22.803351569123205 - type: nauc_precision_at_5_max value: 20.623962388912666 - type: nauc_precision_at_5_std value: 10.348629762758872 - type: nauc_recall_at_1000_diff1 value: 24.170281200656042 - type: nauc_recall_at_1000_max value: 39.62301989834765 - type: nauc_recall_at_1000_std value: 44.8198501430671 - type: nauc_recall_at_100_diff1 value: 14.474857644755254 - type: nauc_recall_at_100_max value: 26.056368501606116 - type: nauc_recall_at_100_std value: 16.530109190381985 - type: nauc_recall_at_10_diff1 value: 19.58412236796417 - type: nauc_recall_at_10_max value: 20.866861957082875 - type: nauc_recall_at_10_std value: 13.5256369081014 - type: nauc_recall_at_1_diff1 value: 37.87356036243269 - type: nauc_recall_at_1_max value: 17.63892349776284 - type: nauc_recall_at_1_std value: -2.0575597858386208 - type: nauc_recall_at_20_diff1 value: 17.420915050615708 - type: nauc_recall_at_20_max value: 19.45722509579385 - type: nauc_recall_at_20_std value: 12.077196513907353 - type: nauc_recall_at_3_diff1 value: 26.865120457860243 - type: nauc_recall_at_3_max value: 20.15493324102137 - type: nauc_recall_at_3_std value: 5.869279472998389 - type: nauc_recall_at_5_diff1 value: 22.803351569123215 - type: nauc_recall_at_5_max value: 20.62396238891266 - type: nauc_recall_at_5_std value: 10.348629762758849 - type: ndcg_at_1 value: 14.947 - type: ndcg_at_10 value: 26.448 - type: ndcg_at_100 value: 32.78 - type: ndcg_at_1000 value: 35.937000000000005 - type: ndcg_at_20 value: 28.842000000000002 - type: ndcg_at_3 value: 21.587999999999997 - type: ndcg_at_5 value: 23.942 - type: precision_at_1 value: 14.947 - type: precision_at_10 value: 3.972 - type: precision_at_100 value: 0.7080000000000001 - type: precision_at_1000 value: 0.096 - type: precision_at_20 value: 2.459 - type: precision_at_3 value: 8.777 - type: precision_at_5 value: 6.413 - type: recall_at_1 value: 14.947 - type: recall_at_10 value: 39.722 - type: recall_at_100 value: 70.844 - type: recall_at_1000 value: 96.274 - type: recall_at_20 value: 49.181000000000004 - type: recall_at_3 value: 26.331 - type: recall_at_5 value: 32.064 - task: type: PairClassification dataset: name: MTEB OpusparcusPC (fr) type: GEM/opusparcus config: fr split: test revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a metrics: - type: cosine_accuracy value: 82.62942779291554 - type: cosine_accuracy_threshold value: 83.4860622882843 - type: cosine_ap value: 93.39616519364185 - type: cosine_f1 value: 88.03378695448146 - type: cosine_f1_threshold value: 83.4860622882843 - type: cosine_precision value: 83.45195729537367 - type: cosine_recall value: 93.14796425024826 - type: dot_accuracy value: 82.62942779291554 - type: dot_accuracy_threshold value: 83.4860622882843 - type: dot_ap value: 93.39616519364185 - type: dot_f1 value: 88.03378695448146 - type: dot_f1_threshold value: 83.4860622882843 - type: dot_precision value: 83.45195729537367 - type: dot_recall value: 93.14796425024826 - type: euclidean_accuracy value: 82.62942779291554 - type: euclidean_accuracy_threshold value: 57.4698805809021 - type: euclidean_ap value: 93.39616519364185 - type: euclidean_f1 value: 88.03378695448146 - type: euclidean_f1_threshold value: 57.4698805809021 - type: euclidean_precision value: 83.45195729537367 - type: euclidean_recall value: 93.14796425024826 - type: main_score value: 93.39616519364185 - type: manhattan_accuracy value: 82.62942779291554 - type: manhattan_accuracy_threshold value: 1306.7530632019043 - type: manhattan_ap value: 93.34098710518775 - type: manhattan_f1 value: 87.78409090909089 - type: manhattan_f1_threshold value: 1335.2685928344727 - type: manhattan_precision value: 83.89140271493213 - type: manhattan_recall value: 92.05561072492551 - type: max_ap value: 93.39616519364185 - type: max_f1 value: 88.03378695448146 - type: max_precision value: 83.89140271493213 - type: max_recall value: 93.14796425024826 - type: similarity_accuracy value: 82.62942779291554 - type: similarity_accuracy_threshold value: 83.4860622882843 - type: similarity_ap value: 93.39616519364185 - type: similarity_f1 value: 88.03378695448146 - type: similarity_f1_threshold value: 83.4860622882843 - type: similarity_precision value: 83.45195729537367 - type: similarity_recall value: 93.14796425024826 - task: type: PairClassification dataset: name: MTEB PawsXPairClassification (fr) type: google-research-datasets/paws-x config: fr split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cosine_accuracy value: 60.8 - type: cosine_accuracy_threshold value: 98.90193939208984 - type: cosine_ap value: 60.50913122978733 - type: cosine_f1 value: 62.69411339833874 - type: cosine_f1_threshold value: 95.17210125923157 - type: cosine_precision value: 46.51661307609861 - type: cosine_recall value: 96.12403100775194 - type: dot_accuracy value: 60.8 - type: dot_accuracy_threshold value: 98.9019513130188 - type: dot_ap value: 60.49770725998639 - type: dot_f1 value: 62.69411339833874 - type: dot_f1_threshold value: 95.17210721969604 - type: dot_precision value: 46.51661307609861 - type: dot_recall value: 96.12403100775194 - type: euclidean_accuracy value: 60.8 - type: euclidean_accuracy_threshold value: 14.819307625293732 - type: euclidean_ap value: 60.50917425308617 - type: euclidean_f1 value: 62.69411339833874 - type: euclidean_f1_threshold value: 31.07377290725708 - type: euclidean_precision value: 46.51661307609861 - type: euclidean_recall value: 96.12403100775194 - type: main_score value: 60.73371250119265 - type: manhattan_accuracy value: 60.9 - type: manhattan_accuracy_threshold value: 354.8734188079834 - type: manhattan_ap value: 60.73371250119265 - type: manhattan_f1 value: 62.70506744440393 - type: manhattan_f1_threshold value: 711.578369140625 - type: manhattan_precision value: 46.73913043478261 - type: manhattan_recall value: 95.23809523809523 - type: max_ap value: 60.73371250119265 - type: max_f1 value: 62.70506744440393 - type: max_precision value: 46.73913043478261 - type: max_recall value: 96.12403100775194 - type: similarity_accuracy value: 60.8 - type: similarity_accuracy_threshold value: 98.90193939208984 - type: similarity_ap value: 60.50913122978733 - type: similarity_f1 value: 62.69411339833874 - type: similarity_f1_threshold value: 95.17210125923157 - type: similarity_precision value: 46.51661307609861 - type: similarity_recall value: 96.12403100775194 - task: type: STS dataset: name: MTEB SICKFr type: Lajavaness/SICK-fr config: default split: test revision: e077ab4cf4774a1e36d86d593b150422fafd8e8a metrics: - type: cosine_pearson value: 81.02846310969592 - type: cosine_spearman value: 77.47140335069184 - type: euclidean_pearson value: 77.4818795209704 - type: euclidean_spearman value: 77.4714043813526 - type: main_score value: 77.47140335069184 - type: manhattan_pearson value: 77.44622115854098 - type: manhattan_spearman value: 77.29743297817558 - type: pearson value: 81.02846310969592 - type: spearman value: 77.47140335069184 - task: type: STS dataset: name: MTEB STS22 (fr) type: mteb/sts22-crosslingual-sts config: fr split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 77.1356210910051 - type: cosine_spearman value: 81.7065039306575 - type: euclidean_pearson value: 79.32575551712296 - type: euclidean_spearman value: 81.75624482168821 - type: main_score value: 81.7065039306575 - type: manhattan_pearson value: 81.05436417153798 - type: manhattan_spearman value: 82.13370902176736 - type: pearson value: 77.1356210910051 - type: spearman value: 81.7065039306575 - task: type: STS dataset: name: MTEB STS22 (de-fr) type: mteb/sts22-crosslingual-sts config: de-fr split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 61.40659325490285 - type: cosine_spearman value: 64.21007088135842 - type: euclidean_pearson value: 61.051174476106 - type: euclidean_spearman value: 64.21007088135842 - type: main_score value: 64.21007088135842 - type: manhattan_pearson value: 60.225817072214525 - type: manhattan_spearman value: 64.32288638294209 - type: pearson value: 61.40659325490285 - type: spearman value: 64.21007088135842 - task: type: STS dataset: name: MTEB STS22 (fr-pl) type: mteb/sts22-crosslingual-sts config: fr-pl split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 88.17138238483673 - type: cosine_spearman value: 84.51542547285167 - type: euclidean_pearson value: 87.99782696047525 - type: euclidean_spearman value: 84.51542547285167 - type: main_score value: 84.51542547285167 - type: manhattan_pearson value: 85.811937669563 - type: manhattan_spearman value: 84.51542547285167 - type: pearson value: 88.17138238483673 - type: spearman value: 84.51542547285167 - type: cosine_pearson value: 88.17138238483673 - type: cosine_spearman value: 84.51542547285167 - type: euclidean_pearson value: 87.99782696047525 - type: euclidean_spearman value: 84.51542547285167 - type: main_score value: 84.51542547285167 - type: manhattan_pearson value: 85.811937669563 - type: manhattan_spearman value: 84.51542547285167 - type: pearson value: 88.17138238483673 - type: spearman value: 84.51542547285167 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (fr) type: mteb/stsb_multi_mt config: fr split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 79.98375089796882 - type: cosine_spearman value: 81.06570417849169 - type: euclidean_pearson value: 79.44759787417051 - type: euclidean_spearman value: 81.06430479357311 - type: main_score value: 81.06570417849169 - type: manhattan_pearson value: 79.34683573713086 - type: manhattan_spearman value: 81.00584846124926 - type: pearson value: 79.98375089796882 - type: spearman value: 81.06570417849169 - task: type: Summarization dataset: name: MTEB SummEvalFr type: lyon-nlp/summarization-summeval-fr-p2p config: default split: test revision: b385812de6a9577b6f4d0f88c6a6e35395a94054 metrics: - type: cosine_pearson value: 31.198220154029464 - type: cosine_spearman value: 30.886000528607877 - type: dot_pearson value: 31.19822718500702 - type: dot_spearman value: 30.86590068433314 - type: main_score value: 30.886000528607877 - type: pearson value: 31.198220154029464 - type: spearman value: 30.886000528607877 - task: type: Reranking dataset: name: MTEB SyntecReranking type: lyon-nlp/mteb-fr-reranking-syntec-s2p config: default split: test revision: daf0863838cd9e3ba50544cdce3ac2b338a1b0ad metrics: - type: main_score value: 86.6 - type: map value: 86.6 - type: mrr value: 86.6 - type: nAUC_map_diff1 value: 59.66160008216082 - type: nAUC_map_max value: 19.768885092568734 - type: nAUC_map_std value: 44.66975354255961 - type: nAUC_mrr_diff1 value: 59.66160008216082 - type: nAUC_mrr_max value: 19.768885092568734 - type: nAUC_mrr_std value: 44.66975354255961 - task: type: Retrieval dataset: name: MTEB SyntecRetrieval type: lyon-nlp/mteb-fr-retrieval-syntec-s2p config: default split: test revision: 19661ccdca4dfc2d15122d776b61685f48c68ca9 metrics: - type: main_score value: 81.899 - type: map_at_1 value: 64.0 - type: map_at_10 value: 76.594 - type: map_at_100 value: 76.66199999999999 - type: map_at_1000 value: 76.66199999999999 - type: map_at_20 value: 76.644 - type: map_at_3 value: 74.833 - type: map_at_5 value: 76.183 - type: mrr_at_1 value: 64.0 - type: mrr_at_10 value: 76.59404761904761 - type: mrr_at_100 value: 76.66159147869675 - type: mrr_at_1000 value: 76.66159147869675 - type: mrr_at_20 value: 76.64404761904763 - type: mrr_at_3 value: 74.83333333333333 - type: mrr_at_5 value: 76.18333333333334 - type: nauc_map_at_1000_diff1 value: 53.82627007182553 - type: nauc_map_at_1000_max value: 17.927045359651704 - type: nauc_map_at_1000_std value: -6.973071195715382 - type: nauc_map_at_100_diff1 value: 53.82627007182553 - type: nauc_map_at_100_max value: 17.927045359651704 - type: nauc_map_at_100_std value: -6.973071195715382 - type: nauc_map_at_10_diff1 value: 53.90625505629818 - type: nauc_map_at_10_max value: 18.12979815440444 - type: nauc_map_at_10_std value: -6.664265062780913 - type: nauc_map_at_1_diff1 value: 57.671797164388835 - type: nauc_map_at_1_max value: 16.9354323668412 - type: nauc_map_at_1_std value: -11.064631498275675 - type: nauc_map_at_20_diff1 value: 53.789271077104125 - type: nauc_map_at_20_max value: 17.922015037605867 - type: nauc_map_at_20_std value: -6.934974465544576 - type: nauc_map_at_3_diff1 value: 52.10054809507078 - type: nauc_map_at_3_max value: 17.282564201023686 - type: nauc_map_at_3_std value: -7.316507696153171 - type: nauc_map_at_5_diff1 value: 53.84305456072319 - type: nauc_map_at_5_max value: 18.0761340059772 - type: nauc_map_at_5_std value: -6.788097105243701 - type: nauc_mrr_at_1000_diff1 value: 53.82627007182553 - type: nauc_mrr_at_1000_max value: 17.927045359651704 - type: nauc_mrr_at_1000_std value: -6.973071195715382 - type: nauc_mrr_at_100_diff1 value: 53.82627007182553 - type: nauc_mrr_at_100_max value: 17.927045359651704 - type: nauc_mrr_at_100_std value: -6.973071195715382 - type: nauc_mrr_at_10_diff1 value: 53.90625505629818 - type: nauc_mrr_at_10_max value: 18.12979815440444 - type: nauc_mrr_at_10_std value: -6.664265062780913 - type: nauc_mrr_at_1_diff1 value: 57.671797164388835 - type: nauc_mrr_at_1_max value: 16.9354323668412 - type: nauc_mrr_at_1_std value: -11.064631498275675 - type: nauc_mrr_at_20_diff1 value: 53.789271077104125 - type: nauc_mrr_at_20_max value: 17.922015037605867 - type: nauc_mrr_at_20_std value: -6.934974465544576 - type: nauc_mrr_at_3_diff1 value: 52.10054809507078 - type: nauc_mrr_at_3_max value: 17.282564201023686 - type: nauc_mrr_at_3_std value: -7.316507696153171 - type: nauc_mrr_at_5_diff1 value: 53.84305456072319 - type: nauc_mrr_at_5_max value: 18.0761340059772 - type: nauc_mrr_at_5_std value: -6.788097105243701 - type: nauc_ndcg_at_1000_diff1 value: 53.47773846493816 - type: nauc_ndcg_at_1000_max value: 18.270810672735895 - type: nauc_ndcg_at_1000_std value: -6.204392784046327 - type: nauc_ndcg_at_100_diff1 value: 53.47773846493816 - type: nauc_ndcg_at_100_max value: 18.270810672735895 - type: nauc_ndcg_at_100_std value: -6.204392784046327 - type: nauc_ndcg_at_10_diff1 value: 53.70897446254982 - type: nauc_ndcg_at_10_max value: 19.41340528944212 - type: nauc_ndcg_at_10_std value: -4.167245194562443 - type: nauc_ndcg_at_1_diff1 value: 57.671797164388835 - type: nauc_ndcg_at_1_max value: 16.9354323668412 - type: nauc_ndcg_at_1_std value: -11.064631498275675 - type: nauc_ndcg_at_20_diff1 value: 53.013882632385034 - type: nauc_ndcg_at_20_max value: 18.20334171980294 - type: nauc_ndcg_at_20_std value: -5.7313885736485455 - type: nauc_ndcg_at_3_diff1 value: 49.798853568516044 - type: nauc_ndcg_at_3_max value: 17.88910440624622 - type: nauc_ndcg_at_3_std value: -5.959252175174665 - type: nauc_ndcg_at_5_diff1 value: 53.565830685346896 - type: nauc_ndcg_at_5_max value: 19.301209293805627 - type: nauc_ndcg_at_5_std value: -4.5368156313357435 - type: nauc_precision_at_1000_diff1 value: .nan - type: nauc_precision_at_1000_max value: .nan - type: nauc_precision_at_1000_std value: .nan - type: nauc_precision_at_100_diff1 value: .nan - type: nauc_precision_at_100_max value: .nan - type: nauc_precision_at_100_std value: .nan - type: nauc_precision_at_10_diff1 value: 56.13912231559286 - type: nauc_precision_at_10_max value: 56.13912231559286 - type: nauc_precision_at_10_std value: 67.9038281979461 - type: nauc_precision_at_1_diff1 value: 57.671797164388835 - type: nauc_precision_at_1_max value: 16.9354323668412 - type: nauc_precision_at_1_std value: -11.064631498275675 - type: nauc_precision_at_20_diff1 value: 12.278244631185926 - type: nauc_precision_at_20_max value: 12.278244631185926 - type: nauc_precision_at_20_std value: 35.80765639589114 - type: nauc_precision_at_3_diff1 value: 36.90404604415416 - type: nauc_precision_at_3_max value: 21.58749248346349 - type: nauc_precision_at_3_std value: 1.5204879305900956 - type: nauc_precision_at_5_diff1 value: 53.47338935574264 - type: nauc_precision_at_5_max value: 33.86554621848775 - type: nauc_precision_at_5_std value: 22.00746965452886 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: .nan - type: nauc_recall_at_100_max value: .nan - type: nauc_recall_at_100_std value: .nan - type: nauc_recall_at_10_diff1 value: 56.13912231559305 - type: nauc_recall_at_10_max value: 56.13912231559305 - type: nauc_recall_at_10_std value: 67.903828197946 - type: nauc_recall_at_1_diff1 value: 57.671797164388835 - type: nauc_recall_at_1_max value: 16.9354323668412 - type: nauc_recall_at_1_std value: -11.064631498275675 - type: nauc_recall_at_20_diff1 value: 12.278244631185359 - type: nauc_recall_at_20_max value: 12.278244631185359 - type: nauc_recall_at_20_std value: 35.80765639589109 - type: nauc_recall_at_3_diff1 value: 36.904046044154384 - type: nauc_recall_at_3_max value: 21.587492483463492 - type: nauc_recall_at_3_std value: 1.5204879305901602 - type: nauc_recall_at_5_diff1 value: 53.47338935574226 - type: nauc_recall_at_5_max value: 33.86554621848721 - type: nauc_recall_at_5_std value: 22.00746965452852 - type: ndcg_at_1 value: 64.0 - type: ndcg_at_10 value: 81.899 - type: ndcg_at_100 value: 82.297 - type: ndcg_at_1000 value: 82.297 - type: ndcg_at_20 value: 82.126 - type: ndcg_at_3 value: 78.464 - type: ndcg_at_5 value: 80.917 - type: precision_at_1 value: 64.0 - type: precision_at_10 value: 9.8 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.95 - type: precision_at_3 value: 29.666999999999998 - type: precision_at_5 value: 19.0 - type: recall_at_1 value: 64.0 - type: recall_at_10 value: 98.0 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 99.0 - type: recall_at_3 value: 89.0 - type: recall_at_5 value: 95.0 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (fr) type: jinaai/xpqa config: fra-fra split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 62.09 - type: map_at_1 value: 36.073 - type: map_at_10 value: 55.677 - type: map_at_100 value: 57.118 - type: map_at_1000 value: 57.199 - type: map_at_20 value: 56.501999999999995 - type: map_at_3 value: 49.619 - type: map_at_5 value: 53.455 - type: mrr_at_1 value: 57.543391188251 - type: mrr_at_10 value: 65.30018861127007 - type: mrr_at_100 value: 65.94099315822325 - type: mrr_at_1000 value: 65.96453864135188 - type: mrr_at_20 value: 65.71224590825028 - type: mrr_at_3 value: 63.351134846461946 - type: mrr_at_5 value: 64.42590120160212 - type: nauc_map_at_1000_diff1 value: 50.58271523935874 - type: nauc_map_at_1000_max value: 43.13816564852953 - type: nauc_map_at_1000_std value: 0.1844114463791253 - type: nauc_map_at_100_diff1 value: 50.55402514007517 - type: nauc_map_at_100_max value: 43.131135237384484 - type: nauc_map_at_100_std value: 0.1962985407010382 - type: nauc_map_at_10_diff1 value: 50.211948710332386 - type: nauc_map_at_10_max value: 42.56586858276775 - type: nauc_map_at_10_std value: -0.21682461914908613 - type: nauc_map_at_1_diff1 value: 58.97288229253611 - type: nauc_map_at_1_max value: 27.09256578748547 - type: nauc_map_at_1_std value: -3.128360909366587 - type: nauc_map_at_20_diff1 value: 50.33687763446524 - type: nauc_map_at_20_max value: 42.89316787999387 - type: nauc_map_at_20_std value: -0.2224194056336769 - type: nauc_map_at_3_diff1 value: 51.23147801843447 - type: nauc_map_at_3_max value: 37.22691523640508 - type: nauc_map_at_3_std value: -1.4704387784247346 - type: nauc_map_at_5_diff1 value: 50.66157676518992 - type: nauc_map_at_5_max value: 41.58957149577394 - type: nauc_map_at_5_std value: 0.16909716462753255 - type: nauc_mrr_at_1000_diff1 value: 58.88383847534171 - type: nauc_mrr_at_1000_max value: 49.3245365217643 - type: nauc_mrr_at_1000_std value: 2.1575868813952894 - type: nauc_mrr_at_100_diff1 value: 58.885865820137276 - type: nauc_mrr_at_100_max value: 49.32954909327622 - type: nauc_mrr_at_100_std value: 2.1750885487117024 - type: nauc_mrr_at_10_diff1 value: 58.83761987066026 - type: nauc_mrr_at_10_max value: 49.37803355533766 - type: nauc_mrr_at_10_std value: 1.927477967317313 - type: nauc_mrr_at_1_diff1 value: 60.897823384674496 - type: nauc_mrr_at_1_max value: 48.79303626218842 - type: nauc_mrr_at_1_std value: 3.68732973455558 - type: nauc_mrr_at_20_diff1 value: 58.80334636154898 - type: nauc_mrr_at_20_max value: 49.299926776474535 - type: nauc_mrr_at_20_std value: 1.9599488796786029 - type: nauc_mrr_at_3_diff1 value: 59.21037240205004 - type: nauc_mrr_at_3_max value: 49.14597672580709 - type: nauc_mrr_at_3_std value: 1.0051764061328385 - type: nauc_mrr_at_5_diff1 value: 58.98849095570841 - type: nauc_mrr_at_5_max value: 49.68364568027881 - type: nauc_mrr_at_5_std value: 2.4739579029654366 - type: nauc_ndcg_at_1000_diff1 value: 52.31164533549997 - type: nauc_ndcg_at_1000_max value: 45.69420989458311 - type: nauc_ndcg_at_1000_std value: 1.1608489877596142 - type: nauc_ndcg_at_100_diff1 value: 51.87286842964108 - type: nauc_ndcg_at_100_max value: 45.685834956792895 - type: nauc_ndcg_at_100_std value: 1.8157949218428466 - type: nauc_ndcg_at_10_diff1 value: 50.57331251457611 - type: nauc_ndcg_at_10_max value: 44.44795063905562 - type: nauc_ndcg_at_10_std value: -0.3915488786381922 - type: nauc_ndcg_at_1_diff1 value: 60.897823384674496 - type: nauc_ndcg_at_1_max value: 48.79303626218842 - type: nauc_ndcg_at_1_std value: 3.68732973455558 - type: nauc_ndcg_at_20_diff1 value: 50.76487704699518 - type: nauc_ndcg_at_20_max value: 44.79388134049559 - type: nauc_ndcg_at_20_std value: -0.4213693889586553 - type: nauc_ndcg_at_3_diff1 value: 51.177774035828605 - type: nauc_ndcg_at_3_max value: 43.73405047316084 - type: nauc_ndcg_at_3_std value: -1.18104282095782 - type: nauc_ndcg_at_5_diff1 value: 51.15375930024702 - type: nauc_ndcg_at_5_max value: 43.7940523142017 - type: nauc_ndcg_at_5_std value: 0.8224796779269716 - type: nauc_precision_at_1000_diff1 value: -13.700846719394837 - type: nauc_precision_at_1000_max value: 15.005182092410575 - type: nauc_precision_at_1000_std value: 6.913901876028514 - type: nauc_precision_at_100_diff1 value: -8.919890455110265 - type: nauc_precision_at_100_max value: 20.85944528699816 - type: nauc_precision_at_100_std value: 8.934660613911344 - type: nauc_precision_at_10_diff1 value: 2.0626021976371662 - type: nauc_precision_at_10_max value: 30.851331908454423 - type: nauc_precision_at_10_std value: 4.512923316711585 - type: nauc_precision_at_1_diff1 value: 60.897823384674496 - type: nauc_precision_at_1_max value: 48.79303626218842 - type: nauc_precision_at_1_std value: 3.68732973455558 - type: nauc_precision_at_20_diff1 value: -1.9918582602200585 - type: nauc_precision_at_20_max value: 27.779932491338315 - type: nauc_precision_at_20_std value: 4.734186088720616 - type: nauc_precision_at_3_diff1 value: 14.5090169489911 - type: nauc_precision_at_3_max value: 37.59006778251299 - type: nauc_precision_at_3_std value: 3.677659738072369 - type: nauc_precision_at_5_diff1 value: 7.705804886616575 - type: nauc_precision_at_5_max value: 36.0216894270471 - type: nauc_precision_at_5_std value: 6.513474617464925 - type: nauc_recall_at_1000_diff1 value: 20.71811619738829 - type: nauc_recall_at_1000_max value: 23.217180195398225 - type: nauc_recall_at_1000_std value: 26.037508089878237 - type: nauc_recall_at_100_diff1 value: 38.44958378050671 - type: nauc_recall_at_100_max value: 40.99327582118083 - type: nauc_recall_at_100_std value: 16.36015422588489 - type: nauc_recall_at_10_diff1 value: 40.027789080211576 - type: nauc_recall_at_10_max value: 38.82613587358396 - type: nauc_recall_at_10_std value: -3.5237192778606596 - type: nauc_recall_at_1_diff1 value: 58.97288229253611 - type: nauc_recall_at_1_max value: 27.09256578748547 - type: nauc_recall_at_1_std value: -3.128360909366587 - type: nauc_recall_at_20_diff1 value: 37.818919303571406 - type: nauc_recall_at_20_max value: 37.42703966259237 - type: nauc_recall_at_20_std value: -4.770317748130178 - type: nauc_recall_at_3_diff1 value: 45.13163472734054 - type: nauc_recall_at_3_max value: 33.72267598718042 - type: nauc_recall_at_3_std value: -4.443802840190085 - type: nauc_recall_at_5_diff1 value: 43.05114612174671 - type: nauc_recall_at_5_max value: 39.10347802906311 - type: nauc_recall_at_5_std value: 0.4813526343602913 - type: ndcg_at_1 value: 57.543 - type: ndcg_at_10 value: 62.09 - type: ndcg_at_100 value: 67.216 - type: ndcg_at_1000 value: 68.60000000000001 - type: ndcg_at_20 value: 64.20700000000001 - type: ndcg_at_3 value: 56.952999999999996 - type: ndcg_at_5 value: 58.631 - type: precision_at_1 value: 57.543 - type: precision_at_10 value: 14.499 - type: precision_at_100 value: 1.8739999999999999 - type: precision_at_1000 value: 0.20600000000000002 - type: precision_at_20 value: 7.971 - type: precision_at_3 value: 34.446 - type: precision_at_5 value: 24.993000000000002 - type: recall_at_1 value: 36.073 - type: recall_at_10 value: 70.532 - type: recall_at_100 value: 90.63600000000001 - type: recall_at_1000 value: 99.577 - type: recall_at_20 value: 77.388 - type: recall_at_3 value: 54.786 - type: recall_at_5 value: 62.365 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (eng-fra) type: jinaai/xpqa config: eng-fra split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 34.795 - type: map_at_1 value: 13.818 - type: map_at_10 value: 28.221 - type: map_at_100 value: 30.715999999999998 - type: map_at_1000 value: 30.86 - type: map_at_20 value: 29.601 - type: map_at_3 value: 23.194 - type: map_at_5 value: 26.057999999999996 - type: mrr_at_1 value: 27.236315086782376 - type: mrr_at_10 value: 36.39890224002375 - type: mrr_at_100 value: 37.73446796439471 - type: mrr_at_1000 value: 37.79021013088287 - type: mrr_at_20 value: 37.17175635350331 - type: mrr_at_3 value: 33.73386737872722 - type: mrr_at_5 value: 35.24922118380064 - type: nauc_map_at_1000_diff1 value: 32.30220782845437 - type: nauc_map_at_1000_max value: 39.87665001530303 - type: nauc_map_at_1000_std value: 5.7695221727058055 - type: nauc_map_at_100_diff1 value: 32.2694600306288 - type: nauc_map_at_100_max value: 39.8855550981263 - type: nauc_map_at_100_std value: 5.776881467271089 - type: nauc_map_at_10_diff1 value: 31.804039669931537 - type: nauc_map_at_10_max value: 39.311719475930005 - type: nauc_map_at_10_std value: 4.733050895784147 - type: nauc_map_at_1_diff1 value: 37.6388258626095 - type: nauc_map_at_1_max value: 25.192760889147102 - type: nauc_map_at_1_std value: 0.84012346712856 - type: nauc_map_at_20_diff1 value: 31.970587565845022 - type: nauc_map_at_20_max value: 39.68812698793437 - type: nauc_map_at_20_std value: 5.466710545588436 - type: nauc_map_at_3_diff1 value: 33.9083845702625 - type: nauc_map_at_3_max value: 35.88443788757562 - type: nauc_map_at_3_std value: 2.956590608487331 - type: nauc_map_at_5_diff1 value: 32.97116962607063 - type: nauc_map_at_5_max value: 38.37964967819906 - type: nauc_map_at_5_std value: 4.573297881379916 - type: nauc_mrr_at_1000_diff1 value: 32.44126725520097 - type: nauc_mrr_at_1000_max value: 39.246349656799985 - type: nauc_mrr_at_1000_std value: 7.779126542191918 - type: nauc_mrr_at_100_diff1 value: 32.416259735518885 - type: nauc_mrr_at_100_max value: 39.258316221246965 - type: nauc_mrr_at_100_std value: 7.7944505673136 - type: nauc_mrr_at_10_diff1 value: 32.06932803779604 - type: nauc_mrr_at_10_max value: 39.04853109147614 - type: nauc_mrr_at_10_std value: 7.4138965935269505 - type: nauc_mrr_at_1_diff1 value: 37.27173489316227 - type: nauc_mrr_at_1_max value: 40.3355905491979 - type: nauc_mrr_at_1_std value: 6.933728079474825 - type: nauc_mrr_at_20_diff1 value: 32.329474296004804 - type: nauc_mrr_at_20_max value: 39.0764118546337 - type: nauc_mrr_at_20_std value: 7.696441003623004 - type: nauc_mrr_at_3_diff1 value: 33.08673004752433 - type: nauc_mrr_at_3_max value: 39.95985433324281 - type: nauc_mrr_at_3_std value: 7.62764311528151 - type: nauc_mrr_at_5_diff1 value: 32.818965514653684 - type: nauc_mrr_at_5_max value: 39.34493265770003 - type: nauc_mrr_at_5_std value: 7.778531920242 - type: nauc_ndcg_at_1000_diff1 value: 31.02864530261756 - type: nauc_ndcg_at_1000_max value: 39.64187241406462 - type: nauc_ndcg_at_1000_std value: 7.768459120817835 - type: nauc_ndcg_at_100_diff1 value: 30.39095044516521 - type: nauc_ndcg_at_100_max value: 39.956877555291406 - type: nauc_ndcg_at_100_std value: 8.813305671545828 - type: nauc_ndcg_at_10_diff1 value: 29.09294115578835 - type: nauc_ndcg_at_10_max value: 38.59245602933513 - type: nauc_ndcg_at_10_std value: 5.516145701680656 - type: nauc_ndcg_at_1_diff1 value: 37.27173489316227 - type: nauc_ndcg_at_1_max value: 40.3355905491979 - type: nauc_ndcg_at_1_std value: 6.933728079474825 - type: nauc_ndcg_at_20_diff1 value: 29.725541536865684 - type: nauc_ndcg_at_20_max value: 39.12781667827556 - type: nauc_ndcg_at_20_std value: 7.464557759930056 - type: nauc_ndcg_at_3_diff1 value: 32.2472918241563 - type: nauc_ndcg_at_3_max value: 39.38528978160266 - type: nauc_ndcg_at_3_std value: 5.126228097274878 - type: nauc_ndcg_at_5_diff1 value: 31.39000117667687 - type: nauc_ndcg_at_5_max value: 38.16838826710958 - type: nauc_ndcg_at_5_std value: 5.747613838798295 - type: nauc_precision_at_1000_diff1 value: 3.926032418467635 - type: nauc_precision_at_1000_max value: 19.08045437036499 - type: nauc_precision_at_1000_std value: 6.796129044597931 - type: nauc_precision_at_100_diff1 value: 9.73958477384916 - type: nauc_precision_at_100_max value: 29.07096859484853 - type: nauc_precision_at_100_std value: 12.96991105140292 - type: nauc_precision_at_10_diff1 value: 17.1980255233314 - type: nauc_precision_at_10_max value: 43.52273606745023 - type: nauc_precision_at_10_std value: 10.958034153583304 - type: nauc_precision_at_1_diff1 value: 37.27173489316227 - type: nauc_precision_at_1_max value: 40.3355905491979 - type: nauc_precision_at_1_std value: 6.933728079474825 - type: nauc_precision_at_20_diff1 value: 15.369446454490415 - type: nauc_precision_at_20_max value: 39.48680497589929 - type: nauc_precision_at_20_std value: 12.929898425260022 - type: nauc_precision_at_3_diff1 value: 23.95767667939835 - type: nauc_precision_at_3_max value: 45.09931497087944 - type: nauc_precision_at_3_std value: 8.770453872723321 - type: nauc_precision_at_5_diff1 value: 21.065461642416665 - type: nauc_precision_at_5_max value: 44.72202962458711 - type: nauc_precision_at_5_std value: 10.750776410192397 - type: nauc_recall_at_1000_diff1 value: 6.039018739578919 - type: nauc_recall_at_1000_max value: 11.436015450640827 - type: nauc_recall_at_1000_std value: 66.07591419148011 - type: nauc_recall_at_100_diff1 value: 18.65963295269777 - type: nauc_recall_at_100_max value: 38.12793731004431 - type: nauc_recall_at_100_std value: 22.877750142093177 - type: nauc_recall_at_10_diff1 value: 19.342831730746934 - type: nauc_recall_at_10_max value: 31.63232417200137 - type: nauc_recall_at_10_std value: 3.271699563530961 - type: nauc_recall_at_1_diff1 value: 37.6388258626095 - type: nauc_recall_at_1_max value: 25.192760889147102 - type: nauc_recall_at_1_std value: 0.84012346712856 - type: nauc_recall_at_20_diff1 value: 20.61391084945006 - type: nauc_recall_at_20_max value: 32.5842740511191 - type: nauc_recall_at_20_std value: 9.819500541742485 - type: nauc_recall_at_3_diff1 value: 28.687240532045937 - type: nauc_recall_at_3_max value: 31.72988302079546 - type: nauc_recall_at_3_std value: 2.7062500297360295 - type: nauc_recall_at_5_diff1 value: 25.62354289856022 - type: nauc_recall_at_5_max value: 32.13732981730723 - type: nauc_recall_at_5_std value: 4.661623958812741 - type: ndcg_at_1 value: 27.236 - type: ndcg_at_10 value: 34.795 - type: ndcg_at_100 value: 44.352000000000004 - type: ndcg_at_1000 value: 46.98 - type: ndcg_at_20 value: 38.537 - type: ndcg_at_3 value: 29.448 - type: ndcg_at_5 value: 30.996000000000002 - type: precision_at_1 value: 27.236 - type: precision_at_10 value: 10.427 - type: precision_at_100 value: 1.8769999999999998 - type: precision_at_1000 value: 0.22300000000000003 - type: precision_at_20 value: 6.589 - type: precision_at_3 value: 21.584 - type: precision_at_5 value: 16.555 - type: recall_at_1 value: 13.818 - type: recall_at_10 value: 44.826 - type: recall_at_100 value: 82.047 - type: recall_at_1000 value: 99.286 - type: recall_at_20 value: 56.615 - type: recall_at_3 value: 28.509 - type: recall_at_5 value: 35.472 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (fra-eng) type: jinaai/xpqa config: fra-eng split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 55.165 - type: map_at_1 value: 30.070999999999998 - type: map_at_10 value: 48.391 - type: map_at_100 value: 50.077000000000005 - type: map_at_1000 value: 50.175000000000004 - type: map_at_20 value: 49.425999999999995 - type: map_at_3 value: 43.108999999999995 - type: map_at_5 value: 46.331 - type: mrr_at_1 value: 47.79706275033378 - type: mrr_at_10 value: 57.112499205289545 - type: mrr_at_100 value: 57.77653857554601 - type: mrr_at_1000 value: 57.810309267669155 - type: mrr_at_20 value: 57.50639791688323 - type: mrr_at_3 value: 55.028927458833955 - type: mrr_at_5 value: 56.19715175789934 - type: nauc_map_at_1000_diff1 value: 48.30744877651571 - type: nauc_map_at_1000_max value: 41.304811375830106 - type: nauc_map_at_1000_std value: -3.319640191562977 - type: nauc_map_at_100_diff1 value: 48.24792131158136 - type: nauc_map_at_100_max value: 41.289809346155096 - type: nauc_map_at_100_std value: -3.3322490840754044 - type: nauc_map_at_10_diff1 value: 48.51735869545944 - type: nauc_map_at_10_max value: 41.39091467858207 - type: nauc_map_at_10_std value: -3.816834529081366 - type: nauc_map_at_1_diff1 value: 55.20976873300869 - type: nauc_map_at_1_max value: 29.31564406699001 - type: nauc_map_at_1_std value: -3.531295202744916 - type: nauc_map_at_20_diff1 value: 48.15994357210226 - type: nauc_map_at_20_max value: 41.27059573974859 - type: nauc_map_at_20_std value: -3.553567850461392 - type: nauc_map_at_3_diff1 value: 49.07638331745524 - type: nauc_map_at_3_max value: 37.4344180429373 - type: nauc_map_at_3_std value: -4.793107974869855 - type: nauc_map_at_5_diff1 value: 48.610911544450566 - type: nauc_map_at_5_max value: 40.36936409939194 - type: nauc_map_at_5_std value: -4.494930285823858 - type: nauc_mrr_at_1000_diff1 value: 54.361185624681966 - type: nauc_mrr_at_1000_max value: 44.38223734909631 - type: nauc_mrr_at_1000_std value: -4.6407251183091045 - type: nauc_mrr_at_100_diff1 value: 54.3534593950135 - type: nauc_mrr_at_100_max value: 44.378173894610114 - type: nauc_mrr_at_100_std value: -4.625111682775984 - type: nauc_mrr_at_10_diff1 value: 54.24312662266002 - type: nauc_mrr_at_10_max value: 44.30746970923868 - type: nauc_mrr_at_10_std value: -4.965217414238369 - type: nauc_mrr_at_1_diff1 value: 58.26954724773496 - type: nauc_mrr_at_1_max value: 45.03422518009353 - type: nauc_mrr_at_1_std value: -4.069334933239831 - type: nauc_mrr_at_20_diff1 value: 54.25997769977666 - type: nauc_mrr_at_20_max value: 44.38402056799441 - type: nauc_mrr_at_20_std value: -4.671395366726689 - type: nauc_mrr_at_3_diff1 value: 54.499361492963985 - type: nauc_mrr_at_3_max value: 43.81936137776164 - type: nauc_mrr_at_3_std value: -5.644776625702544 - type: nauc_mrr_at_5_diff1 value: 54.44957576550037 - type: nauc_mrr_at_5_max value: 43.983826735470124 - type: nauc_mrr_at_5_std value: -5.796801921865972 - type: nauc_ndcg_at_1000_diff1 value: 49.15241156513385 - type: nauc_ndcg_at_1000_max value: 42.45980120922761 - type: nauc_ndcg_at_1000_std value: -2.3353260193872605 - type: nauc_ndcg_at_100_diff1 value: 48.24122686676774 - type: nauc_ndcg_at_100_max value: 42.27679493782058 - type: nauc_ndcg_at_100_std value: -1.5752369584570114 - type: nauc_ndcg_at_10_diff1 value: 48.5509813605824 - type: nauc_ndcg_at_10_max value: 42.59298249833255 - type: nauc_ndcg_at_10_std value: -3.672669315491546 - type: nauc_ndcg_at_1_diff1 value: 58.26954724773496 - type: nauc_ndcg_at_1_max value: 45.03422518009353 - type: nauc_ndcg_at_1_std value: -4.069334933239831 - type: nauc_ndcg_at_20_diff1 value: 47.729261088005316 - type: nauc_ndcg_at_20_max value: 42.49497033902468 - type: nauc_ndcg_at_20_std value: -2.6719433358977773 - type: nauc_ndcg_at_3_diff1 value: 48.68223689824344 - type: nauc_ndcg_at_3_max value: 40.9157048148036 - type: nauc_ndcg_at_3_std value: -5.637336437839516 - type: nauc_ndcg_at_5_diff1 value: 48.69726991107552 - type: nauc_ndcg_at_5_max value: 41.152294520697076 - type: nauc_ndcg_at_5_std value: -5.48123275220102 - type: nauc_precision_at_1000_diff1 value: -10.425039324403782 - type: nauc_precision_at_1000_max value: 7.051352071885475 - type: nauc_precision_at_1000_std value: 4.456043136940008 - type: nauc_precision_at_100_diff1 value: -6.528489272274514 - type: nauc_precision_at_100_max value: 12.611149343017736 - type: nauc_precision_at_100_std value: 5.918229501417929 - type: nauc_precision_at_10_diff1 value: 9.37469315859335 - type: nauc_precision_at_10_max value: 29.792160957981938 - type: nauc_precision_at_10_std value: 0.2316309488416353 - type: nauc_precision_at_1_diff1 value: 58.26954724773496 - type: nauc_precision_at_1_max value: 45.03422518009353 - type: nauc_precision_at_1_std value: -4.069334933239831 - type: nauc_precision_at_20_diff1 value: 2.981751622851337 - type: nauc_precision_at_20_max value: 23.312084195651227 - type: nauc_precision_at_20_std value: 2.560521133286893 - type: nauc_precision_at_3_diff1 value: 20.831474725533468 - type: nauc_precision_at_3_max value: 34.732843194059996 - type: nauc_precision_at_3_std value: -3.379064346220114 - type: nauc_precision_at_5_diff1 value: 14.628778037588857 - type: nauc_precision_at_5_max value: 33.5567398421705 - type: nauc_precision_at_5_std value: -2.4525869923256236 - type: nauc_recall_at_1000_diff1 value: 24.629562614981076 - type: nauc_recall_at_1000_max value: 37.74776159843809 - type: nauc_recall_at_1000_std value: 45.84365921167674 - type: nauc_recall_at_100_diff1 value: 28.656294603430176 - type: nauc_recall_at_100_max value: 34.99333512037935 - type: nauc_recall_at_100_std value: 18.07167333451945 - type: nauc_recall_at_10_diff1 value: 39.579271628779686 - type: nauc_recall_at_10_max value: 39.65055294313406 - type: nauc_recall_at_10_std value: -1.4953189564586904 - type: nauc_recall_at_1_diff1 value: 55.20976873300869 - type: nauc_recall_at_1_max value: 29.31564406699001 - type: nauc_recall_at_1_std value: -3.531295202744916 - type: nauc_recall_at_20_diff1 value: 35.59952531108398 - type: nauc_recall_at_20_max value: 39.735665662589234 - type: nauc_recall_at_20_std value: 2.746812413081314 - type: nauc_recall_at_3_diff1 value: 42.180790443876234 - type: nauc_recall_at_3_max value: 33.23529070499019 - type: nauc_recall_at_3_std value: -7.102867270573987 - type: nauc_recall_at_5_diff1 value: 41.34875509720362 - type: nauc_recall_at_5_max value: 36.67737500141328 - type: nauc_recall_at_5_std value: -7.16711230678949 - type: ndcg_at_1 value: 47.797 - type: ndcg_at_10 value: 55.165 - type: ndcg_at_100 value: 61.072 - type: ndcg_at_1000 value: 62.766999999999996 - type: ndcg_at_20 value: 57.603 - type: ndcg_at_3 value: 50.134 - type: ndcg_at_5 value: 51.711 - type: precision_at_1 value: 47.797 - type: precision_at_10 value: 13.150999999999998 - type: precision_at_100 value: 1.8370000000000002 - type: precision_at_1000 value: 0.20600000000000002 - type: precision_at_20 value: 7.517 - type: precision_at_3 value: 30.975 - type: precision_at_5 value: 22.27 - type: recall_at_1 value: 30.070999999999998 - type: recall_at_10 value: 65.352 - type: recall_at_100 value: 88.31099999999999 - type: recall_at_1000 value: 99.417 - type: recall_at_20 value: 72.65 - type: recall_at_3 value: 49.891000000000005 - type: recall_at_5 value: 56.949000000000005 - task: type: Classification dataset: name: MTEB AllegroReviews type: PL-MTEB/allegro-reviews config: default split: test revision: b89853e6de927b0e3bfa8ecc0e56fe4e02ceafc6 metrics: - type: accuracy value: 53.48906560636182 - type: f1 value: 41.948000361532074 - type: f1_weighted value: 50.64284561538599 - type: main_score value: 53.48906560636182 - task: type: Retrieval dataset: name: MTEB ArguAna-PL type: clarin-knext/arguana-pl config: default split: test revision: 63fc86750af76253e8c760fc9e534bbf24d260a2 metrics: - type: main_score value: 49.913000000000004 - type: map_at_1 value: 24.04 - type: map_at_10 value: 40.493 - type: map_at_100 value: 41.447 - type: map_at_1000 value: 41.454 - type: map_at_20 value: 41.197 - type: map_at_3 value: 35.099999999999994 - type: map_at_5 value: 38.196999999999996 - type: mrr_at_1 value: 24.537695590327168 - type: mrr_at_10 value: 40.67259929102034 - type: mrr_at_100 value: 41.639618460436125 - type: mrr_at_1000 value: 41.64596845247576 - type: mrr_at_20 value: 41.38915253517258 - type: mrr_at_3 value: 35.27738264580362 - type: mrr_at_5 value: 38.40327169274532 - type: nauc_map_at_1000_diff1 value: 7.431509810863732 - type: nauc_map_at_1000_max value: -2.981272220393634 - type: nauc_map_at_1000_std value: -7.60710973485905 - type: nauc_map_at_100_diff1 value: 7.436737619273204 - type: nauc_map_at_100_max value: -2.967184788185936 - type: nauc_map_at_100_std value: -7.597426337410871 - type: nauc_map_at_10_diff1 value: 7.255093659685807 - type: nauc_map_at_10_max value: -2.9042962900147544 - type: nauc_map_at_10_std value: -7.934694729089717 - type: nauc_map_at_1_diff1 value: 12.509203312194646 - type: nauc_map_at_1_max value: -5.881727148045224 - type: nauc_map_at_1_std value: -7.791332615643759 - type: nauc_map_at_20_diff1 value: 7.327100008464186 - type: nauc_map_at_20_max value: -2.837417061935196 - type: nauc_map_at_20_std value: -7.727026459254324 - type: nauc_map_at_3_diff1 value: 6.852993254257847 - type: nauc_map_at_3_max value: -4.051470844228069 - type: nauc_map_at_3_std value: -7.896963683580916 - type: nauc_map_at_5_diff1 value: 6.528299731268904 - type: nauc_map_at_5_max value: -3.6970340215693476 - type: nauc_map_at_5_std value: -7.655276417735266 - type: nauc_mrr_at_1000_diff1 value: 5.711449969160694 - type: nauc_mrr_at_1000_max value: -3.4753506470039266 - type: nauc_mrr_at_1000_std value: -7.794020380041222 - type: nauc_mrr_at_100_diff1 value: 5.717019799542202 - type: nauc_mrr_at_100_max value: -3.461221495753972 - type: nauc_mrr_at_100_std value: -7.784340755281538 - type: nauc_mrr_at_10_diff1 value: 5.509993731954919 - type: nauc_mrr_at_10_max value: -3.4562614853854345 - type: nauc_mrr_at_10_std value: -8.172557318463994 - type: nauc_mrr_at_1_diff1 value: 10.815838583441858 - type: nauc_mrr_at_1_max value: -5.323382194534891 - type: nauc_mrr_at_1_std value: -8.038288156705363 - type: nauc_mrr_at_20_diff1 value: 5.622966175346149 - type: nauc_mrr_at_20_max value: -3.3271519171448602 - type: nauc_mrr_at_20_std value: -7.911979321248223 - type: nauc_mrr_at_3_diff1 value: 5.1203118177676945 - type: nauc_mrr_at_3_max value: -4.663436282182911 - type: nauc_mrr_at_3_std value: -8.16687342201878 - type: nauc_mrr_at_5_diff1 value: 4.899936200607895 - type: nauc_mrr_at_5_max value: -4.238888324916206 - type: nauc_mrr_at_5_std value: -7.911378372003927 - type: nauc_ndcg_at_1000_diff1 value: 7.208621858675132 - type: nauc_ndcg_at_1000_max value: -1.9047927444267347 - type: nauc_ndcg_at_1000_std value: -6.986137159109878 - type: nauc_ndcg_at_100_diff1 value: 7.409545817332008 - type: nauc_ndcg_at_100_max value: -1.4631671846013694 - type: nauc_ndcg_at_100_std value: -6.630280309037233 - type: nauc_ndcg_at_10_diff1 value: 6.4667756391170395 - type: nauc_ndcg_at_10_max value: -0.6950268010456382 - type: nauc_ndcg_at_10_std value: -8.022144927522392 - type: nauc_ndcg_at_1_diff1 value: 12.509203312194646 - type: nauc_ndcg_at_1_max value: -5.881727148045224 - type: nauc_ndcg_at_1_std value: -7.791332615643759 - type: nauc_ndcg_at_20_diff1 value: 6.726279074146785 - type: nauc_ndcg_at_20_max value: -0.3861052348420354 - type: nauc_ndcg_at_20_std value: -7.221277273790139 - type: nauc_ndcg_at_3_diff1 value: 5.5538863803913365 - type: nauc_ndcg_at_3_max value: -3.5651217527245946 - type: nauc_ndcg_at_3_std value: -7.826880086024049 - type: nauc_ndcg_at_5_diff1 value: 4.878905871379252 - type: nauc_ndcg_at_5_max value: -2.821048486985759 - type: nauc_ndcg_at_5_std value: -7.31598311150453 - type: nauc_precision_at_1000_diff1 value: 31.595672412803232 - type: nauc_precision_at_1000_max value: 42.56487657246246 - type: nauc_precision_at_1000_std value: 76.77064740096077 - type: nauc_precision_at_100_diff1 value: 37.959767569852325 - type: nauc_precision_at_100_max value: 61.03819238774345 - type: nauc_precision_at_100_std value: 57.75475522584779 - type: nauc_precision_at_10_diff1 value: 3.679895666980749 - type: nauc_precision_at_10_max value: 11.38829056417457 - type: nauc_precision_at_10_std value: -8.650914185729293 - type: nauc_precision_at_1_diff1 value: 12.509203312194646 - type: nauc_precision_at_1_max value: -5.881727148045224 - type: nauc_precision_at_1_std value: -7.791332615643759 - type: nauc_precision_at_20_diff1 value: 4.065515107956777 - type: nauc_precision_at_20_max value: 23.888067135216097 - type: nauc_precision_at_20_std value: -1.4622436922054596 - type: nauc_precision_at_3_diff1 value: 2.1003082872796663 - type: nauc_precision_at_3_max value: -2.24675019839533 - type: nauc_precision_at_3_std value: -7.604178336955303 - type: nauc_precision_at_5_diff1 value: -0.246824792648523 - type: nauc_precision_at_5_max value: 0.0642032358424201 - type: nauc_precision_at_5_std value: -6.0892549043276745 - type: nauc_recall_at_1000_diff1 value: 31.59567241280578 - type: nauc_recall_at_1000_max value: 42.564876572459895 - type: nauc_recall_at_1000_std value: 76.7706474009625 - type: nauc_recall_at_100_diff1 value: 37.95976756985206 - type: nauc_recall_at_100_max value: 61.03819238774383 - type: nauc_recall_at_100_std value: 57.75475522584684 - type: nauc_recall_at_10_diff1 value: 3.679895666980674 - type: nauc_recall_at_10_max value: 11.388290564174538 - type: nauc_recall_at_10_std value: -8.650914185729265 - type: nauc_recall_at_1_diff1 value: 12.509203312194646 - type: nauc_recall_at_1_max value: -5.881727148045224 - type: nauc_recall_at_1_std value: -7.791332615643759 - type: nauc_recall_at_20_diff1 value: 4.065515107957231 - type: nauc_recall_at_20_max value: 23.888067135216005 - type: nauc_recall_at_20_std value: -1.462243692205622 - type: nauc_recall_at_3_diff1 value: 2.100308287279676 - type: nauc_recall_at_3_max value: -2.2467501983953024 - type: nauc_recall_at_3_std value: -7.604178336955286 - type: nauc_recall_at_5_diff1 value: -0.24682479264852286 - type: nauc_recall_at_5_max value: 0.06420323584243659 - type: nauc_recall_at_5_std value: -6.089254904327643 - type: ndcg_at_1 value: 24.04 - type: ndcg_at_10 value: 49.913000000000004 - type: ndcg_at_100 value: 54.057 - type: ndcg_at_1000 value: 54.213 - type: ndcg_at_20 value: 52.42400000000001 - type: ndcg_at_3 value: 38.842999999999996 - type: ndcg_at_5 value: 44.416 - type: precision_at_1 value: 24.04 - type: precision_at_10 value: 8.009 - type: precision_at_100 value: 0.984 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.495 - type: precision_at_3 value: 16.572 - type: precision_at_5 value: 12.645999999999999 - type: recall_at_1 value: 24.04 - type: recall_at_10 value: 80.085 - type: recall_at_100 value: 98.36399999999999 - type: recall_at_1000 value: 99.57300000000001 - type: recall_at_20 value: 89.9 - type: recall_at_3 value: 49.716 - type: recall_at_5 value: 63.229 - task: type: Classification dataset: name: MTEB CBD type: PL-MTEB/cbd config: default split: test revision: 36ddb419bcffe6a5374c3891957912892916f28d metrics: - type: accuracy value: 64.91 - type: ap value: 20.253009474993238 - type: ap_weighted value: 20.253009474993238 - type: f1 value: 54.83698737303514 - type: f1_weighted value: 69.53194816160229 - type: main_score value: 64.91 - task: type: PairClassification dataset: name: MTEB CDSC-E type: PL-MTEB/cdsce-pairclassification config: default split: test revision: 0a3d4aa409b22f80eb22cbf59b492637637b536d metrics: - type: cosine_accuracy value: 87.5 - type: cosine_accuracy_threshold value: 97.39001989364624 - type: cosine_ap value: 71.63899566137869 - type: cosine_f1 value: 64.39024390243902 - type: cosine_f1_threshold value: 94.18535828590393 - type: cosine_precision value: 60.0 - type: cosine_recall value: 69.47368421052632 - type: dot_accuracy value: 87.5 - type: dot_accuracy_threshold value: 97.39001989364624 - type: dot_ap value: 71.63899566137869 - type: dot_f1 value: 64.39024390243902 - type: dot_f1_threshold value: 94.18535232543945 - type: dot_precision value: 60.0 - type: dot_recall value: 69.47368421052632 - type: euclidean_accuracy value: 87.5 - type: euclidean_accuracy_threshold value: 22.847232222557068 - type: euclidean_ap value: 71.63899566137869 - type: euclidean_f1 value: 64.39024390243902 - type: euclidean_f1_threshold value: 34.101736545562744 - type: euclidean_precision value: 60.0 - type: euclidean_recall value: 69.47368421052632 - type: main_score value: 71.83631821171632 - type: manhattan_accuracy value: 87.6 - type: manhattan_accuracy_threshold value: 499.97105598449707 - type: manhattan_ap value: 71.83631821171632 - type: manhattan_f1 value: 64.5631067961165 - type: manhattan_f1_threshold value: 809.0234756469727 - type: manhattan_precision value: 59.909909909909906 - type: manhattan_recall value: 70.0 - type: max_ap value: 71.83631821171632 - type: max_f1 value: 64.5631067961165 - type: max_precision value: 60.0 - type: max_recall value: 70.0 - type: similarity_accuracy value: 87.5 - type: similarity_accuracy_threshold value: 97.39001989364624 - type: similarity_ap value: 71.63899566137869 - type: similarity_f1 value: 64.39024390243902 - type: similarity_f1_threshold value: 94.18535828590393 - type: similarity_precision value: 60.0 - type: similarity_recall value: 69.47368421052632 - task: type: STS dataset: name: MTEB CDSC-R type: PL-MTEB/cdscr-sts config: default split: test revision: 1cd6abbb00df7d14be3dbd76a7dcc64b3a79a7cd metrics: - type: cosine_pearson value: 89.9839992597087 - type: cosine_spearman value: 90.27044716786627 - type: euclidean_pearson value: 87.74719535276023 - type: euclidean_spearman value: 90.2703874383013 - type: main_score value: 90.27044716786627 - type: manhattan_pearson value: 87.81149530960033 - type: manhattan_spearman value: 90.37098083828207 - type: pearson value: 89.9839992597087 - type: spearman value: 90.27044716786627 - task: type: Retrieval dataset: name: MTEB DBPedia-PL type: clarin-knext/dbpedia-pl config: default split: test revision: 76afe41d9af165cc40999fcaa92312b8b012064a metrics: - type: main_score value: 29.225 - type: map_at_1 value: 5.92 - type: map_at_10 value: 13.052 - type: map_at_100 value: 18.054000000000002 - type: map_at_1000 value: 19.378999999999998 - type: map_at_20 value: 14.921000000000001 - type: map_at_3 value: 9.517000000000001 - type: map_at_5 value: 11.122 - type: mrr_at_1 value: 45.0 - type: mrr_at_10 value: 57.3967261904762 - type: mrr_at_100 value: 57.83804567388388 - type: mrr_at_1000 value: 57.86075000832548 - type: mrr_at_20 value: 57.66969785675282 - type: mrr_at_3 value: 55.16666666666667 - type: mrr_at_5 value: 56.64166666666669 - type: nauc_map_at_1000_diff1 value: 29.411798531506246 - type: nauc_map_at_1000_max value: 20.900134633305655 - type: nauc_map_at_1000_std value: 31.404039472246353 - type: nauc_map_at_100_diff1 value: 30.843903551109808 - type: nauc_map_at_100_max value: 17.39151067247246 - type: nauc_map_at_100_std value: 27.44650726590824 - type: nauc_map_at_10_diff1 value: 37.979613569219495 - type: nauc_map_at_10_max value: 9.222700346624988 - type: nauc_map_at_10_std value: 12.007799385555293 - type: nauc_map_at_1_diff1 value: 53.50284116730185 - type: nauc_map_at_1_max value: 1.370522275254312 - type: nauc_map_at_1_std value: -0.30640006292692257 - type: nauc_map_at_20_diff1 value: 35.67559578714465 - type: nauc_map_at_20_max value: 12.765002402346221 - type: nauc_map_at_20_std value: 17.73265858605054 - type: nauc_map_at_3_diff1 value: 45.619789003130585 - type: nauc_map_at_3_max value: 1.045838638341231 - type: nauc_map_at_3_std value: 2.319308580529236 - type: nauc_map_at_5_diff1 value: 42.08058689946505 - type: nauc_map_at_5_max value: 5.337616164644746 - type: nauc_map_at_5_std value: 4.73118790791731 - type: nauc_mrr_at_1000_diff1 value: 34.33930133013396 - type: nauc_mrr_at_1000_max value: 29.38799773918778 - type: nauc_mrr_at_1000_std value: 32.26009048699902 - type: nauc_mrr_at_100_diff1 value: 34.3197444457885 - type: nauc_mrr_at_100_max value: 29.413059576309497 - type: nauc_mrr_at_100_std value: 32.26908951100588 - type: nauc_mrr_at_10_diff1 value: 34.30610810384026 - type: nauc_mrr_at_10_max value: 29.25358347303212 - type: nauc_mrr_at_10_std value: 32.42735770220712 - type: nauc_mrr_at_1_diff1 value: 38.47836050546717 - type: nauc_mrr_at_1_max value: 25.549990178746796 - type: nauc_mrr_at_1_std value: 27.017285405617763 - type: nauc_mrr_at_20_diff1 value: 34.32685063678914 - type: nauc_mrr_at_20_max value: 29.382152716878547 - type: nauc_mrr_at_20_std value: 32.36225065070027 - type: nauc_mrr_at_3_diff1 value: 34.94513788944085 - type: nauc_mrr_at_3_max value: 28.948106098297938 - type: nauc_mrr_at_3_std value: 31.752978523564845 - type: nauc_mrr_at_5_diff1 value: 34.22773791436512 - type: nauc_mrr_at_5_max value: 28.645995406061914 - type: nauc_mrr_at_5_std value: 31.947761641656065 - type: nauc_ndcg_at_1000_diff1 value: 23.59930215160307 - type: nauc_ndcg_at_1000_max value: 30.004827423326873 - type: nauc_ndcg_at_1000_std value: 45.14606063029462 - type: nauc_ndcg_at_100_diff1 value: 27.150265390833766 - type: nauc_ndcg_at_100_max value: 21.542350038665962 - type: nauc_ndcg_at_100_std value: 37.04783459199997 - type: nauc_ndcg_at_10_diff1 value: 30.44928623138369 - type: nauc_ndcg_at_10_max value: 21.38523283782705 - type: nauc_ndcg_at_10_std value: 31.948655996496527 - type: nauc_ndcg_at_1_diff1 value: 38.141954118151105 - type: nauc_ndcg_at_1_max value: 20.764788523221725 - type: nauc_ndcg_at_1_std value: 24.457971796268065 - type: nauc_ndcg_at_20_diff1 value: 31.668458090974728 - type: nauc_ndcg_at_20_max value: 20.1903988669924 - type: nauc_ndcg_at_20_std value: 30.646872442412544 - type: nauc_ndcg_at_3_diff1 value: 30.030850630038053 - type: nauc_ndcg_at_3_max value: 19.919461574491066 - type: nauc_ndcg_at_3_std value: 28.065728170179188 - type: nauc_ndcg_at_5_diff1 value: 30.06324115773368 - type: nauc_ndcg_at_5_max value: 21.013491210996943 - type: nauc_ndcg_at_5_std value: 29.390767365137947 - type: nauc_precision_at_1000_diff1 value: -15.2968288893292 - type: nauc_precision_at_1000_max value: 48.371418703337305 - type: nauc_precision_at_1000_std value: 33.90852748893144 - type: nauc_precision_at_100_diff1 value: -7.607176962046647 - type: nauc_precision_at_100_max value: 35.35122884806948 - type: nauc_precision_at_100_std value: 46.4742326977524 - type: nauc_precision_at_10_diff1 value: 0.0234083902358811 - type: nauc_precision_at_10_max value: 34.310462135642645 - type: nauc_precision_at_10_std value: 46.22745495492598 - type: nauc_precision_at_1_diff1 value: 38.47836050546717 - type: nauc_precision_at_1_max value: 25.549990178746796 - type: nauc_precision_at_1_std value: 27.017285405617763 - type: nauc_precision_at_20_diff1 value: -0.7281234339501458 - type: nauc_precision_at_20_max value: 34.879992298927796 - type: nauc_precision_at_20_std value: 46.6455237720046 - type: nauc_precision_at_3_diff1 value: 12.557632325001943 - type: nauc_precision_at_3_max value: 27.472641291674343 - type: nauc_precision_at_3_std value: 32.76253410590738 - type: nauc_precision_at_5_diff1 value: 5.72403051661784 - type: nauc_precision_at_5_max value: 31.623557984213747 - type: nauc_precision_at_5_std value: 37.60956680129879 - type: nauc_recall_at_1000_diff1 value: 5.745409852861974 - type: nauc_recall_at_1000_max value: 27.497512598172698 - type: nauc_recall_at_1000_std value: 48.07303762126119 - type: nauc_recall_at_100_diff1 value: 17.211282922855617 - type: nauc_recall_at_100_max value: 17.98582110327383 - type: nauc_recall_at_100_std value: 34.86455715009784 - type: nauc_recall_at_10_diff1 value: 28.755279638184874 - type: nauc_recall_at_10_max value: 8.106029595934537 - type: nauc_recall_at_10_std value: 12.493783688335569 - type: nauc_recall_at_1_diff1 value: 53.50284116730185 - type: nauc_recall_at_1_max value: 1.370522275254312 - type: nauc_recall_at_1_std value: -0.30640006292692257 - type: nauc_recall_at_20_diff1 value: 27.994527440411993 - type: nauc_recall_at_20_max value: 12.916323071056604 - type: nauc_recall_at_20_std value: 17.70928825635808 - type: nauc_recall_at_3_diff1 value: 39.80550258552395 - type: nauc_recall_at_3_max value: -0.8593780074939045 - type: nauc_recall_at_3_std value: 2.086691158003704 - type: nauc_recall_at_5_diff1 value: 34.29080510342918 - type: nauc_recall_at_5_max value: 2.8885937240283113 - type: nauc_recall_at_5_std value: 2.6609799835271852 - type: ndcg_at_1 value: 35.875 - type: ndcg_at_10 value: 29.225 - type: ndcg_at_100 value: 33.554 - type: ndcg_at_1000 value: 40.908 - type: ndcg_at_20 value: 28.910000000000004 - type: ndcg_at_3 value: 32.405 - type: ndcg_at_5 value: 30.408 - type: precision_at_1 value: 45.0 - type: precision_at_10 value: 23.599999999999998 - type: precision_at_100 value: 7.68 - type: precision_at_1000 value: 1.804 - type: precision_at_20 value: 17.5 - type: precision_at_3 value: 36.167 - type: precision_at_5 value: 30.15 - type: recall_at_1 value: 5.92 - type: recall_at_10 value: 18.658 - type: recall_at_100 value: 40.144999999999996 - type: recall_at_1000 value: 63.914 - type: recall_at_20 value: 23.91 - type: recall_at_3 value: 11.334 - type: recall_at_5 value: 14.251 - task: type: Clustering dataset: name: MTEB 8TagsClustering type: PL-MTEB/8tags-clustering config: default split: test revision: 78b962b130c6690659c65abf67bf1c2f030606b6 metrics: - type: main_score value: 37.57372573379629 - type: v_measure value: 37.57372573379629 - type: v_measure_std value: 1.576502898019969 - task: type: Retrieval dataset: name: MTEB FiQA-PL type: clarin-knext/fiqa-pl config: default split: test revision: 2e535829717f8bf9dc829b7f911cc5bbd4e6608e metrics: - type: main_score value: 25.322 - type: map_at_1 value: 12.084 - type: map_at_10 value: 19.402 - type: map_at_100 value: 20.766000000000002 - type: map_at_1000 value: 20.958 - type: map_at_20 value: 20.085 - type: map_at_3 value: 16.794 - type: map_at_5 value: 18.242 - type: mrr_at_1 value: 23.30246913580247 - type: mrr_at_10 value: 31.084594846168915 - type: mrr_at_100 value: 32.081458268143486 - type: mrr_at_1000 value: 32.15082259510916 - type: mrr_at_20 value: 31.641799089124518 - type: mrr_at_3 value: 28.703703703703713 - type: mrr_at_5 value: 30.12345679012346 - type: nauc_map_at_1000_diff1 value: 33.497391865616606 - type: nauc_map_at_1000_max value: 15.431683878656488 - type: nauc_map_at_1000_std value: 10.827813213986468 - type: nauc_map_at_100_diff1 value: 33.534068616502886 - type: nauc_map_at_100_max value: 15.291439989133599 - type: nauc_map_at_100_std value: 10.715061061847777 - type: nauc_map_at_10_diff1 value: 33.49437614167937 - type: nauc_map_at_10_max value: 14.377484560226964 - type: nauc_map_at_10_std value: 9.487834206589557 - type: nauc_map_at_1_diff1 value: 39.87810373637443 - type: nauc_map_at_1_max value: 10.730137705508765 - type: nauc_map_at_1_std value: 3.2660873686456195 - type: nauc_map_at_20_diff1 value: 33.37736866727796 - type: nauc_map_at_20_max value: 14.70143784805556 - type: nauc_map_at_20_std value: 9.989663285421791 - type: nauc_map_at_3_diff1 value: 34.368864609204216 - type: nauc_map_at_3_max value: 12.768667645519768 - type: nauc_map_at_3_std value: 7.982752811874638 - type: nauc_map_at_5_diff1 value: 33.58267051366728 - type: nauc_map_at_5_max value: 13.529005222918848 - type: nauc_map_at_5_std value: 8.565140707894367 - type: nauc_mrr_at_1000_diff1 value: 34.518749214862446 - type: nauc_mrr_at_1000_max value: 20.004412541379317 - type: nauc_mrr_at_1000_std value: 10.794450592562008 - type: nauc_mrr_at_100_diff1 value: 34.502828469831684 - type: nauc_mrr_at_100_max value: 20.016402128122674 - type: nauc_mrr_at_100_std value: 10.770953740589398 - type: nauc_mrr_at_10_diff1 value: 34.464123530074744 - type: nauc_mrr_at_10_max value: 19.812317084561315 - type: nauc_mrr_at_10_std value: 10.660604975440622 - type: nauc_mrr_at_1_diff1 value: 39.735267543303344 - type: nauc_mrr_at_1_max value: 20.218792748481526 - type: nauc_mrr_at_1_std value: 7.574870456628672 - type: nauc_mrr_at_20_diff1 value: 34.4112636812203 - type: nauc_mrr_at_20_max value: 19.736403323847995 - type: nauc_mrr_at_20_std value: 10.58825811173397 - type: nauc_mrr_at_3_diff1 value: 34.322321922524765 - type: nauc_mrr_at_3_max value: 19.48120229919887 - type: nauc_mrr_at_3_std value: 10.241852033769396 - type: nauc_mrr_at_5_diff1 value: 34.41273362560696 - type: nauc_mrr_at_5_max value: 19.80166599189298 - type: nauc_mrr_at_5_std value: 10.535257678547225 - type: nauc_ndcg_at_1000_diff1 value: 31.756209625205372 - type: nauc_ndcg_at_1000_max value: 19.79815198505404 - type: nauc_ndcg_at_1000_std value: 15.747292429924494 - type: nauc_ndcg_at_100_diff1 value: 32.24612802150064 - type: nauc_ndcg_at_100_max value: 18.490724459073633 - type: nauc_ndcg_at_100_std value: 14.606523975785374 - type: nauc_ndcg_at_10_diff1 value: 32.17599943968043 - type: nauc_ndcg_at_10_max value: 15.73203247263979 - type: nauc_ndcg_at_10_std value: 11.361059016427816 - type: nauc_ndcg_at_1_diff1 value: 39.735267543303344 - type: nauc_ndcg_at_1_max value: 20.218792748481526 - type: nauc_ndcg_at_1_std value: 7.574870456628672 - type: nauc_ndcg_at_20_diff1 value: 31.750276068192886 - type: nauc_ndcg_at_20_max value: 15.761403266813346 - type: nauc_ndcg_at_20_std value: 11.939341736048261 - type: nauc_ndcg_at_3_diff1 value: 32.60001850916417 - type: nauc_ndcg_at_3_max value: 16.484580482661286 - type: nauc_ndcg_at_3_std value: 9.93945065513519 - type: nauc_ndcg_at_5_diff1 value: 32.44524427279313 - type: nauc_ndcg_at_5_max value: 15.875506598237141 - type: nauc_ndcg_at_5_std value: 9.982281820511833 - type: nauc_precision_at_1000_diff1 value: 5.371199115978502 - type: nauc_precision_at_1000_max value: 32.2390464051828 - type: nauc_precision_at_1000_std value: 14.878904307648414 - type: nauc_precision_at_100_diff1 value: 16.16681952079101 - type: nauc_precision_at_100_max value: 31.799356005933838 - type: nauc_precision_at_100_std value: 19.248994737500986 - type: nauc_precision_at_10_diff1 value: 22.009585966198923 - type: nauc_precision_at_10_max value: 25.75349877480564 - type: nauc_precision_at_10_std value: 16.27236030310856 - type: nauc_precision_at_1_diff1 value: 39.735267543303344 - type: nauc_precision_at_1_max value: 20.218792748481526 - type: nauc_precision_at_1_std value: 7.574870456628672 - type: nauc_precision_at_20_diff1 value: 18.58140182399686 - type: nauc_precision_at_20_max value: 25.678514022441874 - type: nauc_precision_at_20_std value: 16.797936080303757 - type: nauc_precision_at_3_diff1 value: 26.928025721272824 - type: nauc_precision_at_3_max value: 20.657641661666794 - type: nauc_precision_at_3_std value: 13.0985390930848 - type: nauc_precision_at_5_diff1 value: 23.36859898010871 - type: nauc_precision_at_5_max value: 22.374908445175237 - type: nauc_precision_at_5_std value: 14.246505892972294 - type: nauc_recall_at_1000_diff1 value: 11.980972712740272 - type: nauc_recall_at_1000_max value: 19.76758314007667 - type: nauc_recall_at_1000_std value: 37.01896226544845 - type: nauc_recall_at_100_diff1 value: 21.23333081030157 - type: nauc_recall_at_100_max value: 17.273702477754753 - type: nauc_recall_at_100_std value: 22.66184024937999 - type: nauc_recall_at_10_diff1 value: 24.654784002876422 - type: nauc_recall_at_10_max value: 11.299238954418193 - type: nauc_recall_at_10_std value: 12.933536657323804 - type: nauc_recall_at_1_diff1 value: 39.87810373637443 - type: nauc_recall_at_1_max value: 10.730137705508765 - type: nauc_recall_at_1_std value: 3.2660873686456195 - type: nauc_recall_at_20_diff1 value: 22.912968265183142 - type: nauc_recall_at_20_max value: 10.463163094071744 - type: nauc_recall_at_20_std value: 13.342666469120315 - type: nauc_recall_at_3_diff1 value: 26.200195626449702 - type: nauc_recall_at_3_max value: 10.661728055293116 - type: nauc_recall_at_3_std value: 10.101882781882052 - type: nauc_recall_at_5_diff1 value: 25.286289446845807 - type: nauc_recall_at_5_max value: 11.353540373539142 - type: nauc_recall_at_5_std value: 10.67026258089847 - type: ndcg_at_1 value: 23.302 - type: ndcg_at_10 value: 25.322 - type: ndcg_at_100 value: 31.452 - type: ndcg_at_1000 value: 35.378 - type: ndcg_at_20 value: 27.392 - type: ndcg_at_3 value: 22.238 - type: ndcg_at_5 value: 23.436 - type: precision_at_1 value: 23.302 - type: precision_at_10 value: 7.037 - type: precision_at_100 value: 1.321 - type: precision_at_1000 value: 0.2 - type: precision_at_20 value: 4.344 - type: precision_at_3 value: 14.557999999999998 - type: precision_at_5 value: 10.988000000000001 - type: recall_at_1 value: 12.084 - type: recall_at_10 value: 31.011 - type: recall_at_100 value: 54.782 - type: recall_at_1000 value: 78.828 - type: recall_at_20 value: 37.573 - type: recall_at_3 value: 20.918999999999997 - type: recall_at_5 value: 25.434 - task: type: Retrieval dataset: name: MTEB HotpotQA-PL type: clarin-knext/hotpotqa-pl config: default split: test revision: a0bd479ac97b4ccb5bd6ce320c415d0bb4beb907 metrics: - type: main_score value: 61.76199999999999 - type: map_at_1 value: 36.462 - type: map_at_10 value: 52.595000000000006 - type: map_at_100 value: 53.486 - type: map_at_1000 value: 53.561 - type: map_at_20 value: 53.116 - type: map_at_3 value: 49.55 - type: map_at_5 value: 51.468 - type: mrr_at_1 value: 72.92370020256584 - type: mrr_at_10 value: 79.14170498269061 - type: mrr_at_100 value: 79.39082829565201 - type: mrr_at_1000 value: 79.4039312237504 - type: mrr_at_20 value: 79.30320990617905 - type: mrr_at_3 value: 78.06887238352448 - type: mrr_at_5 value: 78.74746792707597 - type: nauc_map_at_1000_diff1 value: 26.629197478945656 - type: nauc_map_at_1000_max value: 20.417296536263652 - type: nauc_map_at_1000_std value: 7.824861166949661 - type: nauc_map_at_100_diff1 value: 26.597747680876964 - type: nauc_map_at_100_max value: 20.394321293004854 - type: nauc_map_at_100_std value: 7.812277969136019 - type: nauc_map_at_10_diff1 value: 26.733323682484784 - type: nauc_map_at_10_max value: 20.271344228458663 - type: nauc_map_at_10_std value: 7.0935616016511815 - type: nauc_map_at_1_diff1 value: 73.40480136620272 - type: nauc_map_at_1_max value: 38.86815860879837 - type: nauc_map_at_1_std value: 4.8325955891477275 - type: nauc_map_at_20_diff1 value: 26.568842010897114 - type: nauc_map_at_20_max value: 20.275169904863905 - type: nauc_map_at_20_std value: 7.56661656432979 - type: nauc_map_at_3_diff1 value: 28.824845889064793 - type: nauc_map_at_3_max value: 20.76852907202902 - type: nauc_map_at_3_std value: 5.754512537392399 - type: nauc_map_at_5_diff1 value: 27.454615905979974 - type: nauc_map_at_5_max value: 20.352277144385937 - type: nauc_map_at_5_std value: 6.601409288581079 - type: nauc_mrr_at_1000_diff1 value: 72.29337975556386 - type: nauc_mrr_at_1000_max value: 41.162812968303555 - type: nauc_mrr_at_1000_std value: 7.658983139015768 - type: nauc_mrr_at_100_diff1 value: 72.28963649528013 - type: nauc_mrr_at_100_max value: 41.16405855619647 - type: nauc_mrr_at_100_std value: 7.671105812656405 - type: nauc_mrr_at_10_diff1 value: 72.20735283859506 - type: nauc_mrr_at_10_max value: 41.22707207638071 - type: nauc_mrr_at_10_std value: 7.642216005282447 - type: nauc_mrr_at_1_diff1 value: 73.40480136620272 - type: nauc_mrr_at_1_max value: 38.86815860879837 - type: nauc_mrr_at_1_std value: 4.8325955891477275 - type: nauc_mrr_at_20_diff1 value: 72.28084176981353 - type: nauc_mrr_at_20_max value: 41.19699794135133 - type: nauc_mrr_at_20_std value: 7.673602725654943 - type: nauc_mrr_at_3_diff1 value: 72.2517312298736 - type: nauc_mrr_at_3_max value: 41.23050336709122 - type: nauc_mrr_at_3_std value: 7.055398076214827 - type: nauc_mrr_at_5_diff1 value: 72.3010580466702 - type: nauc_mrr_at_5_max value: 41.16023128418148 - type: nauc_mrr_at_5_std value: 7.224799100313062 - type: nauc_ndcg_at_1000_diff1 value: 31.836096618552684 - type: nauc_ndcg_at_1000_max value: 24.19594101782851 - type: nauc_ndcg_at_1000_std value: 11.27051039772318 - type: nauc_ndcg_at_100_diff1 value: 31.010910429281985 - type: nauc_ndcg_at_100_max value: 23.73763527936943 - type: nauc_ndcg_at_100_std value: 11.202567249866915 - type: nauc_ndcg_at_10_diff1 value: 31.630736903110733 - type: nauc_ndcg_at_10_max value: 23.29057670190408 - type: nauc_ndcg_at_10_std value: 8.622063436605352 - type: nauc_ndcg_at_1_diff1 value: 73.40480136620272 - type: nauc_ndcg_at_1_max value: 38.86815860879837 - type: nauc_ndcg_at_1_std value: 4.8325955891477275 - type: nauc_ndcg_at_20_diff1 value: 31.022867077795073 - type: nauc_ndcg_at_20_max value: 23.20240329652894 - type: nauc_ndcg_at_20_std value: 9.910412291823127 - type: nauc_ndcg_at_3_diff1 value: 35.496569057786346 - type: nauc_ndcg_at_3_max value: 24.448277354535833 - type: nauc_ndcg_at_3_std value: 6.498237519761217 - type: nauc_ndcg_at_5_diff1 value: 33.251227793460906 - type: nauc_ndcg_at_5_max value: 23.605853646520984 - type: nauc_ndcg_at_5_std value: 7.54284385208763 - type: nauc_precision_at_1000_diff1 value: -0.47079501803456375 - type: nauc_precision_at_1000_max value: 15.089814566667142 - type: nauc_precision_at_1000_std value: 27.847788246114057 - type: nauc_precision_at_100_diff1 value: 3.0595485970514704 - type: nauc_precision_at_100_max value: 14.360431203666717 - type: nauc_precision_at_100_std value: 22.31753410548815 - type: nauc_precision_at_10_diff1 value: 11.454235819834814 - type: nauc_precision_at_10_max value: 14.979788854311145 - type: nauc_precision_at_10_std value: 11.290542607411098 - type: nauc_precision_at_1_diff1 value: 73.40480136620272 - type: nauc_precision_at_1_max value: 38.86815860879837 - type: nauc_precision_at_1_std value: 4.8325955891477275 - type: nauc_precision_at_20_diff1 value: 7.60972218209098 - type: nauc_precision_at_20_max value: 13.692113405742418 - type: nauc_precision_at_20_std value: 15.359273788872974 - type: nauc_precision_at_3_diff1 value: 22.002230799209492 - type: nauc_precision_at_3_max value: 19.075064977055266 - type: nauc_precision_at_3_std value: 7.1760372858256956 - type: nauc_precision_at_5_diff1 value: 16.565606958337607 - type: nauc_precision_at_5_max value: 16.550935196750206 - type: nauc_precision_at_5_std value: 8.807234374696868 - type: nauc_recall_at_1000_diff1 value: -0.47079501803429247 - type: nauc_recall_at_1000_max value: 15.089814566667334 - type: nauc_recall_at_1000_std value: 27.847788246114025 - type: nauc_recall_at_100_diff1 value: 3.0595485970514558 - type: nauc_recall_at_100_max value: 14.360431203666705 - type: nauc_recall_at_100_std value: 22.317534105488054 - type: nauc_recall_at_10_diff1 value: 11.4542358198349 - type: nauc_recall_at_10_max value: 14.979788854311154 - type: nauc_recall_at_10_std value: 11.290542607411085 - type: nauc_recall_at_1_diff1 value: 73.40480136620272 - type: nauc_recall_at_1_max value: 38.86815860879837 - type: nauc_recall_at_1_std value: 4.8325955891477275 - type: nauc_recall_at_20_diff1 value: 7.609722182091017 - type: nauc_recall_at_20_max value: 13.692113405742424 - type: nauc_recall_at_20_std value: 15.35927378887301 - type: nauc_recall_at_3_diff1 value: 22.002230799209435 - type: nauc_recall_at_3_max value: 19.07506497705519 - type: nauc_recall_at_3_std value: 7.176037285825619 - type: nauc_recall_at_5_diff1 value: 16.56560695833764 - type: nauc_recall_at_5_max value: 16.55093519675023 - type: nauc_recall_at_5_std value: 8.807234374696902 - type: ndcg_at_1 value: 72.924 - type: ndcg_at_10 value: 61.76199999999999 - type: ndcg_at_100 value: 64.943 - type: ndcg_at_1000 value: 66.42 - type: ndcg_at_20 value: 63.105 - type: ndcg_at_3 value: 57.318000000000005 - type: ndcg_at_5 value: 59.80799999999999 - type: precision_at_1 value: 72.924 - type: precision_at_10 value: 12.723999999999998 - type: precision_at_100 value: 1.521 - type: precision_at_1000 value: 0.172 - type: precision_at_20 value: 6.795 - type: precision_at_3 value: 35.863 - type: precision_at_5 value: 23.487 - type: recall_at_1 value: 36.462 - type: recall_at_10 value: 63.619 - type: recall_at_100 value: 76.036 - type: recall_at_1000 value: 85.8 - type: recall_at_20 value: 67.95400000000001 - type: recall_at_3 value: 53.795 - type: recall_at_5 value: 58.717 - task: type: Retrieval dataset: name: MTEB MSMARCO-PL type: clarin-knext/msmarco-pl config: default split: test revision: 8634c07806d5cce3a6138e260e59b81760a0a640 metrics: - type: main_score value: 45.132 - type: map_at_1 value: 1.667 - type: map_at_10 value: 8.405999999999999 - type: map_at_100 value: 20.796 - type: map_at_1000 value: 25.679999999999996 - type: map_at_20 value: 11.882 - type: map_at_3 value: 3.4000000000000004 - type: map_at_5 value: 5.289 - type: mrr_at_1 value: 62.7906976744186 - type: mrr_at_10 value: 71.9767441860465 - type: mrr_at_100 value: 72.19001178866145 - type: mrr_at_1000 value: 72.21077590826278 - type: mrr_at_20 value: 71.9767441860465 - type: mrr_at_3 value: 69.76744186046511 - type: mrr_at_5 value: 71.9767441860465 - type: nauc_map_at_1000_diff1 value: 13.121496890926018 - type: nauc_map_at_1000_max value: 64.4620914971356 - type: nauc_map_at_1000_std value: 70.89107882842627 - type: nauc_map_at_100_diff1 value: 6.569373263154751 - type: nauc_map_at_100_max value: 54.52329917268778 - type: nauc_map_at_100_std value: 57.970012281008195 - type: nauc_map_at_10_diff1 value: 12.479881525075633 - type: nauc_map_at_10_max value: 16.416934605814358 - type: nauc_map_at_10_std value: 16.562025084061755 - type: nauc_map_at_1_diff1 value: -13.480148625088354 - type: nauc_map_at_1_max value: -12.48386553446901 - type: nauc_map_at_1_std value: -19.47568765990734 - type: nauc_map_at_20_diff1 value: 8.75113737642458 - type: nauc_map_at_20_max value: 28.316394733873455 - type: nauc_map_at_20_std value: 28.706433416288757 - type: nauc_map_at_3_diff1 value: 0.4892858373106769 - type: nauc_map_at_3_max value: 4.82429174133813 - type: nauc_map_at_3_std value: 2.685691736161667 - type: nauc_map_at_5_diff1 value: 7.407280581282287 - type: nauc_map_at_5_max value: 7.810182361989069 - type: nauc_map_at_5_std value: 7.1694430987177915 - type: nauc_mrr_at_1000_diff1 value: -1.3143171207174462 - type: nauc_mrr_at_1000_max value: 55.56132775818817 - type: nauc_mrr_at_1000_std value: 44.747614607383106 - type: nauc_mrr_at_100_diff1 value: -1.224506180649995 - type: nauc_mrr_at_100_max value: 55.600720798015224 - type: nauc_mrr_at_100_std value: 44.73970951740156 - type: nauc_mrr_at_10_diff1 value: -1.404072265069855 - type: nauc_mrr_at_10_max value: 55.81202913496246 - type: nauc_mrr_at_10_std value: 45.1755213724528 - type: nauc_mrr_at_1_diff1 value: -3.3932017924925764 - type: nauc_mrr_at_1_max value: 45.85906083891651 - type: nauc_mrr_at_1_std value: 36.94174294169342 - type: nauc_mrr_at_20_diff1 value: -1.404072265069855 - type: nauc_mrr_at_20_max value: 55.81202913496246 - type: nauc_mrr_at_20_std value: 45.1755213724528 - type: nauc_mrr_at_3_diff1 value: -1.9535315867645546 - type: nauc_mrr_at_3_max value: 54.66533478368106 - type: nauc_mrr_at_3_std value: 42.93031026511843 - type: nauc_mrr_at_5_diff1 value: -1.404072265069855 - type: nauc_mrr_at_5_max value: 55.81202913496246 - type: nauc_mrr_at_5_std value: 45.1755213724528 - type: nauc_ndcg_at_1000_diff1 value: 15.612187648926648 - type: nauc_ndcg_at_1000_max value: 66.0369696987196 - type: nauc_ndcg_at_1000_std value: 69.96669745374349 - type: nauc_ndcg_at_100_diff1 value: 8.757636842486582 - type: nauc_ndcg_at_100_max value: 60.74693277069104 - type: nauc_ndcg_at_100_std value: 63.76108092965522 - type: nauc_ndcg_at_10_diff1 value: 6.45234697262411 - type: nauc_ndcg_at_10_max value: 47.130858592103536 - type: nauc_ndcg_at_10_std value: 46.654922458779126 - type: nauc_ndcg_at_1_diff1 value: -4.400276896768569 - type: nauc_ndcg_at_1_max value: 24.736725318748277 - type: nauc_ndcg_at_1_std value: 15.100951232927404 - type: nauc_ndcg_at_20_diff1 value: -0.44419635404462504 - type: nauc_ndcg_at_20_max value: 53.81470890104093 - type: nauc_ndcg_at_20_std value: 54.65514527813791 - type: nauc_ndcg_at_3_diff1 value: 4.176276992379476 - type: nauc_ndcg_at_3_max value: 33.4079755228582 - type: nauc_ndcg_at_3_std value: 26.097236468435497 - type: nauc_ndcg_at_5_diff1 value: 9.966039505450683 - type: nauc_ndcg_at_5_max value: 40.118178652342394 - type: nauc_ndcg_at_5_std value: 34.33405125137147 - type: nauc_precision_at_1000_diff1 value: 13.757669487153102 - type: nauc_precision_at_1000_max value: 52.007228955531794 - type: nauc_precision_at_1000_std value: 62.70603005119199 - type: nauc_precision_at_100_diff1 value: 7.1595084301066105 - type: nauc_precision_at_100_max value: 57.56055309573276 - type: nauc_precision_at_100_std value: 69.09674838687823 - type: nauc_precision_at_10_diff1 value: 10.548904389246808 - type: nauc_precision_at_10_max value: 58.361747853932435 - type: nauc_precision_at_10_std value: 62.35890309913381 - type: nauc_precision_at_1_diff1 value: -3.3932017924925764 - type: nauc_precision_at_1_max value: 45.85906083891651 - type: nauc_precision_at_1_std value: 36.94174294169342 - type: nauc_precision_at_20_diff1 value: 0.5486557649755647 - type: nauc_precision_at_20_max value: 55.8966200841496 - type: nauc_precision_at_20_std value: 64.46833667077514 - type: nauc_precision_at_3_diff1 value: 3.74969726265482 - type: nauc_precision_at_3_max value: 50.98538299147468 - type: nauc_precision_at_3_std value: 47.52256580019106 - type: nauc_precision_at_5_diff1 value: 14.409304075805396 - type: nauc_precision_at_5_max value: 52.63426384539844 - type: nauc_precision_at_5_std value: 48.72540538657435 - type: nauc_recall_at_1000_diff1 value: 14.810856570503505 - type: nauc_recall_at_1000_max value: 56.70402594077228 - type: nauc_recall_at_1000_std value: 62.44988045776601 - type: nauc_recall_at_100_diff1 value: -0.547033022823402 - type: nauc_recall_at_100_max value: 37.5943435400723 - type: nauc_recall_at_100_std value: 42.055737611040904 - type: nauc_recall_at_10_diff1 value: 5.6072575274918695 - type: nauc_recall_at_10_max value: 6.244507044627988 - type: nauc_recall_at_10_std value: 5.1959433044082575 - type: nauc_recall_at_1_diff1 value: -13.480148625088354 - type: nauc_recall_at_1_max value: -12.48386553446901 - type: nauc_recall_at_1_std value: -19.47568765990734 - type: nauc_recall_at_20_diff1 value: 1.5008424440815344 - type: nauc_recall_at_20_max value: 16.711622731636748 - type: nauc_recall_at_20_std value: 16.46978349884905 - type: nauc_recall_at_3_diff1 value: -2.3329900069251996 - type: nauc_recall_at_3_max value: 2.511711071593615 - type: nauc_recall_at_3_std value: -0.5855889251226093 - type: nauc_recall_at_5_diff1 value: 4.1075104414046315 - type: nauc_recall_at_5_max value: 0.34189966462509463 - type: nauc_recall_at_5_std value: -1.89085195502975 - type: ndcg_at_1 value: 50.0 - type: ndcg_at_10 value: 45.132 - type: ndcg_at_100 value: 41.504999999999995 - type: ndcg_at_1000 value: 49.738 - type: ndcg_at_20 value: 42.569 - type: ndcg_at_3 value: 45.423 - type: ndcg_at_5 value: 45.611000000000004 - type: precision_at_1 value: 62.791 - type: precision_at_10 value: 54.419 - type: precision_at_100 value: 25.047000000000004 - type: precision_at_1000 value: 5.002 - type: precision_at_20 value: 46.394999999999996 - type: precision_at_3 value: 57.364000000000004 - type: precision_at_5 value: 57.208999999999996 - type: recall_at_1 value: 1.667 - type: recall_at_10 value: 10.933 - type: recall_at_100 value: 35.169 - type: recall_at_1000 value: 59.955999999999996 - type: recall_at_20 value: 16.399 - type: recall_at_3 value: 3.7379999999999995 - type: recall_at_5 value: 6.365 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pl) type: mteb/amazon_massive_intent config: pl split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 62.55548083389375 - type: f1 value: 55.243883281423955 - type: f1_weighted value: 61.53554902108963 - type: main_score value: 62.55548083389375 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pl) type: mteb/amazon_massive_scenario config: pl split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 71.7518493611298 - type: f1 value: 69.39084021404145 - type: f1_weighted value: 71.48397679382578 - type: main_score value: 71.7518493611298 - task: type: Retrieval dataset: name: MTEB NFCorpus-PL type: clarin-knext/nfcorpus-pl config: default split: test revision: 9a6f9567fda928260afed2de480d79c98bf0bec0 metrics: - type: main_score value: 27.359 - type: map_at_1 value: 4.013 - type: map_at_10 value: 9.243 - type: map_at_100 value: 11.417 - type: map_at_1000 value: 12.465 - type: map_at_20 value: 10.241999999999999 - type: map_at_3 value: 6.6739999999999995 - type: map_at_5 value: 7.720000000000001 - type: mrr_at_1 value: 36.84210526315789 - type: mrr_at_10 value: 45.80704211509165 - type: mrr_at_100 value: 46.43056530919217 - type: mrr_at_1000 value: 46.481813685972384 - type: mrr_at_20 value: 46.2328011230761 - type: mrr_at_3 value: 43.653250773993804 - type: mrr_at_5 value: 44.75232198142416 - type: nauc_map_at_1000_diff1 value: 24.84177430292285 - type: nauc_map_at_1000_max value: 17.115036682746375 - type: nauc_map_at_1000_std value: 24.075727964418853 - type: nauc_map_at_100_diff1 value: 25.813465171019708 - type: nauc_map_at_100_max value: 15.890774948775189 - type: nauc_map_at_100_std value: 20.733065453457606 - type: nauc_map_at_10_diff1 value: 29.488943622716107 - type: nauc_map_at_10_max value: 9.776720754233569 - type: nauc_map_at_10_std value: 10.581345052422016 - type: nauc_map_at_1_diff1 value: 48.57974934948135 - type: nauc_map_at_1_max value: 1.149527115405564 - type: nauc_map_at_1_std value: -2.0301983395175363 - type: nauc_map_at_20_diff1 value: 27.744545091489826 - type: nauc_map_at_20_max value: 12.800210322701194 - type: nauc_map_at_20_std value: 15.036851255880851 - type: nauc_map_at_3_diff1 value: 37.25540055051418 - type: nauc_map_at_3_max value: 4.906473702901897 - type: nauc_map_at_3_std value: 1.462933406016024 - type: nauc_map_at_5_diff1 value: 33.75262117705747 - type: nauc_map_at_5_max value: 5.349094540200769 - type: nauc_map_at_5_std value: 4.009473353212513 - type: nauc_mrr_at_1000_diff1 value: 25.923316236906224 - type: nauc_mrr_at_1000_max value: 30.218473131172814 - type: nauc_mrr_at_1000_std value: 34.32841034971355 - type: nauc_mrr_at_100_diff1 value: 25.89160877435761 - type: nauc_mrr_at_100_max value: 30.26076316909358 - type: nauc_mrr_at_100_std value: 34.38168790885202 - type: nauc_mrr_at_10_diff1 value: 25.94165965662626 - type: nauc_mrr_at_10_max value: 29.92861838955619 - type: nauc_mrr_at_10_std value: 34.217857324602384 - type: nauc_mrr_at_1_diff1 value: 27.77544038178182 - type: nauc_mrr_at_1_max value: 23.544571519690063 - type: nauc_mrr_at_1_std value: 29.133288904288985 - type: nauc_mrr_at_20_diff1 value: 25.817823276199377 - type: nauc_mrr_at_20_max value: 30.212951519162534 - type: nauc_mrr_at_20_std value: 34.38656845672502 - type: nauc_mrr_at_3_diff1 value: 27.253167791083772 - type: nauc_mrr_at_3_max value: 28.668229911423044 - type: nauc_mrr_at_3_std value: 32.24039598508148 - type: nauc_mrr_at_5_diff1 value: 26.50152942042588 - type: nauc_mrr_at_5_max value: 29.014104429398657 - type: nauc_mrr_at_5_std value: 33.10408829199384 - type: nauc_ndcg_at_1000_diff1 value: 21.670441606508682 - type: nauc_ndcg_at_1000_max value: 35.085480170350294 - type: nauc_ndcg_at_1000_std value: 40.26959838435534 - type: nauc_ndcg_at_100_diff1 value: 20.56655267151386 - type: nauc_ndcg_at_100_max value: 29.059496472106172 - type: nauc_ndcg_at_100_std value: 36.20604882231693 - type: nauc_ndcg_at_10_diff1 value: 19.327892822047392 - type: nauc_ndcg_at_10_max value: 22.970443207173847 - type: nauc_ndcg_at_10_std value: 33.63485024562264 - type: nauc_ndcg_at_1_diff1 value: 29.440869586898806 - type: nauc_ndcg_at_1_max value: 21.1892146993199 - type: nauc_ndcg_at_1_std value: 27.715145294772626 - type: nauc_ndcg_at_20_diff1 value: 19.84119342340242 - type: nauc_ndcg_at_20_max value: 24.648907071153918 - type: nauc_ndcg_at_20_std value: 34.21144991558109 - type: nauc_ndcg_at_3_diff1 value: 22.475236266303952 - type: nauc_ndcg_at_3_max value: 22.5673625414089 - type: nauc_ndcg_at_3_std value: 30.40344427150939 - type: nauc_ndcg_at_5_diff1 value: 20.435706146454795 - type: nauc_ndcg_at_5_max value: 20.807509478884405 - type: nauc_ndcg_at_5_std value: 30.50756403953348 - type: nauc_precision_at_1000_diff1 value: -7.734779276193169 - type: nauc_precision_at_1000_max value: 10.369447288094234 - type: nauc_precision_at_1000_std value: 38.88122374339474 - type: nauc_precision_at_100_diff1 value: -5.148267935551239 - type: nauc_precision_at_100_max value: 22.682811622480507 - type: nauc_precision_at_100_std value: 52.14414978661011 - type: nauc_precision_at_10_diff1 value: 4.2440553409575115 - type: nauc_precision_at_10_max value: 24.922198902459577 - type: nauc_precision_at_10_std value: 44.24729160099345 - type: nauc_precision_at_1_diff1 value: 28.683873179972423 - type: nauc_precision_at_1_max value: 24.333474443231477 - type: nauc_precision_at_1_std value: 29.657103597064992 - type: nauc_precision_at_20_diff1 value: 0.981459375147628 - type: nauc_precision_at_20_max value: 26.656822900511944 - type: nauc_precision_at_20_std value: 47.61829905274704 - type: nauc_precision_at_3_diff1 value: 14.009226282963393 - type: nauc_precision_at_3_max value: 25.206963221334643 - type: nauc_precision_at_3_std value: 34.640163356829575 - type: nauc_precision_at_5_diff1 value: 9.732199396026699 - type: nauc_precision_at_5_max value: 21.620896160839308 - type: nauc_precision_at_5_std value: 36.54829562203162 - type: nauc_recall_at_1000_diff1 value: 13.592706145413594 - type: nauc_recall_at_1000_max value: 26.905710458923515 - type: nauc_recall_at_1000_std value: 27.77232599212786 - type: nauc_recall_at_100_diff1 value: 11.474980161550619 - type: nauc_recall_at_100_max value: 24.6542606788053 - type: nauc_recall_at_100_std value: 26.088933416325894 - type: nauc_recall_at_10_diff1 value: 20.86627786542471 - type: nauc_recall_at_10_max value: 12.310575849201342 - type: nauc_recall_at_10_std value: 8.93720284107538 - type: nauc_recall_at_1_diff1 value: 48.57974934948135 - type: nauc_recall_at_1_max value: 1.149527115405564 - type: nauc_recall_at_1_std value: -2.0301983395175363 - type: nauc_recall_at_20_diff1 value: 17.03977114136929 - type: nauc_recall_at_20_max value: 15.132361504438405 - type: nauc_recall_at_20_std value: 14.39504435329145 - type: nauc_recall_at_3_diff1 value: 33.90735954186142 - type: nauc_recall_at_3_max value: 7.589690453066397 - type: nauc_recall_at_3_std value: 0.8609172933612455 - type: nauc_recall_at_5_diff1 value: 27.37452904528661 - type: nauc_recall_at_5_max value: 6.950034812753282 - type: nauc_recall_at_5_std value: 2.9248007586594396 - type: ndcg_at_1 value: 35.294 - type: ndcg_at_10 value: 27.359 - type: ndcg_at_100 value: 24.285999999999998 - type: ndcg_at_1000 value: 32.438 - type: ndcg_at_20 value: 25.418000000000003 - type: ndcg_at_3 value: 31.328 - type: ndcg_at_5 value: 29.269000000000002 - type: precision_at_1 value: 36.533 - type: precision_at_10 value: 20.681 - type: precision_at_100 value: 6.087 - type: precision_at_1000 value: 1.7469999999999999 - type: precision_at_20 value: 15.325 - type: precision_at_3 value: 29.309 - type: precision_at_5 value: 25.201 - type: recall_at_1 value: 4.013 - type: recall_at_10 value: 13.153 - type: recall_at_100 value: 24.549000000000003 - type: recall_at_1000 value: 53.908 - type: recall_at_20 value: 16.453 - type: recall_at_3 value: 7.832999999999999 - type: recall_at_5 value: 9.693999999999999 - task: type: Retrieval dataset: name: MTEB NQ-PL type: clarin-knext/nq-pl config: default split: test revision: f171245712cf85dd4700b06bef18001578d0ca8d metrics: - type: main_score value: 30.842000000000002 - type: map_at_1 value: 15.584999999999999 - type: map_at_10 value: 25.141999999999996 - type: map_at_100 value: 26.387 - type: map_at_1000 value: 26.458 - type: map_at_20 value: 25.897 - type: map_at_3 value: 21.792 - type: map_at_5 value: 23.605 - type: mrr_at_1 value: 17.526071842410197 - type: mrr_at_10 value: 27.034281943754777 - type: mrr_at_100 value: 28.093499231975112 - type: mrr_at_1000 value: 28.151579697181628 - type: mrr_at_20 value: 27.685578601768064 - type: mrr_at_3 value: 23.966782541521876 - type: mrr_at_5 value: 25.63538045577454 - type: nauc_map_at_1000_diff1 value: 25.629659206470034 - type: nauc_map_at_1000_max value: 19.50903133109958 - type: nauc_map_at_1000_std value: 11.369355803540456 - type: nauc_map_at_100_diff1 value: 25.63185640379452 - type: nauc_map_at_100_max value: 19.49043016244933 - type: nauc_map_at_100_std value: 11.349471698782217 - type: nauc_map_at_10_diff1 value: 25.801905100212085 - type: nauc_map_at_10_max value: 18.71914313595772 - type: nauc_map_at_10_std value: 10.101933080218412 - type: nauc_map_at_1_diff1 value: 27.69756013829008 - type: nauc_map_at_1_max value: 13.265356278967614 - type: nauc_map_at_1_std value: 4.845453511488002 - type: nauc_map_at_20_diff1 value: 25.57617091165384 - type: nauc_map_at_20_max value: 19.22087134146287 - type: nauc_map_at_20_std value: 10.863338999338074 - type: nauc_map_at_3_diff1 value: 26.04936647826419 - type: nauc_map_at_3_max value: 17.00014461889098 - type: nauc_map_at_3_std value: 8.345803797704802 - type: nauc_map_at_5_diff1 value: 25.926914766086163 - type: nauc_map_at_5_max value: 17.909768342318312 - type: nauc_map_at_5_std value: 8.99533665314055 - type: nauc_mrr_at_1000_diff1 value: 24.821439280682775 - type: nauc_mrr_at_1000_max value: 20.48215524313607 - type: nauc_mrr_at_1000_std value: 13.302755245100787 - type: nauc_mrr_at_100_diff1 value: 24.822888515699727 - type: nauc_mrr_at_100_max value: 20.476125364875305 - type: nauc_mrr_at_100_std value: 13.303370196580808 - type: nauc_mrr_at_10_diff1 value: 24.827095834283377 - type: nauc_mrr_at_10_max value: 19.906455259365014 - type: nauc_mrr_at_10_std value: 12.461215626420783 - type: nauc_mrr_at_1_diff1 value: 27.354076617153282 - type: nauc_mrr_at_1_max value: 15.421589080989397 - type: nauc_mrr_at_1_std value: 7.854191402321044 - type: nauc_mrr_at_20_diff1 value: 24.707829956282353 - type: nauc_mrr_at_20_max value: 20.343614549048684 - type: nauc_mrr_at_20_std value: 12.991368337778994 - type: nauc_mrr_at_3_diff1 value: 25.001495195422212 - type: nauc_mrr_at_3_max value: 18.670877184315987 - type: nauc_mrr_at_3_std value: 11.073823459359353 - type: nauc_mrr_at_5_diff1 value: 25.09633485104506 - type: nauc_mrr_at_5_max value: 19.289598809877393 - type: nauc_mrr_at_5_std value: 11.447861090124427 - type: nauc_ndcg_at_1000_diff1 value: 24.454331896090252 - type: nauc_ndcg_at_1000_max value: 24.54817880813177 - type: nauc_ndcg_at_1000_std value: 18.291577235898664 - type: nauc_ndcg_at_100_diff1 value: 24.4900499476292 - type: nauc_ndcg_at_100_max value: 24.3113863055596 - type: nauc_ndcg_at_100_std value: 18.283249505464127 - type: nauc_ndcg_at_10_diff1 value: 24.75304628631047 - type: nauc_ndcg_at_10_max value: 21.346414904765112 - type: nauc_ndcg_at_10_std value: 13.144087870627114 - type: nauc_ndcg_at_1_diff1 value: 27.354076617153282 - type: nauc_ndcg_at_1_max value: 15.421589080989397 - type: nauc_ndcg_at_1_std value: 7.854191402321044 - type: nauc_ndcg_at_20_diff1 value: 24.054443970465634 - type: nauc_ndcg_at_20_max value: 23.02090178343728 - type: nauc_ndcg_at_20_std value: 15.466706732549639 - type: nauc_ndcg_at_3_diff1 value: 25.21593203645425 - type: nauc_ndcg_at_3_max value: 18.366389791319857 - type: nauc_ndcg_at_3_std value: 9.886764558221312 - type: nauc_ndcg_at_5_diff1 value: 25.18968308632415 - type: nauc_ndcg_at_5_max value: 19.714457143715883 - type: nauc_ndcg_at_5_std value: 10.810267333820615 - type: nauc_precision_at_1000_diff1 value: 5.311743560049695 - type: nauc_precision_at_1000_max value: 31.8449636551786 - type: nauc_precision_at_1000_std value: 38.560980646256645 - type: nauc_precision_at_100_diff1 value: 11.642708984639716 - type: nauc_precision_at_100_max value: 33.08348545702312 - type: nauc_precision_at_100_std value: 38.84569611188958 - type: nauc_precision_at_10_diff1 value: 19.39529546701617 - type: nauc_precision_at_10_max value: 27.35329522618733 - type: nauc_precision_at_10_std value: 21.657982938733863 - type: nauc_precision_at_1_diff1 value: 27.354076617153282 - type: nauc_precision_at_1_max value: 15.421589080989397 - type: nauc_precision_at_1_std value: 7.854191402321044 - type: nauc_precision_at_20_diff1 value: 15.315200424520157 - type: nauc_precision_at_20_max value: 30.813032263448335 - type: nauc_precision_at_20_std value: 28.51929835139947 - type: nauc_precision_at_3_diff1 value: 23.171414749401624 - type: nauc_precision_at_3_max value: 22.230781193639906 - type: nauc_precision_at_3_std value: 14.39995607518812 - type: nauc_precision_at_5_diff1 value: 22.12050049652593 - type: nauc_precision_at_5_max value: 24.47739013891615 - type: nauc_precision_at_5_std value: 15.911936861665232 - type: nauc_recall_at_1000_diff1 value: 18.49721947186244 - type: nauc_recall_at_1000_max value: 59.77562391547361 - type: nauc_recall_at_1000_std value: 67.25992226904116 - type: nauc_recall_at_100_diff1 value: 21.08120571727416 - type: nauc_recall_at_100_max value: 41.81711687017934 - type: nauc_recall_at_100_std value: 45.46881224307712 - type: nauc_recall_at_10_diff1 value: 22.267969061265276 - type: nauc_recall_at_10_max value: 26.20350836241132 - type: nauc_recall_at_10_std value: 18.312586912516927 - type: nauc_recall_at_1_diff1 value: 27.69756013829008 - type: nauc_recall_at_1_max value: 13.265356278967614 - type: nauc_recall_at_1_std value: 4.845453511488002 - type: nauc_recall_at_20_diff1 value: 19.7184358966775 - type: nauc_recall_at_20_max value: 32.18279692099271 - type: nauc_recall_at_20_std value: 26.185137240814377 - type: nauc_recall_at_3_diff1 value: 23.501740451271914 - type: nauc_recall_at_3_max value: 19.91360673787573 - type: nauc_recall_at_3_std value: 11.210024942573977 - type: nauc_recall_at_5_diff1 value: 23.437183434421655 - type: nauc_recall_at_5_max value: 22.272023416475623 - type: nauc_recall_at_5_std value: 12.814496156956142 - type: ndcg_at_1 value: 17.526 - type: ndcg_at_10 value: 30.842000000000002 - type: ndcg_at_100 value: 36.629 - type: ndcg_at_1000 value: 38.495000000000005 - type: ndcg_at_20 value: 33.382 - type: ndcg_at_3 value: 24.252000000000002 - type: ndcg_at_5 value: 27.339000000000002 - type: precision_at_1 value: 17.526 - type: precision_at_10 value: 5.548 - type: precision_at_100 value: 0.88 - type: precision_at_1000 value: 0.106 - type: precision_at_20 value: 3.3649999999999998 - type: precision_at_3 value: 11.25 - type: precision_at_5 value: 8.517 - type: recall_at_1 value: 15.584999999999999 - type: recall_at_10 value: 46.521 - type: recall_at_100 value: 72.571 - type: recall_at_1000 value: 86.86500000000001 - type: recall_at_20 value: 56.004 - type: recall_at_3 value: 29.195999999999998 - type: recall_at_5 value: 36.324 - task: type: Classification dataset: name: MTEB PAC type: laugustyniak/abusive-clauses-pl config: default split: test revision: fc69d1c153a8ccdcf1eef52f4e2a27f88782f543 metrics: - type: accuracy value: 64.80162177816392 - type: ap value: 74.10348604798286 - type: ap_weighted value: 74.10348604798286 - type: f1 value: 61.280331645723685 - type: f1_weighted value: 65.03859489177282 - type: main_score value: 64.80162177816392 - task: type: PairClassification dataset: name: MTEB PSC type: PL-MTEB/psc-pairclassification config: default split: test revision: d05a294af9e1d3ff2bfb6b714e08a24a6cabc669 metrics: - type: cosine_accuracy value: 97.77365491651206 - type: cosine_accuracy_threshold value: 81.08445405960083 - type: cosine_ap value: 99.43195082030653 - type: cosine_f1 value: 96.40718562874251 - type: cosine_f1_threshold value: 81.08445405960083 - type: cosine_precision value: 94.70588235294117 - type: cosine_recall value: 98.17073170731707 - type: dot_accuracy value: 97.77365491651206 - type: dot_accuracy_threshold value: 81.08445405960083 - type: dot_ap value: 99.43195082030653 - type: dot_f1 value: 96.40718562874251 - type: dot_f1_threshold value: 81.08445405960083 - type: dot_precision value: 94.70588235294117 - type: dot_recall value: 98.17073170731707 - type: euclidean_accuracy value: 97.77365491651206 - type: euclidean_accuracy_threshold value: 61.50695085525513 - type: euclidean_ap value: 99.43195082030653 - type: euclidean_f1 value: 96.40718562874251 - type: euclidean_f1_threshold value: 61.50695085525513 - type: euclidean_precision value: 94.70588235294117 - type: euclidean_recall value: 98.17073170731707 - type: main_score value: 99.46339853695966 - type: manhattan_accuracy value: 98.05194805194806 - type: manhattan_accuracy_threshold value: 1428.3578872680664 - type: manhattan_ap value: 99.46339853695966 - type: manhattan_f1 value: 96.83257918552036 - type: manhattan_f1_threshold value: 1428.3578872680664 - type: manhattan_precision value: 95.82089552238806 - type: manhattan_recall value: 97.86585365853658 - type: max_ap value: 99.46339853695966 - type: max_f1 value: 96.83257918552036 - type: max_precision value: 95.82089552238806 - type: max_recall value: 98.17073170731707 - type: similarity_accuracy value: 97.77365491651206 - type: similarity_accuracy_threshold value: 81.08445405960083 - type: similarity_ap value: 99.43195082030653 - type: similarity_f1 value: 96.40718562874251 - type: similarity_f1_threshold value: 81.08445405960083 - type: similarity_precision value: 94.70588235294117 - type: similarity_recall value: 98.17073170731707 - task: type: Classification dataset: name: MTEB PolEmo2.0-IN type: PL-MTEB/polemo2_in config: default split: test revision: d90724373c70959f17d2331ad51fb60c71176b03 metrics: - type: accuracy value: 80.54016620498614 - type: f1 value: 74.07868803329357 - type: f1_weighted value: 78.52375884318697 - type: main_score value: 80.54016620498614 - task: type: Classification dataset: name: MTEB PolEmo2.0-OUT type: PL-MTEB/polemo2_out config: default split: test revision: 6a21ab8716e255ab1867265f8b396105e8aa63d4 metrics: - type: accuracy value: 61.37651821862349 - type: f1 value: 46.60510896853889 - type: f1_weighted value: 61.3956699958363 - type: main_score value: 61.37651821862349 - task: type: PairClassification dataset: name: MTEB PPC type: PL-MTEB/ppc-pairclassification config: default split: test revision: 2c7d2df57801a591f6b1e3aaf042e7a04ec7d9f2 metrics: - type: cosine_accuracy value: 76.0 - type: cosine_accuracy_threshold value: 93.07277202606201 - type: cosine_ap value: 87.43755817552731 - type: cosine_f1 value: 80.46989720998532 - type: cosine_f1_threshold value: 90.98483324050903 - type: cosine_precision value: 72.29551451187335 - type: cosine_recall value: 90.72847682119205 - type: dot_accuracy value: 76.0 - type: dot_accuracy_threshold value: 93.07277798652649 - type: dot_ap value: 87.43751021710085 - type: dot_f1 value: 80.46989720998532 - type: dot_f1_threshold value: 90.98482728004456 - type: dot_precision value: 72.29551451187335 - type: dot_recall value: 90.72847682119205 - type: euclidean_accuracy value: 76.0 - type: euclidean_accuracy_threshold value: 37.221553921699524 - type: euclidean_ap value: 87.43751021710085 - type: euclidean_f1 value: 80.46989720998532 - type: euclidean_f1_threshold value: 42.46214032173157 - type: euclidean_precision value: 72.29551451187335 - type: euclidean_recall value: 90.72847682119205 - type: main_score value: 87.43755817552731 - type: manhattan_accuracy value: 75.2 - type: manhattan_accuracy_threshold value: 858.4394454956055 - type: manhattan_ap value: 87.28751334847506 - type: manhattan_f1 value: 80.47162859248341 - type: manhattan_f1_threshold value: 981.0188293457031 - type: manhattan_precision value: 72.50996015936255 - type: manhattan_recall value: 90.39735099337747 - type: max_ap value: 87.43755817552731 - type: max_f1 value: 80.47162859248341 - type: max_precision value: 72.50996015936255 - type: max_recall value: 90.72847682119205 - type: similarity_accuracy value: 76.0 - type: similarity_accuracy_threshold value: 93.07277202606201 - type: similarity_ap value: 87.43755817552731 - type: similarity_f1 value: 80.46989720998532 - type: similarity_f1_threshold value: 90.98483324050903 - type: similarity_precision value: 72.29551451187335 - type: similarity_recall value: 90.72847682119205 - task: type: Retrieval dataset: name: MTEB Quora-PL type: clarin-knext/quora-pl config: default split: test revision: 0be27e93455051e531182b85e85e425aba12e9d4 metrics: - type: main_score value: 75.235 - type: map_at_1 value: 57.720000000000006 - type: map_at_10 value: 70.322 - type: map_at_100 value: 71.208 - type: map_at_1000 value: 71.247 - type: map_at_20 value: 70.889 - type: map_at_3 value: 67.278 - type: map_at_5 value: 69.07900000000001 - type: mrr_at_1 value: 66.44 - type: mrr_at_10 value: 74.32428571428532 - type: mrr_at_100 value: 74.67001717307676 - type: mrr_at_1000 value: 74.68049849872023 - type: mrr_at_20 value: 74.55920910032467 - type: mrr_at_3 value: 72.6349999999996 - type: mrr_at_5 value: 73.67099999999938 - type: nauc_map_at_1000_diff1 value: 69.03523613954961 - type: nauc_map_at_1000_max value: 30.29022964222993 - type: nauc_map_at_1000_std value: -13.13676129820498 - type: nauc_map_at_100_diff1 value: 69.03918889242972 - type: nauc_map_at_100_max value: 30.28851815152789 - type: nauc_map_at_100_std value: -13.173343854637487 - type: nauc_map_at_10_diff1 value: 69.11834037559699 - type: nauc_map_at_10_max value: 29.609089948792388 - type: nauc_map_at_10_std value: -14.511647137697395 - type: nauc_map_at_1_diff1 value: 72.50653845898617 - type: nauc_map_at_1_max value: 22.521228683262873 - type: nauc_map_at_1_std value: -17.72541519468729 - type: nauc_map_at_20_diff1 value: 69.0572096712263 - type: nauc_map_at_20_max value: 30.09049337817234 - type: nauc_map_at_20_std value: -13.69213787699562 - type: nauc_map_at_3_diff1 value: 69.4118549460786 - type: nauc_map_at_3_max value: 27.31606724944123 - type: nauc_map_at_3_std value: -16.430296769671298 - type: nauc_map_at_5_diff1 value: 69.18608931793607 - type: nauc_map_at_5_max value: 28.681802217476093 - type: nauc_map_at_5_std value: -15.492619374306827 - type: nauc_mrr_at_1000_diff1 value: 70.27871731978331 - type: nauc_mrr_at_1000_max value: 33.89585229097829 - type: nauc_mrr_at_1000_std value: -9.231498078778678 - type: nauc_mrr_at_100_diff1 value: 70.27656223213475 - type: nauc_mrr_at_100_max value: 33.90583650980198 - type: nauc_mrr_at_100_std value: -9.213247629622375 - type: nauc_mrr_at_10_diff1 value: 70.1800255282438 - type: nauc_mrr_at_10_max value: 33.975132933927085 - type: nauc_mrr_at_10_std value: -9.344439026014577 - type: nauc_mrr_at_1_diff1 value: 72.72425945481199 - type: nauc_mrr_at_1_max value: 31.239650246117385 - type: nauc_mrr_at_1_std value: -11.607242701686696 - type: nauc_mrr_at_20_diff1 value: 70.24166041655792 - type: nauc_mrr_at_20_max value: 33.9613048334359 - type: nauc_mrr_at_20_std value: -9.219736983314839 - type: nauc_mrr_at_3_diff1 value: 70.06664104900666 - type: nauc_mrr_at_3_max value: 33.5732140539362 - type: nauc_mrr_at_3_std value: -9.778577982149953 - type: nauc_mrr_at_5_diff1 value: 70.14739007028493 - type: nauc_mrr_at_5_max value: 33.796518466305834 - type: nauc_mrr_at_5_std value: -9.649151783176043 - type: nauc_ndcg_at_1000_diff1 value: 68.62634218438664 - type: nauc_ndcg_at_1000_max value: 33.057143795018696 - type: nauc_ndcg_at_1000_std value: -9.563352961803663 - type: nauc_ndcg_at_100_diff1 value: 68.58213175533443 - type: nauc_ndcg_at_100_max value: 33.35336572393414 - type: nauc_ndcg_at_100_std value: -9.127811506992467 - type: nauc_ndcg_at_10_diff1 value: 68.26726256015203 - type: nauc_ndcg_at_10_max value: 32.33115112923283 - type: nauc_ndcg_at_10_std value: -11.874276014971688 - type: nauc_ndcg_at_1_diff1 value: 72.66000012395291 - type: nauc_ndcg_at_1_max value: 31.283711202542207 - type: nauc_ndcg_at_1_std value: -11.501503096057867 - type: nauc_ndcg_at_20_diff1 value: 68.39658663907474 - type: nauc_ndcg_at_20_max value: 33.08529095010713 - type: nauc_ndcg_at_20_std value: -10.437492609480433 - type: nauc_ndcg_at_3_diff1 value: 68.05324210316826 - type: nauc_ndcg_at_3_max value: 30.30824001099573 - type: nauc_ndcg_at_3_std value: -13.044199992428771 - type: nauc_ndcg_at_5_diff1 value: 68.10994364753626 - type: nauc_ndcg_at_5_max value: 31.182072802471055 - type: nauc_ndcg_at_5_std value: -12.836057047748234 - type: nauc_precision_at_1000_diff1 value: -32.848796455727836 - type: nauc_precision_at_1000_max value: 6.715546095139156 - type: nauc_precision_at_1000_std value: 32.9655373056535 - type: nauc_precision_at_100_diff1 value: -28.794521134307093 - type: nauc_precision_at_100_max value: 11.155432738297682 - type: nauc_precision_at_100_std value: 33.30986182557851 - type: nauc_precision_at_10_diff1 value: -10.613535245108128 - type: nauc_precision_at_10_max value: 19.057316698279582 - type: nauc_precision_at_10_std value: 19.87457963908978 - type: nauc_precision_at_1_diff1 value: 72.66000012395291 - type: nauc_precision_at_1_max value: 31.283711202542207 - type: nauc_precision_at_1_std value: -11.501503096057867 - type: nauc_precision_at_20_diff1 value: -19.6984185276961 - type: nauc_precision_at_20_max value: 16.497527862287058 - type: nauc_precision_at_20_std value: 26.871607334073012 - type: nauc_precision_at_3_diff1 value: 17.130494007304765 - type: nauc_precision_at_3_max value: 23.99199625132106 - type: nauc_precision_at_3_std value: 5.234797091652211 - type: nauc_precision_at_5_diff1 value: 3.0202641879085697 - type: nauc_precision_at_5_max value: 22.31257369308076 - type: nauc_precision_at_5_std value: 12.502866671883032 - type: nauc_recall_at_1000_diff1 value: 49.899967761974196 - type: nauc_recall_at_1000_max value: 54.39990257883846 - type: nauc_recall_at_1000_std value: 42.663306287015196 - type: nauc_recall_at_100_diff1 value: 57.87887190551234 - type: nauc_recall_at_100_max value: 48.03395851487758 - type: nauc_recall_at_100_std value: 25.008694604591312 - type: nauc_recall_at_10_diff1 value: 60.99359933290845 - type: nauc_recall_at_10_max value: 34.817508290483154 - type: nauc_recall_at_10_std value: -10.355946195658207 - type: nauc_recall_at_1_diff1 value: 72.50653845898617 - type: nauc_recall_at_1_max value: 22.521228683262873 - type: nauc_recall_at_1_std value: -17.72541519468729 - type: nauc_recall_at_20_diff1 value: 59.63721580389802 - type: nauc_recall_at_20_max value: 39.78324293003396 - type: nauc_recall_at_20_std value: -0.7738431870195353 - type: nauc_recall_at_3_diff1 value: 64.28146361759069 - type: nauc_recall_at_3_max value: 27.55821665783294 - type: nauc_recall_at_3_std value: -16.385154477134336 - type: nauc_recall_at_5_diff1 value: 62.687585623754046 - type: nauc_recall_at_5_max value: 30.357420406058328 - type: nauc_recall_at_5_std value: -14.95291415876769 - type: ndcg_at_1 value: 66.47 - type: ndcg_at_10 value: 75.235 - type: ndcg_at_100 value: 77.847 - type: ndcg_at_1000 value: 78.396 - type: ndcg_at_20 value: 76.539 - type: ndcg_at_3 value: 71.219 - type: ndcg_at_5 value: 73.235 - type: precision_at_1 value: 66.47 - type: precision_at_10 value: 11.596 - type: precision_at_100 value: 1.424 - type: precision_at_1000 value: 0.153 - type: precision_at_20 value: 6.331 - type: precision_at_3 value: 31.130000000000003 - type: precision_at_5 value: 20.735999999999997 - type: recall_at_1 value: 57.720000000000006 - type: recall_at_10 value: 85.249 - type: recall_at_100 value: 95.39699999999999 - type: recall_at_1000 value: 98.81 - type: recall_at_20 value: 89.739 - type: recall_at_3 value: 73.978 - type: recall_at_5 value: 79.355 - task: type: Retrieval dataset: name: MTEB SCIDOCS-PL type: clarin-knext/scidocs-pl config: default split: test revision: 45452b03f05560207ef19149545f168e596c9337 metrics: - type: main_score value: 15.174000000000001 - type: map_at_1 value: 3.6580000000000004 - type: map_at_10 value: 8.796 - type: map_at_100 value: 10.391 - type: map_at_1000 value: 10.646 - type: map_at_20 value: 9.592 - type: map_at_3 value: 6.489000000000001 - type: map_at_5 value: 7.600999999999999 - type: mrr_at_1 value: 18.0 - type: mrr_at_10 value: 26.845317460317457 - type: mrr_at_100 value: 28.04995949015167 - type: mrr_at_1000 value: 28.121893269944824 - type: mrr_at_20 value: 27.566026091211864 - type: mrr_at_3 value: 23.916666666666686 - type: mrr_at_5 value: 25.551666666666666 - type: nauc_map_at_1000_diff1 value: 17.302827041650488 - type: nauc_map_at_1000_max value: 26.65992706695422 - type: nauc_map_at_1000_std value: 18.96964501922404 - type: nauc_map_at_100_diff1 value: 17.21226432890004 - type: nauc_map_at_100_max value: 26.45824637348571 - type: nauc_map_at_100_std value: 18.573352847100065 - type: nauc_map_at_10_diff1 value: 17.02056023363081 - type: nauc_map_at_10_max value: 24.48428170985602 - type: nauc_map_at_10_std value: 14.014378375804235 - type: nauc_map_at_1_diff1 value: 21.638506619768716 - type: nauc_map_at_1_max value: 19.709230810058283 - type: nauc_map_at_1_std value: 9.042419739024966 - type: nauc_map_at_20_diff1 value: 17.067893569553323 - type: nauc_map_at_20_max value: 25.69106547536296 - type: nauc_map_at_20_std value: 16.535327068913993 - type: nauc_map_at_3_diff1 value: 18.56349850011108 - type: nauc_map_at_3_max value: 22.127177599224744 - type: nauc_map_at_3_std value: 9.47260767358392 - type: nauc_map_at_5_diff1 value: 18.05585009830461 - type: nauc_map_at_5_max value: 23.31477343090323 - type: nauc_map_at_5_std value: 11.257936348356862 - type: nauc_mrr_at_1000_diff1 value: 19.71318833342125 - type: nauc_mrr_at_1000_max value: 22.359300102570092 - type: nauc_mrr_at_1000_std value: 13.89561747692388 - type: nauc_mrr_at_100_diff1 value: 19.709804653242603 - type: nauc_mrr_at_100_max value: 22.365551370687967 - type: nauc_mrr_at_100_std value: 13.918573803759068 - type: nauc_mrr_at_10_diff1 value: 19.74677273038544 - type: nauc_mrr_at_10_max value: 22.348783997030335 - type: nauc_mrr_at_10_std value: 13.606175345418963 - type: nauc_mrr_at_1_diff1 value: 21.957688664351128 - type: nauc_mrr_at_1_max value: 19.50356102866365 - type: nauc_mrr_at_1_std value: 9.323755394169037 - type: nauc_mrr_at_20_diff1 value: 19.5076818806823 - type: nauc_mrr_at_20_max value: 22.192342439483934 - type: nauc_mrr_at_20_std value: 13.705438410110608 - type: nauc_mrr_at_3_diff1 value: 19.784830140193804 - type: nauc_mrr_at_3_max value: 21.606746947165416 - type: nauc_mrr_at_3_std value: 12.289045699872666 - type: nauc_mrr_at_5_diff1 value: 20.139962218896674 - type: nauc_mrr_at_5_max value: 22.139813460789266 - type: nauc_mrr_at_5_std value: 13.177813432176084 - type: nauc_ndcg_at_1000_diff1 value: 17.78059204124948 - type: nauc_ndcg_at_1000_max value: 29.830544327132436 - type: nauc_ndcg_at_1000_std value: 28.03254237837783 - type: nauc_ndcg_at_100_diff1 value: 17.62481104076364 - type: nauc_ndcg_at_100_max value: 28.629131876483665 - type: nauc_ndcg_at_100_std value: 26.019853664301124 - type: nauc_ndcg_at_10_diff1 value: 17.25237540570343 - type: nauc_ndcg_at_10_max value: 25.128032787033604 - type: nauc_ndcg_at_10_std value: 16.571629975349868 - type: nauc_ndcg_at_1_diff1 value: 21.957688664351128 - type: nauc_ndcg_at_1_max value: 19.50356102866365 - type: nauc_ndcg_at_1_std value: 9.323755394169037 - type: nauc_ndcg_at_20_diff1 value: 16.549388210526494 - type: nauc_ndcg_at_20_max value: 26.1871953370256 - type: nauc_ndcg_at_20_std value: 19.971064555030125 - type: nauc_ndcg_at_3_diff1 value: 18.707127276019474 - type: nauc_ndcg_at_3_max value: 22.042786711511813 - type: nauc_ndcg_at_3_std value: 11.103829353868623 - type: nauc_ndcg_at_5_diff1 value: 18.45321448876598 - type: nauc_ndcg_at_5_max value: 23.475902453066492 - type: nauc_ndcg_at_5_std value: 13.216222368946411 - type: nauc_precision_at_1000_diff1 value: 11.843768977161584 - type: nauc_precision_at_1000_max value: 30.300299347010352 - type: nauc_precision_at_1000_std value: 41.123748924498585 - type: nauc_precision_at_100_diff1 value: 13.765676375073074 - type: nauc_precision_at_100_max value: 29.769561801824956 - type: nauc_precision_at_100_std value: 37.56343888054612 - type: nauc_precision_at_10_diff1 value: 14.123009605345343 - type: nauc_precision_at_10_max value: 26.045793706986558 - type: nauc_precision_at_10_std value: 20.45802977436883 - type: nauc_precision_at_1_diff1 value: 21.957688664351128 - type: nauc_precision_at_1_max value: 19.50356102866365 - type: nauc_precision_at_1_std value: 9.323755394169037 - type: nauc_precision_at_20_diff1 value: 12.080580953868749 - type: nauc_precision_at_20_max value: 26.741203934729374 - type: nauc_precision_at_20_std value: 26.249289307014976 - type: nauc_precision_at_3_diff1 value: 17.390833784290034 - type: nauc_precision_at_3_max value: 22.639415005064585 - type: nauc_precision_at_3_std value: 11.481404394862311 - type: nauc_precision_at_5_diff1 value: 17.18007614612505 - type: nauc_precision_at_5_max value: 24.244045184229563 - type: nauc_precision_at_5_std value: 15.180528647694574 - type: nauc_recall_at_1000_diff1 value: 11.507406580463488 - type: nauc_recall_at_1000_max value: 30.78976497232251 - type: nauc_recall_at_1000_std value: 41.618419379918855 - type: nauc_recall_at_100_diff1 value: 13.408507737517144 - type: nauc_recall_at_100_max value: 29.849796157178197 - type: nauc_recall_at_100_std value: 37.58778281760627 - type: nauc_recall_at_10_diff1 value: 13.942112101503866 - type: nauc_recall_at_10_max value: 26.228452951171487 - type: nauc_recall_at_10_std value: 20.14835260352246 - type: nauc_recall_at_1_diff1 value: 21.638506619768716 - type: nauc_recall_at_1_max value: 19.709230810058283 - type: nauc_recall_at_1_std value: 9.042419739024966 - type: nauc_recall_at_20_diff1 value: 11.905542570350702 - type: nauc_recall_at_20_max value: 26.84107459006622 - type: nauc_recall_at_20_std value: 25.888986621614645 - type: nauc_recall_at_3_diff1 value: 17.056201299401692 - type: nauc_recall_at_3_max value: 22.94288018834461 - type: nauc_recall_at_3_std value: 11.337560544201224 - type: nauc_recall_at_5_diff1 value: 16.89022137209632 - type: nauc_recall_at_5_max value: 24.564195711081545 - type: nauc_recall_at_5_std value: 14.979769166201622 - type: ndcg_at_1 value: 18.0 - type: ndcg_at_10 value: 15.174000000000001 - type: ndcg_at_100 value: 22.047 - type: ndcg_at_1000 value: 27.057 - type: ndcg_at_20 value: 17.628 - type: ndcg_at_3 value: 14.536999999999999 - type: ndcg_at_5 value: 12.590000000000002 - type: precision_at_1 value: 18.0 - type: precision_at_10 value: 7.82 - type: precision_at_100 value: 1.773 - type: precision_at_1000 value: 0.298 - type: precision_at_20 value: 5.335 - type: precision_at_3 value: 13.5 - type: precision_at_5 value: 10.92 - type: recall_at_1 value: 3.6580000000000004 - type: recall_at_10 value: 15.867999999999999 - type: recall_at_100 value: 36.068 - type: recall_at_1000 value: 60.608 - type: recall_at_20 value: 21.653 - type: recall_at_3 value: 8.248 - type: recall_at_5 value: 11.108 - task: type: PairClassification dataset: name: MTEB SICK-E-PL type: PL-MTEB/sicke-pl-pairclassification config: default split: test revision: 71bba34b0ece6c56dfcf46d9758a27f7a90f17e9 metrics: - type: cosine_accuracy value: 78.12882185079495 - type: cosine_accuracy_threshold value: 95.76345682144165 - type: cosine_ap value: 63.56538407363026 - type: cosine_f1 value: 60.88388690639582 - type: cosine_f1_threshold value: 92.86266565322876 - type: cosine_precision value: 49.53104064314426 - type: cosine_recall value: 78.98860398860398 - type: dot_accuracy value: 78.12882185079495 - type: dot_accuracy_threshold value: 95.76345682144165 - type: dot_ap value: 63.56553287602377 - type: dot_f1 value: 60.88388690639582 - type: dot_f1_threshold value: 92.86266565322876 - type: dot_precision value: 49.53104064314426 - type: dot_recall value: 78.98860398860398 - type: euclidean_accuracy value: 78.12882185079495 - type: euclidean_accuracy_threshold value: 29.108554124832153 - type: euclidean_ap value: 63.56543484315041 - type: euclidean_f1 value: 60.88388690639582 - type: euclidean_f1_threshold value: 37.781822681427 - type: euclidean_precision value: 49.53104064314426 - type: euclidean_recall value: 78.98860398860398 - type: main_score value: 63.56553287602377 - type: manhattan_accuracy value: 77.82307378719935 - type: manhattan_accuracy_threshold value: 658.8656902313232 - type: manhattan_ap value: 63.12761769067177 - type: manhattan_f1 value: 60.76436623590872 - type: manhattan_f1_threshold value: 888.3136749267578 - type: manhattan_precision value: 49.48499776085983 - type: manhattan_recall value: 78.70370370370371 - type: max_ap value: 63.56553287602377 - type: max_f1 value: 60.88388690639582 - type: max_precision value: 49.53104064314426 - type: max_recall value: 78.98860398860398 - type: similarity_accuracy value: 78.12882185079495 - type: similarity_accuracy_threshold value: 95.76345682144165 - type: similarity_ap value: 63.56538407363026 - type: similarity_f1 value: 60.88388690639582 - type: similarity_f1_threshold value: 92.86266565322876 - type: similarity_precision value: 49.53104064314426 - type: similarity_recall value: 78.98860398860398 - task: type: STS dataset: name: MTEB SICK-R-PL type: PL-MTEB/sickr-pl-sts config: default split: test revision: fd5c2441b7eeff8676768036142af4cfa42c1339 metrics: - type: cosine_pearson value: 71.75502609028113 - type: cosine_spearman value: 66.52097638938338 - type: euclidean_pearson value: 68.6974439167054 - type: euclidean_spearman value: 66.52095939114172 - type: main_score value: 66.52097638938338 - type: manhattan_pearson value: 68.53848708135571 - type: manhattan_spearman value: 66.29909223435631 - type: pearson value: 71.75502609028113 - type: spearman value: 66.52097638938338 - task: type: STS dataset: name: MTEB STS22 (pl) type: mteb/sts22-crosslingual-sts config: pl split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 40.06621394078099 - type: cosine_spearman value: 45.160446103264285 - type: euclidean_pearson value: 25.38908314629843 - type: euclidean_spearman value: 45.160446103264285 - type: main_score value: 45.160446103264285 - type: manhattan_pearson value: 25.13217941116968 - type: manhattan_spearman value: 45.05397285684081 - type: pearson value: 40.06621394078099 - type: spearman value: 45.160446103264285 - task: type: STS dataset: name: MTEB STS22 (de-pl) type: mteb/sts22-crosslingual-sts config: de-pl split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 40.2221719679774 - type: cosine_spearman value: 57.18465019880842 - type: euclidean_pearson value: 42.11211158455479 - type: euclidean_spearman value: 57.18465019880842 - type: main_score value: 57.18465019880842 - type: manhattan_pearson value: 43.24148614152723 - type: manhattan_spearman value: 56.35320940431847 - type: pearson value: 40.2221719679774 - type: spearman value: 57.18465019880842 - task: type: Retrieval dataset: name: MTEB SciFact-PL type: clarin-knext/scifact-pl config: default split: test revision: 47932a35f045ef8ed01ba82bf9ff67f6e109207e metrics: - type: main_score value: 62.064 - type: map_at_1 value: 48.317 - type: map_at_10 value: 57.693000000000005 - type: map_at_100 value: 58.392999999999994 - type: map_at_1000 value: 58.428999999999995 - type: map_at_20 value: 58.108000000000004 - type: map_at_3 value: 55.293000000000006 - type: map_at_5 value: 56.595 - type: mrr_at_1 value: 51.0 - type: mrr_at_10 value: 59.019576719576705 - type: mrr_at_100 value: 59.58007358566797 - type: mrr_at_1000 value: 59.61403985820887 - type: mrr_at_20 value: 59.35199007075942 - type: mrr_at_3 value: 57.166666666666664 - type: mrr_at_5 value: 58.08333333333332 - type: nauc_map_at_1000_diff1 value: 55.90310480193163 - type: nauc_map_at_1000_max value: 40.922646499130586 - type: nauc_map_at_1000_std value: 6.308307542867231 - type: nauc_map_at_100_diff1 value: 55.87923016501095 - type: nauc_map_at_100_max value: 40.930429212300396 - type: nauc_map_at_100_std value: 6.302652510324859 - type: nauc_map_at_10_diff1 value: 55.96811326806582 - type: nauc_map_at_10_max value: 40.91912121040118 - type: nauc_map_at_10_std value: 6.315081020792943 - type: nauc_map_at_1_diff1 value: 61.615316460538374 - type: nauc_map_at_1_max value: 34.4312789344494 - type: nauc_map_at_1_std value: -2.151749018851701 - type: nauc_map_at_20_diff1 value: 55.781940594193316 - type: nauc_map_at_20_max value: 40.877518039008585 - type: nauc_map_at_20_std value: 6.170527123248918 - type: nauc_map_at_3_diff1 value: 58.104315292507216 - type: nauc_map_at_3_max value: 39.524635028616544 - type: nauc_map_at_3_std value: 4.367263811245541 - type: nauc_map_at_5_diff1 value: 56.60725686218003 - type: nauc_map_at_5_max value: 40.362341129747456 - type: nauc_map_at_5_std value: 5.222556427559299 - type: nauc_mrr_at_1000_diff1 value: 56.243518111487454 - type: nauc_mrr_at_1000_max value: 41.92306224416779 - type: nauc_mrr_at_1000_std value: 7.331011181148979 - type: nauc_mrr_at_100_diff1 value: 56.21745814714038 - type: nauc_mrr_at_100_max value: 41.92847851363498 - type: nauc_mrr_at_100_std value: 7.322136402819359 - type: nauc_mrr_at_10_diff1 value: 56.22224221410973 - type: nauc_mrr_at_10_max value: 42.020110225540144 - type: nauc_mrr_at_10_std value: 7.367785001729785 - type: nauc_mrr_at_1_diff1 value: 61.65968884760533 - type: nauc_mrr_at_1_max value: 39.22611274899148 - type: nauc_mrr_at_1_std value: 3.3484556807524357 - type: nauc_mrr_at_20_diff1 value: 56.140226618395495 - type: nauc_mrr_at_20_max value: 41.92506913405156 - type: nauc_mrr_at_20_std value: 7.20339996949852 - type: nauc_mrr_at_3_diff1 value: 57.82506573973446 - type: nauc_mrr_at_3_max value: 41.962001263558484 - type: nauc_mrr_at_3_std value: 6.909954113302328 - type: nauc_mrr_at_5_diff1 value: 56.659054585223565 - type: nauc_mrr_at_5_max value: 42.220145330498326 - type: nauc_mrr_at_5_std value: 6.914754115832333 - type: nauc_ndcg_at_1000_diff1 value: 54.101423320176956 - type: nauc_ndcg_at_1000_max value: 42.35761455565217 - type: nauc_ndcg_at_1000_std value: 9.158968107515042 - type: nauc_ndcg_at_100_diff1 value: 53.193377266960695 - type: nauc_ndcg_at_100_max value: 42.39818084789296 - type: nauc_ndcg_at_100_std value: 8.982680006715663 - type: nauc_ndcg_at_10_diff1 value: 52.7521864873992 - type: nauc_ndcg_at_10_max value: 42.25954681169497 - type: nauc_ndcg_at_10_std value: 9.025856795668409 - type: nauc_ndcg_at_1_diff1 value: 61.65968884760533 - type: nauc_ndcg_at_1_max value: 39.22611274899148 - type: nauc_ndcg_at_1_std value: 3.3484556807524357 - type: nauc_ndcg_at_20_diff1 value: 52.24054304553779 - type: nauc_ndcg_at_20_max value: 42.14484844258701 - type: nauc_ndcg_at_20_std value: 8.522811774790046 - type: nauc_ndcg_at_3_diff1 value: 56.65801023652111 - type: nauc_ndcg_at_3_max value: 41.59901000744857 - type: nauc_ndcg_at_3_std value: 6.866411754213651 - type: nauc_ndcg_at_5_diff1 value: 54.25032835371862 - type: nauc_ndcg_at_5_max value: 41.52568005051319 - type: nauc_ndcg_at_5_std value: 6.747184564934237 - type: nauc_precision_at_1000_diff1 value: -12.438995870489618 - type: nauc_precision_at_1000_max value: 33.65458584888833 - type: nauc_precision_at_1000_std value: 38.65000092313945 - type: nauc_precision_at_100_diff1 value: -3.7051397832573696 - type: nauc_precision_at_100_max value: 36.777033924925384 - type: nauc_precision_at_100_std value: 32.24732998272339 - type: nauc_precision_at_10_diff1 value: 14.458974499542448 - type: nauc_precision_at_10_max value: 45.75828754327736 - type: nauc_precision_at_10_std value: 31.734511856215665 - type: nauc_precision_at_1_diff1 value: 61.65968884760533 - type: nauc_precision_at_1_max value: 39.22611274899148 - type: nauc_precision_at_1_std value: 3.3484556807524357 - type: nauc_precision_at_20_diff1 value: 6.911000226020142 - type: nauc_precision_at_20_max value: 42.75953196446269 - type: nauc_precision_at_20_std value: 30.293217657388254 - type: nauc_precision_at_3_diff1 value: 39.95888414475174 - type: nauc_precision_at_3_max value: 46.81095681980396 - type: nauc_precision_at_3_std value: 20.732734118894037 - type: nauc_precision_at_5_diff1 value: 27.25227607416867 - type: nauc_precision_at_5_max value: 45.278620768210615 - type: nauc_precision_at_5_std value: 22.7094842525771 - type: nauc_recall_at_1000_diff1 value: 54.66853408029846 - type: nauc_recall_at_1000_max value: 69.49112978524705 - type: nauc_recall_at_1000_std value: 84.76890756302552 - type: nauc_recall_at_100_diff1 value: 33.641140071848085 - type: nauc_recall_at_100_max value: 49.94619316653212 - type: nauc_recall_at_100_std value: 26.970675275760104 - type: nauc_recall_at_10_diff1 value: 38.56340942303001 - type: nauc_recall_at_10_max value: 44.13889679913801 - type: nauc_recall_at_10_std value: 17.814455740104584 - type: nauc_recall_at_1_diff1 value: 61.615316460538374 - type: nauc_recall_at_1_max value: 34.4312789344494 - type: nauc_recall_at_1_std value: -2.151749018851701 - type: nauc_recall_at_20_diff1 value: 33.86997626483988 - type: nauc_recall_at_20_max value: 44.31136705663488 - type: nauc_recall_at_20_std value: 16.58271492635832 - type: nauc_recall_at_3_diff1 value: 52.39739118413791 - type: nauc_recall_at_3_max value: 40.56472420414715 - type: nauc_recall_at_3_std value: 7.856902134348368 - type: nauc_recall_at_5_diff1 value: 45.693766776717595 - type: nauc_recall_at_5_max value: 41.817545551209086 - type: nauc_recall_at_5_std value: 9.066813773598692 - type: ndcg_at_1 value: 51.0 - type: ndcg_at_10 value: 62.064 - type: ndcg_at_100 value: 65.45 - type: ndcg_at_1000 value: 66.366 - type: ndcg_at_20 value: 63.418 - type: ndcg_at_3 value: 57.915000000000006 - type: ndcg_at_5 value: 59.65200000000001 - type: precision_at_1 value: 51.0 - type: precision_at_10 value: 8.433 - type: precision_at_100 value: 1.03 - type: precision_at_1000 value: 0.11 - type: precision_at_20 value: 4.517 - type: precision_at_3 value: 23.0 - type: precision_at_5 value: 15.067 - type: recall_at_1 value: 48.317 - type: recall_at_10 value: 74.078 - type: recall_at_100 value: 90.167 - type: recall_at_1000 value: 97.333 - type: recall_at_20 value: 79.256 - type: recall_at_3 value: 62.561 - type: recall_at_5 value: 67.039 - task: type: Retrieval dataset: name: MTEB TRECCOVID-PL type: clarin-knext/trec-covid-pl config: default split: test revision: 81bcb408f33366c2a20ac54adafad1ae7e877fdd metrics: - type: main_score value: 69.244 - type: map_at_1 value: 0.216 - type: map_at_10 value: 1.717 - type: map_at_100 value: 9.051 - type: map_at_1000 value: 21.688 - type: map_at_20 value: 2.972 - type: map_at_3 value: 0.624 - type: map_at_5 value: 0.9809999999999999 - type: mrr_at_1 value: 82.0 - type: mrr_at_10 value: 88.41666666666666 - type: mrr_at_100 value: 88.57051282051282 - type: mrr_at_1000 value: 88.57051282051282 - type: mrr_at_20 value: 88.57051282051282 - type: mrr_at_3 value: 87.66666666666666 - type: mrr_at_5 value: 88.16666666666666 - type: nauc_map_at_1000_diff1 value: -21.210172839828886 - type: nauc_map_at_1000_max value: 50.364439193708456 - type: nauc_map_at_1000_std value: 82.23413161215711 - type: nauc_map_at_100_diff1 value: -3.737989437317314 - type: nauc_map_at_100_max value: 40.24314095187729 - type: nauc_map_at_100_std value: 74.6556355692718 - type: nauc_map_at_10_diff1 value: 24.069758586207186 - type: nauc_map_at_10_max value: 25.978576944212445 - type: nauc_map_at_10_std value: 30.92185789388276 - type: nauc_map_at_1_diff1 value: 33.44422662406722 - type: nauc_map_at_1_max value: 18.58849173002632 - type: nauc_map_at_1_std value: 23.001195148863555 - type: nauc_map_at_20_diff1 value: 16.195748164952704 - type: nauc_map_at_20_max value: 32.418991157208055 - type: nauc_map_at_20_std value: 45.053299350375795 - type: nauc_map_at_3_diff1 value: 32.94899528110181 - type: nauc_map_at_3_max value: 16.721379232494304 - type: nauc_map_at_3_std value: 18.336699336799814 - type: nauc_map_at_5_diff1 value: 30.34930846309755 - type: nauc_map_at_5_max value: 19.37661209832802 - type: nauc_map_at_5_std value: 20.312897662543314 - type: nauc_mrr_at_1000_diff1 value: 49.418158929182006 - type: nauc_mrr_at_1000_max value: 67.05328023364747 - type: nauc_mrr_at_1000_std value: 70.85520896614209 - type: nauc_mrr_at_100_diff1 value: 49.418158929182006 - type: nauc_mrr_at_100_max value: 67.05328023364747 - type: nauc_mrr_at_100_std value: 70.85520896614209 - type: nauc_mrr_at_10_diff1 value: 49.50157932873256 - type: nauc_mrr_at_10_max value: 65.88227845429796 - type: nauc_mrr_at_10_std value: 70.87422352601853 - type: nauc_mrr_at_1_diff1 value: 44.82872563057607 - type: nauc_mrr_at_1_max value: 70.45930168520755 - type: nauc_mrr_at_1_std value: 69.88104416785988 - type: nauc_mrr_at_20_diff1 value: 49.418158929182006 - type: nauc_mrr_at_20_max value: 67.05328023364747 - type: nauc_mrr_at_20_std value: 70.85520896614209 - type: nauc_mrr_at_3_diff1 value: 49.71407489393107 - type: nauc_mrr_at_3_max value: 67.77215590165227 - type: nauc_mrr_at_3_std value: 72.72379898279185 - type: nauc_mrr_at_5_diff1 value: 50.328834220772976 - type: nauc_mrr_at_5_max value: 66.34746357369875 - type: nauc_mrr_at_5_std value: 71.51800332961842 - type: nauc_ndcg_at_1000_diff1 value: -11.723371568664843 - type: nauc_ndcg_at_1000_max value: 53.41150083076567 - type: nauc_ndcg_at_1000_std value: 81.94372023908832 - type: nauc_ndcg_at_100_diff1 value: -15.990454633114279 - type: nauc_ndcg_at_100_max value: 45.35431514782352 - type: nauc_ndcg_at_100_std value: 75.73014493320755 - type: nauc_ndcg_at_10_diff1 value: 4.30050518239422 - type: nauc_ndcg_at_10_max value: 50.83631607203189 - type: nauc_ndcg_at_10_std value: 63.1087699434136 - type: nauc_ndcg_at_1_diff1 value: 17.206529677661354 - type: nauc_ndcg_at_1_max value: 62.14050255620695 - type: nauc_ndcg_at_1_std value: 64.51116243264046 - type: nauc_ndcg_at_20_diff1 value: -5.9182205607515685 - type: nauc_ndcg_at_20_max value: 49.12802457140552 - type: nauc_ndcg_at_20_std value: 68.77672262568693 - type: nauc_ndcg_at_3_diff1 value: 22.158007969692125 - type: nauc_ndcg_at_3_max value: 48.17593837968984 - type: nauc_ndcg_at_3_std value: 58.4991887813489 - type: nauc_ndcg_at_5_diff1 value: 16.89487399786786 - type: nauc_ndcg_at_5_max value: 46.752900245009414 - type: nauc_ndcg_at_5_std value: 60.870638593862914 - type: nauc_precision_at_1000_diff1 value: -24.67751088399524 - type: nauc_precision_at_1000_max value: 42.70887481946044 - type: nauc_precision_at_1000_std value: 49.219386318590566 - type: nauc_precision_at_100_diff1 value: -19.829901963316278 - type: nauc_precision_at_100_max value: 44.4613898680245 - type: nauc_precision_at_100_std value: 74.8829067578589 - type: nauc_precision_at_10_diff1 value: -0.6759004971171398 - type: nauc_precision_at_10_max value: 52.16154071543153 - type: nauc_precision_at_10_std value: 62.98886080224083 - type: nauc_precision_at_1_diff1 value: 44.82872563057607 - type: nauc_precision_at_1_max value: 70.45930168520755 - type: nauc_precision_at_1_std value: 69.88104416785988 - type: nauc_precision_at_20_diff1 value: -11.458671607862547 - type: nauc_precision_at_20_max value: 49.71202888307331 - type: nauc_precision_at_20_std value: 71.79100842422972 - type: nauc_precision_at_3_diff1 value: 30.23048096153466 - type: nauc_precision_at_3_max value: 48.24954855245538 - type: nauc_precision_at_3_std value: 54.344575833478935 - type: nauc_precision_at_5_diff1 value: 13.925893655561437 - type: nauc_precision_at_5_max value: 46.23506752573775 - type: nauc_precision_at_5_std value: 59.610666544378944 - type: nauc_recall_at_1000_diff1 value: -13.691809447793393 - type: nauc_recall_at_1000_max value: 50.39633577248049 - type: nauc_recall_at_1000_std value: 76.65225154588104 - type: nauc_recall_at_100_diff1 value: 4.67778695632382 - type: nauc_recall_at_100_max value: 30.19071079451134 - type: nauc_recall_at_100_std value: 65.03682595699173 - type: nauc_recall_at_10_diff1 value: 26.24600831247693 - type: nauc_recall_at_10_max value: 22.235399614875632 - type: nauc_recall_at_10_std value: 27.653841671594176 - type: nauc_recall_at_1_diff1 value: 33.44422662406722 - type: nauc_recall_at_1_max value: 18.58849173002632 - type: nauc_recall_at_1_std value: 23.001195148863555 - type: nauc_recall_at_20_diff1 value: 19.13211263378722 - type: nauc_recall_at_20_max value: 26.697525172621827 - type: nauc_recall_at_20_std value: 40.9095035359023 - type: nauc_recall_at_3_diff1 value: 30.47343886364865 - type: nauc_recall_at_3_max value: 12.854379330237647 - type: nauc_recall_at_3_std value: 14.711252261798258 - type: nauc_recall_at_5_diff1 value: 28.344400535065112 - type: nauc_recall_at_5_max value: 14.755638630484144 - type: nauc_recall_at_5_std value: 15.864031786019787 - type: ndcg_at_1 value: 72.0 - type: ndcg_at_10 value: 69.244 - type: ndcg_at_100 value: 50.834 - type: ndcg_at_1000 value: 45.535 - type: ndcg_at_20 value: 65.676 - type: ndcg_at_3 value: 73.776 - type: ndcg_at_5 value: 72.715 - type: precision_at_1 value: 82.0 - type: precision_at_10 value: 73.6 - type: precision_at_100 value: 52.22 - type: precision_at_1000 value: 20.380000000000003 - type: precision_at_20 value: 69.0 - type: precision_at_3 value: 81.333 - type: precision_at_5 value: 79.2 - type: recall_at_1 value: 0.216 - type: recall_at_10 value: 1.8900000000000001 - type: recall_at_100 value: 12.359 - type: recall_at_1000 value: 42.791000000000004 - type: recall_at_20 value: 3.44 - type: recall_at_3 value: 0.653 - type: recall_at_5 value: 1.048 - task: type: MultilabelClassification dataset: name: MTEB CEDRClassification type: ai-forever/cedr-classification config: default split: test revision: c0ba03d058e3e1b2f3fd20518875a4563dd12db4 metrics: - type: accuracy value: 43.29968119022317 - type: f1 value: 41.112000768648386 - type: lrap value: 72.06216790648348 - type: main_score value: 43.29968119022317 - task: type: Classification dataset: name: MTEB GeoreviewClassification type: ai-forever/georeview-classification config: default split: test revision: 3765c0d1de6b7d264bc459433c45e5a75513839c metrics: - type: accuracy value: 52.0361328125 - type: f1 value: 47.84397823612054 - type: f1_weighted value: 47.84111706041435 - type: main_score value: 52.0361328125 - task: type: Clustering dataset: name: MTEB GeoreviewClusteringP2P type: ai-forever/georeview-clustering-p2p config: default split: test revision: 97a313c8fc85b47f13f33e7e9a95c1ad888c7fec metrics: - type: main_score value: 60.28266888390485 - type: v_measure value: 60.28266888390485 - type: v_measure_std value: 1.0348363132473835 - task: type: Classification dataset: name: MTEB HeadlineClassification type: ai-forever/headline-classification config: default split: test revision: 2fe05ee6b5832cda29f2ef7aaad7b7fe6a3609eb metrics: - type: accuracy value: 83.4033203125 - type: f1 value: 83.39708551274371 - type: f1_weighted value: 83.39502222187862 - type: main_score value: 83.4033203125 - task: type: Classification dataset: name: MTEB InappropriatenessClassification type: ai-forever/inappropriateness-classification config: default split: test revision: 601651fdc45ef243751676e62dd7a19f491c0285 metrics: - type: accuracy value: 64.140625 - type: ap value: 59.28880813167948 - type: ap_weighted value: 59.28880813167948 - type: f1 value: 63.72032598814496 - type: f1_weighted value: 63.72032598814496 - type: main_score value: 64.140625 - task: type: Classification dataset: name: MTEB KinopoiskClassification type: ai-forever/kinopoisk-sentiment-classification config: default split: test revision: 5911f26666ac11af46cb9c6849d0dc80a378af24 metrics: - type: accuracy value: 63.15333333333333 - type: f1 value: 59.395986541732384 - type: f1_weighted value: 59.395986541732384 - type: main_score value: 63.15333333333333 - task: type: Reranking dataset: name: MTEB MIRACLReranking (ru) type: miracl/mmteb-miracl-reranking config: ru split: dev revision: 6d1962c527217f8927fca80f890f14f36b2802af metrics: - type: MAP@1(MIRACL) value: 29.732999999999997 - type: MAP@10(MIRACL) value: 48.333 - type: MAP@100(MIRACL) value: 50.517 - type: MAP@1000(MIRACL) value: 50.517 - type: MAP@20(MIRACL) value: 49.85 - type: MAP@3(MIRACL) value: 41.843 - type: MAP@5(MIRACL) value: 45.323 - type: NDCG@1(MIRACL) value: 48.436 - type: NDCG@10(MIRACL) value: 56.111999999999995 - type: NDCG@100(MIRACL) value: 60.617 - type: NDCG@1000(MIRACL) value: 60.617 - type: NDCG@20(MIRACL) value: 58.826 - type: NDCG@3(MIRACL) value: 50.483999999999995 - type: NDCG@5(MIRACL) value: 52.61 - type: P@1(MIRACL) value: 48.436 - type: P@10(MIRACL) value: 14.667 - type: P@100(MIRACL) value: 1.9529999999999998 - type: P@1000(MIRACL) value: 0.19499999999999998 - type: P@20(MIRACL) value: 8.665000000000001 - type: P@3(MIRACL) value: 31.302000000000003 - type: P@5(MIRACL) value: 23.384 - type: Recall@1(MIRACL) value: 29.732999999999997 - type: Recall@10(MIRACL) value: 66.532 - type: Recall@100(MIRACL) value: 79.952 - type: Recall@1000(MIRACL) value: 79.952 - type: Recall@20(MIRACL) value: 73.75 - type: Recall@3(MIRACL) value: 49.541000000000004 - type: Recall@5(MIRACL) value: 57.389 - type: main_score value: 56.111999999999995 - type: nAUC_MAP@1000_diff1(MIRACL) value: 15.8510181843185 - type: nAUC_MAP@1000_max(MIRACL) value: 27.452155305037095 - type: nAUC_MAP@1000_std(MIRACL) value: 15.147015882448075 - type: nAUC_MAP@100_diff1(MIRACL) value: 15.8510181843185 - type: nAUC_MAP@100_max(MIRACL) value: 27.452155305037095 - type: nAUC_MAP@100_std(MIRACL) value: 15.147015882448075 - type: nAUC_MAP@10_diff1(MIRACL) value: 17.808742699385363 - type: nAUC_MAP@10_max(MIRACL) value: 25.21217663908093 - type: nAUC_MAP@10_std(MIRACL) value: 13.970995033749716 - type: nAUC_MAP@1_diff1(MIRACL) value: 34.30066727981356 - type: nAUC_MAP@1_max(MIRACL) value: 11.096793012814972 - type: nAUC_MAP@1_std(MIRACL) value: 4.298644702770651 - type: nAUC_MAP@20_diff1(MIRACL) value: 16.499957004860978 - type: nAUC_MAP@20_max(MIRACL) value: 26.676987318433714 - type: nAUC_MAP@20_std(MIRACL) value: 15.166175199040485 - type: nAUC_MAP@3_diff1(MIRACL) value: 23.797870452650084 - type: nAUC_MAP@3_max(MIRACL) value: 18.20460307122738 - type: nAUC_MAP@3_std(MIRACL) value: 8.985118628338126 - type: nAUC_MAP@5_diff1(MIRACL) value: 20.549029352694866 - type: nAUC_MAP@5_max(MIRACL) value: 21.528805328834324 - type: nAUC_MAP@5_std(MIRACL) value: 11.131951589460492 - type: nAUC_NDCG@1000_diff1(MIRACL) value: 5.973372149854828 - type: nAUC_NDCG@1000_max(MIRACL) value: 36.70565868748619 - type: nAUC_NDCG@1000_std(MIRACL) value: 19.551007976769245 - type: nAUC_NDCG@100_diff1(MIRACL) value: 5.973372149854828 - type: nAUC_NDCG@100_max(MIRACL) value: 36.70565868748619 - type: nAUC_NDCG@100_std(MIRACL) value: 19.551007976769245 - type: nAUC_NDCG@10_diff1(MIRACL) value: 10.894100451667919 - type: nAUC_NDCG@10_max(MIRACL) value: 31.735109695399416 - type: nAUC_NDCG@10_std(MIRACL) value: 17.674556265190706 - type: nAUC_NDCG@1_diff1(MIRACL) value: 22.04892839322977 - type: nAUC_NDCG@1_max(MIRACL) value: 32.51034181981298 - type: nAUC_NDCG@1_std(MIRACL) value: 14.343760356007765 - type: nAUC_NDCG@20_diff1(MIRACL) value: 8.074119776676103 - type: nAUC_NDCG@20_max(MIRACL) value: 34.52221220694718 - type: nAUC_NDCG@20_std(MIRACL) value: 19.94006423667 - type: nAUC_NDCG@3_diff1(MIRACL) value: 16.284195830367825 - type: nAUC_NDCG@3_max(MIRACL) value: 26.521965826220352 - type: nAUC_NDCG@3_std(MIRACL) value: 13.850033289666094 - type: nAUC_NDCG@5_diff1(MIRACL) value: 14.362693198633952 - type: nAUC_NDCG@5_max(MIRACL) value: 27.781809390068872 - type: nAUC_NDCG@5_std(MIRACL) value: 14.879808284537981 - type: nAUC_P@1000_diff1(MIRACL) value: -27.606682296231373 - type: nAUC_P@1000_max(MIRACL) value: 33.03084251491326 - type: nAUC_P@1000_std(MIRACL) value: 15.674013757663898 - type: nAUC_P@100_diff1(MIRACL) value: -27.606682296231327 - type: nAUC_P@100_max(MIRACL) value: 33.03084251491332 - type: nAUC_P@100_std(MIRACL) value: 15.674013757663937 - type: nAUC_P@10_diff1(MIRACL) value: -23.575685602922174 - type: nAUC_P@10_max(MIRACL) value: 36.72548498655645 - type: nAUC_P@10_std(MIRACL) value: 21.317694028285104 - type: nAUC_P@1_diff1(MIRACL) value: 22.04892839322977 - type: nAUC_P@1_max(MIRACL) value: 32.51034181981298 - type: nAUC_P@1_std(MIRACL) value: 14.343760356007765 - type: nAUC_P@20_diff1(MIRACL) value: -26.064734965649322 - type: nAUC_P@20_max(MIRACL) value: 34.10936682680113 - type: nAUC_P@20_std(MIRACL) value: 20.31615496254574 - type: nAUC_P@3_diff1(MIRACL) value: -10.903444655544746 - type: nAUC_P@3_max(MIRACL) value: 34.33585029049373 - type: nAUC_P@3_std(MIRACL) value: 18.620142249622834 - type: nAUC_P@5_diff1(MIRACL) value: -18.454884144221385 - type: nAUC_P@5_max(MIRACL) value: 35.620428961110036 - type: nAUC_P@5_std(MIRACL) value: 20.265460635926893 - type: nAUC_Recall@1000_diff1(MIRACL) value: -28.25716669219796 - type: nAUC_Recall@1000_max(MIRACL) value: 59.88673755432144 - type: nAUC_Recall@1000_std(MIRACL) value: 29.916576785101622 - type: nAUC_Recall@100_diff1(MIRACL) value: -28.25716669219796 - type: nAUC_Recall@100_max(MIRACL) value: 59.88673755432144 - type: nAUC_Recall@100_std(MIRACL) value: 29.916576785101622 - type: nAUC_Recall@10_diff1(MIRACL) value: -2.5731369116803466 - type: nAUC_Recall@10_max(MIRACL) value: 34.37108435281944 - type: nAUC_Recall@10_std(MIRACL) value: 20.744457001608925 - type: nAUC_Recall@1_diff1(MIRACL) value: 34.30066727981356 - type: nAUC_Recall@1_max(MIRACL) value: 11.096793012814972 - type: nAUC_Recall@1_std(MIRACL) value: 4.298644702770651 - type: nAUC_Recall@20_diff1(MIRACL) value: -13.667980220614172 - type: nAUC_Recall@20_max(MIRACL) value: 44.947659106700044 - type: nAUC_Recall@20_std(MIRACL) value: 29.413435369376923 - type: nAUC_Recall@3_diff1(MIRACL) value: 15.838199908854786 - type: nAUC_Recall@3_max(MIRACL) value: 17.368565662731196 - type: nAUC_Recall@3_std(MIRACL) value: 10.538072940876807 - type: nAUC_Recall@5_diff1(MIRACL) value: 8.199967584892176 - type: nAUC_Recall@5_max(MIRACL) value: 23.500985460573578 - type: nAUC_Recall@5_std(MIRACL) value: 13.477424183539433 - task: type: Retrieval dataset: name: MTEB MIRACLRetrieval (ru) type: miracl/mmteb-miracl config: ru split: dev revision: main metrics: - type: main_score value: 52.211 - type: map_at_1 value: 23.238 - type: map_at_10 value: 41.559000000000005 - type: map_at_100 value: 44.757999999999996 - type: map_at_1000 value: 44.861000000000004 - type: map_at_20 value: 43.461 - type: map_at_3 value: 34.593 - type: map_at_5 value: 38.056 - type: mrr_at_1 value: 47.04472843450479 - type: mrr_at_10 value: 59.587485420153286 - type: mrr_at_100 value: 60.17662556783717 - type: mrr_at_1000 value: 60.1850174860852 - type: mrr_at_20 value: 60.003979383733544 - type: mrr_at_3 value: 56.62939297124608 - type: mrr_at_5 value: 58.33067092651768 - type: nauc_map_at_1000_diff1 value: 26.665139374258256 - type: nauc_map_at_1000_max value: 20.20801190375824 - type: nauc_map_at_1000_std value: 3.35434510540552 - type: nauc_map_at_100_diff1 value: 26.691816652639787 - type: nauc_map_at_100_max value: 20.193510183457917 - type: nauc_map_at_100_std value: 3.371679544337864 - type: nauc_map_at_10_diff1 value: 27.24904607990151 - type: nauc_map_at_10_max value: 18.26589731339405 - type: nauc_map_at_10_std value: 1.0177924180874538 - type: nauc_map_at_1_diff1 value: 34.53595808193455 - type: nauc_map_at_1_max value: 10.970155439499656 - type: nauc_map_at_1_std value: -3.8626873246816373 - type: nauc_map_at_20_diff1 value: 26.8513788979128 - type: nauc_map_at_20_max value: 19.367475736662428 - type: nauc_map_at_20_std value: 2.2475091146613564 - type: nauc_map_at_3_diff1 value: 28.911815196615866 - type: nauc_map_at_3_max value: 15.474121149651292 - type: nauc_map_at_3_std value: -1.0664535264565158 - type: nauc_map_at_5_diff1 value: 27.772031743222787 - type: nauc_map_at_5_max value: 16.241638808384145 - type: nauc_map_at_5_std value: -0.6044307972013538 - type: nauc_mrr_at_1000_diff1 value: 26.66563442138901 - type: nauc_mrr_at_1000_max value: 27.74734004586503 - type: nauc_mrr_at_1000_std value: 10.663042801330587 - type: nauc_mrr_at_100_diff1 value: 26.66809693875436 - type: nauc_mrr_at_100_max value: 27.7565667281779 - type: nauc_mrr_at_100_std value: 10.671838040923266 - type: nauc_mrr_at_10_diff1 value: 26.587658592417736 - type: nauc_mrr_at_10_max value: 27.872712998242328 - type: nauc_mrr_at_10_std value: 10.979716151856918 - type: nauc_mrr_at_1_diff1 value: 29.30751401472168 - type: nauc_mrr_at_1_max value: 24.98212676568516 - type: nauc_mrr_at_1_std value: 6.094206809391165 - type: nauc_mrr_at_20_diff1 value: 26.52396413399926 - type: nauc_mrr_at_20_max value: 27.720568784204847 - type: nauc_mrr_at_20_std value: 10.749903126459412 - type: nauc_mrr_at_3_diff1 value: 26.993782403961802 - type: nauc_mrr_at_3_max value: 27.810128603605342 - type: nauc_mrr_at_3_std value: 10.526250026174825 - type: nauc_mrr_at_5_diff1 value: 26.491056284663404 - type: nauc_mrr_at_5_max value: 27.938292238745838 - type: nauc_mrr_at_5_std value: 10.620036152236098 - type: nauc_ndcg_at_1000_diff1 value: 24.743263734342236 - type: nauc_ndcg_at_1000_max value: 25.632023742967196 - type: nauc_ndcg_at_1000_std value: 9.54979482991325 - type: nauc_ndcg_at_100_diff1 value: 24.884477288371073 - type: nauc_ndcg_at_100_max value: 25.856099754401797 - type: nauc_ndcg_at_100_std value: 10.275002448873611 - type: nauc_ndcg_at_10_diff1 value: 25.813663674330005 - type: nauc_ndcg_at_10_max value: 21.4632558325771 - type: nauc_ndcg_at_10_std value: 4.793772488457711 - type: nauc_ndcg_at_1_diff1 value: 29.30751401472168 - type: nauc_ndcg_at_1_max value: 24.98212676568516 - type: nauc_ndcg_at_1_std value: 6.094206809391165 - type: nauc_ndcg_at_20_diff1 value: 24.96712085611002 - type: nauc_ndcg_at_20_max value: 23.176681160212546 - type: nauc_ndcg_at_20_std value: 6.936886476037671 - type: nauc_ndcg_at_3_diff1 value: 25.475637018641205 - type: nauc_ndcg_at_3_max value: 22.040672063815855 - type: nauc_ndcg_at_3_std value: 5.327531594448605 - type: nauc_ndcg_at_5_diff1 value: 25.70702625003538 - type: nauc_ndcg_at_5_max value: 20.273499330943313 - type: nauc_ndcg_at_5_std value: 3.733783938564952 - type: nauc_precision_at_1000_diff1 value: -14.918023025551047 - type: nauc_precision_at_1000_max value: 18.668936317187704 - type: nauc_precision_at_1000_std value: 19.15643973163778 - type: nauc_precision_at_100_diff1 value: -12.902497092152561 - type: nauc_precision_at_100_max value: 22.117700522212857 - type: nauc_precision_at_100_std value: 23.367379142816734 - type: nauc_precision_at_10_diff1 value: -3.319884895143968 - type: nauc_precision_at_10_max value: 25.207453700919412 - type: nauc_precision_at_10_std value: 16.768944029523773 - type: nauc_precision_at_1_diff1 value: 29.30751401472168 - type: nauc_precision_at_1_max value: 24.98212676568516 - type: nauc_precision_at_1_std value: 6.094206809391165 - type: nauc_precision_at_20_diff1 value: -8.101925051455304 - type: nauc_precision_at_20_max value: 23.93155685736234 - type: nauc_precision_at_20_std value: 19.599852197885983 - type: nauc_precision_at_3_diff1 value: 8.604157546918138 - type: nauc_precision_at_3_max value: 26.8274074367336 - type: nauc_precision_at_3_std value: 13.210078569814973 - type: nauc_precision_at_5_diff1 value: 2.0240126571446004 - type: nauc_precision_at_5_max value: 25.068271323836683 - type: nauc_precision_at_5_std value: 13.423044252359 - type: nauc_recall_at_1000_diff1 value: 2.5057442905176264 - type: nauc_recall_at_1000_max value: 57.765040045333485 - type: nauc_recall_at_1000_std value: 75.40225417846978 - type: nauc_recall_at_100_diff1 value: 13.982399962667946 - type: nauc_recall_at_100_max value: 36.06499090419987 - type: nauc_recall_at_100_std value: 38.55877836909554 - type: nauc_recall_at_10_diff1 value: 19.09907433139298 - type: nauc_recall_at_10_max value: 14.320755651797818 - type: nauc_recall_at_10_std value: 3.68835109545608 - type: nauc_recall_at_1_diff1 value: 34.53595808193455 - type: nauc_recall_at_1_max value: 10.970155439499656 - type: nauc_recall_at_1_std value: -3.8626873246816373 - type: nauc_recall_at_20_diff1 value: 15.80854510984775 - type: nauc_recall_at_20_max value: 17.20627614536354 - type: nauc_recall_at_20_std value: 9.028188051323042 - type: nauc_recall_at_3_diff1 value: 23.88853757885772 - type: nauc_recall_at_3_max value: 13.29954353582913 - type: nauc_recall_at_3_std value: -0.42190904806759966 - type: nauc_recall_at_5_diff1 value: 20.720312115028822 - type: nauc_recall_at_5_max value: 12.324541527710025 - type: nauc_recall_at_5_std value: -0.19420222400103399 - type: ndcg_at_1 value: 47.044999999999995 - type: ndcg_at_10 value: 52.211 - type: ndcg_at_100 value: 60.777 - type: ndcg_at_1000 value: 61.951 - type: ndcg_at_20 value: 56.215 - type: ndcg_at_3 value: 45.871 - type: ndcg_at_5 value: 47.643 - type: precision_at_1 value: 47.044999999999995 - type: precision_at_10 value: 16.062 - type: precision_at_100 value: 2.563 - type: precision_at_1000 value: 0.27899999999999997 - type: precision_at_20 value: 9.9 - type: precision_at_3 value: 31.575999999999997 - type: precision_at_5 value: 24.153 - type: recall_at_1 value: 23.238 - type: recall_at_10 value: 63.479 - type: recall_at_100 value: 91.51899999999999 - type: recall_at_1000 value: 97.906 - type: recall_at_20 value: 74.705 - type: recall_at_3 value: 42.082 - type: recall_at_5 value: 50.708 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ru) type: mteb/amazon_massive_intent config: ru split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 68.61466039004706 - type: f1 value: 63.790707574282045 - type: f1_weighted value: 67.28456899088164 - type: main_score value: 68.61466039004706 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ru) type: mteb/amazon_massive_scenario config: ru split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 77.97579018157364 - type: f1 value: 76.31497051309336 - type: f1_weighted value: 77.54198422119202 - type: main_score value: 77.97579018157364 - task: type: STS dataset: name: MTEB RUParaPhraserSTS type: merionum/ru_paraphraser config: default split: test revision: 43265056790b8f7c59e0139acb4be0a8dad2c8f4 metrics: - type: cosine_pearson value: 62.072853635744465 - type: cosine_spearman value: 68.32627155640247 - type: euclidean_pearson value: 65.56072460948485 - type: euclidean_spearman value: 68.32632364995054 - type: main_score value: 68.32627155640247 - type: manhattan_pearson value: 65.54799770948776 - type: manhattan_spearman value: 68.2428132570697 - type: pearson value: 62.072853635744465 - type: spearman value: 68.32627155640247 - task: type: Retrieval dataset: name: MTEB RiaNewsRetrieval type: ai-forever/ria-news-retrieval config: default split: test revision: 82374b0bbacda6114f39ff9c5b925fa1512ca5d7 metrics: - type: main_score value: 79.42399999999999 - type: map_at_1 value: 67.42 - type: map_at_10 value: 75.81700000000001 - type: map_at_100 value: 76.103 - type: map_at_1000 value: 76.11099999999999 - type: map_at_20 value: 76.011 - type: map_at_3 value: 74.38 - type: map_at_5 value: 75.31400000000001 - type: mrr_at_1 value: 67.42 - type: mrr_at_10 value: 75.81702380952322 - type: mrr_at_100 value: 76.10294206257022 - type: mrr_at_1000 value: 76.11127333184083 - type: mrr_at_20 value: 76.01092756817413 - type: mrr_at_3 value: 74.37999999999947 - type: mrr_at_5 value: 75.31449999999931 - type: nauc_map_at_1000_diff1 value: 74.47312749692254 - type: nauc_map_at_1000_max value: 24.255650636762592 - type: nauc_map_at_1000_std value: -13.538045103707466 - type: nauc_map_at_100_diff1 value: 74.46935527123232 - type: nauc_map_at_100_max value: 24.260637479032273 - type: nauc_map_at_100_std value: -13.526893488105108 - type: nauc_map_at_10_diff1 value: 74.37904649319015 - type: nauc_map_at_10_max value: 24.25477514829031 - type: nauc_map_at_10_std value: -13.673101053529166 - type: nauc_map_at_1_diff1 value: 77.41742450291483 - type: nauc_map_at_1_max value: 21.561634939014 - type: nauc_map_at_1_std value: -15.302925641163046 - type: nauc_map_at_20_diff1 value: 74.44339113303336 - type: nauc_map_at_20_max value: 24.281346979231508 - type: nauc_map_at_20_std value: -13.533874833150467 - type: nauc_map_at_3_diff1 value: 74.31017752460161 - type: nauc_map_at_3_max value: 24.209272036097506 - type: nauc_map_at_3_std value: -14.053104049162751 - type: nauc_map_at_5_diff1 value: 74.42859541067173 - type: nauc_map_at_5_max value: 24.16570861589971 - type: nauc_map_at_5_std value: -13.948432311463257 - type: nauc_mrr_at_1000_diff1 value: 74.47312785315074 - type: nauc_mrr_at_1000_max value: 24.255652429274488 - type: nauc_mrr_at_1000_std value: -13.538043692357599 - type: nauc_mrr_at_100_diff1 value: 74.46935527123232 - type: nauc_mrr_at_100_max value: 24.260637479032273 - type: nauc_mrr_at_100_std value: -13.526893488105108 - type: nauc_mrr_at_10_diff1 value: 74.37904649319015 - type: nauc_mrr_at_10_max value: 24.25477514829031 - type: nauc_mrr_at_10_std value: -13.673101053529166 - type: nauc_mrr_at_1_diff1 value: 77.41742450291483 - type: nauc_mrr_at_1_max value: 21.561634939014 - type: nauc_mrr_at_1_std value: -15.302925641163046 - type: nauc_mrr_at_20_diff1 value: 74.44339113303336 - type: nauc_mrr_at_20_max value: 24.281346979231508 - type: nauc_mrr_at_20_std value: -13.533874833150467 - type: nauc_mrr_at_3_diff1 value: 74.31017752460161 - type: nauc_mrr_at_3_max value: 24.209272036097506 - type: nauc_mrr_at_3_std value: -14.053104049162751 - type: nauc_mrr_at_5_diff1 value: 74.42859541067173 - type: nauc_mrr_at_5_max value: 24.16570861589971 - type: nauc_mrr_at_5_std value: -13.948432311463257 - type: nauc_ndcg_at_1000_diff1 value: 73.67049349073889 - type: nauc_ndcg_at_1000_max value: 25.36219767677513 - type: nauc_ndcg_at_1000_std value: -12.018149673769434 - type: nauc_ndcg_at_100_diff1 value: 73.52540106541404 - type: nauc_ndcg_at_100_max value: 25.54104779422804 - type: nauc_ndcg_at_100_std value: -11.596858470683141 - type: nauc_ndcg_at_10_diff1 value: 73.13668875552696 - type: nauc_ndcg_at_10_max value: 25.555285618887662 - type: nauc_ndcg_at_10_std value: -12.31485256997023 - type: nauc_ndcg_at_1_diff1 value: 77.41742450291483 - type: nauc_ndcg_at_1_max value: 21.561634939014 - type: nauc_ndcg_at_1_std value: -15.302925641163046 - type: nauc_ndcg_at_20_diff1 value: 73.35771732216482 - type: nauc_ndcg_at_20_max value: 25.73112191366883 - type: nauc_ndcg_at_20_std value: -11.69854261340669 - type: nauc_ndcg_at_3_diff1 value: 73.20274751289709 - type: nauc_ndcg_at_3_max value: 25.285529084214925 - type: nauc_ndcg_at_3_std value: -13.37770120862227 - type: nauc_ndcg_at_5_diff1 value: 73.33594229336342 - type: nauc_ndcg_at_5_max value: 25.281830078361644 - type: nauc_ndcg_at_5_std value: -13.088615162069974 - type: nauc_precision_at_1000_diff1 value: 55.90120106013352 - type: nauc_precision_at_1000_max value: 55.70083105705886 - type: nauc_precision_at_1000_std value: 36.2217350708384 - type: nauc_precision_at_100_diff1 value: 59.2870776629234 - type: nauc_precision_at_100_max value: 47.133189559008834 - type: nauc_precision_at_100_std value: 28.301920571571802 - type: nauc_precision_at_10_diff1 value: 65.12757705051081 - type: nauc_precision_at_10_max value: 34.0578425463014 - type: nauc_precision_at_10_std value: -2.7826038995063618 - type: nauc_precision_at_1_diff1 value: 77.41742450291483 - type: nauc_precision_at_1_max value: 21.561634939014 - type: nauc_precision_at_1_std value: -15.302925641163046 - type: nauc_precision_at_20_diff1 value: 64.13592064044578 - type: nauc_precision_at_20_max value: 39.3269437952694 - type: nauc_precision_at_20_std value: 7.181669511985859 - type: nauc_precision_at_3_diff1 value: 68.88283614651107 - type: nauc_precision_at_3_max value: 29.546078723110387 - type: nauc_precision_at_3_std value: -10.635148066667597 - type: nauc_precision_at_5_diff1 value: 68.11610612745827 - type: nauc_precision_at_5_max value: 30.708733892411683 - type: nauc_precision_at_5_std value: -8.722606142068399 - type: nauc_recall_at_1000_diff1 value: 55.90120106013372 - type: nauc_recall_at_1000_max value: 55.70083105705975 - type: nauc_recall_at_1000_std value: 36.22173507083937 - type: nauc_recall_at_100_diff1 value: 59.287077662923856 - type: nauc_recall_at_100_max value: 47.1331895590096 - type: nauc_recall_at_100_std value: 28.30192057157174 - type: nauc_recall_at_10_diff1 value: 65.1275770505108 - type: nauc_recall_at_10_max value: 34.057842546301245 - type: nauc_recall_at_10_std value: -2.7826038995065376 - type: nauc_recall_at_1_diff1 value: 77.41742450291483 - type: nauc_recall_at_1_max value: 21.561634939014 - type: nauc_recall_at_1_std value: -15.302925641163046 - type: nauc_recall_at_20_diff1 value: 64.13592064044556 - type: nauc_recall_at_20_max value: 39.32694379526965 - type: nauc_recall_at_20_std value: 7.181669511986287 - type: nauc_recall_at_3_diff1 value: 68.88283614651114 - type: nauc_recall_at_3_max value: 29.54607872311032 - type: nauc_recall_at_3_std value: -10.635148066667742 - type: nauc_recall_at_5_diff1 value: 68.11610612745811 - type: nauc_recall_at_5_max value: 30.70873389241151 - type: nauc_recall_at_5_std value: -8.722606142068207 - type: ndcg_at_1 value: 67.42 - type: ndcg_at_10 value: 79.42399999999999 - type: ndcg_at_100 value: 80.754 - type: ndcg_at_1000 value: 80.979 - type: ndcg_at_20 value: 80.118 - type: ndcg_at_3 value: 76.543 - type: ndcg_at_5 value: 78.215 - type: precision_at_1 value: 67.42 - type: precision_at_10 value: 9.052 - type: precision_at_100 value: 0.966 - type: precision_at_1000 value: 0.098 - type: precision_at_20 value: 4.662 - type: precision_at_3 value: 27.589999999999996 - type: precision_at_5 value: 17.36 - type: recall_at_1 value: 67.42 - type: recall_at_10 value: 90.52 - type: recall_at_100 value: 96.61 - type: recall_at_1000 value: 98.39 - type: recall_at_20 value: 93.24 - type: recall_at_3 value: 82.77 - type: recall_at_5 value: 86.8 - task: type: Reranking dataset: name: MTEB RuBQReranking type: ai-forever/rubq-reranking config: default split: test revision: 2e96b8f098fa4b0950fc58eacadeb31c0d0c7fa2 metrics: - type: main_score value: 68.48180892753541 - type: map value: 68.48180892753541 - type: mrr value: 73.69372550223615 - type: nAUC_map_diff1 value: 37.93778560797301 - type: nAUC_map_max value: 10.858022431340633 - type: nAUC_map_std value: 6.446466714820493 - type: nAUC_mrr_diff1 value: 39.83698029227208 - type: nAUC_mrr_max value: 14.378309445768284 - type: nAUC_mrr_std value: 10.579567761464919 - task: type: Retrieval dataset: name: MTEB RuBQRetrieval type: ai-forever/rubq-retrieval config: default split: test revision: e19b6ffa60b3bc248e0b41f4cc37c26a55c2a67b metrics: - type: main_score value: 66.77 - type: map_at_1 value: 36.525 - type: map_at_10 value: 58.021 - type: map_at_100 value: 59.016000000000005 - type: map_at_1000 value: 59.041999999999994 - type: map_at_20 value: 58.709 - type: map_at_3 value: 51.808 - type: map_at_5 value: 55.706999999999994 - type: mrr_at_1 value: 52.95508274231678 - type: mrr_at_10 value: 66.10029926076034 - type: mrr_at_100 value: 66.46489903689454 - type: mrr_at_1000 value: 66.47135430048212 - type: mrr_at_20 value: 66.36282360130573 - type: mrr_at_3 value: 63.347123719464236 - type: mrr_at_5 value: 65.20291568163925 - type: nauc_map_at_1000_diff1 value: 36.39353112777031 - type: nauc_map_at_1000_max value: 14.511234479555156 - type: nauc_map_at_1000_std value: -12.003784393055856 - type: nauc_map_at_100_diff1 value: 36.396297354858326 - type: nauc_map_at_100_max value: 14.532932252459755 - type: nauc_map_at_100_std value: -11.9933713072409 - type: nauc_map_at_10_diff1 value: 36.19731963995984 - type: nauc_map_at_10_max value: 14.331593327284844 - type: nauc_map_at_10_std value: -12.607001882190588 - type: nauc_map_at_1_diff1 value: 39.04224394212489 - type: nauc_map_at_1_max value: 9.44079807509392 - type: nauc_map_at_1_std value: -8.725551038382205 - type: nauc_map_at_20_diff1 value: 36.27250811060138 - type: nauc_map_at_20_max value: 14.521970331255876 - type: nauc_map_at_20_std value: -12.033391150828098 - type: nauc_map_at_3_diff1 value: 35.966460233965485 - type: nauc_map_at_3_max value: 11.62955834976298 - type: nauc_map_at_3_std value: -13.649024048480133 - type: nauc_map_at_5_diff1 value: 36.131815002934644 - type: nauc_map_at_5_max value: 13.157509275481777 - type: nauc_map_at_5_std value: -13.36839170298778 - type: nauc_mrr_at_1000_diff1 value: 40.191647456610056 - type: nauc_mrr_at_1000_max value: 16.63142892913043 - type: nauc_mrr_at_1000_std value: -12.671951113868769 - type: nauc_mrr_at_100_diff1 value: 40.18726742271696 - type: nauc_mrr_at_100_max value: 16.638314382103207 - type: nauc_mrr_at_100_std value: -12.664912420744438 - type: nauc_mrr_at_10_diff1 value: 40.028293277796855 - type: nauc_mrr_at_10_max value: 16.841638035795718 - type: nauc_mrr_at_10_std value: -12.781785759758687 - type: nauc_mrr_at_1_diff1 value: 42.26303997344821 - type: nauc_mrr_at_1_max value: 14.211014905785252 - type: nauc_mrr_at_1_std value: -11.030701637062437 - type: nauc_mrr_at_20_diff1 value: 40.12680433695074 - type: nauc_mrr_at_20_max value: 16.75915749592042 - type: nauc_mrr_at_20_std value: -12.613807048523782 - type: nauc_mrr_at_3_diff1 value: 40.32434278687767 - type: nauc_mrr_at_3_max value: 15.811615950737387 - type: nauc_mrr_at_3_std value: -13.957860180387636 - type: nauc_mrr_at_5_diff1 value: 40.09422159913817 - type: nauc_mrr_at_5_max value: 16.64090259238879 - type: nauc_mrr_at_5_std value: -13.230746065794726 - type: nauc_ndcg_at_1000_diff1 value: 36.67352791454268 - type: nauc_ndcg_at_1000_max value: 16.749915190801016 - type: nauc_ndcg_at_1000_std value: -11.008545008175378 - type: nauc_ndcg_at_100_diff1 value: 36.58072887287039 - type: nauc_ndcg_at_100_max value: 17.22374718832945 - type: nauc_ndcg_at_100_std value: -10.559637745205016 - type: nauc_ndcg_at_10_diff1 value: 35.786024269753334 - type: nauc_ndcg_at_10_max value: 17.217091860749864 - type: nauc_ndcg_at_10_std value: -12.505927857541066 - type: nauc_ndcg_at_1_diff1 value: 42.41055520049291 - type: nauc_ndcg_at_1_max value: 14.001922648893919 - type: nauc_ndcg_at_1_std value: -11.224085018036103 - type: nauc_ndcg_at_20_diff1 value: 35.9577978619838 - type: nauc_ndcg_at_20_max value: 17.612142353807204 - type: nauc_ndcg_at_20_std value: -10.715656533623179 - type: nauc_ndcg_at_3_diff1 value: 35.92331458170165 - type: nauc_ndcg_at_3_max value: 12.972908846104833 - type: nauc_ndcg_at_3_std value: -14.90499944816046 - type: nauc_ndcg_at_5_diff1 value: 35.87509174776851 - type: nauc_ndcg_at_5_max value: 15.016606655112842 - type: nauc_ndcg_at_5_std value: -14.252766370474959 - type: nauc_precision_at_1000_diff1 value: -7.854237065573715 - type: nauc_precision_at_1000_max value: 7.340193640831781 - type: nauc_precision_at_1000_std value: 5.270139452495764 - type: nauc_precision_at_100_diff1 value: -5.433762342336105 - type: nauc_precision_at_100_max value: 10.323131724715576 - type: nauc_precision_at_100_std value: 6.065361232063088 - type: nauc_precision_at_10_diff1 value: 1.6163013309854788 - type: nauc_precision_at_10_max value: 13.853149437703955 - type: nauc_precision_at_10_std value: -0.4630873244645538 - type: nauc_precision_at_1_diff1 value: 42.41055520049291 - type: nauc_precision_at_1_max value: 14.001922648893919 - type: nauc_precision_at_1_std value: -11.224085018036103 - type: nauc_precision_at_20_diff1 value: -2.406608082278331 - type: nauc_precision_at_20_max value: 12.672408320017443 - type: nauc_precision_at_20_std value: 4.420612595577876 - type: nauc_precision_at_3_diff1 value: 15.724555799730243 - type: nauc_precision_at_3_max value: 12.818558415088615 - type: nauc_precision_at_3_std value: -11.49979730611224 - type: nauc_precision_at_5_diff1 value: 8.485573750280292 - type: nauc_precision_at_5_max value: 13.304773839372094 - type: nauc_precision_at_5_std value: -6.633911950881821 - type: nauc_recall_at_1000_diff1 value: -7.902591492154048 - type: nauc_recall_at_1000_max value: 54.202835032879946 - type: nauc_recall_at_1000_std value: 68.22401286555711 - type: nauc_recall_at_100_diff1 value: 14.88281690495126 - type: nauc_recall_at_100_max value: 41.9305338281276 - type: nauc_recall_at_100_std value: 30.260295038603324 - type: nauc_recall_at_10_diff1 value: 23.09613458762812 - type: nauc_recall_at_10_max value: 24.921985669652386 - type: nauc_recall_at_10_std value: -9.990910822464661 - type: nauc_recall_at_1_diff1 value: 39.04224394212489 - type: nauc_recall_at_1_max value: 9.44079807509392 - type: nauc_recall_at_1_std value: -8.725551038382205 - type: nauc_recall_at_20_diff1 value: 19.41298369752395 - type: nauc_recall_at_20_max value: 31.91169321346991 - type: nauc_recall_at_20_std value: 4.514353181881159 - type: nauc_recall_at_3_diff1 value: 29.514018426239197 - type: nauc_recall_at_3_max value: 10.600179069626673 - type: nauc_recall_at_3_std value: -17.02685998662361 - type: nauc_recall_at_5_diff1 value: 26.66966838912029 - type: nauc_recall_at_5_max value: 15.359436829533934 - type: nauc_recall_at_5_std value: -15.87666175175801 - type: ndcg_at_1 value: 52.896 - type: ndcg_at_10 value: 66.77 - type: ndcg_at_100 value: 69.98100000000001 - type: ndcg_at_1000 value: 70.408 - type: ndcg_at_20 value: 68.53200000000001 - type: ndcg_at_3 value: 58.074999999999996 - type: ndcg_at_5 value: 62.841 - type: precision_at_1 value: 52.896 - type: precision_at_10 value: 13.8 - type: precision_at_100 value: 1.609 - type: precision_at_1000 value: 0.166 - type: precision_at_20 value: 7.444000000000001 - type: precision_at_3 value: 32.623999999999995 - type: precision_at_5 value: 23.735 - type: recall_at_1 value: 36.525 - type: recall_at_10 value: 83.893 - type: recall_at_100 value: 96.345 - type: recall_at_1000 value: 99.126 - type: recall_at_20 value: 89.812 - type: recall_at_3 value: 62.58899999999999 - type: recall_at_5 value: 73.64500000000001 - task: type: Classification dataset: name: MTEB RuReviewsClassification type: ai-forever/ru-reviews-classification config: default split: test revision: f6d2c31f4dc6b88f468552750bfec05b4b41b05a metrics: - type: accuracy value: 68.2373046875 - type: f1 value: 66.6798984937843 - type: f1_weighted value: 66.67858774240374 - type: main_score value: 68.2373046875 - task: type: STS dataset: name: MTEB RuSTSBenchmarkSTS type: ai-forever/ru-stsbenchmark-sts config: default split: test revision: 7cf24f325c6da6195df55bef3d86b5e0616f3018 metrics: - type: cosine_pearson value: 77.06911833905438 - type: cosine_spearman value: 77.84605139753621 - type: euclidean_pearson value: 76.3616511204864 - type: euclidean_spearman value: 77.84487946345095 - type: main_score value: 77.84605139753621 - type: manhattan_pearson value: 76.35303659263998 - type: manhattan_spearman value: 77.87677782965115 - type: pearson value: 77.06911833905438 - type: spearman value: 77.84605139753621 - task: type: Classification dataset: name: MTEB RuSciBenchGRNTIClassification type: ai-forever/ru-scibench-grnti-classification config: default split: test revision: 673a610d6d3dd91a547a0d57ae1b56f37ebbf6a1 metrics: - type: accuracy value: 61.23535156249999 - type: f1 value: 59.029291161802334 - type: f1_weighted value: 59.041548793589406 - type: main_score value: 61.23535156249999 - task: type: Clustering dataset: name: MTEB RuSciBenchGRNTIClusteringP2P type: ai-forever/ru-scibench-grnti-classification config: default split: test revision: 673a610d6d3dd91a547a0d57ae1b56f37ebbf6a1 metrics: - type: main_score value: 56.82815630686135 - type: v_measure value: 56.82815630686135 - type: v_measure_std value: 0.6871068462323323 - task: type: Classification dataset: name: MTEB RuSciBenchOECDClassification type: ai-forever/ru-scibench-oecd-classification config: default split: test revision: 26c88e99dcaba32bb45d0e1bfc21902337f6d471 metrics: - type: accuracy value: 48.1005859375 - type: f1 value: 44.918516110124315 - type: f1_weighted value: 44.91942618115105 - type: main_score value: 48.1005859375 - task: type: Clustering dataset: name: MTEB RuSciBenchOECDClusteringP2P type: ai-forever/ru-scibench-oecd-classification config: default split: test revision: 26c88e99dcaba32bb45d0e1bfc21902337f6d471 metrics: - type: main_score value: 48.72707742931753 - type: v_measure value: 48.72707742931753 - type: v_measure_std value: 0.7258468439420995 - task: type: STS dataset: name: MTEB STS22 (ru) type: mteb/sts22-crosslingual-sts config: ru split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 64.95220904597029 - type: cosine_spearman value: 67.35282990065247 - type: euclidean_pearson value: 64.72045496418937 - type: euclidean_spearman value: 67.35282990065247 - type: main_score value: 67.35282990065247 - type: manhattan_pearson value: 64.40621455763392 - type: manhattan_spearman value: 66.99408273892949 - type: pearson value: 64.95220904597029 - type: spearman value: 67.35282990065247 - task: type: MultilabelClassification dataset: name: MTEB SensitiveTopicsClassification type: ai-forever/sensitive-topics-classification config: default split: test revision: 416b34a802308eac30e4192afc0ff99bb8dcc7f2 metrics: - type: accuracy value: 29.624023437500004 - type: f1 value: 33.214028020582894 - type: lrap value: 44.53599717881868 - type: main_score value: 29.624023437500004 - task: type: PairClassification dataset: name: MTEB TERRa type: ai-forever/terra-pairclassification config: default split: dev revision: 7b58f24536063837d644aab9a023c62199b2a612 metrics: - type: cosine_accuracy value: 57.98045602605863 - type: cosine_accuracy_threshold value: 83.04829597473145 - type: cosine_ap value: 55.56580974377611 - type: cosine_f1 value: 66.9603524229075 - type: cosine_f1_threshold value: 73.216313123703 - type: cosine_precision value: 50.498338870431894 - type: cosine_recall value: 99.34640522875817 - type: dot_accuracy value: 57.98045602605863 - type: dot_accuracy_threshold value: 83.04829597473145 - type: dot_ap value: 55.56580974377611 - type: dot_f1 value: 66.9603524229075 - type: dot_f1_threshold value: 73.21631908416748 - type: dot_precision value: 50.498338870431894 - type: dot_recall value: 99.34640522875817 - type: euclidean_accuracy value: 57.98045602605863 - type: euclidean_accuracy_threshold value: 58.226633071899414 - type: euclidean_ap value: 55.56580974377611 - type: euclidean_f1 value: 66.9603524229075 - type: euclidean_f1_threshold value: 73.18969368934631 - type: euclidean_precision value: 50.498338870431894 - type: euclidean_recall value: 99.34640522875817 - type: main_score value: 55.56580974377611 - type: manhattan_accuracy value: 57.98045602605863 - type: manhattan_accuracy_threshold value: 1336.6012573242188 - type: manhattan_ap value: 55.5371135438789 - type: manhattan_f1 value: 66.95842450765863 - type: manhattan_f1_threshold value: 1720.5078125 - type: manhattan_precision value: 50.32894736842105 - type: manhattan_recall value: 100.0 - type: max_ap value: 55.56580974377611 - type: max_f1 value: 66.9603524229075 - type: max_precision value: 50.498338870431894 - type: max_recall value: 100.0 - type: similarity_accuracy value: 57.98045602605863 - type: similarity_accuracy_threshold value: 83.04829597473145 - type: similarity_ap value: 55.56580974377611 - type: similarity_f1 value: 66.9603524229075 - type: similarity_f1_threshold value: 73.216313123703 - type: similarity_precision value: 50.498338870431894 - type: similarity_recall value: 99.34640522875817 --- - <h1 align="center">KaLM-Embedding</h1> **KaLM-Embedding** is a series of embedding models adapted from auto-regressive LLMs with superior training data. KaLM-embedding-multilingual-mini is trained from [Qwen/Qwen2-0.5B](https://huggingface.co/Qwen/Qwen2-0.5B) with massive weakly-supervised pre-training and supervised fine-tuning data. ## 📑 Open-source Plan - [x] Model Checkpoint - [x] [KaLM-embedding-multilingual-mini-v1](https://huggingface.co/HIT-TMG/KaLM-embedding-multilingual-mini-v1) - [x] [KaLM-embedding-multilingual-mini-instruct-v1](https://huggingface.co/HIT-TMG/KaLM-embedding-multilingual-mini-instruct-v1) - [x] [KaLM-embedding-multilingual-mini-instruct-v1.5](https://huggingface.co/HIT-TMG/KaLM-embedding-multilingual-mini-instruct-v1.5) - [ ] KaLM-embedding-multilingual-max-v1 - [x] Training and Evaluation Code: [HITsz-TMG/KaLM-Embedding](https://github.com/HITsz-TMG/KaLM-Embedding) - [x] Technical Report: [KaLM-Embedding: Superior Training Data Brings A Stronger Embedding Model](https://arxiv.org/abs/2501.01028) - [ ] Training Data ## Evaluation | Model Name | Model Size | C-MTEB(35) | MTEB(56) | avg |:----:|:---:|:---:|:---:|:---:| | [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 560M | 58.81 | 61.5 | 60.16 | [bge-m3 (dense)](https://huggingface.co/BAAI/bge-m3) | 560M | 60.80 | 59.84 | 60.32 | [gte-multilingual-base (dense)](https://huggingface.co/Alibaba-NLP/gte-multilingual-base) | **305M** | 62.72 | 61.40 | 62.06 | [KaLM-embedding-multilingual-mini-v1](https://huggingface.co/HIT-TMG/KaLM-embedding-multilingual-mini-v1) | 494M | 62.31 | 61.87 | 62.09 | [KaLM-embedding-multilingual-mini-instruct-v1](https://huggingface.co/HIT-TMG/KaLM-embedding-multilingual-mini-instruct-v1) | 494M | 63.57 | 64.74 | 64.16 | [KaLM-embedding-multilingual-mini-instruct-v1.5](https://huggingface.co/HIT-TMG/KaLM-embedding-multilingual-mini-instruct-v1.5) | 494M | **64.13** | **64.94** | **64.53** ## Requirements Since we have used the Qwen2 model, we advise you to install `transformers>=4.37.0`, or you might encounter the following error: ``` KeyError: 'qwen2' ``` ## Usage Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('{MODEL_NAME_OR_PATH}') # Do NOT set trust_remote_code model.max_seq_length = 512 embeddings = model.encode( sentences, normalize_embeddings=True, batch_size=256, show_progress_bar=True ) print(embeddings) ``` <!-- We add instruction for asymmetric tasks: retrieval, reranking, classification and clustering. --> We add instruction for classification and clustering. If you want to add instruction to the query (no instruction for the corpus), you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('{MODEL_NAME_OR_PATH}') # Do NOT set trust_remote_code model.max_seq_length = 512 prompt = "Instruct: Classifying the category of french news. \n Query: " embeddings = model.encode( sentences, prompt=prompt, normalize_embeddings=True, batch_size=256, show_progress_bar=True ) print(embeddings) ``` ## Citation Please cite the repo if you use the model or code in this repo. ``` @article{hu2025kalm, title={KaLM-Embedding: Superior Training Data Brings A Stronger Embedding Model}, author={Hu, Xinshuo and Shan, Zifei and Zhao, Xinping and Sun, Zetian and Liu, Zhenyu and Li, Dongfang and Ye, Shaolin and Wei, Xinyuan and Chen, Qian and Hu, Baotian and others}, journal={arXiv preprint arXiv:2501.01028}, year={2025} } ``` ## Contact If you encounter any issue, feel free to contact us via the email: [email protected]
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
maastrichtlawtech/distilcamembert-lleqa
maastrichtlawtech
sentence-similarity
[ "sentence-transformers", "pytorch", "safetensors", "camembert", "feature-extraction", "sentence-similarity", "fr", "dataset:maastrichtlawtech/lleqa", "arxiv:2309.17050", "license:apache-2.0", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,695
1,730
48
3
--- datasets: - maastrichtlawtech/lleqa language: fr library_name: sentence-transformers license: apache-2.0 metrics: - recall pipeline_tag: sentence-similarity tags: - feature-extraction - sentence-similarity inference: true widget: - source_sentence: Je reçois des confidences liées à mon emploi. Qu'est-ce que je risque si je viole le secret professionnel ? sentences: - 'Art. 1 : Les médecins, chirurgiens, officiers de santé, pharmaciens, sages-femmes et toutes autres personnes dépositaires, par état ou par profession, des secrets qu''on leur confie, qui, hors le cas où ils sont appelés à rendre témoignage en justice ou devant une commission d''enquête parlementaire et celui où la loi, le décret ou l''ordonnance les oblige ou les autoriseà faire connaître ces secrets, les auront révélés, seront punis d''un emprisonnement d''un an à trois ans et d''une amende de cent euros à mille euros ou d''une de ces peines seulement.' - 'Art. 2 : L''allocataire peut demander l''allocation de naissance à partir du sixième mois de la grossesse et en obtenir le paiement deux mois avant la date probable de la naissance mentionnée sur le certificat médical à joindre à la demande.L''allocation de naissance demandée conformément à l''alinéa 1er est due par la caisse d''allocations familiales, par l''autorité ou par l''établissement public qui serait compétent, selon le cas, pour payer les allocations familiales à la date à laquelle la demande de paiement anticipé est introduite.' - 'Art. 3 : La periode de maternité constitue une période de repos de douze semaines, ou de treize semainesen cas de naissance multiple, au cours de laquelle la titulaire ne peut exercer son activité professionnelle habituelle ni aucune autre activité professionnelle.' example_title: Secret professionnel --- # distilcamembert-lleqa This is a [sentence-transformers](https://www.SBERT.net) model: it maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. The model was trained on the [LLeQA](https://huggingface.co/datasets/maastrichtlawtech/lleqa) dataset for legal information retrieval in **French**. ## Usage *** #### Sentence-Transformers Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('maastrichtlawtech/distilcamembert-lleqa') embeddings = model.encode(sentences) print(embeddings) ``` #### 🤗 Transformers Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['This is an example sentence', 'Each sentence is converted'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('maastrichtlawtech/distilcamembert-lleqa') model = AutoModel.from_pretrained('maastrichtlawtech/distilcamembert-lleqa') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, mean pooling. sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) print(sentence_embeddings) ``` ## Evaluation *** We evaluate the model on the test set of LLeQA, which consists of 195 legal questions with a knowlegde corpus of 27.9K candidate articles. We report the mean reciprocal rank (MRR), normalized discounted cumulative gainand (NDCG), mean average precision (MAP), and recall at various cut-offs (R@k). | MRR@10 | NDCG@10 | MAP@10 | R@10 | R@100 | R@500 | |---------:|----------:|---------:|-------:|--------:|--------:| | 36.67 | 37.24 | 29.26 | 52.95 | 78.07 | 90.17 | ## Training *** #### Background We utilize the [distilcamembert-base](https://huggingface.co/cmarkea/distilcamembert-base) model and fine-tuned it on 9.3K question-article pairs in French. We used a contrastive learning objective: given a short legal question, the model should predict which out of a set of sampled legal articles, was actually paired with it in the dataset. Formally, we compute the cosine similarity from each possible pairs from the batch. We then apply the cross entropy loss with a temperature of 0.05 by comparing with true pairs. #### Hyperparameters We trained the model on a single Tesla V100 GPU with 32GBs of memory during 20 epochs (i.e., 5.4k steps) using a batch size of 32. We used the AdamW optimizer with an initial learning rate of 2e-05, weight decay of 0.01, learning rate warmup over the first 50 steps, and linear decay of the learning rate. The sequence length was limited to 384 tokens. #### Data We use the [Long-form Legal Question Answering (LLeQA)](https://huggingface.co/datasets/maastrichtlawtech/lleqa) dataset to fine-tune the model. LLeQA is a French native dataset for studying legal information retrieval and question answering. It consists of a knowledge corpus of 27,941 statutory articles collected from the Belgian legislation, and 1,868 legal questions posed by Belgian citizens and labeled by experienced jurists with a comprehensive answer rooted in relevant articles from the corpus. ## Citation ```bibtex @article{louis2023interpretable, author = {Louis, Antoine and van Dijck, Gijs and Spanakis, Gerasimos}, title = {Interpretable Long-Form Legal Question Answering with Retrieval-Augmented Large Language Models}, journal = {CoRR}, volume = {abs/2309.17050}, year = {2023}, url = {https://arxiv.org/abs/2309.17050}, eprinttype = {arXiv}, eprint = {2309.17050}, } ```
[ "QUESTION_ANSWERING" ]
[ "CAS" ]
Non_BioNLP
am-azadi/KaLM-embedding-multilingual-mini-v1_Fine_Tuned_1e
am-azadi
sentence-similarity
[ "sentence-transformers", "safetensors", "qwen2", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:21769", "loss:MultipleNegativesRankingLoss", "arxiv:1908.10084", "arxiv:1705.00652", "base_model:HIT-TMG/KaLM-embedding-multilingual-mini-v1", "base_model:finetune:HIT-TMG/KaLM-embedding-multilingual-mini-v1", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,740
1,740
16
0
--- base_model: HIT-TMG/KaLM-embedding-multilingual-mini-v1 library_name: sentence-transformers pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:21769 - loss:MultipleNegativesRankingLoss widget: - source_sentence: 'Blooming Canals of Venice, Italy. by: [IG] ' sentences: - This comparison shows the values of gasoline in Cádiz and in Gibraltar in 2021 The comparison of fuel prices circulates in Spain at least since 2018 - Genuine image of a Venice canal laden with lotus blossoms The lotus blossoms were digitally inserted into this image by a graphic artist - PT deputy presented PL for police to carry unloaded weapons Bill "5439/2022" for police officers to carry unloaded weapons does not exist - source_sentence: 'The story of a young mother in Italy who is dying due to corona, makes a last request to stop her 18-month-old baby''s incessant crying. The doctor allowed by covering his whole body with transparent wax and then placing the baby on his chest, miraculously the baby remained silent and his mother also remained silent forever. Apparently, a mother''s love for a child has no limits and has a very strange aura as Allah swt said and the Messenger of Allah reminded. Make it a lesson about mother''s love.. Greetings to all mothers especially my mother.. Puan Hjh Ashakiah Basri...... ' sentences: - Photo shows a COVID-19 victim in Italy holding her child for the last time This photo shows a baby waiting for a bone marrow transplant in the United States in 1985 - Applesauce tests positive for covid-19 antigens The video of the positive antigen test on applesauce does not show that these tests are useless - '"Submission to Rouen" with the removal of the statue of Napoleon 1st No, the removal of the statue of Napoleon I in Rouen is unrelated to the controversies linked to colonial history' - source_sentence: 'COVID-19 - HUNT FOR AFRICANS IN CHINA This is the real hell that the whole black community in China is going through in its last days. Accused by a frank Chinese population of being carriers of the COVID-19 virus, although the African continent was one of the last to experience this virus and especially the least affected at present, black Africans, are victims of another severe and cruel form of discrimination in China. They are brutally chased from their homes, banned and rejected in all hotels, violently beaten in the streets. A real open sky in this country where the population is spread all over Africa and occupies a place of choice in our African society. The Kenyan government has therefore given an Ultimatum to the entire Chinese community residing in the country, so that it go away in a short time.. AFRICAN PEOPLE, UNITY IN OUR CONTINENT SHOULD BE AN EMERGENCY MISSION . ' sentences: - Video of flooded metro as Typhoon In-Fa hit Shanghai The video predates Typhoon In-Fa that hit eastern China in 2021 - Video shows attack on Africans in China No, this video does not show an attack on Africans in China but a fight in New York a month ago - These figures show that Covid-19 kills fewer people than many other diseases/accidents Figures that put the mortality due to Covid-19 in the world into perspective? Beware of these comparisons - source_sentence: 'I will not be vaccinated. - Swiss Vaccine Protection Federation How to protect us / - see below i vaccinate will not receive Dr. Gisele Werner: Prevention, like many colleagues waiting before getting vaccinated It is recommended. We believe in this novel gene therapy necessary to ensure safety There is no possibility. This vaccine against COVID 19 is prevent the spread of the virus and barrier gestures does not release We have long-term side effects I don''t know until now more than benefits witness the danger there is. Look out! Swiss Vaccine Protection Federation This information campaign aims to address the risks inherent in vaccination. Supported by knowledgeable Swiss citizens and doctors. Sante' sentences: - This picture shows a man in hospital after eating a bat after a novel coronavirus outbreak This picture has circulated on a fundraising site for men with lung pain unrelated to the novel coronavirus - Cardiologist McCullough affirms “without the slightest shadow of a doubt” that “the Covid-19 vaccine causes myocarditis”, “and the FDA knows that because there are 200 published scientific documents that prove it Dr. Peter McCullough's sayings about vaccines against covid-19 verified - Swiss medical orgnisation publishes this poster to advise people not to get vaccinated against Covid-19. This poster has been doctored from a health notice promoting the coronavirus jab - source_sentence: 'The putschists sold out Mali Read it is extremely serious MOVEMENT COORDINATION OF AZAWAD (CMA) - having regard to the CMA charter - having regard to the Internal Regulations of the Management Committee - given the need for the service STEERING COMMITTEE Decision N°013/Pdt CMA Bearing the Boundary of the State of AZAWAD and the State of Mali. The President of the CMA: تنسيقية الحركات الأزوادية DECIDED: Article 1: The start of bormage works between the State of Azawad and the State of Mali in order to avoid all conflicts of interest. Article 2: Prohibition of all military operations without the prior agreement of 40-1tl.: 0:51 +1.XIA the State of Azawad and its partners (Barkhane and Minusma) who provided efforts for our independence. Amplification: EMGA/CMA.. Fama area of Gao01 Minusma Kidal: Barkhane Kidal: Article 3: This decision takes effect from the date of its signature and will be recorded and published wherever needed. 01 ...01 01 Kidal, February 2, 2004 THE PRESIDENTIA SIDI IBRAHIM OULD SIDATT' sentences: - 'Pfizer announces Covid-19 vaccine update with Microsoft chip for symptom reduction Pfizer did not announce an agreement with Microsoft: the article about the chip in the covid vaccine is a satire' - Selensyj indicated rabbit ears to Putin here. This image of Zelenskyy showing Putin rabbit ears is manipulated - The CMA announces the start of the demarcation between the State of Azawad and the State of Mali Please note, this document attributed to former Tuareg separatist rebels is a fake --- # SentenceTransformer based on HIT-TMG/KaLM-embedding-multilingual-mini-v1 This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [HIT-TMG/KaLM-embedding-multilingual-mini-v1](https://huggingface.co/HIT-TMG/KaLM-embedding-multilingual-mini-v1). It maps sentences & paragraphs to a 896-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [HIT-TMG/KaLM-embedding-multilingual-mini-v1](https://huggingface.co/HIT-TMG/KaLM-embedding-multilingual-mini-v1) <!-- at revision 685312fd77f877ad457efcf17bf31b5de0a8ed1c --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 896 dimensions - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: Qwen2Model (1): Pooling({'word_embedding_dimension': 896, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("sentence_transformers_model_id") # Run inference sentences = [ 'The putschists sold out Mali Read it is extremely serious MOVEMENT COORDINATION OF AZAWAD (CMA) - having regard to the CMA charter - having regard to the Internal Regulations of the Management Committee - given the need for the service STEERING COMMITTEE Decision N°013/Pdt CMA Bearing the Boundary of the State of AZAWAD and the State of Mali. The President of the CMA: تنسيقية الحركات الأزوادية DECIDED: Article 1: The start of bormage works between the State of Azawad and the State of Mali in order to avoid all conflicts of interest. Article 2: Prohibition of all military operations without the prior agreement of 40-1tl.: 0:51 +1.XIA the State of Azawad and its partners (Barkhane and Minusma) who provided efforts for our independence. Amplification: EMGA/CMA.. Fama area of Gao01 Minusma Kidal: Barkhane Kidal: Article 3: This decision takes effect from the date of its signature and will be recorded and published wherever needed. 01 ...01 01 Kidal, February 2, 2004 THE PRESIDENTIA SIDI IBRAHIM OULD SIDATT', 'The CMA announces the start of the demarcation between the State of Azawad and the State of Mali Please note, this document attributed to former Tuareg separatist rebels is a fake', 'Pfizer announces Covid-19 vaccine update with Microsoft chip for symptom reduction Pfizer did not announce an agreement with Microsoft: the article about the chip in the covid vaccine is a satire', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 896] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 21,769 training samples * Columns: <code>sentence_0</code> and <code>sentence_1</code> * Approximate statistics based on the first 1000 samples: | | sentence_0 | sentence_1 | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 2 tokens</li><li>mean: 111.86 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 34.13 tokens</li><li>max: 127 tokens</li></ul> | * Samples: | sentence_0 | sentence_1 | |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>CAMP CANBERRA - the biggest gathering in Canberra of all time! Police report they let 1.4 million vehicles through and that was yesterday. People were still pouring in overnight and all morning. Most vehicles had more than one person in them. Amongst the vehicles there were 100s of special buses that came full of people from all over Australia. So doubling that number can still be considered quite a conservative estimate. Population of Australia : 25 million. When 5 million show up, that's 20% of the country and there's HEAPS of us that couldn't make it! Here's a HUGE SHOUT-OUT and THANK YOU to ALL who did! Lighting a candle for all who are rising up all over the world. I Love it when we stand Peacefully in Love as One! TO THE REBELS This is for ones that see the through the deception and lies. That actively resist tyranny and live a life which is lead by their own intuition and heart. They are owned by no one. To the brave Women and Men who courageously risk their reputation and relat...</code> | <code>Anti-vaccine mandate protests attract over one million vehicles to Canberra Facebook posts share false claim about size of anti-vaccine mandate protest in Australia</code> | | <code>Typhoon fireworks land in Shanghai emergency (12) The extension line of Shanghai Metro No. 1 began to flood. video </code> | <code>Video of flooded metro as Typhoon In-Fa hit Shanghai The video predates Typhoon In-Fa that hit eastern China in 2021</code> | | <code>WHO declares PCR tests unreliable At the same time as Joe Biden was sworn in as the new US President, the WHO questioned the reliability of the PCR test. I can't remember who kept mentioning this before? Was that me in the end? According to the WHO, a PCR test alone is not enough to detect an infection. How many millions of people have sat in quarantine for nothing? Positive test results in symptom-free people are not usable! For my critical statements about the PCR test, I was disparaged by self-appointed fact-checkers. I also remember a discussion on Servus-TV, where Prof. Manfred Spitzer, whom I valued before, suppressed any criticism of the PCR test in a highly authoritarian and almost aggressive manner. "Positive PCR test means infected!" We now know that the basis for the tightened and probably ever-extended lockdown is a political and not a scientific decision. Of course, politicians refer to scientists. However, only to those who support the political course. Two brave editors ...</code> | <code>The WHO confirmed that PCR tests are unsuitable for detecting corona WHO recommendations on PCR tests are misinterpreted</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `per_device_train_batch_size`: 2 - `per_device_eval_batch_size`: 2 - `num_train_epochs`: 1 - `multi_dataset_batch_sampler`: round_robin #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: no - `prediction_loss_only`: True - `per_device_train_batch_size`: 2 - `per_device_eval_batch_size`: 2 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 5e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.0 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: None - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `average_tokens_across_devices`: False - `prompts`: None - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: round_robin </details> ### Training Logs | Epoch | Step | Training Loss | |:------:|:-----:|:-------------:| | 0.0459 | 500 | 0.0142 | | 0.0919 | 1000 | 0.0367 | | 0.1378 | 1500 | 0.0444 | | 0.1837 | 2000 | 0.0581 | | 0.2297 | 2500 | 0.045 | | 0.2756 | 3000 | 0.0736 | | 0.3215 | 3500 | 0.0567 | | 0.3675 | 4000 | 0.0314 | | 0.4134 | 4500 | 0.0362 | | 0.4593 | 5000 | 0.029 | | 0.5053 | 5500 | 0.0621 | | 0.5512 | 6000 | 0.0328 | | 0.5972 | 6500 | 0.0279 | | 0.6431 | 7000 | 0.0343 | | 0.6890 | 7500 | 0.0251 | | 0.7350 | 8000 | 0.0437 | | 0.7809 | 8500 | 0.0328 | | 0.8268 | 9000 | 0.0123 | | 0.8728 | 9500 | 0.0177 | | 0.9187 | 10000 | 0.0332 | | 0.9646 | 10500 | 0.0214 | ### Framework Versions - Python: 3.11.11 - Sentence Transformers: 3.4.1 - Transformers: 4.48.3 - PyTorch: 2.5.1+cu124 - Accelerate: 1.3.0 - Datasets: 3.3.2 - Tokenizers: 0.21.0 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "PCR" ]
Non_BioNLP
markaw/NV-Embed-v2
markaw
feature-extraction
[ "transformers", "safetensors", "nvembed", "feature-extraction", "mteb", "custom_code", "en", "arxiv:2405.17428", "arxiv:2407.15831", "license:cc-by-nc-4.0", "model-index", "endpoints_compatible", "region:us" ]
1,727
1,727
15
0
--- language: - en library_name: transformers license: cc-by-nc-4.0 tags: - mteb model-index: - name: NV-Embed-v2 results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 94.28358208955224 - type: accuracy_stderr value: 0.40076780842082305 - type: ap value: 76.49097318319616 - type: ap_stderr value: 1.2418692675183929 - type: f1 value: 91.41982003001168 - type: f1_stderr value: 0.5043921413093579 - type: main_score value: 94.28358208955224 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 97.74185000000001 - type: accuracy_stderr value: 0.07420471683120942 - type: ap value: 96.4737144875525 - type: ap_stderr value: 0.2977518241541558 - type: f1 value: 97.7417581594921 - type: f1_stderr value: 0.07428763617010377 - type: main_score value: 97.74185000000001 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 63.96000000000001 - type: accuracy_stderr value: 1.815555011559825 - type: f1 value: 62.49361841640459 - type: f1_stderr value: 2.829339314126457 - type: main_score value: 63.96000000000001 - task: type: Retrieval dataset: name: MTEB ArguAna type: mteb/arguana config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 46.515 - type: map_at_10 value: 62.392 - type: map_at_100 value: 62.732 - type: map_at_1000 value: 62.733000000000004 - type: map_at_3 value: 58.701 - type: map_at_5 value: 61.027 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 46.515 - type: ndcg_at_10 value: 70.074 - type: ndcg_at_100 value: 71.395 - type: ndcg_at_1000 value: 71.405 - type: ndcg_at_3 value: 62.643 - type: ndcg_at_5 value: 66.803 - type: precision_at_1 value: 46.515 - type: precision_at_10 value: 9.41 - type: precision_at_100 value: 0.996 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 24.68 - type: precision_at_5 value: 16.814 - type: recall_at_1 value: 46.515 - type: recall_at_10 value: 94.097 - type: recall_at_100 value: 99.57300000000001 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 74.03999999999999 - type: recall_at_5 value: 84.068 - type: main_score value: 70.074 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: main_score value: 55.79933795955242 - type: v_measure value: 55.79933795955242 - type: v_measure_std value: 14.575108141916148 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: main_score value: 51.262845995850334 - type: v_measure value: 51.262845995850334 - type: v_measure_std value: 14.727824473104173 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 67.46477327480808 - type: mrr value: 79.50160488941653 - type: main_score value: 67.46477327480808 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cosine_pearson value: 89.74311007980987 - type: cosine_spearman value: 87.41644967443246 - type: manhattan_pearson value: 88.57457108347744 - type: manhattan_spearman value: 87.59295972042997 - type: euclidean_pearson value: 88.27108977118459 - type: euclidean_spearman value: 87.41644967443246 - type: main_score value: 87.41644967443246 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 92.41558441558443 - type: accuracy_stderr value: 0.37701502251934443 - type: f1 value: 92.38130170447671 - type: f1_stderr value: 0.39115151225617767 - type: main_score value: 92.41558441558443 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: main_score value: 54.08649516394218 - type: v_measure value: 54.08649516394218 - type: v_measure_std value: 0.5303233693045373 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: main_score value: 49.60352214167779 - type: v_measure value: 49.60352214167779 - type: v_measure_std value: 0.7176198612516721 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: CQADupstackRetrieval_is_a_combined_dataset config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 31.913249999999998 - type: map_at_10 value: 43.87733333333334 - type: map_at_100 value: 45.249916666666664 - type: map_at_1000 value: 45.350583333333326 - type: map_at_3 value: 40.316833333333335 - type: map_at_5 value: 42.317083333333336 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 38.30616666666667 - type: ndcg_at_10 value: 50.24175000000001 - type: ndcg_at_100 value: 55.345333333333336 - type: ndcg_at_1000 value: 56.91225000000001 - type: ndcg_at_3 value: 44.67558333333333 - type: ndcg_at_5 value: 47.32333333333334 - type: precision_at_1 value: 38.30616666666667 - type: precision_at_10 value: 9.007416666666666 - type: precision_at_100 value: 1.3633333333333333 - type: precision_at_1000 value: 0.16691666666666666 - type: precision_at_3 value: 20.895666666666667 - type: precision_at_5 value: 14.871666666666666 - type: recall_at_1 value: 31.913249999999998 - type: recall_at_10 value: 64.11891666666666 - type: recall_at_100 value: 85.91133333333333 - type: recall_at_1000 value: 96.28225 - type: recall_at_3 value: 48.54749999999999 - type: recall_at_5 value: 55.44283333333334 - type: main_score value: 50.24175000000001 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: mteb/climate-fever config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: map_at_1 value: 19.556 - type: map_at_10 value: 34.623 - type: map_at_100 value: 36.97 - type: map_at_1000 value: 37.123 - type: map_at_3 value: 28.904999999999998 - type: map_at_5 value: 31.955 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 44.104 - type: ndcg_at_10 value: 45.388 - type: ndcg_at_100 value: 52.793 - type: ndcg_at_1000 value: 55.108999999999995 - type: ndcg_at_3 value: 38.604 - type: ndcg_at_5 value: 40.806 - type: precision_at_1 value: 44.104 - type: precision_at_10 value: 14.143 - type: precision_at_100 value: 2.2190000000000003 - type: precision_at_1000 value: 0.266 - type: precision_at_3 value: 29.316 - type: precision_at_5 value: 21.98 - type: recall_at_1 value: 19.556 - type: recall_at_10 value: 52.120999999999995 - type: recall_at_100 value: 76.509 - type: recall_at_1000 value: 89.029 - type: recall_at_3 value: 34.919 - type: recall_at_5 value: 42.18 - type: main_score value: 45.388 - task: type: Retrieval dataset: name: MTEB DBPedia type: mteb/dbpedia config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: map_at_1 value: 10.714 - type: map_at_10 value: 25.814999999999998 - type: map_at_100 value: 37.845 - type: map_at_1000 value: 39.974 - type: map_at_3 value: 17.201 - type: map_at_5 value: 21.062 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 66.0 - type: ndcg_at_10 value: 53.496 - type: ndcg_at_100 value: 58.053 - type: ndcg_at_1000 value: 64.886 - type: ndcg_at_3 value: 57.656 - type: ndcg_at_5 value: 55.900000000000006 - type: precision_at_1 value: 77.25 - type: precision_at_10 value: 43.65 - type: precision_at_100 value: 13.76 - type: precision_at_1000 value: 2.5940000000000003 - type: precision_at_3 value: 61.0 - type: precision_at_5 value: 54.65 - type: recall_at_1 value: 10.714 - type: recall_at_10 value: 31.173000000000002 - type: recall_at_100 value: 63.404 - type: recall_at_1000 value: 85.874 - type: recall_at_3 value: 18.249000000000002 - type: recall_at_5 value: 23.69 - type: main_score value: 53.496 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 93.38499999999999 - type: accuracy_stderr value: 0.13793114224133846 - type: f1 value: 90.12141028353496 - type: f1_stderr value: 0.174640257706043 - type: main_score value: 93.38499999999999 - task: type: Retrieval dataset: name: MTEB FEVER type: mteb/fever config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: map_at_1 value: 84.66900000000001 - type: map_at_10 value: 91.52799999999999 - type: map_at_100 value: 91.721 - type: map_at_1000 value: 91.73 - type: map_at_3 value: 90.752 - type: map_at_5 value: 91.262 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 91.20899999999999 - type: ndcg_at_10 value: 93.74900000000001 - type: ndcg_at_100 value: 94.279 - type: ndcg_at_1000 value: 94.408 - type: ndcg_at_3 value: 92.923 - type: ndcg_at_5 value: 93.376 - type: precision_at_1 value: 91.20899999999999 - type: precision_at_10 value: 11.059 - type: precision_at_100 value: 1.1560000000000001 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_3 value: 35.129 - type: precision_at_5 value: 21.617 - type: recall_at_1 value: 84.66900000000001 - type: recall_at_10 value: 97.03399999999999 - type: recall_at_100 value: 98.931 - type: recall_at_1000 value: 99.65899999999999 - type: recall_at_3 value: 94.76299999999999 - type: recall_at_5 value: 95.968 - type: main_score value: 93.74900000000001 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: mteb/fiqa config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: map_at_1 value: 34.866 - type: map_at_10 value: 58.06099999999999 - type: map_at_100 value: 60.028999999999996 - type: map_at_1000 value: 60.119 - type: map_at_3 value: 51.304 - type: map_at_5 value: 55.054 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 64.815 - type: ndcg_at_10 value: 65.729 - type: ndcg_at_100 value: 71.14 - type: ndcg_at_1000 value: 72.336 - type: ndcg_at_3 value: 61.973 - type: ndcg_at_5 value: 62.858000000000004 - type: precision_at_1 value: 64.815 - type: precision_at_10 value: 17.87 - type: precision_at_100 value: 2.373 - type: precision_at_1000 value: 0.258 - type: precision_at_3 value: 41.152 - type: precision_at_5 value: 29.568 - type: recall_at_1 value: 34.866 - type: recall_at_10 value: 72.239 - type: recall_at_100 value: 91.19 - type: recall_at_1000 value: 98.154 - type: recall_at_3 value: 56.472 - type: recall_at_5 value: 63.157 - type: main_score value: 65.729 - task: type: Retrieval dataset: name: MTEB HotpotQA type: mteb/hotpotqa config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: map_at_1 value: 44.651999999999994 - type: map_at_10 value: 79.95100000000001 - type: map_at_100 value: 80.51700000000001 - type: map_at_1000 value: 80.542 - type: map_at_3 value: 77.008 - type: map_at_5 value: 78.935 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 89.305 - type: ndcg_at_10 value: 85.479 - type: ndcg_at_100 value: 87.235 - type: ndcg_at_1000 value: 87.669 - type: ndcg_at_3 value: 81.648 - type: ndcg_at_5 value: 83.88600000000001 - type: precision_at_1 value: 89.305 - type: precision_at_10 value: 17.807000000000002 - type: precision_at_100 value: 1.9140000000000001 - type: precision_at_1000 value: 0.197 - type: precision_at_3 value: 53.756 - type: precision_at_5 value: 34.018 - type: recall_at_1 value: 44.651999999999994 - type: recall_at_10 value: 89.034 - type: recall_at_100 value: 95.719 - type: recall_at_1000 value: 98.535 - type: recall_at_3 value: 80.635 - type: recall_at_5 value: 85.044 - type: main_score value: 85.479 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 97.1376 - type: accuracy_stderr value: 0.04571914259913447 - type: ap value: 95.92783808558808 - type: ap_stderr value: 0.05063782483358255 - type: f1 value: 97.13755519177172 - type: f1_stderr value: 0.04575943074086138 - type: main_score value: 97.1376 - task: type: Retrieval dataset: name: MTEB MSMARCO type: mteb/msmarco config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: map_at_1 value: 0.0 - type: map_at_10 value: 38.342 - type: map_at_100 value: 0.0 - type: map_at_1000 value: 0.0 - type: map_at_3 value: 0.0 - type: map_at_5 value: 0.0 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 0.0 - type: ndcg_at_10 value: 45.629999999999995 - type: ndcg_at_100 value: 0.0 - type: ndcg_at_1000 value: 0.0 - type: ndcg_at_3 value: 0.0 - type: ndcg_at_5 value: 0.0 - type: precision_at_1 value: 0.0 - type: precision_at_10 value: 7.119000000000001 - type: precision_at_100 value: 0.0 - type: precision_at_1000 value: 0.0 - type: precision_at_3 value: 0.0 - type: precision_at_5 value: 0.0 - type: recall_at_1 value: 0.0 - type: recall_at_10 value: 67.972 - type: recall_at_100 value: 0.0 - type: recall_at_1000 value: 0.0 - type: recall_at_3 value: 0.0 - type: recall_at_5 value: 0.0 - type: main_score value: 45.629999999999995 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 99.24988600091199 - type: accuracy_stderr value: 0.04496826931900734 - type: f1 value: 99.15933275095276 - type: f1_stderr value: 0.05565039139747446 - type: main_score value: 99.24988600091199 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 94.3684450524396 - type: accuracy_stderr value: 0.8436548701322188 - type: f1 value: 77.33022623133307 - type: f1_stderr value: 0.9228425861187275 - type: main_score value: 94.3684450524396 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 86.09616677874916 - type: accuracy_stderr value: 0.9943208055590853 - type: f1 value: 83.4902056490062 - type: f1_stderr value: 0.7626189310074184 - type: main_score value: 86.09616677874916 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 92.17215870880968 - type: accuracy_stderr value: 0.25949941333658166 - type: f1 value: 91.36757392422702 - type: f1_stderr value: 0.29139507298154815 - type: main_score value: 92.17215870880968 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: main_score value: 46.09497344077905 - type: v_measure value: 46.09497344077905 - type: v_measure_std value: 1.44871520869784 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: main_score value: 44.861049989560684 - type: v_measure value: 44.861049989560684 - type: v_measure_std value: 1.432199293162203 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 31.75936162919999 - type: mrr value: 32.966812736541236 - type: main_score value: 31.75936162919999 - task: type: Retrieval dataset: name: MTEB NFCorpus type: mteb/nfcorpus config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: map_at_1 value: 7.893999999999999 - type: map_at_10 value: 17.95 - type: map_at_100 value: 23.474 - type: map_at_1000 value: 25.412000000000003 - type: map_at_3 value: 12.884 - type: map_at_5 value: 15.171000000000001 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 55.728 - type: ndcg_at_10 value: 45.174 - type: ndcg_at_100 value: 42.18 - type: ndcg_at_1000 value: 50.793 - type: ndcg_at_3 value: 50.322 - type: ndcg_at_5 value: 48.244 - type: precision_at_1 value: 57.276 - type: precision_at_10 value: 33.437 - type: precision_at_100 value: 10.671999999999999 - type: precision_at_1000 value: 2.407 - type: precision_at_3 value: 46.646 - type: precision_at_5 value: 41.672 - type: recall_at_1 value: 7.893999999999999 - type: recall_at_10 value: 22.831000000000003 - type: recall_at_100 value: 43.818 - type: recall_at_1000 value: 75.009 - type: recall_at_3 value: 14.371 - type: recall_at_5 value: 17.752000000000002 - type: main_score value: 45.174 - task: type: Retrieval dataset: name: MTEB NQ type: mteb/nq config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: map_at_1 value: 49.351 - type: map_at_10 value: 66.682 - type: map_at_100 value: 67.179 - type: map_at_1000 value: 67.18499999999999 - type: map_at_3 value: 62.958999999999996 - type: map_at_5 value: 65.364 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 55.417 - type: ndcg_at_10 value: 73.568 - type: ndcg_at_100 value: 75.35 - type: ndcg_at_1000 value: 75.478 - type: ndcg_at_3 value: 67.201 - type: ndcg_at_5 value: 70.896 - type: precision_at_1 value: 55.417 - type: precision_at_10 value: 11.036999999999999 - type: precision_at_100 value: 1.204 - type: precision_at_1000 value: 0.121 - type: precision_at_3 value: 29.654000000000003 - type: precision_at_5 value: 20.006 - type: recall_at_1 value: 49.351 - type: recall_at_10 value: 91.667 - type: recall_at_100 value: 98.89 - type: recall_at_1000 value: 99.812 - type: recall_at_3 value: 75.715 - type: recall_at_5 value: 84.072 - type: main_score value: 73.568 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: mteb/quora config: default split: test revision: e4e08e0b7dbe3c8700f0daef558ff32256715259 metrics: - type: map_at_1 value: 71.358 - type: map_at_10 value: 85.474 - type: map_at_100 value: 86.101 - type: map_at_1000 value: 86.114 - type: map_at_3 value: 82.562 - type: map_at_5 value: 84.396 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 82.12 - type: ndcg_at_10 value: 89.035 - type: ndcg_at_100 value: 90.17399999999999 - type: ndcg_at_1000 value: 90.243 - type: ndcg_at_3 value: 86.32300000000001 - type: ndcg_at_5 value: 87.85 - type: precision_at_1 value: 82.12 - type: precision_at_10 value: 13.55 - type: precision_at_100 value: 1.54 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.89 - type: precision_at_5 value: 24.9 - type: recall_at_1 value: 71.358 - type: recall_at_10 value: 95.855 - type: recall_at_100 value: 99.711 - type: recall_at_1000 value: 99.994 - type: recall_at_3 value: 88.02 - type: recall_at_5 value: 92.378 - type: main_score value: 89.035 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: main_score value: 71.0984522742521 - type: v_measure value: 71.0984522742521 - type: v_measure_std value: 3.5668139917058044 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 385e3cb46b4cfa89021f56c4380204149d0efe33 metrics: - type: main_score value: 74.94499641904133 - type: v_measure value: 74.94499641904133 - type: v_measure_std value: 11.419672879389248 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: mteb/scidocs config: default split: test revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88 metrics: - type: map_at_1 value: 5.343 - type: map_at_10 value: 13.044 - type: map_at_100 value: 15.290999999999999 - type: map_at_1000 value: 15.609 - type: map_at_3 value: 9.227 - type: map_at_5 value: 11.158 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 26.3 - type: ndcg_at_10 value: 21.901 - type: ndcg_at_100 value: 30.316 - type: ndcg_at_1000 value: 35.547000000000004 - type: ndcg_at_3 value: 20.560000000000002 - type: ndcg_at_5 value: 18.187 - type: precision_at_1 value: 26.3 - type: precision_at_10 value: 11.34 - type: precision_at_100 value: 2.344 - type: precision_at_1000 value: 0.359 - type: precision_at_3 value: 18.967 - type: precision_at_5 value: 15.920000000000002 - type: recall_at_1 value: 5.343 - type: recall_at_10 value: 22.997 - type: recall_at_100 value: 47.562 - type: recall_at_1000 value: 72.94500000000001 - type: recall_at_3 value: 11.533 - type: recall_at_5 value: 16.148 - type: main_score value: 21.901 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: 20a6d6f312dd54037fe07a32d58e5e168867909d metrics: - type: cosine_pearson value: 87.3054603493591 - type: cosine_spearman value: 82.14763206055602 - type: manhattan_pearson value: 84.78737790237557 - type: manhattan_spearman value: 81.88455356002758 - type: euclidean_pearson value: 85.00668629311117 - type: euclidean_spearman value: 82.14763037860851 - type: main_score value: 82.14763206055602 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cosine_pearson value: 86.6911864687294 - type: cosine_spearman value: 77.89286260403269 - type: manhattan_pearson value: 82.87240347680857 - type: manhattan_spearman value: 78.10055393740326 - type: euclidean_pearson value: 82.72282535777123 - type: euclidean_spearman value: 77.89256648406325 - type: main_score value: 77.89286260403269 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cosine_pearson value: 87.7220832598633 - type: cosine_spearman value: 88.30238972017452 - type: manhattan_pearson value: 87.88214789140248 - type: manhattan_spearman value: 88.24770220032391 - type: euclidean_pearson value: 87.98610386257103 - type: euclidean_spearman value: 88.30238972017452 - type: main_score value: 88.30238972017452 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cosine_pearson value: 85.70614623247714 - type: cosine_spearman value: 84.29920990970672 - type: manhattan_pearson value: 84.9836190531721 - type: manhattan_spearman value: 84.40933470597638 - type: euclidean_pearson value: 84.96652336693347 - type: euclidean_spearman value: 84.29920989531965 - type: main_score value: 84.29920990970672 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cosine_pearson value: 88.4169972425264 - type: cosine_spearman value: 89.03555007807218 - type: manhattan_pearson value: 88.83068699455478 - type: manhattan_spearman value: 89.21877175674125 - type: euclidean_pearson value: 88.7251052947544 - type: euclidean_spearman value: 89.03557389893083 - type: main_score value: 89.03555007807218 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cosine_pearson value: 85.63830579034632 - type: cosine_spearman value: 86.77353371581373 - type: manhattan_pearson value: 86.24830492396637 - type: manhattan_spearman value: 86.96754348626189 - type: euclidean_pearson value: 86.09837038778359 - type: euclidean_spearman value: 86.77353371581373 - type: main_score value: 86.77353371581373 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cosine_pearson value: 91.2204675588959 - type: cosine_spearman value: 90.66976712249057 - type: manhattan_pearson value: 91.11007808242346 - type: manhattan_spearman value: 90.51739232964488 - type: euclidean_pearson value: 91.19588941007903 - type: euclidean_spearman value: 90.66976712249057 - type: main_score value: 90.66976712249057 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cosine_pearson value: 69.34416749707114 - type: cosine_spearman value: 68.11632448161046 - type: manhattan_pearson value: 68.99243488935281 - type: manhattan_spearman value: 67.8398546438258 - type: euclidean_pearson value: 69.06376010216088 - type: euclidean_spearman value: 68.11632448161046 - type: main_score value: 68.11632448161046 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cosine_pearson value: 88.10309739429758 - type: cosine_spearman value: 88.40520383147418 - type: manhattan_pearson value: 88.50753383813232 - type: manhattan_spearman value: 88.66382629460927 - type: euclidean_pearson value: 88.35050664609376 - type: euclidean_spearman value: 88.40520383147418 - type: main_score value: 88.40520383147418 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 87.58627126942797 - type: mrr value: 97.01098103058887 - type: main_score value: 87.58627126942797 - task: type: Retrieval dataset: name: MTEB SciFact type: mteb/scifact config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: map_at_1 value: 62.883 - type: map_at_10 value: 75.371 - type: map_at_100 value: 75.66000000000001 - type: map_at_1000 value: 75.667 - type: map_at_3 value: 72.741 - type: map_at_5 value: 74.74 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 66.0 - type: ndcg_at_10 value: 80.12700000000001 - type: ndcg_at_100 value: 81.291 - type: ndcg_at_1000 value: 81.464 - type: ndcg_at_3 value: 76.19 - type: ndcg_at_5 value: 78.827 - type: precision_at_1 value: 66.0 - type: precision_at_10 value: 10.567 - type: precision_at_100 value: 1.117 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 30.333 - type: precision_at_5 value: 20.133000000000003 - type: recall_at_1 value: 62.883 - type: recall_at_10 value: 93.556 - type: recall_at_100 value: 98.667 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 83.322 - type: recall_at_5 value: 89.756 - type: main_score value: 80.12700000000001 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.87524752475248 - type: cos_sim_accuracy_threshold value: 74.86587762832642 - type: cos_sim_ap value: 97.02222446606328 - type: cos_sim_f1 value: 93.66197183098592 - type: cos_sim_f1_threshold value: 74.74223375320435 - type: cos_sim_precision value: 94.23076923076923 - type: cos_sim_recall value: 93.10000000000001 - type: dot_accuracy value: 99.87524752475248 - type: dot_accuracy_threshold value: 74.86587762832642 - type: dot_ap value: 97.02222688043362 - type: dot_f1 value: 93.66197183098592 - type: dot_f1_threshold value: 74.74223375320435 - type: dot_precision value: 94.23076923076923 - type: dot_recall value: 93.10000000000001 - type: euclidean_accuracy value: 99.87524752475248 - type: euclidean_accuracy_threshold value: 70.9000825881958 - type: euclidean_ap value: 97.02222446606329 - type: euclidean_f1 value: 93.66197183098592 - type: euclidean_f1_threshold value: 71.07426524162292 - type: euclidean_precision value: 94.23076923076923 - type: euclidean_recall value: 93.10000000000001 - type: manhattan_accuracy value: 99.87623762376238 - type: manhattan_accuracy_threshold value: 3588.5040283203125 - type: manhattan_ap value: 97.09194643777883 - type: manhattan_f1 value: 93.7375745526839 - type: manhattan_f1_threshold value: 3664.3760681152344 - type: manhattan_precision value: 93.18181818181817 - type: manhattan_recall value: 94.3 - type: max_accuracy value: 99.87623762376238 - type: max_ap value: 97.09194643777883 - type: max_f1 value: 93.7375745526839 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: main_score value: 82.10134099988541 - type: v_measure value: 82.10134099988541 - type: v_measure_std value: 2.7926349897769533 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: main_score value: 48.357450742397404 - type: v_measure value: 48.357450742397404 - type: v_measure_std value: 1.520118876440547 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 55.79277200802986 - type: mrr value: 56.742517082590616 - type: main_score value: 55.79277200802986 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cosine_spearman value: 30.701215774712693 - type: cosine_pearson value: 31.26740037278488 - type: dot_spearman value: 30.701215774712693 - type: dot_pearson value: 31.267404144879997 - type: main_score value: 30.701215774712693 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: mteb/trec-covid config: default split: test revision: bb9466bac8153a0349341eb1b22e06409e78ef4e metrics: - type: map_at_1 value: 0.23800000000000002 - type: map_at_10 value: 2.31 - type: map_at_100 value: 15.495000000000001 - type: map_at_1000 value: 38.829 - type: map_at_3 value: 0.72 - type: map_at_5 value: 1.185 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 91.0 - type: ndcg_at_10 value: 88.442 - type: ndcg_at_100 value: 71.39 - type: ndcg_at_1000 value: 64.153 - type: ndcg_at_3 value: 89.877 - type: ndcg_at_5 value: 89.562 - type: precision_at_1 value: 92.0 - type: precision_at_10 value: 92.60000000000001 - type: precision_at_100 value: 73.74000000000001 - type: precision_at_1000 value: 28.222 - type: precision_at_3 value: 94.0 - type: precision_at_5 value: 93.60000000000001 - type: recall_at_1 value: 0.23800000000000002 - type: recall_at_10 value: 2.428 - type: recall_at_100 value: 18.099999999999998 - type: recall_at_1000 value: 60.79599999999999 - type: recall_at_3 value: 0.749 - type: recall_at_5 value: 1.238 - type: main_score value: 88.442 - task: type: Retrieval dataset: name: MTEB Touche2020 type: mteb/touche2020 config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: map_at_1 value: 3.4939999999999998 - type: map_at_10 value: 12.531999999999998 - type: map_at_100 value: 19.147 - type: map_at_1000 value: 20.861 - type: map_at_3 value: 7.558 - type: map_at_5 value: 9.49 - type: mrr_at_1 value: 0.0 - type: mrr_at_10 value: 0.0 - type: mrr_at_100 value: 0.0 - type: mrr_at_1000 value: 0.0 - type: mrr_at_3 value: 0.0 - type: mrr_at_5 value: 0.0 - type: ndcg_at_1 value: 47.959 - type: ndcg_at_10 value: 31.781 - type: ndcg_at_100 value: 42.131 - type: ndcg_at_1000 value: 53.493 - type: ndcg_at_3 value: 39.204 - type: ndcg_at_5 value: 34.635 - type: precision_at_1 value: 48.980000000000004 - type: precision_at_10 value: 27.143 - type: precision_at_100 value: 8.224 - type: precision_at_1000 value: 1.584 - type: precision_at_3 value: 38.775999999999996 - type: precision_at_5 value: 33.061 - type: recall_at_1 value: 3.4939999999999998 - type: recall_at_10 value: 18.895 - type: recall_at_100 value: 50.192 - type: recall_at_1000 value: 85.167 - type: recall_at_3 value: 8.703 - type: recall_at_5 value: 11.824 - type: main_score value: 31.781 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de metrics: - type: accuracy value: 92.7402 - type: accuracy_stderr value: 1.020764595781027 - type: ap value: 44.38594756333084 - type: ap_stderr value: 1.817150701258273 - type: f1 value: 79.95699280019547 - type: f1_stderr value: 1.334582498702029 - type: main_score value: 92.7402 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 80.86870401810978 - type: accuracy_stderr value: 0.22688467782004712 - type: f1 value: 81.1829040745744 - type: f1_stderr value: 0.19774920574849694 - type: main_score value: 80.86870401810978 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: main_score value: 64.82048869927482 - type: v_measure value: 64.82048869927482 - type: v_measure_std value: 0.9170394252450564 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 88.44251057996067 - type: cos_sim_accuracy_threshold value: 70.2150285243988 - type: cos_sim_ap value: 81.11422351199913 - type: cos_sim_f1 value: 73.71062868615887 - type: cos_sim_f1_threshold value: 66.507488489151 - type: cos_sim_precision value: 70.2799712849964 - type: cos_sim_recall value: 77.4934036939314 - type: dot_accuracy value: 88.44251057996067 - type: dot_accuracy_threshold value: 70.2150285243988 - type: dot_ap value: 81.11420529068658 - type: dot_f1 value: 73.71062868615887 - type: dot_f1_threshold value: 66.50749444961548 - type: dot_precision value: 70.2799712849964 - type: dot_recall value: 77.4934036939314 - type: euclidean_accuracy value: 88.44251057996067 - type: euclidean_accuracy_threshold value: 77.18156576156616 - type: euclidean_ap value: 81.11422421732487 - type: euclidean_f1 value: 73.71062868615887 - type: euclidean_f1_threshold value: 81.84436559677124 - type: euclidean_precision value: 70.2799712849964 - type: euclidean_recall value: 77.4934036939314 - type: manhattan_accuracy value: 88.26369434344639 - type: manhattan_accuracy_threshold value: 3837.067413330078 - type: manhattan_ap value: 80.81442360477725 - type: manhattan_f1 value: 73.39883099117024 - type: manhattan_f1_threshold value: 4098.833847045898 - type: manhattan_precision value: 69.41896024464832 - type: manhattan_recall value: 77.86279683377309 - type: max_accuracy value: 88.44251057996067 - type: max_ap value: 81.11422421732487 - type: max_f1 value: 73.71062868615887 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 90.03182365040556 - type: cos_sim_accuracy_threshold value: 64.46443796157837 - type: cos_sim_ap value: 87.86649113691112 - type: cos_sim_f1 value: 80.45644844577821 - type: cos_sim_f1_threshold value: 61.40774488449097 - type: cos_sim_precision value: 77.54052702992216 - type: cos_sim_recall value: 83.60024638127503 - type: dot_accuracy value: 90.03182365040556 - type: dot_accuracy_threshold value: 64.46444988250732 - type: dot_ap value: 87.86649011954319 - type: dot_f1 value: 80.45644844577821 - type: dot_f1_threshold value: 61.407750844955444 - type: dot_precision value: 77.54052702992216 - type: dot_recall value: 83.60024638127503 - type: euclidean_accuracy value: 90.03182365040556 - type: euclidean_accuracy_threshold value: 84.30368900299072 - type: euclidean_ap value: 87.86649114275045 - type: euclidean_f1 value: 80.45644844577821 - type: euclidean_f1_threshold value: 87.8547191619873 - type: euclidean_precision value: 77.54052702992216 - type: euclidean_recall value: 83.60024638127503 - type: manhattan_accuracy value: 89.99883572010712 - type: manhattan_accuracy_threshold value: 4206.838607788086 - type: manhattan_ap value: 87.8600826607838 - type: manhattan_f1 value: 80.44054508120217 - type: manhattan_f1_threshold value: 4372.755432128906 - type: manhattan_precision value: 78.08219178082192 - type: manhattan_recall value: 82.94579611949491 - type: max_accuracy value: 90.03182365040556 - type: max_ap value: 87.86649114275045 - type: max_f1 value: 80.45644844577821 --- ## Introduction We present NV-Embed-v2, a generalist embedding model that ranks No. 1 on the Massive Text Embedding Benchmark ([MTEB benchmark](https://huggingface.co/spaces/mteb/leaderboard))(as of Aug 30, 2024) with a score of 72.31 across 56 text embedding tasks. It also holds the No. 1 in the retrieval sub-category (a score of 62.65 across 15 tasks) in the leaderboard, which is essential to the development of RAG technology. NV-Embed-v2 presents several new designs, including having the LLM attend to latent vectors for better pooled embedding output, and demonstrating a two-staged instruction tuning method to enhance the accuracy of both retrieval and non-retrieval tasks. Additionally, NV-Embed-v2 incorporates a novel hard-negative mining methods that take into account the positive relevance score for better false negatives removal. For more technical details, refer to our paper: [NV-Embed: Improved Techniques for Training LLMs as Generalist Embedding Models](https://arxiv.org/pdf/2405.17428). ## Model Details - Base Decoder-only LLM: [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) - Pooling Type: Latent-Attention - Embedding Dimension: 4096 ## How to use Here is an example of how to encode queries and passages using Huggingface-transformer and Sentence-transformer. Please find the required package version [here](https://huggingface.co/nvidia/NV-Embed-v2#2-required-packages). ### Usage (HuggingFace Transformers) ```python import torch import torch.nn.functional as F from transformers import AutoTokenizer, AutoModel # Each query needs to be accompanied by an corresponding instruction describing the task. task_name_to_instruct = {"example": "Given a question, retrieve passages that answer the question",} query_prefix = "Instruct: "+task_name_to_instruct["example"]+"\nQuery: " queries = [ 'are judo throws allowed in wrestling?', 'how to become a radiology technician in michigan?' ] # No instruction needed for retrieval passages passage_prefix = "" passages = [ "Since you're reading this, you are probably someone from a judo background or someone who is just wondering how judo techniques can be applied under wrestling rules. So without further ado, let's get to the question. Are Judo throws allowed in wrestling? Yes, judo throws are allowed in freestyle and folkstyle wrestling. You only need to be careful to follow the slam rules when executing judo throws. In wrestling, a slam is lifting and returning an opponent to the mat with unnecessary force.", "Below are the basic steps to becoming a radiologic technologist in Michigan:Earn a high school diploma. As with most careers in health care, a high school education is the first step to finding entry-level employment. Taking classes in math and science, such as anatomy, biology, chemistry, physiology, and physics, can help prepare students for their college studies and future careers.Earn an associate degree. Entry-level radiologic positions typically require at least an Associate of Applied Science. Before enrolling in one of these degree programs, students should make sure it has been properly accredited by the Joint Review Committee on Education in Radiologic Technology (JRCERT).Get licensed or certified in the state of Michigan." ] # load model with tokenizer model = AutoModel.from_pretrained('nvidia/NV-Embed-v2', trust_remote_code=True) # get the embeddings max_length = 32768 query_embeddings = model.encode(queries, instruction=query_prefix, max_length=max_length) passage_embeddings = model.encode(passages, instruction=passage_prefix, max_length=max_length) # normalize embeddings query_embeddings = F.normalize(query_embeddings, p=2, dim=1) passage_embeddings = F.normalize(passage_embeddings, p=2, dim=1) # get the embeddings with DataLoader (spliting the datasets into multiple mini-batches) # batch_size=2 # query_embeddings = model._do_encode(queries, batch_size=batch_size, instruction=query_prefix, max_length=max_length, num_workers=32, return_numpy=True) # passage_embeddings = model._do_encode(passages, batch_size=batch_size, instruction=passage_prefix, max_length=max_length, num_workers=32, return_numpy=True) scores = (query_embeddings @ passage_embeddings.T) * 100 print(scores.tolist()) # [[87.42693328857422, 0.46283677220344543], [0.965264618396759, 86.03721618652344]] ``` ### Usage (Sentence-Transformers) ```python import torch from sentence_transformers import SentenceTransformer # Each query needs to be accompanied by an corresponding instruction describing the task. task_name_to_instruct = {"example": "Given a question, retrieve passages that answer the question",} query_prefix = "Instruct: "+task_name_to_instruct["example"]+"\nQuery: " queries = [ 'are judo throws allowed in wrestling?', 'how to become a radiology technician in michigan?' ] # No instruction needed for retrieval passages passages = [ "Since you're reading this, you are probably someone from a judo background or someone who is just wondering how judo techniques can be applied under wrestling rules. So without further ado, let's get to the question. Are Judo throws allowed in wrestling? Yes, judo throws are allowed in freestyle and folkstyle wrestling. You only need to be careful to follow the slam rules when executing judo throws. In wrestling, a slam is lifting and returning an opponent to the mat with unnecessary force.", "Below are the basic steps to becoming a radiologic technologist in Michigan:Earn a high school diploma. As with most careers in health care, a high school education is the first step to finding entry-level employment. Taking classes in math and science, such as anatomy, biology, chemistry, physiology, and physics, can help prepare students for their college studies and future careers.Earn an associate degree. Entry-level radiologic positions typically require at least an Associate of Applied Science. Before enrolling in one of these degree programs, students should make sure it has been properly accredited by the Joint Review Committee on Education in Radiologic Technology (JRCERT).Get licensed or certified in the state of Michigan." ] # load model with tokenizer model = SentenceTransformer('nvidia/NV-Embed-v2', trust_remote_code=True) model.max_seq_length = 32768 model.tokenizer.padding_side="right" def add_eos(input_examples): input_examples = [input_example + model.tokenizer.eos_token for input_example in input_examples] return input_examples # get the embeddings batch_size = 2 query_embeddings = model.encode(add_eos(queries), batch_size=batch_size, prompt=query_prefix, normalize_embeddings=True) passage_embeddings = model.encode(add_eos(passages), batch_size=batch_size, normalize_embeddings=True) scores = (query_embeddings @ passage_embeddings.T) * 100 print(scores.tolist()) ``` ## License This model should not be used for any commercial purpose. Refer the [license](https://spdx.org/licenses/CC-BY-NC-4.0) for the detailed terms. For commercial purpose, we recommend you to use the models of [NeMo Retriever Microservices (NIMs)](https://build.nvidia.com/explore/retrieval). ## Correspondence to Chankyu Lee ([email protected]), Wei Ping ([email protected]) ## Citation If you find this code useful in your research, please consider citing: ```bibtex @article{lee2024nv, title={NV-Embed: Improved Techniques for Training LLMs as Generalist Embedding Models}, author={Lee, Chankyu and Roy, Rajarshi and Xu, Mengyao and Raiman, Jonathan and Shoeybi, Mohammad and Catanzaro, Bryan and Ping, Wei}, journal={arXiv preprint arXiv:2405.17428}, year={2024} } ``` ```bibtex @article{moreira2024nv, title={NV-Retriever: Improving text embedding models with effective hard-negative mining}, author={Moreira, Gabriel de Souza P and Osmulski, Radek and Xu, Mengyao and Ak, Ronay and Schifferer, Benedikt and Oldridge, Even}, journal={arXiv preprint arXiv:2407.15831}, year={2024} } ``` ## Troubleshooting #### 1. Instruction template for MTEB benchmarks For MTEB sub-tasks for retrieval, STS, summarization, please use the instruction prefix template in [instructions.json](https://huggingface.co/nvidia/NV-Embed-v2/blob/main/instructions.json). For classification, clustering and reranking, please use the instructions provided in Table. 7 in [NV-Embed paper](https://arxiv.org/pdf/2405.17428). #### 2. Required Packages If you have trouble, try installing the python packages as below ```python pip uninstall -y transformer-engine pip install torch==2.2.0 pip install transformers==4.42.4 pip install flash-attn==2.2.0 pip install sentence-transformers==2.7.0 ``` #### 3. How to enable Multi-GPU (Note, this is the case for HuggingFace Transformers) ```python from transformers import AutoModel from torch.nn import DataParallel embedding_model = AutoModel.from_pretrained("nvidia/NV-Embed-v2") for module_key, module in embedding_model._modules.items(): embedding_model._modules[module_key] = DataParallel(module) ``` #### 4. Fixing "nvidia/NV-Embed-v2 is not the path to a directory containing a file named config.json" Switch to your local model path,and open config.json and change the value of **"_name_or_path"** and replace it with your local model path. #### 5. Access to model nvidia/NV-Embed-v2 is restricted. You must be authenticated to access it Use your huggingface access [token](https://huggingface.co/settings/tokens) to execute *"huggingface-cli login"*. #### 6. How to resolve slight mismatch in Sentence transformer results. A slight mismatch in the Sentence Transformer implementation is caused by a discrepancy in the calculation of the instruction prefix length within the Sentence Transformer package. To fix this issue, you need to build the Sentence Transformer package from source, making the necessary modification in this [line](https://github.com/UKPLab/sentence-transformers/blob/v2.7-release/sentence_transformers/SentenceTransformer.py#L353) as below. ```python git clone https://github.com/UKPLab/sentence-transformers.git cd sentence-transformers git checkout v2.7-release # Modify L353 in SentenceTransformer.py to **'extra_features["prompt_length"] = tokenized_prompt["input_ids"].shape[-1]'**. pip install -e . ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
TitanML/jina-v2-base-en-embed
TitanML
feature-extraction
[ "sentence-transformers", "pytorch", "safetensors", "bert", "feature-extraction", "sentence-similarity", "mteb", "custom_code", "en", "dataset:allenai/c4", "arxiv:2108.12409", "arxiv:2310.19923", "license:apache-2.0", "model-index", "autotrain_compatible", "text-embeddings-inference", "region:us" ]
1,713
1,713
26
0
--- datasets: - allenai/c4 language: en license: apache-2.0 tags: - sentence-transformers - feature-extraction - sentence-similarity - mteb inference: false model-index: - name: jina-embedding-b-en-v2 results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 74.73134328358209 - type: ap value: 37.765427081831035 - type: f1 value: 68.79367444339518 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 88.544275 - type: ap value: 84.61328675662887 - type: f1 value: 88.51879035862375 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 45.263999999999996 - type: f1 value: 43.778759656699435 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: map_at_1 value: 21.693 - type: map_at_10 value: 35.487 - type: map_at_100 value: 36.862 - type: map_at_1000 value: 36.872 - type: map_at_3 value: 30.049999999999997 - type: map_at_5 value: 32.966 - type: mrr_at_1 value: 21.977 - type: mrr_at_10 value: 35.565999999999995 - type: mrr_at_100 value: 36.948 - type: mrr_at_1000 value: 36.958 - type: mrr_at_3 value: 30.121 - type: mrr_at_5 value: 33.051 - type: ndcg_at_1 value: 21.693 - type: ndcg_at_10 value: 44.181 - type: ndcg_at_100 value: 49.982 - type: ndcg_at_1000 value: 50.233000000000004 - type: ndcg_at_3 value: 32.830999999999996 - type: ndcg_at_5 value: 38.080000000000005 - type: precision_at_1 value: 21.693 - type: precision_at_10 value: 7.248 - type: precision_at_100 value: 0.9769999999999999 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 13.632 - type: precision_at_5 value: 10.725 - type: recall_at_1 value: 21.693 - type: recall_at_10 value: 72.475 - type: recall_at_100 value: 97.653 - type: recall_at_1000 value: 99.57300000000001 - type: recall_at_3 value: 40.896 - type: recall_at_5 value: 53.627 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 45.39242428696777 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 36.675626784714 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 62.247725694904034 - type: mrr value: 74.91359978894604 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 82.68003802970496 - type: cos_sim_spearman value: 81.23438110096286 - type: euclidean_pearson value: 81.87462986142582 - type: euclidean_spearman value: 81.23438110096286 - type: manhattan_pearson value: 81.61162566600755 - type: manhattan_spearman value: 81.11329400456184 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 84.01298701298701 - type: f1 value: 83.31690714969382 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 37.050108150972086 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 30.15731442819715 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: map_at_1 value: 31.391999999999996 - type: map_at_10 value: 42.597 - type: map_at_100 value: 44.07 - type: map_at_1000 value: 44.198 - type: map_at_3 value: 38.957 - type: map_at_5 value: 40.961 - type: mrr_at_1 value: 37.196 - type: mrr_at_10 value: 48.152 - type: mrr_at_100 value: 48.928 - type: mrr_at_1000 value: 48.964999999999996 - type: mrr_at_3 value: 45.446 - type: mrr_at_5 value: 47.205999999999996 - type: ndcg_at_1 value: 37.196 - type: ndcg_at_10 value: 49.089 - type: ndcg_at_100 value: 54.471000000000004 - type: ndcg_at_1000 value: 56.385 - type: ndcg_at_3 value: 43.699 - type: ndcg_at_5 value: 46.22 - type: precision_at_1 value: 37.196 - type: precision_at_10 value: 9.313 - type: precision_at_100 value: 1.478 - type: precision_at_1000 value: 0.198 - type: precision_at_3 value: 20.839 - type: precision_at_5 value: 14.936 - type: recall_at_1 value: 31.391999999999996 - type: recall_at_10 value: 61.876 - type: recall_at_100 value: 84.214 - type: recall_at_1000 value: 95.985 - type: recall_at_3 value: 46.6 - type: recall_at_5 value: 53.588 - type: map_at_1 value: 29.083 - type: map_at_10 value: 38.812999999999995 - type: map_at_100 value: 40.053 - type: map_at_1000 value: 40.188 - type: map_at_3 value: 36.111 - type: map_at_5 value: 37.519000000000005 - type: mrr_at_1 value: 36.497 - type: mrr_at_10 value: 44.85 - type: mrr_at_100 value: 45.546 - type: mrr_at_1000 value: 45.593 - type: mrr_at_3 value: 42.686 - type: mrr_at_5 value: 43.909 - type: ndcg_at_1 value: 36.497 - type: ndcg_at_10 value: 44.443 - type: ndcg_at_100 value: 48.979 - type: ndcg_at_1000 value: 51.154999999999994 - type: ndcg_at_3 value: 40.660000000000004 - type: ndcg_at_5 value: 42.193000000000005 - type: precision_at_1 value: 36.497 - type: precision_at_10 value: 8.433 - type: precision_at_100 value: 1.369 - type: precision_at_1000 value: 0.185 - type: precision_at_3 value: 19.894000000000002 - type: precision_at_5 value: 13.873 - type: recall_at_1 value: 29.083 - type: recall_at_10 value: 54.313 - type: recall_at_100 value: 73.792 - type: recall_at_1000 value: 87.629 - type: recall_at_3 value: 42.257 - type: recall_at_5 value: 47.066 - type: map_at_1 value: 38.556000000000004 - type: map_at_10 value: 50.698 - type: map_at_100 value: 51.705 - type: map_at_1000 value: 51.768 - type: map_at_3 value: 47.848 - type: map_at_5 value: 49.358000000000004 - type: mrr_at_1 value: 43.95 - type: mrr_at_10 value: 54.191 - type: mrr_at_100 value: 54.852999999999994 - type: mrr_at_1000 value: 54.885 - type: mrr_at_3 value: 51.954 - type: mrr_at_5 value: 53.13 - type: ndcg_at_1 value: 43.95 - type: ndcg_at_10 value: 56.516 - type: ndcg_at_100 value: 60.477000000000004 - type: ndcg_at_1000 value: 61.746 - type: ndcg_at_3 value: 51.601 - type: ndcg_at_5 value: 53.795 - type: precision_at_1 value: 43.95 - type: precision_at_10 value: 9.009 - type: precision_at_100 value: 1.189 - type: precision_at_1000 value: 0.135 - type: precision_at_3 value: 22.989 - type: precision_at_5 value: 15.473 - type: recall_at_1 value: 38.556000000000004 - type: recall_at_10 value: 70.159 - type: recall_at_100 value: 87.132 - type: recall_at_1000 value: 96.16 - type: recall_at_3 value: 56.906 - type: recall_at_5 value: 62.332 - type: map_at_1 value: 24.238 - type: map_at_10 value: 32.5 - type: map_at_100 value: 33.637 - type: map_at_1000 value: 33.719 - type: map_at_3 value: 30.026999999999997 - type: map_at_5 value: 31.555 - type: mrr_at_1 value: 26.328000000000003 - type: mrr_at_10 value: 34.44 - type: mrr_at_100 value: 35.455999999999996 - type: mrr_at_1000 value: 35.521 - type: mrr_at_3 value: 32.034 - type: mrr_at_5 value: 33.565 - type: ndcg_at_1 value: 26.328000000000003 - type: ndcg_at_10 value: 37.202 - type: ndcg_at_100 value: 42.728 - type: ndcg_at_1000 value: 44.792 - type: ndcg_at_3 value: 32.368 - type: ndcg_at_5 value: 35.008 - type: precision_at_1 value: 26.328000000000003 - type: precision_at_10 value: 5.7059999999999995 - type: precision_at_100 value: 0.8880000000000001 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 13.672 - type: precision_at_5 value: 9.74 - type: recall_at_1 value: 24.238 - type: recall_at_10 value: 49.829 - type: recall_at_100 value: 75.21 - type: recall_at_1000 value: 90.521 - type: recall_at_3 value: 36.867 - type: recall_at_5 value: 43.241 - type: map_at_1 value: 15.378 - type: map_at_10 value: 22.817999999999998 - type: map_at_100 value: 23.977999999999998 - type: map_at_1000 value: 24.108 - type: map_at_3 value: 20.719 - type: map_at_5 value: 21.889 - type: mrr_at_1 value: 19.03 - type: mrr_at_10 value: 27.022000000000002 - type: mrr_at_100 value: 28.011999999999997 - type: mrr_at_1000 value: 28.096 - type: mrr_at_3 value: 24.855 - type: mrr_at_5 value: 26.029999999999998 - type: ndcg_at_1 value: 19.03 - type: ndcg_at_10 value: 27.526 - type: ndcg_at_100 value: 33.040000000000006 - type: ndcg_at_1000 value: 36.187000000000005 - type: ndcg_at_3 value: 23.497 - type: ndcg_at_5 value: 25.334 - type: precision_at_1 value: 19.03 - type: precision_at_10 value: 4.963 - type: precision_at_100 value: 0.893 - type: precision_at_1000 value: 0.13 - type: precision_at_3 value: 11.360000000000001 - type: precision_at_5 value: 8.134 - type: recall_at_1 value: 15.378 - type: recall_at_10 value: 38.061 - type: recall_at_100 value: 61.754 - type: recall_at_1000 value: 84.259 - type: recall_at_3 value: 26.788 - type: recall_at_5 value: 31.326999999999998 - type: map_at_1 value: 27.511999999999997 - type: map_at_10 value: 37.429 - type: map_at_100 value: 38.818000000000005 - type: map_at_1000 value: 38.924 - type: map_at_3 value: 34.625 - type: map_at_5 value: 36.064 - type: mrr_at_1 value: 33.300999999999995 - type: mrr_at_10 value: 43.036 - type: mrr_at_100 value: 43.894 - type: mrr_at_1000 value: 43.936 - type: mrr_at_3 value: 40.825 - type: mrr_at_5 value: 42.028 - type: ndcg_at_1 value: 33.300999999999995 - type: ndcg_at_10 value: 43.229 - type: ndcg_at_100 value: 48.992000000000004 - type: ndcg_at_1000 value: 51.02100000000001 - type: ndcg_at_3 value: 38.794000000000004 - type: ndcg_at_5 value: 40.65 - type: precision_at_1 value: 33.300999999999995 - type: precision_at_10 value: 7.777000000000001 - type: precision_at_100 value: 1.269 - type: precision_at_1000 value: 0.163 - type: precision_at_3 value: 18.351 - type: precision_at_5 value: 12.762 - type: recall_at_1 value: 27.511999999999997 - type: recall_at_10 value: 54.788000000000004 - type: recall_at_100 value: 79.105 - type: recall_at_1000 value: 92.49199999999999 - type: recall_at_3 value: 41.924 - type: recall_at_5 value: 47.026 - type: map_at_1 value: 24.117 - type: map_at_10 value: 33.32 - type: map_at_100 value: 34.677 - type: map_at_1000 value: 34.78 - type: map_at_3 value: 30.233999999999998 - type: map_at_5 value: 31.668000000000003 - type: mrr_at_1 value: 29.566 - type: mrr_at_10 value: 38.244 - type: mrr_at_100 value: 39.245000000000005 - type: mrr_at_1000 value: 39.296 - type: mrr_at_3 value: 35.864000000000004 - type: mrr_at_5 value: 36.919999999999995 - type: ndcg_at_1 value: 29.566 - type: ndcg_at_10 value: 39.127 - type: ndcg_at_100 value: 44.989000000000004 - type: ndcg_at_1000 value: 47.189 - type: ndcg_at_3 value: 34.039 - type: ndcg_at_5 value: 35.744 - type: precision_at_1 value: 29.566 - type: precision_at_10 value: 7.385999999999999 - type: precision_at_100 value: 1.204 - type: precision_at_1000 value: 0.158 - type: precision_at_3 value: 16.286 - type: precision_at_5 value: 11.484 - type: recall_at_1 value: 24.117 - type: recall_at_10 value: 51.559999999999995 - type: recall_at_100 value: 77.104 - type: recall_at_1000 value: 91.79899999999999 - type: recall_at_3 value: 36.82 - type: recall_at_5 value: 41.453 - type: map_at_1 value: 25.17625 - type: map_at_10 value: 34.063916666666664 - type: map_at_100 value: 35.255500000000005 - type: map_at_1000 value: 35.37275 - type: map_at_3 value: 31.351666666666667 - type: map_at_5 value: 32.80608333333333 - type: mrr_at_1 value: 29.59783333333333 - type: mrr_at_10 value: 38.0925 - type: mrr_at_100 value: 38.957249999999995 - type: mrr_at_1000 value: 39.01608333333333 - type: mrr_at_3 value: 35.77625 - type: mrr_at_5 value: 37.04991666666667 - type: ndcg_at_1 value: 29.59783333333333 - type: ndcg_at_10 value: 39.343666666666664 - type: ndcg_at_100 value: 44.488249999999994 - type: ndcg_at_1000 value: 46.83358333333334 - type: ndcg_at_3 value: 34.69708333333333 - type: ndcg_at_5 value: 36.75075 - type: precision_at_1 value: 29.59783333333333 - type: precision_at_10 value: 6.884083333333332 - type: precision_at_100 value: 1.114 - type: precision_at_1000 value: 0.15108333333333332 - type: precision_at_3 value: 15.965250000000003 - type: precision_at_5 value: 11.246500000000001 - type: recall_at_1 value: 25.17625 - type: recall_at_10 value: 51.015999999999984 - type: recall_at_100 value: 73.60174999999998 - type: recall_at_1000 value: 89.849 - type: recall_at_3 value: 37.88399999999999 - type: recall_at_5 value: 43.24541666666666 - type: map_at_1 value: 24.537 - type: map_at_10 value: 31.081999999999997 - type: map_at_100 value: 32.042 - type: map_at_1000 value: 32.141 - type: map_at_3 value: 29.137 - type: map_at_5 value: 30.079 - type: mrr_at_1 value: 27.454 - type: mrr_at_10 value: 33.694 - type: mrr_at_100 value: 34.579 - type: mrr_at_1000 value: 34.649 - type: mrr_at_3 value: 32.004 - type: mrr_at_5 value: 32.794000000000004 - type: ndcg_at_1 value: 27.454 - type: ndcg_at_10 value: 34.915 - type: ndcg_at_100 value: 39.641 - type: ndcg_at_1000 value: 42.105 - type: ndcg_at_3 value: 31.276 - type: ndcg_at_5 value: 32.65 - type: precision_at_1 value: 27.454 - type: precision_at_10 value: 5.337 - type: precision_at_100 value: 0.8250000000000001 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 13.241 - type: precision_at_5 value: 8.895999999999999 - type: recall_at_1 value: 24.537 - type: recall_at_10 value: 44.324999999999996 - type: recall_at_100 value: 65.949 - type: recall_at_1000 value: 84.017 - type: recall_at_3 value: 33.857 - type: recall_at_5 value: 37.316 - type: map_at_1 value: 17.122 - type: map_at_10 value: 24.32 - type: map_at_100 value: 25.338 - type: map_at_1000 value: 25.462 - type: map_at_3 value: 22.064 - type: map_at_5 value: 23.322000000000003 - type: mrr_at_1 value: 20.647 - type: mrr_at_10 value: 27.858 - type: mrr_at_100 value: 28.743999999999996 - type: mrr_at_1000 value: 28.819 - type: mrr_at_3 value: 25.769 - type: mrr_at_5 value: 26.964 - type: ndcg_at_1 value: 20.647 - type: ndcg_at_10 value: 28.849999999999998 - type: ndcg_at_100 value: 33.849000000000004 - type: ndcg_at_1000 value: 36.802 - type: ndcg_at_3 value: 24.799 - type: ndcg_at_5 value: 26.682 - type: precision_at_1 value: 20.647 - type: precision_at_10 value: 5.2170000000000005 - type: precision_at_100 value: 0.906 - type: precision_at_1000 value: 0.134 - type: precision_at_3 value: 11.769 - type: precision_at_5 value: 8.486 - type: recall_at_1 value: 17.122 - type: recall_at_10 value: 38.999 - type: recall_at_100 value: 61.467000000000006 - type: recall_at_1000 value: 82.716 - type: recall_at_3 value: 27.601 - type: recall_at_5 value: 32.471 - type: map_at_1 value: 24.396 - type: map_at_10 value: 33.415 - type: map_at_100 value: 34.521 - type: map_at_1000 value: 34.631 - type: map_at_3 value: 30.703999999999997 - type: map_at_5 value: 32.166 - type: mrr_at_1 value: 28.825 - type: mrr_at_10 value: 37.397000000000006 - type: mrr_at_100 value: 38.286 - type: mrr_at_1000 value: 38.346000000000004 - type: mrr_at_3 value: 35.028 - type: mrr_at_5 value: 36.32 - type: ndcg_at_1 value: 28.825 - type: ndcg_at_10 value: 38.656 - type: ndcg_at_100 value: 43.856 - type: ndcg_at_1000 value: 46.31 - type: ndcg_at_3 value: 33.793 - type: ndcg_at_5 value: 35.909 - type: precision_at_1 value: 28.825 - type: precision_at_10 value: 6.567 - type: precision_at_100 value: 1.0330000000000001 - type: precision_at_1000 value: 0.135 - type: precision_at_3 value: 15.516 - type: precision_at_5 value: 10.914 - type: recall_at_1 value: 24.396 - type: recall_at_10 value: 50.747 - type: recall_at_100 value: 73.477 - type: recall_at_1000 value: 90.801 - type: recall_at_3 value: 37.1 - type: recall_at_5 value: 42.589 - type: map_at_1 value: 25.072 - type: map_at_10 value: 34.307 - type: map_at_100 value: 35.725 - type: map_at_1000 value: 35.943999999999996 - type: map_at_3 value: 30.906 - type: map_at_5 value: 32.818000000000005 - type: mrr_at_1 value: 29.644 - type: mrr_at_10 value: 38.673 - type: mrr_at_100 value: 39.459 - type: mrr_at_1000 value: 39.527 - type: mrr_at_3 value: 35.771 - type: mrr_at_5 value: 37.332 - type: ndcg_at_1 value: 29.644 - type: ndcg_at_10 value: 40.548 - type: ndcg_at_100 value: 45.678999999999995 - type: ndcg_at_1000 value: 48.488 - type: ndcg_at_3 value: 34.887 - type: ndcg_at_5 value: 37.543 - type: precision_at_1 value: 29.644 - type: precision_at_10 value: 7.688000000000001 - type: precision_at_100 value: 1.482 - type: precision_at_1000 value: 0.23600000000000002 - type: precision_at_3 value: 16.206 - type: precision_at_5 value: 12.016 - type: recall_at_1 value: 25.072 - type: recall_at_10 value: 53.478 - type: recall_at_100 value: 76.07300000000001 - type: recall_at_1000 value: 93.884 - type: recall_at_3 value: 37.583 - type: recall_at_5 value: 44.464 - type: map_at_1 value: 20.712 - type: map_at_10 value: 27.467999999999996 - type: map_at_100 value: 28.502 - type: map_at_1000 value: 28.610000000000003 - type: map_at_3 value: 24.887999999999998 - type: map_at_5 value: 26.273999999999997 - type: mrr_at_1 value: 22.736 - type: mrr_at_10 value: 29.553 - type: mrr_at_100 value: 30.485 - type: mrr_at_1000 value: 30.56 - type: mrr_at_3 value: 27.078999999999997 - type: mrr_at_5 value: 28.401 - type: ndcg_at_1 value: 22.736 - type: ndcg_at_10 value: 32.023 - type: ndcg_at_100 value: 37.158 - type: ndcg_at_1000 value: 39.823 - type: ndcg_at_3 value: 26.951999999999998 - type: ndcg_at_5 value: 29.281000000000002 - type: precision_at_1 value: 22.736 - type: precision_at_10 value: 5.213 - type: precision_at_100 value: 0.832 - type: precision_at_1000 value: 0.116 - type: precision_at_3 value: 11.459999999999999 - type: precision_at_5 value: 8.244 - type: recall_at_1 value: 20.712 - type: recall_at_10 value: 44.057 - type: recall_at_100 value: 67.944 - type: recall_at_1000 value: 87.925 - type: recall_at_3 value: 30.305 - type: recall_at_5 value: 36.071999999999996 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: map_at_1 value: 10.181999999999999 - type: map_at_10 value: 16.66 - type: map_at_100 value: 18.273 - type: map_at_1000 value: 18.45 - type: map_at_3 value: 14.141 - type: map_at_5 value: 15.455 - type: mrr_at_1 value: 22.15 - type: mrr_at_10 value: 32.062000000000005 - type: mrr_at_100 value: 33.116 - type: mrr_at_1000 value: 33.168 - type: mrr_at_3 value: 28.827 - type: mrr_at_5 value: 30.892999999999997 - type: ndcg_at_1 value: 22.15 - type: ndcg_at_10 value: 23.532 - type: ndcg_at_100 value: 30.358 - type: ndcg_at_1000 value: 33.783 - type: ndcg_at_3 value: 19.222 - type: ndcg_at_5 value: 20.919999999999998 - type: precision_at_1 value: 22.15 - type: precision_at_10 value: 7.185999999999999 - type: precision_at_100 value: 1.433 - type: precision_at_1000 value: 0.207 - type: precision_at_3 value: 13.941 - type: precision_at_5 value: 10.906 - type: recall_at_1 value: 10.181999999999999 - type: recall_at_10 value: 28.104000000000003 - type: recall_at_100 value: 51.998999999999995 - type: recall_at_1000 value: 71.311 - type: recall_at_3 value: 17.698 - type: recall_at_5 value: 22.262999999999998 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: map_at_1 value: 6.669 - type: map_at_10 value: 15.552 - type: map_at_100 value: 21.865000000000002 - type: map_at_1000 value: 23.268 - type: map_at_3 value: 11.309 - type: map_at_5 value: 13.084000000000001 - type: mrr_at_1 value: 55.50000000000001 - type: mrr_at_10 value: 66.46600000000001 - type: mrr_at_100 value: 66.944 - type: mrr_at_1000 value: 66.956 - type: mrr_at_3 value: 64.542 - type: mrr_at_5 value: 65.717 - type: ndcg_at_1 value: 44.75 - type: ndcg_at_10 value: 35.049 - type: ndcg_at_100 value: 39.073 - type: ndcg_at_1000 value: 46.208 - type: ndcg_at_3 value: 39.525 - type: ndcg_at_5 value: 37.156 - type: precision_at_1 value: 55.50000000000001 - type: precision_at_10 value: 27.800000000000004 - type: precision_at_100 value: 9.013 - type: precision_at_1000 value: 1.8800000000000001 - type: precision_at_3 value: 42.667 - type: precision_at_5 value: 36.0 - type: recall_at_1 value: 6.669 - type: recall_at_10 value: 21.811 - type: recall_at_100 value: 45.112 - type: recall_at_1000 value: 67.806 - type: recall_at_3 value: 13.373 - type: recall_at_5 value: 16.615 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 48.769999999999996 - type: f1 value: 42.91448356376592 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: map_at_1 value: 54.013 - type: map_at_10 value: 66.239 - type: map_at_100 value: 66.62599999999999 - type: map_at_1000 value: 66.644 - type: map_at_3 value: 63.965 - type: map_at_5 value: 65.45400000000001 - type: mrr_at_1 value: 58.221000000000004 - type: mrr_at_10 value: 70.43700000000001 - type: mrr_at_100 value: 70.744 - type: mrr_at_1000 value: 70.75099999999999 - type: mrr_at_3 value: 68.284 - type: mrr_at_5 value: 69.721 - type: ndcg_at_1 value: 58.221000000000004 - type: ndcg_at_10 value: 72.327 - type: ndcg_at_100 value: 73.953 - type: ndcg_at_1000 value: 74.312 - type: ndcg_at_3 value: 68.062 - type: ndcg_at_5 value: 70.56400000000001 - type: precision_at_1 value: 58.221000000000004 - type: precision_at_10 value: 9.521 - type: precision_at_100 value: 1.045 - type: precision_at_1000 value: 0.109 - type: precision_at_3 value: 27.348 - type: precision_at_5 value: 17.794999999999998 - type: recall_at_1 value: 54.013 - type: recall_at_10 value: 86.957 - type: recall_at_100 value: 93.911 - type: recall_at_1000 value: 96.38 - type: recall_at_3 value: 75.555 - type: recall_at_5 value: 81.671 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: map_at_1 value: 21.254 - type: map_at_10 value: 33.723 - type: map_at_100 value: 35.574 - type: map_at_1000 value: 35.730000000000004 - type: map_at_3 value: 29.473 - type: map_at_5 value: 31.543 - type: mrr_at_1 value: 41.358 - type: mrr_at_10 value: 49.498 - type: mrr_at_100 value: 50.275999999999996 - type: mrr_at_1000 value: 50.308 - type: mrr_at_3 value: 47.016000000000005 - type: mrr_at_5 value: 48.336 - type: ndcg_at_1 value: 41.358 - type: ndcg_at_10 value: 41.579 - type: ndcg_at_100 value: 48.455 - type: ndcg_at_1000 value: 51.165000000000006 - type: ndcg_at_3 value: 37.681 - type: ndcg_at_5 value: 38.49 - type: precision_at_1 value: 41.358 - type: precision_at_10 value: 11.543000000000001 - type: precision_at_100 value: 1.87 - type: precision_at_1000 value: 0.23600000000000002 - type: precision_at_3 value: 24.743000000000002 - type: precision_at_5 value: 17.994 - type: recall_at_1 value: 21.254 - type: recall_at_10 value: 48.698 - type: recall_at_100 value: 74.588 - type: recall_at_1000 value: 91.00200000000001 - type: recall_at_3 value: 33.939 - type: recall_at_5 value: 39.367000000000004 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: map_at_1 value: 35.922 - type: map_at_10 value: 52.32599999999999 - type: map_at_100 value: 53.18000000000001 - type: map_at_1000 value: 53.245 - type: map_at_3 value: 49.294 - type: map_at_5 value: 51.202999999999996 - type: mrr_at_1 value: 71.843 - type: mrr_at_10 value: 78.24600000000001 - type: mrr_at_100 value: 78.515 - type: mrr_at_1000 value: 78.527 - type: mrr_at_3 value: 77.17500000000001 - type: mrr_at_5 value: 77.852 - type: ndcg_at_1 value: 71.843 - type: ndcg_at_10 value: 61.379 - type: ndcg_at_100 value: 64.535 - type: ndcg_at_1000 value: 65.888 - type: ndcg_at_3 value: 56.958 - type: ndcg_at_5 value: 59.434 - type: precision_at_1 value: 71.843 - type: precision_at_10 value: 12.686 - type: precision_at_100 value: 1.517 - type: precision_at_1000 value: 0.16999999999999998 - type: precision_at_3 value: 35.778 - type: precision_at_5 value: 23.422 - type: recall_at_1 value: 35.922 - type: recall_at_10 value: 63.43 - type: recall_at_100 value: 75.868 - type: recall_at_1000 value: 84.88900000000001 - type: recall_at_3 value: 53.666000000000004 - type: recall_at_5 value: 58.555 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 79.4408 - type: ap value: 73.52820871620366 - type: f1 value: 79.36240238685001 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: dev revision: None metrics: - type: map_at_1 value: 21.826999999999998 - type: map_at_10 value: 34.04 - type: map_at_100 value: 35.226 - type: map_at_1000 value: 35.275 - type: map_at_3 value: 30.165999999999997 - type: map_at_5 value: 32.318000000000005 - type: mrr_at_1 value: 22.464000000000002 - type: mrr_at_10 value: 34.631 - type: mrr_at_100 value: 35.752 - type: mrr_at_1000 value: 35.795 - type: mrr_at_3 value: 30.798 - type: mrr_at_5 value: 32.946999999999996 - type: ndcg_at_1 value: 22.464000000000002 - type: ndcg_at_10 value: 40.919 - type: ndcg_at_100 value: 46.632 - type: ndcg_at_1000 value: 47.833 - type: ndcg_at_3 value: 32.992 - type: ndcg_at_5 value: 36.834 - type: precision_at_1 value: 22.464000000000002 - type: precision_at_10 value: 6.494 - type: precision_at_100 value: 0.9369999999999999 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 14.021 - type: precision_at_5 value: 10.347000000000001 - type: recall_at_1 value: 21.826999999999998 - type: recall_at_10 value: 62.132 - type: recall_at_100 value: 88.55199999999999 - type: recall_at_1000 value: 97.707 - type: recall_at_3 value: 40.541 - type: recall_at_5 value: 49.739 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 95.68399452804377 - type: f1 value: 95.25490609832268 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 83.15321477428182 - type: f1 value: 60.35476439087966 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 71.92669804976462 - type: f1 value: 69.22815107207565 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.4855413584398 - type: f1 value: 72.92107516103387 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 32.412679360205544 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 28.09211869875204 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 30.540919056982545 - type: mrr value: 31.529904607063536 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.745 - type: map_at_10 value: 12.013 - type: map_at_100 value: 15.040000000000001 - type: map_at_1000 value: 16.427 - type: map_at_3 value: 8.841000000000001 - type: map_at_5 value: 10.289 - type: mrr_at_1 value: 45.201 - type: mrr_at_10 value: 53.483999999999995 - type: mrr_at_100 value: 54.20700000000001 - type: mrr_at_1000 value: 54.252 - type: mrr_at_3 value: 51.29 - type: mrr_at_5 value: 52.73 - type: ndcg_at_1 value: 43.808 - type: ndcg_at_10 value: 32.445 - type: ndcg_at_100 value: 30.031000000000002 - type: ndcg_at_1000 value: 39.007 - type: ndcg_at_3 value: 37.204 - type: ndcg_at_5 value: 35.07 - type: precision_at_1 value: 45.201 - type: precision_at_10 value: 23.684 - type: precision_at_100 value: 7.600999999999999 - type: precision_at_1000 value: 2.043 - type: precision_at_3 value: 33.953 - type: precision_at_5 value: 29.412 - type: recall_at_1 value: 5.745 - type: recall_at_10 value: 16.168 - type: recall_at_100 value: 30.875999999999998 - type: recall_at_1000 value: 62.686 - type: recall_at_3 value: 9.75 - type: recall_at_5 value: 12.413 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: map_at_1 value: 37.828 - type: map_at_10 value: 53.239000000000004 - type: map_at_100 value: 54.035999999999994 - type: map_at_1000 value: 54.067 - type: map_at_3 value: 49.289 - type: map_at_5 value: 51.784 - type: mrr_at_1 value: 42.497 - type: mrr_at_10 value: 55.916999999999994 - type: mrr_at_100 value: 56.495 - type: mrr_at_1000 value: 56.516999999999996 - type: mrr_at_3 value: 52.800000000000004 - type: mrr_at_5 value: 54.722 - type: ndcg_at_1 value: 42.468 - type: ndcg_at_10 value: 60.437 - type: ndcg_at_100 value: 63.731 - type: ndcg_at_1000 value: 64.41799999999999 - type: ndcg_at_3 value: 53.230999999999995 - type: ndcg_at_5 value: 57.26 - type: precision_at_1 value: 42.468 - type: precision_at_10 value: 9.47 - type: precision_at_100 value: 1.1360000000000001 - type: precision_at_1000 value: 0.12 - type: precision_at_3 value: 23.724999999999998 - type: precision_at_5 value: 16.593 - type: recall_at_1 value: 37.828 - type: recall_at_10 value: 79.538 - type: recall_at_100 value: 93.646 - type: recall_at_1000 value: 98.72999999999999 - type: recall_at_3 value: 61.134 - type: recall_at_5 value: 70.377 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: map_at_1 value: 70.548 - type: map_at_10 value: 84.466 - type: map_at_100 value: 85.10600000000001 - type: map_at_1000 value: 85.123 - type: map_at_3 value: 81.57600000000001 - type: map_at_5 value: 83.399 - type: mrr_at_1 value: 81.24 - type: mrr_at_10 value: 87.457 - type: mrr_at_100 value: 87.574 - type: mrr_at_1000 value: 87.575 - type: mrr_at_3 value: 86.507 - type: mrr_at_5 value: 87.205 - type: ndcg_at_1 value: 81.25 - type: ndcg_at_10 value: 88.203 - type: ndcg_at_100 value: 89.457 - type: ndcg_at_1000 value: 89.563 - type: ndcg_at_3 value: 85.465 - type: ndcg_at_5 value: 87.007 - type: precision_at_1 value: 81.25 - type: precision_at_10 value: 13.373 - type: precision_at_100 value: 1.5270000000000001 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.417 - type: precision_at_5 value: 24.556 - type: recall_at_1 value: 70.548 - type: recall_at_10 value: 95.208 - type: recall_at_100 value: 99.514 - type: recall_at_1000 value: 99.988 - type: recall_at_3 value: 87.214 - type: recall_at_5 value: 91.696 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 53.04822095496839 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 60.30778476474675 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 4.692 - type: map_at_10 value: 11.766 - type: map_at_100 value: 13.904 - type: map_at_1000 value: 14.216999999999999 - type: map_at_3 value: 8.245 - type: map_at_5 value: 9.92 - type: mrr_at_1 value: 23.0 - type: mrr_at_10 value: 33.78 - type: mrr_at_100 value: 34.922 - type: mrr_at_1000 value: 34.973 - type: mrr_at_3 value: 30.2 - type: mrr_at_5 value: 32.565 - type: ndcg_at_1 value: 23.0 - type: ndcg_at_10 value: 19.863 - type: ndcg_at_100 value: 28.141 - type: ndcg_at_1000 value: 33.549 - type: ndcg_at_3 value: 18.434 - type: ndcg_at_5 value: 16.384 - type: precision_at_1 value: 23.0 - type: precision_at_10 value: 10.39 - type: precision_at_100 value: 2.235 - type: precision_at_1000 value: 0.35300000000000004 - type: precision_at_3 value: 17.133000000000003 - type: precision_at_5 value: 14.44 - type: recall_at_1 value: 4.692 - type: recall_at_10 value: 21.025 - type: recall_at_100 value: 45.324999999999996 - type: recall_at_1000 value: 71.675 - type: recall_at_3 value: 10.440000000000001 - type: recall_at_5 value: 14.64 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 84.96178184892842 - type: cos_sim_spearman value: 79.6487740813199 - type: euclidean_pearson value: 82.06661161625023 - type: euclidean_spearman value: 79.64876769031183 - type: manhattan_pearson value: 82.07061164575131 - type: manhattan_spearman value: 79.65197039464537 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 84.15305604100027 - type: cos_sim_spearman value: 74.27447427941591 - type: euclidean_pearson value: 80.52737337565307 - type: euclidean_spearman value: 74.27416077132192 - type: manhattan_pearson value: 80.53728571140387 - type: manhattan_spearman value: 74.28853605753457 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 83.44386080639279 - type: cos_sim_spearman value: 84.17947648159536 - type: euclidean_pearson value: 83.34145388129387 - type: euclidean_spearman value: 84.17947648159536 - type: manhattan_pearson value: 83.30699061927966 - type: manhattan_spearman value: 84.18125737380451 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 81.57392220985612 - type: cos_sim_spearman value: 78.80745014464101 - type: euclidean_pearson value: 80.01660371487199 - type: euclidean_spearman value: 78.80741240102256 - type: manhattan_pearson value: 79.96810779507953 - type: manhattan_spearman value: 78.75600400119448 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 86.85421063026625 - type: cos_sim_spearman value: 87.55320285299192 - type: euclidean_pearson value: 86.69750143323517 - type: euclidean_spearman value: 87.55320284326378 - type: manhattan_pearson value: 86.63379169960379 - type: manhattan_spearman value: 87.4815029877984 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 84.31314130411842 - type: cos_sim_spearman value: 85.3489588181433 - type: euclidean_pearson value: 84.13240933463535 - type: euclidean_spearman value: 85.34902871403281 - type: manhattan_pearson value: 84.01183086503559 - type: manhattan_spearman value: 85.19316703166102 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 89.09979781689536 - type: cos_sim_spearman value: 88.87813323759015 - type: euclidean_pearson value: 88.65413031123792 - type: euclidean_spearman value: 88.87813323759015 - type: manhattan_pearson value: 88.61818758256024 - type: manhattan_spearman value: 88.81044100494604 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 62.30693258111531 - type: cos_sim_spearman value: 62.195516523251946 - type: euclidean_pearson value: 62.951283701049476 - type: euclidean_spearman value: 62.195516523251946 - type: manhattan_pearson value: 63.068322281439535 - type: manhattan_spearman value: 62.10621171028406 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 84.27092833763909 - type: cos_sim_spearman value: 84.84429717949759 - type: euclidean_pearson value: 84.8516966060792 - type: euclidean_spearman value: 84.84429717949759 - type: manhattan_pearson value: 84.82203139242881 - type: manhattan_spearman value: 84.8358503952945 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 83.10290863981409 - type: mrr value: 95.31168450286097 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: map_at_1 value: 52.161 - type: map_at_10 value: 62.138000000000005 - type: map_at_100 value: 62.769 - type: map_at_1000 value: 62.812 - type: map_at_3 value: 59.111000000000004 - type: map_at_5 value: 60.995999999999995 - type: mrr_at_1 value: 55.333 - type: mrr_at_10 value: 63.504000000000005 - type: mrr_at_100 value: 64.036 - type: mrr_at_1000 value: 64.08 - type: mrr_at_3 value: 61.278 - type: mrr_at_5 value: 62.778 - type: ndcg_at_1 value: 55.333 - type: ndcg_at_10 value: 66.678 - type: ndcg_at_100 value: 69.415 - type: ndcg_at_1000 value: 70.453 - type: ndcg_at_3 value: 61.755 - type: ndcg_at_5 value: 64.546 - type: precision_at_1 value: 55.333 - type: precision_at_10 value: 9.033 - type: precision_at_100 value: 1.043 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 24.221999999999998 - type: precision_at_5 value: 16.333000000000002 - type: recall_at_1 value: 52.161 - type: recall_at_10 value: 79.156 - type: recall_at_100 value: 91.333 - type: recall_at_1000 value: 99.333 - type: recall_at_3 value: 66.43299999999999 - type: recall_at_5 value: 73.272 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.81287128712871 - type: cos_sim_ap value: 95.30034785910676 - type: cos_sim_f1 value: 90.28629856850716 - type: cos_sim_precision value: 92.36401673640168 - type: cos_sim_recall value: 88.3 - type: dot_accuracy value: 99.81287128712871 - type: dot_ap value: 95.30034785910676 - type: dot_f1 value: 90.28629856850716 - type: dot_precision value: 92.36401673640168 - type: dot_recall value: 88.3 - type: euclidean_accuracy value: 99.81287128712871 - type: euclidean_ap value: 95.30034785910676 - type: euclidean_f1 value: 90.28629856850716 - type: euclidean_precision value: 92.36401673640168 - type: euclidean_recall value: 88.3 - type: manhattan_accuracy value: 99.80990099009901 - type: manhattan_ap value: 95.26880751950654 - type: manhattan_f1 value: 90.22177419354838 - type: manhattan_precision value: 90.95528455284553 - type: manhattan_recall value: 89.5 - type: max_accuracy value: 99.81287128712871 - type: max_ap value: 95.30034785910676 - type: max_f1 value: 90.28629856850716 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 58.518662504351184 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 34.96168178378587 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 52.04862593471896 - type: mrr value: 52.97238402936932 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.092545236479946 - type: cos_sim_spearman value: 31.599851000175498 - type: dot_pearson value: 30.092542723901676 - type: dot_spearman value: 31.599851000175498 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.189 - type: map_at_10 value: 1.662 - type: map_at_100 value: 9.384 - type: map_at_1000 value: 22.669 - type: map_at_3 value: 0.5559999999999999 - type: map_at_5 value: 0.9039999999999999 - type: mrr_at_1 value: 68.0 - type: mrr_at_10 value: 81.01899999999999 - type: mrr_at_100 value: 81.01899999999999 - type: mrr_at_1000 value: 81.01899999999999 - type: mrr_at_3 value: 79.333 - type: mrr_at_5 value: 80.733 - type: ndcg_at_1 value: 63.0 - type: ndcg_at_10 value: 65.913 - type: ndcg_at_100 value: 51.895 - type: ndcg_at_1000 value: 46.967 - type: ndcg_at_3 value: 65.49199999999999 - type: ndcg_at_5 value: 66.69699999999999 - type: precision_at_1 value: 68.0 - type: precision_at_10 value: 71.6 - type: precision_at_100 value: 53.66 - type: precision_at_1000 value: 21.124000000000002 - type: precision_at_3 value: 72.667 - type: precision_at_5 value: 74.0 - type: recall_at_1 value: 0.189 - type: recall_at_10 value: 1.913 - type: recall_at_100 value: 12.601999999999999 - type: recall_at_1000 value: 44.296 - type: recall_at_3 value: 0.605 - type: recall_at_5 value: 1.018 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.701 - type: map_at_10 value: 10.445 - type: map_at_100 value: 17.324 - type: map_at_1000 value: 19.161 - type: map_at_3 value: 5.497 - type: map_at_5 value: 7.278 - type: mrr_at_1 value: 30.612000000000002 - type: mrr_at_10 value: 45.534 - type: mrr_at_100 value: 45.792 - type: mrr_at_1000 value: 45.806999999999995 - type: mrr_at_3 value: 37.755 - type: mrr_at_5 value: 43.469 - type: ndcg_at_1 value: 26.531 - type: ndcg_at_10 value: 26.235000000000003 - type: ndcg_at_100 value: 39.17 - type: ndcg_at_1000 value: 51.038 - type: ndcg_at_3 value: 23.625 - type: ndcg_at_5 value: 24.338 - type: precision_at_1 value: 30.612000000000002 - type: precision_at_10 value: 24.285999999999998 - type: precision_at_100 value: 8.224 - type: precision_at_1000 value: 1.6179999999999999 - type: precision_at_3 value: 24.490000000000002 - type: precision_at_5 value: 24.898 - type: recall_at_1 value: 2.701 - type: recall_at_10 value: 17.997 - type: recall_at_100 value: 51.766999999999996 - type: recall_at_1000 value: 87.863 - type: recall_at_3 value: 6.295000000000001 - type: recall_at_5 value: 9.993 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 73.3474 - type: ap value: 15.393431414459924 - type: f1 value: 56.466681887882416 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 62.062818336163 - type: f1 value: 62.11230840463252 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 42.464892820845115 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 86.15962329379508 - type: cos_sim_ap value: 74.73674057919256 - type: cos_sim_f1 value: 68.81245642574947 - type: cos_sim_precision value: 61.48255813953488 - type: cos_sim_recall value: 78.12664907651715 - type: dot_accuracy value: 86.15962329379508 - type: dot_ap value: 74.7367634988281 - type: dot_f1 value: 68.81245642574947 - type: dot_precision value: 61.48255813953488 - type: dot_recall value: 78.12664907651715 - type: euclidean_accuracy value: 86.15962329379508 - type: euclidean_ap value: 74.7367761466634 - type: euclidean_f1 value: 68.81245642574947 - type: euclidean_precision value: 61.48255813953488 - type: euclidean_recall value: 78.12664907651715 - type: manhattan_accuracy value: 86.21326816474935 - type: manhattan_ap value: 74.64416473733951 - type: manhattan_f1 value: 68.80924855491331 - type: manhattan_precision value: 61.23456790123457 - type: manhattan_recall value: 78.52242744063325 - type: max_accuracy value: 86.21326816474935 - type: max_ap value: 74.7367761466634 - type: max_f1 value: 68.81245642574947 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.97620988085536 - type: cos_sim_ap value: 86.08680845745758 - type: cos_sim_f1 value: 78.02793637114438 - type: cos_sim_precision value: 73.11082699683736 - type: cos_sim_recall value: 83.65414228518632 - type: dot_accuracy value: 88.97620988085536 - type: dot_ap value: 86.08681149437946 - type: dot_f1 value: 78.02793637114438 - type: dot_precision value: 73.11082699683736 - type: dot_recall value: 83.65414228518632 - type: euclidean_accuracy value: 88.97620988085536 - type: euclidean_ap value: 86.08681215460771 - type: euclidean_f1 value: 78.02793637114438 - type: euclidean_precision value: 73.11082699683736 - type: euclidean_recall value: 83.65414228518632 - type: manhattan_accuracy value: 88.88888888888889 - type: manhattan_ap value: 86.02916327562438 - type: manhattan_f1 value: 78.02063045516843 - type: manhattan_precision value: 73.38851947346994 - type: manhattan_recall value: 83.2768709578072 - type: max_accuracy value: 88.97620988085536 - type: max_ap value: 86.08681215460771 - type: max_f1 value: 78.02793637114438 --- <!-- TODO: add evaluation results here --> <br><br> <p align="center"> <img src="https://aeiljuispo.cloudimg.io/v7/https://cdn-uploads.huggingface.co/production/uploads/603763514de52ff951d89793/AFoybzd5lpBQXEBrQHuTt.png?w=200&h=200&f=face" alt="Finetuner logo: Finetuner helps you to create experiments in order to improve embeddings on search tasks. It accompanies you to deliver the last mile of performance-tuning for neural search applications." width="150px"> </p> <p align="center"> <b>The text embedding set trained by <a href="https://jina.ai/"><b>Jina AI</b></a>.</b> </p> ## Quick Start The easiest way to starting using `jina-embeddings-v2-base-en` is to use Jina AI's [Embedding API](https://jina.ai/embeddings/). ## Intended Usage & Model Info `jina-embeddings-v2-base-en` is an English, monolingual **embedding model** supporting **8192 sequence length**. It is based on a BERT architecture (JinaBERT) that supports the symmetric bidirectional variant of [ALiBi](https://arxiv.org/abs/2108.12409) to allow longer sequence length. The backbone `jina-bert-v2-base-en` is pretrained on the C4 dataset. The model is further trained on Jina AI's collection of more than 400 millions of sentence pairs and hard negatives. These pairs were obtained from various domains and were carefully selected through a thorough cleaning process. The embedding model was trained using 512 sequence length, but extrapolates to 8k sequence length (or even longer) thanks to ALiBi. This makes our model useful for a range of use cases, especially when processing long documents is needed, including long document retrieval, semantic textual similarity, text reranking, recommendation, RAG and LLM-based generative search, etc. With a standard size of 137 million parameters, the model enables fast inference while delivering better performance than our small model. It is recommended to use a single GPU for inference. Additionally, we provide the following embedding models: - [`jina-embeddings-v2-small-en`](https://huggingface.co/jinaai/jina-embeddings-v2-small-en): 33 million parameters. - [`jina-embeddings-v2-base-en`](https://huggingface.co/jinaai/jina-embeddings-v2-base-en): 137 million parameters **(you are here)**. - [`jina-embeddings-v2-base-zh`](https://huggingface.co/jinaai/jina-embeddings-v2-base-zh): Chinese-English Bilingual embeddings. - [`jina-embeddings-v2-base-de`](https://huggingface.co/jinaai/jina-embeddings-v2-base-de): German-English Bilingual embeddings. - [`jina-embeddings-v2-base-es`](https://huggingface.co/jinaai/jina-embeddings-v2-base-es): Spanish-English Bilingual embeddings. ## Data & Parameters Jina Embeddings V2 [technical report](https://arxiv.org/abs/2310.19923) ## Usage **<details><summary>Please apply mean pooling when integrating the model.</summary>** <p> ### Why mean pooling? `mean poooling` takes all token embeddings from model output and averaging them at sentence/paragraph level. It has been proved to be the most effective way to produce high-quality sentence embeddings. We offer an `encode` function to deal with this. However, if you would like to do it without using the default `encode` function: ```python import torch import torch.nn.functional as F from transformers import AutoTokenizer, AutoModel def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) sentences = ['How is the weather today?', 'What is the current weather like today?'] tokenizer = AutoTokenizer.from_pretrained('jinaai/jina-embeddings-v2-small-en') model = AutoModel.from_pretrained('jinaai/jina-embeddings-v2-small-en', trust_remote_code=True) encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') with torch.no_grad(): model_output = model(**encoded_input) embeddings = mean_pooling(model_output, encoded_input['attention_mask']) embeddings = F.normalize(embeddings, p=2, dim=1) ``` </p> </details> You can use Jina Embedding models directly from transformers package. First, you need to make sure that you are logged into huggingface. You can either use the huggingface-cli tool (after installing the `transformers` package) and pass your [hugginface access token](https://huggingface.co/docs/hub/security-tokens): ```bash huggingface-cli login ``` Alternatively, you can provide the access token as an environment variable in the shell: ```bash export HF_TOKEN="<your token here>" ``` or in Python: ```python import os os.environ['HF_TOKEN'] = "<your token here>" ``` Then, you can use load and use the model via the `AutoModel` class: ```python !pip install transformers from transformers import AutoModel from numpy.linalg import norm cos_sim = lambda a,b: (a @ b.T) / (norm(a)*norm(b)) model = AutoModel.from_pretrained('jinaai/jina-embeddings-v2-base-en', trust_remote_code=True) # trust_remote_code is needed to use the encode method embeddings = model.encode(['How is the weather today?', 'What is the current weather like today?']) print(cos_sim(embeddings[0], embeddings[1])) ``` If you only want to handle shorter sequence, such as 2k, pass the `max_length` parameter to the `encode` function: ```python embeddings = model.encode( ['Very long ... document'], max_length=2048 ) ``` Using the its latest release (v2.3.0) sentence-transformers also supports Jina embeddings (Please make sure that you are logged into huggingface as well): ```python !pip install -U sentence-transformers from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim model = SentenceTransformer( "jinaai/jina-embeddings-v2-base-en", # switch to en/zh for English or Chinese trust_remote_code=True ) # control your input sequence length up to 8192 model.max_seq_length = 1024 embeddings = model.encode([ 'How is the weather today?', 'What is the current weather like today?' ]) print(cos_sim(embeddings[0], embeddings[1])) ``` ## Alternatives to Using Transformers (or SentencTransformers) Package 1. _Managed SaaS_: Get started with a free key on Jina AI's [Embedding API](https://jina.ai/embeddings/). 2. _Private and high-performance deployment_: Get started by picking from our suite of models and deploy them on [AWS Sagemaker](https://aws.amazon.com/marketplace/seller-profile?id=seller-stch2ludm6vgy). ## Use Jina Embeddings for RAG According to the latest blog post from [LLamaIndex](https://blog.llamaindex.ai/boosting-rag-picking-the-best-embedding-reranker-models-42d079022e83), > In summary, to achieve the peak performance in both hit rate and MRR, the combination of OpenAI or JinaAI-Base embeddings with the CohereRerank/bge-reranker-large reranker stands out. <img src="https://miro.medium.com/v2/resize:fit:4800/format:webp/1*ZP2RVejCZovF3FDCg-Bx3A.png" width="780px"> ## Plans 1. Bilingual embedding models supporting more European & Asian languages, including Spanish, French, Italian and Japanese. 2. Multimodal embedding models enable Multimodal RAG applications. 3. High-performt rerankers. ## Trouble Shooting **Loading of Model Code failed** If you forgot to pass the `trust_remote_code=True` flag when calling `AutoModel.from_pretrained` or initializing the model via the `SentenceTransformer` class, you will receive an error that the model weights could not be initialized. This is caused by tranformers falling back to creating a default BERT model, instead of a jina-embedding model: ```bash Some weights of the model checkpoint at jinaai/jina-embeddings-v2-base-en were not used when initializing BertModel: ['encoder.layer.2.mlp.layernorm.weight', 'encoder.layer.3.mlp.layernorm.weight', 'encoder.layer.10.mlp.wo.bias', 'encoder.layer.5.mlp.wo.bias', 'encoder.layer.2.mlp.layernorm.bias', 'encoder.layer.1.mlp.gated_layers.weight', 'encoder.layer.5.mlp.gated_layers.weight', 'encoder.layer.8.mlp.layernorm.bias', ... ``` **User is not logged into Huggingface** The model is only availabe under [gated access](https://huggingface.co/docs/hub/models-gated). This means you need to be logged into huggingface load load it. If you receive the following error, you need to provide an access token, either by using the huggingface-cli or providing the token via an environment variable as described above: ```bash OSError: jinaai/jina-embeddings-v2-base-en is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo with `use_auth_token` or log in with `huggingface-cli login` and pass `use_auth_token=True`. ``` ## Contact Join our [Discord community](https://discord.jina.ai) and chat with other community members about ideas. ## Citation If you find Jina Embeddings useful in your research, please cite the following paper: ``` @misc{günther2023jina, title={Jina Embeddings 2: 8192-Token General-Purpose Text Embeddings for Long Documents}, author={Michael Günther and Jackmin Ong and Isabelle Mohr and Alaeddine Abdessalem and Tanguy Abel and Mohammad Kalim Akram and Susana Guzman and Georgios Mastrapas and Saba Sturua and Bo Wang and Maximilian Werk and Nan Wang and Han Xiao}, year={2023}, eprint={2310.19923}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf
RichardErkhov
null
[ "gguf", "arxiv:2101.00027", "arxiv:2201.07311", "endpoints_compatible", "region:us" ]
1,730
1,730
87
0
--- {} --- Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) pythia-2.8b-deduped-v0 - GGUF - Model creator: https://huggingface.co/EleutherAI/ - Original model: https://huggingface.co/EleutherAI/pythia-2.8b-deduped-v0/ | Name | Quant method | Size | | ---- | ---- | ---- | | [pythia-2.8b-deduped-v0.Q2_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q2_K.gguf) | Q2_K | 1.01GB | | [pythia-2.8b-deduped-v0.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q3_K_S.gguf) | Q3_K_S | 1.16GB | | [pythia-2.8b-deduped-v0.Q3_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q3_K.gguf) | Q3_K | 1.38GB | | [pythia-2.8b-deduped-v0.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q3_K_M.gguf) | Q3_K_M | 1.38GB | | [pythia-2.8b-deduped-v0.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q3_K_L.gguf) | Q3_K_L | 1.49GB | | [pythia-2.8b-deduped-v0.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.IQ4_XS.gguf) | IQ4_XS | 1.43GB | | [pythia-2.8b-deduped-v0.Q4_0.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q4_0.gguf) | Q4_0 | 1.49GB | | [pythia-2.8b-deduped-v0.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.IQ4_NL.gguf) | IQ4_NL | 1.5GB | | [pythia-2.8b-deduped-v0.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q4_K_S.gguf) | Q4_K_S | 1.5GB | | [pythia-2.8b-deduped-v0.Q4_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q4_K.gguf) | Q4_K | 1.66GB | | [pythia-2.8b-deduped-v0.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q4_K_M.gguf) | Q4_K_M | 1.66GB | | [pythia-2.8b-deduped-v0.Q4_1.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q4_1.gguf) | Q4_1 | 1.64GB | | [pythia-2.8b-deduped-v0.Q5_0.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q5_0.gguf) | Q5_0 | 1.8GB | | [pythia-2.8b-deduped-v0.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q5_K_S.gguf) | Q5_K_S | 1.8GB | | [pythia-2.8b-deduped-v0.Q5_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q5_K.gguf) | Q5_K | 1.93GB | | [pythia-2.8b-deduped-v0.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q5_K_M.gguf) | Q5_K_M | 1.93GB | | [pythia-2.8b-deduped-v0.Q5_1.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q5_1.gguf) | Q5_1 | 1.95GB | | [pythia-2.8b-deduped-v0.Q6_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q6_K.gguf) | Q6_K | 2.13GB | | [pythia-2.8b-deduped-v0.Q8_0.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-v0-gguf/blob/main/pythia-2.8b-deduped-v0.Q8_0.gguf) | Q8_0 | 2.75GB | Original model description: --- language: - en tags: - pytorch - causal-lm - pythia - pythia_v0 license: apache-2.0 datasets: - EleutherAI/the_pile_deduplicated --- The *Pythia Scaling Suite* is a collection of models developed to facilitate interpretability research. It contains two sets of eight models of sizes 70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two models: one trained on the Pile, and one trained on the Pile after the dataset has been globally deduplicated. All 8 model sizes are trained on the exact same data, in the exact same order. All Pythia models are available [on Hugging Face](https://huggingface.co/models?other=pythia). The Pythia model suite was deliberately designed to promote scientific research on large language models, especially interpretability research. Despite not centering downstream performance as a design goal, we find the models <a href="#evaluations">match or exceed</a> the performance of similar and same-sized models, such as those in the OPT and GPT-Neo suites. Please note that all models in the *Pythia* suite were renamed in January 2023. For clarity, a <a href="#naming-convention-and-parameter-count">table comparing the old and new names</a> is provided in this model card, together with exact parameter counts. ## Pythia-2.8B-deduped ### Model Details - Developed by: [EleutherAI](http://eleuther.ai) - Model type: Transformer-based Language Model - Language: English - Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia) for training procedure, config files, and details on how to use. - Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox) - License: Apache 2.0 - Contact: to ask questions about this model, join the [EleutherAI Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`. Please read the existing *Pythia* documentation before asking about it in the EleutherAI Discord. For general correspondence: [contact@eleuther. ai](mailto:[email protected]). <figure> | Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models | | -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: | | 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — | | 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M | | 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M | | 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — | | 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B | | 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B | | 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B | | 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — | <figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and non-deduped models of a given size have the same hyperparameters. “Equivalent” models have <b>exactly</b> the same architecture, and the same number of non-embedding parameters.</figcaption> </figure> ### Uses and Limitations #### Intended Use The primary intended use of Pythia is research on the behavior, functionality, and limitations of large language models. This suite is intended to provide a controlled setting for performing scientific experiments. To enable the study of how language models change in the course of training, we provide 143 evenly spaced intermediate checkpoints per model. These checkpoints are hosted on Hugging Face as branches. Note that branch `143000` corresponds exactly to the model checkpoint on the `main` branch of each model. You may also further fine-tune and adapt Pythia-2.8B-deduped for deployment, as long as your use is in accordance with the Apache 2.0 license. Pythia models work with the Hugging Face [Transformers Library](https://huggingface.co/docs/transformers/index). If you decide to use pre-trained Pythia-2.8B-deduped as a basis for your fine-tuned model, please conduct your own risk and bias assessment. #### Out-of-scope use The Pythia Suite is **not** intended for deployment. It is not a in itself a product and cannot be used for human-facing interactions. Pythia models are English-language only, and are not suitable for translation or generating text in other languages. Pythia-2.8B-deduped has not been fine-tuned for downstream contexts in which language models are commonly deployed, such as writing genre prose, or commercial chatbots. This means Pythia-2.8B-deduped will **not** respond to a given prompt the way a product like ChatGPT does. This is because, unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement Learning from Human Feedback (RLHF) to better “understand” human instructions. #### Limitations and biases The core functionality of a large language model is to take a string of text and predict the next token. The token deemed statistically most likely by the model need not produce the most “accurate” text. Never rely on Pythia-2.8B-deduped to produce factually accurate output. This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset known to contain profanity and texts that are lewd or otherwise offensive. See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a discussion of documented biases with regards to gender, religion, and race. Pythia-2.8B-deduped may produce socially unacceptable or undesirable text, *even if* the prompt itself does not include anything explicitly offensive. If you plan on using text generated through, for example, the Hosted Inference API, we recommend having a human curate the outputs of this language model before presenting it to other people. Please inform your audience that the text was generated by Pythia-2.8B-deduped. ### Quickstart Pythia models can be loaded and used via the following code, demonstrated here for the third `pythia-70m-deduped` checkpoint: ```python from transformers import GPTNeoXForCausalLM, AutoTokenizer model = GPTNeoXForCausalLM.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) tokenizer = AutoTokenizer.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) inputs = tokenizer("Hello, I am", return_tensors="pt") tokens = model.generate(**inputs) tokenizer.decode(tokens[0]) ``` Revision/branch `step143000` corresponds exactly to the model checkpoint on the `main` branch of each model.<br> For more information on how to use all Pythia models, see [documentation on GitHub](https://github.com/EleutherAI/pythia). ### Training #### Training data Pythia-2.8B-deduped was trained on the Pile **after the dataset has been globally deduplicated**.<br> [The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in English. It was created by EleutherAI specifically for training large language models. It contains texts from 22 diverse sources, roughly broken down into five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl), prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and miscellaneous (e.g. GitHub, Enron Emails). See [the Pile paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources, methodology, and a discussion of ethical implications. Consult [the datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation about the Pile and its component datasets. The Pile can be downloaded from the [official website](https://pile.eleuther.ai/), or from a [community mirror](https://the-eye.eu/public/AI/pile/). #### Training procedure All models were trained on the exact same data, in the exact same order. Each model saw 299,892,736,000 tokens during training, and 143 checkpoints for each model are saved every 2,097,152,000 tokens, spaced evenly throughout training. This corresponds to training for just under 1 epoch on the Pile for non-deduplicated models, and about 1.5 epochs on the deduplicated Pile. All *Pythia* models trained for the equivalent of 143000 steps at a batch size of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch size of 4M tokens listed were originally trained for 71500 steps instead, with checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for consistency with all 2M batch models, so `step1000` is the first checkpoint for `pythia-1.4b` that was saved (corresponding to step 500 in training), and `step1000` is likewise the first `pythia-6.9b` checkpoint that was saved (corresponding to 1000 “actual” steps).<br> See [GitHub](https://github.com/EleutherAI/pythia) for more details on training procedure, including [how to reproduce it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br> Pythia uses the same tokenizer as [GPT-NeoX- 20B](https://huggingface.co/EleutherAI/gpt-neox-20b). ### Evaluations All 16 *Pythia* models were evaluated using the [LM Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access the results by model and step at `results/json/*` in the [GitHub repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br> Expand the sections below to see plots of evaluation results for all Pythia and Pythia-deduped models compared with OPT and BLOOM. <details> <summary>LAMBADA – OpenAI</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/> </details> <details> <summary>Physical Interaction: Question Answering (PIQA)</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/> </details> <details> <summary>WinoGrande</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/> </details> <details> <summary>AI2 Reasoning Challenge – Challenge Set</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/> </details> <details> <summary>SciQ</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/> </details> ### Naming convention and parameter count *Pythia* models were renamed in January 2023. It is possible that the old naming convention still persists in some documentation by accident. The current naming convention (70M, 160M, etc.) is based on total parameter count. <figure style="width:32em"> | current Pythia suffix | old suffix | total params | non-embedding params | | --------------------: | ---------: | -------------: | -------------------: | | 70M | 19M | 70,426,624 | 18,915,328 | | 160M | 125M | 162,322,944 | 85,056,000 | | 410M | 350M | 405,334,016 | 302,311,424 | | 1B | 800M | 1,011,781,632 | 805,736,448 | | 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 | | 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 | | 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 | | 12B | 13B | 11,846,072,320 | 11,327,027,200 | </figure>
[ "QUESTION_ANSWERING", "TRANSLATION" ]
[ "SCIQ" ]
Non_BioNLP
tomaarsen/mpnet-base-natural-questions-icl
tomaarsen
sentence-similarity
[ "sentence-transformers", "safetensors", "mpnet", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:100231", "loss:ImprovedContrastiveLoss", "en", "dataset:sentence-transformers/natural-questions", "arxiv:1908.10084", "base_model:microsoft/mpnet-base", "base_model:finetune:microsoft/mpnet-base", "license:apache-2.0", "model-index", "co2_eq_emissions", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,719
1,720
4
0
--- base_model: microsoft/mpnet-base datasets: - sentence-transformers/natural-questions language: - en library_name: sentence-transformers license: apache-2.0 metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 - dot_accuracy@1 - dot_accuracy@3 - dot_accuracy@5 - dot_accuracy@10 - dot_precision@1 - dot_precision@3 - dot_precision@5 - dot_precision@10 - dot_recall@1 - dot_recall@3 - dot_recall@5 - dot_recall@10 - dot_ndcg@10 - dot_mrr@10 - dot_map@100 pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:100231 - loss:ImprovedContrastiveLoss widget: - source_sentence: when did the british leave new york city sentences: - Golden State Warriors The Golden State Warriors are an American professional basketball team based in Oakland, California. The Warriors compete in the National Basketball Association (NBA) as a member of the league's Western Conference Pacific Division. The Warriors play their home games at the Oracle Arena in Oakland. The Warriors have reached nine NBA Finals, winning five NBA championships in 1947,[b] 1956, 1975, 2015 and 2017. Golden State's five NBA championships are tied for fourth-most in NBA history with the San Antonio Spurs, and behind only the Boston Celtics (17), Los Angeles Lakers (16) and Chicago Bulls (6). As of 2017, the Warriors are the third most valuable NBA franchise according to Forbes, with an estimated value of $2.6 billion.[6] - Evacuation Day (New York) Evacuation Day on November 25 marks the day in 1783 when British troops departed from New York City on Manhattan Island, after the end of the American Revolutionary War. After this British Army evacuation, General George Washington triumphantly led the Continental Army from his former headquarters, north of the city, across the Harlem River south down Manhattan through the town to The Battery at the foot of Broadway.[1] - Biochemical oxygen demand BOD can be used as a gauge of the effectiveness of wastewater treatment plants. It is listed as a conventional pollutant in the U.S. Clean Water Act.[2] - source_sentence: what is the newest generation of the ipad sentences: - Alex Karev Alex is fired by Dr. Lebackes when Maggie Pierce accidentally reveals to him that Karev was thinking about leaving the job. Webber recommended Bailey to fill Yang's board seat after she left, so Bailey and Alex fight over the chair. They both make presentations to the board and eventually Bailey wins, with a unanimous vote in her favor. He is hired back as an attending Peds surgeon and takes over full-time as Arizona pursues a fellowship with Dr. Herman. Alex continues to date Jo and his friendship with Meredith grows stronger than ever, with him taking on the role of her new person. When Derek dies and Meredith runs away, Alex is upset by her leaving without telling him where she went and calls her everyday. Eventually she calls him, tells him she is okay, and to stop calling. When she goes into labor and gives birth to Ellis Shepherd, Alex goes to see her since he is her emergency contact. He brings Meredith and her kids back to her house. She asks to move back in with him in her old house. Alex sells Meredith back the house and he and Jo rent a loft. - List of presidents of the United States by age The median age upon accession to the presidency is 55 years and 3 months. This is how old Lyndon B. Johnson was at the time of his inauguration. The youngest person to assume the office was Theodore Roosevelt, who became president at the age of 42 years, 322 days, following William McKinley's assassination; the oldest was Donald Trump, who was 70 years, 220 days old at his inauguration. The youngest person to be elected president was John F. Kennedy, at 43 years, 163 days of age on election day; the oldest was Ronald Reagan, who was 73 years, 274 days old at the time of his election to a second term. - iPad (2018) The iPad (officially sixth-generation iPad) is a 9.7-inch (25cm) tablet computer designed, developed, and marketed by Apple Inc. It was announced on March 27, 2018 during an education-focused event in Chicago and it is a revision of the 2017 model, upgraded with the Apple A10 Fusion SoC and support for styluses such as Apple Pencil.[2] The iPad is marketed towards educators and schools. - source_sentence: what is the average speed of passenger airplane sentences: - Fixed exchange-rate system In the 21st century, the currencies associated with large economies typically do not fix or peg exchange rates to other currencies. The last large economy to use a fixed exchange rate system was the People's Republic of China which, in July 2005, adopted a slightly more flexible exchange rate system called a managed exchange rate.[2] The European Exchange Rate Mechanism is also used on a temporary basis to establish a final conversion rate against the Euro (€) from the local currencies of countries joining the Eurozone. - Tenth Doctor The Tenth Doctor is an incarnation of the Doctor, the protagonist of the BBC science fiction television programme Doctor Who, who is played by David Tennant in three series as well as nine specials. As with previous incarnations of the Doctor, the character has also appeared in other Doctor Who spin-offs. In the programme's narrative, the Doctor is a centuries-old Time Lord alien from the planet Gallifrey who travels in time in his TARDIS, frequently with companions. When the Doctor is critically injured beyond medical repair, he can regenerate his body; in doing so, his physical appearance and personality change, and a new actor assumes the role. Tennant's portrayal of the Doctor is of an outwardly charismatic and charming adventurer whose likable and easygoing attitude can quickly turn to righteous fury when provoked. - Cruise (aeronautics) The typical cruising airspeed for a long-distance commercial passenger aircraft is approximately 475–500 knots (878–926 km/h; 546–575 mph). - source_sentence: when is cars three going to be released sentences: - Benedict's reagent The color of the obtained precipitate gives an idea about the quantity of sugar present in the solution, hence the test is semi-quantitative. A greenish precipitate indicates about 0.5 g% concentration; yellow precipitate indicates 1 g% concentration; orange indicates 1.5 g% and red indicates 2 g% or higher concentration. - Cars 3 The film was released on June 16, 2017, has grossed over $362 million worldwide and received generally positive reviews, with many critics considering it an improvement over its predecessor, as well as praising its emotional story and animation.[7] - Sleeping Beauty At the christening of a king and queen's long-wished-for child, seven good fairies are invited to be godmothers to the infant princess. The fairies attend the banquet at the palace. Each fairy is presented with a golden plate and drinking cups adorned with jewels. Soon after, an old fairy enters the palace and is seated with a plate of fine china and a crystal drinking glass. This old fairy is overlooked because she has been within a tower for many years and everyone had believed her to be deceased. Six of the other seven fairies then offer their gifts of beauty, wit, grace, dance, song, and goodness to the infant princess. The evil fairy is very angry about having been forgotten, and as her gift, enchants the infant princess so that she will one day prick her finger on a spindle of a spinning wheel and die. The seventh fairy, who hasn't yet given her gift, attempts to reverse the evil fairy's curse. However, she can only do so partially. Instead of dying, the Princess will fall into a deep sleep for 100 years and be awakened by a kiss from a king's son. - source_sentence: who was ancient china's main enemy that lived to the north sentences: - Betty Lynn Elizabeth Ann Theresa "Betty" Lynn[1] (born August 29, 1926) is a former American actress. She is best known for her role as Thelma Lou, Deputy Barney Fife's girlfriend, on The Andy Griffith Show. - Sampath Bank Sampath Bank PLC is a licensed commercial bank incorporated in Sri Lanka in 1986 with 229 branches and 373 ATMs island wide. It has won the "Bank of the Year" award by "The Banker" of Financial Times Limited – London, for the second consecutive year and the "National Business Excellence Awards 2010".[citation needed] It has become the third largest private sector bank in Sri Lanka with Rs. 453 billion in deposits as of 30 June 2016.[1] - 'Sui dynasty The Sui Dynasty (Chinese: 隋朝; pinyin: Suí cháo) was a short-lived imperial dynasty of China of pivotal significance. The Sui unified the Northern and Southern dynasties and reinstalled the rule of ethnic Han Chinese in the entirety of China proper, along with sinicization of former nomadic ethnic minorities (the Five Barbarians) within its territory. It was succeeded by the Tang dynasty, which largely inherited its foundation.' co2_eq_emissions: emissions: 171.00505800984172 energy_consumed: 0.4399387140015789 source: codecarbon training_type: fine-tuning on_cloud: false cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K ram_total_size: 31.777088165283203 hours_used: 1.139 hardware_used: 1 x NVIDIA GeForce RTX 3090 model-index: - name: MPNet base trained on Natural Questions pairs results: - task: type: information-retrieval name: Information Retrieval dataset: name: natural questions dev type: natural-questions-dev metrics: - type: cosine_accuracy@1 value: 0.5886032645880168 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.8148763561724172 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.8832958655067931 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9410614798162448 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.5886032645880168 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.27162545205747235 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.17665917310135862 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09410614798162449 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.5886032645880168 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.8148763561724172 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.8832958655067931 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9410614798162448 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.769304304207993 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.7136417796519368 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.7163262351468975 name: Cosine Map@100 - type: dot_accuracy@1 value: 0.5153943896002345 name: Dot Accuracy@1 - type: dot_accuracy@3 value: 0.7485094321180725 name: Dot Accuracy@3 - type: dot_accuracy@5 value: 0.8219137914182387 name: Dot Accuracy@5 - type: dot_accuracy@10 value: 0.8932655654383735 name: Dot Accuracy@10 - type: dot_precision@1 value: 0.5153943896002345 name: Dot Precision@1 - type: dot_precision@3 value: 0.2495031440393575 name: Dot Precision@3 - type: dot_precision@5 value: 0.16438275828364773 name: Dot Precision@5 - type: dot_precision@10 value: 0.08932655654383737 name: Dot Precision@10 - type: dot_recall@1 value: 0.5153943896002345 name: Dot Recall@1 - type: dot_recall@3 value: 0.7485094321180725 name: Dot Recall@3 - type: dot_recall@5 value: 0.8219137914182387 name: Dot Recall@5 - type: dot_recall@10 value: 0.8932655654383735 name: Dot Recall@10 - type: dot_ndcg@10 value: 0.7056782708639685 name: Dot Ndcg@10 - type: dot_mrr@10 value: 0.6453053511503243 name: Dot Mrr@10 - type: dot_map@100 value: 0.6498747716288641 name: Dot Map@100 --- # MPNet base trained on Natural Questions pairs This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) on the [natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) <!-- at revision 6996ce1e91bd2a9c7d7f61daec37463394f73f09 --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity - **Training Dataset:** - [natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions) - **Language:** en - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: MPNetModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("tomaarsen/mpnet-base-natural-questions-icl") # Run inference sentences = [ "who was ancient china's main enemy that lived to the north", 'Sui dynasty The Sui Dynasty (Chinese: 隋朝; pinyin: Suí cháo) was a short-lived imperial dynasty of China of pivotal significance. The Sui unified the Northern and Southern dynasties and reinstalled the rule of ethnic Han Chinese in the entirety of China proper, along with sinicization of former nomadic ethnic minorities (the Five Barbarians) within its territory. It was succeeded by the Tang dynasty, which largely inherited its foundation.', 'Sampath Bank Sampath Bank PLC is a licensed commercial bank incorporated in Sri Lanka in 1986 with 229 branches and 373 ATMs island wide. It has won the "Bank of the Year" award by "The Banker" of Financial Times Limited – London, for the second consecutive year and the "National Business Excellence Awards 2010".[citation needed] It has become the third largest private sector bank in Sri Lanka with Rs. 453 billion in deposits as of 30 June 2016.[1]', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Information Retrieval * Dataset: `natural-questions-dev` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.5886 | | cosine_accuracy@3 | 0.8149 | | cosine_accuracy@5 | 0.8833 | | cosine_accuracy@10 | 0.9411 | | cosine_precision@1 | 0.5886 | | cosine_precision@3 | 0.2716 | | cosine_precision@5 | 0.1767 | | cosine_precision@10 | 0.0941 | | cosine_recall@1 | 0.5886 | | cosine_recall@3 | 0.8149 | | cosine_recall@5 | 0.8833 | | cosine_recall@10 | 0.9411 | | cosine_ndcg@10 | 0.7693 | | cosine_mrr@10 | 0.7136 | | **cosine_map@100** | **0.7163** | | dot_accuracy@1 | 0.5154 | | dot_accuracy@3 | 0.7485 | | dot_accuracy@5 | 0.8219 | | dot_accuracy@10 | 0.8933 | | dot_precision@1 | 0.5154 | | dot_precision@3 | 0.2495 | | dot_precision@5 | 0.1644 | | dot_precision@10 | 0.0893 | | dot_recall@1 | 0.5154 | | dot_recall@3 | 0.7485 | | dot_recall@5 | 0.8219 | | dot_recall@10 | 0.8933 | | dot_ndcg@10 | 0.7057 | | dot_mrr@10 | 0.6453 | | dot_map@100 | 0.6499 | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### natural-questions * Dataset: [natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17) * Size: 100,231 training samples * Columns: <code>query</code> and <code>answer</code> * Approximate statistics based on the first 1000 samples: | | query | answer | |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 11.74 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 135.66 tokens</li><li>max: 512 tokens</li></ul> | * Samples: | query | answer | |:----------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>when did richmond last play in a preliminary final</code> | <code>Richmond Football Club Richmond began 2017 with 5 straight wins, a feat it had not achieved since 1995. A series of close losses hampered the Tigers throughout the middle of the season, including a 5-point loss to the Western Bulldogs, 2-point loss to Fremantle, and a 3-point loss to the Giants. Richmond ended the season strongly with convincing victories over Fremantle and St Kilda in the final two rounds, elevating the club to 3rd on the ladder. Richmond's first final of the season against the Cats at the MCG attracted a record qualifying final crowd of 95,028; the Tigers won by 51 points. Having advanced to the first preliminary finals for the first time since 2001, Richmond defeated Greater Western Sydney by 36 points in front of a crowd of 94,258 to progress to the Grand Final against Adelaide, their first Grand Final appearance since 1982. The attendance was 100,021, the largest crowd to a grand final since 1986. The Crows led at quarter time and led by as many as 13, but the Tigers took over the game as it progressed and scored seven straight goals at one point. They eventually would win by 48 points – 16.12 (108) to Adelaide's 8.12 (60) – to end their 37-year flag drought.[22] Dustin Martin also became the first player to win a Premiership medal, the Brownlow Medal and the Norm Smith Medal in the same season, while Damien Hardwick was named AFL Coaches Association Coach of the Year. Richmond's jump from 13th to premiers also marked the biggest jump from one AFL season to the next.</code> | | <code>who sang what in the world's come over you</code> | <code>Jack Scott (singer) At the beginning of 1960, Scott again changed record labels, this time to Top Rank Records.[1] He then recorded four Billboard Hot 100 hits – "What in the World's Come Over You" (#5), "Burning Bridges" (#3) b/w "Oh Little One" (#34), and "It Only Happened Yesterday" (#38).[1] "What in the World's Come Over You" was Scott's second gold disc winner.[6] Scott continued to record and perform during the 1960s and 1970s.[1] His song "You're Just Gettin' Better" reached the country charts in 1974.[1] In May 1977, Scott recorded a Peel session for BBC Radio 1 disc jockey, John Peel.</code> | | <code>who produces the most wool in the world</code> | <code>Wool Global wool production is about 2 million tonnes per year, of which 60% goes into apparel. Wool comprises ca 3% of the global textile market, but its value is higher owing to dying and other modifications of the material.[1] Australia is a leading producer of wool which is mostly from Merino sheep but has been eclipsed by China in terms of total weight.[30] New Zealand (2016) is the third-largest producer of wool, and the largest producer of crossbred wool. Breeds such as Lincoln, Romney, Drysdale, and Elliotdale produce coarser fibers, and wool from these sheep is usually used for making carpets.</code> | * Loss: <code>__main__.ImprovedContrastiveLoss</code> with these parameters: ```json { "temperature": 0.01 } ``` ### Evaluation Dataset #### natural-questions * Dataset: [natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17) * Size: 100,231 evaluation samples * Columns: <code>query</code> and <code>answer</code> * Approximate statistics based on the first 1000 samples: | | query | answer | |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 11.79 tokens</li><li>max: 25 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 142.78 tokens</li><li>max: 512 tokens</li></ul> | * Samples: | query | answer | |:--------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>who betrayed siraj ud daula in the battle of plassey in 1757</code> | <code>Siraj ud-Daulah The Battle of Plassey (or Palashi) is widely considered the turning point in the history of the subcontinent, and opened the way to eventual British domination. After Siraj-ud-Daulah's conquest of Calcutta, the British sent fresh troops from Madras to recapture the fort and avenge the attack. A retreating Siraj-ud-Daulah met the British at Plassey. He had to make camp 27 miles away from Murshidabad. On 23 June 1757 Siraj-ud-Daulah called on Mir Jafar because he was saddened by the sudden fall of Mir Mardan who was a very dear companion of Siraj in battles. The Nawab asked for help from Mir Jafar. Mir Jafar advised Siraj to retreat for that day. The Nawab made the blunder in giving the order to stop the fight. Following his command, the soldiers of the Nawab were returning to their camps. At that time, Robert Clive attacked the soldiers with his army. At such a sudden attack, the army of Siraj became indisciplined and could think of no way to fight. So all fled away in such a situation. Betrayed by a conspiracy plotted by Jagat Seth, Mir Jafar, Krishna Chandra, Omichund etc., he lost the battle and had to escape. He went first to Murshidabad and then to Patna by boat, but was eventually arrested by Mir Jafar's soldiers.</code> | | <code>what is the meaning of single malt whisky</code> | <code>Single malt whisky Single malt whisky is malt whisky from a single distillery, that is, whisky distilled from fermented mash made exclusively with malted grain (usually barley), as distinguished from unmalted grain.</code> | | <code>when is despicable me 3 going to release</code> | <code>Despicable Me 3 Despicable Me 3 premiered on June 14, 2017, at the Annecy International Animated Film Festival, and was released in the United States on June 30, 2017, by Universal Pictures in 3D, RealD 3D, Dolby Cinema, and IMAX 3D. The film received mixed reviews from critics[7] and has grossed over $1 billion worldwide, making it the third highest-grossing film of 2017, the fifth highest-grossing animated film of all time and the 28th highest-grossing overall. It is Illumination's second film to gross over $1 billion, after Minions in 2015, becoming the first ever animated franchise to do so.</code> | * Loss: <code>__main__.ImprovedContrastiveLoss</code> with these parameters: ```json { "temperature": 0.01 } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 32 - `learning_rate`: 2e-05 - `num_train_epochs`: 1 - `warmup_ratio`: 0.1 - `bf16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 32 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: True - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | loss | natural-questions-dev_cosine_map@100 | |:------:|:----:|:-------------:|:------:|:------------------------------------:| | 0 | 0 | - | - | 0.1228 | | 0.0004 | 1 | 12.7798 | - | - | | 0.0355 | 100 | 3.9819 | 1.0786 | 0.5069 | | 0.0711 | 200 | 0.9481 | 0.8211 | 0.6407 | | 0.1066 | 300 | 0.8286 | 0.8080 | 0.6565 | | 0.1422 | 400 | 0.8069 | 0.7917 | 0.6608 | | 0.1777 | 500 | 0.8148 | 0.7781 | 0.6778 | | 0.2133 | 600 | 0.7887 | 0.7719 | 0.6790 | | 0.2488 | 700 | 0.7866 | 0.7651 | 0.6817 | | 0.2844 | 800 | 0.7848 | 0.7768 | 0.6836 | | 0.3199 | 900 | 0.7702 | 0.7628 | 0.6863 | | 0.3555 | 1000 | 0.7774 | 0.7558 | 0.6987 | | 0.3910 | 1100 | 0.7537 | 0.7630 | 0.6871 | | 0.4266 | 1200 | 0.7588 | 0.7524 | 0.7012 | | 0.4621 | 1300 | 0.7688 | 0.7544 | 0.6942 | | 0.4977 | 1400 | 0.7454 | 0.7567 | 0.6910 | | 0.5332 | 1500 | 0.7371 | 0.7498 | 0.7047 | | 0.5688 | 1600 | 0.7581 | 0.7529 | 0.6953 | | 0.6043 | 1700 | 0.7922 | 0.7465 | 0.6967 | | 0.6399 | 1800 | 0.7528 | 0.7474 | 0.7021 | | 0.6754 | 1900 | 0.7572 | 0.7482 | 0.7048 | | 0.7110 | 2000 | 0.7384 | 0.7460 | 0.7050 | | 0.7465 | 2100 | 0.7523 | 0.7439 | 0.7069 | | 0.7821 | 2200 | 0.7587 | 0.7437 | 0.7072 | | 0.8176 | 2300 | 0.7416 | 0.7424 | 0.7080 | | 0.8532 | 2400 | 0.7407 | 0.7416 | 0.7112 | | 0.8887 | 2500 | 0.7634 | 0.7397 | 0.7125 | | 0.9243 | 2600 | 0.7513 | 0.7383 | 0.7137 | | 0.9598 | 2700 | 0.7392 | 0.7383 | 0.7149 | | 0.9954 | 2800 | 0.7398 | 0.7379 | 0.7147 | | 1.0 | 2813 | - | - | 0.7163 | ### Environmental Impact Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon). - **Energy Consumed**: 0.440 kWh - **Carbon Emitted**: 0.171 kg of CO2 - **Hours Used**: 1.139 hours ### Training Hardware - **On Cloud**: No - **GPU Model**: 1 x NVIDIA GeForce RTX 3090 - **CPU Model**: 13th Gen Intel(R) Core(TM) i7-13700K - **RAM Size**: 31.78 GB ### Framework Versions - Python: 3.11.6 - Sentence Transformers: 3.1.0.dev0 - Transformers: 4.41.2 - PyTorch: 2.3.1+cu121 - Accelerate: 0.31.0 - Datasets: 2.20.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "MEDAL" ]
Non_BioNLP
Gopal2002/Material_Receipt_Report_ZEON
Gopal2002
text-classification
[ "setfit", "safetensors", "bert", "sentence-transformers", "text-classification", "generated_from_setfit_trainer", "arxiv:2209.11055", "base_model:BAAI/bge-small-en-v1.5", "base_model:finetune:BAAI/bge-small-en-v1.5", "model-index", "region:us" ]
1,705
1,705
4
0
--- base_model: BAAI/bge-small-en-v1.5 library_name: setfit metrics: - accuracy pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification - generated_from_setfit_trainer widget: - text: "* 04 Hindalco Industries Ltd\nHirkaud Smelter Stores\n\n \n\n* Service Recei\ \ ot\nBUYER _ Lp / GATE ENRTY NO:\noe ADL D vA /2/0A\nRECEIPT DATE: 04-MAR-22\ \ ATU\" ! : 1-SAMBALPUR\nUNIQUE ENTERPRISES ad ZL POL CPi pg 6 ee Q/748/2022\n\ ASS Cer ag fe oO\nos \" -\n\n \n \n \n\nORG CODE:\n\nBOE NO:\nBOE DATE:\ncut\n\ \n \n\nTT\n\nWAY BILL AIRBILL NO\n\nPo\nSoe\nDATE:\n\nTOTAL RECEIVED 21074.8 Nes\ \ REMARKS/REFERENCE: | SUPPLY FOR PAINTING\nAMOUNT INCL TAX Reverse Charge: No\ \ ~\n\nINR) : Tax Point Basis : INVOICE\n\nPO Description SUPPLY FOR PAINTER FOR\ \ 85KA EMD\n\n \n\n \n \n \n\n \n \n \n \n\n \n \n \n\n\ \ \n \n \n \n\n \n \n \n\n \n \n\nLOCATOR\nShelf Life\nCONTROL\n\n\ QUANTITY:\nCHALAN/INVOICE\nRECEIVED\n\nQUANTITY:\nACCEPTED\nREJECTED\n\n \n\n\ \ \n\n \n \n\nITEM CODE DESCRIPTION HSN / SAC\nPR NUMBER SUB INVENTORY\ \ CODE\n\nPO NO. BU/cost Center/ Account Code along with GL ACCOUNT\n\nREQUESTER\ \ CODE\n\nNote to receiver\n\n1 - 801015110326 - HIRE: MANPOWER, SKILLED;RATE\ \ TYP:STANDARD, : MANDAY\nLVL/DSGNTN:PAINTER\n\n[=] = b07-\n\nS/PO/SRV/2122/054\n\ 2\n\n- Sekhar, Mr.\nChandra Makthala\n\n \n \n\n: No Control\n\n \n \n\n\ \ \n \n \n\n- 3711.204.910103.50803112.9999.9999.9999.9999.9999\n- Hirakud\ \ Smelter Plant.Aluminium Smelter.Electrical.Repairs to\nMachinery- Electrical.Default.Default.Default.Default.\ \ Default\n\nP ruchasuil dG ~L— gw\n\n \n\n4atos- OF + 2622. .e, oer |\nPREPARER\ \ SECTION HEAD / INSPECTOR SECTION HEAD /\nSTORES KEEPER AREA HEAD -RECEIVING\ \ AREA HEAD — CUSTODY & ISSUE\nor\n\nals\n\f" - text: " \n\n \n\nDELIVERY CHALLAN ~ Phone : (0891) 2577077 |\nALUFLUORIDE LIMITED\n\ MULAGADA VILLAGE, MINDHI POST,\nVISAKHAPATNAM - 530 012 |\n\n \n\n \n\n \n\n \n\ \n \n\n \n\n \n\n \n\n \n\nDc Nox: g22 - - : ; “Date 02-02-2016\n| HINDALCO INDUSTRIES\ \ LTD HIRAKUD\nSAMBALPUR\nODISHA\nPIN CODE: 768016\nYour Order No: ~HKDRM/1516/0001\ \ DT: 01/04/2015\nReceived the below mentioned in good condition. Carrier No:\ \ AP 16 TC 9339\n—SI.No | ~~ PARTICULARS” | Qty. | Rate / MT\n: = | ae\n: 7\n\ ALUMINIUM FLUORIDE . | 21.000 | ; sbatS\n|\n420 BagsX 50.120 kg. = 21.0504 MT\ \ |\nWeight of Emppty Bags:& Liners: 0.050 MT\nSoa Net Weight of Material: ~ 21.000\ \ ~MT\nInvoice No.: 822 Date 02-02-2016\"\nAPVAT TIN : 37900106541 Dt: 02.06.2014\ \ CST No.: 37900106541 Dt: 02.06.2014\nReceiver's Signature Signature\n\n \n\f" - text: " \n\n \n\n \n\n \n\n \n\n \n\n| rad nas Bi Tiapz Ke en\nap | pa\ \ ape EE By EY ED ITT? ON matte / ON moray |\nP| airing swodanraa boc pia oe ne\ \ ed ee v , 4\n! e i ma | VeACLA Baus §uOQ souBisua¢ of\n| “P io | . [ | seBieUo\ \ IS | wal VY | Loo abi +A Buipe spun |\n| | fe) De [ nl oman «| OE U :\nmS, (Spe\ \ fb) to ae\n| eo Ss | | Pepe (GEOUVHO | GE SOF ae\nE 4 ’ : E sapesecascnsctute\ \ saps Ln + ad et an\nme | | a | es ' | xR Uag ob iw aa ae 32\n' a a] i as aN\ \ Ne paneer\nRe is pad on\n| ee | Sel Nmd Oe oy ld,\n| ix | ; | ‘lwnov L PP. ‘dg\ \ py\n| . Pe eh\n\n \n\nmo sory oR! wor,\n\nou d&- ane eer\n\n: \"ORL\n\n \n\ \ \n\n \n\n‘PO 0Es - “ay Sink /BUSIA,\n‘eyemfes eipug weayediueaewueyepsd JeaK\n\ \"UINYD BPISGG SE-’-S7Z ON 100G\n\nBu. NOUMIS BNDIOOS\n\ney\nWeve! se\n\n \n\n\ \ \n\nhceaitbaaor re\n\n! AMoaAM\n\n \n\n \n\n> tewe-3™\n\noy eee\n\nY3WOISH)\ \ Ad GAUNSNI SI ODUYO\n— MSIH S.HSNMO LY\n\nAdOD HONDIS. NOD\n\nene os roarans\n\ \n \n\nKINO NOMIC unr\n\nWaalarad Ta soz - ‘Sn\n\n \n\n- “eu = 3 re\n\neagaee\n\ \nGY oe Ae\n\nBA OFT OVI\nfoe, 17 :\n\n“OL\n\n \n\nivan OL.Givs) NOiAIOSaa\n\ \n \n\neT ea ‘ON aGOW\n\n \n\n \n\n(sour g) 9292 94924 920P : 181 600 OOF\ \ - IVAW angus Wi0l <\n‘OVOY OTIS .G 'Zy “.BSNOH X3dINI PVHIA. ¢°O\"H\n\n? tAd\ \ LHOdSNU 4! 88909 LVENS\n\n-_ wd\nfe\n\n»\n\f" - text: "SOT Ue\n\n \n\n \n\noH\n\n| ia\n\nI\nod\n\nHi\n\na\n\n|\nTo) Sig\ \ Pere\na\n\nal |g\n&%\n5)\n\nwS\\\neB\nSB\n“5\n“O\nS\n€X\n\nBea\n\nem\n\nPe eS\n\ \nse aE a\n\n4 |] | tat [ety\n\ntt pe Ta\n&\na\n\nOK\n\n¢\n\nSRLS ia Leh coe\n\ \n \n \n\f" - text: " \n \n \n \n\nAUSEOOUSRGSEEENSSRCESRORROGS\n\nMise oaeta\nMis tnaes Lo\ \ Q) duty at col ane\n\nDate 12.8820\n‘Stra Bort as Corry Ub 2.\n\nexeauscscotecne:\ \ aneasese\n\nMm. €.M. NBUSTRIES\n\nAn ISO 9001 : 2008 COMPANY\n\n“PODDAR COURT\"\ , Phones : 2235 2096 / 3985 2494 Lo Wi. TEE OLL, a¥ahe Package Ae 2\natadiee Fax\ \ 033-2235 1868\n\nE-mail : [email protected] Tame Ahr SLM, Freight eng\n\ \n \n\nRaut WAR OKA O Van weg 9 at ai sl age Reve\nCorny u. )\n\nGABLES ARE\ \ IN GUR CONTROL\n\nFrease sign & return VAT No. : 19570720098 e TIN/ CST No.\ \ : 19570720292\n—~ = Office : 55, Ezra Street, 2nd Floor, Kolkata - 700 001\n\ \f" inference: true model-index: - name: SetFit with BAAI/bge-small-en-v1.5 results: - task: type: text-classification name: Text Classification dataset: name: Unknown type: unknown split: test metrics: - type: accuracy value: 1.0 name: Accuracy --- # SetFit with BAAI/bge-small-en-v1.5 This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance - **Maximum Sequence Length:** 512 tokens - **Number of Classes:** 2 classes <!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ### Model Labels | Label | Examples | |:------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | <ul><li>'_\ni.\nSe\nNew\n~~\ned\nTy\nSw\nNe\nNw\ned\n2:\n\n \n\x0c'</li><li>'ne.\n\n \n \n\n \n \n\n \n\nbBo fy20 5 ‘ )\n- wi Pas BOOKING STATION\nstat” SURAT GEIS TRA: BSPORT HOT. LTE, DIMPLE COURT, 2ND FLOOR,\n<ESEE> H.O.: “VIRAJ IMPEX HOUSE”, 47, D\' M= -toRoaD, + AT_OW. ER’S RISK oer\n\' , a” MUMBAI - 400 009 Tel. : 4076 7676 sianan Gece i al CARGO iS INSUR BY CUSTOMER — PH, : 033-30821697, 22\n{ 1. Consignor’s Name & Address As. ExOme peas Br. Code\ndT ncuer\n| acai denn EE Motie iho. ;\n| Weal © Gab TES 1 eensests Uasansensssssseonsenoereneorsenvenesnneasy\n\n\' 2.Cons ignce Bai:x’s Names wl ke iy at < CoO ale ysien b> € (to!\n\n“Litsakuod smalter f eat Lireatéuel Bor oa thin ~ behets ___\n\n \n\n|\n| %\n| on Sen Me te INS a sna iene tl er sues EES _KM__\nat i ag Se are ~ 7 oo 2 ne\n\'| L. US | - 1265 . - HY f Y -ataucl =\nate OF _ QESCHIPTION (SAIC TD ee wy ss _ WEIGHT at GATE PY 2 FRGH GH\neer . | we Re, ?. i\n\n| UFG Re Matta PS RO [aa =r 52 fences\nwe by “Matrtale O%, EFT Gora), ed\n\nhr\n\niia Sa ea eterna eas ean a\n\n \n\n \n\nTin Me a! pene __ aod i osem ge Wleg\n\' Lone CHARS 4 Hanne oe | 5 & ;\nt—- cee eee a = _ Ss Reece!\n| hig © pap Loading by 7S TAP ut. Crarges fon = aw\ntal | 7 “a eet ci a" or a — © =\n\nfree = w JBODs C } se ren st tet , Re 1 SURAT GOODS TRANSPORT VTALTD. * *\n\nTruck No. / Trailer No. : Capacity __\n\nscreens: eat BY SoH BUNS hs BENESME Pp\n\n \n\n \n\n \n\n. Lo\n\nAeookutd Clerk\n\n \n\x0c'</li><li>'J o\nALL DISPUTES SUBJECT TO KOLKATA JURISDICTION @ : 2236 6735\n\n"os Mobile : 98300 08461\n-*, _ TAXINVOICE RIG UYER\'\n\nUNITED ENTERPRISES\n\n— = Deals in : TARPAULIN, POLYTHENE, HDPE FABRIC, COTTON CLOTH,\nHESSIAN, GRANULES & GENERAL ORDER SUPPLIERS\n\n3, Amartalla Lane, Kolkata - 700 001 ~ 3 MAY 2Ui5\n\n \n \n \n\nws. HlinPAL so Taposreics Limireep | BN. bf LSS nnn\nDato......1 Sf94. LA csanscsonss\n\nSOD LSA LARS Bn Tee. Chalian No.....1.6. AS: ~(§\nDist: caumpac pon Opis HAD Date ....... LOfoy Iss sessssessseee\n\nCC OSECCLETTTECOETSSOECOHH TS ETTSSEOTHAU HE HOVER SHEUMOSECEDSOUCODESCODECE ODI SMousON RE RED\n\nBayar YATIEST No. BSS. za BIG san\n\n \n \n \n \n\n \n\n \n\nCAP TCT o Ce ERE veTe Darden vavoryDEETOeseeEDOOTEEDE\n\nRupees inwords .N.| why Fou These —\nmA YS..ntL ard cl\n\nVat No. 19511424085 dt. 20/3/08\nC.S.T. No. 19511424289 dt. 20/3/06\n\n \n \n \n \n\x0c'</li></ul> | | 1 | <ul><li>"Posatis ils. H\n\n \n\niS\nvs\na (uf\n\noe\n\n \n\n-\n\n \n\nSarichor Pls: q\n\nPea :\n\nITEM /\n\n1. Description/ Received Reject Delivered Retur\n\n \n \n\nSPARE TX. Phat\n\n(MARKETED BY MESAPD\n\nPact eta\n\n \n\nMATERIAL RECEIPT REPORT\n\n \n\n \n \n \n\n \n\nCUM nea\n\n00 LeTlooo 0.000\n\nPAS\n\n \n \n\nELT\n\nJUPLICATE FOR TRANSPORTE?-\nOGPY (EMGISE INVOICE) RECEIVED\n\nMite ariant Eee\n\nPRAM MUIMAFE RCL RE\n\n \n\n \n\nFrys\n\n \n\not\n\nSuds oT\n\n \n \n\npeas\n\nee ase\n\n. Tax Gelig\n\nGrand Tooke\n\ni\n\nRM\n\nRate/Unit\n\nMRR SUBMITTED\nwv\n\nITH PARTY'S INVIGCE\n\nEET RY MO SSO OT Soe ELS\n\nLS.\n\n \n\n \n\n \n\nWee\n\n7; Ae 18\n\nTrcic\n\ni\nSu\n\n~s\n\n“en\n\nnny\n\x0c"</li><li>"«= ITER /\ncit BDescription/ Received\n\nms\n\n \n \n\n \n\nIces\n\ne to\n\ntea tae\n\nhoimeryh bea\n\nPorccheninernyh Qerkees\n\nRican dec\n\nrarer:\n\nPAD RP eAR eR\n\nMeare\n\n \n\nMATERIAL RECEIPT\n\n \n\nREPORT\n\n \n\nwe ie 7\nhe\n\nSeba.\nbh ETS\n\n \n\nReject Delivered Retur\n\nTESLA y’\n\n \n\n \n\n \n\nLF PIE\n\nTAIT a\n\nSUPLICATE FOR TRANSPORTER\nOGPY (EXGISE INVOICE) RECEIVED\n\noy\n\nf\n\n“soarewe Pk Beak\nree\n\nRAF |\n\n \n\nep oe:\n\nPATE\n\nenc\n\n \n\nMarat\nmw LA\n\n \n \n\nNeneh cat\n\nMRR SUBMITTED\n\\AITH PARTY'S INVIOCE\n\nvee oat\n\nPO Mea PEC SPR AL?\nPi Davtess Bech.\naS OMMOL\n\nRate/Unit\n\nouts 8\n\nI.\n\nfity ¥\n\n \n\n \n\n \n\n \n\nValue\niRise. }\n\n \n\nhare\n\nfMats Terkalis\n\nCaw Wa\n\nresid\n\nTera l.\n\nHae\n\n \n\nEVheres\n\n \n\n \n\nLrpechaarcies\n\nih\n\nAaB\n\n \n\noa,\n\n_\n\na\n\n_ alls\n\x0c"</li><li>'| ie\n\n \n \n \n \n\n \n\ntn eee i he _#\nTrivveiece Dae oo og OF\n1 Cxors d arimeant hoo &\n\nLearner: £ DA ted\n\n \n \n \n \n \n\n \n\nae ‘Beam teas” 8 GIR-sae? DY .mada 18 & GTR BBse “DT.13.1.38 GENO, S388\n4, Mandar Meum 2 DTV2.2.18 & G.E.NMO.S164 DT. LSeud. Le INV.NO.G5¥=1 71.8-EM-O1BS\nExcess a » DT.?.L.18 :\nSUMAM IND-AGRO SALES PYT. LTD.\n7 i\n(Te Quantity-—----—----— Value\nCAL) sence me i ee et “Received Reject Delivered: en ag ne tec enw\n\nLOCATION\n\nat\nSat OD\n\nROLFES7 5.\n\n \n\nAES FORCE | EXTRACT I,\n\non ie.\n\nDs so17Eave. au\n“6 OMELETTE MOTORISED\n\nhs norzasra 2.000\nCOMPLETE MOTORISED\n\n| GLOBE VALVE\n\nOO PATERIAL~OAS1. SIZE\n\n \n\nest AF 18 BO LEXS\nreli\n\n» COMPLETE MOM PETUBM VaLVr\na VTE TAL -- CAS d. a SIZE SOME,\n\nVALME\ney hai: Pu. WABI SIZE .\n\nALE\nTAMOHIMG TYRE.\n—LOONE ,\n\n \n\nMRR SUBMITTED -\n\n‘MATERIAL RECEIPT + REPORT -_ WITH PARTY S. INVIOCE |\n\neneeiae me\neden\n\n: “RRR Reece i at Pig\n, MARA Re ced pt Dahes\n\nPTY SPRY i Fibs\nOF -FER-LS\n\nv9\nore\n\nPO phos\n\nPEC SFRY v8 Ore\nFO Thahes\n\nOL AU?\n\n-#\n\n9.000 3.600 EET a OK 1SE460. ‘ OO a\n\nNMOS LST Tae oe PEGE IO pS\n860\nON-4as RELIVERY DATE es\nLaF EE 8 Srctual Tax Vailue 4922.20. ;\nOILST. ATTY “CO ; Stabs Torhals LoS\n\n2900\n\nSn encewn es bovese es an be neeven os ones ntES Oe pts wt 90H On eden ov ET Om aUReeR ones Mt eretereneneesa stoner mint o>\n\nOu OK)\nGTs--\n““LEOME,, 800\nDELIVERY DATE\n© Date FETE 1\n\nLE OOOD 00\nIGST Taxaiex\n\n -3ROOGO JOD\n\n- 6BA00.00 |\n\nPEI:\n\na\na\n\nfaz tua dL. Tax MaiLure E8400. 00.\n\nDIST. OTYVEC\n2.000\n\n2,000 oO\npene\n\nste os evenen enan en enetan ue saareberernestenereens eueaan ane ed ateras ony wReniboens mnotvnes cesumewtneey\n\n0.000\nTYRE\n\nnw CHD. BL OOO o OD\n\nABO . OO\nIGST Tax@iex\n\n75600. oe\n\nLOOODELUVERY DATE\n\noo Pe LASS. END CCOMMEDTION - BUTT 2a-FEERIS Actual Tax Value s FBO « 00\nPS. WELD. | . senpoeapcinatimane licleshicisunanpatal fe sini\nkG 2 BIST. OTY-/CC Suh Terkale 4PEGOO 00\nSat ya 8h Be KOCH. aetctemnneeetenectnimetnnnngeeeren tienen manent eneeremencinessirnatibioe\n\nmy\n(TW beeeninenminnien casein annnsnene sae wonenaennnnntnnneennneenunedenennineneneniannnecnenucntannennnniennacuccnannpaansuneaancinnnnnennnn nn aeeseanininc\nTN | Grand Total — LAO7F S82. 80\n-~ | SUPLICATE FOR TRANSPORTER : eo\n\n \n\n7 senvecauvenenen eqs quanvernsemn seesmaneseseneasnen amen etetenanenacesense eves anne ne on enemies ests\n\n \n \n\n“Pa ane a: of 2\n\n \n\n \n\noi eoeneens ater et et ote neat eegtas ent cege antes enewen ten mes eeenme webeei anemone anetes eran en seeeaterarts dat aneneree spans cums ct maretenen et seeterieen ment te et arereratet srereveneias cosesnesescipsenaceeie sncbntensuseeeth pesasemmeccnsnsaunsier sees lenses\n\nA ym\n\naren ra nit i\n\nee\n\ni\n\nnoe en Sep eet St ee\n\nagai e teoncrescs 7 aS\n\naaa Se Ss:\n\ncote\n\nco hegiecssoscse\n\nsenalt\n\naa\n\nJI J FF JF JF DF JD\n\nee\n\nee\n\n \n\nKoy\n\nwy \\\nae “ r\n\\\n\nZ\n\n \n\x0c'</li></ul> | ## Evaluation ### Metrics | Label | Accuracy | |:--------|:---------| | **all** | 1.0 | ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import SetFitModel # Download from the 🤗 Hub model = SetFitModel.from_pretrained("Gopal2002/Material_Receipt_Report_ZEON") # Run inference preds = model("SOT Ue oH | ia I od Hi a | To) Sig Pere a al |g &% 5) wS\ eB SB “5 “O S €X Bea em Pe eS se aE a 4 |] | tat [ety tt pe Ta & a OK ¢ SRLS ia Leh coe ") ``` <!-- ### Downstream Use *List how someone could finetune this model on their own dataset.* --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Set Metrics | Training set | Min | Median | Max | |:-------------|:----|:---------|:-----| | Word count | 1 | 182.1336 | 1108 | | Label | Training Sample Count | |:------|:----------------------| | 0 | 202 | | 1 | 45 | ### Training Hyperparameters - batch_size: (32, 32) - num_epochs: (2, 2) - max_steps: -1 - sampling_strategy: oversampling - body_learning_rate: (2e-05, 1e-05) - head_learning_rate: 0.01 - loss: CosineSimilarityLoss - distance_metric: cosine_distance - margin: 0.25 - end_to_end: False - use_amp: False - warmup_proportion: 0.1 - seed: 42 - eval_max_steps: -1 - load_best_model_at_end: False ### Training Results | Epoch | Step | Training Loss | Validation Loss | |:------:|:----:|:-------------:|:---------------:| | 0.0007 | 1 | 0.2952 | - | | 0.0371 | 50 | 0.2253 | - | | 0.0742 | 100 | 0.1234 | - | | 0.1114 | 150 | 0.0115 | - | | 0.1485 | 200 | 0.0036 | - | | 0.1856 | 250 | 0.0024 | - | | 0.2227 | 300 | 0.0015 | - | | 0.2598 | 350 | 0.0011 | - | | 0.2970 | 400 | 0.0009 | - | | 0.3341 | 450 | 0.0007 | - | | 0.3712 | 500 | 0.0011 | - | | 0.4083 | 550 | 0.0008 | - | | 0.4454 | 600 | 0.0008 | - | | 0.4826 | 650 | 0.0007 | - | | 0.5197 | 700 | 0.0005 | - | | 0.5568 | 750 | 0.0006 | - | | 0.5939 | 800 | 0.0005 | - | | 0.6310 | 850 | 0.0005 | - | | 0.6682 | 900 | 0.0004 | - | | 0.7053 | 950 | 0.0003 | - | | 0.7424 | 1000 | 0.0004 | - | | 0.7795 | 1050 | 0.0005 | - | | 0.8166 | 1100 | 0.0004 | - | | 0.8537 | 1150 | 0.0004 | - | | 0.8909 | 1200 | 0.0005 | - | | 0.9280 | 1250 | 0.0004 | - | | 0.9651 | 1300 | 0.0003 | - | | 1.0022 | 1350 | 0.0003 | - | | 1.0393 | 1400 | 0.0003 | - | | 1.0765 | 1450 | 0.0004 | - | | 1.1136 | 1500 | 0.0003 | - | | 1.1507 | 1550 | 0.0004 | - | | 1.1878 | 1600 | 0.0004 | - | | 1.2249 | 1650 | 0.0004 | - | | 1.2621 | 1700 | 0.0003 | - | | 1.2992 | 1750 | 0.0003 | - | | 1.3363 | 1800 | 0.0003 | - | | 1.3734 | 1850 | 0.0003 | - | | 1.4105 | 1900 | 0.0003 | - | | 1.4477 | 1950 | 0.0002 | - | | 1.4848 | 2000 | 0.0003 | - | | 1.5219 | 2050 | 0.0003 | - | | 1.5590 | 2100 | 0.0003 | - | | 1.5961 | 2150 | 0.0002 | - | | 1.6333 | 2200 | 0.0003 | - | | 1.6704 | 2250 | 0.0004 | - | | 1.7075 | 2300 | 0.0004 | - | | 1.7446 | 2350 | 0.0003 | - | | 1.7817 | 2400 | 0.0002 | - | | 1.8189 | 2450 | 0.0002 | - | | 1.8560 | 2500 | 0.0003 | - | | 1.8931 | 2550 | 0.0002 | - | | 1.9302 | 2600 | 0.0003 | - | | 1.9673 | 2650 | 0.0003 | - | ### Framework Versions - Python: 3.10.12 - SetFit: 1.0.3 - Sentence Transformers: 2.2.2 - Transformers: 4.35.2 - PyTorch: 2.1.0+cu121 - Datasets: 2.16.1 - Tokenizers: 0.15.0 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "CAS", "CPI" ]
Non_BioNLP
lixsh6/XLM-3B5-embedding
lixsh6
null
[ "mteb", "model-index", "region:us" ]
1,690
1,690
0
0
--- tags: - mteb model-index: - name: xlm3b5_step3len260_b128g8_lr1e-5 results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 66.94029850746269 - type: ap value: 28.832990644897478 - type: f1 value: 60.32686940828024 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 94.697425 - type: ap value: 92.35377895045687 - type: f1 value: 94.6945423828739 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 51.586 - type: f1 value: 49.90891720350314 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: map_at_1 value: 17.781 - type: map_at_10 value: 30.854 - type: map_at_100 value: 32.344 - type: map_at_1000 value: 32.364 - type: map_at_3 value: 25.711000000000002 - type: map_at_5 value: 28.254 - type: mrr_at_1 value: 18.563 - type: mrr_at_10 value: 31.137999999999998 - type: mrr_at_100 value: 32.621 - type: mrr_at_1000 value: 32.641 - type: mrr_at_3 value: 25.984 - type: mrr_at_5 value: 28.53 - type: ndcg_at_1 value: 17.781 - type: ndcg_at_10 value: 39.206 - type: ndcg_at_100 value: 45.751 - type: ndcg_at_1000 value: 46.225 - type: ndcg_at_3 value: 28.313 - type: ndcg_at_5 value: 32.919 - type: precision_at_1 value: 17.781 - type: precision_at_10 value: 6.65 - type: precision_at_100 value: 0.9560000000000001 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 11.949 - type: precision_at_5 value: 9.417 - type: recall_at_1 value: 17.781 - type: recall_at_10 value: 66.501 - type: recall_at_100 value: 95.59 - type: recall_at_1000 value: 99.21799999999999 - type: recall_at_3 value: 35.846000000000004 - type: recall_at_5 value: 47.083999999999996 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 44.44154312957711 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 34.189712542346385 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 62.72571219134687 - type: mrr value: 76.3612979817966 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 83.62762841254953 - type: cos_sim_spearman value: 80.72111639383013 - type: euclidean_pearson value: 82.63506732956259 - type: euclidean_spearman value: 81.177753304636 - type: manhattan_pearson value: 82.5891836637346 - type: manhattan_spearman value: 81.06811225217339 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 80.34090909090908 - type: f1 value: 79.4054298683183 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 37.82441952130262 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 32.132057843418416 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: map_at_1 value: 34.23 - type: map_at_10 value: 46.763 - type: map_at_100 value: 48.454 - type: map_at_1000 value: 48.58 - type: map_at_3 value: 43.167 - type: map_at_5 value: 45.214 - type: mrr_at_1 value: 42.775 - type: mrr_at_10 value: 53.190000000000005 - type: mrr_at_100 value: 53.928 - type: mrr_at_1000 value: 53.964 - type: mrr_at_3 value: 51.168 - type: mrr_at_5 value: 52.434000000000005 - type: ndcg_at_1 value: 42.775 - type: ndcg_at_10 value: 53.376999999999995 - type: ndcg_at_100 value: 58.748 - type: ndcg_at_1000 value: 60.461 - type: ndcg_at_3 value: 48.929 - type: ndcg_at_5 value: 50.99399999999999 - type: precision_at_1 value: 42.775 - type: precision_at_10 value: 10.428999999999998 - type: precision_at_100 value: 1.678 - type: precision_at_1000 value: 0.215 - type: precision_at_3 value: 23.939 - type: precision_at_5 value: 17.082 - type: recall_at_1 value: 34.23 - type: recall_at_10 value: 64.96300000000001 - type: recall_at_100 value: 86.803 - type: recall_at_1000 value: 97.917 - type: recall_at_3 value: 51.815 - type: recall_at_5 value: 57.781000000000006 - type: map_at_1 value: 28.935 - type: map_at_10 value: 39.574999999999996 - type: map_at_100 value: 40.891 - type: map_at_1000 value: 41.043 - type: map_at_3 value: 36.248999999999995 - type: map_at_5 value: 38.157999999999994 - type: mrr_at_1 value: 36.624 - type: mrr_at_10 value: 45.241 - type: mrr_at_100 value: 46.028000000000006 - type: mrr_at_1000 value: 46.082 - type: mrr_at_3 value: 42.93 - type: mrr_at_5 value: 44.417 - type: ndcg_at_1 value: 36.624 - type: ndcg_at_10 value: 45.423 - type: ndcg_at_100 value: 49.971 - type: ndcg_at_1000 value: 52.382 - type: ndcg_at_3 value: 41.019 - type: ndcg_at_5 value: 43.254 - type: precision_at_1 value: 36.624 - type: precision_at_10 value: 8.86 - type: precision_at_100 value: 1.458 - type: precision_at_1000 value: 0.198 - type: precision_at_3 value: 20.276 - type: precision_at_5 value: 14.573 - type: recall_at_1 value: 28.935 - type: recall_at_10 value: 55.745999999999995 - type: recall_at_100 value: 74.977 - type: recall_at_1000 value: 90.505 - type: recall_at_3 value: 42.575 - type: recall_at_5 value: 48.902 - type: map_at_1 value: 38.828 - type: map_at_10 value: 50.888999999999996 - type: map_at_100 value: 52.001 - type: map_at_1000 value: 52.054 - type: map_at_3 value: 47.638999999999996 - type: map_at_5 value: 49.423 - type: mrr_at_1 value: 44.765 - type: mrr_at_10 value: 54.408 - type: mrr_at_100 value: 55.116 - type: mrr_at_1000 value: 55.144000000000005 - type: mrr_at_3 value: 52.038 - type: mrr_at_5 value: 53.323 - type: ndcg_at_1 value: 44.765 - type: ndcg_at_10 value: 56.724 - type: ndcg_at_100 value: 61.058 - type: ndcg_at_1000 value: 62.125 - type: ndcg_at_3 value: 51.324000000000005 - type: ndcg_at_5 value: 53.805 - type: precision_at_1 value: 44.765 - type: precision_at_10 value: 9.248000000000001 - type: precision_at_100 value: 1.234 - type: precision_at_1000 value: 0.136 - type: precision_at_3 value: 23.093 - type: precision_at_5 value: 15.799 - type: recall_at_1 value: 38.828 - type: recall_at_10 value: 70.493 - type: recall_at_100 value: 89.293 - type: recall_at_1000 value: 96.872 - type: recall_at_3 value: 55.74400000000001 - type: recall_at_5 value: 61.95 - type: map_at_1 value: 22.085 - type: map_at_10 value: 30.070000000000004 - type: map_at_100 value: 31.206 - type: map_at_1000 value: 31.291999999999998 - type: map_at_3 value: 27.011000000000003 - type: map_at_5 value: 28.854999999999997 - type: mrr_at_1 value: 23.842 - type: mrr_at_10 value: 31.755 - type: mrr_at_100 value: 32.778 - type: mrr_at_1000 value: 32.845 - type: mrr_at_3 value: 28.851 - type: mrr_at_5 value: 30.574 - type: ndcg_at_1 value: 23.842 - type: ndcg_at_10 value: 35.052 - type: ndcg_at_100 value: 40.550999999999995 - type: ndcg_at_1000 value: 42.789 - type: ndcg_at_3 value: 29.096 - type: ndcg_at_5 value: 32.251000000000005 - type: precision_at_1 value: 23.842 - type: precision_at_10 value: 5.605 - type: precision_at_100 value: 0.877 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 12.316 - type: precision_at_5 value: 9.13 - type: recall_at_1 value: 22.085 - type: recall_at_10 value: 48.815999999999995 - type: recall_at_100 value: 74.039 - type: recall_at_1000 value: 90.872 - type: recall_at_3 value: 33.098 - type: recall_at_5 value: 40.647 - type: map_at_1 value: 14.088999999999999 - type: map_at_10 value: 21.526 - type: map_at_100 value: 22.832 - type: map_at_1000 value: 22.958000000000002 - type: map_at_3 value: 18.747 - type: map_at_5 value: 20.396 - type: mrr_at_1 value: 17.662 - type: mrr_at_10 value: 25.513 - type: mrr_at_100 value: 26.621 - type: mrr_at_1000 value: 26.698 - type: mrr_at_3 value: 22.658 - type: mrr_at_5 value: 24.449 - type: ndcg_at_1 value: 17.662 - type: ndcg_at_10 value: 26.506999999999998 - type: ndcg_at_100 value: 32.782 - type: ndcg_at_1000 value: 35.709999999999994 - type: ndcg_at_3 value: 21.279 - type: ndcg_at_5 value: 23.998 - type: precision_at_1 value: 17.662 - type: precision_at_10 value: 5.124 - type: precision_at_100 value: 0.951 - type: precision_at_1000 value: 0.133 - type: precision_at_3 value: 10.323 - type: precision_at_5 value: 8.158999999999999 - type: recall_at_1 value: 14.088999999999999 - type: recall_at_10 value: 37.874 - type: recall_at_100 value: 65.34100000000001 - type: recall_at_1000 value: 86.06099999999999 - type: recall_at_3 value: 23.738999999999997 - type: recall_at_5 value: 30.359 - type: map_at_1 value: 24.75 - type: map_at_10 value: 34.156 - type: map_at_100 value: 35.638999999999996 - type: map_at_1000 value: 35.754999999999995 - type: map_at_3 value: 31.047000000000004 - type: map_at_5 value: 32.823 - type: mrr_at_1 value: 30.991000000000003 - type: mrr_at_10 value: 39.509 - type: mrr_at_100 value: 40.582 - type: mrr_at_1000 value: 40.636 - type: mrr_at_3 value: 37.103 - type: mrr_at_5 value: 38.503 - type: ndcg_at_1 value: 30.991000000000003 - type: ndcg_at_10 value: 39.719 - type: ndcg_at_100 value: 45.984 - type: ndcg_at_1000 value: 48.293 - type: ndcg_at_3 value: 34.92 - type: ndcg_at_5 value: 37.253 - type: precision_at_1 value: 30.991000000000003 - type: precision_at_10 value: 7.3340000000000005 - type: precision_at_100 value: 1.225 - type: precision_at_1000 value: 0.16 - type: precision_at_3 value: 16.586000000000002 - type: precision_at_5 value: 12.127 - type: recall_at_1 value: 24.75 - type: recall_at_10 value: 51.113 - type: recall_at_100 value: 77.338 - type: recall_at_1000 value: 92.764 - type: recall_at_3 value: 37.338 - type: recall_at_5 value: 43.437 - type: map_at_1 value: 23.158 - type: map_at_10 value: 32.877 - type: map_at_100 value: 34.226 - type: map_at_1000 value: 34.35 - type: map_at_3 value: 29.43 - type: map_at_5 value: 31.319000000000003 - type: mrr_at_1 value: 29.224 - type: mrr_at_10 value: 38.080000000000005 - type: mrr_at_100 value: 39.04 - type: mrr_at_1000 value: 39.097 - type: mrr_at_3 value: 35.407 - type: mrr_at_5 value: 36.771 - type: ndcg_at_1 value: 29.224 - type: ndcg_at_10 value: 38.805 - type: ndcg_at_100 value: 44.746 - type: ndcg_at_1000 value: 47.038000000000004 - type: ndcg_at_3 value: 33.269 - type: ndcg_at_5 value: 35.611 - type: precision_at_1 value: 29.224 - type: precision_at_10 value: 7.454 - type: precision_at_100 value: 1.221 - type: precision_at_1000 value: 0.16199999999999998 - type: precision_at_3 value: 16.134 - type: precision_at_5 value: 11.895 - type: recall_at_1 value: 23.158 - type: recall_at_10 value: 51.487 - type: recall_at_100 value: 77.464 - type: recall_at_1000 value: 92.525 - type: recall_at_3 value: 35.478 - type: recall_at_5 value: 41.722 - type: map_at_1 value: 24.456916666666668 - type: map_at_10 value: 33.5495 - type: map_at_100 value: 34.86808333333333 - type: map_at_1000 value: 34.98908333333333 - type: map_at_3 value: 30.59158333333334 - type: map_at_5 value: 32.24916666666667 - type: mrr_at_1 value: 29.387250000000005 - type: mrr_at_10 value: 37.73958333333333 - type: mrr_at_100 value: 38.6595 - type: mrr_at_1000 value: 38.718250000000005 - type: mrr_at_3 value: 35.31658333333333 - type: mrr_at_5 value: 36.69441666666667 - type: ndcg_at_1 value: 29.387250000000005 - type: ndcg_at_10 value: 38.910333333333334 - type: ndcg_at_100 value: 44.40241666666666 - type: ndcg_at_1000 value: 46.72008333333334 - type: ndcg_at_3 value: 34.045583333333326 - type: ndcg_at_5 value: 36.33725 - type: precision_at_1 value: 29.387250000000005 - type: precision_at_10 value: 7.034666666666668 - type: precision_at_100 value: 1.1698333333333333 - type: precision_at_1000 value: 0.15599999999999997 - type: precision_at_3 value: 15.866416666666666 - type: precision_at_5 value: 11.456333333333331 - type: recall_at_1 value: 24.456916666666668 - type: recall_at_10 value: 50.47758333333333 - type: recall_at_100 value: 74.52275 - type: recall_at_1000 value: 90.7105 - type: recall_at_3 value: 36.86275 - type: recall_at_5 value: 42.76533333333333 - type: map_at_1 value: 19.356 - type: map_at_10 value: 25.378 - type: map_at_100 value: 26.349 - type: map_at_1000 value: 26.451 - type: map_at_3 value: 23.403 - type: map_at_5 value: 24.614 - type: mrr_at_1 value: 22.086 - type: mrr_at_10 value: 28.072000000000003 - type: mrr_at_100 value: 28.887 - type: mrr_at_1000 value: 28.965999999999998 - type: mrr_at_3 value: 26.074 - type: mrr_at_5 value: 27.293 - type: ndcg_at_1 value: 22.086 - type: ndcg_at_10 value: 29.107 - type: ndcg_at_100 value: 34.0 - type: ndcg_at_1000 value: 36.793 - type: ndcg_at_3 value: 25.407999999999998 - type: ndcg_at_5 value: 27.375 - type: precision_at_1 value: 22.086 - type: precision_at_10 value: 4.678 - type: precision_at_100 value: 0.7779999999999999 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 10.992 - type: precision_at_5 value: 7.853000000000001 - type: recall_at_1 value: 19.356 - type: recall_at_10 value: 37.913999999999994 - type: recall_at_100 value: 60.507999999999996 - type: recall_at_1000 value: 81.459 - type: recall_at_3 value: 27.874 - type: recall_at_5 value: 32.688 - type: map_at_1 value: 16.008 - type: map_at_10 value: 22.431 - type: map_at_100 value: 23.61 - type: map_at_1000 value: 23.743 - type: map_at_3 value: 20.358 - type: map_at_5 value: 21.371000000000002 - type: mrr_at_1 value: 19.752 - type: mrr_at_10 value: 26.333000000000002 - type: mrr_at_100 value: 27.297 - type: mrr_at_1000 value: 27.378000000000004 - type: mrr_at_3 value: 24.358 - type: mrr_at_5 value: 25.354 - type: ndcg_at_1 value: 19.752 - type: ndcg_at_10 value: 26.712000000000003 - type: ndcg_at_100 value: 32.294 - type: ndcg_at_1000 value: 35.410000000000004 - type: ndcg_at_3 value: 22.974 - type: ndcg_at_5 value: 24.412 - type: precision_at_1 value: 19.752 - type: precision_at_10 value: 4.986 - type: precision_at_100 value: 0.924 - type: precision_at_1000 value: 0.13699999999999998 - type: precision_at_3 value: 10.966 - type: precision_at_5 value: 7.832 - type: recall_at_1 value: 16.008 - type: recall_at_10 value: 35.716 - type: recall_at_100 value: 60.76200000000001 - type: recall_at_1000 value: 83.204 - type: recall_at_3 value: 25.092 - type: recall_at_5 value: 28.858 - type: map_at_1 value: 24.743000000000002 - type: map_at_10 value: 34.492 - type: map_at_100 value: 35.716 - type: map_at_1000 value: 35.815999999999995 - type: map_at_3 value: 31.201 - type: map_at_5 value: 32.926 - type: mrr_at_1 value: 29.384 - type: mrr_at_10 value: 38.333 - type: mrr_at_100 value: 39.278 - type: mrr_at_1000 value: 39.330999999999996 - type: mrr_at_3 value: 35.65 - type: mrr_at_5 value: 36.947 - type: ndcg_at_1 value: 29.384 - type: ndcg_at_10 value: 40.195 - type: ndcg_at_100 value: 45.686 - type: ndcg_at_1000 value: 47.906 - type: ndcg_at_3 value: 34.477000000000004 - type: ndcg_at_5 value: 36.89 - type: precision_at_1 value: 29.384 - type: precision_at_10 value: 7.164 - type: precision_at_100 value: 1.111 - type: precision_at_1000 value: 0.13999999999999999 - type: precision_at_3 value: 15.983 - type: precision_at_5 value: 11.418000000000001 - type: recall_at_1 value: 24.743000000000002 - type: recall_at_10 value: 53.602000000000004 - type: recall_at_100 value: 77.266 - type: recall_at_1000 value: 92.857 - type: recall_at_3 value: 37.921 - type: recall_at_5 value: 44.124 - type: map_at_1 value: 26.531 - type: map_at_10 value: 35.933 - type: map_at_100 value: 37.913000000000004 - type: map_at_1000 value: 38.146 - type: map_at_3 value: 32.713 - type: map_at_5 value: 34.339999999999996 - type: mrr_at_1 value: 32.806000000000004 - type: mrr_at_10 value: 41.728 - type: mrr_at_100 value: 42.731 - type: mrr_at_1000 value: 42.777 - type: mrr_at_3 value: 39.065 - type: mrr_at_5 value: 40.467999999999996 - type: ndcg_at_1 value: 32.806000000000004 - type: ndcg_at_10 value: 42.254999999999995 - type: ndcg_at_100 value: 48.687999999999995 - type: ndcg_at_1000 value: 50.784 - type: ndcg_at_3 value: 37.330999999999996 - type: ndcg_at_5 value: 39.305 - type: precision_at_1 value: 32.806000000000004 - type: precision_at_10 value: 8.34 - type: precision_at_100 value: 1.7209999999999999 - type: precision_at_1000 value: 0.252 - type: precision_at_3 value: 17.589 - type: precision_at_5 value: 12.845999999999998 - type: recall_at_1 value: 26.531 - type: recall_at_10 value: 53.266000000000005 - type: recall_at_100 value: 81.49499999999999 - type: recall_at_1000 value: 94.506 - type: recall_at_3 value: 38.848 - type: recall_at_5 value: 44.263000000000005 - type: map_at_1 value: 20.77 - type: map_at_10 value: 28.504 - type: map_at_100 value: 29.580000000000002 - type: map_at_1000 value: 29.681 - type: map_at_3 value: 26.134 - type: map_at_5 value: 27.551 - type: mrr_at_1 value: 22.736 - type: mrr_at_10 value: 30.713 - type: mrr_at_100 value: 31.628 - type: mrr_at_1000 value: 31.701 - type: mrr_at_3 value: 28.497 - type: mrr_at_5 value: 29.799999999999997 - type: ndcg_at_1 value: 22.736 - type: ndcg_at_10 value: 33.048 - type: ndcg_at_100 value: 38.321 - type: ndcg_at_1000 value: 40.949999999999996 - type: ndcg_at_3 value: 28.521 - type: ndcg_at_5 value: 30.898999999999997 - type: precision_at_1 value: 22.736 - type: precision_at_10 value: 5.194 - type: precision_at_100 value: 0.86 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_3 value: 12.2 - type: precision_at_5 value: 8.762 - type: recall_at_1 value: 20.77 - type: recall_at_10 value: 44.741 - type: recall_at_100 value: 68.987 - type: recall_at_1000 value: 88.984 - type: recall_at_3 value: 32.830999999999996 - type: recall_at_5 value: 38.452999999999996 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: map_at_1 value: 9.646 - type: map_at_10 value: 17.432 - type: map_at_100 value: 19.347 - type: map_at_1000 value: 19.555 - type: map_at_3 value: 14.355 - type: map_at_5 value: 15.83 - type: mrr_at_1 value: 21.433 - type: mrr_at_10 value: 32.583 - type: mrr_at_100 value: 33.708 - type: mrr_at_1000 value: 33.751999999999995 - type: mrr_at_3 value: 28.979 - type: mrr_at_5 value: 30.979 - type: ndcg_at_1 value: 21.433 - type: ndcg_at_10 value: 25.025 - type: ndcg_at_100 value: 32.818999999999996 - type: ndcg_at_1000 value: 36.549 - type: ndcg_at_3 value: 19.689 - type: ndcg_at_5 value: 21.462 - type: precision_at_1 value: 21.433 - type: precision_at_10 value: 8.085 - type: precision_at_100 value: 1.6340000000000001 - type: precision_at_1000 value: 0.233 - type: precision_at_3 value: 14.832 - type: precision_at_5 value: 11.530999999999999 - type: recall_at_1 value: 9.646 - type: recall_at_10 value: 31.442999999999998 - type: recall_at_100 value: 58.48 - type: recall_at_1000 value: 79.253 - type: recall_at_3 value: 18.545 - type: recall_at_5 value: 23.362 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: map_at_1 value: 8.48 - type: map_at_10 value: 18.127 - type: map_at_100 value: 25.563999999999997 - type: map_at_1000 value: 27.386 - type: map_at_3 value: 13.189 - type: map_at_5 value: 15.417 - type: mrr_at_1 value: 63.74999999999999 - type: mrr_at_10 value: 71.34899999999999 - type: mrr_at_100 value: 71.842 - type: mrr_at_1000 value: 71.851 - type: mrr_at_3 value: 69.167 - type: mrr_at_5 value: 70.479 - type: ndcg_at_1 value: 51.87500000000001 - type: ndcg_at_10 value: 38.792 - type: ndcg_at_100 value: 43.889 - type: ndcg_at_1000 value: 51.561 - type: ndcg_at_3 value: 42.686 - type: ndcg_at_5 value: 40.722 - type: precision_at_1 value: 63.74999999999999 - type: precision_at_10 value: 30.375000000000004 - type: precision_at_100 value: 10.103 - type: precision_at_1000 value: 2.257 - type: precision_at_3 value: 45.167 - type: precision_at_5 value: 38.95 - type: recall_at_1 value: 8.48 - type: recall_at_10 value: 23.008 - type: recall_at_100 value: 48.875 - type: recall_at_1000 value: 73.402 - type: recall_at_3 value: 14.377 - type: recall_at_5 value: 17.819 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 47.83 - type: f1 value: 41.76842531751529 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: map_at_1 value: 62.247 - type: map_at_10 value: 72.782 - type: map_at_100 value: 73.095 - type: map_at_1000 value: 73.112 - type: map_at_3 value: 70.928 - type: map_at_5 value: 72.173 - type: mrr_at_1 value: 67.372 - type: mrr_at_10 value: 77.538 - type: mrr_at_100 value: 77.741 - type: mrr_at_1000 value: 77.74600000000001 - type: mrr_at_3 value: 75.938 - type: mrr_at_5 value: 77.054 - type: ndcg_at_1 value: 67.372 - type: ndcg_at_10 value: 78.001 - type: ndcg_at_100 value: 79.295 - type: ndcg_at_1000 value: 79.648 - type: ndcg_at_3 value: 74.71 - type: ndcg_at_5 value: 76.712 - type: precision_at_1 value: 67.372 - type: precision_at_10 value: 9.844999999999999 - type: precision_at_100 value: 1.065 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 29.308 - type: precision_at_5 value: 18.731 - type: recall_at_1 value: 62.247 - type: recall_at_10 value: 89.453 - type: recall_at_100 value: 94.998 - type: recall_at_1000 value: 97.385 - type: recall_at_3 value: 80.563 - type: recall_at_5 value: 85.58099999999999 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: map_at_1 value: 22.587 - type: map_at_10 value: 37.316 - type: map_at_100 value: 39.542 - type: map_at_1000 value: 39.701 - type: map_at_3 value: 32.332 - type: map_at_5 value: 35.172 - type: mrr_at_1 value: 42.437999999999995 - type: mrr_at_10 value: 51.98500000000001 - type: mrr_at_100 value: 52.910999999999994 - type: mrr_at_1000 value: 52.944 - type: mrr_at_3 value: 49.691 - type: mrr_at_5 value: 51.15 - type: ndcg_at_1 value: 42.437999999999995 - type: ndcg_at_10 value: 45.016 - type: ndcg_at_100 value: 52.541000000000004 - type: ndcg_at_1000 value: 54.99699999999999 - type: ndcg_at_3 value: 41.175 - type: ndcg_at_5 value: 42.647 - type: precision_at_1 value: 42.437999999999995 - type: precision_at_10 value: 12.855 - type: precision_at_100 value: 2.049 - type: precision_at_1000 value: 0.247 - type: precision_at_3 value: 27.675 - type: precision_at_5 value: 20.617 - type: recall_at_1 value: 22.587 - type: recall_at_10 value: 51.547 - type: recall_at_100 value: 78.88 - type: recall_at_1000 value: 93.741 - type: recall_at_3 value: 37.256 - type: recall_at_5 value: 44.295 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: map_at_1 value: 32.451 - type: map_at_10 value: 48.082 - type: map_at_100 value: 49.08 - type: map_at_1000 value: 49.163000000000004 - type: map_at_3 value: 44.766 - type: map_at_5 value: 46.722 - type: mrr_at_1 value: 64.902 - type: mrr_at_10 value: 72.195 - type: mrr_at_100 value: 72.572 - type: mrr_at_1000 value: 72.589 - type: mrr_at_3 value: 70.774 - type: mrr_at_5 value: 71.611 - type: ndcg_at_1 value: 64.902 - type: ndcg_at_10 value: 57.14399999999999 - type: ndcg_at_100 value: 60.916000000000004 - type: ndcg_at_1000 value: 62.649 - type: ndcg_at_3 value: 52.09 - type: ndcg_at_5 value: 54.70399999999999 - type: precision_at_1 value: 64.902 - type: precision_at_10 value: 12.136 - type: precision_at_100 value: 1.51 - type: precision_at_1000 value: 0.174 - type: precision_at_3 value: 32.933 - type: precision_at_5 value: 21.823 - type: recall_at_1 value: 32.451 - type: recall_at_10 value: 60.682 - type: recall_at_100 value: 75.523 - type: recall_at_1000 value: 87.063 - type: recall_at_3 value: 49.399 - type: recall_at_5 value: 54.55799999999999 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 89.6584 - type: ap value: 85.36881978624284 - type: f1 value: 89.64170045393931 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: dev revision: None metrics: - type: map_at_1 value: 17.942 - type: map_at_10 value: 29.755 - type: map_at_100 value: 31.008000000000003 - type: map_at_1000 value: 31.067 - type: map_at_3 value: 25.959 - type: map_at_5 value: 28.044999999999998 - type: mrr_at_1 value: 18.467 - type: mrr_at_10 value: 30.253000000000004 - type: mrr_at_100 value: 31.461 - type: mrr_at_1000 value: 31.513 - type: mrr_at_3 value: 26.528000000000002 - type: mrr_at_5 value: 28.588 - type: ndcg_at_1 value: 18.467 - type: ndcg_at_10 value: 36.510999999999996 - type: ndcg_at_100 value: 42.748999999999995 - type: ndcg_at_1000 value: 44.188 - type: ndcg_at_3 value: 28.752 - type: ndcg_at_5 value: 32.462 - type: precision_at_1 value: 18.467 - type: precision_at_10 value: 6.006 - type: precision_at_100 value: 0.9169999999999999 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 12.55 - type: precision_at_5 value: 9.395000000000001 - type: recall_at_1 value: 17.942 - type: recall_at_10 value: 57.440000000000005 - type: recall_at_100 value: 86.66199999999999 - type: recall_at_1000 value: 97.613 - type: recall_at_3 value: 36.271 - type: recall_at_5 value: 45.167 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.76652986776104 - type: f1 value: 93.726741953801 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 67.79753761969903 - type: f1 value: 45.8547023848409 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.26563550773369 - type: f1 value: 67.37602000921103 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.51244115669132 - type: f1 value: 73.79891534060464 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 33.88016176143737 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 32.07643038274053 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 30.81344342001539 - type: mrr value: 31.82078962760685 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: map_at_1 value: 4.617 - type: map_at_10 value: 11.501 - type: map_at_100 value: 14.729999999999999 - type: map_at_1000 value: 16.209 - type: map_at_3 value: 8.275 - type: map_at_5 value: 9.853000000000002 - type: mrr_at_1 value: 41.486000000000004 - type: mrr_at_10 value: 51.471999999999994 - type: mrr_at_100 value: 52.020999999999994 - type: mrr_at_1000 value: 52.066 - type: mrr_at_3 value: 49.484 - type: mrr_at_5 value: 50.660000000000004 - type: ndcg_at_1 value: 38.854 - type: ndcg_at_10 value: 31.567 - type: ndcg_at_100 value: 29.842999999999996 - type: ndcg_at_1000 value: 38.995000000000005 - type: ndcg_at_3 value: 36.785000000000004 - type: ndcg_at_5 value: 34.955000000000005 - type: precision_at_1 value: 40.867 - type: precision_at_10 value: 23.591 - type: precision_at_100 value: 7.771 - type: precision_at_1000 value: 2.11 - type: precision_at_3 value: 35.397 - type: precision_at_5 value: 30.959999999999997 - type: recall_at_1 value: 4.617 - type: recall_at_10 value: 15.609 - type: recall_at_100 value: 31.313999999999997 - type: recall_at_1000 value: 63.085 - type: recall_at_3 value: 9.746 - type: recall_at_5 value: 12.295 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: map_at_1 value: 28.797 - type: map_at_10 value: 44.822 - type: map_at_100 value: 45.891999999999996 - type: map_at_1000 value: 45.919 - type: map_at_3 value: 40.237 - type: map_at_5 value: 42.913000000000004 - type: mrr_at_1 value: 32.561 - type: mrr_at_10 value: 46.982 - type: mrr_at_100 value: 47.827 - type: mrr_at_1000 value: 47.843999999999994 - type: mrr_at_3 value: 43.26 - type: mrr_at_5 value: 45.527 - type: ndcg_at_1 value: 32.532 - type: ndcg_at_10 value: 52.832 - type: ndcg_at_100 value: 57.343999999999994 - type: ndcg_at_1000 value: 57.93899999999999 - type: ndcg_at_3 value: 44.246 - type: ndcg_at_5 value: 48.698 - type: precision_at_1 value: 32.532 - type: precision_at_10 value: 9.003 - type: precision_at_100 value: 1.1480000000000001 - type: precision_at_1000 value: 0.12 - type: precision_at_3 value: 20.605999999999998 - type: precision_at_5 value: 14.954 - type: recall_at_1 value: 28.797 - type: recall_at_10 value: 75.065 - type: recall_at_100 value: 94.6 - type: recall_at_1000 value: 98.967 - type: recall_at_3 value: 52.742 - type: recall_at_5 value: 63.012 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: map_at_1 value: 69.84700000000001 - type: map_at_10 value: 83.91499999999999 - type: map_at_100 value: 84.568 - type: map_at_1000 value: 84.584 - type: map_at_3 value: 80.87299999999999 - type: map_at_5 value: 82.76299999999999 - type: mrr_at_1 value: 80.4 - type: mrr_at_10 value: 86.843 - type: mrr_at_100 value: 86.956 - type: mrr_at_1000 value: 86.957 - type: mrr_at_3 value: 85.843 - type: mrr_at_5 value: 86.521 - type: ndcg_at_1 value: 80.4 - type: ndcg_at_10 value: 87.787 - type: ndcg_at_100 value: 89.039 - type: ndcg_at_1000 value: 89.137 - type: ndcg_at_3 value: 84.76700000000001 - type: ndcg_at_5 value: 86.413 - type: precision_at_1 value: 80.4 - type: precision_at_10 value: 13.391 - type: precision_at_100 value: 1.533 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.123 - type: precision_at_5 value: 24.462 - type: recall_at_1 value: 69.84700000000001 - type: recall_at_10 value: 95.296 - type: recall_at_100 value: 99.543 - type: recall_at_1000 value: 99.98700000000001 - type: recall_at_3 value: 86.75 - type: recall_at_5 value: 91.33099999999999 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 54.24501738730203 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 61.28243705082983 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 3.473 - type: map_at_10 value: 8.944 - type: map_at_100 value: 11.21 - type: map_at_1000 value: 11.601 - type: map_at_3 value: 6.167 - type: map_at_5 value: 7.438000000000001 - type: mrr_at_1 value: 17.1 - type: mrr_at_10 value: 26.487 - type: mrr_at_100 value: 27.888 - type: mrr_at_1000 value: 27.961000000000002 - type: mrr_at_3 value: 23.25 - type: mrr_at_5 value: 24.91 - type: ndcg_at_1 value: 17.1 - type: ndcg_at_10 value: 15.615000000000002 - type: ndcg_at_100 value: 24.667 - type: ndcg_at_1000 value: 31.467 - type: ndcg_at_3 value: 14.035 - type: ndcg_at_5 value: 12.443 - type: precision_at_1 value: 17.1 - type: precision_at_10 value: 8.4 - type: precision_at_100 value: 2.149 - type: precision_at_1000 value: 0.378 - type: precision_at_3 value: 13.200000000000001 - type: precision_at_5 value: 11.06 - type: recall_at_1 value: 3.473 - type: recall_at_10 value: 17.087 - type: recall_at_100 value: 43.641999999999996 - type: recall_at_1000 value: 76.7 - type: recall_at_3 value: 8.037999999999998 - type: recall_at_5 value: 11.232000000000001 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 86.07032781899852 - type: cos_sim_spearman value: 81.86668245459153 - type: euclidean_pearson value: 83.75572948495356 - type: euclidean_spearman value: 81.88575221829207 - type: manhattan_pearson value: 83.73171218997966 - type: manhattan_spearman value: 81.85928771458329 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 80.29008828604368 - type: cos_sim_spearman value: 70.7510437896188 - type: euclidean_pearson value: 76.65867322096001 - type: euclidean_spearman value: 70.53984435296805 - type: manhattan_pearson value: 76.6398826461678 - type: manhattan_spearman value: 70.55153706770477 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 83.55610063096913 - type: cos_sim_spearman value: 84.36676850545378 - type: euclidean_pearson value: 82.81438612985889 - type: euclidean_spearman value: 84.182693686057 - type: manhattan_pearson value: 82.8355239074719 - type: manhattan_spearman value: 84.19280249146543 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 78.94275022740113 - type: cos_sim_spearman value: 74.50851813226338 - type: euclidean_pearson value: 77.30867917552419 - type: euclidean_spearman value: 74.55661368823343 - type: manhattan_pearson value: 77.31883134876524 - type: manhattan_spearman value: 74.58999819014154 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 85.62907185533146 - type: cos_sim_spearman value: 86.40667080261993 - type: euclidean_pearson value: 85.15184748925726 - type: euclidean_spearman value: 86.33853519247509 - type: manhattan_pearson value: 85.21542426870172 - type: manhattan_spearman value: 86.4076178438401 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 83.42449758804275 - type: cos_sim_spearman value: 84.7411616479609 - type: euclidean_pearson value: 83.56616729612806 - type: euclidean_spearman value: 84.44493050289694 - type: manhattan_pearson value: 83.50906591764574 - type: manhattan_spearman value: 84.39704993090794 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 88.84843806728331 - type: cos_sim_spearman value: 89.03139214250334 - type: euclidean_pearson value: 89.63615835813032 - type: euclidean_spearman value: 89.33022202130817 - type: manhattan_pearson value: 89.67071925715891 - type: manhattan_spearman value: 89.29339683171531 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 65.65559857216783 - type: cos_sim_spearman value: 65.86805861979079 - type: euclidean_pearson value: 66.69697475461513 - type: euclidean_spearman value: 66.07735691378713 - type: manhattan_pearson value: 66.63427637906918 - type: manhattan_spearman value: 65.95720565040364 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 86.06435608928308 - type: cos_sim_spearman value: 86.46139340079428 - type: euclidean_pearson value: 86.4874804471064 - type: euclidean_spearman value: 86.19390771731406 - type: manhattan_pearson value: 86.51184704840284 - type: manhattan_spearman value: 86.19094101171963 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 85.10723925640346 - type: mrr value: 95.62579305226365 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: map_at_1 value: 56.233 - type: map_at_10 value: 64.94 - type: map_at_100 value: 65.508 - type: map_at_1000 value: 65.537 - type: map_at_3 value: 62.121 - type: map_at_5 value: 63.92400000000001 - type: mrr_at_1 value: 58.667 - type: mrr_at_10 value: 66.352 - type: mrr_at_100 value: 66.751 - type: mrr_at_1000 value: 66.777 - type: mrr_at_3 value: 64.22200000000001 - type: mrr_at_5 value: 65.656 - type: ndcg_at_1 value: 58.667 - type: ndcg_at_10 value: 69.318 - type: ndcg_at_100 value: 71.822 - type: ndcg_at_1000 value: 72.578 - type: ndcg_at_3 value: 64.532 - type: ndcg_at_5 value: 67.292 - type: precision_at_1 value: 58.667 - type: precision_at_10 value: 9.133 - type: precision_at_100 value: 1.05 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 24.889 - type: precision_at_5 value: 16.733 - type: recall_at_1 value: 56.233 - type: recall_at_10 value: 81.206 - type: recall_at_100 value: 92.80000000000001 - type: recall_at_1000 value: 98.667 - type: recall_at_3 value: 68.672 - type: recall_at_5 value: 75.378 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.56336633663366 - type: cos_sim_ap value: 86.13024319858586 - type: cos_sim_f1 value: 76.80157946692991 - type: cos_sim_precision value: 75.82846003898635 - type: cos_sim_recall value: 77.8 - type: dot_accuracy value: 99.56336633663366 - type: dot_ap value: 86.13028343072267 - type: dot_f1 value: 76.80157946692991 - type: dot_precision value: 75.82846003898635 - type: dot_recall value: 77.8 - type: euclidean_accuracy value: 99.56336633663366 - type: euclidean_ap value: 86.13029040641543 - type: euclidean_f1 value: 76.80157946692991 - type: euclidean_precision value: 75.82846003898635 - type: euclidean_recall value: 77.8 - type: manhattan_accuracy value: 99.56534653465347 - type: manhattan_ap value: 86.24817068330776 - type: manhattan_f1 value: 77.13580246913581 - type: manhattan_precision value: 76.19512195121952 - type: manhattan_recall value: 78.10000000000001 - type: max_accuracy value: 99.56534653465347 - type: max_ap value: 86.24817068330776 - type: max_f1 value: 77.13580246913581 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 64.69564559409538 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 34.23127531581388 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 49.845357053686975 - type: mrr value: 50.59803656311009 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 29.02241691876377 - type: cos_sim_spearman value: 29.017719340560923 - type: dot_pearson value: 29.59373129445045 - type: dot_spearman value: 29.616196388331968 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.157 - type: map_at_10 value: 0.9440000000000001 - type: map_at_100 value: 4.61 - type: map_at_1000 value: 11.488 - type: map_at_3 value: 0.396 - type: map_at_5 value: 0.569 - type: mrr_at_1 value: 57.99999999999999 - type: mrr_at_10 value: 71.672 - type: mrr_at_100 value: 71.707 - type: mrr_at_1000 value: 71.707 - type: mrr_at_3 value: 68.333 - type: mrr_at_5 value: 70.533 - type: ndcg_at_1 value: 54.0 - type: ndcg_at_10 value: 45.216 - type: ndcg_at_100 value: 32.623999999999995 - type: ndcg_at_1000 value: 33.006 - type: ndcg_at_3 value: 51.76500000000001 - type: ndcg_at_5 value: 47.888999999999996 - type: precision_at_1 value: 57.99999999999999 - type: precision_at_10 value: 48.0 - type: precision_at_100 value: 32.74 - type: precision_at_1000 value: 14.588000000000001 - type: precision_at_3 value: 55.333 - type: precision_at_5 value: 51.2 - type: recall_at_1 value: 0.157 - type: recall_at_10 value: 1.212 - type: recall_at_100 value: 7.868 - type: recall_at_1000 value: 31.583 - type: recall_at_3 value: 0.443 - type: recall_at_5 value: 0.6779999999999999 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 1.545 - type: map_at_10 value: 4.6690000000000005 - type: map_at_100 value: 8.982 - type: map_at_1000 value: 10.453999999999999 - type: map_at_3 value: 2.35 - type: map_at_5 value: 3.168 - type: mrr_at_1 value: 18.367 - type: mrr_at_10 value: 28.599999999999998 - type: mrr_at_100 value: 30.287 - type: mrr_at_1000 value: 30.339 - type: mrr_at_3 value: 24.490000000000002 - type: mrr_at_5 value: 27.040999999999997 - type: ndcg_at_1 value: 17.347 - type: ndcg_at_10 value: 13.868 - type: ndcg_at_100 value: 25.499 - type: ndcg_at_1000 value: 37.922 - type: ndcg_at_3 value: 13.746 - type: ndcg_at_5 value: 13.141 - type: precision_at_1 value: 18.367 - type: precision_at_10 value: 12.653 - type: precision_at_100 value: 5.776 - type: precision_at_1000 value: 1.3860000000000001 - type: precision_at_3 value: 13.605 - type: precision_at_5 value: 13.061 - type: recall_at_1 value: 1.545 - type: recall_at_10 value: 9.305 - type: recall_at_100 value: 38.084 - type: recall_at_1000 value: 75.897 - type: recall_at_3 value: 2.903 - type: recall_at_5 value: 4.8919999999999995 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 70.8454 - type: ap value: 14.744783758537974 - type: f1 value: 54.86055534008869 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 58.71250707413695 - type: f1 value: 58.76581794782603 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 49.314744135178934 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 84.13899982118377 - type: cos_sim_ap value: 68.03329474978145 - type: cos_sim_f1 value: 63.31192005710206 - type: cos_sim_precision value: 57.6473136915078 - type: cos_sim_recall value: 70.21108179419525 - type: dot_accuracy value: 84.13899982118377 - type: dot_ap value: 68.03324775052695 - type: dot_f1 value: 63.31192005710206 - type: dot_precision value: 57.6473136915078 - type: dot_recall value: 70.21108179419525 - type: euclidean_accuracy value: 84.13899982118377 - type: euclidean_ap value: 68.03331114508686 - type: euclidean_f1 value: 63.31192005710206 - type: euclidean_precision value: 57.6473136915078 - type: euclidean_recall value: 70.21108179419525 - type: manhattan_accuracy value: 84.12111819753234 - type: manhattan_ap value: 67.97378509663328 - type: manhattan_f1 value: 63.38468945594607 - type: manhattan_precision value: 58.2779991146525 - type: manhattan_recall value: 69.47229551451187 - type: max_accuracy value: 84.13899982118377 - type: max_ap value: 68.03331114508686 - type: max_f1 value: 63.38468945594607 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 87.68774013272791 - type: cos_sim_ap value: 83.51733662214374 - type: cos_sim_f1 value: 75.82190771045259 - type: cos_sim_precision value: 72.72341628959276 - type: cos_sim_recall value: 79.19618109023713 - type: dot_accuracy value: 87.68774013272791 - type: dot_ap value: 83.5173527754126 - type: dot_f1 value: 75.82190771045259 - type: dot_precision value: 72.72341628959276 - type: dot_recall value: 79.19618109023713 - type: euclidean_accuracy value: 87.68774013272791 - type: euclidean_ap value: 83.51734651146224 - type: euclidean_f1 value: 75.82190771045259 - type: euclidean_precision value: 72.72341628959276 - type: euclidean_recall value: 79.19618109023713 - type: manhattan_accuracy value: 87.67221640082276 - type: manhattan_ap value: 83.51179463759505 - type: manhattan_f1 value: 75.76243980738361 - type: manhattan_precision value: 71.99112590127565 - type: manhattan_recall value: 79.95072374499537 - type: max_accuracy value: 87.68774013272791 - type: max_ap value: 83.5173527754126 - type: max_f1 value: 75.82190771045259 ---
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
Muennighoff/SGPT-5.8B-weightedmean-msmarco-specb-bitfit
Muennighoff
sentence-similarity
[ "sentence-transformers", "pytorch", "gptj", "feature-extraction", "sentence-similarity", "mteb", "arxiv:2202.08904", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,646
1,679
83
23
--- pipeline_tag: sentence-similarity tags: - sentence-transformers - feature-extraction - sentence-similarity - mteb model-index: - name: SGPT-5.8B-weightedmean-msmarco-specb-bitfit results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: 2d8a100785abf0ae21420d2a55b0c56e3e1ea996 metrics: - type: accuracy value: 69.22388059701493 - type: ap value: 32.04724673950256 - type: f1 value: 63.25719825770428 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: 80714f8dcf8cefc218ef4f8c5a966dd83f75a0e1 metrics: - type: accuracy value: 71.26109999999998 - type: ap value: 66.16336378255403 - type: f1 value: 70.89719145825303 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: c379a6705fec24a2493fa68e011692605f44e119 metrics: - type: accuracy value: 39.19199999999999 - type: f1 value: 38.580766731113826 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: 5b3e3697907184a9b77a3c99ee9ea1a9cbb1e4e3 metrics: - type: map_at_1 value: 27.311999999999998 - type: map_at_10 value: 42.620000000000005 - type: map_at_100 value: 43.707 - type: map_at_1000 value: 43.714999999999996 - type: map_at_3 value: 37.624 - type: map_at_5 value: 40.498 - type: mrr_at_1 value: 27.667 - type: mrr_at_10 value: 42.737 - type: mrr_at_100 value: 43.823 - type: mrr_at_1000 value: 43.830999999999996 - type: mrr_at_3 value: 37.743 - type: mrr_at_5 value: 40.616 - type: ndcg_at_1 value: 27.311999999999998 - type: ndcg_at_10 value: 51.37500000000001 - type: ndcg_at_100 value: 55.778000000000006 - type: ndcg_at_1000 value: 55.96600000000001 - type: ndcg_at_3 value: 41.087 - type: ndcg_at_5 value: 46.269 - type: precision_at_1 value: 27.311999999999998 - type: precision_at_10 value: 7.945 - type: precision_at_100 value: 0.9820000000000001 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 17.046 - type: precision_at_5 value: 12.745000000000001 - type: recall_at_1 value: 27.311999999999998 - type: recall_at_10 value: 79.445 - type: recall_at_100 value: 98.151 - type: recall_at_1000 value: 99.57300000000001 - type: recall_at_3 value: 51.13799999999999 - type: recall_at_5 value: 63.727000000000004 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: 0bbdb47bcbe3a90093699aefeed338a0f28a7ee8 metrics: - type: v_measure value: 45.59037428592033 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: b73bd54100e5abfa6e3a23dcafb46fe4d2438dc3 metrics: - type: v_measure value: 38.86371701986363 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 4d853f94cd57d85ec13805aeeac3ae3e5eb4c49c metrics: - type: map value: 61.625568691427766 - type: mrr value: 75.83256386580486 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: 9ee918f184421b6bd48b78f6c714d86546106103 metrics: - type: cos_sim_pearson value: 89.96074355094802 - type: cos_sim_spearman value: 86.2501580394454 - type: euclidean_pearson value: 82.18427440380462 - type: euclidean_spearman value: 80.14760935017947 - type: manhattan_pearson value: 82.24621578156392 - type: manhattan_spearman value: 80.00363016590163 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 44fa15921b4c889113cc5df03dd4901b49161ab7 metrics: - type: accuracy value: 84.49350649350649 - type: f1 value: 84.4249343233736 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 11d0121201d1f1f280e8cc8f3d98fb9c4d9f9c55 metrics: - type: v_measure value: 36.551459722989385 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: c0fab014e1bcb8d3a5e31b2088972a1e01547dc1 metrics: - type: v_measure value: 33.69901851846774 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db metrics: - type: map_at_1 value: 30.499 - type: map_at_10 value: 41.208 - type: map_at_100 value: 42.638 - type: map_at_1000 value: 42.754 - type: map_at_3 value: 37.506 - type: map_at_5 value: 39.422000000000004 - type: mrr_at_1 value: 37.339 - type: mrr_at_10 value: 47.051 - type: mrr_at_100 value: 47.745 - type: mrr_at_1000 value: 47.786 - type: mrr_at_3 value: 44.086999999999996 - type: mrr_at_5 value: 45.711 - type: ndcg_at_1 value: 37.339 - type: ndcg_at_10 value: 47.666 - type: ndcg_at_100 value: 52.994 - type: ndcg_at_1000 value: 54.928999999999995 - type: ndcg_at_3 value: 41.982 - type: ndcg_at_5 value: 44.42 - type: precision_at_1 value: 37.339 - type: precision_at_10 value: 9.127 - type: precision_at_100 value: 1.4749999999999999 - type: precision_at_1000 value: 0.194 - type: precision_at_3 value: 20.076 - type: precision_at_5 value: 14.449000000000002 - type: recall_at_1 value: 30.499 - type: recall_at_10 value: 60.328 - type: recall_at_100 value: 82.57900000000001 - type: recall_at_1000 value: 95.074 - type: recall_at_3 value: 44.17 - type: recall_at_5 value: 50.94 - type: map_at_1 value: 30.613 - type: map_at_10 value: 40.781 - type: map_at_100 value: 42.018 - type: map_at_1000 value: 42.132999999999996 - type: map_at_3 value: 37.816 - type: map_at_5 value: 39.389 - type: mrr_at_1 value: 38.408 - type: mrr_at_10 value: 46.631 - type: mrr_at_100 value: 47.332 - type: mrr_at_1000 value: 47.368 - type: mrr_at_3 value: 44.384 - type: mrr_at_5 value: 45.661 - type: ndcg_at_1 value: 38.408 - type: ndcg_at_10 value: 46.379999999999995 - type: ndcg_at_100 value: 50.81 - type: ndcg_at_1000 value: 52.663000000000004 - type: ndcg_at_3 value: 42.18 - type: ndcg_at_5 value: 43.974000000000004 - type: precision_at_1 value: 38.408 - type: precision_at_10 value: 8.656 - type: precision_at_100 value: 1.3860000000000001 - type: precision_at_1000 value: 0.184 - type: precision_at_3 value: 20.276 - type: precision_at_5 value: 14.241999999999999 - type: recall_at_1 value: 30.613 - type: recall_at_10 value: 56.44 - type: recall_at_100 value: 75.044 - type: recall_at_1000 value: 86.426 - type: recall_at_3 value: 43.766 - type: recall_at_5 value: 48.998000000000005 - type: map_at_1 value: 37.370999999999995 - type: map_at_10 value: 49.718 - type: map_at_100 value: 50.737 - type: map_at_1000 value: 50.79 - type: map_at_3 value: 46.231 - type: map_at_5 value: 48.329 - type: mrr_at_1 value: 42.884 - type: mrr_at_10 value: 53.176 - type: mrr_at_100 value: 53.81700000000001 - type: mrr_at_1000 value: 53.845 - type: mrr_at_3 value: 50.199000000000005 - type: mrr_at_5 value: 52.129999999999995 - type: ndcg_at_1 value: 42.884 - type: ndcg_at_10 value: 55.826 - type: ndcg_at_100 value: 59.93000000000001 - type: ndcg_at_1000 value: 61.013 - type: ndcg_at_3 value: 49.764 - type: ndcg_at_5 value: 53.025999999999996 - type: precision_at_1 value: 42.884 - type: precision_at_10 value: 9.046999999999999 - type: precision_at_100 value: 1.212 - type: precision_at_1000 value: 0.135 - type: precision_at_3 value: 22.131999999999998 - type: precision_at_5 value: 15.524 - type: recall_at_1 value: 37.370999999999995 - type: recall_at_10 value: 70.482 - type: recall_at_100 value: 88.425 - type: recall_at_1000 value: 96.03399999999999 - type: recall_at_3 value: 54.43 - type: recall_at_5 value: 62.327999999999996 - type: map_at_1 value: 22.875999999999998 - type: map_at_10 value: 31.715 - type: map_at_100 value: 32.847 - type: map_at_1000 value: 32.922000000000004 - type: map_at_3 value: 29.049999999999997 - type: map_at_5 value: 30.396 - type: mrr_at_1 value: 24.52 - type: mrr_at_10 value: 33.497 - type: mrr_at_100 value: 34.455000000000005 - type: mrr_at_1000 value: 34.510000000000005 - type: mrr_at_3 value: 30.791 - type: mrr_at_5 value: 32.175 - type: ndcg_at_1 value: 24.52 - type: ndcg_at_10 value: 36.95 - type: ndcg_at_100 value: 42.238 - type: ndcg_at_1000 value: 44.147999999999996 - type: ndcg_at_3 value: 31.435000000000002 - type: ndcg_at_5 value: 33.839000000000006 - type: precision_at_1 value: 24.52 - type: precision_at_10 value: 5.9319999999999995 - type: precision_at_100 value: 0.901 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 13.446 - type: precision_at_5 value: 9.469 - type: recall_at_1 value: 22.875999999999998 - type: recall_at_10 value: 51.38 - type: recall_at_100 value: 75.31099999999999 - type: recall_at_1000 value: 89.718 - type: recall_at_3 value: 36.26 - type: recall_at_5 value: 42.248999999999995 - type: map_at_1 value: 14.984 - type: map_at_10 value: 23.457 - type: map_at_100 value: 24.723 - type: map_at_1000 value: 24.846 - type: map_at_3 value: 20.873 - type: map_at_5 value: 22.357 - type: mrr_at_1 value: 18.159 - type: mrr_at_10 value: 27.431 - type: mrr_at_100 value: 28.449 - type: mrr_at_1000 value: 28.52 - type: mrr_at_3 value: 24.979000000000003 - type: mrr_at_5 value: 26.447 - type: ndcg_at_1 value: 18.159 - type: ndcg_at_10 value: 28.627999999999997 - type: ndcg_at_100 value: 34.741 - type: ndcg_at_1000 value: 37.516 - type: ndcg_at_3 value: 23.902 - type: ndcg_at_5 value: 26.294 - type: precision_at_1 value: 18.159 - type: precision_at_10 value: 5.485 - type: precision_at_100 value: 0.985 - type: precision_at_1000 value: 0.136 - type: precision_at_3 value: 11.774 - type: precision_at_5 value: 8.731 - type: recall_at_1 value: 14.984 - type: recall_at_10 value: 40.198 - type: recall_at_100 value: 67.11500000000001 - type: recall_at_1000 value: 86.497 - type: recall_at_3 value: 27.639000000000003 - type: recall_at_5 value: 33.595000000000006 - type: map_at_1 value: 29.067 - type: map_at_10 value: 39.457 - type: map_at_100 value: 40.83 - type: map_at_1000 value: 40.94 - type: map_at_3 value: 35.995 - type: map_at_5 value: 38.159 - type: mrr_at_1 value: 34.937000000000005 - type: mrr_at_10 value: 44.755 - type: mrr_at_100 value: 45.549 - type: mrr_at_1000 value: 45.589 - type: mrr_at_3 value: 41.947 - type: mrr_at_5 value: 43.733 - type: ndcg_at_1 value: 34.937000000000005 - type: ndcg_at_10 value: 45.573 - type: ndcg_at_100 value: 51.266999999999996 - type: ndcg_at_1000 value: 53.184 - type: ndcg_at_3 value: 39.961999999999996 - type: ndcg_at_5 value: 43.02 - type: precision_at_1 value: 34.937000000000005 - type: precision_at_10 value: 8.296000000000001 - type: precision_at_100 value: 1.32 - type: precision_at_1000 value: 0.167 - type: precision_at_3 value: 18.8 - type: precision_at_5 value: 13.763 - type: recall_at_1 value: 29.067 - type: recall_at_10 value: 58.298 - type: recall_at_100 value: 82.25099999999999 - type: recall_at_1000 value: 94.476 - type: recall_at_3 value: 42.984 - type: recall_at_5 value: 50.658 - type: map_at_1 value: 25.985999999999997 - type: map_at_10 value: 35.746 - type: map_at_100 value: 37.067 - type: map_at_1000 value: 37.191 - type: map_at_3 value: 32.599000000000004 - type: map_at_5 value: 34.239000000000004 - type: mrr_at_1 value: 31.735000000000003 - type: mrr_at_10 value: 40.515 - type: mrr_at_100 value: 41.459 - type: mrr_at_1000 value: 41.516 - type: mrr_at_3 value: 37.938 - type: mrr_at_5 value: 39.25 - type: ndcg_at_1 value: 31.735000000000003 - type: ndcg_at_10 value: 41.484 - type: ndcg_at_100 value: 47.047 - type: ndcg_at_1000 value: 49.427 - type: ndcg_at_3 value: 36.254999999999995 - type: ndcg_at_5 value: 38.375 - type: precision_at_1 value: 31.735000000000003 - type: precision_at_10 value: 7.66 - type: precision_at_100 value: 1.234 - type: precision_at_1000 value: 0.16 - type: precision_at_3 value: 17.427999999999997 - type: precision_at_5 value: 12.328999999999999 - type: recall_at_1 value: 25.985999999999997 - type: recall_at_10 value: 53.761 - type: recall_at_100 value: 77.149 - type: recall_at_1000 value: 93.342 - type: recall_at_3 value: 39.068000000000005 - type: recall_at_5 value: 44.693 - type: map_at_1 value: 24.949749999999998 - type: map_at_10 value: 34.04991666666667 - type: map_at_100 value: 35.26825 - type: map_at_1000 value: 35.38316666666667 - type: map_at_3 value: 31.181333333333335 - type: map_at_5 value: 32.77391666666667 - type: mrr_at_1 value: 29.402833333333334 - type: mrr_at_10 value: 38.01633333333333 - type: mrr_at_100 value: 38.88033333333334 - type: mrr_at_1000 value: 38.938500000000005 - type: mrr_at_3 value: 35.5175 - type: mrr_at_5 value: 36.93808333333333 - type: ndcg_at_1 value: 29.402833333333334 - type: ndcg_at_10 value: 39.403166666666664 - type: ndcg_at_100 value: 44.66408333333333 - type: ndcg_at_1000 value: 46.96283333333333 - type: ndcg_at_3 value: 34.46633333333334 - type: ndcg_at_5 value: 36.78441666666667 - type: precision_at_1 value: 29.402833333333334 - type: precision_at_10 value: 6.965833333333333 - type: precision_at_100 value: 1.1330833333333334 - type: precision_at_1000 value: 0.15158333333333335 - type: precision_at_3 value: 15.886666666666665 - type: precision_at_5 value: 11.360416666666667 - type: recall_at_1 value: 24.949749999999998 - type: recall_at_10 value: 51.29325 - type: recall_at_100 value: 74.3695 - type: recall_at_1000 value: 90.31299999999999 - type: recall_at_3 value: 37.580083333333334 - type: recall_at_5 value: 43.529666666666664 - type: map_at_1 value: 22.081999999999997 - type: map_at_10 value: 29.215999999999998 - type: map_at_100 value: 30.163 - type: map_at_1000 value: 30.269000000000002 - type: map_at_3 value: 26.942 - type: map_at_5 value: 28.236 - type: mrr_at_1 value: 24.847 - type: mrr_at_10 value: 31.918999999999997 - type: mrr_at_100 value: 32.817 - type: mrr_at_1000 value: 32.897 - type: mrr_at_3 value: 29.831000000000003 - type: mrr_at_5 value: 31.019999999999996 - type: ndcg_at_1 value: 24.847 - type: ndcg_at_10 value: 33.4 - type: ndcg_at_100 value: 38.354 - type: ndcg_at_1000 value: 41.045 - type: ndcg_at_3 value: 29.236 - type: ndcg_at_5 value: 31.258000000000003 - type: precision_at_1 value: 24.847 - type: precision_at_10 value: 5.353 - type: precision_at_100 value: 0.853 - type: precision_at_1000 value: 0.116 - type: precision_at_3 value: 12.679000000000002 - type: precision_at_5 value: 8.988 - type: recall_at_1 value: 22.081999999999997 - type: recall_at_10 value: 43.505 - type: recall_at_100 value: 66.45400000000001 - type: recall_at_1000 value: 86.378 - type: recall_at_3 value: 32.163000000000004 - type: recall_at_5 value: 37.059999999999995 - type: map_at_1 value: 15.540000000000001 - type: map_at_10 value: 22.362000000000002 - type: map_at_100 value: 23.435 - type: map_at_1000 value: 23.564 - type: map_at_3 value: 20.143 - type: map_at_5 value: 21.324 - type: mrr_at_1 value: 18.892 - type: mrr_at_10 value: 25.942999999999998 - type: mrr_at_100 value: 26.883000000000003 - type: mrr_at_1000 value: 26.968999999999998 - type: mrr_at_3 value: 23.727 - type: mrr_at_5 value: 24.923000000000002 - type: ndcg_at_1 value: 18.892 - type: ndcg_at_10 value: 26.811 - type: ndcg_at_100 value: 32.066 - type: ndcg_at_1000 value: 35.166 - type: ndcg_at_3 value: 22.706 - type: ndcg_at_5 value: 24.508 - type: precision_at_1 value: 18.892 - type: precision_at_10 value: 4.942 - type: precision_at_100 value: 0.878 - type: precision_at_1000 value: 0.131 - type: precision_at_3 value: 10.748000000000001 - type: precision_at_5 value: 7.784000000000001 - type: recall_at_1 value: 15.540000000000001 - type: recall_at_10 value: 36.742999999999995 - type: recall_at_100 value: 60.525 - type: recall_at_1000 value: 82.57600000000001 - type: recall_at_3 value: 25.252000000000002 - type: recall_at_5 value: 29.872 - type: map_at_1 value: 24.453 - type: map_at_10 value: 33.363 - type: map_at_100 value: 34.579 - type: map_at_1000 value: 34.686 - type: map_at_3 value: 30.583 - type: map_at_5 value: 32.118 - type: mrr_at_1 value: 28.918 - type: mrr_at_10 value: 37.675 - type: mrr_at_100 value: 38.567 - type: mrr_at_1000 value: 38.632 - type: mrr_at_3 value: 35.260999999999996 - type: mrr_at_5 value: 36.576 - type: ndcg_at_1 value: 28.918 - type: ndcg_at_10 value: 38.736 - type: ndcg_at_100 value: 44.261 - type: ndcg_at_1000 value: 46.72 - type: ndcg_at_3 value: 33.81 - type: ndcg_at_5 value: 36.009 - type: precision_at_1 value: 28.918 - type: precision_at_10 value: 6.586 - type: precision_at_100 value: 1.047 - type: precision_at_1000 value: 0.13699999999999998 - type: precision_at_3 value: 15.360999999999999 - type: precision_at_5 value: 10.857999999999999 - type: recall_at_1 value: 24.453 - type: recall_at_10 value: 50.885999999999996 - type: recall_at_100 value: 75.03 - type: recall_at_1000 value: 92.123 - type: recall_at_3 value: 37.138 - type: recall_at_5 value: 42.864999999999995 - type: map_at_1 value: 24.57 - type: map_at_10 value: 33.672000000000004 - type: map_at_100 value: 35.244 - type: map_at_1000 value: 35.467 - type: map_at_3 value: 30.712 - type: map_at_5 value: 32.383 - type: mrr_at_1 value: 29.644 - type: mrr_at_10 value: 38.344 - type: mrr_at_100 value: 39.219 - type: mrr_at_1000 value: 39.282000000000004 - type: mrr_at_3 value: 35.771 - type: mrr_at_5 value: 37.273 - type: ndcg_at_1 value: 29.644 - type: ndcg_at_10 value: 39.567 - type: ndcg_at_100 value: 45.097 - type: ndcg_at_1000 value: 47.923 - type: ndcg_at_3 value: 34.768 - type: ndcg_at_5 value: 37.122 - type: precision_at_1 value: 29.644 - type: precision_at_10 value: 7.5889999999999995 - type: precision_at_100 value: 1.478 - type: precision_at_1000 value: 0.23500000000000001 - type: precision_at_3 value: 16.337 - type: precision_at_5 value: 12.055 - type: recall_at_1 value: 24.57 - type: recall_at_10 value: 51.00900000000001 - type: recall_at_100 value: 75.423 - type: recall_at_1000 value: 93.671 - type: recall_at_3 value: 36.925999999999995 - type: recall_at_5 value: 43.245 - type: map_at_1 value: 21.356 - type: map_at_10 value: 27.904 - type: map_at_100 value: 28.938000000000002 - type: map_at_1000 value: 29.036 - type: map_at_3 value: 25.726 - type: map_at_5 value: 26.935 - type: mrr_at_1 value: 22.551 - type: mrr_at_10 value: 29.259 - type: mrr_at_100 value: 30.272 - type: mrr_at_1000 value: 30.348000000000003 - type: mrr_at_3 value: 27.295 - type: mrr_at_5 value: 28.358 - type: ndcg_at_1 value: 22.551 - type: ndcg_at_10 value: 31.817 - type: ndcg_at_100 value: 37.164 - type: ndcg_at_1000 value: 39.82 - type: ndcg_at_3 value: 27.595999999999997 - type: ndcg_at_5 value: 29.568 - type: precision_at_1 value: 22.551 - type: precision_at_10 value: 4.917 - type: precision_at_100 value: 0.828 - type: precision_at_1000 value: 0.11399999999999999 - type: precision_at_3 value: 11.583 - type: precision_at_5 value: 8.133 - type: recall_at_1 value: 21.356 - type: recall_at_10 value: 42.489 - type: recall_at_100 value: 67.128 - type: recall_at_1000 value: 87.441 - type: recall_at_3 value: 31.165 - type: recall_at_5 value: 35.853 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: 392b78eb68c07badcd7c2cd8f39af108375dfcce metrics: - type: map_at_1 value: 12.306000000000001 - type: map_at_10 value: 21.523 - type: map_at_100 value: 23.358 - type: map_at_1000 value: 23.541 - type: map_at_3 value: 17.809 - type: map_at_5 value: 19.631 - type: mrr_at_1 value: 27.948 - type: mrr_at_10 value: 40.355000000000004 - type: mrr_at_100 value: 41.166000000000004 - type: mrr_at_1000 value: 41.203 - type: mrr_at_3 value: 36.819 - type: mrr_at_5 value: 38.958999999999996 - type: ndcg_at_1 value: 27.948 - type: ndcg_at_10 value: 30.462 - type: ndcg_at_100 value: 37.473 - type: ndcg_at_1000 value: 40.717999999999996 - type: ndcg_at_3 value: 24.646 - type: ndcg_at_5 value: 26.642 - type: precision_at_1 value: 27.948 - type: precision_at_10 value: 9.648 - type: precision_at_100 value: 1.7239999999999998 - type: precision_at_1000 value: 0.232 - type: precision_at_3 value: 18.48 - type: precision_at_5 value: 14.293 - type: recall_at_1 value: 12.306000000000001 - type: recall_at_10 value: 37.181 - type: recall_at_100 value: 61.148 - type: recall_at_1000 value: 79.401 - type: recall_at_3 value: 22.883 - type: recall_at_5 value: 28.59 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: f097057d03ed98220bc7309ddb10b71a54d667d6 metrics: - type: map_at_1 value: 9.357 - type: map_at_10 value: 18.849 - type: map_at_100 value: 25.369000000000003 - type: map_at_1000 value: 26.950000000000003 - type: map_at_3 value: 13.625000000000002 - type: map_at_5 value: 15.956999999999999 - type: mrr_at_1 value: 67.75 - type: mrr_at_10 value: 74.734 - type: mrr_at_100 value: 75.1 - type: mrr_at_1000 value: 75.10900000000001 - type: mrr_at_3 value: 73.542 - type: mrr_at_5 value: 74.167 - type: ndcg_at_1 value: 55.375 - type: ndcg_at_10 value: 39.873999999999995 - type: ndcg_at_100 value: 43.098 - type: ndcg_at_1000 value: 50.69200000000001 - type: ndcg_at_3 value: 44.856 - type: ndcg_at_5 value: 42.138999999999996 - type: precision_at_1 value: 67.75 - type: precision_at_10 value: 31.1 - type: precision_at_100 value: 9.303 - type: precision_at_1000 value: 2.0060000000000002 - type: precision_at_3 value: 48.25 - type: precision_at_5 value: 40.949999999999996 - type: recall_at_1 value: 9.357 - type: recall_at_10 value: 23.832 - type: recall_at_100 value: 47.906 - type: recall_at_1000 value: 71.309 - type: recall_at_3 value: 14.512 - type: recall_at_5 value: 18.3 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 829147f8f75a25f005913200eb5ed41fae320aa1 metrics: - type: accuracy value: 49.655 - type: f1 value: 45.51976190938951 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: 1429cf27e393599b8b359b9b72c666f96b2525f9 metrics: - type: map_at_1 value: 62.739999999999995 - type: map_at_10 value: 73.07000000000001 - type: map_at_100 value: 73.398 - type: map_at_1000 value: 73.41 - type: map_at_3 value: 71.33800000000001 - type: map_at_5 value: 72.423 - type: mrr_at_1 value: 67.777 - type: mrr_at_10 value: 77.873 - type: mrr_at_100 value: 78.091 - type: mrr_at_1000 value: 78.094 - type: mrr_at_3 value: 76.375 - type: mrr_at_5 value: 77.316 - type: ndcg_at_1 value: 67.777 - type: ndcg_at_10 value: 78.24 - type: ndcg_at_100 value: 79.557 - type: ndcg_at_1000 value: 79.814 - type: ndcg_at_3 value: 75.125 - type: ndcg_at_5 value: 76.834 - type: precision_at_1 value: 67.777 - type: precision_at_10 value: 9.832 - type: precision_at_100 value: 1.061 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 29.433 - type: precision_at_5 value: 18.665000000000003 - type: recall_at_1 value: 62.739999999999995 - type: recall_at_10 value: 89.505 - type: recall_at_100 value: 95.102 - type: recall_at_1000 value: 96.825 - type: recall_at_3 value: 81.028 - type: recall_at_5 value: 85.28099999999999 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: 41b686a7f28c59bcaaa5791efd47c67c8ebe28be metrics: - type: map_at_1 value: 18.467 - type: map_at_10 value: 30.020999999999997 - type: map_at_100 value: 31.739 - type: map_at_1000 value: 31.934 - type: map_at_3 value: 26.003 - type: map_at_5 value: 28.338 - type: mrr_at_1 value: 35.339999999999996 - type: mrr_at_10 value: 44.108999999999995 - type: mrr_at_100 value: 44.993 - type: mrr_at_1000 value: 45.042 - type: mrr_at_3 value: 41.667 - type: mrr_at_5 value: 43.14 - type: ndcg_at_1 value: 35.339999999999996 - type: ndcg_at_10 value: 37.202 - type: ndcg_at_100 value: 43.852999999999994 - type: ndcg_at_1000 value: 47.235 - type: ndcg_at_3 value: 33.5 - type: ndcg_at_5 value: 34.985 - type: precision_at_1 value: 35.339999999999996 - type: precision_at_10 value: 10.247 - type: precision_at_100 value: 1.7149999999999999 - type: precision_at_1000 value: 0.232 - type: precision_at_3 value: 22.222 - type: precision_at_5 value: 16.573999999999998 - type: recall_at_1 value: 18.467 - type: recall_at_10 value: 44.080999999999996 - type: recall_at_100 value: 68.72200000000001 - type: recall_at_1000 value: 89.087 - type: recall_at_3 value: 30.567 - type: recall_at_5 value: 36.982 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: 766870b35a1b9ca65e67a0d1913899973551fc6c metrics: - type: map_at_1 value: 35.726 - type: map_at_10 value: 50.207 - type: map_at_100 value: 51.05499999999999 - type: map_at_1000 value: 51.12799999999999 - type: map_at_3 value: 47.576 - type: map_at_5 value: 49.172 - type: mrr_at_1 value: 71.452 - type: mrr_at_10 value: 77.41900000000001 - type: mrr_at_100 value: 77.711 - type: mrr_at_1000 value: 77.723 - type: mrr_at_3 value: 76.39399999999999 - type: mrr_at_5 value: 77.00099999999999 - type: ndcg_at_1 value: 71.452 - type: ndcg_at_10 value: 59.260999999999996 - type: ndcg_at_100 value: 62.424 - type: ndcg_at_1000 value: 63.951 - type: ndcg_at_3 value: 55.327000000000005 - type: ndcg_at_5 value: 57.416999999999994 - type: precision_at_1 value: 71.452 - type: precision_at_10 value: 12.061 - type: precision_at_100 value: 1.455 - type: precision_at_1000 value: 0.166 - type: precision_at_3 value: 34.36 - type: precision_at_5 value: 22.266 - type: recall_at_1 value: 35.726 - type: recall_at_10 value: 60.304 - type: recall_at_100 value: 72.75500000000001 - type: recall_at_1000 value: 82.978 - type: recall_at_3 value: 51.54 - type: recall_at_5 value: 55.665 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 8d743909f834c38949e8323a8a6ce8721ea6c7f4 metrics: - type: accuracy value: 66.63759999999999 - type: ap value: 61.48938261286748 - type: f1 value: 66.35089269264965 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: validation revision: e6838a846e2408f22cf5cc337ebc83e0bcf77849 metrics: - type: map_at_1 value: 20.842 - type: map_at_10 value: 32.992 - type: map_at_100 value: 34.236 - type: map_at_1000 value: 34.286 - type: map_at_3 value: 29.049000000000003 - type: map_at_5 value: 31.391999999999996 - type: mrr_at_1 value: 21.375 - type: mrr_at_10 value: 33.581 - type: mrr_at_100 value: 34.760000000000005 - type: mrr_at_1000 value: 34.803 - type: mrr_at_3 value: 29.704000000000004 - type: mrr_at_5 value: 32.015 - type: ndcg_at_1 value: 21.375 - type: ndcg_at_10 value: 39.905 - type: ndcg_at_100 value: 45.843 - type: ndcg_at_1000 value: 47.083999999999996 - type: ndcg_at_3 value: 31.918999999999997 - type: ndcg_at_5 value: 36.107 - type: precision_at_1 value: 21.375 - type: precision_at_10 value: 6.393 - type: precision_at_100 value: 0.935 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 13.663 - type: precision_at_5 value: 10.324 - type: recall_at_1 value: 20.842 - type: recall_at_10 value: 61.17 - type: recall_at_100 value: 88.518 - type: recall_at_1000 value: 97.993 - type: recall_at_3 value: 39.571 - type: recall_at_5 value: 49.653999999999996 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: a7e2a951126a26fc8c6a69f835f33a346ba259e3 metrics: - type: accuracy value: 93.46557227542178 - type: f1 value: 92.87345917772146 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: 6299947a7777084cc2d4b64235bf7190381ce755 metrics: - type: accuracy value: 72.42134062927497 - type: f1 value: 55.03624810959269 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 072a486a144adf7f4479a4a0dddb2152e161e1ea metrics: - type: accuracy value: 70.3866845998655 - type: f1 value: 68.9674519872921 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 76.27774041694687 - type: f1 value: 76.72936190462792 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: dcefc037ef84348e49b0d29109e891c01067226b metrics: - type: v_measure value: 31.511745925773337 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 3cd0e71dfbe09d4de0f9e5ecba43e7ce280959dc metrics: - type: v_measure value: 28.764235987575365 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 32.29353136386601 - type: mrr value: 33.536774455851685 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: 7eb63cc0c1eb59324d709ebed25fcab851fa7610 metrics: - type: map_at_1 value: 5.702 - type: map_at_10 value: 13.642000000000001 - type: map_at_100 value: 17.503 - type: map_at_1000 value: 19.126 - type: map_at_3 value: 9.748 - type: map_at_5 value: 11.642 - type: mrr_at_1 value: 45.82 - type: mrr_at_10 value: 54.821 - type: mrr_at_100 value: 55.422000000000004 - type: mrr_at_1000 value: 55.452999999999996 - type: mrr_at_3 value: 52.373999999999995 - type: mrr_at_5 value: 53.937000000000005 - type: ndcg_at_1 value: 44.272 - type: ndcg_at_10 value: 36.213 - type: ndcg_at_100 value: 33.829 - type: ndcg_at_1000 value: 42.557 - type: ndcg_at_3 value: 40.814 - type: ndcg_at_5 value: 39.562000000000005 - type: precision_at_1 value: 45.511 - type: precision_at_10 value: 27.214 - type: precision_at_100 value: 8.941 - type: precision_at_1000 value: 2.1870000000000003 - type: precision_at_3 value: 37.874 - type: precision_at_5 value: 34.489 - type: recall_at_1 value: 5.702 - type: recall_at_10 value: 17.638 - type: recall_at_100 value: 34.419 - type: recall_at_1000 value: 66.41 - type: recall_at_3 value: 10.914 - type: recall_at_5 value: 14.032 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: 6062aefc120bfe8ece5897809fb2e53bfe0d128c metrics: - type: map_at_1 value: 30.567 - type: map_at_10 value: 45.01 - type: map_at_100 value: 46.091 - type: map_at_1000 value: 46.126 - type: map_at_3 value: 40.897 - type: map_at_5 value: 43.301 - type: mrr_at_1 value: 34.56 - type: mrr_at_10 value: 47.725 - type: mrr_at_100 value: 48.548 - type: mrr_at_1000 value: 48.571999999999996 - type: mrr_at_3 value: 44.361 - type: mrr_at_5 value: 46.351 - type: ndcg_at_1 value: 34.531 - type: ndcg_at_10 value: 52.410000000000004 - type: ndcg_at_100 value: 56.999 - type: ndcg_at_1000 value: 57.830999999999996 - type: ndcg_at_3 value: 44.734 - type: ndcg_at_5 value: 48.701 - type: precision_at_1 value: 34.531 - type: precision_at_10 value: 8.612 - type: precision_at_100 value: 1.118 - type: precision_at_1000 value: 0.12 - type: precision_at_3 value: 20.307 - type: precision_at_5 value: 14.519000000000002 - type: recall_at_1 value: 30.567 - type: recall_at_10 value: 72.238 - type: recall_at_100 value: 92.154 - type: recall_at_1000 value: 98.375 - type: recall_at_3 value: 52.437999999999995 - type: recall_at_5 value: 61.516999999999996 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: 6205996560df11e3a3da9ab4f926788fc30a7db4 metrics: - type: map_at_1 value: 65.98 - type: map_at_10 value: 80.05600000000001 - type: map_at_100 value: 80.76299999999999 - type: map_at_1000 value: 80.786 - type: map_at_3 value: 76.848 - type: map_at_5 value: 78.854 - type: mrr_at_1 value: 75.86 - type: mrr_at_10 value: 83.397 - type: mrr_at_100 value: 83.555 - type: mrr_at_1000 value: 83.557 - type: mrr_at_3 value: 82.033 - type: mrr_at_5 value: 82.97 - type: ndcg_at_1 value: 75.88000000000001 - type: ndcg_at_10 value: 84.58099999999999 - type: ndcg_at_100 value: 86.151 - type: ndcg_at_1000 value: 86.315 - type: ndcg_at_3 value: 80.902 - type: ndcg_at_5 value: 82.953 - type: precision_at_1 value: 75.88000000000001 - type: precision_at_10 value: 12.986 - type: precision_at_100 value: 1.5110000000000001 - type: precision_at_1000 value: 0.156 - type: precision_at_3 value: 35.382999999999996 - type: precision_at_5 value: 23.555999999999997 - type: recall_at_1 value: 65.98 - type: recall_at_10 value: 93.716 - type: recall_at_100 value: 99.21799999999999 - type: recall_at_1000 value: 99.97 - type: recall_at_3 value: 83.551 - type: recall_at_5 value: 88.998 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: b2805658ae38990172679479369a78b86de8c390 metrics: - type: v_measure value: 40.45148482612238 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 385e3cb46b4cfa89021f56c4380204149d0efe33 metrics: - type: v_measure value: 55.749490673039126 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: 5c59ef3e437a0a9651c8fe6fde943e7dce59fba5 metrics: - type: map_at_1 value: 4.903 - type: map_at_10 value: 11.926 - type: map_at_100 value: 13.916999999999998 - type: map_at_1000 value: 14.215 - type: map_at_3 value: 8.799999999999999 - type: map_at_5 value: 10.360999999999999 - type: mrr_at_1 value: 24.099999999999998 - type: mrr_at_10 value: 34.482 - type: mrr_at_100 value: 35.565999999999995 - type: mrr_at_1000 value: 35.619 - type: mrr_at_3 value: 31.433 - type: mrr_at_5 value: 33.243 - type: ndcg_at_1 value: 24.099999999999998 - type: ndcg_at_10 value: 19.872999999999998 - type: ndcg_at_100 value: 27.606 - type: ndcg_at_1000 value: 32.811 - type: ndcg_at_3 value: 19.497999999999998 - type: ndcg_at_5 value: 16.813 - type: precision_at_1 value: 24.099999999999998 - type: precision_at_10 value: 10.08 - type: precision_at_100 value: 2.122 - type: precision_at_1000 value: 0.337 - type: precision_at_3 value: 18.2 - type: precision_at_5 value: 14.62 - type: recall_at_1 value: 4.903 - type: recall_at_10 value: 20.438000000000002 - type: recall_at_100 value: 43.043 - type: recall_at_1000 value: 68.41000000000001 - type: recall_at_3 value: 11.068 - type: recall_at_5 value: 14.818000000000001 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: 20a6d6f312dd54037fe07a32d58e5e168867909d metrics: - type: cos_sim_pearson value: 78.58086597995997 - type: cos_sim_spearman value: 69.63214182814991 - type: euclidean_pearson value: 72.76175489042691 - type: euclidean_spearman value: 67.84965161872971 - type: manhattan_pearson value: 72.73812689782592 - type: manhattan_spearman value: 67.83610439531277 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: fdf84275bb8ce4b49c971d02e84dd1abc677a50f metrics: - type: cos_sim_pearson value: 75.13970861325006 - type: cos_sim_spearman value: 67.5020551515597 - type: euclidean_pearson value: 66.33415412418276 - type: euclidean_spearman value: 66.82145056673268 - type: manhattan_pearson value: 66.55489484006415 - type: manhattan_spearman value: 66.95147433279057 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 1591bfcbe8c69d4bf7fe2a16e2451017832cafb9 metrics: - type: cos_sim_pearson value: 78.85850536483447 - type: cos_sim_spearman value: 79.1633350177206 - type: euclidean_pearson value: 72.74090561408477 - type: euclidean_spearman value: 73.57374448302961 - type: manhattan_pearson value: 72.92980654233226 - type: manhattan_spearman value: 73.72777155112588 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: e2125984e7df8b7871f6ae9949cf6b6795e7c54b metrics: - type: cos_sim_pearson value: 79.51125593897028 - type: cos_sim_spearman value: 74.46048326701329 - type: euclidean_pearson value: 70.87726087052985 - type: euclidean_spearman value: 67.7721470654411 - type: manhattan_pearson value: 71.05892792135637 - type: manhattan_spearman value: 67.93472619779037 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: 1cd7298cac12a96a373b6a2f18738bb3e739a9b6 metrics: - type: cos_sim_pearson value: 83.8299348880489 - type: cos_sim_spearman value: 84.47194637929275 - type: euclidean_pearson value: 78.68768462480418 - type: euclidean_spearman value: 79.80526323901917 - type: manhattan_pearson value: 78.6810718151946 - type: manhattan_spearman value: 79.7820584821254 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 360a0b2dff98700d09e634a01e1cc1624d3e42cd metrics: - type: cos_sim_pearson value: 79.99206664843005 - type: cos_sim_spearman value: 80.96089203722137 - type: euclidean_pearson value: 71.31216213716365 - type: euclidean_spearman value: 71.45258140049407 - type: manhattan_pearson value: 71.26140340402836 - type: manhattan_spearman value: 71.3896894666943 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: 9fc37e8c632af1c87a3d23e685d49552a02582a0 metrics: - type: cos_sim_pearson value: 87.35697089594868 - type: cos_sim_spearman value: 87.78202647220289 - type: euclidean_pearson value: 84.20969668786667 - type: euclidean_spearman value: 83.91876425459982 - type: manhattan_pearson value: 84.24429755612542 - type: manhattan_spearman value: 83.98826315103398 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 2de6ce8c1921b71a755b262c6b57fef195dd7906 metrics: - type: cos_sim_pearson value: 69.06962775868384 - type: cos_sim_spearman value: 69.34889515492327 - type: euclidean_pearson value: 69.28108180412313 - type: euclidean_spearman value: 69.6437114853659 - type: manhattan_pearson value: 69.39974983734993 - type: manhattan_spearman value: 69.69057284482079 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: 8913289635987208e6e7c72789e4be2fe94b6abd metrics: - type: cos_sim_pearson value: 82.42553734213958 - type: cos_sim_spearman value: 81.38977341532744 - type: euclidean_pearson value: 76.47494587945522 - type: euclidean_spearman value: 75.92794860531089 - type: manhattan_pearson value: 76.4768777169467 - type: manhattan_spearman value: 75.9252673228599 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: 56a6d0140cf6356659e2a7c1413286a774468d44 metrics: - type: map value: 80.78825425914722 - type: mrr value: 94.60017197762296 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: a75ae049398addde9b70f6b268875f5cbce99089 metrics: - type: map_at_1 value: 60.633 - type: map_at_10 value: 70.197 - type: map_at_100 value: 70.758 - type: map_at_1000 value: 70.765 - type: map_at_3 value: 67.082 - type: map_at_5 value: 69.209 - type: mrr_at_1 value: 63.333 - type: mrr_at_10 value: 71.17 - type: mrr_at_100 value: 71.626 - type: mrr_at_1000 value: 71.633 - type: mrr_at_3 value: 68.833 - type: mrr_at_5 value: 70.6 - type: ndcg_at_1 value: 63.333 - type: ndcg_at_10 value: 74.697 - type: ndcg_at_100 value: 76.986 - type: ndcg_at_1000 value: 77.225 - type: ndcg_at_3 value: 69.527 - type: ndcg_at_5 value: 72.816 - type: precision_at_1 value: 63.333 - type: precision_at_10 value: 9.9 - type: precision_at_100 value: 1.103 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 26.889000000000003 - type: precision_at_5 value: 18.2 - type: recall_at_1 value: 60.633 - type: recall_at_10 value: 87.36699999999999 - type: recall_at_100 value: 97.333 - type: recall_at_1000 value: 99.333 - type: recall_at_3 value: 73.656 - type: recall_at_5 value: 82.083 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: 5a8256d0dff9c4bd3be3ba3e67e4e70173f802ea metrics: - type: cos_sim_accuracy value: 99.76633663366337 - type: cos_sim_ap value: 93.84024096781063 - type: cos_sim_f1 value: 88.08080808080808 - type: cos_sim_precision value: 88.9795918367347 - type: cos_sim_recall value: 87.2 - type: dot_accuracy value: 99.46336633663367 - type: dot_ap value: 75.78127156965245 - type: dot_f1 value: 71.41403865717193 - type: dot_precision value: 72.67080745341616 - type: dot_recall value: 70.19999999999999 - type: euclidean_accuracy value: 99.67524752475248 - type: euclidean_ap value: 88.61274955249769 - type: euclidean_f1 value: 82.30852211434735 - type: euclidean_precision value: 89.34426229508196 - type: euclidean_recall value: 76.3 - type: manhattan_accuracy value: 99.67722772277227 - type: manhattan_ap value: 88.77516158012779 - type: manhattan_f1 value: 82.36536430834212 - type: manhattan_precision value: 87.24832214765101 - type: manhattan_recall value: 78.0 - type: max_accuracy value: 99.76633663366337 - type: max_ap value: 93.84024096781063 - type: max_f1 value: 88.08080808080808 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 70a89468f6dccacc6aa2b12a6eac54e74328f235 metrics: - type: v_measure value: 59.20812266121527 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: d88009ab563dd0b16cfaf4436abaf97fa3550cf0 metrics: - type: v_measure value: 33.954248554638056 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: ef807ea29a75ec4f91b50fd4191cb4ee4589a9f9 metrics: - type: map value: 51.52800990025549 - type: mrr value: 52.360394915541974 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: 8753c2788d36c01fc6f05d03fe3f7268d63f9122 metrics: - type: cos_sim_pearson value: 30.737881131277355 - type: cos_sim_spearman value: 31.45979323917254 - type: dot_pearson value: 26.24686017962023 - type: dot_spearman value: 25.006732878791745 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: 2c8041b2c07a79b6f7ba8fe6acc72e5d9f92d217 metrics: - type: map_at_1 value: 0.253 - type: map_at_10 value: 2.1399999999999997 - type: map_at_100 value: 12.873000000000001 - type: map_at_1000 value: 31.002000000000002 - type: map_at_3 value: 0.711 - type: map_at_5 value: 1.125 - type: mrr_at_1 value: 96.0 - type: mrr_at_10 value: 98.0 - type: mrr_at_100 value: 98.0 - type: mrr_at_1000 value: 98.0 - type: mrr_at_3 value: 98.0 - type: mrr_at_5 value: 98.0 - type: ndcg_at_1 value: 94.0 - type: ndcg_at_10 value: 84.881 - type: ndcg_at_100 value: 64.694 - type: ndcg_at_1000 value: 56.85 - type: ndcg_at_3 value: 90.061 - type: ndcg_at_5 value: 87.155 - type: precision_at_1 value: 96.0 - type: precision_at_10 value: 88.8 - type: precision_at_100 value: 65.7 - type: precision_at_1000 value: 25.080000000000002 - type: precision_at_3 value: 92.667 - type: precision_at_5 value: 90.0 - type: recall_at_1 value: 0.253 - type: recall_at_10 value: 2.292 - type: recall_at_100 value: 15.78 - type: recall_at_1000 value: 53.015 - type: recall_at_3 value: 0.7270000000000001 - type: recall_at_5 value: 1.162 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: 527b7d77e16e343303e68cb6af11d6e18b9f7b3b metrics: - type: map_at_1 value: 2.116 - type: map_at_10 value: 9.625 - type: map_at_100 value: 15.641 - type: map_at_1000 value: 17.127 - type: map_at_3 value: 4.316 - type: map_at_5 value: 6.208 - type: mrr_at_1 value: 32.653 - type: mrr_at_10 value: 48.083999999999996 - type: mrr_at_100 value: 48.631 - type: mrr_at_1000 value: 48.649 - type: mrr_at_3 value: 42.857 - type: mrr_at_5 value: 46.224 - type: ndcg_at_1 value: 29.592000000000002 - type: ndcg_at_10 value: 25.430999999999997 - type: ndcg_at_100 value: 36.344 - type: ndcg_at_1000 value: 47.676 - type: ndcg_at_3 value: 26.144000000000002 - type: ndcg_at_5 value: 26.304 - type: precision_at_1 value: 32.653 - type: precision_at_10 value: 24.082 - type: precision_at_100 value: 7.714 - type: precision_at_1000 value: 1.5310000000000001 - type: precision_at_3 value: 26.531 - type: precision_at_5 value: 26.939 - type: recall_at_1 value: 2.116 - type: recall_at_10 value: 16.794 - type: recall_at_100 value: 47.452 - type: recall_at_1000 value: 82.312 - type: recall_at_3 value: 5.306 - type: recall_at_5 value: 9.306000000000001 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de metrics: - type: accuracy value: 67.709 - type: ap value: 13.541535578501716 - type: f1 value: 52.569619919446794 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: 62146448f05be9e52a36b8ee9936447ea787eede metrics: - type: accuracy value: 56.850594227504246 - type: f1 value: 57.233377364910574 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 091a54f9a36281ce7d6590ec8c75dd485e7e01d4 metrics: - type: v_measure value: 39.463722986090474 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 84.09131549144662 - type: cos_sim_ap value: 66.86677647503386 - type: cos_sim_f1 value: 62.94631710362049 - type: cos_sim_precision value: 59.73933649289099 - type: cos_sim_recall value: 66.51715039577837 - type: dot_accuracy value: 80.27656911247541 - type: dot_ap value: 54.291720398612085 - type: dot_f1 value: 54.77150537634409 - type: dot_precision value: 47.58660957571039 - type: dot_recall value: 64.5118733509235 - type: euclidean_accuracy value: 82.76211480002385 - type: euclidean_ap value: 62.430397690753296 - type: euclidean_f1 value: 59.191590539356774 - type: euclidean_precision value: 56.296119971435374 - type: euclidean_recall value: 62.401055408970976 - type: manhattan_accuracy value: 82.7561542588067 - type: manhattan_ap value: 62.41882051995577 - type: manhattan_f1 value: 59.32101002778785 - type: manhattan_precision value: 54.71361711611321 - type: manhattan_recall value: 64.77572559366754 - type: max_accuracy value: 84.09131549144662 - type: max_ap value: 66.86677647503386 - type: max_f1 value: 62.94631710362049 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.79574649745798 - type: cos_sim_ap value: 85.28960532524223 - type: cos_sim_f1 value: 77.98460043358001 - type: cos_sim_precision value: 75.78090948714224 - type: cos_sim_recall value: 80.32029565753002 - type: dot_accuracy value: 85.5939767920208 - type: dot_ap value: 76.14131706694056 - type: dot_f1 value: 72.70246298696868 - type: dot_precision value: 65.27012127894156 - type: dot_recall value: 82.04496458269172 - type: euclidean_accuracy value: 86.72332828812046 - type: euclidean_ap value: 80.84854809178995 - type: euclidean_f1 value: 72.47657499809551 - type: euclidean_precision value: 71.71717171717171 - type: euclidean_recall value: 73.25223283030489 - type: manhattan_accuracy value: 86.7563162184189 - type: manhattan_ap value: 80.87598895575626 - type: manhattan_f1 value: 72.54617892068092 - type: manhattan_precision value: 68.49268225960881 - type: manhattan_recall value: 77.10963966738528 - type: max_accuracy value: 88.79574649745798 - type: max_ap value: 85.28960532524223 - type: max_f1 value: 77.98460043358001 --- # SGPT-5.8B-weightedmean-msmarco-specb-bitfit ## Usage For usage instructions, refer to our codebase: https://github.com/Muennighoff/sgpt ## Evaluation Results For eval results, refer to our paper: https://arxiv.org/abs/2202.08904 ## Training The model was trained with the parameters: **DataLoader**: `torch.utils.data.dataloader.DataLoader` of length 249592 with parameters: ``` {'batch_size': 2, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'} ``` **Loss**: `sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss` with parameters: ``` {'scale': 20.0, 'similarity_fct': 'cos_sim'} ``` Parameters of the fit()-Method: ``` { "epochs": 10, "evaluation_steps": 0, "evaluator": "NoneType", "max_grad_norm": 1, "optimizer_class": "<class 'transformers.optimization.AdamW'>", "optimizer_params": { "lr": 5e-05 }, "scheduler": "WarmupLinear", "steps_per_epoch": null, "warmup_steps": 1000, "weight_decay": 0.01 } ``` ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 300, 'do_lower_case': False}) with Transformer model: GPTJModel (1): Pooling({'word_embedding_dimension': 4096, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': True, 'pooling_mode_lasttoken': False}) ) ``` ## Citing & Authors ```bibtex @article{muennighoff2022sgpt, title={SGPT: GPT Sentence Embeddings for Semantic Search}, author={Muennighoff, Niklas}, journal={arXiv preprint arXiv:2202.08904}, year={2022} } ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
RichardErkhov/dunzhang_-_stella_en_1.5B_v5-4bits
RichardErkhov
null
[ "safetensors", "qwen2", "custom_code", "arxiv:2205.13147", "4-bit", "bitsandbytes", "region:us" ]
1,730
1,730
4
0
--- {} --- Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) stella_en_1.5B_v5 - bnb 4bits - Model creator: https://huggingface.co/dunzhang/ - Original model: https://huggingface.co/dunzhang/stella_en_1.5B_v5/ Original model description: --- model-index: - name: stella_en_1.5B_v5 results: - dataset: config: en name: MTEB AmazonCounterfactualClassification (en) revision: e8379541af4e31359cca9fbcf4b00f2671dba205 split: test type: mteb/amazon_counterfactual metrics: - type: accuracy value: 92.86567164179104 - type: ap value: 72.13503907102613 - type: ap_weighted value: 72.13503907102613 - type: f1 value: 89.5586886376355 - type: f1_weighted value: 93.13621183004571 - type: main_score value: 92.86567164179104 task: type: Classification - dataset: config: default name: MTEB AmazonPolarityClassification revision: e2d317d38cd51312af73b3d32a06d1a08b442046 split: test type: mteb/amazon_polarity metrics: - type: accuracy value: 97.16485 - type: ap value: 96.05546315415225 - type: ap_weighted value: 96.05546315415225 - type: f1 value: 97.16351087403213 - type: f1_weighted value: 97.16351087403213 - type: main_score value: 97.16485 task: type: Classification - dataset: config: en name: MTEB AmazonReviewsClassification (en) revision: 1399c76144fd37290681b995c656ef9b2e06e26d split: test type: mteb/amazon_reviews_multi metrics: - type: accuracy value: 59.358 - type: f1 value: 59.0264615883114 - type: f1_weighted value: 59.0264615883114 - type: main_score value: 59.358 task: type: Classification - dataset: config: default name: MTEB ArguAna revision: c22ab2a51041ffd869aaddef7af8d8215647e41a split: test type: mteb/arguana metrics: - type: main_score value: 65.269 - type: map_at_1 value: 41.607 - type: map_at_10 value: 57.104 - type: map_at_100 value: 57.621 - type: map_at_1000 value: 57.621 - type: map_at_20 value: 57.533 - type: map_at_3 value: 52.891999999999996 - type: map_at_5 value: 55.371 - type: mrr_at_1 value: 42.318634423897585 - type: mrr_at_10 value: 57.353970511865406 - type: mrr_at_100 value: 57.88398078476526 - type: mrr_at_1000 value: 57.88467807648422 - type: mrr_at_20 value: 57.796730533206166 - type: mrr_at_3 value: 53.200568990042775 - type: mrr_at_5 value: 55.6330014224753 - type: nauc_map_at_1000_diff1 value: 24.54414600428287 - type: nauc_map_at_1000_max value: -8.389738078358459 - type: nauc_map_at_1000_std value: -18.188787645801366 - type: nauc_map_at_100_diff1 value: 24.543138576462308 - type: nauc_map_at_100_max value: -8.390896839752044 - type: nauc_map_at_100_std value: -18.192549240185247 - type: nauc_map_at_10_diff1 value: 24.219607088995822 - type: nauc_map_at_10_max value: -8.245734391254308 - type: nauc_map_at_10_std value: -18.229706566466447 - type: nauc_map_at_1_diff1 value: 29.325201664812788 - type: nauc_map_at_1_max value: -11.742800494823971 - type: nauc_map_at_1_std value: -18.610215769702528 - type: nauc_map_at_20_diff1 value: 24.471097562798803 - type: nauc_map_at_20_max value: -8.318035874000799 - type: nauc_map_at_20_std value: -18.171541096773108 - type: nauc_map_at_3_diff1 value: 24.275846107642824 - type: nauc_map_at_3_max value: -8.212242049581894 - type: nauc_map_at_3_std value: -17.920379368937496 - type: nauc_map_at_5_diff1 value: 23.873692493209255 - type: nauc_map_at_5_max value: -8.110347163828767 - type: nauc_map_at_5_std value: -18.20863325596931 - type: nauc_mrr_at_1000_diff1 value: 22.656410956419975 - type: nauc_mrr_at_1000_max value: -8.924888102233243 - type: nauc_mrr_at_1000_std value: -18.103674384502526 - type: nauc_mrr_at_100_diff1 value: 22.655448817140968 - type: nauc_mrr_at_100_max value: -8.926034318499038 - type: nauc_mrr_at_100_std value: -18.10743930104164 - type: nauc_mrr_at_10_diff1 value: 22.297536272996872 - type: nauc_mrr_at_10_max value: -8.836407556658274 - type: nauc_mrr_at_10_std value: -18.1598393044477 - type: nauc_mrr_at_1_diff1 value: 27.419572424489708 - type: nauc_mrr_at_1_max value: -11.42241314820691 - type: nauc_mrr_at_1_std value: -18.54893865856313 - type: nauc_mrr_at_20_diff1 value: 22.590227214657418 - type: nauc_mrr_at_20_max value: -8.849986456376993 - type: nauc_mrr_at_20_std value: -18.0862391777352 - type: nauc_mrr_at_3_diff1 value: 22.415270167774988 - type: nauc_mrr_at_3_max value: -8.692871854156435 - type: nauc_mrr_at_3_std value: -17.6740102891955 - type: nauc_mrr_at_5_diff1 value: 21.96284578521464 - type: nauc_mrr_at_5_max value: -8.757031535546025 - type: nauc_mrr_at_5_std value: -18.210766964081294 - type: nauc_ndcg_at_1000_diff1 value: 23.939400161569115 - type: nauc_ndcg_at_1000_max value: -7.866999120512983 - type: nauc_ndcg_at_1000_std value: -17.981457019643617 - type: nauc_ndcg_at_100_diff1 value: 23.920033349619317 - type: nauc_ndcg_at_100_max value: -7.889849409678031 - type: nauc_ndcg_at_100_std value: -18.054931990360537 - type: nauc_ndcg_at_10_diff1 value: 22.543020461303534 - type: nauc_ndcg_at_10_max value: -7.072111788010867 - type: nauc_ndcg_at_10_std value: -18.26397604573537 - type: nauc_ndcg_at_1_diff1 value: 29.325201664812788 - type: nauc_ndcg_at_1_max value: -11.742800494823971 - type: nauc_ndcg_at_1_std value: -18.610215769702528 - type: nauc_ndcg_at_20_diff1 value: 23.551587021207972 - type: nauc_ndcg_at_20_max value: -7.298056222649139 - type: nauc_ndcg_at_20_std value: -18.056004880930608 - type: nauc_ndcg_at_3_diff1 value: 22.669089506345273 - type: nauc_ndcg_at_3_max value: -7.278024373570137 - type: nauc_ndcg_at_3_std value: -17.816657759914193 - type: nauc_ndcg_at_5_diff1 value: 21.72619728226575 - type: nauc_ndcg_at_5_max value: -6.959741647471228 - type: nauc_ndcg_at_5_std value: -18.35173705190235 - type: nauc_precision_at_1000_diff1 value: 5.0388241058076995 - type: nauc_precision_at_1000_max value: 34.439879624882145 - type: nauc_precision_at_1000_std value: 77.22610895194498 - type: nauc_precision_at_100_diff1 value: 1.340670767252794 - type: nauc_precision_at_100_max value: 19.30870025961241 - type: nauc_precision_at_100_std value: 35.37688289157788 - type: nauc_precision_at_10_diff1 value: 7.734227153124332 - type: nauc_precision_at_10_max value: 4.202399088422237 - type: nauc_precision_at_10_std value: -18.383890254046698 - type: nauc_precision_at_1_diff1 value: 29.325201664812788 - type: nauc_precision_at_1_max value: -11.742800494823971 - type: nauc_precision_at_1_std value: -18.610215769702528 - type: nauc_precision_at_20_diff1 value: 9.48070999361637 - type: nauc_precision_at_20_max value: 19.056709637253025 - type: nauc_precision_at_20_std value: -13.266821166159485 - type: nauc_precision_at_3_diff1 value: 17.245260303409747 - type: nauc_precision_at_3_max value: -4.202455033452335 - type: nauc_precision_at_3_std value: -17.514264039955332 - type: nauc_precision_at_5_diff1 value: 12.074628162049974 - type: nauc_precision_at_5_max value: -1.9145501461107832 - type: nauc_precision_at_5_std value: -19.162525528916344 - type: nauc_recall_at_1000_diff1 value: 5.038824105805915 - type: nauc_recall_at_1000_max value: 34.43987962487738 - type: nauc_recall_at_1000_std value: 77.22610895193765 - type: nauc_recall_at_100_diff1 value: 1.3406707672497025 - type: nauc_recall_at_100_max value: 19.30870025960776 - type: nauc_recall_at_100_std value: 35.37688289157515 - type: nauc_recall_at_10_diff1 value: 7.734227153124366 - type: nauc_recall_at_10_max value: 4.202399088421976 - type: nauc_recall_at_10_std value: -18.38389025404673 - type: nauc_recall_at_1_diff1 value: 29.325201664812788 - type: nauc_recall_at_1_max value: -11.742800494823971 - type: nauc_recall_at_1_std value: -18.610215769702528 - type: nauc_recall_at_20_diff1 value: 9.480709993616845 - type: nauc_recall_at_20_max value: 19.05670963725301 - type: nauc_recall_at_20_std value: -13.266821166158651 - type: nauc_recall_at_3_diff1 value: 17.24526030340978 - type: nauc_recall_at_3_max value: -4.202455033452323 - type: nauc_recall_at_3_std value: -17.51426403995538 - type: nauc_recall_at_5_diff1 value: 12.074628162049992 - type: nauc_recall_at_5_max value: -1.914550146110865 - type: nauc_recall_at_5_std value: -19.162525528916362 - type: ndcg_at_1 value: 41.607 - type: ndcg_at_10 value: 65.269 - type: ndcg_at_100 value: 67.289 - type: ndcg_at_1000 value: 67.29899999999999 - type: ndcg_at_20 value: 66.76299999999999 - type: ndcg_at_3 value: 56.604 - type: ndcg_at_5 value: 61.07900000000001 - type: precision_at_1 value: 41.607 - type: precision_at_10 value: 9.118 - type: precision_at_100 value: 0.996 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.8469999999999995 - type: precision_at_3 value: 22.451 - type: precision_at_5 value: 15.647 - type: recall_at_1 value: 41.607 - type: recall_at_10 value: 91.181 - type: recall_at_100 value: 99.57300000000001 - type: recall_at_1000 value: 99.644 - type: recall_at_20 value: 96.942 - type: recall_at_3 value: 67.354 - type: recall_at_5 value: 78.236 task: type: Retrieval - dataset: config: default name: MTEB ArxivClusteringP2P revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d split: test type: mteb/arxiv-clustering-p2p metrics: - type: main_score value: 55.437138353189994 - type: v_measure value: 55.437138353189994 - type: v_measure_std value: 14.718556601335491 task: type: Clustering - dataset: config: default name: MTEB ArxivClusteringS2S revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 split: test type: mteb/arxiv-clustering-s2s metrics: - type: main_score value: 50.65858459544658 - type: v_measure value: 50.65858459544658 - type: v_measure_std value: 14.887033747525146 task: type: Clustering - dataset: config: default name: MTEB AskUbuntuDupQuestions revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 split: test type: mteb/askubuntudupquestions-reranking metrics: - type: main_score value: 67.32597152838535 - type: map value: 67.32597152838535 - type: mrr value: 78.98683111286988 - type: nAUC_map_diff1 value: 16.8624639710487 - type: nAUC_map_max value: 24.91996491142433 - type: nAUC_map_std value: 17.91865808793225 - type: nAUC_mrr_diff1 value: 25.03766425631947 - type: nAUC_mrr_max value: 41.64561939958336 - type: nAUC_mrr_std value: 23.179909345891968 task: type: Reranking - dataset: config: default name: MTEB BIOSSES revision: d3fb88f8f02e40887cd149695127462bbcf29b4a split: test type: mteb/biosses-sts metrics: - type: cosine_pearson value: 85.790820496042 - type: cosine_spearman value: 83.10731534330517 - type: euclidean_pearson value: 84.61741304343133 - type: euclidean_spearman value: 83.17297949010973 - type: main_score value: 83.10731534330517 - type: manhattan_pearson value: 85.2137696526676 - type: manhattan_spearman value: 84.39168195786738 - type: pearson value: 85.790820496042 - type: spearman value: 83.10731534330517 task: type: STS - dataset: config: default name: MTEB Banking77Classification revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 split: test type: mteb/banking77 metrics: - type: accuracy value: 89.78896103896105 - type: f1 value: 89.76107366333488 - type: f1_weighted value: 89.76107366333488 - type: main_score value: 89.78896103896105 task: type: Classification - dataset: config: default name: MTEB BiorxivClusteringP2P revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 split: test type: mteb/biorxiv-clustering-p2p metrics: - type: main_score value: 50.68092296236376 - type: v_measure value: 50.68092296236376 - type: v_measure_std value: 0.7832640983085436 task: type: Clustering - dataset: config: default name: MTEB BiorxivClusteringS2S revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 split: test type: mteb/biorxiv-clustering-s2s metrics: - type: main_score value: 46.86629236732983 - type: v_measure value: 46.86629236732983 - type: v_measure_std value: 0.8784322236350974 task: type: Clustering - dataset: config: default name: MTEB CQADupstackRetrieval revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 split: test type: mteb/cqadupstack metrics: - type: main_score value: 47.74883333333334 - type: map_at_1 value: 30.179249999999996 - type: map_at_10 value: 41.60824999999999 - type: map_at_100 value: 42.94008333333332 - type: map_at_1000 value: 43.04666666666667 - type: map_at_20 value: 42.36833333333334 - type: map_at_3 value: 38.23491666666666 - type: map_at_5 value: 40.10183333333333 - type: mrr_at_1 value: 36.47676085808166 - type: mrr_at_10 value: 46.300991916437155 - type: mrr_at_100 value: 47.12155753713262 - type: mrr_at_1000 value: 47.168033610799945 - type: mrr_at_20 value: 46.80405724560391 - type: mrr_at_3 value: 43.77000352801797 - type: mrr_at_5 value: 45.22295361704542 - type: nauc_map_at_1000_diff1 value: 46.953671666941524 - type: nauc_map_at_1000_max value: 32.260396316089675 - type: nauc_map_at_1000_std value: 0.6657766120094878 - type: nauc_map_at_100_diff1 value: 46.94717463394555 - type: nauc_map_at_100_max value: 32.25088350678177 - type: nauc_map_at_100_std value: 0.6257017014549283 - type: nauc_map_at_10_diff1 value: 46.974678429336464 - type: nauc_map_at_10_max value: 31.862230807295504 - type: nauc_map_at_10_std value: -0.14758828549579284 - type: nauc_map_at_1_diff1 value: 52.48913346466124 - type: nauc_map_at_1_max value: 29.874374024967725 - type: nauc_map_at_1_std value: -2.433547569836134 - type: nauc_map_at_20_diff1 value: 46.96088684217651 - type: nauc_map_at_20_max value: 32.08954208613205 - type: nauc_map_at_20_std value: 0.25946321113436527 - type: nauc_map_at_3_diff1 value: 47.703230121518345 - type: nauc_map_at_3_max value: 30.977880095983107 - type: nauc_map_at_3_std value: -1.342777563991804 - type: nauc_map_at_5_diff1 value: 47.1615010199957 - type: nauc_map_at_5_max value: 31.420885812683284 - type: nauc_map_at_5_std value: -0.8789297099444306 - type: nauc_mrr_at_1000_diff1 value: 46.69178645962615 - type: nauc_mrr_at_1000_max value: 34.392807413340655 - type: nauc_mrr_at_1000_std value: 1.6155464863667934 - type: nauc_mrr_at_100_diff1 value: 46.67417236349189 - type: nauc_mrr_at_100_max value: 34.384607045512624 - type: nauc_mrr_at_100_std value: 1.6259917384109652 - type: nauc_mrr_at_10_diff1 value: 46.60497560446239 - type: nauc_mrr_at_10_max value: 34.32918897817958 - type: nauc_mrr_at_10_std value: 1.39387793769014 - type: nauc_mrr_at_1_diff1 value: 51.61608573254137 - type: nauc_mrr_at_1_max value: 35.18105023234596 - type: nauc_mrr_at_1_std value: 0.17943702145478177 - type: nauc_mrr_at_20_diff1 value: 46.635943069860254 - type: nauc_mrr_at_20_max value: 34.37050973118794 - type: nauc_mrr_at_20_std value: 1.5346464678860607 - type: nauc_mrr_at_3_diff1 value: 47.154389369038334 - type: nauc_mrr_at_3_max value: 34.41036411855465 - type: nauc_mrr_at_3_std value: 0.924551812357872 - type: nauc_mrr_at_5_diff1 value: 46.6690101691763 - type: nauc_mrr_at_5_max value: 34.29740388138466 - type: nauc_mrr_at_5_std value: 1.0567184149139792 - type: nauc_ndcg_at_1000_diff1 value: 45.375448289173264 - type: nauc_ndcg_at_1000_max value: 33.47957083714482 - type: nauc_ndcg_at_1000_std value: 3.192251100225568 - type: nauc_ndcg_at_100_diff1 value: 44.93601014699499 - type: nauc_ndcg_at_100_max value: 33.21249888295249 - type: nauc_ndcg_at_100_std value: 3.609842852934217 - type: nauc_ndcg_at_10_diff1 value: 44.87893284011915 - type: nauc_ndcg_at_10_max value: 32.384885249478515 - type: nauc_ndcg_at_10_std value: 1.454493065035396 - type: nauc_ndcg_at_1_diff1 value: 51.61608573254137 - type: nauc_ndcg_at_1_max value: 35.18105023234596 - type: nauc_ndcg_at_1_std value: 0.17943702145478177 - type: nauc_ndcg_at_20_diff1 value: 44.867752179050605 - type: nauc_ndcg_at_20_max value: 32.689535921840196 - type: nauc_ndcg_at_20_std value: 2.337765158573901 - type: nauc_ndcg_at_3_diff1 value: 45.87485821381341 - type: nauc_ndcg_at_3_max value: 32.33282450558947 - type: nauc_ndcg_at_3_std value: 0.0681643829273283 - type: nauc_ndcg_at_5_diff1 value: 45.202902131892394 - type: nauc_ndcg_at_5_max value: 32.1026971523917 - type: nauc_ndcg_at_5_std value: 0.3565572833774486 - type: nauc_precision_at_1000_diff1 value: -8.935267931198956 - type: nauc_precision_at_1000_max value: 6.464981960169269 - type: nauc_precision_at_1000_std value: 10.662786182234633 - type: nauc_precision_at_100_diff1 value: -1.64091517847155 - type: nauc_precision_at_100_max value: 15.175617871025024 - type: nauc_precision_at_100_std value: 16.924256989248075 - type: nauc_precision_at_10_diff1 value: 15.676651966277047 - type: nauc_precision_at_10_max value: 26.243734188847117 - type: nauc_precision_at_10_std value: 10.601741034956333 - type: nauc_precision_at_1_diff1 value: 51.61608573254137 - type: nauc_precision_at_1_max value: 35.18105023234596 - type: nauc_precision_at_1_std value: 0.17943702145478177 - type: nauc_precision_at_20_diff1 value: 9.447267260198654 - type: nauc_precision_at_20_max value: 23.024130858142723 - type: nauc_precision_at_20_std value: 13.739145648899603 - type: nauc_precision_at_3_diff1 value: 30.11583572134629 - type: nauc_precision_at_3_max value: 31.37321080069495 - type: nauc_precision_at_3_std value: 4.705512374126024 - type: nauc_precision_at_5_diff1 value: 23.192015335996093 - type: nauc_precision_at_5_max value: 29.415746835998764 - type: nauc_precision_at_5_std value: 6.843498772798558 - type: nauc_recall_at_1000_diff1 value: 25.36573313426033 - type: nauc_recall_at_1000_max value: 43.06672256524168 - type: nauc_recall_at_1000_std value: 47.93664853815292 - type: nauc_recall_at_100_diff1 value: 31.222880916617406 - type: nauc_recall_at_100_max value: 31.761159904172658 - type: nauc_recall_at_100_std value: 23.034218976635877 - type: nauc_recall_at_10_diff1 value: 36.23439028915225 - type: nauc_recall_at_10_max value: 28.473458977606438 - type: nauc_recall_at_10_std value: 3.7797969934159 - type: nauc_recall_at_1_diff1 value: 52.48913346466124 - type: nauc_recall_at_1_max value: 29.874374024967725 - type: nauc_recall_at_1_std value: -2.433547569836134 - type: nauc_recall_at_20_diff1 value: 34.678676952584766 - type: nauc_recall_at_20_max value: 29.04638392522168 - type: nauc_recall_at_20_std value: 8.148894982082549 - type: nauc_recall_at_3_diff1 value: 41.31029996231311 - type: nauc_recall_at_3_max value: 28.44199443414157 - type: nauc_recall_at_3_std value: -0.747324057600377 - type: nauc_recall_at_5_diff1 value: 38.535873899920674 - type: nauc_recall_at_5_max value: 27.942667805948375 - type: nauc_recall_at_5_std value: 0.30652206930973686 - type: ndcg_at_1 value: 36.47675 - type: ndcg_at_10 value: 47.74883333333334 - type: ndcg_at_100 value: 52.902416666666674 - type: ndcg_at_1000 value: 54.69116666666667 - type: ndcg_at_20 value: 49.89758333333333 - type: ndcg_at_3 value: 42.462250000000004 - type: ndcg_at_5 value: 44.91841666666667 - type: precision_at_1 value: 36.47675 - type: precision_at_10 value: 8.582416666666665 - type: precision_at_100 value: 1.31475 - type: precision_at_1000 value: 0.16458333333333333 - type: precision_at_20 value: 5.021833333333333 - type: precision_at_3 value: 20.004499999999997 - type: precision_at_5 value: 14.178666666666665 - type: recall_at_1 value: 30.179249999999996 - type: recall_at_10 value: 60.950166666666675 - type: recall_at_100 value: 83.19025 - type: recall_at_1000 value: 95.27774999999998 - type: recall_at_20 value: 68.80175 - type: recall_at_3 value: 46.01841666666666 - type: recall_at_5 value: 52.482416666666666 task: type: Retrieval - dataset: config: default name: MTEB ClimateFEVER revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 split: test type: mteb/climate-fever metrics: - type: main_score value: 46.113 - type: map_at_1 value: 20.122999999999998 - type: map_at_10 value: 35.474 - type: map_at_100 value: 37.592 - type: map_at_1000 value: 37.773 - type: map_at_20 value: 36.637 - type: map_at_3 value: 29.731 - type: map_at_5 value: 32.964 - type: mrr_at_1 value: 46.71009771986971 - type: mrr_at_10 value: 58.855669303552105 - type: mrr_at_100 value: 59.389249674038425 - type: mrr_at_1000 value: 59.408448104362364 - type: mrr_at_20 value: 59.23881203149016 - type: mrr_at_3 value: 56.18892508143328 - type: mrr_at_5 value: 57.85342019543985 - type: nauc_map_at_1000_diff1 value: 27.047031037721958 - type: nauc_map_at_1000_max value: 43.25240279148033 - type: nauc_map_at_1000_std value: 20.795849418696037 - type: nauc_map_at_100_diff1 value: 27.044739015116452 - type: nauc_map_at_100_max value: 43.24042159787812 - type: nauc_map_at_100_std value: 20.799952124137683 - type: nauc_map_at_10_diff1 value: 27.372696854670338 - type: nauc_map_at_10_max value: 43.054456574721684 - type: nauc_map_at_10_std value: 19.537162110136645 - type: nauc_map_at_1_diff1 value: 43.65424623953092 - type: nauc_map_at_1_max value: 45.17986509998762 - type: nauc_map_at_1_std value: 8.497107052335414 - type: nauc_map_at_20_diff1 value: 27.224535846566074 - type: nauc_map_at_20_max value: 43.12222854561229 - type: nauc_map_at_20_std value: 20.29982972202669 - type: nauc_map_at_3_diff1 value: 30.87847002319001 - type: nauc_map_at_3_max value: 42.890027891707575 - type: nauc_map_at_3_std value: 13.857451947580929 - type: nauc_map_at_5_diff1 value: 27.966867093591542 - type: nauc_map_at_5_max value: 42.35826637592201 - type: nauc_map_at_5_std value: 16.993102524058624 - type: nauc_mrr_at_1000_diff1 value: 30.191544077608164 - type: nauc_mrr_at_1000_max value: 44.959438920351644 - type: nauc_mrr_at_1000_std value: 24.065801376465114 - type: nauc_mrr_at_100_diff1 value: 30.170368115494 - type: nauc_mrr_at_100_max value: 44.955868115761156 - type: nauc_mrr_at_100_std value: 24.093510767847707 - type: nauc_mrr_at_10_diff1 value: 30.128430637520175 - type: nauc_mrr_at_10_max value: 44.97689261350708 - type: nauc_mrr_at_10_std value: 24.037049561818897 - type: nauc_mrr_at_1_diff1 value: 35.323351939108214 - type: nauc_mrr_at_1_max value: 43.85026244855636 - type: nauc_mrr_at_1_std value: 17.040662141218974 - type: nauc_mrr_at_20_diff1 value: 30.192006556160443 - type: nauc_mrr_at_20_max value: 45.02814530774032 - type: nauc_mrr_at_20_std value: 24.20885865448696 - type: nauc_mrr_at_3_diff1 value: 29.88250163424518 - type: nauc_mrr_at_3_max value: 44.25768944883186 - type: nauc_mrr_at_3_std value: 22.804183393364198 - type: nauc_mrr_at_5_diff1 value: 30.269824490420767 - type: nauc_mrr_at_5_max value: 44.97443265796657 - type: nauc_mrr_at_5_std value: 23.894159916141177 - type: nauc_ndcg_at_1000_diff1 value: 24.533764005407356 - type: nauc_ndcg_at_1000_max value: 44.50902713386608 - type: nauc_ndcg_at_1000_std value: 27.589506980238404 - type: nauc_ndcg_at_100_diff1 value: 24.209785073940353 - type: nauc_ndcg_at_100_max value: 44.18257063893669 - type: nauc_ndcg_at_100_std value: 27.963150866401943 - type: nauc_ndcg_at_10_diff1 value: 25.168069201989486 - type: nauc_ndcg_at_10_max value: 43.84940910683214 - type: nauc_ndcg_at_10_std value: 24.810707270956435 - type: nauc_ndcg_at_1_diff1 value: 35.323351939108214 - type: nauc_ndcg_at_1_max value: 43.85026244855636 - type: nauc_ndcg_at_1_std value: 17.040662141218974 - type: nauc_ndcg_at_20_diff1 value: 24.829924800466834 - type: nauc_ndcg_at_20_max value: 43.738574327059716 - type: nauc_ndcg_at_20_std value: 26.252370278684072 - type: nauc_ndcg_at_3_diff1 value: 27.321943393906274 - type: nauc_ndcg_at_3_max value: 42.16584786993447 - type: nauc_ndcg_at_3_std value: 18.24775079455969 - type: nauc_ndcg_at_5_diff1 value: 26.043785418347998 - type: nauc_ndcg_at_5_max value: 42.874593895388344 - type: nauc_ndcg_at_5_std value: 21.294004555506117 - type: nauc_precision_at_1000_diff1 value: -22.073027615308582 - type: nauc_precision_at_1000_max value: -6.549723766317357 - type: nauc_precision_at_1000_std value: 18.301749191241306 - type: nauc_precision_at_100_diff1 value: -15.654286887593619 - type: nauc_precision_at_100_max value: 6.401516251421999 - type: nauc_precision_at_100_std value: 29.170680324929805 - type: nauc_precision_at_10_diff1 value: -4.362381972892247 - type: nauc_precision_at_10_max value: 22.10943515872447 - type: nauc_precision_at_10_std value: 31.869699459530022 - type: nauc_precision_at_1_diff1 value: 35.323351939108214 - type: nauc_precision_at_1_max value: 43.85026244855636 - type: nauc_precision_at_1_std value: 17.040662141218974 - type: nauc_precision_at_20_diff1 value: -7.50749661117875 - type: nauc_precision_at_20_max value: 16.80584016023257 - type: nauc_precision_at_20_std value: 31.976755897112437 - type: nauc_precision_at_3_diff1 value: 7.402667538773083 - type: nauc_precision_at_3_max value: 31.2088401330676 - type: nauc_precision_at_3_std value: 24.287905698405662 - type: nauc_precision_at_5_diff1 value: 0.7479172565343901 - type: nauc_precision_at_5_max value: 26.28427734237825 - type: nauc_precision_at_5_std value: 28.246947120310317 - type: nauc_recall_at_1000_diff1 value: 2.4778431086370496 - type: nauc_recall_at_1000_max value: 40.2231995797509 - type: nauc_recall_at_1000_std value: 52.62124052183862 - type: nauc_recall_at_100_diff1 value: 8.960962419741463 - type: nauc_recall_at_100_max value: 35.81132850291491 - type: nauc_recall_at_100_std value: 40.020903251786166 - type: nauc_recall_at_10_diff1 value: 15.603400751376636 - type: nauc_recall_at_10_max value: 37.570127529136485 - type: nauc_recall_at_10_std value: 28.07128410238545 - type: nauc_recall_at_1_diff1 value: 43.65424623953092 - type: nauc_recall_at_1_max value: 45.17986509998762 - type: nauc_recall_at_1_std value: 8.497107052335414 - type: nauc_recall_at_20_diff1 value: 13.844820282832346 - type: nauc_recall_at_20_max value: 36.0106148516309 - type: nauc_recall_at_20_std value: 31.453103910565254 - type: nauc_recall_at_3_diff1 value: 24.359328154117748 - type: nauc_recall_at_3_max value: 39.93774251377568 - type: nauc_recall_at_3_std value: 16.214921517509648 - type: nauc_recall_at_5_diff1 value: 18.75788451360292 - type: nauc_recall_at_5_max value: 38.177646107055516 - type: nauc_recall_at_5_std value: 22.17196825834675 - type: ndcg_at_1 value: 46.71 - type: ndcg_at_10 value: 46.113 - type: ndcg_at_100 value: 53.035 - type: ndcg_at_1000 value: 55.724 - type: ndcg_at_20 value: 48.929 - type: ndcg_at_3 value: 39.501999999999995 - type: ndcg_at_5 value: 41.792 - type: precision_at_1 value: 46.71 - type: precision_at_10 value: 14.274000000000001 - type: precision_at_100 value: 2.1870000000000003 - type: precision_at_1000 value: 0.269 - type: precision_at_20 value: 8.375 - type: precision_at_3 value: 29.881 - type: precision_at_5 value: 22.697 - type: recall_at_1 value: 20.122999999999998 - type: recall_at_10 value: 52.22 - type: recall_at_100 value: 75.388 - type: recall_at_1000 value: 89.938 - type: recall_at_20 value: 60.077000000000005 - type: recall_at_3 value: 35.150999999999996 - type: recall_at_5 value: 42.748000000000005 task: type: Retrieval - dataset: config: default name: MTEB DBPedia revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 split: test type: mteb/dbpedia metrics: - type: main_score value: 52.276999999999994 - type: map_at_1 value: 9.949 - type: map_at_10 value: 24.891 - type: map_at_100 value: 37.111 - type: map_at_1000 value: 39.266 - type: map_at_20 value: 29.685 - type: map_at_3 value: 16.586000000000002 - type: map_at_5 value: 19.982 - type: mrr_at_1 value: 76.25 - type: mrr_at_10 value: 82.4518849206349 - type: mrr_at_100 value: 82.70302194564499 - type: mrr_at_1000 value: 82.70909729942254 - type: mrr_at_20 value: 82.60492765962964 - type: mrr_at_3 value: 81.33333333333331 - type: mrr_at_5 value: 82.14583333333331 - type: nauc_map_at_1000_diff1 value: 21.427201262456556 - type: nauc_map_at_1000_max value: 35.357361590816076 - type: nauc_map_at_1000_std value: 24.785419223353717 - type: nauc_map_at_100_diff1 value: 22.82358692021537 - type: nauc_map_at_100_max value: 35.07399692072945 - type: nauc_map_at_100_std value: 22.679878828987025 - type: nauc_map_at_10_diff1 value: 26.491769223479643 - type: nauc_map_at_10_max value: 20.78079385443902 - type: nauc_map_at_10_std value: -4.910406292079661 - type: nauc_map_at_1_diff1 value: 35.20851030208876 - type: nauc_map_at_1_max value: 5.783003346365858 - type: nauc_map_at_1_std value: -21.11679133835354 - type: nauc_map_at_20_diff1 value: 24.80097499300491 - type: nauc_map_at_20_max value: 26.807021360774975 - type: nauc_map_at_20_std value: 4.793103995429955 - type: nauc_map_at_3_diff1 value: 29.238193458890173 - type: nauc_map_at_3_max value: 10.300839972189456 - type: nauc_map_at_3_std value: -17.889666731981592 - type: nauc_map_at_5_diff1 value: 28.773624870573926 - type: nauc_map_at_5_max value: 14.951435645422887 - type: nauc_map_at_5_std value: -13.319697827173565 - type: nauc_mrr_at_1000_diff1 value: 55.232544856708785 - type: nauc_mrr_at_1000_max value: 64.73225637682637 - type: nauc_mrr_at_1000_std value: 37.57480399594188 - type: nauc_mrr_at_100_diff1 value: 55.219251601773735 - type: nauc_mrr_at_100_max value: 64.73305063663611 - type: nauc_mrr_at_100_std value: 37.56458562909293 - type: nauc_mrr_at_10_diff1 value: 55.123463838253464 - type: nauc_mrr_at_10_max value: 64.91914041040233 - type: nauc_mrr_at_10_std value: 37.76482503851598 - type: nauc_mrr_at_1_diff1 value: 56.45461238513347 - type: nauc_mrr_at_1_max value: 63.11782510293676 - type: nauc_mrr_at_1_std value: 33.592561284868985 - type: nauc_mrr_at_20_diff1 value: 55.15401961460458 - type: nauc_mrr_at_20_max value: 64.77145835613156 - type: nauc_mrr_at_20_std value: 37.471561418305804 - type: nauc_mrr_at_3_diff1 value: 54.64387438697658 - type: nauc_mrr_at_3_max value: 64.27618995019164 - type: nauc_mrr_at_3_std value: 39.391637295269014 - type: nauc_mrr_at_5_diff1 value: 55.08702591239485 - type: nauc_mrr_at_5_max value: 64.6071475650635 - type: nauc_mrr_at_5_std value: 37.97185134269896 - type: nauc_ndcg_at_1000_diff1 value: 31.696698876400387 - type: nauc_ndcg_at_1000_max value: 52.12183760001191 - type: nauc_ndcg_at_1000_std value: 40.197596211778716 - type: nauc_ndcg_at_100_diff1 value: 33.253120193433666 - type: nauc_ndcg_at_100_max value: 49.47167758554746 - type: nauc_ndcg_at_100_std value: 32.643833139756204 - type: nauc_ndcg_at_10_diff1 value: 27.065541392580013 - type: nauc_ndcg_at_10_max value: 45.83504281289289 - type: nauc_ndcg_at_10_std value: 27.11739500732328 - type: nauc_ndcg_at_1_diff1 value: 49.42808250022517 - type: nauc_ndcg_at_1_max value: 53.502615048520354 - type: nauc_ndcg_at_1_std value: 27.17555908836708 - type: nauc_ndcg_at_20_diff1 value: 29.374791382330308 - type: nauc_ndcg_at_20_max value: 43.91246842479055 - type: nauc_ndcg_at_20_std value: 23.419410620550316 - type: nauc_ndcg_at_3_diff1 value: 26.71550354496204 - type: nauc_ndcg_at_3_max value: 43.9641457892003 - type: nauc_ndcg_at_3_std value: 27.320024167947686 - type: nauc_ndcg_at_5_diff1 value: 27.020654974589487 - type: nauc_ndcg_at_5_max value: 46.130417266030584 - type: nauc_ndcg_at_5_std value: 28.392009019010068 - type: nauc_precision_at_1000_diff1 value: -21.47455482181002 - type: nauc_precision_at_1000_max value: -9.721907229236024 - type: nauc_precision_at_1000_std value: -1.061132062651487 - type: nauc_precision_at_100_diff1 value: -12.35759246101943 - type: nauc_precision_at_100_max value: 15.509512444892168 - type: nauc_precision_at_100_std value: 36.21183578592014 - type: nauc_precision_at_10_diff1 value: -6.136998947343125 - type: nauc_precision_at_10_max value: 32.30037906748288 - type: nauc_precision_at_10_std value: 41.4500302476981 - type: nauc_precision_at_1_diff1 value: 56.45461238513347 - type: nauc_precision_at_1_max value: 63.11782510293676 - type: nauc_precision_at_1_std value: 33.592561284868985 - type: nauc_precision_at_20_diff1 value: -7.335890123683174 - type: nauc_precision_at_20_max value: 28.31417075291312 - type: nauc_precision_at_20_std value: 41.405935715061815 - type: nauc_precision_at_3_diff1 value: 7.117255890225942 - type: nauc_precision_at_3_max value: 39.19894132683829 - type: nauc_precision_at_3_std value: 38.48255841994843 - type: nauc_precision_at_5_diff1 value: 1.861523090114206 - type: nauc_precision_at_5_max value: 38.11649223007208 - type: nauc_precision_at_5_std value: 40.52993530374645 - type: nauc_recall_at_1000_diff1 value: 26.497648584314636 - type: nauc_recall_at_1000_max value: 44.48069746734414 - type: nauc_recall_at_1000_std value: 53.16438130228715 - type: nauc_recall_at_100_diff1 value: 26.353456899511446 - type: nauc_recall_at_100_max value: 37.57379787884197 - type: nauc_recall_at_100_std value: 29.197468295989548 - type: nauc_recall_at_10_diff1 value: 22.80445738351114 - type: nauc_recall_at_10_max value: 15.895630778449046 - type: nauc_recall_at_10_std value: -8.746224797644501 - type: nauc_recall_at_1_diff1 value: 35.20851030208876 - type: nauc_recall_at_1_max value: 5.783003346365858 - type: nauc_recall_at_1_std value: -21.11679133835354 - type: nauc_recall_at_20_diff1 value: 22.34028867678706 - type: nauc_recall_at_20_max value: 21.42373427646772 - type: nauc_recall_at_20_std value: 0.4533036151015875 - type: nauc_recall_at_3_diff1 value: 24.96853445599229 - type: nauc_recall_at_3_max value: 6.245185375804208 - type: nauc_recall_at_3_std value: -20.200240127099622 - type: nauc_recall_at_5_diff1 value: 24.749259476710623 - type: nauc_recall_at_5_max value: 11.024592845995942 - type: nauc_recall_at_5_std value: -16.15683085641543 - type: ndcg_at_1 value: 64.125 - type: ndcg_at_10 value: 52.276999999999994 - type: ndcg_at_100 value: 57.440000000000005 - type: ndcg_at_1000 value: 64.082 - type: ndcg_at_20 value: 51.383 - type: ndcg_at_3 value: 55.769000000000005 - type: ndcg_at_5 value: 53.978 - type: precision_at_1 value: 76.25 - type: precision_at_10 value: 43.05 - type: precision_at_100 value: 14.09 - type: precision_at_1000 value: 2.662 - type: precision_at_20 value: 33.112 - type: precision_at_3 value: 59.833000000000006 - type: precision_at_5 value: 53.05 - type: recall_at_1 value: 9.949 - type: recall_at_10 value: 30.424 - type: recall_at_100 value: 64.062 - type: recall_at_1000 value: 85.916 - type: recall_at_20 value: 39.895 - type: recall_at_3 value: 17.876 - type: recall_at_5 value: 22.536 task: type: Retrieval - dataset: config: default name: MTEB EmotionClassification revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 split: test type: mteb/emotion metrics: - type: accuracy value: 84.29499999999999 - type: f1 value: 79.76188258172078 - type: f1_weighted value: 84.96026012933847 - type: main_score value: 84.29499999999999 task: type: Classification - dataset: config: default name: MTEB FEVER revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 split: test type: mteb/fever metrics: - type: main_score value: 94.83200000000001 - type: map_at_1 value: 87.339 - type: map_at_10 value: 92.92099999999999 - type: map_at_100 value: 93.108 - type: map_at_1000 value: 93.116 - type: map_at_20 value: 93.041 - type: map_at_3 value: 92.219 - type: map_at_5 value: 92.664 - type: mrr_at_1 value: 93.99939993999399 - type: mrr_at_10 value: 96.55188137861403 - type: mrr_at_100 value: 96.5652366009286 - type: mrr_at_1000 value: 96.5652625550811 - type: mrr_at_20 value: 96.5601781754844 - type: mrr_at_3 value: 96.45714571457142 - type: mrr_at_5 value: 96.544904490449 - type: nauc_map_at_1000_diff1 value: 51.81676454961933 - type: nauc_map_at_1000_max value: 24.904822914926118 - type: nauc_map_at_1000_std value: -3.8110347821630404 - type: nauc_map_at_100_diff1 value: 51.77514975011158 - type: nauc_map_at_100_max value: 24.912497341800094 - type: nauc_map_at_100_std value: -3.76229517662447 - type: nauc_map_at_10_diff1 value: 51.29608296382479 - type: nauc_map_at_10_max value: 24.78704970246707 - type: nauc_map_at_10_std value: -3.723130815783328 - type: nauc_map_at_1_diff1 value: 59.90813138005125 - type: nauc_map_at_1_max value: 24.58479295693794 - type: nauc_map_at_1_std value: -8.056152492777027 - type: nauc_map_at_20_diff1 value: 51.428639331678326 - type: nauc_map_at_20_max value: 24.849214517705086 - type: nauc_map_at_20_std value: -3.685550123874596 - type: nauc_map_at_3_diff1 value: 50.94399923719279 - type: nauc_map_at_3_max value: 24.359700180006207 - type: nauc_map_at_3_std value: -5.407767408816422 - type: nauc_map_at_5_diff1 value: 50.767302682959546 - type: nauc_map_at_5_max value: 24.491113461892215 - type: nauc_map_at_5_std value: -4.058336127339082 - type: nauc_mrr_at_1000_diff1 value: 79.86042313551833 - type: nauc_mrr_at_1000_max value: 23.20960445633933 - type: nauc_mrr_at_1000_std value: -23.54334295120471 - type: nauc_mrr_at_100_diff1 value: 79.85991247027636 - type: nauc_mrr_at_100_max value: 23.210085926780106 - type: nauc_mrr_at_100_std value: -23.542508200789197 - type: nauc_mrr_at_10_diff1 value: 79.71095155563415 - type: nauc_mrr_at_10_max value: 23.24128650883908 - type: nauc_mrr_at_10_std value: -23.408502781834102 - type: nauc_mrr_at_1_diff1 value: 82.6349900233902 - type: nauc_mrr_at_1_max value: 21.994548214014227 - type: nauc_mrr_at_1_std value: -22.549769792179262 - type: nauc_mrr_at_20_diff1 value: 79.76465012873038 - type: nauc_mrr_at_20_max value: 23.17575026523213 - type: nauc_mrr_at_20_std value: -23.492660166315048 - type: nauc_mrr_at_3_diff1 value: 79.91074933379953 - type: nauc_mrr_at_3_max value: 24.14246499097892 - type: nauc_mrr_at_3_std value: -25.22601708389664 - type: nauc_mrr_at_5_diff1 value: 79.62092651565847 - type: nauc_mrr_at_5_max value: 23.315937737034425 - type: nauc_mrr_at_5_std value: -23.317659360058403 - type: nauc_ndcg_at_1000_diff1 value: 54.404537986779225 - type: nauc_ndcg_at_1000_max value: 25.38408304128995 - type: nauc_ndcg_at_1000_std value: -4.916709117696968 - type: nauc_ndcg_at_100_diff1 value: 53.2448598868241 - type: nauc_ndcg_at_100_max value: 25.75325255295546 - type: nauc_ndcg_at_100_std value: -3.680507005630751 - type: nauc_ndcg_at_10_diff1 value: 50.81057355170232 - type: nauc_ndcg_at_10_max value: 25.006448273343807 - type: nauc_ndcg_at_10_std value: -2.8979899112515577 - type: nauc_ndcg_at_1_diff1 value: 82.6349900233902 - type: nauc_ndcg_at_1_max value: 21.994548214014227 - type: nauc_ndcg_at_1_std value: -22.549769792179262 - type: nauc_ndcg_at_20_diff1 value: 51.205023097166304 - type: nauc_ndcg_at_20_max value: 25.22133626556826 - type: nauc_ndcg_at_20_std value: -2.9506328244150155 - type: nauc_ndcg_at_3_diff1 value: 51.79780256736321 - type: nauc_ndcg_at_3_max value: 24.81137324438439 - type: nauc_ndcg_at_3_std value: -6.881223858227807 - type: nauc_ndcg_at_5_diff1 value: 50.290038260564565 - type: nauc_ndcg_at_5_max value: 24.57250792165796 - type: nauc_ndcg_at_5_std value: -3.5124628344654596 - type: nauc_precision_at_1000_diff1 value: -20.215211396894333 - type: nauc_precision_at_1000_max value: -14.165452298769171 - type: nauc_precision_at_1000_std value: -2.0952871214470816 - type: nauc_precision_at_100_diff1 value: -22.340257474494607 - type: nauc_precision_at_100_max value: -12.697885641360282 - type: nauc_precision_at_100_std value: 1.0688624940286244 - type: nauc_precision_at_10_diff1 value: -24.78271817420798 - type: nauc_precision_at_10_max value: -12.625257500222656 - type: nauc_precision_at_10_std value: 3.223250450607087 - type: nauc_precision_at_1_diff1 value: 82.6349900233902 - type: nauc_precision_at_1_max value: 21.994548214014227 - type: nauc_precision_at_1_std value: -22.549769792179262 - type: nauc_precision_at_20_diff1 value: -24.375756227194177 - type: nauc_precision_at_20_max value: -12.341015011563536 - type: nauc_precision_at_20_std value: 2.7475274619387955 - type: nauc_precision_at_3_diff1 value: -24.8251306777365 - type: nauc_precision_at_3_max value: -13.109579709589042 - type: nauc_precision_at_3_std value: -1.2233442335420748 - type: nauc_precision_at_5_diff1 value: -26.955418583344894 - type: nauc_precision_at_5_max value: -13.598630838071015 - type: nauc_precision_at_5_std value: 2.545780631940738 - type: nauc_recall_at_1000_diff1 value: 0.2542680835344437 - type: nauc_recall_at_1000_max value: 49.38194243035277 - type: nauc_recall_at_1000_std value: 57.021502715846026 - type: nauc_recall_at_100_diff1 value: 5.062154815367015 - type: nauc_recall_at_100_max value: 45.41178380188437 - type: nauc_recall_at_100_std value: 50.78382225901813 - type: nauc_recall_at_10_diff1 value: 20.429153629007818 - type: nauc_recall_at_10_max value: 27.516855026155508 - type: nauc_recall_at_10_std value: 21.367491371755467 - type: nauc_recall_at_1_diff1 value: 59.90813138005125 - type: nauc_recall_at_1_max value: 24.58479295693794 - type: nauc_recall_at_1_std value: -8.056152492777027 - type: nauc_recall_at_20_diff1 value: 13.072430858896942 - type: nauc_recall_at_20_max value: 29.5522659183247 - type: nauc_recall_at_20_std value: 28.70569974090291 - type: nauc_recall_at_3_diff1 value: 30.419084482663617 - type: nauc_recall_at_3_max value: 25.627389580252835 - type: nauc_recall_at_3_std value: 2.5557690877637054 - type: nauc_recall_at_5_diff1 value: 22.92561435069869 - type: nauc_recall_at_5_max value: 25.545265063475455 - type: nauc_recall_at_5_std value: 14.736172663072786 - type: ndcg_at_1 value: 93.999 - type: ndcg_at_10 value: 94.83200000000001 - type: ndcg_at_100 value: 95.363 - type: ndcg_at_1000 value: 95.478 - type: ndcg_at_20 value: 95.077 - type: ndcg_at_3 value: 94.143 - type: ndcg_at_5 value: 94.525 - type: precision_at_1 value: 93.999 - type: precision_at_10 value: 11.029 - type: precision_at_100 value: 1.1560000000000001 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_20 value: 5.62 - type: precision_at_3 value: 35.219 - type: precision_at_5 value: 21.584 - type: recall_at_1 value: 87.339 - type: recall_at_10 value: 97.026 - type: recall_at_100 value: 98.936 - type: recall_at_1000 value: 99.599 - type: recall_at_20 value: 97.744 - type: recall_at_3 value: 95.069 - type: recall_at_5 value: 96.177 task: type: Retrieval - dataset: config: default name: MTEB FiQA2018 revision: 27a168819829fe9bcd655c2df245fb19452e8e06 split: test type: mteb/fiqa metrics: - type: main_score value: 60.480000000000004 - type: map_at_1 value: 31.529 - type: map_at_10 value: 52.081 - type: map_at_100 value: 54.342 - type: map_at_1000 value: 54.449000000000005 - type: map_at_20 value: 53.479 - type: map_at_3 value: 45.471000000000004 - type: map_at_5 value: 49.164 - type: mrr_at_1 value: 60.03086419753087 - type: mrr_at_10 value: 67.73754409171075 - type: mrr_at_100 value: 68.332432152368 - type: mrr_at_1000 value: 68.34150941774908 - type: mrr_at_20 value: 68.14780993838725 - type: mrr_at_3 value: 65.6378600823045 - type: mrr_at_5 value: 66.88014403292176 - type: nauc_map_at_1000_diff1 value: 45.36598134579052 - type: nauc_map_at_1000_max value: 31.891451119906943 - type: nauc_map_at_1000_std value: -15.41454384137943 - type: nauc_map_at_100_diff1 value: 45.31268291874018 - type: nauc_map_at_100_max value: 31.811055683002092 - type: nauc_map_at_100_std value: -15.348503855591417 - type: nauc_map_at_10_diff1 value: 45.22606983565892 - type: nauc_map_at_10_max value: 30.46108534749699 - type: nauc_map_at_10_std value: -16.618086029682555 - type: nauc_map_at_1_diff1 value: 49.94952823753276 - type: nauc_map_at_1_max value: 13.770377574254548 - type: nauc_map_at_1_std value: -14.946357968858653 - type: nauc_map_at_20_diff1 value: 45.29274207897926 - type: nauc_map_at_20_max value: 31.27332015148257 - type: nauc_map_at_20_std value: -15.782946115613129 - type: nauc_map_at_3_diff1 value: 47.94248233566038 - type: nauc_map_at_3_max value: 24.022838776825456 - type: nauc_map_at_3_std value: -17.103518542262208 - type: nauc_map_at_5_diff1 value: 45.85345590031722 - type: nauc_map_at_5_max value: 27.78341379004547 - type: nauc_map_at_5_std value: -17.490850791756326 - type: nauc_mrr_at_1000_diff1 value: 58.225141047822824 - type: nauc_mrr_at_1000_max value: 43.39606904140525 - type: nauc_mrr_at_1000_std value: -14.64093518199122 - type: nauc_mrr_at_100_diff1 value: 58.22137274179545 - type: nauc_mrr_at_100_max value: 43.39567568136935 - type: nauc_mrr_at_100_std value: -14.62512313985582 - type: nauc_mrr_at_10_diff1 value: 58.03217329957151 - type: nauc_mrr_at_10_max value: 43.633561683075186 - type: nauc_mrr_at_10_std value: -14.563703576023808 - type: nauc_mrr_at_1_diff1 value: 61.48979902647692 - type: nauc_mrr_at_1_max value: 43.1938079066948 - type: nauc_mrr_at_1_std value: -15.808138277440465 - type: nauc_mrr_at_20_diff1 value: 58.13185370150794 - type: nauc_mrr_at_20_max value: 43.35607721183147 - type: nauc_mrr_at_20_std value: -14.635812702971263 - type: nauc_mrr_at_3_diff1 value: 58.698963168321264 - type: nauc_mrr_at_3_max value: 43.633129249785405 - type: nauc_mrr_at_3_std value: -15.733246346983854 - type: nauc_mrr_at_5_diff1 value: 57.94156745229547 - type: nauc_mrr_at_5_max value: 43.14152462640525 - type: nauc_mrr_at_5_std value: -15.318685307750895 - type: nauc_ndcg_at_1000_diff1 value: 47.871896043731496 - type: nauc_ndcg_at_1000_max value: 37.159845167533426 - type: nauc_ndcg_at_1000_std value: -13.067288160833485 - type: nauc_ndcg_at_100_diff1 value: 47.046171407204426 - type: nauc_ndcg_at_100_max value: 36.422514360855835 - type: nauc_ndcg_at_100_std value: -11.636859259571441 - type: nauc_ndcg_at_10_diff1 value: 46.232628149078096 - type: nauc_ndcg_at_10_max value: 34.82402625088358 - type: nauc_ndcg_at_10_std value: -14.768545542980114 - type: nauc_ndcg_at_1_diff1 value: 61.48979902647692 - type: nauc_ndcg_at_1_max value: 43.1938079066948 - type: nauc_ndcg_at_1_std value: -15.808138277440465 - type: nauc_ndcg_at_20_diff1 value: 46.51116172390955 - type: nauc_ndcg_at_20_max value: 35.36362650568298 - type: nauc_ndcg_at_20_std value: -12.849406209182826 - type: nauc_ndcg_at_3_diff1 value: 47.39832263785871 - type: nauc_ndcg_at_3_max value: 35.67466264628456 - type: nauc_ndcg_at_3_std value: -17.257717349296943 - type: nauc_ndcg_at_5_diff1 value: 45.91049493804232 - type: nauc_ndcg_at_5_max value: 33.8405091138445 - type: nauc_ndcg_at_5_std value: -17.477069902735895 - type: nauc_precision_at_1000_diff1 value: -12.037873000917767 - type: nauc_precision_at_1000_max value: 26.043220150002295 - type: nauc_precision_at_1000_std value: 6.84910668321572 - type: nauc_precision_at_100_diff1 value: -9.383403459051864 - type: nauc_precision_at_100_max value: 29.68713170610003 - type: nauc_precision_at_100_std value: 10.079531587056152 - type: nauc_precision_at_10_diff1 value: 3.3433323353925135 - type: nauc_precision_at_10_max value: 38.31790111725993 - type: nauc_precision_at_10_std value: 0.7888123304710856 - type: nauc_precision_at_1_diff1 value: 61.48979902647692 - type: nauc_precision_at_1_max value: 43.1938079066948 - type: nauc_precision_at_1_std value: -15.808138277440465 - type: nauc_precision_at_20_diff1 value: -2.083500986294448 - type: nauc_precision_at_20_max value: 35.77143835726343 - type: nauc_precision_at_20_std value: 5.318547021874003 - type: nauc_precision_at_3_diff1 value: 23.335617788912586 - type: nauc_precision_at_3_max value: 39.81973275320871 - type: nauc_precision_at_3_std value: -8.442769390555561 - type: nauc_precision_at_5_diff1 value: 11.521087842589482 - type: nauc_precision_at_5_max value: 39.527792539828255 - type: nauc_precision_at_5_std value: -5.412729503701626 - type: nauc_recall_at_1000_diff1 value: 10.6830893047453 - type: nauc_recall_at_1000_max value: 8.834504311238423 - type: nauc_recall_at_1000_std value: 24.670754304859692 - type: nauc_recall_at_100_diff1 value: 20.646020385527358 - type: nauc_recall_at_100_max value: 20.121595011523294 - type: nauc_recall_at_100_std value: 19.42307459311791 - type: nauc_recall_at_10_diff1 value: 33.01029313733417 - type: nauc_recall_at_10_max value: 27.948634980368702 - type: nauc_recall_at_10_std value: -10.239767371462975 - type: nauc_recall_at_1_diff1 value: 49.94952823753276 - type: nauc_recall_at_1_max value: 13.770377574254548 - type: nauc_recall_at_1_std value: -14.946357968858653 - type: nauc_recall_at_20_diff1 value: 30.040111045267963 - type: nauc_recall_at_20_max value: 25.984919302418184 - type: nauc_recall_at_20_std value: -1.4998001817460804 - type: nauc_recall_at_3_diff1 value: 42.24410559113653 - type: nauc_recall_at_3_max value: 20.269503583626914 - type: nauc_recall_at_3_std value: -17.09578532600584 - type: nauc_recall_at_5_diff1 value: 36.124149735848945 - type: nauc_recall_at_5_max value: 22.708022306002622 - type: nauc_recall_at_5_std value: -16.966976847236193 - type: ndcg_at_1 value: 60.031 - type: ndcg_at_10 value: 60.480000000000004 - type: ndcg_at_100 value: 66.94099999999999 - type: ndcg_at_1000 value: 68.303 - type: ndcg_at_20 value: 63.536 - type: ndcg_at_3 value: 55.903999999999996 - type: ndcg_at_5 value: 57.387 - type: precision_at_1 value: 60.031 - type: precision_at_10 value: 16.682 - type: precision_at_100 value: 2.336 - type: precision_at_1000 value: 0.259 - type: precision_at_20 value: 9.66 - type: precision_at_3 value: 37.191 - type: precision_at_5 value: 27.253 - type: recall_at_1 value: 31.529 - type: recall_at_10 value: 68.035 - type: recall_at_100 value: 90.925 - type: recall_at_1000 value: 98.688 - type: recall_at_20 value: 77.453 - type: recall_at_3 value: 50.221000000000004 - type: recall_at_5 value: 58.209999999999994 task: type: Retrieval - dataset: config: default name: MTEB HotpotQA revision: ab518f4d6fcca38d87c25209f94beba119d02014 split: test type: mteb/hotpotqa metrics: - type: main_score value: 76.67399999999999 - type: map_at_1 value: 43.822 - type: map_at_10 value: 68.82000000000001 - type: map_at_100 value: 69.659 - type: map_at_1000 value: 69.714 - type: map_at_20 value: 69.305 - type: map_at_3 value: 65.517 - type: map_at_5 value: 67.633 - type: mrr_at_1 value: 87.643484132343 - type: mrr_at_10 value: 91.28134679485098 - type: mrr_at_100 value: 91.37985230614755 - type: mrr_at_1000 value: 91.38202467630681 - type: mrr_at_20 value: 91.34718855278429 - type: mrr_at_3 value: 90.75849651136599 - type: mrr_at_5 value: 91.10961062345235 - type: nauc_map_at_1000_diff1 value: 3.7670405082837477 - type: nauc_map_at_1000_max value: 14.410594409695182 - type: nauc_map_at_1000_std value: 7.94738583292685 - type: nauc_map_at_100_diff1 value: 3.738796209193936 - type: nauc_map_at_100_max value: 14.408029101534694 - type: nauc_map_at_100_std value: 7.979641077687816 - type: nauc_map_at_10_diff1 value: 3.334917978089454 - type: nauc_map_at_10_max value: 13.975255289147748 - type: nauc_map_at_10_std value: 7.491959628012161 - type: nauc_map_at_1_diff1 value: 75.35066482050009 - type: nauc_map_at_1_max value: 53.573503488571475 - type: nauc_map_at_1_std value: -6.542030594426993 - type: nauc_map_at_20_diff1 value: 3.5197129341582083 - type: nauc_map_at_20_max value: 14.159880698006816 - type: nauc_map_at_20_std value: 7.856574384998483 - type: nauc_map_at_3_diff1 value: 3.0992333232864064 - type: nauc_map_at_3_max value: 12.513959281222112 - type: nauc_map_at_3_std value: 4.352912866014865 - type: nauc_map_at_5_diff1 value: 3.0351688998572537 - type: nauc_map_at_5_max value: 13.21599457624529 - type: nauc_map_at_5_std value: 6.246882983214777 - type: nauc_mrr_at_1000_diff1 value: 75.23953736361132 - type: nauc_mrr_at_1000_max value: 56.64260717262164 - type: nauc_mrr_at_1000_std value: -4.865932053762276 - type: nauc_mrr_at_100_diff1 value: 75.24091372816497 - type: nauc_mrr_at_100_max value: 56.64831104504846 - type: nauc_mrr_at_100_std value: -4.850966297943324 - type: nauc_mrr_at_10_diff1 value: 75.26540178053416 - type: nauc_mrr_at_10_max value: 56.828755673428965 - type: nauc_mrr_at_10_std value: -4.8401126970944635 - type: nauc_mrr_at_1_diff1 value: 75.35066482050009 - type: nauc_mrr_at_1_max value: 53.573503488571475 - type: nauc_mrr_at_1_std value: -6.542030594426993 - type: nauc_mrr_at_20_diff1 value: 75.24453050729845 - type: nauc_mrr_at_20_max value: 56.69220588401435 - type: nauc_mrr_at_20_std value: -4.843700730832108 - type: nauc_mrr_at_3_diff1 value: 74.98411648336175 - type: nauc_mrr_at_3_max value: 56.766537573537114 - type: nauc_mrr_at_3_std value: -4.909712671649337 - type: nauc_mrr_at_5_diff1 value: 75.20599020991028 - type: nauc_mrr_at_5_max value: 56.64236207782237 - type: nauc_mrr_at_5_std value: -5.208907367513977 - type: nauc_ndcg_at_1000_diff1 value: 11.48307079099774 - type: nauc_ndcg_at_1000_max value: 20.893326881675176 - type: nauc_ndcg_at_1000_std value: 10.43489838692119 - type: nauc_ndcg_at_100_diff1 value: 10.395588735754927 - type: nauc_ndcg_at_100_max value: 20.529573302516912 - type: nauc_ndcg_at_100_std value: 11.252973083654268 - type: nauc_ndcg_at_10_diff1 value: 8.596739352741972 - type: nauc_ndcg_at_10_max value: 18.475863682540673 - type: nauc_ndcg_at_10_std value: 9.175831033463352 - type: nauc_ndcg_at_1_diff1 value: 75.35066482050009 - type: nauc_ndcg_at_1_max value: 53.573503488571475 - type: nauc_ndcg_at_1_std value: -6.542030594426993 - type: nauc_ndcg_at_20_diff1 value: 8.998033972471749 - type: nauc_ndcg_at_20_max value: 18.892085875404522 - type: nauc_ndcg_at_20_std value: 10.3241608901084 - type: nauc_ndcg_at_3_diff1 value: 8.796384949533579 - type: nauc_ndcg_at_3_max value: 16.515261419885274 - type: nauc_ndcg_at_3_std value: 4.081902976576701 - type: nauc_ndcg_at_5_diff1 value: 8.277259464605025 - type: nauc_ndcg_at_5_max value: 17.163053202909527 - type: nauc_ndcg_at_5_std value: 6.652669449704474 - type: nauc_precision_at_1000_diff1 value: -3.490556596304827 - type: nauc_precision_at_1000_max value: 31.0473259001597 - type: nauc_precision_at_1000_std value: 52.36921397692622 - type: nauc_precision_at_100_diff1 value: -6.420747959222489 - type: nauc_precision_at_100_max value: 20.555887056005936 - type: nauc_precision_at_100_std value: 36.119132870798495 - type: nauc_precision_at_10_diff1 value: -6.461726057290426 - type: nauc_precision_at_10_max value: 12.161081825341915 - type: nauc_precision_at_10_std value: 17.961318451839993 - type: nauc_precision_at_1_diff1 value: 75.35066482050009 - type: nauc_precision_at_1_max value: 53.573503488571475 - type: nauc_precision_at_1_std value: -6.542030594426993 - type: nauc_precision_at_20_diff1 value: -7.361461296416161 - type: nauc_precision_at_20_max value: 12.663621261696733 - type: nauc_precision_at_20_std value: 23.312476851670286 - type: nauc_precision_at_3_diff1 value: -3.299056912774522 - type: nauc_precision_at_3_max value: 9.85602375812038 - type: nauc_precision_at_3_std value: 6.4962782003155475 - type: nauc_precision_at_5_diff1 value: -5.3155827772027795 - type: nauc_precision_at_5_max value: 10.32907751171833 - type: nauc_precision_at_5_std value: 11.384098087196932 - type: nauc_recall_at_1000_diff1 value: -3.4905565963043332 - type: nauc_recall_at_1000_max value: 31.04732590016041 - type: nauc_recall_at_1000_std value: 52.36921397692641 - type: nauc_recall_at_100_diff1 value: -6.420747959222586 - type: nauc_recall_at_100_max value: 20.55588705600596 - type: nauc_recall_at_100_std value: 36.11913287079825 - type: nauc_recall_at_10_diff1 value: -6.461726057290347 - type: nauc_recall_at_10_max value: 12.161081825342022 - type: nauc_recall_at_10_std value: 17.96131845184002 - type: nauc_recall_at_1_diff1 value: 75.35066482050009 - type: nauc_recall_at_1_max value: 53.573503488571475 - type: nauc_recall_at_1_std value: -6.542030594426993 - type: nauc_recall_at_20_diff1 value: -7.361461296416054 - type: nauc_recall_at_20_max value: 12.66362126169679 - type: nauc_recall_at_20_std value: 23.312476851670382 - type: nauc_recall_at_3_diff1 value: -3.2990569127745886 - type: nauc_recall_at_3_max value: 9.856023758120296 - type: nauc_recall_at_3_std value: 6.496278200315444 - type: nauc_recall_at_5_diff1 value: -5.315582777202729 - type: nauc_recall_at_5_max value: 10.329077511718229 - type: nauc_recall_at_5_std value: 11.384098087196932 - type: ndcg_at_1 value: 87.643 - type: ndcg_at_10 value: 76.67399999999999 - type: ndcg_at_100 value: 79.462 - type: ndcg_at_1000 value: 80.43599999999999 - type: ndcg_at_20 value: 77.83 - type: ndcg_at_3 value: 72.256 - type: ndcg_at_5 value: 74.789 - type: precision_at_1 value: 87.643 - type: precision_at_10 value: 15.726999999999999 - type: precision_at_100 value: 1.791 - type: precision_at_1000 value: 0.192 - type: precision_at_20 value: 8.236 - type: precision_at_3 value: 45.919 - type: precision_at_5 value: 29.558 - type: recall_at_1 value: 43.822 - type: recall_at_10 value: 78.636 - type: recall_at_100 value: 89.527 - type: recall_at_1000 value: 95.868 - type: recall_at_20 value: 82.363 - type: recall_at_3 value: 68.879 - type: recall_at_5 value: 73.896 task: type: Retrieval - dataset: config: default name: MTEB ImdbClassification revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 split: test type: mteb/imdb metrics: - type: accuracy value: 96.6608 - type: ap value: 95.14657820401189 - type: ap_weighted value: 95.14657820401189 - type: f1 value: 96.66029695623422 - type: f1_weighted value: 96.66029695623423 - type: main_score value: 96.6608 task: type: Classification - dataset: config: default name: MTEB MSMARCO revision: c5a29a104738b98a9e76336939199e264163d4a0 split: dev type: mteb/msmarco metrics: - type: main_score value: 45.217 - type: map_at_1 value: 24.728 - type: map_at_10 value: 37.933 - type: map_at_100 value: 39.074999999999996 - type: map_at_1000 value: 39.115 - type: map_at_20 value: 38.663 - type: map_at_3 value: 33.904 - type: map_at_5 value: 36.217 - type: mrr_at_1 value: 25.44412607449857 - type: mrr_at_10 value: 38.52640196479737 - type: mrr_at_100 value: 39.60462889736067 - type: mrr_at_1000 value: 39.638904296248526 - type: mrr_at_20 value: 39.2234365827559 - type: mrr_at_3 value: 34.59646609360076 - type: mrr_at_5 value: 36.8801337153773 - type: nauc_map_at_1000_diff1 value: 37.645652178132174 - type: nauc_map_at_1000_max value: 9.953357023361367 - type: nauc_map_at_1000_std value: -20.800238036721503 - type: nauc_map_at_100_diff1 value: 37.643073495974555 - type: nauc_map_at_100_max value: 9.95921239641703 - type: nauc_map_at_100_std value: -20.76517765535793 - type: nauc_map_at_10_diff1 value: 37.44380763335014 - type: nauc_map_at_10_max value: 9.917273043055342 - type: nauc_map_at_10_std value: -21.467951225710898 - type: nauc_map_at_1_diff1 value: 41.02118887981969 - type: nauc_map_at_1_max value: 8.301113449711778 - type: nauc_map_at_1_std value: -19.436814224415027 - type: nauc_map_at_20_diff1 value: 37.58156586490493 - type: nauc_map_at_20_max value: 9.972927967610659 - type: nauc_map_at_20_std value: -20.951374218839387 - type: nauc_map_at_3_diff1 value: 37.67246795684178 - type: nauc_map_at_3_max value: 9.307031378909478 - type: nauc_map_at_3_std value: -21.77026217965021 - type: nauc_map_at_5_diff1 value: 37.39086482095963 - type: nauc_map_at_5_max value: 9.732739107368566 - type: nauc_map_at_5_std value: -21.8424296893692 - type: nauc_mrr_at_1000_diff1 value: 37.36666719603192 - type: nauc_mrr_at_1000_max value: 9.79040465289953 - type: nauc_mrr_at_1000_std value: -20.590147245965568 - type: nauc_mrr_at_100_diff1 value: 37.36560296629318 - type: nauc_mrr_at_100_max value: 9.798113710672162 - type: nauc_mrr_at_100_std value: -20.556791838504292 - type: nauc_mrr_at_10_diff1 value: 37.19257605840734 - type: nauc_mrr_at_10_max value: 9.749429811638063 - type: nauc_mrr_at_10_std value: -21.206407664327276 - type: nauc_mrr_at_1_diff1 value: 40.98478651095172 - type: nauc_mrr_at_1_max value: 8.173841799119707 - type: nauc_mrr_at_1_std value: -19.530027987868017 - type: nauc_mrr_at_20_diff1 value: 37.29973172861245 - type: nauc_mrr_at_20_max value: 9.815127660001345 - type: nauc_mrr_at_20_std value: -20.700860112175928 - type: nauc_mrr_at_3_diff1 value: 37.282848009425734 - type: nauc_mrr_at_3_max value: 9.172741713108193 - type: nauc_mrr_at_3_std value: -21.563630513502996 - type: nauc_mrr_at_5_diff1 value: 37.08609827303586 - type: nauc_mrr_at_5_max value: 9.604643424273284 - type: nauc_mrr_at_5_std value: -21.580110806494094 - type: nauc_ndcg_at_1000_diff1 value: 37.086587020218545 - type: nauc_ndcg_at_1000_max value: 10.696860688467472 - type: nauc_ndcg_at_1000_std value: -19.50989939916873 - type: nauc_ndcg_at_100_diff1 value: 37.03794531268128 - type: nauc_ndcg_at_100_max value: 10.940820719182339 - type: nauc_ndcg_at_100_std value: -18.28651832370893 - type: nauc_ndcg_at_10_diff1 value: 36.21062857920633 - type: nauc_ndcg_at_10_max value: 10.845172882571733 - type: nauc_ndcg_at_10_std value: -21.454301679510106 - type: nauc_ndcg_at_1_diff1 value: 40.98478651095172 - type: nauc_ndcg_at_1_max value: 8.173841799119707 - type: nauc_ndcg_at_1_std value: -19.530027987868017 - type: nauc_ndcg_at_20_diff1 value: 36.583262733100526 - type: nauc_ndcg_at_20_max value: 11.10492720898974 - type: nauc_ndcg_at_20_std value: -19.41753284137609 - type: nauc_ndcg_at_3_diff1 value: 36.57271365035382 - type: nauc_ndcg_at_3_max value: 9.56073433062999 - type: nauc_ndcg_at_3_std value: -22.324263670932915 - type: nauc_ndcg_at_5_diff1 value: 36.09419372820154 - type: nauc_ndcg_at_5_max value: 10.357384992631271 - type: nauc_ndcg_at_5_std value: -22.389578276324894 - type: nauc_precision_at_1000_diff1 value: -2.7435338714030597 - type: nauc_precision_at_1000_max value: 4.302274933383809 - type: nauc_precision_at_1000_std value: 8.456846348638948 - type: nauc_precision_at_100_diff1 value: 15.149466332615983 - type: nauc_precision_at_100_max value: 12.501013731673163 - type: nauc_precision_at_100_std value: 15.909667509021785 - type: nauc_precision_at_10_diff1 value: 28.699788688314214 - type: nauc_precision_at_10_max value: 13.024586051842347 - type: nauc_precision_at_10_std value: -19.197658937078703 - type: nauc_precision_at_1_diff1 value: 40.98478651095172 - type: nauc_precision_at_1_max value: 8.173841799119707 - type: nauc_precision_at_1_std value: -19.530027987868017 - type: nauc_precision_at_20_diff1 value: 26.519292942353395 - type: nauc_precision_at_20_max value: 14.389979272056438 - type: nauc_precision_at_20_std value: -7.030956994938155 - type: nauc_precision_at_3_diff1 value: 32.87913492278213 - type: nauc_precision_at_3_max value: 9.673660161387776 - type: nauc_precision_at_3_std value: -23.905612656592172 - type: nauc_precision_at_5_diff1 value: 30.903850113238597 - type: nauc_precision_at_5_max value: 11.482375434154898 - type: nauc_precision_at_5_std value: -23.828657095254247 - type: nauc_recall_at_1000_diff1 value: 35.80765639589219 - type: nauc_recall_at_1000_max value: 50.94532805969448 - type: nauc_recall_at_1000_std value: 66.79910877083275 - type: nauc_recall_at_100_diff1 value: 34.96182828311028 - type: nauc_recall_at_100_max value: 21.729699631790556 - type: nauc_recall_at_100_std value: 23.509439011686474 - type: nauc_recall_at_10_diff1 value: 31.88371369567137 - type: nauc_recall_at_10_max value: 14.425389702697073 - type: nauc_recall_at_10_std value: -20.95578001880924 - type: nauc_recall_at_1_diff1 value: 41.02118887981969 - type: nauc_recall_at_1_max value: 8.301113449711778 - type: nauc_recall_at_1_std value: -19.436814224415027 - type: nauc_recall_at_20_diff1 value: 32.42718780622455 - type: nauc_recall_at_20_max value: 16.90686126329399 - type: nauc_recall_at_20_std value: -9.38158227016737 - type: nauc_recall_at_3_diff1 value: 33.68966646043966 - type: nauc_recall_at_3_max value: 10.336277419708532 - type: nauc_recall_at_3_std value: -23.80165869168538 - type: nauc_recall_at_5_diff1 value: 32.26258807452426 - type: nauc_recall_at_5_max value: 12.303713005399935 - type: nauc_recall_at_5_std value: -23.87721891164968 - type: ndcg_at_1 value: 25.444 - type: ndcg_at_10 value: 45.217 - type: ndcg_at_100 value: 50.575 - type: ndcg_at_1000 value: 51.519999999999996 - type: ndcg_at_20 value: 47.786 - type: ndcg_at_3 value: 37.067 - type: ndcg_at_5 value: 41.184 - type: precision_at_1 value: 25.444 - type: precision_at_10 value: 7.07 - type: precision_at_100 value: 0.9730000000000001 - type: precision_at_1000 value: 0.106 - type: precision_at_20 value: 4.072 - type: precision_at_3 value: 15.754999999999999 - type: precision_at_5 value: 11.544 - type: recall_at_1 value: 24.728 - type: recall_at_10 value: 67.607 - type: recall_at_100 value: 92.094 - type: recall_at_1000 value: 99.165 - type: recall_at_20 value: 77.529 - type: recall_at_3 value: 45.535 - type: recall_at_5 value: 55.394 task: type: Retrieval - dataset: config: en name: MTEB MTOPDomainClassification (en) revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf split: test type: mteb/mtop_domain metrics: - type: accuracy value: 99.01276789785682 - type: f1 value: 98.9288649250924 - type: f1_weighted value: 99.01406884928141 - type: main_score value: 99.01276789785682 task: type: Classification - dataset: config: en name: MTEB MTOPIntentClassification (en) revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba split: test type: mteb/mtop_intent metrics: - type: accuracy value: 92.78385772913816 - type: f1 value: 79.78115704297824 - type: f1_weighted value: 93.90424147486428 - type: main_score value: 92.78385772913816 task: type: Classification - dataset: config: en name: MTEB MassiveIntentClassification (en) revision: 4672e20407010da34463acc759c162ca9734bca6 split: test type: mteb/amazon_massive_intent metrics: - type: accuracy value: 85.83053127101546 - type: f1 value: 82.72036139888232 - type: f1_weighted value: 85.81759723866098 - type: main_score value: 85.83053127101546 task: type: Classification - dataset: config: en name: MTEB MassiveScenarioClassification (en) revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 split: test type: mteb/amazon_massive_scenario metrics: - type: accuracy value: 90.19838601210489 - type: f1 value: 89.55260197964978 - type: f1_weighted value: 90.11422965504119 - type: main_score value: 90.19838601210489 task: type: Classification - dataset: config: default name: MTEB MedrxivClusteringP2P revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 split: test type: mteb/medrxiv-clustering-p2p metrics: - type: main_score value: 46.866746897607094 - type: v_measure value: 46.866746897607094 - type: v_measure_std value: 1.0966477896919726 task: type: Clustering - dataset: config: default name: MTEB MedrxivClusteringS2S revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 split: test type: mteb/medrxiv-clustering-s2s metrics: - type: main_score value: 44.6538827415503 - type: v_measure value: 44.6538827415503 - type: v_measure_std value: 1.1649569936599116 task: type: Clustering - dataset: config: default name: MTEB MindSmallReranking revision: 59042f120c80e8afa9cdbb224f67076cec0fc9a7 split: test type: mteb/mind_small metrics: - type: main_score value: 33.05449204940555 - type: map value: 33.05449204940555 - type: mrr value: 34.32562058439585 - type: nAUC_map_diff1 value: 11.465656013162807 - type: nAUC_map_max value: -20.400088169502308 - type: nAUC_map_std value: -2.638964886362445 - type: nAUC_mrr_diff1 value: 10.644290702481207 - type: nAUC_mrr_max value: -15.304687384645769 - type: nAUC_mrr_std value: -0.519919931348978 task: type: Reranking - dataset: config: default name: MTEB NFCorpus revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 split: test type: mteb/nfcorpus metrics: - type: main_score value: 41.998000000000005 - type: map_at_1 value: 6.907000000000001 - type: map_at_10 value: 16.397000000000002 - type: map_at_100 value: 21.69 - type: map_at_1000 value: 23.652 - type: map_at_20 value: 18.629 - type: map_at_3 value: 11.969000000000001 - type: map_at_5 value: 13.894 - type: mrr_at_1 value: 53.25077399380805 - type: mrr_at_10 value: 61.8561108653988 - type: mrr_at_100 value: 62.42447851935404 - type: mrr_at_1000 value: 62.459626424428095 - type: mrr_at_20 value: 62.287236389990696 - type: mrr_at_3 value: 60.42311661506711 - type: mrr_at_5 value: 61.36738906088753 - type: nauc_map_at_1000_diff1 value: 17.159461939643844 - type: nauc_map_at_1000_max value: 32.42764938789903 - type: nauc_map_at_1000_std value: 11.039427848422093 - type: nauc_map_at_100_diff1 value: 19.089532984187503 - type: nauc_map_at_100_max value: 31.96721085058713 - type: nauc_map_at_100_std value: 6.947468655726444 - type: nauc_map_at_10_diff1 value: 25.77255342629802 - type: nauc_map_at_10_max value: 26.163590320961543 - type: nauc_map_at_10_std value: -5.2588093720998375 - type: nauc_map_at_1_diff1 value: 46.31602607957798 - type: nauc_map_at_1_max value: 11.807757660801942 - type: nauc_map_at_1_std value: -13.984889089354317 - type: nauc_map_at_20_diff1 value: 22.308161130465365 - type: nauc_map_at_20_max value: 29.070587307827722 - type: nauc_map_at_20_std value: -1.0103056620851558 - type: nauc_map_at_3_diff1 value: 33.580827849617506 - type: nauc_map_at_3_max value: 17.661630885799042 - type: nauc_map_at_3_std value: -11.463282544041888 - type: nauc_map_at_5_diff1 value: 30.32603342696912 - type: nauc_map_at_5_max value: 20.938905485667245 - type: nauc_map_at_5_std value: -10.537086968155755 - type: nauc_mrr_at_1000_diff1 value: 24.45065397805829 - type: nauc_mrr_at_1000_max value: 48.17519860927417 - type: nauc_mrr_at_1000_std value: 30.350767549118903 - type: nauc_mrr_at_100_diff1 value: 24.444061606534486 - type: nauc_mrr_at_100_max value: 48.1922894212229 - type: nauc_mrr_at_100_std value: 30.379257816584094 - type: nauc_mrr_at_10_diff1 value: 24.25598717198779 - type: nauc_mrr_at_10_max value: 48.10437607774264 - type: nauc_mrr_at_10_std value: 30.090202482685996 - type: nauc_mrr_at_1_diff1 value: 26.907595285201264 - type: nauc_mrr_at_1_max value: 44.006974050369955 - type: nauc_mrr_at_1_std value: 26.921001962861062 - type: nauc_mrr_at_20_diff1 value: 24.462771570553738 - type: nauc_mrr_at_20_max value: 48.264688196799746 - type: nauc_mrr_at_20_std value: 30.498095141265914 - type: nauc_mrr_at_3_diff1 value: 24.76829388237229 - type: nauc_mrr_at_3_max value: 48.213758704739924 - type: nauc_mrr_at_3_std value: 30.1502853918892 - type: nauc_mrr_at_5_diff1 value: 24.476494932330247 - type: nauc_mrr_at_5_max value: 47.977250552198804 - type: nauc_mrr_at_5_std value: 29.65248143104835 - type: nauc_ndcg_at_1000_diff1 value: 13.055818920426246 - type: nauc_ndcg_at_1000_max value: 46.00986444256306 - type: nauc_ndcg_at_1000_std value: 29.622662054922085 - type: nauc_ndcg_at_100_diff1 value: 12.260551238228816 - type: nauc_ndcg_at_100_max value: 39.89783048267698 - type: nauc_ndcg_at_100_std value: 23.806961617956613 - type: nauc_ndcg_at_10_diff1 value: 11.002915931619567 - type: nauc_ndcg_at_10_max value: 39.79323759244374 - type: nauc_ndcg_at_10_std value: 23.053072152911046 - type: nauc_ndcg_at_1_diff1 value: 27.560910719974434 - type: nauc_ndcg_at_1_max value: 41.21084046258119 - type: nauc_ndcg_at_1_std value: 26.112891742912893 - type: nauc_ndcg_at_20_diff1 value: 10.085854089024496 - type: nauc_ndcg_at_20_max value: 37.88629173784684 - type: nauc_ndcg_at_20_std value: 23.17664322248358 - type: nauc_ndcg_at_3_diff1 value: 16.58969583405987 - type: nauc_ndcg_at_3_max value: 41.282222954101435 - type: nauc_ndcg_at_3_std value: 21.080670648392747 - type: nauc_ndcg_at_5_diff1 value: 13.893127947909885 - type: nauc_ndcg_at_5_max value: 40.21188015992804 - type: nauc_ndcg_at_5_std value: 21.417443978842652 - type: nauc_precision_at_1000_diff1 value: -17.227504530334564 - type: nauc_precision_at_1000_max value: 3.798554468439066 - type: nauc_precision_at_1000_std value: 35.73617809452683 - type: nauc_precision_at_100_diff1 value: -17.63388230218776 - type: nauc_precision_at_100_max value: 15.079399882407094 - type: nauc_precision_at_100_std value: 41.83698491321226 - type: nauc_precision_at_10_diff1 value: -11.850925959645156 - type: nauc_precision_at_10_max value: 35.93283968364352 - type: nauc_precision_at_10_std value: 34.391271855921296 - type: nauc_precision_at_1_diff1 value: 27.730860778824823 - type: nauc_precision_at_1_max value: 43.97462471516834 - type: nauc_precision_at_1_std value: 27.491068270978896 - type: nauc_precision_at_20_diff1 value: -14.281328840943347 - type: nauc_precision_at_20_max value: 29.469099781759006 - type: nauc_precision_at_20_std value: 38.54703022340941 - type: nauc_precision_at_3_diff1 value: 3.486986910413196 - type: nauc_precision_at_3_max value: 41.21107780473768 - type: nauc_precision_at_3_std value: 24.057479124531216 - type: nauc_precision_at_5_diff1 value: -3.0623787872866233 - type: nauc_precision_at_5_max value: 37.49266386466702 - type: nauc_precision_at_5_std value: 26.894454268004935 - type: nauc_recall_at_1000_diff1 value: -2.446891864334283 - type: nauc_recall_at_1000_max value: 23.867293584643377 - type: nauc_recall_at_1000_std value: 16.34707128224595 - type: nauc_recall_at_100_diff1 value: 4.891133690841179 - type: nauc_recall_at_100_max value: 24.56727964996522 - type: nauc_recall_at_100_std value: 9.847212953200797 - type: nauc_recall_at_10_diff1 value: 19.211912363585288 - type: nauc_recall_at_10_max value: 24.825344777920737 - type: nauc_recall_at_10_std value: -5.447989195041898 - type: nauc_recall_at_1_diff1 value: 46.31602607957798 - type: nauc_recall_at_1_max value: 11.807757660801942 - type: nauc_recall_at_1_std value: -13.984889089354317 - type: nauc_recall_at_20_diff1 value: 12.233372054304805 - type: nauc_recall_at_20_max value: 22.284108685207148 - type: nauc_recall_at_20_std value: -4.317138366746209 - type: nauc_recall_at_3_diff1 value: 28.394631527225815 - type: nauc_recall_at_3_max value: 15.593864852625462 - type: nauc_recall_at_3_std value: -12.383531804314593 - type: nauc_recall_at_5_diff1 value: 24.457441304950343 - type: nauc_recall_at_5_max value: 19.080049396281623 - type: nauc_recall_at_5_std value: -11.879747703626627 - type: ndcg_at_1 value: 51.548 - type: ndcg_at_10 value: 41.998000000000005 - type: ndcg_at_100 value: 39.626 - type: ndcg_at_1000 value: 48.707 - type: ndcg_at_20 value: 40.181 - type: ndcg_at_3 value: 48.06 - type: ndcg_at_5 value: 45.829 - type: precision_at_1 value: 52.941 - type: precision_at_10 value: 31.330999999999996 - type: precision_at_100 value: 10.421 - type: precision_at_1000 value: 2.428 - type: precision_at_20 value: 24.118000000000002 - type: precision_at_3 value: 45.408 - type: precision_at_5 value: 39.938 - type: recall_at_1 value: 6.907000000000001 - type: recall_at_10 value: 20.51 - type: recall_at_100 value: 40.857 - type: recall_at_1000 value: 73.616 - type: recall_at_20 value: 26.52 - type: recall_at_3 value: 13.267999999999999 - type: recall_at_5 value: 16.141 task: type: Retrieval - dataset: config: default name: MTEB NQ revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 split: test type: mteb/nq metrics: - type: main_score value: 71.8 - type: map_at_1 value: 47.629 - type: map_at_10 value: 64.846 - type: map_at_100 value: 65.40899999999999 - type: map_at_1000 value: 65.416 - type: map_at_20 value: 65.239 - type: map_at_3 value: 61.185 - type: map_at_5 value: 63.583 - type: mrr_at_1 value: 53.15758980301275 - type: mrr_at_10 value: 67.12880961577366 - type: mrr_at_100 value: 67.44006405426018 - type: mrr_at_1000 value: 67.44519150402294 - type: mrr_at_20 value: 67.34317135515428 - type: mrr_at_3 value: 64.5905755117805 - type: mrr_at_5 value: 66.24613750482806 - type: nauc_map_at_1000_diff1 value: 45.73812106517133 - type: nauc_map_at_1000_max value: 35.21262031755756 - type: nauc_map_at_1000_std value: -5.549443574026027 - type: nauc_map_at_100_diff1 value: 45.74254652176879 - type: nauc_map_at_100_max value: 35.22349167515518 - type: nauc_map_at_100_std value: -5.53697496044773 - type: nauc_map_at_10_diff1 value: 45.62837128377087 - type: nauc_map_at_10_max value: 35.3261562342222 - type: nauc_map_at_10_std value: -5.761924414031163 - type: nauc_map_at_1_diff1 value: 48.69187848570499 - type: nauc_map_at_1_max value: 28.687996096473476 - type: nauc_map_at_1_std value: -7.518605958272523 - type: nauc_map_at_20_diff1 value: 45.702303442220035 - type: nauc_map_at_20_max value: 35.30719944705456 - type: nauc_map_at_20_std value: -5.59505654742681 - type: nauc_map_at_3_diff1 value: 45.376813726832474 - type: nauc_map_at_3_max value: 34.68452149643597 - type: nauc_map_at_3_std value: -7.329014950379634 - type: nauc_map_at_5_diff1 value: 45.29528861989316 - type: nauc_map_at_5_max value: 35.35741440869229 - type: nauc_map_at_5_std value: -6.028788612259288 - type: nauc_mrr_at_1000_diff1 value: 46.11808147912517 - type: nauc_mrr_at_1000_max value: 35.59241850411947 - type: nauc_mrr_at_1000_std value: -3.4072428526109317 - type: nauc_mrr_at_100_diff1 value: 46.121345545514046 - type: nauc_mrr_at_100_max value: 35.60147795073431 - type: nauc_mrr_at_100_std value: -3.3965322447588826 - type: nauc_mrr_at_10_diff1 value: 46.0920068210502 - type: nauc_mrr_at_10_max value: 35.79649987854354 - type: nauc_mrr_at_10_std value: -3.339624589368137 - type: nauc_mrr_at_1_diff1 value: 49.101364605656194 - type: nauc_mrr_at_1_max value: 31.500796071482146 - type: nauc_mrr_at_1_std value: -4.183818500718156 - type: nauc_mrr_at_20_diff1 value: 46.088076630465594 - type: nauc_mrr_at_20_max value: 35.682131663053205 - type: nauc_mrr_at_20_std value: -3.35939023178519 - type: nauc_mrr_at_3_diff1 value: 45.47570812708642 - type: nauc_mrr_at_3_max value: 35.741892517632984 - type: nauc_mrr_at_3_std value: -4.135335963822013 - type: nauc_mrr_at_5_diff1 value: 45.78903474184014 - type: nauc_mrr_at_5_max value: 35.91273593700205 - type: nauc_mrr_at_5_std value: -3.467873421286869 - type: nauc_ndcg_at_1000_diff1 value: 45.5056583000012 - type: nauc_ndcg_at_1000_max value: 36.34328379251593 - type: nauc_ndcg_at_1000_std value: -4.0759698229323345 - type: nauc_ndcg_at_100_diff1 value: 45.61918946477166 - type: nauc_ndcg_at_100_max value: 36.675460335836235 - type: nauc_ndcg_at_100_std value: -3.6795334726235986 - type: nauc_ndcg_at_10_diff1 value: 45.15343994274541 - type: nauc_ndcg_at_10_max value: 37.48139242964657 - type: nauc_ndcg_at_10_std value: -4.287039084554882 - type: nauc_ndcg_at_1_diff1 value: 49.101364605656194 - type: nauc_ndcg_at_1_max value: 31.500796071482146 - type: nauc_ndcg_at_1_std value: -4.183818500718156 - type: nauc_ndcg_at_20_diff1 value: 45.310026313402375 - type: nauc_ndcg_at_20_max value: 37.32177497902133 - type: nauc_ndcg_at_20_std value: -3.8214360391282587 - type: nauc_ndcg_at_3_diff1 value: 44.27064370528994 - type: nauc_ndcg_at_3_max value: 36.380294033571396 - type: nauc_ndcg_at_3_std value: -6.844263370898355 - type: nauc_ndcg_at_5_diff1 value: 44.29933499225583 - type: nauc_ndcg_at_5_max value: 37.46477041822136 - type: nauc_ndcg_at_5_std value: -4.866548530467956 - type: nauc_precision_at_1000_diff1 value: -14.666553359142306 - type: nauc_precision_at_1000_max value: -0.5599759853201481 - type: nauc_precision_at_1000_std value: 16.8370925526591 - type: nauc_precision_at_100_diff1 value: -11.816251306246278 - type: nauc_precision_at_100_max value: 2.969819268208207 - type: nauc_precision_at_100_std value: 18.59422946634747 - type: nauc_precision_at_10_diff1 value: 1.2050200086029401 - type: nauc_precision_at_10_max value: 17.59930352911209 - type: nauc_precision_at_10_std value: 13.714495717588985 - type: nauc_precision_at_1_diff1 value: 49.101364605656194 - type: nauc_precision_at_1_max value: 31.500796071482146 - type: nauc_precision_at_1_std value: -4.183818500718156 - type: nauc_precision_at_20_diff1 value: -5.263476664822757 - type: nauc_precision_at_20_max value: 11.42004823600046 - type: nauc_precision_at_20_std value: 16.510514518664994 - type: nauc_precision_at_3_diff1 value: 20.116460379305828 - type: nauc_precision_at_3_max value: 31.32235038301311 - type: nauc_precision_at_3_std value: 2.7486717133871923 - type: nauc_precision_at_5_diff1 value: 9.57451645335723 - type: nauc_precision_at_5_max value: 25.28449126580587 - type: nauc_precision_at_5_std value: 9.955736162466767 - type: nauc_recall_at_1000_diff1 value: -21.632253065978794 - type: nauc_recall_at_1000_max value: 70.14409090958776 - type: nauc_recall_at_1000_std value: 65.61658090892989 - type: nauc_recall_at_100_diff1 value: 51.83161124806711 - type: nauc_recall_at_100_max value: 77.49921361841523 - type: nauc_recall_at_100_std value: 48.352508746719444 - type: nauc_recall_at_10_diff1 value: 39.86695231362791 - type: nauc_recall_at_10_max value: 50.12029094799474 - type: nauc_recall_at_10_std value: 0.1650940628131058 - type: nauc_recall_at_1_diff1 value: 48.69187848570499 - type: nauc_recall_at_1_max value: 28.687996096473476 - type: nauc_recall_at_1_std value: -7.518605958272523 - type: nauc_recall_at_20_diff1 value: 39.14155398061627 - type: nauc_recall_at_20_max value: 56.78559423716229 - type: nauc_recall_at_20_std value: 7.9728224572344075 - type: nauc_recall_at_3_diff1 value: 38.69589523432158 - type: nauc_recall_at_3_max value: 39.53271258375579 - type: nauc_recall_at_3_std value: -8.646925065787512 - type: nauc_recall_at_5_diff1 value: 37.45922652959002 - type: nauc_recall_at_5_max value: 44.4911958995867 - type: nauc_recall_at_5_std value: -3.5659842556375594 - type: ndcg_at_1 value: 53.15800000000001 - type: ndcg_at_10 value: 71.8 - type: ndcg_at_100 value: 73.85199999999999 - type: ndcg_at_1000 value: 74.017 - type: ndcg_at_20 value: 72.933 - type: ndcg_at_3 value: 65.479 - type: ndcg_at_5 value: 69.182 - type: precision_at_1 value: 53.15800000000001 - type: precision_at_10 value: 10.805 - type: precision_at_100 value: 1.2 - type: precision_at_1000 value: 0.122 - type: precision_at_20 value: 5.694 - type: precision_at_3 value: 28.939999999999998 - type: precision_at_5 value: 19.641000000000002 - type: recall_at_1 value: 47.629 - type: recall_at_10 value: 90.204 - type: recall_at_100 value: 98.66 - type: recall_at_1000 value: 99.874 - type: recall_at_20 value: 94.24 - type: recall_at_3 value: 74.394 - type: recall_at_5 value: 82.711 task: type: Retrieval - dataset: config: default name: MTEB QuoraRetrieval revision: e4e08e0b7dbe3c8700f0daef558ff32256715259 split: test type: mteb/quora metrics: - type: main_score value: 90.025 - type: map_at_1 value: 72.222 - type: map_at_10 value: 86.58500000000001 - type: map_at_100 value: 87.176 - type: map_at_1000 value: 87.188 - type: map_at_20 value: 86.97399999999999 - type: map_at_3 value: 83.736 - type: map_at_5 value: 85.554 - type: mrr_at_1 value: 83.04 - type: mrr_at_10 value: 89.05599603174585 - type: mrr_at_100 value: 89.12398891419457 - type: mrr_at_1000 value: 89.12434072241001 - type: mrr_at_20 value: 89.10416280692111 - type: mrr_at_3 value: 88.23833333333312 - type: mrr_at_5 value: 88.82233333333308 - type: nauc_map_at_1000_diff1 value: 78.29348113313218 - type: nauc_map_at_1000_max value: 32.31386754277228 - type: nauc_map_at_1000_std value: -50.47543661484052 - type: nauc_map_at_100_diff1 value: 78.29618548618575 - type: nauc_map_at_100_max value: 32.301475680947846 - type: nauc_map_at_100_std value: -50.50303428814228 - type: nauc_map_at_10_diff1 value: 78.47383776440803 - type: nauc_map_at_10_max value: 31.839339990133563 - type: nauc_map_at_10_std value: -52.832713555976 - type: nauc_map_at_1_diff1 value: 82.46330147467418 - type: nauc_map_at_1_max value: 23.497664918373538 - type: nauc_map_at_1_std value: -43.824657665520704 - type: nauc_map_at_20_diff1 value: 78.34772176474422 - type: nauc_map_at_20_max value: 32.16495182893947 - type: nauc_map_at_20_std value: -51.503292726558605 - type: nauc_map_at_3_diff1 value: 79.07823813069432 - type: nauc_map_at_3_max value: 29.395911687513976 - type: nauc_map_at_3_std value: -54.16377546873304 - type: nauc_map_at_5_diff1 value: 78.73076619520454 - type: nauc_map_at_5_max value: 30.700453118585237 - type: nauc_map_at_5_std value: -54.130514177664054 - type: nauc_mrr_at_1000_diff1 value: 79.04736184471865 - type: nauc_mrr_at_1000_max value: 34.43004593837643 - type: nauc_mrr_at_1000_std value: -46.137269068195316 - type: nauc_mrr_at_100_diff1 value: 79.04698704288086 - type: nauc_mrr_at_100_max value: 34.4305553741175 - type: nauc_mrr_at_100_std value: -46.13786687786434 - type: nauc_mrr_at_10_diff1 value: 79.04490677485934 - type: nauc_mrr_at_10_max value: 34.38170181522227 - type: nauc_mrr_at_10_std value: -46.38129875681807 - type: nauc_mrr_at_1_diff1 value: 79.87159215719124 - type: nauc_mrr_at_1_max value: 34.05882339253136 - type: nauc_mrr_at_1_std value: -43.56093395137571 - type: nauc_mrr_at_20_diff1 value: 79.04384174535653 - type: nauc_mrr_at_20_max value: 34.442136494675005 - type: nauc_mrr_at_20_std value: -46.205458519638654 - type: nauc_mrr_at_3_diff1 value: 78.78154519155487 - type: nauc_mrr_at_3_max value: 34.74995000500305 - type: nauc_mrr_at_3_std value: -46.36264203155416 - type: nauc_mrr_at_5_diff1 value: 79.02631187177 - type: nauc_mrr_at_5_max value: 34.538698249632205 - type: nauc_mrr_at_5_std value: -46.468881576157465 - type: nauc_ndcg_at_1000_diff1 value: 78.25260097014645 - type: nauc_ndcg_at_1000_max value: 33.68584498704271 - type: nauc_ndcg_at_1000_std value: -48.44716779494868 - type: nauc_ndcg_at_100_diff1 value: 78.25115412256716 - type: nauc_ndcg_at_100_max value: 33.63652663447088 - type: nauc_ndcg_at_100_std value: -48.489243909024715 - type: nauc_ndcg_at_10_diff1 value: 78.23875101557334 - type: nauc_ndcg_at_10_max value: 32.65217430043823 - type: nauc_ndcg_at_10_std value: -52.57770468845309 - type: nauc_ndcg_at_1_diff1 value: 79.87159215719124 - type: nauc_ndcg_at_1_max value: 34.05882339253136 - type: nauc_ndcg_at_1_std value: -43.56093395137571 - type: nauc_ndcg_at_20_diff1 value: 78.23478552311765 - type: nauc_ndcg_at_20_max value: 33.30691737901109 - type: nauc_ndcg_at_20_std value: -50.78412614854527 - type: nauc_ndcg_at_3_diff1 value: 77.66134485470224 - type: nauc_ndcg_at_3_max value: 32.19504710373125 - type: nauc_ndcg_at_3_std value: -52.01636728550155 - type: nauc_ndcg_at_5_diff1 value: 78.04734137324255 - type: nauc_ndcg_at_5_max value: 31.94593625591248 - type: nauc_ndcg_at_5_std value: -53.02169800690546 - type: nauc_precision_at_1000_diff1 value: -45.771948123542636 - type: nauc_precision_at_1000_max value: -5.182406190477681 - type: nauc_precision_at_1000_std value: 41.14460438707817 - type: nauc_precision_at_100_diff1 value: -45.64767154261461 - type: nauc_precision_at_100_max value: -5.046308286851713 - type: nauc_precision_at_100_std value: 41.07186716587844 - type: nauc_precision_at_10_diff1 value: -42.26779562305825 - type: nauc_precision_at_10_max value: -1.1264852893323076 - type: nauc_precision_at_10_std value: 27.62275729822392 - type: nauc_precision_at_1_diff1 value: 79.87159215719124 - type: nauc_precision_at_1_max value: 34.05882339253136 - type: nauc_precision_at_1_std value: -43.56093395137571 - type: nauc_precision_at_20_diff1 value: -44.24293221128388 - type: nauc_precision_at_20_max value: -3.1345628837361867 - type: nauc_precision_at_20_std value: 34.23625492740366 - type: nauc_precision_at_3_diff1 value: -24.925251389823348 - type: nauc_precision_at_3_max value: 6.622188833369412 - type: nauc_precision_at_3_std value: 6.424741786858512 - type: nauc_precision_at_5_diff1 value: -36.1407949990387 - type: nauc_precision_at_5_max value: 1.7533948968374462 - type: nauc_precision_at_5_std value: 17.914083278982634 - type: nauc_recall_at_1000_diff1 value: 52.26815466244496 - type: nauc_recall_at_1000_max value: 69.73611104239443 - type: nauc_recall_at_1000_std value: 73.18969965863008 - type: nauc_recall_at_100_diff1 value: 70.80557513785271 - type: nauc_recall_at_100_max value: 33.333440086544556 - type: nauc_recall_at_100_std value: -38.75992366905504 - type: nauc_recall_at_10_diff1 value: 74.45948457438163 - type: nauc_recall_at_10_max value: 26.64948512428989 - type: nauc_recall_at_10_std value: -82.90334292052363 - type: nauc_recall_at_1_diff1 value: 82.46330147467418 - type: nauc_recall_at_1_max value: 23.497664918373538 - type: nauc_recall_at_1_std value: -43.824657665520704 - type: nauc_recall_at_20_diff1 value: 73.80140280887753 - type: nauc_recall_at_20_max value: 30.361616426734965 - type: nauc_recall_at_20_std value: -81.1418804447414 - type: nauc_recall_at_3_diff1 value: 75.19854736087834 - type: nauc_recall_at_3_max value: 26.12298005045584 - type: nauc_recall_at_3_std value: -63.42583714745169 - type: nauc_recall_at_5_diff1 value: 74.16423451950358 - type: nauc_recall_at_5_max value: 25.552390331018987 - type: nauc_recall_at_5_std value: -71.15891947773912 - type: ndcg_at_1 value: 83.04 - type: ndcg_at_10 value: 90.025 - type: ndcg_at_100 value: 91.006 - type: ndcg_at_1000 value: 91.061 - type: ndcg_at_20 value: 90.556 - type: ndcg_at_3 value: 87.493 - type: ndcg_at_5 value: 88.955 - type: precision_at_1 value: 83.04 - type: precision_at_10 value: 13.667000000000002 - type: precision_at_100 value: 1.542 - type: precision_at_1000 value: 0.157 - type: precision_at_20 value: 7.221 - type: precision_at_3 value: 38.433 - type: precision_at_5 value: 25.228 - type: recall_at_1 value: 72.222 - type: recall_at_10 value: 96.604 - type: recall_at_100 value: 99.786 - type: recall_at_1000 value: 99.996 - type: recall_at_20 value: 98.253 - type: recall_at_3 value: 89.276 - type: recall_at_5 value: 93.46 task: type: Retrieval - dataset: config: default name: MTEB RedditClustering revision: 24640382cdbf8abc73003fb0fa6d111a705499eb split: test type: mteb/reddit-clustering metrics: - type: main_score value: 72.86492101891123 - type: v_measure value: 72.86492101891123 - type: v_measure_std value: 2.778711445144635 task: type: Clustering - dataset: config: default name: MTEB RedditClusteringP2P revision: 385e3cb46b4cfa89021f56c4380204149d0efe33 split: test type: mteb/reddit-clustering-p2p metrics: - type: main_score value: 75.27316726548479 - type: v_measure value: 75.27316726548479 - type: v_measure_std value: 8.87871936725338 task: type: Clustering - dataset: config: default name: MTEB SCIDOCS revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88 split: test type: mteb/scidocs metrics: - type: main_score value: 26.638 - type: map_at_1 value: 6.128 - type: map_at_10 value: 16.472 - type: map_at_100 value: 19.522000000000002 - type: map_at_1000 value: 19.898 - type: map_at_20 value: 18.098 - type: map_at_3 value: 11.283 - type: map_at_5 value: 13.771 - type: mrr_at_1 value: 30.2 - type: mrr_at_10 value: 42.621150793650735 - type: mrr_at_100 value: 43.740858712021954 - type: mrr_at_1000 value: 43.762699500220904 - type: mrr_at_20 value: 43.383639927753634 - type: mrr_at_3 value: 38.83333333333331 - type: mrr_at_5 value: 41.14833333333326 - type: nauc_map_at_1000_diff1 value: 13.13534664124808 - type: nauc_map_at_1000_max value: 29.346654566149795 - type: nauc_map_at_1000_std value: 18.08121186982413 - type: nauc_map_at_100_diff1 value: 13.098072728041538 - type: nauc_map_at_100_max value: 29.299084480697523 - type: nauc_map_at_100_std value: 17.961620202918464 - type: nauc_map_at_10_diff1 value: 14.001743720394682 - type: nauc_map_at_10_max value: 28.04128290996403 - type: nauc_map_at_10_std value: 13.744481555974716 - type: nauc_map_at_1_diff1 value: 22.1926640424872 - type: nauc_map_at_1_max value: 21.32609279586034 - type: nauc_map_at_1_std value: 6.566596302915438 - type: nauc_map_at_20_diff1 value: 13.57313142419664 - type: nauc_map_at_20_max value: 28.93840146319476 - type: nauc_map_at_20_std value: 16.50869367365676 - type: nauc_map_at_3_diff1 value: 17.707700541948462 - type: nauc_map_at_3_max value: 26.058174051376238 - type: nauc_map_at_3_std value: 9.943924560735267 - type: nauc_map_at_5_diff1 value: 17.11844492157723 - type: nauc_map_at_5_max value: 27.865247403049388 - type: nauc_map_at_5_std value: 11.372588172121546 - type: nauc_mrr_at_1000_diff1 value: 21.11248719936198 - type: nauc_mrr_at_1000_max value: 26.734172102201466 - type: nauc_mrr_at_1000_std value: 11.766121765437228 - type: nauc_mrr_at_100_diff1 value: 21.107109982277702 - type: nauc_mrr_at_100_max value: 26.741616065723267 - type: nauc_mrr_at_100_std value: 11.789802686224208 - type: nauc_mrr_at_10_diff1 value: 20.74108639793207 - type: nauc_mrr_at_10_max value: 26.920838463358333 - type: nauc_mrr_at_10_std value: 11.849217361926522 - type: nauc_mrr_at_1_diff1 value: 22.177437860573356 - type: nauc_mrr_at_1_max value: 21.88074521417754 - type: nauc_mrr_at_1_std value: 6.776011900101789 - type: nauc_mrr_at_20_diff1 value: 21.126633710175994 - type: nauc_mrr_at_20_max value: 26.860736480370974 - type: nauc_mrr_at_20_std value: 11.815411633726338 - type: nauc_mrr_at_3_diff1 value: 21.689245200066466 - type: nauc_mrr_at_3_max value: 26.187305092831625 - type: nauc_mrr_at_3_std value: 10.895380313134332 - type: nauc_mrr_at_5_diff1 value: 20.898811082479778 - type: nauc_mrr_at_5_max value: 26.939217247104036 - type: nauc_mrr_at_5_std value: 11.77832949822472 - type: nauc_ndcg_at_1000_diff1 value: 13.251184947898546 - type: nauc_ndcg_at_1000_max value: 30.879594164526146 - type: nauc_ndcg_at_1000_std value: 23.125206047366625 - type: nauc_ndcg_at_100_diff1 value: 12.549100649053676 - type: nauc_ndcg_at_100_max value: 30.634680845419123 - type: nauc_ndcg_at_100_std value: 23.296226055422984 - type: nauc_ndcg_at_10_diff1 value: 14.475144549294322 - type: nauc_ndcg_at_10_max value: 29.450349815417336 - type: nauc_ndcg_at_10_std value: 15.94068314781612 - type: nauc_ndcg_at_1_diff1 value: 22.177437860573356 - type: nauc_ndcg_at_1_max value: 21.88074521417754 - type: nauc_ndcg_at_1_std value: 6.776011900101789 - type: nauc_ndcg_at_20_diff1 value: 14.173669585802266 - type: nauc_ndcg_at_20_max value: 30.475890854725 - type: nauc_ndcg_at_20_std value: 19.863898148221704 - type: nauc_ndcg_at_3_diff1 value: 18.93971261196868 - type: nauc_ndcg_at_3_max value: 27.3707298720736 - type: nauc_ndcg_at_3_std value: 11.439810510051224 - type: nauc_ndcg_at_5_diff1 value: 17.89535958094687 - type: nauc_ndcg_at_5_max value: 29.272740466638425 - type: nauc_ndcg_at_5_std value: 13.402467626635909 - type: nauc_precision_at_1000_diff1 value: -3.811547048784123 - type: nauc_precision_at_1000_max value: 22.55165337197117 - type: nauc_precision_at_1000_std value: 35.98524999650108 - type: nauc_precision_at_100_diff1 value: 0.6474234774922896 - type: nauc_precision_at_100_max value: 25.06920726527032 - type: nauc_precision_at_100_std value: 32.31439698982313 - type: nauc_precision_at_10_diff1 value: 7.943127218139508 - type: nauc_precision_at_10_max value: 28.571937636787197 - type: nauc_precision_at_10_std value: 18.8472620918488 - type: nauc_precision_at_1_diff1 value: 22.177437860573356 - type: nauc_precision_at_1_max value: 21.88074521417754 - type: nauc_precision_at_1_std value: 6.776011900101789 - type: nauc_precision_at_20_diff1 value: 6.981574259607366 - type: nauc_precision_at_20_max value: 28.986094397038727 - type: nauc_precision_at_20_std value: 25.83129974001146 - type: nauc_precision_at_3_diff1 value: 17.197490724039355 - type: nauc_precision_at_3_max value: 29.17569320583099 - type: nauc_precision_at_3_std value: 13.430554945991846 - type: nauc_precision_at_5_diff1 value: 14.952364330739362 - type: nauc_precision_at_5_max value: 31.053243354846977 - type: nauc_precision_at_5_std value: 15.856312752807822 - type: nauc_recall_at_1000_diff1 value: -4.8224253128926975 - type: nauc_recall_at_1000_max value: 21.3989024429911 - type: nauc_recall_at_1000_std value: 39.152234275603604 - type: nauc_recall_at_100_diff1 value: 0.11936808422867201 - type: nauc_recall_at_100_max value: 24.261739241957823 - type: nauc_recall_at_100_std value: 32.62984573938928 - type: nauc_recall_at_10_diff1 value: 7.851256165018388 - type: nauc_recall_at_10_max value: 27.936406600938746 - type: nauc_recall_at_10_std value: 18.683634320636113 - type: nauc_recall_at_1_diff1 value: 22.1926640424872 - type: nauc_recall_at_1_max value: 21.32609279586034 - type: nauc_recall_at_1_std value: 6.566596302915438 - type: nauc_recall_at_20_diff1 value: 6.8107211705182165 - type: nauc_recall_at_20_max value: 28.286284094687787 - type: nauc_recall_at_20_std value: 25.932013268120862 - type: nauc_recall_at_3_diff1 value: 17.04156818427151 - type: nauc_recall_at_3_max value: 28.645439108719216 - type: nauc_recall_at_3_std value: 13.346047828494411 - type: nauc_recall_at_5_diff1 value: 14.906284329771822 - type: nauc_recall_at_5_max value: 30.58628602415921 - type: nauc_recall_at_5_std value: 15.755157478191755 - type: ndcg_at_1 value: 30.2 - type: ndcg_at_10 value: 26.638 - type: ndcg_at_100 value: 37.135 - type: ndcg_at_1000 value: 42.576 - type: ndcg_at_20 value: 30.75 - type: ndcg_at_3 value: 24.675 - type: ndcg_at_5 value: 21.836 - type: precision_at_1 value: 30.2 - type: precision_at_10 value: 14.06 - type: precision_at_100 value: 2.904 - type: precision_at_1000 value: 0.42 - type: precision_at_20 value: 9.4 - type: precision_at_3 value: 23.233 - type: precision_at_5 value: 19.439999999999998 - type: recall_at_1 value: 6.128 - type: recall_at_10 value: 28.471999999999998 - type: recall_at_100 value: 58.952000000000005 - type: recall_at_1000 value: 85.137 - type: recall_at_20 value: 38.17 - type: recall_at_3 value: 14.127999999999998 - type: recall_at_5 value: 19.673 task: type: Retrieval - dataset: config: default name: MTEB SICK-R revision: 20a6d6f312dd54037fe07a32d58e5e168867909d split: test type: mteb/sickr-sts metrics: - type: cosine_pearson value: 86.86608529160739 - type: cosine_spearman value: 82.88625166203383 - type: euclidean_pearson value: 84.15494418856142 - type: euclidean_spearman value: 82.88449294676421 - type: main_score value: 82.88625166203383 - type: manhattan_pearson value: 84.39068623474428 - type: manhattan_spearman value: 82.88065412169463 - type: pearson value: 86.86608529160739 - type: spearman value: 82.88625166203383 task: type: STS - dataset: config: default name: MTEB STS12 revision: a0d554a64d88156834ff5ae9920b964011b16384 split: test type: mteb/sts12-sts metrics: - type: cosine_pearson value: 87.0445014940449 - type: cosine_spearman value: 80.0880365116599 - type: euclidean_pearson value: 83.80250772928852 - type: euclidean_spearman value: 80.0892465260778 - type: main_score value: 80.0880365116599 - type: manhattan_pearson value: 83.96793981929336 - type: manhattan_spearman value: 80.24881789268238 - type: pearson value: 87.0445014940449 - type: spearman value: 80.0880365116599 task: type: STS - dataset: config: default name: MTEB STS13 revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca split: test type: mteb/sts13-sts metrics: - type: cosine_pearson value: 89.33900828959968 - type: cosine_spearman value: 89.68256358526733 - type: euclidean_pearson value: 89.29188708262265 - type: euclidean_spearman value: 89.68204344658601 - type: main_score value: 89.68256358526733 - type: manhattan_pearson value: 89.13996588193149 - type: manhattan_spearman value: 89.61372804425623 - type: pearson value: 89.33900828959968 - type: spearman value: 89.68256358526733 task: type: STS - dataset: config: default name: MTEB STS14 revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 split: test type: mteb/sts14-sts metrics: - type: cosine_pearson value: 86.42029843639123 - type: cosine_spearman value: 85.0707889220723 - type: euclidean_pearson value: 85.75114239552562 - type: euclidean_spearman value: 85.06858160270725 - type: main_score value: 85.0707889220723 - type: manhattan_pearson value: 85.86461900459038 - type: manhattan_spearman value: 85.28671103475605 - type: pearson value: 86.42029843639123 - type: spearman value: 85.0707889220723 task: type: STS - dataset: config: default name: MTEB STS15 revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 split: test type: mteb/sts15-sts metrics: - type: cosine_pearson value: 88.3660081271444 - type: cosine_spearman value: 89.39375083609528 - type: euclidean_pearson value: 89.21818482894895 - type: euclidean_spearman value: 89.39361588875443 - type: main_score value: 89.39375083609528 - type: manhattan_pearson value: 89.53535068014057 - type: manhattan_spearman value: 89.81077130567752 - type: pearson value: 88.3660081271444 - type: spearman value: 89.39375083609528 task: type: STS - dataset: config: default name: MTEB STS16 revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 split: test type: mteb/sts16-sts metrics: - type: cosine_pearson value: 85.60708247171874 - type: cosine_spearman value: 87.15234952832193 - type: euclidean_pearson value: 86.21743555548137 - type: euclidean_spearman value: 87.14450217418016 - type: main_score value: 87.15234952832193 - type: manhattan_pearson value: 86.2467748746084 - type: manhattan_spearman value: 87.2197479717654 - type: pearson value: 85.60708247171874 - type: spearman value: 87.15234952832193 task: type: STS - dataset: config: en-en name: MTEB STS17 (en-en) revision: faeb762787bd10488a50c8b5be4a3b82e411949c split: test type: mteb/sts17-crosslingual-sts metrics: - type: cosine_pearson value: 91.25898556808458 - type: cosine_spearman value: 91.35372390581641 - type: euclidean_pearson value: 91.319520321348 - type: euclidean_spearman value: 91.30821135416925 - type: main_score value: 91.35372390581641 - type: manhattan_pearson value: 91.14800959939069 - type: manhattan_spearman value: 91.09775424245629 - type: pearson value: 91.25898556808458 - type: spearman value: 91.35372390581641 task: type: STS - dataset: config: en name: MTEB STS22 (en) revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 split: test type: mteb/sts22-crosslingual-sts metrics: - type: cosine_pearson value: 67.61637111515797 - type: cosine_spearman value: 68.10379096526697 - type: euclidean_pearson value: 69.2652309491375 - type: euclidean_spearman value: 68.18436357033228 - type: main_score value: 68.10379096526697 - type: manhattan_pearson value: 69.52531340510775 - type: manhattan_spearman value: 68.17874790391862 - type: pearson value: 67.61637111515797 - type: spearman value: 68.10379096526697 task: type: STS - dataset: config: default name: MTEB STSBenchmark revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 split: test type: mteb/stsbenchmark-sts metrics: - type: cosine_pearson value: 87.81592853782297 - type: cosine_spearman value: 88.2302550329183 - type: euclidean_pearson value: 88.01165144519526 - type: euclidean_spearman value: 88.23342148890097 - type: main_score value: 88.2302550329183 - type: manhattan_pearson value: 88.148592564938 - type: manhattan_spearman value: 88.49226317320988 - type: pearson value: 87.81592853782297 - type: spearman value: 88.2302550329183 task: type: STS - dataset: config: default name: MTEB SciDocsRR revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab split: test type: mteb/scidocs-reranking metrics: - type: main_score value: 89.196009707431 - type: map value: 89.196009707431 - type: mrr value: 97.07198121413808 - type: nAUC_map_diff1 value: -14.066667940115352 - type: nAUC_map_max value: 49.73702475027407 - type: nAUC_map_std value: 64.0986775782592 - type: nAUC_mrr_diff1 value: 21.96846389417319 - type: nAUC_mrr_max value: 86.38341077184032 - type: nAUC_mrr_std value: 75.38945014727746 task: type: Reranking - dataset: config: default name: MTEB SciFact revision: 0228b52cf27578f30900b9e5271d331663a030d7 split: test type: mteb/scifact metrics: - type: main_score value: 80.08999999999999 - type: map_at_1 value: 63.161 - type: map_at_10 value: 75.163 - type: map_at_100 value: 75.408 - type: map_at_1000 value: 75.409 - type: map_at_20 value: 75.332 - type: map_at_3 value: 71.839 - type: map_at_5 value: 74.32600000000001 - type: mrr_at_1 value: 66.33333333333333 - type: mrr_at_10 value: 75.95978835978836 - type: mrr_at_100 value: 76.15647881281473 - type: mrr_at_1000 value: 76.15736533763744 - type: mrr_at_20 value: 76.08557368557368 - type: mrr_at_3 value: 73.55555555555556 - type: mrr_at_5 value: 75.4888888888889 - type: nauc_map_at_1000_diff1 value: 77.31229383811176 - type: nauc_map_at_1000_max value: 58.848319058605156 - type: nauc_map_at_1000_std value: -14.290090263454985 - type: nauc_map_at_100_diff1 value: 77.31325400213969 - type: nauc_map_at_100_max value: 58.848885054155275 - type: nauc_map_at_100_std value: -14.285806618869273 - type: nauc_map_at_10_diff1 value: 77.1806705504232 - type: nauc_map_at_10_max value: 59.02905805134415 - type: nauc_map_at_10_std value: -14.132954900037467 - type: nauc_map_at_1_diff1 value: 81.03932970557837 - type: nauc_map_at_1_max value: 49.02073230264529 - type: nauc_map_at_1_std value: -22.977452975845512 - type: nauc_map_at_20_diff1 value: 77.22581364818562 - type: nauc_map_at_20_max value: 58.90740400399768 - type: nauc_map_at_20_std value: -14.245079150986745 - type: nauc_map_at_3_diff1 value: 76.99793243255563 - type: nauc_map_at_3_max value: 54.9930733886623 - type: nauc_map_at_3_std value: -19.297708446082407 - type: nauc_map_at_5_diff1 value: 77.1671608360295 - type: nauc_map_at_5_max value: 57.27757489519526 - type: nauc_map_at_5_std value: -15.446338357667708 - type: nauc_mrr_at_1000_diff1 value: 77.4806080821202 - type: nauc_mrr_at_1000_max value: 60.9213776129792 - type: nauc_mrr_at_1000_std value: -12.139599632228343 - type: nauc_mrr_at_100_diff1 value: 77.48158073865281 - type: nauc_mrr_at_100_max value: 60.9218657185361 - type: nauc_mrr_at_100_std value: -12.13532070453677 - type: nauc_mrr_at_10_diff1 value: 77.32428546014407 - type: nauc_mrr_at_10_max value: 61.018407010343466 - type: nauc_mrr_at_10_std value: -12.143193773309347 - type: nauc_mrr_at_1_diff1 value: 80.99806778887115 - type: nauc_mrr_at_1_max value: 59.17855969530095 - type: nauc_mrr_at_1_std value: -12.30545640831458 - type: nauc_mrr_at_20_diff1 value: 77.3811067653992 - type: nauc_mrr_at_20_max value: 60.9648880366335 - type: nauc_mrr_at_20_std value: -12.124066076541853 - type: nauc_mrr_at_3_diff1 value: 77.31304316321959 - type: nauc_mrr_at_3_max value: 60.75536766404163 - type: nauc_mrr_at_3_std value: -12.997876030849623 - type: nauc_mrr_at_5_diff1 value: 77.12952864141742 - type: nauc_mrr_at_5_max value: 60.995943754968685 - type: nauc_mrr_at_5_std value: -11.353447465605694 - type: nauc_ndcg_at_1000_diff1 value: 76.81788665683746 - type: nauc_ndcg_at_1000_max value: 60.35947755262391 - type: nauc_ndcg_at_1000_std value: -12.884942372460362 - type: nauc_ndcg_at_100_diff1 value: 76.87388230365198 - type: nauc_ndcg_at_100_max value: 60.38813162962434 - type: nauc_ndcg_at_100_std value: -12.64384717800478 - type: nauc_ndcg_at_10_diff1 value: 75.87713506026317 - type: nauc_ndcg_at_10_max value: 61.39356554675667 - type: nauc_ndcg_at_10_std value: -12.144227584144218 - type: nauc_ndcg_at_1_diff1 value: 80.99806778887115 - type: nauc_ndcg_at_1_max value: 59.17855969530095 - type: nauc_ndcg_at_1_std value: -12.30545640831458 - type: nauc_ndcg_at_20_diff1 value: 76.09913944506627 - type: nauc_ndcg_at_20_max value: 61.01644448834147 - type: nauc_ndcg_at_20_std value: -12.456209267623857 - type: nauc_ndcg_at_3_diff1 value: 75.52717946614608 - type: nauc_ndcg_at_3_max value: 58.96433090721983 - type: nauc_ndcg_at_3_std value: -15.849280494339556 - type: nauc_ndcg_at_5_diff1 value: 75.69026981016921 - type: nauc_ndcg_at_5_max value: 58.924044405851326 - type: nauc_ndcg_at_5_std value: -13.182728827923107 - type: nauc_precision_at_1000_diff1 value: -31.634022001609914 - type: nauc_precision_at_1000_max value: 31.46271490784504 - type: nauc_precision_at_1000_std value: 60.44801276891442 - type: nauc_precision_at_100_diff1 value: -29.722363469948103 - type: nauc_precision_at_100_max value: 32.05464592020074 - type: nauc_precision_at_100_std value: 60.832570595613554 - type: nauc_precision_at_10_diff1 value: -11.91731376599939 - type: nauc_precision_at_10_max value: 45.43646553157129 - type: nauc_precision_at_10_std value: 52.962408871791276 - type: nauc_precision_at_1_diff1 value: 80.99806778887115 - type: nauc_precision_at_1_max value: 59.17855969530095 - type: nauc_precision_at_1_std value: -12.30545640831458 - type: nauc_precision_at_20_diff1 value: -18.43293701721667 - type: nauc_precision_at_20_max value: 39.53434874203934 - type: nauc_precision_at_20_std value: 53.6291982468461 - type: nauc_precision_at_3_diff1 value: 30.84789043003892 - type: nauc_precision_at_3_max value: 55.660727758110376 - type: nauc_precision_at_3_std value: 17.87243920840355 - type: nauc_precision_at_5_diff1 value: 4.099395181445625 - type: nauc_precision_at_5_max value: 50.346770968709386 - type: nauc_precision_at_5_std value: 44.66722483255029 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: 100.0 - type: nauc_recall_at_100_max value: 72.2222222222207 - type: nauc_recall_at_100_std value: 86.92810457516407 - type: nauc_recall_at_10_diff1 value: 62.18887555022005 - type: nauc_recall_at_10_max value: 75.14339068960916 - type: nauc_recall_at_10_std value: -1.4912631719357108 - type: nauc_recall_at_1_diff1 value: 81.03932970557837 - type: nauc_recall_at_1_max value: 49.02073230264529 - type: nauc_recall_at_1_std value: -22.977452975845512 - type: nauc_recall_at_20_diff1 value: 59.27414444038499 - type: nauc_recall_at_20_max value: 76.32241302318047 - type: nauc_recall_at_20_std value: -0.8322169447488666 - type: nauc_recall_at_3_diff1 value: 69.58783002593157 - type: nauc_recall_at_3_max value: 55.89660919896563 - type: nauc_recall_at_3_std value: -21.183005510917862 - type: nauc_recall_at_5_diff1 value: 65.53660499878802 - type: nauc_recall_at_5_max value: 58.218018535135805 - type: nauc_recall_at_5_std value: -8.328952210032455 - type: ndcg_at_1 value: 66.333 - type: ndcg_at_10 value: 80.08999999999999 - type: ndcg_at_100 value: 81.24900000000001 - type: ndcg_at_1000 value: 81.28800000000001 - type: ndcg_at_20 value: 80.625 - type: ndcg_at_3 value: 74.98700000000001 - type: ndcg_at_5 value: 78.553 - type: precision_at_1 value: 66.333 - type: precision_at_10 value: 10.667 - type: precision_at_100 value: 1.127 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_20 value: 5.45 - type: precision_at_3 value: 29.555999999999997 - type: precision_at_5 value: 20.133000000000003 - type: recall_at_1 value: 63.161 - type: recall_at_10 value: 94.167 - type: recall_at_100 value: 99.667 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 96.167 - type: recall_at_3 value: 80.972 - type: recall_at_5 value: 89.90599999999999 task: type: Retrieval - dataset: config: default name: MTEB SprintDuplicateQuestions revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 split: test type: mteb/sprintduplicatequestions-pairclassification metrics: - type: cosine_accuracy value: 99.81881188118813 - type: cosine_accuracy_threshold value: 85.55081486701965 - type: cosine_ap value: 96.0359661816236 - type: cosine_f1 value: 90.6584992343032 - type: cosine_f1_threshold value: 84.82859134674072 - type: cosine_precision value: 92.59645464025026 - type: cosine_recall value: 88.8 - type: dot_accuracy value: 99.81881188118813 - type: dot_accuracy_threshold value: 84.91908311843872 - type: dot_ap value: 96.05740121094365 - type: dot_f1 value: 90.81885856079404 - type: dot_f1_threshold value: 83.84919166564941 - type: dot_precision value: 90.14778325123153 - type: dot_recall value: 91.5 - type: euclidean_accuracy value: 99.82079207920792 - type: euclidean_accuracy_threshold value: 54.49706315994263 - type: euclidean_ap value: 96.03223527068818 - type: euclidean_f1 value: 90.72270630445925 - type: euclidean_f1_threshold value: 54.49706315994263 - type: euclidean_precision value: 93.05993690851734 - type: euclidean_recall value: 88.5 - type: main_score value: 96.32671902439806 - type: manhattan_accuracy value: 99.83267326732673 - type: manhattan_accuracy_threshold value: 3818.192672729492 - type: manhattan_ap value: 96.32671902439806 - type: manhattan_f1 value: 91.52032112393378 - type: manhattan_f1_threshold value: 3818.192672729492 - type: manhattan_precision value: 91.8429003021148 - type: manhattan_recall value: 91.2 - type: max_ap value: 96.32671902439806 - type: max_f1 value: 91.52032112393378 - type: max_precision value: 93.05993690851734 - type: max_recall value: 91.5 - type: similarity_accuracy value: 99.81881188118813 - type: similarity_accuracy_threshold value: 85.55081486701965 - type: similarity_ap value: 96.0359661816236 - type: similarity_f1 value: 90.6584992343032 - type: similarity_f1_threshold value: 84.82859134674072 - type: similarity_precision value: 92.59645464025026 - type: similarity_recall value: 88.8 task: type: PairClassification - dataset: config: default name: MTEB StackExchangeClustering revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 split: test type: mteb/stackexchange-clustering metrics: - type: main_score value: 80.28558559137414 - type: v_measure value: 80.28558559137414 - type: v_measure_std value: 2.795276520287584 task: type: Clustering - dataset: config: default name: MTEB StackExchangeClusteringP2P revision: 815ca46b2622cec33ccafc3735d572c266efdb44 split: test type: mteb/stackexchange-clustering-p2p metrics: - type: main_score value: 49.57135582416209 - type: v_measure value: 49.57135582416209 - type: v_measure_std value: 1.6414135468423754 task: type: Clustering - dataset: config: default name: MTEB StackOverflowDupQuestions revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 split: test type: mteb/stackoverflowdupquestions-reranking metrics: - type: main_score value: 55.253002583598644 - type: map value: 55.253002583598644 - type: mrr value: 56.24172396231219 - type: nAUC_map_diff1 value: 40.00053248203427 - type: nAUC_map_max value: 10.05441740585869 - type: nAUC_map_std value: 8.227169286387552 - type: nAUC_mrr_diff1 value: 40.250446264233744 - type: nAUC_mrr_max value: 10.586310195339053 - type: nAUC_mrr_std value: 8.47326494370076 task: type: Reranking - dataset: config: default name: MTEB SummEval revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c split: test type: mteb/summeval metrics: - type: cosine_pearson value: 31.19874648747059 - type: cosine_spearman value: 31.493550648844863 - type: dot_pearson value: 31.157847680289407 - type: dot_spearman value: 31.575299712180538 - type: main_score value: 31.493550648844863 - type: pearson value: 31.19874648747059 - type: spearman value: 31.493550648844863 task: type: Summarization - dataset: config: default name: MTEB TRECCOVID revision: bb9466bac8153a0349341eb1b22e06409e78ef4e split: test type: mteb/trec-covid metrics: - type: main_score value: 85.983 - type: map_at_1 value: 0.247 - type: map_at_10 value: 2.177 - type: map_at_100 value: 14.804 - type: map_at_1000 value: 37.045 - type: map_at_20 value: 4.12 - type: map_at_3 value: 0.7000000000000001 - type: map_at_5 value: 1.1320000000000001 - type: mrr_at_1 value: 96.0 - type: mrr_at_10 value: 98.0 - type: mrr_at_100 value: 98.0 - type: mrr_at_1000 value: 98.0 - type: mrr_at_20 value: 98.0 - type: mrr_at_3 value: 98.0 - type: mrr_at_5 value: 98.0 - type: nauc_map_at_1000_diff1 value: -0.9165125200337213 - type: nauc_map_at_1000_max value: 40.260117798042764 - type: nauc_map_at_1000_std value: 71.72789335831554 - type: nauc_map_at_100_diff1 value: 20.493827311583953 - type: nauc_map_at_100_max value: 21.005742079276462 - type: nauc_map_at_100_std value: 62.53815607831659 - type: nauc_map_at_10_diff1 value: 31.289297684528215 - type: nauc_map_at_10_max value: 7.86554294370268 - type: nauc_map_at_10_std value: 37.26191657133897 - type: nauc_map_at_1_diff1 value: 25.57568148849456 - type: nauc_map_at_1_max value: -5.9767435623941445 - type: nauc_map_at_1_std value: 30.849871717506755 - type: nauc_map_at_20_diff1 value: 30.896018204532087 - type: nauc_map_at_20_max value: 8.667077299744314 - type: nauc_map_at_20_std value: 41.512687168412924 - type: nauc_map_at_3_diff1 value: 29.44724521006598 - type: nauc_map_at_3_max value: 1.597496889532064 - type: nauc_map_at_3_std value: 32.25013773854697 - type: nauc_map_at_5_diff1 value: 27.387036605618825 - type: nauc_map_at_5_max value: 5.402983746211454 - type: nauc_map_at_5_std value: 33.940523962472184 - type: nauc_mrr_at_1000_diff1 value: -14.122315592903503 - type: nauc_mrr_at_1000_max value: 33.84687208216605 - type: nauc_mrr_at_1000_std value: 86.11111111111092 - type: nauc_mrr_at_100_diff1 value: -14.122315592903503 - type: nauc_mrr_at_100_max value: 33.84687208216605 - type: nauc_mrr_at_100_std value: 86.11111111111092 - type: nauc_mrr_at_10_diff1 value: -14.122315592903503 - type: nauc_mrr_at_10_max value: 33.84687208216605 - type: nauc_mrr_at_10_std value: 86.11111111111092 - type: nauc_mrr_at_1_diff1 value: -14.122315592903831 - type: nauc_mrr_at_1_max value: 33.84687208216637 - type: nauc_mrr_at_1_std value: 86.11111111111124 - type: nauc_mrr_at_20_diff1 value: -14.122315592903503 - type: nauc_mrr_at_20_max value: 33.84687208216605 - type: nauc_mrr_at_20_std value: 86.11111111111092 - type: nauc_mrr_at_3_diff1 value: -14.122315592903503 - type: nauc_mrr_at_3_max value: 33.84687208216605 - type: nauc_mrr_at_3_std value: 86.11111111111092 - type: nauc_mrr_at_5_diff1 value: -14.122315592903503 - type: nauc_mrr_at_5_max value: 33.84687208216605 - type: nauc_mrr_at_5_std value: 86.11111111111092 - type: nauc_ndcg_at_1000_diff1 value: 8.745907669561928 - type: nauc_ndcg_at_1000_max value: 45.43307237994533 - type: nauc_ndcg_at_1000_std value: 74.93357447176336 - type: nauc_ndcg_at_100_diff1 value: -3.9719350773353765 - type: nauc_ndcg_at_100_max value: 44.43705332397461 - type: nauc_ndcg_at_100_std value: 61.59493812371758 - type: nauc_ndcg_at_10_diff1 value: 15.230915878367348 - type: nauc_ndcg_at_10_max value: 48.332840970836635 - type: nauc_ndcg_at_10_std value: 46.888785065125774 - type: nauc_ndcg_at_1_diff1 value: 13.219732337379442 - type: nauc_ndcg_at_1_max value: 45.19919078742603 - type: nauc_ndcg_at_1_std value: 64.68253968253977 - type: nauc_ndcg_at_20_diff1 value: 12.479648691964865 - type: nauc_ndcg_at_20_max value: 48.76688248450331 - type: nauc_ndcg_at_20_std value: 51.450399755887545 - type: nauc_ndcg_at_3_diff1 value: 6.165414201871464 - type: nauc_ndcg_at_3_max value: 45.089689347691035 - type: nauc_ndcg_at_3_std value: 41.08249161845213 - type: nauc_ndcg_at_5_diff1 value: 7.411245806844721 - type: nauc_ndcg_at_5_max value: 47.818748093538076 - type: nauc_ndcg_at_5_std value: 45.907685763676575 - type: nauc_precision_at_1000_diff1 value: -30.574290219847345 - type: nauc_precision_at_1000_max value: 32.56926126118719 - type: nauc_precision_at_1000_std value: 14.584504392628874 - type: nauc_precision_at_100_diff1 value: -10.199740234718847 - type: nauc_precision_at_100_max value: 41.0213226769777 - type: nauc_precision_at_100_std value: 56.975760776771324 - type: nauc_precision_at_10_diff1 value: 7.865792689701161 - type: nauc_precision_at_10_max value: 52.00432275201737 - type: nauc_precision_at_10_std value: 43.89512276413724 - type: nauc_precision_at_1_diff1 value: -14.122315592903831 - type: nauc_precision_at_1_max value: 33.84687208216637 - type: nauc_precision_at_1_std value: 86.11111111111124 - type: nauc_precision_at_20_diff1 value: 5.481424191880084 - type: nauc_precision_at_20_max value: 46.86629331792725 - type: nauc_precision_at_20_std value: 49.245692667517496 - type: nauc_precision_at_3_diff1 value: -5.870408807869163 - type: nauc_precision_at_3_max value: 48.73657612128875 - type: nauc_precision_at_3_std value: 41.15152062088262 - type: nauc_precision_at_5_diff1 value: -4.550610529125413 - type: nauc_precision_at_5_max value: 60.390115878205386 - type: nauc_precision_at_5_std value: 44.16494295055696 - type: nauc_recall_at_1000_diff1 value: 8.047794367079034 - type: nauc_recall_at_1000_max value: 37.07551482870489 - type: nauc_recall_at_1000_std value: 66.20862163364201 - type: nauc_recall_at_100_diff1 value: 25.08104923597475 - type: nauc_recall_at_100_max value: 9.971294642165734 - type: nauc_recall_at_100_std value: 51.737814074891254 - type: nauc_recall_at_10_diff1 value: 32.33148478369628 - type: nauc_recall_at_10_max value: 1.3767192150014917 - type: nauc_recall_at_10_std value: 30.801926742876308 - type: nauc_recall_at_1_diff1 value: 25.57568148849456 - type: nauc_recall_at_1_max value: -5.9767435623941445 - type: nauc_recall_at_1_std value: 30.849871717506755 - type: nauc_recall_at_20_diff1 value: 31.716580022934654 - type: nauc_recall_at_20_max value: -0.1281270579464631 - type: nauc_recall_at_20_std value: 33.76185294993676 - type: nauc_recall_at_3_diff1 value: 29.758810004388348 - type: nauc_recall_at_3_max value: -1.9442985017191816 - type: nauc_recall_at_3_std value: 27.45550076962206 - type: nauc_recall_at_5_diff1 value: 27.047710181576672 - type: nauc_recall_at_5_max value: 1.5237000700880248 - type: nauc_recall_at_5_std value: 28.235297950159698 - type: ndcg_at_1 value: 94.0 - type: ndcg_at_10 value: 85.983 - type: ndcg_at_100 value: 69.195 - type: ndcg_at_1000 value: 62.541000000000004 - type: ndcg_at_20 value: 83.405 - type: ndcg_at_3 value: 89.98899999999999 - type: ndcg_at_5 value: 87.905 - type: precision_at_1 value: 96.0 - type: precision_at_10 value: 89.4 - type: precision_at_100 value: 71.54 - type: precision_at_1000 value: 27.594 - type: precision_at_20 value: 87.2 - type: precision_at_3 value: 92.667 - type: precision_at_5 value: 90.8 - type: recall_at_1 value: 0.247 - type: recall_at_10 value: 2.315 - type: recall_at_100 value: 17.574 - type: recall_at_1000 value: 59.336999999999996 - type: recall_at_20 value: 4.491 - type: recall_at_3 value: 0.7250000000000001 - type: recall_at_5 value: 1.1820000000000002 task: type: Retrieval - dataset: config: default name: MTEB Touche2020 revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f split: test type: mteb/touche2020 metrics: - type: main_score value: 29.944 - type: map_at_1 value: 3.064 - type: map_at_10 value: 11.501999999999999 - type: map_at_100 value: 18.736 - type: map_at_1000 value: 20.333000000000002 - type: map_at_20 value: 14.057 - type: map_at_3 value: 6.300999999999999 - type: map_at_5 value: 8.463 - type: mrr_at_1 value: 44.89795918367347 - type: mrr_at_10 value: 58.41188856494979 - type: mrr_at_100 value: 58.93964266413245 - type: mrr_at_1000 value: 58.93964266413245 - type: mrr_at_20 value: 58.767485349118 - type: mrr_at_3 value: 54.42176870748299 - type: mrr_at_5 value: 56.666666666666664 - type: nauc_map_at_1000_diff1 value: 11.478593385608479 - type: nauc_map_at_1000_max value: 10.309889845044324 - type: nauc_map_at_1000_std value: 21.16721939940238 - type: nauc_map_at_100_diff1 value: 11.570438543562418 - type: nauc_map_at_100_max value: 8.426183648064834 - type: nauc_map_at_100_std value: 18.56231985033613 - type: nauc_map_at_10_diff1 value: 22.37735506247481 - type: nauc_map_at_10_max value: 5.455946239060806 - type: nauc_map_at_10_std value: -4.2848826518388154 - type: nauc_map_at_1_diff1 value: 27.853645380676824 - type: nauc_map_at_1_max value: 7.30739948053113 - type: nauc_map_at_1_std value: -0.2773663157814586 - type: nauc_map_at_20_diff1 value: 14.724669779924648 - type: nauc_map_at_20_max value: 10.12882779173533 - type: nauc_map_at_20_std value: 4.4803777672120875 - type: nauc_map_at_3_diff1 value: 31.891173385921263 - type: nauc_map_at_3_max value: 4.889652271827218 - type: nauc_map_at_3_std value: -9.477460238651643 - type: nauc_map_at_5_diff1 value: 31.489012040465003 - type: nauc_map_at_5_max value: 1.7330092417337482 - type: nauc_map_at_5_std value: -8.137018608469637 - type: nauc_mrr_at_1000_diff1 value: 24.411522237082416 - type: nauc_mrr_at_1000_max value: 11.286971076556688 - type: nauc_mrr_at_1000_std value: 23.443174210894043 - type: nauc_mrr_at_100_diff1 value: 24.411522237082416 - type: nauc_mrr_at_100_max value: 11.286971076556688 - type: nauc_mrr_at_100_std value: 23.443174210894043 - type: nauc_mrr_at_10_diff1 value: 23.948152308265186 - type: nauc_mrr_at_10_max value: 12.22420979621155 - type: nauc_mrr_at_10_std value: 23.557939024705544 - type: nauc_mrr_at_1_diff1 value: 17.902334894536107 - type: nauc_mrr_at_1_max value: 17.36969662861018 - type: nauc_mrr_at_1_std value: 19.425714969048734 - type: nauc_mrr_at_20_diff1 value: 24.635893795899797 - type: nauc_mrr_at_20_max value: 11.330541067194913 - type: nauc_mrr_at_20_std value: 23.74518583400233 - type: nauc_mrr_at_3_diff1 value: 25.045536328282587 - type: nauc_mrr_at_3_max value: 7.497967004732733 - type: nauc_mrr_at_3_std value: 24.167153007320078 - type: nauc_mrr_at_5_diff1 value: 24.328479930592454 - type: nauc_mrr_at_5_max value: 10.037126854938336 - type: nauc_mrr_at_5_std value: 25.236208055346136 - type: nauc_ndcg_at_1000_diff1 value: 15.555347444667389 - type: nauc_ndcg_at_1000_max value: 13.356591700655718 - type: nauc_ndcg_at_1000_std value: 42.42395845935052 - type: nauc_ndcg_at_100_diff1 value: 13.110526060413708 - type: nauc_ndcg_at_100_max value: 3.140006440162515 - type: nauc_ndcg_at_100_std value: 39.02733288398033 - type: nauc_ndcg_at_10_diff1 value: 20.68853369009725 - type: nauc_ndcg_at_10_max value: 2.435389817058852 - type: nauc_ndcg_at_10_std value: 10.038202768784316 - type: nauc_ndcg_at_1_diff1 value: 20.17287594582385 - type: nauc_ndcg_at_1_max value: 12.487205168273196 - type: nauc_ndcg_at_1_std value: 20.639827614373075 - type: nauc_ndcg_at_20_diff1 value: 16.987577348502985 - type: nauc_ndcg_at_20_max value: 2.9978717644469266 - type: nauc_ndcg_at_20_std value: 13.015690866750354 - type: nauc_ndcg_at_3_diff1 value: 32.392223079245575 - type: nauc_ndcg_at_3_max value: 1.587587110582544 - type: nauc_ndcg_at_3_std value: 12.850592473446609 - type: nauc_ndcg_at_5_diff1 value: 32.80244517369626 - type: nauc_ndcg_at_5_max value: 5.8939933777508084 - type: nauc_ndcg_at_5_std value: 15.779687411463414 - type: nauc_precision_at_1000_diff1 value: -14.314031720452537 - type: nauc_precision_at_1000_max value: 32.87886666567266 - type: nauc_precision_at_1000_std value: 21.49347046886851 - type: nauc_precision_at_100_diff1 value: -9.4034008613839 - type: nauc_precision_at_100_max value: 16.784075123309645 - type: nauc_precision_at_100_std value: 73.14688535393604 - type: nauc_precision_at_10_diff1 value: 6.855101404043058 - type: nauc_precision_at_10_max value: 6.52491228645612 - type: nauc_precision_at_10_std value: 16.104602266016744 - type: nauc_precision_at_1_diff1 value: 17.902334894536107 - type: nauc_precision_at_1_max value: 17.36969662861018 - type: nauc_precision_at_1_std value: 19.425714969048734 - type: nauc_precision_at_20_diff1 value: -5.337534613602212 - type: nauc_precision_at_20_max value: 17.722925454767218 - type: nauc_precision_at_20_std value: 34.26680462132849 - type: nauc_precision_at_3_diff1 value: 31.054623397809255 - type: nauc_precision_at_3_max value: -0.92038600946826 - type: nauc_precision_at_3_std value: 8.326997076862916 - type: nauc_precision_at_5_diff1 value: 29.784942296920462 - type: nauc_precision_at_5_max value: 6.337469263434779 - type: nauc_precision_at_5_std value: 12.789597196020974 - type: nauc_recall_at_1000_diff1 value: -3.8177981862041364 - type: nauc_recall_at_1000_max value: 14.206064332229163 - type: nauc_recall_at_1000_std value: 74.18853420771269 - type: nauc_recall_at_100_diff1 value: 0.7677996771461106 - type: nauc_recall_at_100_max value: -4.139924106878441 - type: nauc_recall_at_100_std value: 48.319930706362896 - type: nauc_recall_at_10_diff1 value: 12.038835537494322 - type: nauc_recall_at_10_max value: -2.0498983557854418 - type: nauc_recall_at_10_std value: -2.0339180690854493 - type: nauc_recall_at_1_diff1 value: 27.853645380676824 - type: nauc_recall_at_1_max value: 7.30739948053113 - type: nauc_recall_at_1_std value: -0.2773663157814586 - type: nauc_recall_at_20_diff1 value: 0.7907893667756708 - type: nauc_recall_at_20_max value: 0.8795499810558195 - type: nauc_recall_at_20_std value: 11.512483291688282 - type: nauc_recall_at_3_diff1 value: 33.19440392639576 - type: nauc_recall_at_3_max value: -1.5494237697432613 - type: nauc_recall_at_3_std value: -8.560408808376984 - type: nauc_recall_at_5_diff1 value: 27.42193873870941 - type: nauc_recall_at_5_max value: -4.74350293281128 - type: nauc_recall_at_5_std value: -7.618060131179654 - type: ndcg_at_1 value: 42.857 - type: ndcg_at_10 value: 29.944 - type: ndcg_at_100 value: 42.624 - type: ndcg_at_1000 value: 53.384 - type: ndcg_at_20 value: 30.135 - type: ndcg_at_3 value: 34.847 - type: ndcg_at_5 value: 32.573 - type: precision_at_1 value: 44.897999999999996 - type: precision_at_10 value: 25.306 - type: precision_at_100 value: 8.694 - type: precision_at_1000 value: 1.616 - type: precision_at_20 value: 19.082 - type: precision_at_3 value: 34.014 - type: precision_at_5 value: 31.019999999999996 - type: recall_at_1 value: 3.064 - type: recall_at_10 value: 17.849999999999998 - type: recall_at_100 value: 53.217999999999996 - type: recall_at_1000 value: 87.095 - type: recall_at_20 value: 26.111 - type: recall_at_3 value: 7.383000000000001 - type: recall_at_5 value: 11.434 task: type: Retrieval - dataset: config: default name: MTEB ToxicConversationsClassification revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de split: test type: mteb/toxic_conversations_50k metrics: - type: accuracy value: 88.759765625 - type: ap value: 36.49152357863017 - type: ap_weighted value: 36.49152357863017 - type: f1 value: 74.4692714448641 - type: f1_weighted value: 90.54372649306606 - type: main_score value: 88.759765625 task: type: Classification - dataset: config: default name: MTEB TweetSentimentExtractionClassification revision: d604517c81ca91fe16a244d1248fc021f9ecee7a split: test type: mteb/tweet_sentiment_extraction metrics: - type: accuracy value: 74.8443689869836 - type: f1 value: 75.1139662898148 - type: f1_weighted value: 74.7369003946243 - type: main_score value: 74.8443689869836 task: type: Classification - dataset: config: default name: MTEB TwentyNewsgroupsClustering revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 split: test type: mteb/twentynewsgroups-clustering metrics: - type: main_score value: 61.42918790942448 - type: v_measure value: 61.42918790942448 - type: v_measure_std value: 1.0156550098843082 task: type: Clustering - dataset: config: default name: MTEB TwitterSemEval2015 revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 split: test type: mteb/twittersemeval2015-pairclassification metrics: - type: cosine_accuracy value: 88.22197055492639 - type: cosine_accuracy_threshold value: 83.30042362213135 - type: cosine_ap value: 80.57754959194938 - type: cosine_f1 value: 73.70579190158894 - type: cosine_f1_threshold value: 81.04978799819946 - type: cosine_precision value: 71.64922770303936 - type: cosine_recall value: 75.8839050131926 - type: dot_accuracy value: 88.23985217857782 - type: dot_accuracy_threshold value: 83.31039547920227 - type: dot_ap value: 80.57533213448181 - type: dot_f1 value: 73.61309601143302 - type: dot_f1_threshold value: 81.33968114852905 - type: dot_precision value: 72.51087791144101 - type: dot_recall value: 74.74934036939314 - type: euclidean_accuracy value: 88.22197055492639 - type: euclidean_accuracy_threshold value: 58.290231227874756 - type: euclidean_ap value: 80.57982723880139 - type: euclidean_f1 value: 73.63426519620417 - type: euclidean_f1_threshold value: 61.55576705932617 - type: euclidean_precision value: 71.63173652694611 - type: euclidean_recall value: 75.75197889182058 - type: main_score value: 80.57982723880139 - type: manhattan_accuracy value: 88.14448351910353 - type: manhattan_accuracy_threshold value: 3907.2471618652344 - type: manhattan_ap value: 80.3538079655539 - type: manhattan_f1 value: 73.40466675261054 - type: manhattan_f1_threshold value: 4103.794097900391 - type: manhattan_precision value: 71.76707839677337 - type: manhattan_recall value: 75.11873350923483 - type: max_ap value: 80.57982723880139 - type: max_f1 value: 73.70579190158894 - type: max_precision value: 72.51087791144101 - type: max_recall value: 75.8839050131926 - type: similarity_accuracy value: 88.22197055492639 - type: similarity_accuracy_threshold value: 83.30042362213135 - type: similarity_ap value: 80.57754959194938 - type: similarity_f1 value: 73.70579190158894 - type: similarity_f1_threshold value: 81.04978799819946 - type: similarity_precision value: 71.64922770303936 - type: similarity_recall value: 75.8839050131926 task: type: PairClassification - dataset: config: default name: MTEB TwitterURLCorpus revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf split: test type: mteb/twitterurlcorpus-pairclassification metrics: - type: cosine_accuracy value: 89.88628866379477 - type: cosine_accuracy_threshold value: 80.8050274848938 - type: cosine_ap value: 87.57594591596816 - type: cosine_f1 value: 80.0812257707218 - type: cosine_f1_threshold value: 77.990061044693 - type: cosine_precision value: 76.93126197063205 - type: cosine_recall value: 83.50015398829689 - type: dot_accuracy value: 89.87852679784221 - type: dot_accuracy_threshold value: 80.84419965744019 - type: dot_ap value: 87.56136742222151 - type: dot_f1 value: 80.05898617511521 - type: dot_f1_threshold value: 77.92385816574097 - type: dot_precision value: 76.80554573106035 - type: dot_recall value: 83.60024638127503 - type: euclidean_accuracy value: 89.86882446540149 - type: euclidean_accuracy_threshold value: 62.08193898200989 - type: euclidean_ap value: 87.57517549192228 - type: euclidean_f1 value: 80.05286925872892 - type: euclidean_f1_threshold value: 66.65036082267761 - type: euclidean_precision value: 76.51063232507545 - type: euclidean_recall value: 83.93902063443178 - type: main_score value: 87.64162614197194 - type: manhattan_accuracy value: 89.8959909962355 - type: manhattan_accuracy_threshold value: 4176.108169555664 - type: manhattan_ap value: 87.64162614197194 - type: manhattan_f1 value: 80.17116279069768 - type: manhattan_f1_threshold value: 4433.153533935547 - type: manhattan_precision value: 77.57615035644848 - type: manhattan_recall value: 82.94579611949491 - type: max_ap value: 87.64162614197194 - type: max_f1 value: 80.17116279069768 - type: max_precision value: 77.57615035644848 - type: max_recall value: 83.93902063443178 - type: similarity_accuracy value: 89.88628866379477 - type: similarity_accuracy_threshold value: 80.8050274848938 - type: similarity_ap value: 87.57594591596816 - type: similarity_f1 value: 80.0812257707218 - type: similarity_f1_threshold value: 77.990061044693 - type: similarity_precision value: 76.93126197063205 - type: similarity_recall value: 83.50015398829689 task: type: PairClassification tags: - mteb - sentence-transformers - transformers - sentence-similarity license: mit --- # Updates New open-source models and ToDoList will be listed on https://github.com/DunZhang/Stella/blob/main/news_and_todo.md. You can also find these models on my [homepage](https://huggingface.co/infgrad). # Introduction The models are trained based on `Alibaba-NLP/gte-large-en-v1.5` and `Alibaba-NLP/gte-Qwen2-1.5B-instruct`. Thanks for their contributions! **We simplify usage of prompts, providing two prompts for most general tasks, one is for s2p, another one is for s2s.** Prompt of s2p task(e.g. retrieve task): ```text Instruct: Given a web search query, retrieve relevant passages that answer the query.\nQuery: {query} ``` Prompt of s2s task(e.g. semantic textual similarity task): ```text Instruct: Retrieve semantically similar text.\nQuery: {query} ``` The models are finally trained by [MRL]((https://arxiv.org/abs/2205.13147)), so they have multiple dimensions: 512, 768, 1024, 2048, 4096, 6144 and 8192. The higher the dimension, the better the performance. **Generally speaking, 1024d is good enough.** The MTEB score of 1024d is only 0.001 lower than 8192d. # Model directory structure The model directory structure is very simple, it is a standard SentenceTransformer directory **with a series of `2_Dense_{dims}` folders**, where `dims` represents the final vector dimension. For example, the `2_Dense_256` folder stores Linear weights that convert vector dimensions to 256 dimensions. Please refer to the following chapters for specific instructions on how to use them. # Usage You can use `SentenceTransformers` or `transformers` library to encode text. ## Sentence Transformers ```python from sentence_transformers import SentenceTransformer # This model supports two prompts: "s2p_query" and "s2s_query" for sentence-to-passage and sentence-to-sentence tasks, respectively. # They are defined in `config_sentence_transformers.json` query_prompt_name = "s2p_query" queries = [ "What are some ways to reduce stress?", "What are the benefits of drinking green tea?", ] # docs do not need any prompts docs = [ "There are many effective ways to reduce stress. Some common techniques include deep breathing, meditation, and physical activity. Engaging in hobbies, spending time in nature, and connecting with loved ones can also help alleviate stress. Additionally, setting boundaries, practicing self-care, and learning to say no can prevent stress from building up.", "Green tea has been consumed for centuries and is known for its potential health benefits. It contains antioxidants that may help protect the body against damage caused by free radicals. Regular consumption of green tea has been associated with improved heart health, enhanced cognitive function, and a reduced risk of certain types of cancer. The polyphenols in green tea may also have anti-inflammatory and weight loss properties.", ] # !The default dimension is 1024, if you need other dimensions, please clone the model and modify `modules.json` to replace `2_Dense_1024` with another dimension, e.g. `2_Dense_256` or `2_Dense_8192` ! model = SentenceTransformer("dunzhang/stella_en_1.5B_v5", trust_remote_code=True).cuda() query_embeddings = model.encode(queries, prompt_name=query_prompt_name) doc_embeddings = model.encode(docs) print(query_embeddings.shape, doc_embeddings.shape) # (2, 1024) (2, 1024) similarities = model.similarity(query_embeddings, doc_embeddings) print(similarities) # tensor([[0.8179, 0.2958], # [0.3194, 0.7854]]) ``` ## Transformers ```python import os import torch from transformers import AutoModel, AutoTokenizer from sklearn.preprocessing import normalize query_prompt = "Instruct: Given a web search query, retrieve relevant passages that answer the query.\nQuery: " queries = [ "What are some ways to reduce stress?", "What are the benefits of drinking green tea?", ] queries = [query_prompt + query for query in queries] # docs do not need any prompts docs = [ "There are many effective ways to reduce stress. Some common techniques include deep breathing, meditation, and physical activity. Engaging in hobbies, spending time in nature, and connecting with loved ones can also help alleviate stress. Additionally, setting boundaries, practicing self-care, and learning to say no can prevent stress from building up.", "Green tea has been consumed for centuries and is known for its potential health benefits. It contains antioxidants that may help protect the body against damage caused by free radicals. Regular consumption of green tea has been associated with improved heart health, enhanced cognitive function, and a reduced risk of certain types of cancer. The polyphenols in green tea may also have anti-inflammatory and weight loss properties.", ] # The path of your model after cloning it model_dir = "{Your MODEL_PATH}" vector_dim = 1024 vector_linear_directory = f"2_Dense_{vector_dim}" model = AutoModel.from_pretrained(model_dir, trust_remote_code=True).cuda().eval() tokenizer = AutoTokenizer.from_pretrained(model_dir, trust_remote_code=True) vector_linear = torch.nn.Linear(in_features=model.config.hidden_size, out_features=vector_dim) vector_linear_dict = { k.replace("linear.", ""): v for k, v in torch.load(os.path.join(model_dir, f"{vector_linear_directory}/pytorch_model.bin")).items() } vector_linear.load_state_dict(vector_linear_dict) vector_linear.cuda() # Embed the queries with torch.no_grad(): input_data = tokenizer(queries, padding="longest", truncation=True, max_length=512, return_tensors="pt") input_data = {k: v.cuda() for k, v in input_data.items()} attention_mask = input_data["attention_mask"] last_hidden_state = model(**input_data)[0] last_hidden = last_hidden_state.masked_fill(~attention_mask[..., None].bool(), 0.0) query_vectors = last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None] query_vectors = normalize(vector_linear(query_vectors).cpu().numpy()) # Embed the documents with torch.no_grad(): input_data = tokenizer(docs, padding="longest", truncation=True, max_length=512, return_tensors="pt") input_data = {k: v.cuda() for k, v in input_data.items()} attention_mask = input_data["attention_mask"] last_hidden_state = model(**input_data)[0] last_hidden = last_hidden_state.masked_fill(~attention_mask[..., None].bool(), 0.0) docs_vectors = last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None] docs_vectors = normalize(vector_linear(docs_vectors).cpu().numpy()) print(query_vectors.shape, docs_vectors.shape) # (2, 1024) (2, 1024) similarities = query_vectors @ docs_vectors.T print(similarities) # [[0.8178789 0.2958377 ] # [0.31938642 0.7853526 ]] ``` # FAQ Q: The details of training? A: The training method and datasets will be released in the future. (specific time unknown, may be provided in a paper) Q: How to choose a suitable prompt for my own task? A: In most cases, please use the s2p and s2s prompts. These two prompts account for the vast majority of the training data. Q: How to reproduce MTEB results? A: Please use evaluation scripts in `Alibaba-NLP/gte-Qwen2-1.5B-instruct` or `intfloat/e5-mistral-7b-instruct` Q: Why each dimension has a linear weight? A: MRL has multiple training methods, we choose this method which has the best performance. Q: What is the sequence length of models? A: 512 is recommended, in our experiments, almost all models perform poorly on specialized long text retrieval datasets. Besides, the model is trained on datasets of 512 length. This may be an optimization term. If you have any questions, please start a discussion on community.
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
adriansanz/sitges2608bai-4ep
adriansanz
sentence-similarity
[ "sentence-transformers", "safetensors", "xlm-roberta", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:4173", "loss:MatryoshkaLoss", "loss:MultipleNegativesRankingLoss", "arxiv:1908.10084", "arxiv:2205.13147", "arxiv:1705.00652", "base_model:BAAI/bge-m3", "base_model:finetune:BAAI/bge-m3", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,724
1,724
4
0
--- base_model: BAAI/bge-m3 datasets: [] language: [] library_name: sentence-transformers metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:4173 - loss:MatryoshkaLoss - loss:MultipleNegativesRankingLoss widget: - source_sentence: Si dins el termini que s'hagi atorgat amb aquesta finalitat els habitatges que en disposen no s'han adaptat, la llicència pot ésser revocada. sentences: - Qui pot sol·licitar la pròrroga de la prestació? - Quin és el resultat de la constatació dels fets denunciats per part de l'Ajuntament? - Què passa si no s'adapten els habitatges d'ús turístic dins el termini establert? - source_sentence: En cas que a la sepultura hi hagi despulles, la persona titular podrà triar entre traslladar-les a una altra sepultura de la què en sigui el/la titular o bé que l'Ajuntament les traslladi a l'ossera general. sentences: - Què passa amb les despulles si la persona titular decideix traslladar-les a una altra sepultura? - Quins són els beneficis de la llicència de publicitat dinàmica? - Quan es va aprovar els models d'aval per part de la Junta de Govern Local? - source_sentence: La colònia felina té un paper important en la reducció del nombre d'animals abandonats, ja que proporciona un refugi segur i un entorn adequat per als animals que es troben en situació de risc o abandonament. sentences: - Quin és el termini per justificar la realització del projecte/activitat subvencionada? - Quins són els tractaments mèdics que beneficien la salut de l'empleat municipal? - Quin és el paper de la colònia felina en la reducció del nombre d'animals abandonats? - source_sentence: 'La realització de les obres que s’indiquen a continuació està subjecta a l’obtenció d’una llicència d’obra major atorgada per l’Ajuntament: ... Compartimentació de naus industrials existents...' sentences: - Quin tipus d’obra es refereix a la compartimentació de naus industrials existents? - Quin és el benefici principal del tràmit de canvi de titular de la llicència de gual? - Quin és el tipus de garantia que es pot fer mitjançant una assegurança de caució? - source_sentence: Els membres de la Corporació tenen dret a obtenir dels òrgans de l'Ajuntament les dades o informacions... sentences: - Quin és el paper dels òrgans de l'Ajuntament en relació amb les sol·licituds dels membres de la Corporació? - Quin és el motiu principal perquè un beneficiari pugui perdre el dret a una subvenció? - Quin és el benefici de la presentació de recursos? model-index: - name: SentenceTransformer based on BAAI/bge-m3 results: - task: type: information-retrieval name: Information Retrieval dataset: name: dim 768 type: dim_768 metrics: - type: cosine_accuracy@1 value: 0.07543103448275862 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.14439655172413793 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.21336206896551724 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.3900862068965517 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.07543103448275862 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.048132183908045974 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.04267241379310344 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.039008620689655174 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.07543103448275862 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.14439655172413793 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.21336206896551724 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.3900862068965517 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.19775448839983267 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.14087729200875768 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.1670966505747688 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 512 type: dim_512 metrics: - type: cosine_accuracy@1 value: 0.07543103448275862 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.1400862068965517 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.20905172413793102 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.3922413793103448 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.07543103448275862 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.046695402298850566 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.04181034482758621 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.03922413793103448 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.07543103448275862 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.1400862068965517 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.20905172413793102 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.3922413793103448 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.1973388128367381 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.14006910235358525 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.1660059682423787 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 256 type: dim_256 metrics: - type: cosine_accuracy@1 value: 0.07112068965517242 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.14439655172413793 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.20905172413793102 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.3793103448275862 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.07112068965517242 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.048132183908045974 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.04181034482758621 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.03793103448275861 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.07112068965517242 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.14439655172413793 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.20905172413793102 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.3793103448275862 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.19451734912520316 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.13957307060755345 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.1658323397622155 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 128 type: dim_128 metrics: - type: cosine_accuracy@1 value: 0.06465517241379311 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.13793103448275862 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.21336206896551724 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.3577586206896552 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.06465517241379311 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.04597701149425287 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.04267241379310345 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.03577586206896552 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.06465517241379311 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.13793103448275862 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.21336206896551724 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.3577586206896552 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.18381656342161204 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.13181616037219498 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.15919561658705733 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 64 type: dim_64 metrics: - type: cosine_accuracy@1 value: 0.06896551724137931 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.13577586206896552 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.20905172413793102 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.35344827586206895 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.06896551724137931 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.04525862068965517 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.041810344827586214 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.03534482758620689 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.06896551724137931 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.13577586206896552 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.20905172413793102 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.35344827586206895 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.18256713591724985 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.131704980842912 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.1580121500031178 name: Cosine Map@100 --- # SentenceTransformer based on BAAI/bge-m3 This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) <!-- at revision 5617a9f61b028005a4858fdac845db406aefb181 --> - **Maximum Sequence Length:** 8192 tokens - **Output Dimensionality:** 1024 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: XLMRobertaModel (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("adriansanz/sitges2608bai-4ep") # Run inference sentences = [ "Els membres de la Corporació tenen dret a obtenir dels òrgans de l'Ajuntament les dades o informacions...", "Quin és el paper dels òrgans de l'Ajuntament en relació amb les sol·licituds dels membres de la Corporació?", 'Quin és el benefici de la presentació de recursos?', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 1024] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Information Retrieval * Dataset: `dim_768` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.0754 | | cosine_accuracy@3 | 0.1444 | | cosine_accuracy@5 | 0.2134 | | cosine_accuracy@10 | 0.3901 | | cosine_precision@1 | 0.0754 | | cosine_precision@3 | 0.0481 | | cosine_precision@5 | 0.0427 | | cosine_precision@10 | 0.039 | | cosine_recall@1 | 0.0754 | | cosine_recall@3 | 0.1444 | | cosine_recall@5 | 0.2134 | | cosine_recall@10 | 0.3901 | | cosine_ndcg@10 | 0.1978 | | cosine_mrr@10 | 0.1409 | | **cosine_map@100** | **0.1671** | #### Information Retrieval * Dataset: `dim_512` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:----------| | cosine_accuracy@1 | 0.0754 | | cosine_accuracy@3 | 0.1401 | | cosine_accuracy@5 | 0.2091 | | cosine_accuracy@10 | 0.3922 | | cosine_precision@1 | 0.0754 | | cosine_precision@3 | 0.0467 | | cosine_precision@5 | 0.0418 | | cosine_precision@10 | 0.0392 | | cosine_recall@1 | 0.0754 | | cosine_recall@3 | 0.1401 | | cosine_recall@5 | 0.2091 | | cosine_recall@10 | 0.3922 | | cosine_ndcg@10 | 0.1973 | | cosine_mrr@10 | 0.1401 | | **cosine_map@100** | **0.166** | #### Information Retrieval * Dataset: `dim_256` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.0711 | | cosine_accuracy@3 | 0.1444 | | cosine_accuracy@5 | 0.2091 | | cosine_accuracy@10 | 0.3793 | | cosine_precision@1 | 0.0711 | | cosine_precision@3 | 0.0481 | | cosine_precision@5 | 0.0418 | | cosine_precision@10 | 0.0379 | | cosine_recall@1 | 0.0711 | | cosine_recall@3 | 0.1444 | | cosine_recall@5 | 0.2091 | | cosine_recall@10 | 0.3793 | | cosine_ndcg@10 | 0.1945 | | cosine_mrr@10 | 0.1396 | | **cosine_map@100** | **0.1658** | #### Information Retrieval * Dataset: `dim_128` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.0647 | | cosine_accuracy@3 | 0.1379 | | cosine_accuracy@5 | 0.2134 | | cosine_accuracy@10 | 0.3578 | | cosine_precision@1 | 0.0647 | | cosine_precision@3 | 0.046 | | cosine_precision@5 | 0.0427 | | cosine_precision@10 | 0.0358 | | cosine_recall@1 | 0.0647 | | cosine_recall@3 | 0.1379 | | cosine_recall@5 | 0.2134 | | cosine_recall@10 | 0.3578 | | cosine_ndcg@10 | 0.1838 | | cosine_mrr@10 | 0.1318 | | **cosine_map@100** | **0.1592** | #### Information Retrieval * Dataset: `dim_64` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:----------| | cosine_accuracy@1 | 0.069 | | cosine_accuracy@3 | 0.1358 | | cosine_accuracy@5 | 0.2091 | | cosine_accuracy@10 | 0.3534 | | cosine_precision@1 | 0.069 | | cosine_precision@3 | 0.0453 | | cosine_precision@5 | 0.0418 | | cosine_precision@10 | 0.0353 | | cosine_recall@1 | 0.069 | | cosine_recall@3 | 0.1358 | | cosine_recall@5 | 0.2091 | | cosine_recall@10 | 0.3534 | | cosine_ndcg@10 | 0.1826 | | cosine_mrr@10 | 0.1317 | | **cosine_map@100** | **0.158** | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 4,173 training samples * Columns: <code>positive</code> and <code>anchor</code> * Approximate statistics based on the first 1000 samples: | | positive | anchor | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 48.65 tokens</li><li>max: 125 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 20.96 tokens</li><li>max: 45 tokens</li></ul> | * Samples: | positive | anchor | |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------| | <code>Quan es produeix la caducitat del dret funerari per haver transcorregut el termini de concessió i un cop que l'Ajuntament hagi resolt el procediment legalment establert per a la declaració de caducitat, és imprescindible formalitzar la nova concessió del dret.</code> | <code>Quan es produeix la caducitat del dret funerari?</code> | | <code>Les persones beneficiàries de l'ajut per a la creació de noves empreses per persones donades d'alta al règim especial de treballadors autònoms.</code> | <code>Quin és el tipus de persones que poden beneficiar-se de l'ajut?</code> | | <code>Les entitats beneficiàries són les responsables de la gestió dels recursos econòmics i materials assignats per a la realització del projecte o activitat subvencionat.</code> | <code>Quin és el paper de les entitats beneficiàries en la gestió dels recursos?</code> | * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters: ```json { "loss": "MultipleNegativesRankingLoss", "matryoshka_dims": [ 768, 512, 256, 128, 64 ], "matryoshka_weights": [ 1, 1, 1, 1, 1 ], "n_dims_per_step": -1 } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: epoch - `per_device_train_batch_size`: 2 - `per_device_eval_batch_size`: 2 - `gradient_accumulation_steps`: 2 - `learning_rate`: 2e-05 - `num_train_epochs`: 4 - `lr_scheduler_type`: cosine - `warmup_ratio`: 0.1 - `bf16`: True - `tf32`: False - `load_best_model_at_end`: True - `optim`: adamw_torch_fused - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: epoch - `prediction_loss_only`: True - `per_device_train_batch_size`: 2 - `per_device_eval_batch_size`: 2 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 2 - `eval_accumulation_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 4 - `max_steps`: -1 - `lr_scheduler_type`: cosine - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: True - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: False - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch_fused - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs <details><summary>Click to expand</summary> | Epoch | Step | Training Loss | dim_128_cosine_map@100 | dim_256_cosine_map@100 | dim_512_cosine_map@100 | dim_64_cosine_map@100 | dim_768_cosine_map@100 | |:----------:|:--------:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|:----------------------:| | 0.0096 | 10 | 0.4269 | - | - | - | - | - | | 0.0192 | 20 | 0.2328 | - | - | - | - | - | | 0.0287 | 30 | 0.2803 | - | - | - | - | - | | 0.0383 | 40 | 0.312 | - | - | - | - | - | | 0.0479 | 50 | 0.0631 | - | - | - | - | - | | 0.0575 | 60 | 0.1824 | - | - | - | - | - | | 0.0671 | 70 | 0.3102 | - | - | - | - | - | | 0.0767 | 80 | 0.2966 | - | - | - | - | - | | 0.0862 | 90 | 0.3715 | - | - | - | - | - | | 0.0958 | 100 | 0.0719 | - | - | - | - | - | | 0.1054 | 110 | 0.279 | - | - | - | - | - | | 0.1150 | 120 | 0.0954 | - | - | - | - | - | | 0.1246 | 130 | 0.4912 | - | - | - | - | - | | 0.1342 | 140 | 0.2877 | - | - | - | - | - | | 0.1437 | 150 | 0.1933 | - | - | - | - | - | | 0.1533 | 160 | 0.5942 | - | - | - | - | - | | 0.1629 | 170 | 0.1336 | - | - | - | - | - | | 0.1725 | 180 | 0.1755 | - | - | - | - | - | | 0.1821 | 190 | 0.1455 | - | - | - | - | - | | 0.1917 | 200 | 0.4391 | - | - | - | - | - | | 0.2012 | 210 | 0.0567 | - | - | - | - | - | | 0.2108 | 220 | 0.2368 | - | - | - | - | - | | 0.2204 | 230 | 0.0249 | - | - | - | - | - | | 0.2300 | 240 | 0.0518 | - | - | - | - | - | | 0.2396 | 250 | 0.015 | - | - | - | - | - | | 0.2492 | 260 | 0.4096 | - | - | - | - | - | | 0.2587 | 270 | 0.115 | - | - | - | - | - | | 0.2683 | 280 | 0.0532 | - | - | - | - | - | | 0.2779 | 290 | 0.0407 | - | - | - | - | - | | 0.2875 | 300 | 0.082 | - | - | - | - | - | | 0.2971 | 310 | 0.1086 | - | - | - | - | - | | 0.3067 | 320 | 0.0345 | - | - | - | - | - | | 0.3162 | 330 | 0.3144 | - | - | - | - | - | | 0.3258 | 340 | 0.0056 | - | - | - | - | - | | 0.3354 | 350 | 0.0867 | - | - | - | - | - | | 0.3450 | 360 | 0.1011 | - | - | - | - | - | | 0.3546 | 370 | 0.6417 | - | - | - | - | - | | 0.3642 | 380 | 0.0689 | - | - | - | - | - | | 0.3737 | 390 | 0.0075 | - | - | - | - | - | | 0.3833 | 400 | 0.0822 | - | - | - | - | - | | 0.3929 | 410 | 0.098 | - | - | - | - | - | | 0.4025 | 420 | 0.0442 | - | - | - | - | - | | 0.4121 | 430 | 0.1759 | - | - | - | - | - | | 0.4217 | 440 | 0.2625 | - | - | - | - | - | | 0.4312 | 450 | 0.1123 | - | - | - | - | - | | 0.4408 | 460 | 0.1174 | - | - | - | - | - | | 0.4504 | 470 | 0.0529 | - | - | - | - | - | | 0.4600 | 480 | 0.5396 | - | - | - | - | - | | 0.4696 | 490 | 0.1985 | - | - | - | - | - | | 0.4792 | 500 | 0.0016 | - | - | - | - | - | | 0.4887 | 510 | 0.0496 | - | - | - | - | - | | 0.4983 | 520 | 0.3138 | - | - | - | - | - | | 0.5079 | 530 | 0.1974 | - | - | - | - | - | | 0.5175 | 540 | 0.3489 | - | - | - | - | - | | 0.5271 | 550 | 0.3332 | - | - | - | - | - | | 0.5367 | 560 | 0.7838 | - | - | - | - | - | | 0.5462 | 570 | 0.8335 | - | - | - | - | - | | 0.5558 | 580 | 0.5018 | - | - | - | - | - | | 0.5654 | 590 | 0.3391 | - | - | - | - | - | | 0.5750 | 600 | 0.0055 | - | - | - | - | - | | 0.5846 | 610 | 0.0264 | - | - | - | - | - | | 0.5942 | 620 | 0.1397 | - | - | - | - | - | | 0.6037 | 630 | 0.1114 | - | - | - | - | - | | 0.6133 | 640 | 0.337 | - | - | - | - | - | | 0.6229 | 650 | 0.0027 | - | - | - | - | - | | 0.6325 | 660 | 0.1454 | - | - | - | - | - | | 0.6421 | 670 | 0.2212 | - | - | - | - | - | | 0.6517 | 680 | 0.0472 | - | - | - | - | - | | 0.6612 | 690 | 0.6882 | - | - | - | - | - | | 0.6708 | 700 | 0.0266 | - | - | - | - | - | | 0.6804 | 710 | 1.0057 | - | - | - | - | - | | 0.6900 | 720 | 0.1456 | - | - | - | - | - | | 0.6996 | 730 | 0.4195 | - | - | - | - | - | | 0.7092 | 740 | 0.0732 | - | - | - | - | - | | 0.7187 | 750 | 0.0588 | - | - | - | - | - | | 0.7283 | 760 | 0.0033 | - | - | - | - | - | | 0.7379 | 770 | 0.0156 | - | - | - | - | - | | 0.7475 | 780 | 0.0997 | - | - | - | - | - | | 0.7571 | 790 | 0.856 | - | - | - | - | - | | 0.7667 | 800 | 0.2394 | - | - | - | - | - | | 0.7762 | 810 | 0.0322 | - | - | - | - | - | | 0.7858 | 820 | 0.1821 | - | - | - | - | - | | 0.7954 | 830 | 0.1883 | - | - | - | - | - | | 0.8050 | 840 | 0.0994 | - | - | - | - | - | | 0.8146 | 850 | 0.3889 | - | - | - | - | - | | 0.8241 | 860 | 0.0221 | - | - | - | - | - | | 0.8337 | 870 | 0.0106 | - | - | - | - | - | | 0.8433 | 880 | 0.0031 | - | - | - | - | - | | 0.8529 | 890 | 0.1453 | - | - | - | - | - | | 0.8625 | 900 | 0.487 | - | - | - | - | - | | 0.8721 | 910 | 0.2987 | - | - | - | - | - | | 0.8816 | 920 | 0.0347 | - | - | - | - | - | | 0.8912 | 930 | 0.2024 | - | - | - | - | - | | 0.9008 | 940 | 0.0087 | - | - | - | - | - | | 0.9104 | 950 | 0.3944 | - | - | - | - | - | | 0.9200 | 960 | 0.0935 | - | - | - | - | - | | 0.9296 | 970 | 0.2408 | - | - | - | - | - | | 0.9391 | 980 | 0.1545 | - | - | - | - | - | | 0.9487 | 990 | 0.1168 | - | - | - | - | - | | 0.9583 | 1000 | 0.0051 | - | - | - | - | - | | 0.9679 | 1010 | 0.681 | - | - | - | - | - | | 0.9775 | 1020 | 0.0198 | - | - | - | - | - | | 0.9871 | 1030 | 0.7243 | - | - | - | - | - | | 0.9966 | 1040 | 0.0341 | - | - | - | - | - | | 0.9995 | 1043 | - | 0.1608 | 0.1639 | 0.1678 | 0.1526 | 0.1610 | | 1.0062 | 1050 | 0.001 | - | - | - | - | - | | 1.0158 | 1060 | 0.0864 | - | - | - | - | - | | 1.0254 | 1070 | 0.0209 | - | - | - | - | - | | 1.0350 | 1080 | 0.2703 | - | - | - | - | - | | 1.0446 | 1090 | 0.1857 | - | - | - | - | - | | 1.0541 | 1100 | 0.0032 | - | - | - | - | - | | 1.0637 | 1110 | 0.118 | - | - | - | - | - | | 1.0733 | 1120 | 0.0029 | - | - | - | - | - | | 1.0829 | 1130 | 0.0393 | - | - | - | - | - | | 1.0925 | 1140 | 0.3103 | - | - | - | - | - | | 1.1021 | 1150 | 0.0323 | - | - | - | - | - | | 1.1116 | 1160 | 0.0925 | - | - | - | - | - | | 1.1212 | 1170 | 0.0963 | - | - | - | - | - | | 1.1308 | 1180 | 0.0481 | - | - | - | - | - | | 1.1404 | 1190 | 0.0396 | - | - | - | - | - | | 1.1500 | 1200 | 0.0033 | - | - | - | - | - | | 1.1596 | 1210 | 0.1555 | - | - | - | - | - | | 1.1691 | 1220 | 0.0938 | - | - | - | - | - | | 1.1787 | 1230 | 0.1347 | - | - | - | - | - | | 1.1883 | 1240 | 0.3057 | - | - | - | - | - | | 1.1979 | 1250 | 0.0005 | - | - | - | - | - | | 1.2075 | 1260 | 0.0634 | - | - | - | - | - | | 1.2171 | 1270 | 0.0013 | - | - | - | - | - | | 1.2266 | 1280 | 0.0012 | - | - | - | - | - | | 1.2362 | 1290 | 0.0119 | - | - | - | - | - | | 1.2458 | 1300 | 0.002 | - | - | - | - | - | | 1.2554 | 1310 | 0.016 | - | - | - | - | - | | 1.2650 | 1320 | 0.0169 | - | - | - | - | - | | 1.2746 | 1330 | 0.0332 | - | - | - | - | - | | 1.2841 | 1340 | 0.0076 | - | - | - | - | - | | 1.2937 | 1350 | 0.0029 | - | - | - | - | - | | 1.3033 | 1360 | 0.0011 | - | - | - | - | - | | 1.3129 | 1370 | 0.0477 | - | - | - | - | - | | 1.3225 | 1380 | 0.014 | - | - | - | - | - | | 1.3321 | 1390 | 0.0002 | - | - | - | - | - | | 1.3416 | 1400 | 0.012 | - | - | - | - | - | | 1.3512 | 1410 | 0.0175 | - | - | - | - | - | | 1.3608 | 1420 | 0.0088 | - | - | - | - | - | | 1.3704 | 1430 | 0.0022 | - | - | - | - | - | | 1.3800 | 1440 | 0.0007 | - | - | - | - | - | | 1.3896 | 1450 | 0.0098 | - | - | - | - | - | | 1.3991 | 1460 | 0.0003 | - | - | - | - | - | | 1.4087 | 1470 | 0.0804 | - | - | - | - | - | | 1.4183 | 1480 | 0.0055 | - | - | - | - | - | | 1.4279 | 1490 | 0.1131 | - | - | - | - | - | | 1.4375 | 1500 | 0.0018 | - | - | - | - | - | | 1.4471 | 1510 | 0.0002 | - | - | - | - | - | | 1.4566 | 1520 | 0.0143 | - | - | - | - | - | | 1.4662 | 1530 | 0.0876 | - | - | - | - | - | | 1.4758 | 1540 | 0.003 | - | - | - | - | - | | 1.4854 | 1550 | 0.0087 | - | - | - | - | - | | 1.4950 | 1560 | 0.0005 | - | - | - | - | - | | 1.5046 | 1570 | 0.0002 | - | - | - | - | - | | 1.5141 | 1580 | 0.1614 | - | - | - | - | - | | 1.5237 | 1590 | 0.0017 | - | - | - | - | - | | 1.5333 | 1600 | 0.0013 | - | - | - | - | - | | 1.5429 | 1610 | 0.0041 | - | - | - | - | - | | 1.5525 | 1620 | 0.0021 | - | - | - | - | - | | 1.5621 | 1630 | 0.1113 | - | - | - | - | - | | 1.5716 | 1640 | 0.0003 | - | - | - | - | - | | 1.5812 | 1650 | 0.0003 | - | - | - | - | - | | 1.5908 | 1660 | 0.0018 | - | - | - | - | - | | 1.6004 | 1670 | 0.0004 | - | - | - | - | - | | 1.6100 | 1680 | 0.0003 | - | - | - | - | - | | 1.6195 | 1690 | 0.0017 | - | - | - | - | - | | 1.6291 | 1700 | 0.0023 | - | - | - | - | - | | 1.6387 | 1710 | 0.0167 | - | - | - | - | - | | 1.6483 | 1720 | 0.0023 | - | - | - | - | - | | 1.6579 | 1730 | 0.0095 | - | - | - | - | - | | 1.6675 | 1740 | 0.0005 | - | - | - | - | - | | 1.6770 | 1750 | 0.0014 | - | - | - | - | - | | 1.6866 | 1760 | 0.0007 | - | - | - | - | - | | 1.6962 | 1770 | 0.0014 | - | - | - | - | - | | 1.7058 | 1780 | 0.0 | - | - | - | - | - | | 1.7154 | 1790 | 0.0016 | - | - | - | - | - | | 1.7250 | 1800 | 0.0004 | - | - | - | - | - | | 1.7345 | 1810 | 0.0007 | - | - | - | - | - | | 1.7441 | 1820 | 0.3356 | - | - | - | - | - | | 1.7537 | 1830 | 0.001 | - | - | - | - | - | | 1.7633 | 1840 | 0.0436 | - | - | - | - | - | | 1.7729 | 1850 | 0.0839 | - | - | - | - | - | | 1.7825 | 1860 | 0.0019 | - | - | - | - | - | | 1.7920 | 1870 | 0.0406 | - | - | - | - | - | | 1.8016 | 1880 | 0.0496 | - | - | - | - | - | | 1.8112 | 1890 | 0.0164 | - | - | - | - | - | | 1.8208 | 1900 | 0.0118 | - | - | - | - | - | | 1.8304 | 1910 | 0.001 | - | - | - | - | - | | 1.8400 | 1920 | 0.0004 | - | - | - | - | - | | 1.8495 | 1930 | 0.002 | - | - | - | - | - | | 1.8591 | 1940 | 0.0051 | - | - | - | - | - | | 1.8687 | 1950 | 0.0624 | - | - | - | - | - | | 1.8783 | 1960 | 0.0033 | - | - | - | - | - | | 1.8879 | 1970 | 0.0001 | - | - | - | - | - | | 1.8975 | 1980 | 0.1594 | - | - | - | - | - | | 1.9070 | 1990 | 0.007 | - | - | - | - | - | | 1.9166 | 2000 | 0.0002 | - | - | - | - | - | | 1.9262 | 2010 | 0.0012 | - | - | - | - | - | | 1.9358 | 2020 | 0.0011 | - | - | - | - | - | | 1.9454 | 2030 | 0.0264 | - | - | - | - | - | | 1.9550 | 2040 | 0.0004 | - | - | - | - | - | | 1.9645 | 2050 | 0.008 | - | - | - | - | - | | 1.9741 | 2060 | 0.1025 | - | - | - | - | - | | 1.9837 | 2070 | 0.0745 | - | - | - | - | - | | 1.9933 | 2080 | 0.006 | - | - | - | - | - | | 2.0 | 2087 | - | 0.1609 | 0.1644 | 0.1708 | 0.1499 | 0.1696 | | 2.0029 | 2090 | 0.001 | - | - | - | - | - | | 2.0125 | 2100 | 0.0004 | - | - | - | - | - | | 2.0220 | 2110 | 0.0003 | - | - | - | - | - | | 2.0316 | 2120 | 0.0001 | - | - | - | - | - | | 2.0412 | 2130 | 0.0003 | - | - | - | - | - | | 2.0508 | 2140 | 0.0002 | - | - | - | - | - | | 2.0604 | 2150 | 0.0006 | - | - | - | - | - | | 2.0700 | 2160 | 0.04 | - | - | - | - | - | | 2.0795 | 2170 | 0.0055 | - | - | - | - | - | | 2.0891 | 2180 | 0.1454 | - | - | - | - | - | | 2.0987 | 2190 | 0.0029 | - | - | - | - | - | | 2.1083 | 2200 | 0.0006 | - | - | - | - | - | | 2.1179 | 2210 | 0.0001 | - | - | - | - | - | | 2.1275 | 2220 | 0.0129 | - | - | - | - | - | | 2.1370 | 2230 | 0.0001 | - | - | - | - | - | | 2.1466 | 2240 | 0.0003 | - | - | - | - | - | | 2.1562 | 2250 | 0.4145 | - | - | - | - | - | | 2.1658 | 2260 | 0.0048 | - | - | - | - | - | | 2.1754 | 2270 | 0.0706 | - | - | - | - | - | | 2.1850 | 2280 | 0.0026 | - | - | - | - | - | | 2.1945 | 2290 | 0.008 | - | - | - | - | - | | 2.2041 | 2300 | 0.0051 | - | - | - | - | - | | 2.2137 | 2310 | 0.0307 | - | - | - | - | - | | 2.2233 | 2320 | 0.0017 | - | - | - | - | - | | 2.2329 | 2330 | 0.0005 | - | - | - | - | - | | 2.2425 | 2340 | 0.0001 | - | - | - | - | - | | 2.2520 | 2350 | 0.0001 | - | - | - | - | - | | 2.2616 | 2360 | 0.0001 | - | - | - | - | - | | 2.2712 | 2370 | 0.0461 | - | - | - | - | - | | 2.2808 | 2380 | 0.0001 | - | - | - | - | - | | 2.2904 | 2390 | 0.0003 | - | - | - | - | - | | 2.3000 | 2400 | 0.001 | - | - | - | - | - | | 2.3095 | 2410 | 0.0002 | - | - | - | - | - | | 2.3191 | 2420 | 0.1568 | - | - | - | - | - | | 2.3287 | 2430 | 0.0001 | - | - | - | - | - | | 2.3383 | 2440 | 0.0005 | - | - | - | - | - | | 2.3479 | 2450 | 0.0072 | - | - | - | - | - | | 2.3575 | 2460 | 0.014 | - | - | - | - | - | | 2.3670 | 2470 | 0.0003 | - | - | - | - | - | | 2.3766 | 2480 | 0.0 | - | - | - | - | - | | 2.3862 | 2490 | 0.0001 | - | - | - | - | - | | 2.3958 | 2500 | 0.0008 | - | - | - | - | - | | 2.4054 | 2510 | 0.0 | - | - | - | - | - | | 2.4149 | 2520 | 0.0002 | - | - | - | - | - | | 2.4245 | 2530 | 0.061 | - | - | - | - | - | | 2.4341 | 2540 | 0.0005 | - | - | - | - | - | | 2.4437 | 2550 | 0.0 | - | - | - | - | - | | 2.4533 | 2560 | 0.0003 | - | - | - | - | - | | 2.4629 | 2570 | 0.0095 | - | - | - | - | - | | 2.4724 | 2580 | 0.0002 | - | - | - | - | - | | 2.4820 | 2590 | 0.0 | - | - | - | - | - | | 2.4916 | 2600 | 0.0003 | - | - | - | - | - | | 2.5012 | 2610 | 0.0002 | - | - | - | - | - | | 2.5108 | 2620 | 0.0035 | - | - | - | - | - | | 2.5204 | 2630 | 0.0001 | - | - | - | - | - | | 2.5299 | 2640 | 0.0 | - | - | - | - | - | | 2.5395 | 2650 | 0.0017 | - | - | - | - | - | | 2.5491 | 2660 | 0.0 | - | - | - | - | - | | 2.5587 | 2670 | 0.0066 | - | - | - | - | - | | 2.5683 | 2680 | 0.0004 | - | - | - | - | - | | 2.5779 | 2690 | 0.0001 | - | - | - | - | - | | 2.5874 | 2700 | 0.0 | - | - | - | - | - | | 2.5970 | 2710 | 0.0 | - | - | - | - | - | | 2.6066 | 2720 | 0.131 | - | - | - | - | - | | 2.6162 | 2730 | 0.0001 | - | - | - | - | - | | 2.6258 | 2740 | 0.0001 | - | - | - | - | - | | 2.6354 | 2750 | 0.0001 | - | - | - | - | - | | 2.6449 | 2760 | 0.0 | - | - | - | - | - | | 2.6545 | 2770 | 0.0003 | - | - | - | - | - | | 2.6641 | 2780 | 0.0095 | - | - | - | - | - | | 2.6737 | 2790 | 0.0 | - | - | - | - | - | | 2.6833 | 2800 | 0.0003 | - | - | - | - | - | | 2.6929 | 2810 | 0.0001 | - | - | - | - | - | | 2.7024 | 2820 | 0.0002 | - | - | - | - | - | | 2.7120 | 2830 | 0.0007 | - | - | - | - | - | | 2.7216 | 2840 | 0.0008 | - | - | - | - | - | | 2.7312 | 2850 | 0.0 | - | - | - | - | - | | 2.7408 | 2860 | 0.0002 | - | - | - | - | - | | 2.7504 | 2870 | 0.0003 | - | - | - | - | - | | 2.7599 | 2880 | 0.0062 | - | - | - | - | - | | 2.7695 | 2890 | 0.0415 | - | - | - | - | - | | 2.7791 | 2900 | 0.0002 | - | - | - | - | - | | 2.7887 | 2910 | 0.0024 | - | - | - | - | - | | 2.7983 | 2920 | 0.0022 | - | - | - | - | - | | 2.8079 | 2930 | 0.0014 | - | - | - | - | - | | 2.8174 | 2940 | 0.1301 | - | - | - | - | - | | 2.8270 | 2950 | 0.0 | - | - | - | - | - | | 2.8366 | 2960 | 0.0 | - | - | - | - | - | | 2.8462 | 2970 | 0.0 | - | - | - | - | - | | 2.8558 | 2980 | 0.0006 | - | - | - | - | - | | 2.8654 | 2990 | 0.0 | - | - | - | - | - | | 2.8749 | 3000 | 0.0235 | - | - | - | - | - | | 2.8845 | 3010 | 0.0001 | - | - | - | - | - | | 2.8941 | 3020 | 0.0285 | - | - | - | - | - | | 2.9037 | 3030 | 0.0 | - | - | - | - | - | | 2.9133 | 3040 | 0.0002 | - | - | - | - | - | | 2.9229 | 3050 | 0.0 | - | - | - | - | - | | 2.9324 | 3060 | 0.0005 | - | - | - | - | - | | 2.9420 | 3070 | 0.0001 | - | - | - | - | - | | 2.9516 | 3080 | 0.0011 | - | - | - | - | - | | 2.9612 | 3090 | 0.0 | - | - | - | - | - | | 2.9708 | 3100 | 0.0001 | - | - | - | - | - | | 2.9804 | 3110 | 0.0046 | - | - | - | - | - | | 2.9899 | 3120 | 0.0001 | - | - | - | - | - | | **2.9995** | **3130** | **0.0005** | **0.1622** | **0.1647** | **0.1635** | **0.1564** | **0.1617** | | 3.0091 | 3140 | 0.0 | - | - | - | - | - | | 3.0187 | 3150 | 0.0 | - | - | - | - | - | | 3.0283 | 3160 | 0.0 | - | - | - | - | - | | 3.0379 | 3170 | 0.0002 | - | - | - | - | - | | 3.0474 | 3180 | 0.0004 | - | - | - | - | - | | 3.0570 | 3190 | 0.1022 | - | - | - | - | - | | 3.0666 | 3200 | 0.0012 | - | - | - | - | - | | 3.0762 | 3210 | 0.0001 | - | - | - | - | - | | 3.0858 | 3220 | 0.0677 | - | - | - | - | - | | 3.0954 | 3230 | 0.0 | - | - | - | - | - | | 3.1049 | 3240 | 0.0002 | - | - | - | - | - | | 3.1145 | 3250 | 0.0001 | - | - | - | - | - | | 3.1241 | 3260 | 0.0005 | - | - | - | - | - | | 3.1337 | 3270 | 0.0002 | - | - | - | - | - | | 3.1433 | 3280 | 0.0 | - | - | - | - | - | | 3.1529 | 3290 | 0.0021 | - | - | - | - | - | | 3.1624 | 3300 | 0.0001 | - | - | - | - | - | | 3.1720 | 3310 | 0.0077 | - | - | - | - | - | | 3.1816 | 3320 | 0.0001 | - | - | - | - | - | | 3.1912 | 3330 | 0.1324 | - | - | - | - | - | | 3.2008 | 3340 | 0.0 | - | - | - | - | - | | 3.2103 | 3350 | 0.1278 | - | - | - | - | - | | 3.2199 | 3360 | 0.0001 | - | - | - | - | - | | 3.2295 | 3370 | 0.0 | - | - | - | - | - | | 3.2391 | 3380 | 0.0001 | - | - | - | - | - | | 3.2487 | 3390 | 0.0001 | - | - | - | - | - | | 3.2583 | 3400 | 0.0 | - | - | - | - | - | | 3.2678 | 3410 | 0.0001 | - | - | - | - | - | | 3.2774 | 3420 | 0.0 | - | - | - | - | - | | 3.2870 | 3430 | 0.0001 | - | - | - | - | - | | 3.2966 | 3440 | 0.0001 | - | - | - | - | - | | 3.3062 | 3450 | 0.0001 | - | - | - | - | - | | 3.3158 | 3460 | 0.0263 | - | - | - | - | - | | 3.3253 | 3470 | 0.0001 | - | - | - | - | - | | 3.3349 | 3480 | 0.0002 | - | - | - | - | - | | 3.3445 | 3490 | 0.0003 | - | - | - | - | - | | 3.3541 | 3500 | 0.0 | - | - | - | - | - | | 3.3637 | 3510 | 0.0 | - | - | - | - | - | | 3.3733 | 3520 | 0.0 | - | - | - | - | - | | 3.3828 | 3530 | 0.0002 | - | - | - | - | - | | 3.3924 | 3540 | 0.0001 | - | - | - | - | - | | 3.4020 | 3550 | 0.0 | - | - | - | - | - | | 3.4116 | 3560 | 0.0001 | - | - | - | - | - | | 3.4212 | 3570 | 0.0001 | - | - | - | - | - | | 3.4308 | 3580 | 0.0122 | - | - | - | - | - | | 3.4403 | 3590 | 0.0 | - | - | - | - | - | | 3.4499 | 3600 | 0.0001 | - | - | - | - | - | | 3.4595 | 3610 | 0.0003 | - | - | - | - | - | | 3.4691 | 3620 | 0.0 | - | - | - | - | - | | 3.4787 | 3630 | 0.0 | - | - | - | - | - | | 3.4883 | 3640 | 0.0001 | - | - | - | - | - | | 3.4978 | 3650 | 0.0 | - | - | - | - | - | | 3.5074 | 3660 | 0.0002 | - | - | - | - | - | | 3.5170 | 3670 | 0.0004 | - | - | - | - | - | | 3.5266 | 3680 | 0.0003 | - | - | - | - | - | | 3.5362 | 3690 | 0.0004 | - | - | - | - | - | | 3.5458 | 3700 | 0.0 | - | - | - | - | - | | 3.5553 | 3710 | 0.0001 | - | - | - | - | - | | 3.5649 | 3720 | 0.0001 | - | - | - | - | - | | 3.5745 | 3730 | 0.0 | - | - | - | - | - | | 3.5841 | 3740 | 0.0001 | - | - | - | - | - | | 3.5937 | 3750 | 0.0003 | - | - | - | - | - | | 3.6033 | 3760 | 0.0 | - | - | - | - | - | | 3.6128 | 3770 | 0.0002 | - | - | - | - | - | | 3.6224 | 3780 | 0.0 | - | - | - | - | - | | 3.6320 | 3790 | 0.0 | - | - | - | - | - | | 3.6416 | 3800 | 0.0 | - | - | - | - | - | | 3.6512 | 3810 | 0.0 | - | - | - | - | - | | 3.6608 | 3820 | 0.0 | - | - | - | - | - | | 3.6703 | 3830 | 0.0 | - | - | - | - | - | | 3.6799 | 3840 | 0.0001 | - | - | - | - | - | | 3.6895 | 3850 | 0.0001 | - | - | - | - | - | | 3.6991 | 3860 | 0.0002 | - | - | - | - | - | | 3.7087 | 3870 | 0.0 | - | - | - | - | - | | 3.7183 | 3880 | 0.0001 | - | - | - | - | - | | 3.7278 | 3890 | 0.0002 | - | - | - | - | - | | 3.7374 | 3900 | 0.0001 | - | - | - | - | - | | 3.7470 | 3910 | 0.0003 | - | - | - | - | - | | 3.7566 | 3920 | 0.0003 | - | - | - | - | - | | 3.7662 | 3930 | 0.0021 | - | - | - | - | - | | 3.7758 | 3940 | 0.0002 | - | - | - | - | - | | 3.7853 | 3950 | 0.0001 | - | - | - | - | - | | 3.7949 | 3960 | 0.0001 | - | - | - | - | - | | 3.8045 | 3970 | 0.0001 | - | - | - | - | - | | 3.8141 | 3980 | 0.0002 | - | - | - | - | - | | 3.8237 | 3990 | 0.0001 | - | - | - | - | - | | 3.8333 | 4000 | 0.0001 | - | - | - | - | - | | 3.8428 | 4010 | 0.0001 | - | - | - | - | - | | 3.8524 | 4020 | 0.0001 | - | - | - | - | - | | 3.8620 | 4030 | 0.0 | - | - | - | - | - | | 3.8716 | 4040 | 0.0003 | - | - | - | - | - | | 3.8812 | 4050 | 0.0 | - | - | - | - | - | | 3.8908 | 4060 | 0.002 | - | - | - | - | - | | 3.9003 | 4070 | 0.0 | - | - | - | - | - | | 3.9099 | 4080 | 0.0 | - | - | - | - | - | | 3.9195 | 4090 | 0.0001 | - | - | - | - | - | | 3.9291 | 4100 | 0.0 | - | - | - | - | - | | 3.9387 | 4110 | 0.0 | - | - | - | - | - | | 3.9483 | 4120 | 0.0 | - | - | - | - | - | | 3.9578 | 4130 | 0.0 | - | - | - | - | - | | 3.9674 | 4140 | 0.0 | - | - | - | - | - | | 3.9770 | 4150 | 0.0 | - | - | - | - | - | | 3.9866 | 4160 | 0.0004 | - | - | - | - | - | | 3.9962 | 4170 | 0.0 | - | - | - | - | - | | 3.9981 | 4172 | - | 0.1592 | 0.1658 | 0.1660 | 0.1580 | 0.1671 | * The bold row denotes the saved checkpoint. </details> ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.0.1 - Transformers: 4.42.4 - PyTorch: 2.3.1+cu121 - Accelerate: 0.34.0.dev0 - Datasets: 2.21.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MatryoshkaLoss ```bibtex @misc{kusupati2024matryoshka, title={Matryoshka Representation Learning}, author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi}, year={2024}, eprint={2205.13147}, archivePrefix={arXiv}, primaryClass={cs.LG} } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "CAS" ]
Non_BioNLP
BAAI/bge-large-zh
BAAI
feature-extraction
[ "transformers", "pytorch", "safetensors", "bert", "feature-extraction", "zh", "arxiv:2310.07554", "arxiv:2309.07597", "license:mit", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,690
1,697
21,308
320
--- language: - zh license: mit --- **Recommend switching to newest [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5), which has more reasonable similarity distribution and same method of usage.** <h1 align="center">FlagEmbedding</h1> <h4 align="center"> <p> <a href=#model-list>Model List</a> | <a href=#frequently-asked-questions>FAQ</a> | <a href=#usage>Usage</a> | <a href="#evaluation">Evaluation</a> | <a href="#train">Train</a> | <a href="#contact">Contact</a> | <a href="#citation">Citation</a> | <a href="#license">License</a> <p> </h4> More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding). [English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md) FlagEmbedding can map any text to a low-dimensional dense vector which can be used for tasks like retrieval, classification, clustering, or semantic search. And it also can be used in vector databases for LLMs. ************* 🌟**Updates**🌟 ************* - 10/12/2023: Release [LLM-Embedder](./FlagEmbedding/llm_embedder/README.md), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Paper](https://arxiv.org/pdf/2310.07554.pdf) :fire: - 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) of BGE has been released - 09/15/2023: The [masive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released - 09/12/2023: New models: - **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models. - **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction. <details> <summary>More</summary> <!-- ### More --> - 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning. - 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard). - 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗** - 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada: - 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset. </details> ## Model List `bge` is short for `BAAI general embedding`. | Model | Language | | Description | query instruction for retrieval [1] | |:-------------------------------|:--------:| :--------:| :--------:|:--------:| | [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](./FlagEmbedding/llm_embedder/README.md) [Fine-tune](./FlagEmbedding/llm_embedder/README.md) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](./FlagEmbedding/llm_embedder/README.md) | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | | | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` | [1\]: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages. [2\]: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models. For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results. All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI. If you cannot open the Huggingface Hub, you also can download the models at https://model.baai.ac.cn/models . ## Frequently asked questions <details> <summary>1. How to fine-tune bge embedding model?</summary> <!-- ### How to fine-tune bge embedding model? --> Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model. Some suggestions: - Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance. - If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity. - If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker. </details> <details> <summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary> <!-- ### The similarity score between two dissimilar sentences is higher than 0.5 --> **Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.** Since we finetune the models by contrastive learning with a temperature of 0.01, the similarity distribution of the current BGE model is about in the interval \[0.6, 1\]. So a similarity score greater than 0.5 does not indicate that the two sentences are similar. For downstream tasks, such as passage retrieval or semantic similarity, **what matters is the relative order of the scores, not the absolute value.** If you need to filter similar sentences based on a similarity threshold, please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9). </details> <details> <summary>3. When does the query instruction need to be used</summary> <!-- ### When does the query instruction need to be used --> For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction. No instruction only has a slight degradation in retrieval performance compared with using instruction. So you can generate embedding without instruction in all cases for convenience. For a retrieval task that uses short queries to find long related documents, it is recommended to add instructions for these short queries. **The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.** In all cases, the documents/passages do not need to add the instruction. </details> ## Usage ### Usage for Embedding Model Here are some examples for using `bge` models with [FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers). #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding. ```python from FlagEmbedding import FlagModel sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = FlagModel('BAAI/bge-large-zh-v1.5', query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:", use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation embeddings_1 = model.encode(sentences_1) embeddings_2 = model.encode(sentences_2) similarity = embeddings_1 @ embeddings_2.T print(similarity) # for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query # corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] q_embeddings = model.encode_queries(queries) p_embeddings = model.encode(passages) scores = q_embeddings @ p_embeddings.T ``` For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list). By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs. You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable. #### Using Sentence-Transformers You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net): ``` pip install -U sentence-transformers ``` ```python from sentence_transformers import SentenceTransformer sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = SentenceTransformer('BAAI/bge-large-zh-v1.5') embeddings_1 = model.encode(sentences_1, normalize_embeddings=True) embeddings_2 = model.encode(sentences_2, normalize_embeddings=True) similarity = embeddings_1 @ embeddings_2.T print(similarity) ``` For s2p(short query to long passage) retrieval task, each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)). But the instruction is not needed for passages. ```python from sentence_transformers import SentenceTransformer queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] instruction = "为这个句子生成表示以用于检索相关文章:" model = SentenceTransformer('BAAI/bge-large-zh-v1.5') q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True) p_embeddings = model.encode(passages, normalize_embeddings=True) scores = q_embeddings @ p_embeddings.T ``` #### Using Langchain You can use `bge` in langchain like this: ```python from langchain.embeddings import HuggingFaceBgeEmbeddings model_name = "BAAI/bge-large-en-v1.5" model_kwargs = {'device': 'cuda'} encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity model = HuggingFaceBgeEmbeddings( model_name=model_name, model_kwargs=model_kwargs, encode_kwargs=encode_kwargs, query_instruction="为这个句子生成表示以用于检索相关文章:" ) model.query_instruction = "为这个句子生成表示以用于检索相关文章:" ``` #### Using HuggingFace Transformers With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding. ```python from transformers import AutoTokenizer, AutoModel import torch # Sentences we want sentence embeddings for sentences = ["样例数据-1", "样例数据-2"] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5') model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5') model.eval() # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages) # encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, cls pooling. sentence_embeddings = model_output[0][:, 0] # normalize embeddings sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1) print("Sentence embeddings:", sentence_embeddings) ``` ### Usage for Reranker Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. You can get a relevance score by inputting query and passage to the reranker. The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range. #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` Get relevance scores (higher scores indicate more relevance): ```python from FlagEmbedding import FlagReranker reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation score = reranker.compute_score(['query', 'passage']) print(score) scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]) print(scores) ``` #### Using Huggingface transformers ```python import torch from transformers import AutoModelForSequenceClassification, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large') model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large') model.eval() pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']] with torch.no_grad(): inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512) scores = model(**inputs, return_dict=True).logits.view(-1, ).float() print(scores) ``` ## Evaluation `baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!** For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md). - **MTEB**: | Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) | |:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:| | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 | | [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 | | [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 | | [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 | | [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 | | [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 | | [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 | | [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 | | [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 | | [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 | | [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 | | [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 | | [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 | | [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 | | [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 | - **C-MTEB**: We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks. Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction. | Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 | | [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 | | [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 | | [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 | | [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 | | [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 | | [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 | | [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 | | [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 | | [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 | | [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 | - **Reranking**: See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script. | Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 | | multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 | | multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 | | multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 | | m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 | | m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 | | bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 | | bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 | \* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks ## Train ### BAAI Embedding We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning. **You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).** We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain). Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned. More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md). ### BGE Reranker Cross-encoder will perform full-attention over the input pair, which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model. Therefore, it can be used to re-rank the top-k documents returned by embedding model. We train the cross-encoder on a multilingual pair data, The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker). More details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker) ## Contact If you have any question or suggestion related to this project, feel free to open an issue or pull request. You also can email Shitao Xiao([email protected]) and Zheng Liu([email protected]). ## Citation If you find this repository useful, please consider giving a star :star: and citation ``` @misc{bge_embedding, title={C-Pack: Packaged Resources To Advance General Chinese Embedding}, author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff}, year={2023}, eprint={2309.07597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## License FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
[ "SEMANTIC_SIMILARITY", "SUMMARIZATION" ]
[ "BEAR" ]
Non_BioNLP
RomainDarous/large_directTwoEpoch_meanPooling_mistranslationModel
RomainDarous
sentence-similarity
[ "sentence-transformers", "safetensors", "xlm-roberta", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:4460010", "loss:CoSENTLoss", "dataset:RomainDarous/corrupted_os_by_language", "arxiv:1908.10084", "base_model:RomainDarous/large_directOneEpoch_meanPooling_mistranslationModel", "base_model:finetune:RomainDarous/large_directOneEpoch_meanPooling_mistranslationModel", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,740
1,740
22
0
--- base_model: RomainDarous/large_directOneEpoch_meanPooling_mistranslationModel datasets: - RomainDarous/corrupted_os_by_language library_name: sentence-transformers metrics: - pearson_cosine - spearman_cosine pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:4460010 - loss:CoSENTLoss widget: - source_sentence: Malformed target specific variable definition sentences: - Hedefe özgü değişken tanımı bozuk - Kan alle data in die gids lees - "слава Украине! героям слава!\uFEFF" - source_sentence: Can't write an inode bitmap sentences: - Skontrolujte stav aktualizácií alebo to skúste znova neskôr. - Malsukcesis skribi i nodan bitmapon - Zastępuje wersję GL obsługiwaną przez sterownik - source_sentence: Optimize soft proofing color transformations sentences: - 'arkadaslar biz artik her an kirmizi kart yiyecek,bencil,pas yapamayan,isabetsiz orta yapani istemiyoruz. sozde efsaneniz bu sezon Besiktasa en cok zarar verenlerden biriydi. kendini dusunmeden once Besiktasi dusunecek adam lazim bize. o yuzden #GoHomeQuaresma' - Yav bizim dedikodusunu yaptığımız insanın bile bi vizyonu var. Senin hakkında neden oturup konuşalım? - Ik ben een transgender. - source_sentence: 'Pass 1: Checking @is, @bs, and sizes' sentences: - Bu adam cidden kurabiye gibi ben bunu çayın yanında yerim - sagnat. errada. invisible. justificació. idioma - Wilt u echt de primaire sleutel verplaatsen? (j N) - source_sentence: Search for matching log entries sentences: - quem te lembra? caralho tô assustada aqui kkkkk - sendotasunik gabeko\ egoera bistaratuko den ala ez adierazten du - En aquest cas, hem d'incloure les imatges del contenidor )sr iov per a càrregues de treball de telco (per exemple, com a referència, es podrien obtenir des de valors de helm chart) model-index: - name: SentenceTransformer based on RomainDarous/large_directOneEpoch_meanPooling_mistranslationModel results: - task: type: semantic-similarity name: Semantic Similarity dataset: name: sts eval type: sts-eval metrics: - type: pearson_cosine value: 0.9795611378598187 name: Pearson Cosine - type: spearman_cosine value: 0.8656183557127043 name: Spearman Cosine - task: type: semantic-similarity name: Semantic Similarity dataset: name: sts test type: sts-test metrics: - type: pearson_cosine value: 0.9796319177718953 name: Pearson Cosine - type: spearman_cosine value: 0.8656754104676266 name: Spearman Cosine --- # SentenceTransformer based on RomainDarous/large_directOneEpoch_meanPooling_mistranslationModel This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [RomainDarous/large_directOneEpoch_meanPooling_mistranslationModel](https://huggingface.co/RomainDarous/large_directOneEpoch_meanPooling_mistranslationModel) on the [corrupted_open_os_by_language](https://huggingface.co/datasets/RomainDarous/corrupted_os_by_language) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [RomainDarous/large_directOneEpoch_meanPooling_mistranslationModel](https://huggingface.co/RomainDarous/large_directOneEpoch_meanPooling_mistranslationModel) <!-- at revision ce7addcfaf672d0a5ad38a5fdd89a785c1a46440 --> - **Maximum Sequence Length:** 128 tokens - **Output Dimensionality:** 768 dimensions - **Similarity Function:** Cosine Similarity - **Training Dataset:** - [corrupted_open_os_by_language](https://huggingface.co/datasets/RomainDarous/corrupted_os_by_language) <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: XLMRobertaModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("RomainDarous/large_directTwoEpoch_meanPooling_mistranslationModel") # Run inference sentences = [ 'Search for matching log entries', 'quem te lembra? caralho tô assustada aqui kkkkk', 'sendotasunik gabeko\\ egoera bistaratuko den ala ez adierazten du', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Semantic Similarity * Datasets: `sts-eval` and `sts-test` * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) | Metric | sts-eval | sts-test | |:--------------------|:-----------|:-----------| | pearson_cosine | 0.9796 | 0.9796 | | **spearman_cosine** | **0.8656** | **0.8657** | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### corrupted_open_os_by_language * Dataset: [corrupted_open_os_by_language](https://huggingface.co/datasets/RomainDarous/corrupted_os_by_language) at [9d25780](https://huggingface.co/datasets/RomainDarous/corrupted_os_by_language/tree/9d25780e2032b1e8f06af6a4ff55124d7a930c3c) * Size: 4,460,010 training samples * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | score | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:------------------------------------------------| | type | string | string | int | | details | <ul><li>min: 6 tokens</li><li>mean: 18.33 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 26.47 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>0: ~50.60%</li><li>1: ~49.40%</li></ul> | * Samples: | sentence1 | sentence2 | score | |:--------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------|:---------------| | <code>Check spelling. Print the document. Show completion window. General. Show help</code> | <code>Kontrolli õigekirja. присоединяюсь. </code> | <code>0</code> | | <code>EXIF not supported for this file format.</code> | <code>Šiam failo formatui EXIF nepalaikomas.</code> | <code>1</code> | | <code>This package includes the documentation for texlive everyhook</code> | <code>Paket ini menyertakan dokumentasi untuk texlive everyhook</code> | <code>1</code> | * Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "pairwise_cos_sim" } ``` ### Evaluation Dataset #### corrupted_open_os_by_language * Dataset: [corrupted_open_os_by_language](https://huggingface.co/datasets/RomainDarous/corrupted_os_by_language) at [9d25780](https://huggingface.co/datasets/RomainDarous/corrupted_os_by_language/tree/9d25780e2032b1e8f06af6a4ff55124d7a930c3c) * Size: 4,460,010 evaluation samples * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | score | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:------------------------------------------------| | type | string | string | int | | details | <ul><li>min: 5 tokens</li><li>mean: 17.71 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 26.95 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>0: ~50.60%</li><li>1: ~49.40%</li></ul> | * Samples: | sentence1 | sentence2 | score | |:----------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| | <code>Could not identify the current seat.</code> | <code> 天天花着男人的钱还这这创造新词汇男权你可真牛批,你也就这一出了一问男权,就说是我是吧,到现在我也没听到你给我们讲的男权,你也就是在网上喷喷,现实走道都不敢探头自卑,你现实要把你女权的劲拿出来总低啥头,您老应该去国家教育局把男权加上是吧,你们女权天天说自己生活不好没地位,给你们地位了你们能干啥?用你们的女权打到全世界男性是吧,能相出男权这一词您老也是人才呀,是不是庆幸自己是个女的,活在自己想想的世界里不觉得孤单吗,假象有男权是吧,自己假象和男权还说自己不是田园女权,田园女权能连自己都骂说自己妈是驴爸是大鼎的也是奇葩呀,那我们国家大肆宣扬过你们这么田园女权吗,国家要的是女性人群自主自理,你们可好看看你们女权干的啥事,给你们女权地位高了,看看你们女权干的事n绿地集团高管怎么都不说呀,人家可是有钱有地位,也不是我们说三从四德洗衣做饭你们女权会吗?,那我问问你们女权干过啥惊天大事,还甩锅给孔子,还封建社会,那我问问你们女权在福利面前为啥说自己是女性呀不是社会主义社会吗不应该男女平等吗,天天自己也不知道是不是抱个手机天天欧巴欧巴,你家那位要是不陪你看一会就会问你是不是不爱我了是吧大姐,您老也就赚这白菜钱操心国家事,中国五千年的历史被您老一句否决,还嘲讽人家日本女性,好意思说自己不是女权,三从四德流传这么久到您这变成日本文化了,我就想问问男权您老是怎么想的,那你问孔子老人家呗为什么女人要三从四德,我说的是女权你干嘛自己对号入座,连中华人民传承的东西都不认跟我这谈男权,还男权您老给我举个例子呗,让我们男权听听都是h啥,这些不都是你们女权的标准吗?,还男权,您老醒醒吧这里是现实,不是你的公主世界,总觉得自己多么多么重要,地球没你是不能转了还是人类要灭亡呀,我真的想问一句你给我找一条男权的新闻,咋了我们男人不能提女权呗你老授权了呗,那我们谈论田园女权你老对号入座干嘛,天天过节要礼物,还嫌弃自己男朋友没有钱,我寻思你找个有钱人包养你呗,对了有钱人怎么可能看上你这种女权的呢,还要孩子跟女方姓我也没看见你没跟你妈姓呀,年年过节男人给你们送礼物你们女人给男人送过礼物吗?,一问我不是陪着他吗我对他说我爱你了这不是最好的礼物吗?,男人只要不送礼物就是不爱你们了呗,人家国际女权讲的男人能做的我们女人也能做,田园女权男人能做的我们女人为啥要做,还男权我笑了,以前结婚几头牛换个衣服原装的,现在几十万彩...</code> | <code>0</code> | | <code>Undoing Date and Time Adjustment</code> | <code>正在取消日期和时间调整</code> | <code>1</code> | | <code>Dependency package for gsl_2_6 gnu hpc</code> | <code>Pacotes de desenvolvimento do KDE</code> | <code>1</code> | * Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "pairwise_cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 64 - `per_device_eval_batch_size`: 64 - `num_train_epochs`: 1 - `warmup_ratio`: 0.1 #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 64 - `per_device_eval_batch_size`: 64 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 5e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: None - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `average_tokens_across_devices`: False - `prompts`: None - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | corrupted open os by language loss | sts-eval_spearman_cosine | sts-test_spearman_cosine | |:-----:|:-----:|:-------------:|:----------------------------------:|:------------------------:|:------------------------:| | 1.0 | 55751 | 0.2403 | 0.2550 | 0.8656 | - | | -1 | -1 | - | - | - | 0.8657 | ### Framework Versions - Python: 3.10.13 - Sentence Transformers: 3.4.1 - Transformers: 4.48.2 - PyTorch: 2.1.2+cu121 - Accelerate: 1.3.0 - Datasets: 2.16.1 - Tokenizers: 0.21.0 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### CoSENTLoss ```bibtex @online{kexuefm-8847, title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT}, author={Su Jianlin}, year={2022}, month={Jan}, url={https://kexue.fm/archives/8847}, } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION", "SEMANTIC_SIMILARITY", "TRANSLATION" ]
[ "CAS" ]
Non_BioNLP
BAAI/bge-large-zh-noinstruct
BAAI
feature-extraction
[ "transformers", "pytorch", "bert", "feature-extraction", "zh", "arxiv:2310.07554", "arxiv:2309.07597", "license:mit", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,690
1,697
69
11
--- language: - zh license: mit --- **Recommend switching to newest [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) , which has more reasonable similarity distribution and same method of usage.** <h1 align="center">FlagEmbedding</h1> <h4 align="center"> <p> <a href=#model-list>Model List</a> | <a href=#frequently-asked-questions>FAQ</a> | <a href=#usage>Usage</a> | <a href="#evaluation">Evaluation</a> | <a href="#train">Train</a> | <a href="#contact">Contact</a> | <a href="#citation">Citation</a> | <a href="#license">License</a> <p> </h4> More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding). [English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md) FlagEmbedding can map any text to a low-dimensional dense vector which can be used for tasks like retrieval, classification, clustering, or semantic search. And it also can be used in vector databases for LLMs. ************* 🌟**Updates**🌟 ************* - 10/12/2023: Release [LLM-Embedder](./FlagEmbedding/llm_embedder/README.md), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Paper](https://arxiv.org/pdf/2310.07554.pdf) :fire: - 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) of BGE has been released - 09/15/2023: The [masive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released - 09/12/2023: New models: - **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models. - **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction. <details> <summary>More</summary> <!-- ### More --> - 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning. - 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard). - 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗** - 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada: - 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset. </details> ## Model List `bge` is short for `BAAI general embedding`. | Model | Language | | Description | query instruction for retrieval [1] | |:-------------------------------|:--------:| :--------:| :--------:|:--------:| | [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](./FlagEmbedding/llm_embedder/README.md) [Fine-tune](./FlagEmbedding/llm_embedder/README.md) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](./FlagEmbedding/llm_embedder/README.md) | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | | | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` | [1\]: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages. [2\]: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models. For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results. All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI. If you cannot open the Huggingface Hub, you also can download the models at https://model.baai.ac.cn/models . ## Frequently asked questions <details> <summary>1. How to fine-tune bge embedding model?</summary> <!-- ### How to fine-tune bge embedding model? --> Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model. Some suggestions: - Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance. - If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity. - If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker. </details> <details> <summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary> <!-- ### The similarity score between two dissimilar sentences is higher than 0.5 --> **Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.** Since we finetune the models by contrastive learning with a temperature of 0.01, the similarity distribution of the current BGE model is about in the interval \[0.6, 1\]. So a similarity score greater than 0.5 does not indicate that the two sentences are similar. For downstream tasks, such as passage retrieval or semantic similarity, **what matters is the relative order of the scores, not the absolute value.** If you need to filter similar sentences based on a similarity threshold, please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9). </details> <details> <summary>3. When does the query instruction need to be used</summary> <!-- ### When does the query instruction need to be used --> For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction. No instruction only has a slight degradation in retrieval performance compared with using instruction. So you can generate embedding without instruction in all cases for convenience. For a retrieval task that uses short queries to find long related documents, it is recommended to add instructions for these short queries. **The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.** In all cases, the documents/passages do not need to add the instruction. </details> ## Usage ### Usage for Embedding Model Here are some examples for using `bge` models with [FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers). #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding. ```python from FlagEmbedding import FlagModel sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = FlagModel('BAAI/bge-large-zh-v1.5', query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:", use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation embeddings_1 = model.encode(sentences_1) embeddings_2 = model.encode(sentences_2) similarity = embeddings_1 @ embeddings_2.T print(similarity) # for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query # corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] q_embeddings = model.encode_queries(queries) p_embeddings = model.encode(passages) scores = q_embeddings @ p_embeddings.T ``` For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list). By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs. You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable. #### Using Sentence-Transformers You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net): ``` pip install -U sentence-transformers ``` ```python from sentence_transformers import SentenceTransformer sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = SentenceTransformer('BAAI/bge-large-zh-v1.5') embeddings_1 = model.encode(sentences_1, normalize_embeddings=True) embeddings_2 = model.encode(sentences_2, normalize_embeddings=True) similarity = embeddings_1 @ embeddings_2.T print(similarity) ``` For s2p(short query to long passage) retrieval task, each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)). But the instruction is not needed for passages. ```python from sentence_transformers import SentenceTransformer queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] instruction = "为这个句子生成表示以用于检索相关文章:" model = SentenceTransformer('BAAI/bge-large-zh-v1.5') q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True) p_embeddings = model.encode(passages, normalize_embeddings=True) scores = q_embeddings @ p_embeddings.T ``` #### Using Langchain You can use `bge` in langchain like this: ```python from langchain.embeddings import HuggingFaceBgeEmbeddings model_name = "BAAI/bge-large-en-v1.5" model_kwargs = {'device': 'cuda'} encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity model = HuggingFaceBgeEmbeddings( model_name=model_name, model_kwargs=model_kwargs, encode_kwargs=encode_kwargs, query_instruction="为这个句子生成表示以用于检索相关文章:" ) model.query_instruction = "为这个句子生成表示以用于检索相关文章:" ``` #### Using HuggingFace Transformers With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding. ```python from transformers import AutoTokenizer, AutoModel import torch # Sentences we want sentence embeddings for sentences = ["样例数据-1", "样例数据-2"] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5') model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5') model.eval() # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages) # encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, cls pooling. sentence_embeddings = model_output[0][:, 0] # normalize embeddings sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1) print("Sentence embeddings:", sentence_embeddings) ``` ### Usage for Reranker Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. You can get a relevance score by inputting query and passage to the reranker. The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range. #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` Get relevance scores (higher scores indicate more relevance): ```python from FlagEmbedding import FlagReranker reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation score = reranker.compute_score(['query', 'passage']) print(score) scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]) print(scores) ``` #### Using Huggingface transformers ```python import torch from transformers import AutoModelForSequenceClassification, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large') model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large') model.eval() pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']] with torch.no_grad(): inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512) scores = model(**inputs, return_dict=True).logits.view(-1, ).float() print(scores) ``` ## Evaluation `baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!** For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md). - **MTEB**: | Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) | |:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:| | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 | | [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 | | [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 | | [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 | | [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 | | [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 | | [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 | | [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 | | [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 | | [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 | | [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 | | [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 | | [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 | | [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 | | [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 | - **C-MTEB**: We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks. Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction. | Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 | | [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 | | [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 | | [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 | | [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 | | [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 | | [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 | | [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 | | [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 | | [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 | | [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 | - **Reranking**: See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script. | Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 | | multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 | | multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 | | multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 | | m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 | | m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 | | bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 | | bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 | \* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks ## Train ### BAAI Embedding We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning. **You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).** We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain). Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned. More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md). ### BGE Reranker Cross-encoder will perform full-attention over the input pair, which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model. Therefore, it can be used to re-rank the top-k documents returned by embedding model. We train the cross-encoder on a multilingual pair data, The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker). More details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker) ## Contact If you have any question or suggestion related to this project, feel free to open an issue or pull request. You also can email Shitao Xiao([email protected]) and Zheng Liu([email protected]). ## Citation If you find this repository useful, please consider giving a star :star: and citation ``` @misc{bge_embedding, title={C-Pack: Packaged Resources To Advance General Chinese Embedding}, author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff}, year={2023}, eprint={2309.07597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## License FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
[ "SEMANTIC_SIMILARITY", "SUMMARIZATION" ]
[ "BEAR" ]
Non_BioNLP
udrearobert999/multi-qa-mpnet-base-cos-v1-test
udrearobert999
text-classification
[ "setfit", "safetensors", "mpnet", "sentence-transformers", "text-classification", "generated_from_setfit_trainer", "arxiv:2209.11055", "base_model:sentence-transformers/multi-qa-mpnet-base-cos-v1", "base_model:finetune:sentence-transformers/multi-qa-mpnet-base-cos-v1", "model-index", "region:us" ]
1,714
1,714
5
2
--- base_model: sentence-transformers/multi-qa-mpnet-base-cos-v1 library_name: setfit metrics: - accuracy pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification - generated_from_setfit_trainer widget: - text: in durankulak near varna is another important example other signs of early metals are found from the third millennium bc in palmela portugal los millares spain and stonehenge united kingdom the precise beginnings however have not be clearly ascertained and new discoveries are both continuous and ongoing in tamilnadu in approximately 1900 bc ancient iron smelting sites were functioning in tamil nadu in the near east about 3500 bc it was discovered that by combining copper and tin a superior metal could be made an alloy called bronze this represented a major technological shift known as the bronze age the extraction of iron from its ore into a workable metal is much more difficult than for copper or tin the process appears to have been invented by the hittites in about 1200 bc beginning the iron age the secret of extracting and working iron was a key factor in the success of the philistineshistorical developments in ferrous metallurgy can be found in a wide variety of past cultures and civilizations this includes the ancient and medieval kingdoms and empires of the middle east and near east ancient iran ancient egypt ancient nubia and anatolia in presentday turkey ancient nok carthage the greeks and romans of ancient europe medieval europe ancient and medieval china ancient and medieval india ancient and medieval japan amongst others many applications practices and devices associated or involved in metallurgy were established in ancient china such as the innovation of the blast furnace cast iron hydraulicpowered trip hammers and double acting piston bellowsa 16th century book by georg agricola de re metallica describes the highly developed and complex processes of mining metal ores metal extraction and metallurgy of the time agricola has been described as the father of metallurgy extractive metallurgy is the practice of removing valuable metals from an ore and refining the extracted raw metals into a purer form in order to convert a metal oxide or sulphide to a purer metal the ore must be reduced physically chemically or electrolytically extractive metallurgists are interested in three primary streams feed concentrate metal oxidesulphide and tailings waste after mining large pieces of the ore feed are broken through crushing or grinding in order to obtain particles small enough where each particle is either mostly valuable or mostly waste concentrating the particles of value in a form supporting separation enables the desired metal to be removed from waste products mining may not be necessary if the ore body and physical environment are conducive to leaching leaching dissolves minerals in an ore body and results in an enriched solution the solution is collected and processed to extract valuable metals ore - text: '##rch procedure that evaluates the objective function p x displaystyle pmathbf x on a grid of candidate source locations g displaystyle mathcal g to estimate the spatial location of the sound source x s displaystyle textbf xs as the point of the grid that provides the maximum srp modifications of the classical srpphat algorithm have been proposed to reduce the computational cost of the gridsearch step of the algorithm and to increase the robustness of the method in the classical srpphat for each microphone pair and for each point of the grid a unique integer tdoa value is selected to be the acoustic delay corresponding to that grid point this procedure does not guarantee that all tdoas are associated to points on the grid nor that the spatial grid is consistent since some of the points may not correspond to an intersection of hyperboloids this issue becomes more problematic with coarse grids since when the number of points is reduced part of the tdoa information gets lost because most delays are not anymore associated to any point in the grid the modified srpphat collects and uses the tdoa information related to the volume surrounding each spatial point of the search grid by considering a modified objective function where l m 1 m 2 l x displaystyle lm1m2lmathbf x and l m 1 m 2 u x displaystyle lm1m2umathbf x are the lower and upper accumulation limits of gcc delays which depend on the spatial location x displaystyle mathbf x the accumulation limits can be calculated beforehand in an exact way by exploring the boundaries separating the regions corresponding to the points of the grid alternatively they can be selected by considering the spatial gradient of the tdoa ∇ τ m 1 m 2 x ∇ x τ m 1 m 2 x ∇ y τ m 1 m 2 x ∇ z τ m 1 m 2 x t displaystyle nabla tau m1m2mathbf x nabla xtau m1m2mathbf x nabla ytau m1m2mathbf x nabla ztau m1m2mathbf x t where each component γ ∈ x y z displaystyle gamma in leftxyzright of the gradient is for a rectangular grid where neighboring points are separated a distance r displaystyle r the lower and upper accumulation limits are given by where d r 2 min 1 sin θ cos [UNK] 1 sin θ sin [UNK] 1 cos θ displaystyle dr2min leftfrac 1vert sintheta cosphi vert frac 1vert sintheta sinphi vert frac 1vert' - text: authority to select projects and mandated new metropolitan planning initiatives for the first time state transportation officials were required to consult seriously with local representatives on mpo governing boards regarding matters of project prioritization and decisionmaking these changes had their roots in the need to address increasingly difficult transportation problems — in particular the more complicated patterns of traffic congestion that arose with the suburban development boom in the previous decades many recognized that the problems could only be addressed effectively through a stronger federal commitment to regional planning the legislation that emerged the intermodal surface transportation efficiency act istea was signed into federal law by president george h w bush in december 1991 it focused on improving transportation not as an end in itself but as the means to achieve important national goals including economic progress cleaner air energy conservation and social equity istea promoted a transportation system in which different modes and facilities — highway transit pedestrian bicycle aviation and marine — were integrated to allow a seamless movement of both goods and people new funding programs provided greater flexibility in the use of funds particularly regarding using previously restricted highway funds for transit development improved intermodal connections and emphasized upgrades to existing facilities over building new capacity — particularly roadway capacity to accomplish more serious metropolitan planning istea doubled federal funding for mpo operations and required the agencies to evaluate a variety of multimodal solutions to roadway congestion and other transportation problems mpos also were required to broaden public participation in the planning process and to see that investment decisions contributed to meeting the air quality standards of the clean air act amendments in addition istea placed a new requirement on mpos to conduct fiscally constrained planning and ensure that longrange transportation plans and shortterm transportation improvement programs were fiscally constrained in other words adopted plans and programs can not include more projects than reasonably can be expected to be funded through existing or projected sources of revenues this new requirement represented a major conceptual shift for many mpos and others in the planning community since the imposition of fiscal discipline on plans now required not only understanding how much money might be available but how to prioritize investment needs and make difficult choices among competing needs adding to this complexity is the need to plan across transportation modes and develop approaches for multimodal investment prioritization and decision making it is in this context of greater prominence funding and requirements that mpos function today an annual element is composed of transportation improvement projects contained in an areas transportation improvement program tip which is proposed for implementation during the current year the annual element is submitted to the us department of transportation as part of the required planning process the passage of safe accountable flexible efficient transportation equity act a legacy for users safetealu - text: '##pignygiroux served as an assistant professor from 1997 2003 associate professor from 2003 2014 chair of the department of geography from 2015 2018 and professor beginning in 2014 with secondary appointments in department of geology the college of education social services and rubenstein school of environment natural resources she teaches courses in meteorology climatology physical geography remote sensing and landsurface processes in her work as state climatologist for vermont dupignygiroux uses her expertise hydrology and extreme weather such as floods droughts and storms to keep the residents of vermont informed on how climate change will affect their homes health and livelihoods she assists other state agencies in preparing for and adapting to current and future impacts of climate change on vermonts transportation system emergency management planning and agriculture and forestry industries for example she has published analyses of the impacts of climate change on the health of vermonts sugar maples a hardwood species of key economic and cultural importance to the state as cochair of vermonts state ’ s drought task force she played a key role in developing the 2018 vermont state hazard mitigation plandupignygiroux served as secretary for the american association of state climatologists from 20102011 and president elect from 20192020 in june 2020 she was elected as president of the american association of state climatologists which is a twoyear term in addition to her research on climate change dupignygiroux is known for her efforts to research and promote climate literacy climate literacy is an understanding of the influences of and influences on the climate system including how people change the climate how climate metrics are observed and modelled and how climate change affects society “ being climate literate is more critical than ever before ” lesleyann dupignygiroux stated for a 2020 article on climate literacy “ if we do not understand weather climate and climate change as intricate and interconnected systems then our appreciation of the big picture is lost ” dupignygiroux is known for her climate literacy work with elementary and high school teachers and students she cofounded the satellites weather and climate swac project in 2008 which is a professional development program for k12 teachers designed to promote climate literacy and interest in the stem science technology engineering and mathematics careers dupignygiroux is also a founding member of the climate literacy and energy awareness network clean formerly climate literacy network a communitybased effort to support climate literacy and communication in a 2016 interview dupignygiroux stated “ sharing knowledge and giving back to my community are my two axioms in life watching students mature and flourish in' - text: no solutions to x n y n z n displaystyle xnynzn for all n ≥ 3 displaystyle ngeq 3 this claim appears in his annotations in the margins of his copy of diophantus euler the interest of leonhard euler 1707 – 1783 in number theory was first spurred in 1729 when a friend of his the amateur goldbach pointed him towards some of fermats work on the subject this has been called the rebirth of modern number theory after fermats relative lack of success in getting his contemporaries attention for the subject eulers work on number theory includes the following proofs for fermats statements this includes fermats little theorem generalised by euler to nonprime moduli the fact that p x 2 y 2 displaystyle px2y2 if and only if p ≡ 1 mod 4 displaystyle pequiv 1bmod 4 initial work towards a proof that every integer is the sum of four squares the first complete proof is by josephlouis lagrange 1770 soon improved by euler himself the lack of nonzero integer solutions to x 4 y 4 z 2 displaystyle x4y4z2 implying the case n4 of fermats last theorem the case n3 of which euler also proved by a related method pells equation first misnamed by euler he wrote on the link between continued fractions and pells equation first steps towards analytic number theory in his work of sums of four squares partitions pentagonal numbers and the distribution of prime numbers euler pioneered the use of what can be seen as analysis in particular infinite series in number theory since he lived before the development of complex analysis most of his work is restricted to the formal manipulation of power series he did however do some very notable though not fully rigorous early work on what would later be called the riemann zeta function quadratic forms following fermats lead euler did further research on the question of which primes can be expressed in the form x 2 n y 2 displaystyle x2ny2 some of it prefiguring quadratic reciprocity diophantine equations euler worked on some diophantine equations of genus 0 and 1 in particular he studied diophantuss work he tried to systematise it but the time was not yet ripe for such an endeavour — algebraic geometry was still in its infancy he did notice there was a connection between diophantine problems and elliptic integrals whose study he had himself initiated lagrange legendre and gauss josephlouis inference: true model-index: - name: SetFit with sentence-transformers/multi-qa-mpnet-base-cos-v1 results: - task: type: text-classification name: Text Classification dataset: name: Unknown type: unknown split: test metrics: - type: accuracy value: 0.6908674054260604 name: Accuracy --- # SetFit with sentence-transformers/multi-qa-mpnet-base-cos-v1 This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/multi-qa-mpnet-base-cos-v1](https://huggingface.co/sentence-transformers/multi-qa-mpnet-base-cos-v1) as the Sentence Transformer embedding model. A [SetFitHead](huggingface.co/docs/setfit/reference/main#setfit.SetFitHead) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [sentence-transformers/multi-qa-mpnet-base-cos-v1](https://huggingface.co/sentence-transformers/multi-qa-mpnet-base-cos-v1) - **Classification head:** a [SetFitHead](huggingface.co/docs/setfit/reference/main#setfit.SetFitHead) instance - **Maximum Sequence Length:** 512 tokens - **Number of Classes:** 43 classes <!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ### Model Labels | Label | Examples | |:------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 20 | <ul><li>'##les approach which combined geography history and the sociological approaches of the annee sociologique many members of which were their colleagues at strasbourg to produce an approach which rejected the predominant emphasis on politics diplomacy and war of many 19th and early 20thcentury historians as spearheaded by historians whom febvre called les sorbonnistes instead they pioneered an approach to a study of longterm historical structures la longue duree over events and political transformations geography material culture and what later annalistes called mentalites or the psychology of the epoch are also characteristic areas of study the goal of the annales was to undo the work of the sorbonnistes to turn french historians away from the narrowly political and diplomatic toward the new vistas in social and economic historycofounder marc bloch 1886 – 1944 was a quintessential modernist who studied at the elite ecole normale superieure and in germany serving as a professor at the university of strasbourg until he was called to the sorbonne in paris in 1936 as professor of economic history blochs interests were highly interdisciplinary influenced by the geography of paul vidal de la blache 1845 – 1918 and the sociology of emile durkheim 1858 – 1917 his own ideas especially those expressed in his masterworks french rural history les caracteres originaux de lhistoire rurale francaise 1931 and feudal society were incorporated by the secondgeneration annalistes led by fernand braudel georges duby a leader of the school wrote that the history he taught relegated the sensational to the sidelines and was reluctant to give a simple accounting of events but strove on the contrary to pose and solve problems and neglecting surface disturbances to observe the long and mediumterm evolution of economy society and civilisationthe annalistes especially lucien febvre advocated a histoire totale or histoire tout court a complete study of a historic problem bloch was shot by the gestapo during the german occupation of france in world war ii for his active membership of the french resistance and febvre carried on the annales approach in the 1940s and 1950s it was during this time that he mentored braudel who would become one of the bestknown exponents of this school braudels work came to define a second era of annales historiography and was very influential throughout the 1960s and 1970s especially for his work on the mediterranean region in the era of philip ii of spain braudel developed the idea often associated with annalistes of different modes of historical time lhistoire quasi immobile the quasi motionless history of historical'</li><li>'is important because the persuasiveness of a source usually depends upon its history primary sources may include cases constitutions statutes administrative regulations and other sources of binding legal authority while secondary legal sources may include books the headnotes of case reports articles and encyclopedias legal writers usually prefer to cite primary sources because only primary sources are authoritative and precedential while secondary sources are only persuasive at best family history a secondary source is a record or statement of an event or circumstance made by a noneyewitness or by someone not closely connected with the event or circumstances recorded or stated verbally either at or sometime after the event or by an eyewitness at a time after the event when the fallibility of memory is an important factor consequently according to this definition a firsthand account written long after the event when the fallibility of memory is an important factor is a secondary source even though it may be the first published description of that event autobiographies an autobiography can be a secondary source in history or the humanities when used for information about topics other than its subject for example many firsthand accounts of events in world war i written in the postwar years were influenced by the then prevailing perception of the war which was significantly different from contemporary opinion original research jules r benjamin a students guide to history 2013 isbn 9781457621444 edward h carr what is history basingstoke palgrave 2001 isbn 9780333977019 wood gray historians handbook a key to the study and writing of history prospect heights il waveland press 1991 ©1964 isbn 9780881336269 derek harland a basic course in genealogy volume two research procedure and evaluation of evidence bookcraft inc 1958 worldcat record richard holmes tommy harpercollins 2004 isbn 9780007137510 martha c howell and walter prevenier from reliable sources an introduction to historical methods 2001 isbn 9780801435737 richard a marius and melvin e page a short guide to writing about history 8th edition 2012 isbn 9780205118601 hayden white metahistory the historical imagination in nineteenthcentury europe baltimore johns hopkins university press 1973 isbn 9780801814693'</li><li>'have a meticulous approach to reconstructing the costumes or material culture of past eras but who are perceived to lack much understanding of the cultural values and historical contexts of the periods in question a college or society of antiquaries was founded in london in c 1586 to debate matters of antiquarian interest members included william camden sir robert cotton john stow william lambarde richard carew and others this body existed until 1604 when it fell under suspicion of being political in its aims and was abolished by king james i papers read at their meetings are preserved in cottons collections and were printed by thomas hearne in 1720 under the title a collection of curious discourses a second edition appearing in 1771 in 1707 a number of english antiquaries began to hold regular meetings for the discussion of their hobby and in 1717 the society of antiquaries was formally reconstituted finally receiving a charter from king george ii in 1751 in 1780 king george iii granted the society apartments in somerset house and in 1874 it moved into its present accommodation in burlington house piccadilly the society was governed by a council of twenty and a president who is ex officio a trustee of the british museum the society of antiquaries of scotland was founded in 1780 and had the management of a large national antiquarian museum in edinburgh the society of antiquaries of newcastle upon tyne the oldest provincial antiquarian society in england was founded in 1813 in ireland a society was founded in 1849 called the kilkenny archaeological society holding its meetings at kilkenny in 1869 its name was changed to the royal historical and archaeological association of ireland and in 1890 to the royal society of antiquaries of ireland its office being transferred to dublin in france the societe des antiquaires de france was formed in 1813 by the reconstruction of the academie celtique which had existed since 1804 the american antiquarian society was founded in 1812 with its headquarters at worcester massachusetts in modern times its library has grown to over 4 million items and as an institution it is internationally recognized as a repository and research library for early pre1876 american printed materials in denmark the kongelige nordiske oldskriftselskab also known as la societe royale des antiquaires du nord or the royal society of northern antiquaries was founded at copenhagen in 1825 in germany the gesamtverein der deutschen geschichts und altertumsvereine was founded in 1852in addition a number of local historical and archaeological societies have adopted the word antiquarian in their titles these have included the cambridge antiquarian society'</li></ul> | | 42 | <ul><li>'been described as the worlds largest repository of covid19 sequences and by far the worlds largest database of sarscov2 sequences by midapril 2021 gisaids sarscov2 database reached over 1200000 submissions a testament to the hard work of researchers in over 170 different countries only three months later the number of uploaded sarscov2 sequences had doubled again to over 24 million by late 2021 the database contained over 5 million genome sequences as of december 2021 over 6 million sequences had been submitted by april 2022 there were 10 million sequences accumulated and in january 2023 the number had reached 144 millionin january 2020 the sarscov2 genetic sequence data was shared through gisaid throughout the first year of the covid19 pandemic most of the sarscov2 wholegenome sequences that were generated and shared globally were submitted through gisaid when the sarscov2 omicron variant was detected in south africa by quickly uploading the sequence to gisaid the national institute for communicable diseases there was able to learn that botswana and hong kong had also reported cases possessing the same gene sequencein march 2023 gisaid temporarily suspended database access for some scientists removing raw data relevant to investigations of the origins of sarscov2 gisaid stated that they do not delete records from their database but data may become temporarily invisible during updates or corrections availability of the data was restored with an additional restriction that any analysis based thereon would not be shared with the public the board of friends of gisaid consists of peter bogner and two german lawyers who are not involved in the daytoday operations of the organisation scientific advice to the organization is provided by its scientific advisory council including directors of leading public health laboratories such as who collaborating centres for influenza in 2023 gisaids lack of transparency was criticized by some gisaid funders including the european commission and the rockefeller foundation with longterm funding being denied from international federation of pharmaceutical manufacturers and associations ifpma in june 2023 it was reported in vanity fair that bogner had said that gisaid will soon launch an independent compliance board responsible for addressing a wide range of governance matters the telegraph similarly reported that gisaids inhouse counsel was developing new governance processes intended to be transparent and allow for the resolution of scientific disputes without the involvement of bogner the creation of the gisaid database was motivated in part by concerns raised by researchers from developing countries with scientific american noting in 2009 that that a previous datasharing system run by who forced them to give up intellectual'</li><li>'viruses can be named based on the antibodies they react with the use of the antibodies which were once exclusively derived from the serum blood fluid of animals is called serology once an antibody – reaction has taken place in a test other methods are needed to confirm this older methods included complement fixation tests hemagglutination inhibition and virus neutralisation newer methods use enzyme immunoassays eiain the years before pcr was invented immunofluorescence was used to quickly confirm viral infections it is an infectivity assay that is virus species specific because antibodies are used the antibodies are tagged with a dye that is luminescencent and when using an optical microscope with a modified light source infected cells glow in the dark pcr is a mainstay method for detecting viruses in all species including plants and animals it works by detecting traces of virus specific rna or dna it is very sensitive and specific but can be easily compromised by contamination most of the tests used in veterinary virology and medical virology are based on pcr or similar methods such as transcription mediated amplification when a novel virus emerges such as the covid coronavirus a specific test can be devised quickly so long as the viral genome has been sequenced and unique regions of the viral dna or rna identified the invention of microfluidic tests as allowed for most of these tests to be automated despite its specificity and sensitivity pcr has a disadvantage in that it does not differentiate infectious and noninfectious viruses and tests of cure have to be delayed for up to 21 days to allow for residual viral nucleic acid to clear from the site of the infection in laboratories many of the diagnostic test for detecting viruses are nucleic acid amplification methods such as pcr some tests detect the viruses or their components as these include electron microscopy and enzymeimmunoassays the socalled home or selftesting gadgets are usually lateral flow tests which detect the virus using a tagged monoclonal antibody these are also used in agriculture food and environmental sciences counting viruses quantitation has always had an important role in virology and has become central to the control of some infections of humans where the viral load is measured there are two basic methods those that count the fully infective virus particles which are called infectivity assays and those that count all the particles including the defective ones infectivity assays measure the amount concentration of infective viruses in a sample of known volume for host cells plants or cultures of bacterial or animal cells are used laboratory animals such as mice'</li><li>'vpx is a virionassociated protein encoded by human immunodeficiency virus type 2 hiv2 and most simian immunodeficiency virus siv strains but that is absent from hiv1 it is similar in structure to the protein vpr that is carried by siv and hiv2 as well as hiv1 vpx is one of five accessory proteins vif vpx vpr vpu and nef carried by lentiviruses that enhances viral replication by inhibiting host antiviral factorsvpx enhances hiv2 replication in humans by counteracting the host factor samhd1 samhd1 is a host factor found in human myeloid cells such as dendritic cells and macrophages that restricts hiv1 replication by depleting the cytoplasmic pool of deoxynucleoside triphosphates needed for viral dna production samhd1 does not however restrict hiv2 replication in myeloid cells due to the presence of viral vpx vpx counteracts restriction by inducing the ubiquitinproteasomedependent degradation of samhd1 vpxmediated degradation of samhd1 therefore decreases deoxynucleoside triphosphate hydrolysis thereby increasing the availability of dntps for viral reverse transcription in the cytoplasm it has been postulated that samhd1 degradation is required for hiv2 replication because the hiv2 reverse transcriptase rt is less active than the hiv1 rt which would be the reason for the absence of vpx from hiv1 because vpx is required for hiv2 reverse transcription and the early stages of the viral life cycle it is packaged into virions in significant amountsvpx is also involved in the nuclear import of the hiv2siv genomes and associated proteins but the specific mechanisms and interactions are currently unknown although vpr and vpx are similar in size both are 100 amino acids with 2025 sequence similarity and structure both are predicted to have similar tertiary structure with three major helices they serve very different roles in viral replication vpx targets a host restriction factor for proteasomal degradation while vpr arrests the host cell cycle in the g2 phase however they are both involved in the import of the viral preintegration complex into the host nucleus'</li></ul> | | 19 | <ul><li>'##es insulin blood glucose from the portal vein enters liver cells hepatocytes insulin acts on the hepatocytes to stimulate the action of several enzymes including glycogen synthase glucose molecules are added to the chains of glycogen as long as both insulin and glucose remain plentiful in this postprandial or fed state the liver takes in more glucose from the blood than it releases after a meal has been digested and glucose levels begin to fall insulin secretion is reduced and glycogen synthesis stops when it is needed for energy glycogen is broken down and converted again to glucose glycogen phosphorylase is the primary enzyme of glycogen breakdown for the next 8 – 12 hours glucose derived from liver glycogen is the primary source of blood glucose used by the rest of the body for fuel glucagon another hormone produced by the pancreas in many respects serves as a countersignal to insulin in response to insulin levels being below normal when blood levels of glucose begin to fall below the normal range glucagon is secreted in increasing amounts and stimulates both glycogenolysis the breakdown of glycogen and gluconeogenesis the production of glucose from other sources muscle glycogen appears to function as an immediate reserve source of available phosphorylated glucose in the form of glucose1phosphate for muscle cells glycogen contained within skeletal muscle cells are primarily in the form of β particles other cells that contain small amounts use it locally as well as muscle cells lack glucose6phosphatase which is required to pass glucose into the blood the glycogen they store is available solely for internal use and is not shared with other cells this is in contrast to liver cells which on demand readily do break down their stored glycogen into glucose and send it through the blood stream as fuel for other organsskeletal muscle needs atp provides energy for muscle contraction and relaxation in what is known as the sliding filament theory skeletal muscle relies predominantly on glycogenolysis for the first few minutes as it transitions from rest to activity as well as throughout highintensity aerobic activity and all anaerobic activity during anaerobic activity such as weightlifting and isometric exercise the phosphagen system atppcr and muscle glycogen are the only substrates used as they do not require oxygen nor blood flowdifferent bioenergetic systems produce atp at different speeds with atp produced'</li><li>'glycogen storage disease type i gsd i is an inherited disease that prevents the liver from properly breaking down stored glycogen which is necessary to maintain adequate blood sugar levels gsd i is divided into two main types gsd ia and gsd ib which differ in cause presentation and treatment there are also possibly rarer subtypes the translocases for inorganic phosphate gsd ic or glucose gsd id however a recent study suggests that the biochemical assays used to differentiate gsd ic and gsd id from gsd ib are not reliable and are therefore gsd ibgsd ia is caused by a deficiency in the enzyme glucose6phosphatase gsd ib a deficiency in the transport protein glucose6phosphate translocase because glycogenolysis is the principal metabolic mechanism by which the liver supplies glucose to the body during fasting both deficiencies cause severe hypoglycemia and over time excess glycogen storage in the liver and in some cases in the kidneys because of the glycogen buildup gsd i patients typically present with enlarged livers from nonalcoholic fatty liver disease other functions of the liver and kidneys are initially intact in gsd i but are susceptible to other problems without proper treatment gsd i causes chronic low blood sugar which can lead to excessive lactic acid and abnormally high lipids in the blood and other problems frequent feedings of cornstarch or other carbohydrates are the principal treatment for all forms of gsd i gsd ib also features chronic neutropenia due to a dysfunction in the production of neutrophils in the bone marrow this immunodeficiency if untreated makes gsd ib patients susceptible to infection the principal treatment for this feature of gsd ib is filgrastim however patients often still require treatment for frequent infections and a chronically enlarged spleen is a common side effect gsd ib patients often present with inflammatory bowel diseaseit is the most common of the glycogen storage diseases gsd i has an incidence of approximately 1 in 100000 births in the american population and approximately 1 in 20000 births among ashkenazi jews the disease was named after german doctor edgar von gierke who first described it in 1929 early research into gsd i identified numerous clinical manifestations falsely thought to be primary features of the genetic disorder however continuing research has revealed that these clinical features are the consequences of only one in gsd ia or two in gsd ib'</li><li>'##patic arteries and threaded through the gastroduodenal mostly or celiac artery the catheter is fixed in this position and the pump is placed in a subcutaneous pocket finally to confirm adequate placement and hepatic perfusion and to rule out extrahepatic perfusion a dye fluorescein or methylene blue is injected into the pump after the procedure and before starting the hai based treatment a technetium 99mlabeled macroaggregated albumin scan is performed to again confirm adequate hepatic perfusion and no misperfusion outside of the liver the complications of hai therapy can be divided into those related to the surgical placement of the pump technical catheterrelated complications and those related to the chemotherapeutic agents usedrelating to the surgical hai pump placement early postoperative complications consist of arterial injury leading to hepatic artery thrombosis inadequate perfusion of the entire liver due to the inability to identify an accessory hepatic artery extrahepatic perfusion to the stomach or duodenum or hematoma formation in the subcutaneous pump pocket late complications are more common and include inflammation or ulceration of the stomach or duodenum and pump pocket infectionthe most common catheter related complications include displacement of the catheter occlusion of the hepatic artery because of the catheter and catheter thrombosis these catheter related complications dont occur as frequently with increased surgical experience and with improvements in pump designthe most common toxicities caused by the chemotherapeutic agents were gastrointestinal symptoms chemical hepatitis and bone marrow inhibition it is important to note that the most serious and dose limiting complication of hai is hepatobiliary toxicity this occurs more commonly with fudr than any other chemotherapeutic agent patients undergoing hai therapy therefore have regular liver function tests to monitor any damage to the liver as previously mentioned studies have been carried out to come up with treatment algorithms to minimize this serious side effect it has been shown that adding leucovorin and fudr for infusion through the pump not only reduces the biliary toxicity of the drug but also increases the response rate however biliary sclerosis is not seen with hai using 5fu 5fu is associated with an increased risk of myelosuppression logically it would make sense to therefore consider alternating between hai fudr and hai 5fu'</li></ul> | | 11 | <ul><li>'and arms within the cranium the two vertebral arteries fuse into the basilar artery posterior inferior cerebellar artery pica basilar artery supplies the midbrain cerebellum and usually branches into the posterior cerebral artery anterior inferior cerebellar artery aica pontine branches superior cerebellar artery sca posterior cerebral artery pca posterior communicating artery the venous drainage of the cerebrum can be separated into two subdivisions superficial and deep the superficial systemthe superficial system is composed of dural venous sinuses sinuses channels within the dura mater the dural sinuses are therefore located on the surface of the cerebrum the most prominent of these sinuses is the superior sagittal sinus which is located in the sagittal plane under the midline of the cerebral vault posteriorly and inferiorly to the confluence of sinuses where the superficial drainage joins with the sinus that primarily drains the deep venous system from here two transverse sinuses bifurcate and travel laterally and inferiorly in an sshaped curve that forms the sigmoid sinuses which go on to form the two jugular veins in the neck the jugular veins parallel the upward course of the carotid arteries and drain blood into the superior vena cava the veins puncture the relevant dural sinus piercing the arachnoid and dura mater as bridging veins that drain their contents into the sinus the deep venous systemthe deep venous system is primarily composed of traditional veins inside the deep structures of the brain which join behind the midbrain to form the great cerebral vein vein of galen this vein merges with the inferior sagittal sinus to form the straight sinus which then joins the superficial venous system mentioned above at the confluence of sinuses cerebral blood flow cbf is the blood supply to the brain in a given period of time in an adult cbf is typically 750 millilitres per minute or 15 of the cardiac output this equates to an average perfusion of 50 to 54 millilitres of blood per 100 grams of brain tissue per minute cbf is tightly regulated to meet the brains metabolic demands too much blood a clinical condition of a normal homeostatic response of hyperemia can raise intracranial pressure icp which can compress and damage delicate brain tissue too little blood flow ischemia results if blood flow to the brain is below 18 to 20 ml per 100 g per minute and tissue death occurs if flow dips below 8 to'</li><li>'##ie b infection it is mostly unnecessary for treatment purposes to diagnose which virus is causing the symptoms in question though it may be epidemiologically useful coxsackie b infections usually do not cause serious disease although for newborns in the first 1 – 2 weeks of life coxsackie b infections can easily be fatal the pancreas is a frequent target which can cause pancreatitiscoxsackie b3 cb3 infections are the most common enterovirus cause of myocarditis and sudden cardiac death cb3 infection causes ion channel pathology in the heart leading to ventricular arrhythmia studies in mice suggest that cb3 enters cells by means of tolllike receptor 4 both cb3 and cb4 exploit cellular autophagy to promote replication the b4 coxsackie viruses cb4 serotype was suggested to be a possible cause of diabetes mellitus type 1 t1d an autoimmune response to coxsackie virus b infection upon the islets of langerhans may be a cause of t1dother research implicates strains b1 a4 a2 and a16 in the destruction of beta cells with some suggestion that strains b3 and b6 may have protective effects via immunological crossprotection as of 2008 there is no wellaccepted treatment for the coxsackie b group of viruses palliative care is available however and patients with chest pain or stiffness of the neck should be examined for signs of cardiac or central nervous system involvement respectively some measure of prevention can usually be achieved by basic sanitation on the part of foodservice workers though the viruses are highly contagious care should be taken in washing ones hands and in cleaning the body after swimming in the event of coxsackieinduced myocarditis or pericarditis antiinflammatories can be given to reduce damage to the heart muscle enteroviruses are usually only capable of acute infections that are rapidly cleared by the adaptive immune response however mutations which enterovirus b serotypes such as coxsackievirus b and echovirus acquire in the host during the acute phase can transform these viruses into the noncytolytic form also known as noncytopathic or defective enterovirus this form is a mutated quasispecies of enterovirus which is capable of causing persistent infection in human tissues and such infections have been found in the pancreas in type 1 diabetes in chronic myocarditis and dilated cardiomyopathy in valvular'</li><li>'the biomedical research center brc is a research center at qatar university focusing on biomedical research brc was founded in 2014 and partners with the ministry of public health qatar and hamad medical corporation hmc the incidence of genetic disorders in qatar is high with the top three causes of death in the country cancer heart diseases and diabetes the government saw the creation of brc as a strategy for proactively preventing diseases to foster public healthbrc labs received the isoiec 17025 accreditation from the american association for laboratory accreditation a2la the centres research activities focus on the domains of infectious diseases virology and microbiology metabolic disorders and biomedical omics since its inauguration in 2014 brc researchers have published research papers with more than 530 publicationsthe centres research projects include antibiotic profiling of antibiotics resistant microbes in humans and animals one health approach identified for the first time the reason of why some obese people gets type2 diabetes while others do not conducted six research on covid19 to assist in fighting and recovery provided a study on protection against the omicron variant in qatar decoded the genetic code of qatari falcons and various endangered animal species dna sequence of the dugong sea cow study a nanomedicinebased preventative strategy to controlling diseases and improve health brc introduced the use of zebrafish as an animal model in biomedical research at qu and established a facility for it in 2015 the facility is used as a research unit to study many genetic diseases therefore ministry of public health qatar clearly articulated an institutional research policy irp on human use of zebrafish in research and qu circulated it to qu community for implementation the brc facilities include biosafety level 3 bsl3 built by certek usa it is equipped for viral and bacterial research on risk group 3 pathogens sequencing unit to conduct stateoftheart research in genomics mariam al maadeed sidra medical and research center'</li></ul> | | 17 | <ul><li>'and rainfall there are many ways to date a core once dated it gives valuable information about changes of climate and terrain for example cores in the ocean floor soil and ice have altered the view of the geologic history of the pleistocene entirely reverse circulation drilling is a method in which rock cuttings are continuously extracted through the hollow drill rod and can be sampled for analysis the method may be faster and use less water than core drilling but does not produce cores of relatively undisturbed material so less information on the rock structure can be derived from analysis if compressed air is used for cutting extraction the sample remains uncontaminated is available almost immediately and the method has a low environmental impact core drill ice core integrated ocean drilling program scientific drilling'</li><li>'##cial environments tend to be found in higher latitudes since there is more land at these latitudes in the north most of this effect is seen in the northern hemisphere however in lower latitudes the direct effect of the suns radiation is greater so the freezethaw effect is seen but permafrost is much less widespread altitude – air temperature drops by approximately 1 °c for every 100 m rise above sea level this means that on mountain ranges modern periglacial conditions are found nearer the equator than they are lower down ocean currents – cold surface currents from polar regions reduce mean average temperatures in places where they exert their effect so that ice caps and periglacial conditions will show nearer to the equator as in labrador for example conversely warm surface currents from tropical seas increases mean temperatures the cold conditions are then found only in more northerly places this is apparent in western north america which is affected by the north pacific current in the same way but more markedly the gulf stream affects western europe continentality – away from the moderating influence of the ocean seasonal temperature variation is more extreme and freezethaw goes deeper in the centres of canada and siberia the permafrost typical of periglaciation goes deeper and extends further towards the equator similarly solifluction associated with freezethaw extends into somewhat lower latitudes than on western coasts periglaciation results in a variety of ground conditions but especially those involving irregular mixed deposits created by ice wedges solifluction gelifluction frost creep and rockfalls periglacial environments trend towards stable geomorphologies coombe and head deposits – coombe deposits are chalk deposits found below chalk escarpments in southern england head deposits are more common below outcrops of granite on dartmoor patterned ground – patterned ground occurs where stones form circles polygons and stripes local topography affects which of these are expressed a process called frost heaving is responsible for these features solifluction lobes – solifluction lobes are formed when waterlogged soil slips down a slope due to gravity forming u shaped lobes blockfields or felsenmeer – blockfields are areas covered by large angular blocks traditionally believed to have been created by freezethaw action a good example of a blockfield can be found in the snowdonia national park wales blockfields are common in the unglaciated parts of the appalachian mountains in the northeastern united states such as at the river of rocks or hickory run boulder field lehigh county pennsylvaniaother landforms include bratschen palsa periglacial lake pingo'</li><li>'climate was cooler during the overarching little ice age than it is today ice cores scientists have studied the chemical composition of ice cores long tubes of ice that are drilled from glaciers and ice sheets to learn of past climate conditions tree rings the width of tree rings can be used to reconstruct past climate conditions as trees grow more slowly in cooler temperatures tree ring data from the little ice age seems to prove a reduction in solar activityoverall the evidence suggests that the amount of solar radiation reaching the earths surface was slightly lower during the grindelwald fluctuation and this reduction in solar radiation is thought to have contributed to the expansion of the glaciers human activities such as deforestation and land use changes are known to negatively affect local climate patterns william ruddiman a palaeoclimatologist proposed the hypothesis that human activity has been affecting the earths climate for much longer than previously thought in particular ruddiman has argued that the early adoption of agriculture and landuse practices by human societies beginning around 8000 years ago led to the release of significant amounts of greenhouse gases into the atmosphere which may have contributed to the warming of the earths climateit is difficult to accurately assess the extent of depopulation that occurred during both the 1500s and 1600s as reliable population data from this period is limited however it is known that this period was one of significant upheaval and change with many regions experiencing significant population drops due to wars plagues famines and natural disasters the bubonic plague for instance killed between 75 and 200 million people in europe alone it is also believed that an onset of disease during the little ice age may have led to further depopulationthis decline in population meant that cultivated lands became unkempt allowing for the regrowth of wild plants this is perceived to be the cause for the drop in atmospheric carbon dioxide in the sixteenth century thus exacerbating the extreme cooling period however of the causes depopulation is the least significant in historical records the grindelwald fluctuation is characterised by a further drop in temperatures and more frequent cold spells throughout many parts of the world the more notable records written by a jacobean weather enthusiast in bristol chronicle some of the effects the weather fluctuation had on agriculture and society they specifically discuss food shortages and crop failures taking precedence throughout the area'</li></ul> | | 14 | <ul><li>'needle aspiration fna biopsy can be fast and least painful a very thin hollow needle and slight suction will be used to remove a small sample from under the nipple using a local anesthetic to numb the skin may not be necessary since a thin needle is used for the biopsy receiving an injection to prevent pain from the biopsy may be more painful than the biopsy itselfsome men develop a condition known as gynecomastia in which the breast tissue under the nipple develops and grows discharge from the nipple can occur the nipple may swell in some men possibly due to increased levels of estrogen changes in appearance may be normal or related to disease inverted nipples – this is normal if the nipples have always been indented inward and can easily point out when touched if the nipples are pointing in and this is new this is an unexpected change skin puckering of the nipple – this can be caused by scar tissue from surgery or an infection often scar tissue forms for no reason most of the time this issue does not need treatment this is an unexpected change this change can be of concern since puckering or retraction of the nipple can indicate an underlying change in breast tissue that may be cancerous the nipple is warm to the touch red or painful – this can be an infection it is rarely due to breast cancer scaly flaking or itchy nipple – this is most often due to eczema or a bacterial or fungal infection this change is not expected flaking scaly or itchy nipples can be a sign of pagets disease thickened skin with large pores – this is called peau dorange because the skin looks like an orange peel an infection in the breast or inflammatory breast cancer can cause this problem this is not an expected change retracted nipples – the nipple was raised above the surface but changes begins to pull inward and does not come out when stimulatedthe average projection and size of human female nipples is slightly more than 3⁄8 inch 95 mm symptoms of breast cancer can often be seen first by changes of the nipple and areola although not all women have the same symptoms and some people do not have any signs or symptoms at all a person may find out they have breast cancer after a routine mammogram warning signs can include new lump in the nipple or breast or armpit thickening or swelling of part of the breast areola or nipple irritation or dimpling of breast skin redness or flaky skin in the nipple area or the breast pulling in of the nipple or pain in the nipple area nipple discharge other than breast milk including blood any change'</li><li>'the mother over the chorion frondosum this part of the endometrium is called the decidua basalis forms the decidual plate the decidual plate is tightly attached to the chorion frondosum and goes on to form the actual placenta endometrium on the opposite side to the decidua basalis is the decidua parietalis this fuses with the chorion laevae thus filling up the uterine cavityin the case of twins dichorionic placentation refers to the presence of two placentas in all dizygotic and some monozygotic twins monochorionic placentation occurs when monozygotic twins develop with only one placenta and bears a higher risk of complications during pregnancy abnormal placentation can lead to an early termination of pregnancy for example in preeclampsia as placentation often results during the evolution of live birth the more than 100 origins of live birth in lizards and snakes squamata have seen close to an equal number of independent origins of placentation this means that the occurrence of placentation in squamata is more frequent than in all other vertebrates combined making them ideal for research on the evolution of placentation and viviparity itself in most squamates two separate placentae form utilising separate embryonic tissue the chorioallantoic and yolksac placentae in species with more complex placentation we see regional specialisation for gas amino acid and lipid transport placentae form following implantation into uterine tissue as seen in mammals and formation is likely facilitated by a plasma membrane transformationmost reptiles exhibit strict epitheliochorial placentation eg pseudemoia entrecasteauxii however at least two examples of endotheliochorial placentation have been identified mabuya sp and trachylepis ivensi unlike eutherian mammals epitheliochorial placentation is not maintained by maternal tissue as embryos do not readily invade tissues outside of the uterus the placenta is an organ that has evolved multiple times independently evolved relatively recently in some lineages and exists in intermediate forms in living species for these reasons it is an outstanding model to study the evolution of complex organs in animals research into the genetic mechanisms that underpin the evolution of the placenta have been conducted in a diversity of animals including reptiles seahorses and mammalsthe genetic processes that support the evolution of the placenta can be best understood by separating those that result'</li><li>'the myometrium once these cells penetrate through the first few layers of cells of the decidua they lose their ability to proliferate and become invasive this departure from the cell cycle seems to be due to factors such as tgfβ and decorin although these invasive interstitial cytotrophoblasts can no longer divide they retain their ability to form syncytia multinucleated giant cells small syncytia are found in the placental bed and myometrium as a result of the fusion of interstitial cytotrophoblastsinterstitial cytotrophoblasts may also transform into endovascular cytotrophoblasts the primary function of the endovascular cytotrophoblast is to penetrate maternal spiral arteries and route the blood flow through the placenta for the growing embryo to use they arise from interstitial cytotrophoblasts from the process of phenocopying this changes the phenotype of these cells from epithelial to endothelial endovascular cytotrophoblasts like their interstitial predecessor are nonproliferating and invasive proper cytotrophoblast function is essential in the implantation of a blastocyst after hatching the embryonic pole of the blastocyst faces the uterine endometrium once they make contact the trophoblast begins to rapidly proliferate the cytotrophoblast secretes proteolytic enzymes to break down the extracellular matrix between the endometrial cells to allow fingerlike projections of trophoblast to penetrate through projections of cytotrophoblast and syncytiotrophoblast pull the embryo into the endometrium until it is fully covered by endometrial epithelium save for the coagulation plug the most common associated disorder is preeclampsia affecting approximately 7 of all births it is characterized by a failure of the cytotrophoblast to invade the uterus and its vasculature specifically the spiral arteries that the endovascular cytotrophoblast should invade the result of this is decreased blood flow to the fetus which may cause intrauterine growth restriction clinical symptoms of preeclampsia in the mother are most commonly high blood pressure proteinuria and edema conversely if there is too much invasion of uterine tissue by the trophoblast then'</li></ul> | | 36 | <ul><li>'to some decision or course of action socrates great myth illustrates this motif most clearly when the soul is depicted as a charioteer and its horses being led around a heavenly circuit this is the occasion for the first appearance in platos dialogues of the prominent platonic doctrine that life is motion the soul being the principle or source of life is that which moves itself as opposed to inanimate objects that require an external source of motion to move them the view that life is selfmotion and that the soul is a selfmover is used by plato to guarantee the immortality of the soul making this a novel argument for the souls immortality not found in the phaedo plato relies further on the view that the soul is a mind in order to explain how its motions are possible plato combines the view that the soul is a selfmover with the view that the soul is a mind in order to explain how the soul can move things in the first place eg how it can move the body to which it is attached in life souls move things by means of their thoughts in thomas manns novella death in venice the narrators young love tadzio is associated with phaedrus in mary renaults 1953 novel the charioteer a text of phaedrus is passed among the characters gay men during world war ii and the image of the charioteer and his white and black horses recurs as the protagonist struggles to choose between consummated and unconsummated love in a key scene from the film adaptation of maurice students including maurice attend dean cornwalliss translation class in which two undergraduates orally translate into english the text based on phaedrus stephanus 251a 255a – e during which the dean instructs one to omit the reference to the unspeakable vice of the greeks the 2016 film knight of cups by terrence malick is inspired in part by phaedrus in robert m pirsigs fictionalized autobiographical novel zen and the art of motorcycle maintenance pirsig refers to his past self from before undergoing electroconvulsive therapy in the third person and using the name phaedrus intended to reflect his opposition to certain educational and philosophical ideas the character reappears in the followup lila an inquiry into morals in virginia woolfs 1922 novel jacobs room jacob reads phaedrus alone in his room after a visit to the enormous mind as woolf characterizes the british museum jowett translation at standardebooks greek text at perseus plato nichols j h tr and ed phaedrus cornell university press'</li><li>'other lacks so much the betterthe first two of young becker and pikes four phases of written rogerian argument are based on the first two of rapoports three principles of ethical debate the third of rapoports principles — increasing the perceived similarity between self and other — is a principle that young becker and pike considered to be equally as important as the other two but they said it should be an attitude assumed throughout the discourse and is not a phase of writingmaxine hairston in a section on rogerian or nonthreatening argument in her textbook a contemporary rhetoric advised that one shouldnt start writing with a detailed plan in mind but might start by making four lists the others concerns ones own key points anticipated problems and points of agreement or common ground she gave a different version of young becker and pikes four phases which she expanded to five and called elements of the nonthreatening argument a brief and objective statement of the issue a neutrally worded analysis of the others position a neutrally worded analysis of ones own position a statement of the common aspects goals and values that the positions share and a proposal for resolving the issue that shows how both sides may gain she said that the rogerian approach requires calm patience and effort and will work if one is more concerned about increasing understanding and communication than about scoring a triumph in a related article she noted the similarity between rogerian argument and john stuart mills wellknown phrase from on liberty he who knows only his own side of the case knows little of thatrobert keith millers textbook the informed argument first published in 1986 presented five phases adapted from an earlier textbook by richard coe millers phases were an introduction to the problem a summary of views that oppose the writers position a statement of understanding of the region of validity of the opposing views a statement of the writers position a statement of the situations in which the writers position has merit and a statement of the benefits of accepting the writers positionin 1992 rebecca stephens built on the vague and abstract rogerian principles of other rhetoricians to create a set of 23 concrete and detailed questions that she called a rogerianbased heuristic for rhetorical invention intended to help people think in a rogerian way while discovering ideas and arguments for example the first two of her 23 questions are what is the nature of the issue in general terms and she recommended that the answer should itself be stated as a question and whose lives are affected by the issue the last two questions are what would have to happen to eliminate the disagreement among the opposing groups and what are the chances that this will occur lisa'</li><li>'reestablishes equilibrium and health in the collective imaginary which are jeopardized by the repressive aspects of societythe state of political satire in a given society reflects the tolerance or intolerance that characterizes it and the state of civil liberties and human rights under totalitarian regimes any criticism of a political system and especially satire is suppressed a typical example is the soviet union where the dissidents such as aleksandr solzhenitsyn and andrei sakharov were under strong pressure from the government while satire of everyday life in the ussr was allowed the most prominent satirist being arkady raikin political satire existed in the form of anecdotes that made fun of soviet political leaders especially brezhnev famous for his narrowmindedness and love for awards and decorations satire is a diverse genre which is complex to classify and define with a wide range of satiric modes satirical literature can commonly be categorized as either horatian juvenalian or menippean horatian horatian satire named for the roman satirist horace 65 – 8 bce playfully criticizes some social vice through gentle mild and lighthearted humour horace quintus horatius flaccus wrote satires to gently ridicule the dominant opinions and philosophical beliefs of ancient rome and greece rather than writing in harsh or accusing tones he addressed issues with humor and clever mockery horatian satire follows this same pattern of gently ridiculing the absurdities and follies of human beingsit directs wit exaggeration and selfdeprecating humour toward what it identifies as folly rather than evil horatian satires sympathetic tone is common in modern society a horatian satirists goal is to heal the situation with smiles rather than by anger horatian satire is a gentle reminder to take life less seriously and evokes a wry smile juvenalian juvenalian satire named for the writings of the roman satirist juvenal late first century – early second century ad is more contemptuous and abrasive than the horatian juvenal disagreed with the opinions of the public figures and institutions of the republic and actively attacked them through his literature he utilized the satirical tools of exaggeration and parody to make his targets appear monstrous and incompetent juvenals satire follows this same pattern of abrasively ridiculing societal structures juvenal also unlike horace attacked public officials and governmental organizations through his satires regarding their opinions as not just wrong but evil following in this tradition juvenalia'</li></ul> | | 27 | <ul><li>'rod is so small newtons third law of physics applies for any action there is a reaction when the electrons are pulled across the surface of the rod so too is the rod pulled in the opposite direction the first recorded success of a nanosubmarine was performed by a team of students led by dan peer from tel aviv university in israel this was a continuation to peers work at harvard on nanosubmarines and targeted drug delivery tests have proven successful in delivering drugs to heal mice with ulcerative colitis tests will continue and the team plans to experiment on the human body soon fantastic voyage novel and movie based on the nanosubmarine theme'</li><li>'electronbeaminduced deposition ebid is a process of decomposing gaseous molecules by an electron beam leading to deposition of nonvolatile fragments onto a nearby substrate the electron beam is usually provided by a scanning electron microscope which results in high spatial accuracy potentially below one nanometer and the possibility to produce freestanding threedimensional structures the focused electron beam of a scanning electron microscope sem or scanning transmission electron microscope stem is commonly used another method is ionbeaminduced deposition ibid where a focused ion beam is applied instead precursor materials are typically liquid or solid and gasified prior to deposition usually through vaporization or sublimation and introduced at accurately controlled rate into the highvacuum chamber of the electron microscope alternatively solid precursors can be sublimated by the electron beam itself when deposition occurs at a high temperature or involves corrosive gases a specially designed deposition chamber is used it is isolated from the microscope and the beam is introduced into it through a micrometresized orifice the small orifice size maintains differential pressure in the microscope vacuum and deposition chamber no vacuum such deposition mode has been used for ebid of diamondin the presence of the precursor gas the electron beam is scanned over the substrate resulting in deposition of material the scanning is usually computercontrolled the deposition rate depends on a variety of processing parameters such as the partial precursor pressure substrate temperature electron beam parameters applied current density etc it usually is in the order of 10 nms primary electron energies in sems or stems are usually between 10 and 300 kev where reactions induced by electron impact ie precursor dissociation have a relatively low cross section the majority of decomposition occurs via low energy electron impact either by low energy secondary electrons which cross the substratevacuum interface and contribute to the total current density or inelastically scattered backscattered electrons primary stem electrons can be focused into spots as small as 0045 nm while the smallest structures deposited so far by ebid are point deposits of 07 nm diameter deposits usually have a larger lateral size than the beam spot size the reason are the socalled proximity effects meaning that secondary backscattered and forward scattered if the beam dwells on already deposited material electrons contribute to the deposition as these electrons can leave the substrate up to several microns away from the point of impact of the electron beam depending on its energy material deposition is not necessarily confined to the irradiated spot to overcome this problem compensation algorithms can be applied which is typical for electron beam lithography as of 2008 the range of materials deposited by ebid included al au amor'</li><li>'##onment this presents a challenge in maintaining protein arrays in a stable condition over extended periods of time in situ methods — invented and published by mingyue he and michael taussig in 2001 — involve onchip synthesis of proteins as and when required directly from the dna using cellfree protein expression systems since dna is a highly stable molecule it does not deteriorate over time and is therefore suited to longterm storage this approach is also advantageous in that it circumvents the laborious and often costly processes of separate protein purification and dna cloning since proteins are made and immobilised simultaneously in a single step on the chip surface examples of in situ techniques are pisa protein in situ array nappa nucleic acid programmable protein array and dapa dna array to protein array there are three types of protein microarrays that are currently used to study the biochemical activities of proteins analytical microarrays are also known as capture arrays in this technique a library of antibodies aptamers or affibodies is arrayed on the support surface these are used as capture molecules since each binds specifically to a particular protein the array is probed with a complex protein solution such as a cell lysate analysis of the resulting binding reactions using various detection systems can provide information about expression levels of particular proteins in the sample as well as measurements of binding affinities and specificities this type of microarray is especially useful in comparing protein expression in different solutions for instance the response of the cells to a particular factor can be identified by comparing the lysates of cells treated with specific substances or grown under certain conditions with the lysates of control cells another application is in the identification and profiling of diseased tissues reverse phase protein microarray rppa involve complex samples such as tissue lysates cells are isolated from various tissues of interest and are lysed the lysate is arrayed onto the microarray and probed with antibodies against the target protein of interest these antibodies are typically detected with chemiluminescent fluorescent or colorimetric assays reference peptides are printed on the slides to allow for protein quantification of the sample lysates rpas allow for the determination of the presence of altered proteins or other agents that may be the result of disease specifically posttranslational modifications which are typically altered as a result of disease can be detected using rpas functional protein microarrays also known as target protein arrays are constructed by immobilising large numbers of purified proteins and are used to'</li></ul> | | 9 | <ul><li>'a circular chromosome is a chromosome in bacteria archaea mitochondria and chloroplasts in the form of a molecule of circular dna unlike the linear chromosome of most eukaryotes most prokaryote chromosomes contain a circular dna molecule – there are no free ends to the dna free ends would otherwise create significant challenges to cells with respect to dna replication and stability cells that do contain chromosomes with dna ends or telomeres most eukaryotes have acquired elaborate mechanisms to overcome these challenges however a circular chromosome can provide other challenges for cells after replication the two progeny circular chromosomes can sometimes remain interlinked or tangled and they must be resolved so that each cell inherits one complete copy of the chromosome during cell division the circular bacteria chromosome replication is best understood in the wellstudied bacteria escherichia coli and bacillus subtilis chromosome replication proceeds in three major stages initiation elongation and termination the initiation stage starts with the ordered assembly of initiator proteins at the origin region of the chromosome called oric these assembly stages are regulated to ensure that chromosome replication occurs only once in each cell cycle during the elongation phase of replication the enzymes that were assembled at oric during initiation proceed along each arm replichore of the chromosome in opposite directions away from the oric replicating the dna to create two identical copies this process is known as bidirectional replication the entire assembly of molecules involved in dna replication on each arm is called a replisome at the forefront of the replisome is a dna helicase that unwinds the two strands of dna creating a moving replication fork the two unwound single strands of dna serve as templates for dna polymerase which moves with the helicase together with other proteins to synthesise a complementary copy of each strand in this way two identical copies of the original dna are created eventually the two replication forks moving around the circular chromosome meet in a specific zone of the chromosome approximately opposite oric called the terminus region the elongation enzymes then disassemble and the two daughter chromosomes are resolved before cell division is completed the e coli origin of replication called oric consists of dna sequences that are recognised by the dnaa protein which is highly conserved amongst different bacterial species dnaa binding to the origin initiates the regulated recruitment of other enzymes and proteins that will eventually lead to the establishment of two complete replisomes for bidirectional replicationdna sequence elements within oric that are important for its function include dnaa boxes a 9mer repeat with a highly'</li><li>'the second step of this process has recently fallen into question for the past few decades the common view was that a trimeric multiheme ctype hao converts hydroxylamine into nitrite in the periplasm with production of four electrons 12 the stream of four electrons is channeled through cytochrome c554 to a membranebound cytochrome c552 two of the electrons are routed back to amo where they are used for the oxidation of ammonia quinol pool the remaining two electrons are used to generate a proton motive force and reduce nadp through reverse electron transportrecent results however show that hao does not produce nitrite as a direct product of catalysis this enzyme instead produces nitric oxide and three electrons nitric oxide can then be oxidized by other enzymes or oxygen to nitrite in this paradigm the electron balance for overall metabolism needs to be reconsidered nitrite produced in the first step of autotrophic nitrification is oxidized to nitrate by nitrite oxidoreductase nxr 2 it is a membraneassociated ironsulfur molybdo protein and is part of an electron transfer chain which channels electrons from nitrite to molecular oxygen the enzymatic mechanisms involved in nitriteoxidizing bacteria are less described than that of ammonium oxidation recent research eg woznica a et al 2013 proposes a new hypothetical model of nob electron transport chain and nxr mechanisms here in contrast to earlier models the nxr would act on the outside of the plasma membrane and directly contribute to a mechanism of proton gradient generation as postulated by spieck and coworkers nevertheless the molecular mechanism of nitrite oxidation is an open question the twostep conversion of ammonia to nitrate observed in ammoniaoxidizing bacteria ammoniaoxidizing archaea and nitriteoxidizing bacteria such as nitrobacter is puzzling to researchers complete nitrification the conversion of ammonia to nitrate in a single step known as comammox has an energy yield ∆g° ′ of −349 kj mol−1 nh3 while the energy yields for the ammoniaoxidation and nitriteoxidation steps of the observed twostep reaction are −275 kj mol−1 nh3 and −74 kj mol−1 no2− respectively these values indicate that it would be energetically favourable for an organism to carry out complete nitrification from ammonia to nitrate comammox rather'</li><li>'young animals and nonnative breeds the clinical signs of disease are caused by an increased vascular permeability and consequent oedema and hypovolemia the symptoms include neurological signs such as tremors and head pressing respiratory signs such as coughing and nasal discharge and systemic signs such as fever and loss of appetite physical examination may reveal petechiae of the mucous membranes tachycardia and muffled heart sounds heartwater can also cause reproductive and gastrointestinal disease it is frequently fatal on post mortem examination a light yellow transudate that coagulates on exposure to air is often found within the thorax pericardium and abdomen most fatal cases have the hydropericardium that gives the disease its common name pulmonary oedema and mucosal congestion are regularly seen along with frothy fluid in the airways and cut surfaces of the lungs to definitively diagnose the disease c ruminantium must be demonstrated either in preparations of the hippocampus under giemsa staining or by histopathology of brain or kidney during the early stages of disease animals may be treated with sulfonamides and tetracyclines in advanced disease prognosis is poor tetracyclines can also be used prophylactically when animals are introduced into an area endemic with heartwater ectoparasiticides used as dips can be used to reduce exposure the animals exposure to bont ticks in areas endemic for heartwater the use of dips against other ticks of domestic animals such as rhipicephalus boophilus and hyalomma species is likely and this will usually contribute to control of vectors of e ruminantium a live blood vaccine is available for protection of young stock but animals may require treatment for the disease after vaccination several experimental vaccines are currently being developed examples include attenuated recombinant and multiepitope dna vaccines depending on the species of the animal the mortality rate of the disease may vary from 5 to 90 mortality rates appear to be the highest within the various sheep and goat species but this is not always the case as some sheep species such as the afrikaner have mortality rates only reaching as high as 6 heartwater is notifiable to the world organization for animal health the us department of agriculture believes that an outbreak in the us could cost the livestock industry up to 762 million in losses annually the tick that carries the disease is thought to be capable of being transported by migratory birds from the caribbean to at least florida the'</li></ul> | | 29 | <ul><li>'fixed circle of latitude or zonal region if the coriolis parameter is large the effect of the earths rotation on the body is significant since it will need a larger angular frequency to stay in equilibrium with the coriolis forces alternatively if the coriolis parameter is small the effect of the earths rotation is small since only a small fraction of the centripetal force on the body is canceled by the coriolis force thus the magnitude of f displaystyle f strongly affects the relevant dynamics contributing to the bodys motion these considerations are captured in the nondimensionalized rossby number in stability calculations the rate of change of f displaystyle f along the meridional direction becomes significant this is called the rossby parameter and is usually denoted β ∂ f ∂ y displaystyle beta frac partial fpartial y where y displaystyle y is the in the local direction of increasing meridian this parameter becomes important for example in calculations involving rossby waves beta plane earths rotation rossbygravity waves'</li><li>'of silicic acid to nitrate because larger diatoms that require silicic acid to make their opal silica shells are less prevalent unlike the southern ocean and the north pacific the equatorial pacific experiences temporal silicate availability which leads to large seasonal diatom bloomsthe distribution of trace metals and relative abundance of macronutrients are reflected in the plankton community structure for example the selection of phytoplankton with a high surface area to volume ratio results in hnlc regions being dominated by nano and picoplankton this ratio allows for optimal utilization of available dissolved nutrients larger phytoplankton such as diatoms cannot energetically sustain themselves in these regions common picoplankton within these regions include genera such as prochlorococcus not generally found in the north pacific synechococcus and various eukaryotes grazing protists likely control the abundance and distribution of these small phytoplanktonthe generally lower net primary production in hnlc zones results in lower biological drawdown of atmospheric carbon dioxide and thus these regions are generally considered a net source of carbon dioxide to the atmosphere hnlc areas are of interest to geoengineers and some in the scientific community who believe fertilizing large patches of these waters with iron could potentially lower dissolved carbon dioxide and offset increased anthropogenic carbon emissions analysis of antarctic ice core data over the last million years shows correlation between high levels of dust and low temperature indicating that addition of diffuse ironrich dust to the sea has been a natural amplifier of climate cooling the discovery and naming of the first hnlc region the north pacific was formalized in a seminal paper published in 1988 the study concluded that surface waters of the eastern north pacific are generally dominated by picoplankton despite the relative abundance of macronutrients in other words larger phytoplankton such as diatoms which thrive in nutrientrich waters were not found instead the surface waters were replete with smaller pico and nanoplankton based on laboratory nutrient experiments iron was hypothesized to be a key limiting micronutrientthe pacific ocean is the largest and oldest body of water on earth the north pacific is characterized by the general clockwise rotation of the north pacific gyre which is driven by trade winds spatial variations in tradewinds result in cooler air temperatures in the western north pacific and milder air temperatures in the eastern north pacific ie subarctic pacific iron is supplied to the north pacific by dust storms that occur in asia'</li><li>'atmospheric pressure 101325 pa whereas water has a density of 09998 – 0999863 gcm3 at the same temperature and pressure liquid water is densest essentially 100 gcm3 at 4 °c and begins to lose its density as the water molecules begin to form the hexagonal crystals of ice as the freezing point is reached this is due to hydrogen bonding dominating the intermolecular forces which results in a packing of molecules less compact in the solid density of ice increases slightly with decreasing temperature and has a value of 09340 gcm3 at −180 °c 93 kwhen water freezes it increases in volume about 9 for fresh water the effect of expansion during freezing can be dramatic and ice expansion is a basic cause of freezethaw weathering of rock in nature and damage to building foundations and roadways from frost heaving it is also a common cause of the flooding of houses when water pipes burst due to the pressure of expanding water when it freezes the result of this process is that ice in its most common form floats on liquid water which is an important feature in earths biosphere it has been argued that without this property natural bodies of water would freeze in some cases permanently from the bottom up resulting in a loss of bottomdependent animal and plant life in fresh and sea water sufficiently thin ice sheets allow light to pass through while protecting the underside from shortterm weather extremes such as wind chill this creates a sheltered environment for bacterial and algal colonies when sea water freezes the ice is riddled with brinefilled channels which sustain sympagic organisms such as bacteria algae copepods and annelids which in turn provide food for animals such as krill and specialised fish like the bald notothen fed upon in turn by larger animals such as emperor penguins and minke whaleswhen ice melts it absorbs as much energy as it would take to heat an equivalent mass of water by 80 °c during the melting process the temperature remains constant at 0 °c while melting any energy added breaks the hydrogen bonds between ice water molecules energy becomes available to increase the thermal energy temperature only after enough hydrogen bonds are broken that the ice can be considered liquid water the amount of energy consumed in breaking hydrogen bonds in the transition from ice to water is known as the heat of fusion as with water ice absorbs light at the red end of the spectrum preferentially as the result of an overtone of an oxygen – hydrogen o – h bond stretch compared with water this absorption is shifted toward slightly lower energies thus ice appears blue with'</li></ul> | | 13 | <ul><li>'has offered artworks in the form of graphics downloadable to the home personal computer – for example by peter halley the thing has enabled a diverse group of artists critics curators and activists to use the internet in its early stages at its core the thing is a social network made up of individuals from diverse backgrounds with a wide range of expert knowledge from this social hub the thing has built an array of programs and initiatives in both technological and cultural networks during its first five years tt became widely recognized as one of the founding and leading online centers for new media culture its activities include hosting artists projects and mailing lists as well as publishing cultural criticism the thing has also organized many public events and symposia on such topics as the state of new media arts the preservation of online privacy artistic innovations in robotics and the possibilities of community empowerment through wireless technologies in 1997 thingnet communications llc an internet service provider isp was incorporated by wolfgang staehle gisela ehrenfried and max kossatz the isp was to provide a financial backbone for the thing inc a 501 c 3 non profit organization thingnet has hosted arts and activist groups and publications including ps1 contemporary art center artforum mabou mines willoughby sharp gallery zingmagazine journal of contemporary art rtmark and tenantnet among many others artists and projects associated with thingnet have included sawad brooks heath bunting cercle ramo nash vuk cosic ricardo dominguez ursula endlicher etoy gh hovagimyan jerome joy john klima jenny marketou mariko mori olivier mosset prema murty mark napier joseph nechvatal phil niblock daniel pflumm francesca da rimini beat streuli and beth stryker the thing amsterdam was founded by walter van der cruijsen the thing basel was founded by barbara strebel and rik gelles the thing berlin was founded by ulf schleth the thing cologne was founded by michael krome the thing dusseldorf was founded by jorg sasse the thing frankfurt was founded by andreas kallfelz the thing hamburg 1993 – 94 was founded by hansjoachim lenger the thing hamburg 2006 – 2009 was founded by the local art association the thing hamburg the thing london was founded by andreas ruethi the thing new york was founded by wolfgang staehle the thing stockholm was founded by magnus borg the thing vienna was founded by helmut mark and max kossatz the thing roma was founded by marco deseriis and giuseppe marano'</li><li>'of using locative media to better understand and connect in their environmentsyzygryd is a collaboration with three other arts organizations interpretive arson false profit labs ardent heavy industries to create a large scale interactive art piece to be unveiled at the 2010 burning man event the first five resident artists alphonzo solorzano gabriel dunne ryan alexander miles stemper and daniel massey moved into the space in july 2009 in 2010 three of these resident artists remained gabriel dunne ryan alexander and daniel massey in 2021 gray area partnered with the human rights foundation to launch the art in protest residency program the program s an opportunity for artists whose art is dedicated to promoting democracy and human rights globally to explore and expand their digital practices the gray area incubator is a peerdriven community of creators developing work at the intersection of art and technology membership is a 6month commitment though many have continued on much longer to develop their works in the incubator artists work in the disciplines of visual media arts creative code virtual augmented reality civic engagement digital activism social entrepreneurship data science sound audio and software hardware gray areas josette melchor was selected as one of the five innovators showcased on fords the edge of progress tourafter the 2016 oakland ghostship warehouse fire gray area raised approximately 13 million from over 12000 donors which it distributed to 390 applicants ranging from deceased victims next of kin displaced residents people injured in the fire as well as people who would not be acknowledged by traditional disaster relief organizations including chosen family within marginalized communities'</li><li>'nfts being used in the filmindustry include a collection of nftartworks for godzilla vs kong the release of both kevin smiths horrormovie killroy was here and the 2021 film zero contact as nfts in 2021 in april 2021 an nft was released for the score of the movie triumph composed by gregg leonard in november 2021 film director quentin tarantino released seven nfts based on uncut scenes of pulp fiction miramax subsequently filed a lawsuit claiming that their film rights were violated and that the original 1993 contract with tarantino gave them the right to mint nfts in relation to pulp fiction in august 2022 muse released album will of the people as 1000 nfts and it became the first album for which nft sales would qualify for the uk and australian chartsby february 2021 nfts accounted for us25 million of revenue generated through the sale of artwork and songs as nfts on february 28 2021 electronic dance musician 3lau sold a collection of 33 nfts for a total of us117 million to commemorate the threeyear anniversary of his ultraviolet album on march 3 2021 an nft was made to promote the kings of leon album when you see yourself other musicians who have used nfts include american rapper lil pump grimes visual artist shepard fairey in collaboration with record producer mike dean and rapper eminema paper presented at the 40th international conference on information systems in munich in 2019 suggested using nfts as tickets for different types of events this would enable organizers of the respective events or artists performing there to receive royalties on the resale of each ticket other associated files a number of internet memes have been associated with nfts which were minted and sold by their creators or by their subjects examples include doge an image of a shiba inu dog as well as charlie bit my finger nyan cat and disaster girl some virtual worlds often marketed as metaverses have incorporated nfts as a means of trading virtual items and virtual real estate some pornographic works have been sold as nfts though hostility from nft marketplaces towards pornographic material has presented significant drawbacks for creators by using nfts people engaged in this area of the entertainmentindustry are able to publish their works without thirdparty platforms being able to delete them the first credited political protest nft destruction of nazi monument symbolizing contemporary lithuania was a video filmed by professor stanislovas tomas on april 8 2019 and minted on march 29 2021 in the video tomas uses a sledgehammer to destroy a statesponsored'</li></ul> | | 7 | <ul><li>'lot of solutions available for people with hearing impairments some examples of solutions would be blinking lights on different things like their phones alarms and things that are important to alert them cochlear implants are an option too cochlear implants are surgically placed devices that stimulate the cochlear nerve in order to help the person hear a cochlear implant is used instead of hearing aids in order to help when someone has difficulties understanding speech in a cultural context deaf culture refers to a tightknit cultural group of people whose primary language is signed and who practice social and cultural norms which are distinct from those of the surrounding hearing community this community does not automatically include all those who are clinically or legally deaf nor does it exclude every hearing person according to baker and padden it includes any person who identifies himherself as a member of the deaf community and other members accept that person as a part of the community an example being children of deaf adults with normal hearing ability it includes the set of social beliefs behaviors art literary traditions history values and shared institutions of communities that are influenced by deafness and which use sign languages as the main means of communication members of the deaf community tend to view deafness as a difference in human experience rather than a disability or diseasemany nondisabled people continue to assume that deaf people have no autonomy and fail to provide people with support beyond hearing aids which is something that must be addressed different nongovernmental organizations around the world have created programs towards closing the gap between deaf and nondisabled people in developing countries the quota international organization with headquarters in the united states provided immense educational support in the philippines where it started providing free education to deaf children in the leganes resource center for the deaf the sounds seekers british organization also provided support by offering audiology maintenance technology to better assist those who are deaf in hardtoreach places the nippon foundation also supports deaf students at gallaudet university and the national technical institute for the deaf through sponsoring international scholarships programs to encourage students to become future leaders in the deaf community the more aid these organizations give to the deaf people the more opportunities and resources disabled people must speak up about their struggles and goals that they aim to achieve when more people understand how to leverage their privilege for the marginalized groups in the community then we can build a more inclusive and tolerant environment for the generations that are yet to come the first known record of sign language in history comes from platos cratylus written in the fifth century bce in a dialogue on the correctness of names socrates says suppose'</li><li>'the ear canal external acoustic meatus external auditory meatus eam is a pathway running from the outer ear to the middle ear the adult human ear canal extends from the pinna to the eardrum and is about 25 centimetres 1 in in length and 07 centimetres 03 in in diameter the human ear canal is divided into two parts the elastic cartilage part forms the outer third of the canal its anterior and lower wall are cartilaginous whereas its superior and back wall are fibrous the cartilage is the continuation of the cartilage framework of pinna the cartilaginous portion of the ear canal contains small hairs and specialized sweat glands called apocrine glands which produce cerumen ear wax the bony part forms the inner two thirds the bony part is much shorter in children and is only a ring annulus tympanicus in the newborn the layer of epithelium encompassing the bony portion of the ear canal is much thinner and therefore more sensitive in comparison to the cartilaginous portion size and shape of the canal vary among individuals the canal is approximately 25 centimetres 1 in long and 07 centimetres 028 in in diameter it has a sigmoid form and runs from behind and above downward and forward on the crosssection it is of oval shape these are important factors to consider when fitting earplugs due to its relative exposure to the outside world the ear canal is susceptible to diseases and other disorders some disorders include atresia of the ear canal cerumen impaction bone exposure caused by the wearing away of skin in the canal auditory canal osteoma bony outgrowths of the temporal bone cholesteatoma contact dermatitis of the ear canal fungal infection otomycosis ear mites in animals ear myiasis an extremely rare infestation of maggots foreign body in ear granuloma a scar usually caused by tympanostomy tubes otitis externa swimmers ear bacteriacaused inflammation of the ear canal stenosis a gradual closing of the canal earwax also known as cerumen is a yellowish waxy substance secreted in the ear canals it plays an important role in the human ear canal assisting in cleaning and lubrication and also provides some protection from bacteria fungi and insects excess or impacted cerumen can press against the eardrum andor occlude the external auditory canal and impair hearing causing conductive hearing loss if left untreated cerumen impaction can also increase the risk of developing an infection within the ear canal list of specialized glands within the'</li><li>'##anometry and speech audiometry may be helpful testing is performed by an audiologist there is no proven or recommended treatment or cure for snhl management of hearing loss is usually by hearing strategies and hearing aids in cases of profound or total deafness a cochlear implant is a specialised hearing aid that may restore a functional level of hearing snhl is at least partially preventable by avoiding environmental noise ototoxic chemicals and drugs and head trauma and treating or inoculating against certain triggering diseases and conditions like meningitis since the inner ear is not directly accessible to instruments identification is by patient report of the symptoms and audiometric testing of those who present to their doctor with sensorineural hearing loss 90 report having diminished hearing 57 report having a plugged feeling in ear and 49 report having ringing in ear tinnitus about half report vestibular vertigo problemsfor a detailed exposition of symptoms useful for screening a selfassessment questionnaire was developed by the american academy of otolaryngology called the hearing handicap inventory for adults hhia it is a 25question survey of subjective symptoms sensorineural hearing loss may be genetic or acquired ie as a consequence of disease noise trauma etc people may have a hearing loss from birth congenital or the hearing loss may come on later many cases are related to old age agerelated hearing loss can be inherited more than 40 genes have been implicated in the cause of deafness there are 300 syndromes with related hearing loss and each syndrome may have causative genesrecessive dominant xlinked or mitochondrial genetic mutations can affect the structure or metabolism of the inner ear some may be single point mutations whereas others are due to chromosomal abnormalities some genetic causes give rise to a late onset hearing loss mitochondrial mutations can cause snhl ie m1555ag which makes the individual sensitive to the ototoxic effects of aminoglycoside antibiotics the most common cause of recessive genetic congenital hearing impairment in developed countries is dfnb1 also known as connexin 26 deafness or gjb2related deafness the most common syndromic forms of hearing impairment include dominant stickler syndrome and waardenburg syndrome and recessive pendred syndrome and usher syndrome mitochondrial mutations causing deafness are rare mttl1 mutations cause midd maternally inherited deafness and diabetes and other conditions which may include deafness as part of the picture tmprss3 gene was identified by its association with both congenital and childhood onset autosomal recessive deafness this gene is expressed in fetal co'</li></ul> | | 23 | <ul><li>'tolerogenic dendritic cells a k a toldcs tdcs or dcregs are heterogenous pool of dendritic cells with immunosuppressive properties priming immune system into tolerogenic state against various antigens these tolerogenic effects are mostly mediated through regulation of t cells such as inducing t cell anergy t cell apoptosis and induction of tregs toldcs also affect local microenvironment toward tolerogenic state by producing antiinflammatory cytokines toldcs are not lineage specific and their immunesuppressive functions is due to their state of activation andor differentiation generally properties of all types of dendritic cells can be highly affected by local microenvironment such as presence of pro or antiinflammatory cytokines therefore tolerogenic properties of toldcs are often context dependant and can be even eventually overridden into proinflammatory phenotypetolerogenic dcs present a potential strategy for treatment of autoimmune diseases allergic diseases and transplant rejections moreover agspecific tolerance in humans can be induced in vivo via vaccination with agpulsed ex vivo generated tolerogenic dcs for that reason tolerogenic dcs are an important promising therapeutic tool dendritic cells dcs were first discovered and described in 1973 by ralph m steinman they represent a bridge between innate and adaptive immunity and play a key role in the regulation of initiation of immune responses dcs populate almost all body surfaces and they do not kill the pathogens directly they utilize and subsequently degrade antigens to peptides by their proteolytic activity after that they present these peptides in complexes together with their mhc molecules on their cell surface dcs are also the only cell type which can activate naive t cells and induce antigenspecific immune responsestherefore their role is crucially important in balance between tolerance and immune response tolerogenic dcs are essential in maintenance of central and peripheral tolerance through induction of t cell clonal deletion t cell anergy and generation and activation of regulatory t treg cells for that reason tolerogenic dcs are possible candidates for specific cellular therapy for treatment of allergic diseases autoimmune diseases eg type 1 diabetes multiple sclerosis rheumatoid arthritis or transplant rejectionstolerogenic dcs often display an immature or semimature phenotype with characteristically low expression of costimulatory eg cd80 cd86 and mhc molecules'</li><li>'distribution of il2 receptors cd25 cd122 cd132 on different cell populations resulting in different cells that are activated by high and low dose il2 in general high doses are immune suppressive while low doses can stimulate type 1 immunity lowdose il2 has been reported to reduce hepatitis c and b infectionil2 has been used in clinical trials for the treatment of chronic viral infections and as a booster adjuvant for vaccines the use of large doses of il2 given every 6 – 8 weeks in hiv therapy similar to its use in cancer therapy was found to be ineffective in preventing progression to an aids diagnosis in two large clinical trials published in 2009more recently low dose il2 has shown early success in modulating the immune system in disease like type 1 diabetes and vasculitis there are also promising studies looking to use low dose il2 in ischaemic heart disease il2 cannot accomplish its role as a promising immunotherapeutic agent due to significant drawbacks which are listed above some of the issues can be overcome using il2 ic they are composed of il2 and some of its monoclonal antibody mab and can potentiate biologic activity of il2 in vivo the main mechanism of this phenomenon in vivo is due to the prolongation of the cytokine halflife in circulation depending on the clone of il2 mab il2 ic can selectively stimulate either cd25high il2jes61 complexes or cd122high cells il2s4b6 il2s4b6 immune complexes have high stimulatory activity for nk cells and memory cd8 t cells and they could thus replace the conventional il2 in cancer immunotherapy on the other hand il2jes61 highly selectively stimulate regulatory t cells and they could be potentially useful for transplantations and in treatment of autoimmune diseases according to an immunology textbook il2 is particularly important historically as it is the first type i cytokine that was cloned the first type i cytokine for which a receptor component was cloned and was the first shortchain type i cytokine whose receptor structure was solved many general principles have been derived from studies of this cytokine including its being the first cytokine demonstrated to act in a growth factor – like fashion through specific highaffinity receptors analogous to the growth factors being studied by endocrinologists and biochemists 712 in the mid1960s studies reported activities in leukocyteconditioned media'</li><li>'the immune system during puberty and postpuberty than during the rest of a males adult life physical changes during puberty such as thymic involution also affect immunological response ecoimmunology or ecological immunology explores the relationship between the immune system of an organism and its social biotic and abiotic environment more recent ecoimmunological research has focused on host pathogen defences traditionally considered nonimmunological such as pathogen avoidance selfmedication symbiontmediated defenses and fecundity tradeoffs behavioural immunity a phrase coined by mark schaller specifically refers to psychological pathogen avoidance drivers such as disgust aroused by stimuli encountered around pathogeninfected individuals such as the smell of vomit more broadly behavioural ecological immunity has been demonstrated in multiple species for example the monarch butterfly often lays its eggs on certain toxic milkweed species when infected with parasites these toxins reduce parasite growth in the offspring of the infected monarch however when uninfected monarch butterflies are forced to feed only on these toxic plants they suffer a fitness cost as reduced lifespan relative to other uninfected monarch butterflies this indicates that laying eggs on toxic plants is a costly behaviour in monarchs which has probably evolved to reduce the severity of parasite infectionsymbiontmediated defenses are also heritable across host generations despite a nongenetic direct basis for the transmission aphids for example rely on several different symbionts for defense from key parasites and can vertically transmit their symbionts from parent to offspring therefore a symbiont that successfully confers protection from a parasite is more likely to be passed to the host offspring allowing coevolution with parasites attacking the host in a way similar to traditional immunity the preserved immune tissues of extinct species such as the thylacine thylacine cynocephalus can also provide insights into their biology the study of the interaction of the immune system with cancer cells can lead to diagnostic tests and therapies with which to find and fight cancer the immunology concerned with physiological reaction characteristic of the immune state this area of the immunology is devoted to the study of immunological aspects of the reproductive process including fetus acceptance the term has also been used by fertility clinics to address fertility problems recurrent miscarriages premature deliveries and dangerous complications such as preeclampsia list of immunologists immunomics international reviews of immunology outline of immunology history of immunology osteoimmunology'</li></ul> | | 25 | <ul><li>'then convergence to i − a − 1 b displaystyle ia1b occurs if the magnitudes of all eigenvalues of a displaystyle a are less than 1 every bounded sequence in r n displaystyle mathbb r n has a convergent subsequence by the bolzano – weierstrass theorem if these all have the same limit then the original sequence converges to that limit if it can be shown that all of the subsequences of f displaystyle f have the same limit such as by showing that there is a unique fixed point of the transformation t displaystyle t then the initial sequence must also converge to that limit every bounded monotonic sequence in r n displaystyle mathbb r n converges to a limit this approach can also be applied to sequences that are not monotonic instead it is possible to define a function v r n → r displaystyle vmathbb r nrightarrow mathbb r such that v f n displaystyle vfn is monotonic in n displaystyle n if the v displaystyle v satisfies the conditions to be a lyapunov function then f displaystyle f is convergent lyapunovs theorem is normally stated for ordinary differential equations but can also be applied to sequences of iterates by replacing derivatives with discrete differences the basic requirements on v displaystyle v are that v f n 1 − v f n 0 displaystyle vfn1vfn0 for f n = 0 displaystyle fnneq 0 and v 0 0 displaystyle v00 or v [UNK] x 0 displaystyle dot vx0 for x = 0 displaystyle xneq 0 v x 0 displaystyle vx0 for all x = 0 displaystyle xneq 0 and v 0 0 displaystyle v00 v displaystyle v be radially unbounded so that v x displaystyle vx goes to infinity for any sequence with ‖ x ‖ displaystyle x that tends to infinityin many cases a lyapunov function of the form v x x t a x displaystyle vxxtax can be found although more complex forms are also used for delay differential equations a similar approach applies with lyapunov functions replaced by lyapunov functionals also called lyapunovkrasovskii functionals if the inequality in the condition 1 is weak lasalles invariance principle may be used to consider the convergence of sequences of functions it is necessary to define a distance between functions to replace the euclidean norm these often include convergence in the'</li><li>'this is a list of convexity topics by wikipedia page alpha blending the process of combining a translucent foreground color with a background color thereby producing a new blended color this is a convex combination of two colors allowing for transparency effects in computer graphics barycentric coordinates a coordinate system in which the location of a point of a simplex a triangle tetrahedron etc is specified as the center of mass or barycenter of masses placed at its vertices the coordinates are nonnegative for points in the convex hull borsuks conjecture a conjecture about the number of pieces required to cover a body with a larger diameter solved by hadwiger for the case of smooth convex bodies bond convexity a measure of the nonlinear relationship between price and yield duration of a bond to changes in interest rates the second derivative of the price of the bond with respect to interest rates a basic form of convexity in finance caratheodorys theorem convex hull if a point x of rd lies in the convex hull of a set p there is a subset of p with d1 or fewer points such that x lies in its convex hull choquet theory an area of functional analysis and convex analysis concerned with measures with support on the extreme points of a convex set c roughly speaking all vectors of c should appear as averages of extreme points complex convexity — extends the notion of convexity to complex numbers convex analysis the branch of mathematics devoted to the study of properties of convex functions and convex sets often with applications in convex minimization convex combination a linear combination of points where all coefficients are nonnegative and sum to 1 all convex combinations are within the convex hull of the given points convex and concave a print by escher in which many of the structures features can be seen as both convex shapes and concave impressions convex body a compact convex set in a euclidean space whose interior is nonempty convex conjugate a dual of a real functional in a vector space can be interpreted as an encoding of the convex hull of the functions epigraph in terms of its supporting hyperplanes convex curve a plane curve that lies entirely on one side of each of its supporting lines the interior of a closed convex curve is a convex set convex function a function in which the line segment between any two points on the graph of the function lies above the graph closed convex function a convex function all of whose sublevel sets are closed sets proper convex function a convex function whose effective domain is nonempty and it never attains minus infinity concave function the negative of a convex function convex geometry the branch of geometry studying'</li><li>'##regularization is useful as it can often be used in a way such that the various symmetries of the physical system are preserved zetafunction regularization is used in conformal field theory renormalization and in fixing the critical spacetime dimension of string theory zeta function regularization is equivalent to dimensional regularization see4 however the main advantage of the zeta regularization is that it can be used whenever the dimensional regularization fails for example if there are matrices or tensors inside the calculations [UNK] i j k displaystyle epsilon ijk zetafunction regularization gives an analytic structure to any sums over an arithmetic function fn such sums are known as dirichlet series the regularized form f s [UNK] n 1 ∞ f n n − s displaystyle tilde fssum n1infty fnns converts divergences of the sum into simple poles on the complex splane in numerical calculations the zetafunction regularization is inappropriate as it is extremely slow to converge for numerical purposes a more rapidly converging sum is the exponential regularization given by f t [UNK] n 1 ∞ f n e − t n displaystyle ftsum n1infty fnetn this is sometimes called the ztransform of f where z exp−t the analytic structure of the exponential and zetaregularizations are related by expanding the exponential sum as a laurent series f t a n t n a n − 1 t n − 1 [UNK] displaystyle ftfrac antnfrac an1tn1cdots one finds that the zetaseries has the structure f s a n s − n [UNK] displaystyle tilde fsfrac ansncdots the structure of the exponential and zetaregulators are related by means of the mellin transform the one may be converted to the other by making use of the integral representation of the gamma function γ s [UNK] 0 ∞ t s − 1 e − t d t displaystyle gamma sint 0infty ts1etdt which leads to the identity γ s f s [UNK] 0 ∞ t s − 1 f t d t displaystyle gamma stilde fsint 0infty ts1ftdt relating the exponential and zetaregulators and converting poles in the splane to divergent terms in the laurent series the sum f s [UNK] n a n e − s ω n displaystyle fssum nanesomega n is sometimes called a heat kernel or a heatkernel regularized sum this name stems from the idea that the ω n'</li></ul> | | 37 | <ul><li>'##dicative adjective must also be connected by a copula some theories of syntax adopt a subjectpredicate distinction for instance a textbook phrase structure grammar typically divides an english declarative sentence s into a noun phrase np and verb phrase vp the subject np is shown in green and the predicate vp in blue languages with more flexible word order often called nonconfigurational languages are often also treated differently in phrase structure approaches on the other hand dependency grammar rejects the binary subjectpredicate division and places the finite verb as the root of the sentence the matrix predicate is marked in blue and its two arguments are in green while the predicate cannot be construed as a constituent in the formal sense it is a catena barring a discontinuity predicates and their arguments are always catenae in dependency structures some theories of grammar accept both a binary division of sentences into subject and predicate while also giving the head of the predicate a special status in such contexts the term predicator is used to refer to that head there are cases in which the semantic predicand has a syntactic function other than subject this happens in raising constructions such as the following here you is the object of the make verb phrase the head of the main clause but it is also the predicand of the subordinate think clause which has no subject 329 – 335 the term predicate is also used to refer to properties and to words or phrases which denote them this usage of the term comes from the concept of a predicate in logic in logic predicates are symbols which are interpreted as relations or functions over arguments in semantics the denotations of some linguistic expressions are analyzed along similar lines expressions which denote predicates in the semantic sense are sometimes themselves referred to as predication the seminal work of greg carlson distinguishes between types of predicates based on carlsons work predicates have been divided into the following subclasses which roughly pertain to how a predicate relates to its subject stagelevel predicates a stagelevel predicate is true of a temporal stage of its subject for example if john is hungry then he typically will eat some food his state of being hungry therefore lasts a certain amount of time and not his entire lifespan stagelevel predicates can occur in a wide range of grammatical constructions and are probably the most versatile kind of predicate individuallevel predicates an individuallevel predicate is true throughout the existence of an individual for example if john is smart this is a property that he has regardless of which particular point'</li><li>'that there can be exactly the same relation between two completely different objects greek philosophers such as plato and aristotle used a wider notion of analogy they saw analogy as a shared abstraction analogous objects did not share necessarily a relation but also an idea a pattern a regularity an attribute an effect or a philosophy these authors also accepted that comparisons metaphors and images allegories could be used as arguments and sometimes they called them analogies analogies should also make those abstractions easier to understand and give confidence to those who use them james francis ross in portraying analogy 1982 the first substantive examination of the topic since cajetans de nominum analogia demonstrated that analogy is a systematic and universal feature of natural languages with identifiable and lawlike characteristics which explain how the meanings of words in a sentence are interdependent on the contrary ibn taymiyya francis bacon and later john stuart mill argued that analogy is simply a special case of induction in their view analogy is an inductive inference from common known attributes to another probable common attribute which is known about only in the source of the analogy in the following form premises a is c d e f g b is c d e f conclusion b is probably g contemporary cognitive scientists use a wide notion of analogy extensionally close to that of plato and aristotle but framed by gentners 1983 structure mapping theory the same idea of mapping between source and target is used by conceptual metaphor and conceptual blending theorists structure mapping theory concerns both psychology and computer science according to this view analogy depends on the mapping or alignment of the elements of source and target the mapping takes place not only between objects but also between relations of objects and between relations of relations the whole mapping yields the assignment of a predicate or a relation to the target structure mapping theory has been applied and has found considerable confirmation in psychology it has had reasonable success in computer science and artificial intelligence see below some studies extended the approach to specific subjects such as metaphor and similarity logicians analyze how analogical reasoning is used in arguments from analogy an analogy can be stated using is to and as when representing the analogous relationship between two pairs of expressions for example smile is to mouth as wink is to eye in the field of mathematics and logic this can be formalized with colon notation to represent the relationships using single colon for ratio and double colon for equalityin the field of testing the colon notation of ratios and equality is often borrowed so that the example above might be rendered smile mouth wink eye and pronounced the same way an analogy can be the linguistic process that reduces word forms thought to break rules to more common forms that follow these rules for example'</li><li>'this approach can be used to cover a wide variety of semantic phenomena a lambek grammar is an elaboration of this idea that has a concatenation operator for types and several other inference rules mati pentus has shown that these still have the generative capacity of contextfree grammars for the lambek calculus there is a type concatenation operator [UNK] displaystyle star so that prim ⊆ tp prim displaystyle textprimsubseteq texttptextprim and if x y ∈ tp prim displaystyle xyin texttptextprim then x y x [UNK] y x [UNK] y ∈ tp prim displaystyle xyxbackslash yxstar yin texttptextprim the lambek calculus consists of several deduction rules which specify how type inclusion assertions can be derived in the following rules upper case roman letters stand for types upper case greek letters stand for sequences of types a sequent of the form x ← γ displaystyle xleftarrow gamma can be read a string is of type x if it consists of the concatenation of strings of each of the types in γ if a type is interpreted as a set of strings then the ← may be interpreted as [UNK] that is includes as a subset a horizontal line means that the inclusion above the line implies the one below the line the process is begun by the axiom rule which has no antecedents and just says that any type includes itself axiom x ← x displaystyle textaxiomquad over xleftarrow x the cut rule says that inclusions can be composed cut z ← δ x δ ′ x ← γ z ← δ γ δ ′ displaystyle textcutquad zleftarrow delta xdelta qquad xleftarrow gamma over zleftarrow delta gamma delta the other rules come in pairs one pair for each type construction operator each pair consisting of one rule for the operator in the target one in the source of the arrow the name of a rule consists of the operator and an arrow with the operator on the side of the arrow on which it occurs in the conclusion for an example here is a derivation of type raising which says that b a [UNK] b ← a displaystyle babackslash bleftarrow a the names of rules and the substitutions used are to the right b ← b a ← a b ← b a a b a [UNK] b ← a axioms ← z y b x a γ a δ δ ′ [UNK] ← y b x b a γ a displaystyle dfra'</li></ul> | | 30 | <ul><li>'on february 5 2005 for its operations of a vermiculite mine in libby montana the indictment accused grace of wire fraud knowing endangerment of residents by concealing air monitoring results obstruction of justice by interfering with an environmental protection agency epa investigation violation of the clean air act providing asbestos materials to schools and local residents and conspiracy to release asbestos and cover up health problems from asbestos contamination the department of justice said 1200 residents had developed asbestosrelated diseases and some had died and there could be many more injuries and deathson june 8 2006 a federal judge dismissed the conspiracy charge of knowing endangerment because some of the defendant officials had left the company before the fiveyear statute of limitations had begun to run the wire fraud charge was dropped by prosecutors in march other prosecutions on april 2 1998 three men were indicted in a conspiracy to use homeless men for illegal asbestos removal from an aging wisconsin manufacturing plant thenus attorney general janet reno said knowingly removing asbestos improperly is criminal exploiting the homeless to do this work is cruelon december 12 2004 owners of new york asbestos abatement companies were sentenced to the longest federal jail sentences for environmental crimes in us history after they were convicted on 18 counts of conspiracy to violate the clean air act and the toxic substances control act and actual violations of the clean air act and racketeerinfluenced and corrupt organizations act the crimes involved a 10year scheme to illegally remove asbestos the rico counts included obstruction of justice money laundering mail fraud and bid rigging all related to the asbestos cleanupon january 11 2006 san diego gas electric co two of its employees and a contractor were indicted by a federal grand jury on charges that they violated safety standards while removing asbestos from pipes in lemon grove california the defendants were charged with five counts of conspiracy violating asbestos work practice standards and making false statements'</li><li>'is standard in medicalbilling terminology especially when billing for a growth whose pathology has yet to be determined epidemiology of cancer list of biological development disorders pleomorphism somatic evolution in cancer'</li><li>'atm these epigenetic defects occurred in various cancers including breast ovarian colorectal and head and neck cancers two or three deficiencies in expression of ercc1 xpf or pms2 occur simultaneously in the majority of the 49 colon cancers evaluated by facista et al epigenetic alterations causing reduced expression of dna repair genes is shown in a central box at the third level from the top of the figure in this section and the consequent dna repair deficiency is shown at the fourth level when expression of dna repair genes is reduced dna damages accumulate in cells at a higher than normal level and these excess damages cause increased frequencies of mutation or epimutation mutation rates strongly increase in cells defective in dna mismatch repair or in homologous recombinational repair hrrduring repair of dna double strand breaks or repair of other dna damages incompletely cleared sites of repair can cause epigenetic gene silencing dna repair deficiencies level 4 in the figure cause increased dna damages level 5 in the figure which result in increased somatic mutations and epigenetic alterations level 6 in the figure field defects normalappearing tissue with multiple alterations and discussed in the section below are common precursors to development of the disordered and improperly proliferating clone of tissue in a malignant neoplasm such field defects second level from bottom of figure may have multiple mutations and epigenetic alterations once a cancer is formed it usually has genome instability this instability is likely due to reduced dna repair or excessive dna damage because of such instability the cancer continues to evolve and to produce sub clones for example a renal cancer sampled in 9 areas had 40 ubiquitous mutations demonstrating tumor heterogeneity ie present in all areas of the cancer 59 mutations shared by some but not all areas and 29 private mutations only present in one of the areas of the cancer various other terms have been used to describe this phenomenon including field effect field cancerization and field carcinogenesis the term field cancerization was first used in 1953 to describe an area or field of epithelium that has been preconditioned by at that time largely unknown processes so as to predispose it towards development of cancer since then the terms field cancerization and field defect have been used to describe premalignant tissue in which new cancers are likely to arisefield defects are important in progression to cancer however in most cancer research as pointed out by rubin the vast majority of studies in cancer research has been done on welldefined tumors in vivo or on discrete neoplastic foci in vitro'</li></ul> | | 2 | <ul><li>'in algebra a resolvent cubic is one of several distinct although related cubic polynomials defined from a monic polynomial of degree four p x x 4 a 3 x 3 a 2 x 2 a 1 x a 0 displaystyle pxx4a3x3a2x2a1xa0 in each case the coefficients of the resolvent cubic can be obtained from the coefficients of px using only sums subtractions and multiplications knowing the roots of the resolvent cubic of px is useful for finding the roots of px itself hence the name “ resolvent cubic ” the polynomial px has a multiple root if and only if its resolvent cubic has a multiple root suppose that the coefficients of px belong to a field k whose characteristic is different from 2 in other words we are working in a field in which 1 1 = 0 whenever roots of px are mentioned they belong to some extension k of k such that px factors into linear factors in kx if k is the field q of rational numbers then k can be the field c of complex numbers or the field q of algebraic numbers in some cases the concept of resolvent cubic is defined only when px is a quartic in depressed form — that is when a3 0 note that the fourth and fifth definitions below also make sense and that the relationship between these resolvent cubics and px are still valid if the characteristic of k is equal to 2 suppose that px is a depressed quartic — that is that a3 0 a possible definition of the resolvent cubic of px is r 1 y 8 y 3 8 a 2 y 2 2 a 2 2 − 8 a 0 y − a 1 2 displaystyle r1y8y38a2y22a228a0ya12 the origin of this definition lies in applying ferraris method to find the roots of px to be more precise p x 0 [UNK] x 4 a 2 x 2 − a 1 x − a 0 [UNK] x 2 a 2 2 2 − a 1 x − a 0 a 2 2 4 displaystyle beginalignedpx0longleftrightarrow x4a2x2a1xa0longleftrightarrow leftx2frac a22right2a1xa0frac a224endaligned add a new unknown y to x2 a22 now you have x 2 a 2 2 y 2 − a 1 x − a 0 a 2 2 4 2 x 2 y a 2 y y 2 2 y x 2 − a 1 x − a'</li><li>'in particular in characteristic zero all complex solutions are sought searching for the real or rational solutions are much more difficult problems that are not considered in this article the set of solutions is not always finite for example the solutions of the system x x − 1 0 x y − 1 0 displaystyle beginalignedxx10xy10endaligned are a point xy 11 and a line x 0 even when the solution set is finite there is in general no closedform expression of the solutions in the case of a single equation this is abel – ruffini theorem the barth surface shown in the figure is the geometric representation of the solutions of a polynomial system reduced to a single equation of degree 6 in 3 variables some of its numerous singular points are visible on the image they are the solutions of a system of 4 equations of degree 5 in 3 variables such an overdetermined system has no solution in general that is if the coefficients are not specific if it has a finite number of solutions this number is at most 53 125 by bezouts theorem however it has been shown that for the case of the singular points of a surface of degree 6 the maximum number of solutions is 65 and is reached by the barth surface a system is overdetermined if the number of equations is higher than the number of variables a system is inconsistent if it has no complex solution or if the coefficients are not complex numbers no solution in an algebraically closed field containing the coefficients by hilberts nullstellensatz this means that 1 is a linear combination with polynomials as coefficients of the first members of the equations most but not all overdetermined systems when constructed with random coefficients are inconsistent for example the system x3 – 1 0 x2 – 1 0 is overdetermined having two equations but only one unknown but it is not inconsistent since it has the solution x 1 a system is underdetermined if the number of equations is lower than the number of the variables an underdetermined system is either inconsistent or has infinitely many complex solutions or solutions in an algebraically closed field that contains the coefficients of the equations this is a nontrivial result of commutative algebra that involves in particular hilberts nullstellensatz and krulls principal ideal theorem a system is zerodimensional if it has a finite number of complex solutions or solutions in an algebraically closed field this terminology comes from the fact that the algebraic variety of the solutions has dimension zero a system with infinitely many solutions is said to be positivedimensional a zerodimensional system with as'</li><li>'##gu endif endwhile return factors the correctness of this algorithm relies on the fact that the ring fqxf is a direct product of the fields fqxfi where fi runs on the irreducible factors of f as all these fields have qd elements the component of g in any of these fields is zero with probability q d − 1 2 q d [UNK] 1 2 displaystyle frac qd12qdsim tfrac 12 this implies that the polynomial gcdg u is the product of the factors of g for which the component of g is zero it has been shown that the average number of iterations of the while loop of the algorithm is less than 25 log 2 r displaystyle 25log 2r giving an average number of arithmetic operations in fq which is o d n 2 log r log q displaystyle odn2logrlogq in the typical case where dlogq n this complexity may be reduced to o n 2 log r log q n displaystyle on2logrlogqn by choosing h in the kernel of the linear map v → v q − v mod f displaystyle vto vqvpmod f and replacing the instruction g h q d − 1 2 − 1 mod f displaystyle ghfrac qd121pmod f by g h q − 1 2 − 1 mod f displaystyle ghfrac q121pmod f the proof of validity is the same as above replacing the direct product of the fields fqxfi by the direct product of their subfields with q elements the complexity is decomposed in o n 2 log r log q displaystyle on2logrlogq for the algorithm itself o n 2 log q n displaystyle on2logqn for the computation of the matrix of the linear map which may be already computed in the squarefree factorization and on3 for computing its kernel it may be noted that this algorithm works also if the factors have not the same degree in this case the number r of factors needed for stopping the while loop is found as the dimension of the kernel nevertheless the complexity is slightly better if squarefree factorization is done before using this algorithm as n may decrease with squarefree factorization this reduces the complexity of the critical steps victor shoups algorithm like the algorithms of the preceding section victor shoups algorithm is an equaldegree factorization algorithm unlike them it is a deterministic algorithm however it is less efficient in practice than the algorithms of preceding section for shoups algorithm the input is restricted'</li></ul> | | 0 | <ul><li>'occupational noise is the amount of acoustic energy received by an employees auditory system when they are working in the industry occupational noise or industrial noise is often a term used in occupational safety and health as sustained exposure can cause permanent hearing damage occupational noise is considered an occupational hazard traditionally linked to loud industries such as shipbuilding mining railroad work welding and construction but can be present in any workplace where hazardous noise is present in the us the national institute for occupational safety and health niosh and the occupational safety and health administration osha work together to provide standards and regulations for noise in the workplacenational institute for occupational safety and health niosh occupational safety and health administration osha mine safety and health administration msha federal railroad administration fra have all set standards on hazardous occupational noise in their respective industries each industry is different as workers tasks and equipment differ but most regulations agree that noise becomes hazardous when it exceeds 85 decibels for an 8hour time exposure typical work shift this relationship between allotted noise level and exposure time is known as an exposure action value eav or permissible exposure limit pel the eav or pel can be seen as equations which manipulate the allotted exposure time according to the intensity of the industrial noise this equation works as an inverse exponential relationship as the industrial noise intensity increases the allotted exposure time to still remain safe decreases thus a worker exposed to a noise level of 100 decibels for 15 minutes would be at the same risk level as a worker exposed to 85 decibels for 8 hours using this mathematical relationship an employer can calculate whether or not their employees are being overexposed to noise when it is suspected that an employee will reach or exceed the pel a monitoring program for that employee should be implemented by the employerthe above calculations of pel and eav are based on measurements taken to determine the intensity of that particular industrial noise aweighted measurements are commonly used to determine noise levels that can cause harm to the human ear there are also special exposure meters available that integrate noise over a period of time to give an leq value equivalent sound pressure level defined by standards these numerical values do not fully reflect the real situation for example the osha standard sets the action level 85 dba and the pel 90 dba but in practice the compliance safety and health officer must record the excess of these values with a margin in order to take into account the potential measurement error and instead of pel 90 dba it turns out 92 dba and instead of al 85 dba its 87 dba occupational noise if experienced repeatedly at high intensity for an extended period of time can cause noiseinduce'</li><li>'the lowest frequency which can be localized depends on the ear distance animals with a greater ear distance can localize lower frequencies than humans can for animals with a smaller ear distance the lowest localizable frequency is higher than for humans if the ears are located at the side of the head interaural level differences appear for higher frequencies and can be evaluated for localization tasks for animals with ears at the top of the head no shadowing by the head will appear and therefore there will be much less interaural level differences which could be evaluated many of these animals can move their ears and these ear movements can be used as a lateral localization cue for many mammals there are also pronounced structures in the pinna near the entry of the ear canal as a consequence directiondependent resonances can appear which could be used as an additional localization cue similar to the localization in the median plane in the human auditory system there are additional localization cues which are also used by animals for sound localization in the median plane elevation of the sound also two detectors can be used which are positioned at different heights in animals however rough elevation information is gained simply by tilting the head provided that the sound lasts long enough to complete the movement this explains the innate behavior of cocking the head to one side when trying to localize a sound precisely to get instantaneous localization in more than two dimensions from timedifference or amplitudedifference cues requires more than two detectors the tiny parasitic fly ormia ochracea has become a model organism in sound localization experiments because of its unique ear the animal is too small for the time difference of sound arriving at the two ears to be calculated in the usual way yet it can determine the direction of sound sources with exquisite precision the tympanic membranes of opposite ears are directly connected mechanically allowing resolution of submicrosecond time differences and requiring a new neural coding strategy ho showed that the coupledeardrum system in frogs can produce increased interaural vibration disparities when only small arrival time and sound level differences were available to the animals head efforts to build directional microphones based on the coupledeardrum structure are underway most owls are nocturnal or crepuscular birds of prey because they hunt at night they must rely on nonvisual senses experiments by roger payne have shown that owls are sensitive to the sounds made by their prey not the heat or the smell in fact the sound cues are both necessary and sufficient for localization of mice from a distant location where they are perched for this to work the owls must be able to accurately localize both'</li><li>'##benmelodie in rock music from the late 1960s to the 2000s the timbre of specific sounds is important to a song for example in heavy metal music the sonic impact of the heavily amplified heavily distorted power chord played on electric guitar through very loud guitar amplifiers and rows of speaker cabinets is an essential part of the styles musical identity often listeners can identify an instrument even at different pitches and loudness in different environments and with different players in the case of the clarinet acoustic analysis shows waveforms irregular enough to suggest three instruments rather than one david luce suggests that this implies that certain strong regularities in the acoustic waveform of the above instruments must exist which are invariant with respect to the above variables however robert erickson argues that there are few regularities and they do not explain our powers of recognition and identification he suggests borrowing the concept of subjective constancy from studies of vision and visual perceptionpsychoacoustic experiments from the 1960s onwards tried to elucidate the nature of timbre one method involves playing pairs of sounds to listeners then using a multidimensional scaling algorithm to aggregate their dissimilarity judgments into a timbre space the most consistent outcomes from such experiments are that brightness or spectral energy distribution and the bite or rate and synchronicity and rise time of the attack are important factors the concept of tristimulus originates in the world of color describing the way three primary colors can be mixed together to create a given color by analogy the musical tristimulus measures the mixture of harmonics in a given sound grouped into three sections it is basically a proposal of reducing a huge number of sound partials that can amount to dozens or hundreds in some cases down to only three values the first tristimulus measures the relative weight of the first harmonic the second tristimulus measures the relative weight of the second third and fourth harmonics taken together and the third tristimulus measures the relative weight of all the remaining harmonics t 1 a 1 [UNK] h 1 h a h t 2 a 2 a 3 a 4 [UNK] h 1 h a h t 3 [UNK] h 5 h a h [UNK] h 1 h a h displaystyle t1frac a1sum h1hahqquad t2frac a2a3a4sum h1hahqquad t3frac sum h5hahsum h1hah however more evidence studies and applications would be needed regarding this type of representation in order to validate it the term brightness is also used in discussions of sound timbres in a rough analogy'</li></ul> | | 39 | <ul><li>'waste heat is heat that is produced by a machine or other process that uses energy as a byproduct of doing work all such processes give off some waste heat as a fundamental result of the laws of thermodynamics waste heat has lower utility or in thermodynamics lexicon a lower exergy or higher entropy than the original energy source sources of waste heat include all manner of human activities natural systems and all organisms for example incandescent light bulbs get hot a refrigerator warms the room air a building gets hot during peak hours an internal combustion engine generates hightemperature exhaust gases and electronic components get warm when in operation instead of being wasted by release into the ambient environment sometimes waste heat or cold can be used by another process such as using hot engine coolant to heat a vehicle or a portion of heat that would otherwise be wasted can be reused in the same process if makeup heat is added to the system as with heat recovery ventilation in a building thermal energy storage which includes technologies both for short and longterm retention of heat or cold can create or improve the utility of waste heat or cold one example is waste heat from air conditioning machinery stored in a buffer tank to aid in night time heating another is seasonal thermal energy storage stes at a foundry in sweden the heat is stored in the bedrock surrounding a cluster of heat exchanger equipped boreholes and is used for space heating in an adjacent factory as needed even months later an example of using stes to use natural waste heat is the drake landing solar community in alberta canada which by using a cluster of boreholes in bedrock for interseasonal heat storage obtains 97 percent of its yearround heat from solar thermal collectors on the garage roofs another stes application is storing winter cold underground for summer air conditioningon a biological scale all organisms reject waste heat as part of their metabolic processes and will die if the ambient temperature is too high to allow this anthropogenic waste heat can contribute to the urban heat island effect the biggest point sources of waste heat originate from machines such as electrical generators or industrial processes such as steel or glass production and heat loss through building envelopes the burning of transport fuels is a major contribution to waste heat machines converting energy contained in fuels to mechanical work or electric energy produce heat as a byproduct in the majority of energy applications energy is required in multiple forms these energy forms typically include some combination of heating ventilation and air conditioning mechanical energy and electric power often these additional forms of energy are produced by a heat engine running on a source of hightemperat'</li><li>'boundaries at the flow extremes for a particular speed which are caused by different phenomena the steepness of the high flow part of a constant speed line is due to the effects of compressibility the position of the other end of the line is located by blade or passage flow separation there is a welldefined lowflow boundary marked on the map as a stall or surge line at which blade stall occurs due to positive incidence separation not marked as such on maps for turbochargers and gas turbine engines is a more gradually approached highflow boundary at which passages choke when the gas velocity reaches the speed of sound this boundary is identified for industrial compressors as overload choke sonic or stonewall the approach to this flow limit is indicated by the speed lines becoming more vertical other areas of the map are regions where fluctuating vane stalling may interact with blade structural modes leading to failure ie rotating stall causing metal fatigue different applications move over their particular map along different paths an example map with no operating lines is shown as a pictorial reference with the stallsurge line on the left and the steepening speed lines towards choke and overload on the right maps have similar features and general shape because they all apply to machines with spinning vanes which use similar principles for pumping a compressible fluid not all machines have stationary vanes centrifugal compressors may have either vaned or vaneless diffusers however a compressor operating as part of a gas turbine or turbocharged engine behaves differently to an industrial compressor because its flow and pressure characteristics have to match those of its driving turbine and other engine components such as power turbine or jet nozzle for a gas turbine and for a turbocharger the engine airflow which depends on engine speed and charge pressure a link between a gas turbine compressor and its engine can be shown with lines of constant engine temperature ratio ie the effect of fuellingincreased turbine temperature which raises the running line as the temperature ratio increases one manifestation of different behaviour appears in the choke region on the righthand side of a map it is a noload condition in a gas turbine turbocharger or industrial axial compressor but overload in an industrial centrifugal compressor hiereth et al shows a turbocharger compressor fullload or maximum fuelling curve runs up close to the surge line a gas turbine compressor fullload line also runs close to the surge line the industrial compressor overload is a capacity limit and requires high power levels to pass the high flow rates required excess power is available to inadvertently take the compressor beyond the overload limit to a hazardous condition'</li><li>'quantity thus it is useful to derive relationships between μ j t displaystyle mu mathrm jt and other more conveniently measured quantities as described below the first step in obtaining these results is to note that the joule – thomson coefficient involves the three variables t p and h a useful result is immediately obtained by applying the cyclic rule in terms of these three variables that rule may be written ∂ t ∂ p h ∂ h ∂ t p ∂ p ∂ h t − 1 displaystyle leftfrac partial tpartial prighthleftfrac partial hpartial trightpleftfrac partial ppartial hrightt1 each of the three partial derivatives in this expression has a specific meaning the first is μ j t displaystyle mu mathrm jt the second is the constant pressure heat capacity c p displaystyle cmathrm p defined by c p ∂ h ∂ t p displaystyle cmathrm p leftfrac partial hpartial trightp and the third is the inverse of the isothermal joule – thomson coefficient μ t displaystyle mu mathrm t defined by μ t ∂ h ∂ p t displaystyle mu mathrm t leftfrac partial hpartial prightt this last quantity is more easily measured than μ j t displaystyle mu mathrm jt thus the expression from the cyclic rule becomes μ j t − μ t c p displaystyle mu mathrm jt frac mu mathrm t cp this equation can be used to obtain joule – thomson coefficients from the more easily measured isothermal joule – thomson coefficient it is used in the following to obtain a mathematical expression for the joule – thomson coefficient in terms of the volumetric properties of a fluid to proceed further the starting point is the fundamental equation of thermodynamics in terms of enthalpy this is d h t d s v d p displaystyle mathrm d htmathrm d svmathrm d p now dividing through by dp while holding temperature constant yields ∂ h ∂ p t t ∂ s ∂ p t v displaystyle leftfrac partial hpartial prightttleftfrac partial spartial prighttv the partial derivative on the left is the isothermal joule – thomson coefficient μ t displaystyle mu mathrm t and the one on the right can be expressed in terms of the coefficient of thermal expansion via a maxwell relation the appropriate relation is ∂ s ∂ p t − ∂ v ∂ t p − v α displaystyle leftfrac partial spartial prighttleftfrac partial'</li></ul> | | 21 | <ul><li>'##agate this type of plant this means that the characteristics of a determined cultivar remain unalteredbulbs can reproduce vegetatively in a number of ways depending on the type of storage organ the plant has bulbs can be evergreen such as clivia agapanthus and some species and varieties of iris and hemerocallis however the majority are deciduous dying down to the storage organ for part of the year this characteristic has been taken advantage of in the commercialization of these plants at the beginning of the rest period the bulbs can be dug out of the ground and prepared for sale as if they remain dry they do not need any nutrition for weeks or monthsbulbous plants are produced on an industrial scale for two main markets cut flowers and dried bulbs the bulbs are produced to satisfy the demand for bulbs for parks gardens and as house plants in addition to providing the bulbs necessary for the production of cut flowers the international trade in cut flowers has a worldwide value of approximately 11000 million euros which gives an idea of the economic importance of this activity the netherlands has been the leader in commercial production since the start of the 16th century both for the dried bulb market and for cut flowers in fact with approximately 30000 hectares dedicated to this activity the production of bulbs in the netherlands represents 65 of global production the netherlands also produces 95 of the international market in bulbs dedicated to the production of cut flowers the united states is the second largest producer followed by france japan italy united kingdom israel brazil and spain international bulb society httpwwwbulbsocietyorgestablished in 1933 this society is an international educational and scientific organization it is a charity dedicated to the dissemination of information regarding the cultivation conservation and botany of all types of bulbous plants their website contains an excellent gallery of high quality photographs of bulbous plantsthe pacific bulb society httpwwwpacificbulbsocietyorgorganized in 2002 this society disseminates information and shares experiences regarding the cultivation of ornamental bulbous plants their website contains an exceptional educational resource pacific bulb society wiki with images and information regarding numerous species of bulbous plantsaustralian bulb association httpswebarchiveorgweb20090518011847httpwwwausbulbsorgindexhtmorganized in 2001 it possessed an excellent collection of photographs of bulbous plants on its website list of flower bulbs hessayon dg 1999 the bulb expert london transworld publishers mathew brian 1978 the larger bulbs london bt batsford in association with the royal horticultural society isbn 9780'</li><li>'soil conservation is the prevention of loss of the topmost layer of the soil from erosion or prevention of reduced fertility caused by over usage acidification salinization or other chemical soil contamination slashandburn and other unsustainable methods of subsistence farming are practiced in some lesser developed areas a consequence of deforestation is typically largescale erosion loss of soil nutrients and sometimes total desertification techniques for improved soil conservation include crop rotation cover crops conservation tillage and planted windbreaks affect both erosion and fertility when plants die they decay and become part of the soil code 330 defines standard methods recommended by the us natural resources conservation service farmers have practiced soil conservation for millennia in europe policies such as the common agricultural policy are targeting the application of best management practices such as reduced tillage winter cover crops plant residues and grass margins in order to better address soil conservation political and economic action is further required to solve the erosion problem a simple governance hurdle concerns how we value the land and this can be changed by cultural adaptation soil carbon is a carbon sink playing a role in climate change mitigation contour ploughing orients furrows following the contour lines of the farmed area furrows move left and right to maintain a constant altitude which reduces runoff contour plowing was practiced by the ancient phoenicians for slopes between two and ten percent contour plowing can increase crop yields from 10 to 50 percent partially as a result of greater soil retention terracing is the practice of creating nearly level areas in a hillside area the terraces form a series of steps each at a higher level than the previous terraces are protected from erosion by other soil barriers terraced farming is more common on small farms keyline design is the enhancement of contour farming where the total watershed properties are taken into account in forming the contour lines tree shrubs and groundcover are effective perimeter treatment for soil erosion prevention by impeding surface flows a special form of this perimeter or interrow treatment is the use of a grass way that both channels and dissipates runoff through surface friction impeding surface runoff and encouraging infiltration of the slowed surface water windbreaks are sufficiently dense rows of trees at the windward exposure of an agricultural field subject to wind erosion evergreen species provide yearround protection however as long as foliage is present in the seasons of bare soil surfaces the effect of deciduous trees may be adequate cover crops such as nitrogenfixing legumes white turnips radishes and other species are rotated with cash crops to blanket the soil yearround and act as green manure that rep'</li><li>'blackberries are also cultivated in the same way in a tropical climate temperatures are prone to soar above all normal levels in such cases foggersmisters are used to reduce the temperature this does not increase the humidity levels in the poly house as the evaporated droplets are almost immediately ventilated to open air hightech poly houses even have spaceheating systems as well as soilheating systems to purify the soil of unwanted viruses bacteria and other organisms the recent indoisrael collaboration at gharunda near karnal is an excellent example of polyhouse farming taking place in a developing country if developing countries were to develop a special incentive program solely for fruitandvegetable farmers especially in demographically large nations like india then the migration rate from rural to urban areas as well as the loss of horticultural and fruitvegetable farmers to urban areas may be reduced this brings a huge potential to improve the farming sector which is key to longterm economic stability the small polytunnels used by each farmer in each village promote the cultivation of vegetables both onseason and offseason and would actually help to moderate the market rate for fruit and vegetables in long run on a yearround basis and would help to satisfy local market needs for example in india the inability to grow tomatoes generates price spikes during the monsoon season this is seen as an ideal time to grow tomatoes in polytunnels since they provide the ideal climate for the crop in india the abhinav farmers club grows flowers and organic vegetables in polytunnels hoophouses have existed at least since the 1940s but they are much more commonly used with each passing decade and their design continues to evolve because of the wide variety of constantly changing designs in reality there is an entirely continuous spectrum from high tunnels through low tunnels to the simplest row covers although they are often thought about as discrete steps major themes of continuing development are 1 achieving the same results with lighter construction and less cost and 2 making hoophouses easily movable the advantages of mobile hoophouses include greater return on investment with the same unit of investment getting greater use per year across different crops in different months and more flexibility on crop rotation without ever having to bother to dig the soil out of a stationary house or use soil steam sterilization to cure greenhouse soil sickness a us department of agriculture program is helping farmers install polytunnels the program was announced at the us white house garden in december 2009farmers in iraq are building these in increasing number and adding drip irrigation to grow tomatoes'</li></ul> | | 18 | <ul><li>'the first postage stamps those of the united kingdom had no name in 1874 the universal postal union exempted the united kingdom from its rule which stated that a countrys name had to appear on their postage stamps so a profile of the reigning monarch was all that was required for identification of the uks stamps to this day the uk remains the only country not required to name itself on its stamps for all other upu members the name must appear in latin letters many countries using nonlatin alphabets used only those on their early stamps and they remain difficult for most collectors to identify today the name chosen is typically the countrys own name for itself with a modern trend towards using simpler and shorter forms or abbreviations for instance the republic of south africa inscribes with rsa while jordan originally used the hashemite kingdom of jordan and now just jordan some countries have multiple allowed forms from which the designer may choose the most suitable the name may appear in an adjectival form as in posta romana romanian post for romania dependent territories may or may not include the name of the parent country the graphic element of a stamp design falls into one of four major categories portrait bust profile or fullface emblem coat of arms flag national symbol posthorn etc numeric a design built around the numeral of value pictorialthe use of portrait busts of the ruler or other significant person or emblems was typical of the first stamps by extension from currency which was the closest model available to the early stamp designers usage pattern has varied considerably for 60 years from 1840 to 1900 all british stamps used exactly the same portrait bust of victoria enclosed in a dizzying variety of frames while spain periodically updated the image of alfonso xiii as he grew from child to adult norway has issued stamps with the same posthorn motif for over a century changing only the details from time to time as printing technology improves while the us has placed the flag of the united states into a wide variety of settings since first using it on a stamp in the 1950s while numeral designs are eminently practical in that they emphasize the most important element of the stamp they are the exception rather than the rule by far the greatest variety of stamp design seen today is in pictorial issues the choice of image is nearly unlimited ranging from plants and animals to figures from history to landscapes to original artwork images may represent realworld objects or be allegories or abstract designs the choice of pictorial designs is governed by a combination of anniversaries required annual issues such as christmas stamps postal rate changes exhaustion of existing stamp stocks and popular demand since postal administrations are either a branch'</li><li>'##ionism in both cases reflecting the influence of french impressionism which had spread internationally they are also known for their conceptual art as well as an internal split in the group which led to the formation of a new secession 1910 – 1914 key figures included walter leistikow franz skarbina max liebermann hermann struck and the norwegian painter edvard munch cologne 1909 – 1916 — also known as the sonderbund or the separate league of west german art lovers and artists the sonderbund westdeutscher kunstfreunde und kunstler was known for its landmark exhibitions introducing french impressionism postimpressionism and modernism to germany its 1912 show aimed to organize the most disputed paintings of our time and was later credited for helping develop a german version of expressionism while also presenting the most significant exhibition of european modernism prior to world war i the following year in fact it inspired a similar show in new york artists associated with the group included julius bretz max clarenbach august deusser walter ophey ernst osthaus egon schiele wilhelm schmurr alfred sohnrethel karli sohnrethel and otto sohnrethel along with collectors and curators of art dresden 1919 – 1925 — formed in reaction to the oppression of post world war i and the rise of the weimar republic otto schubert conrad felixmuller and otto dix are considered key figures in the dresden secession they are known for a highly accomplished form of german expressionism that was later labeled degenerate by the nazis selection was limited by availability academic art – style of painting and sculpture preraphaelite – group of english painters poets and critics founded in 1848pages displaying short descriptions of redirect targets salon des refuses art exhibition in paris first held in 1863 of works rejected by the academie des beauxarts simon hansulrich sezessionismus kunstgewerbe in literarischer und bildender kunst j b metzlersche verlagsbuchhandlung stuttgart 1976 isbn 3476002896'</li><li>'then still known as the vienna method was the monumental collection of 100 statistical charts gesellschaft und wirtschaft 1930 the first rule of isotype is that greater quantities are not represented by an enlarged pictogram but by a greater number of the samesized pictogram in neurath ’ s view variation in size does not allow accurate comparison what is to be compared – heightlength or area whereas repeated pictograms which always represent a fixed value within a certain chart can be counted if necessary isotype pictograms almost never depicted things in perspective in order to preserve this clarity and there were other guidelines for graphic configuration and use of colour the best exposition of isotype technique remains otto neurath ’ s book international picture language 1936 visual education was always the prime motive behind isotype which was worked out in exhibitions and books designed to inform ordinary citizens including schoolchildren about their place in the world it was never intended to replace verbal language it was a helping language always accompanied by verbal elements otto neurath realized that it could never be a fully developed language so instead he called it a “ languagelike technique ” as more requests came to the vienna museum from abroad a partner institute called mundaneum a name adopted from an abortive collaboration with paul otlet was established in 19312 to promote international work it formed branches containing small exhibitions in berlin the hague london and new york city members of the vienna team travelled periodically to the soviet union during the early 1930s in order to help set up the allunion institute of pictorial statistics of soviet construction and economy всесоюзныи институт изобразительнои статистики советского строительства и хозяиства commonly abbreviated to izostat изостат which produced statistical graphics about the five year plans among other things after the closure of the gesellschafts und wirtschaftsmuseum in 1934 neurath reidemeister and arntz fled to the netherlands where they set up the international foundation for visual education in the hague during the 1930s significant commissions were received from the us including a series of massproduced charts for the national tuberculosis association and otto neurath ’ s book modern man in the making 1939 a high point of isotype on which he reidemeister and arntz worked in close'</li></ul> | | 5 | <ul><li>'giant stars and white and red dwarf stars could support a timeintegrated biota up to 1046 kgyears in the galaxy and 1057 kgyears in the universesuch astroecology considerations quantify the immense potentials of future life in space with commensurate biodiversity and possibly intelligence chemical analysis of carbonaceous chondrite meteorites show that they contain extractable bioavailable water organic carbon and essential phosphate nitrate and potassium nutrients the results allow assessing the soil fertilities of the parent asteroids and planets and the amounts of biomass that they can sustainlaboratory experiments showed that material from the murchison meteorite when ground into a fine powder and combined with earths water and air can provide the nutrients to support a variety of organisms including bacteria nocardia asteroides algae and plant cultures such as potato and asparagus the microorganisms used organics in the carbonaceous meteorites as the carbon source algae and plant cultures grew well also on mars meteorites because of their high bioavailable phosphate contents the martian materials achieved soil fertility ratings comparable to productive agricultural soils this offers some data relating to terraforming of marsterrestrial analogues of planetary materials are also used in such experiments for comparison and to test the effects of space conditions on microorganismsthe biomass that can be constructed from resources can be calculated by comparing the concentration of elements in the resource materials and in biomass equation 1 a given mass of resource materials mresource can support mbiomass x of biomass containing element x considering x as the limiting nutrient where cresource x is the concentration mass per unit mass of element x in the resource material and cbiomass x is its concentration in the biomass m b i o m a s s x m r e s o u r c e x c r e s o u r c e x c b i o m a s s x displaystyle mbiomassxmresourcexfrac cresourcexcbiomassx 1 assuming that 100000 kg biomass supports one human the asteroids may then sustain about 6e15 six million billion people equal to a million earths a million times the present population similar materials in the comets could support biomass and populations about one hundred times larger solar energy can sustain these populations for the predicted further five billion years of the sun these considerations yield a maximum timeintegrated biota of 3e30 kgyears in the solar system after the sun becomes a white dwarf star and other white dwarf stars can provide energy'</li><li>'astronomer and astrobiology pioneer gavriil adrianovich tikhov tikhov is considered to be the father of astrobotany research in the field has been conducted both with growing earth plants in space environments and searching for botanical life on other planets the first organisms in space were specially developed strains of seeds launched to 134 km 83 mi on 9 july 1946 on a us launched v2 rocket these samples were not recovered the first seeds launched into space and successfully recovered were maize seeds launched on 30 july 1946 which were soon followed by rye and cotton these early suborbital biological experiments were handled by harvard university and the naval research laboratory and were concerned with radiation exposure on living tissue in 1971 500 tree seeds loblolly pine sycamore sweetgum redwood and douglas fir were flown around the moon on apollo 14 these moon trees were planted and grown with controls back on earth where no changes were detected in 1982 the crew of the soviet salyut 7 space station conducted an experiment prepared by lithuanian scientists alfonsas merkys and others and grew some arabidopsis using fiton3 experimental microgreenhouse apparatus thus becoming the first plants to flower and produce seeds in space a skylab experiment studied the effects of gravity and light on rice plants the svet2 space greenhouse successfully achieved seed to seed plant growth in 1997 aboard space station mir bion 5 carried daucus carota and bion 7 carried maize aka corn plant research continued on the international space station biomass production system was used on the iss expedition 4 the vegetable production system veggie system was later used aboard iss plants tested in veggie before going into space included lettuce swiss chard radishes chinese cabbage and peas red romaine lettuce was grown in space on expedition 40 which were harvested when mature frozen and tested back on earth expedition 44 members became the first american astronauts to eat plants grown in space on 10 august 2015 when their crop of red romaine was harvested since 2003 russian cosmonauts have been eating half of their crop while the other half goes towards further research in 2012 a sunflower bloomed aboard the iss under the care of nasa astronaut donald pettit in january 2016 us astronauts announced that a zinnia had blossomed aboard the issin 2018 the veggie3 experiment was tested with plant pillows and root mats one of the goals is to grow food for crew consumption crops tested at this time include cabbage lettuce and mizuna plants that have been grown in space include arabidopsis thale cress bok choy tokyo bekana'</li><li>'the planet simulator also known as a planetary simulator is a climatecontrolled simulation chamber designed to study the origin of life the device was announced by researchers at mcmaster university on behalf of the origins institute on 4 october 2018 the simulator project begun in 2012 and was funded with 1 million from the canada foundation for innovation the ontario government and mcmaster university it was built and manufactured by angstrom engineering inc of kitchener ontariothe device was designed and developed by biophysicist maikel rheinstadter and coprincipal investigators biochemist yingfu li and astrophysicist ralph pudritz for researchers to study a theory that suggests life on early earth began in warm little ponds rather than in deep ocean vents nearly four billion years ago the device can recreate conditions of the primitive earth to see whether cellular life can be created and then later evolvein an 2018 news release maikel rheinstadter stated we want to understand how the first living cell was formed how the earth moved from a chemical world to a biological worldthe planet simulator can mimic the environmental conditions consistent on the early earth and other astronomical bodies including other planets and exoplanets by controlling temperature humidity pressure atmosphere and radiation levels within the simulation chamber according to researchers preliminary tests with the simulator under possible conditions of the early earth created protocells cells which are not living but very important nonetheless according to biologist david deamer the device is a game changer and the cells produced so far are significant the cells are not alive but are evolutionary steps toward a living system of molecules the simulator opens up a lot of experimental activities that were literally impossible before ” based on initial tests with the new simulator technology project director rheinstadter stated that it seems that the formation of life is probably a relatively frequent process in the universe'</li></ul> | | 28 | <ul><li>'##nfjgk0 if k = 1 displaystyle kneq 1 and [UNK] j 1 n a j 1 [UNK] j 1 n f j e n displaystyle sum j1naj1sum j1nfjen let a ∗ displaystyle aast denote the conjugate transpose of a then a a ∗ a ∗ a n i displaystyle aaast aast ani this implies the desired orthogonality relationship for the characters ie [UNK] k 1 n f k ∗ g i f k g j n δ i j displaystyle sum k1nfkgifkgjndelta ij where δ i j displaystyle delta ij is the kronecker delta and f k ∗ g i displaystyle fkgi is the complex conjugate of f k g i displaystyle fkgi pontryagin duality'</li><li>'j x p i ν p i − 1 [UNK] j i 1 ω x p j ν p j x [UNK] i 1 ω x ν p i x p i x x [UNK] p prime p [UNK] x ν p x p displaystyle dxsum i1omega xleftnu pixleftprod j1i1pjnu pjxrightpinu pi1leftprod ji1omega xpjnu pjxrightrightsum i1omega xfrac nu pixpixxsum stackrel pmid xptext primefrac nu pxp where ωx a prime omega function is the number of distinct prime factors in x and νpx is the padic valuation of x for example d 60 d 2 2 ⋅ 3 ⋅ 5 2 2 1 3 1 5 ⋅ 60 92 displaystyle d60d22cdot 3cdot 5leftfrac 22frac 13frac 15rightcdot 6092 or d 81 d 3 4 4 ⋅ 3 3 ⋅ d 3 4 ⋅ 27 ⋅ 1 108 displaystyle d81d344cdot 33cdot d34cdot 27cdot 1108 the sequence of number derivatives for k 0 1 2 … begins sequence a003415 in the oeis 0 0 1 1 4 1 5 1 12 6 7 1 16 1 9 … displaystyle 00114151126711619ldots the logarithmic derivative ld x d x x [UNK] p prime p [UNK] x ν p x p displaystyle operatorname ld xfrac dxxsum stackrel pmid xptext primefrac nu pxp is a totally additive function ld x ⋅ y ld x ld y displaystyle operatorname ld xcdot yoperatorname ld xoperatorname ld y the arithmetic partial derivative of x displaystyle x with respect to p displaystyle p is defined as x p ′ ν p x p x displaystyle xpprime frac nu pxpx so the arithmetic derivative of x displaystyle x is given as d x [UNK] p prime p [UNK] x x p ′ displaystyle dxsum stackrel pmid xptext primexpprime an arithmetic function f displaystyle f is leibnizadditive if there is a totally multiplicative function h f displaystyle hf such that f m n f m h f n f n h f m displaystyle fmnfmhfnfnhfm for all positive integers m displaystyle m and n displaystyle n a motivation for this concept is'</li><li>'and every rcoloring of the integers greater than one there is a finite monochromatic subset s of these integers such that the conjecture was proven in 2003 by ernest s croot iii znams problem and primary pseudoperfect numbers are closely related to the existence of egyptian fractions of the form for instance the primary pseudoperfect number 1806 is the product of the prime numbers 2 3 7 and 43 and gives rise to the egyptian fraction 1 12 13 17 143 11806 egyptian fractions are normally defined as requiring all denominators to be distinct but this requirement can be relaxed to allow repeated denominators however this relaxed form of egyptian fractions does not allow for any number to be represented using fewer fractions as any expansion with repeated fractions can be converted to an egyptian fraction of equal or smaller length by repeated application of the replacement if k is odd or simply by replacing 1k 1k by 2k if k is even this result was first proven by takenouchi 1921 graham and jewett proved that it is similarly possible to convert expansions with repeated denominators to longer egyptian fractions via the replacement this method can lead to long expansions with large denominators such as botts 1967 had originally used this replacement technique to show that any rational number has egyptian fraction representations with arbitrarily large minimum denominators any fraction xy has an egyptian fraction representation in which the maximum denominator is bounded by and a representation with at most terms the number of terms must sometimes be at least proportional to log log y for instance this is true for the fractions in the sequence 12 23 67 4243 18061807 whose denominators form sylvesters sequence it has been conjectured that olog log y terms are always enough it is also possible to find representations in which both the maximum denominator and the number of terms are small graham 1964 characterized the numbers that can be represented by egyptian fractions in which all denominators are nth powers in particular a rational number q can be represented as an egyptian fraction with square denominators if and only if q lies in one of the two halfopen intervals martin 1999 showed that any rational number has very dense expansions using a constant fraction of the denominators up to n for any sufficiently large n engel expansion sometimes called an egyptian product is a form of egyptian fraction expansion in which each denominator is a multiple of the previous one in addition the sequence of multipliers ai is required to be nondecreasi'</li></ul> | | 38 | <ul><li>'##ken the global language system theorises that language groups are engaged in unequal competition on different levels globally using the notions of a periphery semiperiphery and a core which are concepts of the world system theory de swaan relates them to the four levels present in the hierarchy of the global language system peripheral central supercentral and hypercentralde swaan also argues that the greater the range of potential uses and users of a language the higher the tendency of an individual to move up the hierarchy in the global language system and learn a more central language thus de swaan views the learning of second languages as proceeding up rather than down the hierarchy in the sense that they learn a language that is on the next level up for instance speakers of catalan a peripheral language have to learn spanish a central language to function in their own society spain meanwhile speakers of persian a central language have to learn arabic a supercentral language to function in their region on the other hand speakers of a supercentral language have to learn the hypercentral language to function globally as is evident from the huge number of nonnative english speakersaccording to de swaan languages exist in constellations and the global language system comprises a sociological classification of languages based on their social role for their speakers the worlds languages and multilinguals are connected in a strongly ordered hierarchical pattern there are thousands of peripheral or minority languages in the world each of which are connected to one of a hundred central languages the connections and patterns between each language is what makes up the global language system the four levels of language are the peripheral central supercentral and hypercentral languages peripheral languages at the lowest level peripheral languages or minority languages form the majority of languages spoken in the world 98 of the worlds languages are peripheral languages and spoken by less than 10 of the world ’ s population unlike central languages these are languages of conversation and narration rather than reading and writing of memory and remembrance rather than record they are used by native speakers within a particular area and are in danger of becoming extinct with increasing globalisation which sees more and more speakers of peripheral languages acquiring more central languages in order to communicate with others central languages the next level constitutes about 100 central languages spoken by 95 of the worlds population and generally used in education media and administration typically they are the national and official languages of the ruling state these are the languages of record and much of what has been said and written in those languages is saved in newspaper reports minutes and proceedings stored in archives included in history books collections of the classics of folk talks and folk ways increasingly recorded on electronic media and'</li><li>'the common misconception that aave carries ungrammatical features or that any speaker who speaks aave are uneducated or sloppy however like all dialects aave shows consistent internal logic and grammatical complexity as explained in the following examplesthe use of done coupled with the past tense of the verb in a sentence as seen in they done used all the good ones is a persistent structural trait of aave that is shared with southern european american vernacular varieties of english although the verbal particle done also occurs in caribbean creoles its syntactic configuration and semanticpragmatic function in aave differ somewhat from its creole counterpartsin aave done occurs only in preverbal auxiliary position with past tense forms whereas it occurs with a bare verb stem eg they done go and can occur in clausefinal position in some creoles in many aspects it functions in aave like a perfect tense referring to an action completed in the recent past but it can also be used to highlight the change of state or to intensify an activity as in the sentence i done told you not to mess up it is a stable feature but it is more frequently used in southern rural versions of aave than in urban aavedouble negation is also another feature commonly found in aave referring to the marking of negation on the auxiliary verb and indefinite pronoun an example would be she aint tellin nobody which would be she isnt telling anybody in standard english another feature copula absence or the absence of is or are in certain contexts can be observed as well he workin or they going home are some examples the habitual aspect marker or the invariant be habitual be as seen in he be workin they be tryin or i be like is a typical feature of aave it is the use of the base form of the copula verb be instead of the inflected forms such as are and am this is probably the most salient grammatical trait of aave both within the community and outside of it to the point of it being a stereotype prominently figured in representations of aave especially in the mediathe link between language and identity can be stretched into a tripartite where culture becomes key the addition of culture to the way language is linked to identity blur the lines because culture can be considered an abstract concept particularly in america it is nearly impossible to pinpoint a common culture in a country filled with so many different cultures especially when many of them are several generations removed from their origins because of the racial makeup of the country it is not ideal to include all american citizens under a'</li><li>'patois pl same or is speech or language that is considered nonstandard although the term is not formally defined in linguistics as such patois can refer to pidgins creoles dialects or vernaculars but not commonly to jargon or slang which are vocabularybased forms of cant in colloquial usage of the term especially in france class distinctions are implied by the very meaning of the term since in french patois refers to any sociolect associated with uneducated rural classes in contrast with the dominant prestige language standard french spoken by the middle and high classes of cities or as used in literature and formal settings the acrolect the term patois comes from old french patois local or regional dialect originally meaning rough clumsy or uncultivated speech possibly from the verb patoier to treat roughly from patte paw from old low franconian patta paw sole of the foot plus the suffix ois in france and other francophone countries patois has been used to describe nonstandard french and regional languages such as picard occitan and francoprovencal since 1643 and catalan after 1700 when the king louis xiv banned its use the word assumes the view of such languages being backward countrified and unlettered thus patois being potentially considered offensive when used by outsiders jean jaures said one names patois the language of a defeated nation in france and switzerland however the term patois no longer holds any offensive connotation and has indeed become a celebrated and distinguished variant of the numerous local tonguesthe vernacular form of english spoken in jamaica is also referred to as patois or patwa it is noted especially in reference to jamaican patois from 1934 jamaican patois language comprises words of the native languages of the many ethnic and cultural groups within the caribbean including spanish portuguese chinese amerindian and english along with several african languages some islands have creole dialects influenced by their linguistic diversity french spanish arabic hebrew german dutch italian chinese vietnamese and others jamaican patois is also spoken in costa rica and french creole is spoken in caribbean countries such as trinidad and tobago and guyana in south america often these patois are popularly considered broken english or slang but cases such as jamaican patois are classified with more correctness as a creole language in fact in the francophone caribbean the analogous term for local basilectal languages is creole see also jamaican english and jamaican creole antillean creole spoken in several present or formerly french islands of the lesser antilles includes vocabulary and grammar of african and carib origin in addition to french its dialects often contain folketymological derivatives of french words for example la'</li></ul> | | 40 | <ul><li>'##2 is the invariant of rohlin1991 clifford taubes forselfdual yangmills connections on nonselfdual 4manifolds journal of differential geometry 17 1982 no 1 139 – 170 gauge theory on asymptotically periodic 4manifolds j differential geom 25 1987 no 3 363 – 430 cassons invariant and gauge theory j differential geom 31 1990 no 2 547 – 5991996 richard s hamilton forthe formation of singularities in the ricci flow surveys in differential geometry vol ii cambridge ma 1993 7 – 136 int press cambridge ma 1995 fourmanifolds with positive isotropic curvature comm anal geom 5 1997 no 1 1 – 921996 gang tian foron calabis conjecture for complex surfaces with positive first chern class invent math 101 1990 no 1 101 – 172 compactness theorems for kahlereinstein manifolds of dimension 3 and up j differential geom 35 1992 no 3 535 – 558 a mathematical theory of quantum cohomology j differential geom 42 1995 no 2 259 – 367 with yongbin ruan kahlereinstein metrics with positive scalar curvature invent math 130 1997 no 1 1 – 372001 jeff cheeger forfamilies index for manifolds with boundary superconnections and cones i families of manifolds with boundary and dirac operators j funct anal 89 1990 no 2 313 – 363 with jeanmichel bismut families index for manifolds with boundary superconnections and cones ii the chern character j funct anal 90 1990 no 2 306 – 354 with jeanmichel bismut lower bounds on ricci curvature and the almost rigidity of warped products ann of math 2 144 1996 no 1 189 – 237 with tobias colding on the structure of spaces with ricci curvature bounded below i j differential geom 46 1997 no 3 406 – 480 with tobias colding2001 yakov eliashberg forcombinatorial methods in symplectic geometry proceedings of the international congress of mathematicians vol 1 2 berkeley calif 1986 531 – 539 amer math soc providence ri 1987 classification of overtwisted contact structures on 3manifolds invent math 98 1989 no 3 623 – 6372001 michael j hopkins fornilpotence and stable homotopy theory i ann of math 2 128 1988 no 2 207 – 241 with ethan devinatz and jeffrey smith the rigid analytic period mapping lubintate space and stable homotopy theory bull amer math'</li><li>'this case the two metric spaces are essentially identical they are called quasiisometric if there is a quasiisometry between them a normed vector space is a vector space equipped with a norm which is a function that measures the length of vectors the norm of a vector v is typically denoted by ‖ v ‖ displaystyle lvert vrvert any normed vector space can be equipped with a metric in which the distance between two vectors x and y is given by the metric d is said to be induced by the norm ‖ ⋅ ‖ displaystyle lvert cdot rvert conversely if a metric d on a vector space x is translation invariant d x y d x a y a displaystyle dxydxaya for every x y and a in x and absolutely homogeneous d α x α y α d x y displaystyle dalpha xalpha yalpha dxy for every x and y in x and real number αthen it is the metric induced by the norm a similar relationship holds between seminorms and pseudometrics among examples of metrics induced by a norm are the metrics d1 d2 and d∞ on r 2 displaystyle mathbb r 2 which are induced by the manhattan norm the euclidean norm and the maximum norm respectively more generally the kuratowski embedding allows one to see any metric space as a subspace of a normed vector space infinitedimensional normed vector spaces particularly spaces of functions are studied in functional analysis completeness is particularly important in this context a complete normed vector space is known as a banach space an unusual property of normed vector spaces is that linear transformations between them are continuous if and only if they are lipschitz such transformations are known as bounded operators a curve in a metric space m d is a continuous function γ 0 t → m displaystyle gamma 0tto m the length of γ is measured by in general this supremum may be infinite a curve of finite length is called rectifiable suppose that the length of the curve γ is equal to the distance between its endpoints — that is its the shortest possible path between its endpoints after reparametrization by arc length γ becomes a geodesic a curve which is a distancepreserving function a geodesic is a shortest possible path between any two of its pointsa geodesic metric space is a metric space which admits a geodesic between any two of its points the spaces r 2 d 1 displaystyle mathbb r 2d1 and r 2 d 2 displaystyle mathbb r 2d2 are both geo'</li><li>'symmetryprotected topological spt order is a kind of order in zerotemperature quantummechanical states of matter that have a symmetry and a finite energy gap to derive the results in a mostinvariant way renormalization group methods are used leading to equivalence classes corresponding to certain fixed points the spt order has the following defining properties a distinct spt states with a given symmetry cannot be smoothly deformed into each other without a phase transition if the deformation preserves the symmetry b however they all can be smoothly deformed into the same trivial product state without a phase transition if the symmetry is broken during the deformation the above definition works for both bosonic systems and fermionic systems which leads to the notions of bosonic spt order and fermionic spt order using the notion of quantum entanglement we can say that spt states are shortrange entangled states with a symmetry by contrast for longrange entanglement see topological order which is not related to the famous epr paradox since shortrange entangled states have only trivial topological orders we may also refer the spt order as symmetry protected trivial order the boundary effective theory of a nontrivial spt state always has pure gauge anomaly or mixed gaugegravity anomaly for the symmetry group as a result the boundary of a spt state is either gapless or degenerate regardless how we cut the sample to form the boundary a gapped nondegenerate boundary is impossible for a nontrivial spt state if the boundary is a gapped degenerate state the degeneracy may be caused by spontaneous symmetry breaking andor intrinsic topological order monodromy defects in nontrivial 21d spt states carry nontrival statistics and fractional quantum numbers of the symmetry group monodromy defects are created by twisting the boundary condition along a cut by a symmetry transformation the ends of such cut are the monodromy defects for example 21d bosonic zn spt states are classified by a zn integer m one can show that n identical elementary monodromy defects in a zn spt state labeled by m will carry a total zn quantum number 2m which is not a multiple of n 21d bosonic u1 spt states have a hall conductance that is quantized as an even integer 21d bosonic so3 spt states have a quantized spin hall conductance spt states are shortrange entangled while topologically ordered states are longrange entangled both intrinsic topological order and also sp'</li></ul> | | 4 | <ul><li>'hormone auxin which activates meristem growth alongside other mechanisms to control the relative angle of buds around the stem from a biological perspective arranging leaves as far apart as possible in any given space is favoured by natural selection as it maximises access to resources especially sunlight for photosynthesis in mathematics a dynamical system is chaotic if it is highly sensitive to initial conditions the socalled butterfly effect which requires the mathematical properties of topological mixing and dense periodic orbitsalongside fractals chaos theory ranks as an essentially universal influence on patterns in nature there is a relationship between chaos and fractals — the strange attractors in chaotic systems have a fractal dimension some cellular automata simple sets of mathematical rules that generate patterns have chaotic behaviour notably stephen wolframs rule 30vortex streets are zigzagging patterns of whirling vortices created by the unsteady separation of flow of a fluid most often air or water over obstructing objects smooth laminar flow starts to break up when the size of the obstruction or the velocity of the flow become large enough compared to the viscosity of the fluid meanders are sinuous bends in rivers or other channels which form as a fluid most often water flows around bends as soon as the path is slightly curved the size and curvature of each loop increases as helical flow drags material like sand and gravel across the river to the inside of the bend the outside of the loop is left clean and unprotected so erosion accelerates further increasing the meandering in a powerful positive feedback loop waves are disturbances that carry energy as they move mechanical waves propagate through a medium – air or water making it oscillate as they pass by wind waves are sea surface waves that create the characteristic chaotic pattern of any large body of water though their statistical behaviour can be predicted with wind wave models as waves in water or wind pass over sand they create patterns of ripples when winds blow over large bodies of sand they create dunes sometimes in extensive dune fields as in the taklamakan desert dunes may form a range of patterns including crescents very long straight lines stars domes parabolas and longitudinal or seif sword shapesbarchans or crescent dunes are produced by wind acting on desert sand the two horns of the crescent and the slip face point downwind sand blows over the upwind face which stands at about 15 degrees from the horizontal and falls onto the slip face where it accumulates up to the angle of repose of the sand which is about 35 degrees when the slip face'</li><li>'singleparticle trajectories spts consist of a collection of successive discrete points causal in time these trajectories are acquired from images in experimental data in the context of cell biology the trajectories are obtained by the transient activation by a laser of small dyes attached to a moving molecule molecules can now by visualized based on recent superresolution microscopy which allow routine collections of thousands of short and long trajectories these trajectories explore part of a cell either on the membrane or in 3 dimensions and their paths are critically influenced by the local crowded organization and molecular interaction inside the cell as emphasized in various cell types such as neuronal cells astrocytes immune cells and many others spt allowed observing moving particles these trajectories are used to investigate cytoplasm or membrane organization but also the cell nucleus dynamics remodeler dynamics or mrna production due to the constant improvement of the instrumentation the spatial resolution is continuously decreasing reaching now values of approximately 20 nm while the acquisition time step is usually in the range of 10 to 50 ms to capture short events occurring in live tissues a variant of superresolution microscopy called sptpalm is used to detect the local and dynamically changing organization of molecules in cells or events of dna binding by transcription factors in mammalian nucleus superresolution image acquisition and particle tracking are crucial to guarantee a high quality data once points are acquired the next step is to reconstruct a trajectory this step is done known tracking algorithms to connect the acquired points tracking algorithms are based on a physical model of trajectories perturbed by an additive random noise the redundancy of many short spts is a key feature to extract biophysical information parameters from empirical data at a molecular level in contrast long isolated trajectories have been used to extract information along trajectories destroying the natural spatial heterogeneity associated to the various positions the main statistical tool is to compute the meansquare displacement msd or second order statistical moment ⟨ x t δ t − x t 2 ⟩ [UNK] t α displaystyle langle xtdelta txt2rangle sim talpha average over realizations where α displaystyle alpha is the called the anomalous exponentfor a brownian motion ⟨ x t δ t − x t 2 ⟩ 2 n d t displaystyle langle xtdelta txt2rangle 2ndt where d is the diffusion coefficient n is dimension of the space some other properties can also be recovered from long trajectories such as the'</li><li>'each n displaystyle n the new function is defined at the points a a h a 2 h … a n h … displaystyle aaha2hldots anhldots the fundamental theorem of calculus states that differentiation and integration are inverse operations more precisely it relates the difference quotients to the riemann sums it can also be interpreted as a precise statement of the fact that differentiation is the inverse of integration the fundamental theorem of calculus if a function f displaystyle f is defined on a partition of the interval a b displaystyle ab b a n h displaystyle banh and if f displaystyle f is a function whose difference quotient is f displaystyle f then we have [UNK] i 0 n − 1 f a i h h 2 δ x f b − f a displaystyle sum i0n1faihh2delta xfbfa furthermore for every m 0 1 2 … n − 1 textstyle m012ldots n1 we have δ δ x [UNK] i 0 m f a i h h 2 δ x f a m h h 2 displaystyle frac delta delta xsum i0mfaihh2delta xfamhh2 this is also a prototype solution of a difference equation difference equations relate an unknown function to its difference or difference quotient and are ubiquitous in the sciences the early history of discrete calculus is the history of calculus such basic ideas as the difference quotients and the riemann sums appear implicitly or explicitly in definitions and proofs after the limit is taken however they are never to be seen again however the kirchhoffs voltage law 1847 can be expressed in terms of the onedimensional discrete exterior derivative during the 20th century discrete calculus remains interlinked with infinitesimal calculus especially differential forms but also starts to draw from algebraic topology as both develop the main contributions come from the following individuals henri poincare triangulations barycentric subdivision dual triangulation poincare lemma the first proof of the general stokes theorem and a lot more l e j brouwer simplicial approximation theorem elie cartan georges de rham the notion of differential form the exterior derivative as a coordinateindependent linear operator exactnessclosedness of forms emmy noether heinz hopf leopold vietoris walther mayer modules of chains the boundary operator chain complexes j w alexander solomon lefschetz lev pontryagin andrey kolmogorov norman steenrod eduard cech the early cochain notions hermann weyl the kirchhoff laws'</li></ul> | | 6 | <ul><li>'##ativistic degenerate matter a polytrope with index n 3 is a good model for the cores of white dwarfs of higher masses according to the equation of state of relativistic degenerate matter a polytrope with index n 3 is usually also used to model mainsequence stars like the sun at least in the radiation zone corresponding to the eddington standard model of stellar structure a polytrope with index n 5 has an infinite radius it corresponds to the simplest plausible model of a selfconsistent stellar system first studied by arthur schuster in 1883 and it has an exact solution a polytrope with index n ∞ corresponds to what is called an isothermal sphere that is an isothermal selfgravitating sphere of gas whose structure is identical to the structure of a collisionless system of stars like a globular cluster this is because for an ideal gas the temperature is proportional to ρ1n so infinite n corresponds to a constant temperaturein general as the polytropic index increases the density distribution is more heavily weighted toward the center r 0 of the body polytropic process equation of state murnaghan equation of state'</li><li>'together the analysis was expanded upon by alar toomre in 1964 and presented in a more general and comprehensive framework'</li><li>'the bidirectional reflectance distribution function brdf symbol f r ω i ω r displaystyle ftextromega textiomega textr is a function of four real variables that defines how light is reflected at an opaque surface it is employed in the optics of realworld light in computer graphics algorithms and in computer vision algorithms the function takes an incoming light direction ω i displaystyle omega texti and outgoing direction ω r displaystyle omega textr taken in a coordinate system where the surface normal n displaystyle mathbf n lies along the zaxis and returns the ratio of reflected radiance exiting along ω r displaystyle omega textr to the irradiance incident on the surface from direction ω i displaystyle omega texti each direction ω displaystyle omega is itself parameterized by azimuth angle [UNK] displaystyle phi and zenith angle θ displaystyle theta therefore the brdf as a whole is a function of 4 variables the brdf has units sr−1 with steradians sr being a unit of solid angle the brdf was first defined by fred nicodemus around 1965 the definition is where l displaystyle l is radiance or power per unit solidangleinthedirectionofaray per unit projectedareaperpendiculartotheray e displaystyle e is irradiance or power per unit surface area and θ i displaystyle theta texti is the angle between ω i displaystyle omega texti and the surface normal n displaystyle mathbf n the index i displaystyle texti indicates incident light whereas the index r displaystyle textr indicates reflected light the reason the function is defined as a quotient of two differentials and not directly as a quotient between the undifferentiated quantities is because irradiating light other than d e i ω i displaystyle mathrm d etextiomega texti which are of no interest for f r ω i ω r displaystyle ftextromega textiomega textr might illuminate the surface which would unintentionally affect l r ω r displaystyle ltextromega textr whereas d l r ω r displaystyle mathrm d ltextromega textr is only affected by d e i ω i displaystyle mathrm d etextiomega texti the spatially varying bidirectional reflectance distribution function svbrdf is a 6dimensional function f r ω i ω r x displaystyle ftextromega textiomega textrmathbf x where x displaystyle mathbf x describes a 2d'</li></ul> | | 35 | <ul><li>'microbiologically induced calcium carbonate precipitation micp is a biogeochemical process that induces calcium carbonate precipitation within the soil matrix biomineralization in the form of calcium carbonate precipitation can be traced back to the precambrian period calcium carbonate can be precipitated in three polymorphic forms which in the order of their usual stabilities are calcite aragonite and vaterite the main groups of microorganisms that can induce the carbonate precipitation are photosynthetic microorganisms such as cyanobacteria and microalgae sulfatereducing bacteria and some species of microorganisms involved in nitrogen cycle several mechanisms have been identified by which bacteria can induce the calcium carbonate precipitation including urea hydrolysis denitrification sulfate production and iron reduction two different pathways or autotrophic and heterotrophic pathways through which calcium carbonate is produced have been identified there are three autotrophic pathways which all result in depletion of carbon dioxide and favouring calcium carbonate precipitation in heterotrophic pathway two metabolic cycles can be involved the nitrogen cycle and the sulfur cycle several applications of this process have been proposed such as remediation of cracks and corrosion prevention in concrete biogrout sequestration of radionuclides and heavy metals all three principal kinds of bacteria that are involved in autotrophic production of carbonate obtain carbon from gaseous or dissolved carbon dioxide these pathways include nonmethylotrophic methanogenesis anoxygenic photosynthesis and oxygenic photosynthesis nonmethylotrophic methanogenesis is carried out by methanogenic archaebacteria which use co2 and h2 in anaerobiosis to give ch4 two separate and often concurrent heterotrophic pathways that lead to calcium carbonate precipitation may occur including active and passive carbonatogenesis during active carbonatogenesis the carbonate particles are produced by ionic exchanges through the cell membrane by activation of calcium andor magnesium ionic pumps or channels probably coupled with carbonate ion production during passive carbonatogenesis two metabolic cycles can be involved the nitrogen cycle and the sulfur cycle three different pathways can be involved in the nitrogen cycle ammonification of amino acids dissimilatory reduction of nitrate and degradation of urea or uric acid in the sulfur cycle bacteria follow the dissimilatory reduction of sulfate ureolysis or degradation of urea the microbial urease catalyzes the hydrolysis of urea into ammonium and carbonate one mole of urea is hydrolyzed intracellular'</li><li>'brown earth is a type of soil brown earths are mostly located between 35° and 55° north of the equator the largest expanses cover western and central europe large areas of western and transuralian russia the east coast of america and eastern asia here areas of brown earth soil types are found particularly in japan korea china eastern australia and new zealand brown earths cover 45 of the land in england and wales they are common in lowland areas below 1000 feet on permeable parent material the most common vegetation types are deciduous woodland and grassland due to the reasonable natural fertility of brown earths large tracts of deciduous woodland have been cut down and the land is now used for farming they are normally located in regions with a humid temperate climate rainfall totals are moderate usually below 76 cm per year and temperatures range from 4 °c in the winter to 18 °c in the summer they are welldrained fertile soils with a ph of between 50 and 65 soils generally have three horizons the a b and c horizon horizon a is usually a brownish colour and over 20 cm in depth it is composed of mull humus well decomposed alkaline organic matter and mineral matter it is biologically active with many soil organisms and plant roots mixing the mull humus with mineral particles as a result the boundary between the a and b horizons can be illdefined in unploughed examples horizon b is mostly composed of mineral matter which has been weathered from the parent material but it often contains inclusions of more organic material carried in by organisms especially earthworms it is lighter in colour than the a horizon and is often weakly illuviated enriched with material from overlying horizons due to limited leaching only the more soluble bases are moved down through the profile horizon c is made up of the parent material which is generally permeable and non or slightly acidic for example clay loam brown earths are important because they are permeable and usually easy to work throughout the year so they are valued for agriculture they also support a much wider range of forest trees than can be found on wetter land they are freely drained soils with welldeveloped a and b horizons they often develop over relatively permeable bedrock of some kind but are also found over unconsolidated parent materials like river gravels some soil classifications include welldrained alluvial soils in the brown earths too typically the brown earths have dark brown topsoils with loamy particle sizeclasses and good structure – especially under grassland the b horizon lacks the grey colours and mottles characteristic of gley'</li><li>'and it is about twice the carbon content of the atmosphere or around four times larger than the human emissions of carbon between the start of the industrial revolution and 2011 further most of this carbon 1035 billion tons is stored in what is defined as the nearsurface permafrost no deeper than 3 metres 98 ft below the surface however only a fraction of this stored carbon is expected to enter the atmosphere in general the volume of permafrost in the upper 3 m of ground is expected to decrease by about 25 per 1 °c 18 °f of global warming 1283 yet even under the rcp85 scenario associated with over 4 °c 72 °f of global warming by the end of the 21st century about 5 to 15 of permafrost carbon is expected to be lost over decades and centuriesthe exact amount of carbon that will be released due to warming in a given permafrost area depends on depth of thaw carbon content within the thawed soil physical changes to the environment and microbial and vegetation activity in the soil notably estimates of carbon release alone do not fully represent the impact of permafrost thaw on climate change this is because carbon can be released through either aerobic or anaerobic respiration which results in carbon dioxide co2 or methane ch4 emissions respectively while methane lasts less than 12 years in the atmosphere its global warming potential is around 80 times larger than that of co2 over a 20year period and about 28 times larger over a 100year period while only a small fraction of permafrost carbon will enter the atmosphere as methane those emissions will cause 4070 of the total warming caused by permafrost thaw during the 21st century much of the uncertainty about the eventual extent of permafrost methane emissions is caused by the difficulty of accounting for the recently discovered abrupt thaw processes which often increase the fraction of methane emitted over carbon dioxide in comparison to the usual gradual thaw processes another factor which complicates projections of permafrost carbon emissions is the ongoing greening of the arctic as climate change warms the air and the soil the region becomes more hospitable to plants including larger shrubs and trees which could not survive there before thus the arctic is losing more and more of its tundra biomes yet it gains more plants which proceed to absorb more carbon some of the emissions caused by permafrost thaw will be offset by this increased plant growth but the exact proportion is uncertain it is considered very unlikely that this greening could offset all of the emissions from permafrost thaw during the'</li></ul> | | 8 | <ul><li>'the enhanced avionics system or easy is an integrated modular avionics suite and cockpit display system used on dassault falcon business jets since falcon 900ex and later used in other newer falcon aircraft such as falcon 2000ex and falcon 7xeasy has been jointly developed by dassault and honeywell and is based on honeywell primus epic dassault aviation started to develop the easy flight deck concept in the mid1990s with a goal to have a much better integration of aircraft systems such as fmseasy was first integrated and certificated on falcon 900ex the first easy equipped 900ex was delivered in december 2003 honeywell primus epic base of easy was then integrated on other business jets and helicopterseasy was certified on the falcon 2000ex in june 2004 with deliveries starting shortly after falcon 7x was developed from the groundup with easy avionics in october 2008 dassault announced the launch of easy phase ii program at the annual nbaa meeting in orlando easy phase ii include several enhancements to easy such as synthetic vision system adsb out paperless charts future air navigation system fans1a using controller pilot data link communications cpdlc localizer performance with vertical guidance lpveasy phase ii was certified on falcon 900lx in june 2011 and on falcon 7x in may 2013 easy architecture is based on integrated modular avionics the processing modules are called mau modular avionics units the core operating system of easy is provided by ddci integrated modular avionics ima cockpit display system dassault falcon 7x dassault aviation'</li><li>'briefly before being replaced by sonne and bernard erika transmitted a vhf signal on 3033 mhz which could be received by standard ebl 3 receivers the signal was adjusted in phase between a ref point and a navigation point after processing the fug 121 displayed an angle from the beacon by using two beacons it was possible to achieve a fix however this was a problem as four receivers were required two listening to each station on smaller aircraft there was not enough space and german industry was by now having trouble supplying enough radios to the air force without adding 4 more receivers per plane the system was not deployed some sources indicate that there may have been a version called electra that operated at 250 to 300 khz but details are lacking or contradictorysonne this system transmitted on 270 – 480 khz and could be received on a fug 10 no special receiver was required as the pattern was discernable with the ear all that was required was the special charts at least 6 stations were built providing coverage from the bay of biscay to norway accuracy was reasonable during the day but errors up to 4 degrees occurred at night the allies captured the maps with resulted in the being issued to allied units because of this the allies left the sonne system alone after the war the stations were rebuilt and operated into the 1970s the system was called consol by that time mond development work was done on sonne sun to remove the night time errors this system was called mond moon work was never completed truhe this system was based on the british gee system after british units were captured the germans set up a project to clone the units the first unit was the fug 122 which allowed the reception of british gee signals units in france received these units and were able to navigate using british signals the germans then developed the concept to produce fug 123 receivers which would allow a wider turning range this allowed the germans to setup gee chains of their own further inside germany where the british gee signals were unusable there seems to have been some idea of using frequencies very close to the british frequencies to make jamming by the allies hard to do without jamming their own gee system one chain became operational around berlin fubl 1 used the lorenz landing beam system consisted of the ebl 1 and ebl 2 receivers with display device anf 2 the ebl 1 operated between 30 and 33 mhz and received the azimuth signals from a transmitter at the far end of the runway the ebl 2 operated at 38 mhz and received the two marker beacons as the aircraft approached the threshold to land the afn 2 provided the pilot with'</li><li>'a ground proximity warning system gpws is a system designed to alert pilots if their aircraft is in immediate danger of flying into the ground or an obstacle the united states federal aviation administration faa defines gpws as a type of terrain awareness and warning system taws more advanced systems introduced in 1996 are known as enhanced ground proximity warning systems egpws a modern type of taws in the late 1960s a series of controlled flight into terrain cfit accidents took the lives of hundreds of people a cfit accident is one where a properly functioning airplane under the control of a fully qualified and certified crew is flown into terrain water or obstacles with no apparent awareness on the part of the crewbeginning in the early 1970s a number of studies examined the occurrence of cfit accidents findings from these studies indicated that many such accidents could have been avoided if a warning device called a ground proximity warning system gpws had been used as a result of these studies and recommendations from the us national transportation safety board ntsb in 1974 the faa required all large turbine and turbojet airplanes to install tsoapproved gpws equipmentthe un international civil aviation organization icao recommended the installation of gpws in 1979c donald bateman a canadianborn engineer developed and is credited with the invention of gpwsin march 2000 the us faa amended operating rules to require that all us registered turbinepowered airplanes with six or more passenger seats exclusive of pilot and copilot seating be equipped with an faaapproved taws the mandate affects aircraft manufactured after march 29 2002 prior to the development of gpws large passenger aircraft were involved in 35 fatal cfit accidents per year falling to 2 per year in the mid1970s a 2006 report stated that from 1974 when the us faa made it a requirement for large aircraft to carry such equipment until the time of the report there had not been a single passenger fatality in a cfit crash by a large jet in us airspaceafter 1974 there were still some cfit accidents that gpws was unable to help prevent due to the blind spot of those early gpws systems more advanced systems were developed older taws or deactivation of the egpws or ignoring its warnings when an airport is not in its database still leave aircraft vulnerable to possible cfit incidents in april 2010 a polish air force tupolev tu154m aircraft crashed near smolensk russia in a possible cfit accident killing all passengers and crew including the president of poland lech kaczynski the aircraft was equipped with taws made by universal avionics systems of tucson according to the russian interstate aviation committee'</li></ul> | | 12 | <ul><li>'of s m displaystyle sm for some integers m displaystyle m whose base k displaystyle k representations are close to that of n displaystyle n constantrecursive sequences can be thought of as 1 displaystyle 1 regular sequences where the base1 representation of n displaystyle n consists of n displaystyle n copies of the digit 1 displaystyle 1'</li><li>'the small triangles whose vertices all have different numbers are shaded in the graph each small triangle becomes a node in the new graph derived from the triangulation the small letters identify the areas eight inside the figure and area i designates the space outside of it as described previously those nodes that share an edge whose endpoints are numbered 1 and 2 are joined in the derived graph for example node d shares an edge with the outer area i and its vertices all have different numbers so it is also shaded node b is not shaded because two vertices have the same number but it is joined to the outer area one could add a new fullnumbered triangle say by inserting a node numbered 3 into the edge between 1 and 1 of node a and joining that node to the other vertex of a doing so would have to create a pair of new nodes like the situation with nodes f and g suppose there is a ddimensional simplex of sidelength n and it is triangulated into subsimplices of sidelength 1 there is a function that given any vertex of the triangulation returns its color the coloring is guaranteed to satisfy sperners boundary condition how many times do we have to call the function in order to find a rainbow simplex obviously we can go over all the triangulation vertices whose number is ond which is polynomial in n when the dimension is fixed but can it be done in time this problem was first studied by christos papadimitriou he introduced a complexity class called ppad which contains this as well as related problems such as finding a brouwer fixed point he proved that finding a sperner simplex is ppadcomplete even for d3 some 15 years later chen and deng proved ppadcompleteness even for d2 it is believed that ppadhard problems cannot be solved in time opolylog n suppose that each vertex of the triangulation may be labeled with multiple colors so that the coloring function is f s → 2n1 for every subsimplex the set of labelings on its vertices is a setfamily over the set of colors n 1 this setfamily can be seen as a hypergraph if for every vertex v on a face of the simplex the colors in fv are a subset of the set of colors on the face endpoints then there exists a subsimplex with a balanced labeling – a labeling in which the corresponding hypergraph admits a perfect fractional matching to illustrate here are some balanced labeling examples for n 2'</li><li>'labeling is also odd l − v − l v displaystyle lvlv hence by tuckers lemma there are two adjacent vertices u v displaystyle uv with opposite labels assume wlog that the labels are l u 1 l v − 1 displaystyle lu1lv1 by the definition of l this means that in both g u displaystyle gu and g v displaystyle gv coordinate 1 is the largest coordinate in g u displaystyle gu this coordinate is positive while in g v displaystyle gv it is negative by the construction of the triangulation the distance between g u displaystyle gu and g v displaystyle gv is at most [UNK] displaystyle epsilon so in particular g u 1 − g v 1 g u 1 g v 1 ≤ [UNK] displaystyle gu1gv1gu1gv1leq epsilon since g u 1 displaystyle gu1 and g v 1 displaystyle gv1 have opposite signs and so g u 1 ≤ [UNK] displaystyle gu1leq epsilon but since the largest coordinate of g u displaystyle gu is coordinate 1 this means that g u k ≤ [UNK] displaystyle gukleq epsilon for each 1 ≤ k ≤ n displaystyle 1leq kleq n so g u ≤ c n [UNK] displaystyle guleq cnepsilon where c n displaystyle cn is some constant depending on n displaystyle n and the norm ⋅ displaystyle cdot which you have chosen the above is true for every [UNK] 0 displaystyle epsilon 0 since s n displaystyle sn is compact there must hence be a point u in which g u 0 displaystyle gu0 no subset of r n displaystyle mathbb r n is homeomorphic to s n displaystyle sn the ham sandwich theorem for any compact sets a1 an in r n displaystyle mathbb r n we can always find a hyperplane dividing each of them into two subsets of equal measure above we showed how to prove the borsuk – ulam theorem from tuckers lemma the converse is also true it is possible to prove tuckers lemma from the borsuk – ulam theorem therefore these two theorems are equivalent there are several fixedpoint theorems which come in three equivalent variants an algebraic topology variant a combinatorial variant and a setcovering variant each variant can be proved separately using totally different arguments but each variant can also be reduced to the other variants in its row additionally each result in the top row can be deduced from the one below it in the same column in the original theorem the domain'</li></ul> | | 33 | <ul><li>'xenoglossy also written xenoglossia and sometimes also known as xenolalia is the supposedly paranormal phenomenon in which a person is allegedly able to speak write or understand a foreign language that they could not have acquired by natural means the term derives from the ancient greek xenos ξενος foreigner and glossa γλωσσα tongue or language the term xenoglossy was first used by french parapsychologist charles richet in 1905 claims of xenoglossy are found in the new testament and contemporary claims have been made by parapsychologists and reincarnation researchers such as ian stevenson doubts have been expressed that xenoglossy is an actual phenomenon and there is no scientifically admissible evidence supporting any of the alleged instances of xenoglossytwo types of xenoglossy are distinguished recitative xenoglossy is the use of an unacquired language incomprehensibly while responsive xenoglossy refers to the ability to intelligibly employ the unlearned language as if already acquired this phenomenon is mentioned in acts of the apostles chapter 2 at pentecost when the first disciples of jesus christ gathered together numbering one hundred and twenty and of the tongues of fire landed on each of them formalizing the coming of the spirit in an episode of inspired communication that allows the disciples to express themselves in languages other than galilean and to be understood by strangers several accounts of miraculous abilities of some people to read write speak or understand a foreign language as mentioned in the bible have been related in similar christian accounts in the middle ages similar claims were also made by some pentecostal theologians in 1901 claims of mediums speaking foreign languages were made by spiritualists in the 19th century more recent claims of xenoglossy have come from reincarnation researchers who have alleged that individuals were able to recall a language spoken in a past life some reports of xenoglossy have surfaced in the popular press such as czech speedway rider matej kus who in september 2007 supposedly awoke after a crash and was able to converse in perfect english however press reports of his fluency in english were based entirely on anecdotal stories told by his czech teammates xenoglossy has been claimed to have occurred during exorcisms canadian parapsychologist and psychiatrist at the university of virginia ian stevenson claimed there were a handful of cases that suggested evidence of xenoglossy these included two where a subject under hypnosis could'</li><li>'have lost but if asked directly in the context of a psychic reading whether they have such an item the client may be shocked and assume that the reader learned the information directly from the deceased loved one robert todd carroll notes in the skeptics dictionary that some would consider this to be cold reading the rainbow ruse is a crafted statement which simultaneously awards the subject a specific personality trait as well as the opposite of that trait with such a phrase a cold reader can cover all possibilities and appear to have made an accurate deduction in the mind of the subject despite the fact that a rainbow ruse statement is vague and contradictory this technique is used since personality traits are not quantifiable and also because nearly everybody has experienced both sides of a particular emotion at some time in their lives statements of this type include most of the time you are positive and cheerful but there has been a time in the past when you were very upset you are a very kind and considerate person but when somebody does something to break your trust you feel deepseated anger i would say that you are mostly shy and quiet but when the mood strikes you you can easily become the center of attentiona cold reader can choose from a variety of personality traits think of its opposite and then bind the two together in a phrase vaguely linked by factors such as mood time or potential the mentalist branch of the stagemagician community approves of reading as long as it is presented strictly as an artistic entertainment and one is not pretending to be psychicsome performers who use cold reading are honest about their use of the technique lynne kelly kari coleman ian rowland and derren brown have used these techniques at either private fortunetelling sessions or open forum talking with the dead sessions in the manner of those who claim to be genuine mediums only after receiving acclaim and applause from their audience do they reveal that they needed no psychic power for the performance only a sound knowledge of psychology and cold reading in an episode of his trick of the mind series broadcast in march 2006 derren brown showed how easily people can be influenced through cold reading techniques by repeating bertram forers famous demonstration of the personal validation fallacy or forer effect in a detailed review of four sittings conducted by medium tyler henry edward and susan gerbic reviewed all statements made by him on the tv show hollywood medium in their opinion not one statement made by henry was accurate yet each sitter felt that their reading was highly successful in interviews with each sitter after their sitting all four claimed specific statements made by henry but after reviewing the show it was shown that he had not made those statements each sit'</li><li>'al concluding that the ganzfeld studies have not been independently replicated and had thus failed to produce evidence for psi according to hyman reliance on metaanalysis as the sole basis for justifying the claim that an anomaly exists and that the evidence for it is consistent and replicable is fallacious it distorts what scientists mean by confirmatory evidence storm et al published a response to hyman claiming the ganzfeld experimental design has proved to be consistent and reliable but parapsychology is a struggling discipline that has not received much attention so further research on the subject is necessary rouder et al in 2013 wrote that critical evaluation of storm et als metaanalysis reveals no evidence for psi no plausible mechanism and omitted replication failuresa 2016 paper examined questionable research practices in the ganzfeld experiments and simulated how such practices could cause erroneous positive results there are several common criticisms of some or all of the ganzfeld experiments isolation – richard wiseman and others argue that not all of the studies used soundproof rooms so it is possible that when videos were playing the experimenter could have heard it and later given involuntary cues to the receiver during the selection process it could even have been possible that the receiver themselves could hear the video randomization – when subjects are asked to choose from a variety of selections there is an inherent bias to choose the first selection they are shown if the order in which they are shown the selections is randomized each time this bias will be averaged out the randomization procedures used in the experiment have been criticized for not randomizing satisfactorily the psi assumption – the assumption that any statistical deviation from chance is evidence for telepathy is highly controversial strictly speaking a deviation from chance is only evidence that either this was a rare statistically unlikely occurrence that happened by chance or something was causing a deviation from chance flaws in the experimental design are a common cause of this and so the assumption that it must be telepathy is fallaciouswriting in 1985 c e m hansel discovered weaknesses in the design and possibilities of sensory leakage in the ganzfeld experiments reported by carl sargent and other parapsychologists hansel concluded the ganzfeld studies had not been independently replicated and that esp is no nearer to being established than it was a hundred years agodavid marks in his book the psychology of the psychic 2000 has noted that during the autoganzfeld experiments the experimenter sat only fourteen feet from the senders room soundproofing tiles were eventually added but they were designed to absorb sound not to prevent transmission according to marks this was inadequate'</li></ul> | | 22 | <ul><li>'water resources are natural resources of water that are potentially useful for humans for example as a source of drinking water supply or irrigation water 97 of the water on earth is salt water and only three percent is fresh water slightly over twothirds of this is frozen in glaciers and polar ice caps the remaining unfrozen freshwater is found mainly as groundwater with only a small fraction present above ground or in the air natural sources of fresh water include surface water under river flow groundwater and frozen water artificial sources of fresh water can include treated wastewater wastewater reuse and desalinated seawater human uses of water resources include agricultural industrial household recreational and environmental activities water resources are under threat from water scarcity water pollution water conflict and climate change fresh water is a renewable resource yet the worlds supply of groundwater is steadily decreasing with depletion occurring most prominently in asia south america and north america although it is still unclear how much natural renewal balances this usage and whether ecosystems are threatened natural sources of fresh water include surface water under river flow groundwater and frozen water surface water is water in a river lake or fresh water wetland surface water is naturally replenished by precipitation and naturally lost through discharge to the oceans evaporation evapotranspiration and groundwater recharge the only natural input to any surface water system is precipitation within its watershed the total quantity of water in that system at any given time is also dependent on many other factors these factors include storage capacity in lakes wetlands and artificial reservoirs the permeability of the soil beneath these storage bodies the runoff characteristics of the land in the watershed the timing of the precipitation and local evaporation rates all of these factors also affect the proportions of water loss humans often increase storage capacity by constructing reservoirs and decrease it by draining wetlands humans often increase runoff quantities and velocities by paving areas and channelizing the stream flow natural surface water can be augmented by importing surface water from another watershed through a canal or pipeline brazil is estimated to have the largest supply of fresh water in the world followed by russia and canada water from glaciers glacier runoff is considered to be surface water the himalayas which are often called the roof of the world contain some of the most extensive and rough high altitude areas on earth as well as the greatest area of glaciers and permafrost outside of the poles ten of asias largest rivers flow from there and more than a billion peoples livelihoods depend on them to complicate matters temperatures there are rising more rapidly than the global average in nepal the temperature has risen by 06 degrees celsius over the last decade whereas globally the earth has'</li><li>'##ng magnitude from leftright the finite water content vadose zone flux method works with any monotonic water retention curveunsaturated hydraulic conductivity relations such as brooks and corey clapp and hornberger and van genuchtenmualem the method might work with hysteretic water retention relations these have not yet been tested the finite water content method lacks the effect of soil water diffusion this omission does not affect the accuracy of flux calculations using the method because the mean of the diffusive flux is small practically this means that the shape of the wetting front plays no role in driving the infiltration the method is thus far limited to 1d in practical applications the infiltration equation was extended to 2 and quasi3 dimensions more work remains in extending the entire method into more than one dimension the paper describing this method was selected by the early career hydrogeologists network of the international association of hydrogeologists to receive the coolest paper published in 2015 award in recognition of the potential impact of the publication on the future of hydrogeology richards equation infiltration hydrology soil moisture velocity equation'</li><li>'stress distribution in soil is a function of the type of soil the relative rigidity of the soil and the footing and the depth of foundation at level of contact between footing and soilthe estimation of vertical stresses at any point in a soil mass due to external loading is essential to the prediction of settlements of buildings bridges and pressure the solution to the problem of calculating the stresses in an elastic half space subjected to a vertical point load at the surface will be of value in estimating the stresses induced in a deposit of soil whose depth is large compared to the dimensions of that part of the surface that is loaded δ σ z − 3 p 2 π r 2 cos 3 θ displaystyle delta sigma zfrac 3p2pi r2cos 3theta δ σ r p 2 π r 2 − 3 cos θ sin 2 θ 1 − 2 μ 1 cos θ displaystyle delta sigma rfrac p2pi r23cos theta sin 2theta frac 12mu 1cos theta δ σ t p 2 π r 2 1 − 2 μ cos θ − 1 1 cos θ displaystyle delta sigma tfrac p2pi r212mu cos theta frac 11cos theta δ τ − 3 p 2 π r 2 cos 2 θ sin θ displaystyle delta tau frac 3p2pi r2cos 2theta sin theta cos θ z r displaystyle cos theta frac zr r r 2 z 2 displaystyle rsqrt r2z2 δ σ z − 3 p z 3 2 π r 5 − 3 p 2 π z 3 r 2 z 2 5 2 − 3 p 2 π z 2 1 r z 2 5 2 displaystyle delta sigma zfrac 3pz32pi r5frac 3p2pi frac z3r2z252frac 3p2pi z2left1leftfrac rzright2rightfrac 52 σ q 1 − 1 r z 2 1 3 2 displaystyle sigma q1frac 1frac rz2132'</li></ul> | | 3 | <ul><li>'##ilise and suggest other technologies such as mobile phones or psion organisers as such feedback studies involve asynchronous communication between the participants and the researchers as the participants ’ data is recorded in their diary first and then passed on to the researchers once completefeedback studies are scalable that is a largescale sample can be used since it is mainly the participants themselves who are responsible for collecting and recording data in elicitation studies participants capture media as soon as the phenomenon occurs the media is usually in the form of a photograph but can be in other different forms as well and so the recording is generally quick and less effortful than feedback studies these media are then used as prompts and memory cues to elicit memories and discussion in interviews that take place much later as such elicitation studies involve synchronous communication between the participants and the researchers usually through interviewsin these later interviews the media and other memory cues such as what activities were done before and after the event can improve participants ’ episodic memory in particular photos were found to elicit more specific recall than all other media types there are two prominent tradeoffs between each type of study feedback studies involve answering questions more frequently and in situ therefore enabling more accurate recall but more effortful recording in contrast elicitation studies involve quickly capturing media in situ but answering questions much later therefore enabling less effortful recording but potentially inaccurate recall diary studies are most often used when observing behavior over time in a natural environment they can be beneficial when one is looking to find new qualitative and quantitative data advantages of diary studies are numerous they allow collecting longitudinal and temporal information reporting events and experiences in context and inthemoment participants to diary their behaviours thoughts and feelings inthemoment thereby minimising the potential for post rationalisation determining the antecedents correlations and consequences of daily experiences and behaviors there are some limitations of diary studies mainly due to their characteristics of reliance on memory and selfreport measures there is low control low participation and there is a risk of disturbing the action in feedback studies it can be troubling and disturbing to write everything down the validity of diary studies rests on the assumption that participants will accurately recall and record their experiences this is somewhat more easily enabled by the fact that diaries are completed media is captured in a natural environment and closer in realtime to any occurrences of the phenomenon of interest however there are multiple barriers to obtaining accurate data such as social desirability bias where participants may answer in a way that makes them appear more socially desirable this may be more prominent in longitudinal studies'</li><li>'indigenous media can reference film video music digital art and sound produced and created by and for indigenous people it refers to the use of communication tools pathways and outlets by indigenous peoples for their own political and cultural purposes indigenous media is the use of modern media techniques by indigenous peoples also called fourth world peoples indigenous media helps communities in their fight against cultural extinction economic and ecological decline and forced displacement most often in the field of indigenous media the creators of the media are also the consumers together with the neighboring communities sometimes the media is also received by institutions and film festivals located far away from the production location like the american indian film festival the production is usually locally based low budget and small scale but it can also be sponsored by different support groups and governments 34 – 35 the concept of indigenous media could be extended to first world alternative media like aids activist video the research of indigenous media and the international indigenous movement in the process of globalization develop in parallel in the second half of the 20th century united nations agencies including the united nations working group on indigenous populations wgip led the movement the united nations general assembly adopted a declaration aimed at protecting the rights of indigenous peoples in 2007 the theoretical development of indigenous media research first occurred in anthropology in 1980 it was accompanied by a critical research method that diverged from postcolonialism and poststructuralism the newer method attempted to minimize the power imbalance between the researcher and the researched leading up to this ethnographic films that gave photographic techniques to locals can be traced back as far as the navajo project in 1960 the project was the pioneering work of sol worth and john adair to which the origin of a new anthropological language and style of ethnography can be attributedhowever the indigenous media movement was not a significant phenomenon for another decade the widely recognized start of the new media movement was a collaboration between american anthropologist eric michaels and australia ’ s warlpiri aboriginal broadcasting this new type of collaborative anthropological project exemplified a change from a simple observation of the life of the indigenous people to a cultural record by the indigenous people themselves following the warlpiri project the brazilian kayapo village project of vincent carelli and terence turner and the indigenous series by maori producer barry barclay in new zealand have been important milestones in the development of indigenous media however it was faye ginsburg an american anthropologist who laid the theoretical foundation for the study of indigenous media her research in 1991 expounded the faustian dilemma between technology and tribal life and inspired later indigenous media researchers the important theories of recent indigenous media studies have highlighted the dynamic relationship between local indigenous communities and their countries and globalization lorna roth'</li><li>'results did not predict any prejudices towards black individuals this study used emic approaches of study by conducting interviews with the locals and etic approaches by giving participants generalized personality tests exonym and endonymother explorations of the differences between reality and humans models of it blind men and an elephant emic and etic units internalism and externalism map – territory relation creswell j w 1998 qualitative enquiry and research design choosing among five traditions london uk sage dundes alan 1962 from etic to emic units in the structural study of folktales journal of american folklore 75 296 95 – 105 doi102307538171 jstor i223629 goodenough ward 1970 describing a culture description and comparison in cultural anthropology cambridge uk cambridge university press pp 104 – 119 isbn 9780202308616 harris marvin 1976 history and significance of the emicetic distinction annual review of anthropology 5 329 – 350 doi101146annurevan05100176001553 harris marvin 1980 chapter two the epistemology of cultural materialism cultural materialism the struggle for a science of culture new york random house pp 29 – 45 isbn 9780759101340 headland thomas pike kenneth harris marvin eds 1990 emics and etics the insideroutsider debate sage jahoda g 1977 y j poortinga ed in pursuit of the emicetic distinction can we ever capture it basic problems in crosscultural psychology pp 55 – 63 jardine nick 2004 etics and emics not to mention anemics and emetics in the history of the sciences history of science 42 3 261 – 278 bibcode2004hissc42261j doi101177007327530404200301 s2cid 141081973 jingfeng xia 2013 an anthropological emicetic perspective on open access practices academic search premier kitayama shinobu cohen dov 2007 handbook of cultural psychology new york guilford press kottak conrad 2006 mirror for humanity new york mcgraw hill isbn 9780078034909 nattiez jeanjacques 1987 musicologie generale et semiologue music and discourse toward a semiology of music translated by carolyn abbate isbn 9780691027142 pike kenneth lee ed 1967 language in relation to a unified theory of structure of human behavior 2nd ed the hague netherlands mouton'</li></ul> | | 34 | <ul><li>'democratic education is a type of formal education that is organized democratically so that students can manage their own learning and participate in the governance of their school democratic education is often specifically emancipatory with the students voices being equal to the teachersthe history of democratic education spans from at least the 17th century while it is associated with a number of individuals there has been no central figure establishment or nation that advocated democratic education in 1693 john locke published some thoughts concerning education in describing the teaching of children he declares none of the things they are to learn should ever be made a burthen to them or imposd on them as a task whatever is so proposd presently becomes irksome the mind takes an aversion to it though before it were a thing of delight or indifferency let a child but be orderd to whip his top at a certain time every day whether he has or has not a mind to it let this be but requird of him as a duty wherein he must spend so many hours morning and afternoon and see whether he will not soon be weary of any play at this rate jeanjacques rousseaus book of advice on education emile was first published in 1762 emile the imaginary pupil he uses for illustration was only to learn what he could appreciate as useful he was to enjoy his lessons and learn to rely on his own judgement and experience the tutor must not lay down precepts he must let them be discovered wrote rousseau and urged him not make emile learn science but let him discover it he also said that we should not substitute books for personal experience because this does not teach us to reason it teaches us to use other peoples reasoning it teaches us to believe a great deal but never to know anything while locke and rousseau were concerned only with the education of the children of the wealthy in the 19th century leo tolstoy set up a school for peasant children this was on his own estate at yasnaya polyana russia in the late 19th century he tells us that the school evolved freely from principles introduced by teachers and pupils that in spite of the preponderating influence of the teacher the pupil had always had the right not to come to school or having come not to listen to the teacher and that the teacher had the right not to admit a pupil and was able to use all the influence he could muster to win over the community where the children were always in the majority dom sierot in 1912 janusz korczak founded dom sierot the jewish orphanage in warsaw which was run on democratic lines in 1940 dom si'</li><li>'is done through six points of reference learners studentsteachers in dialogue approach their acts of knowing as grounded in individual experience and circumstance learners approach the historical and cultural world as a transformable reality shaped by human ideological representations of reality learners make connections between their own conditions and the conditions produced through the making of reality learners consider the ways that they can shape this reality through their methods of knowing this new reality is collective shared and shifting learners develop literacy skills that put their ideas into print thus giving potency to the act of knowing learners identify the myths in the dominant discourse and work to destabilize these myths ending the cycle of oppression the montessori method developed by maria montessori is an example of problemposing education in an early childhood model ira shor a professor of composition and rhetoric at cuny who has worked closely with freire also advocates a problem posing model in his use of critical pedagogy he has published on the use of contract grading the physical setup of the classroom and the political aspects of student and teacher rolesjames d kirylo in his book paulo freire the man from recife reiterated freires thought and stated that a problemposing education is one where human beings are viewed as conscious beings who are unfinished yet in process of becoming other advocates of problemposing critical pedagogy include henry giroux peter mclaren and bell hooks inquirybased learning problembased learning unschooling'</li><li>'ambiguity tolerance – intolerance is a psychological construct that describes the relationship that individuals have with ambiguous stimuli or events individuals view these stimuli in a neutral and open way or as a threat ambiguity tolerance – intolerance is a construct that was first introduced in 1949 through the work of else frenkelbrunswik while researching ethnocentrism in children and was perpetuated by her research of ambiguity intolerance in connection to authoritarian personality it serves to define and measure how well an individual responds when presented with an event that results in ambiguous stimuli or situations in her study she tested the notion that children who are ethnically prejudiced also tend to reject ambiguity more so than their peers she studied children who ranked high and low on prejudice in a story recall test and then studied their responses to an ambiguous disc shaped figure the children who scored high in prejudice were expected to take longer to give a response to the shape less likely to make changes on their response and less likely to change their perspectives a study by kenny and ginsberg 1958 retesting frenkelbrunswiks original connection of ambiguity intolerance to ethnocentrism and authoritarian personality found that the results were unreplicable however it was discussed that this may be due to the fact that at the time the study was done incorrect methodology was used and that there lacked a concrete definition as to what the construct was most of the research on this subject was completed in the two decades after the publication of the authoritarian personality however the construct is still studied in psychological research today budner gives three examples as to what could be considered ambiguous situations a situation with no familiar cues a situation in which there are many cues to be taken into consideration and a situation in which cues suggest the existence of different structures to be adhered to there have been many attempts to conceptualize the construct of ambiguity tolerance – intolerance as to give researchers a more standard concept to work with many of these conceptualizations are based on the work of frenkelbrunswik budner 1962 defines the construct as the following intolerance of ambiguity may be defined as the tendency to perceive ie interpret ambiguous situations as sources of threat tolerance of ambiguity as the tendency to perceive ambiguous situations as desirableadditionally bochner 1965 categorized attributes given by frenkelbrunswiks theory of individuals who are intolerant to ambiguity the nine primary characteristics describe intolerance of ambiguity and are as follows need for categorization need for certainty inability to allow good and bad traits to exist in the same person'</li></ul> | | 31 | <ul><li>'in philosophy transcendence is the basic ground concept from the words literal meaning from latin of climbing or going beyond albeit with varying connotations in its different historical and cultural stages it includes philosophies systems and approaches that describe the fundamental structures of being not as an ontology theory of being but as the framework of emergence and validation of knowledge of being these definitions are generally grounded in reason and empirical observation and seek to provide a framework for understanding the world that is not reliant on religious beliefs or supernatural forces transcendental is a word derived from the scholastic designating the extracategorical attributes of beings in religion transcendence refers to the aspect of gods nature and power which is wholly independent of the material universe beyond all physical laws this is contrasted with immanence where a god is said to be fully present in the physical world and thus accessible to creatures in various ways in religious experience transcendence is a state of being that has overcome the limitations of physical existence and by some definitions has also become independent of it this is typically manifested in prayer seance meditation psychedelics and paranormal visions it is affirmed in various religious traditions concept of the divine which contrasts with the notion of a god or the absolute that exists exclusively in the physical order immanentism or indistinguishable from it pantheism transcendence can be attributed to the divine not only in its being but also in its knowledge thus god may transcend both the universe and knowledge is beyond the grasp of the human mind although transcendence is defined as the opposite of immanence the two are not necessarily mutually exclusive some theologians and metaphysicians of various religious traditions affirm that a god is both within and beyond the universe panentheism in it but not of it simultaneously pervading it and surpassing it the ethics of baruch spinoza used the expression transcendental terms in latin termini transcendentales to indicate concepts like being thing something which are so general not to be included in the definitions of species genus and category in modern philosophy immanuel kant introduced a new term transcendental thus instituting a new third meaning in his theory of knowledge this concept is concerned with the condition of possibility of knowledge itself he also opposed the term transcendental to the term transcendent the latter meaning that which goes beyond transcends any possible knowledge of a human being for him transcendental meant knowledge about our cognitive faculty with regard to how objects are possible a priori i call all knowledge transcendental if it is occupied not with objects'</li><li>'atoms in molecules — collision theory — ligand field theory successor to crystal field theory — variational transitionstate theory — benson group increment theory — specific ion interaction theory climatology climate change theory general study of climate changes and anthropogenic climate change acc global warming agw theories due to human activity computer science automata theory — queueing theory cosmology big bang theory — cosmic inflation — loop quantum gravity — superstring theory — supergravity — supersymmetric theory — multiverse theory — holographic principle — quantum gravity — mtheory economics macroeconomic theory — microeconomic theory — law of supply and demand education constructivist theory — critical pedagogy theory — education theory — multiple intelligence theory — progressive education theory engineering circuit theory — control theory — signal theory — systems theory — information theory film film theory geology plate tectonics humanities critical theory jurisprudence or legal theory natural law — legal positivism — legal realism — critical legal studies law see jurisprudence also case theory linguistics xbar theory — government and binding — principles and parameters — universal grammar literature literary theory mathematics approximation theory — arakelov theory — asymptotic theory — bifurcation theory — catastrophe theory — category theory — chaos theory — choquet theory — coding theory — combinatorial game theory — computability theory — computational complexity theory — deformation theory — dimension theory — ergodic theory — field theory — galois theory — game theory — gauge theory — graph theory — group theory — hodge theory — homology theory — homotopy theory — ideal theory — intersection theory — invariant theory — iwasawa theory — ktheory — kktheory — knot theory — ltheory — lie theory — littlewood – paley theory — matrix theory — measure theory — model theory — module theory — morse theory — nevanlinna theory — number theory — obstruction theory — operator theory — order theory — pcf theory — perturbation theory — potential theory — probability theory — ramsey theory — rational choice theory — representation theory — ring theory — set theory — shape theory — small cancellation theory — spectral theory — stability theory — stable theory — sturm – liouville theory — surgery theory — twistor theory — yang – mills theory music music theory philosophy proof theory — speculative reason — theory of truth — type theory — value theory — virtue theory physics acoustic theory — antenna theory — atomic theory — bcs theory — conformal field theory — dirac hole theory — dynamo theory — landau theory — mtheory — perturbation theory — theory'</li><li>'##ism turned this world on its head he argues for the nominalists all real being was individual or particular and universals were thus mere fictionsanother scholar victor bruno follows the same line according to bruno nominalism is one of the first signs of rupture in the medieval system the dismembering of the particulars the dangerous attribution to individuals to a status of totalization of possibilities in themselves all this will unfold in an existential fissure that is both objective and material the result of this fissure will be the essays to establish the nation state indian philosophy encompasses various realist and nominalist traditions certain orthodox hindu schools defend the realist position notably purva mimamsa nyaya and vaisheshika maintaining that the referent of the word is both the individual object perceived by the subject of knowledge and the universal class to which the thing belongs according to indian realism both the individual and the universal exist objectively with the second underlying the former buddhists take the nominalist position especially those of the sautrantika and yogacara schools they were of the opinion that words have as referent not true objects but only concepts produced in the intellect these concepts are not real since they do not have efficient existence that is causal powers words as linguistic conventions are useful to thought and discourse but even so it should not be accepted that words apprehend reality as it is dignaga formulated a nominalist theory of meaning called apohavada or theory of exclusions the theory seeks to explain how it is possible for words to refer to classes of objects even if no such class has an objective existence dignagas thesis is that classes do not refer to positive qualities that their members share in common on the contrary universal classes are exclusions apoha as such the cow class for example is composed of all exclusions common to individual cows they are all nonhorse nonelephant etc nominalism arose in reaction to the problem of universals specifically accounting for the fact that some things are of the same type for example fluffy and kitzler are both cats or the fact that certain properties are repeatable such as the grass the shirt and kermit the frog are green one wants to know by virtue of what are fluffy and kitzler both cats and what makes the grass the shirt and kermit green the platonist answer is that all the green things are green in virtue of the existence of a universal a single abstract thing that in this case is a part of all the green things with respect to the color of the grass the'</li></ul> | | 41 | <ul><li>'along streams and rivers through parks and across commons another type is the alley normally providing access to the rear of properties or connecting builtup roads not easily reached by vehicles towpaths are another kind of urban footpath but they are often shared with cyclists a typical footpath in a park is found along the seawall in stanley park vancouver british columbia canada this is a segregated path with one lane for skaters and cyclists and the other for pedestriansin the us and canada where urban sprawl has begun to strike even the most rural communities developers and local leaders are currently striving to make their communities more conducive to nonmotorized transportation through the use of less traditional paths the robert wood johnson foundation has established the active living by design program to improve the livability of communities in part through developing trails the upper valley trails alliance has done similar work on traditional trails while the somerville community path and related paths are examples of urban initiatives in st johns newfoundland canada the grand concourse is an integrated walkway system that has over 160 kilometers 99 mi of footpaths which link every major park river pond and green space in six municipalities in london england there are several longdistance walking routes which combine footpaths and roads to link green spaces these include the capital ring london outer orbital path and the jubilee walkway the use of which have been endorsed by transport for london an alley is a narrow usually paved pedestrian path often between the walls of buildings in towns and cities this type is usually short and straight and on steep ground can consist partially or entirely of steps in older cities and towns in europe alleys are often what is left of a medieval street network or a right of way or ancient footpath similar paths also exist in some older north american towns and cities in some older urban development in north america lanes at the rear of houses to allow for deliveries and garbage collection are called alleys alleys may be paved or unpaved and a blind alley is a culdesac some alleys are roofed because they are within buildings such as the traboules of lyon or when they are a pedestrian passage through railway embankments in britain the latter follow the line of rightsof way that existed before the railway was built because of topography steps stairs are the predominant form of alley in hilly cities and towns this includes pittsburgh see steps of pittsburgh cincinnati see steps of cincinnati portland oregon seattle and san francisco in the united states as well as hong kong and rome footpaths and other rights of way have been combined and new paths created so as to produce longdistance walking routes in a number of countries these'</li><li>'the minot area growth through investment and cooperation fund or magic fund is a growth fund financed through a one percent sales tax in the city of minot north dakota the fund was approved by voters on may 1 1990 and the money is used for economic development capital improvements and property tax relief as of 2012 the magic fund has invested over 33 million into 200 projects in 44 communities forty percent of the one percent tax is earmarked for economic development and is used to help finance relocations startups and expansions in the minot area minot area development corporation the lead economic development agency for the city of minot targets primary sector businesses such as those in valueadded agriculture knowledgebased business and the energy industry the availability of magic funds makes minot more appealing to businesses the magic fund is very progressive in that it was one of the first growth funds in the state of north dakota and the first one to be used regionally when the magic fund was originally established it was designed to operate with minimal guidelines to allow for the high level of flexibility necessary when assembling financing and incentive packages to benefit potential businesses and the community of minot this nonrestrictive nature of the fund has been a source of some criticism though local leadership acknowledges that throughout the life of the magic fund it has been a challenge maintain openness with the public about specific spending while at the same time respecting the confidentiality of business information leaders are striving however to keep communications clearin 2005 new magic fund guidelines were set in place to clearly define “ full time ” and to require a breakdown — not an average of — salaries of proposed positions more recently in october 2008 the guidelines of the magic fund underwent public review and area residents were encouraged to offer suggestions suggestions included making magic funds available for private sector projects such as housing recreation and childcare or using the money for infrastructure purposes such as streets and sewer in order to encourage more housing projects after consideration the guidelines review committee decided to continue using magic funding for businessrelated projects the initial creation of the magic fund in may 1990 established it through 2006 and come june 2004 city voters approved an extension of the 1 city sales tax through the year 2014 the magic fund has a rich history of aiding economic development in the minot region and study after study shows the local economy has benefited drastically from its availability historically magic funds have been used in three main areas of primary sector economic development knowledgebased employment agriculture and energy five of the ten largest employers conducting business in minot today were recruited using magic funds choice hotels international was one of the first businesses to be recruited using'</li><li>'##tes to solve problems everything promised by compact cities can be delivered'</li></ul> | | 16 | <ul><li>'physiographic regions are a means of defining earths landforms into distinct mutually exclusive areas independent of political boundaries it is based upon the classic threetiered approach by nevin m fenneman in 1916 that separates landforms into physiographic divisions physiographic provinces and physiographic sectionsthe classification mechanism has become a popular geographical tool in the united states indicated by the publication of a usgs shapefile that maps the regions of the original work and the national park servicess use of the terminology to describe the regions in which its parks are locatedoriginally used in north america the model became the basis for similar classifications of other continents during the early 1900s the study of regionalscale geomorphology was termed physiography physiography later was considered to be a portmanteau of physical and geography and therefore synonymous with physical geography and the concept became embroiled in controversy surrounding the appropriate concerns of that discipline some geomorphologists held to a geological basis for physiography and emphasized a concept of physiographic regions while a conflicting trend among geographers was to equate physiography with pure morphology separated from its geological heritage in the period following world war ii the emergence of process climatic and quantitative studies led to a preference by many earth scientists for the term geomorphology in order to suggest an analytical approach to landscapes rather than a descriptive one in current usage physiography still lends itself to confusion as to which meaning is meant the more specialized geomorphological definition or the more encompassing physical geography definition for the purposes of physiographic mapping landforms are classified according to both their geologic structures and histories distinctions based on geologic age also correspond to physiographic distinctions where the forms are so recent as to be in their first erosion cycle as is generally the case with sheets of glacial drift generally forms which result from similar histories are characterized by certain similar features and differences in history result in corresponding differences of form usually resulting in distinctive features which are obvious to the casual observer but this is not always the case a maturely dissected plateau may grade without a break from rugged mountains on the one hand to mildly rolling farm lands on the other so also forms which are not classified together may be superficially similar for example a young coastal plain and a peneplain in a large number of cases the boundary lines are also geologic lines due to differences in the nature or structure of the underlying rocks the history of physiography itself is at best a complicated effort much of'</li><li>'##ythagoras contrary to popular belief most educated people in the middle ages did not believe the earth was flat this misconception is often called the myth of the flat earth as evidenced by thinkers such as thomas aquinas the european belief in a spherical earth was widespread by this point in time prior to circumnavigation of the planet and the introduction of space flight belief in a spherical earth was based on observations of the secondary effects of the earths shape and parallels drawn with the shape of other planets humans have commonly traveled for business pleasure discovery and adventure all made easier in recent human history as a result of technologies like cars trains planes and ships land navigation is an aspect of travel and refers to progressing through unfamiliar terrain using navigational tools like maps with references to terrain a compass or satellite navigation navigation on land is often facilitated by reference to landmarks – enduring and recognizable natural or artificial features that stand out from their nearby environment and are often visible from long distances natural landmarks can be characteristic features such as mountains or plateaus with examples including table mountain in south africa mount ararat in turkey the grand canyon in the united states uluru in'</li><li>'##width extra versatility compared to the strahler number however unlike the strahler number the pathwidth is defined only for the whole graph and not separately for each node in the graph main stem of a river typically found by following the branch with the highest strahler number pfafstetter coding system'</li></ul> | | 24 | <ul><li>'glenstone is a private contemporary art museum in potomac maryland founded in 2006 by american billionaire mitchell rales and his wife emily wei rales the museums exhibitions are drawn from a collection of about 1300 works from postworld war ii artists around the world it is the largest private contemporary art museum in the united states holding more than 46 billion in net assets and is noted for its setting in a broad natural landscape glenstones original building was designed by charles gwathmey with it being expanded several times on its 230acre 93 ha campus its most significant expansion was finished in the late 2010s with outdoor sculpture installations landscaping a new complex designed by thomas phifer and an environmental center being added glenstone has been compared to other private museums such as the frick collection and the phillips collection the museum is free to the public with it seeing over 100000 visitors in 2022 in 1986 billionaire american businessman mitchell rales purchased the property in potomac maryland to build a home starting in 1990 rales began collecting art for that home following a neardeath accident on a helicopter trip in russia rales decided to take on a philanthropic project which became the establishment of a private contemporary art museum built on land that was formerly a fox hunting club glenstone is named for the nearby glen road and because of stone quarries located in the vicinity located 15 miles 24 km from downtown washington dc the museums initial 30000squarefoot 2800 m2 modernist limestone gallery opened in 2006 and admitted visitors two days a week in its first seven years the museum admitted only 10000 visitorsthough several smaller expansions took place in the years after the museums opening the largest expansion was announced in 2013 and was completed in 2018 opening to the public on october 4 2018 with a cost of approximately 219 million the expansion increased the size of the museums gallery space by a factor of five increasing the propertys size by 130 acres 53 ha and included substantial landscaping changes with the expansion glenstone became the largest private contemporary art museum in the united states in 2019 the expansion was named as a museum opening of the year by apollowith the expansion glenstone opened to the public with free tickets available online in the year following the expansion glenstone admitted nearly 100000 visitorsin 2015 glenstone was one of several private museums questioned by the us senate finance committee over its nonprofit tax status after reporting from the new york times had questioned the validity of nonprofit tax status for institutions like glenstone which at the time welcomed very few visitors the committee sought to investigate whether highvalue individuals and families were using private museums as a form of tax shelter committee chairman senator orrin hatch said'</li><li>'in consistently producing organic litter is believed to be more important in reducing erosion than its direct speedreducing effects on raindrops nevertheless gardens are less effective than natural forests in erosion reduction harvesting of rice — the dominant staple of indonesia — influences the use of pekarangans in some ways production in the gardens decreases during riceharvesting season but peaks during the rest of the year lowerincome villagers benefit from the consistent productivity of starch crops in the gardens especially in a period of food shortage prerice harvest or after a failed rice harvest by droughtsettlement dynamics affect pekarangans in various ways expansion of settlements to new lands caused by population growth is the cause of the wide presence of food crops in newly made pekarangans people who resettled via the indonesian transmigration program might support plant diversity in the gardens in the places they migrate to plant species brought by internal migrants need to adapt well to the local environmentcommercialization fragmentation and urbanization are major hazards to pekarangans plant diversity these change the organic cycles within the gardens threatening their ecological sustainability commercialization requires a systemic change of crop planting to optimize and produce more crops a pekarangans owner must specialize in its crops making a small number of crops dominate the garden some owners turn them into monoculture gardens fragmentation stems from the traditional system of inheritance consequences from the reduction of plant diversity include the loss of canopy structures and organic litter resulting in less protection of the gardens soil loss of pestcontrol agents increasing the use of pesticides loss of production stability loss of nutrients diversity and the disappearance of yieldssharing culture despite urbanizations negative effect in reducing their plant diversity it increases that of the ornamental plantsa case study of home gardens in napu valley central sulawesi shows that the decrease in soil protection is caused by insufficient soil fertility management regular weeding and waste burning dumping waste in garbage pits instead of using it for compost and spread of inorganic waste the decrease of soil fertility worsens the decrease of crop diversity in the gardens products from pekarangans have multiple uses for example a coconut tree can provide food oil fuel and building materials and also be used in rituals and ceremonies the gardens plants are known for their products nutritional benefits and diversity while rice is low in vitamins a and c products from the gardens offer an abundance of them pekarangans with more perennial crops tend to create more carbohydrates and proteins and those with more annual plants tend to create more portions of vitamin a pekarangans also act as a source of fire'</li><li>'the german fountain turkish alman cesmesi german deutscher brunnen is a gazebo styled fountain in the northern end of old hippodrome sultanahmet square istanbul turkey and across from the mausoleum of sultan ahmed i it was constructed to commemorate the second anniversary of german emperor wilhelm iis visit to istanbul in 1898 it was built in germany then transported piece by piece and assembled in its current site in 1900 the neobyzantine style fountains octagonal dome has eight marble columns and domes interior is covered with golden mosaics the idea of great palace of constantinoples empire lodge kathisma being on the site of the german fountains conflicts with the view that carceres gates of hippodrome was found on the site of the fountain however the hypothesis of carceres gates being on the site enforces the view that quadriga of lysippos was used to stand on the site of the german fountainduring his reign as german emperor and king of prussia wilhelm ii visited several european and eastern countries his trip started in istanbul ottoman empire on 18 october 1898 during the reign of abdulhamid ii according to peter hopkirk the visit to ottoman empire was an ego trip and also had longterm motivations the emperors primary motivation for visiting was to construct the baghdad railway which would run from berlin to the persian gulf and would further connect to british india through persia this railway could provide a short and quick route from europe to asia and could carry german exports troops and artillery at the time the ottoman empire could not afford such a railway and abdulhamid ii was grateful to wilhelms offer but was suspicious over the german motives abdulhamid iis secret service believed that german archeologists in the emperors retinue were in fact geologists with designs on the oil wealth of the ottoman empire later the secret service uncovered a german report which noted that the oilfields in mosul northern mesopotamia were richer than that in the caucuses in his first visit wilhelm secured the sale of germanmade rifles to ottoman army and in his second visit he secured a promise for german companies to construct the istanbulbaghdad railway the german government constructed the german fountain for wilhelm ii and empress augustas 1898 istanbul visit according to afife batur the fountains plans were drawn by architect spitta and constructed by architect schoele also german architect carlitzik and italian architect joseph anthony worked on this projectaccording to the ottoman inscription the fountains construction started in the hejira 1319 1898 – 1899 although the inauguration of the fountain was planned to take place on 1'</li></ul> | | 10 | <ul><li>'inhibits the growth of some harmful gramnegative and grampositive bacteria along with yeasts molds and protozoa l reuteri can secrete sufficient amounts of reuterin to inhibit the growth of harmful gut organisms without killing beneficial gut bacteria allowing l reuteri to remove gut invaders while keeping normal gut flora intactreuterin is watersoluble effective in a wide range of ph resistant to proteolytic and lipolytic enzymes and has been studied as a food preservative or auxiliary therapeutic agentreuterin as an extracted compound has been shown capable of killing escherichia coli o157h7 and listeria monocytogenes with the addition of lactic acid increasing its efficacy it has also been demonstrated to kill escherichia coli o157h7 when produced by l reuteri'</li><li>'thus can affect biological function of the fsl lipids in fsl kode constructs include diacyldiakyl eg dope sterols eg cholesterol ceramides one of the important functions of an fsl construct is that it can optimise the presentation of antigens both on cell surfaces and solidphase membranes this optimisation is achieved primarily by the spacer and secondarily by the lipid tail in a typical immunoassay the antigen is deposited directly onto the microplate surface and binds to the surface either in a random fashion or in a preferred orientation depending on the residues present on the surface of this antigen usually this deposition process is uncontrolled in contrast the fsl kode construct bound to a microplate presents the antigen away from the surface in an orientation with a high level of exposure to the environment furthermore typical immunoassays use recombinant peptides rather than discrete peptide antigens as the recombinant peptide is many times bigger than the epitope of interest a lot of undesired and unwanted peptide sequences are also represented on the microplate these additional sequences may include unwanted microbial related sequences as determined by a blast analysis that can cause issues of low level crossreactivity often the mechanism by which an immunoassay is able to overcome this low level activity is to dilute the serum so that the low level microbial reactive antibodies are not seen and only highlevel specific antibodies result in an interpretable result in contrast fsl kode constructs usually use specifically selected peptide fragments up to 40 amino acids thereby overcoming crossreactivity with microbial sequences and allowing for the use of undiluted serum which increases sensitivity the f component can be further enhanced by presentation of it in multimeric formats and with specific spacing the four types of multimeric format include linear repeating units linear repeating units with spacing clusters and branching fig 4 the fsl kode construct by nature of its composition in possessing both hydrophobic and hydrophilic regions are amphiphilic or amphipathic this characteristic determines the way in which the construct will interact with surfaces when present in a solution they may form simple micelles or adopt more complex bilayer structures with two simplistic examples shown in fig 5a more complex structures are expected the actual nature of fsl micelles has not been determined however based on normal structural function of micelles it is expected that it will be determined in part by the combination of functional group spacer and lipid together'</li><li>'##n1 il1 etc which do not have a signal sequence they do not use the classical ergolgi pathway these are secreted through various nonclassical pathways at least four nonclassical unconventional protein secretion pathways have been described they include direct protein translocation across the plasma membrane likely through membrane transport proteins blebbing lysosomal secretion release via exosomes derived from multivesicular bodiesin addition proteins can be released from cells by mechanical or physiological wounding and through nonlethal transient oncotic pores in the plasma membrane induced by washing cells with serumfree media or buffers many human cell types have the ability to be secretory cells they have a welldeveloped endoplasmic reticulum and golgi apparatus to fulfill this function tissues that produce secretions include the gastrointestinal tract which secretes digestive enzymes and gastric acid the lungs which secrete surfactants and sebaceous glands which secrete sebum to lubricate the skin and hair meibomian glands in the eyelid secrete meibum to lubricate and protect the eye secretion is not unique to eukaryotes – it is also present in bacteria and archaea as well atp binding cassette abc type transporters are common to the three domains of life some secreted proteins are translocated across the cytoplasmic membrane by the secyeg translocon one of two translocation systems which requires the presence of an nterminal signal peptide on the secreted protein others are translocated across the cytoplasmic membrane by the twinarginine translocation pathway tat gramnegative bacteria have two membranes thus making secretion topologically more complex there are at least six specialized secretion systems in gramnegative bacteria many secreted proteins are particularly important in bacterial pathogenesis type i secretion is a chaperone dependent secretion system employing the hly and tol gene clusters the process begins as a leader sequence on the protein to be secreted is recognized by hlya and binds hlyb on the membrane this signal sequence is extremely specific for the abc transporter the hlyab complex stimulates hlyd which begins to uncoil and reaches the outer membrane where tolc recognizes a terminal molecule or signal on hlyd hlyd recruits tolc to the inner membrane and hlya is excreted outside of the outer membrane via a longtunnel protein channel type i secretion system transports various molecules from ions drugs to'</li></ul> | | 1 | <ul><li>'first to form followed by the oblique shock shock diamonds are most commonly associated with jet and rocket propulsion but they can form in other systems shock diamonds can be seen during gas pipeline blowdowns because the gas is under high pressure and exits the blowdown valve at extreme speeds when artillery pieces are fired gas exits the cannon muzzle at supersonic speeds and produces a series of shock diamonds the diamonds cause a bright muzzle flash which can expose the location of gun emplacements to the enemy it was found that when the ratio between the flow pressure and atmospheric pressure is close which can be achieved with a flash suppressor the shock diamonds were greatly minimized adding a muzzle brake to the end of the muzzle balances the pressures and prevents shock diamonds 41 some radio jets powerful jets of plasma that emanate from quasars and radio galaxies are observed to have regularlyspaced knots of enhanced radio emissions 68 the jets travel at supersonic speed through a thin atmosphere of gas in space 51 so it is hypothesized that these knots are shock diamonds index of aviation articles plume hydrodynamics rocket engine nozzle'</li><li>'##al change in location of the marker can be calculated by collecting results from a few markers the degree to which the model is flexibly yielding due to the air load can be calculated there are many different kinds of wind tunnels they are typically classified by the range of speeds that are achieved in the test section as follows lowspeed wind tunnel high speed wind tunnel subsonic and transonic wind tunnel supersonic wind tunnel hypersonic wind tunnel high enthalpy wind tunnelwind tunnels are also classified by the orientation of air flow in the test section with respect to gravity typically they are oriented horizontally as happens during level flight a different class of wind tunnels are oriented vertically so that gravity can be balanced by drag instead of lift and these have become a popular form of recreation for simulating skydiving vertical wind tunnelwind tunnels are also classified based on their main use for those used with land vehicles such as cars and trucks the type of floor aerodynamics is also important these vary from stationary floors through to full moving floors with smaller moving floors and some attempt at boundary level control also being important the main subcategories in the aeronautical wind tunnels are high reynolds number tunnels reynolds number is one of the governing similarity parameters for the simulation of flow in a wind tunnel for mach number less than 03 it is the primary parameter that governs the flow characteristics there are three main ways to simulate high reynolds number since it is not practical to obtain full scale reynolds number by use of a full scale vehicle pressurised tunnels here test gases are pressurised to increase the reynolds number heavy gas tunnels heavier gases like freon and r134a are used as test gases the transonic dynamics tunnel at nasa langley is an example of such a tunnel cryogenic tunnels here test gas is cooled down to increase the reynolds number the european transonic wind tunnel uses this technique highaltitude tunnels these are designed to test the effects of shock waves against various aircraft shapes in near vacuum in 1952 the university of california constructed the first two highaltitude wind tunnels one for testing objects at 50 to 70 miles above the earth and the second for tests at 80 to 200 miles above the earth vstol tunnels vstol tunnels require large cross section area but only small velocities since power varies with the cube of velocity the power required for the operation is also less an example of a vstol tunnel is the nasa langley 14 by 22 ft 43 by 67 m tunnel spin tunnels aircraft have a tendency to spin when they stall these tunnels are used to study that phenomenon automotive wind tunnels fall into two categories'</li><li>'high speed requires at least a 2dimensional treatment when all 3 spatial dimensions and perhaps the time dimension as well are important we often resort to computerized solutions of the governing equations the mach number m is defined as the ratio of the speed of an object or of a flow to the speed of sound for instance in air at room temperature the speed of sound is about 340 ms 1100 fts m can range from 0 to ∞ but this broad range falls naturally into several flow regimes these regimes are subsonic transonic supersonic hypersonic and hypervelocity flow the figure below illustrates the mach number spectrum of these flow regimes these flow regimes are not chosen arbitrarily but rather arise naturally from the strong mathematical background that underlies compressible flow see the cited reference textbooks at very slow flow speeds the speed of sound is so much faster that it is mathematically ignored and the mach number is irrelevant once the speed of the flow approaches the speed of sound however the mach number becomes allimportant and shock waves begin to appear thus the transonic regime is described by a different and much more complex mathematical treatment in the supersonic regime the flow is dominated by wave motion at oblique angles similar to the mach angle above about mach 5 these wave angles grow so small that a different mathematical approach is required defining the hypersonic speed regime finally at speeds comparable to that of planetary atmospheric entry from orbit in the range of several kms the speed of sound is now comparatively so slow that it is once again mathematically ignored in the hypervelocity regime as an object accelerates from subsonic toward supersonic speed in a gas different types of wave phenomena occur to illustrate these changes the next figure shows a stationary point m 0 that emits symmetric sound waves the speed of sound is the same in all directions in a uniform fluid so these waves are simply concentric spheres as the soundgenerating point begins to accelerate the sound waves bunch up in the direction of motion and stretch out in the opposite direction when the point reaches sonic speed m 1 it travels at the same speed as the sound waves it creates therefore an infinite number of these sound waves pile up ahead of the point forming a shock wave upon achieving supersonic flow the particle is moving so fast that it continuously leaves its sound waves behind when this occurs the locus of these waves trailing behind the point creates an angle known as the mach wave angle or mach angle μ μ arcsin a v arcsin 1 m displaystyle mu arcsin leftfrac avrightarcsin leftfrac 1mright where a displaystyle a'</li></ul> | | 32 | <ul><li>'for producing precision lengths by stacking components which are joined temporarily in a similar fashion'</li><li>'this step does the preforming of green raw bodies of the mould inserts sintering by sintering the preformed green bodies are compressed and hardened in order to do this the green body is heated to a temperature below the melting temperature the sintering process consists of three phases first the volume and the porosity is reduced and secondly the open porosity is reduced in the third phase sinter necks are formed which enhance the materials strength premachining the step of premachining creates the main form of the optical insert it typically contains four process steps these steps are grinding the innerouter diameter grinding the parallelend faces of the insert grindinglapping of the fitting of insert and finally the nearnetshape grinding of the cavity normally the cavity is only premachined to a flat or a bestfit sphere grinding grinding or finishmachining creates the final form and the surface finish of the cavity in the mould insert usually the finish is carried out by grinding a subsequent polishing step is optionally required finish grinding can require several changes of the grinding tool and several truing steps of the tool finishmachining of the mould is an iterative process as long as the machined mould shows deviations from the nominal contour in the measurement step after grinding it has to be reground there is no welldefined border between premachining and fine grinding throughout the grinding process of the cavity the grain size of the tool the feed rate and the cutting depth are reduced whereas machining time increases convex surfaces are easier to manufacture the necessary steps of workpiece preparation are the mould alignment and the mould referencing grinding tool alignment grinding tool referencing and grinding tool truing also have to be done after that polishing can be necessary to remove the anisotropic structure which remains after grinding it can be performed manually or by a cncmachine coating coating is the process step in which a layer is applied on the cavity surface of the optical insert which protects the mould against wear corrosion friction sticking of glass and chemical reactions with glass for coating the surface of moulds by physical vapour deposition pvd metals are evaporated in combination with processgasbased chemicals on the tool surface highly adherent thin coatings are synthesized materials for coatings on optical inserts are platinumbased pvd mostly iridiumalloyed standard diamondlike carbon not yet commercially available sic cvd on sicceramics not yet commercially available have to be postmachined or tialn not yet commercially available to achieve a homogeneous layer thickness the'</li><li>'gag bennet 1974 electricity and modern physics 2nd ed edward arnold uk isbn 0713124598 is grant wr phillips manchester physics 2008 electromagnetism 2nd ed john wiley sons isbn 9780471927129 dj griffiths 2007 introduction to electrodynamics 3rd ed pearson education dorling kindersley isbn 9788177582932 lh greenberg 1978 physics with modern applications holtsaunders international wb saunders and co isbn 0721642470 jb marion wf hornyak 1984 principles of physics holtsaunders international saunders college isbn 4833701952 a beiser 1987 concepts of modern physics 4th ed mcgrawhill international isbn 0071001441 hd young ra freedman 2008 university physics – with modern physics 12th ed addisonwesley pearson international isbn 9780321501301'</li></ul> | | 26 | <ul><li>'between roughness because due to this tangential component plastic deformation comes with a lower load than when ignoring this component a more realistic description then of the area of each single junction that is created is given by with α displaystyle alpha constant and a tangent force f → i displaystyle vec fi applied to the joint to obtain even more realistic considerations the phenomenon of the third body should also be considered ie the presence of foreign materials such as moisture oxides or lubricants between the two solids in contact a coefficient c is then introduced which is able to correlate the shear strength t of the pure material and that of the third body t t b displaystyle ttb with 0 c 1 by studying the behavior at the limits it will be that for c 0 t 0 and for c 1 it returns to the condition in which the surfaces are directly in contact and there is no presence of a third body keeping in mind what has just been said it is possible to correct the friction coefficient formula as follows in conclusion the case of elastic bodies in interaction with each other is considered similarly to what we have just seen it is possible to define an equation of the type where in this case k depends on the elastic properties of the materials also for the elastic bodies the tangential force depends on the coefficient c seen above and it will be and therefore a fairly exhaustive description of the friction coefficient can be obtained friction measurements the simplest and most immediate method for evaluating the friction coefficient of two surfaces is the use of an inclined plane on which a block of material is made to slide as can be seen in the figure the normal force of the plane is given by m g cos θ displaystyle mgcos theta while the frictional force is equal to m g sin θ displaystyle mgsin theta this allows us to state that the coefficient of friction can be calculated very easily by means of the tangent of the angle in which the block begins to slip in fact we have then from the inclined plane we moved on to more sophisticated systems which allow us to consider all the possible environmental conditions in which the measurement is made such as the crossroller machine or the pin and disk machine today there are digital machines such as the friction tester which allows by means of a software support to insert all the desired variables another widely used process is the ring compression test a flat ring of the material to be studied is plastically deformed by means of a press if the deformation is an expansion in both the inner and the outer circle then there will be low or zero friction coefficients otherwise for a deformation that expands only in'</li><li>'the metallurgical production of the republic of azerbaijan is considered high due to the large deposits of alunite polymetallic ores deposits of iron ore etc the metallurgy industry of azerbaijan encompasses both ferrous and nonferrous branches ferrous metallurgy includes extraction of iron smelting and refining of iron ore rolling and ferroalloys production the ferrous metallurgy production of the country started to meet the demand of oil and gas industry due to pipe production and grew further in order to improve other branches of the industry dashkasan iron ore in 4 deposits dashkesen south dashkasan hamanchay demiroglu in the valley of goshagarchay plays a key role in development of ferrous metallurgy the cities of baku sumgait and dashkesan are major centers of metallurgy in terms of extraction and processing of iron ore the sumgait piperolling plant produces drill pipes casing tubing oil and gas pipes etc bentonite clay deposits in the village of dash salakhly gazakh district is used in steel smelting baku steel company the largest metallurgical enterprise in azerbaijan was opened in 2001 on the initiative of heydar aliyev with two electric arc furnaces and three rolling lines the annual steel production capacity of company increased to 1000000 tons aluminum copper molybdenum cobalt mercury reserves and most importantly electricity for the smelting process has led to the development of nonferrous metallurgy the zeylik mine in daskasan district is the main provider of the alunite for aluminum production the extracted ore here transported through guschualabashli railway to the aluminum plant located in ganja city the obtained aluminum oxide is brought to sumgayit aluminum plant in order produce aluminum metal ganja aluminum plant produces sulfuric acid aluminum oxide and potassium fertilizer through extracted ore from zalik deposit in dashkesen aluminum oxide is also produced in sumgait azergold cjsc created by the presidential decree no 1047 on february 11 2015 implements exploration management and also extraction processing and sale of precious and nonferrous metal ore deposits located within the borders of the country in 2017 the volume of exports of precious metals carried out by this company amounted to 77340 million dollars gold mining began in gedebey in 2009 in 2016 azer gold cjsc began gold mining in the chovdar deposit in 2017 63908 kg of gold was mined which exceeded the 2016 production by 34 times gold production'</li><li>'the material they are most found in these are given in miller indices for simplification purposes cube component 001100 brass component 110112 copper component 112111 s component 123634 the full 3d representation of crystallographic texture is given by the orientation distribution function odf which can be achieved through evaluation of a set of pole figures or diffraction patterns subsequently all pole figures can be derived from the odf the odf is defined as the volume fraction of grains with a certain orientation g displaystyle boldsymbol g odf g 1 v d v g d g displaystyle textodfboldsymbol gfrac 1vfrac dvboldsymbol gdg the orientation g displaystyle boldsymbol g is normally identified using three euler angles the euler angles then describe the transition from the sample ’ s reference frame into the crystallographic reference frame of each individual grain of the polycrystal one thus ends up with a large set of different euler angles the distribution of which is described by the odf the orientation distribution function odf cannot be measured directly by any technique traditionally both xray diffraction and ebsd may collect pole figures different methodologies exist to obtain the odf from the pole figures or data in general they can be classified based on how they represent the odf some represent the odf as a function sum of functions or expand it in a series of harmonic functions others known as discrete methods divide the odf space in cells and focus on determining the value of the odf in each cell in wire and fiber all crystals tend to have nearly identical orientation in the axial direction but nearly random radial orientation the most familiar exceptions to this rule are fiberglass which has no crystal structure and carbon fiber in which the crystalline anisotropy is so great that a goodquality filament will be a distorted single crystal with approximately cylindrical symmetry often compared to a jelly roll singlecrystal fibers are also not uncommon the making of metal sheet often involves compression in one direction and in efficient rolling operations tension in another which can orient crystallites in both axes by a process known as grain flow however cold work destroys much of the crystalline order and the new crystallites that arise with annealing usually have a different texture control of texture is extremely important in the making of silicon steel sheet for transformer cores to reduce magnetic hysteresis and of aluminium cans since deep drawing requires extreme and relatively uniform plasticity texture in ceramics usually arises because the crystallites in a slurry'</li></ul> | | 15 | <ul><li>'is could effectively be used as a geneediting tool in human 2pn zygotes which could lead potentially pregnancy viable if implanted the scientists used injection of cas9 protein complexed with the relevant sgrnas and homology donors into human embryos the scientists found homologous recombinationmediated alteration in hbb and g6pd the scientists also noted the limitations of their study and called for further researchin august 2017 a group of scientists from oregon published an article in nature journal detailing the successful use of crispr to edit out a mutation responsible for congenital heart disease the study looked at heterozygous mybpc3 mutation in human embryos the study claimed precise crisprcas9 and homologydirected repair response with high accuracy and precision doublestrand breaks at the mutant paternal allele were repaired using the homologous wildtype gene by modifying the cell cycle stage at which the dsb was induced they were able to avoid mosaicism which had been seen in earlier similar studies in cleaving embryos and achieve a large percentage of homozygous embryos carrying the wildtype mybpc3 gene without evidence of unintended mutations the scientists concluded that the technique may be used for the correction of mutations in human embryos the claims of this study were however pushed back on by critics who argued the evidence was overall unpersuasivein june 2018 a group of scientists published and article in nature journal indicating a potential link for edited cells having increased potential turn cancerous the scientists reported that genome editing by crisprcas9 induced dna damage response and the cell cycle stopped the study was conducted in human retinal pigment epithelial cells and the use of crispr led to a selection against cells with a functional p53 pathway the conclusion of the study would suggest that p53 inhibition might increase efficiency of human germline editing and that p53 function would need to be watched when developing crisprcas9 based therapyin november 2018 a group of chinese scientists published research in the journal molecular therapy detailing their use of crisprcas9 technology to correct a single mistaken amino acid successfully in 16 out of 18 attempts in a human embryo the unusual level of precision was achieved by the use of a base editor be system which was constructed by fusing the deaminase to the dcas9 protein the be system efficiently edits the targeted c to t or g to a without the use of a donor and without dbs formation the study focused on the fbn1 mutation that is causative for mar'</li><li>'by the american nurses association which provides rules regulations and guidelines to follow when making a decision that is ethical based these regulations were mainly established to help provide equal healthcare protect the rights safety and privacy of the patient and to hold nurses accountable for their actions and choices genetics can create ethical issues in nursing for a variety of different situations many scenarios questions and debates have been encountered such as what individuals can receive genetic testing or information who owns or controls the information received from the genetic test and how can the owner use that information however the code of ethics does not address genetics or genomics specifically so ethical foundations were also established to help guide genetics into health care the foundations provide a set of guidelines to understand and manage an ethical issue if one should arise and to assist in the translation of genetics into the healthcare environment'</li><li>'than is accurate to the population this is known as the shadow effect the cabrera vole microtus cabrerae is a small endangered rodent that belongs to the microtus genus existing primarily in portugal populations can be difficult to estimate using typical markrecapture methods due to their small size and ability to quickly disperse over large swaths of prairie land with the introduction and reduced cost of using environmental dna in this case feces were able to be used in a relatively low cost experiment to estimate the population size of the cabrera vole in southern portugal in return for sacrificing demographic age sex health information endangered species act of 1973'</li></ul> | ## Evaluation ### Metrics | Label | Accuracy | |:--------|:---------| | **all** | 0.6909 | ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import SetFitModel # Download from the 🤗 Hub model = SetFitModel.from_pretrained("udrearobert999/multi-qa-mpnet-base-cos-v1-test") # Run inference preds = model("##rch procedure that evaluates the objective function p x displaystyle pmathbf x on a grid of candidate source locations g displaystyle mathcal g to estimate the spatial location of the sound source x s displaystyle textbf xs as the point of the grid that provides the maximum srp modifications of the classical srpphat algorithm have been proposed to reduce the computational cost of the gridsearch step of the algorithm and to increase the robustness of the method in the classical srpphat for each microphone pair and for each point of the grid a unique integer tdoa value is selected to be the acoustic delay corresponding to that grid point this procedure does not guarantee that all tdoas are associated to points on the grid nor that the spatial grid is consistent since some of the points may not correspond to an intersection of hyperboloids this issue becomes more problematic with coarse grids since when the number of points is reduced part of the tdoa information gets lost because most delays are not anymore associated to any point in the grid the modified srpphat collects and uses the tdoa information related to the volume surrounding each spatial point of the search grid by considering a modified objective function where l m 1 m 2 l x displaystyle lm1m2lmathbf x and l m 1 m 2 u x displaystyle lm1m2umathbf x are the lower and upper accumulation limits of gcc delays which depend on the spatial location x displaystyle mathbf x the accumulation limits can be calculated beforehand in an exact way by exploring the boundaries separating the regions corresponding to the points of the grid alternatively they can be selected by considering the spatial gradient of the tdoa ∇ τ m 1 m 2 x ∇ x τ m 1 m 2 x ∇ y τ m 1 m 2 x ∇ z τ m 1 m 2 x t displaystyle nabla tau m1m2mathbf x nabla xtau m1m2mathbf x nabla ytau m1m2mathbf x nabla ztau m1m2mathbf x t where each component γ ∈ x y z displaystyle gamma in leftxyzright of the gradient is for a rectangular grid where neighboring points are separated a distance r displaystyle r the lower and upper accumulation limits are given by where d r 2 min 1 sin θ cos [UNK] 1 sin θ sin [UNK] 1 cos θ displaystyle dr2min leftfrac 1vert sintheta cosphi vert frac 1vert sintheta sinphi vert frac 1vert") ``` <!-- ### Downstream Use *List how someone could finetune this model on their own dataset.* --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Set Metrics | Training set | Min | Median | Max | |:-------------|:----|:---------|:----| | Word count | 1 | 370.3098 | 509 | | Label | Training Sample Count | |:------|:----------------------| | 0 | 50 | | 1 | 50 | | 2 | 50 | | 3 | 50 | | 4 | 50 | | 5 | 50 | | 6 | 50 | | 7 | 50 | | 8 | 50 | | 9 | 50 | | 10 | 50 | | 11 | 50 | | 12 | 50 | | 13 | 50 | | 14 | 50 | | 15 | 50 | | 16 | 50 | | 17 | 50 | | 18 | 50 | | 19 | 50 | | 20 | 50 | | 21 | 50 | | 22 | 50 | | 23 | 50 | | 24 | 50 | | 25 | 50 | | 26 | 50 | | 27 | 50 | | 28 | 50 | | 29 | 50 | | 30 | 50 | | 31 | 50 | | 32 | 50 | | 33 | 50 | | 34 | 50 | | 35 | 50 | | 36 | 50 | | 37 | 50 | | 38 | 50 | | 39 | 50 | | 40 | 50 | | 41 | 50 | | 42 | 50 | ### Training Hyperparameters - batch_size: (16, 16) - num_epochs: (1, 4) - max_steps: -1 - sampling_strategy: oversampling - num_iterations: 10 - body_learning_rate: (2e-05, 0.01) - head_learning_rate: 0.01 - loss: CosineSimilarityLoss - distance_metric: cosine_distance - margin: 0.25 - end_to_end: False - use_amp: False - warmup_proportion: 0.1 - max_length: 512 - seed: 42 - eval_max_steps: -1 - load_best_model_at_end: True ### Training Results | Epoch | Step | Training Loss | Validation Loss | |:----------:|:--------:|:-------------:|:---------------:| | 0.0004 | 1 | 0.3114 | - | | 0.1860 | 500 | 0.0379 | - | | 0.3720 | 1000 | 0.1131 | - | | 0.5580 | 1500 | 0.0567 | - | | **0.7440** | **2000** | **0.0168** | **0.1033** | | 0.9301 | 2500 | 0.0033 | - | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.10.12 - SetFit: 1.0.3 - Sentence Transformers: 2.7.0 - Transformers: 4.40.1 - PyTorch: 2.2.1+cu121 - Datasets: 2.19.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION", "TRANSLATION" ]
[ "PCR" ]
Non_BioNLP
nold/CroissantLLMChat-v0.1-GGUF
nold
text2text-generation
[ "gguf", "legal", "code", "text-generation-inference", "art", "text2text-generation", "fr", "en", "dataset:croissantllm/croissant_dataset", "dataset:croissantllm/CroissantLLM-2201-sft", "dataset:cerebras/SlimPajama-627B", "dataset:uonlp/CulturaX", "dataset:pg19", "dataset:bigcode/starcoderdata", "arxiv:2402.00786", "license:mit", "endpoints_compatible", "region:us", "conversational" ]
1,707
1,708
80
0
--- datasets: - croissantllm/croissant_dataset - croissantllm/CroissantLLM-2201-sft - cerebras/SlimPajama-627B - uonlp/CulturaX - pg19 - bigcode/starcoderdata language: - fr - en license: mit pipeline_tag: text2text-generation tags: - legal - code - text-generation-inference - art --- # CroissantLLMChat (190k steps + Chat) This model is part of the CroissantLLM initiative, and corresponds to the checkpoint after 190k steps (2.99 T) tokens and a final Chat finetuning phase. https://arxiv.org/abs/2402.00786 For best performance, it should be used with a temperature of 0.3 or more, and with the exact template described below: ```python chat = [ {"role": "user", "content": "Que puis-je faire à Marseille en hiver?"}, ] chat_input = tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True) ``` corresponding to: ```python chat_input = """<|im_start|>user {USER QUERY}<|im_end|> <|im_start|>assistant\n""" ``` ## Abstract We introduce CroissantLLM, a 1.3B language model pretrained on a set of 3T English and French tokens, to bring to the research and industrial community a high-performance, fully open-sourced bilingual model that runs swiftly on consumer-grade local hardware. To that end, we pioneer the approach of training an intrinsically bilingual model with a 1:1 English-to-French pretraining data ratio, a custom tokenizer, and bilingual finetuning datasets. We release the training dataset, notably containing a French split with manually curated, high-quality, and varied data sources. To assess performance outside of English, we craft a novel benchmark, FrenchBench, consisting of an array of classification and generation tasks, covering various orthogonal aspects of model performance in the French Language. Additionally, rooted in transparency and to foster further Large Language Model research, we release codebases, and dozens of checkpoints across various model sizes, training data distributions, and training steps, as well as fine-tuned Chat models, and strong translation models. We evaluate our model through the FMTI framework, and validate 81% of the transparency criteria, far beyond the scores of even most open initiatives. This work enriches the NLP landscape, breaking away from previous English-centric work in order to strengthen our understanding of multilinguality in language models. ## Citation Our work can be cited as: ```bash @misc{faysse2024croissantllm, title={CroissantLLM: A Truly Bilingual French-English Language Model}, author={Manuel Faysse and Patrick Fernandes and Nuno M. Guerreiro and António Loison and Duarte M. Alves and Caio Corro and Nicolas Boizard and João Alves and Ricardo Rei and Pedro H. Martins and Antoni Bigata Casademunt and François Yvon and André F. T. Martins and Gautier Viaud and Céline Hudelot and Pierre Colombo}, year={2024}, eprint={2402.00786}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## Usage This model is a Chat model, that is, it is finetuned for Chat function and works best with the provided template. ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer model_name = "croissantllm/CroissantLLMChat-v0.1" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16, device_map="auto") chat = [ {"role": "user", "content": "Que puis-je faire à Marseille en hiver?"}, ] chat_input = tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True) inputs = tokenizer(chat_input, return_tensors="pt", add_special_tokens=True).to(model.device) tokens = model.generate(**inputs, max_new_tokens=150, do_sample=True, top_p=0.95, top_k=60, temperature=0.3) print(tokenizer.decode(tokens[0])) ``` ## Model limitations Evaluation results indicate the model is strong in its size category, and offers decent performances on writing-based tasks and internal knowledge, and very strong performance on translation tasks. The small size of the CroissantLLM model however hinders its capacity to perform more complex reasoning-based tasks, at least in a zero or few-shot manner in its generalist base or chat-model versions. This is aligned with other models of size and underlines the importance of scale for more abstract tasks. #### Knowledge Cutoff The model training dataset has a data cutoff date corresponding to the November 2023 Wikipedia dump. This is the de facto knowledge cutoff date for our base model, although a lot of information dates back further. Updated versions can be trained through continued pre-training or subsequent fine-tuning. #### Multilingual performance. CroissantLLM is mostly a French and English model. Code performance is relatively limited, and although some amount of data from other languages is included within the SlimPajama training set, out-of-the-box performance in other languages is not to be expected, although some European languages do work quite well. #### Hallucinations. CroissantLLM can hallucinate and output factually incorrect data, especially regarding complex topics. This is to be expected given the small model size, and hallucination rates seem inferior to most models of the same size category although no quantitative assessments have been conducted outside of MT-Bench experiments. *** Quantization of Model [croissantllm/CroissantLLMChat-v0.1](https://huggingface.co/croissantllm/CroissantLLMChat-v0.1). Created using [llm-quantizer](https://github.com/Nold360/llm-quantizer) Pipeline
[ "TRANSLATION" ]
[ "CRAFT" ]
Non_BioNLP
juanpablomesa/bge-base-financial-matryoshka
juanpablomesa
sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:9600", "loss:MatryoshkaLoss", "loss:MultipleNegativesRankingLoss", "en", "arxiv:1908.10084", "arxiv:2205.13147", "arxiv:1705.00652", "base_model:BAAI/bge-base-en-v1.5", "base_model:finetune:BAAI/bge-base-en-v1.5", "license:apache-2.0", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,719
1,719
47
0
--- base_model: BAAI/bge-base-en-v1.5 datasets: [] language: - en library_name: sentence-transformers license: apache-2.0 metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:9600 - loss:MatryoshkaLoss - loss:MultipleNegativesRankingLoss widget: - source_sentence: The median home value in San Carlos, CA is $2,350,000. sentences: - What does the console property of the WorkerGlobalScope interface provide access to? - What is the last sold price and date for the property at 4372 W 14th Street Dr, Greeley, CO 80634? - What is the median home value in San Carlos, CA? - source_sentence: The four new principals hired by Superintendent of Schools Ken Kenworthy for the Okeechobee school system are Joseph Stanley at Central Elementary, Jody Hays at Yearling Middle School, Tuuli Robinson at North Elementary, and Dr. Thelma Jackson at Seminole Elementary School. sentences: - Who won the gold medal in the men's 1,500m final at the speed skating World Cup? - What is the purpose of the 1,2,3 bowling activity for toddlers? - Who are the four new principals hired by Superintendent of Schools Ken Kenworthy for the Okeechobee school system? - source_sentence: Twitter Audit is used to scan your followers and find out what percentage of them are real people. sentences: - What is the main product discussed in the context of fair trade? - What is the software mentioned in the context suitable for? - What is the purpose of the Twitter Audit tool? - source_sentence: Michael Czysz made the 2011 E1pc lighter and more powerful than the 2010 version, and also improved the software controlling the bike’s D1g1tal powertrain. sentences: - What changes did Michael Czysz make to the 2011 E1pc compared to the 2010 version? - What is the author's suggestion for leaving a legacy for future generations? - What is the most affordable and reliable option to fix a MacBook according to the technician? - source_sentence: HTC called the Samsung Galaxy S4 “mainstream”. sentences: - What is the essential aspect of the vocation to marriage according to Benedict XVI's message on the 40th Anniversary of Humanae Vitae? - What did HTC announce about the Samsung Galaxy S4? - What was Allan Cox's First Class Delivery launched on for his Level 1 certification flight? model-index: - name: BGE base Financial Matryoshka results: - task: type: information-retrieval name: Information Retrieval dataset: name: dim 768 type: dim_768 metrics: - type: cosine_accuracy@1 value: 0.9675 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.9791666666666666 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.9829166666666667 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.98875 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.9675 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.3263888888888889 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.1965833333333333 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09887499999999999 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.9675 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.9791666666666666 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.9829166666666667 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.98875 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.9776735843960416 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.9741727843915341 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.974471752833939 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 512 type: dim_512 metrics: - type: cosine_accuracy@1 value: 0.9641666666666666 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.9775 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.9816666666666667 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.98875 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.9641666666666666 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.3258333333333333 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.1963333333333333 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09887499999999999 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.9641666666666666 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.9775 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.9816666666666667 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.98875 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.9758504869144781 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.9717977843915344 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.9720465527215371 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 256 type: dim_256 metrics: - type: cosine_accuracy@1 value: 0.9620833333333333 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.9741666666666666 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.9804166666666667 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.98625 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.9620833333333333 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.32472222222222225 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.1960833333333333 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09862499999999999 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.9620833333333333 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.9741666666666666 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.9804166666666667 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.98625 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.9737941784937224 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.9698406084656085 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.9702070899963996 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 128 type: dim_128 metrics: - type: cosine_accuracy@1 value: 0.9554166666666667 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.97 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.9766666666666667 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.98375 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.9554166666666667 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.3233333333333333 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.1953333333333333 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09837499999999999 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.9554166666666667 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.97 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.9766666666666667 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.98375 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.969307497603498 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.9647410714285715 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.9652034022263717 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 64 type: dim_64 metrics: - type: cosine_accuracy@1 value: 0.9391666666666667 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.9616666666666667 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.9666666666666667 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9758333333333333 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.9391666666666667 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.3205555555555556 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.1933333333333333 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09758333333333333 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.9391666666666667 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.9616666666666667 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.9666666666666667 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9758333333333333 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.9577277779716886 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.9519417989417989 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.9525399354798056 name: Cosine Map@100 --- # BGE base Financial Matryoshka This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) <!-- at revision a5beb1e3e68b9ab74eb54cfd186867f64f240e1a --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> - **Language:** en - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("juanpablomesa/bge-base-financial-matryoshka") # Run inference sentences = [ 'HTC called the Samsung Galaxy S4 “mainstream”.', 'What did HTC announce about the Samsung Galaxy S4?', "What is the essential aspect of the vocation to marriage according to Benedict XVI's message on the 40th Anniversary of Humanae Vitae?", ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Information Retrieval * Dataset: `dim_768` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.9675 | | cosine_accuracy@3 | 0.9792 | | cosine_accuracy@5 | 0.9829 | | cosine_accuracy@10 | 0.9888 | | cosine_precision@1 | 0.9675 | | cosine_precision@3 | 0.3264 | | cosine_precision@5 | 0.1966 | | cosine_precision@10 | 0.0989 | | cosine_recall@1 | 0.9675 | | cosine_recall@3 | 0.9792 | | cosine_recall@5 | 0.9829 | | cosine_recall@10 | 0.9888 | | cosine_ndcg@10 | 0.9777 | | cosine_mrr@10 | 0.9742 | | **cosine_map@100** | **0.9745** | #### Information Retrieval * Dataset: `dim_512` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:----------| | cosine_accuracy@1 | 0.9642 | | cosine_accuracy@3 | 0.9775 | | cosine_accuracy@5 | 0.9817 | | cosine_accuracy@10 | 0.9888 | | cosine_precision@1 | 0.9642 | | cosine_precision@3 | 0.3258 | | cosine_precision@5 | 0.1963 | | cosine_precision@10 | 0.0989 | | cosine_recall@1 | 0.9642 | | cosine_recall@3 | 0.9775 | | cosine_recall@5 | 0.9817 | | cosine_recall@10 | 0.9888 | | cosine_ndcg@10 | 0.9759 | | cosine_mrr@10 | 0.9718 | | **cosine_map@100** | **0.972** | #### Information Retrieval * Dataset: `dim_256` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.9621 | | cosine_accuracy@3 | 0.9742 | | cosine_accuracy@5 | 0.9804 | | cosine_accuracy@10 | 0.9862 | | cosine_precision@1 | 0.9621 | | cosine_precision@3 | 0.3247 | | cosine_precision@5 | 0.1961 | | cosine_precision@10 | 0.0986 | | cosine_recall@1 | 0.9621 | | cosine_recall@3 | 0.9742 | | cosine_recall@5 | 0.9804 | | cosine_recall@10 | 0.9862 | | cosine_ndcg@10 | 0.9738 | | cosine_mrr@10 | 0.9698 | | **cosine_map@100** | **0.9702** | #### Information Retrieval * Dataset: `dim_128` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.9554 | | cosine_accuracy@3 | 0.97 | | cosine_accuracy@5 | 0.9767 | | cosine_accuracy@10 | 0.9838 | | cosine_precision@1 | 0.9554 | | cosine_precision@3 | 0.3233 | | cosine_precision@5 | 0.1953 | | cosine_precision@10 | 0.0984 | | cosine_recall@1 | 0.9554 | | cosine_recall@3 | 0.97 | | cosine_recall@5 | 0.9767 | | cosine_recall@10 | 0.9838 | | cosine_ndcg@10 | 0.9693 | | cosine_mrr@10 | 0.9647 | | **cosine_map@100** | **0.9652** | #### Information Retrieval * Dataset: `dim_64` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.9392 | | cosine_accuracy@3 | 0.9617 | | cosine_accuracy@5 | 0.9667 | | cosine_accuracy@10 | 0.9758 | | cosine_precision@1 | 0.9392 | | cosine_precision@3 | 0.3206 | | cosine_precision@5 | 0.1933 | | cosine_precision@10 | 0.0976 | | cosine_recall@1 | 0.9392 | | cosine_recall@3 | 0.9617 | | cosine_recall@5 | 0.9667 | | cosine_recall@10 | 0.9758 | | cosine_ndcg@10 | 0.9577 | | cosine_mrr@10 | 0.9519 | | **cosine_map@100** | **0.9525** | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 9,600 training samples * Columns: <code>positive</code> and <code>anchor</code> * Approximate statistics based on the first 1000 samples: | | positive | anchor | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 3 tokens</li><li>mean: 50.19 tokens</li><li>max: 435 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 18.66 tokens</li><li>max: 43 tokens</li></ul> | * Samples: | positive | anchor | |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------| | <code>The Berry Export Summary 2028 is a dedicated export plan for the Australian strawberry, raspberry, and blackberry industries. It maps the sectors’ current position, where they want to be, high-opportunity markets, and next steps. The purpose of this plan is to grow their global presence over the next 10 years.</code> | <code>What is the Berry Export Summary 2028 and what is its purpose?</code> | | <code>Benefits reported from having access to Self-supply water sources include convenience, less time spent for fetching water and access to more and better quality water. In some areas, Self-supply sources offer important added values such as water for productive use, income generation, family safety and improved food security.</code> | <code>What are some of the benefits reported from having access to Self-supply water sources?</code> | | <code>The unique features of the Coolands for Twitter app include Real-Time updates without the need for a refresh button, Avatar Indicator which shows small avatars on the title bar for new messages, Direct Link for intuitive and convenient link opening, Smart Bookmark to easily return to previous reading position, and User Level Notification which allows customized notification settings for different users.</code> | <code>What are the unique features of the Coolands for Twitter app?</code> | * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters: ```json { "loss": "MultipleNegativesRankingLoss", "matryoshka_dims": [ 768, 512, 256, 128, 64 ], "matryoshka_weights": [ 1, 1, 1, 1, 1 ], "n_dims_per_step": -1 } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: epoch - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `gradient_accumulation_steps`: 16 - `learning_rate`: 2e-05 - `num_train_epochs`: 4 - `lr_scheduler_type`: cosine - `warmup_ratio`: 0.1 - `bf16`: True - `tf32`: True - `load_best_model_at_end`: True - `optim`: adamw_torch_fused - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: epoch - `prediction_loss_only`: True - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 16 - `eval_accumulation_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 4 - `max_steps`: -1 - `lr_scheduler_type`: cosine - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: True - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: True - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch_fused - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | dim_128_cosine_map@100 | dim_256_cosine_map@100 | dim_512_cosine_map@100 | dim_64_cosine_map@100 | dim_768_cosine_map@100 | |:--------:|:------:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|:----------------------:| | 0.5333 | 10 | 0.6065 | - | - | - | - | - | | 0.96 | 18 | - | 0.9583 | 0.9674 | 0.9695 | 0.9372 | 0.9708 | | 1.0667 | 20 | 0.3313 | - | - | - | - | - | | 1.6 | 30 | 0.144 | - | - | - | - | - | | 1.9733 | 37 | - | 0.9630 | 0.9699 | 0.9716 | 0.9488 | 0.9745 | | 2.1333 | 40 | 0.1317 | - | - | - | - | - | | 2.6667 | 50 | 0.0749 | - | - | - | - | - | | 2.9867 | 56 | - | 0.9650 | 0.9701 | 0.9721 | 0.9522 | 0.9747 | | 3.2 | 60 | 0.088 | - | - | - | - | - | | 3.7333 | 70 | 0.0598 | - | - | - | - | - | | **3.84** | **72** | **-** | **0.9652** | **0.9702** | **0.972** | **0.9525** | **0.9745** | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.11.5 - Sentence Transformers: 3.0.1 - Transformers: 4.41.2 - PyTorch: 2.1.2+cu121 - Accelerate: 0.31.0 - Datasets: 2.19.1 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MatryoshkaLoss ```bibtex @misc{kusupati2024matryoshka, title={Matryoshka Representation Learning}, author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi}, year={2024}, eprint={2205.13147}, archivePrefix={arXiv}, primaryClass={cs.LG} } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "MEDAL" ]
Non_BioNLP
svorwerk/setfit-fine-tuned-demo-class_hpo
svorwerk
text-classification
[ "setfit", "safetensors", "mpnet", "sentence-transformers", "text-classification", "generated_from_setfit_trainer", "arxiv:2209.11055", "base_model:sentence-transformers/all-mpnet-base-v2", "base_model:finetune:sentence-transformers/all-mpnet-base-v2", "region:us" ]
1,706
1,706
5
0
--- base_model: sentence-transformers/all-mpnet-base-v2 library_name: setfit metrics: - accuracy pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification - generated_from_setfit_trainer widget: - text: 'Acquisition Id: ALOG; Ancotel; Asia Tone; Bit-Isle; IXEurope; Infomart; Itconic; OGS; SafeGuard; Switch and Data; Telecity; Verizon; Virtu; Zenium; nan' - text: 'Emp_FLSA: E; N; P; V; X; nan' - text: 'Emp Grade: A; AA; AS 1; AS1; AS2; AS3; B; C; COM; D; E; E I; E II; E10; E11; E12; E13; E7; E8; E9; J01; J02; J03; J04; JS; MM I; MM II; N01; N02; N03; N05; N06; N07; N08; NE 2; NE 3; NE 4; NE 5; NE 6; NE1; NE2; NE3; NE4; NF; NG; Non-exempt; S; SA I; SA II; SA III; SA4; SA5; SE I; SE II; SM I; SM II; SP; SS; nan' - text: 'Loc State: AB; AL; AR; AZ; BC; CA; CO; CT; DC; DE; FL; GA; GTL; GTM; Gujarat; HI; IA; ID; IL; IN; KS; KY; Karnataka; LA; MA; MD; ME; MI; MN; MO; MS; MT; Maharashtra; NB; NC; ND; NE; NH; NJ; NM; NS; NV; NY; OH; OK; ON; OR; PA; PR; QC; RI; SA; SC; SD; TN; TX; UT; VA; VT; WA; WI; WV; WY; nan' - text: 'Supervisory Organization: 3 PL & Warehousing Management (Harald Dehe); 3 PL Management (Ryoichi Imamura (Inherited)); 3A-Modul (Arnd Vollmerhausen (Inherited)); 3A-Modul (Erkan nder); 3A-Modul (Erkan √ñnder); ABN ( Antibodies/NAT) (Thomas Roten); AD&V - Assay Development & Validation (Svenja Wolff); AESD Aurora Build Elec & Warehouse (Chris Askitis); AM Ural, Siberia & Far East (Anna Tov); AM Volga & South (Natalia Ivanova); ANI Region ICO (Ivaylo Vladimirov); APAC Payroll (Rohit Jain); API Manufacturing (Luke McIntyre); API Manufacturing (Ryan Cox (Inherited)); API Manufacturing (Union) (Luke McIntyre (Inherited)); API Manufacturing (Union) (Ryan Cox (Inherited)); AQL (Jens Huft); AS&T (Jeff Hancock); ASQ Analytical Sciences & Technology (Michael Sch√ºmann); ASQ Biochemistry (Sven Karschnia); ASQ Chemistry (Kerstin Nske); ASQ Chemistry (Kerstin N√∂ske); ASQ Compliance & Release (Angelika Jung); ASQ Data Integrity (Christoph Kircher); ASQ External Materials & Support (Angelika Jung); ASQ External Materials & Support (Simone Lang); ASQ Microbiology (Thomas Bhler); ASQ Microbiology (Thomas B√ºhler); ASQ Potency (Sven Karschnia); ASQ Potency (Volker Gawantka (Inherited)); ATR Financial Planning (John Sullivan); ATR General Accounting (Tanja Roth); AUS Government & Media Relations (Sharon McHale); Abfllteam 1 (Heiko Brhl); Abfllteam 2 (Sorin Suditu); Abfllteam 3 (Dirk Fischer); Abfllteam 4 (Heiko Stein); Abfllteam 5 (Eduard Wegner); Abfllteam 6 (Maik Jesberg); Abfllteam 8 (Murat Midik); Abf√ºllteam 1 (Heiko Br√ºhl); Abf√ºllteam 2 (Sorin Suditu); Abf√ºllteam 3 (Dirk Fischer); Abf√ºllteam 4 (Heiko Stein); Abf√ºllteam 5 (Eduard Wegner); Abf√ºllteam 6 (Maik Jesberg); Abf√ºllteam 7 (Holger Kuhl (On Leave)); Abf√ºllteam 7 (Holger Kuhl); Abf√ºllteam 8 (Murat Midik); Abilene 701 (Ophelia M Cavalier); Abilene 701 (Sara M Schuppe); Abilene 701 ACM Area 1 (Traci Bradford); Abilene 701 ACM Area 2 (Joel Hutsell); Abilene 701 QA (Wes Scruggs); Account to Report (Andrew Croft (Inherited)); Account to Report (Linda Carducci (Inherited)); Account to Report (Michael Kochanski); Accounting & Reporting (Jeffrey H Schorr); Accounting (Annalisa Saracchi (Inherited)); Accounting (Paul Fellingham (Inherited)); Accounting France (Charlotte Rougi (Inherited)); Accounting France (Charlotte Rougi√© (Inherited)); Accounting Geschftsbuchhaltung (Jens Dettbarn); Accounting Gesch√§ftsbuchhaltung (Jens Dettbarn); Accounting Operations (Brian T Simeur); Accounting Operations (Denna DeWitt); Accounting Operations (Manish Devnani); Accounting Operations (Patrick Zuercher); Accounting Operations 1 (Denna DeWitt); Accounts Payable & Receivable (Yvonne Robson); Accounts Payable (Brian T Simeur (Inherited)); Accounts Payable (Mark Staniford); Accounts Payable (Susan Velazquez); Accounts Receivable (Stephen Fairhurst); Accounts Team Lead (Rebecca Merrett); Acquired Bleeding & Critical Care Group (Mitsuhiro Kuwahara); Acquired Bleeding & Critical Care Group (Takashi Ishijima ??? ?? - ???? ????); Acquired Bleeding TA (Naoki Ikeguchi); Acting Manager, Innovation R&D (Chi Ong); Adaptive Clinical Trial Technologies (Liz Quinlan); Administration (Franz Gr√ºn); Administration (Juli Zhu ????); Administration (Sandy Tan ); Administration (Sandy Tan ?????); Administration Italy (Antonella Carluccio); Administration Italy (Cristina Fabbro); Administration/Production Support Subject Matter Expert (Justin Ryan); Administration/Production Support Systems Officer (Adam Kraan); Advertising (Raimund Wybranietz); Akron 402 (Joe Jacko); Akron 402 ACM Area 1 (LeAnna M Mauger); Akron 402 ACM Area 2 (Tonya R Robinson); Akron 402 ACM Area 3 (Brett Ferrell); Akron 402 QA (Carrie Piggford); Akron 402 QA (Christine Thomas); Akron 402 QA (Esence Hambrick); AlbuRX (Tom Hubert); AlbuRx (Barry Lynch); AlbuRx Bulk (Aoife Corrigan); AlbuRx Bulk Mfg (Joel Rainey); AlbuRx Packing (Carlo Volpe); Albumin & IVIG Lyo Bulkmanufacturin (Irne Stempfel); Albumin & IVIG Lyo Bulkmanufacturin (Ir√®ne Stempfel); Albumin (Fritz Rentsch); Albumin (Juerg Hofmaenner); Albumin Bulk Team 1 (Fritz Rentsch); Albumin Bulk Team 2 (Andreas L√ºdi); Albuquerque 034 (Latoya K Gilbert-Torres); Albuquerque 034 ACM Area 1 (Dee Ulibarri); Albuquerque 034 ACM Area 2 (Gerardo Ruelas); Albuquerque 034 QA (Antoinette F Tapia); Albuquerque 137 (Preston W Minor); Albuquerque 137 ACM Area 1 (Brian Trujillo); Albuquerque 137 ACM Area 2 (Melisa R Cox); Albuquerque 137 QA (Daniel Venn); Albuquerque 137 QA (Danny Kinder); Albuquerque 137 QA (Kelsey Gaffney); Alcohol Inventory (Union) (Michael D Proctor); Allentown 294 (Susan Tudela); Allentown 294 QA (Mercy Cobbinah); Allergy National Sales Team (Lea Leon); Allergy National Sales Team (Lea Rajendran); Alliance and Governance (Andrea Lehmann); Alliance and Project Management Systems (Jos√© Maldonado); Amarillo 283 (Jerica Hunter); Amarillo 283 ACM Area 1 (Nicole Taylor); Amarillo 283 ACM Area 1 (ROBERT WILLIAMS); Amarillo 283 ACM Area 2 (Nicole Taylor); Amarillo 283 ACM Area 2 (ROBERT WILLIAMS); Amarillo 283 QA (Michael L Allen); America''s Service Desk (Delilah Harden); Americas HR Ops Tier 1 (Alana DeWeever); Americas HR Ops Tier 1 (Monica Silveira); Americas Service Delivery and Plasma Tech (David G Bersch); Americas Service Operations (Mary Jane McPherson); Analytical Development (Jayendra Shankar); Analytical Drug Product Development (Jiang Qian); Analytical Science & Technology (Oliver Lffler); Analytical Science & Technology (Oliver L√∂ffler); Analytical Science & Technology Holly Springs (Jeffrey Pederson (Inherited)); Analytical Science & Technology Holly Springs (Jessica Gambill); Analytical Science & Technology Liverpool (Jeff Hancock); Analytical Science & Technology Liverpool (Jeffrey Hancock); Analytical Science & Technology Parkville (Tim Karla); Analytical Services Quality (Simone Naruhn); Analytical Services Quality (Volker Gawantka); Analytical Software Technology 2 (Jan Bursy); Analytics R&D (Patrick Schuetz); Animal Services Manager (Phil Franchina); Animal Services Manager (Rachel Borg, Phil Franchina); Animal facility 2 (Elmar Raquet); Anlagensicherheit & Konzessionen (Jrgen Arnold); Anlagensicherheit & Konzessionen (J√ºrgen Arnold); Application Security (Riti Arya); Applications, Plateau (Mark S Mantarian); Applications, Plateau (Trevor Alcorn); Applications, Plateau I (Trevor Alcorn); Apprentices (Kevin Liechti); Apprentices and Trainees ES (Rolf Isenschmid); Apprenticeship (Sandra Zbinden); ArbEG (Beate Binsack); Arbeitssicherheit (zcan Campinar (Inherited)); Arbeitssicherheit (√ñzcan Campinar (Inherited)); Area Business Manager 725 (Danielle Traum); Area Business Manager 725 (Eva Merce Maldonado); Area Business Manager 725 (Nick Croton); Area Business Manager 726 (Cameron McCulloch); Area Business Manager 728 (Graham Cluley); Area Business Manager 781 (Danielle Traum); Area Business Manager 781 (Helen Kostopoulos); Area Business Manager 783 (Melissa Weier); Area Business Manager 785 (David Clapin); Area Business Manager 786 (David Brown); Area Business Manager 786 (Peter Moxham); Area Innovation Operations (Cole D Kimple); Argentina Cluster (Carmen Pereyra (Inherited)); Argentina Cluster (Carmen Rosa Pereyra Davila (Inherited)); Artwork Packaging (Ratana Lim); Arvada 129 (Colleen A Irish); Arvada 129 ACM Area 1 (Robert Young); Arvada 129 ACM Area 2 (Jason T Studtmann); Arvada 129 QA (Alfredo Castillo (On Leave)); Arvada 129 QA (Alfredo Castillo); Aseptic Cert (Anja Djordjevich); Aseptic Cert (Grace Luong); Aseptic Filling (Benjamin Dudok); Aseptic Filling I (Eveline Kindler (Inherited)); Aseptic Filling Team (Terry Shipway); Aseptic Processing & SIAG (Laurent Wagner); Aseptic Processing & SIAG (Steffen Korth); Asia Operations (Felix Soh); Asia Operations (Richard Kwan ?????); Asia Pacific Tax (Aoife Deane); Asia Pacific Tax (JOON YONG); Asia South Marketing (Natalie Ku); Asia South Medical Affairs (Narendra Patil); Asia-Pacific Business Integrity (Angelia Lee); Asia-Pacific Commercial Operations (Paul Li); Asia-Pacific Demand Planning (James Lee ?????); Asia-Pacific Marketing and Medical Affairs (Peter Chow); Asia/Pac Service Operations (Joe Razavi); Asia/Pac Tech (Juerg Clavadetscher (Inherited)); Assay Development and Analytics, Gene Therapy, Flow Cytometry (Ann George); Assay Development and Optimization I (Mirna Rapp); Assay Development and Optimization II (Rainer Straub); Assay Support Group (Stefan Kempf); Asset Compliance (Keith Champion); Assets Management and Governance (Stefano Siviero); Assistant Supply Chain Management (Manuela Lacher); Associate Director, R&D Ops (Christine Wadey); Associate Sales Director 798 (Ray Friedrich); Auburn 183 (Cherita Saulmarshall); Auburn 183 (Joshua Killingsworth); Auburn 183 ACM Area 1 (Tiffany Johnson); Auburn 183 ACM Area 2 (Ashley Bentley); Auburn 183 QA (Melodee C Ebel (Inherited)); Auburn 183 QA (Stephanie Baptiste); Auburn 183 QA (Timothy J Nisewonger); Audit & Compliance Management (Christina Berninger); Audit & Quality Risk Management (Christina Berninger); Audit & Quality Risk Management (Rainer Bier); Auditing and Inspections (Jenny Cavanagh); Auftragsvorbereitung & -koordination (Horst Kraus); Augusta 253 (Kristopher Collier); Augusta 253 ACM Area 1 (Kristopher Collier (Inherited)); Augusta 253 ACM Area 1 (Tomecia Tillman); Augusta 253 ACM Area 2 (Jonathan Lewis); Augusta 253 QA (Dave Anderson); Augusta 253 QA (Pamela DeLucia); Aurora 702 (Kevin J Lawley); Aurora 702 (Keyonna L Gray); Aurora 702 ACM Area 1 (Virginia L Garnica); Aurora 702 ACM Area 2 (Theresa M Couture); Aurora 702 QA (Fernanda Nistal); Aurora 702 QA (Nan Nistal); Automated VI (David Kuhs); Automation (Adrian Marti); Automation (Christopher Parlane); Automation (Frank Mastellone); Automation (Jrgen Dersch); Automation (J√ºrgen Dersch); Automation (Stefan Sigrist); Automation Instrumentation (Ludovic Le Reste); Automation Systems Engineer (Magan Lai); Automation Systems Manager (Cornell D''Couto); Automation and Electrical Systems (Lou Corvetti); Automation and Electrical Systems (Matt Downey); Automation and Electrical Systems (Zoran Hadzi-Nikolov); Automatisierungstechnik (Andreas Tement); Automatisierungstechnik (Jens Laucht); BCI Team 1 (Frank Ludwig); BCI Team 2 (Markus Plociennik); BCI Team 2 (Ralf Kolley); BCI Team 3 (Torsten Hrtl); BCI Team 3 (Torsten H√§rtl); BFX8 (Donnie Daugherty); BFX8 (Victor Vazquez); BMS (Jan Klee); BPA Holly Springs (Luke McMahon); BPA Holly Springs (Paul Parske); BPA Liverpool (Andrew Holland); BRM Batch Release Management (Joachim Leiss); BRR & QoF (Natalie Windel); BRS Batch Release Support (Hans-Tobias Deinzer); BT - Quality & Manufacturing Applications (Robert Price); BT Applications (BI-Analytics) (John Thompson (Inherited)); BT Applications (BI-Analytics) II (Johnny F Helms Jr); BT Applications (BI/Analytics) (Johnny F Helms Jr); BT Applications (Bern) (Andrew Matys); BT Applications (Business Applications) (Jesse R Crew); BT Applications (Coallaboration-BI Bern) (Christophe Fuchs); BT Applications (Coallaboration/BI Bern) (Christophe Badertscher); BT Applications (ComOps) (Natasha Reantillo); BT Applications (ComOps) 2 (Francis Azul); BT Applications (DMS) (Johannes Lichtenfels); BT Applications (DMS/Bern) (Johannes Lichtenfels (Inherited)); BT Applications (DMS/MBR) (Johannes Lichtenfels (Inherited)); BT Applications (Daniel R Rodgers); BT Applications (Data Services) (Thomas Walther (On Leave)); BT Applications (Data Services) (Thomas Walther); BT Applications (Data Warehouse) (Bhanu Vereddigari); BT Applications (Manuf.-Quality Bern) (Marcel Hadorn); BT Applications (Manuf./Quality Bern) (Marcel Hadorn); BT Applications (Sean O''Connor); BT Applications (Web Apps) (James Briggs); BT Applications (Web Ops) (Ross Bovenkerk); BT Applications BI (MBR) (Manuel Schaub); BT Applications CSL Plasma (Boca) (Cindy K Elliott); BT Applications CSL Plasma (MBR) (Gerhard Vogel); BT Applications CSL Plasma (MBR) (Hubert Diehl); BT Applications Corporate Functions (Kartik Tavargeri); BT Applications DMS (Boca) (Barbara L Youngling); BT Applications DMS (Boca) (Becky Heatherman); BT Applications DMS (Boca) (Brandi Kennedy); BT Applications DMS (Boca) (John Di Anni); BT Applications DMS (Boca) I (Barbara L Youngling); BT Applications DMS (Boca) II (Brandi Kennedy); BT Applications DMS (Boca) III (Malinda Hargitt); BT Applications EU (Markus Nickel); BT Applications Finance and Risk (Jesse R Crew (Inherited)); BT Applications LIMS & Local Apps (Boca) (Ram Jadvani); BT Applications Logic (DMS) (MBR) (Gerhard Vogel (Inherited)); BT Applications Manuf.-Quality MBR (Chris Camilleri); BT Applications Manuf./Quality MBR (Chris Camilleri); BT Applications Manuf./Quality MBR (Martin Hopp (Inherited)); BT Applications Quality (Andy Chung (Inherited)); BT Applications Quality (MBR) (Martin Hopp); BT Applications Quality (Ted Schmidt); BT Applications R&D (MBR) (Christoph Kraus); BT Applications, HRIS (Kent Riddell); BT Apprentices (Michel M√ºller); BT Apprentices (Ueli Niederhauser); BT Commercial Compliance Apps (Chunlei Liao); BT DEV Applications (Janek Geil); BT Enterprise Applications EPPM (Eltis Wong WongKeung Fai); BT Enterprise Applications ¬ñ EPPM (Elizabeth Cataldo); BT Enterprise Applications ¬ñ EPPM (Eltis Wong ?WongKeung Fai?); BT Enterprise Site Mgmt and Quality (Don Konemann); BT Infrastructure (AUS) (Michael Fugaro); BT Infrastucture (China Ruide) (Michael Fugaro (Inherited)); BT Operational Excellence (Jeffrey Walker); BT Operational Excellence (Markus Wotruba); BT Operations Parkville (Nick Witnish); BT Portfolio Management (Julia Naake); BT Quality (Astrid Tr√ºmper); BT Quality (BRN) (Valrie Progin-Meyer); BT Quality (BRN) (Val√©rie Progin-Meyer); BT Quality (MBR) (Jutta Weiss); BT Quality (Michelle Lemasters); BT Quality 2 (Jill L Rieken); BT Quality Applications (QMS) (Jeff Senley); BT Quality KoP (Chantelle Marie Otto); BT Quality Manager (Astrid Tr√ºmper (Inherited)); BT Quality Manager (Irene Ragel); BT Quality Manager (Travis Newing); BT Site Delivery (Sven Brske); BT Site Delivery (Sven Br√ºske); BT Source-To-Pay Apps (Charles Warman); BT Source-To-Pay Apps (Jochen Preis); BT Source-To-Pay Apps (Satish Mohan Gudipalli); BT operational excellence (Markus Wotruba); BT-Serialisation (Ramanand Lanka); BTQ Biotech Quality (Barbara Wegscheid); BTQ Biotech Quality (Ulrich Eberhard); Bacteriology (Benny Hung); Bacteriology (Karthy Santhosh); Bacteriology (Niharika Pathak); Baltimore 166 (Mario A Salas); Baltimore 166 (Robert Centennial); Baltimore 166 ACM Area 1 (Dami Alimi); Baltimore 166 ACM Area 2 (Gary Rivers Jr); Baltimore 166 QA (Briesha Smith); Baltimore 166 QA (Monica Brown (On Leave)); Baltimore 166 QA (Monica Brown); Base Fractionation (Anthony Kaye); Base Fractionation (Brendan Hilliard); Base Fractionation (Ernest Shepard (Inherited)); Base Fractionation (George Makris); Base Fractionation (Parish McKenzie); Base Fractionation (Roy Taylor); Base Fractionation Operations (Shane Bourne); Batch Release (Anthony Day); Batch Release Management (Constanze Buchter); Batch Release Management (Nicole Ast); Batch Release Support (Christine Schneider); Batch Release Support QAI (Daniel Kalb); Baytown 086 (Brian T Edwards); Baytown 086 (Darriel Clark); Baytown 086 ACM Area 1 (Rachel I Ramirez); Baytown 086 ACM Area 1 (Tara West); Baytown 086 ACM Area 2 (Elizabeth Morales); Baytown 086 ACM Area 2 (Rebecca Rogers); Baytown 086 QA (Monica Banuelos); Beloit 005 (Jesus A Castillo (Inherited)); Beloit 005 (Kristin R Swain); Beloit 005 ACM Area 1 (Eric M Cook); Beloit 005 ACM Area 2 (John Miles); Beloit 005 QA (Patty Justus); Berufliche Erstausbildung (Carmen Walldorf); Berufliche Erstausbildung (Doris Nake); BioAnalytical Sciences Routine (Ashraf Raza); Bioanalytical Sciences (Aaron Hahn); Bioanalytical Sciences (Alice Andreu); Bioanalytical Sciences (Andreas Meister); Bioanalytical Sciences (Bo An); Bioanalytical Sciences (Christophe Pical); Bioanalytical Sciences (Clare Elizabeth Shepherd); Bioanalytical Sciences (Craig Kyngdon); Bioanalytical Sciences (Cristina Baker); Bioanalytical Sciences (Cristina Torres-Arce); Bioanalytical Sciences (David Boerema); Bioanalytical Sciences (Jennifer La); Bioanalytical Sciences (Laura Cortes Castrillon); Bioanalytical Sciences (Lee Xin Chong); Bioanalytical Sciences (Lucy Cao (Inherited)); Bioanalytical Sciences (Lucy Cao ); Bioanalytical Sciences (Lucy Cao ?????); Bioanalytical Sciences (Michael Johnston) (52388204); Bioanalytical Sciences (Ralf Ottofuelling); Bioanalytical Sciences (Rodney Holmes); Bioanalytical Sciences (Saw Yen Ow); Bioanalytical Sciences (Theresa Qiu); Bioanalytical Sciences (Vincent Strangis); Bioanalytical Sciences, LCM (Minyao Tang ?????); Bioanalytical Sciences, Lab Ops (Jinsong Zhao ?????); Bioanalytics & Fermentation (Partho Halder); Bioanalytics, Gene Therapy (Gene-Errol Ringpis); Bioassay Group (Souravi Ghosh); Biochemical Quality Control (Andreas Affolter); Biochemical Quality Control (BCAD) (Mirjam Kuehne Sebaste); Biochemical Quality Control (BCQC) (Kathrin Minnig Gsponer); Biochemical Quality Control (BCQC) (Sten Strunze); Biochemical Quality Control (Sandra Kampczyk); Biochemical Quality Control (Sten Strunze); Biochemistry (Bjrn Hegemann); Biochemistry (Bj√∂rn Hegemann); Biochemistry (Marius Loetscher); Biochemistry (Monika Edler); Biochemistry 4 (Thomas Gentinetta); Bioinformatics & AI (Arthur Hsu); Bioinformatics & AI (Monther Alhamdoosh); Biological Analytcs R&D (Roland Zehnder); Biological Analytical Development (Simon Urwyler); Biological Quality Control (BQC) (Michael Molitor); Biologielaboranten (Carmen Walldorf (Inherited)); Biologielaboranten (Doris Nake (Inherited)); Biology Animal Group (Preston Eilders); Biology Lab (Catherine A Moody); Biology Lab (Preston Eilders); Biology Lab I (Preston Eilders); Biology Quality Control (BQC) (Michael Molitor); Bioprocess Development & Innovation (Erik Hinze); Bioprocess Development (Vicky Pirzas); Bioreactor Development (Sara Ladd); Bioreactor Development (Tizita Horning); Bioreactor Development 1 (Tony Hunt); Bioreactor Development 2 (Eric Zhu); Biostatistics Transplantation (Aparna Raychaudhuri (Inherited)); Biostatistics & Medical Writing, R&D JAPAN (Takashi Fukai - ); Biostatistics & Medical Writing, R&D JAPAN (Takashi Fukai ??? ?? - ??? ????); Biostatistics (Fang Xie); Biostatistics (Michael Fries); Biostatistics - Aquired Bleeding, Coagulation, Respiratory (Michael Fries); Biostatistics - Cardiovascular (Mark Heise); Biostatistics - Immunology and Inflammation (John-Philip Lawo); Biostatistics and Medical Writing (LOTHAR TREMMEL); Biostatistics and Medical Writing (Lothar Tremmel); Biostatistics ¬ñ Cardiovascular and Metabolic (Mark Heise); Biostatistics ¬ñ Innovation (Sergei Leonov); Biostatistics ¬ñ Transplantation (Aparna Raychaudhuri); Biostatistics ¬ñ Transplantation (Fang Xie (Inherited)); Biostatistik (Marcel Mischnik); Biotech Manufactuirng Facility (Christoph Hau√ümann); Biotech Manufactuirng Facility (Philip Elliott); Biotech Manufacturing (Aleksandar Hristov); Biotech Manufacturing (Brett Bohr); Biotech Manufacturing (Fraser Goodwin); Biotech Manufacturing (Peter Cloyd Belandres); Biotech Manufacturing (Peter Florey); Biotech Manufacturing (Steven Jacovou); Biotech Manufacturing 1 (Steven Jacovou); Birmingham 259 (Sam Whitehead); Birmingham 259 ACM Area 1 (Meredith Y Lundie); Birmingham 259 ACM Area 2 (AJ Johnson); Birmingham 259 QA (Noelle Teague); Bloomington 127 (Ashley B Fearnall); Bloomington 127 (Xang Vang); Bloomington 127 ACM Area 1 (Loryna Williams); Bloomington 127 ACM Area 2 (Kayla L Stueber); Bloomington 127 ACM Area 2 (Kirsten M Heller); Bloomington 127 QA (Karen R Soderberg); Bloomington 241 (Kevin Smith); Bloomington 241 ACM Area 1 (Anna Whitman); Bloomington 241 ACM Area 2 (Kevin Smith (Inherited)); Bloomington 241 ACM Area 2 (Michele Morison); Bloomington 241 QA (Ben Samarripas (Inherited)); Bloomington 241 QA (Ryan Caudill-Laughlin); Boca Field Services (Javier Lopez); Boca Field Services (Julio Feliciano); Boise 227 (Ash Senters); Boise 227 (Ashley Senters); Boise 227 (Carl Seelert); Boise 227 (Timothy Freeland Jr (Inherited)); Boise 227 ACM Area 1 (Camille Snow); Boise 227 ACM Area 2 (Travis Richardson (On Leave)); Boise 227 ACM Area 2 (Travis Richardson); Boise 227 QA (Derek Erhart (Inherited)); Boise 227 QA (Miles Veater); Brand Manager (Judith Vico); Brand Manager 745 (Genevieve Nihill); Breakthrough Technologies (Axel Dietrich); Breakthrough Technologies (Hung Pham); Breakthrough Technologies (Nathan J Brinkman); Breakthrough Technologies I (Laura Keigher); Brewer 503 (MacGregor Roy); Brewer 503 ACM Area 1 (Stephen R Coltharp); Brewer 503 ACM Area 2 (Katherine Ragia); Brewer 503 QA (Marc Stephens); Brownsville 113 (Jose L Dela Garza (Inherited)); Brownsville 113 (Nick Caballero); Brownsville 113 ACM Area 1 (Alfonso Gutierrez); Brownsville 113 ACM Area 2 (Robert Miranda); Brownsville 113 ACM Area 3 (Brenda Z Garcia); Brownsville 113 ACM Area 4 (Hector F Amaya); Brownsville 113 QA (Francisca Z Lopez); Brownsville 113 QA (Laura L Escalante); Brownsville 113 QA (Rosa E Mercado (Inherited)); Brownsville 114 (Anthony A Almaguer); Brownsville 114 (Osiel E Selvera); Brownsville 114 ACM Area 1 (Roy Madero); Brownsville 114 ACM Area 2 (Amanda Millan); Brownsville 114 ACM Area 3 (Melissa Medrano); Brownsville 114 ACM Area 4 (Maria A Garcia); Brownsville 114 QA (Joanna M Franco); Brownsville 135 (Francisca Z Lopez); Brownsville 135 (Nick Caballero); Brownsville 135 ACM Area 1 (Severita Williams); Brownsville 135 ACM Area 2 (Oralia M Vasquez); Brownsville 135 ACM Area 3 (Claudia Uribe Resendiz); Brownsville 135 QA (Alma E De Los Santos De Gonzalez); Brownsville 135 QA (Britney Castillo); Buffalo 239 (Nicholas Liberati); Buffalo 239 (Renaye Hecker); Buffalo 239 ACM Area 1 (Kimberly Lubecki); Buffalo 239 ACM Area 2 (Carol Palaszewski); Buffalo 239 QA (Nicholas Liberati); Buffalo 239 QA (Olivia Bejaran); Buffer Preparation (Benjamin Gr√ºn); Buffer-Production (Bernd Grau); Building 21 South (Brock A Boudreau); Building 21 South (Parish McKenzie (Inherited)); Building 21S (Union) (Brock A Boudreau); Building 21S (Union) (Parish McKenzie (Inherited)); Building 30 (Brock A Boudreau); Building 30 (Parish McKenzie (Inherited)); Building 30 (Union) (Brock A Boudreau); Building 30 (Union) (Parish McKenzie (Inherited)); Buildings & Prop Coord 256 (Ray Belli); Bulk (Markus Weber); Bulk Manufacturing (Freddie Wayne West); Bulk Manufacturing (Gregory Taylor); Bulk Manufacturing (Patricia Stewart (Inherited)); Bulk Manufacturing (Ryan Cox); Bulk Mechanical Team (Matthew Johnson (Inherited)); Bulk Mechanical Team (Mohamed Tubar); Bulk Mfg (Joel Rainey (Inherited)); Bulk Mfg (Katerina Petreska); Bulk Mfg (Mahmoud Lasheen); Bulk Mfg (Matt Thompson); Bulk Mfg (Tom Vick); Bulk Mfg (Tri Nguyen); Bulk Process Technology (Andreas Grter); Bulk Process Technology (Andreas Gr√ºter); Bulk Process Technology (Rene Boros); Bulk Utilities (Michael D Proctor); Burlington 512 (Lynn M Stratton); Burlington 512 ACM Area 1 (Kay Harris); Burlington 512 ACM Area 2 (Danica Johnson); Burlington 512 QA (Sharleen Dunn); Bus. Svcs, Project Edge (John Dinatale); Business (Camilla Shen); Business Analytics (Joseph Smith); Business Analytics (Venkatesh Ramakrishnan (Inherited)); Business Applications (Anu Thampi); Business Applications (Charles Lowe); Business Development (Evgeniy Glukhovskiy); Business Development (Simone Parkes); Business Insights & Analytics (Nitin Bhatnagar (Inherited)); Business Insights & Analytics (Shital Patel); Business Insights & Operations (Lynda Kulp); Business Integrity (Christine Zettlemoyer); Business Integrity Intercontinental (Bella Hovhannisyan Melkonyan (Rafael)); Business Integrity and Privacy Program (Sarah McHenry); Business Integrity and Risks (Karen Neave); Business Integrity and Risks (Kelly Scott); Business Operations (Harald Mller); Business Operations (Harald M√ºller); Business Operations (Laura Attride); Business Operations (Paul Jens); Business Operations EU (Heidi Sun); Business Partner Support (Anika Wagner); Business Partnering Holly Springs (Carey Vassallo); Business Partners (Christine Toth); Business Partners (Jacqueline Hawkins (Inherited)); Business Planning Group (Junya Morinaga - ); Business Planning Group (Junya Morinaga ??? ?? - ???? ?????); Business Planning Group (Makoto Miura (Inherited)); Business Planning Group (Yuichiro Sakagami); Business Process & Technology (Joseph Elicone); Business Process & Technology (Maureen Martini); Business Process (Christian Sonderegger); Business Process Excellence (Manuel Schaub); Business Process Excellence Global Training (Reed Johnston); Business Process Management (BPM) (GAFOOR SARANG); Business Process OTC (Kian Hartono); Business Process S2P (Simon Haemmerli); Business Processes & Data Mgt (Barbora ?√°chov√°); Business Processes & Data Mgt (Barbora chov); Business Processes (Boris Kaiser); Business Processes (Hans Raess); Business Processes (Thomas Romanus); Business Productivity (Scott A Ramseyer); Business Services (David H Confessore); Business Services (Ken Lim); Business Services Enterprise Business Solutions (David Wolozyn); Business Services and Demand Planning, APAC (Uli Kiefer); Business Support (Christian Schnabel); Business Support (Lisa Bartol); Business Support (Walter Aebersold); Business Technology (Boca) (Rob Klostermeyer); Business Technology (Jesse R Crew (Inherited)); Business Technology (Sharon Wong ); Business Technology (Sharon Wong ?????) (Sharon Wong ?????); Business Unit Director (Susan Snowball); Business Unit Manager (Natasha Hutchison); CAB & Digital Marketing Group (Narihiko Suenobu); CAD (Erwin Vonlanthen); CD Clinical Quality Control & Compliance (Larry Fish); CDS - Computerized Data & Systems (Christoph Kircher); CEQ Management (Barry Lynch); CI & QC Compliance (Lisa Marie Malcharek (Inherited)); CI & QC Compliance (Lisa Marie Malcharek); CI & QC Compliance (Thomas Wescombe); CI & QC Compliance (Thomas Wescombe) (Thomas Wescombe); CMC (Jason Newman); CMC Lead (Andrew White); CMC Lead (Dirk Bruns-Nagel); CMC Lead (Mackenzie Firer Sherwood); CMC Lead (Max Stuart Corbett); CMC Lead (Paul Smrdelj); CMC RA Group (Hideaki Hoshi ?? ?? - ?? ?????); CMC RA Group (Koichiro Kase - ); CMC Site Lead (Richard Buchta); CMO (Metani Rooms); CMO Management & Technology (Sabine Zollner); CMO Office (Vicki Oosterbaan); CO Diverse (Eddie Owens (Inherited)); CO Diverse (Edward Owens (Inherited)); CORE Operations - Canada (Constantina Boikos); CPAT (Kelly Helebrant); CPAT CVM - H&T (Uli Frevert); CPAT CVM / H&T (Uli Frevert); CPAT Internship Program I (Alissa Verone-Boyle (On Leave)); CPG Business Services (Michael Engelmann); CPG Center Berlin (Frank Bernert); CPG Center Bielefeld (Frank Bernert); CPG Center Braunschweig (Frank Bernert); CPG Center Bremen (Frank Bernert); CPG Center Gttingen (Frank Bernert); CPG Center G√∂ttingen (Frank Bernert); CPG Center Kiel (Frank Bernert); CPG Center Nrnberg (Frank Bernert); CPG Center N√ºrnberg (Frank Bernert); CPG Finance & Planning (Gerhard Mbus); CPG Finance & Planning (Gerhard M√∂bus); CPG Human Resources (Christine Debellis); CPG LQK Labor (Bettina Flotho-Salzmann); CPG Manager Serologisches Labor (Astrid Mather); CPG Medical Director (Kirsten Seidel); CPG Operations Management (Frank Bernert); CPG Planning (Achim Wagner); CPG Plasma Einkauf (Michael Engelmann (Inherited)); CPG Plasma Logistics (Klaus Rolshausen); CPG QA Case Processing Management (Ute Cherfan (Inherited)); CPG QA Center Operations (Ute Wessels); CPG QA Center Systems (Kerstin Kaddatz); CPG QA PLC Operations (Oliver Gro); CPG QA PLC Operations (Oliver Gro√ü); CPG QA Plasma Quality EU (Sascha Platt); CPG QA Plasma Quality QMB (Oliver Gro√ü (Inherited)); CPG QA Plasma Supplier Qualification (Ute Cherfan); CPG QA Plasma Supply Chain (Francesc Pont (Inherited)); CPG QA Plasma Supply Chain (Justin K Zajc); CPG QA Regulatory Affairs (Mandy Htzel); CPG QA Regulatory Affairs (Mandy H√∂tzel); CPG QA Supplier Qualification Management (Ute Cherfan (Inherited)); CPG QMB Center Operations (Ingrid Wallenwein (Inherited)); CPG QMB Center Operations (Ute Wessels (Inherited)); CPG Qualified Person (Margit El Azhari); CR&D Clinical Coagulation (Andres Brainsky); CRD Business Operations (Brian Dudt); CRD Business Operations (Walter Young); CRD Business Operations I (Craig Coffman (Inherited)); CRM Operations (Vinita Raina); CSL (Paul R Perreault); CSL 112 (Union) (Derek Butler); CSL Behring AG Bern (Martin Schaeren); CSL Behring AG Bern (Pierre Caloz); CSL Behring Broadmeadows (Martin Schaeren); CSL Behring Broadmeadows (Patricia Stewart); CSL Behring Broadmeadows II (Martin Schaeren); CSL Behring LLC Kankakee (Jose Gonzalez); CSL Behring LLC Kankakee (Patricia Stewart); CSL Behring LLC Kankakeee II (Patricia Stewart (Inherited)); CSL Behring Lengnau (Boris Lanoir); CSL Behring Marburg (Craig Shelanskey); CSL Behring Marburg (Michael Schrder); CSL Behring Marburg (Michael Schr√∂der); CSL Behring Pay Services (Susan M Walker); CSL Behring RCF Lengnau (Susanne Jecklin); CSL Behring RCF Lengnau II (Susanne Jecklin (Inherited)); CSL Behring Trademarks (Frank Schne-de la Nuez); CSL Behring Trademarks (Frank Sch√∂ne-de la Nuez); CSL Plasma (Craig Shelanskey); CSL Plasma (Michael F Deem); CSL Plasma (Willy Pardinas, Craig Shelanskey); CSL Plasma - Finance (Chris Shane); CSL Plasma - Finance (Christopher Shane); CSL Plasma / Engineering (Jrg Walz); CSL Plasma / Engineering (J√∂rg Walz); CSL Plasma GmbH (Berthold Ssser); CSL Plasma GmbH (Berthold S√ºsser); CSL Plasma GmbH HR (Berthold Ssser (Inherited)); CSL Plasma GmbH HR (Berthold S√ºsser (Inherited)); CSL Plasma II (Michael F Deem (Inherited)); CSL Plasma Kft. Hungary (Pankotai Tams); CSL Plasma Kft. Hungary (Pankotai Tam√°s); CSL Plasma US PLC Whitestown (Kristofor M Stauch); CSL Plasma US ¬ñ PLC Whitestown (Kristofor M Stauch); CSL Ruide Wuhan Manangement (David Chen ); CSL Ruide Wuhan Manangement (David Chen ?????); CSL Wuhan Plasma Operations (Jason Xu ?????); CSL Wuhan Ruide Calibration (Shangqu Shi ?????); CSL Wuhan Ruide Engineering (Jack Situ ); CSL Wuhan Ruide Engineering (Jack Situ ??????); CSL Wuhan Ruide Facility Team (Roger Peng ????); CSL112 Commercial Manufacturing Dept. (Derek Butler); CSR for Corporate (Patrick Castauro); CTS Business Operations (Robert Bredohl); CTS Cardiovascular (Eveline Girod-Engelhardt); CTS Hematology & Early Development (Annette Angell); CTS IRT (Amy Rupp); CTS Immunology (Berthold Roters); CTS Packaging & Labeling (Claudia Wieber); CTS Packaging & Labeling (Markus Thelen); CTS Process Improvement & Innovation (Carolin Sann); CTS Product Lead Cardiovascular (Elizabeth Bean); CTS Product Lead Immunology (Karin Knieke); CTS Product Lead Transplant (Fabienne Aschenbrenner); CTS Specialty Products & Transplant (Martin Mildenberger); CVC Cell, Virus & Compliance (Bjrn Keiner); CVC ¬ñ Cell, Virus & Compliance (Bj√∂rn Keiner); Calimmune Cell Manufacturing (Andreas Gille (Inherited)); Calimmune Cell Manufacturing (Bryan Burke); Calimmune Cell and Process Development (Jeffrey Ahlers); Calimmune Clinical (Maureen Boyd); Calimmune Clinical Programs (Mollie Barrett); Calimmune Information Technology (John Dallaire); Calimmune Quality Assurance (Anuja Prabhutendolkar); Calimmune Quality Assurance (Suparna Mishra Sarkar); Calimmune Research (Steven Lee); Calimmune Research and Development (Jeffrey Bartlett); Calumet Park 293 (Malissa Lichtenwalter); Calumet Park 293 QA (Michael W Solomon (Inherited)); Canada Medical Affairs Field Team (Maye Machnouk); Canton 236 (Jennie Marcum); Canton 236 ACM Area 1 (Ashley Instone); Canton 236 ACM Area 1 (Mirela Sekulic); Canton 236 ACM Area 2 (Rhianna Minger); Canton 236 ACM Area 2 (Rhianna Petrone); Canton 236 QA (Brandon Bosley); Canton 236 QA (Esence Hambrick); CapEx Procurement Lengnau (Franz Zweibrot [C]); CapEx Procurement Lengnau (Oliver Hahn); Capital Business Support (Tobias Pohle); Capital Controlling (Dirk Achenbach); Capital Controlling (Jrn Kaletsch); Capital Controlling (J√∂rn Kaletsch); Capital Project Management (Martina Thalmann); Capital Vendor Manager (Mark Vamadevan); Capital Vendor Manager (Nicholas Moody (Inherited)); Capital and MRO Sourcing - Kankakee (Emiliano Colon Segarra); Card Services (Linda K Nordmeyer); Cardio Therapies & Clinical Dev 2 (Lawrence Deckelbaum (Inherited)); Cardio Therapies & Clinical Development (Lawrence Deckelbaum); Cardiovascular & Diabetes (Susan Welsh (Inherited)); Cardiovascular & Diabetes (Todd Rudo); Cardiovascular & Metabolic Marketing (Rupal Shah); Cardiovascular & Metabolic Medical Affairs (Jeff McFadden (Inherited)); Cardiovascular & Metabolic TA (Jeff McFadden); Cardiovascular & Metabolism Therapeutic Area (Pierluigi Tricoci); Cardiovascular & Respiratory (James Peterson); Cardiovascular (Gail Berman); Cardiovascular (Lawrence Deckelbaum (Inherited)); Cardiovascular (Regina Clementi); Cardiovascular Global Marketing (Simon Fox); Cardiovascular and Metabolism (Danielle Duffy); Cardiovascular/Respiratory Therapeutic Area (Scott Hambaugh (Inherited)); Case Management GCSP (Nell Sborlini); Case Management MBR (Gudrun Heep); Category Chemicals, Filter Aid, Lab Chemicals (Martin Grossmann (Inherited)); Category Construction (Jos√© Maldonado (Inherited)); Category Equipment (Mike Gong); Category Gels, Resins, Media (BRN) (Martin Grossmann (Inherited)); Category Management (Markus Herrmann); Category Manager Indirects (Karl Lavery); Category Manager Indirects (Sarah Orchard); Category Packaging (Adam Kooloos); Cell Biology and Physiology (Cristina Gamell Fulla); Cell Culture & Purification (Michael Schmitt); Cell Culture & Purification Development (Andrew Low); Cell Culture & Purification Development (Ben Hunt); Cell Culture & Purification Development (Innocent Bekard); Cell Culture & Purification Development (Irene Baker); Cell Culture & Purification Development (Lou Fabri); Cell Culture & Purification Development (Simon Gerber); Cell Culture & Purification Development (Simon Stephan Gerber); Cell Culture & Purification Development (Vanessa Sandford); Cell Culture & Purification Development (Yih Yean Lee (Inherited)); Cell Culture & Purification Development (Yih Yean Lee); Cell Culture & Purification Development 1 (Innocent Bekard (Inherited)); Cell Culture Analytics (Vanessa Trefzer); Cell Manufacturing (Angel Jaramillo); Cell Manufacturing (Samuel O''Callaghan (On Leave)); Cell Manufacturing (Stefanie Homann); Cell Manufacturing I (Michelle Millington); Cell Manufacturing III (Samuel O''Callaghan); Cell Manufacturing IV (Stefanie Homann); Cell and Process Development (Jeffrey Ahlers); Cells, Virus & Compliance (Trudi Wentzel); Cells, Virus and Compliance (Tanya Guzzardi); Center Mitarbeiter (Andreas Gehrich (Inherited)); Center Mitarbeiter (Andreas Gehrich); Center Mitarbeiter (Annette Pernitzsch (Inherited)); Center Mitarbeiter (Annette Pernitzsch); Center Mitarbeiter (Claudia Habenicht (Inherited)); Center Mitarbeiter (Claudia Habenicht); Center Mitarbeiter (Damaris Kieckhfer); Center Mitarbeiter (Damaris Kieckh√∂fer); Center Mitarbeiter (Heike Borchert); Center Mitarbeiter (Kirsten Scheibel (Inherited)); Center Mitarbeiter (Kirsten Scheibel); Center Mitarbeiter (Natascha Bock (Inherited)); Center Mitarbeiter (Natascha Tappendorf); Center Mitarbeiter (Stephani Keltsch); Center Mitarbeiter (Sven Schuhmann (Inherited)); Center Mitarbeiter (Sven Schuhmann); Champaign 270 (Harriet Williams); Champaign 270 ACM Area 1 (Jacques LaRue); Champaign 270 ACM Area 2 (Harriet Williams (Inherited)); Champaign 270 ACM Area 2 (Quawan Dhom); Champaign 270 QA (Meghan Constantine); Change & Systems (Angela Leepin); Change & Systems (Lucia Mathis); Change Control Final Product Care (Stephan Nau); Change Management (Elizabeth Walker (Inherited)); Change Management (Kris Weidling (On Leave)); Change Management (Wendy Smith); Change Management Quality (Marlise Kuert Kolb); Change Management and Launch Support (QCM) (Marlise Kuert Kolb); Change Management-Document Control (Michelle Wells); Change and Investigations Primary Manufacturing (Jason Gilmour); Change and Investigations Secondary Manufacturing (Hai Tran); Characterization (Lars Robbel); Characterization 2 (Katharina Kramer); Charleston 044 (Lorenzo L Bowser); Charleston 044 (Robin M Bencivenga); Charleston 044 ACM Area 1 (Gregory Swant); Charleston 044 ACM Area 1 (Lorenzo L Bowser (Inherited)); Charleston 044 ACM Area 2 (Shakerrie Mobley); Charleston 044 QA (Yolanda L Carlisle); Charlotte 203 (Sam Kastanowski); Charlotte 203 (Shannon D Dalton); Charlotte 203 ACM Area 1 (Kathy Reilly); Charlotte 203 ACM Area 2 (Micah Ford); Charlotte 203 ACM Area 2 (Shannon D Dalton (Inherited)); Charlotte 203 QA (Nicole D Etheridge); Charlotte 418 (Paul Schmaldinst); Charlotte 418 ACM Area 1 (Sharita Swann); Charlotte 418 ACM Area 2 (Mayada M Omer); Charlotte 418 ACM Area 3 (Trina Crayton); Charlotte 418 QA (Le Tran); Chattanooga 010 (Ramoncito B Bautista); Chattanooga 010 ACM Area 1 (Sheree L Leatherman); Chattanooga 010 ACM Area 2 (Beth Simpson); Chattanooga 010 ACM Area 2 (Brittany Goforth); Chattanooga 010 QA (Callan Pierson); Chattanooga 010 QA (Heather Palladino); Chattanooga 010 QA (Patti Bailey (Inherited)); Chattanooga 010 QA (Patti Bailey (Inherited), Prim J Cunningham (Inherited)); Chattanooga 407 (Brian West); Chattanooga 407 (Brianna E Ballew); Chattanooga 407 ACM Area 1 (Amy D Hodge); Chattanooga 407 ACM Area 2 (Joshua Turpin); Chattanooga 407 QA (Barron Williamson); Cheektowaga 235 (Scott Bowers); Cheektowaga 235 ACM Area 1 (Cheryl Sousa); Cheektowaga 235 ACM Area 2 (Iryna Omelyan); Cheektowaga 235 QA (Roxanne Tyczka); Chem. Quality Control 1 (Lukas Dinger); Chem. Quality Control 2 (Silvan Stucki); Chemical Quality Control (CQC) (Adrian Zobrist); Chemical Analytics R&D (Lars Schiefelbein); Chemical Analytics R&D (Sara Stinca); Chemical Quality Control (Andreas Affolter); Chemical Quality Control (CQC) (Adrian Zobrist); Chemical Quality Control (Lars L√ºersen); Chemical Quality Control (Sten Strunze); Chemistry (Sara Garland); Chemistry (William Squires); Chemistry - In-Process Group (Courtney Nuccio); Chemistry - Raw Materials Group (Arthur F Fox); Chemistry Lab (Rebecca L Boudreau); Chiba Kanagawa Area (Madoka Yamamoto); Chibi Accounting (Hongyan Hu ?????); Chibi Admin (Hongyan Hu ); Chibi Admin (Hongyan Hu ?????); Chibi Clinical Inspection (Shiyong Yu ); Chibi Clinical Inspection (Shiyong Yu ?????); Chibi Plasma Collect (Liyun Huang ); Chibi Plasma Collect (Liyun Huang ?????); Chibi Plasma Collection (Jie Yu ); Chibi Plasma Collection (Jie Yu ????); Chibi Plasma Collection Center (Jun Lai ????); Chibi Plasma Collection Management (Jingyu Dong ?????); Chibi Plasma Sourcing (Jiaxin Long ?????); Chibi Plasma Sourcing Management (Bin Zhang ); Chibi Plasma Sourcing Management (Bin Zhang ????); Chicago 247 (Guillian T Gonzalez); Chicago 247 ACM Area 1 (Sabrina Flowers); Chicago 247 ACM Area 2 (Gretchen Watkins); Chicago 247 ACM Area 2 (Guillian T Gonzalez (Inherited)); Chicago 247 QA (Gretchen Watkins); Chicago 247 QA (Linda Schulthess); Chief Medical Office (Charmaine Gittleson); Chief Operating Officer (Paul McKenzie); Chief Safety Officer (Susan Welsh); China Logistics (Vickie Xian ); China Logistics (Vickie Xian ?????); China Marketing (Anlu Cai ?????); China Strategic Quality (Jian Fei ????); Christian Spuckti; Chromatography (Holger Blessing); Chromatography (Sven Karschnia); Chubu Area (Hiroyoshi Iwamoto); Chugoku Shikoku Area (Masahiko Ishida); Cincinnati 177 (Harold Tucker Jr); Cincinnati 177 ACM Area 1 (Anh Le); Cincinnati 177 ACM Area 2 (Darryl W Revere Jr); Cincinnati 177 ACM Area 2 (Jessica Hoffman); Cincinnati 177 QA (Christopher Thompson); Cincinnati 189 (Lee A Miles); Cincinnati 189 ACM Area 1 (Kristal Emmitt); Cincinnati 189 ACM Area 2 (Ginger Wells); Cincinnati 189 ACM Area 2 (Kaitlyn Spencer); Cincinnati 189 QA (Tyianna N Trice (On Leave)); Cincinnati 189 QA (Tyianna N Trice); Cinncinnati 177 (Harold Tucker Jr); Citrix (Thomas M Kent); Cleveland 401 (Sarah E Moss); Cleveland 401 ACM Area 1 (Shelly L Deimling); Cleveland 401 ACM Area 2 (Chonita Johnson (On Leave)); Cleveland 401 ACM Area 2 (Chonita Johnson); Cleveland 401 QA (Enetisha T Dailey); Cleveland 401 QA (Jennifer Longo); Clifton 255 (Andrew Oliver); Clifton 255 ACM Area 1 (Anthony Camuso); Clifton 255 ACM Area 2 (Marshaya Johnson); Clifton 255 ACM Area 2 (Rolshall Burnett); Clifton 255 QA (Kengie Jenkins); Clinic Study Acquired Bleeding-IG (Danielle Dalton); Clinic Study Acquired Bleeding/IG (Danielle Dalton); Clinical Bulk (Gerd Eisenmann); Clinical Bulk (Noemi Scholl); Clinical Bulk (Rene Bruegger (Inherited)); Clinical Compliance (Mihaela Carla Nosca); Clinical Compliance and Training (CC&T) (Saskia Ruis); Clinical Data Standards and Programming (Dieter Boss); Clinical Development (Wilfried Seifert); Clinical Development - Study (Christa Lewiski); Clinical Development - Transplant (Paul Shore); Clinical Development Operations (Craig Coffman); Clinical Development Operations (Deirdre BeVard); Clinical Development Operations (Kazuaki Hashimoto - ); Clinical Development Operations (Kazuaki Hashimoto ??? ?? - ???? ?????); Clinical Development Operations I (Craig Coffman); Clinical Development Respiratory (Lars Groenke); Clinical Disclosures & Transparency (Vicki Oosterbaan); Clinical Epidemiology (Quazi Ataher); Clinical Epidemiology (Susan Colilla); Clinical Epidemiology (Susan Welsh (Inherited)); Clinical Operations (David J. Parker); Clinical Operations (Thomas Verish); Clinical Operations (Valerie Reynaert); Clinical Operations 1 (Jennifer Weaver); Clinical Operations II (Valerie Reynaert); Clinical Operations Japan (Hideshiro Benz); Clinical Operations Serology (David Bibby); Clinical Operations Systems (Simone Dierkes) (Simone Dierkes); Clinical Ops 2 (Michael Giordani); Clinical Oversight Manager (Katja Ganter); Clinical Oversight Manager (Miriam Hochthurn); Clinical Oversight Manager (Stefanie Auer); Clinical Pharmacology & Early Development (Amy Harman); Clinical Pharmacology & Early Development (Stephen Caltabiano); Clinical Pharmacology & Translational Dev (John Roberts); Clinical Pharmacology & Translational Dev ¬ñ CPAT (Diana Lanchoney); Clinical Pharmacology & Translational Development CPAT (Diana Lanchoney); Clinical Pharmacology &Early Development (Diana Lanchoney (Inherited)); Clinical Pharmacology &Early Development (Dipti Pawaskar); Clinical Pharmacology &Early Development (Jagdev Sidhu); Clinical Pharmacology &Early Development (Joanne Ma); Clinical Pharmacology &Early Development (John Roberts); Clinical Pharmacology &Early Development (Michael Tortorici); Clinical Pharmacology (Bo Zheng); Clinical Procedural Documents &Standards (Thomas Verish (Inherited)); Clinical Programming (Stefan Hofmann); Clinical Programs (Christine Joch); Clinical Quality (Claire Pope); Clinical Quality (Karen Gard''ner (Inherited)); Clinical Quality Assurance (Daisy Maldonado-Ortiz); Clinical Quality Assurance (Joy Quinal); Clinical Quality Assurance (Pontaah Arbtan); Clinical Quality Assurance (Sharon Reinhard); Clinical Quality Assurance (Terrence Purnell); Clinical Quality Assurance (Volker Nickel); Clinical R&D (Hideto Akama - ); Clinical R&D (Hideto Akama ??? ?? - ??? ????); Clinical Research & Development - Transplant (Scott Adler); Clinical Safety (Corrinne Clement); Clinical Safety (Maria Mller); Clinical Safety (Maria M√ºller); Clinical Safety (Velma Hurst); Clinical Science (Eve Versage); Clinical Science (Naohito Sato); Clinical Sciences Transplantation (Christine Voigt ); Clinical Sciences Transplantation (Christine Voigt); Clinical Scientist (Andres Brainsky (Inherited)); Clinical Scientist (Jenny Mears); Clinical Serology Operations Lead (Frank Iozzo); Clinical Strategy&Development (Hideto Akama ??? ?? - ??? ????); Clinical Study (Agnieszka Turowska); Clinical Supply Chain Planning (Ulrich Mengel); Clinical Supply Quality (Carl Forte); Clinical Supply Quality (Matthew Wokulich); Clinical Trial Process Improvement & Innovation (Steve Walker); Clinical Trial Process Improvement & Innovation (Thomas Kelly); Clinical Trial Supply (Patrick McLaughlin); Clinical and TA Strategy (Steven Pascoe); Coag, Devices & Special Products (Ignacio Rodriguez); Coag, Devices & Special Products (Juergen Zorn); Coagulation & CC Sales Force (Emmanuelle Massonie (Inherited)); Coagulation & CC Sales Force (Jean-Vincent Viale); Coagulation & CC Sales Force (Thierry BERTHOULE); Coagulation & Oncology (Kristin Ingrassia); Coagulation & Oncology (Sylvia Herget); Coagulation & Oncology 1 (Kerstin Jung); Coagulation (Janine Dolan); Coagulation - CPP (Kristy Bandza (Inherited)); Coagulation Manufacturing (Kristy Bandza); Coagulation Manufacturing (Union) (Kristy Bandza (Inherited)); Coagulation Sales South (Marlene Gregory (On Leave)); Coagulation Sales South (Marlene Gregory); College Station 152 (Kandra K Blodgett); College Station 152 (Lauren Parks); College Station 152 (May Walker); College Station 152 ACM Area 1 (Kailey Stockton); College Station 152 ACM Area 2 (Amanda Miller); College Station 152 ACM Area 2 (DANIELLE GARRETT); College Station 152 QA (Kacie Goad); College Station 152 QA (May Nowalk); College Station 152 QA (May Walker); Colorado Springs 277 (Amanda M Cvitkovich); Colorado Springs 277 ACM Area 1 (Ben Huff); Colorado Springs 277 ACM Area 2 (Leon Clemons Jr.); Colorado Springs 277 ACM Area 2 (Sang Nguyen); Colorado Springs 277 QA (Crystal L Reichard); Columbia 217 (Catherine Watson); Columbia 217 (Monique Simpson); Columbia 217 ACM Area 1 (Mirna Rodriguez); Columbia 217 ACM Area 2 (Gregory Hines); Columbia 217 QA (Alissa Elke); Columbia 217 QA (Brandon Hoffman); Columbia 217 QA (Victoria McIntyre (Inherited)); Columbia 271 (Beth Brooks-Mccoy); Columbia 271 QA (Eric Mathis); Columbia 612 (Catherine Watson); Columbia 612 (Jose Pineda); Columbia 612 ACM Area 1 (Joyce A Jackson); Columbia 612 ACM Area 2 (Garrett Palmer); Columbia 612 QA (Aniashalla McDuffie); Columbia 612 QA (Shannon V Brown); Columbia 612 QA (Victoria McIntyre (Inherited)); Columbus 150 (Mark A Leach); Columbus 150 (Matthew Z Osborne); Columbus 150 ACM Area 1 (Nasha Ausberry); Columbus 150 ACM Area 2 (Alison L Woody); Columbus 150 QA (Tina M Miller); Columbus 409 (Angela L Funk); Columbus 409 ACM Area 1 (Jacob A Wilcox); Columbus 409 ACM Area 2 (Stella Shella May Oliver); Columbus 409 QA (Thomas U Anderson); Com Dev Immunology (GABRIELA ESPINOZA); Com Dev Immunology (Gabriela Espinoza); Com Dev Immunology (Karen MacPhail); Com Dev Immunology (Lawrence Bruck); Com Dev Medical (Birgit Voelker); Com Ops Human Resources Asia Pac (Jenny Zeng); Com Ops Human Resources Asia Pac (Trina Hendri (Inherited)); Com Ops Human Resources EU (Marc Htting); Com Ops Human Resources EU (Marc H√∂tting); Com Ops Human Resources ICO (Jenny Alexandra Kjaer); Com Ops Human Resources ICO (Jenny Kjaer Rotzler); Com Ops Human Resources NA (Elizabeth Wixted); ComDev Coagulation (Jens Oltrogge); ComDev Speciality Products (Georg Henkel); ComDev Speciality Products 1 (Georg Henkel); ComOps Argentina Accounting (Guadalupe Porro Greco); ComOps Argentina Finance (Silvina Lazzari); ComOps Argentina Marketing (Lucia I Grossi); ComOps Argentina Sales (Fernando Grosso); ComOps Brazil Finance (Marcelo Di Napoli); ComOps Brazil Market Access (Gerdeon Aurelio A Paiva); ComOps Brazil Marketing (Cristina Daniel Paulino); ComOps Brazil Operations (Cristina Junko Nakai); ComOps Brazil Regulatory Affairs (Rosana Batista); ComOps Brazil Sales (Luis Gustavo Gomes); ComOps Business Operations GE/AT/EEU (Karl Fox); ComOps Canada Coag & CC (MICHAEL LEO); ComOps Canada Finance (Michael McAllister); ComOps Canada Medical Affairs (MSL) (David Barnes (Inherited)); ComOps Canada Medical Affairs (MSL) (Debbie Bensen-Kennedy (Inherited)); ComOps Canada Regulatory Affairs (Vaneeta Bhatia); ComOps Canada Sales (MARIE-EVE JACQUES); ComOps Colombia Accounting (Carlos Andres Loaiza Barragn); ComOps Colombia Accounting (Carlos Andres Loaiza Barrag√°n); ComOps Colombia Sales (Martha Romano Gomez); ComOps Controlling GE/AT/Emerg. EU (Oliver Rosenthal); ComOps Demand Planning EU (Heike Kayser); ComOps Finance FP&A EU (Tamara Lissitsyna); ComOps Finance/Supply Chain/Compass EU (Heinz Berghoff); ComOps Government Reporting/Compliance (Mike Andrews (Inherited)); ComOps Intercontinental MBR (Bjrn Schfer); ComOps Intercontinental MBR (Bj√∂rn Sch√§fer); ComOps Intercontinental MBR (Manfred Nolte); ComOps Market Access (Stefan Neudrfer); ComOps Market Access (Stefan Neud√∂rfer); ComOps Marketing Coagulation (Dave Lewis); ComOps Marketing Coagulation (JD Kohutka); ComOps Marketing GE/AT/Emerg. EU (Elisabeth Averwerser); ComOps Meetings & Conventions (Molly Hess Knodel); ComOps Mexico Finance & Administration (Carlos Salas); ComOps Mexico Finance & Administration (HECTOR ESCOBEDO); ComOps Mexico Market Access (Humberto Maciel); ComOps Mexico Product Management (THALIA FIERRO DE LEON); ComOps Mexico Regulatory Affairs (Sandra Velasco); ComOps Mexico Sales (Jorge L Gastlum); ComOps Mexico Sales (Jorge L Gast√©lum); ComOps NA Business Operations (Denise Von Dohren); ComOps NA Business Operations (Mike Andrews (Inherited)); ComOps NA Government Reporting (Pamela Makosky); ComOps NA Government Reporting (Ronald Ritter Jr); ComOps NA Government Reporting (Sarah Palmer); ComOps NA Learning & Support Services (Lynn DiBonaventura); ComOps NA Learning & Support Services (Mike Andrews (Inherited)); ComOps NA Market Insights (Venkatesh Ramakrishnan); ComOps NA Marketing Consumer (Janet A Reimund); ComOps NA PRC Operations (Diane Wright); ComOps NA, Master Data (David Fling); ComOps NA, Sales Operations & CRM (Michael Price); ComOps NA, Sales Operations & Incentive Compensation (Michael Price); ComOps NA, Sales Operations (Jerry Burgener); ComOps North America Medical Affairs (Debbie Bensen-Kennedy); ComOps North America Medical Affairs (Judith Vensak); ComOps North America Medical Affairs- Immunology TA (Arie Katz); ComOps Reimbursement and Access (Dina Inverso); ComOps Reimbursements and Access (Jeffrey Lucero); ComOps Reimbursements and Access (Kate O''Connor-Masse); ComOps SP - Payers (Pete Dickson); ComOps SP / Payers (Pete Dickson (Inherited)); ComOps SP / Payers (Pete Dickson); ComOps Sales Germany (Michael Bernd Rode); ComOps Switzerland (Isabelle Dahinden); ComOps Therapeutic Area EU (Antti Kourula); ComOps Therapeutic Area EU (Damian Gilkerson); ComOps US Atlantic Specialty (Jeffrey Todd Winn); ComOps US Coag Midwest (Mark A Wiener); ComOps US Coag Northeast (Dominic Lattanzi); ComOps US Coag Northeast (Ivan Holtz (Inherited)); ComOps US Coag Sales (Ivan Holtz); ComOps US Coag South (Mark Fitzgerald); ComOps US Coag West (Scott Vollet); ComOps US Corporate Accounts (Paul Kurt); ComOps US Delaware Valley Specialty (Kellee Fearon); ComOps US Delaware Valley Specialty (Marlene Gregory); ComOps US Medical Affairs Coagulation (Jerry Powell (Inherited)); ComOps US Medical Affairs Coagulation (Vidhi Desai); ComOps US Medical Affairs Coagulation I (Vidhi Desai); ComOps US Medical Affairs Immunoglobulin (Ann Bullinger); ComOps US Medical Affairs Immunoglobulin (Ayman Kafal); ComOps US Medical Affairs Specialty (Laurel Omert); ComOps US Medical Affairs Specialty (Paolo Bajcic); ComOps US Medical Information (Deirdre Smith); ComOps US Mid-Atlantic Immunology (Lori Giampaolo); ComOps US Mid-Atlantic Immunology (Michael Flaherty); ComOps US Mid-South Immunology (Cory Baldwin); ComOps US Mid-South Immunology (James Heintz (On Leave)); ComOps US Mid-South Immunology (James Heintz); ComOps US Mid-South Specialty (Bill Stokes); ComOps US Mid-South Specialty (Brett Weathersby); ComOps US Mid-West Immunology (Mark C Morgan); ComOps US North Central Specialty (Steve A Mick); ComOps US Northeast Immunology (Pamela Buschbacher); ComOps US Northeast Specialty (Craig Strok); ComOps US Northeast Specialty (Gina Blair (Inherited)); ComOps US Northeast Specialty (Rebecca Riebe (On Leave)); ComOps US Ohio Valley Specialty (Jason Flowers); ComOps US South Central Immunology (Joseph Guinan (Inherited)); ComOps US South Central Immunology (Roxanne Quirin); ComOps US South Central Specialty (David Van Buskirk); ComOps US Southeast Immunology (James Gleichowski); ComOps US Southeast Specialty (Michael Allen); ComOps US Specialty Marketing (Bernadine Koziara (Inherited)); ComOps US Specialty Marketing (Tom Groeling); ComOps US Specialty Sales (Gina Blair); ComOps US Supply Chain (Madonna Jarrett); ComOps US West Central Immunology (Greg Logsdon); ComOps US West Central Specialty (Ann Andari); ComOps US West Central Specialty (Gina Blair (Inherited)); ComOps US West Central Specialty (Kimberly Kustron); ComOps US West Immunology (Greg Hansen); Combination-Device QA (TATYANA ILYINA); Combination/Device QA (TATYANA ILYINA); Comm Dev and Scientific Affairs (Edith Rosenberg); Comm Dev and Scientific Affairs (Tara Satyanand); Comm. Oper. GE/AT/Emerg. EU Diverse (Dirk Hoheisel (Inherited)); Commercial Operations Junxin (Shaolin Huang ?HuangShaolin?); Commercial (Brent MacGregor); Commercial (Stephen Allan); Commercial Access & Policy US (Marie Mazur (Inherited)); Commercial Access & Policy US (Shanthy Krishnarajah); Commercial Account Management - Public Health (Jeffrey Benton); Commercial Argentina (Gonzalo Pereira); Commercial Business Operations Sales & Analytics (Kevin Harkins); Commercial Business Operations Training (Patrick Gostomski); Commercial Business Services (Lynda Kulp); Commercial China (Cheng-Yen Tsai ); Commercial China (Cheng-Yen Tsai ?????); Commercial China (Harold Chan ?????); Commercial Contracts US (Yvonne Blom Hilsky); Commercial Contracts US (Yvonne Hilsky); Commercial Contracts-US (Agnes Goins); Commercial Customer Operations US (John Spencer); Commercial Customer Operations US, Customer Service, Account Mgmt (Teka-Ann Forrester); Commercial Customer Service/Supply Chain (Narelle Kinson); Commercial Development (Jane True); Commercial Development Global (Debbie Drane); Commercial Development and Policy (Dirk Ulrich Hofmann); Commercial Development and Policy (Lorna Meldrum); Commercial Development and Policy (Marie Mazur); Commercial Excellence (Makoto Miura); Commercial Excellence (Roger Melliger (Inherited)); Commercial Excellence (Tomohiro Miura - ); Commercial Excellence (Tomohiro Miura ??? ?? - ??? ?????); Commercial Excellence Training & Development Office (Chiho Muto); Commercial Excellence and Training (Cheng-Yen Tsai ????? (Inherited)); Commercial Excellence and Training (Joanne Liu ); Commercial Excellence and Training (Joanne Liu ????); Commercial Governance and Transparency (Artimis Ghassemi); Commercial Governance and Transparency (Daniel Quayle); Commercial IT solutions (Thomas Wilcock); Commercial Italy (Massimo Leoni); Commercial MEA (Camilla Shen (Inherited)); Commercial Marketing US (David Ross (Inherited)); Commercial Marketing US (Glenn Omanio); Commercial Marketing US (Tara Charvat); Commercial National Accounts US (Stefan Merlo); Commercial Operations (Brent MacGregor); Commercial Operations (Deniz Bagdadi); Commercial Operations (James Gaw); Commercial Operations (Mark Ridge); Commercial Operations (Sam Dowdle); Commercial Operations Argentina (Juan Pablo Guereo); Commercial Operations Argentina (Juan Pablo Guere√±o); Commercial Operations Australia/NZ (Loretta Croker); Commercial Operations Austria (Beate Pettinger-Natmenig); Commercial Operations Austria (Beate Pettinger-Natmessnig); Commercial Operations Austria (Beate Pettinger-Natme√ünig); Commercial Operations Austria (Martin Tenlen); Commercial Operations Benelux (Patrick Reygaert); Commercial Operations Brazil (Gustavo Fernandes); Commercial Operations Canada (Philippe Hebert); Commercial Operations Chile (Juan Pablo Ambar); Commercial Operations Colombia (Eduardo Cabas); Commercial Operations Colombia (Juan Feliu (Inherited)); Commercial Operations Czech Republic (JI KAPEREK); Commercial Operations Czech Republic (JI?√ç KA¬äPEREK); Commercial Operations Czech Republic (Ondrej Halasz); Commercial Operations Denmark (Gitte Stausholm); Commercial Operations Europe (Lutz Bonacker); Commercial Operations Finance (Adrienne Ford); Commercial Operations Finance (Amanda White); Commercial Operations Finance (Marcelo Estrella); Commercial Operations Finance (Michael Kochanski); Commercial Operations France (Franck Puget); Commercial Operations GE/AT/Emerg. EU (Dirk Hoheisel); Commercial Operations Global HR (Trina Hendri); Commercial Operations Greater China (Ben Yang ?????); Commercial Operations Greater China (Harold Chan ); Commercial Operations Greater China (Harold Chan ?????); Commercial Operations Greater China Junxin (Paul Li (Inherited)); Commercial Operations Greece (Marianna Konstantinidi); Commercial Operations Hong Kong (Roger Cheng ); Commercial Operations Hong Kong (Roger Cheng ?????); Commercial Operations Hungary (Lukacs Attila); Commercial Operations Intercontinental (Markus Staempfli); Commercial Operations Italy & Greece (Oliver Schmitt); Commercial Operations Italy, Business Operations (Giuseppe Fioravante); Commercial Operations Italy, Central Italy Sales (Claudio Chiorri); Commercial Operations Italy, North Italy Sales (Maurizio Gonizzi Barsanti); Commercial Operations Italy, South Italy Sales (Paolo Lombardi); Commercial Operations Japan (Jean-Marc Morange); Commercial Operations Junxin (Qiuhui Shi); Commercial Operations Mexico (Nicolas Martinez Gould); Commercial Operations Nordic (Martin Tenlen); Commercial Operations Nordic (Ulf Hultquist); Commercial Operations North America (Robert Lojewski); Commercial Operations Poland (Grazyna Debowska); Commercial Operations Poland (Marek Skotnicki); Commercial Operations Portugal (David Ventura); Commercial Operations SG, ML & ID (Matthew Ho); Commercial Operations Slovakia (Andrea Solivajsova); Commercial Operations Slovakia (JI KAPEREK); Commercial Operations Slovakia (Ondrej Halasz); Commercial Operations Spain & Portugal (Mara Jose Sanchez Losada); Commercial Operations Spain & Portugal (Mar√≠a Jose Sanchez Losada); Commercial Operations Turkey (Aysun Acer); Commercial Operations Turkey (Aysun Yanbol); Commercial Operations Turkey (Ercin Kugu); Commercial Operations Turkey 2 (Mehmet Aydogan); Commercial Operations United Kingdom (Eddie Owens); Commercial Operations United Kingdom (Edward Owens); Commercial Operations United Kingdom II (Dan Betts); Commercial Operations, Influenza Vaccines (Linda DU); Commercial Operations, Americas (Haejin Chung); Commercial Operations, Americas (Jane True); Commercial Operations, Canada (Gillian Stafford); Commercial Operations, DE, Customer Service (Thomas Kasimirat); Commercial Operations, DE/CH/AU (Deborah Di Salvo); Commercial Operations, DE/CH/AU (Frank Eberlein); Commercial Operations, EMEA (Enric Canelles Torres); Commercial Operations, Fleet, Meetings & Travel Strategic Sourcing (Kristie Boyd); Commercial Operations, Influenza Vaccines (Linda DU); Commercial Operations, International and Pandemic (Lorna Meldrum); Commercial Operations, Italy (Maura Cambiaggi); Commercial Operations, LatAm (Carmen Pereyra); Commercial Operations, LatAm (Carmen Rosa Pereyra Davila); Commercial Operations, Marketing UK (Kaush Gandhi); Commercial Operations, North Americas (David Ross); Commercial Operations, Spain (Antonio Lloret Parellada); Commercial Operations, UK (Deborah Di Salvo); Commercial Operations, UK (Helen Concilia); Commercial Ops (John Lawrence); Commercial Ops North America (John Fudala); Commercial Ops North America (Robert Lojewski (Inherited)); Commercial Pandemic Contracts (Randall Deck); Commercial Taiwan (Cheng-Yen Tsai ?????, King Lian Wang ?????); Commercial Taiwan (King Lian Wang ?????); Commercial Taiwan (Louis Liu ); Commercial Taiwan (Louis Liu ?????); Commercial, Business Operations UK (Charlotte Wrigley); Commercial, Business Operations, Customer Support (Jeff Wettlaufer); Commercial, Customer Service UK - Liverpool (Charlotte Wrigley); Commercial, Customer Service UK ¬ñ Maidenhead (Amy Smith); Commercial, Global Fluad (Richard Bland); Commercial, Global Flucelvax & Afluria (JESSICA O''DONNELL); Commercial, Global Flucelvax & Afluria (Jessica O''Donnell); Commercial, National Accounts Field US (Aaron Hubner); Commercial, National Accounts Field US (Aaron Martin Hubner); Commercial, National Accounts Field US-Summit (Gregg Quatrini); Commercial, National Accounts UK (Raashid Mehmood); Commerical, Product Planning & Innovation (Loddie Foose); Commission & Qualification (Arelis Cabezas); Communications (Maureen Powell); Communications (Melanie Kerin); Communications (Polina Miklush); Communications (Sandra Ruckstuhl); Company Secretary (Gregory Boss (Inherited)); Company Secretary Office (Sonya Curciev); Compass / gCRM System (Giorgio Lippi (Inherited)); Compass / gCRM System Benelux (Patrick Reygaert (Inherited)); Compass / gCRM System France (Pascale Ogel Le Guen (Inherited)); Compass Excellence Center (John Eric Bunn); Compensation Programs (Anthony Dillon); Compensation Programs (Timothy O''Donnell); Competitive Intelligence (Magdalena Popesco); Compliance & Improvement Manager (Elaine Feely); Compliance (Andrea Studer); Compliance (Cindy Rocknowski); Compliance (Jeffrey Zoubek (Inherited)); Compliance (Robin A Mroz); Compliance Americas (John Neff (Inherited)); Compliance Americas (Thomas Spittal); Compliance I (Margrit Waterval); Compliance II (Dagmar Riffel (Inherited)); Compliance II (Jutta Regenfelder); Compliance Management Engineering (Rainer Kutsch); Compliance Support 1 (Volker Gawantka); Computerized Data & Instruments Systems (C√©line Pires); Computerized Data & Instruments Systems (Hiroshi Nakano); Congress (Jean-Marc Morange (Inherited)); Congress (Kyota Yamaoka ??? ?? - ???? ????? (Inherited)); Connected Healthcare (Mark Ridge (Inherited)); Construction, Engineering & Qualification (Adam Robb); Construction, Engineering & Qualification (Mike Spencer); Construction, Engineering & Qualification (Richard Hayne); Construction, Engineering & Qualification 1 (Michael Ah-Cann); Construction, Engineering & Qualification 2 (Adam Robb (Inherited)); Construction, Engineering & Qualification 2 (Jim Li); Construction, Engineering & Qualification CSL112 (Jacqueline Murphy); Content Management (Christian Mohr); Content Management (Elisabeth Averwerser (Inherited)); Contract Administration (Frances Richardson); Contract Manufacturing (Ian Goldup); Contracts / Claims Management (Kevin Rathmell [C]); Controlling & Financial Reporting (RYAN HANSEN); Controlling (Wolfgang Thiel); Corporate & Expert Services (Patrick Haeberli); Corporate Affairs (Sharon McHale); Corporate Affairs - US (Polina Miklush); Corporate Affairs and Communications (Anthony Farina); Corporate Communications (Jemimah Brennan); Corporate Communications (Jemimah Pentland); Corporate Communications - Japan (Hidemi Akazawa); Corporate Communications Business Partnering (Melanie Kerin); Corporate Communications Business Partnering 2 (Melanie Kerin); Corporate Development (Serge Scrofani); Corporate Finance (John Levy); Corporate Finance (Paul Coulter); Corporate Finance Edge Controller (Julia Wilkinson [C]); Corporate Services (Marvin Anthony Edwards II); Corporate Services (Michael Hays (Inherited)); Corpus Christi 603 (Sam Schultz (Inherited)); Corpus Christi 603 (Sam Schultz); Corpus Christi 603 (Tashana K Sanders); Corpus Christi 603 ACM Area 1 (Lorena Luna); Corpus Christi 603 ACM Area 2 (Nola V Baker); Corpus Christi 603 QA (Tara L Spitzer); Cost Center Accounting & Sales Reporting (Patrick Eley); Cost Center Controlling (Rainer Althaus); Counsel Americas (Shawn Gibbs); Counsel Americas (Shawn Michael Gibbs); Counsel EMEA (John Minardo (Inherited)); Counsel EMEA (Martin Quinn); Counsel EMEA (Virginie Didier); Country & Region Management (Geeseung Yoo); Credit & Collection (Anette Rummelsberger); Credit Collection (Paul Fellingham (Inherited)); Critical Systems - HVAC (Union) (Jeff J Parks (Inherited)); Critical Systems - HVAC (Union) (Michael D Proctor); Critical Systems - Water Systems (Union) (Jeff J Parks (Inherited)); Critical Systems - Water Systems (Union) (Jim Meils); Critical Utilities (Frank Miles III); Critical Utility Projects (Jim Meils); Culture and HR Strategy (Linda Hagerty-Dotterer); Customer Care Center I (Christian Siebert); Customer Care Center II (Oliver Weck); Customer Engagement Management (Brian Johnson (On Leave)); Customer Engagement Management (Gina Malloy); Customer Service & Logistics (Massimo Leoni); Customer Service (Bernhard Czapla (Inherited)); Customer Service (Consuelo D''Amore); Customer Service (Crystal Marie Wiles); Customer Service (Crystal Wiles); Customer Service (Holger Milkereit (Inherited)); Customer Service (Michael Bernd Rode (Inherited)); Customer Service (Rainer Adam (Inherited)); Customer Service (Robert Rohs); Customer Service (Sandra Lafoca (Inherited)); Customer Service (Sean Grinnell); Customer Service (Susanne M√∂ller (Inherited)); Customer Service ComOps Intercontinental Region (Anita Erber); Customer Service Deutschland (Roger Melliger); Customer Service France (Charlotte Rougi (Inherited)); Customer Service France (Charlotte Rougi√© (Inherited)); Customer Service France (Julien Roche); Customer Service Manager (Anna Arena); Customer Service Ops (Sean Grinnell); Customer Service and Launchmanagement (Christoph Krug); Customer Service and Launchmanagement (Jochen Wagner); Customer Service und Logistik (Susanne Pfeiffer); Customer Services & Logistics (Barbara Kemp); CyberSecurity Operations (Daniel Pekol); CyberSecurity Operations (Edward Ferrara (Inherited)); Cytogam (Thomas Beer); DGL 1 (Advait Jagirdar); DOCI eClinical Technology (Thomas Verish (Inherited)); DS Manufacturing (Barbara Beugger); DS Manufacturing (Matthias Kaeser); DSP & Analytics (Michael Schmitt); DSP Engineering (Dave Tomsik [C]); DSP Engineering (Rene Boros); DSP Laboratories (Arnaud Vonarburg); DSP Manufacturing 1 (Matthias Kaeser); DSP Manufacturing 2 (Baptiste Leclerc); DTP & Graphical Control (Metin Yilmaz (Inherited)); Dallas 078 (Brenda C Greenfield (Inherited)); Dallas 078 (Elizabeth Casillas); Dallas 078 (Elizabeth Trejo); Dallas 078 ACM Area 1 (Rhonda Shields); Dallas 078 ACM Area 2 (Brenda C Greenfield (Inherited)); Dallas 078 ACM Area 2 (Melissa J Chapman); Dallas 078 QA (Carlotta McCoy); Dallas 078 QA (Wajeehah Al-Uqdah); Dallas 510 (Elizabeth Casillas); Dalton 296 (Brittany Goforth); Dalton 296 ACM Area 1 (Dana Hibbs); Dalton 296 ACM Area 2 (Annette L Switzer); Dalton 296 QA (Wayne J Bixler); Dangyang Clinical Inspection (Xingzuan Zhang ?????); Dangyang Inspect (Liuqing Wan ); Dangyang Inspect (Liuqing Wan ?????); Dangyang Inspection (Pingfan Zhang ?????); Dangyang Office Management (Wanwan Zhu ); Dangyang Office Management (Wanwan Zhu ?????); Dangyang Office Management (Xiaoquan Zhu ?????); Dangyang Plasma Collection (Yingshuang Li ); Dangyang Plasma Collection (Yingshuang Li ?????); Dangyang Plasma Collection Center (Jack Zhou ); Dangyang Plasma Collection Center (Jack Zhou ?????); Dangyang Plasma Collection Center (Qingqing Wang ?????); Dangyang Plasma Collection Management (Yaling Zhu ); Dangyang Plasma Collection Management (Yaling Zhu ?????); Dangyang Plasma Sourcing (Meng Hu ); Dangyang Plasma Sourcing (Meng Hu ????); Dangyang Plasma Sourcing Management (Xuejun Wang ); Dangyang Plasma Sourcing Management (Xuejun Wang ?????); Data Analytics & Metrics (Bill Bigney); Data Analytics (Aaron Imig); Data Analytics (Constanze Buchter); Data Analytics (John Choy); Data Analytics (Michael Schrder (Inherited)); Data Governance (STEPHEN SMITH); Data Management (Steven Carr); Data Management (Thomas Hahlgans); Data Management KOP 1 (Mara Strelecki); Data Management Operations (Charles Johnson); Data Operations&Clinical Infrastructure (Thomas Verish (Inherited)); Data Operations&Clinical Infrastructure (Thomas Verish); Data Services (Don Konemann); Data Services (Sachin Ohal); Data and Analytics (Enterprise Applications) (Robert Hawker); Data and Analytics Center of Excellence (Thomas Gsponer); Database (Bhavesh Patel); Davenport 424 (Greg Boden); Davenport 424 ACM Area 1 (Jacinda L Head); Davenport 424 ACM Area 2 (Tabathia Ann Dells); Davenport 424 QA (Christopher R Doerscher); Dayton 408 (Daniel K Osborne); Dayton 408 (Megan L Waldeck); Dayton 408 ACM Area 1 (Ashley Instone); Dayton 408 ACM Area 1 (Shalia Sloan); Dayton 408 ACM Area 2 (Ashley K McConnell); Dayton 408 QA (Daniel K Osborne); Dayton 408 QA (Megan L Waldeck); Decatur 104 (Antonia Geiselmayr); Decatur 104 ACM Area 1 (Binh Tang); Decatur 104 ACM Area 1 (Shauntia Cobb); Decatur 104 ACM Area 2 (Antonia Geiselmayr (Inherited)); Decatur 104 ACM Area 2 (Binh Tang); Decatur 104 QA (Amaris A Wiggins); Decatur 104 QA (China Washington); Decatur 104 QA (Kyle M Lehrke (Inherited)); Decatur 446 (Amber McCullough); Decatur 446 (Jordon Lyon); Decatur 446 (Sentoria D Leonard-Brown); Decatur 446 ACM Area 1 (Amber McCullough (Inherited)); Decatur 446 ACM Area 1 (Amber McCullough); Decatur 446 ACM Area 2 (Aja Marbley); Decatur 446 ACM Area 2 (Amber McCullough (Inherited)); Decatur 446 QA (Tony D Giebelstein Jr); Delivery Support (Christopher A Betterton); Delivery Support (Robert Boland (Inherited)); Demand Planner (Rose Cimbora); Demand Planning (Ann Cipkins); Demand Planning (Tsutomu Nagoya ???? ? - ??? ????); Dept 1216 Antivenom Manufacture (Andrew Penhale); Dept 1216 Antivenom Manufacture (Cassandra Smoult); Dept 822, Cell Culture and Purification (Jamie Black); Development & chem. Quality Control (Daniel Frattini); Development Applications (Andy Chung); Development GMP Laboratory (DGL) (Andreas Meister (Inherited)); Development GMP Laboratory (DGL) (Heike Gocht); Development Projects (Heather Davis (Inherited)); Development and Support (Johannes Schiebel); Development and Support (Stefan Schmidbauer); Digital Communications (Mairian Gildea); Digital Communications (Piers Dickinson); Digital Delivery & Data (Robert Boland); Digital Health (Brian Johnson); Digital Strategy Implementation (David Christie); Digital Workplace (Dana Leeson); Dir Com Op - Vaccines (Helen Concilia); Dir, Health Economics 724 (Stuart Harsley); Direct Procurement (Angelia Crumbley); Director Clinical Science (Janine Oberije); Director Comm Ops - Pharma (Danielle Dowell); Director HR 924 (Yvette Saunders); Director QA Continuous Improvement & Issues Management (Adrian Meade); Director Quality Control (Leonora Pancho); Director Supply Chain (Lachlan Cruise); Director of Engineering (Brian Price); Director of Manufacturing, Products of National Significance (Cassandra Smoult); Director of Manufacturing, Products of National Significance (Lisa Lamb); Director, Commercial Operations NZ (Catherine Murphy); Director, Marketing (Rachel Jensen); Director, Marketing (Theo Horafiaris); Director, Program Execution (Gail Dawson); Dispatch (Bernd Schfer); Dispatch (Bernd Sch√§fer); Dispatch (Igor Kaucher (Inherited)); Dispensing, Medium & Buffer Preparation (Vreni Frtsch); Dispensing, Medium & Buffer Preparation (Vreni F√∂rtsch); Distribution (Jasmine Ma ?????) (Jasmine Ma ?????); Distribution (John Conway); Distribution (Maggie Wan ); Distribution (Maggie Wan ?????); Distribution (Nan Wang ); Distribution - Central Region (Lu Jin ); Distribution - Central Region (Lu Jin ????); Distribution - DTP, China (Cissy Xi ????); Distribution - East Region (Zhan-jun Liu ); Distribution - East Region (Zhan-jun Liu ?????); Distribution - North Region (Feng Rui ); Distribution - North Region (Feng Rui ????); Distribution - North Region (Kaijian Zhao ?????); Distribution - South Region (Nan Wang ????); Distribution - South Region (Sunny Sun ); Distribution - South Region (Sunny Sun ?????); Distribution - Tendering (Yanfang Zhou ?????); Distribution - West Region (Xuemei Zeng ); Distribution - West Region (Xuemei Zeng ?????); Distribution CH U8 (Rafael Gasser); Distribution CH U8 (Thomas Ryser (Inherited)); Distribution Junxin (Yanfang Zhou ); Distribution Junxin (Yanfang Zhou ?????); District Heights 210 (Cecelia Cutchin); District Heights 210 QA (ALISON CONLEY); District Heights 210 (Cecelia Cutchin); District Heights 210 (Michael W Solomon); District Heights 210 ACM Area 1 (Mickey Wilson); District Heights 210 ACM Area 1 (Tamika Hogan); District Heights 210 ACM Area 2 (Tamika Hogan); District Heights 210 QA (ALISON CONLEY); District Heights 210 QA (Abigail Brown-Delostrinos); Documentation (Arno Karnholz (Inherited)); Documentation (Dominik Erhart); Documentation Management and GMP Training (Jin Tao ); Documentation Management and GMP Training (Vicky Fang ????); Documentation Starting Materials (Angelika Jung); Documentation Starting Materials (Simone Lang); Dothan 504 (Demetia Scott); Dothan 504 ACM Area 1 (Olivia McVey); Dothan 504 ACM Area 2 (Kaitlyn M Delamore); Dothan 504 QA (Roxanne K Schaeffer); Douglas 190 (Alejandra Gonzalez); Douglas 190 (Jose Pineda); Douglas 190 ACM Area 1 (Irma Ornelas); Douglas 190 ACM Area 2 (Alejandra Gonzalez); Douglas 190 ACM Area 2 (Marisela Nunez); Douglas 190 QA (Andrew Larson); Downstream Manufacturing (Alan Cartwright); Downstream Manufacturing Days (Alan Hudson); Downstream Manufacturing Engineering (Anthony Flynn); Downstream Manufacturing Shift 1 (Neil Myerscough); Downstream Manufacturing Shift 2 (Edward Bucknall); Downstream Manufacturing Shift 3 (Alan Hudson); Downstream Manufacturing Shift 3 (Neil Myerscough); Downstream Manufacturing Shift 4 (Craig Ridyard); Downstream Manufacturing Shift 4 (Edward Bucknall); Drawing Office (Andrew Brown); Drug Product (Nicola Di Maiuta); Duluth 613 (Dennis J Lofald); Duluth 613 (Veronica J Kaspszak); Duluth 613 ACM Area 1 (Jenn Jackson); Duluth 613 ACM Area 2 (Angela J O''Hara); Duluth 613 QA (Heidi E Woolhouse); Durham 074 (Thomas Kisicki Jr); Durham 074 ACM Area 1 (Keonna Austin); Durham 074 ACM Area 2 (Damonta A Burch); Durham 074 QA (Meia Moore); E&I with MES/Systems (Josh Mills); E-Support (Marco Grossniklaus); E2E Operations Finance (Marcelo Estrella); ECI Finance/Controlling (Salim Ketata); EEMEA Finance/Controlling (Amanda White); EHS & Business Resilience (Lynette Hodgden); EHS (Liam Ryan); EHS Bern (Lone Carlsen); EHS Bern (Rolf Ingold); EHS Bern (Ulrich Schuerch); EHS Design Construction & Process Safety (Aaron Duff); EHS Kankakee (Dale C Rosene); EHS Kankakee (Lynette Hodgden (Inherited)); EHS Kankakee 2 (Allan Wise); EHS Kankakee 2 (Andrew Uftring); EHS Marburg (J√ºrgen Kanand (Inherited)); EHS Marburg (zcan Campinar); EHS Marburg (√ñzcan Campinar); EHS Plasma (BARBARA WUNDER); EHS Security (Adam Kennell); EHSS Lengnau (Harry Hohl); ELS A (Alain Ducaud); ELS A (Susanne Heins); ELS Z (Peter Reusser); ELS Z (Simon Haenni); EM (Tina Liu); EMEA (Anja Brunlich); EMEA (Anja Br√§unlich); EMEA HR Ops Marburg Team (Inga Menzinger-Koradin); EMEA Service Delivery (Cornelia Huber); EMEA Service Desk (Filipe Cabete); EMEA Service Operations (Bernd Boucsein); EMEA Site Services (Raluca Hodgson); EMR GH Gruppe; EMR HW Gruppe (Patrick Laukel); EMR NBF Gruppe (Thomas Peil); ERP Applications (Mourad Boulanouar); ERP Applications (Nagesh Ramesh); ERP Solution (Rajan Thomas); ERP Solution (Rajan Thomas) (Rajan Thomas); ERP Solution Center (KoP) (Peter Eliasson); ERP Solution Center (MBR) (Jochen Preis); ERP Solution Center (Neelesh Kulkarni); ERP Solution Centre (AU) (Shama Ravindran); ERP and Enterprise Applications (Steven Harvey); ES Qualification (Michael Kocher); ETA (Colin Steele); ETA (Ian Mackay); ETA (Tim Bullock (Inherited)); ETA + Utilities & Services (Michael Elmer); EU Qualified Person for PhV (Andrew Bond); EU Qualified Person for PhV (Frank Mauler); EU TA Coagulation (Bianca Petzold); EU TA Coagulation (Damian Gilkerson); EU TA Coagulation (Sinem Kaba Pasqualon); EU Therapeutic Area Immunology & Neurology Europe (Peter K Tadros); EU-QPPV Office Lead (Gudrun Dechert); EU/ROW RA Franchise Cell and aQIV (Susan Cameron-Laxton (Inherited)); Early DSP Development (Michael Schmitt (Inherited)); Early Stage DSP Development (EDD) (Lars Robbel); Early Stage DSP Development (EDD) (Michael Schmitt); Early Stage DSP Development (Olga M√ºller); Early USP Development (Jrg Gnther); Early USP Development (J√∂rg G√ºnther); Early USP Development (Stefan Debus); Early Upstream Development (Emmanuel Bizier); Early Upstream Development 1 (Ian Walker); Early Upstream Development 2 (Ellen Otte); East Point 193 (Kimberly Bragg); East Point 193 (William A Voltz); East Point 193 ACM Area 1 (Marshaya Johnson); East Point 193 ACM Area 1 (ROSALIND MCCOY); East Point 193 ACM Area 2 (Latasha A Wech); East Point 193 QA (Amaris A Wiggins); East Point 193 QA (Danelle Jones); East Point 193 QA (Melodee C Ebel (Inherited)); East Providence 202 (Christopher Travalik (Inherited)); East Providence 202 (PAUL BURKE); East Providence 202 (Sean Delong); East Providence 202 ACM Area 1 (Jacqueline Levasseur); East Providence 202 ACM Area 2 (Christine Riebe); East Providence 202 QA (Desiree Guerrero); East Providence 202 QA (Tatyani Guest); Eau Claire 514 (Kayla L Stueber); Eau Claire 514 QA (Melissa K Latourelle); Edge Commercial (Darren Hawker); Edge Finance (Matthew Rees (Inherited)); Edge Financial Accounting (Barry Johnston); Edge Manufacturing (Andrew Penhale); Edge Parkville (Matthew Rees); Edge Parkville - GLIMS (Helen Mihaljevic); Edge Planning (Brent Gorham); Edge Procurement (Mark Van Der Poel); Edge Programme (Ian Dick); Edge Programme (Philip Troughton); Edge Quality (Kate Waterman); Educational Meetings (Marco Kuhn); El Paso 197 (ALEX MARIN); El Paso 197 (Heather Jex); El Paso 197 ACM Area 1 (Michael Garcia); El Paso 197 ACM Area 2 (Cynthia Marquez); El Paso 197 QA (Amanda Robles); El Paso 197 QA (Brenda C Greenfield (Inherited)); El Paso 197 QA (CATIA LOPEZ); El Paso 248 (Edgar Rodriguez); El Paso 248 ACM Area 1 (Manuel Jaramillo); El Paso 248 ACM Area 2 (Albert Lozano); El Paso 248 QA (NOHEMI GARCIA); El Paso 279 (Alejandro Perales); El Paso 279 ACM Area 1 (Crystal Ramirez); El Paso 279 ACM Area 2 (Vanessa Pena); El Paso 279 QA (Kenya Villarreal); Electrical / I&C / BMS (Jan Klee); Electrical Engineer (Tien Nguyen); Electrical Maintenance (Marcel Ziegler); Electrical Maintenance (Vittorio D''Argento (Inherited)); Electro Maintenance (Simon Urfer); Electronic Laboratory Systems (ELS) (Susanne Heins); Electrophoresis and Immunoassays (Michael Albers); Elektroniker fr Automatisierungstechnik (Carmen Walldorf (Inherited)); Elektroniker f√ºr Automatisierungstechnik (Doris Nake (Inherited)); Elternzeit Bielefeld (Kirsten Scheibel (Inherited)); Elternzeit Bielefeld (Kirsten Scheibel); Elternzeit Diverse (Andreas Gehrich (Inherited)); Elternzeit Diverse (Andreas Gehrich); Elternzeit Diverse (Annette Pernitzsch (Inherited)); Elternzeit Diverse (Annette Pernitzsch); Elternzeit Diverse (Claudia Habenicht (Inherited)); Elternzeit Diverse (Claudia Habenicht); Elternzeit Diverse (Damaris Kieckh√∂fer); Elternzeit Diverse (Stephani Keltsch); Elternzeit Diverse (Sven Schuhmann (Inherited)); Elternzeit Diverse (Sven Schuhmann); Elyria 165 (Karin M Rothig); Elyria 165 ACM Area 1 (Nathan G Dailey); Elyria 165 ACM Area 2 (Gabrielle N Scalese); Elyria 165 QA (Calvin Juguilon); Elyria 165 QA (Jason A Skonecki); Emerging Europe (Christian Wieszner); Emerging Europe (Dirk Hoheisel (Inherited)); Employee Relations (Bonnie Shor (Inherited)); Employee Relations (Bonnie Slone (Inherited)); Employee Relations 1 (Tricia N Jackson); Employee Relations 2 (Jan Cameron); Employee Relations 3 (Emmanuella Hedge); End User Services (Christian Reinhardt); End User Services (Rolf Tr√ºmper); Endwell 220 (Barbara Ferrese); Endwell 220 ACM Area 1 (Richard Barber); Endwell 220 ACM Area 2 (Barbara Ferrese (Inherited)); Endwell 220 ACM Area 2 (Tara Streeter); Endwell 220 QA (Aarsalaan Semna); Endwell 220 QA (Richard Purdy II); Energy Management (Anna Fellenberg); Energy/WAD (Sandro Jenzer); Energy/WAD 1 (Michael Hirschi); Eng Business & Systems Mgr (Nicholas Moody); Eng Services - Ops (Mark Mansour); Eng Services -Ops (Rohit Dhorje); Eng Services-Ops (Damien Barri (Inherited)); Eng Services-Ops (Michael Spiteri); Eng Services-Ops (Victor Karafilis); Engineering (Bulk) (Jeff Rehborg); Engineering (Bozana Dujak); Engineering (Bulk) (Jeff Rehborg); Engineering (Controls) (Dennis Prom); Engineering (Controls) (Scott Bilkey); Engineering (Controls) I (Dennis Prom); Engineering (Howard Wilton); Engineering (Johannes Krmer); Engineering (Johannes Kr√§mer); Engineering (My Linh Ly); Engineering (Qualification) (Jeff Mihaichuk (Inherited)); Engineering (Qualification) (Matthew Galley); Engineering (Rainer Kraus); Engineering (Richard Friar); Engineering Compliance (Connie Costanzo); Engineering Data Management (Susan Clough); Engineering Lengnau (Olaf Thiel); Engineering Liverpool (Kevin Ridley); Engineering MAB/Gene Therapy (David Glover); Engineering Maintenance Mgr (Andrzej Wawrzykowski); Engineering Production Manager (Karen Spencer); Engineering Production Manager (Mark Davide); Engineering Production Manager (Stuart Barnes); Engineering Projects Dept (Stuart Freeland-Small); Engineering Projects Manager (Anthony Wrzesinski (Inherited)); Engineering Projects Manager (Anthony Wrzesinski); Engineering Projects Manager (David Ryan); Engineering Serv (Narein Mather); Engineering Serv (Sudhir Kamath); Engineering Serv - Drawing Office (Peter Dubuisson-Perrine); Engineering Serv - Maintenance (Michael Bell); Engineering Serv - Maintenance (Shiran Fernando); Engineering Serv - Management (Crystal Penaluna); Engineering Serv - Management (Geoff Armstrong); Engineering Serv Plan &Support (Benjamin Terbeeke); Engineering Serv Plan &Support (Deepak Cherian); Engineering Serv Plan &Support (Satya Dara (Inherited)); Engineering Serv Plan &Support (Satya Dara); Engineering Services Ops (Jarvis Walker); Engineering Services (Arnold Nigsch); Engineering Services (Atul Malhotra); Engineering Services (Bradley J Eberhart); Engineering Services (Daniel Reveron); Engineering Services (Franz Arnold Nigsch); Engineering Services (James E Viane Jr); Engineering Services (Jose Gonzalez (Inherited)); Engineering Services (Peter Szitas); Engineering Services (Victor Marinelli); Engineering Services - Maintenance (Matthew Johnson); Engineering Services - Maintenance E/I (Jason Fletcher (Inherited)); Engineering Services - Maintenance E/I (Jason Fletcher); Engineering Services - Maintenance E/I (Matt Downey); Engineering Services - Maintenance E/I 1 (Samuel Kanyongo); Engineering Services - Maintenance E/I 2 (Ronnie Mercieca); Engineering Services - Maintenance E/I 3 (Ben Hillman); Engineering Services - Maintenance E/I 4 (ANDREW Rawlinson); Engineering Services - Maintenance E/I 5 (Haisley Okpako); Engineering Services - Maintenance E/I 6 (Jim Haines); Engineering Services - Ops (Amanda Sim); Engineering Services - Ops (Jason Chan); Engineering Services - Ops (Lee Dengler); Engineering Services 3 (Gene Bohn); Engineering Services I (Daniel Reveron); Engineering Services Process Leader (Tim Bullock); Engineering Services ¬ñ Ops (Jarvis Walker); Engineering Standards (Adam Dragolic); Engineering Support (Crystal Penaluna (Inherited)); Engineering Support (Crystal Penaluna); Engineering Support (Geoff Armstrong (Inherited)); Engineering Support (Jayne Crowley); Engineering Technology Transfer (Shannon Boudreau); Engineering, PKV (Anthony Wrzesinski); Engineering, PKV (Brian Price); Engineering/PJM (Roger Stoffel); Engineering/PJM (Sven Schwerdtfeger); Enshi Inspection (Binming Tian ?????); Enshi Plasma (Xiaoxing Jiang ); Enshi Plasma (Xiaoxing Jiang ?????); Enshi Plasma Collection Center (Genxiong Zhou ?????); Enshi Plasma Collection and Office Administration (Min Zhang ); Enshi Plasma Collection and Office Administration (Min Zhang ????); Enshi Plasma Operations (Jing Wang ); Enshi Plasma Operations (Jing Wang ????); Enshi Plasma Sourcing (Liu Yang ????); Enshi Quality Control (Xiaohong Tan ?????); Enshi Quality Control Management (Stevin Cheng ); Enshi Quality Control Management (Stevin Cheng ?????); Enshi Quality Control Management (Xiaoping Tang ?????); Enshi Quality Inspection (Yinglong Liu ); Enshi Quality Inspection (Yinglong Liu ?????); Enshi Supply Management (Hong Yuan ); Enshi Supply Management (Hong Yuan ????); Enterprise Analytics (John Thompson); Enterprise Applications (Charles Lowe); Enterprise Applications (David Christie); Enterprise Applications (Martin Jones (Inherited)); Enterprise Architecture (Ian Wilson); Enterprise Architecture (Keith Walbert); Enterprise Architecture (Movi Banerjee); Enterprise Business Solutions (David Wolozyn); Enterprise Data Management (Matt Barnett); Enterprise Excellence (Andrew Croft); Enterprise Execution Systems (Frank Mastellone); Enterprise Infrastructure & Operations (Bernard Shepard); Enterprise Infrastructure & Operations (Don Konemann (Inherited)); Enterprise Infrastructure & Operations (Greg Misyak); Enterprise Investment Portfolio Management (Aymeric Ange); Enterprise Learning Management (Justin Huss); Enterprise Portfolio & Governance (Matthew Cam (Inherited)); Enterprise Portfolio & Governance (Matthew Cam); Enterprise Portfolio & Governance (Matthew Kokkonen); Enterprise Process Management (Desire Djomani); Enterprise Process Management (Linda Carducci (Inherited)); Enterprise Process Management (Matthias Kienast); Enterprise Security & Risk (Edward Ferrara); Enterprise Security & Risk EMEA (Jrg Koch); Enterprise Security & Risk EMEA (J√∂rg Koch); Enterprise Security - Identity and Access Management (Rebecca Daniels); Enterprise Security - Identity and Access Mgmt (Rebecca Daniels); Enterprise Security Architecture (Wilfried Ziegler); Enterprise Site Management (AU/Asia) (Don Konemann (Inherited)); Enterprise Site Management (AU/Asia) (Viv Louzado); Enterprise Site Management (Andr√© Strahm); Enterprise Site Management (Michael Furchert); Enterprise Site Management MBR (Michael Furchert); Environment (Barbara Dix); Environment Health Safety Sustainability (Dale C Rosene); Environment Health Safety Sustainability (Lynette Hodgden (Inherited)); Environmental Health & Safety (Andrew Hanley); Environmental Health & Safety (David Nelson); Environmental Health & Safety (David Stewart); Environmental Health & Safety (Filanthy Nalpantidis); Environmental Health & Safety (Prue McKeown); Europe Global Reg Affairs (Hazel-Anne Griffiths); Europe HR Ops Tier 1 (Katrin Schpbach); Europe HR Ops Tier 1 (Katrin Sch√ºpbach); Europe HR Ops Tier 1 (Stephan Sch√§ufele (Inherited)); Europe HR Ops Tier 1 (Sylvia Potocnik); European Sourcing Packaging (Matthias Engler); Evansville 614 (Coltin L Springate); Evansville 614 (Michelle S DeCambre); Evansville 614 (Scott Ward); Evansville 614 ACM Area 1 (Tani Baugher); Evansville 614 ACM Area 2 (Ian C Fox); Evansville 614 QA (Karla K Cooper); Execution Systems (Matt Casey); Executive Admin (Rupal Pandit); Executive Assistant & Travel Office (Eliane Bossart); Executive Assistant (Joanne Du); Executive Assistant (Sarah Gleeson); Executive Compensation & Equity (Micaela Costello); Experimental Unit (Felix Hiltwein); Export Admin LatAm (Cindy Jacobs); External Communications (Natalie de Vane); External Materials (Dominik Corbet); External Supply Integration (Cameron Barrett); External Supply Quality Assurance (Eva Streit); External processes (Andr Strahm); External processes (Andr√© Strahm); External processes (Simon Haemmerli); F IX, F II, Inhibitors (Carlotta Debnar-Daumler); F VIII & F IX (Barbara Kalina (Inherited)); F VIII & F IX (Horst Boeder); FBP Project and Portfolio Support (Ritesh Kumar); FP - Non Process Engineering and Construction Management (Jennifer Mastio); FP - Non Process Engineering and Construction Management (Rolf Mnig); Fachlageristen (Carmen Walldorf (Inherited)); Fachlageristen (Doris Nake (Inherited)); Facilities & Utilities (Adila Zaidi); Facilities & Utilities (Bradley J Eberhart (Inherited)); Facilities (Claire Behforooz); Facilities Develop & Services (Barbara Anderton); Facilities Develop & Services (Cameron Cook); Facilities, Capex and Drawing (Mark Hughes); Facilities, Utilities & Services (Michael Elmer); Facility & Waste Management (Michael Andrey); Facility & Workspace Management (Samuel Maurer); Facility Compliance Specialist (Andrew Stalder); Facility Management (Christian Daum); Facility Management (Hansjoerg Bettschen); Facility Management (Hanspeter Bruni); Facility Management (Michael Andrey); Facility Project Management - Non Process Engineering and Construction (Rolf M√∂nig); Facility Project Management - Process Engineering (Darren Vegara (Inherited)); Facility Project Quality Management (Brian Grimson [C]); Facility Project Quality Management (Graham William Telford); Facility Services (Alex Elandt [C]); Facility Services (Alex St√§hli); Facility Services (Sabine Beck); Facility Services (Samuel Maurer); Facility, Data & Laboratory Management (Robert Schwanzer); Faktor X Produkte (Carlotta Debnar-Daumler); Fayetteville 266 (Grant Strayer); Fayetteville 266 ACM Area 1 (Kady-Ann Foster); Fayetteville 266 ACM Area 2 (Joshua Simpson); Fayetteville QA 266 (Daniel Huereca); Federal Way 110 (Tamara Ann Owens); Federal Way 110 ACM Area 1 (Nancy Martinez); Federal Way 110 ACM Area 1 (Tamara Ann Owens); Federal Way 110 ACM Area 2 (Tamara Ann Owens (Inherited)); Federal Way 110 ACM Area 2 (Tiffani Brazzell); Federal Way 110 QA (Jenny Bardwell); Federal Way 110 QA (Simon P Dickinson); Fermentation Research (Thomas Rein); Fertigungselektronik (Michael Kraft); Fertigungselektronik 1.0 (Thomas Grn-Fischer); Fertigungselektronik 1.0 (Thomas Gr√ºn-Fischer); Fertigungselektronik 2.0 (Ralf Gerlach); Fertigungselektronik 2.1 (Ralf Gerlach); Field Sales (Angus Gordon); Field Sales (Catherine Murphy (Inherited)); Field Sales (Kim Fry); Field Services BRN & LGN (Urs Derungs); Field Support (Brett A Wintheiser); Field Support (Brett Wintheiser); Field Support (Robin G Palmer (On Leave)); Field Support (Robin G Palmer); Fill - Finish (Pasquale Carestia); Fill / Finish (Pasquale Carestia); Fill / Finish Manufacturing (David Hartley); Fill /Finish Process Improv Manager (Joseph Connor); Fill Area (Barbara Beugger); Fill Area (Nicola Di Maiuta); Fill Finish (Beat Allemann); Fill Finish (Nicola Di Maiuta); Fill Finish (Shannon Thorp); Fill Finish Marburg (Frank Emmerich); Fill Finish Marburg (Helmut Robert Euler); Fill Finish Operations (Lasher Rao ); Fill Finish Operations (Lasher Rao ?????); Fill Finish Operations (Philip Troughton); Fill Process Technology (Herman Schinkelshoek); Fill and Finish Support (Melissa Addamo); Fill finish (Ian Middleton); Fill finish (John Riley); Fill finish (Marion Taligault Owens); Fill/Finish Formulation (Caterina Colantoni); Fill/Finish Formulation (Norm Mancuso (Inherited)); Filling & Packaging Mechanical Team (Adam Steegstra); Filling & Packaging Mechanical Team (Tharanga Abeysinghe); Filling & Visual Inspection (Eveline Kindler); Filling (Adrian Carni); Filling (Andreas Gavriel (Inherited)); Filling (Andreas Gavriel); Filling (Andrew Marshall (Inherited)); Filling (Andrew Marshall); Filling (Cuong Nguyen); Filling (Daniel Locandro); Filling (Igor Belevski); Filling (Joselito Bautista); Filling (Marion Taligault Owens); Filling H69 & Refludan (Matthias Klein); Filling I (Eveline Kindler); Filling II (Celio Ferreira); Filling II (Simone Wettstein); Filling Line & Lyophilisation (Michael Gisler); Filling Line I & II (Adrian Aeschlimann); Filling Line I & II Equipment Preparatio (Werner Steiner); Filling Line I & II Group 1 (Urs Cotting); Filling Line I & II Group 2 (Markus Rindisbacher); Filling Line I & II Group 3 (Bruno Zuercher); Filling Line I & II Group 3 (Roland Gerber); Filling Line I&II Pasteurisat./Incubat. (Eduard Wittwer); Filling Line II Pasteurisat./Incubat (Roland Lerch); Filling Line III & IV (Mathias Beck); Filling Line III (Mathias Beck); Filling Line III Group 1 (Sasa Lazarevic); Filling Line III Group 2 (Christian Schmid); Filling Line III Group 2 (Daniel Kraehenbuehl); Filling Line III Group 3 (Ulrich Beat Wildi); Filling Line III Support (Daniel Kraehenbuehl); Filling Line IV (Jean-Claude Cauderay); Filling Line IV, Lyo & Support (Alexander Kissler); Filling Line IV, Lyo & Support (Jean-Claude Cauderay); Filling Line V (Andrea Jantsch); Filling Line V (Anna Meier); Filling M 305 (Esther Seidel); Filling Non Privigen (Mayur Bannore); Filling Operations (Chenyi Guo ?????); Filling Operations (Wei Xiao ); Filling Privigen (Narelle Urli); Filling Privigen (Tyson Parker); Filling Support (Andrew Marshall); Filling Toll Plasma (Dale Peel); Filling Toll Plasma (Laxman Trichinapalli); Filling Toll Plasma (Narelle Urli); Filling Toll Plasma (Peter Koelmeyer); Filling Toll Plasma (Rebecca Hayes); Filling Toll Plasma (Robert La Ferla); Filling V Group 1 (Eike Gredler); Filling V Group 1 (Roger Wamister); Filling V Group 1 (Thomas Daehler); Filling V Group 2 (Michael Roos); Filling V Group 2 (Thomas Daehler); Filling and Packaging (Narelle Urli); Filling and Packaging (Tyson Parker); Filling/Lyo/Visual Inspection (Michael Gisler); Final Product Manufacturing / Production Services (Othmar Geisser); Final Product Planning (Ingo Kling); Final Product Planning (Jan-Christopher Gerlach); Final Product Planning (Julian Knabeschuh); Finance & Accounting (Eisuke Kofugata); Finance & Accounting (Haruka Utsugi (Inherited)); Finance & Accounting Japan (Izumi Yoshida - ); Finance & Accounting Japan (Izumi Yoshida ??? ??? - ??? ????); Finance & Controlling (Devun Dusoruth); Finance & Controlling (Ebru Kuntay); Finance & Controlling (Jonathan Ho); Finance & Controlling (Justin Mericle); Finance & Controlling HU (Togya Gergely); Finance & Controlling Italy (Annalisa Saracchi); Finance (Amy Kishikawa); Finance (Avi Singh); Finance (Becky Heatherman); Finance (Carolyn Xiong ????); Finance (Damian Gaylor); Finance (Daniel Janides); Finance (Fergus Patrick McLellan); Finance (Frank Liesner); Finance (Harold Chan (Inherited)); Finance (Helen Gearing); Finance (Jacqui Lomas (Inherited)); Finance (Jacqui Lomas); Finance (John Levy); Finance (Karol Bian ?????); Finance (Kate Tse); Finance (Ken Lim); Finance (Konstantin Petropoulos); Finance (Lena Shi ????); Finance (Luke McMahon (Inherited)); Finance (Luke McMahon); Finance (Melody Orbiso); Finance (Nicole Pryde); Finance (Nishant Popat); Finance (Shalini Goundar); Finance (Siu-Yin Yu ?????); Finance (Vicci Quagliarella); Finance (Wolfgang Thiel); Finance (Xiaowei Yin ); Finance (Xiaowei Yin ?????); Finance / Tax Marburg (Fatma Kremser); Finance Belgium (Jrgen Bond); Finance Belgium (J√∂rgen Bond); Finance Business Partner Commercial EMEA (Simon Briscoe); Finance Business Partnering Operations (Daniel Janides); Finance Business Partnership (Jason Sowles); Finance Business Partnership (Michael T McAvoy); Finance Business Partnership I (Michael T McAvoy); Finance China (Karol Bian ?????); Finance Commercial LATAM (KRISTINE SOLOMON (Inherited)); Finance Commercial LATAM (Martin Descotte); Finance Czech Republic / Slovakia (Libor Ballek); Finance Director: Lead Business Partner (Sharon Tindley); Finance EU Commercial (KRISTINE SOLOMON (Inherited)); Finance EU Commercial (Kristine Solomon (Inherited)); Finance France (Charlotte Rougi√©); Finance Global Commercial (KRISTINE SOLOMON); Finance Global Commercial (Kristine Solomon); Finance Greece (Christos Papadatos (??????? ?????????)); Finance Greece (Efstathios Lymperopoulos); Finance Netherlands (Jrgen Bond); Finance Netherlands (J√∂rgen Bond); Finance Nordic (Carl Werner); Finance Portugal (David Roig Martinez); Finance Spain (David Roig Martinez); Finance UK (Paul Fellingham); Finance US Commercial Operations (Robert Smith Jr); Finance and Admin (Elena Kondrashova); Finance and Controlling (Devun Dusoruth (Inherited)); Finance and Controlling (Doris Kamtner); Finance and Controlling (Franz Gr√ºn, Doris Kamtner); Finance and Controlling (Ulrike Bridi); Finance, Accounting & Reporting (Beeharrylall Jeetun); Finance, Business Modelling (Helen Gearing (Inherited)); Financial Planning & Analysis (Duncan Webber); Financial Planning and Analysis (Haruka Utsugi); Financial Planning and Analysis 1 (Christopher Pulupa); Financial Planning and Analysis 2 (Christopher Pulupa (Inherited)); Financial Planning and Analysis 2 (Daniel Ganaishlal); Financial and Reporting Accountant (Hayley Jackson); Financial and Reporting Accountant (Ryan Brown [C]); Financial and Reporting Accounting (Callum Bircham); Finishing & Infrastructure (Laurent Wagner); Finishing & Infrastructure (Roger Stoffel); Flint 161 (Anjela Johnson); Flint 161 (Carlotta McCoy); Flint 161 ACM Area 1 (Janie Cary); Flint 161 ACM Area 1 (Trina L Bryant); Flint 161 ACM Area 2 (Carlotta McCoy (Inherited)); Flint 161 ACM Area 2 (Khatija Moiz); Flint 161 QA (Allante S Williams); Flint 161 QA (Andrea K Coleman); Forecasting & Strategic Analytics (Joshua Prince); Forecasting & Strategic Analytics (MANISH SRIVASTAVA); Forecasting, Compliance & CTA Operations (Jutta Neufang-Hueber); Formulation (Remon Hemaya); Formulation Development (Ahmad Abdul Fattah); Formulation Development (Di Goodall); Formulation Development (Heidi Elmer Bodnar); Formulation Development (Hywel Williams); Formulation Development (Michelle Zhuravlyova); Formulation Development (Nathan Edwards); Formulation Development (Richard Shalders); Formulation Development (Scott Thompson); Formulation Project Process (John Riley); Formulation Project Process (Marion Taligault Owens); Formulation Shift A (David Rimmer); Formulation Shift B (David Rimmer); Formulation Shift B (Matthew Storey); Formulation, Lyo & Stability Development (FLS) (Uwe Liebing); Fort Collins 705 (Erin J Zwalina); Fort Collins 705 ACM Area 1 (Michael A McNear); Fort Collins 705 ACM Area 2 (Jeremy M Kuehn); Fort Collins 705 QA (Christi Bringle); Fort Smith 278 (David Ensminger (Inherited)); Fort Smith 278 (Rachael Kirby); Fort Smith 278 ACM Area 1 (Rachael Kirby); Fort Smith 278 ACM Area 1 (Tammy Semiche); Fort Smith 278 ACM Area 2 (Russell Perez); Fort Smith 278 QA (David Ensminger (Inherited)); Fort Smith 278 QA (Whitney Jacobs); Fort Wayne 089 (Malori A Shields); Fort Wayne 089 (Rob Garcia); Fort Wayne 089 ACM Area 1 (Timothy R Albright); Fort Wayne 089 ACM Area 2 (Chad Rudolph); Fort Wayne 089 QA (Chris Cusack); Fort Wayne 089 QA (Erik Plate (Inherited)); Fort Wayne 089 QA (Gretchen Watkins); Fort Wayne 089 QA (Mitch A Quinn); Fort Worth 419 (Angelica M Henry); Fort Worth 419 (Sarah E Silva); Fort Worth 419 ACM Area 1 (Eddie S Rosas); Fort Worth 419 ACM Area 2 (Jennyfer Delacruz); Fort Worth 419 ACM Area 2 (Martel Carter); Fort Worth 419 ACM Area 3 (Angelica M Henry (Inherited)); Fort Worth 419 ACM Area 3 (MacGregor Roy); Fort Worth 419 QA (Rochelle L Shannon); Fractionation & Bulk (Michael Beyeler); Fractionation & Bulk (Roger Stoffel); Fractionation & Bulk (Sven Schwerdtfeger); Fractionation & Bulk (Zabdiel Dominguez); Fractionation (Simon Scheidegger); Fractionation Group 1 (Fritz Liechti); Fractionation Group 2 (Adrian Locher); Fractionation Group 3 (Christian Stucki); Fractionation Group 4 (Walter Strahm); Fractionation Group 5 (Urs Durtschi); Fraktionierung & Filtration (Kai Erkel); Fraktionierung (Rainer Frank (Inherited)); Franchise Medical Affairs Team 1 (Emna Bourkhis); Franchise Medical Affairs Team 1 (Nabil Moumane); Franchise Medical Affairs Team 2 (Hasan Catovic); Fredericksburg 703 (Sara M Schuppe); Fredericksburg 703 (Sheri Mixon (Inherited)); Fredericksburg 703 ACM Area 1 (Juan Manuel Castillo); Fredericksburg 703 ACM Area 2 (Mykel A Gonzales); Fredericksburg 703 QA (Gracie P Melendez); Front Line QA (Amanda Cooper); Ft Collins 705 QA (Christi Bringle); Ft. Gratiot 170 (Adrienne Smith); Ft. Gratiot 170 (Desiree Wright); Ft. Gratiot 170 ACM Area 1 (Cortney Young); Ft. Gratiot 170 ACM Area 2 (Karri Mitchell); Ft. Gratiot 170 QA (Allante S Williams); Ft. Gratiot 170 QA (Breanna Mini); Ft. Gratiot 170 QA (Melissa Johnson); GCSP & PhV Regions (Richard Wolf); GCSP Global Regions (Lana Gloukhova); GCSP Regions & Pv Operations (Kevin Burke (Inherited)); GCSP Regions & Pv Operations (Kevin Burke); GCSP Regions & Pv Operations 1 (Mark McGinnis); GCSP Regions (Angela Long); GCSP Regions (Rawad Antoun); GCSP Regions Lead Asia Pacific (Sophie Fontez); GCSP Regions Lead ECI (Simone Lorenz-Asmus); GCSP Regions Lead EU (Nicole Avalos); GFSS Asia Pac (Noopur Pattni (Inherited)); GFSS ¬ñ Asia Pac (Madison Crawford); GM & Staff (Sarah Yeung); GMP Compliance & Packaging Excellence (Thorsten Keller); GMP Training (Ann Moody); GPSS BRN & LNG (Denis Klochkov); GPSS BRN & LNG (Stephanie Schoch); GPSS USA (Michael Burdick); GRA CMC (Eva Walter); GRA CMC BMW (Siew Cheng Ney); GRA CMC BMW Team 1 (Nicole Apostolidis); GRA CMC BMW Team 2 (Libby Brodie); GRA CMC BRN (Luca Reggiani); GRA CMC BRN Team 1 (Petra Truetsch); GRA CMC BRN Team 2 (Dominique Schaller); GRA CMC BRN Team 2 (Sabine Rohner); GRA CMC BRN Team 3 (Grzegorz Podrygajlo); GRA CMC BRN Team 3 (Karin Stein-Liesen); GRA CMC KAN (Rick Khuu); GRA CMC KAN (Ricky Khuu); GRA CMC KAN Team 1 (William K Mendell); GRA CMC KAN Team 2 (Olga Neumller); GRA CMC KAN Team 2 (Olga Neum√ºller); GRA CMC KAN Team 3 (William K Mendell); GRA CMC MBR (Lene Nielsen); GRA CMC MBR (Martin Opper); GRA CMC MBR Team 1 (D√∂rthe Vingerhoet); GRA CMC MBR Team 1 (Markus Kuhl); GRA CMC MBR Team 2 (Thomas Nassauer); GRA CMC MBR Team 3 (Antje Mehrer); GRA CTA Operations Group (Florin Muraru); GRA GPS CV, Metabolism & Adv Therapies (Scott Hambaugh (Inherited)); GRA GPS Cardiovascular & Metabolism (Scott Hambaugh (Inherited)); GRA GPS Hematology & Thrombosis (Scott Hambaugh); GRA GPS Hematology & Thrombosis (Sibylle Kaiser); GRA GPS Immunology & Neurology (Lauren Tornetta); GRA GPS Immunology & Neurology (Scott Hambaugh (Inherited)); GRA GPS Immunology (Scott Hambaugh (Inherited)); GRA GPS Inflammation & Transplant (Hartmut Landgrebe); GRA GPS Respiratory (Melissa Tokosh); GRA GPS Transplant (Hartmut Landgrebe); GRA LATAM (Gordana Joksimovic); GRA Planning Group (Martin Steinmann); GRA Region EU & Switzerland (Anke Arnold); GRA Region EU & Switzerland (Bettina Doepner); GRA Region EU & Switzerland (Birgit Sommer (Inherited)); GRA Region EU & Switzerland (Birgit Sommer); GRA Region EU & Switzerland (Martina Schneider); GRA Region EU & Switzerland (Paolo Voltolina); GRA Region EU & Switzerland (Pedro Manuel Regateiro de Moura Campino); GRA Region EU & Switzerland (Stefanie Zaugg); GRA Region EU & Switzerland (Wencke Maeder-Wotruba (Inherited)); GRA Region EU & Switzerland (Wencke Maeder-Wotruba); GRA Region EU & Switzerland (Wolfgang Friedrich); GRA Region NA - AdPromo (John Hill); GRA Region NA - Hematology (Tara Chapman); GRA Region NA - Immnunology (Angela D Azzara); GRA Region NA ¬ñ CMC (Todd Olson); GRA Region NA ¬ñ CV/Transplant (Uros Djekic); GRA Region North America (Tara Chapman); GRA Resourcing (Silke Britschock); GSP Operations (Liwei Sun ); GSP Operations (Liwei Sun ?????); Gainesville 182 (Laila Matthews-El); Gainesville 182 ACM Area 1 (Toya Green); Gainesville 182 ACM Area 2 (Laila Matthews-El (Inherited)); Gainesville 182 ACM Area 2 (Leslie L Heidelberg); Gainesville 182 QA (Elvern M Gregg); Gainsville 182 (Deidra Snow-Johnson); Gainsville 182 (Laila Matthews-El); Gainsville 182 QA (Elvern M Gregg); Gastonia 267 (Mai Yang); Gastonia 267 ACM Area 1 (Terri L Salsman); Gastonia 267 ACM Area 2 (Scotty Burch); Gastonia QA 267 (Blake Painter); Gbl Commercial Operations-Cardiovascular (Debbie Drane (Inherited)); Gbl Commercial Operations-Cardiovascular (MaryAnn Capritti); Gene Therapy (Karsten Peppel); Gene Therapy (Orit Wolstein); Gene Therapy Research (Cdric Vonarburg); Gene Therapy Research (C√©dric Vonarburg); Gene Therapy Research I (Florian Aeschimann); Gene Therapy(Orit Wolstein); General Ledger (Tanja Bieri); General Ledger (Tanja Gurtner); General Ledger Accounting (Thierry Bonjour); General Product Characterisation (Robert Dickinson); General Product Characterisation (Tom Murray-Rust); Geringfgig Beschftigte (Andreas Gehrich); Geringfgig Beschftigte (Annette Pernitzsch); Geringfgig Beschftigte (Claudia Habenicht); Geringfgig Beschftigte (Kirsten Scheibel); Geringfgig Beschftigte (Natascha Tappendorf); Geringfgig Beschftigte (Stephani Keltsch); Geringfgig Beschftigte (Sven Schuhmann); Geringf√ºgig Besch√§ftigte (Andreas Gehrich); Geringf√ºgig Besch√§ftigte (Annette Pernitzsch (Inherited)); Geringf√ºgig Besch√§ftigte (Annette Pernitzsch); Geringf√ºgig Besch√§ftigte (Claudia Habenicht (Inherited)); Geringf√ºgig Besch√§ftigte (Claudia Habenicht); Geringf√ºgig Besch√§ftigte (Kirsten Scheibel (Inherited)); Geringf√ºgig Besch√§ftigte (Kirsten Scheibel); Geringf√ºgig Besch√§ftigte (Natascha Bock (Inherited)); Geringf√ºgig Besch√§ftigte (Stephani Keltsch); Geringf√ºgig Besch√§ftigte (Sven Schuhmann (Inherited)); Geringf√ºgig Besch√§ftigte (Sven Schuhmann); Gertevorbereitung (Roman Truttmann); Ger√§tevorbereitung (Roman Truttmann); Gesundheitsschutz (J√ºrgen Kanand (Inherited)); Gesundheitsschutz (zcan Campinar (Inherited)); Gesundheitsschutz (√ñzcan Campinar (Inherited)); Gilbert 131 (Daniel I Villegas (Inherited)); Gilbert 131 (Erica L Ewing); Gilbert 131 ACM Area 1 (Christopher Hersch); Gilbert 131 ACM Area 1 (Michael Brownell); Gilbert 131 ACM Area 2 (Karen Branch); Gilbert 131 QA (Karen Branch); Gilbert 131 QA (Will Porter); Glassware Washing (Jasmina Trumic); Glen Burnie 136 (Bryon Wiley); Glen Burnie 136 (Guillian T Gonzalez); Glen Burnie 136 ACM Area 1 (Janet Rhys-Jones); Glen Burnie 136 ACM Area 2 (Solyana Gebrekidan); Glen Burnie 136 QA (Monique L Fitz); Global Analytical & Technical Services (Stephen Case); Global Analytical Science & Technology (Jeffrey Pederson); Global Analytical Science & Technology (Stephen Case); Global Applications Testing (Kinga Zambo); Global Artwork (Karelle Phelan); Global Automation (Christoph Zahnd (Inherited)); Global Automation (Hans Wieser); Global Batch Release (Karen Marks); Global Benefits (Judith Kleemeier); Global Bioanalytical (Jeffrey Michael Hey); Global Bioanalytical (Matthias Zimmermann); Global Bioanalytical (Vanessa Sandford); Global Bioanalytical Sciences (Jeff Hey); Global Business Technology (John A Newsom); Global CQA Systems (Carmen Szeto); Global Case Management (Kevin Burke); Global Case Management (Monika Klug); Global Case Managment (Joanne Grego); Global Category Gels, Resins, Media & Process Aid (Stephan Heer); Global Category Lead R&D (Benjamin Prior); Global Change Management and Communications (Linda Hagerty-Dotterer); Global Clinical Dev Hematology & Thrombosis (Marcus Carr); Global Clinical Development (Frank Albano); Global Clinical Development (William Mezzanotte (Inherited)); Global Clinical Development (William Mezzanotte); Global Clinical Development Hematology & Thrombosis (Marcus Carr); Global Clinical Ops - Transplant (Michele Jenkins); Global Clinical Programs (Blanca Salazar); Global Clinical Programs 2 (Ingo Pragst); Global Clinical Quality Assurance (Claudia Fellmer); Global Clinical Quality Assurance (Volker Nickel (Inherited)); Global Commercial Insights & Analytics (Nitin Bhatnagar); Global Commercial Operations (William Campbell); Global Content (Greg Healy); Global Data Management (Jennifer Toomey); Global Doc Systems (Dominik Zuercher); Global EH&S, Resilience and Risk Management (Lynette Hodgden); Global Engineering (Christoph Zahnd); Global Engineering Projects (Darrah Wilkerson); Global Export Order Fulfillment (Sabine Hmel); Global Export Order Fulfillment (Sabine H√§mel); Global Finance (David Lamont); Global Financial Operations (David Lamont (Inherited)); Global Financial Operations (Karen Neave); Global HR Business Partner Organization (Doug German); Global HR Operations (Sara Proctor [C]); Global HR Operations (Sara Proctor); Global HR Services and Payroll (Mark Hickenbottom); Global HR Systems (Garnett Hudson); Global HR Systems (Jeff Allen); Global HR Systems (Melissa Zyla); Global HR Systems I (Jeff Allen); Global HR Systems II (Melissa Zyla); Global HR Systems and Reporting (Andrea Dauphinee); Global HR Systems and Reporting (Patricia Berdugo); Global HR Systems and Reporting (Rob Nyhan); Global Health Economics and Reimb (Girishanthy Krishnarajah); Global Health Economics and Reimb (Ryan Saadi); Global Healthcare Policy & Ext. Affairs (Dennis Jackman); Global Healthcare Policy & Ext. Affairs (Michael Ruggiero); Global Human Resources (Andrea Resch); Global Human Resources (Elizabeth Walker); Global Human Resources Business Partner Organization (Doug German); Global Human Resources Business Partner Organization (Gyuri Endes); Global IP (Beate Binsack); Global IT PMO (Lance Runyard); Global IT Service Delivery (Sam Vickers); Global IT Service Delivery (Sunil Shah); Global Indirect Procurement (James Thomas); Global Indirect Sourcing (John Ehgartner); Global Intellectual Property (GIP) (Hans-Peter Hauser); Global Intellectual Property (Peter Gomme); Global Internal Communications (Michele Darnell); Global LIMS (David Johnstone); Global Labeling (MARK COLLINS); Global Labeling (Maryann Cuomo); Global Labeling Operations (Maricarmen Dilone-Raposo); Global Labelling Operations (Lynda Rizzetti); Global Legal (Gregory Boss); Global Legal (John Michael Minardo); Global Legal (John Minardo); Global Legal Operations & Services (Lauren Neal); Global Legal Services (Adarsh Nair); Global Legal Services (Kieran O''Shea); Global Library Service Operations (Ulrike Friebertshuser-Jilke); Global Library Service Operations (Ulrike Friebertsh√§user-Jilke); Global Licensing (Andrea Huggins); Global Licensing (Michael Jorgensen); Global Logistics (Marianne McDonald); Global Logistics (Paul Wolstencroft); Global Logistics (Uli Kiefer); Global Logistics Operations (Suzanne Johnson (Inherited)); Global Logistics Team (BP Pedersen); Global MS&T (Irina Staxen); Global MSPR (Christoph Hck); Global MSPR (Christoph H√∂ck); Global Manufacturing (Chris Larkins); Global Manufacturing Ops & Plasma (Tony HARTMAN); Global Marketing Team Cardiovascular (KAMRAN MAMMADOV); Global Medical Affairs (Gregg Sylvester); Global Medical Affairs (Marcus Stockschlder); Global Medical Affairs (Marcus Stockschl√§der); Global Medical Devices & Primary Packaging Materials (Declan Reilly); Global Medical Evaluation (Federico Melo Ferrer); Global Medical Evaluation (Federico Melo-Ferrer); Global Medical Evaluation (Sabine Hrtel); Global Medical Evaluation (Sabine H√§rtel); Global Mobility COE (Ulrike Krenz-Fisher); Global Operation & RCF Administration (Margrit Hug Bucher); Global Operational Excellence and BPM (Fabrice Gribon); Global Operational Excellence and BPM (Stephen Marlow (Inherited)); Global Operations & Quality Finance (Helen Gearing (Inherited)); Global Operations & Quality Finance (Jacinta Glennon); Global Operations (Chris Larkins); Global Operations (Stephen Marlow); Global Operations (Val Romberg); Global Packaging (Warren Comerford); Global Packaging (jerome serrurier); Global Packaging Design and Artwork (Andrew John Robinson); Global Packaging Design and Artwork (Eva Streit); Global Packaging Services (Rick A Majszak); Global Packaging Services (Samantha Czako (On Leave)); Global Packaging Services (Samantha Czako); Global Pathogen Safety (Birgit Popp); Global Pathogen Safety (Eleonora Widmer); Global Pathogen Safety (John Liu); Global Pathogen Safety (Nathan Roth); Global Pathogen Safety - Marburg (Birgit Popp); Global Pathogen Safety - Marburg (Bj√∂rn Keiner); Global Pathogen Safety Support (Eleonora Widmer); Global Pathogen Safety Support Asia (Connie Broumis); Global Payroll (Christina Avraamides); Global Payroll (Marjorie Platt); Global Payroll (Mark Hickenbottom (Inherited)); Global Pharmacovigilance Quality Assurance (Claudia Nolte); Global Planning (Harald Berg); Global Planning (Jamie Pritzl); Global Planning, Supply Chain (Chad Salisbury); Global Planning, Supply Chain (Steve Moloney); Global Plasma Team (Dieter Brazel); Global Plasma Technology Ownership (Benno Bitterli); Global Portfolio Influenza (Jane Lanteri) (Jane Leong); Global Pricing (John Hakanson); Global Product Characterization (Stefan Schmidbauer); Global Product Specifications and Reference Standards (Richard Steere); Global Product Strategy (GPS) (Scott Hambaugh); Global Publishing (Marquis Bryant); Global Publishing (Timothy Huke); Global QA, IT (Patricia Hernan Miller); Global Quality (Vasilis Mavrogenis); Global Quality - Americas (Carlos Torres); Global Quality Affiliates (Collins Onyejese); Global Quality Affiliates (Laura O''Brien (Inherited)); Global Quality Assurance (Karen Netherton); Global Quality Control (Brian Nunnally); Global Quality Management (Jeffrey A Alcorn); Global Quality Management (Laura O''Brien); Global Quality Management (Rushen Mendis); Global Quality Management (Sanjana Sanjappa); Global Quality Management 2 (Brian Walker); Global Quality Management Sys. (Allen F Coleman); Global Quality Management Sys. (Eva M. Urban (Inherited)); Global Quality Management Sys. (Eva M. Urban); Global Quality Management Sys. (Jeffrey A Alcorn (Inherited)); Global Quality Management Sys. (Stephen A Wilson); Global Quality Management Sys.(Steve Wilson); Global Quality Management Systems (Carol Kidwell (On Leave)); Global Quality Management Systems (Carol Kidwell); Global Quality Operations (Matthias Pohl); Global Quality Systems & Compliance (Milka Smoljko); Global Quality Systems (Chad M Salisbury); Global Quality and R&D IT Systems (Tim Jones); Global Quality, Bus Svcs & Finance HR (Doug German (Inherited)); Global Quality, Bus Svcs & Finance HR (Stephanie McKinney); Global R&D - Financial Rptng & Analysis (Krista Doron); Global R&D Early Development CMO (Diana Lanchoney); Global R&D Finance (Christopher James Thorpe); Global R&D Finance (Karen Neave (Inherited)); Global R&D Finance (Pamela Cerovich); Global R&D Project Management (David S Fifer); Global R&D Project Management (Diana Lanchoney); Global R&D Project Management (Heiko Riedel); Global R&D Project Management (Heiko V√∂lpel); Global R&D Project Management (Jennifer Dutton); Global R&D Project Management (Lena Ohannesian); Global R&D Project Management (Martin Broder); Global R&D Project Management (Peter Douglas); Global R&D Project Management (Rose Fida); Global R&D Project Management (Steven Brooks); Global R&D Project Management 1 (Peter Douglas); Global R&D Project Management 2 (Christian Spyr); Global R&D Project Management 2 (Gino Vairo); Global R&D Project Management 2 (Nancy Fetrow (Inherited)); Global R&D Project Management 2 (Regula Heini Hodel) (Christian Spyr); Global R&D Project Management 3 (Christiane Enzinger); Global R&D Project Management 3 (David Leacy (Inherited)); Global R&D Project Management 3 (Katy Dimitropoulos); Global R&D Project Management 3 (Rose Fida); Global R&D Project Management 4 (Laura J Schweigert); Global R&D Project Management I (David S Fifer); Global R&D Project Management II (Heiko Riedel); Global R&D Project Management III (Jennifer Dutton); Global R&D Project Management Immunology (Linda Faux (Inherited)); Global R&D Project Management Immunology (Steven Brooks (Inherited)); Global R&D QA CMC & Research (April Sena); Global R&D Quality Assurance (Karen Gard''ner); Global Recombinant Portfolio Group (Lene Nielsen); Global Records (John Neff (Inherited)); Global Reg Affairs ANZ (Kellie Hooley); Global Regulatory Affairs (Ashley Burt); Global Regulatory Affairs (Catarina Edfjaell); Global Regulatory Affairs (Emmanuelle LECOMTE BRISSET); Global Regulatory Affairs (Franck Nicolas); Global Regulatory Affairs (Mary Ryan); Global Regulatory Affairs Quality Compliance (Jana Reitmajer); Global Regulatory Affairs Quality Compliance (Monika Dietrich-Sander); Global Regulatory Affairs Quality Compliance (Rafael Sierra); Global Regulatory Lead - QIVc (Karen Jourdan-Brown); Global Regulatory Operations and Labelling (Emma Williams); Global Regulatory Systems&Informat.Mgmt. (Christine Berger); Global Regulatory Systems&Informat.Mgmt. (Franck Nicolas (Inherited)); Global Reporting (Harsha Kadiyala); Global Research & Development (Andrew Cuthbertson); Global Research & Development (William Mezzanotte); Global Risk & Insurance Management (John Marren); Global Risk Management (Mark Luksic); Global SAP Security & BTGC (Steven Yannelli); Global SC Operations (Tina Law); Global Sales & Operations Planning (Ben Wilson); Global Scientific Exellence (Maria M√ºller); Global Security (Tony Strickland); Global Serialization (Michel B√©raud); Global Shared Services (Liam Connelly); Global Site Management (Lauren Ruth Vickery); Global Site Management (Lauren Vickery); Global Sourcing (Paul Addis); Global Sourcing - Automation, Instrumentation, Packaging and Aseptic Filling (Iouri Sverdlov); Global Sourcing Logistics (Gela Bakuridze); Global Sourcing Logistics (John Ehgartner (Inherited)); Global Strategic Sourcing, Chemicals (Jun Gao); Global Supplier Quality EMEA (Hans-Jrgen Schning); Global Supplier Quality EMEA (Hans-J√ºrgen Sch√∂ning); Global Supply Chain (Ian Dick); Global Systems Maintenance (Regina Mhlich); Global Systems Maintenance (Regina M√ºhlich); Global Talent Acquisition (Brian Fehrer); Global Talent Acquisition (Melissa Bradford); Global Tax (Aoife Deane); Global Total Rewards (Elizabeth Walker (Inherited)); Global Total Rewards (Maynard Branscome); Global Trademarks (Nicole Smith); Global Transport Validation (Matthew Wokulich); Global Validation (Russell Ciliento); Global Validation (Russell James Ciliento); Government Rebate Operations (Joseph DeLuca); Government Vaccines Manager 745 (Helen Dela Cruz); Graduates (David Azzopardi); Grand Blanc 244 (Kelly M Weng); Grand Blanc 244 ACM Area 1 (BRANDON SMITH); Grand Blanc 244 ACM Area 1 (LC Davis); Grand Blanc 244 ACM Area 2 (ROBERT MANGOLD); Grand Blanc 244 QA (Martina Young); Grand Junction 159 (Daniel Venn); Grand Junction 159 (Markah Williams Mower); Grand Junction 159 (Markah Williams); Grand Junction 159 ACM Area 1 (Steven Potter); Grand Junction 159 ACM Area 2 (Richard S Simpson); Grand Junction 159 ACM Area 2 (Rob Ferguson); Grand Junction 159 QA (Carrie E Pell); Grand Junction 159 QA (Kelly M Weng); Grand Prairie 049 (Angelica M Henry); Grand Prairie 049 (Jamie Bullock); Grand Prairie 049 ACM Area 1 (Kelly Gomez); Grand Prairie 049 ACM Area 2 (Deonka Whitley); Grand Prairie 049 QA (LaDonnica L Eddings); Grants Management (Abraham Smith); Grants Manager (Abraham Smith); Greece 194 (Jontus Walker); Greece 194 (Tangerine Tingle); Greece 194 ACM Area 1 (Mike Massaro (On Leave)); Greece 194 ACM Area 1 (Mike Massaro); Greece 194 ACM Area 2 (Arooj Hussain); Greece 194 QA (Ariel S Forrest); Greece 194 QA (John L Thixton (Inherited)); Greece 194 QA (Todd Wolfe); Greeley 615 (Natasha D Casillas); Greeley 615 (Skyler T Campbell); Greeley 615 ACM Area 1 (Natasha D Casillas (Inherited)); Greeley 615 ACM Area 1 (Natasha D Casillas); Greeley 615 ACM Area 2 (Rita E Williams); Greeley 615 QA (Meghan Fryer); Greensboro 117 (Susan Watkins (On Leave)); Greensboro 117 (Susan Watkins); Greensboro 117 ACM Area 1 (Dorleans Alce); Greensboro 117 ACM Area 1 (Kristen Jones); Greensboro 117 ACM Area 2 (Kristie Cunningham); Greensboro 117 QA (Stephanie Bernard); Greenville 088 (Andrea S Zeller); Greenville 088 ACM Area 1 (Andrea S Zeller); Greenville 088 ACM Area 1 (Jeremy Honea (On Leave)); Greenville 088 ACM Area 2 (Natasha Pinson); Greenville 088 QA (LeeAnn M Estes); Gresham 055 (Brandy J Vaughan); Gresham 055 ACM Area 1 (Noah S Johnson); Gresham 055 ACM Area 2 (Becca Daugherty); Gresham 055 QA (Dijana Colic); Group 3 (Adrian Alder); Group Analysis (Dean Barrett); Group Analysis (Dean Wilde); Group Analysis (Maureen Harrington); Group Controller (Helen Gearing (Inherited)); Group Controller (Jacob Weaver); Group Controller (Noopur Pattni); Group Finance (Daya Salter); Group Finance (Jason Mugridge); Group Finance (Kevin Personius); Group Finance 1 (Jeffrey Marchetti); Group Finance II (Troy Kukorlo); Group Finance- Financial Systems (Mary Conlin); Group Grzhausen (Uwe Sthr); Group G√∂rzhausen (Michael Engel); Group G√∂rzhausen (Uwe St√∂hr); Group Hauptwerk (Michael Engel); Group Income Protection (Emma McCarthy); Group Lead Validation-Site Expansion (Robert Musgrave); Group Main Site (Michael Engel); Group Reporting (Michael J. Clark); Group Tax (Peter Larsen); Group Taxation (Michael Manusov); Gulfport 122 (Elishia Humphrey); Gulfport 122 (John E Hunt (Inherited)); Gulfport 122 (Joshua D Harper); Gulfport 122 (Robert Spicer); Gulfport 122 ACM Area 1 (Joshua D Harper); Gulfport 122 ACM Area 2 (Bernetta L Huff); Gulfport 122 ACM Area 2 (Joshua D Harper); Gulfport 122 QA (Regina Williams); GxP Training (Carla Oliver); GxP Training (Vicky Lioutas (Inherited)); HAE & Respiratory (Sylvia Herget); HAE & Respiratory 1 (Susann Hofmockel); HAE Marketing (Amy Bifolco-Morrell); HAE Marketing (Tom Groeling (Inherited)); HAE Marketing (Tom Groeling); HR Business Partner (Sandro Krug); HR Business Partner (Tanja Templer); HR Business Partner 1; HR Business Partner 1 (Tina Camenzind); HR Business Partner 2 (Johanna Wege); HR Business Partner 4 (Darja Skaza-Brock); HR Commercial & Legal (Susan Devlin); HR Development & Programs (Sabine Wagner); HR Finance, R&D, IT & Business Services (Carolyne Malizia); HR Finance, R&D, IT, Bus Services (Carolyne Malizia); HR Holly Springs (Shandalyn Hope Matson); HR Holly Springs (Shandalyn Matson); HR Liverpool (Emma McCarthy); HR Liverpool (Sheila Redmond [C]) (Sheila Redmond [C]); HR Liverpool Business Partners (Kerry Rimmer); HR Marketing (Martin Stump); HR Marketing (Nadine Reh); HR Operations & Quality (Judi Badenoch); HR Operations (Adam Williams); HR Operations (Claudia Petrocchi); HR Operations (Elizabeth Walker (Inherited)); HR Operations (Lindsay Heaton); HR Operations (Mike Drew); HR Operations (Sara Proctor [C]); HR Ops Enabler Tools (Jennifer Sullivan (On Leave)); HR Ops Enabler Tools (Jennifer Sullivan); HR Ops Process Excellence (Anna Tassone); HR Ops Support (Kai Hofmann); HR Parkville (Yvette Saunders); HR Payroll (Christina Avraamides); HR Process and Portfolio Management (Beth Swiezkowski); HR Projects (KT Leong); HR Talent Acquisition & Talent Management (Beth Thomas); HR Talent Acquisition Holly Springs (Blake Derrick); HR Talent Acquisition Maidenhead (Louise Hawkes); HR Talent Acquisition Parkville (Angela Bellenger); HR Total Rewards (Gyuri Endes (Inherited)); HR Total Rewards (Karen Vyse [C]); HRBP Corporate Functions (Mafalda Lou); HS Manufacturing Fill & Finish Ops (Brian Kennedy); HU CSL Plasma Kft. Center Debrecen (Halsz Roland); HU CSL Plasma Kft. Center Debrecen (Hal√°sz Roland); HU CSL Plasma Kft. Center Miskolc (Ruzsinszki Ibolya); HU CSL Plasma Kft. Center Ny√≠regyh√°za (Rask√°n√© Petruska Gy√∂ngyi); HU Center Employee (Raskn Petruska Gyngyi (Inherited)); HU Center Employee (Rask√°n√© Petruska Gy√∂ngyi (Inherited)); HU Center Employee (Ruzsinszki Ibolya (Inherited)); HU Center Manager (Ruzsinszki Ibolya (Inherited)); HU Center Physician (Raskn Petruska Gyngyi (Inherited)); HU Center Physician (Rask√°n√© Petruska Gy√∂ngyi (Inherited)); HU Center Physician (Ruzsinszki Ibolya (Inherited)); HU Quality (Duds kos); HU Quality (Dud√°s √Åkos); HVAC & Coldrooms (Nozar Basseri); HVAC (Anna Fellenberg); HVAC (Juerg Schwarz); HVAC (Simon Hediger); HVAC / Reinr√§ume (Nozar Basseri); HVAC 1 (Urs Turtschi); Haemostaseology (Bernhard Czapla (Inherited)); Haemostaseology (Claudia Bachmann (Inherited)); Haemostaseology (Heinrich Feischen (Inherited)); Haemostaseology (Holger Milkereit (Inherited)); Haemostaseology (Michael Bernd Rode (Inherited)); Haemostaseology (Rainer Adam (Inherited)); Haemostaseology (Ralf Kosmol); Haemostaseology (Susanne Mller (Inherited)); Haemostaseology (Susanne M√∂ller (Inherited)); Haemostasis (Anthony Downes); Haemostasis (Elias Francis (Inherited)); Haemostasis (Elias Francis); Haemostasis (George Tsirbas); Haemostasis (Gerry Orval); Haemostasis (John Bashour); Haemostasis (Roy Taylor); Haemostasis (Shane Bartils); Haemostasis (Steven Barello); Hagerstown 174 (Bukola Raji); Hagerstown 174 (Kashaun Muhammad); Hagerstown 174 ACM Area 1 (Antonio DuBois); Hagerstown 174 ACM Area 2 (Daniel Pappariella); Hagerstown 174 QA (Bukola Raji); Hagerstown 174 QA (Jade Jessop); Hagerstown 174 QA (Joanne Charles-Clarke); Hagerstown 174 QA (John E Hunt (Inherited)); Halethorpe 167 (Rebecca R Pettiford); Halethorpe 167 ACM Area 1 (Christine Bethea); Halethorpe 167 ACM Area 2 (Lanasia West); Halethorpe 167 QA (ALISON CONLEY); Halethorpe 167 QA (Robin K Doering); Haltom City 188 (Dante Williams); Haltom City 188 (Melissa J Chapman); Haltom City 188 ACM Area 1 (Marvin Tablante); Haltom City 188 ACM Area 2 (Dante Williams); Haltom City 188 ACM Area 2 (Robert G Wilson); Haltom City 188 QA (Daniel Vu); Haltom City 188 QA (Julie E Reynolds); Hamilton 199 (Jenna Evans); Hamilton 199 (Katanya Hall); Hamilton 199 (Kera Cathel); Hamilton 199 ACM Area 1 (Steve Hosang); Hamilton 199 ACM Area 2 (TaChita Robb); Hamilton 199 QA (Emily Norton); Hamilton 199 QA (Kera Cathel); Hamilton 494 (Derek Morner); Hamilton 494 ACM Area 1 (William Robinson); Hamilton 494 ACM Area 2 (Jessica Hoffman); Hamilton 494 ACM Area 2 (VERONICA HESTER); Hamilton 494 QA (Elizabeth E Galloway); Harlingen 185 (Victor Guevara); Harlingen 185 ACM Area 1 (Eduardo De La Rosa); Harlingen 185 ACM Area 2 (Melinda Garcia); Harlingen 185 QA (Audrey Rodriguez); Harlingen 185 QA (Jennifer Martinez); Harlingen 185 QA (Luis Rodr√≠guez); Harlingen 185 QA (Rosa E Mercado (Inherited)); Hattersheim Field Services (Frank Dauber); Hattersheim Field Services (Robert Rohs); Hazel Crest 116 (Andrea C Rice); Hazel Crest 116 (Morgan R Grose); Hazel Crest 116 ACM Area 1 (Ian Olson); Hazel Crest 116 ACM Area 2 (Bob Gentille); Hazel Crest 116 QA (Amanda E Swider); Hazel Crest 116 QA (Joshua D Williamson (Inherited)); Head , In-Licensed Products AsiaPac, Global Regulatory Affairs (Angela Wong); Head Global Business Developme (Eve Williamson); Head Greater China Logistics (Edwin Chia); Head of Asia Pac, Medical Affairs (Jane Leong); Head of Asia Pac, Medical Affairs (Jonathan Anderson); Head of Batch Release (Darren Moulton); Head of Batch Release (Sherrin Gribble); Head of Commercial and Marketing (Jamila Filipecki); Head of Medical Affairs UK (Mansoor Ashraf); Head of Medical Affairs UK (Sankarasubramanian Rajaram (Inherited)); Head of Medical Writing & Disclosures (Catherine Tyrrell); Head of Medical Writing & Disclosures (Cathy Tyrrell); Head of Operational Excellence (Dirk Crouse); Head of R&D Finance (Christopher James Thorpe); Head of Region, APAC & In licensed Prod, Glob RA (Lisa MacDonald); Head of Region, Asia Pac & In licensed Prod, Glob Reg Affairs (Lisa MacDonald); Head, Clinical Development Operations (Daniel Kirby); Head, Clinical Development Operations (Veronica Suarez); Head, Technical Development (PKV) & Global Process Innovation (Steven Rockman); Head, Technical Development (PKV) & Global Process Innovation (Steven Rockman); Health & Wellbeing (Susanne Marx); Health (Donna G O''Keefe); Health (Sara Regnier); Health and Safety (Gregory Dowler); Healthcare Policy & Ext. Affairs APAC (Shouqing Zhang); Healthcare Policy & Ext. Affairs Europe (Rdiger Gatermann); Healthcare Policy & Ext. Affairs Europe (R√ºdiger Gatermann); Healthcare Policy & Ext. Affairs Japan (Shouqing Zhang (Inherited)); Healthcare Policy & Ext. Affairs N.A. (Patrick Collins); Hem Higashi Nihon Area (Atsuhiko Arikata); Hem Kansai Chubu Area (Shinichi Kano); Hem Nishi Nihon Area (Taisuke Miyakoshi); Hem Nishi Nihon Area (Takeyuki Akiyoshi); Hem Shutoken Area (Takayuki Takigawa); Hematology & Thrombosis Marketing (John Nicastro); Hematology & Thrombosis Medical Affairs (Debbie Drane (Inherited)); Hematology & Thrombosis TA (Sharad Agrawal); Hematology & Thrombosis Therapeutic Area (Brahm Goldstein); Hematology Marketing (John Nicastro); Hematology TA (Antti Kourula); Hematology TA Marketing (Sharad Agrawal); Hematology TA Medical Affairs (Krupa Sivamurthy); Hemophilia A Marketing (Beth Ann Hirst); Hemophilia A Marketing (Brian Johnson); Hemophilia B Marketing (Nicole McInerney); Hemophilia Group (Hideyuki Seto); Hemophilia Group (Makoto Kubo); Hemophilia TA (Takayuki Ishii); Henderson 134 (Eddie H Gaillard); Henderson 134 ACM Area 1 (Maria Coulter); Henderson 134 ACM Area 2 (Eshell Cudjo-Williams); Henderson 134 QA (Bri Johnson); Henrico 264 (Tracia Lopez); Henrico 264 ACM Area 1 (Nancy L Ott); Henrico 264 ACM Area 1 (Tannika Green); Henrico 264 ACM Area 2 (Tannika Green); Henrico 264 ACM Area 2 (Tracia Lopez (Inherited)); Henrico QA 264 (Brandan Lurz); Herstellungsleiter (Andreas Gehrich (Inherited)); Herstellungsleiter (Annette Pernitzsch (Inherited)); Herstellungsleiter (Claudia Habenicht (Inherited)); Herstellungsleiter (Heike Borchert); Herstellungsleiter (Kirsten Scheibel (Inherited)); Herstellungsleiter (Natascha Bock (Inherited)); Herstellungsleiter (Stephani Keltsch); Herstellungsleiter (Sven Schuhmann (Inherited)); Herstellungsleiter Berlin (Dorothee Knop); Herstellungsleiter Braunschweig (Dorothee Knop); Herstellungsleiter Bremen (Dorothee Knop); Herstellungsleiter Frankfurt (Dorothee Knop); Herstellungsleiter Gttingen (Dorothee Knop); Herstellungsleiter G√∂ttingen (Dorothee Knop); Herstellungsleiter Kiel (Dorothee Knop); Herstellungsleiter Nrnberg (Dorothee Knop); Herstellungsleiter N√ºrnberg (Dorothee Knop); Hidalgo 151 (Howard Augusto Castillo); Hidalgo 151 ACM Area 1 (Javier De La Fuente (On Leave)); Hidalgo 151 ACM Area 1 (Javier De La Fuente); Hidalgo 151 ACM Area 2 (Lucio Jaramillo); Hidalgo 151 QA (Becky S Diaz); High Speed Packaging (Jrg Sthli); High Speed Packaging (J√ºrg St√§hli); High Speed Packaging Line (Peter Zysset); Highland Park 138 (Miriah Grady); Highland Park 138 (Mondel Hightower); Highland Park 138 ACM Area 1 (Miriah Grady); Highland Park 138 ACM Area 1 (T''Pring John); Highland Park 138 ACM Area 2 (Dee Freeman); Highland Park 138 QA (Jenae Jacobs); Highland Park 138 QA (Shawna Taylor); Highland Park 138 QA (Slater P Murphy); Hillsboro 126 (Elizabeth Manning); Hillsboro 126 ACM Area 1 (Alex Steinke); Hillsboro 126 ACM Area 2 (Dan Jordan (On Leave)); Hillsboro 126 ACM Area 2 (Paige N Zafran); Hillsboro 126 QA (Grant Haun); Hizentra Marketing (Michael Ward); Hobart 218 (Kevin Robinson); Hobart 218 (Sherri L Clark); Hobart 218 ACM Area 1 (Michele Tosseng); Hobart 218 ACM Area 2 (Ashley Myvett); Hobart 218 ACM Area 2 (Kevin Robinson); Hobart 218 QA (Drewleigha B Sarver (Inherited)); Hobart 218 QA (KayLeigh Northcutt); Hokkaido Tohoku Area (Masahiro Takai); Homestead 207 (Mary A Paul (Inherited)); Homestead 207 (Roger Jiron); Homestead 207 (Stacey Ewing); Homestead 207 ACM Area 1 (Monica Alvelay); Homestead 207 ACM Area 2 (Julio Delgado); Homestead 207 ACM Area 2 (Roger Jiron (Inherited)); Homestead 207 QA (Natasha Roopnarine); Homestead 250 (Diane Day); Homestead 250 ACM Area 1 (Ryan Olsavsky); Homestead 250 ACM Area 2 (Jamille Ford); Homestead 250 QA (DENNIS GINTHER); Houston 208 (Sara Bouras); Houston 143 (Josh Concepcion); Houston 143 ACM Area 1 (Sharon K Easiley); Houston 143 ACM Area 2 (Oscar Beasley); Houston 143 QA (Shawntrala Stephens); Houston 168 (Lisa Rojas); Houston 168 ACM Area 1 (Lisa Wilson); Houston 168 ACM Area 2 (Elizabeth Morales); Houston 168 ACM Area 2 (Tascha Montgomery); Houston 168 QA (Sam Schultz (Inherited)); Houston 168 QA (Tara West); Houston 208 (Sara Bouras); Houston 208 ACM Area 1 (Sara Bouras (Inherited)); Houston 208 ACM Area 1 (Sarah L Terry); Houston 208 ACM Area 2 (Marc Garcia); Houston 208 ACM Area 2 (Sarah L Terry); Houston 208 QA (Darriel Clark (On Leave)); Houston 208 QA (Darriel Clark); Houston 208 QA (Elaine R Wilson); Houston 209 (Erin Ostean); Houston 209 (Sheneka E Wilson); Houston 209 ACM Area 1 (Charles Minter (On Leave)); Houston 209 ACM Area 1 (Charles Minter); Houston 209 ACM Area 2 (Adrean N Brown); Houston 209 ACM Area 2 (MARY MEADOWS); Houston 209 QA (Barbara May); Houston 209 QA (Keva M Williams); Houston 274 (Brian T Edwards); Houston 274 ACM Area 1 (Reiko F Hernandez); Houston 274 ACM Area 2 (Tyriana T Shaw); Houston 274 QA (Lawrence Jones); Human Resources Labor and Employee Relations (Christine Adams (On Leave)); Human Resources & Communications (Sandro Krug); Human Resources & General Affairs, Japan (Akira Nakajima); Human Resources & General Affairs, Japan (Mayumi Gonome - ); Human Resources & General Affairs, Japan (Mayumi Gonome ??? ??? - ???? ????); Human Resources (Bonnie Shor); Human Resources (Bonnie Slone); Human Resources (Gyuri Endes); Human Resources (Jacqueline Hawkins); Human Resources (Nicole Bookert); Human Resources (Tanja Templer); Human Resources (Tanya Kennedy); Human Resources Kankakee (Jacqueline Hawkins); Human Resources Management (Adam Williams); Human Resources Management (Pia Daish); Human Resources Organisation Transformation (Paula Foord); Human Resources SC (Nicole Bookert); Human Resources Talent Development (Michael O''Connor); Human Resources ¬ñ Labor and Employee Relations (Christine Adams); Human Resources ¬ñ Labor and Employee Relations (Jacqueline Hawkins (Inherited)); Human Resources, China (Grace Deng ?????); Human Resources, China (Tracy Lyu ?????); Hygiene (Arno Karnholz (Inherited)); ICSR Compliance and Reconciliation (Samantha Gan); IG / Albumin Bulk (Anthony Manovella); IG / Albumin Bulk (Jill Allen); IG Lab (Ritaben Suhagiya (Inherited)); IG Lab (Ritaben Suhagiya); IG Lab (Tom McCallum); IG Lab (William Fenech); IM (Guido Kagemann); IM Modul (Arnd Vollmerhausen (Inherited)); IM Modul (Torsten Jeide); IMED / Clinical (Julia Daum); IP Management (Helen Mutimer); IP Management (Peter Gomme); IP Management (Philip Keep); IR Higashi Nihon Area (Takahiro Tsuruta); IR Kansai Chubu Area (Yutaka Fujita); IR Nishi Nihon Area (Takayuki Sakai); IR Shutoken Area (Hiroki Nagayasu); IS Applications (Markus Fhrer); IS Applications (Markus F√ºhrer); IS Operations (BAHA ATICI); IS Operations (Bernd Boucsein (Inherited)); IS Operations (Robert Rohs); IT (Martin Jones); IT Americas (Stephen Norman Bender); IT Americas (Steve Bender); IT Americas, Laboratories (Dave Kirk); IT Americas, Site IT (DEBORAH BUREC); IT Americas, Site IT (Deborah Burec); IT Applications (Pavan Dronamraju); IT Applications - SharePoint (Emma Tibble); IT Asia Pacific (Gavin Gusling); IT Automation BMW (Daud Warraich (Inherited)); IT Automation BMW (John Croxton (Inherited)); IT Automation BMW (John Croxton); IT Automation BMW (Reto Von Gunten); IT Automation BMW (Stephen Pickering); IT Business Applications (Paul Ashton); IT Communications Services (Nealesh Mistry); IT Compliance - EMEA (David Boyd); IT Compliance - Infrastructure (Neil Broster); IT EMEA (Martin Gallington); IT EMEA Infrastructure (Chris Gatley [C]); IT Infrastructure - Hosting (Sadha Venkatachellam); IT Infrastructure - Hosting (Sathasivan Venkatachellam); IT Infrastructure Service - Identity & Desktop (Rob Deacon); IT Infrastructure Services (Quentin Zhao); IT Security & Compliance (Bob DeMarco); IT Security & Compliance (Robert DeMarco); IT Security and Compliance (Alan Butterfield); IT Security and Compliance (Alan Matthew Butterfield); IT Service Management Office (Richard Williams); IT Services (Daniel Robinson); IT Vendor Contracts (Craig Skelton); IVV Bacteriology (Matthew Stellato); IVV Chemistry (Thuy Dang); IVV Environmental Monitoring (Andrea Chalker); IVV Potency + Biochemistry US (Corina Zahra); IVV Potency, Biochem Rest of World (Anna Gruszka); IVV Seed Development (Brad Dickson); Identity and Access Management Operations (Bill Keohane); Ig Marketing (Sara Cowan); Ig&API Franchise Marketing (Amlie De Rosnay) (52388632); Ig&API Franchise Marketing (Am√©lie De Rosnay) (52388632); Ig&API Franchise Marketing (Emmanuelle Massonie) (52388632); Ig&API Sales Force Florent Privat (Emeline Bedu) (52388634); Ig&API Sales Force Florent Privat (Florent Privat) (52388634); IgG & Albumin/Supply Chain PMR Main Site (Barbara Kalina (Inherited)); IgG & Albumin/Supply Chain PMR Main Site (Wilfried Freudenberg); IgLAB (Franz Petter); IgLAB Bulk formulation (Susanne Gilgen); IgLAB Bulk purification (Thomas Eckert); IgLAB MV&VI Bulk Formulation (Sandra Kaempfer); IgLAB MV&VI Bulk Purification (Mathias Schinegger); IgLAB MV&VI Subfractionation (Markus Hauert); IgLAB Subfractionation (Mark Deutschland); IgLAB Subfractionation (Markus Hauert); IgLAB Subfractionation (Susanne Gilgen); IgLAB Subfractionation (Thomas Daehler); IgLABMV&VI (Marius Liesch); IgPRO (Markus Weber); Immunoglobulin Asset (Fritz Rentsch); Immunohematology Lab (Maria R Fernandez (Inherited)); Immunohematology Lab (Peter A Fitzgerald (Inherited)); Immunology & Neurology CommDev Marketing (Michael Ward); Immunology & Neurology Medical Affairs (Andrew Koenig); Immunology & Neurology New Products (Regula Styger Baumann); Immunology & Neurology RDPM (Karen Lindquist); Immunology & Neurology RDPM I (Sabine Alexandra Stoffel Domig); Immunology & Neurology TA (Jay Bowsher); Immunology & Rare Disease Group (Hirokazu Imura); Immunology & Rare Disease Group (Shinichiro Magome); Immunology & Rare Disease TA (Takuya Ohshima); Immunology (Bernhard Czapla (Inherited)); Immunology (Claudia Bachmann (Inherited)); Immunology (Heinrich Feischen (Inherited)); Immunology (Helen Hartman); Immunology (Holger Milkereit (Inherited)); Immunology (IMM) (Stefan Spycher); Immunology (Michael Bernd Rode (Inherited)); Immunology (Rachpal Malhotra); Immunology (Rainer Adam (Inherited)); Immunology (Ralf Kosmol); Immunology (Susanne Mller (Inherited)); Immunology (Susanne M√∂ller (Inherited)); Immunology - Transplant (Mircea Ciuca); Immunology / Transplant (Mircea Ciuca); Immunology Lab (Maria R Fernandez (Inherited)); Immunology Lab (Peter A Fitzgerald (Inherited)); Immunology Marketing (Bernadine Koziara (Inherited)); Immunology Marketing (Biju Chorinchath); Immunology Marketing (JD Kohutka); Immunology New Products (Regula Styger Baumann); Immunology TA (Jay Bowsher); Immunology TA Marketing (Michael Ward); Immunology TA Medical Affairs I (Andrew Koenig); Immunology and Neurology TA (Sharon Popik); Immunology and Neurology TA (Susanne Wang); Immunology-North America (Ian Gourley); Import / Export Lead AU (Robert Bronikowski); Import Export Compliance (MICHAEL MORRILL); Import Export Compliance (Markus Buri); Import Export Compliance (Michael Morrill); Import-Export Compliance (Nona Clarke); Import/Export Compliance (Neda Nikolic); Import/Export Compliance (Nona Clarke); Impurity & Data Management (Patricia Lieby); Impurity Data Mngt I (Madlene von K√§nel); Impurity Data Mngt I (Simona Pfister); Impurity and Data Mngt II (Tanja Angela Nyffenegger); In Market Logistics EMEA (Avi Yuhay); In-Market Logistics Turkey/EEU (Avi Yuhay); Incoming Quality Assurance (Jamie Nichols); Incoming Quality Assurance (Lynette Mirrielees); Incoming Quality Assurance GL (Cindy Rocknowski (Inherited)); Incoming Quality Assurance GL (Jeffrey Zoubek (Inherited)); Indianapolis 146 (Brian W Stewart); Indianapolis 146 (Randy Miller); Indianapolis 146 QA (Erik Tharp); Indianapolis 146 QA (Randy Miller); Indianapolis 181 (Jami Colson); Indianapolis 181 ACM Area 1 (Dayneisha G Pinkston); Indianapolis 181 ACM Area 1 (Jordan Swoape); Indianapolis 181 ACM Area 2 (Ronnisha Banks); Indianapolis 181 QA (Aja Blue); Indianapolis 181 QA (Drewleigha B Sarver); Indianapolis 181 QA (Robin L Oldaker); Indianapolis 412 (LaToya M Hinkle); Indianapolis 412 ACM Area 1 (Brian Stewart (On Leave)); Indianapolis 412 ACM Area 1 (Brian Stewart); Indianapolis 412 ACM Area 2 (Latoria J Moore); Indianapolis 412 QA (Ashley Kemper); Indirect Procurement (Daniela Ebert); Industriekaufleute (Carmen Walldorf (Inherited)); Industriekaufleute (Doris Nake (Inherited)); Indy 146 ACM Area 1 (Sara K Campbell); Indy 146 ACM Area 1 (Sara K Sheets); Indy 146 ACM Area 2 (Joe Hicks Jr); Influenza (Chris Clarke); Influenza Operations (Bill Cracknell); Influenza Vaccines (Carole Verhoeven); Influenza Vaccines Seasonal (Athanasia Papadimitriou); Influenza Vaccines Seasonal (Jonathan Edelman (Inherited)); Influenza and National Products, Global RA (Lisa Steinberg); Information Security (Federico Iaschi); Infrastructure Design (Jeremy Finlay); Infrastructure Excellence & Process Management (Stephan Krummel); Infrastructure Program Manager (Jessica Bartels); Infrastructure Program Mgr (Jessica Bartels); Infusion Science - ISS (Lisa Barrett); Infusion Science - ISS (Lisa Barrett); Inhibitors, FI, FXIII & Support/Supply C (Barbara Kalina (Inherited)); Inhibitors, FI, FXIII & Support/Supply C (Wilfried Happel); Innovation (Becky Heatherman); Inoculation (Jubail Dimabuyu); Inspection & Packaging (Jonathan Kanczes); Inspection & Packing (Ben Hagger); Inspection & Packing (David Nguyen); Inspection & Packing (Joanna Madafferi (Inherited)); Inspection & Packing (Joanna Madafferi); Inspection (Pasquale Carestia (Inherited)); Inspection (Thomas Royal); Inspection (Union) (Pasquale Carestia (Inherited)); Inspection (Union) (Thomas Royal (Inherited)); Inspection semi final prod. 4 (Samira Spahn-Belbaita); Instrum & Elect Engineer (Justin Lim); Instrumentation (Colin Steele); Integrated Business Planning (Avi Gor√©); Integrated Business Planning (Avinash Gor); Integrated Business Planning (Avinash Gor√©); Integrated Business Planning (Jamie Pritzl); Intercontinental Supply Chain (Oliver Wehner); Internal Communications (Claudine Heinz); Internal Communications (Jasmin Joller); Internal Communications (Laura Kumpe); Internal Services (Reto Moser); Internal processes (Ernst Scheurer); International Logistics - Intermediates, Special Shipments (Julia Daum); International Logistics - Team Americas - APAC (Anna-Karina Muth); International Logistics - Team Americas / APAC (Anna-Karina Muth); International Logistics - Team EMEA (Christoph Mueller); International Payroll (Clair Burke); International Plasma Operations (Jeffrey A Schulz); Interns (Jacqueline Hawkins (Inherited)); Investigation & Process Owners (Ryan Cox); Investor Relations (Mark Dehring); Invoice Control & Invoicing of Services (Harald Bieker (On Leave), Beatrix Gnau); Invoice Control & Invoicing of Services (Harald Bieker); Irondequoit 246 (Sheilah Mykins); Irondequoit 246 ACM Area 1 (Nicole Chipembere); Irondequoit 246 ACM Area 2 (Teresa Moreira-Weil); Irondequoit 246 QA (Meghan Beckedorf); Italian Commercial Finance (Laura Lucaroni); JPN TA (Coagulation) (Yuki Hidaka); JPN TA (Critical Care) (Osamu Tsukamoto); JPN TA (Immunology) (Satoshi Koike ??? ? - ??? ???? (Inherited)); JPN TA (Immunology) (Tomomi Shibata); JPN TA (Osamu Tsukamoto); Jackson 156 (Chris Weary); Jackson 156 (Jose L Dela Garza (Inherited)); Jackson 156 ACM Area 1 (Chris Weary); Jackson 156 ACM Area 1 (Joseph Dupree); Jackson 156 ACM Area 2 (Adrian Johnson); Jackson 156 QA (Bonnie M Talbott (Inherited)); Jackson 156 QA (Cynthia Hill); Jackson 156 QA (Jose L Dela Garza (Inherited)); Jackson 156 QA (Savannah Vann); Jackson 205 (Mark Bundy); Jackson 205 ACM Area 1 (Erica R Smith); Jackson 205 ACM Area 2 (Kenny Berry); Jackson 205 QA (Marc D Fisher); Jackson 205 QA (Nicole Pichla (On Leave)); Jackson 205 QA (Nicole Pichla); Jackson 225 (Bonnie M Talbott (Inherited)); Jackson 225 (Cherita Saulmarshall); Jackson 225 (Jai Baylis); Jackson 225 (Kronnetra Hester); Jackson 225 ACM Area 1 (Mariyo Archie); Jackson 225 ACM Area 2 (Jose L Dela Garza); Jackson 225 ACM Area 2 (Stanley Taylor); Jackson 225 QA (Deborah L Baker); Jackson 225 QA (Keyauna Lewis); Jackson 257 (Sarah E Silva); Jackson 257 ACM Area 1 (Caitie Golubski); Jackson 257 ACM Area 2 (Jarrett Heathcock); Jackson 257 ACM Area 2 (Sarah E Silva (Inherited)); Jackson 257 QA (Brooke McKinney); Jacksonville 251 (Sherri L Clark); Jacksonville 251 ACM Area 1 (Gina Castellano); Jacksonville 251 ACM Area 2 (AlexZandria Taylor); Jacksonville 251 QA (Brett A Wintheiser (Inherited)); Jacksonville 251 QA (Cindy Vieira); Japan Clinical Safety & Pharmacovigilance (Mariko Hase); Japan Field Services (Satoru Shimizu); Japan Project Management (Midori Kobayashi); Johnston 242 (Catherine Colucci); Johnston 242 (John L Thixton); Johnston 242 (Renee Keyser); Johnston 242 ACM Area 1 (Son Nguyen); Johnston 242 ACM Area 2 (Cessa Piedra); Johnston 242 QA (Allante S Williams); Johnston 242 QA (Erin Thompson); Joliet 219 (Andrew Franzen); Joliet 219 (Christopher J Rivers Jr); Joliet 219 ACM Area 1 (Sharon Kunz); Joliet 219 ACM Area 2 (Duanita Scott); Joliet 219 QA (Beth Majewski); Joliet 219 QA (Lori Carlson (Inherited)); Joliet 219 QA (Ryan Welter); Jonesboro 120 (Maurice E Clements); Jonesboro 120 ACM Area 1 (Jumela S Bell); Jonesboro 120 ACM Area 1 (Sade Hodges); Jonesboro 120 ACM Area 2 (Denise Bloodsaw); Jonesboro 120 ACM Area 2 (Jumela S Bell); Jonesboro 120 QA (Laila Matthews-El); Jonesboro 120 QA (Rose-Marie O Bland); K-C Fractionation (Union) (Jason Vaughn); K-C Fractionation (Union) (Samuel Jackson); KAN Security (Adam Kennell); KC Module 3 Operational Readiness (Cornelis Rijneveld); KOP Corporate Services (Michael Hays (Inherited)); KOP Corporate Services (Wendy Kilp) (Wendy Kilp); KOP Facilities (Michael Hays); KOP Outsourcing (Melissa Hurst); KOP Security (Shanna Aldridge); KOP Sourcing (Ed Rosario); KOP Sourcing (Paul Addis (Inherited)); Kankakee Field Services (Rebecca Liehr); Kankakee Manufacturing (Ernest Shepard); Kankakee R&D Tech Transfer (Shannon Boudreau); Kansai Area (Shingo Fujiwara); Kansai Area (Tatsuto Aihara); Kansas City 011 (Cristina E Ceniceros); Kansas City 011 (Tina Wagenknecht); Kansas City 011 ACM Area 1 (Dustin Irish); Kansas City 011 ACM Area 2 (Cristina E Ceniceros); Kansas City 011 ACM Area 2 (Samuel Jordan); Kansas City 011 QA (Cole D Kimple (Inherited)); Kansas City 011 QA (Samuel Anderson); Kansas City 011 QA (Whitney A Dean); Kansas City 410 (Cristina E Ceniceros); Kansas City 410 (Tina Wagenknecht); Kansas City 410 (Trethan R Copeland); Kansas City 410 ACM Area 1 (Jackie Florez); Kansas City 410 ACM Area 2 (Trethan R Copeland); Kansas City 410 QA (Kimberly S Mangold); Kansas City 410 QA (Whitney A Dean); Kaufmann f√ºr B√ºrokommunikation (Doris Nake (Inherited)); Kcentra Marketing (John Nicastro); Kcentra Marketing (Tom Groeling (Inherited)); Kcentra Marketing Group (Shunsuke Kuwata ??? ?? - ??? ??????); Kcentra Marketing I (Sara Cowan); Kenner 149 (Chris Weary); Kenner 149 (Durrell Arceneaux); Kenner 149 (Michael Markey); Kenner 149 ACM Area 1 (Brittany Miles); Kenner 149 ACM Area 2 (Teresa Currence); Kenner 149 QA (Centrell J Jackson); Kent 112 (David M Wilson); Kent 112 (Diana H Ek); Kent 112 ACM Area 1 (Diana H Ek (Inherited)); Kent 112 ACM Area 1 (Trevor Case); Kent 112 ACM Area 2 (Wesley Noble); Kent 112 QA (Brian Patterson); Kent 112 QA (Robert D Coulter); Kent 112 QA (Sasima Teadwatanasuk); Kent 160 (Michael J Ryan); Kent 160 ACM Area 1 (Brandy M Cermak); Kent 160 ACM Area 2 (Bambi C Gonwa); Kent 160 QA (Jamie L Dunderman); Kent 160 QA (Jamie L Matheney); Key Account Management (Alexander Kahlau); Key Account Management (Shun Huang ); Key Account Management (Shun Huang ????); King of Prussia Field Services (Cheryl Fennell); King of Prussia Field Services (Joy Holland); King of Prussia Field Services (Mary Jane McPherson (Inherited)); King of Prussia Quality (Brian Puglisi); Kitakanto Shinetsu Area (Hideo Yonesaka); Knowledge Management (Jacqui Altman); Knowledge Management (Kim Vandenberg); Knowledge Management (Leanne Cummings); Knoxville 405 (Brianna E Ballew); Knoxville 405 (John W Kelly); Knoxville 405 (Keith Clemons (Inherited)); Knoxville 405 ACM Area 1 (Michael R Thomas); Knoxville 405 ACM Area 2 (Leighann N Miller); Knoxville 405 QA (Tina G Ledbetter); Knoxville 405 QA (Tina Grubb Ledbetter); Kommunikation (Stephanie Fuchs); Konzessionen/Brandschutzbeauftragter (Michael Knoll (On Leave)); Konzessionen/Brandschutzbeauftragter (Michael Knoll); Koordination und Dokumentation (Rainer Frank (Inherited)); Kyushu Okinawa Area (Akihiro Enomoto); L&D, Apprentices KV (Ruth Schmid); LATAM RA (Andrea Violante); LATAM Sales Ops CAM & CAR (Mariano Miri); LVP Sterility Assurance (Sara Kimmins); La Crosse 516 (Ranee Bloor); La Crosse 516 QA (Sara Martin); Lab Automation (Ann L Wickenheiser); Lab Facilities (Joel Jones); Lab Inventory (Joel Jones (Inherited)); Lab Operations (Diep Chau); Lab Operations, Bio21 (Kirsten Edwards); Labor Relations (Steven Stewart); Laboratory Management Pasadena (Anthony Navarro); Laboratory Management - Pasadena (Anthony Navarro); Laboratory Operations (Constance W Farrar); Laboratory Operations (Marleen Enthoven); Laboratory Operations (Ricky R Alexander); Laboratory Systems (Amit Krishna); Lackawanna 238 (Martin Szczublewski); Lackawanna 238 ACM Area 1 (Brent Hollingsworth); Lackawanna 238 ACM Area 2 (Allie Tuttle); Lackawanna 238 QA (Anita Brenon); Lackland 706 (Ovetta A Mickles); Lackland 706 ACM Area 1 (Gabriel J Martinez); Lackland 706 ACM Area 2 (Ariel Schiller); Lackland 706 ACM Area 3 (Nate Neal II); Lackland 706 QA (Amber Sanders); Lackland 706 QA (Brenton Ferguson); Lager/Ersatzteilmanagement (Leon Krupa); Lakeland 154 (Elizabeth Adkins); Lakeland 154 ACM Area 1 (Jeffrey Simmons); Lakeland 154 ACM Area 2 (Bralyn T McCullough); Lakeland 154 QA (Crystal L Reichard); Lakeland 154 QA (Matthew Smith (Inherited)); Lakeland 154 QA (William Forquignon); Lansing 042 (Debbie L Duhe); Lansing 042 ACM Area 1 (Elizabeth Lawhon); Lansing 042 ACM Area 2 (Ruth A Griffin); Lansing 042 QA (Christine M Leija); Lansing 118 (Angie K Fedewa); Lansing 118 ACM Area 1 (Douglas Fiedler); Lansing 118 ACM Area 2 (Toussaint Hodari); Lansing 118 QA (Jessica Babcock); Las Cruces 506 (Samuel V Grijalva); Las Cruces 506 ACM Area 1 (Jacquelyn Jaques); Las Cruces 506 ACM Area 2 (Ira Bacani); Las Cruces 506 QA (Linda Dutchover); Las Vegas 081 (Jolena Lee); Las Vegas 081 (Michele Purvines-Honzo); Las Vegas 081 ACM Area 1 (Austin Vinson); Las Vegas 081 ACM Area 2 (Kevin Wallace); Las Vegas 081 ACM Area 3 (Christian Marcus); Las Vegas 081 QA (Erica Wiley); Las Vegas 081 QA (Paul Warden (Inherited)); Las Vegas 081 QA (Yaritza Monarrez); Las Vegas 172 (TIM AVILA); Las Vegas 172 (Xang Vang); Las Vegas 172 ACM Area 1 (Lashay Anter); Las Vegas 172 ACM Area 1 (Sarah C Sweat); Las Vegas 172 ACM Area 2 (Jessica L Jabbora); Las Vegas 172 QA (ANGELICA WILSON); Las Vegas 172 QA (Aaron D Learn); Las Vegas 216 (Erica Wiley); Las Vegas 216 (Nicole M Loncon); Las Vegas 216 ACM Area 1 (Erica Wiley); Las Vegas 216 ACM Area 1 (Michael Dako); Las Vegas 216 ACM Area 2 (Erica Wiley); Las Vegas 216 ACM Area 2 (Jose D Garcia); Las Vegas 216 QA (Orlando R Edwards Jr); Las Vegas 501 (Cari N Howard); Las Vegas 501 ACM Area 1 (Lissa Elswick); Las Vegas 501 ACM Area 2 (Steven G Simpson); Las Vegas 501 QA (Miranda Banks); LatAm Supply Chain (Martin Rossini); Late DSP Development (Erik Hinze); Late DSP Development (Tobias Brandt); Late Stage DSP Development (Erik Hinze); Late Stage DSP Development (LDD) (Uwe Liebing); Late Stage DSP Development (Tobias Brandt); Late Stage DSP Development (Uwe Liebing (Inherited)); Late USP Development (Jasmine Roth); Latin America (Juan Feliu); Latin American Distributors (Jean-Claude Andr); Latin American Distributors (Jean-Claude Andr√©); Lawrence 012 (Amy L Jackson); Lawrence 012 (Cole D Kimple (Inherited)); Lawrence 012 (Jessey Johnson); Lawrence 012 ACM Area 1 (Jessey Johnson (Inherited)); Lawrence 012 ACM Area 1 (Laura Hassen); Lawrence 012 ACM Area 2 (Taniesha D Kopriva); Lawrence 012 QA (Adam Loop); Lawrence 012 QA (Jessey Johnson (On Leave)); Lawrence 012 QA (Jessey Johnson); Lawrenceville 186 (Domonique T Walker); Lawrenceville 186 ACM Area 1 (Jeffrey Toussaint); Lawrenceville 186 ACM Area 2 (Ahesha M Francis); Lawrenceville 186 QA (Brandon Bailey); Lawton 452 (Natalie Compher); Lawton 452 (Vicky Sablan (On Leave)); Lawton 452 (Vicky Sablan); Lawton 452 ACM Area 1 (Jace A Guthrie); Lawton 452 ACM Area 2 (Samuel Jones); Lawton 452 QA (Adam Loop); Lawton 452 QA (Tiffany N Oxley); Layout & Packaging Planning (Martina Schweyer); Lead BP Finance - Asia Pac (Brendan Safe); Lead Clinical Oversight Manager (Anja Br√§unlich (Inherited)); Learning and Development (Amy Jackson); Learning and Development (Ann Lescher); Learning and Development (Henry Docx); Learning and Development (Karen A Emord); Learning and Development I (Henry Docx); Legal (Christine Dragann); Legal (Melissa Merriweather); Legal - Americas (John Neff); Legal - Australia (Amy Demediuk); Legal - Australia (Fiona Mead); Legal - Australia (Patrick Brady); Legal - Australia (Phyllis Perkins); Legal - Australia (Raewynn McIntyre); Legal - Australia (Tom Reid); Legal 1 (Khalil Rogers); Legal Clinical (Brian Sharma); Legal Counsel, Commercial, North America (Michael O''Connor); Legal Department APAC (Mae Chen ); Legal Department APAC (Mae Chen ?????); Legal Department Bern (Niklaus Kraehenbuehl); Legal Department Bern (Philippe Mueller); Legal Department Marburg (Dennis Kraft); Legal Operations Europe, Asia, Intercon. (Gereon Backmann); Legal Partners (Antje Michel); Legal Services (Sam Benyamin); Legal Services Europe & ICO (Charlotte Tvede Andersen); Legal Support One Commercial Operations Europe (Gereon Backmann (Inherited)); Legal Support One Commercial Operations Europe (Gereon Franz-Josef Backmann (Inherited)); Legal Support One Commercial Operations Europe (Gereon Franz-Josef Backmann); Legal ZLB Plasma (located in KOP) (Eric Silberstein); Lengnau Administration & Office Management (Boris Lanoir (Inherited)); Lengnau Administration & Office Management (Natasha Jackson); Lengnau Amenities Support (Franz Renfer); Lengnau Business Operations Services (Guenther Baumgartner); Lengnau Execution Systems (Frank Mastellone); Lengnau Facility Project (Darren Vegara); Lengnau Facility Project (Paul Loxley); Lengnau Human Resources (Ece Ergin [C]); Lengnau Human Resources (Sandro Krug (Inherited)); Lengnau Program (Nina Walser); Lengnau Program (Urs Meyer); Lengnau Project Documentation (Anamaria Negura); Lengnau Project Documentation (Mairead Henry [C]); Lengnau Project Documentation (Thorsten Buergel [C]); Lengnau SC and IBP (Marco Restelli); Lernende Logistik (Silvio Beck); Lexington 053 (Bobby R Fields Jr); Lexington 053 (Morgan R Grose); Lexington 053 ACM Area 1 (Jamale R Gentry); Lexington 053 ACM Area 2 (A.J. Stevenson); Lexington 053 QA (Michele R Estepp); Lexington 404 (Chris Otto); Lexington 404 ACM Area 1 (Ben Jones); Lexington 404 ACM Area 2 (Nathan J Fox); Lexington 404 QA (April Tyler); Lexington 404 QA (Bailee E White); Lichuan Plasma Collection Center (Jun Lai ); Lichuan Plasma Collection Center (Jun Lai ????); Lifecycle Management (Kathrin Eichstdt); Lifecycle Management (Kathrin Eichst√§dt); Lincoln Park 101 (Miriah Grady); Lincoln Park 101 (Toni M Walden); Lincoln Park 101 ACM Area 1 (Jeanette M Love-Ellison); Lincoln Park 101 ACM Area 2 (Dion J Holland); Lincoln Park 101 QA (Jenae Beacham); Lincoln Park 101 QA (Latosha Y Floyd (Inherited)); Lincoln Park 101 QA (Remie T Ray); Linden 212 (Jennifer Luque); Linden 212 ACM Area 1 (Jennifer Luque (Inherited)); Linden 212 ACM Area 1 (Matthew Clayborn); Linden 212 ACM Area 2 (Paul Eatman); Linden 212 QA (Jaleeka Johnson); Linden 212 QA (Stephanie D Shah (Inherited)); Linden 212 QA (Wendy MacConnell); Little Rock 234 (Seth Stuerke); Little Rock 234 ACM Area 1 (Charlie Hollinquest-Ford); Little Rock 234 ACM Area 2 (Ben Kulpa); Little Rock 234 QA (Akira Crenshaw); Logisitcs Manager VIC 266 (John Turone (Inherited)); Logistics (Angela Schembri); Logistics (Brendan Xerri); Logistics (Carl Werner (Inherited)); Logistics (Christopher Pela Fuaiva''a); Logistics (Dalal Mikhaeel); Logistics (Ibs Kaygisiz); Logistics (Ljubinka Duzel); Logistics (Peter Trimcevski); Logistics (Sam Mekhael (Inherited)); Logistics (Sam Mekhael); Logistics (Sebastian Sarmiento); Logistics (Tracy McIntosh); Logistics - Purchasing (Benjamin Fruin); Logistics - Purchasing (CHERYL GOODWIN); Logistics - Purchasing (Sue Savage); Logistics APAC (Edwin Chia); Logistics Customer Group (Namiko Hirakawa); Logistics I (Harald Mller (Inherited)); Logistics I (Harald M√ºller (Inherited)); Logistics Operations (Kai Menz); Logistics Operations (Koji Sugihara - ); Logistics Operations (Koji Sugihara ??? ?? - ???? ????); Logistics Operations - LATAM (Bruno Arakaki); Logistics Operations Australia (Suzanne Johnson); Logistics Operations Customer Service (Kaye McConnell); Logistics Operations Customer Service (Tanja Wells); Logistics Operations Europe (Matthias Loth); Logistics Operations Manager (John Turone); Logistics Operations- Americas (Daniel Sweed); Logistics Operations- Americas (Marianne McDonald (Inherited)); Logistics Planning Group (Takayuki Kato); Longwood 195 (Annette Nelson); Longwood 195 (Brian D Kelly); Longwood 195 ACM Area 1 (Jenna Smith); Longwood 195 ACM Area 1 (Vincent Spencer); Longwood 195 ACM Area 2 (Jessica Greene (On Leave)); Longwood 195 ACM Area 2 (Lori B Warfield); Longwood 195 QA (Brian Murzycki); Longwood 195 QA (Christopher Davis); Longwood 195 QA (John Garrett); Look Back / PDI (Julia Schimansky); Louisville 054 (Susan D Bensing); Louisville 054 ACM Area 1 (Tish Farris); Louisville 054 ACM Area 2 (Heather Romines); Louisville 054 QA (Gary Loy II); Louisville 054 QA (Keith Clemons (Inherited)); Louisville 054 QA (Melissa Casaus); Louisville 054 QA (Melissa J Roberts); Luotian Clinical Inspection (Yongmin Lv ?????); Luotian Inspection Management (Zengyi Chen ?????); Luotian Inspection Professional (Jiwei Liu ); Luotian Inspection Professional (Jiwei Liu ?????); Luotian Inspection Professional (Zheng Liang ????); Luotian Office Administration (Xiaoping Tang ?????); Luotian Office Administration (Zhen Zhang ); Luotian Office Administration (Zhen Zhang ????); Luotian Plasma Center Quality (Lixia He ?????); Luotian Plasma Collect (Jiali Fan ); Luotian Plasma Collect (Jiali Fan ?????); Luotian Plasma Collection (Meng Zhou ????); Luotian Plasma Collection (Menghua Ye (Inherited)); Luotian Plasma Collection (Shuiqiao Xiao ?????); Luotian Plasma Collection Center (Cunwei Hou ?????); Luotian Plasma Collection Center (Xiaoping Tang ); Luotian Plasma Collection Center (Xiaoping Tang ?????); Luotian Plasma Sourcing (Xiaoling Wang ); Luotian Plasma Sourcing (Xiaoling Wang ?????); Luotian Plasma Sourcing Management (Caihong Cheng ?????); Luotian Quality Management (Menghua Ye ); Luotian Quality Management (Menghua Ye ?????); Luotian Quality Management (Zheng Liang ????); Luotian plasma source management (Yongmin Lv ); Luotian plasma source management (Yongmin Lv ?????); Lyophilization (Jean-Claude Cauderay); M99 (Guido Mller); M99 (Guido M√∂ller); M99 (Marius Liesch); M99 NVI (Michael Theilkaes); M99 VVI (Marcel Mosimann); MDM Operations (Chandra Karpuram); MES & Systems (Reto Von Gunten); MES Automation (Gary Steele); MES Koordination (Horst Boeder (Inherited)); MES Koordination (Ralf Dersch); MF-59 (Gerhard Seemann (Inherited)); MFG Berinert & Beriplex Production (Jonathan Signore); MFG Berinert & Beriplex Production (Union) (Jonathan Signore); MS&T Lead (Kevin Murphy); MS&T Liverpool (Lisa-Marie Foulkes); MSAT (Matthias Kaeser); MSL Manager (Claire Morgan); MST Labor 1 (Anne N√∂ll); Macon 233 (Keyonna L Gray); Macon 233 (Lori B Warfield (On Leave)); Macon 233 (Melodee C Ebel (Inherited)); Macon 233 (Sherri L Clark); Macon 233 ACM Area 1 (Jennie Miles); Macon 233 ACM Area 1 (Lori B Warfield (On Leave) (Inherited)); Macon 233 ACM Area 2 (Gina Castellano); Macon 233 ACM Area 2 (Tomecia Tillman); Macon 233 QA (Teddye Gandy (On Leave)); Macon 233 QA (Teddye Gandy); Madison 076 (Tiffany K Singh); Madison 076 ACM Area 1 (Shelby N Grimsley); Madison 076 ACM Area 2 (Jada Phillips); Madison 076 QA (Alissa Elke); Madison 076 QA (Iricka Williams); Madison 076 QA (Prim J Cunningham (Inherited)); Main (Elizabeth Boyd); Maintenance & Reliability (Michael Elmer); Maintenance & Utilities (Franz Arnold Nigsch); Maintenance (Jeffrey Rhodes); Maintenance (Michael J Stephenson); Maintenance (Michael Memenga); Maintenance (Union) (Jose Franceschini Mirabal (Inherited)); Maintenance (Union) (Michael Memenga (Inherited)); Maintenance Engineering (Vittorio D''Argento); Maintenance K3 (Jose Franceschini Mirabal (Inherited)); Maintenance K3 I (Michael J Stephenson); Maintenance Officer (Jesse Chen); Maintenance Officer (Ray Belli); Maintenance Operations (Vittorio D''Argento); Maintenance SC I (Jeffrey Rhodes); Maintenance Support Engineer (James Stevens); Maintenance U8 (Simon Urfer); Maintenance U8 (Stefan Bgli); Maintenance U8 (Stefan B√∂gli); Maintenance and Utilities (Jose Franceschini Mirabal); Major Capital Projects (Brian Price); Management Accounting (Daya Salter); Management Accounting (RYAN HANSEN); Manager , IVV Seed Development (Karen Laurie); Manager - QA Batch Release (Linda Curran); Manager - QA Batch Release (Sherrin Gribble); Manager DGL (Heike Gocht); Manager ICA (Tim Karla); Manager IT Applications BPCS (Rod Randall); Manager Performance Qualification (Aaron Haag); Manager QA Batch Release (Anthony Day); Manager QA Batch Release (Carol Youssef); Manager QA Batch Release (Olivia Fisher); Manager QA Capability (Mark Machowicz); Manager QA Capability (Nicole Schaefer); Manager QA Capability (Vicky Gakias); Manager, DS Processing (Jesse Bodle); Manager, Field Services Australia (Bec Heitbaum); Manager, Field Services Australia (Travis Slessar); Manager, QA Cont Imp & Iss Mgt (Christopher Burke); Manager, QA Cont Imp & Iss Mgt (Janet Drew); Manager, QA Cont Imp & Iss Mgt (Jeremiah Holden); Manager, Virol & Immuno Res (Erin Verity); Manf Dir - Influenza Vaccines (Paul Morrison); Manf Dir - Influenza Vaccines (Vincent Chung); Manhattan 019 (Stacy J Teske); Manhattan 019 ACM Area 1 (Shane A Groover); Manhattan 019 ACM Area 2 (Dave Lynn); Manhattan 019 ACM Area 2 (Stacy J Teske (Inherited)); Manhattan 019 QA (Karen L Phillips); Manufacturing (Barbara Beugger); Manufacturing (Boris Lanoir); Manufacturing (Bradley J Eberhart); Manufacturing (James Janachowski); Manufacturing (Jose Gonzalez (Inherited)); Manufacturing (Katie Wood); Manufacturing (Martin Schaeren (Inherited)); Manufacturing (Matthew Seay); Manufacturing (Patricia Stewart (Inherited)); Manufacturing (Rene Bruegger); Manufacturing - Fill/Finish (Vincent Chung); Manufacturing A1 (Danica Bates); Manufacturing B1 (Trinette Farr); Manufacturing B2 (Michael Haney); Manufacturing Continuous Improvement (Trinette Farr); Manufacturing EU & APAC (Pierre Caloz); Manufacturing Engineering (Aaron Imig); Manufacturing Execution Systems (Frank Behnisch); Manufacturing Finance (Jacob Weaver); Manufacturing Finance (Jason Mugridge); Manufacturing First Shift (Tish Smith); Manufacturing HS (Chad M Salisbury); Manufacturing HS (Dave Sehgal); Manufacturing HS (Karen Netherton); Manufacturing Kankakee I (Jose Gonzalez (Inherited)); Manufacturing LVP (Jonah Smith); Manufacturing LVP (Nige Hilton); Manufacturing Liverpool (Jonah Smith); Manufacturing Operations (Steven Aldrich); Manufacturing PKV (Chris Larkins (Inherited)); Manufacturing PKV (Jonah Smith); Manufacturing Quality Management (Ramzan Tabasum); Manufacturing SC I (Matthew Seay); Manufacturing Science & Technology (Klaus Schmitt); Manufacturing Science & Technology (Klaus-Jrgen Schlitt); Manufacturing Science & Technology (Klaus-J√ºrgen Schlitt); Manufacturing Sciences and Technologies (Heidi Bergman); Manufacturing Second Shift (Michael Haney); Manufacturing Supply Chain & Integrated Business Planning (Pat Golla); Manufacturing Support (Clare Hughes); Manufacturing Support (Dee Hamer); Manufacturing Support (Marco Restelli); Manufacturing Support (Vreni F√∂rtsch); Manufacturing Technical Operations Team (Yuan Su ); Manufacturing Technical Operations Team (Yuan Su ????); Manufacturing Technology & Science (Christoph Hau√ümann); Manufacturing Third Shift (Michael Haney); Maple Shade 215 (Brett Goldman); Maple Shade 215 (Darryl King); Maple Shade 215 ACM Area 1 (Tracey Pinkney); Maple Shade 215 ACM Area 2 (Erica Hoesly); Maple Shade 215 QA (Deb Stith); Maple Shade 215 QA (Kimberly Perry); Marburg Data Management (Babette Katharina von Hagen); Marburg Field Services (Alexander Berendes); Margate 142 (Christina M Kokoszka); Margate 142 (Michelle S DeCambre); Margate 142 (Takisha F Jackson); Margate 142 ACM Area 1 (Amanda Bybee); Margate 142 ACM Area 1 (Kurt S Tuckett); Margate 142 ACM Area 2 (Kencia Cadet-Pa?ko); Margate 142 ACM Area 2 (Kencia Cadet-Pako); Margate 142 QA (Estela M Euceda); Margate 142 QA (Karen Blanchard-Sims); Market Access & Public Health Netherlands (Els Devriese); Market Access (Debbie Drane (Inherited)); Market Access France (Alice MATHERON); Market Access France (Franck Puget (Inherited)); Market Access GE/AT/Emerg. Europe (Dirk Hoheisel (Inherited)); Market Access GE/AT/Emerg. Europe (Ilona Krug); Market Access Italy (Lara Pippo); Market Access Russia & CIS (Batyrkhan Kuatov); Market Access Strategy (Robert Rouse); Market Access and Commercial Strategy (Ling Yang ????); Market Access and Public Affairs (Jonathan Galduf Cabanas); Market Access and Public Affairs (Jose Luis Moreno Sanchez); Market Access and Public Affairs (Sandra Santos); Market Research (Nathan Barrall); Market Research (Venkatesh Ramakrishnan (Inherited)); Marketing & Medical Affairs Interconti. (Thomas Hauck); Marketing (Brian McMaster); Marketing (Dariusz Holdys); Marketing (Elena Glukhova); Marketing (Michael Chen ?????); Marketing (Philippe Hebert (Inherited)); Marketing (Robert Mitchell); Marketing (Scott Newkirk); Marketing (Thomas Offergeld); Marketing Belgium (Marijke Maes); Marketing Benelux (Erwin Franken); Marketing Benelux (George Peters); Marketing Benelux (Patrick Reygaert); Marketing Benelux (Stefaan Schatteman [C]); Marketing Coagulation (Marino Bertapelle); Marketing Coagulation (Sharad Agrawal); Marketing Communication (Anastasia Walsh); Marketing Department (Marianna Konstantinidi (Inherited)); Marketing Division (Jean-Marc Morange (Inherited)); Marketing Division (Kyota Yamaoka - ); Marketing Division (Kyota Yamaoka ??? ?? - ???? ?????); Marketing Division Congress Group (Kyota Yamaoka - (Inherited)); Marketing Division Congress Group (Kyota Yamaoka ??? ?? - ???? ????? (Inherited)); Marketing Division Critical Care and Acquired Bleeding (Shunsuke Kuwata - ); Marketing Division Critical Care and Acquired Bleeding (Shunsuke Kuwata ??? ?? - ??? ??????); Marketing Division Hemophilia Group (Makoto Kubo); Marketing Division Hemophilia Group (Sho Sakuma ???? ? - ??? ????); Marketing Division Immunology & Rare Disease Group (Shinichiro Magome); Marketing Division SID Group (Jun Ishiwa - ); Marketing Division SID Group (Jun Ishiwa ??? ? - ??? ????); Marketing France (Benjamin BISMUTH); Marketing France (Pascale Ogel Le Guen); Marketing Franchise (marco odelli); Marketing Greece (Marianna Konstantinidi (Inherited)); Marketing In-Licensing Director (James Kretsis); Marketing Intercontinental (Timothy Akroyd); Marketing Italy (Alessandro Vasco); Marketing Italy (Giorgio Lippi); Marketing Manager (Andrew Barker); Marketing Manager (Natasha Rees); Marketing Manager (New Influenza Products) (Belinda Anderson); Marketing Manager 745 (Belinda Anderson); Marketing Manager 745 (Gina Kladis); Marketing Manager 745 (Helen Concilia (Inherited)); Marketing Nordic (Petter Olbe); Marketing Portugal (David Ventura); Marketing Product (Rebecca Turner); Marketing Product Administration (Edward Potter); Marketing Program (Michael Chen ); Marketing Program (Michael Chen ?????); Marketing Schweiz (Beatrice Guldimann); Marketing Schweiz (Christoph Schneider); Marketing Spain (Aurea Xumetra); Marketing Specialty Products (Jan Hoesche); Marketing UK (Amandine Faguer); Marketing UK (Eddie Owens (Inherited)); Marketing, China Com Ops (Claire Tang ); Marketing, China Com Ops (Claire Tang ?????); Marketing, Medical Affairs & Market Access Interconti. (Thomas Hauck); Mass Spec Research (Victor Nesati); Master Data & Country Specific (Joachim Leiss); Master Data & Country Specific (Julian Knabeschuh); Master Data (Bruce C Beatty); Master Data (Gilbert Kilchoer); Master Data Management ES (Roland Burkhard); Master Data Management Finished Product (Luana Gauer); Master Data Technology (James G Kirby); Master Data, Maintenance & Development (Julia Schimansky); Master Production Planner - Privigen (Kylie Cramer); Materials Life Cycle Management (Jennifer Chung); Materials Management (Steven E Putlack); McAllen 258 (Ben Samarripas (Inherited)); McAllen 258 (Carlos Floyd); McAllen 258 ACM Area 1 (Felipe Gonzalez); McAllen 258 ACM Area 1 (Marc Garcia); McAllen 258 ACM Area 2 (Monica Contreras); McAllen 258 QA (Esperanza Pina); McKeesport 192 (Miekal Brown); McKeesport 192 (Steven Warheit); McKeesport 192 ACM Area 1 (Aaron Bova); McKeesport 192 ACM Area 2 (Caroline Hoyer); McKeesport 192 QA (Daniel Sullenberger); McKeesport 192 QA (Katherine Parker); Mckinney 276 (Sheneka E Wilson); Mckinney 276 ACM Area 1 (Charles E Baxter IV); Mckinney 276 ACM Area 2 (Andrew Fluharty); Mckinney 276 QA (Roxann L Sandoval); Mech Main Engineer 253 (Desmond Lobo); Mechanic (Thomas Baumann); Mechanical Maintenance (Daniel Hofmann); Mechanical Maintenance (Stefan Schmid); Mechanicsburg 171 (Bernard Thompson); Mechanicsburg 171 (John L Thixton (Inherited)); Mechanicsburg 171 (Michele Purvines-Honzo (Inherited)); Mechanicsburg 171 (Olivia Chung); Mechanicsburg 171 ACM Area 1 (Theodore Rooks); Mechanicsburg 171 ACM Area 2 (Michael Crosby); Mechanicsburg 171 QA (Cyle Starner-Moore); Mechanicsburg 171 QA (Kellie N Buecker); Mechanicsburg 171 QA (Kimm Klisiewicz); Mechatroniker (Doris Nake (Inherited)); Medford 037 (Jane Herrera); Medford 037 ACM Area 1 (Hattie E Johnston); Medford 037 ACM Area 2 (Denise Scarborough); Medford 037 ACM Area 2 (Katrina D Walls); Medford 037 QA (Richard W Smith); Medical (Christina Berchtold); Medical Affair (Claire Morgan); Medical Affairs (David Crump); Medical Affairs (Giulio Barrese); Medical Affairs (Gunnar Philipp); Medical Affairs (Manzhou Hou ); Medical Affairs (Manzhou Hou ?????); Medical Affairs (Michael Haslauer); Medical Affairs (Navin Singh); Medical Affairs (Robert Chan); Medical Affairs (Sebastian Dinatale); Medical Affairs Belgium (Anne Verheyen (Inherited)); Medical Affairs Benelux (Anne Verheyen); Medical Affairs Division (Robert Chan); Medical Affairs Division Hematology and Thrombosis Group (Yasuhiro Terano ??? ?? - ??? ?????); Medical Affairs Division Hemophilia Group (Motohiro Okayasu - ); Medical Affairs Division Immunology & Rare Disease Group (Er Win Hew); Medical Affairs Division Medical Excellence and Operations (Kenji Suwa - ); Medical Affairs Division Medical Excellence and Operations (Kenji Suwa ??? ?? - ?? ????); Medical Affairs Division SID Group (Hiromi Igari ??? ?? - ??? ????); Medical Affairs EU (Damian Gilkerson (Inherited)); Medical Affairs EU (Patrick Sommerer); Medical Affairs France (Jamila Filipecki); Medical Affairs France (Nabil Moumane); Medical Affairs Germany (Paolo Bajcic); Medical Affairs Germany (Patrick Sommerer); Medical Affairs Greece (Evi Baimpou); Medical Affairs Italy (Learco Mottola); Medical Affairs Netherlands (Anne Verheyen (Inherited)); Medical Affairs Nordic (Martin Tenlen (Inherited)); Medical Affairs Nordic (Michael Gr√∂vdal); Medical Affairs Nordic (Stefan Grass); Medical Affairs Project Management (Diane Bracquart); Medical Affairs Russia (Evgeny Rudenko); Medical Affairs Russia (Maria A Lituchaya (Inherited)); Medical Affairs Spain (Jos Aznar-Salatti); Medical Affairs Spain (Jos√© Aznar-Salatti); Medical Affairs Specialty Products (Thomas Machnig); Medical Affairs UK (Alessandro Dos Santos); Medical Affairs UK (Jo Heaton); Medical Affairs of Greater China (Helen Dai ); Medical Affairs of Greater China (Helen Dai ????); Medical Affairs, Americas (Ashesh Gandhi); Medical Affairs, Canada (Ashesh Gandhi (Inherited)); Medical Affairs, Europe (Sankarasubramanian Rajaram); Medical Affairs, Influenza (Karita Ambrose); Medical Affairs, Rapivab (Ashesh Gandhi (Inherited)); Medical Communications, US (Nancy Dougherty); Medical Department Turkey (Hasan Avcu); Medical Excellence and Operations (Mitsuhiro Kuwahara); Medical Excellence and Operations (Robert Chan (Inherited)); Medical Hemophilia Group (Takeo Hirai ??? ?? - ??? ????); Medical Manager (Andrea McCracken); Medical Manager (Anthony Gargano); Medical Manager (Arturo Lopez Larios); Medical Manager (Claire Morgan); Medical Manager (DEBRA BOURKE); Medical Manager (Debra Bourke); Medical Manager (Jane Wheeler); Medical Manager (Julianne Bayliss); Medical Manager (Luis Aversa); Medical Manager 842 (Jane Leong); Medical Manager 842 (MAUREEN THAM); Medical Operations US 2 (Jeanie Chiu); Medical Operations US 3 (Jennifer Hanes); Medical Operations US 3 (John Nelson); Medical Science Liaison Canada (James Mansi); Medical Scientific Liaison (Joana Rodrigues); Medical Scientific Liaison Spain (Jenny Alvarez Nieto); Medical Services (Anna Burek); Medical Unit Medical Information (Ana Claudia Guersoni); Medical Unit ¬ñ Medical Information (Ana Claudia Guersoni); Medical Writing Quality and Publishing (Nerrie Lam); Medical Writing Therapeutic Area Lead (Ellen Krasutsky); Medical Writing (Amy Walton); Medical Writing (Bob Stumpo (Inherited)); Medical Writing (Bob Stumpo); Medical Writing (Midori Kobayashi); Medical Writing (Narelle Bramich (Inherited)); Medical Writing (Narelle Bramich); Medical Writing (Takashi Fukai ??? ?? - ??? ????); Medical Writing (Thomas Verish); Medical Writing - Therapeutic Area Lead (Daniel Wood); Medical Writing - Therapeutic Area Lead (Wolfgang Thielen); Medical Writing ¬ñ Quality and Publishing (Nerrie Lam); Medical Writing ¬ñ Therapeutic Area Lead (Ellen Krasutsky); Medical and Business Support (Antoinette Mangione); Medical and Quality Greater China (Spring Wang); Melrose Park 453 (Jesus A Castillo (Inherited)); Melrose Park 453 (Niki Wells); Melrose Park 453 (Tangerine Tingle); Melrose Park 453 ACM Area 1 (Tangerine Tingle (Inherited)); Melrose Park 453 ACM Area 1 (Tangerine Tingle); Melrose Park 453 ACM Area 2 (Tangerine Tingle (Inherited)); Melrose Park 453 ACM Area 2 (Tangerine Tingle); Melrose Park 453 QA (Andrea Bohnenberger); Melrose Park 453 QA (Kimberly L Strong-Allen (On Leave)); Melrose Park 453 QA (Kimberly L Strong-Allen); Memphis 052 (Christopher Morgan); Memphis 052 (Dorleans Alce); Memphis 052 (Trina Crayton); Memphis 052 ACM Area 1 (Dorleans Alce); Memphis 052 ACM Area 1 (Keoshia N Franklin); Memphis 052 ACM Area 2 (Laundray Carter); Memphis 052 QA (Brooke McKinney); Memphis 052 QA (Jason S Hicks); Mesquite 085 (Amber Robinson); Mesquite 085 (Brenda C Greenfield (Inherited)); Mesquite 085 (Brenda C Greenfield); Mesquite 085 ACM Area 1 (Valinda M Peters); Mesquite 085 ACM Area 2 (Christy Pagel); Mesquite 085 QA (Martin DelAngel); Method Development & Instruments (David Canen); Method Development & Instruments (Todd Canen); Method Development (Tom Barnes); Method Development Group (Anna Rozhkova); Method Development Group (Petra Sebastian); Metrics & Analytics (Christian Spuckti); Metrology (Aur√©lien H√©mon); Metrology (Salvatore DiRusso); Metrology (Union) (Jose Franceschini Mirabal (Inherited)); Metrology (Union) (Michael Memenga (Inherited)); Mgr QC Chemistry (Melissa Damino); Mgr QC Chemistry (Ying Huang); Mgr- QC Immunology (Justine Jaap); Mgr- QC Immunology (Melissa Damino); Mgr-Validation Operations (Nick Morgan); Miami 206 (Ashley Britt); Miami 206 (Ashley Downs); Miami 206 (Yennifer Fernandez); Miami 206 ACM Area 1 (Troy Davidson); Miami 206 ACM Area 2 (Barbara May); Miami 206 QA (Anitha Janardhanan); Miami 206 QA (Aris Herrera); Microbiological QC (Ivana Heckel); Microbiological QC (Nicola Di Maiuta); Microbiology (Sarah Krueger); Microbiology (Torsten Vogt); Microbiology - Enviromental Monitoring (Alison Conroy); Microbiology - Lab (Stacey Wenzel); Microbiology - Lab I (Stacey Wenzel); Microbiology - Utilities (Joshua Deabel); Microbiology - Utilities (Matthew Pocius); Microbiology 1 (MB1) (Silke Litzinger); Microbiology 2 (MB2) (Constanta Ola); Microbiology 2 (MB2) (Morten Ruch); Microbiology Lab (Annett Milling); Microbiology Lab (Breanna Steins); Microbiology Lab 1 (Breanna Steins); Microbiology Validation (Emily Neylon (Inherited)); Microbiology Validation (Natalie Gaffney); Middle East & Africa (EMEA) (Camilla Shen); Middle East & Africa (EMEA) (Mohammed Haggag); Midvale 273 (Joel Gallegos); Midvale 273 (Nicole M Loncon (Inherited)); Midvale 273 ACM Area 1 (Jason Stevens); Midvale 273 ACM Area 2 (Casey Davis); Midvale 273 QA (Madison Reid); Mikrobiology 3 (MB3) (Stephanie Achebach); Minneapolis 414 (Deepesh M Pillai); Minneapolis 414 ACM Area 1 (Abubeker M Osman); Minneapolis 414 ACM Area 2 (Ahmed N Ismail); Minneapolis 414 QA (Diego A Bastidas); Minneapolis 414 QA (Pauline M Pipho); Miramar 214 (Jessica Collins); Miramar 214 (Mary A Paul (Inherited)); Miramar 214 (Tyneka Rene); Miramar 214 ACM Area 1 (Chanique Young); Miramar 214 ACM Area 2 (GUILLERMO ORTIZ); Miramar 214 ACM Area 2 (Sang Nguyen); Miramar 214 QA (Azia Alston); Mishawaka 249 (Marisa Nyikos); Mishawaka 249 (Olivia Arend); Mishawaka 249 ACM Area 1 (Kanesha Young); Mishawaka 249 ACM Area 1 (Sydney Boyle); Mishawaka 249 ACM Area 2 (Lucette Gamble); Mishawaka 249 QA (Leah Lehtomaki); Mitarbeiter Serologisches Labor (Astrid Mather (Inherited)); Mobile 284 (Wesley Stokes); Mobile 284 ACM Area 1 (Doris Osobase); Mobile 284 ACM Area 2 (Demitrius Douglas); Mobile 284 QA (Egypt N Ali); Modul 2 - Team 2 (Marko Elias); Modul 2 - Team 2 (mit Aslantas); Modul 2 - Team 2 (√úmit Aslantas); Modul 2 - Team 3 (Timo Gr√ºn); Modul 2 - Team 4 (Maik Czyrzewski); Monitoring (Arno Karnholz); Monitoring (Roland Portmann); Monitoring - Auswertung (Dominik Mueller); Monitoring Operations / Sampling (Martin Hofer); Montgomery 105 (Trinity J Bell); Montgomery 105 (Trinity J Gamble); Montgomery 105 ACM Area 1 (Shauna M Runk); Montgomery 105 ACM Area 2 (Robyn English); Montgomery 105 QA (Tiffany D Sherman (Inherited)); Montgomery 105 QA (Whitney C Belser); Montgomery 125 (Cathrine M Shimek); Montgomery 125 ACM Area 1 (Cathrine M Shimek); Montgomery 125 ACM Area 1 (Mark Sanders); Montgomery 125 ACM Area 2 (Cathrine M Shimek (Inherited)); Montgomery 125 ACM Area 2 (Monica Miller); Montgomery 125 QA (Chrystal D Carrillo); Montgomery 125 QA (Kimberly J Sanders); Montgomery 198 (Cory Toellner (Inherited)); Montgomery 198 (Gregory Jacobs); Montgomery 198 (Justin N Gronbach); Montgomery 198 ACM Area 1 (Timike Sheehy); Montgomery 198 ACM Area 2 (Sarah Peet); Montgomery 198 QA (Christine C Le); Montgomery 198 QA (Michael Hoedle); Motive Power (Nate Thomas); Motive Power (Union) (David G Mollema (Inherited)); Motive Power (Union) (Nate Thomas (Inherited)); Motive Power (Union) 1 (Nate Thomas (Inherited)); Mt Clemens 261 (Tiffany D Peters); Mt Clemens 261 ACM Area 1 (Lavon Williams); Mt Clemens 261 ACM Area 2 (Michelle S Gibbs); Mt Clemens QA 261 (Melissa Johnson); Muncie 191 (John W Wheelock); Muncie 191 (Rob Garcia (On Leave)); Muncie 191 (Rob Garcia); Muncie 191 ACM Area 1 (Andy Umberger); Muncie 191 ACM Area 2 (Andrea S Young); Muncie 191 QA (Drewleigha B Sarver (Inherited)); Muncie 191 QA (Mary Stegall); Muncie 191 QA (Megan M Sheets); Murfreesboro 128 (Elisabeth Johnson); Murfreesboro 128 (Elisabeth Radigan); Murfreesboro 128 (Nedra N Braden); Murfreesboro 128 ACM Area 1 (Ron Rury); Murfreesboro 128 ACM Area 2 (Samantha Holmes); Murfreesboro 128 QA (Melanie J Carmack); Murfreesboro 128 QA (Michelle Young); Mustang 243 (Sam P Emrich); Mustang 243 ACM Area 1 (Jeff Saylors); Mustang 243 ACM Area 2 (Claire Joyce); Mustang 243 QA (Bill Crye); Mustang 243 QA (Fay Michelsen); N. Charleston 291 (Donte Lazarus); N. Charleston 291 ACM Area 2 (Nichole Bell); N. Charleston 291 QA (Sharon Williams); NA CommOps & Patient Services (Mike Andrews); NA Medical Affairs Operations (Sindhu Pampati); NA Therapeutic Area Lead (Coagulation) (Emmanuel Gutierrez); NA Therapeutic Area Lead (Coagulation) (Monica Richardson); NAT Lab (Kevin N Elliott); NAT Lab (Ricky R Alexander (Inherited)); NAT Lab (Ricky R Alexander); Nampa 505 (David Ensminger (Inherited)); National Accounts Sales-Summit (Mark Faulkner); National City 297 (GABRIEL MACEDO); National City 297 QA (Jessie Aquino); National Hospital Manager (Christine Folland); National Management (Denis Fedorov); National Sales (Toshio Nagata); Nebenanlagen (Andr Wermuth); Nebenanlagen (Andr√© Wermuth); Network Services (Christopher Frank); Network Services (David Mann); Network Services (Don Konemann (Inherited)); Network Services ASIAPAC (Mahesh Narayanan); Network Services Americas (Raj Selvaraj); Network Services II (Partha SARATHY); Neurology Marketing (Jason Reckner); New Center Expansion (John Brennan); New Center Launch (Amon G Samples); New Center Launch 1 (Ian Biehler); New Center Launch 2 (Nicole L Ledbetter); New Center Support 1 (Lindsay K Jameson (On Leave)); New Center Support 1 (Lindsey Jameson); New Center Support 1 (Rey Vargas); New Center Support 1 (Rob Soeun); New Center Support 1.1 (Valerie L Ward); New Center Support 1.2 (Rey Vargas); New Center Support 2 (Amy L Guardiola); New Center Support 2 (Anthony Rheuark); New Center Support 2 (Becca Thomas); New Center Support 2 (Billy R Poole); New Center Support 2 (Marissa Sunanon-Clements); New Center Support 2 .1 (Amy L Guardiola); New Center Support 2.2 (Marissa C Sunanon); New Center Support 2.4 (Becca Thomas); New Hope 163 (Jason L Kelley); New Hope 163 ACM Area 1 (DiJon Jones); New Hope 163 ACM Area 2 (Daniel D Rogge); New Hope 163 QA (Holly S Wahlberg); New Hope 163 QA (Holt Peterson (Inherited)); New Hope 163 QA (Kayla L Stueber); Newark 213 (Angela Bordelon); Newark 213 ACM Area 1 (Stephanie Morrison); Newark 213 ACM Area 2 (Angela Mancinelli); Newark 213 ACM Area 2 (Steve H Sison); Newark 213 QA (Christopher Madden); Newark 213 QA (Taylor Thomas); Niagara Falls 237 (Kimberly Reimer); Niagara Falls 237 ACM Area 1 (Paul Hedley); Niagara Falls 237 ACM Area 2 (Mary Jo Watt); Niagara Falls 237 QA (Wesley Summers); Nogales 108 (April Behnke); Nogales 108 (Brooke S Angulo); Nogales 108 ACM Area 1 (Jorge U Orozco); Nogales 108 ACM Area 2 (Rosa G Martinez); Nogales 108 ACM Area 3 (Guadalupe Ochoa (On Leave)); Nogales 108 ACM Area 3 (Rosa G Martinez); Nogales 108 QA (Cori J Collins (Inherited)); Nogales 108 QA (Martha E Lundberg); Non IVV Bact, Vir, Ster, Env Monitoring (Fenny Ng); Non IVV Potency (Keiran McLeod); Non IVV, Chemistry, Biochem, Immulab (Niki Soteriadis); Non-Process Projects (Jennifer Mastio); Norfolk 513 (Katanya Hall); Norfolk 513 QA (James Foreman); Normal 292 (Jose Patino); Normal 292 (Michael W Solomon (Inherited)); Normal 292 ACM Area 1 (William Molitor); Normal 292 ACM Area 2 (Lana Shepherd); Normal 292 QA (Jennifer Harris); Norman 020 (Laura A Post); Norman 020 (Troy Lee Wheeler); Norman 020 ACM Area 1 (Nicole Bertram); Norman 020 ACM Area 2 (Amanda Doan); Norman 020 QA (Katy L Reynolds); North American Benefits (US & Canada) (Matthew Arscott (On Leave)); North American Benefits (US & Canada) (Matthew Arscott); NorthGlenn 141 (Anna M Coulbourne); NorthGlenn 141 (Becca Charles); NorthGlenn 141 (Daniel Venn (Inherited)); NorthGlenn 141 QA (Ashley R Sewell); Northglenn 141 ACM Area 1 (Jonathan Walling); Northglenn 141 ACM Area 2 (Carlos M Valenzuela); O''Fallon 224 (Lori Carlson (Inherited)); O''Fallon 224 (Tara R Spates Tucker); O''Fallon 224 ACM Area 1 (Jahleia Chieves); O''Fallon 224 ACM Area 2 (Andrea M Catchup); O''Fallon 224 QA (Lori Carlson (Inherited)); O''Fallon 224 QA (Marijo Monroe); O''Fallon 224 QA (Tori Chancellor); OE/BPM LVP (Cheryl King); OE/BPM LVP (Fabrice Gribon (Inherited)); OE/BPM LVP (Stephen Marlow (Inherited)); OF EU (incl. TR) (Sabine H√§mel (Inherited)); OF ROW (incl. ICO) (Anna-Karina Muth); OSI (Jennifer Krupka); OSII (Michael Moses); OSIII (Wilfried Wormsb√§cher); OSIV (Tina W√ºrfel); OTO Programs (Paul Bidez); Oak Park 041 (Latosha Y Floyd (Inherited)); Oak Park 041 (Sherlene Killebrew); Oak Park 041 ACM Area 1 (Sandra Erdman); Oak Park 041 ACM Area 2 (Patrick J Tribble); Oak Park 041 QA (Jessica J Cobey (On Leave)); Oak Park 041 QA (Jessica J Cobey); Ocala 290 (Althea Council); Ocala 290 QA (Jean O''Neal); Oklahoma City 422 (Johnnie K Phares); Oklahoma City 422 ACM Area 1 (Clement C Uzoma); Oklahoma City 422 ACM Area 2 (Ella L Boyd); Oklahoma City 422 QA (Billie E Gilliland); Oklahoma City 422 QA (Hannah E Todroff); Olympia 517 (Trevor Case); Omaha 421 (Christie G Edmisten); Omaha 421 ACM Area 1 (Kristen A Marteny); Omaha 421 ACM Area 2 (Sachin Bhandari); Omaha 421 QA (Christopher Trindle); Omaha 421 QA (Larinda N Johnson); Open Systems (Juerg Clavadetscher (Inherited)); Open Systems (Kapil Taneja); Operational Business Development 1 (Amon G Samples); Operational Business Development 2 (Nicole L Ledbetter); Operational Business Development 9.2 (Laura A Allen); Operational Excellence & Data Analytics (Jason Woolley); Operational Excellence (Claus Peihs (Inherited)); Operational Excellence (Gil Rochat); Operational Excellence (Jan-Christopher Gerlach); Operational Excellence (Jewel Reid); Operational Excellence (Monika Goretzki); Operational Excellence (Murat Dalar (Inherited)); Operational Excellence (Philipp Jeker); Operational Excellence LVP (Gordon Pearson); Operational Prozess 1 (Arnd Vollmerhausen); Operational Readiness Phoenix (Rainer Frank); Operational Services (Clare Schwarz); Operational Services Maintenance & Utilities (Michael Kocher); Operational Support (Laura A Allen); Operations & PV Systems (Shinya Takagawa); Operations (Camila Silva Alvarado); Operations Global Engineering (Gregory Taylor); Operations Global Engineering Projects (Daniel Rouse); Operations Global Sourcing (Trevor Reay); Operations HS Business Integration (Thomas Jede); Operations HS EHSS & Risk (Allan Wise); Operations HS EHSS & Risk (Bob Rezek); Operations HS EHSS & Risk (Lynette Hodgden (Inherited)); Operations HS EHSS & Risk (Lynette Hodgden); Operations HS Engineering (Daniel Rouse); Operations HS Engineering (Gregory Taylor (Inherited)); Operations HS Engineering (Rodney Lam); Operations HS Engineering (Tom Gehrin); Operations HS Engineering Automation (Charles Guy Sorrell Jr.); Operations HS Engineering Automation (James Dion); Operations HS Engineering Fill Finish Equipment (Jeffrey Stowe); Operations HS Engineering Process (James Flockhart); Operations HS Engineering Process (Jason VanderPloeg); Operations HS Engineering Process (Jeffrey Stowe); Operations HS Engineering Project (Daniel Rouse); Operations HS Engineering Project (Eddie Taylor); Operations HS MS&T Process Sciences (Nicholas Mauro); Operations HS MS&T Process Sciences (Richard Hughes); Operations HS MS&T Tech Services (Jason Allaband); Operations HS MS&T Tech Services (Nicholas Mauro); Operations HS MS&T Tech Services Fill/Finish (Kevin McMahon); Operations HS MS&T Tech Transfer (Baron Fulk); Operations HS MS&T Tech Transfer (Tsu-shun Lee (Inherited)); Operations HS MS&T Tech Transfer (Wallace Brisson); Operations HS Maintenance (Jamie Blankenship); Operations HS Maintenance (Leon Montgomery); Operations HS Maintenance Facilities (Bruce A Buckoski); Operations HS Maintenance Facilities (Bruce Buckoski); Operations HS Maintenance Instrumentation (Jamie Blankenship); Operations HS Maintenance Metrology (Michael Mikolajczak); Operations HS Maintenance Process (Ricky Norris (On Leave)); Operations HS Maintenance Process (Ricky Norris); Operations HS Maintenance Support (Daniel Sarvis); Operations HS Maintenance Support (Richard Oliver); Operations HS Maintenance Utilities (Scott Curtis Menut); Operations HS Maintenance Utilities (Scott Menut); Operations HS Manufacturing (Irina Staxen); Operations HS Manufacturing Bulk (Chad M Salisbury (Inherited)); Operations HS Manufacturing Bulk (Eric Hoffman); Operations HS Manufacturing Bulk (Jonathan Kegerise); Operations HS Manufacturing Bulk Downstream (Eric P Hoffman); Operations HS Manufacturing Bulk Downstream (Gordon Dunsmore); Operations HS Manufacturing Bulk Downstream - A Shift (Joseph Chapman); Operations HS Manufacturing Bulk Downstream - B Shift (Evan Burke); Operations HS Manufacturing Bulk Downstream - B Shift (LaToya Jaqui McDuffie); Operations HS Manufacturing Bulk Downstream - C Shift (Joseph Chapman); Operations HS Manufacturing Bulk Downstream - C Shift (Samantha Heyer); Operations HS Manufacturing Bulk Downstream - D Shift (Demitra Earls); Operations HS Manufacturing Bulk Downstream - D Shift (Evan Burke); Operations HS Manufacturing Bulk Support (Elie Chiha); Operations HS Manufacturing Bulk Support - A Shift (Craig Steimle); Operations HS Manufacturing Bulk Support - B Shift (Stephen Blair Donaldson); Operations HS Manufacturing Bulk Support - B Shift (Stephen Donaldson); Operations HS Manufacturing Bulk Support - C Shift (Jonathan Adams); Operations HS Manufacturing Bulk Support - D Shift (Kevin Anthony Smith); Operations HS Manufacturing Bulk Support - D Shift (Kevin Smith); Operations HS Manufacturing Bulk Support Materials (Andrew Passarotti); Operations HS Manufacturing Bulk Support Materials (Elie Chiha (Inherited)); Operations HS Manufacturing Bulk Support Materials (George Barrett); Operations HS Manufacturing Bulk Upstream (Gordon Dunsmore); Operations HS Manufacturing Bulk Upstream (Gordon Kennedy Dunsmore); Operations HS Manufacturing Bulk Upstream (Jeremy Smock); Operations HS Manufacturing Bulk Upstream - A Shift (Billy Trask); Operations HS Manufacturing Bulk Upstream - A Shift (Jeremy Smock); Operations HS Manufacturing Bulk Upstream - B Shift (Chris Austin); Operations HS Manufacturing Bulk Upstream - B Shift (Latisha Blair Tucker Kiker); Operations HS Manufacturing Bulk Upstream - C Shift (Chris Austin); Operations HS Manufacturing Bulk Upstream - C Shift (Maxwell Pote); Operations HS Manufacturing Bulk Upstream - D Shift (Jeremy Smock (Inherited)); Operations HS Manufacturing Bulk Upstream - D Shift (Kevin Donnell Thomas); Operations HS Manufacturing Bulk Upstream - D Shift (Kevin Thomas); Operations HS Manufacturing Fill & Finish (Philip Troughton); Operations HS Manufacturing Fill & Finish (Rodney Lam); Operations HS Manufacturing Fill & Finish - A Shift (Aseptic) (LaToya McDuffie); Operations HS Manufacturing Fill & Finish - A Shift (JOSE SERRANO); Operations HS Manufacturing Fill & Finish - A Shift (Jose Serrano); Operations HS Manufacturing Fill & Finish - A Shift (Non-Aseptic) (Todd Brinkley); Operations HS Manufacturing Fill & Finish - B Shift (Aseptic) (Heather Johnson); Operations HS Manufacturing Fill & Finish - B Shift (Heather Johnson); Operations HS Manufacturing Fill & Finish - B Shift (Non-Aseptic) (Reginald Cox); Operations HS Manufacturing Fill & Finish - C Shift (Aseptic) (William Holder); Operations HS Manufacturing Fill & Finish - C Shift (Keith Bridges); Operations HS Manufacturing Fill & Finish - C Shift (Non-Aseptic) (Keith Bridges (On Leave)); Operations HS Manufacturing Fill & Finish - C Shift (Non-Aseptic) (Keith Bridges); Operations HS Manufacturing Fill & Finish - C Shift (Timothy Hampton); Operations HS Manufacturing Fill & Finish - D Shift (Aseptic) (Jamie Page); Operations HS Manufacturing Fill & Finish - D Shift (Branch Chandler Cannon); Operations HS Manufacturing Fill & Finish - D Shift (Non-Aseptic) (Ivan Morris); Operations HS Manufacturing Fill & Finish Expansion (Aseptic); Operations HS Manufacturing Fill & Finish Expansion (Aseptic) (Branch Cannon); Operations HS Manufacturing Fill & Finish Expansion (Non Aseptic) (Zachary Oakley); Operations HS Manufacturing Fill & Finish Expansion (Rodney Lam); Operations HS Manufacturing Fill & Finish Ops Aseptic (Brian Kennedy); Operations HS Manufacturing Fill & Finish Ops Non Aseptic (Steve Gaspar); Operations HS Manufacturing Fill & Finish Ops ¬ñ Aseptic (Brian Kennedy (On Leave)); Operations HS Manufacturing Fill & Finish Ops ¬ñ Non Aseptic (Steve Gaspar); Operations HS Manufacturing Finite Scheduling (Andrew Covington); Operations HS Manufacturing Finite Scheduling (David Tye); Operations HS Manufacturing Operational Excellence (Don Miller); Operations HS Manufacturing Production Systems (Angel Colucci); Operations HS Manufacturing Production Systems (Angel L Colucci); Operations HS Manufacturing Production Systems (Frederick Goerke); Operations HS Manufacturing Sciences & Technology (Baron Fulk); Operations HS Manufacturing Sciences & Technology (Irina Staxen (Inherited)); Operations HS Manufacturing Sciences & Technology (Jessica Mercer); Operations HS Manufacturing Sciences & Technology (Tsu-shun Lee); Operations HS Manufacturing Small Scale (Ashley Greeney); Operations HS Strategy, Alliance Management & PMO (John Anderson (Inherited)); Operations HS Strategy, Alliance Management & PMO (Raj Kapadia); Operations HS Strategy, Alliance Management & PMO (Vernon Horner); Operations HS Supply Chain & Strategy (Mayumi Buckoski); Operations HS Supply Chain (David Tye); Operations HS Supply Chain Planning (David Tye); Operations HS Supply Chain Warehouse (Nicholas Brown); Operations HS Supply Chain Warehouse (Willie Lam); Operations HS Supply Chain Warehouse - Manufacturing & TD (Christopher Stone); Operations HS Viral Pilot Plant (Claudia Johnson); Operations Holly Springs (John Anderson); Operations Lead ¬ñ Project Banksia (Lisa Lamb); Operations Liverpool (Laura O''Brien); Operations Planning Manager (Damien Nguyen); Operations Procurement (John Molyneux); Operations Procurement Operations (Donald Lacombe); Operations Procurement Operations (John Molyneux (Inherited)); Operations Procurement Operations (Michele Morris); Operations Support (Annette Feussner); Operations Support (Nicole Kay); Operations Support (Uwe Kalina) (Uwe Kalina); Operations Support R&D (Michele Himmelspach); Operative Master Data Management (Maike Pollaschek (Inherited)); Operative Master Data Management (Maike Pollaschek); Ops Capital Portfolio Management (Stefano Siviero); Ops Plasma Support (Walter Aebersold); Orange City 155 (ANNETTE NELSON); Orange City 155 (Faye-Lynn Deissinger); Orange City 155 ACM Area 1 (Nathan J Herchenroder); Orange City 155 ACM Area 2 (BRIAN LOFTUS); Orange City 155 ACM Area 2 (Jenna Smith); Orange City 155 QA (Christina M Kokoszka); Orange City 155 QA (Cindy Romero-Estrada); Orange City 155 QA (Kyle M Lehrke (Inherited)); Organisation / Koordination Diverse (Eva Herzog (Inherited)); Organization Transformation (Andrea Douglas); Organization Transformation (Tod Marks); Organizational Development (Kristen Krebs); Organizational Development (Rachel Day); Orlando 144 (Isabella Bishop); Orlando 144 ACM Area 1 (Ron Fischer); Orlando 144 ACM Area 2 (Trinica D Boyd); Orlando 144 QA (Brittany Woodward); Orlando 144 QA (DeQuandra Belton); Orlando 144 QA (Tiffany D Sherman (Inherited)); Orlando 511 (Jessica Collins); PABS (Uwe Kalina); PABS I (Helene Lang); PABS I (Sarah Plum); PABS I+II (Annette Feussner); PABS II (Christina Kober); PABS II (Maria Hauswald); PABS III (Aaron Hahn (On Leave)); PABS III (Aaron Hahn); PABS III (Stefan Baumeister); PACE (Christian Sonderegger) (53003164); PACE (Markus Decher); PACE APAC Deployment - Organisation and Change Management (Faye Papakalodoukas); PACE ATR (Andrew Croft (Inherited)); PACE ATR (Michael Kochanski); PACE ATR Payment Management (Dennis Martin); PACE Americas Deployment (Shane Kennedy); PACE Asia Pacific (Andrew Croft (Inherited)); PACE Asia Pacific (Metani Rooms); PACE Commercial Deployment (Peter K Tadros); PACE Coordination BRN (Boris Kaiser (Inherited)); PACE Coordination BRN (Christian Sonderegger); PACE ES (Marco Maeder); PACE General Accounting (Eric Fay); PACE Global Distribution (Catherine Gil); PACE Metrics & Analytics (Christian Spuckti); PACE OTC (Kian Hartono); PACE PM Bern (Oliver Bigler); PACE PMO (Tod Marks); PACE PMO (Tod Marks) (Tod Marks); PACE PTI (Wolfgang Schneider); PACE Program (Andrew Croft (Inherited)); PACE S2P (Andrew Croft (Inherited)); PACE S2P (Simon Haemmerli); PACE S2P (TR Kannan); PACE Site Deployment (Kelly L Konemann); PACE deployment Bern Lengnau (Boris Kaiser); PACE sustain (Linda Carducci (Inherited)); PACE sustain (Markus Decher); PAI Dokumentation (Andre Hullmann (Inherited)); PAI Dokumentation (Carsten Meyer (Inherited)); PAI Endfiltration Albumin (Achim Ludwig (Inherited)); PAI Endfiltration Albumin (Achim Ludwig); PAI Fermentation (Tobias Kling); PAI Koordination (Andre Hullmann (Inherited)); PAI Koordination (Bernd Prior (Inherited)); PAI Koordination (Carsten Meyer (Inherited)); PAI Nebenbetriebe (Mario Kornemann (Inherited)); PAI Pasteurisierung (Mario Kornemann (Inherited)); PAI Produktion 1 / Nebenanlagen (Mario Kornemann); PAI Produktion Albumin (Andre Hullmann); PAI Produktion Immunglobuline/ Nebenanl. (Bernd Prior); PAI Produktion PCF H67 (Roger Leukel); PAI Produktion Rekombinante Proteine (Andreas Berting); PAI Produktion Rekombinante Proteine (Carsten Meyer); PAI Prozessmanager (Barbara Kalina (Inherited)); PAI Prozessmanager (Wilfried Freudenberg (Inherited)); PAI Rekombinante Proteine GMP (Carsten Meyer (Inherited)); PAI Subfraktionierung (Mario Kornemann (Inherited)); PAI Systemuntersttzung SAP/MES (Wilfried Freudenberg (Inherited)); PAI Systemunterst√ºtzung SAP/MES (Barbara Kalina (Inherited)); PAI Systemunterst√ºtzung SAP/MES (Wilfried Freudenberg (Inherited)); PAI Training & GMP (Barbara Kalina (Inherited)); PAI Training & GMP (Wilfried Freudenberg (Inherited)); PAI Ultrafiltration / Endfiltration (Alfons Hck (Inherited)); PAI Ultrafiltration / Endfiltration (Alfons H√∂ck (Inherited)); PAI Ultrafiltration Albumin (Martin Doruch (Inherited)); PAI Ultrafiltration Albumin (Martin Doruch); PAI Vorbehandlung / Support (Hans Becker); PAI Vorbehandlung 1 / Support (Hans Becker (Inherited)); PAI Vorbehandlung 2 (Hans Becker (Inherited)); PAI Vorbehandlung 3 (Andreas Koch); PAI Wgekabine (Mario Kornemann (Inherited)); PAI W√§gekabine (Mario Kornemann (Inherited)); PBS Basisfraktionierung & Support (Stefan Vaupel); PBS Basisfraktionierung (Bernhard Tribensky); PBS Basisfraktionierung (Klaus Wilhelm); PBS Planung & Dokumentation (Claus Baudszus); PBS Schichtgruppe 1 (Mario Lorch); PBS Schichtgruppe 2 (Bjrn Klingelhfer); PBS Schichtgruppe 2 (Bj√∂rn Klingelh√∂fer); PBS Schichtgruppe 3 (Andreas Klein); PBS Schichtgruppe 4 (Andreas Kraus); PBS Schichtgruppe 5 (Bernd Hofmann); PBS Schichtgruppe 6 (Bernd Teske); PCS & MES (Frank Mastellone (Inherited)); PCS & MES (Magda Stavaroiu); PCS & MES (Magda-Elena Stavaroiu); PCS (Reto Kamber); PCS Maintenance (Markus Klsle); PCS Maintenance (Markus Kl√§sle); PCS Maintenance (Reto Camastral); PD Projects & Technology Transfer (Steven Honey); PE - Central Region (Gangjian Chen ); PE - Central Region (Gangjian Chen ?????); PE - Central Region 1 (Qin Li ); PE - Central Region 1 (Qin Li ????); PE - Central Region 2 (Gangjian Chen ????? (Inherited)); PE - Central Region 2 (Shu Min ); PE - Central Region 2 (Shu Min ????); PE - DTP, China (Cissy Xi ); PE - DTP, China (Cissy Xi ????); PE - East Region (Zhen Shen ); PE - East Region (Zhen Shen ????); PE - East Region 1 (Xiao Ma ); PE - East Region 1 (Xiao Ma ????); PE - East Region 2 (Guo Jie Yu ?????); PE - East Region 2 (Guojie Yu ); PE - East Region 2 (Guojie Yu ?????); PE - East Region 3 (Liang Xu ); PE - East Region 3 (Liang Xu ????); PE - North Region (David Chen ???? (Inherited)); PE - North Region (Zhixia Wang ); PE - North Region (Zhixia Wang ?????); PE - North Region 1 (Yajuan Wen ); PE - North Region 1 (Yajuan Wen ?????); PE - North Region 3 (Qinghua Zhao ?????); PE - North Region 4 (Hongbin Wang ?????); PE - North Region 4 (Tracy Yu ); PE - North Region 4 (Tracy Yu ?????); PE - South Region (Sam Shang ); PE - South Region (Sam Shang ?????); PE - South Region 1 (Tony Lee ); PE - South Region 1 (Tony Lee ?????); PE - South Region 2 (Ice Li ); PE - South Region 2 (Ice Li ?????); PE - South Region 3 (Yi-yu Zhang ); PE - South Region 3 (Yi-yu Zhang ?????); PE - South Region 4 (Michelle Li ); PE - South Region 4 (Michelle Li ?????); PE - South Region 5 (Gary Chen ); PE - South Region 5 (Gary Chen ?????); PE - West Region (Alex Kong ); PE - West Region (Alex Kong ????); PE - West Region (David Chen ???? (Inherited)); PE - West Region (Shengyan Qiu ?????); PE - West Region 1 (Hao Chen ); PE - West Region 1 (Hao Chen ????); PE - West Region 2 (Jack Liao ); PE - West Region 2 (Jack Liao ????); PE - West Region 3 (Shen Jie ); PE - West Region 3 (Shen Jie ????); PE - West Region 3 (Shengyan Qiu ????? (Inherited)); PE-Central Region 3 (Julia Zhu ); PE-Central Region 3 (Julia Zhu ????); PGI Bulkproduktion M1M2 (Julian Lampmann); PGI Bulkproduktion M1M2 (Sebastian Feisel); PGI Documentation (Patrick Brusius); PGI Koordination (Heiko Schild (Inherited)); PGI Produktion Beriate (Heiko Schild); PGP Bulkproduktion 1 FIX (Philipp Hergenrder); PGP Bulkproduktion 1 FIX (Philipp Hergenr√∂der); PGP Bulkproduktion 1 FIX (Steffen Mbius); PGP Bulkproduktion 1 FIX (Steffen M√∂bius); PGP Bulkproduktion 1 FVIII-B (Gerhard Burk (Inherited)); PGP Bulkproduktion 1 FVIII-B (Gerhard Burk); PGP Bulkproduktion 1 FVIII-H (Henrik Tutsch (Inherited)); PGP Bulkproduktion 1 FVIII-H (Peter Diehl (Inherited)); PGP Bulkproduktion 1 FVIII-H (Peter Diehl); PGP Bulkproduktion 2 FIX (Sebastian Feisel (Inherited)); PGP Bulkproduktion 2 FIX (Timo Mudersbach (Inherited)); PGP Bulkproduktion 2 FIX (Timo Mudersbach); PGP Bulkproduktion 2 FIX (Timo Mudersbach) (Timo Mudersbach); PGP Bulkproduktion 2 FVIII-B (Reiner Bamberger (Inherited)); PGP Bulkproduktion 2 FVIII-B (Reiner Bamberger); PGP Bulkproduktion 2 FVIII-H (Ernst Dittmar (Inherited)); PGP Bulkproduktion 2 FVIII-H (Ernst Dittmar); PGP Bulkproduktion 3 FVIII-B (Frank Burich (Inherited)); PGP Bulkproduktion 3 FVIII-B (Frank Burich); PGP Bulkproduktion 3 FVIII-B (Frank B√§urich (Inherited)); PGP Bulkproduktion 3 FVIII-B (Frank B√§urich); PGP Bulkproduktion 3 FVIII-H (Jrgen Ungemach (Inherited)); PGP Bulkproduktion 3 FVIII-H (Jrgen Ungemach); PGP Bulkproduktion 3 FVIII-H (J√ºrgen Ungemach (Inherited)); PGP Bulkproduktion 3 FVIII-H (J√ºrgen Ungemach); PGP Bulkproduktion 4 FIX (Steffen Mbius); PGP Bulkproduktion 4 FIX (Steffen M√∂bius); PGP Dokumentation (Patrick Brusius); PGP Koordination FIX (Karl-Heinz Wenz (Inherited)); PGP Koordination FIX (Karl-Heinz Wenz (On Leave) (Inherited)); PGP Koordination FVIII-B (Heiko Schild (Inherited)); PGP Modul 2 - Team 1 (Henning Dittmar); PGP Modul 2 - Team 2 (mit Aslantas (Inherited)); PGP Modul 2 - Team 2 (√úmit Aslantas (Inherited)); PGP Modul 2 - Team 3 (Timo Gr√ºn (Inherited)); PGP Produktion Beriate (Heiko Schild); PGP Produktion Faktor IX (Karl-Heinz Wenz (On Leave)); PGP Produktion Faktor IX (Karl-Heinz Wenz); PGP Produktion Haemate / Humate (Henrik Tutsch); PGP Produktion Haemate / Humate (Peter G√ºttner); PGP Prozessmanager (Barbara Kalina (Inherited)); PGP Prozessmanager (Horst Boeder (Inherited)); PGP Pufferherstellung FVIII-B (Bernd Grau (Inherited)); PGP Tagschicht FIX (Ewald Burk); PGP Tagschicht FIX (Timo Mudersbach); PGP Vorbehandlung FVIII-H (Sascha Ludwig (Inherited)); PGP Vorbehandlung FVIII-H (Sascha Ludwig); PIU (Alan Collins); PIU (Christine Fitzpatrick); PIU Team (Christine Riley); PIU/UM Engineering (Peter White); PL-Quality (Carmen Althainz); PL-Quality (Mehmet G√ºm√ºs); PM Hematology and Thrombosis TA (Joanne Uhl (Inherited)); PM Hematology and Thrombosis TA (Mark Kleinman); PMR Dokumentation (Wilfried Freudenberg (Inherited)); PMS (Hideo Usui - ); PMS (Hideo Usui ??? ?? - ??? ????); PMS (Masashi Nakayama); PNS (Ibtisam Saeed); PNS Manufacturing (Hosmer Perez); PPD / Technical Operations Marburg (Michael Moses); PPD Bern Admin (Eliane Bossart); PPD BioAnalytical Science (Patrick Schuetz); PPD CMC Bern (Philipp Angerer); PPD Impurity & Data Mngt (Patricia Lieby); PPD Investigations (Thomas Kilchoer); PPD Investigations 2 (Tino Boss); PPD Investigations I (Janine Bash); PPD Process Development - R&D (Hal Braley); PPD Process Development - R&D (Kathryn Scott); PPD Process Development - R&D (Yvette Citrine); PPD Process Development 2 (Ibrahim El Menyawi); PPD Process Development 2 Group 1 (Eva Blatter); PPD Process Development 2 Group 2 (Robin Das Gupta); PPD Process Development 2 Group 3 (Adrian Alder); PPD R & D Bioanalytics BMW (Mark Bailey); PPD R&D KOP (Kenneth Walsh); PPD R&D Marburg (Martin Vey); PPD Technical Operations (Birgit Unterweger); PPD Technical Operations (Michele Himmelspach); PPD, Process Development (Eric Zhu); PPM (Roberta Duncan (Inherited)); PPM Research (Heather Davis); PPM Technical (Heather Davis); PQG Look Back / PDI (Patricia Herrmann); PQG Plasma Control (Iryna Zabolotna); PRP Support (Heinz-J√ºrgen Merkel); PRP Vorbehandlung (Thorsten Theis); PRP GMP-Koordination (Heinz-Jrgen Merkel); PRP GMP-Koordination (Heinz-J√ºrgen Merkel (Inherited)); PRP GMP-Koordination (Heinz-J√ºrgen Merkel); PRP Logistik (Robert Schfer); PRP Logistik (Robert Sch√§fer); PRP Lsungsherstellung & Wiegebereich (Robert Schfer (Inherited)); PRP L√∂sungsherstellung & Wiegebereich (Robert Sch√§fer (Inherited)); PRP Support (Yanina Broadnax); PRP Support 1 (Steffen Ramb); PRP Vorbehandlung (Thorsten Theis (Inherited)); PRP Vorbehandlung (Thorsten Theis); PRP Vorbehandlung 1 (David Grb); PRP Vorbehandlung 1 (David Gr√§b); PRP Vorbehandlung 1 (Fabian Feisel); PRP Wareneingang (Evelin Kaiser-Felsmann); PRP Wareneingang (Yanina Broadnax); PRP Wareneingang Team 1 (Sebastian Siebert); PRP W√§gebereich (Heinz-J√ºrgen Merkel (Inherited)); PTC (Severin Thierau); PTH Abfllung 1 (Alexander Muth (Inherited)); PTH Abfllung 2 (Michael Kroker (Inherited)); PTH Abfllung 2 (Michael Kroker); PTH Abfllung 3 (Nils Rder); PTH Abfllung 4 (Bjrn Schmidt); PTH Abf√ºllung 1 (Pascal Nau (Inherited)); PTH Abf√ºllung 2 (Michael Kroker (Inherited)); PTH Abf√ºllung 1 (Lars Nau); PTH Abf√ºllung 1 (Pascal Nau (Inherited)); PTH Abf√ºllung 1 (Pascal Nau); PTH Abf√ºllung 2 (Michael Kroker (Inherited)); PTH Abf√ºllung 2 (Michael Kroker); PTH Abf√ºllung 3 (Alexander Jegel); PTH Abf√ºllung 3 (Rainer Lepper (Inherited)); PTH Abf√ºllung 3 (Rainer Lepper); PTH Abf√ºllung 4 (Bj√∂rn Schmidt); PTH Abf√ºllung 4 (Heiko Steinbach); PTH Albumin & Visual Inspection (Jrg Nickel); PTH Albumin & Visual Inspection (J√∂rg Nickel); PTH GMP Coordination (Matthias Klein (Inherited)); PTH GMP-Coordination (Jrg Nickel (Inherited)); PTH GMP-Coordination (J√∂rg Nickel (Inherited)); PTH Optische Kontrolle 1 H069 (Bernd Balzer (Inherited)); PTH Optische Kontrolle 1 H069 (Bernd Balzer); PTH Optische Kontrolle 2 H069 (J√∂rg Nickel (Inherited)); PTH Optische Kontrolle 2 H069 (Valentina Kufeld (Inherited)); PTH Optische Kontrolle 2 H069 (Valentina Kufeld); PTH Optische Kontrolle 3 H069 (Jrg Nickel (Inherited)); PTH Optische Kontrolle 3 H069 (J√∂rg Nickel (Inherited)); PTH Optische Kontrolle 3 H069 (Meike D√∂rbecker (Inherited)); PTH Optische Kontrolle 3 H069 (Meike D√∂rbecker); PTH Processmgr Pretreatment Refludan&Bul (Matthias Klein (Inherited)); PTH Servicefunktion (Sabine Fischer); PTH Teilfertigung H069 (Alexander Muth); PTH Teilfertigung H069 (Daniel Schneider); PTH Teilfertigung H069 (Tim Westphal); PTH Teilfertigung Koordination (Daniel Schneider (Inherited)); PTH Teilfertigung Koordination (Tim Westphal (Inherited)); PTH Vorbehandlung & Support (Peter Koch); PTH Vorbehandlung 3 / Abfllung 3 H069 (Uwe Fritsch); PTH Vorbehandlung 3 / Abf√ºllung 3 H069 (Uwe Fritsch); PTH Vorbehandlung&Support (Peter Koch (Inherited)); PTI EM Lead (Susan Clough); PTM Abfllung M305 (Tim Westphal); PTM Abf√ºllung M305 (J√∂rg Dieterich); PTM Abf√ºllung M305 (Tim Westphal); PTM Betriebsservicefunktion M305 (Jennifer Hilscher (Inherited)); PTM Betriebsservicefunktion M305 (Jennifer Hilscher); PTM Betriebsservicefunktion M305 (Reinhard Grn (Inherited)); PTM Betriebsservicefunktion M305 (Reinhard Grn); PTM Betriebsservicefunktion M305 (Reinhard Gr√ºn (Inherited)); PTM Betriebsservicefunktion M305 (Reinhard Gr√ºn); PTM GMP Koordinator (Esther Seidel (Inherited)); PTM GT-Anlage M305 (Peter Dersch (Inherited)); PTM Optische Kontrolle M305 (Alexandra G√ºnther (Inherited)); PTM Optische Kontrolle M305 (Elke Stauss (Inherited)); PTM Optische Kontrolle M305 (Elke Stauss); PTM Projekte / Technik (Esther Seidel (Inherited)); PTM Prozessmanager (Esther Seidel (Inherited)); PTM Teilfertigung M305 (Alexandra Gnther); PTM Teilfertigung M305 (Alexandra G√ºnther); PTM Visuelle Kontrolle (Julia Dworschak); PTM Vorbehandlung M305 (Eckhard Brickum (Inherited)); PV Agreements Lead (Andrea Kergl); PV Excellence and Compliance (Gina Granada); PV Quality Management Lead (Gina Granada); PV Safety (Tomoko Yanagawa); PWI Chromatographie & Fllung H68 (Dietmar Grebe); PWI Chromatographie & F√§llung H68 (Dietmar Grebe); PWI Faktor I / XIII Schichtgruppe 7 (Bj√∂rn Bartelme√ü); PWI Faktor I / XIII Schichtgruppe 7 (Horst Schneider); PWI Faktoren I & XIII (Jochen Khler); PWI Faktoren I & XIII (Jochen K√∂hler); PWI GMP-Koordination (Heinz-J√ºrgen Merkel (Inherited)); PWI Inhibitoren (Wilfried Happel); PWI Koordination (Jochen Khler (Inherited)); PWI Koordination (Jochen K√∂hler (Inherited)); PWI Koordination (Wilfried Happel (Inherited)); PWI Logistik (Robert Sch√§fer); PWI L√∂sungsherstellung & Wiegebereich (Robert Sch√§fer (Inherited)); PWI Regeneration & Vorbehandlung H68 (Marc Wellner); PWI Support (Heinz-J√ºrgen Merkel); PWI Tagdienst (Roger Ochs); PWI Teilbereich (Christoph Bernitt); PWI Training & GMP (Jochen Khler (Inherited)); PWI Training & GMP (Jochen K√∂hler (Inherited)); PWI Training & GMP (Wilfried Happel (Inherited)); PWI Vorbehandlung (Thorsten Theis (Inherited)); PWI Vorbehandlung (Thorsten Theis); PWI Wareneingang (Evelin Kaiser-Felsmann); PWI W√§gebereich (Heinz-J√ºrgen Merkel (Inherited)); PWI-H68-Schicht (Dietmar Grebe (Inherited)); PWI-H68-Schicht (Marc Wellner (Inherited)); PWI-H68-Tagdienst (Dietmar Grebe (Inherited)); PWI-H68-Tagdienst (Marc Wellner (Inherited)); PWI-H68-Tagdienst (Marc Wellner); PWI-M305 (Manuel Lotz); PWI-M305 Schicht 1 (Fabian Cybulski); PWI-M305 Schicht 2 (Florian Scherer (Inherited)); PWI-M305 Schicht 2 (Florian Scherer); PWI-M305 Schicht 3 (Fynn Krieger); PWI-M305 Tagdienst (Robert H√∂hne) (Robert H√∂hne); Packaging & Supply (Claus Peihs); Packaging & Supply (Helmut Robert Euler (Inherited)); Packaging & Supply (Viktor Krecker); Packaging & WHS (Armin Stcklin); Packaging & WHS (Stefan Kaelin); Packaging (Andrew Baxter); Packaging (Brian T White); Packaging (Bruno Baeriswyl); Packaging (Kate (Shortall) Lamont); Packaging (Kate Lamont); Packaging (Kate Shortall); Packaging (Othmar Geisser); Packaging (Pasquale Carestia (Inherited)); Packaging (Thomas Royal); Packaging (Union) (Brian T White); Packaging (Union) (Pasquale Carestia (Inherited)); Packaging (Union) (Thomas Royal); Packaging Day Shift 2/6 (Tanja Maegert); Packaging Day Shift 4/5 (Jelka Golob); Packaging Design (Josue Stoll); Packaging Development (Claude Morf); Packaging Development (Markus Maus); Packaging Diverse (Jrg Dieterich (Inherited)); Packaging Diverse (J√∂rg Dieterich (Inherited)); Packaging Evening Shift 1/3/6 (Shabbir Ahmad Sheikh); Packaging Evening Shift 2/4/5 (Nebojsa Milosevic); Packaging I (Pasquale Carestia (Inherited)); Packaging Line 1 (Daniel Fankhauser); Packaging Line 1 (Marianne Steuri); Packaging Line 2,3,7 (Jelka Golob); Packaging Line 4 (Nebojsa Milosevic); Packaging Line 5 (Bashkim Redzepi); Packaging Line 6 (Tanja Maegert); Packaging Materials Testing & Release (Dominik Corbet); Packaging Operations (Claus Peihs (Inherited)); Packaging Operations (Jrg Dieterich); Packaging Operations (J√∂rg Dieterich); Packaging Operations (Murat Dalar); Packaging Teams (Bernd Baum); Packaging and Inspection (David Hartley (Inherited)); Packaging and Inspection (Joey Tranquilino); Packaging, Design & Artwork (Metin Yilmaz); Packing Material Control PMC (Dominic Wuest); Packing Material Control PMC (Nicole Moser); Packing Material Control PMC 2 (Denise Engimann); Packing Team Leader (Adam Heath); Packing Team Leader (Beau Williams); Packing Team Leader (David Nguyen); Packing Team Leader 451 (Robert De Santis); Pain Business Unit Director (Michael Grant); Palm Bay 254 (Cari N Howard); Palm Bay 254 (Latora (LaLa) Boswell); Palm Bay 254 ACM Area 1 (John Fuller); Palm Bay 254 ACM Area 1 (KIARA CAGLE); Palm Bay 254 ACM Area 2 (Latora (LaLa) Boswell); Palm Bay 254 ACM Area 2 (Lori Leinas); Palm Bay 254 QA (Regine Jean Gilles (On Leave)); Palm Bay 254 QA (Regine Jean Gilles); Pandemic (Lorna Meldrum); Parenteral Manufacturing (AlbuRx Filling) (Daniel Derakhshanian); Parenteral Manufacturing (AlbuRx Filling) (Union) (Daniel Derakhshanian); Parenteral Manufacturing (AlbuRx) (Nick Bonavita); Parenteral Manufacturing (AlbuRx) (Union) (Nick Bonavita); Parenteral Manufacturing (Mindy Randazzo); Parenteral Manufacturing (Thomas Royal); Parenteral Manufacturing (Union) (Mindy Randazzo (Inherited)); Parenteral Manufacturing (Union) (Thomas Royal (Inherited)); Parkersburg 178 (Jeff Hay); Parkersburg 178 (Lachelle Mosholder); Parkersburg 178 (Lenora Lada); Parkersburg 178 ACM Area 1 (Alissa Sindelar); Parkersburg 178 ACM Area 1 (Lenora Lada); Parkersburg 178 ACM Area 2 (Lachelle Mosholder); Parkersburg 178 ACM Area 2 (Lenora Lada (Inherited)); Parkersburg 178 QA (Amanda M Cvitkovich); Parkersburg 178 QA (Christina Prunty); Parma Heights 162 (Olivia Arend); Parma Heights 162 (Sue Collins); Parma Heights 162 ACM Area 1 (Lindsy Wolf); Parma Heights 162 ACM Area 2 (Mirela Sekulic); Parma Heights 162 ACM Area 2 (Seanna Penn); Parma Heights 162 QA (Deborah Robinson); Paste & Final Product Planning (Martin Sutter); Patents and Licenses (Hans-Peter Hauser); Patient Engage & Feas (Rodney Winley); Pay Services (Brian T Simeur); Payroll DE (Claudia Rogge); Pensacola 623 (Nicole K Stassen); Pensacola 623 ACM Area 1 (Esteban Facundo); Pensacola 623 ACM Area 1 (Timothy J Nisewonger); Pensacola 623 ACM Area 2 (Esteban Facundo); Pensacola 623 ACM Area 2 (Timothy J Nisewonger); Pensacola 623 QA (Jessica L Ford (On Leave)); Pensacola 623 QA (Jessica L Ford); Pensacola 623 QA (Matthew T Zisa); Pensacola 623 QA (Melodee C Ebel (Inherited)); Peoria 133 (Mark A Yellen); Peoria 133 (Patrick S Taylor); Peoria 133 ACM Area 1 (DeAnn K Benally); Peoria 133 ACM Area 1 (Patrick S Taylor (Inherited)); Peoria 133 ACM Area 2 (Patrick S Taylor (Inherited)); Peoria 133 ACM Area 2 (Seanna Penn); Peoria 133 QA (LaVona M Holt); Peoria 289 (Dennis Popek); Peoria 289 (Nicholle DeVecchi); Peoria 289 ACM Area 1 (Holly Worsfold); Peoria 289 ACM Area 2 (Lew Carney); Peoria 289 QA (Kali Trevino); Performance Management (Ken Lain); Pharmaceutical Development (Martin Alex Imboden); Pharmacodynamic (Marc Nolte); Pharmacodynamic 1 (Padmapriya Ponnuswamy); Pharmacodynamic 2 (Subhajit Ghosh); Pharmacokinetic (Oliver Ghobrial); Pharmacokinetic (Sabine Pestel); Pharmacology & Toxicology (Eva Herzog); Pharmacometrics (Theresa Yuraszeck); Pharmacovigilance systems (Sahil Sahni); Pharmacovigllance Quality (Wumi McDowall); Pharmakanten (Carmen Walldorf (Inherited)); Pharmakanten (Doris Nake (Inherited)); Philadelphia 145 (Kristen Aydin); Philadelphia 145 (Rene Benson-Skone); Philadelphia 145 (Robert W Gillespie); Philadelphia 145 ACM Area 1 (Ken Laguerre); Philadelphia 145 ACM Area 2 (Kevin Lambrecht); Philadelphia 145 ACM Area 2 (Rene Benson-Skone (Inherited)); Philadelphia 145 QA (Kim Van Houten); Philadelphia 147 (Derek Morner); Philadelphia 147 (John L Thixton (Inherited)); Philadelphia 147 (Michele Dionne); Philadelphia 147 (Theresa Mwimbwa); Philadelphia 147 ACM Area 1 (Jennifer Foxworth); Philadelphia 147 ACM Area 1 (Robinrenee Dorsey); Philadelphia 147 ACM Area 2 (Robinrenee Dorsey); Philadelphia 147 ACM Area 2 (Rose Marie Waddle); Philadelphia 147 QA (Alissa Elke); Philadelphia 147 QA (John L Thixton (Inherited)); Philadelphia 147 QA (Leslie Jones); Philadelphia 147 QA (Samantha J Schrepel); Pilot Plan Manufacturing Team (Stefanie Ronzheimer); Pilot Plant (Christian Schlachtbauer); Pilot Plant (Jarvis Hammitt); Pilot Plant (Klaus-Jrgen Schlitt (Inherited)); Pilot Plant (Leander Trachsel); Pilot Plant (Norbert Egon Juettner); Pilot Plant Group (Lukas Sterchi); Pilot Plant II (Franziska Naef); Pilot Plant II (Lukasz Lubecki); Pilot Plant Lengnau (Joel Zumstein); Pilot Scale Operations (Chris Horridge); Pilot Scale Operations (Daniela Mocanu); Pilot Scale Operations (Heidi Bergman); Pilot Scale Operations (Jeffrey Bourke); Pilot Scale Operations (Maggie Aziz); Pilot Scale Operations (Mark Simmonds (Inherited)); Pilot Scale Operations (Mark Simmonds); Pilot Scale Operations (Paul Gibbs); Pilot Scale Operations (Rob Hooper); Pilot Scale Operations (Sharon Orr); Pilot Scale Operations (Tien Vo); Pilot Scale Operations (Tim Hanna); Pilot Scale Operations (Ursula Macaskill); Pilot Scale Operations 1 (Jessica McGiffin); Pinellas Park 139 (Brett Goldman); Pinellas Park 139 (Leah J Davis); Pinellas Park 139 (Robin G Spencer); Pinellas Park 139 ACM Area 1 (Alesia Davenport); Pinellas Park 139 ACM Area 1 (Lynn M Stratton); Pinellas Park 139 ACM Area 2 (Alesia Davenport); Pinellas Park 139 ACM Area 2 (Christina Goodrich); Pinellas Park 139 QA (Dana Pagano); Pinellas Park 139 QA (Lynn M Stratton); Pinnacle Training Site Las Vegas (Yennifer Fernandez); Pinnacle Training Site Pinellas Park (Lauren Hardy); Pittsburg 269 (Esence Hambrick); Pittsburg 269 ACM Area 1 (Dan Lassige); Pittsburg 269 ACM Area 2 (Tammy Toth); Pittsburg QA 269 (Marianne Brown); Pittsburgh 269 (Marianne Brown); Pittsburgh 269 ACM Area 1 (Dan Lassige); Pittsburgh 269 ACM Area 2 (Tammy Toth); Pittsburgh QA 269 (Melanie Kauffman); Planning (Christoph Krug); Planning (Stephan Obrecht); Planning (Tabitha Dineen); Planning Maintenance (Andr√© Hasler); Planning Maintenance (Oliver Bigler); Plant & Clean Utilities (Nozar Basseri); Plant Engineering & Services (Beat Meyer); Plant Engineering (Benjamin Reh); Plant Engineering (Michael Kleinehanding); Plant Engineering Mgr 255 (Anthony Wrzesinski); Plant Engineering Mgr 255 (Stuart Barnes); Plant Engineering Mgr 255 (Timothy Travis); Plant Finance (Justin Mericle); Plant Finance (Melissa Gottschall); Plant Finance (Vlad Kirylau); Plant Finance II (Vlad Kirylau); Plant Finance Product Costing & Capital (Michael McAvoy); Plant Operations (Vinko Momiroski); Plant Utilities (Hansruedi Brunner); Plasma & Raw Material Release (Stefan Tepfenhart); Plasma Center Management (Jincai Zhu ); Plasma Center Management (Jincai Zhu ?????); Plasma Contract Management (Linda S Romalin); Plasma Finance (Jason Mugridge); Plasma Fractionation (John Nelson); Plasma Fractionation (Jordan Wright); Plasma Fractionation (Union) (John Nelson (Inherited)); Plasma Fractionation (Union) (Jordan Wright); Plasma Logistic Center (Peter Nau); Plasma Logistic Center Dallas Supervisor (Brandon W Wornick); Plasma Logistic Center Dallas Supervisor (Brandon Wornick); Plasma Logistics Center Dallas (Carey L Fleener); Plasma Logistics Center Indy (Chad Simeur); Plasma Logistics Center Whitestown (Chad Simeur); Plasma Logistics Centers (Michael J Frecker); Plasma Management (Jack Zhang ?????); Plasma New Development (Jake Zhang ); Plasma New Development (Lixia He ?????); Plasma Operation, Quality (Qingqing Wang ); Plasma Operation, Quality (Qingqing Wang ?????); Plasma Operations (Eveline Kindler); Plasma Operations (Timo Fuhrmann); Plasma Operations Finance; Plasma Operations and Quality (Eric Li ); Plasma Operations and Quality (Eric Li ?????); Plasma Operations and Quality Management (Jeffrey A Schulz); Plasma Pay Services (Karen D Vellutini); Plasma Product Development (Douglas Lee (Inherited)); Plasma Product Development (Michael Zachner); Plasma Products Bulk Operations (Barbara Kalina); Plasma Quality (Lixia He ); Plasma Quality (Lixia He ?????); Plasma Quality Management (Laura O''Brien); Plasma Quality/Deviations (Stefan Kaelin); Plasma Receipt & Haemostasis (Narelle Urli); Plasma Receipt & Haemostasis (Sean Flannery); Plasma Receipt (Brendan Smale); Plasma Receipt (Roger Hand); Plasma Receipt (Tommy Tovilo); Plasma Release (Daniel Schwarz); Plasma Release (Si√© Kigninlman Coulibaly (Inherited)); Plasma Resources US (David H Confessore (Inherited)); Plasma Resources US (Debra A Hood); Plasma Resources US (Shane Kennedy); Plasma Sourcing Management (Lixia He ?????); Plasma and Manufacturing Finance (Ted Kanigowski); Plasma and Manufacturing Finance (Ted Kanigowski) (Ted Kanigowski); Plasmapreparation (Andreas Reber); Plasmapreparation (Erich Nuessle); Pleasant Grove 046 (Eduardo Williams); Pleasant Grove 046 (Vicky Sablan); Pleasant Grove 046 ACM Area 1 (Chad Pagel); Pleasant Grove 046 ACM Area 2 (Ebony Q McGee); Pleasant Grove 046 QA (Pamela R Mendoza); Pontiac 121 (Ashley M Jamieson (Inherited)); Pontiac 121 (Melissa Johnson); Pontiac 121 (Mondel Hightower); Pontiac 121 ACM Area 1 (Tracey L Boyd-McCorkle); Pontiac 121 ACM Area 2 (Mondel Hightower (Inherited)); Pontiac 121 ACM Area 2 (William D Owens); Pontiac 121 QA (Ashley M Jamieson (Inherited)); Pontiac 121 QA (Rebecca Barrons (On Leave)); Pontiac 121 QA (Rebecca Barrons); Pontiac 121 QA (Rodnesia R Jackson); Port Arthur 176 (Karen Sauceda); Port Arthur 176 ACM Area 1 (Dannetta Abdel-Malek); Port Arthur 176 ACM Area 1 (Karen Sauceda (Inherited)); Port Arthur 176 ACM Area 2 (THU RAMOS); Port Arthur 176 QA (Angela Redd); Port Author 176 QA (Angela Redd); Port Author 176 QA (Michael Thompson); Port Authur 176 (Karen Sauceda); Port Authur 176 (Lauren Hardy); Port St Lucie 072 (Kashaun Muhammad (Inherited)); Port St Lucie 072 (Mario A Salas); Port St Lucie 072 ACM Area 1 (Adam Davis); Port St Lucie 072 ACM Area 2 (Vanessa Sanon); Port St Lucie 072 ACM Area 3 (Garrett J Royal); Port St Lucie 072 QA (Raquel Reyes (On Leave)); Port St Lucie 072 QA (Raquel Reyes); Portage 187 (Dom Moceri); Portage 187 (Richard McCoy); Portage 187 ACM Area 1 (DERREK CRUMP); Portage 187 ACM Area 1 (Jeffrey Ott (On Leave)); Portage 187 ACM Area 2 (DERREK CRUMP); Portage 187 ACM Area 2 (Nikki Bradley); Portage 187 QA (Mitch A Quinn); Portage 187 QA (Stephanie Gower); Portfolio & Project Management (Heather Davis); Portfolio & Project Management (Roberta Duncan); Portfolio Management (Joel Hanson); Potency 1 (Dave De Witte); Potency 1 (Johanna Mock); Potency 2 (Martina Treutlein); Potency Testing Final Product 1 (Johanna Mock); Potency Testing Final Product 2 (Martina Treutlein); Potency Testing Intermediates 1 (Jan Bursy); Potency Testing Intermediates 1 (Marika Midon); Potency Testing Intermediates 2 (Wilfried Peil); Preclinical Innovation (Fabian Kaesermann); Preclinical Innovation (Jennifer Brasseit); Preclinical Innovation (Kleanthis Fytianos); Preclinical Innovation (Rolf Spirig); Pricing (Paul Jens (Inherited)); Pricing (Stephanie Kupski); Primary Automation (Gary Steele); Primary Automation (Stephen Callaghan); Primary Manufacturing (Matthew Burrows); Primary Packaging & Medical Devices Bern (Frank Bamberg); Primary Packaging & Medical Devices Bern (Renzo Pedrussio); Primary Packaging & Medical Devices Bern I (Monica Tavanti); Primary Packaging & Medical Devices Marburg (Ahmad Abdul Fattah); Primary Packaging & Medical Devices Marburg (Thomas Pfeifer); Primary Process Engineering (Asad Akhter); Primary Utility Projects (Russell Peak); Primary and Warehouse Validation (James Swann); Privigen Bulk & Facility Operations (Robert Skok); Privigen Bulk (George Barlas); Privigen Bulk (George Makris); Privigen Bulk (Jeremy Campbell (Inherited)); Privigen Bulk (Jeremy Campbell); Privigen Bulk (Kellie Goodman); Privigen Bulk (Lanie Hynninen); Privigen Bulk (Ritaben Suhagiya); Privigen Marketing (Robert Zegel); Privigen/AlbuRx Processing (Peter Klasen); Privigen/AlbuRx Processing Team Leader (Areti Kaloyannis); Process Validation (Berangere Lingat); Process Validation (Fergus Hawes); Process Validation (Peter Trimcevski); Process & Project Engineering (Duncan Benson); Process Analyst Lead (Kate Goossens); Process Analytics & Scale-up (Michael Bieri); Process Analytics & Scale-up (Tobias Heck); Process Change Program (Anita Kohl-Truebenbach (Inherited)); Process Change Program (Jeffrey Ball); Process Control Manager (Vincent Chung (Inherited)); Process Development (Hubert Metzner); Process Development (Michael Bartkovsky); Process Development 1 (Martin Alex Imboden (Inherited)); Process Development 2 (Ibrahim El Menyawi); Process Development Bern (Kurtis Allan Epp); Process Development Bern (PDB) (Kurtis Allan Epp); Process Development Bern I (Maria Crespo Solans); Process Development Bern I, Team 1 (Madlene von Knel); Process Development Bern I, Team 1 (Madlene von K√§nel); Process Development Bern I, Team 2 (Jonathan Eras); Process Development Bern II (Ibrahim El Menyawi); Process Development Bern II, Team 1 (Eva Blatter); Process Development Bern II, Team 2 (Marcus von Nordheim); Process Development Bern II, Team 3 (Adrian Alder); Process Development Bern II, Team 4 (Matthias Spiess); Process Development Bern III (Simon Gerber); Process Development Bern III, Team 1 (Robin Das Gupta); Process Development Bern III, Team 2 (Adrian Alder); Process Development Bern III, Team 3 (Eva Blatter); Process Development Bern III, Team 4 (Jos√© Ures); Process Development Group 1 (Robin Das Gupta); Process Development Group 2 (Eva Blatter); Process Development I & PP (Jennifer Krupka); Process Development I (Charles Arnold); Process Development I (Maike Glaser); Process Development I (Roopsee Anand); Process Development I (Uwe Liebing (Inherited)); Process Development I (Uwe Liebing); Process Development II (Jennifer Krupka); Process Development II (Katrin Anders); Process Development II (Kenneth Maas); Process Development III (Klaus Schmitt); Process Development, Data Science (Maya Shevlyakova); Process Engineering (Donall O Cualain); Process Engineering (Duncan Benson (Inherited)); Process Engineering (Gemma Parkes); Process Engineering (Markus Rentsch); Process Engineering (Michael Bieri); Process Engineering (Sean Goudy); Process Engineering Form & Fill (Emanuella Barbosa Lopes Souza Leao); Process Equipment & Technology (Benno Bitterli); Process Experts (Nicole L√∂ffelholz); Process Improvement (Deborah Mansfield); Process Improvement (George Thomas); Process Improvement (Jason Woolley (Inherited)); Process Improvement Mgr, PNS (Jerjess Chahoud); Process Management (Dawn Myers); Process Management Admin PGI (Antje Rder); Process Management Admin PGI (Antje R√∂der); Process Management Admin PGI (Oliver Draht); Process Migration (Ian Falcao); Process Migration (Paul Martell); Process Migration (Tony Renna); Process Migration Automation PU (Muditha Hasthanayake); Process Migration E&I (Paul O''Brien); Process Migration Project Engineer (Alice Dinh); Process Migration Project Engineer (Anna Martell); Process Science (Annette Gaida); Process Science (Annette Gaida-Benz); Process Science (Stefan Schulte); Process Science 2 (Arnaud Vonarburg); Process Science Upstream Lead (Sandra Grunske); Process Scientists Fractionation (Bernhard Wyss); Process Seed (Jennifer Kelly-Martland); Process Seed (John Cooney); Process TD (Adam Bentley); Process TD (Lisa-Marie Foulkes); Process Validation & Tech Transfer (Stefan Schulte); Process Validation (Berangere Lingat (Inherited)); Process Validation (Berangere Lingat); Process Validation (Fergus Hawes); Process Validation (Indi Staffa); Process Validation (Jesse Richter (Inherited)); Process Validation (Jessica Parletta); Process Validation (Peter Trimcevski); Process Validation - Stability (Jessica Mackellin); Process, Aseptic and Shipping Validation (Clare O''Donnell); Processes (Ferdinand Marx); Processes and Metrics (Eberhard Fitzner); Procurement (Barbara Beugger (Inherited)); Procurement (Brigitte Kimpel-Koch [C]); Procurement (Sreevatsan Sridharan); Procurement Lengnau (Narin Hermez); Procurement Lengnau (Pierre Bersier); Procurement Liverpool (Ian Goldup); Procurement Liverpool (Rachael Close); Procurement Liverpool (Trevor Reay (Inherited)); Procurement Operations (Juerg Kauer); Procurement Operations (Robert Di Giacomo); Procurement Operations (Sue Savage); Procurement Operations (Taylor Saak); Procurement Operations (Thomas Schneider); Procurement Operations - Liverpool (Rachael Close); Procurement Operations - Liverpool (Rachel Shaw); Procurement Operations I (Taylor Saak); Procurement Operations Manager (Marion Fitchett); Prod Manager - Formulations441 (Jamie Aaron Morris); Prod Manager - Formulations441 (Paul Morrison); Prod Mgr - Packaging (Garth James); Prod Mgr - Packaging (MARILYN BARAKIA); Product Care & Layout (Viviana Solange Fluxa Rojas); Product Care (Bill Chambers (Inherited)); Product Care (Markus Christen); Product Care (Patrick Nolte); Product Care (Samantha Czako (On Leave)); Product Care (Samantha Czako); Product Care (Thorsten Keller); Product Care (Valerie Schaffer); Product Care (Viviana Solange Fluxa Rojas (Inherited)); Product Care Mgmt (Andrea Lehmann); Product Characterisation (Matthias Zimmermann); Product Characterisation (Robert Dickinson); Product Characterization (Carsten Horn); Product Costing & Inventory Controlling (Anika Wagner); Product Costing & Inventory Controlling (Dirk Achenbach); Product Development (David Glover (Inherited)); Product Development (David Glover); Product Development (Fiona Bunworth); Product Development (Matthias Zimmermann); Product Disposition (Amber Hall); Product Disposition (Gennin Snyder); Product Education (David Chen ????); Product Education (Wei Chen ); Product Education (Wei Chen ????); Product Expertise (Paul Sinclair); Product Group Hemophilia (Claudia Zacharias); Product Group Hospital Products; Product Group Hospital Products (Bianca Petzold); Product Group Hospital Products (Michael Bernd Rode (Inherited)); Product Group ID (Richard Sodmann); Product Innovation (Fabian Kaesermann (Inherited)); Product Innovation (Fabian Kaesermann); Product Innovation (Rolf Spirig); Product Innovation (Susann Cattepoel); Product Market Authorization & QA Russia & CIS (Galina Senchukova); Product Ownership - Biotherapies (Anita Kohl-Truebenbach); Product Ownership - Biotherapies (Paul McKenzie (Inherited)); Product Release (Christine Peter); Product Release (Patricia Loftus); Production & Strategic Planning (Matthias Christl (On Leave)); Production & Strategic Planning (Matthias Christl); Production (Craig Byham); Production BCI/C1 INHIB (Peter Gttner); Production BCI/C1 INHIB (Peter G√ºttner); Production Engineering (ANDREW HISLOP); Production Engineering (Andre Majchrzak); Production Engineering (Anisa Moghaddam); Production Engineering (Antonio Ciocca); Production Engineering (Cameron Simpson); Production Engineering (Campbell Anderson); Production Engineering (Candy Lee); Production Engineering (Damien Barri); Production Engineering (Dion Houtman); Production Engineering (Jason Fletcher); Production Engineering (Karen Noonan); Production Engineering (Kate McConnell); Production Engineering (Melissa Nicholson); Production Engineering (Reza Mohebian); Production Engineering (Richard Friar); Production Engineering (Richard Hayne); Production Engineering (Tom Graham); Production Engineering (Tom Kelland); Production Engineering 1 (Geoff Wang); Production Manager (Cassandra Smoult); Production Manager (Jamie Aaron Morris); Production Manager US (Ljubi Huseinovic); Production Manager, PNS 448 (Keiran Ragas); Production Marburg (Frank Emmerich); Production Marburg (Michael Schr√∂der); Production Planning (Kyle Popham); Production Supervisor 454 (Kara Davine); Production Support (Jeffrey Spicer); Production Support (Marcus O''Dwyer); Produktion Inhibitoren PGI (Barbara Kalina (Inherited)); Produktion Inhibitoren PGI (Stefan Wellnitz); Produktion Inhibitoren PGI (Wilfried Happel); Produktion Inhibitoren Schicht 1 (Fabian Cybulski); Produktion Inhibitoren Schicht 2 (Arkadius Kaczmarczyk (Inherited)); Produktion Inhibitoren Schicht 2 (Arkadius Kaczmarczyk); Produktion Inhibitoren Schicht 3 (Manuel Lotz); Produktion Inhibitoren Schicht 4 (Fynn Krieger); Produktion Inhibitoren Schicht 4 (Manuel Cuesta Linker); Produktion Inhibitoren Tagdienst (Florian Scherer); Produktion RPF300 (Anika Knack); Produktion RPF300 (Mara Saglam); Produktion Rekombinante Proteine & Support (Carsten Meyer); Produktion Rekombinante Proteine & Support (Viktor Krecker); Produktion Wundheilungsprparate M300 1 (Meik Dietrich); Produktion Wundheilungsprparate M300 2 (Jrg Schmidt); Produktion Wundheilungsprparate M300 3 (Bjrn Bartelme); Produktion Wundheilungsprparate M300 4 (Willi Drr); Produktion Wundheilungsprparate M300 5 (Rainer Jesberg); Produktion Wundheilungsprparate M300 6 (Udo Wagner); Produktion Wundheilungspr√§parate M300 1 (Meik Dietrich); Produktion Wundheilungspr√§parate M300 2 (J√∂rg Schmidt); Produktion Wundheilungspr√§parate M300 3 (Bj√∂rn Bartelme√ü); Produktion Wundheilungspr√§parate M300 3 (Christoph Bernitt); Produktion Wundheilungspr√§parate M300 4 (Willi D√∂rr); Produktion Wundheilungspr√§parate M300 5 (Rainer Jesberg (On Leave)); Produktion Wundheilungspr√§parate M300 5 (Rainer Jesberg); Produktion Wundheilungspr√§parate M300 6 (Udo Wagner); Produktionsfachkraft Chemie (Carmen Walldorf (Inherited)); Produktionsfachkraft Chemie (Doris Nake (Inherited)); Program Management R&D Building (Carsten Skill); Programme Management (Anthea Stephenson); Project (Joe Fielding [C]); Project Aurora Automation (Mukesh Muruganandan) (Mukesh Muruganandan); Project Automation (Michael Kraft); Project BCI (Kristin Eschrich); Project Controls and Commercial Assurance (Daniel Boltz); Project Delivery & Support (Christopher A Betterton); Project Delivery & Support (Matt Shapiro); Project Delivery & Support (Robert Boland (Inherited)); Project Delivery EU/APAC (Nick Furmston); Project Delivery KAN (Michael Hansen (Inherited)); Project Edge Commercial (Drew Hansen); Project Edge Finance (Daya Salter); Project Edge Finance (John Dinatale (Inherited)); Project Edge Logistics (John Dinatale (Inherited)); Project Edge Logistics (Steve Wilson [C]) (Steve Wilson [C]); Project Edge Procurement (Emma Hopwood); Project Edge Quality (Glenn Barbrey); Project Edge Quality (John Dinatale (Inherited)); Project Engineering (Daniel Weniger); Project Engineering (Duncan Benson); Project Engineering (Volker Teuchert); Project Ldr Improve & Compl (Michael Dunn); Project Ldr Improve & Compl (Thomas Nguyen); Project Logistic Centre Lahntal (Thomas Schwarz); Project Management (Bryan J Hoover); Project Management (Mark Ridge); Project Management CV TA (Julie Waterbury); Project Management Office (Chris Abell); Project Management Office (Emily Brown); Project Management Office (Geoffrey Rea [C]); Project Management Office (Jose Gonzalez (Inherited)); Project Management, AU/Asia (Alex Vaine); Project Management, Europe (Elaine DiMonte); Project Management, Europe (Katharine von der Fecht); Project Management, North America (Elaine DiMonte); Project Manager (Andrei Fedorov); Project Manager (Conal O''Mahony); Project Manager (Heiko V√∂lpel (Inherited)); Project Manager (Jack Hung); Project Manager (Victor Karafilis (Inherited)); Project Support & Technical Transfer (Andreas Berting); Project Upgrade H69 (Thomas Schwarz); Project-Portfolio Delivery (Robert Boland); Project/Portfolio Delivery (Tod Marks); Projekt Phoenix (Markus Ries); Projekt-Koordinator (Claus Peihs (Inherited)); Projekt-Koordinator (Jrg Dieterich (Inherited)); Projekt-Koordinator (J√∂rg Dieterich (Inherited)); Projekt-Koordinator (Murat Dalar (Inherited)); Protein Biochemistry (Eric Salgado); Protein Research R&D (Nathan J Brinkman); Protinus (Marius Liesch); Protinus (Sandra Kaempfer); Prozessgruppe 1 (Christoph Pfeiffer); Prozessgruppe 1 (Daniel Weniger (Inherited)); Prozessgruppe 1 (Marko Witt); Prozessgruppe 2 (Frank Heck); Pt. St. Lucie 072 ACM Area 1 (Adam Davis); Pt. St. Lucie 072 ACM Area 2 (Vanessa Sanon); Pt. St. Lucie 072 ACM Area 3 (Garrett J Royal); Publishing Site Marburg (Jrg Starker); Publishing Site Marburg (J√∂rg Starker); Publishing Site Marburg Diverse (J√∂rg Starker (Inherited)); Puffer (Rainer Frank (Inherited)); Puffer (Torsten Jeide); Pulmonology-Europe (Michael Larbig); Purchasing (Alfonso Albornoz); Purchasing (Bob Siegel); Purchasing (Mark W Hartmann); Purchasing I (Alfonso Albornoz); Q Fever Team Leader D443 (Marcus O''Dwyer); Q Fever Team Leader D443 (Paul Williams); Q-Operation (Isabelle Crauser); Q-Operation (Marco Maeder); Q-Oversight End Products (Urs Pflugshaupt); QA - Batch Release & PTCs (Daniel Powell); QA - Batch Release & PTCs (Peter Tyler); QA - Batch Release (Astrid Mellor); QA - Batch Release (Tracy Owens); QA Batch Release (Constanze Buchter); QA Batch Release (Nicole Kortelainen); QA Batch Release (Randolph Rimando); QA Complaints Mangement (Rhonda L Luhrsen); QA Compliance (Berangere Lingat); QA Compliance (Craig Stephens (Inherited)); QA Compliance (JEFFREY ZOUBEK); QA Compliance (Jeffrey Zoubek); QA Compliance (Kimberly E Lorenz (Inherited)); QA Compliance (Mark Dickson); QA Cont Imp & Iss Mgt (Sharon Thornton); QA Fill/Finish (Lindsay Griffiths); QA Manager (Nicola Rotherham); QA Manufacturing (Alex Hargreaves); QA Manufacturing (Natalie Steele); QA Manufacturing (Tracy Owens); QA Operations (Dave Kowalski); QA Operations API (Anthony Nelson); QA Operations API (Kyle Showalter); QA Operations Bldg 30 (Anastasia Lindsey); QA Operations Bldg 33 (Alison York); QA Operations Bldg 33 (Candice Nieves (Inherited)); QA Operations Bldg 33 (Jill Shafer); QA Operations Bulk (Candice Nieves); QA Operations Bulk (Cassandra Clevenger); QA Operations Coagulation (Nicholas Gluckleder); QA Operations Coagulation 2 (Kelly Kucera); QA Operations Fractionation (Alison York); QA Operations Fractionation (Jacquelyn O''Malley); QA Operations II (Meggan R Smith); QA Operations II (Sarah Milone); QA Operations IPW (Kimberly Desler); QA Operations IPW (Meggan R Smith); QA Operations IPW (Sarah Milone); QA Operations IPW I (Sarah Milone); QA Operations Parenteral (Dave Kowalski (Inherited)); QA Operations Parenteral (Michael Urbanczyk); QA Plasma/Deviations (Eva Streit); QA Plasma/Deviations (Si√© Kigninlman Coulibaly); QA Primary Manufacturing (Jocelyn Bryson); QA Process and Facilities / Stability (Marco Haas); QA Processes & Facilities (Dieter Bathier); QA Processes & Facilities (Ivo Lakomy); QA Processes & Facilities (Michel Baur); QA Processes & Facilities (Silvia Schmutz); QA Product Release (Joanna Madafferi); QA Product Release (Stephanie St.Martin); QA Projects Compliance Team (Danielle Moloney); QA Projects Compliance Team (Stoyan Atanasov); QA Release (Angelos Borobokas); QA Release (Aoife Corrigan); QA Release (Cherie Mclaren); QA Release (Craig Stephens (Inherited)); QA Release (Francesco Intoccia); QA Release (Ivo Lakomy); QA Release (Karin Hofstetter); QA Release (Katie Wood); QA Release (Manuel Selvaggio); QA Release (Marion Jeffrey); QA Release (Neil Del Castillo); QA Release (Rosemary Hill); QA Release 1 (Aoife Corrigan); QA Release 1 (Cherie Mclaren); QA Release 1 (Craig Stephens (Inherited)); QA Release FAR Compliance (Margrit Waterval); QA Release FAR Release (Natalie Helfer); QA Release FAR Review (Fabienne Thoenen); QA Release IG/CYT (Silvia Schmutz); QA Release IGC Compliance (Dagmar Riffel); QA Release Process Engineering (Michael Zachner); QA Secondary (Daniel Powell); QA Systems (Christian Eggel); QA Systems (Connie Costanzo); QA Systems (Craig Stephens (Inherited)); QA Systems (Dina El-Emary); QA Systems (Lorenz Rindisbacher (Inherited)); QA Systems (Malte Kr√§mer); QA Systems (Maryanne Pashalis); QA Systems (Michel Baur (Inherited)); QA Systems (Nancy Phan); QA Systems (Nassima Wilson); QA Systems (Nina Klee); QA Systems (Simone Naruhn); QA Systems (Sue Ireland); QA Systems (Susanne Deyhle); QA Systems (Tony Smith); QA Technical Support (Amanda Cooper); QA Validation (Jeff Mihaichuk (Inherited)); QA Validation (Jeff Mihaichuk); QA Validation (Stephen R Grey); QA Validation - Site Expansion (Jeff Mihaichuk (Inherited)); QA Validation I (Jeff Mihaichuk (Inherited)); QA and Regulatory Affairs SI (Aldelberto Cordova); QAI Quality Albumin, Immunoglob., Plasma (Martin Krah); QAO Compliance (Dagmar Riffel); QAO Manufacturing (Ines Joachim); QAO Manufacturing (Natalie Helfer); QAO Release (Natalie Helfer); QAO Release (Silvia Schmutz); QAO Sustain & Improve (Stefan Kaelin); QBR FVIII & FIX QoF (Anja Beetz); QBR FVIII & FIX QoF (Anja Beetz-Kroll); QBR PWI QoF (Torsten Cyriax); QBR rekombinante Proteine QoF (Nancy Georgieff); QBS Rotational Program (Ulrich Schuerch); QC (Gillian McAdam); QC (Rebecca Gannon); QC - Chemistry (Anna Melia); QC - Chemistry (Anthony Pellegrini); QC - Chemistry (Jimmy Pajarillo); QC - Chemistry (Marie Neophytou); QC - Chemistry (Michael Streule); QC - Microbiology (Angie Fifis); QC - Microbiology (Claire Abson); QC - Microbiology (Denise Vella); QC - Microbiology (Dinesh Raj Methuku); QC - Microbiology (Dozie Okafor); QC - Microbiology (Elsie Everson); QC - Microbiology (Grace Luong (Inherited)); QC - Microbiology (Grace Luong); QC - Microbiology (Karthy Santhosh); QC - Microbiology (Maria Arulruban); QC - Microbiology (Marika Moore); QC - Microbiology (Maruthi Shivananda); QC - Microbiology (Patricia Hughes); QC - Microbiology (Tyson Parker); QC Analytical & Raw Materials (Nick Brendon); QC Analytical & Raw Materials (Victoria Fairclough); QC Analytical Services Manager (Andrea Prendergast); QC Bioassay (Adrian Gee); QC Bioassay (Victoria Fairclough); QC Chemistry (Jenny Staff); QC Chemistry (Robert Zanon); QC Chemistry (Ying Huang); QC Chemistry Team Leader (Niki Soteriadis); QC Chemistry Team Leader (Ying Huang); QC Compliance (Ignazio Lamonica); QC Compliance Support (Lisa Walters); QC Compliance and Improvement (Lisa Marie Malcharek); QC Immunochemistry (Andre Lamarque (Inherited)); QC Immunochemistry (Andre Lamarque); QC Immunochemistry (Caroline Abdul-hay); QC Immunochemistry (Fatima Bartils); QC Immunochemistry (Georgina McKay); QC Immunochemistry (Sean O''Keefe); QC Immunochemistry (Tahlor Robson (Inherited)); QC Immunochemistry (Tahlor Robson); QC Immunochemistry (Xiaowen Chin); QC Immunology (Melissa Damino); QC Immunology Team Leader (Anna Gruszka); QC Immunology Team Leader (Corina Zahra); QC Immunology Team Leader (Grace Huynh); QC Immunology Team Leader (Michelle Reckerman); QC Labs (Dawn Nagel); QC Micro Manager (Rita Simopoulos); QC Micro Team Leader (Dan Balod); QC Micro Team Leader (Prue Shanahan); QC Microbiology (Denise Vella); QC Microbiology (Georgia Ieronymakis); QC Microbiology (Maria Moeller); QC Microbiology (Nicola McDonald); QC Microbiology and Sterility Assurance (Dozie Okafor); QC PNS + Other Non-IVV Prod (Dan Balod); QC Projects (Hannah Kay); QC Projects (Stephen Pearson (Inherited)); QC Sample Logistics (Billy Patel); QC Stability (Victoria Mason (On Leave)); QC Stability (Victoria Mason); QC Stability (Victoria Wilson (On Leave)); QC Stability (Victoria Wilson); QC Stability Coordination (Jonathan Whitehead); QC Support (Andrea Prendergast); QC Support (Jennifer Chung); QC Support (Lucero Perdomo Cruz); QC Support (Philip Elliott (Inherited)); QC Support (Stephen Pearson); QC Support (Sushil Deswal); QC Support Systems (Jenny Higgins); QC Validation (Hayley Mackin); QC Validation (Jeff Hancock (Inherited)); QC-Microbiology (Anja Djordjevich); QC-Microbiology (Kah Wen Lee); QC-Microbiology (Tahlor Robson); QC/QA (Alex Hargreaves); QC/QA (Natalie Steele); QCP BRR (Verena Specowius); QCP QC Support & PTC/QCP, QFP (Mirko Altenkmper); QCP QC Support & PTC/QCP, QFP (Mirko Altenk√§mper); QFP Filling AQL (Lina Matschke); QFP Filling H69 QoF (Christoph Kalfack); QFP Filling H69 QoF (Ingo Kischka); QFP Filling M305 ABW (Sandra Benthin); QFP Filling M305 BRR/CC (Verena Specowius); QFP Filling M305 QoF (Stefan Paul); QGP Quality Coagulation (Jrgen Keitel); QGP Quality Coagulation (J√ºrgen Keitel); QM Production (Monika Christen); QM Production (Monika Krebs); QM Qualification & Validation (Bettina Vgerl); QM Qualification & Validation (Bettina V√∂gerl); QM Qualification (Gnter Fehlberg-Sternemann); QM Qualification (G√ºnter Fehlberg-Sternemann); QM Validation (Mickael Boegli); QMB ES (Samuel Mann); QMB Operations (Jonathan Imhof); QO / Aseptic (Michelle Hogg); QP/QA Product Release (Jocelyn Bryson); QPPV (Anna Rozmyslowicz); QPPV (Giovanni Furlan); QSP Quality Supply Chain & Packaging Op. (Sybille Bertram); QTH Quality Teilfertigung H69 (Guido Kagemann); QTM Quality Teilfertigung M305 (Murat Dalar (Inherited)); QTM Quality Teilfertigung M305 (Wolfgang List); QWI Inhibitors, Fibrinogen+Vaccines (Antonia Preidel); QoF Endfertigung (Christoph Croon); QoF Endfertigung (Jeanette Ludwig); Qualification (Angela Hamrock-Fox (Inherited)); Qualification (Angela Hamrock-Fox); Qualification (Annabel Wang); Qualification (Bozana Dujak); Qualification (Chris Richter); Qualification (Ilija Najdovski); Qualification (Jonathan Nixon); Qualification (Judith Kennedy); Qualification (Judith Youd); Qualification (Lorraine Murphy); Qualification (My Linh Ly); Qualification (Peter Carver); Qualification (Purush Devanathan); Qualification (Rainer Kraus); Qualification (Rolf Ingold (Inherited)); Qualification (Selda Yildiz Kaya) (Selda Yildiz Kaya); Qualification (Susan Clough); Qualification - Systems (Susan Clough); Qualification - systems (Darren Geary); Qualification System (Michael Kocher); Qualification System (Nadine Aeschbacher); Qualification System (Nadine Jost); Qualifizierung I Schwerpunkt Bulk (Michael Dospil); Qualifizierung II Schwerpunkt Teilfertigung (Michael Kuhn); Qualifizierung III Schwerpunkt Automatisierung (Lionel Guthneck); Qualifizierung IV Schwerpunkt Re-Qualifizierung (Ingo Kischka); Qualifizierung IV Schwerpunkt Re-Qualifizierung (Rainer Kutsch); Qualifizierung Lengnau (Thomas Cragnolini); Quality & Business Services (Karen Etchberger); Quality & Compliance UK (Jonathan Sheard); Quality & Med Svcs (Francesc Pont); Quality & Safety Management R&D (Dominik Blaser); Quality (Craig Stephens (Inherited)); Quality (David Atkinson); Quality (Ernest Shepard); Quality (Helmut Robert Euler); Quality (Jeffrey A Alcorn (Inherited)); Quality (Jill Allen); Quality (Jose Gonzalez (Inherited)); Quality (Kimberly E Lorenz); Quality (Mark Dickson); Quality (Matthew Donegan); Quality (Michelle Kelley); Quality (Robin A Mroz); Quality (Scott Overton); Quality (Vasilis Mavrogenis); Quality (Wei Wei ); Quality (Wei Wei ????); Quality (Yun Zhao (Inherited)); Quality (Yun Zhao ????); Quality 1 (David Atkinson); Quality Applications (Jason VanGils); Quality Assurance & Systems (Kelley L Hyatt); Quality Assurance (Anuja Prabhutendolkar); Quality Assurance (Connie Stewart); Quality Assurance (Ryo Ohnishi ??? ? - ???? ???? (Inherited)); Quality Assurance (Sanae Uchida (Inherited)); Quality Assurance Division (Ryo Ohnishi - ); Quality Assurance Division (Ryo Ohnishi ??? ? - ???? ????); Quality Assurance I (Connie Stewart); Quality Assurance II (Terry L Fritz); Quality Assurance Operations (Ivo Lakomy); Quality Assurance Projects Compliance (Craig Stephens (Inherited)); Quality Assurance Projects Compliance (Eoin Hanley); Quality Assurance System Group (Ryo Ohnishi ??? ? - ???? ???? (Inherited)); Quality Assurance System Group (Sanae Uchida (Inherited)); Quality Assurance Systems (Markus Schriewer); Quality Assurance, HS (Jonathan Kegerise); Quality Assurance, LVP (Gillian McAdam); Quality Assurance, PKV (Fiona Smith); Quality Assurance, PKV (Karen Netherton (Inherited)); Quality Bulk and Release QBR (Petra Hintz-Obertreis); Quality Chemistry (Cassie Norton); Quality Compliance (Sandra F Osborne); Quality Control (Juergen Liedtke); Quality Control (Leonora Pancho); Quality Control (Manuel Selvaggio); Quality Control (QC) (Dominik Stadler); Quality Control (QC) Ops Support (Brigitte Siani); Quality Control (Rene Bruegger); Quality Control Development (Andreas Affolter); Quality Control Services (Manuel Selvaggio (Inherited)); Quality Control Specialist (Andrea Chalker (Inherited)); Quality Control Specialist (Lakmini Croner); Quality Control Specialist (Linh Vo); Quality Control Support (Pascal Hulliger); Quality Control Support (QCS) (Christoph Wyss); Quality Control Team Leader (Andrea Chalker); Quality Control Team Leader (Chris O''Meara); Quality Control, LVP (Rebecca Gannon); Quality Control, LVP (Simon Harwood); Quality Controll Support QCS (Sebastian Klzer); Quality Controll Support QCS (Sebastian K√∂lzer); Quality Coordination ECI (Viviana Solange Fluxa Rojas); Quality Deviation & CAPA Management (Michael Rudolf); Quality Document Control (Michael Gough); Quality Enterprise Learning Management (Amy Love); Quality Filling H69 (Jens Huft); Quality Final Product QFP (Murat Dalar); Quality Global QA Technical Development (Monica Rose); Quality HS QA 3rd Party Manufacturing (Eric Blaesing); Quality HS QA Document Control (Aaron Ward); Quality HS QA Document Control (Cara Miller Kell); Quality HS QA Engineering & Validation (Petra Smith); Quality HS QA Fill Finish Expansion (Sarah Stearns); Quality HS QA Front Line (Laura Caldwell); Quality HS QA Front Line Days (1st Shift) (Laura Caldwell (Inherited)); Quality HS QA Front Line Days (1st Shift) (Nicholas Brown); Quality HS QA Front Line Incident Management (Dominic Greene); Quality HS QA Front Line Incident Management (Laura Caldwell (Inherited)); Quality HS QA Front Line Nights (2nd & 3rd Shift) (Karam Farhan); Quality HS QA Front Line Weekends (4th & 5th Shift) (Erminio Alesii); Quality HS QA Manufacturing (Stephenie Robertson); Quality HS QA Manufacturing Batch Release Bulk (Jennifer Deinarowicz); Quality HS QA Manufacturing Batch Release Fill Finish (Marianne Perelstein); Quality HS QA Manufacturing Batch Release (Amy Love); Quality HS QA Manufacturing Batch Release (Jonathan Kegerise (Inherited)); Quality HS QA Manufacturing Batch Release ¬ñ Bulk (Jennifer Deinarowicz); Quality HS QA Manufacturing Batch Release ¬ñ Fill Finish (Marianne Perelstein); Quality HS QA Manufacturing Batch Release-PTC (Troy Greene Jr); Quality HS QA Manufacturing Incident Management (Dominic Greene); Quality HS QA Manufacturing Shopfloor (Brian Leising); Quality HS QA Manufacturing Shopfloor (Jonathan Kegerise (Inherited)); Quality HS QA Manufacturing Shopfloor Bulk Days (Stephaine McMillan Eads); Quality HS QA Manufacturing Shopfloor Bulk Nights (Nicholas Alexander Brown); Quality HS QA Manufacturing Shopfloor FF Days (Elliott Tatum); Quality HS QA Manufacturing Shopfloor FF Days (Joseph A Marti); Quality HS QA Manufacturing Shopfloor FF Nights (Michael Mikolajczak); Quality HS QA Supplier & Third Party Management (Aaron Ward); Quality HS QA Supplier & Third Party Management (Jessica Mercer (Inherited)); Quality HS QA Systems & Compliance (Jessica Mercer); Quality HS QC Biochemistry (Geremy Knapp); Quality HS QC Biochemistry (Richard H Steere); Quality HS QC Chemistry (Gina Stick); Quality HS QC Chemistry (Raymond Otchere-Adjei); Quality HS QC Immunology (Geremy Knapp); Quality HS QC Logistics (Kelly Jenness); Quality HS QC Logistics (Laura Matulevich); Quality HS QC Microbiology (Liz Strickland); Quality HS QC Microbiology (Roland Jason Jacques); Quality HS QC Microbiology (Sarah Strickland); Quality HS QC Validation & Change (Jessica Loshia Gambill); Quality HS QC Virology (Geremy Knapp); Quality HS QC Virology (Geremy William Knapp); Quality HS Quality Control (Jessica Mercer); Quality HS Quality Control (Rebecca Gannon); Quality HS Quality Control (Stephen Case); Quality HS Training & Workforce Development (Jessica Mercer (Inherited)); Quality HS Training & Workforce Development (Jonathan Kegerise (Inherited)); Quality HS Validation (Amy Russell); Quality HS Validation (Brian Nunnally (Inherited)); Quality HS Validation Bulk & Warehouse (Mark Holland); Quality HS Validation Fill Finish, QC & FacOps (Amy Russell (Inherited)); Quality HS Validation Fill Finish, QC & FacOps (Amy Russell); Quality HS Validation Fill Finish, QC & FacOps (Megan Crandall); Quality HS Validation Process & Aseptic (Brian Nunnally (Inherited)); Quality HS Validation Process & Aseptic (Christopher Lee); Quality HS Validation Process & Aseptic (Matthew Franks); Quality Improvement (Marc Christeller); Quality Improvement (Sandra Soverna); Quality Italy (Annarita Cinardo); Quality Knowledge Management (Sarah S Lemons); Quality Lab (Russ Reeves); Quality Management (Adam Robb); Quality Management (Craig Stephens); Quality Management (Dina El-Emary); Quality Management (Jeffrey A Alcorn (Inherited)); Quality Management (Juergen Liedtke); Quality Management (Lorenz Rindisbacher); Quality Management (Michel Baur); Quality Management (Niklas Schier); Quality Management (Paul Martell); Quality Management (Philip Elliott); Quality Management (Reiner Laske); Quality Management (Reiner Laske, Niklas Schier); Quality Management (Susanne Jecklin); Quality Management 2 (Manuel Selvaggio); Quality Management E&S (Michael Kocher); Quality Management E&S (Rolf Ingold); Quality Management Engineering (Alexandra Rompf); Quality Management Strategy & Op Excellence (Collins Onyejese); Quality Management System (Eileen DiRita); Quality Management Systems (Justin Huss); Quality Operations (Carolyn M Koerner); Quality Operations, Liverpool (Karen Netherton); Quality R & D (Bradley Jackson); Quality R & D (Sharon Reinhard); Quality Review & Improvement Management (Uwe Dohmen); Quality Review Management & Trending (Uwe Dohmen); Quality Shared Services (Barbara Hicks); Quality Site Operations HS (Brian Nunnally); Quality Supply Chain US Distribution (Karen Marks (Inherited)); Quality Supply Chain US Distribution (Samantha Wentzell); Quality Supply Chain US Distribution (Stephanie Condi); Quality System Validations (Jeffrey Berry); Quality Systems & Compliance (Mai Viholm); Quality Systems & Compliance (William Cunningham); Quality Systems & Compliance Auditing & Inspections (Marcela Rojas); Quality Systems & Compliance Auditing & Inspections HS (Aaron Ward); Quality Systems & Compliance Auditing & Inspections LVP (William Cunningham); Quality Systems & Compliance Auditing & Inspections PKV (Marcela Rojas (Inherited)); Quality Systems & Compliance HS (Milka Smoljko (Inherited)); Quality Systems & Compliance QA IT (Anthony Pickering); Quality Systems & Compliance Shared Services (Sarah Lemons); Quality Systems & Compliance Shared Services EDMS (Robbie Gay); Quality Systems & Compliance Shared Services GLIMS (Helen Mihaljevic); Quality Systems & Compliance Shared Services LMS (Cara Miller Kell); Quality Systems & Compliance Supplier Management HS (Gina Stick); Quality Systems (Alan Cordero); Quality Systems (Brandi C Robinson); Quality Systems (Brandi Kennedy); Quality Systems (Karen M Cory); Quality Systems (Margaret A Clifton); Quality Systems (Michael Gough); Quality Systems (Micheal Casaus); Quality Systems (Michelle J Siegel); Quality Systems (William Cunningham (Inherited)); Quality Systems - Trackwise (Maggie Bradley); Quality Systems 1 (Kristen Gist); Quality Systems Boca (Micheal Casaus); Quality Systems I (Alan Cordero); Quality Systems II (Michelle J Siegel); Quality Systems IT (Nicole Nolan); Quality Systems IT (Tim Jones (Inherited)); Quality Systems Management I (Sigrid Streichert); Quality Systems and Standards (Sophie Chairs); Quality Systems and Standards (Vicky Lioutas); Quality Validation (Rudolf Beutler); Quality and Compliance (Harumi Ishizuka); Quality and Standards (Shinya Takagawa); Quality system (Eric Li ?????); R&D (Russell Basser); R&D - Albumin/Immunoglobulin (Joseph Bertolini); R&D - Albumin/Immunoglobulin (Karl McCann); R&D - Albumin/Immunoglobulin (Robert Forrest); R&D - Albumin/Immunoglobulin (Vladimir Gurevich); R&D - Haemostasis (Ayse Kara); R&D - Haemostasis (Hung Pham); R&D - Haemostasis (Kathryn Scott); R&D - Haemostasis (Kelly Lo Presti); R&D - Haemostasis (Maria Panayi); R&D - Haemostasis (Norm Mancuso (Inherited)); R&D - Haemostasis (Norm Mancuso); R&D - Haemostasis (Vladimir Gurevich); R&D - Haemostasis (Yvette Citrine); R&D - Management (Germano Coppola); R&D - Technical Operations (Robert Forrest); R&D - Technical Operations Senior Scientist (FRIEDA FEHR); R&D - Technical Operations Senior Scientist (Mary Alaveras); R&D - Virology (Connie Broumis); R&D - Virology (Rachael Ross); R&D - Virology (Randel Fang (Inherited)); R&D - Virology (Randel Fang); R&D - Virology (Randel Fang) (Randel Fang); R&D - Virology (Trudi Wentzel); R&D Bioanalytics BMW (Sue Amatayakul-Chantler); R&D Biostatistics & Data Management AUS (Vince Matassa); R&D Biostatistics & Data Management US (Hongyu Liu); R&D Biostatistics Programming (Daphne Ewing); R&D Breakthrough Technologies BMW (Germano Coppola (Inherited)); R&D Breakthrough Technologies BMW (Joseph Bertolini); R&D Breakthrough Technologies BMW (Viv Louzado); R&D Business Operations (Christian DiDio); R&D CMC & Compliance (Chaaya Ganorkar); R&D CMC & Compliance (Michele Fischer Heintz (Inherited)); R&D CMC & Compliance (Wendy Su); R&D Cell Based Influenza Vaccines (Brett Ashley Leav); R&D Cell Based Influenza Vaccines (Brett Leav); R&D Cell Based Influenza Vaccines (Deborah Molrine); R&D Clinical Business Operations (Christian DiDio); R&D Clinical Compliance & Training (Roberta Duncan (Inherited)); R&D Clinical Development (Jonathan Edelman); R&D Clinical Development, BOSS-CC (Roberta Duncan); R&D Clinical Operations (Veronica Suarez (Inherited)); R&D Clinical Operations Pandemic (Mary Smith); R&D Clinical Operations Pandemic (Mirjam van Huffelen (On Leave)); R&D Clinical Operations Pandemic (Mirjam van Huffelen); R&D Clinical Operations Seasonal (Olivia Crayne); R&D Clinical Safety & Pharmacovigilance (James Milligan); R&D Clinical Safety & Pharmacovigilance (Russell Basser); R&D Clinical Safety & Pharmacovigilance (Sylvie Tomczyk); R&D Clinical Vaccine Management & Serology (Francesco Bedani); R&D Data Management, Coding & Standards (Renate Verbeeten - van Hoof); R&D Data Management, Coding & Standards (Renate Verbeeten-van Hoof); R&D Development Liverpool (April Sena); R&D Epidemiology (Mendel Haag); R&D Finance (Eleanor McQuisten); R&D Finance (Emma Walsh); R&D Formulation & Delivery (HUI LIU); R&D Formulation & Delivery (Hui Liu); R&D Global CMC Standards & Harmonisation (Rima Youil); R&D Global CMC and Compliance (Michele Fischer Heintz); R&D Global CMC and Compliance (Michele Heintz); R&D Global Medical Affairs (Gregg Coveney Sylvester); R&D Global Medical Affairs (Gregg Sylvester); R&D Global Strategic Labelling (Helen Cowdery); R&D Human Resources (Andrea Resch (Inherited)); R&D Human Resources (Kimberly Golden); R&D Human Resources (Paula Foord); R&D Human Resources MBR (Andrea Resch); R&D IT Solutions (John Cornelius); R&D Immunology (Gillis Otten); R&D Immunology (Gillis Robert Otten); R&D Influenza Vaccines Pandemic (Matthew Hohenboken); R&D Influenza Vaccines Seasonal (Esther Heijnen); R&D Influenza Vaccines Seasonal (Igor Smolenov); R&D Influenza Vaccines Seasonal (Jonathan Edelman (Inherited)); R&D JAPAN (Haruo Kitado); R&D Licensing (Andrea Huggins); R&D Medical Affairs, Americas (Ashesh Gandhi); R&D Medical Affairs, Americas (Ashesh J Gandhi); R&D Medical Affairs, Canada (Ashesh Gandhi (Inherited)); R&D Medical Affairs, Europe (Sankarasubramanian Rajaram); R&D Medical Affairs, Influenza (Karita Ambrose); R&D Medical Affairs, Rapivab (Daniele Gelone); R&D Medical Communications, US (Nancy Dougherty); R&D Medical Science Liaison Canada (James Mansi); R&D Microbial & Molecular Biology (Pirada Suphaphiphat); R&D Operations - Influenza (Denis Thomas); R&D Operations - Influenza (Lynda Allan); R&D PM Leadership (Nancy Fetrow); R&D PV Compliance & Excellence (Liz Pound); R&D Pharmacovigilance Operations (Jefferson Guillon); R&D Pharmacovigilance Operations (Lynn Gabb); R&D Pharmacovigilance Operations (Sylvie Tomczyk (Inherited)); R&D Pharmacovigilance and Risk Management (Maria Maddalena Lino); R&D Process Development BMW (Karl McCann); R&D Process Development BMW (Per Hansen); R&D Process Science Liverpool (Kulwinder Banger); R&D Project Management (Julie Waterbury); R&D Project Management - BRN (Michael Exner); R&D Project Management Development Projects (Nancy Fetrow); R&D Project Management Qvax, Patch, Research (Heather Davis (Inherited)); R&D Project Operations (David Leacy); R&D Protein Biochemistry (Changkeun Lee); R&D Protein Biochemistry (Yingxia Wen); R&D QA Systems (Karen Gard''ner (Inherited)); R&D QA Systems (Liz Pound); R&D QA Systems (Sarah S Lemons); R&D Quality (Karen Gard''ner); R&D Quality (Kendra Bossio); R&D Quality Management (Georgina Dimovski); R&D Quality Management (Jackie Desengano); R&D Quality Management (Jonathan Wooley); R&D Quality Management (Malcolm Tipping); R&D Quality Management (Mandy Jergovic); R&D Quality Management (Mary Nasopoulos); R&D Quality Management (Matthew Dickie); R&D Quality Management (Vicky Lioutas); R&D Quality Management Kankakee (Chris Lubben); R&D Quality Management Marburg (Ariane Korzen); R&D Quality Management Marburg (Ingo Brand); R&D Quality Marburg 1 (Rainer Kufka); R&D Regulatory Affairs (Susan Cameron-Laxton); R&D Regulatory Affairs Adjuvents (Hs-yen Liu); R&D Regulatory Affairs Adjuvents (Hs√º-yen Liu); R&D Regulatory Affairs Seasonal EMEA (Dalila Dolfi); R&D Regulatory Affairs US Cell-Based Products (Yael Johnson); R&D Regulatory Affairs US Pandemic (Natasha Getz); R&D Regulatory Affairs, EMEA (Monica Pagni); R&D Regulatory Affairs-US (Debbie DeMuria); R&D Regulatory Affairs-US (Kevin Darryl White); R&D Regulatory Affairs-US (Susan Cameron-Laxton (Inherited)); R&D Regulatory Affairs-US -Cambridge (Peggy Charpie); R&D Research (Ethan Settembre); R&D Research Executive Admin-Cambridge (Jane Davis); R&D Research Strategy & Operations (Rebecca Servais); R&D Serology (Giuseppe Palladino); R&D Site Management & Monitoring US/EU (Veronica Suarez (Inherited)); R&D Statistics & Data Management (Leah Isakov); R&D TD Analytical Process Testing (Tanya Riggins Clemmer); R&D TD Analytical & Drug Product Development (Dan Speelman); R&D TD Analytical (Ying Zhang); R&D TD Analytical Biochemistry (Tanya Clemmer); R&D TD Analytical Biophysical (Jiang Qian); R&D TD Analytical Cell & Molecular (Prakash Koodathingal); R&D TD Analytical Immunoanalytics (Jesse Bodle); R&D TD Analytical Immunoanalytics (Kiki Vukanovska); R&D TD Analytical Method Development I (Bryan E Hart); R&D TD Analytical Method Development I (Bryan Hart); R&D TD Analytical Method Development II (Dan Speelman); R&D TD Analytical Process (Lan Feng); R&D TD Analytical Process Testing (Tanya Clemmer); R&D TD Analytical Process Testing (Tanya Riggins Clemmer); R&D TD Analytical Separation Science (Prakash Koodathingal (Inherited)); R&D TD BPD Drug Product (Lan Feng); R&D TD BPD Product Expertise (Rochelle Bazemore); R&D TD BPD Project Management & Lab Operations (Perciliz Ahern); R&D TD BPD Purification Development (Christopher Dadd); R&D TD BPD Purification Development I (Debbie Lydiard); R&D TD BPD Upstream (Ryan Thurston); R&D TD BPD Upstream Cell Culture Development (Leslie McSweeney); R&D TD Biologics Process Design (Keith Kulowiec); R&D TD Clinical Trial Manufacturing (Keith Kulowiec (Inherited)); R&D TD Downstream Labs (Debra Lydiard); R&D TD Product Expertise (Rochelle Bazemore); R&D TD Project Manager (Lourdes Barnes); R&D TD Project Manager (Perciliz Ahern); R&D TD Purification Development (Christopher Dadd); R&D TD Purification Development HS (Christopher Dadd (Inherited)); R&D TD Purification Development HS (Debbie Lydiard); R&D TD Purification Development HS (Debra Lydiard); R&D TD Purification Development HS (Matthew Brian Smith); R&D TD Technical and Business Services (Katherine Whitley); R&D TD Technical and Business Services (Keith Kulowiec (Inherited)); R&D TD VICE Core Virology (Christine Wadey); R&D TD VICE Core Virology Holly Springs (Christopher Gully); R&D TD VICE Core Virology Holly Springs Commercial (Charles McGee); R&D TD VICE Core Virology Parkville Seed Development (Brad Dickson); R&D TD VICE Core Virology Parkville Seed Development (Lynda Allan); R&D TD VICE Molecular Virology (Catherine Agius); R&D TD VICE Molecular Virology (Chi Ong); R&D TD VICE Molecular Virology Hybridoma & Microscopy (Erin Verity); R&D TD VICE Molecular Virology Hybridoma (Kirsten Vandenberg); R&D TD VICE Molecular Virology Microscopy (Stephen Asquith); R&D TD Virology & Cell Culture (Avishek Nandi); R&D TD Virology & Cell Culture (Ryan Thurston); R&D TD Virology & Cell Culture Sub-Group II (Gwen Truong-Royce); R&D TD Virology & Cell Culture Sub-Group II (Ryan Thurston (Inherited)); R&D TD Virology & Cell Culture Sub-Group III (Leslie McSweeney); R&D TD Virology & Cell Line Sub-Group I (Christopher Gully); R&D TD Virology & Cell Line Sub-Group I (Christopher Patrick Gully); R&D TD Virology & Immunology (Steven Rockman); R&D Technical Development (Ambarish Shah); R&D Technical Development (Russell Basser (Inherited)); R&D Technical Development (Russell Basser); R&D Technical Development (Scot Shepard); R&D Technical Development, Analytical & Drug Product Development (YING ZHANG); R&D Technical Development, Analytical & Drug Product Development (Ying Zhang); R&D Technical Development, Holly Springs (Keith Kulowiec); R&D Technical Development- Holly Springs (April Sena); R&D Technical Operations BMW (Germano Coppola (Inherited)); R&D Technical Operations BMW (Norm Mancuso); R&D Technology Transfer Marburg (Falk Weihmann); R&D Toxicology (Ethan Settembre (Inherited)); R&D Transplant TA (Laurie Lee); R&D and Capital Controlling (Stephan Ludovici); R&D eClinical Technology (John Edward Cornelius); R&D/ G&A Business Partners (Ken Lim (Inherited)); RA CMC & Compliance (Ana Moisidis); RA CMC & Compliance (Pete Campbell); RA CMC & Compliance (Sahra Zanetti); RA CMC Liverpool (Joanne Beighton); RA, China (Jeep Wang ????); RCB MBR Central Lab (Annette Feussner); RCB MBR Central Lab (Helene Lang); RCB MBR Central Lab (Maria Hauswald); RE Services (Dianne Leppanen); REC 1 (Marco Hofmann); REC 2 (Philipp Claar); REC 3 (Holger Lind); REC Gene Therapy (Bala Sai Sundarasetty); REI Europe (Samuel Hou); RI Research & Innovation (Thomas Nowak); RI ¬ñ Research & Innovation (Thomas Nowak); RPL PTI (Hans Raess); RSO, RQO and RA Emerging Markets (Dirk Hoheisel (Inherited)); Racine 065 (Carl L Hutton); Racine 065 ACM Area 1 (Nicole Robinson); Racine 065 ACM Area 2 (Lemina Billups); Racine 065 QA (Megan E Hoffman); Racine 065 QA (Megan Hoffman); Rainbow City 275 (Devyn Bryant); Rainbow City 275 ACM Area 1 (Sacashla Hampton); Rainbow City 275 ACM Area 2 (Ladricka Weatherspoon); Rainbow City 275 QA (Malcolm-Bryce Richbourg); Raleigh 231 (Derek Erhart (Inherited)); Raleigh 231 (Nathan Farcasin); Raleigh 231 ACM Area 1 (Joseph Jackson); Raleigh 231 ACM Area 2 (Deanna Anderson); Raleigh 231 QA (Braxton Summers); Rapid City 288 (Brendon Sato); Rapid City 288 ACM Area 1 (Brendon Sato); Rapid City 288 ACM Area 1 (Marc Sipma); Rapid City 288 ACM Area 2 (April Miller); Rapid City 288 QA (Buck Schiley); Raw Material Acceptance Chemistry (Michelle Reckerman); Raw Material Contro/lMonitoring (Dominic Wuest); Raw Material Control/Monitoring (Dominic Wuest); Raw Material Control/Monitoring 2 (Katrin Becker); Reception / Alarmsystem (Claudia Pereira-Buehler); Recombinant Coagulation R&D Manufacture (Steven Honey (Inherited)); Recombinant Operations Support (Vicky Pirzas (Inherited)); Recombinant Portfolio Team LGN (OLGA SARNOWSKA); Recombinant Portfolio Team MBR (Anne-Regine Herboth); Recombinant Product Development (Anthony Stowers); Recombinant Product Development, Marburg (Richard Alldread); Recombinant Product Development, Marburg R&D Operation and Services (Christian Schlachtbauer); Recombinant Product Development, Marburg Vector Development (Holger Laux); Recombinant Product Development-Pasadena (Andreas Gille); Recombinant Technologies Marburg (Peter Schmidt); Records & Reporting (Boris Kaiser (Inherited)); Records & Reporting (Caroline Roost); Records & Reporting (Ivan Poffet); Region 9 New Center Operations & Support (Amanda L Kitchen); Regional Demand Planning Europe (Lukas Limbach); Regional HR Ops AUS (Clare McCann); Regional HR Ops AUS (Miya Chiba); Regional HR Ops Americas (Mark Hickenbottom (Inherited)); Regional HR Ops Americas (Rita Gross); Regional HR Ops EMEA (Stephan Schufele); Regional HR Ops EMEA (Stephan Sch√§ufele); Regional HR Ops Europe (Stephan Sch√§ufele); Regional Head Americas (Kristin McCarthy); Regional Head Clinical Operations (Jacqui Cumming); Regional Head EU APAC (Mimi Ermens); Regional Innovation Operations (Carmon Kieffer); Regional Labeling (Barbara Peruche); Regional Labeling EU (Katrin Rdiger); Regional Labeling EU (Katrin R√ºdiger); Regional Labeling EU-INT (Katrin R√ºdiger); Regional Labeling INT (Irina Sviriaeva); Regional Labeling Lead, North America (Maricarmen Dilone-Raposo); Regional Medical Affairs Operations Manager (Andrew Stork); Regional Medical Affairs Operations Manager (Rosanda Buljubasic); Regional Procurement (Lucas Jinnette); Regional Quality Support for Eastern and Central Intercontinental Commercial Operations (Jonathan Imhof); Regional Safety Officer - ECI (Marta Puente); Regional Sales 1 (Fernando Marcos V Leony); Regional Sales 2 (Rafael Esteves); Regional Sales 3 (Claudia Bueno); Regional Sales Immunology & Respiratory (Heinrich Feischen); Regional Sales Mitte Hospital (Holger Milkereit); Regional Sales Office Berlin (Bernhard Czapla); Regional Sales Office Berlin (Claudia Bachmann); Regional Sales Office Bochum (Heinrich Feischen); Regional Sales Office Frankfurt (Holger Milkereit); Regional Sales Office Hannover (Michael Bernd Rode); Regional Sales Office Munich (Susanne Mller); Regional Sales Office Munich (Susanne M√∂ller); Regional Sales Office Ost Hospital (Frank Buttchereit); Regional Sales Office West Hospital (Ralf Kosmol); Regional Sales Ost Immunology & Respiratory (Claudia Bachmann); Regional Study Management, Americas (Danielle Dalton (Inherited)); Regional Study Management, Americas (Ross Watson (Inherited)); Regional Study Management-Americas (Ross Watson (Inherited)); Regional Supplier Qlty - Bern (Peter Stettler); Regional Supplier Qlty - Kankakee (Elizabeth Queiro); Regulat. Coordination Russia & CIS (Vsevolod Nikolaev); Regulat.-, Quality- & Safety Coord.EEMEA (Camilla Shen (Inherited)); Regulat.-, Quality- & Safety Coord.EEMEA (Christine Danila); Regulation Intelligence, Knowledge and Training (Sara Mesiano); Regulation, Training & Knowledge Sharing (Vicky Gakias); Regulatory (Haruo Kitado (Inherited)); Regulatory (Satoshi Koike - ); Regulatory (Satoshi Koike ??? ? - ??? ????); Regulatory Affairs & Lab Operations (Jon Knowles); Regulatory Affairs (Doris Friedl); Regulatory Affairs (Jane Wang ????); Regulatory Affairs (Joyce P Castaneda); Regulatory Affairs (Kate Burke); Regulatory Affairs AU/NZ Dev Prod BMW (Kellie Hooley); Regulatory Affairs AU/NZ (Gosia Kupczyk); Regulatory Affairs AU/NZ (Neama Baho); Regulatory Affairs AU/NZ Dev Prod BMW (Kellie Hooley); Regulatory Affairs Asia (Queenie Ho); Regulatory Affairs Benelux (Patrick Reygaert (Inherited)); Regulatory Affairs Benelux (Roel Mallants); Regulatory Affairs France (Christine Roche [C]); Regulatory Affairs France (Laurence Vidal); Regulatory Affairs Greece (Penelope Terentiou); Regulatory Affairs Italy (Roberto DeBenedetto); Regulatory Affairs MEA (Haydi Ibrahim); Regulatory Affairs Mgr Global Labelling Ops (Laura Vanzan); Regulatory Affairs Nordic (Elin Wobbeking); Regulatory Affairs Nordic (Ulf Hultquist (Inherited)); Regulatory Affairs Spain (Julian Fierro); Regulatory Affairs UK (Helen Watts); Regulatory Coordination Africa & EEU (S√©verine Caillet); Regulatory Coordination GLAD (Sverine Caillet); Regulatory Coordination GLAD (S√©verine Caillet); Regulatory Intelligence & Policy (Bettina Doepner); Regulatory Operations, Compliance and Business Excellence (Valeria Graffeo); Regulatory Operations, Compliance and Business Excellence - HS (Detra Bullock); Regulatory Reg. Lead NA EP (Baldevsinh Rana (Inherited)); Release Bulk & Filling (Joachim Leiss); Release FRAKT/ALB/Rho (Christine Peter); Release IG/CYT (Ines Joachim); Reno 502 (Susan Gonzalez); Reno 502 ACM Area 1 (Dwayne Majette); Reno 502 ACM Area 2 (Lauren Clapham); Reno 502 QA (Chermaene Mathis); Reporting & Planning (Konstantin Petropoulos (Inherited)); Reporting A-IFRS & German GAAP, Taxes (Angelika Godosar); Requalification & Stability (Angela Hamrock-Fox); Requalification & Stability (Ilija Najdovski); Requalification & Stability (Judith Kennedy); Research; Research & Clinical Bioanalytics (Bradley Sedgmen); Research & Clinical Bioanalytics (Kirstee Martin); Research & Clinical Bioanalytics (Marit Lichtfuss); Research & Clinical Bioanalytics (Meaghan FitzPatrick); Research & Development (Douglas Lee); Research & Development Bern (Liane Hoefferer); Research & Development Bern (Nathan Roth); Research & Development II (Norbert Schulze); Research (Adele Barr); Research (Adrian Zuercher (Inherited)); Research (Adriana Baz Morelli); Research (Alexander Karnowski); Research (Anabel Silva); Research (Andrew Hammet); Research (Andrew Nash); Research (Anne Verhagen (Inherited)); Research (Anne Verhagen); Research (Arna Andrews); Research (Brodie Miles); Research (Catherine Owczarek); Research (Chao-guang Chen (Inherited)); Research (Chao-guang Chen); Research (Con Panousis); Research (Eugene Maraskovsky); Research (Glenn Powers); Research (Greg Bass); Research (Hadi Lioe); Research (Helen Cao); Research (Ian Campbell); Research (Ineke Muir); Research (Ingela Vikstrom); Research (Ingrid Lonnstedt); Research (JANE ARTHUR); Research (Jason Simmonds); Research (Jenny Chia (On Leave)); Research (Jenny Chia); Research (Judith Field); Research (KOLJA SCHAALE); Research (Katherine Monaghan (On Leave)); Research (Katherine Monaghan); Research (Kerstin Emmrich); Research (Kirsten Edwards); Research (Larissa Provan); Research (Lidija Turkovic); Research (Mae-Xhum Wong); Research (Marco Weinberg); Research (Mark Biondo); Research (Mark Liddament (Inherited)); Research (Mark Liddament); Research (Martin Pearse); Research (Matthias Pelzing); Research (Mhairi Maxwell); Research (Michael Wilson (Inherited)); Research (Michael Wilson); Research (Michael Yan); Research (Milica Ng (Inherited)); Research (Milica Ng); Research (Natasha Pereira); Research (Nick Wilson); Research (Peter Schmidt); Research (Pierre Scotney); Research (Pino Maccarone); Research (RAJESH GHAI); Research (Rebecca Butcher); Research (Sabine Rauth); Research (Sandro Prato); Research (Saw Yen Ow); Research (Shirley Taylor); Research (Srikanth Budnar); Research (Steven Dower (Inherited)); Research (Steven Dower); Research (Steven Lee); Research (Victor Turnbull); Research (Walid Azar); Research (Wei Hong Toh); Research 1 (Hannah Chu); Research 1 (Mihee Kim); Research Bern (Adrian Zuercher); Research Bern Platforms (Christoph R√∂sli); Research Bio21 (Michael Wilson); Research Data Science (Milica Ng); Research I () (Wenting Zhao); Research I (Chao-guang Chen); Research II (Victor Turnbull); Research III (Mark Liddament); Research IV (Marco Weinberg); Research Innovation (Marthe D''Ombrain); Research Marburg (Thomas Weimer); Research Marburg Diverse (Thomas Weimer (Inherited)); Research Scientist - Bioinformatics (Monther Alhamdoosh); Research Therapeutic Area (Eugene Maraskovsky); Research and Clinical Bioanalytics (Allison Dyson); Research and Clinical Bioanalytics (Andreas Gille); Research and Clinical Bioanalytics (Anthony Roberts); Research and Clinical Bioanalytics (Elena Velkoska); Research and Clinical Bioanalytics (Kirstee Martin (Inherited)); Research and Clinical Bioanalytics (Kirstee Martin); Research and Clinical Bioanalytics (Roslyn Davis); Research and Clinical Bioanalytics (Tim Green); Research and Clinical Bioanalytics 1 (Lisa Lindqvist); Research and Development, China (Zak Huang); Research and Laboratory (Andrew Isaac); Research,Therapeutic Area C&M (Bronwyn Kingwell); Resp. Apprentices Bio Lab Techn. (Wim Etter); Respiratory TA (Heike Thiele); Results & Analysis (Jonathan Matty); Results & Analysis (Kenneth Walsh); Results & Analysis I (Jonathan Matty); Results & Analysis II (Jonathan Matty); Review IG/CYT (Armin St√∂cklin); Review IG/CYT (Thomas Kilchoer); Rhophylac (Andrea Stauffiger Eggli); Rhophylac Bulkmanufacturing (Andr Wegmueller); Rhophylac Bulkmanufacturing (Andr√© Wegmueller); Rhophylac Bulkmanufacturing 2 (Reto Stucki); Rialto 507 (Robert Ellison III); Rialto 507 QA (Derek Erhart (Inherited)); Risk & Mitigation Management (Malte Krmer); Risk & Mitigation Management (Malte Kr√§mer (Inherited)); Risk & Mitigation Management (Malte Kr√§mer); Risk & Project Management (Uli Kiefer); Riverside 299 (Iiemmaue Morgan); Riverside 299 QA (Anne Tran); Riviera Beach 115 (Martel Carter); Riviera Beach 115 (Nakia J Harlan); Riviera Beach 115 (Nedra N Braden); Riviera Beach 115 ACM Area 1 (Ethan Johnson); Riviera Beach 115 ACM Area 2 (JASON TRUMBACH); Riviera Beach 115 QA (Bill Angelucci); Riviera Beach 115 QA (Stalmore Duncan); Rochester 232 (Kay Schwartz); Rochester 232 (Lin Macaluso); Rochester 232 ACM Area 1 (Marissa Peterson); Rochester 232 ACM Area 2 (Michelle Draper); Rochester 232 ACM Area 2 (Michelle Horan); Rochester 232 QA (K.C. McCaffery); Rochester 232 QA (Karen Weatherston); Rock Hill 130 (Damon Lehr); Rock Hill 130 (Nicole M Adams); Rock Hill 130 ACM Area 1 (Ashley Pinckney); Rock Hill 130 ACM Area 2 (Brittney Joiner); Rock Hill 130 QA (Bianca M Brunson); Rock Hill 130 QA (Damon Lehr); Rock Island 426 (Josh Buzzell); Rock Island 426 ACM Area 1 (Chandler J Johnson); Rock Island 426 ACM Area 2 (James Rathmann); Rock Island 426 QA (Jennifer D Anthony); Rockford 200 (Kristi Davis); Rockford 200 (Sherylene A Lloyd); Rockford 200 ACM Area 1 (Kristi Davis (Inherited)); Rockford 200 ACM Area 1 (Kristy Carlson); Rockford 200 ACM Area 2 (Paul Crawford); Rockford 200 QA (Amanda Sawlsville); Rome 298 (Marida L Bivens); Rome 298 ACM Area 1 (Salvador Reyes); Rome 298 ACM Area 2 (Matt Comfort); Rome 298 QA (Samantha D Beach); Rome 298 QA (Stephanie D Shah (Inherited)); Roseville 077 (Charles LaVell Jacobs); Roseville 077 (Kourtney Davis); Roseville 077 ACM Area 1 (Charles LaVell Jacobs (Inherited)); Roseville 077 ACM Area 1 (Porsche M Goldsmith); Roseville 077 ACM Area 2 (Natalie King); Roseville 077 QA (Kayla D Lindley); Roseville 077 QA (Nichole M Clay (On Leave)); Roseville 077 QA (Nichole M Clay); Routine Systems (Martene Bond); Ruide Wuhan EHS&S (zcan Campinar); Ruide Wuhan Production (√ñzcan Campinar); Russia & CIS (Maria A Lituchaya); SAP Business Processes (Maike Pollaschek (Inherited)); SAP Business Processes (Maike Pollaschek); SAP Competency Center (Helen Baker); SAP Competency Center (Jonathan Turner); SAP Extended Warehouse Management (Riju Varghese); SAP Finance team (Jonathan Young); SAP Manufacturing Support Team (Manohar Venkataraman); SAP Master Data (Paul Aberson); SAP Quality / Logistics Team (Helen Baker (Inherited)); SAP Quality / Logistics Team (Matthew Gordon); SAP Service and Release (Martin Eldred); SAP Solution Center Bern (Mourad Boulanouar); SAP System Admin (John McCorry); SI Manufacturing (Jason Vaughn); SI Manufacturing (Samuel Jackson); SI Validation (Michael Donley); SI Validation (Robert Musgrave); STA Gruppe I (Claudia Schwab); STA Gruppe I (Stefanie Grafmller); STA Gruppe I (Stefanie Grafm√ºller); STA Gruppe II; STA Gruppe II (Susan Blaser); STA Stability (Manuel Wohde); STA Stability 2 (Barbara Gmann); STA Stability 2 (Barbara G√∂√ümann); STA Stability 3 (Gernot Kissel); STA Stability 4 (Svenja Nieba); STA Stability 5 (Milosz Krala); STA Stability 6 (Oliver Kupski); STA Stabilit√§t, QFP (Annette R√∂hrenbeck); STA Stabilit√§t, QFP (Barbara G√∂√ümann); STA Stabilit√§t, QFP (Christian Sinnen); STA Stabilit√§t, QFP (Gernot Kissel); STA Stabilit√§t, QFP (Julia Kufka); STA Stabilit√§t, QFP (Manuel Wohde); STA Stabilit√§t, QFP (Svenja Nieba); Safety (Alexandra Nogal); Safety (Allan Wise (Inherited)); Safety (Chris Meeks); Safety (Rolf Ingold); Safety (Steven Hull); Safety Risk Management (Max Waschbusch); Safety Risk Management (Pradeep Kumar Sahu); Safety Risk Management (Susan Welsh (Inherited)); Safety Risk Mgt (Alison Graves Jones); Safety Risk Mgt (Astrid Schneider); Safety Risk Mgt (Daphne Sawlwin); Safety Risk Mgt (Gabriele Neumann); Safety Risk Mgt (Joseph Whitten); Safety Risk Mgt (Kristy Van Dinther); Safety Risk Mgt 1.0 (Gabriele Neumann); Safety Risk Mgt 2.0 (Beate Greene); Safety Sciences (Haley Kaplowitz); Safety Systems Projects (JANET AUERBACH); Safety ¬ñ EU/APAC (J√ºrgen Kanand); Saginaw 169 (Amy Railling); Saginaw 169 (Ashley M Jamieson (Inherited)); Saginaw 169 (LC Davis); Saginaw 169 (Latosha Y Floyd (Inherited)); Saginaw 169 ACM Area 1 (TAYLOR GOODWINE (On Leave)); Saginaw 169 ACM Area 1 (TAYLOR GOODWINE); Saginaw 169 ACM Area 2 (Scott Walker); Saginaw 169 QA (Nicole Naji); Saginaw 282 (DaWanna Smith); Saginaw 282 ACM Area 1 (Genesha Curry); Saginaw 282 ACM Area 2 (Andrea Bragg); Saginaw 282 QA (Darren Hall); Salem 221 (Cory Vierck); Salem 221 (Paige N Zafran); Salem 221 (Timothy Freeland Jr (Inherited)); Salem 221 ACM Area 1 (Brandon D Biery); Salem 221 ACM Area 2 (Edward Baye); Salem 221 QA (Rachel R Maddox); Salem 221 QA (Timothy Freeland Jr (Inherited)); Sales & Marketing (Andrea Bennett); Sales & Marketing (Joe Dempsey); Sales & Marketing (Kaye Nolan); Sales & Marketing (Kirsten Comer); Sales & Marketing (Tanja Wells); Sales & Marketing Turkey (Ahmet Can Kalenderoglu); Sales & Private Accounts & Tender Office (Franco Gatta); Sales & Private Accounts & Tender Office (Massimo Leoni (Inherited)); Sales & Private Accounts & Tender Office (Massimo Leoni); Sales (Beata Szymanska-Czyz); Sales (Jorge L Gast√©lum (Inherited)); Sales (Jorge Marco); Sales (Markus Wenninger); Sales (Saul Ortiz Carrillo); Sales (Virgile Grosjean); Sales - CSL Behring Taiwan (Frank Ko ); Sales - CSL Behring Taiwan (Frank Ko ?????); Sales 2 (Claudia Sanchez); Sales 3 (Gema Gonzalez); Sales Benelux IG & CC (Marijke Maes); Sales Benelux IG & CC (Philip Vandromme); Sales Denmark / IC (Mette Toft Jacobsen); Sales Division (Izumi Yoshida ??? ??? - ??? ????); Sales Division (Toshio Nagata); Sales Division CAB Central Japan Area (Yoshifumi Umenaka); Sales Division CAB East Japan Area (Takahiro Tsuruta); Sales Division CAB West Japan Area (Akihiro Enomoto); Sales Division Critical Care & Acquired Bleeding T.A. (Hiroyoshi Iwamoto); Sales Division HEM East Japan Area (Atsuhiko Arikata); Sales Division HEM Kansai & Chubu Area (Shinichi Kano); Sales Division HEM Shutoken Area (Takayuki Takigawa); Sales Division HEM West Japan Area (Taisuke Miyakoshi); Sales Division Hemophilia TA (Hideki Yanagihashi ??? ?? - ????? ????); Sales Division Hemophilia TA (Takayuki Ishii); Sales Division IRD Central Japan Area (Takayuki Azuma); Sales Division IRD East Japan Area (Madoka Yamamoto); Sales Division IRD Kansai & Hokuriku Area (Takahiro Miura); Sales Division IRD Shutoken Area (Hironori Fujioka - ); Sales Division IRD Shutoken Area (Hironori Fujioka ??? ?? - ???? ?????); Sales Division IRD West Japan Area (Hiroki Nagayasu); Sales Division Immunology & Rare Diseases T.A. (Naoki Ikeguchi); Sales Division Kcentra Team (Tomokazu Shiroza); Sales Division SID T.A. Ig/Hematology Group (Izumi Yoshida ??? ??? - ??? ????); Sales Division SID T.A. Ig/Hematology Group (Kenichiro Yamaguchi - ); Sales Division Sales Planning & Wholesaler Management (Hideki Yanagihashi - ); Sales Division Sales Planning (Takayuki Ishii); Sales Division Sales Planning Wholesaler Management, Sales Admin (Hisako Sakoda); Sales Division Sales Planning,Sales Admin Group (Hisako Sakoda); Sales Division Tentative Team (Hiroyoshi Iwamoto); Sales Division Wholesaler Management Customer Support Team (Kyohei Yamamoto); Sales Division Wholesaler Management Distributor Team (Kyohei Yamamoto); Sales Finland (Sirpa Reimari); Sales Force Center (Flavio Di Pietro); Sales Force North (Paolo Francesco Corsi); Sales Force South West (Renato Monteleone); Sales France (Emmanuelle Massonie); Sales France (Franck Puget (Inherited)); Sales France (Karim Abassi); Sales Greece (Christos Fouskotis); Sales Hospital (Popp Gbor); Sales Hospital (Popp G√°bor); Sales Immunology & Coagulation (Kadar Attila); Sales Management Hemophilia (Michael Schulz (Inherited)); Sales Management Hemophilia (Michael Schulz); Sales Norway (Kjell Anker Worren); Sales Operations and Data (Chris Meyer); Sales Spain (Toni Par√©s); Sales Sweden (Nicklas Wallin); Sales Team Belgium (Marijke Maes (Inherited)); Sales Team Belgium (Philip Vandromme (Inherited)); Sales Team France (Emmanuelle Massonie (Inherited)); Sales Team France (Franck Puget (Inherited)); Sales Team France (Karim Abassi (Inherited)); Sales Team Netherlands (Marijke Maes (Inherited)); Sales Team Netherlands (Philip Vandromme (Inherited)); Sales Team UK (John Liam Boyle); Sales Team UK 2 (Nicky Whiteley); Sales Training Manager (Phil Hutton); Salt Lake 627 (Brooke A Neuroth); Salt Lake City 627 (Dave Lynn); Salt Lake City 627 (Marc D Fisher (Inherited)); Salt Lake City 627 (Nate Justet); Salt Lake City 627 ACM Area 1 (Michael E Forman); Salt Lake City 627 ACM Area 2 (Andrew V Lopez); Salt Lake City 627 ACM Area 2 (Ross R Fredrickson); Salt Lake City 627 QA (Allison M Davis); Sample Logistics (Brigitte Harris); Sample Management Quality Control (Christoph Wyss); Sample logistics (Christine Beyeler); Sampling (Lachlan McDonald); San Antonio 087 (Becca Charles); San Antonio 087 (Jennifer Martinez); San Antonio 087 ACM Area 1 (Kamala Yevetta Brown); San Antonio 087 ACM Area 2 (Aaron Thornton); San Antonio 087 ACM Area 2 (Kamala Yevetta Brown); San Antonio 087 QA (Alicia D Conner); San Antonio 157 (Sara A Anderson); San Antonio 157 (Sara Saleem); San Antonio 157 ACM Area 1 (Erika Gonzalez); San Antonio 157 ACM Area 2 (Crystal N Morton-Rollins); San Antonio 157 QA (Brenton Ferguson); San Antonio 157 QA (Nakia J Harlan); San Luis 158 (Javier Luna); San Luis 158 ACM Area 1 (Cristina Silva); San Luis 158 ACM Area 2 (Paulina Pena); San Luis 158 QA (MARA TAFOYA); San Luis 158 QA (Miguel Palomera); Sanitation (Union) (Adila Zaidi); Sanitation (Union) (Michael Memenga (Inherited)); Scanton 240 (Christopher Travalik (Inherited)); Schenectady 229 (Andrew Brammer); Schenectady 229 (Melissa Moore); Schenectady 229 ACM Area 1 (Renie Ball); Schenectady 229 ACM Area 1 (Ronald Cameron); Schenectady 229 ACM Area 2 (Karena Caraballo); Schenectady 229 QA (Sandy Nicholson); Schicht EMR (Gunthard Ludwig); Schicht EMR GH (Bjrn Krieg); Schicht EMR GH (Bj√∂rn Krieg); Schicht HW (Bjrn Krieg); Schicht HW (Bj√∂rn Krieg); Schicht HW (Christian Zeman); Schichtgruppe 1 ASQ (Marcus Heinzmann); Schichtgruppe 2 ASQ (Harald Ferber); Schichtgruppe 3 ASQ (Ruben Zinnkann); Schichtgruppe 3 ASQ (Svea Bieker); Schichtgruppe 4 ASQ (Gerhard Senftner); Scientist (Theresa Qiu); Scientist (Tony Cardno); Secondary Asset Care & Reliability (William Murphy); Secondary Automation (Muhammad Khan); Secondary Engineers (Calum Courtney); Secondary Manufacturing (Tristan Betson); Secondary Manufacturing Support Technicians (Gerard Lopez); Secondary PMO (Carrie O''Keeffe); Secondary PMO (Duncan Benson); Secondary Programmes (CARRIE OKEEFFE (Inherited)); Secondary Projects QA (Karen Marks); Secondary Projects Tech Transfer (Freddie Wayne West); Secondary Projects, Liverpool (CARRIE OKEEFFE); Secondary Projects, Liverpool (Carrie O''Keeffe); Secondary, Utilities and QC Validation (Joao Silva Acioli); Secretary ES (Montserrat Rey); Security CSL Behring Australia (Sharon Carroll); Security Manager 281 (Az Raonaq); Security Manager 281 (Chris Riley); Security Manager 281 (Nicholas Moody (Inherited)); Security Operations (Ganesha Rajanaidu); Security Operations (Ram Narasimman); Sen Ass QA Batch Release (Carol Youssef); Sen Ass QA Batch Release (Chris Graves); Sen Ass QA Batch Release (Nancy Manolitsas); Sen Ass QA Batch Release (Olivia Fisher (Inherited)); Sen Ass QA Batch Release (Olivia Fisher); Sen Ass QA Batch Release (Zareena Shaik); Sen Assoc Contin/Re-Validation (Robert Alvarez); Sen Assoc Validation Operation (Michelle Botterill); Sen Assoc Validation Operation (Nick Morgan (Inherited)); Sen Assoc Validation Operation (Nick Morgan); Sen Mgr QA Batch Release (Darren Moulton); Senior Associate QA Batch Release (Joanna Davis); Senior Associate QA Batch Release (Josie Lanza); Senior Associate QA Capability (Brett Pool); Senior Associate QA Capability (Marcela Rojas (Inherited)); Senior Associate QC Support (Jo Karra); Senior Director Manufacturing (Chris Larkins); Senior Electrical Engineer (Anthony Wrzesinski (Inherited)); Senior Electrical Engineer (Nicholas Hall); Senior Electrical Engineer (Stanislaw (Stan) Hudy); Senior Electrical Engineering Manager (Claro Pellosis); Senior HR Business Partner (Devon Anderson); Senior HR Business Partner (Sharon Davoli); Senior HR Business Partner (Sonia Georgesz); Senior HR Business Partner (Sonia Pititto); Senior Manager QA Capability (Marcela Rojas); Senior Manager Validation (Dina El-Emary); Senior Manager Validation (Michelle Botterill); Senior Manager Validation (Russell Ciliento (Inherited)); Senior Manager Validation (Shane Bourne); Senior Manager, Innovation R&D (Chi Ong); Senior Process Engineer (Robert Hemaya); Senior Process Manager - Utilities (Christiaan Theron (Inherited)); Senior Process Manager - Utilities (Christiaan Theron); Senior Process Manager, Supply Chain (Helen Malliaras); Senior Process Manager, Supply Chain (Lachlan Cruise (Inherited)); Senior Project Manager (Anthony Wrzesinski (Inherited)); Senior Project Manager (Brian Guilly); Senior Project Manager (Raoul Gorris); Senior Regulatory Affairs Manager (Ana Moisidis); Senior Scientist (Albert Garcia Minambres); Senior Scientist (Armando Alabella); Senior Scientist (Kiki Vukanovska (On Leave)); Senior Scientist (Kiki Vukanovska); Senior Scientist (Kim Steegh); Senior Scientist (Maria Panayi); Senior Scientist (Matthew Hardy); Senior Scientist (Sachiyo Nishio); Senior Scientist (Tom Murray-Rust); Senior Scientist Biacore (Bernadine Lu); Senior Scientist EM Unit (Stephen Asquith); Separatoren (Arnd Vollmerhausen (Inherited)); Separatoren (Erkan nder); Separatoren (Erkan √ñnder); Seqirus Head of Legal, Asia (Marcus De Alwis); Seqirus Ltd (Anjana Narain); Seqirus Ltd (Gordon Naylor); Seqirus Ltd (Stephen Marlow); Seqirus Ltd II (Gordon Naylor (Inherited)); Serialisation Operations and Projects (Michel Stritt); Serialisierung (Michel Mller); Serialisierung (Michel M√ºller); Serialization & Anti-Counterfeiting (Andrew John Robinson); Serialization & Anti-Counterfeiting (Warren Comerford (Inherited)); Serology Lab (Dan Thompson); Serology Lab - 1st Shift (Undrea W Jenkins); Serology Lab - 3rd Shift (Angela C Reynolds); Serology Lab - Weekend (Undrea W Jenkins (Inherited)); Service Management (Jim Towarnicki); Servicecenter Hemophilia (Axel Hochfeld); Shared Area Engineering (Marc Herbener); Shreveport 245 (Johnnie Williams); Shreveport 245 (Marcia Schels); Shreveport 245 ACM Area 1 (Demetricia Moore); Shreveport 245 ACM Area 1 (Edgar Rodriguez); Shreveport 245 ACM Area 2 (Matt Comfort); Shreveport 245 ACM Area 2 (Rashunda Dock); Shreveport 245 QA (Kaci Miller); Single Case Management & PhV Systems (Jessica Corrall (On Leave)); Single Case Management & PhV Systems (Jessica Corrall); Single Case Mgt & Phv Systems (Liz Pound); Single Case Mgt & Sy (Sabine H√§rtel (Inherited)); Single Unit Verification 1 (Kai Wlk); Single Unit Verification 1 (Kai W√∂lk); Single Unit Verification 2 (Norbert Vollmerhausen); Single Unit Verification 3 (Karl-Heinz Stelzig); Site Administration (Deborah Lynes); Site Engineering Services (Alex Elandt [C]); Site Engineering Services (Alex St√§hli); Site Lead project Eureka (Chris Camilleri); Site Lead project Eureka (Philip Elliott (Inherited)); Site Logistics (Daniel Schmidt); Site Logistics Production (Igor Kaucher); Site Management (Ross Watson); Site PMO (Karen Mitchell); Site Security (Az Raonaq); Site Security (Matthias Gnther); Site Security (Matthias G√ºnther); Site Security Switzerland (Julien Lischer); Site Supply Chain (Dennis Finger); Site- & Project-Management (Ferdinand Marx); Site- & Project-Management (Marko Witt); Site- & Project-Management (Rainer Herbener); Smryna 123 QA (Apple Grace Swindell); Smyrna 123 (Stephen Jefferson); Smyrna 123 ACM Area 1 (Venessa Lucio); Smyrna 123 ACM Area 2 (Travis Conner); Smyrna 123 QA (Apple Grace Swindell); Snr Assoc, QA Batch Release (Abha Patel); Snr Director Quality PKV530 (Milka Smoljko); Snr Mgr, Bus Effectiveness (Collette Makdissi); Snr Reg Advisor (David Plant); Snr Scientist Flu Innov (Kirsten Vandenberg); Snr Scientist Influ.Innov 275 (Catherine Agius); Snr.Scientist Flu Pilot Facil (Brad Dickson); Solid Organ Transplant Marketing (Courtney Wilson); Solutions Team Leader (Shona Moore); Source to Pay (Andrew Croft (Inherited)); Source to Pay (STP) (Dennis Martin); Sourcing (Brigitte Kimpel-Koch [C]); Sourcing (Frank Liesner); Sourcing (Jens Knoch); South America Sales Ops (Jean-Claude Andr√© (Inherited)); South Korea Operations (Ji-Young Sohn); South Milwaukee 140 (Cory Toellner (Inherited)); South Milwaukee 140 (Justin N Gronbach); South Milwaukee 140 (Kevin Labriola); South Milwaukee 140 ACM Area 1 (Cassandra J Cecka); South Milwaukee 140 ACM Area 2 (Shannon T Bardega); South Milwaukee 140 QA (Amy M Gebhardt); South Portland 256 (Cory Vierck); South Portland 256 ACM Area 1 (Kendra Howard); South Portland 256 ACM Area 2 (Cameron Clement); South Portland 256 QA (Mark Anderson); Southfield 070 (Lauren Jenkins); Southfield 070 (Marida L Bivens); Southfield 070 ACM Area 1 (Lori Daratony); Southfield 070 ACM Area 2 (Linda M Walker); Southfield 070 ACM Area 3 (Tiffany A Patton); Southfield 070 QA (Tamil Pettway (On Leave)); Southfield 070 QA (Tamil Pettway); Spartanburg 179 (Darrell Brotherton); Spartanburg 179 (Jose Pineda); Spartanburg 179 ACM Area 1 (Shaqueda Cariens); Spartanburg 179 ACM Area 2 (Krysten Evans); Spartanburg 179 AMQ (Jennifer Fox); Spartanburg 179 QA (Jennifer R Fox); Spartanburg 179 QA (Jose Pineda); Spartanburg 179 QA (Vernicia Smith); Spartanburg 179 QA (Victoria McIntyre (Inherited)); Specialist Area Bus Manager775 (Lisa Stewart); Specialist Area Business Manager (Katerina Kouridakis); Specialist Area Business Manager (Natasha Hutchison (Inherited)); Specialist Area Business Manager (Steve Carroll); Specialty HAE (Debbie Bensen-Kennedy (Inherited)); Specialty HAE (Joseph Chiao); Specialty Plasma (Phyllis Bressler); Specialty Plasma (Robert P Lawler (Inherited)); Spectroscopy and Elementary Analysis (Pierre-Harald Schmidt); Spectroscopy and Elementary Analysis (Stefan Wilka); Sphinx (Klara Cela); Sphinx II (Fynn Krieger); Spokane Main 621 (Adam Allred); Spokane Main 621 (Juli McConnell); Spokane Main 621 ACM Area 1 (Maurice V R Reed); Spokane Main 621 ACM Area 2 (Janette R Williams); Spokane Main 621 QA (Andee Leigh Schock); Spokane Valley 622 (Donna L King); Spokane Valley 622 (Ryan H Rettkowski); Spokane Valley 622 ACM Area 1 (Josh Kearney); Spokane Valley 622 ACM Area 1 (Ryan H Rettkowski (Inherited)); Spokane Valley 622 ACM Area 2 (Donna L King (Inherited)); Spokane Valley 622 ACM Area 2 (Juli McConnell); Spokane Valley 622 QA (Donna L King); Spokane Valley 622 QA (Rachel R Maddox); Springdale 268 (Justin Hampton); Springdale 268 ACM Area 1 (Devona D Williams); Springdale 268 ACM Area 2 (Ellie Kordooni); Springdale 268 QA (Karina G Campos); Springfield 285 (Quawan Dhom); Springfield 285 QA (Pari Patel); Springfield 492 (Amy L Pruitt); Springfield 492 (Tyler L Robinson); Springfield 492 ACM Area 1 (Carmen Gonzalez); Springfield 492 ACM Area 1 (Peter J Gouvas); Springfield 492 ACM Area 2 (Natalie N Williams); Springfield 492 QA (Marcie B Deal); Springfield 620 (Karen Aspinwall); Springfield 620 (Karen Hebbert); Springfield 620 ACM Area 1 (Karen Hebbert (Inherited)); Springfield 620 ACM Area 1 (Lisa M Meredith); Springfield 620 ACM Area 2 (Julia A Thompson); Springfield 620 QA (Becky D Clute); Sr. Clinical Program Manager Clinical Development (Anthony Ciliberto); Sr. Clinical Program Manager ¬ñ Clinical Development (Anthony Ciliberto); Sr. Clinical Program Mgr ¬ñ Clinical Development (Anthony Ciliberto); Sr. Scientist S''visor EM Unit (Ross Hamilton); St Louis 107 (Jimmy Williamson Jr); St Louis 107 (Robert Karbach); St Louis 107 ACM Area 1 (Ashli N Pinson); St Louis 107 ACM Area 1 (Jimmy Williamson Jr (Inherited)); St Louis 107 ACM Area 2 (Ashli N Pinson); St Louis 107 ACM Area 2 (Sentoria D Leonard-Brown); St Louis 107 QA (Sharica Ausler); St Louis 132 (Tiffany D Thurman); St Louis 132 ACM Area 1 (Chris Haley); St Louis 132 ACM Area 2 (Kevin S Neidinger); St Louis 132 ACM Area 2 (Tiffany D Thurman (Inherited)); St Louis 132 QA (Abby Hill); St Louis 132 QA (Jacob P Phillips); St Paul 180 (Darin L Bargsten); St Paul 180 ACM Area 1 (Cody A Patton); St Paul 180 ACM Area 2 (Brenda L Steffen); St Paul 180 QA (Amanda Peroutka (On Leave)); St Paul 180 QA (Amanda Peroutka); St Paul 180 QA (Holt Peterson (Inherited)); St Paul 416 (Scott Cantrell); St Paul 416 QA (Diego A Bastidas); Stability (Anita Jansen de Salazar); Stability (Jessica Mackellin); Stability (Jessica Parletta); Stability (Michel Baur (Inherited)); Stability (Rossana Amodeo); Stability Trials and Retention Samples (Chris O''Meara); Starting Materials Testing & Release (Simone Lang); State College 262 (Daniel LoCasale); State College 262 ACM Area 1 (Justin Nolan); State College 262 ACM Area 1 (Maria Garlick); State College 262 ACM Area 2 (Hunter Millward); State College QA 262 (TARA STYERS); State Government Affairs & Eastern Reg. (Karla White); Statistics & Data Management (Wilfried Meyers); Stellv. Center Manager (Andreas Gehrich (Inherited)); Stellv. Center Manager (Andreas Gehrich); Stellv. Center Manager (Annette Pernitzsch (Inherited)); Stellv. Center Manager (Annette Pernitzsch); Stellv. Center Manager (Claudia Habenicht (Inherited)); Stellv. Center Manager (Claudia Habenicht); Stellv. Center Manager (Damaris Kieckhfer (Inherited)); Stellv. Center Manager (Damaris Kieckh√∂fer (Inherited)); Stellv. Center Manager (Heike Borchert (Inherited)); Stellv. Center Manager (Kirsten Scheibel (Inherited)); Stellv. Center Manager (Kirsten Scheibel); Stellv. Center Manager (Natascha Bock (Inherited)); Stellv. Center Manager (Natascha Tappendorf); Stellv. Center Manager (Stephani Keltsch); Stellv. Center Manager (Sven Schuhmann (Inherited)); Stellv. Center Manager (Sven Schuhmann); Stellvertretender Labormanager (Astrid Mather (Inherited)); Sterile Filling AlbuRx (Hai Tran); Sterile Filling AlbuRx (Jennifer Tang); Sterile Filling AlbuRx (Mason Briner (Inherited)); Sterile Filling AlbuRx (Mason Briner); Sterile Filling AlbuRx (Matthew Donegan); Sterile Filling AlbuRx (Nina Djordjevich); Sterile Filling AlbuRx (Paolo Robillos); Sterile Filtration (Jakob Locher); Sterility (Anja Djordjevich); Sterility (Denise Vella (Inherited)); Sterility (Johanna Mock); Sterility (Nicole Magno); Sterility (Sabrina Desiree Sann); Sterility Assurance Monitoring & Trending (Marika Moore); Sterility Assurance (Barbara Moser); Sterility Assurance (Boo Pit Tan); Sterility Assurance (Craig Stephens (Inherited)); Sterility Assurance (Darla Erman); Sterility Assurance (Jessica Kay); Sterility Assurance (Meena Shakaib); Sterility Assurance (Peter Major); Sterility Assurance (Richard Hughes); Sterility Assurance (Robert O''Malley); Sterility Assurance (Tyson Parker); Sterility Assurance ¬ñ Monitoring & Trending (Marika Moore); Sterling Heights 164 (Kayla J Allen); Sterling Heights 164 (Shauna Douglas); Sterling Heights 164 ACM Area 1 (Zack Hyso); Sterling Heights 164 ACM Area 2 (Shauna Douglas (Inherited)); Sterling Heights 164 ACM Area 2 (Shauna Douglas); Sterling Heights 164 QA (Elijah J Wilson); Sterling Heights 164 QA (JoJo Sobjack); Stone Mountain 119 (Antonia Geiselmayr); Stone Mountain 119 (William A Voltz); Stone Mountain 119 ACM Area 1 (Milaine Clairvil); Stone Mountain 119 ACM Area 2 (Derrick Barnes); Stone Mountain 119 QA (Marketa D Goodwin (On Leave)); Stone Mountain 119 QA (Marketa D Goodwin); Storage-Virtualization-DR (Ali Bakhtiar); Storage/Virtualization/DR (Ali Bakhtiar); Strat Project Portfolio & Op Excellence (Michael Schrder (Inherited)); Strategic Analytics & Pricing (Paul Jens); Strategic Analytics (Manish Srivastava); Strategic Expansion Projects (Robyn Elliott); Strategic Indirect Sourcing (David Pauli); Strategic Initiatives (Matt Shapiro); Strategic Initiatives ENG (Dilip I Raval); Strategic Initiatives ENG (Gene Bohn); Strategic Project Portfolio and Operational Excellence (Gil Rochat); Strategic Project Portfolio and Operational Excellence (Martin Schaeren (Inherited)); Strategic Sourcing (Benjamin Fruin); Strategic Sourcing Capex & MRO Sourcing (Jos Maldonado); Strategic Sourcing Capex & MRO Sourcing (Jos√© Maldonado); Strategic Sourcing Capex & MRO Sourcing (Paul Addis (Inherited)); Strategic Sourcing Capex/MRO MBG (Bernd Mhling); Strategic Sourcing Capex/MRO MBG (Bernd M√ºhling); Strategic Sourcing Direct (Martin Grossmann); Strategic Sourcing Direct Packaging, Devices, Containers, Closures, R&D (Benjamin Fruin); Strategy & Business Development (Alan Wills (Inherited)); Strategy & Business Development (Alan Wills); Strategy & Business Development (Andrea Douglas); Strategy & Business Development (Bev Menner); Strategy & Business Development 2 (Stephanie Read); Strategy & Innovation (Ken Lim); Studium Plus (Carmen Walldorf (Inherited)); Studium Plus (Doris Nake (Inherited)); Study File Management (Elizabeth Petersen); Study Operations (3) (William Karich); Study Operations (Christa Lewiski); Study Operations (Janis Witzleb); Study Operations (Lyndah Oswald - Okebata); Superior PTH Vorbehandlung 3 / Abfllung 3 H069 (Adam Krajewski); Superior PTH Vorbehandlung 3 / Abfllung 3 H069 (Frank Gerhard Grger); Superior PTH Vorbehandlung 3 / Abf√ºllung 3 H069 (Adam Krajewski); Superior PTH Vorbehandlung 3 / Abf√ºllung 3 H069 (Frank Gerhard Gr√∂ger); Superior PTH Vorbehandlung 3 / Abf√ºllung 3 H069 (Sylvia Kauf); Supervisor (Andreas Gehrich (Inherited)); Supervisor (Andreas Gehrich); Supervisor (Annette Pernitzsch (Inherited)); Supervisor (Annette Pernitzsch); Supervisor (Claudia Habenicht (Inherited)); Supervisor (Claudia Habenicht); Supervisor (Damaris Kieckhfer (Inherited)); Supervisor (Damaris Kieckh√∂fer (Inherited)); Supervisor (Heike Borchert (Inherited)); Supervisor (Kirsten Scheibel (Inherited)); Supervisor (Kirsten Scheibel); Supervisor (Natascha Bock (Inherited)); Supervisor (Natascha Tappendorf); Supervisor (Stephani Keltsch); Supervisor (Sven Schuhmann (Inherited)); Supervisor (Sven Schuhmann); Supplier Management (Bill Chambers); Supplier Management (Ivo Kreyenbuehl); Supplier Quality Management (Allen F Coleman); Supplier Quality Management (Justin K Zajc); Supplies, Liverpool (Stephen Magill [C]); Supplies, Liverpool (William Helsby); Supply & Logistics (Avril Lam); Supply & Logistics (Winnie Yau); Supply Chain (Anita Erber); Supply Chain (Boris Kaiser); Supply Chain (Rick Gibson); Supply Chain Business Process (Wolfgang Schneider); Supply Chain External Manufacturing (Stuart Summers); Supply Chain Finance (Kiran Duhra); Supply Chain Liverpool (James Monaghan); Supply Chain Maidenhead (Ian Dick); Supply Chain Management (Cameron Barrett); Supply Chain Management (Michael F Deem); Supply Chain Management (Ryoichi Imamura); Supply Chain Mgt & Operational Planning (Robert P Lawler); Supply Chain Mgt (Mischa Moeckli); Supply Chain Planning & Inventory Management (Kevin L Robards); Supply Chain Planning (Cheryll McLeod); Supply Chain Planning (David McClure); Supply Chain Planning (Ratana Lim); Supply Chain Planning (Serge Marques); Supply Chain Planning (Sharon Gough); Supply Chain Planning (Unni Nair); Supply Chain QA (Andrew Norman); Supply Chain Services (Dennis Finger); Supply Chain Services (Grant Gaddis); Supply Chain Services (Kelly L Konemann (Inherited)); Supply Chain Services (Kelly L Konemann (On Leave)); Supply Chain Services (Kelly L Konemann); Supply Chain Services (Maike Pollaschek); Supply Chain Services (Tamara Huber); Supply Chain Systems (Sean Flannery); Supply Chain, PKV (Lachlan Cruise); Support & Hygiene Produktion (Monika Krebs); Support & Nebenanlagen (Florian Damm); Support (Arnd Vollmerhausen (Inherited)); Support (Benjamin Grn); Support (Benjamin Gr√ºn); Support (Bernd Zimmermann); Support (Heiko Jucknat); Support und Admin Medical Department (Martina Witzer); Sustain and Improve PTI Americas (Austin Newsom); Syracuse 196 (SILVIO VONA); Syracuse 196 ACM Area 1 (Kristina Deonarine); Syracuse 196 ACM Area 2 (Timothy Ray); Syracuse 196 QA (Matthew McHale); System Support (Magan Lai); System Validation and Implementation (Marquita Moore); TA Coag, Critical Care & Cardiovascular (Susan Welsh (Inherited)); TA Coagulation & Acquired Bleeding, Global Clinical R&D (Andres Brainsky); TA Development PM Group (Joanne Uhl); TA Immunology (Susan Welsh (Inherited)); TA Support (Anjani Advani); TDD (Technical Development & Documentation) (Patrick Gregory); TEC-Testentwicklung Chemie (Kerstin Nske); TEC-Testentwicklung Chemie (Partho Halder); TRICC (William Mezzanotte (Inherited)); TRICC - Therapeutic Area II (Marc Uknis); TRICC II (Mikhail Rojavin); TRICC II (Orell Mielke); TRICC III (Iris Jacobs); TRICC III (Maria Gasior); TRICC Therapeutic Area (Mittie Doyle); Talent Acquisition (Daphne Wong); Talent Acquisition (Ivan Dokoza); Talent Acquisition (James Telfer (Inherited)); Talent Acquisition (Priya Dinkar); Talent Acquisition - APAC (James Telfer); Talent Acquisition - APAC (Lisa Edwards); Talent Acquisition - Americas (Andrew Lewis); Talent Acquisition - EMEA (Elena Kharlamova); Talent Acquisition - Europe (Peggy Klein); Talent Acquisition - Plasma (Tracey Lambalot); Talent Acquisition - Plasma (Tracey Lambalot) (Tracey Lambalot); Talent Acquisition AUS (Angela Bellenger); Talent Acquisition and Talent Management (Beth Thomas); Talent Development (APAC) (Kathy Sacca); Talent Development (Eveline Wuethrich); Talent Development Apprenticeship (Anja K√§ser); Talent Development North America (Ll''Rae Robinson); Talent Management & Acquisition (Brian Fehrer); Talent Management & Acquisition (Elizabeth Walker (Inherited)); Talent Management AU (Raechel Gray); Talent Programs & Analytics (Brian Fehrer (Inherited)); Talent Programs & Analytics (Mary Schnackenberg); Talent Programs & Analytics (Sarah Peacey); Tallahassee 211 (Andria Logan); Tallahassee 211 QA (Lori Carlson (Inherited)); Tallahassee 211 (Andria Logan); Tallahassee 211 ACM Area 1 (Andria Logan (Inherited)); Tallahassee 211 ACM Area 1 (Brooklyn Williams (On Leave)); Tallahassee 211 ACM Area 2 (Brenda B Williams); Tallahassee 211 ACM Area 2 (Michelle Davenport); Tallahassee 211 QA (Lori Carlson (Inherited)); Tallahassee 211 QA (Mechelle Robinson); Tallahassee 211 QA (Mychal A Reynolds); Tampa 109 (Elizabeth Lam); Tampa 109 (Michelle K Natalie); Tampa 109 ACM Area 1 (Leah J Davis); Tampa 109 ACM Area 2 (Amber S Goodwine); Tampa 109 ACM Area 2 (Carolyna Perez); Tampa 109 QA (Joseph Rivera (On Leave)); Tampa 109 QA (Joseph Rivera); Tampa 109 QA (Michelle K Natalie); Tax (James Smith); Tax Compliance (Mark Murtaugh); Taylor 240 (Joe Korea); Taylor 240 ACM Area 1 (Joe Korea (Inherited)); Taylor 240 ACM Area 1 (Nicki Nguyen); Taylor 240 ACM Area 2 (Dion Dippel); Taylor 240 ACM Area 2 (Joe Korea (Inherited)); Taylor 240 QA (Wendy MacConnell); Team 1 (Christian Schubert); Team 1 (Jrg Dennis Issel); Team 1 (J√∂rg Dennis Issel); Team 1 (Michael Welsch (Inherited)); Team 1 (Veronika Chernov); Team 10 Verpackung (Petra Eversberg); Team 10 Verpackung (Petra Sch√§fer (On Leave)); Team 10 Verpackung (Petra Sch√§fer); Team 10 Verpackung (Rosemarie Rdding); Team 2 (Aytac Akin); Team 2 (Michael Welsch (Inherited)); Team 2 (Silke Oppermann); Team 3 (Michael Welsch (Inherited)); Team 3 (Thomas Grhning); Team 3 (Thomas Gr√§hning); Team 3 (Waldemar Kliwer); Team 4 (Erwin Gordzielik); Team 4 (Michael Welsch (Inherited)); Team 5 (Ludwig Heckmann); Team 5 (Michael Welsch); Team 6 (Karl-Hermann Sprenger); Team 7 (Pavlina Weninger); Team 7 (Thomas Fieber); Team 8 (Andreas Rastschewski); Team 8 (Mara Saglam); Team 8 (Melvin Scruggs); Team 9 (Eugen Rogosin); Team 9 (Igor Kaucher); Team Buffer Preparation (Dirk Michel); Team DSP I (Heiko Jucknat); Team HVAC (Michael Hillmann); Team Kalibrierung (Thomas Kniepper); Team Leader - Imp & Compl (Kathy Theodorakis); Team Leader - AFF/ZN 444 (Chas Chalker); Team Leader - AFF/ZN 444 (Remon Hemaya); Team Leader - DS 444 (Hieu Tran); Team Leader - Imp & Compl (Kathy Theodorakis); Team Leader - Inac 444 (Margarita Mejia); Team Leader - Packaging - 451 (Anthony Lane); Team Leader - Packaging - 451 (Anthony Lane); Team Leader - Plnt & Srv 444 (Darren McKean); Team Leader - QC Microbiology (Kerry Lincoln); Team Leader - Sterility Assurance (Jon Wong); Team Leader - Validation (Kylie Prendergast); Team Leader Animal Services (Anne Hageman); Team Leader Change Mgmt - Prod (Marcus O''Dwyer); Team Leader Change Mgmt - Prod (Paul Williams); Team Leader Formulation B 454 (David Moulsdale); Team Leader I PHAD I (Tobias Heck); Team Leader II PHAD I (Patric Sallin); Team Leader Prod Support - DS (Jeffrey Gan); Team Leader Prod Support - DS (Jeffrey Spicer); Team Leader Prod Support - MD (Jeffrey Gan); Team Leader Prod Support - MD (Stuart Jones); Team Leader Production Support (Denise Bertram); Team Leader Production Support (Elaine Feely (Inherited)); Team Leader Upstream-Harv 444 (Ibrahim Ozerim); Team Leader Upstream-Inoc 444 (Craig Byham); Team Mechanik (Christoph Freiling); Team Mechanik (Gerd Pendzialek); Team PBF (Thorsten May); Team PBF 1.0 (Maikel Bamberger); Team PTE (Stefan Rees); Team Purification I (Carsten Meyer (Inherited)); Team Purification I (Heiko Jucknat (On Leave)); Team Purification I (Heiko Jucknat); Team Purification II (Selcuk Ayan); Tech Dev Ops QA (Monica Rose); Tech Support Potency Testing (Julia Hainbach); Tech Support Potency Testing (Reinhard Paul); Tech Transfer (Samantha Gakias); Tech Transfer D820 (Ming Chong); Tech Transfer Projekt Management Team (Nina Walser); Technical Development (Lynda Allan); Technical Learning & Development (David P Monte); Technical Learning & Development 1 (Amy Jackson); Technical Learning & Development 2 (Ann Lescher); Technical Operations (Fuad Haddadin); Technical Operations (Michele Himmelspach (Inherited)); Technical Operations - Investigations (Tino Boss); Technical Operations - Small Scale (Janine Bash); Technical Operations I (Daniel Knack); Technical Operations I (Veronica Lopez); Technical Operations II (Becca Huebsch); Technical Operations II (Raghav Oberoi); Technical Operations III (Inga Breitwieser); Technical Operations III (Wilfried Wormsb√§cher); Technical Operations IIa (Jan Schwichtenberg); Technical Operations IV (Katrin Maria Sander); Technical Projects (Wendy Turner); Technical Services (Juerg Clavadetscher); Technical Services, Influenza Operations (Bill Cracknell); Technical Services/FM (Beat Meyer); Technikteam Labore (Stephan Lw); Technikteam Labore (Stephan L√∂w); Technischer Service (Lothar Klingelhfer); Technischer Service (Lothar Klingelh√∂fer); Technology Transfer (Jesse Richter); Teilbereichsleiter Abfllung (Stefan Peil); Teilbereichsleiter Abf√ºllung (Stefan Peil); Tempe 048 (Terry M Young); Tempe 048 ACM Area 1 (Samuel V Grijalva); Tempe 048 ACM Area 1 (Trina L Bryant); Tempe 048 ACM Area 2 (Sonya L Nigh); Tempe 048 QA (John Son); Tempe 048 QA (Melissa M Martinez); Tempe 427 (Daniel I Villegas (Inherited)); Tempe 427 (Patrick S Taylor); Tempe 427 ACM Area 1 (Kelly L Ortega); Tempe 427 ACM Area 2 (Jennifer Valenciano); Tempe 427 QA (Daniel I Villegas); Tempe 427 QA (Kellie N Buecker); Tempe 427 QA (Tiffanie Contreras); Temple 260 (Kimm Klisiewicz); Temple 260 ACM Area 1 (Moses Olukere); Temple 260 ACM Area 1 (Sarah Gaines); Temple 260 ACM Area 2 (Michael Martinez); Temple 260 QA (Cayley Eppler); Temple 260 QA (Kellie N Buecker); Temple Terrace 252 (Stephanie Frye); Temple Terrace 252 ACM Area 1 (Michelle Briseno); Temple Terrace 252 ACM Area 1 (Monica Miller); Temple Terrace 252 ACM Area 2 (Janette J Pierre); Temple Terrace 252 QA (Caitlin Shoemaker); Temple Terrace 252 QA (Joel Gallegos); Terre Haute 265 (Daniella Miller); Terre Haute 265 (Tara Goebel); Terre Haute 265 ACM Area 1 (Tara Goebel); Terre Haute 265 ACM Area 2 (Tracy Robinson); Terre Haute QA 265 (Sherri A Suttles); Testing Laboratory (Maruthi Shivananda); Therapeutic Area Clinical Ops (Bruce Wynne); Therapeutic Area Clinical Ops I&N (Ann-Marie Hulstine); Therapeutic Area Critical Care (Hartmut Landgrebe); Therapeutic Area Medical Evaluation (Nataliya Doliba); Therapeutic Area Medical Evaluation 1 (Nataliya Doliba); Therapeutic Area Medical Evaluation Lead (Kaniez Baig); Tokyo Yamanashi Area (Yoshifumi Umenaka); Toledo 175 (Steve Sparks); Toledo 175 ACM Area 1 (Kevin Connelly); Toledo 175 ACM Area 2 (James Carroll); Toledo 175 QA (Aarsalaan Semna); Toledo 175 QA (April Tyler); Toledo 223 (Debra Purney); Toledo 223 ACM Area 1 (Jeffery Eagle); Toledo 223 ACM Area 2 (Debra Purney); Toledo 223 ACM Area 2 (Heather Marshall); Toledo 223 QA (Christopher Travalik (Inherited)); Toledo 223 QA (Michael Craun); Toledo 223 QA (Pam Perryman); Toll IG/Alb Bulk (Ali Hashempour); Toll IG/Alb Bulk (Andrew Vasil); Toll IG/Alb Bulk (Anthony Manovella (Inherited)); Toll IG/Alb Bulk (Edward Camilleri); Toll IG/Alb Bulk (Jason Gilmour); Toll IG/Alb Bulk (Johnny Barbis); Toll IG/Alb Bulk (Jon Gummer); Toll IG/Alb Bulk (Kevin deSouza); Toll IG/Alb Bulk (Michael Appelman); Toll IG/Alb Bulk (Ricardo Morales); Toll IG/Alb Bulk (Robert Poletti); Toll IG/Alb Bulk (Rodney Vermeend); Toll IG/Alb Bulk (Shannon Thorp); Toll IG/Alb Bulk (Tom Koukouvaos); Toll Manufacturing BU Team (CLAUDIO BAEZZATO); Toll Manufacturing BU Team (Maria Gabriella Patrassi); Toll Mfg. Excipients & Intermediates (Jennifer Dolores Brenner); Toll Mfg. Excipients & Intermediates (Markus Staempfli (Inherited)); Toll Mfg. Excipients & Intermediates (Niklaus Kraehenbuehl); Toll VI and Pack (Parth Soni); Total Rewards (Figen Zaim); Tox Operations (Andrea Beyerle); Toxicology (Christopher John Peters); Toxicology Unit (Gerald Hbarth); Toxicology Unit (Gerald H√∂barth); Toxicology Unit 1 (Barbara Dietrich); Trademark (Antje Michel (Inherited)); Trademark (Janine Colesie (On Leave)); Training & Development Office (Chiho Muto); Training & Development Office (Shinji Ohkura); Training & GMP (Barbara Kalina (Inherited)); Training & GMP (Wilfried Happel (Inherited)); Training and Document Control (Lixia He ); Training and Document Control (Lixia He ?????); Transformation Change Management (Emily Riggs); Translational Biology (Alexander Schaub); Translational Biology 1 (Sandra Wymann); Translational Biology 2 (Svetlana Diditchenko); Translational Biology 2a (Alexei Navdaev); Translational Biology 3 (Anna Schnell); Translational Safety (Ashlyn Bassiri); Translational Science (Nick Wilson); Translational Science 1 (Nick Wilson); Transplant (Kevin Kovaleski); Transplant Marketing (Paula Manchester); Transplant Marketing SOT (Jeanne Andronowitz); Transplant Marketing SOT (Paula Manchester (Inherited)); Transplant Medical Affairs (Kevin Kovaleski (Inherited)); Transplant Medicine (Kevin Kovaleski); Transplant TA PM Group (Linda Cortese); Transport und Prozess Management (Andre Husse); Transport und Prozess Management (Anna-Lena Niederh√∂fer); Transportation Management (Gnter Vollmer); Transportation Management (G√ºnter Vollmer); Treasury Europe (Dieter Engstfeld); Trending (Marika Moore); Trending ¬ñ Sterility Assurance (Vijay Dundigalla); Trial & Quality Systems (Sean Storms); Tucson 071 (April Behnke); Tucson 071 (Moses Falaiye); Tucson 071 ACM Area 1 (Alma Y Olivera); Tucson 071 ACM Area 2 (Luz D Almeraz); Tucson 071 QA (Cori J Collins (Inherited)); Tucson 071 QA (Nicole A Downey); Tucson 111 (Alejandro Angulo); Tucson 111 ACM Area 1 (Alejandro Angulo (Inherited)); Tucson 111 ACM Area 1 (George Adams); Tucson 111 ACM Area 2 (Kendra N Belcher); Tucson 111 QA (Dulce A Jimenez); Tucson 111 QA (Eugene Y Shem); Tucson 624 (Jovanna R Ortega); Tucson 624 ACM Area 1 (Sara M Portugal); Tucson 624 ACM Area 2 (Adrian Soto); Tucson 624 QA (Bernadette B Woodson); Tulsa 014 (Heather Colbert); Tulsa 014 (Jerry Ewen); Tulsa 014 ACM Area 1 (Reggie De Quiroz); Tulsa 014 ACM Area 2 (Forrest Burns); Tulsa 014 ACM Area 2 (Heather Colbert); Tulsa 014 QA (Cooper Cruson); Tulsa 014 QA (Heather Colbert); Tulsa 417 (Jerry Ewen); Tulsa 417 (Troy Lee Wheeler); Tulsa 417 ACM Area 1 (Nina Linga); Tulsa 417 ACM Area 2 (Lindsay K Jameson); Tulsa 417 QA (Cooper Cruson); Tulsa 417 QA (Hannah E Todroff); Tulsa 417 QA (Julie L Potter); Tulsa 417 QA (Troy Lee Wheeler (Inherited)); Turkey Field Sales (Filinta Cakar); Tyler 520 (Stephanie D Shah); U of M 414 ACM Area 1 (Abubeker M Osman); U of M 414 ACM Area 2 (Ahmed N Ismail); U&S Process Engineering (Rodrigo Campos); UK accounting (Lorraine Lambert); US ComOps Immunology Sales (Joseph Guinan); US Credit and Treasury (Angela Caivano); US Distribution (Daniel Krysztofiak); US Distribution (Joseph Jefferson); US Federal Tax Compliance (Giovanni Siciliano); US Finance: Capital (Andrea Burch); US Healthcare Systems (Pete Dickson); US Healthcare Systems (Richard Dudek); US Lab Quality Assurance (Alecia C Harshaw); US Marketing (Bernadine Koziara); US Med Affairs-Coagulation-Field (Julie Farley); US Med Affairs-Coagulation-Field (Katheleen Pinto); US Med Affairs-Immunology-Field (Elyse Murphy); US Med Affairs-Specialty-Field (Ayman Kafal); US Med Affairs-Specialty-Field (Ben Boccuzzi); US PLC Quality Assurance (Brian H. Frye); US PLC Quality Assurance (Carol Kralicek); US PLC Quality Assurance (Jeff Dalton Jr); US PLC Quality Assurance (Keith Winiger); US PLC Quality Assurance II (Jeff Dalton Jr); US Plasma Marketing (Keith A Haywood); US Plasma Marketing (Scott Newkirk (Inherited)); US Plasma Operations (Daniel V Ferris); US Plasma Operations Division 1 (Scott Newkirk); US Plasma Operations Division 2 (Michelle A Meyer); US Plasma Operations Division 3 (Wlenyeno J Elliott-Browne); US Plasma Operations Region 11 (Joshua D Williamson); US Plasma Operations Region 11.1 (Holt A Peterson); US Plasma Operations Region 11.1 (Holt Peterson); US Plasma Operations Region 11.2 (Aaron C White); US Plasma Operations Region 11.3 (Brandon S Bridges); US Plasma Operations Region 11.4 (Christine Thomas); US Plasma Operations Region 11.5 (Philip Nixon); US Plasma Ops Region 1 (Dianne Sorenson); US Plasma Ops Region 1.1 (Paul Warden); US Plasma Ops Region 1.2 (David M Wilson); US Plasma Ops Region 1.2 (Marc D Fisher); US Plasma Ops Region 1.3 (Cori J Collins); US Plasma Ops Region 1.4 (Daniel I Villegas); US Plasma Ops Region 1.5 (Timothy Freeland Jr); US Plasma Ops Region 10 (Carmon Kieffer); US Plasma Ops Region 10 (Michelle A Meyer (Inherited)); US Plasma Ops Region 10 (Rebecca Swingle); US Plasma Ops Region 10.1 (Bonnie M Talbott (On Leave)); US Plasma Ops Region 10.1 (Bonnie M Talbott); US Plasma Ops Region 10.1 (Christopher Travalik); US Plasma Ops Region 10.1 (Derek Erhart); US Plasma Ops Region 10.2 (Mary A Paul); US Plasma Ops Region 10.2 (Michael W Solomon); US Plasma Ops Region 10.3 (Neville L Bain); US Plasma Ops Region 10.3 (Stephanie D Shah); US Plasma Ops Region 10.4 (Brendi L Cantrell); US Plasma Ops Region 10.4 (Brett A Wintheiser); US Plasma Ops Region 10.4 (Lori Carlson); US Plasma Ops Region 10.4 (Nicole M Loncon); US Plasma Ops Region 10.5 (Melodee C Ebel); US Plasma Ops Region 11 (Joshua D Williamson); US Plasma Ops Region 12 (Brandon L Voege); US Plasma Ops Region 12.1 (Melodee C Ebel); US Plasma Ops Region 12.2 (Kyle M Lehrke); US Plasma Ops Region 12.3 (Alan Maldonado); US Plasma Ops Region 12.4 (Kashaun Muhammad); US Plasma Ops Region 12.4 (Tiffany D Sherman); US Plasma Ops Region 12.5 (Lori Carlson); US Plasma Ops Region 2 (Michael S Beam); US Plasma Ops Region 2.1 (Jose L Dela Garza); US Plasma Ops Region 2.1 (Vida C Chapman); US Plasma Ops Region 2.2 (Daniel Venn); US Plasma Ops Region 2.2 (Sheri Mixon); US Plasma Ops Region 2.3 (Brenda C Greenfield); US Plasma Ops Region 2.3 (Vida C Chapman); US Plasma Ops Region 2.5 (Kandra K Blodgett); US Plasma Ops Region 2.5 (Patrick Garza); US Plasma Ops Region 3 (Angela S Drumright); US Plasma Ops Region 3.1 (Latosha Y Floyd); US Plasma Ops Region 3.2 (Angela S Drumright (Inherited)); US Plasma Ops Region 3.2 (Joshua D Williamson); US Plasma Ops Region 3.2 (Lauren Jenkins); US Plasma Ops Region 3.2 (Marc D Fisher); US Plasma Ops Region 3.3 (Drewleigha B Sarver); US Plasma Ops Region 3.3 (Keith Clemons); US Plasma Ops Region 3.4 (Ashley M Jamieson); US Plasma Ops Region 3.5 (Erik Plate); US Plasma Ops Region 4 (Brannon L Brittain); US Plasma Ops Region 4.1 (Cole D Kimple); US Plasma Ops Region 4.1 (Holt A Peterson); US Plasma Ops Region 4.1 (Tina Wagenknecht); US Plasma Ops Region 4.2 (Jamie E Lawrence); US Plasma Ops Region 4.2 (Troy Lee Wheeler); US Plasma Ops Region 4.3 (Cole D Kimple); US Plasma Ops Region 4.3 (Cory Toellner); US Plasma Ops Region 4.4 (Jesus A Castillo); US Plasma Ops Region 4.5 (Jamie E Lawrence); US Plasma Ops Region 5 (Rhonda C Harp); US Plasma Ops Region 5.1 (Aaron C White); US Plasma Ops Region 5.1 (Keith Clemons); US Plasma Ops Region 5.2 (Brandon S Bridges); US Plasma Ops Region 5.2 (Patti Bailey, Prim J Cunningham); US Plasma Ops Region 5.2 (Prim J Cunningham); US Plasma Ops Region 5.3 (Nicole M Adams); US Plasma Ops Region 5.3 (Patti Bailey); US Plasma Ops Region 5.3 (Rhonda C Harp (Inherited)); US Plasma Ops Region 5.4 (John L Thixton); US Plasma Ops Region 5.5 (Michele Purvines-Honzo); US Plasma Ops Region 6 (Darrel W Simon); US Plasma Ops Region 6.1 (John E Hunt); US Plasma Ops Region 6.1 (Tiffany D Sherman); US Plasma Ops Region 6.2 (Kyle M Lehrke); US Plasma Ops Region 6.2 (Sam Schultz); US Plasma Ops Region 6.3 (Alan Maldonado); US Plasma Ops Region 6.3 (Jose L Dela Garza); US Plasma Ops Region 6.4 (John E Hunt); US Plasma Ops Region 6.4 (Sheri Mixon); US Plasma Ops Region 6.5 (Victoria McIntyre); US Plasma Ops Region 7 (Brandon L Voege); US Plasma Ops Region 7 (Brendi L Cantrell (On Leave)); US Plasma Ops Region 7 (Brendi L Cantrell); US Plasma Ops Region 7.1 (Lori Carlson); US Plasma Ops Region 7.1 (Nicole M Loncon); US Plasma Ops Region 7.1 (Stephanie D Shah); US Plasma Ops Region 7.2 (Christine Thomas); US Plasma Ops Region 7.2 (Christopher Travalik); US Plasma Ops Region 7.3 (Ron Griffin); US Plasma Ops Region 7.4 (Andrew Brammer); US Plasma Ops Region 7.4 (Brendi L Cantrell (On Leave) (Inherited)); US Plasma Ops Region 7.4 (Brett A Wintheiser); US Plasma Ops Region 7.4 (Drewleigha B Sarver); US Plasma Ops Region 7.5 (Christopher Travalik); US Plasma Ops Region 7.5 (Mary A Paul); US Plasma Ops Region 8 (Tammy S Harrison); US Plasma Ops Region 8.1 (David Ensminger); US Plasma Ops Region 8.1 (Derek Erhart); US Plasma Ops Region 8.1 (Matthew Smith); US Plasma Ops Region 8.2 (Ben Samarripas); US Plasma Ops Region 8.2 (Stephanie D Shah); US Plasma Ops Region 8.3 (Andrew Brammer); US Plasma Ops Region 8.3 (Greg McClain); US Plasma Ops Region 8.3 (Neville L Bain); US Plasma Ops Region 8.4 (Derek Erhart); US Plasma Ops Region 8.4 (Michael W Solomon); US Plasma Ops Region 8.4 (Tammy S Harrison (Inherited)); US Plasma Ops Region 8.5 (Derek Erhart); US Plasma Ops Region 8.5 (Patrick Willingham); US Plasma Ops Region 9 (Amanda L Kitchen); US Plasma Region 2.4 (Michael S Beam (Inherited)); US Plasma Region 2.4 (Rosa E Mercado); US Regulatory Affairs (John Hill); US Regulatory Affairs (Kevin Darryl White); US Regulatory Affairs Critical Care/Cardiovascular (Angela D Azzara); US Regulatory Affairs II (Baldevsinh Rana); US Regulatory Affairs III (Paula Clark (On Leave)); US Regulatory Affairs III (Paula Clark); US Sales (Robert Murphy); US State Tax Compliance (Tulasi Veeramachaneni); US Tax Compliance (Eric Lorah); US Tax Compliance (Peter Larsen (Inherited)); USP Engineering (Patrick Rollier); USP Laboratories (Sandra Grunske); USP Manufacturing 1 (Marc Dick); USP Manufacturing 2 (Philipp Steiner); USP Process Technology (Niklas Zink); Umwelt, Plasmabetreuung und Fremdfirmenmanagement (Bjrn Wiesner); Umwelt, Plasmabetreuung und Fremdfirmenmanagement (Bj√∂rn Wiesner); University Relations (Jasmin Senior); Unpaid Diverse (Andreas Gehrich (Inherited)); Unpaid Diverse (Andreas Gehrich); Unpaid Diverse (Annette Pernitzsch (Inherited)); Unpaid Diverse (Annette Pernitzsch); Unpaid Diverse (Claudia Habenicht (Inherited)); Unpaid Diverse (Claudia Habenicht); Unpaid Diverse (Frank Bernert (Inherited)); Unpaid Diverse (Heike Borchert (Inherited)); Unpaid Diverse (Natascha Bock (Inherited)); Unpaid Diverse (Natascha Tappendorf); Unpaid Diverse (Stephani Keltsch); Unpaid Diverse (Sven Schuhmann (Inherited)); Unpaid Diverse (Sven Schuhmann); Upstream Days (Rebecca Briers); Upstream Development (Hans-Wilhelm Beltz); Upstream Development (Stefan Debus); Upstream Manufacturing (Vicky Reading); Upstream Manufacturing - Days (John Meaney); Upstream Shift A (Edward Goulding); Upstream Shift A (Mark Harrop); Upstream Shift B (Mark Harrop); Upstream Shift B (Raymond Brownless); Utilities & Services Engineering (Paul Russell); Utilities & Services Engineering (Peter White); Utilities & Services Engineering Manager (Peter White); Utilities (Kay von Burg); Utilities Critical Systems (Michael D Proctor); Utilities Lengnau (Nozar Basseri); Utilities-Critical Systems (Jeff J Parks); Utilities-Critical Systems (Jim Meils); Utilities-Motive Power (David G Mollema); VAL - Rekombinante Proteine (Kerstin Nau); VAL - Rekombinante Proteine (Verena Koch-Geller); VAL F VIII, IgG & Albumin & Inhibitors (Antje R√∂der); VAL F VIII, IgG & Albumin & Inhibitors (Marco Donges); VAL LIMS-Beauftragte (Eckhard Sch√ºler (Inherited)); VAL Lyophilisierung (Judith Mller); VAL Lyophilisierung (Judith M√ºller); VAL Media Fills & Mikrobiologie (Elke Zameitat); VAL Wissensch. Dokumentation (Eckhard Schler (Inherited)); VAL Wissensch. Dokumentation (Eckhard Sch√ºler (Inherited)); VAL Wundheilung & Intensive Care (Karlheinz Enssle); VAL Wundheilung & Intensive Care (Markus Hilberg); VP Operations 400 (Chris Larkins); VP, Com Operation (Lorna Meldrum); VV-Virus Validation (Wolfram Schfer); VV-Virus Validation (Wolfram Sch√§fer); VV-Virus Validation 1 (Tobias Schrder); VV-Virus Validation 1 (Tobias Schr√§der); VV-Virus Validation 2 (Michaela Gerlach); VV-Virus Validation 3 (Ramona Stau); VV-Virus Validation 3 (Ramona Stau√ü); Validation (Chad Kalia); Validation (Christian Nemeth); Validation (David Turner); Validation (Debra Fisher); Validation (Eckhard Schler); Validation (Eckhard Sch√ºler); Validation (Kah Wen Lee); Validation (Linda Garrett); Validation (Maria Arulruban); Validation (Michel Baur); Validation (Michelle Johnson); Validation (NICHOLA THOMSON); Validation (Ryan Dexter); Validation (Tiffany Korth); Validation I (Chantal Pfaffen); Validation I (Michel Baur (Inherited)); Validation I (Philipp Gersbach); Validation II (Ulrike Hartmann); Validation Process (Peter Tyler); Validation Process (Rebecca Gannon); Validation Process (Russell James Ciliento (Inherited)); Validation QA 1 (Tiffany Korth); Validation QA 2 (Debra Fisher); Validation QA 3 (Linda Garrett); Validation Qualification (Chad Kalia); Validation Specialist 3 Third Party Support (Bhavyesh Pandya); Validation and Stability (Jolyn Hu ?????); Validation and Stability (Xianfeng Guo ); Value Stream - Drug Substance (Barbara Beugger); Vancouver 102 (Larry A Barttelt); Vancouver 102 ACM Area 1 (David B Hammersley); Vancouver 102 ACM Area 2 (Clarissa Halsey); Vancouver 102 QA (Greg R Edge); Viral Vector Bioanalytics (Monica Terrao); Virology & Cell Line Up (Charles McGee); Virus Assay Development and Production (Ben Dickerman); Virus Seed MFG (Adam Kotsubka); Virus Validation (Randel Fang (Inherited)); Virus Validation (Tao Zheng); Virus Validation I (Thomas Nowak); Virus Validation II / Prion Eval. (Wolfram Sch√§fer); Virus Validation III (Bj√∂rn Keiner); Visual Control (Joanna Madafferi (Inherited)); Visual Control (Urs Pflugshaupt); Visual Inspection (Thomas Niedermann); Visual Inspection Precontrol 1 (Georges Schmid); Visual Inspection Precontrol 2 (Daniel Tobler); Visual Inspection and Packing (Claire Petitjean); Visual Inspection and Packing (Clare Schwarz); Visual Inspection semi final prod. 1 (Marlis Erb); Visual Inspection semi final prod.Team 2 (Yvonne Seiler); Visual Inspection semi final prod.Team 3 (Vesile Ciloglu); Visuelle Kontrolle 4 (Christina Vidal-Martinez); Visuelle Kontrolle 4 (Jrg Nickel (Inherited)); Visuelle Kontrolle 4 (J√∂rg Nickel (Inherited)); Vorbehandlung/Brdelung (Michael Welsch); Vorbehandlung/B√∂rdelung (Michael Welsch); WHO & EU Pandemic Vaccines (Ylenia Runci); Waco 084 (Katherine Blount); Waco 084 (Michael Pate Jr); Waco 084 ACM Area 1 (Janet E Jenkins); Waco 084 ACM Area 2 (Rachel I Ramirez); Waco 084 ACM Area 2 (Sharon A Smith); Waco 084 QA (Katherine Blount); Waco 084 QA (Vanessa E Tinsley (On Leave)); Waco 084 QA (Vanessa E Tinsley); Warehouse & Logistics Lengnau (Philipp Kaeser); Warehouse & Transportation (David Dunn); Warehouse & Transportation (Klaus M√ºller); Warehouse (Belinda Thomson); Warehouse (Sam Mekhael); Warehouse (Serge Marques); Warehouse (Uwe Klappstein); Warehouse II (Pavel Miller (On Leave)); Warehouse II (Pavel Miller); Warehouse Operations (Ritchii Lam (Inherited)); Warehouse Operations (Ritchii Lam); Warehouse Supervisor VIC 266 (John Turone (Inherited)); Warehouse Supervisor VIC 266 (Russell Monro); Warehousing (Brian Runner); Warehousing (Jesse Higgins); Warehousing (Noel Burash); Warehousing (Thomas Ryser); Warehousing (Union) (Brian Runner); Warehousing (Union) (Caitlyn Vidas); Warehousing (Union) (Jesse Higgins (Inherited)); Warehousing (Union) (Robin Anderson); Warehousing 1 (Brian Runner); Warehousing GBZ (Walter Kiener); Warehousing Non-Union (Brian Runner (Inherited)); Warehousing Non-Union (Robin Anderson); Warehousing U8 (Martin Hirschi); Warehousing U8 (Matthias Loosli); Warehousing U8 (Rafael Gasser); Warehousing U8 (Thomas Ryser (Inherited)); Warehousing W10 (Patrick Portmann); Warehousing W10 (Thomas Ryser); Warner Robins 509 (Denise Bloodsaw); Warner Robins 509 ACM Area 1 (Bernard Postell); Warner Robins 509 ACM Area 2 (Ta''Neshia Magby); Warner Robins 509 QA (Marilyn Walker); Warner Robins 509 QA (Mary A Paul (Inherited)); Warren 204 (Kimberly Schick); Warren 204 (Kimberly Wrenn); Warren 204 ACM Area 1 (Stephanie M Newland); Warren 204 ACM Area 2 (Daniel Rattay); Warren 204 QA (Jefferson Williams); Warren 204 QA (John Ziegler); Warren 204 QA (Samantha Rouzzo); Warwick 201 (Linda Monteiro); Warwick 201 (Matt Schramm); Warwick 201 ACM Area 1 (Mariela Myers); Warwick 201 ACM Area 2 (Husseim Gomez); Warwick 201 QA (Catherine Colucci); Warwick 201 QA (John L Thixton (Inherited)); Warwick 201 QA (Tessa Grassette); Water Testing (Heike Gocht); Water Testing (Partho Halder); Water Testing (Stefan Wilka); Waters-LAL (J Noel David); Waters/LAL (J Noel David); Weighmaster (Non-Union) (Jeff Keller); Weighmaster (Union) (Jeff Keller); Weighmaster (Union) (Jeffrey Keller); Weslaco 184 (Juan M Ramirez); Weslaco 184 ACM Area 1 (Antonio E Juarez); Weslaco 184 ACM Area 2 (Jesus R Hernandez II); Weslaco 184 QA (Ana Phlipot (On Leave)); Weslaco 184 QA (Ana Phlipot); West Lafayette 411 (Travis F Dill); West Lafayette 411 ACM Area 1 (Marc Baldwin); West Lafayette 411 ACM Area 2 (Alex Visser); West Lafayette 411 QA (Sheryl A Pope); West Specialty Regional Sales (STEPHANIE BESLER); West Specialty Regional Sales (Stephanie Besler); Westland 226 (Corey M Schimming); Westland 226 (Remie T Ray); Westland 226 ACM Area 1 (Kelsie Cremeans); Westland 226 ACM Area 2 (Kirk P Alford II); Westland 226 QA (David Zagorowski); Westwego 153 (Jacqulynn Shankle); Westwego 153 ACM Area 1 (Jacqulynn Shankle); Westwego 153 ACM Area 1 (Nadia Y Grisby); Westwego 153 ACM Area 2 (Regena D Young); Westwego 153 QA (Amanda N Webre); Westwego 153 QA (Brandi N Clark (On Leave)); Westwego 153 QA (Jacqulynn Shankle); Westwego 153 QA (Joshua D Harper); Wholesaler Management (Hideki Yanagihashi ??? ?? - ????? ????); Wichita 263 (Laurie E Boothe); Wichita 263 ACM Area 1 (Sierra Lashbrook); Wichita 263 ACM Area 2 (Mandi Harris); Wichita 263 QA (Cameo Donerson); Wichita 415 (Junior Navarro); Wichita 415 (Sam P Emrich); Wichita 415 ACM Area 1 (Michelle B Duong); Wichita 415 ACM Area 2 (Joel Sutherland); Wichita 415 QA (Erin Shaver); Wichita 415 QA (Laurie E Boothe); Wichita 415 QA (Troy Lee Wheeler (Inherited)); Wilkes Barre 286 (Joseph Frackowiak); Wilkes Barre 286 ACM Area 1 (Cathy Gurzynski); Wilkes Barre 286 ACM Area 2 (Joseph Frackowiak (Inherited)); Wilkes Barre 286 ACM Area 2 (Renee Collins); Wilkes Barre 286 QA (Robin Williams); Willoughby Hills 222 (Frances Campbell (On Leave)); Willoughby Hills 222 (Frances Campbell); Willoughby Hills 222 ACM Area 1 (Amanda Fitzpatrick); Willoughby Hills 222 ACM Area 2 (Breana Brown); Willoughby Hills QA 222 (Bailee E White); Wilmington 228 (Alex Liang); Wilmington 228 (Jack Ellison); Wilmington 228 (John E Hunt (Inherited)); Wilmington 228 ACM Area 1 (Kenneth A Keitt Jr); Wilmington 228 ACM Area 2 (Wendy Dettloff); Wilmington 228 QA (Ben Ward); Wilmington 228 QA (Sam Whitehead); Wilton Manors 073 (Alan Maldonado (Inherited)); Wilton Manors 073 (Benjamin J Morris); Wilton Manors 073 (Michelle S DeCambre); Wilton Manors 073 (Nakia J Harlan); Wilton Manors 073 ACM Area 1 (Darcia Culmer); Wilton Manors 073 ACM Area 2 (Kurt S Tuckett); Wilton Manors 073 ACM Area 2 (Soo-Lin Chang); Wilton Manors 073 ACM Area 3 (Benjamin J Morris); Wilton Manors 073 ACM Area 3 (Nakia J Harlan (Inherited)); Wilton Manors 073 QA (Glenny Arvelaez); Wilton Manors 073 QA (Ryann Chapman); Winston-Salem 124 (Daniel Miclausi); Winston-Salem 124 (Javier Castillo); Winston-Salem 124 ACM Area 1 (Malcolm Childress); Winston-Salem 124 ACM Area 2 (Amanda Jarvis); Winston-Salem 124 ACM Area 2 (Maria Lopez); Winston-Salem 124 QA (Amanda Jarvis); Winston-Salem 124 QA (Beth Majewski); Winston-Salem 124 QA (Mario E Montoya); Wintel (Jason Christides); Witchita 263 (Laurie E Boothe); Witchita 263 QA (Cameo Donerson); Woodend Senior Operator (Brett Walker); Woodend Senior Operator (Lauren Redman); Woonsocket 295 (Catherine Colucci); Woonsocket 295 ACM Area 1 (Jonathan Chenot); Woonsocket 295 ACM Area 2 (Ashley Brown); Woonsocket 295 QA (Michaela Perry); Works Council CSL Behring GmbH (Bernd R√∂√üer); Works Council CSL Behring GmbH (Michael Schrder (Inherited)); Works Council CSL Behring GmbH (Michael Schr√∂der); Works Council CSL Behring GmbH (Reiner D√∂nges); Works Council Chairman (Reiner D√∂nges); Works Councils (Reiner D√∂nges (Inherited)); Wuhan Accounting (Daisy Yang ); Wuhan Accounting (Daisy Yang ????); Wuhan Accounting Finance (Amy Jin ????); Wuhan Accounting Finance (Janet Jin ); Wuhan Accounting Finance (Janet Jin ????); Wuhan Administration Management (CW) (Cris Wang ????? (Inherited)); Wuhan Administration Management (Cris Wang ); Wuhan Administration Management (Cris Wang ?????); Wuhan Administrative Management and Facility Engineering (Fred Pang ?????); Wuhan Administrative Management and Facility Engineering (zcan Campinar); Wuhan Administrative Management and Facility Engineering (√ñzcan Campinar); Wuhan Admistration (Shuiping Zhang ); Wuhan Admistration (Shuiping Zhang ?????); Wuhan Bacteriological Inspection and Animal Trial (CiCi Cheng ); Wuhan Bacteriological Inspection and Animal Trial (CiCi Cheng ????); Wuhan Bioanalytical Sciences (Ming Zuo ); Wuhan Bioanalytical Sciences (Ming Zuo ????); Wuhan Bottle Washing (Weibing Chen ); Wuhan Bottle Washing (Weibing Chen ?????); Wuhan Costing and Financial Planning (Jessie Gao ); Wuhan Costing and Financial Planning (Jessie Gao ?????); Wuhan Environmental Health Safety (Ryan Mao ); Wuhan Environmental Health Safety (Ryan Mao ?????); Wuhan Equipment Maintenance (Jianming Liu ); Wuhan Equipment Maintenance (Jianming Liu ?????); Wuhan Equipment Management (Ming Cao ); Wuhan Equipment Management (Ming Cao ????); Wuhan Equipment Operations (Jun Yin ); Wuhan Equipment Operations (Jun Yin ????); Wuhan Equipment Operations and Maintenance (Rory Yang ); Wuhan Equipment Operations and Maintenance (Rory Yang ?????); Wuhan Finance (Dereck Jiang ); Wuhan Finance (Dereck Jiang ????); Wuhan Human Resources (Grace Yu ????); Wuhan Human Resources Management (Alex Yang ); Wuhan Human Resources Management (Alex Yang ?????); Wuhan Inspection (CW) (Yuemei Huang ????? (Inherited)); Wuhan Inspection (Yuemei Huang ); Wuhan Inspection (Yuemei Huang ?????); Wuhan Manufactuirng Subpackaging Line (Chenyi Guo ); Wuhan Manufactuirng Subpackaging Line (Chenyi Guo ?????); Wuhan Manufacturing Production Management (Liutao Yin ?????); Wuhan Operations (Andrew Tang); Wuhan Packaging (Min Lin ????); Wuhan Physical & Chemical Inspection (Linda Lin ); Wuhan Physical & Chemical Inspection (Linda Lin ????); Wuhan Plasma Inspection (Basin Zhao ); Wuhan Plasma Inspection (Basin Zhao ?????); Wuhan Plasma Sourcing (Lixia He (Inherited)); Wuhan Plasma Sourcing (Qin Chen ????); Wuhan Plasma Sourcing Management (CW) (Lixia He ????? (Inherited)); Wuhan Plasma Sourcing Management (Lixia He ); Wuhan Plasma Sourcing Management (Lixia He ?????); Wuhan Plasma Sourcing Management (Zhibao Qian ?????); Wuhan Plasma and Bacteriological Inspection (Haibo Cheng ); Wuhan Plasma and Bacteriological Inspection (Haibo Cheng ?????); Wuhan Procurement (Chan Liu ????); Wuhan Procurement (Steve Hu ); Wuhan Procurement (Steve Hu ?????); Wuhan Production (Vince Tian ?????); Wuhan Production Management (Zhi Luo ????); Wuhan Production Manufacturing (Elias Francis); Wuhan Production Manufacturing (Ye Xin ????); Wuhan Production Manufacturing (Zhen Wang ????); Wuhan Production Operations (Ye Xin ); Wuhan Production Operations (Ye Xin ????); Wuhan Protein Separation (Songping Xie ); Wuhan Protein Separation (Songping Xie ?????); Wuhan QA Deviation (Ning Ding ); Wuhan QA Deviation (Ning Ding ????); Wuhan QA System (Grace Zhao ); Wuhan QA System (Grace Zhao ????); Wuhan QA Validation (Daoxin Zhu ); Wuhan QA Validation (Daoxin Zhu ?????); Wuhan Quality (Dina El-Emary); Wuhan Quality (Xiaohong Wang ?????); Wuhan Quality Control Inspection (Caixiang Liu ?????); Wuhan Quality Control Ruide (Juergen Liedtke); Wuhan Quality Management (Juergen Liedtke); Wuhan Quality Management (Vivian Zhang ????); Wuhan Quality Systems and Standards (Xiangyang Xia ); Wuhan Quality Systems and Standards (Xiangyang Xia ?????); Wuhan Research and Development (Amory Wang ?????); Wuhan Ruide Compliance (Emma Ma ?????); Wuhan Ruide EIA (Shangqu Shi ?????); Wuhan Ruide Equipment (Zhenzhong Huang ?????); Wuhan Ruide Facilities (Didi Li ?????); Wuhan Ruide Facilities Maintenance (Dexue Hu ?????); Wuhan Ruide QA System & Compliance (Bismarck Huang ?????); Wuhan Ruide Wastewater Treatment (Yuanhui Wang ?????); Wuhan Sales (Jason Xu ????? (Inherited)); Wuhan Sales (Lei Huang ????); Wuhan Solution Preparation (Deqing Mei ); Wuhan Solution Preparation (Deqing Mei ?????); Wuhan Subpackaging Management (Jun Liu ); Wuhan Subpackaging Management (Jun Liu ????); Wuhan Subpackaging Operations (Xin Tian ); Wuhan Subpackaging Operations (Xin Tian ????); Wuhan Technlogy and Quality Study (Lisa Liang ); Wuhan Technlogy and Quality Study (Lisa Liang ????); Wuhan Technology Study (Shawelo Xiao ????); Wuhan Translation team (Mabel Xu ); Wuhan Translation team (Mabel Xu ????); Wuhan Ultrafiltration (Jindi Zhou ); Wuhan Ultrafiltration (Jindi Zhou ?????); Wuhan Water Preparation (Zongrong Liu ); Wuhan Water Preparation (Zongrong Liu ?????); Wyoming 173 (Joe Hicks Jr); Wyoming 173 (Stephanie Gower); Wyoming 173 ACM Area 1 (Jessica Hurlbert); Wyoming 173 ACM Area 2 (AMINA MCPHERSON); Wyoming 173 QA (Brent DeHaan); Wyoming 173 QA (Jared Kurtz); Yeadon 280 (Dominique Holland); Yeadon 280 ACM Area 1 (Therese Algeo); Yeadon 280 ACM Area 2 (TB Bailey); Yeadon 280 QA (Nikki Shaw); York 148 (Brandi Boyles); York 148 ACM Area 1 (Brandi Boyles (Inherited)); York 148 ACM Area 1 (Scottie Johnson Jr); York 148 ACM Area 2 (Stephanie Henry); York 148 QA (Caitie Golubski); York 148 QA (Greg Warren); ZPL Plasma Operations (Klaus Rolshausen (Inherited)); askHR Service ¬ñ TA Support (Anna Tassone); askHR Service ¬ñ TA Support (James Meyer); askHR Shared Services - Tier 1 APAC (Mina Kelepouris); eClinical Operations (Charles Johnson); eR&D Business Support (Simone Dierkes); eSystems (Christina Berninger); eSystems - LIMS Management (Christina Berninger (Inherited)); eSystems - LIMS Management (Stephan Degroth); nan; rzte Berlin (Dorothee Knop); rzte Bielefeld (Dorothee Knop); rzte Braunschweig (Dorothee Knop); rzte Bremen (Dorothee Knop); rzte Frankfurt (Dorothee Knop); rzte Gttingen (Dorothee Knop); rzte Kiel (Dorothee Knop); rzte Nrnberg (Dorothee Knop); support engineer (Deepak Cherian (Inherited)); support engineer (Jamshed Patuck); support engineer (Satya Dara (Inherited)); √Ñrzte (Andreas Gehrich (Inherited)); √Ñrzte (Annette Pernitzsch (Inherited)); √Ñrzte (Claudia Habenicht (Inherited)); √Ñrzte (Heike Borchert); √Ñrzte (Kirsten Scheibel (Inherited)); √Ñrzte (Natascha Bock (Inherited)); √Ñrzte (Stephani Keltsch); √Ñrzte (Sven Schuhmann (Inherited)); √Ñrzte Berlin (Dorothee Knop); √Ñrzte Bielefeld (Dorothee Knop); √Ñrzte Braunschweig (Dorothee Knop); √Ñrzte Bremen (Dorothee Knop); √Ñrzte Frankfurt (Dorothee Knop); √Ñrzte G√∂ttingen (Dorothee Knop); √Ñrzte Kiel (Dorothee Knop); √Ñrzte N√ºrnberg (Dorothee Knop)' inference: true --- # SetFit with sentence-transformers/all-mpnet-base-v2 This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) as the Sentence Transformer embedding model. A [SetFitHead](huggingface.co/docs/setfit/reference/main#setfit.SetFitHead) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) - **Classification head:** a [SetFitHead](huggingface.co/docs/setfit/reference/main#setfit.SetFitHead) instance - **Maximum Sequence Length:** 384 tokens - **Number of Classes:** 15 classes <!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ### Model Labels | Label | Examples | |:------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 2 | <ul><li>'FLSA STATUS DESCR: Exempt; Non-Exempt; nan'</li><li>'Pay Rate Type: Hourly; Hourly Salary; Hourly/Salary; Salaried; Salary; nan'</li><li>'Employee Level: Executive; Exempt professional; Non-exempt professional; Supervisor/Manager; nan'</li></ul> | | 3 | <ul><li>'Is Manager<HIDE>: N; Y; nan'</li><li>'Job Level Name: Architect and Lead/Principal Individual Contributor; Architect and¬†Lead/Principal¬†Individual Contributor; Associate/Intern; Chief Architect/Technical Fellow; Chief Operating Officer; Director; EVP; Fellow and Chief Architect; Group President/Sr EVP; Individual Contributor; People Manager/Sr Mgr; President and CEO; SVP; Senior Individual Contributor; Senior Lead/Principal Architect; Sr EVP Chief Financial Officer; Supervisor; Vice President/Counsel/Controller; nan'</li><li>'Is Manager: 0; 1; N; No; Y; Yes; nan'</li></ul> | | 5 | <ul><li>'Function/MAG: nan'</li><li>'Functional Pipeline: Communications; Corporate & Government Affairs; Corporate Services; Data Analytics; Design; Digital; Finance; General Management; Human Resources; Legal; Logistics & Services; Manufacturing & Sourcing; Marketing; Merchandising; Product Creation; Product Management; Program/Process Excellence; Retail; Sales; Sports Marketing; Strategic Planning; Technology; Unassigned; Unknown; nan'</li><li>'JobClassDescription: ACCOUNTANTS - DEGREED; ADMINISTRATIVE SUPPORT; AIDES/ORDERLIES; CLERICAL OFFICE SUPPORT; CLINICAL SUPPORT; DRIVERS; EMPLOYED PHYSICIANS; HOME HEALTH CARE - AIDES; HOME HEALTH CARE - LVN; HOME HEALTH CARE - RN; LICENSED REGISTERED NURSES; LICENSED VOCATIONAL/PRACTICAL NURSES; MANAGERS; NON-PHYSICIAN MEDICAL PRACTITIONERS; OTHER PHYSICIANS; PHYSICIAN PRACTICE - LVN; PHYSICIAN PRACTICE - RN; Physicians (With Benefits); SUPERVISORS; SUPPORT SERVICES PATIENT CARE; TECHNICAL SUPPORT; TECHNICIANS; TECHNOLOGISTS/THERAPISTS; nan'</li></ul> | | 10 | <ul><li>'Corp State: Alabama; Arizona; California; Colorado; Connecticut; Delaware; District of Columbia; Florida; Georgia; Idaho; Illinois; Indiana; Iowa; Is ; Kansas; Kentucky; Louisiana; Maine; Maryland; Massachusetts; Michigan; Milan; Minnesota; Mississippi; Missouri; Montana; Nebraska; Nevada; New Hampshire; New Jersey; New Mexico; New York; North Carolina; Ohio; Oklahoma; Oregon; Pennsylvania; Rhode Island; South Carolina; South Dakota; Tennessee; Texas; Turin; Utah; Virginia; Washington; West Virginia; Wisconsin; nan'</li><li>"location__stateprovince: ??mm?n; Aargau; Agrigento; Aguascalientes; Aichi; Alabama; Alaska; Alberta; Alessandria; Alexandria; Aosta; Arizona; Arkansas; Auckland; Baden-Wurttemberg; Bangkok; Bari; Bavaria; Beijing; Bergamo; Bologna; Brescia; British Columbia; Buenos Aires; Busan; Cagliari; California; Canterbury; Caserta; Catania; Cebu; Central Singapore; Changhua County; Chiayi City; Ciudad de Mexico; Coahuila; Colorado; Como; Connecticut; Cortes; Delaware; District of Columbia; Distrito Federal; Dubai; Estado de M√©xico; Ferrara; Firenze; Florida; Fujian; Fukuoka; Genova; Georgia; Gifu; Graubunden; Guanajuato; Guangdong; Guatemala; Haryana; Hawaii; Hawke's Bay; Hiroshima; Ho Chi Minh; Hokkaido; Hong Kong Island; Hsinchu City; Hubei; Ibaraki; Idaho; Ilan County; Illinois; Indiana; Iowa; Ishikawa; Jakarta Raya; Jalisco; Jiangsu; Johor; Kanagawa; Kansas; Kaohsiung City; Karnataka; Kentucky; Kowloon; Lecce; Liaoning; Livorno; Louisiana; Maharashtra; Maine; Managua; Manitoba; Maryland; Massachusetts; Melaka; Messina; Miaoli County; Michigan; Milano; Minnesota; Mississippi; Missouri; Montana; Monza e Brianza; Morelos; Nagano; Napoli; Nebraska; Nevada; New Hampshire; New Jersey; New Mexico; New South Wales; New Taipei City; New Territories; New York; Newfoundland and Labrador; North Carolina; North Dakota; North Rhine-Westphalia; Nova Scotia; Novara; Nuevo Le√≥n; Ohio; Oklahoma; Ontario; Oregon; Osaka; Otago; PG_Asia_CHN_01; PG_Asia_HKG_01; PI CHE - VF International; Padova; Pahang; Panam√°; Parma; Pennsylvania; Phnom Penh; Piacenza; Pingtung County; Puebla; Puerto Rico; Pulau Pinang (Penang); Quebec; Quer√©taro; Quintana Roo; Rhode Island; Roma; Saitama; Salary; San Jose; San Salvador; Santiago; Saskatchewan; Selangor; Seoul; Shandong; Shanghai; Shanxi; Shizuoka; Sichuan; South Carolina; S√£o Paulo; Tabasco; Taichung City; Tainan City; Taipei City; Tamil Nadu; Taoyuan City; Tennessee; Texas; Tianjin; Ticino; Tochigi; Tokyo; Torino; Toyama; Treviso; Trieste; Utah; Varese; Venezia; Veracruz; Vermont; Verona; Vicenza; Victoria; Virginia; Washington; Wellington; West Virginia; Wilayah Persekutuan Kuala Lumpur; Wilayah Persekutuan Putrajaya; Wisconsin; Wyoming; Yucat√°n; Zhejiang; nan"</li><li>'Home State | Province | Region: Alabama; Arkansas; Bogot√° D.C.; California; Colorado; Delaware; Distrito Federal; Eastern Cape; England; Florida; Gauteng; Georgia; Illinois; Indiana; Iowa; Kentucky; KwaZulu-Natal; Maine; Mexico State; Michigan; Minnesota; Missouri; Nevada; New Hampshire; New Jersey; New York; North Carolina; Ohio; Oregon; Pennsylvania; Puerto Rico; Santander; South Carolina; Tennessee; Texas; Valle del Cauca; Virginia; Washington; Western Cape; Wisconsin; nan'</li></ul> | | 12 | <ul><li>'Tenure Category: 0 - 3 Months; 10 - 12 Months; 10 - 15 Years; 13 - 28 Months; 15 - 20 Years; 19 - 24 Months; 2 - 3 Years; 20+ Years; 3 - 5 Years; 4 - 6 Months; 5 - 10 Years; 7 - 9 Months; nan'</li><li>'Tenure with the Company: 0-3 months; 1-2 years; 11-15 years; 16-20 years; 21-25 years; 26-30 years; 3-5 years; 31 years or more; 4-6 months; 6-10 years; 7-12 months; nan'</li><li>'TENURE - Hire: 1 - 2 years; 11 - 15 years; 16 - 20 years; 3 - 5 years; 6 - 10 years; Less than 1 year; More than 20 years; nan'</li></ul> | | 6 | <ul><li>'Location (Geographic): Argentina; Australia; Brazil; Canada; Canada - Living Sounds; China; China - Beijing; China - Suzhou; China, Beijing; China, Suzhou; Colombia; Dubai; France; Germany; Hungary; India; Israel; Italy; Japan; Kenya; Korea; Malaysia; Mexico, Matamoros; Mexico, Mexico City; New Zealand; Norway; Peru; Phillipines; Poland; Prefer not to answer; Romania; Singapore; United Kingdom; United States; nan'</li><li>'Country Name: Australia; Belgium; Brazil; Canada; Colombia; Costa Rica; France; India; Ireland; Italy; Luxembourg; Mexico; Netherlands; New Zealand; Philippines; Poland; Puerto Rico; Singapore; Spain; United Kingdom; United States of America; nan'</li><li>'Operating Company: MHPS-EDE; nan'</li></ul> | | 9 | <ul><li>"HR Site Group<HIDE>: 84 SOUTH HEALTH CENTER; ACL LABS; ADVOCATE MEDICAL GROUP; AMC Bay Area; AMC GRAFTON; AMC KENOSHA; AMC MANITOWOC COUNTY; AMC OSHKOSH; AMC SUMMIT; AMC WASHINGTON COUNTY; AMG CONTACT CENTER; APP; AURORA BAYCARE MEDICAL CENTER; AURORA CLINICAL CONTACT CENTER; AURORA LAKELAND MEDICAL CENTER; AURORA MEMORIAL HOSPITAL OF BURLINGTON; AURORA PSYCH/BEHAVIORAL HEALTH; Aurora Health Care Medical Group : GBMM; Aurora Health Care Medical Group : GMNSC; Aurora Health Care Medical Group : GMS; Aurora Health Care Medical Group : OFDL; Aurora Health Care Medical Group : OTHER; Aurora Health Care Medical Group : RKL; Aurora Health Care Medical Group : SCWNWC; Aurora Health Care Medical Group : WJ; BROMENN MEDICAL CENTER/EUREKA; CHRIST MEDICAL CENTER; CONDELL MEDICAL CENTER; CORPORATE; Children's Hospital; GOOD SAMARITAN HOSPITAL; GOOD SHEPHERD HOSPITAL; ILLINOIS MASONIC MEDICAL CENTER; LUTHERAN GENERAL HOSPITAL; POST ACUTE NETWORK; SHEBOYGAN MEMORIAL; SHERMAN HOSPITAL; SINAI MEDICAL CENTER; SOUTH SUBURBAN HOSPITAL; ST. LUKE'S SOUTH SHORE; SYSTEM ANCILLARY SERVICES; SYSTEM SUPPORT SERVICES; St. Luke's Medical Center; TRINITY HOSPITAL; WEST ALLIS MEDICAL CENTER; nan"</li><li>'Affiliation(Affiliate): 9010 Admin; BCHC; BVRMC; Cherokee Regional Medical Center; Dubuque VNA; FORT DODGE REGION; GREENE COUNTY MEDICAL CENTER; Grundy Cnty Hospital; HCF Inc; Hancock County Senior Services; IA Health Acc Care; MERITER; Memorial Hospital; Pocahontas Community Hospital; Stewart Memorial Community Hospital; Sumner Comm Hospital; UP Clinic Affiliate; UP at Home Affiliate; UPC Peoria; UPH Allen; UPH CR St Lukes; UPH Contract Svc LC; UPH Des Moines; UPH FD Trinity Hlth; UPH FD Trinity Regnl; UPH Finley; UPH Grinnell; UPH Jones Regional; UPH Marshalltown; UPH Methodist; UPH Methodist Colleg; UPH Musc Trinity; UPH Pekin; UPH Proctor; UPH QC Trinity; UPH SC St Lukes; UPH at Work; UPH at Work QC; UnityPlace; UnityPoint Health - Keokuk; Virginia Gay Hospital; nan'</li><li>'region: ACP; CALIFORNIA; CAROLINAS; CENTRAL ZONE RM; EAST PENN/DE; EAST ZONE RM; FLORIDA; GREAT PLAINS; HANGER CLINIC SHARED SVCS; HANGER RESOURCE CENTER; HEARTLAND; HOUSTON; KEYSTONE; MICHIGAN; MID ATLANTIC; MIDWEST; NATL LABS; NEW ENGLAND; NORTH ATLANTIC; NORTHWEST; NY METRO; NY/NJ; OHIO VALLEY; ROCKY MOUNTAIN; SOUTH CENTRAL; SOUTHEAST; SOUTHWEST; SPS; TEXAS; WEST ZONE RM; nan'</li></ul> | | 4 | <ul><li>'Union Status <HIDE>: I am a member of a union; I am not a member of union; nan'</li><li>"Union Code: 122; 17; 399; 420; 781; AFSCME; AFSCME Local 3279; AFSCME Local 9; ALT; ARSA; Appointed; BDN; BKV; BLE; BMWE; BOL; BRCA; BRD; BRS; BRW; CAF; CAP; CAT; CAW; CBI; CBT; CE1; CE2; CE3; CE4; CEC; CED; CEL; CEN; CEQ; CET; CFC; CFF; CFO; CLB; CMA; CMN; CNA; CNR; COL; CPA; CPE; CPL; CPO; CPT; CSC; CSE; CSU; CTM; CTS; CVT; CX1; CX2; CX3; CX4; DBS; DVR; FNK; FRE; FRS; G01; G02; G04; G05; G06; G07; G08; G09; G10; G11; G12; G13; G14; G15; G16; G17; GCH; GGI; GGO; GGR; GVL; HTC; IAM; IBBB; IBEW; IBFO; IBT Lab Asst; IBT Lab Couriers; IBT PMCA Childrens; IBW; IDA; IUOE; IW; JOA AFSCME/SEIU HCMI; KU1; KU2; Laundry Workers Loca; Local 320; Local 321; Local 363; Local 49ers; MDL; MDX; MNA; MOD; MOO; MUR; Muldraugh Compressor Station; N01; N02; N03; NON; NUHW; NUR; None ; OIL; Operating Engineers; PNT; Police; R01; R02; R04; R05; R06; R10; R11; R12; R13; R15; R16; R17; R18; R19; R20; R22; R23; R24; R25; R26; R27; R31; R32; R33; R35; R36; R37; R38; R39; R40; R41; R42; R45; R46; R47; R48; R49; R50; R52; R55; R56; R57; R58; R59; RFT; RPT; RSP; SCP; SEC; SEIU; SEIU - PCA's at RIM; SMW; SPNNUNAC; STF; Service Emp Intn'l U; TCU; TCUASR; TCU_ICTF; TEAM; TSV; Trades; U; U01; U02; U04; U05; U06; U07; U10; U14; U19; U21; U22; U23; U24; U25; U26; U32; U37; U43; U44; U52; U53; U56; U76; U78; U83; U84; U85; U91; UA3; UAW Local 889; UB7; UB9; UC3; UD7; UD8; UE3; UE5; UE9; UF1; UF2; UF3; UF4; UFCW; UG1; UG5; UN5; UN6; UN7; UN8; UN9; UNAC; UNKNOWN; UPUYC-SP; UPUYC-UP; UTUC; UTUE; UTUT; UTUY-A&S; W02; W03; W04; W05; WC5; YRK; nan"</li></ul> | | 16 | <ul><li>'Shift Question<HIDE>: Yes; nan'</li><li>'Work Schedule: 0-30 Hrs Wk; 0-38.5 Hrs Wk; 1 - 10%; 10 FR-MO; 10 M-TH; 10 M-TU TH-FR; 10 M-WE FR; 10 SU-WED; 10 TU-FR; 10 WE - SA; 11 - 20%; 1ST SHIFT; 21 - 30%; 2ND SHIFT; 31 - 40%; 37 Hrs Wk; 37.5 Hrs Wk; 38 Hrs Wk; 38.5 Hrs Wk; 39 Hrs Wk; 3RD SHIFT; 40 Hrs Wk; 41 - 50%; 41 Hrs Wk; 42.5 Hrs Wk; 44 Hrs Wk; 45 Hrs Wk; 48 Hrs Wk; 5/8 FR-SA Off; 5/8 MO-TU Off; 5/8 SU-MO Off; 5/8 TH-FR Off; 5/8 TU-WE Off; 5/8 WE-TH Off; 51 - 60%; 61 - 70%; 71 - 80%; 8 HRS 8am-5pm; 81 - 90%; 91 - 100%; KRONOS SHIFT 1; KRONOS SHIFT 2; KRONOS SHIFT 3; Mon-Fri 40 Hrs/Wk; OPS FLOOR Mo-Th-Fr; PART-TIME VERT. 60,5%; Part-time Oriz. 50%; Part-time Oriz. 60%; Part-time Oriz. 62%; Part-time Oriz. 62,5%; Part-time Oriz. 75%; Part-time Oriz. 87,5%; STANDARD 8-5; STANDARD 8:30am-5pm; STANDARD 8am - 5pm; STANDARD 9-5.30; STANDARD 9am - 6pm; STANDARD 9am-6pm; Service Tech. Field; TURNISTA; Turno PT Orizz. 75%; Turno PT Orizz. 87,5%; nan'</li><li>'Shift<HIDE>: 7; B; D; E; L; O; R; W; Z; nan'</li></ul> | | 14 | <ul><li>'What has been your COVID 19 work arrangement?<HIDE>: Furloughed/Closed Location; Office; Other; Reduced Work Schedule; Remote/Work from Home; nan'</li><li>'Ability to Work Remotely<HIDE>: My primary job role can be done remotely with little or no disruption.; My primary job role is a mix - some can be done from anywhere and some can only be done from the physical work location.; My primary job role requires me to be physically present in my workplace.; nan'</li></ul> | | 1 | <ul><li>'Race_Ethnicity: American Indian or Alaska Native (Not Hispanic or Latino) (United States of America); American Indian or Alaskan Native (United States of America); American¬†Indian¬†or¬†Alaskan¬†Native (United States of America); Asian (Not Hispanic or Latino) (United States of America); Asian (United States of America); Asian - Indian (United Kingdom); Asian - Other (United Kingdom); Asian - Pakistani (United Kingdom); Bai (China); Black - African (United Kingdom); Black - British (United Kingdom); Black - Caribbean (United Kingdom); Black or African American (Not Hispanic or Latino) (United States of America); Black or African American (United States of America); Black¬†or¬†African¬†American (United States of America); Buyei (China); Chinese (Singapore); Dai (China); Dong (China); Han (China); Hani (China); Hispanic or Latino (United States of America); Hispanic/Latino (United States of America); I do not wish to answer. (United States of America); Indian (Singapore); Li (China); Malay (Singapore); Native Hawaiian or Other Pacific Islander (Not Hispanic or Latino) (United States of America); Native Hawaiian or Other Pacific Islander (United States of America); Native¬†Hawaiian¬†or¬†Other¬†Pacific¬†Islander (United States of America); Not Declaring (United Kingdom); Not Reported; Other (Singapore); Other (United Kingdom); Tujia (China); Two or More Races (Not Hispanic or Latino) (United States of America); Two or More Races (United States of America); Two¬†or¬†More¬†Races (United States of America); White (Not Hispanic or Latino) (United States of America); White (United States of America); White - British (United Kingdom); White - Irish (United Kingdom); White - Other (United Kingdom); White - Other European (United Kingdom); Yi (China); Zhuang (China); nan'</li><li>'Which ethnicity/ethnicities do you most identify with?: Asian; Black; Hispanic or Latino; Other; Prefer not to respond; Two or More Races; White; nan'</li><li>'Ethnicity On File: 2 or more races, not Hispanc; American Indian/Alaska Nativ; Asian; Black/African American; Hispanic/Latino; Native Hawaiian/Oth Pacif Is; White; nan'</li></ul> | | 7 | <ul><li>'FM_Merger_Cd: N; Y; nan'</li><li>'Acquisition Hire<HIDE>: Acquisition Hire; Non-Acquisition Hire; nan'</li></ul> | | 8 | <ul><li>'Primary Termination Reason: Retained; Terminate Associate > Involuntary > Attendance; Terminate Associate > Involuntary > Death; Terminate Associate > Involuntary > Elimination of Position; Terminate Associate > Involuntary > Exhaustion of Leave; Terminate Associate > Involuntary > Falsification of Records; Terminate Associate > Involuntary > Gross Misconduct; Terminate Associate > Involuntary > Mutual Consent; Terminate Associate > Involuntary > Not re-new contract; Terminate Associate > Involuntary > Poor Job Performance; Terminate Associate > Involuntary > Severance; Terminate Associate > Involuntary > Tardiness; Terminate Associate > Involuntary > Violation of Rules; Terminate Associate > Involuntary > Workforce Reduction; Terminate Associate > Voluntary > Commute Time; Terminate Associate > Voluntary > Company Instability; Terminate Associate > Voluntary > Dissatisfied with Hours; Terminate Associate > Voluntary > Dissatisfied with Job; Terminate Associate > Voluntary > Dissatisfied with Management; Terminate Associate > Voluntary > Dissatisfied with Pay; Terminate Associate > Voluntary > Dissatisfied with Promotional Opportunities; Terminate Associate > Voluntary > Dissatisfied with Working Conditions; Terminate Associate > Voluntary > Failure to Return from Leave; Terminate Associate > Voluntary > Job Abandonment; Terminate Associate > Voluntary > Military Service; Terminate Associate > Voluntary > Moved; Terminate Associate > Voluntary > Other Employment; Terminate Associate > Voluntary > Personal; Terminate Associate > Voluntary > Retirement; Terminate Associate > Voluntary > Return to School; Terminate Associate > Voluntary > Severance; Terminate Associate > Voluntary > Unknown; Terminate Employee > Voluntary > Benefits; Terminate Employee > Voluntary > Career Change; Terminate Employee > Voluntary > Career Development or Advancement; Terminate Employee > Voluntary > Compensation; Terminate Employee > Voluntary > Continue Education; Terminate Employee > Voluntary > Contract End; Terminate Employee > Voluntary > Conversion; Terminate Employee > Voluntary > Dislike Company; Terminate Employee > Voluntary > Dislike Hours/Schedule; Terminate Employee > Voluntary > Dislike Supervisor; Terminate Employee > Voluntary > Dislike Work; Terminate Employee > Voluntary > Dissatisfied Career Advancement Opportunities; Terminate Employee > Voluntary > Dissatisfied with Benefits; Terminate Employee > Voluntary > Dissatisfied with Benefits Package (Health, Dental, Vision, Life, Retirement, Paid Leave, etc.); Terminate Employee > Voluntary > Dissatisfied with Career Opportunities; Terminate Employee > Voluntary > Dissatisfied with Company Policies; Terminate Employee > Voluntary > Dissatisfied with Compensation Package (Base Salary, Bonus, Commissions, etc.); Terminate Employee > Voluntary > Dissatisfied with Coworkers; Terminate Employee > Voluntary > Dissatisfied with Flexible Work Arrangements (remote work, flexible hours, etc.); Terminate Employee > Voluntary > Dissatisfied with Hours / Schedule; Terminate Employee > Voluntary > Dissatisfied with Industry; Terminate Employee > Voluntary > Dissatisfied with Job; Terminate Employee > Voluntary > Dissatisfied with Location; Terminate Employee > Voluntary > Dissatisfied with Location/Commute; Terminate Employee > Voluntary > Dissatisfied with Management; Terminate Employee > Voluntary > Dissatisfied with Manager Effectiveness; Terminate Employee > Voluntary > Dissatisfied with Organization Culture (Corporate Values, Behaviors, Norms that Guide How People Work); Terminate Employee > Voluntary > Dissatisfied with Pay; Terminate Employee > Voluntary > Dissatisfied with Travel; Terminate Employee > Voluntary > Dissatisfied with Type of Work; Terminate Employee > Voluntary > Dissatisfied with Work Conditions; Terminate Employee > Voluntary > Dissatisfied with Working Conditions; Terminate Employee > Voluntary > Dissatisfied with Worklife Balance; Terminate Employee > Voluntary > Exit Workforce; Terminate Employee > Voluntary > Failed to Return from Leave; Terminate Employee > Voluntary > Failure to Return from Leave; Terminate Employee > Voluntary > Family Obligations; Terminate Employee > Voluntary > Family Reasons; Terminate Employee > Voluntary > Health Concerns; Terminate Employee > Voluntary > Health Reasons; Terminate Employee > Voluntary > Job Abandonment; Terminate Employee > Voluntary > Job Security; Terminate Employee > Voluntary > Join Military; Terminate Employee > Voluntary > Location; Terminate Employee > Voluntary > Military Service; Terminate Employee > Voluntary > Moved; Terminate Employee > Voluntary > Mutual Agreement (inactive); Terminate Employee > Voluntary > Mutual Consent; Terminate Employee > Voluntary > Never Reported for Orientation; Terminate Employee > Voluntary > Other Employment; Terminate Employee > Voluntary > Personal - Furthering Education (inactive); Terminate Employee > Voluntary > Personal Reasons; Terminate Employee > Voluntary > Relocation; Terminate Employee > Voluntary > Resignation; Terminate Employee > Voluntary > Retirement; Terminate Employee > Voluntary > Return to School; Terminate Employee > Voluntary > Returned to School; Terminate Employee > Voluntary > Self Employment; Terminate Employee > Voluntary > Training; Terminate Employee > Voluntary > Transportation Problems; Terminate Employee > Voluntary > Unknown; Terminate Employee > Voluntary > Work Authorization Not Renewed; Terminate Employee > Voluntary > Workload; nan'</li><li>'Termed Reason: I; V; nan'</li><li>'Voluntary or Retirement<HIDE>: Retirement; Voluntary; nan'</li></ul> | | 11 | <ul><li>'Generation: 18-24 years of age; 25-34 years; 25-34 years of age; 26-35 Yrs; 26-35 years; 35-44 years; 35-44 years of age; 36-45 Yrs; 36-45 years; 45-54 years of age; 45-55 years; 46-55 Yrs; 46-55 years; 55-64 years of age; 65+ years of age; < 25 years; < 26 Yrs; < 26 years; <25 years; > 55 Yrs; > 55 years; >55 years; Baby Boomer; Baby Boomer (born 1946 ¬ñ 1964); Baby Boomers; Baby Boomers ¬ñ 1946 ¬ñ 1965; Gen X; Generation X; Generation X (born 1965 to 1980); Generation X ¬ñ 1965 ¬ñ 1980; Generation Y / Millennials ¬ñ 1981 ¬ñ 1996; Generation Z; Generation Z (born 2001 to 2015); Generation Z ¬ñ 1997 and onwards; Mature (born in 1945 or earlier); Millennial; Millennials; Millennials (born 1981 to 2000); Silent Generation; Silent Generation - 1928 ¬ñ 1945; Traditionalist; nan'</li><li>'Age Bracket: 119.4; 18-24; 18.5; 18.7; 19.1; 19.2; 19.3; 19.4; 19.5; 19.6; 19.7; 19.8; 19.83333333; 19.9; 20 - 29; 20-24; 20-30 Years; 20.3; 20.6; 20.66666667; 20.7; 20.83333333; 20.9; 21; 21.08333333; 21.1; 21.16666667; 21.2; 21.3; 21.4; 21.5; 21.6; 21.66666667; 21.7; 21.8; 21.83333333; 21.9; 22; 22.1; 22.2; 22.3; 22.33333333; 22.4; 22.41666667; 22.5; 22.58333333; 22.6; 22.66666667; 22.7; 22.75; 22.8; 22.9; 23; 23.08333333; 23.1; 23.16666667; 23.2; 23.25; 23.3; 23.33333333; 23.4; 23.41666667; 23.5; 23.58333333; 23.6; 23.7; 23.8; 23.83333333; 23.9; 23.91666667; 24; 24.1; 24.2; 24.3; 24.33333333; 24.4; 24.41666667; 24.5; 24.58333333; 24.6; 24.66666667; 24.7; 24.75; 24.8; 24.83333333; 24.9; 25; 25-30; 25-35; 25-35 ; 25.08333333; 25.1; 25.16666667; 25.2; 25.25; 25.3; 25.33333333; 25.4; 25.41666667; 25.5; 25.58333333; 25.6; 25.66666667; 25.7; 25.75; 25.8; 25.83333333; 25.9; 25.91666667; 26; 26-3; 26-35; 26.08333333; 26.1; 26.16666667; 26.2; 26.25; 26.3; 26.33333333; 26.4; 26.41666667; 26.5; 26.58333333; 26.6; 26.66666667; 26.7; 26.75; 26.8; 26.83333333; 26.9; 26.91666667; 27; 27.08333333; 27.1; 27.16666667; 27.2; 27.25; 27.3; 27.33333333; 27.4; 27.41666667; 27.5; 27.58333333; 27.6; 27.66666667; 27.7; 27.75; 27.8; 27.83333333; 27.9; 27.91666667; 28; 28.08333333; 28.1; 28.16666667; 28.2; 28.25; 28.3; 28.33333333; 28.4; 28.41666667; 28.5; 28.58333333; 28.6; 28.66666667; 28.7; 28.75; 28.8; 28.83333333; 28.9; 28.91666667; 29; 29.08333333; 29.1; 29.16666667; 29.2; 29.25; 29.3; 29.33333333; 29.4; 29.41666667; 29.5; 29.58333333; 29.6; 29.66666667; 29.7; 29.75; 29.8; 29.83333333; 29.9; 29.91666667; 30; 30 - 39; 30-40 Years; 30.08333333; 30.1; 30.16666667; 30.2; 30.25; 30.3; 30.33333333; 30.4; 30.41666667; 30.5; 30.58333333; 30.6; 30.66666667; 30.7; 30.75; 30.8; 30.83333333; 30.9; 30.91666667; 31; 31-40; 31.08333333; 31.1; 31.16666667; 31.2; 31.25; 31.3; 31.33333333; 31.4; 31.41666667; 31.5; 31.58333333; 31.6; 31.66666667; 31.7; 31.75; 31.8; 31.83333333; 31.9; 31.91666667; 32; 32.08333333; 32.1; 32.16666667; 32.2; 32.25; 32.3; 32.33333333; 32.4; 32.41666667; 32.5; 32.58333333; 32.6; 32.66666667; 32.7; 32.75; 32.8; 32.83333333; 32.9; 32.91666667; 33; 33.08333333; 33.1; 33.16666667; 33.2; 33.25; 33.3; 33.33333333; 33.4; 33.41666667; 33.5; 33.58333333; 33.6; 33.66666667; 33.7; 33.75; 33.8; 33.83333333; 33.9; 33.91666667; 34; 34.08333333; 34.1; 34.16666667; 34.2; 34.25; 34.3; 34.33333333; 34.4; 34.41666667; 34.5; 34.58333333; 34.6; 34.66666667; 34.7; 34.75; 34.8; 34.83333333; 34.9; 34.91666667; 35; 35.08333333; 35.1; 35.16666667; 35.2; 35.25; 35.3; 35.33333333; 35.4; 35.41666667; 35.5; 35.58333333; 35.6; 35.66666667; 35.7; 35.75; 35.8; 35.83333333; 35.9; 35.91666667; 36; 36-40; 36-41; 36-45; 36.08333333; 36.1; 36.16666667; 36.2; 36.25; 36.3; 36.33333333; 36.4; 36.41666667; 36.5; 36.58333333; 36.6; 36.66666667; 36.7; 36.75; 36.8; 36.83333333; 36.9; 36.91666667; 37; 37.08333333; 37.1; 37.16666667; 37.2; 37.25; 37.3; 37.33333333; 37.4; 37.41666667; 37.5; 37.58333333; 37.6; 37.66666667; 37.7; 37.75; 37.8; 37.83333333; 37.9; 37.91666667; 38; 38.08333333; 38.1; 38.16666667; 38.2; 38.25; 38.3; 38.33333333; 38.4; 38.41666667; 38.5; 38.58333333; 38.6; 38.66666667; 38.7; 38.75; 38.8; 38.83333333; 38.9; 38.91666667; 39; 39.08333333; 39.1; 39.16666667; 39.2; 39.25; 39.3; 39.33333333; 39.4; 39.41666667; 39.5; 39.58333333; 39.6; 39.66666667; 39.7; 39.75; 39.8; 39.83333333; 39.9; 39.91666667; 40; 40 - 49; 40-50 Years; 40.08333333; 40.1; 40.16666667; 40.2; 40.25; 40.3; 40.33333333; 40.4; 40.41666667; 40.5; 40.58333333; 40.6; 40.66666667; 40.7; 40.75; 40.8; 40.83333333; 40.9; 40.91666667; 41; 41-49; 41-50; 41.08333333; 41.1; 41.16666667; 41.2; 41.25; 41.3; 41.33333333; 41.4; 41.41666667; 41.5; 41.58333333; 41.6; 41.66666667; 41.7; 41.75; 41.8; 41.83333333; 41.9; 41.91666667; 42; 42.08333333; 42.1; 42.16666667; 42.2; 42.25; 42.3; 42.33333333; 42.4; 42.41666667; 42.5; 42.58333333; 42.6; 42.66666667; 42.7; 42.75; 42.8; 42.83333333; 42.9; 42.91666667; 43; 43.08333333; 43.1; 43.16666667; 43.2; 43.25; 43.3; 43.33333333; 43.4; 43.41666667; 43.5; 43.58333333; 43.6; 43.66666667; 43.7; 43.75; 43.8; 43.83333333; 43.9; 43.91666667; 44; 44.08333333; 44.1; 44.16666667; 44.2; 44.25; 44.3; 44.33333333; 44.4; 44.41666667; 44.5; 44.58333333; 44.6; 44.66666667; 44.7; 44.75; 44.8; 44.83333333; 44.9; 44.91666667; 45; 45.08333333; 45.1; 45.16666667; 45.2; 45.25; 45.3; 45.33333333; 45.4; 45.41666667; 45.5; 45.58333333; 45.6; 45.66666667; 45.7; 45.75; 45.8; 45.83333333; 45.9; 45.91666667; 46; 46-54; 46.08333333; 46.1; 46.16666667; 46.2; 46.25; 46.3; 46.33333333; 46.4; 46.41666667; 46.5; 46.58333333; 46.6; 46.66666667; 46.7; 46.75; 46.8; 46.83333333; 46.9; 46.91666667; 47; 47.08333333; 47.1; 47.16666667; 47.2; 47.25; 47.3; 47.33333333; 47.4; 47.41666667; 47.5; 47.58333333; 47.6; 47.66666667; 47.7; 47.75; 47.8; 47.83333333; 47.9; 47.91666667; 48; 48.08333333; 48.1; 48.16666667; 48.2; 48.25; 48.3; 48.33333333; 48.4; 48.41666667; 48.5; 48.58333333; 48.6; 48.66666667; 48.7; 48.75; 48.8; 48.83333333; 48.9; 48.91666667; 49; 49.08333333; 49.1; 49.16666667; 49.2; 49.25; 49.3; 49.33333333; 49.4; 49.41666667; 49.5; 49.58333333; 49.6; 49.66666667; 49.7; 49.75; 49.8; 49.83333333; 49.9; 49.91666667; 50; 50 - 59; 50-60 Years; 50-64; 50.1; 50.16666667; 50.2; 50.25; 50.3; 50.33333333; 50.4; 50.41666667; 50.5; 50.58333333; 50.6; 50.66666667; 50.7; 50.75; 50.8; 50.83333333; 50.9; 50.91666667; 51; 51+; 51.08333333; 51.1; 51.16666667; 51.2; 51.25; 51.3; 51.33333333; 51.4; 51.41666667; 51.5; 51.58333333; 51.6; 51.66666667; 51.7; 51.75; 51.8; 51.83333333; 51.9; 51.91666667; 52; 52.08333333; 52.1; 52.16666667; 52.2; 52.25; 52.3; 52.33333333; 52.4; 52.41666667; 52.5; 52.58333333; 52.6; 52.66666667; 52.7; 52.75; 52.8; 52.83333333; 52.9; 52.91666667; 53; 53.08333333; 53.1; 53.16666667; 53.2; 53.25; 53.3; 53.33333333; 53.4; 53.41666667; 53.5; 53.58333333; 53.6; 53.66666667; 53.7; 53.75; 53.8; 53.83333333; 53.9; 53.91666667; 54; 54.08333333; 54.1; 54.16666667; 54.2; 54.25; 54.3; 54.33333333; 54.4; 54.41666667; 54.5; 54.58333333; 54.6; 54.66666667; 54.7; 54.75; 54.8; 54.83333333; 54.9; 54.91666667; 55; 55+; 55.08333333; 55.1; 55.16666667; 55.2; 55.25; 55.3; 55.33333333; 55.4; 55.5; 55.58333333; 55.6; 55.66666667; 55.7; 55.75; 55.8; 55.83333333; 55.9; 55.91666667; 56; 56.08333333; 56.1; 56.16666667; 56.2; 56.25; 56.3; 56.33333333; 56.4; 56.41666667; 56.5; 56.58333333; 56.6; 56.66666667; 56.7; 56.75; 56.8; 56.83333333; 56.9; 56.91666667; 57; 57.08333333; 57.1; 57.16666667; 57.2; 57.25; 57.3; 57.4; 57.41666667; 57.5; 57.6; 57.66666667; 57.7; 57.75; 57.8; 57.83333333; 57.9; 57.91666667; 58; 58.08333333; 58.1; 58.16666667; 58.2; 58.25; 58.3; 58.33333333; 58.4; 58.5; 58.58333333; 58.6; 58.7; 58.75; 58.8; 58.83333333; 58.9; 58.91666667; 59; 59.08333333; 59.1; 59.16666667; 59.2; 59.25; 59.3; 59.33333333; 59.4; 59.41666667; 59.5; 59.58333333; 59.6; 59.7; 59.75; 59.8; 59.83333333; 59.9; 59.91666667; 6; 60; 60 and over; 60.08333333; 60.1; 60.16666667; 60.2; 60.3; 60.33333333; 60.4; 60.41666667; 60.5; 60.6; 60.7; 60.75; 60.8; 60.83333333; 60.9; 60.91666667; 61; 61.1; 61.16666667; 61.2; 61.25; 61.3; 61.4; 61.5; 61.58333333; 61.6; 61.66666667; 61.7; 61.75; 61.8; 61.83333333; 61.9; 61.91666667; 62; 62.1; 62.16666667; 62.2; 62.25; 62.3; 62.33333333; 62.4; 62.41666667; 62.5; 62.58333333; 62.6; 62.66666667; 62.7; 62.8; 62.9; 62.91666667; 63; 63.08333333; 63.1; 63.2; 63.25; 63.3; 63.4; 63.5; 63.58333333; 63.6; 63.66666667; 63.7; 63.75; 63.8; 63.9; 63.91666667; 64; 64.08333333; 64.1; 64.2; 64.33333333; 64.4; 64.5; 64.58333333; 64.6; 64.7; 64.75; 64.8; 64.9; 64.91666667; 65; 65+; 65.1; 65.16666667; 65.2; 65.3; 65.4; 65.41666667; 65.5; 65.6; 65.66666667; 65.7; 65.75; 65.8; 65.83333333; 65.9; 65.91666667; 66; 66.1; 66.2; 66.3; 66.33333333; 66.4; 66.6; 66.7; 66.75; 66.8; 67; 67.08333333; 67.1; 67.2; 67.3; 67.4; 67.5; 67.58333333; 67.6; 67.66666667; 67.7; 67.9; 68; 68.2; 68.3; 68.33333333; 68.4; 68.5; 68.66666667; 68.7; 68.91666667; 69; 69.08333333; 69.3; 69.4; 69.7; 69.8; 69.9; 70; 70.08333333; 70.1; 70.2; 70.25; 70.4; 70.6; 70.7; 71.1; 71.3; 71.4; 71.5; 71.6; 71.9; 72.16666667; 72.5; 72.6; 72.75; 72.8; 73; 73.3; 73.6; 74.16666667; 74.2; 74.4; 75.7; 77.6; 79.25; 935.7; <20; <20 Years; <=25; >=60 Years; AAB00366417; AAB10011157; Baby Boomer; F; Gen X; Gen Y / Millennial; Less than 18; M; Traditionalist; Under 20; nan'</li><li>'age_band: 18-25; 20 - 29; 26-35; 30 - 39; 36-45; 40 - 49; 46-55; 50 - 59; 56 and above; 60 and over; Under 20; nan'</li></ul> | | 0 | <ul><li>'Employee_Gender: F; M; nan'</li><li>'O_Gender: F; Female; M; Male; U; Unknown; nan'</li><li>'Gender: -; 0; 1 Individual Contributor; 10 Individual Contributor; 119; 17; 18; 19; 20; 20-29; 21; 22; 23; 24; 25; 26; 27; 28; 29; 30; 30-39; 31; 32; 33; 34; 35; 36; 37; 38; 39; 40; 40-49; 41; 42; 43; 44; 45; 46; 47; 48; 49; 50; 50-59; 51; 52; 53; 54; 55; 56; 57; 58; 59; 60; 61; 62; 63; 64; 65; 66; 67; 68; 69; 70; 71; 72; 73; 74; 75; 76; 77; 78; 79; 8 Sr. Manager; 80; 81; 83; 88; 89; 9 Manager; 9 manager; 90; 91; 935; ?; Agender; Asian; Bay; Bayan; Choose not to respond; Contractor; D; DJO Export; Decline; Decline to State; Decline to answer; Decline to state; Director; Do Not Wish to Disclose; F; F ; FEMALE; FEMALE ; Female; Female ; Female ; Femenino; F√©minin; Gender; Gender Nonconforming; Gender non-conforming; Gender variant / non-conforming; I Choose Not to Self Disclose; I Choose not to Disclose; I DO NOT WISH TO SELF-IDENTIFY; I Prefer Not to Answer; I choose not to disclose; I choose not to reply; I decline to self-identify; I do not wish to Self-Identify; I prefer not to answer; I prefer not to say; I prefer to self-describe in another way:; Identity Not Listed; In Another Way; JOANNE STURGESS; JODIE FIDDES; M; M ; MALE; MASCOLINO; Make; Male; Male ; Male ; Masculin; Masculino; N; Non-Binary; Non-Binary/Third Gender; Non-binary; Non-binary / third gender; Non-binary/ third gender; Non-specific; Nonconform; None; Not Available; Not Declared; Not Declaring; Not Disclosed; Not SpeciFemaleied; Not Specifed; Not Specified; Not assigned; Not available; Not declared; Not known; Not_Declared; Not_declared; O; Other; Prefer Not To Answer; Prefer Not To Say; Prefer Not To Self-Identify; Prefer Not to Respond; Prefer Not to Say; Prefer not to answer; Prefer not to disclose; Prefer not to say; Prefer to self-describe; Reassigned; Sex; Transgender; Two or more races; U; UNKNOWN; Undeclared; Undisc; Undisclosed; Unknown; Unspecified; Unused: F; Unused: M; White; Withhold; [NONE]; f; female; m; nan; unknown; unknown '</li></ul> | ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import SetFitModel # Download from the 🤗 Hub model = SetFitModel.from_pretrained("svorwerk/setfit-fine-tuned-demo-class_hpo") # Run inference preds = model("Emp_FLSA: E; N; P; V; X; nan") ``` <!-- ### Downstream Use *List how someone could finetune this model on their own dataset.* --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Set Metrics | Training set | Min | Median | Max | |:-------------|:----|:---------|:-----| | Word count | 2 | 135.7721 | 6076 | | Label | Training Sample Count | |:------|:----------------------| | 0 | 5 | | 1 | 8 | | 2 | 24 | | 3 | 23 | | 4 | 2 | | 5 | 22 | | 6 | 12 | | 7 | 2 | | 8 | 6 | | 9 | 4 | | 10 | 8 | | 11 | 6 | | 12 | 8 | | 14 | 2 | | 16 | 4 | ### Training Hyperparameters - batch_size: (32, 32) - num_epochs: (1, 1) - max_steps: -1 - sampling_strategy: oversampling - body_learning_rate: (2e-05, 5e-06) - head_learning_rate: 0.002 - loss: CosineSimilarityLoss - distance_metric: cosine_distance - margin: 0.25 - end_to_end: True - use_amp: False - warmup_proportion: 0.1 - l2_weight: 0.01 - max_length: 500 - seed: 42 - eval_max_steps: -1 - load_best_model_at_end: True ### Training Results | Epoch | Step | Training Loss | Validation Loss | |:----------:|:-------:|:-------------:|:---------------:| | 0.0019 | 1 | 0.0936 | - | | 0.0973 | 50 | 0.0073 | - | | 0.1946 | 100 | 0.0009 | - | | 0.2918 | 150 | 0.0006 | - | | 0.3891 | 200 | 0.0002 | - | | 0.4864 | 250 | 0.0002 | - | | 0.5837 | 300 | 0.0003 | - | | 0.6809 | 350 | 0.0002 | - | | 0.7782 | 400 | 0.0004 | - | | 0.8755 | 450 | 0.0002 | - | | **0.9728** | **500** | **0.0006** | **0.0751** | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.10.12 - SetFit: 1.0.3 - Sentence Transformers: 2.2.2 - Transformers: 4.37.1 - PyTorch: 2.1.0+cu121 - Datasets: 2.16.1 - Tokenizers: 0.15.1 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION", "TRANSLATION" ]
[ "CHIA", "MIRNA" ]
Non_BioNLP
knguyennguyen/mpnet_jacket4k_adjustedv3
knguyennguyen
sentence-similarity
[ "sentence-transformers", "safetensors", "mpnet", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:11097", "loss:MultipleNegativesRankingLoss", "arxiv:1908.10084", "arxiv:1705.00652", "base_model:sentence-transformers/all-mpnet-base-v2", "base_model:finetune:sentence-transformers/all-mpnet-base-v2", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,737
1,737
9
0
--- base_model: sentence-transformers/all-mpnet-base-v2 library_name: sentence-transformers pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:11097 - loss:MultipleNegativesRankingLoss widget: - source_sentence: a jacket for protection in cold environments for male or female users sentences: - 'Title: Snugpak Sj9 Jacket Multicam Lg Descripion: The SJ Series of jackets were developed by Snugpak with the intention of providing a versatile, insulated jacket for seasonal weather conditions. With a water-resistant main zip, and a moisture wicking inner liner, the SJ-9 jacket works to keep moisture away from your skin, so you stay warm, comfortable, and dry. The SJ-9 jacket offers protection from cold environments and features Softie Premier insulation, which is finer, softer, and more durable than other insulations. The randomized fine filament fibers of Softie insulation trap warm air to retain heat even when wet. The SJ Series also feature Paratex lightweight fabric and an elasticated hood. Paratex is incredibly durable and remains soft to the touch even after wear. It offers heavy duty, close weave construction that is completely windproof, extremely water resistant and water wicking, and highly breathable. The SJ-9 jacket is comfort rated to 14 degrees Fahrenheit, with a low temperature rating of 5 degrees Fahrenheit. This design is tailored to fit the contours of your body and includes a high neck to retain heat and protect you from high winds. The Snugpak SJ Series provide you with weather-resistant comfort for a diverse range of seasonal conditions. Zipper closure Hand Wash Only Paratex lightweight fabric is durable, windproof, breathable, and delivers extreme water resistance and water wicking Softie insulation traps warm air to retain heat even when thoroughly wet; designed to keep moisture away from your skin so you stay warm Provides protection from cold climates and seasons; comfort temperature rating is 14 degrees Fahrenheit, low is 5 degrees Fahrenheit Designed with hem and neck adjusters, thumb loops, and elastic cuffs for a customized fit; features a two-way water-resistant main zip Equipped with an elasticated hood, internal left chest pocket with a zipper, and two external concealed side pockets; shaped fit maximizes heat retention' - 'Title: Solid 925 Sterling Silver 5mm Blue Topaz Studs Dangle Earring Jackets Descripion: Material: Primary - Purity: 925 Material: Primary - Purity: 925 Finish: Polished Finish: Polished Material: Primary: Sterling Silver Material: Primary: Sterling Silver Product Type: Jewelry Product Type: Jewelry Jewelry Type: Earrings Jewelry Type: Earrings Sold By Unit: Pair Sold By Unit: Pair Gender: Women''s Gender: Women''s Material: Primary - Color: White Material: Primary - Color: White Sonia Jewels Internal Category \ Jewelry \ Earrings \ Childrens Earrings; \ Jewelry \ Earrings \ Birthstone Earrings \ Birthstone Studs Sonia Jewels Internal Category \ Jewelry \ Earrings \ Childrens Earrings; \ Jewelry \ Earrings \ Birthstone Earrings \ Birthstone Studs Elegant Earrings Box Included 925 Sterling Silver GUARANTEED, Authenticated with a 925 Stamp Jewelry Gifts For Women And Gifts For Men Including Gift For Mom; Wife; Mother; Father; Daddy; Daughter; Son; Sister; Brother; Friend; Coworker; Employee; Teacher; Neice or Nephew Christmas Gift; Stocking Stuffers; Sonia Jewels Has The Highest Quality Jewelry Gifts For Her and Him for Christmas; Valentines Day; Mothers Day; Fathers Day; Graduation; Birthday; Weddings or Anniversary Your satisfaction is our top priority at Sonia Jewels - Solid 925 Sterling Silver 5mm Blue Topaz Studs Dangle Earring Jackets' - 'Title: adidas womens Tiro 21 Track Jacket Team Maroon/White X-Large Descripion: Too good to limit to the pitch. The adidas Tiro jacket debuted as football training wear, but it''s now a streetwear staple. We made it using recycled materials as part of our commitment to help end plastic waste. From the moisture-absorbing AEROREADY to the zip pockets, the details are just as useful off the pitch. 100% Polyester Imported Zipper closure Machine Wash Slim fit Full zip with ribbed stand-up collar A track jacket made with recycled materials. Front zip pockets' - source_sentence: baby bathrobe designed for girls, featuring a soft texture and a simple design. sentences: - 'Title: Lykmera Baby Coat Toddler Kimono Solid Silk Robes Kids Clothes Sleepwear Bathrobe Girls Baby Satin Girls Coat Jacket Descripion: 2.Casual style top, , cute and comfy baby clothes 3.Great idea for a baby clothes, there''s no doubt in our mind your little baby will be the cutest Package include:1PC Bathrobe+1PC Ribbons 1.It is made of high quality materials,Soft hand feeling, no any harm to your baby skin Clothing Length:Regular Pattern Type:Solid Gender:Girls Please note that slight color difference should be acceptable due to the light and screen. Both hand wash and machine wash is OK Occasion:Casual Material:Polyester Attention plz: If your kid is , we recomend choosing a larger size, thanks. Polyester Imported Tie closure Hand Wash Only Material:Polyester Clothing Length:Regular Pattern Type:Solid Package Include:1PC Bathrobe+1PC Ribbons 1.It Is Made Of High Quality Materials,Soft Hand Feeling, No Any Harm To Your Baby Skin' - 'Title: Carhartt Men''s Big & Tall Relaxed Fit Heavyweight Flannel Sherpa-Lined Shirt Jacket, Folkstone Gray, 3X-Large Descripion: This men''s Carhartt shirt jac blends the comfort of a hoodie with the ruggedness of a jacket. Made from heavyweight flannel with a plush sherpa lining and a relaxed fit. Features two outer pockets plus an inner pocket on the chest. 8-ounce, 100% cotton ringspun flannel. Sherpa fleece lining for warmth. Carhartt strong triple-stitched main seams. Relaxed fit. Spread collar. Two-snap adjustable cuffs with extended plackets. Antique-finish nickel snaps. Two chest pockets with flaps and snap closures #Interior chest pocket. Replaces 104452. Imported 100% Cotton Imported Button closure Machine Wash 8-ounce, 100% Cotton ringspun flannel Sherpa lining Triple-stitched main seams Antique-finish nickel snaps Spread collar' - 'Title: Men''s Christmas Blazer Blue Sky White Clouds Print Suit Jacket + Vest + Suit Pants Suit Mens Evening Party Dinner Formal Wear Descripion: ✿✿✿✿✿ WELCOME TO LJHH STORE✿✿✿✿✿✿✿ Nice Suit ✿^ _ ^✿ Nice Coat ✿✿✿✿These suit coat are the best ideal for yourself, your friends, parents, spouses and all your loved ones as a great gift for birthday, business activity, Christmas or any other special date.Features:1.lightweight soft fabric for a comfortable, easy to wear, no wrinkles.2.It is made of high quality materials, durable enought for your daily wearing.3.Special design make you more attractive. Season: All seasons Style: Fashion Fitting style: Suit coat Occasion: Casual, BusinessThe stylish design shows your tasteSuitable for casual wearWashing precautions: normal hand wash, machine wash, ✤Recommended 30° water to wash. ✿✿Size Chart✿✿✤The recommended product size are for reference , please follow our size picture carefully before you buy it. ✿✿Fast delivery✿✿✤Standard delivery takes 7-15 days to arrive,express delivery takes 3-6 days.✿✿ If you have problems on your order, please contact us first before asking Amazon for help. We will solve it effectively for you within 24 hours. Thank you very much. ✿✿---✤✤✤✤✤✤This will be a lovely gift✿^ _ ^✿Please purchase as soon as possible✤✤✤✤✤✤✤--- 100% Polyester Lining lining Button closure Hand Wash Only ✿【Material】These suit coat made of high quality materials. Casual and business suit with soft fabric. Environmental protection, Lightweight, warm, smooth lining, breathable, wear-resistant. ✿✿men suits men suits slim fit men suits for wedding men suits regular fit men suits slim fit 3 piece men suits big and tall men suits slim fit 2 piece men suits regular fit 3 piece men suits slim fit 3 piece designer ✿【Occasions】Suit for indoor and outdoor occasions, daily wear or as work clothes , take part in cocktail, party, you will outstanding in the crowd. Also great for business work and casual wear. ✿✿men suits sets men suits slim fit 3 piece 2 piece 3 piece designer black fit blue tuxedo 3 piece designer navy blue jacket green men suits slim fit men suit jacket men suits for wedding ✿【Best Gift】This suit coat is an ideal winter gift for parents, friends and lovers. Christmas is coming soon, you can use it as a gift to participate in Christmas parties, wedding, business party. ✿✿men suits regular fit men suit vest men suits slim fit 3 piece men suit set men suit pants jacket jackets and blazers black slim fit big and tall jacket classic fit red blue grey men suits for wedding ✿【Size】Size selection is for your reference only.To see more details, please check our size picture before ordering. If you want to dress more loosely, we recommend you choosing a larger size. ✿✿men suits for wedding regular fit guest big and tall slim fit champagne black navy blue red tan men suits regular fit men suits regular fit 3 piece 2 piece wedding men suits regular fit 44 plaid on ✿【Service】Your satisfaction is our greatest pleasure. If you have any problem about our items, Please feel free to contact us. ✿✿men suits regular fit 3 piece formal suit solid prom groomsmen men suit vest men suit vest black slim fit and pants with matching pants purple brown costume grey sets men suits slim fit 3 piece' - source_sentence: a jacket for hunting expeditions for men. men's jacket designed for outdoor activities, featuring wind-resistant material and a soft inner layer for warmth. sentences: - 'Title: Nomad Men''s Harvester Nxt Jacket | Wind Resistant W/Sound Kill Tech Descripion: Nomad: Building the most innovative, authentic hunting apparel to inspire a community To experience & protect the traditions of hunting and to empower everyone on their next expedition 100% Polyester Imported Zipper closure Machine Wash NOMAD: Building the most innovative, authentic hunting apparel to inspire a community to experience & protect the traditions of hunting and to empower everyone on their next expedition NOMAD Men''s Harvester NXT Jacket: Wind Resistant W/Sound Kill Tech WIND RESISTANT: Fabric is innovatively constructed to limit the amount of wind that reaches your body keeping you warm and comfortable SILENT FABRIC: Sound Kill noise limiting materials further reduce human detection HIGH PILE FLEECE: NOMAD''s High-Pile fleece is made of super-long fibers to trap air keeping you warmer more comfortable.' - 'Title: Women''s Long Sleeve Cardigan Coat Solid Color Zipper Fuzzy Fleece Coat Jacket Winter Fluffy Coat with Pockets Outwear Descripion: Product Description: Material:Polar fleece Season:Autumn And Winter Gender:Women Occasion:Daily,Casual Style:Casual Sleeve length:Full Sleeve Fit:Fits ture to size How to wash:Hand wash Cold,Hang or Line Dry What you get:1PC Women Coat Size: If You Prefer Loosing Fitting Jacket, Please One Size Up. Size Chart: Size:S _ Size.:Small _ US:4 _ Bust:100cm/39.37'''' _ Sleeve:61cm/24.02'''' _ Shoulder:39.5cm/15.55'''' _ Length:65cm/25.59'''' Size:M _ Size.:Medium _ US:6 _ Bust:104cm/40.94'''' _ Sleeve:62cm/24.41'''' _ Shoulder:40.5cm/15.94'''' _ Length:66cm/25.98'''' Size:L _ Size.:Large _ US:8 _ Bust:110cm/43.31'''' _ Sleeve:63.5cm/25.00'''' _ Shoulder:42cm/16.54'''' _ Length:67.5cm/26.57'''' Size:XL _ Size.:X-Large _ US:10 _ Bust:116cm/45.67'''' _ Sleeve:65cm/25.59'''' _ Shoulder:43cm/16.93'''' _ Length:69cm/27.17'''' fleece Imported Zipper closure Hand Wash Only Features: Long sleeve, Lapel, Zipper Closure, Two Side Pockets, Oversized Arms, Solid Jacket. Simple but fashion style is also a good choice as a gift to your friends and families. Material: Fuzzy faux fleece lined, made of polyester & spandex. Soft material straight hemline, soft and warm fabric keep you warm in autumn and winter, giving you amazing an wearing experience. Match: Prefect with skinny jeans, leggings, t-shirts, tops, shirts, shorts, boots for a casual look. Very soft warmfleece cardigan fit for any daily wear. Occasion: The Fleece fuzzy coat is suit for Daily wear, School, Vacation, Work, Club, Party, Street, great for Office or Outdoor. You will fall in love with this trendy fleece coat!!They''re warm and comfortable, It''s a great choice for giving away. Note: Please check the size chart before order.Recommend hand-washing , lay flat to dry or dry clean, please do not bleach or iron. Please feel free to contact us if you have any questions. *If you need a looser fit, choose one size up' - 'Title: Joules Baby Girls'' Quilted Jacket Descripion: Pass on the love of a quilted coat to your little one with this brand new style. The perfect early years coat that will feature in photographs and milestone moments along the way, it features our all new star quilting effect and is complete with a traditional cord collar. We''ve made it in a pretty pink, added some popper fastenings and a snuggly soft lining too. 100% Polyester Imported Button closure Machine Wash Star quilting Cord collar Popper fastening Cord binding welt pockets Super soft jersey lining' - source_sentence: women's ski jacket with a longer cut, insulation, and advanced weather-resistant features. sentences: - 'Title: Arc''teryx Beta Insulated Jacket Men''s | Insulated Gore-Tex Mountain Shell Descripion: There are so many ways to experience the alpine. The Betas - designed for unrivaled versatility, durability and weather protection - free you to discover what the mountains bring. Leveraging Coreloft Continuous insulation and a more sustainable waterproof, breathable 40D GORE-TEX fabric, the insulated Beta is the jacket for cold conditions. Helmet compatible, its StormHood adjusts to maximize peripheral vision. Pit zippers ventilate and an embedded RECCO reflector can facilitate search and rescue.Technical Features- Windproof- Breathable- Durable- WaterproofConstruction- GORE-TEX two-layer construction- Warm resilient Coreloft synthetic insulation provides thermal performance and retains loftCuff & Sleeves Configuration- Die-cut Velcro cuff adjusters reduce bulk and won''t catch or tear offHem Configuration- Dual lower hem adjustersHood Configuration- Helmet compatible StormHood provides full coverage without impacting visibilityLogos & Label Configuration- Embroidered logoPatterning- Articulated patterning for unrestricted mobilityPocket Configuration- Two hand pockets with WaterTight zippers- One internal dump pocket- Sleeve pocket with zip- Zippered internal security pocketSnowsport Features- Hidden RECCO reflectorSustainability- Contains recycled nylon- Contains materials that meet the bluesign criteriaZippers & Fly Configuration- Pit zippers for easy venting- Full separating two-way front zip Coreloft 80 (80 g/m) insulation. - 100% Polyester Lining: 20D plain weave - 100% Nylon - bluesign Approved Material Imported Zipper closure Hand Wash Only BETA - Versatile: high performance for diverse activities and conditions. INSULATED - Thermally insulated products that provide efficient warmth and protection from the elements. SYNTHETIC INSULATION - Man-made insulation with quick dry times, durability and retains warmth when damp. GORE-TEX - Waterproof, windproof and breathable textiles that offer fully protective environmental shelter. ESSENTIALS - Versatile high performance designs for diverse activities and conditions.' - 'Title: Helly-Hansen Womens Whitewall LIFALOFT Jacket Descripion: A longer-length, insulated women''s ski jacket with high tech features and a choice of camo or corduroy detailing. For skiers who enjoy the back country, side country, or just deep powder. Zipper closure A longer-length, insulated women''s ski jacket with high tech features and a choice of camo or corduroy detailing. For skiers who enjoy the back country, side country, or just deep powder. HELLY TECH PROFESSIONAL: Extremely waterproof and breathable designs and constructions. For highly aerobic, extremely wet or unusually long-lasting activities in extremely harsh conditions. Fully seam sealed. Durable Water Repellency treatment (DWR). FEATURES: 2-layer fabric construction, Fully insulated with 80g LIFALOFT Insulation and brushed stretch panels for added breathability, LIFE POCKET, Hi vis hood brim, ventilation zippers, RECCO Advanced Rescue system, Fusion modular system jacket to pant, Detachable powder skirt, Helmet compatible hood with adjustment, Dual hand warmer pockets and one chest pocket with goggle shammy, Wrist gaiters with thumb hole GOOD FOR: Winter, Resort Skiing, Freeride, Backcountry Ski Touring, Mountaineering FIT: Relaxed - Drapes loosely on the body. Pants are going to be relaxed at the waist and much roomier throughout the thigh, knee, and cuff.' - 'Title: JORDAN CRAIG KIDS DENALI SHEARLING JACKET_MIDNIGHT SMOKE_91445B Descripion: Faux suede body Plush faux shearling lining throughout body and sleeves Shearling accented pockets Horn toggle front closure Vegan suede straps with buckle at neck Detachable faux fox fur hood with zipper Also available in RED, PINK, PURPLE, COGNAC, BLACK AND MIDNIGHT SMOKE. STYLE#: 91445B Zipper closure Faux suede body Plush faux shearling lining throughout body and sleeves Shearling accented pockets Horn toggle front closure Vegan suede straps with buckle at neck Detachable faux fox fur hood with zipper' - source_sentence: kids' jacket featuring a soft exterior, cozy inner lining, and multiple pockets. sentences: - 'Title: Free People Rocky Ridge Jacket Black Check LG (Women''s 12-14) Descripion: Layer up with the perfect studiotostreet layering FP Movement Rocky Ridge Jacket. The relaxed fit buttondown fleece jacket lends a flattering and slouchy silhouette for cozy allday comfort. Free People Movement is now FP Movement. FP Movement athletic wear provides the same blend of performance and style that set your workout look apart. Foldover collar. Long sleeves with stretch cuffs. Front hand pockets. Straight hemline. Main: 100% polyester;Secondary: 95% cotton, 5% elastane. Machine washable. Imported. Measurements: Length: 22 in Chest Measurement: 40 in Sleeve Length: 28 in Product measurements were taken using size XS (Women''s 02). Please note that measurements may vary by size. Fleece Button closure Machine Wash Care instructions: Machine Wash' - 'Title: JORDAN CRAIG KIDS DENALI SHEARLING JACKET_MIDNIGHT SMOKE_91445B Descripion: Faux suede body Plush faux shearling lining throughout body and sleeves Shearling accented pockets Horn toggle front closure Vegan suede straps with buckle at neck Detachable faux fox fur hood with zipper Also available in RED, PINK, PURPLE, COGNAC, BLACK AND MIDNIGHT SMOKE. STYLE#: 91445B Zipper closure Faux suede body Plush faux shearling lining throughout body and sleeves Shearling accented pockets Horn toggle front closure Vegan suede straps with buckle at neck Detachable faux fox fur hood with zipper' - 'Title: Helly Hansen Women''s Long Belfast Waterproof Windproof Breathable Raincoat Jacket with Hood, 597 Navy, Medium Descripion: 3/4 length Helly Tech Protection raincoat and hood keep the rain out, with comfort features inside. The mesh liner keeps you dry with added warmth and sporty detailing. Adjustable fit and zippered pockets add convenience. 100% Other Fibers Imported Zipper closure Machine Wash Helly Tech Protection Waterproof, windproof, and breathable 2 Ply fabric construction Fully seam sealed Durable Water Repellency treatment (DWR)' --- # SentenceTransformer based on sentence-transformers/all-mpnet-base-v2 This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) <!-- at revision 9a3225965996d404b775526de6dbfe85d3368642 --> - **Maximum Sequence Length:** 128 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: MPNetModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("knguyennguyen/mpnet_jacket4k_adjustedv3") # Run inference sentences = [ "kids' jacket featuring a soft exterior, cozy inner lining, and multiple pockets.", 'Title: JORDAN CRAIG KIDS DENALI SHEARLING JACKET_MIDNIGHT SMOKE_91445B Descripion: Faux suede body Plush faux shearling lining throughout body and sleeves Shearling accented pockets Horn toggle front closure Vegan suede straps with buckle at neck Detachable faux fox fur hood with zipper Also available in RED, PINK, PURPLE, COGNAC, BLACK AND MIDNIGHT SMOKE. STYLE#: 91445B Zipper closure Faux suede body Plush faux shearling lining throughout body and sleeves Shearling accented pockets Horn toggle front closure Vegan suede straps with buckle at neck Detachable faux fox fur hood with zipper', "Title: Helly Hansen Women's Long Belfast Waterproof Windproof Breathable Raincoat Jacket with Hood, 597 Navy, Medium Descripion: 3/4 length Helly Tech Protection raincoat and hood keep the rain out, with comfort features inside. The mesh liner keeps you dry with added warmth and sporty detailing. Adjustable fit and zippered pockets add convenience. 100% Other Fibers Imported Zipper closure Machine Wash Helly Tech Protection Waterproof, windproof, and breathable 2 Ply fabric construction Fully seam sealed Durable Water Repellency treatment (DWR)", ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 11,097 training samples * Columns: <code>anchor</code> and <code>positive</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 13.56 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 43 tokens</li><li>mean: 120.72 tokens</li><li>max: 128 tokens</li></ul> | * Samples: | anchor | positive | |:-----------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>a winter jacket for outdoor activities and urban wear for women</code> | <code>Title: Obermeyer Women's Tuscany Ii Jacket Descripion: Stepping lightly on our planet has always been part of our mindset, this year we partnered with Repreve®, a branded performance fabric fiber made from recycled plastics, on our most popular winter jacket, The Tuscany II. The Tuscany II radiates tastefulness and beautiful styling. Whether in the city or on the trail, the Tuscany II is a winter coat that is timeless. Elevated sophistication with removable faux fur trim, extra-soft fleece lined collar, and adjustable hem. Special Inclusive Sizing available. 100% Polyester Imported Zipper closure Hand Wash Only 10K/10K Thermore Classic; 150gm body & sleeve; 40gm hood Ski Contour articulation; YKK zipper); removable hood; 2-way adjustable hood; Removable faux fur; fleece lined collar; zipper Handwarmer pockets); Zipper pass pocket(s) Stretch inner cuffs; fleece chin warmer; interior electronics pocket(s); cord routing Guide; interior goggle pocket; detachable, scratch-free goggle cloth; Snap-away water-resistant powder skirt; adjustable hem</code> | | <code>a shirt jacket for staying warm during cold nights</code> | <code>Title: Wrangler Authentics Men's Long Sleeve Sherpa Lined Shirt Jacket Descripion: Wrangler Authentics Men's Sherpa Lined Denim Shirt. Built to keep you warm and comfortable during those cold nights. This versatile shirt jacket is extremely functional and can be worn on many different occasions. Stay warm in this sherpa lined denim shirt all winter long. Shell: 55% Cotton, 45% Polyester; Lining: 100% Polyester Imported Button closure Machine Wash RELAXED FIT. Constructed with comfort in mind, this mid-weight flannel shirt will keep you comfortable in many climates. Wear alone or with additional layers in cooler temperatures. PLUSH SHERPA LINING. This unique flannel shirt allows you to take your Fall style into Winter with the added warmth of shearling lining. FUNCTIONAL STYLE. Guaranteed to keep you warm without compromising style, this shearling flannel is built for function and style. Wear it on the job or out to lunch, this shirt jacket can be worn for many occasions. ALL DAY COMFORT. Crafted with a 100% Cotton flannel shell and 100% Polyester soft tan sherpa lining with diamond black quilted padding in the sleeves, this shirt jacket is guaranteed to provide all day comfort. ADDED STORAGE. (2) Chest pockets to ensure that there is always room for your basic necessities - great for storing your wallet, sunglasses or any other quick-access items you may need.</code> | | <code>a blazer for formal occasions and events</code> | <code>Title: Men's Linen Blazer Jacket Casual Slim Fit Lightweight Two Buttons Blazer Sport Coat Descripion: Item: Men's Line Blazer Department :Mens Package include:1pc Linen Jacket You can choose high quality goods here,including men's wedding suits,business suits,tuxedo, men's dress suits,men's vest,suit for wedding prom and other clothes and clothing accessories. Nice choice for wedding party,nightclub,dinner,prom party,daily life,business meeting,homecoming and back to school,any fashion forward parties,any holiday,work and other Formal Occasions,Suits can also be given as birthday gifts to your family or friends. We are committed to offer you the most innovative and comfortable fabric product,and bring you the best service as we can. Note:1.The real color of the item may be slightly different from the pictures shown on website due to camera quality and monitor settings. Photos are taken in natural light with no flash. 2.Please allow slight deviation for the measurement data. 3.If there is no size that suits you,please send us your detail measurements as following: 1:neckline=___'' 2:shoulder=___'' 3:sleeve=___'' 4:armhole (bicep)=___'' 5:cuff=___'' 6:chest=___'' 7:belly=___'' 8:waist=___'' 9:hip=___'' 10:clothes length=___'' 11:pants length=___'' 12:thigh=___'' 13:height =___'' 14:weight=___'' 50% Rayon, 50% Linen Imported Linen lining Button closure Hand Wash Only 【Premium Material】-- 50% Linen,50% Rayon;This Linen Balzer for men is made of high quality material which is comfortable,absorbent,good air permeability which can reduce skin irritation,and easier to wash,providing a comfortable wearing,experience and highlighting your body shape at the same time. 【About Size】-- More size information please refer to the size chart in the image,recommend 1-2 size up.If you have any questions about the linen suit,please feel free to contact us.We will provide the best solution for you within 24 hours. 【Occasion】-- This Linen Jacket is suitable for multiple occasions,such as Summer Beach,daily use,business meeting,fashion shows,parties,or grand holidays,etc.With our affordable price,we strongly suggest you,purchase multiple colors and it will be so easy to find your proper suit for the event! 【FINEST MATERIAL & CAREFUL CRAFTMANSHIP】-- Wangyue understands that upscale linen material and careful craftmanship are the two most important things to a linen blazer. That's why we are having the most experienced tailors to handmake these jackets with the finest fabric and material. From head to toe, you will find the blazer are exactly the same quality as our pictures. 【Can Give Gift】It is unique,comfortable,atmospheric and self cultivation.It is the best perfect gift for your father,son,boyfriend,classmate.If you take it out as a gift,it will brighten your face and at the same time,you will receive a lot of compliments.</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Evaluation Dataset #### Unnamed Dataset * Size: 300 evaluation samples * Columns: <code>anchor</code> and <code>positive</code> * Approximate statistics based on the first 300 samples: | | anchor | positive | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 25.18 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 52 tokens</li><li>mean: 125.31 tokens</li><li>max: 128 tokens</li></ul> | * Samples: | anchor | positive | |:------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>a cardigan for school uniform or cosplay costume outfit for teen girls</code> | <code>Title: Teen Girls Cute Knitted V-Neck Sweater Anime Japanese Cosplay JK Uniform Cardigan Long Sleeve High School Cozy Jacket Coat Descripion: Size Chart(Inch): Small--Bust: 45.3 Shoulder:24.0 Shoulder:23.6 Medium--Bust:47.2 Shoulder:24.8 Shoulder:24.4 Large—Bust: 49.2 Shoulder:25.6 Shoulder: 25.2 X-Large—Bust: 51.2 Shoulder:26.4 Shoulder: 26.0 The size is Asian size, please refer to size details below before order Package includes: Cardigan sweater*1 Design details: Japanese sailor jk uniform cardigan candy kawaii sweater with Long sleeve,v-neck Women knit sweater coat, The cardigan could as a cosplay costume outfit,also would be fine to wear costumes to school or as a regular outfit,relaxed fit,buttons closure and side pockets, perfect for casual, daily wear in spring, autumn, winter Machine washing is OK. cold water recommended. don’t bleach. hang to dry 进口 Button closure Machine Wash Package Included: 1*Girl's sweater; other accessories that models wear in the pictures are not sold in this link. Feature: puff sleeve, v-neck, Japanese style, solid color, buttons closure, cozy fabric and loose fit teens coats, Loose Knitted Sweaters for Women Cute Sweater; Girls can wear it for school uniform, Japanese style loose fall sweaters v neck casual pullover sweaters for women soft warm juniors sweaters, Long Sleeve V-Neck Knitted Cardigan Sweater Anime Japanese School Girl Uniform Daily wear suit school uniforms, Fun as birthday or holiday gifts, Halloween party uniforms, Cosplay party uniforms. sailor suits, Christmas party and so on. Perfect to match with basic T-shirt, crop top, leggings, shorts, skirt, jeans, cosplay dress, knee high boots or lolita shoes for a cute look. Soft fabric, skin-friendly, keep you warm in spring and autumn</code> | | <code>a warm winter jacket for little girls aged 1-6 years</code> | <code>Title: ZHICHUANG Girls Novelty Stawberry Print Jacket for 1-6 Years,Little Girls' Winter Warm Thick Coat Bear Ears with Hooded Descripion: Toddler Baby Girls Boys Winter Cartoon Cow Strawberry Dinosaur Windproof Coat Hooded Warm Outerwear JacketFeatures:Feature:Main Color: The Picture ShowPattern Type:PrintIf your kid is , we recomend choosing a larger size, thanks.Material:PolyesterProduct Description:Main Color: The Picture ShowSizeRecommended AgeBustLength9012-24 Months67cm/26.38''37.5cm/14.76''1002-3 Years71cm/27.95''40cm/15.75''1103-4 Years75cm/29.53''42.5cm/16.73''1204-5 Years79cm/31.10''45cm/17.72''1305-6 Years83cm/32.68''47.5cm/18.70''Size:90Recommended Age:12-24 MonthsBust:67cm/26.38''Length:37.5cm/14.76''Size:100Recommended Age:2-3 YearsBust:71cm/27.95''Length:40cm/15.75''Size:110Recommended Age:3-4 YearsBust:75cm/29.53''Length:42.5cm/16.73''Size:120Recommended Age:4-5 YearsBust:79cm/31.10''Length:45cm/17.72''Size:130Recommended Age:5-6 YearsBust:83cm/32.68''Length:47.5cm/18.70''Size: 90 Recommended Age: 12-24 Months Bust: 67cm/26.38'' Length: 37.5cm/14.76'' Size: 100 Recommended Age: 2-3 Years Bust: 71cm/27.95'' Length: 40cm/15.75'' Size: 110 Recommended Age: 3-4 Years Bust: 75cm/29.53'' Length: 42.5cm/16.73'' Size: 120 Recommended Age: 4-5 Years Bust: 79cm/31.10'' Length: 45cm/17.72'' Size: 130 Recommended Age: 5-6 Years Bust: 83cm/32.68'' Length: 47.5cm/18.70'' 🎁✈ Fuzzy Fleece,Cotton Blend 🎉👏 Welcome to ZHICHUANG shop,we are a children's clothing shop,there are child winter jacket,coat,jumpsuis,pants set,boots shoes,support bulk purchase and express shipping,customer service is online 24 hours. 💯💖🌟 [Thanks Attention] - Please check our own size table to select right size(Not the Amazon Size Guide). We give detailed size information in the description on the bottom left page.Our size chart is based on standard data. If your child is a little bit taller or heavier, we recommend you choose the next size. If you can't choose the correct size or have any questions, please feel free to contact us. closure Hand Wash Only Material:Polyester Pattern Type:Solid Very stylish and cute design, carefully selected gentle and skin-friendly materials, and thickened warmth experience, so that your children will be protected from the cold in this Winter. Perfect for wedding , baptism, ceremony, dinner, kindergarten,school, photo shoot,formal wear, Christmas,birthday party, family gathering,casual daily wear, playwear or other occasions. Plus Elephant  Jumpsuit Coat</code> | | <code>a jacket for skiing and snowboarding activities for men</code> | <code>Title: Spyder Mens Hooded Midi Anorak Jacket Descripion: Spyder Signal GTX Anorak - 201040A functional yet stylish jacket that dominates around town and on the mountain, the Signal GTX Anorak is bringing back the Anorak design in a big way. GORE-TEX Stretch Polyester, 80 grams of black eco insulation and YKK zippers equip the Signal GTX Anorak with the tools necessary to conquer a hard day on the slopes. Large side body zips make the Signal GTX Anorak easy to get in and out of. A fully functional zippered front kangaroo pocket gives you all the storage you could possibly need. With all of these features, the Signal GTX Anorak is here to stay. Stretch Polyester Plain Weave 2L with GORE-TE Laminate and PFCecFree DWR PrimaLoft Black ECO Insulation (80g) YKK Zippers Fixed helmet compatible hood with adjustable opening Fully seam taped YKK reverse coil center front zipper Underarm ventilation system with side entry at wearer's right side Secure data card pocket Custom Chassis: Taffeta lining with strategic stretch panels and fixed powder skirt with snapback feature Hook and loop secure anorak pocket Zippered pass through kangaroo hand pockets Drawcord adjustable hem Polyester Zipper closure Hand Wash Only</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: epoch - `per_device_train_batch_size`: 128 - `learning_rate`: 2e-05 - `num_train_epochs`: 10 - `lr_scheduler_type`: cosine - `warmup_ratio`: 0.1 - `load_best_model_at_end`: True #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: epoch - `prediction_loss_only`: True - `per_device_train_batch_size`: 128 - `per_device_eval_batch_size`: 8 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 10 - `max_steps`: -1 - `lr_scheduler_type`: cosine - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | loss | |:-------:|:-------:|:-------------:|:---------:| | 0.5747 | 50 | 1.6459 | - | | 1.0 | 87 | - | 0.0897 | | 1.1494 | 100 | 0.6779 | - | | 1.7241 | 150 | 0.512 | - | | 2.0 | 174 | - | 0.0765 | | 2.2989 | 200 | 0.3965 | - | | 2.8736 | 250 | 0.3655 | - | | 3.0 | 261 | - | 0.0658 | | 3.4483 | 300 | 0.3009 | - | | 4.0 | 348 | - | 0.0658 | | 4.0230 | 350 | 0.2838 | - | | 4.5977 | 400 | 0.2394 | - | | 5.0 | 435 | - | 0.0689 | | 5.1724 | 450 | 0.2279 | - | | 5.7471 | 500 | 0.2125 | - | | 6.0 | 522 | - | 0.0680 | | 6.3218 | 550 | 0.1913 | - | | 6.8966 | 600 | 0.1899 | - | | **7.0** | **609** | **-** | **0.063** | | 7.4713 | 650 | 0.1793 | - | | 8.0 | 696 | - | 0.0631 | | 8.0460 | 700 | 0.1676 | - | | 8.6207 | 750 | 0.1643 | - | | 9.0 | 783 | - | 0.0638 | | 9.1954 | 800 | 0.1772 | - | | 9.7701 | 850 | 0.1665 | - | | 10.0 | 870 | - | 0.0636 | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.11.11 - Sentence Transformers: 3.1.1 - Transformers: 4.45.2 - PyTorch: 2.5.1+cu121 - Accelerate: 1.2.1 - Datasets: 3.2.0 - Tokenizers: 0.20.3 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "BEAR" ]
Non_BioNLP
tessimago/bge-large-repmus-matryoshka
tessimago
sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:1024", "loss:MatryoshkaLoss", "loss:MultipleNegativesRankingLoss", "en", "arxiv:1908.10084", "arxiv:2205.13147", "arxiv:1705.00652", "base_model:BAAI/bge-large-en-v1.5", "base_model:finetune:BAAI/bge-large-en-v1.5", "license:apache-2.0", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,726
1,726
9
0
--- base_model: BAAI/bge-large-en-v1.5 language: - en library_name: sentence-transformers license: apache-2.0 metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:1024 - loss:MatryoshkaLoss - loss:MultipleNegativesRankingLoss widget: - source_sentence: After rescue, survivors may require hospital treatment. This must be provided as quickly as possible. The SMC should consider having ambulance and hospital facilities ready. sentences: - What should the SMC consider having ready after a rescue? - What is critical for mass rescue operations? - What can computer programs do to relieve the search planner of computational burden? - source_sentence: SMCs conduct communication searches when facts are needed to supplement initially reported information. Efforts are continued to contact the craft, to find out more about a possible distress situation, and to prepare for or to avoid a search effort. Section 3.5 has more information on communication searches.MEDICO Communications sentences: - What is generally produced by dead-reckoning navigation alone for search aircraft? - What should be the widths of rectangular areas to be covered with a PS pattern and the lengths of rectangular areas to be covered with a CS pattern? - What is the purpose of SMCs conducting communication searches? - source_sentence: 'SAR facilities include designated SRUs and other resources which can be used to conduct or support SAR operations. An SRU is a unit composed of trained personnel and provided with equipment suitable for the expeditious and efficient conduct of search and rescue. An SRU can be an air, maritime, or land-based facility. Facilities selected as SRUs should be able to reach the scene of distress quickly and, in particular, be suitable for one or more of the following operations:– providing assistance to prevent or reduce the severity of accidents and the hardship of survivors, e.g., escorting an aircraft, standing by a sinking vessel;– conducting a search;– delivering supplies and survival equipment to the scene;– rescuing survivors;– providing food, medical or other initial needs of survivors; and– delivering the survivors to a place of safety. ' sentences: - What are the types of SAR facilities that can be used to conduct or support SAR operations? - What is the scenario in which a simulated communication search is carried out and an air search is planned? - What is discussed in detail in various other places in this Manual? - source_sentence: Support facilities enable the operational response resources (e.g., the RCC and SRUs) to provide the SAR services. Without the supporting resources, the operational resources cannot sustain effective operations. There is a wide range of support facilities and services, which include the following:Training facilities Facility maintenanceCommunications facilities Management functionsNavigation systems Research and developmentSAR data providers (SDPs) PlanningMedical facilities ExercisesAircraft landing fields Refuelling servicesVoluntary services (e.g., Red Cross) Critical incident stress counsellors Computer resources sentences: - How many ways are there to train SAR specialists and teams? - What types of support facilities are mentioned in the context? - What is the duration of a prolonged blast? - source_sentence: 'Sound funding decisions arise out of accurate assessments made of the SAR system. To measure the performance or effectiveness of a SAR system usually requires collecting information or statistics and establishing agreed-upon goals. All pertinent information should be collected, including where the system failed to perform as it should have; failures and successes provide valuable information in assessing effectiveness and determining means to improve. ' sentences: - What is required to measure the performance or effectiveness of a SAR system? - What is the purpose of having an SRR? - What is the effect of decreasing track spacing on the area that can be searched? model-index: - name: BGE base Financial Matryoshka results: - task: type: information-retrieval name: Information Retrieval dataset: name: dim 768 type: dim_768 metrics: - type: cosine_accuracy@1 value: 0.7631578947368421 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.9122807017543859 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.9385964912280702 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9912280701754386 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.7631578947368421 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.30409356725146197 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.18771929824561404 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09912280701754386 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.7631578947368421 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.9122807017543859 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.9385964912280702 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9912280701754386 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8800566604626379 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.8442112225006964 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.8449422166527428 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 512 type: dim_512 metrics: - type: cosine_accuracy@1 value: 0.7456140350877193 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.9210526315789473 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.9385964912280702 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9912280701754386 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.7456140350877193 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.30701754385964913 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.18771929824561404 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09912280701754386 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.7456140350877193 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.9210526315789473 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.9385964912280702 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9912280701754386 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8757357824813555 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.8383040935672514 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.8389306599832915 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 256 type: dim_256 metrics: - type: cosine_accuracy@1 value: 0.7280701754385965 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.8947368421052632 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.9385964912280702 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.956140350877193 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.7280701754385965 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.2982456140350877 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.18771929824561406 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.0956140350877193 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.7280701754385965 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.8947368421052632 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.9385964912280702 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.956140350877193 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8514949465138896 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.8167397660818715 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.8197472848788638 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 128 type: dim_128 metrics: - type: cosine_accuracy@1 value: 0.6842105263157895 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.8596491228070176 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.8947368421052632 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9385964912280702 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.6842105263157895 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.28654970760233917 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.17894736842105263 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09385964912280703 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.6842105263157895 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.8596491228070176 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.8947368421052632 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9385964912280702 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8139200097505314 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.7736702868281816 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.7777583689864392 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 64 type: dim_64 metrics: - type: cosine_accuracy@1 value: 0.6140350877192983 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.7456140350877193 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.8245614035087719 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.8947368421052632 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.6140350877192983 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.24853801169590642 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.16491228070175437 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.08947368421052632 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.6140350877192983 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.7456140350877193 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.8245614035087719 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.8947368421052632 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.7479917679807845 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.7017961570593151 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.7073668567988093 name: Cosine Map@100 --- # BGE base Financial Matryoshka This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) on the json dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) <!-- at revision d4aa6901d3a41ba39fb536a557fa166f842b0e09 --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 1024 tokens - **Similarity Function:** Cosine Similarity - **Training Dataset:** - json - **Language:** en - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("tessimago/bge-large-repmus-matryoshka") # Run inference sentences = [ 'Sound funding decisions arise out of accurate assessments made of the SAR system. To measure the performance or effectiveness of a SAR system usually requires collecting information or statistics and establishing agreed-upon goals. All pertinent information should be collected, including where the system failed to perform as it should have; failures and successes provide valuable information in assessing effectiveness and determining means to improve. ', 'What is required to measure the performance or effectiveness of a SAR system?', 'What is the effect of decreasing track spacing on the area that can be searched?', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 1024] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Information Retrieval * Dataset: `dim_768` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.7632 | | cosine_accuracy@3 | 0.9123 | | cosine_accuracy@5 | 0.9386 | | cosine_accuracy@10 | 0.9912 | | cosine_precision@1 | 0.7632 | | cosine_precision@3 | 0.3041 | | cosine_precision@5 | 0.1877 | | cosine_precision@10 | 0.0991 | | cosine_recall@1 | 0.7632 | | cosine_recall@3 | 0.9123 | | cosine_recall@5 | 0.9386 | | cosine_recall@10 | 0.9912 | | cosine_ndcg@10 | 0.8801 | | cosine_mrr@10 | 0.8442 | | **cosine_map@100** | **0.8449** | #### Information Retrieval * Dataset: `dim_512` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.7456 | | cosine_accuracy@3 | 0.9211 | | cosine_accuracy@5 | 0.9386 | | cosine_accuracy@10 | 0.9912 | | cosine_precision@1 | 0.7456 | | cosine_precision@3 | 0.307 | | cosine_precision@5 | 0.1877 | | cosine_precision@10 | 0.0991 | | cosine_recall@1 | 0.7456 | | cosine_recall@3 | 0.9211 | | cosine_recall@5 | 0.9386 | | cosine_recall@10 | 0.9912 | | cosine_ndcg@10 | 0.8757 | | cosine_mrr@10 | 0.8383 | | **cosine_map@100** | **0.8389** | #### Information Retrieval * Dataset: `dim_256` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.7281 | | cosine_accuracy@3 | 0.8947 | | cosine_accuracy@5 | 0.9386 | | cosine_accuracy@10 | 0.9561 | | cosine_precision@1 | 0.7281 | | cosine_precision@3 | 0.2982 | | cosine_precision@5 | 0.1877 | | cosine_precision@10 | 0.0956 | | cosine_recall@1 | 0.7281 | | cosine_recall@3 | 0.8947 | | cosine_recall@5 | 0.9386 | | cosine_recall@10 | 0.9561 | | cosine_ndcg@10 | 0.8515 | | cosine_mrr@10 | 0.8167 | | **cosine_map@100** | **0.8197** | #### Information Retrieval * Dataset: `dim_128` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.6842 | | cosine_accuracy@3 | 0.8596 | | cosine_accuracy@5 | 0.8947 | | cosine_accuracy@10 | 0.9386 | | cosine_precision@1 | 0.6842 | | cosine_precision@3 | 0.2865 | | cosine_precision@5 | 0.1789 | | cosine_precision@10 | 0.0939 | | cosine_recall@1 | 0.6842 | | cosine_recall@3 | 0.8596 | | cosine_recall@5 | 0.8947 | | cosine_recall@10 | 0.9386 | | cosine_ndcg@10 | 0.8139 | | cosine_mrr@10 | 0.7737 | | **cosine_map@100** | **0.7778** | #### Information Retrieval * Dataset: `dim_64` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.614 | | cosine_accuracy@3 | 0.7456 | | cosine_accuracy@5 | 0.8246 | | cosine_accuracy@10 | 0.8947 | | cosine_precision@1 | 0.614 | | cosine_precision@3 | 0.2485 | | cosine_precision@5 | 0.1649 | | cosine_precision@10 | 0.0895 | | cosine_recall@1 | 0.614 | | cosine_recall@3 | 0.7456 | | cosine_recall@5 | 0.8246 | | cosine_recall@10 | 0.8947 | | cosine_ndcg@10 | 0.748 | | cosine_mrr@10 | 0.7018 | | **cosine_map@100** | **0.7074** | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### json * Dataset: json * Size: 1,024 training samples * Columns: <code>positive</code> and <code>anchor</code> * Approximate statistics based on the first 1000 samples: | | positive | anchor | |:--------|:-------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 133.58 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 17.7 tokens</li><li>max: 39 tokens</li></ul> | * Samples: | positive | anchor | |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------| | <code>The debriefing helps to ensure that all survivors are rescued, to attend to the physical welfare of each survivor, and to obtain information which may assist and improve SAR services. Proper debriefing techniques include:– due care to avoid worsening a survivor’s condition by excessive debriefing;– careful assessment of the survivor’s statements if the survivor is frightened or excited;– use of a calm voice in questioning;– avoidance of suggesting the answers when obtaining facts; and– explaining that the information requested is important for the success of the SAR operation, and possibly for future SAR operations.</code> | <code>What are some proper debriefing techniques used in SAR services?</code> | | <code>Communicating with passengers is more difficult in remote areas where phone service may be inadequate or lacking. If phones do exist, calling the airline or shipping company may be the best way to check in and find out information. In more populated areas, local agencies may have an emergency evacuation plan or other useful plan that can be implemented.IE961E.indb 21 6/28/2013 10:29:55 AM</code> | <code>What is a good way to check in and find out information in remote areas where phone service may be inadequate or lacking?</code> | | <code>Voice communication is the basis of telemedical advice. It allows free dialogue and contributes to the human relationship, which is crucial to any medical consultation. Text messages are a useful complement to the voice telemedical advice and add the reliability of writing. Facsimile allows the exchange of pictures or diagrams, which help to identify a symptom, describe a lesion or the method of treatment. Digital data transmissions (photographs or electrocardiogram) provide an objective and potentially crucial addition to descriptive and subjective clinical data.</code> | <code>What are the types of communication methods used in telemedical advice?</code> | * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters: ```json { "loss": "MultipleNegativesRankingLoss", "matryoshka_dims": [ 768, 512, 256, 128, 64 ], "matryoshka_weights": [ 1, 1, 1, 1, 1 ], "n_dims_per_step": -1 } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: epoch - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `gradient_accumulation_steps`: 16 - `learning_rate`: 2e-05 - `num_train_epochs`: 4 - `lr_scheduler_type`: cosine - `warmup_ratio`: 0.1 - `bf16`: True - `tf32`: True - `load_best_model_at_end`: True - `optim`: adamw_torch_fused - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: epoch - `prediction_loss_only`: True - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 16 - `eval_accumulation_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 4 - `max_steps`: -1 - `lr_scheduler_type`: cosine - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: True - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: True - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch_fused - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | dim_128_cosine_map@100 | dim_256_cosine_map@100 | dim_512_cosine_map@100 | dim_64_cosine_map@100 | dim_768_cosine_map@100 | |:-------:|:-----:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|:----------------------:| | 1.0 | 2 | 0.7826 | 0.8163 | 0.8230 | 0.6761 | 0.8359 | | 2.0 | 4 | 0.7739 | 0.8218 | 0.8282 | 0.6939 | 0.8459 | | 3.0 | 6 | 0.7740 | 0.8223 | 0.8409 | 0.7072 | 0.8457 | | **4.0** | **8** | **0.7778** | **0.8197** | **0.8389** | **0.7074** | **0.8449** | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.10.14 - Sentence Transformers: 3.1.0 - Transformers: 4.41.2 - PyTorch: 2.1.2+cu121 - Accelerate: 0.34.2 - Datasets: 2.19.1 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MatryoshkaLoss ```bibtex @misc{kusupati2024matryoshka, title={Matryoshka Representation Learning}, author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi}, year={2024}, eprint={2205.13147}, archivePrefix={arXiv}, primaryClass={cs.LG} } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "CRAFT" ]
Non_BioNLP
delwinn/stella_en_1.5B_v5
delwinn
feature-extraction
[ "sentence-transformers", "pytorch", "safetensors", "qwen2", "text-generation", "mteb", "transformers", "sentence-similarity", "feature-extraction", "custom_code", "arxiv:2205.13147", "license:mit", "model-index", "autotrain_compatible", "text-generation-inference", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,731
1,731
13
0
--- license: mit pipeline_tag: feature-extraction tags: - mteb - sentence-transformers - transformers - sentence-similarity model-index: - name: stella_en_1.5B_v5 results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 92.86567164179104 - type: ap value: 72.13503907102613 - type: ap_weighted value: 72.13503907102613 - type: f1 value: 89.5586886376355 - type: f1_weighted value: 93.13621183004571 - type: main_score value: 92.86567164179104 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 97.16485 - type: ap value: 96.05546315415225 - type: ap_weighted value: 96.05546315415225 - type: f1 value: 97.16351087403213 - type: f1_weighted value: 97.16351087403213 - type: main_score value: 97.16485 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 59.358 - type: f1 value: 59.0264615883114 - type: f1_weighted value: 59.0264615883114 - type: main_score value: 59.358 - task: type: Retrieval dataset: name: MTEB ArguAna type: mteb/arguana config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: main_score value: 65.269 - type: map_at_1 value: 41.607 - type: map_at_10 value: 57.104 - type: map_at_100 value: 57.621 - type: map_at_1000 value: 57.621 - type: map_at_20 value: 57.533 - type: map_at_3 value: 52.891999999999996 - type: map_at_5 value: 55.371 - type: mrr_at_1 value: 42.318634423897585 - type: mrr_at_10 value: 57.353970511865406 - type: mrr_at_100 value: 57.88398078476526 - type: mrr_at_1000 value: 57.88467807648422 - type: mrr_at_20 value: 57.796730533206166 - type: mrr_at_3 value: 53.200568990042775 - type: mrr_at_5 value: 55.6330014224753 - type: nauc_map_at_1000_diff1 value: 24.54414600428287 - type: nauc_map_at_1000_max value: -8.389738078358459 - type: nauc_map_at_1000_std value: -18.188787645801366 - type: nauc_map_at_100_diff1 value: 24.543138576462308 - type: nauc_map_at_100_max value: -8.390896839752044 - type: nauc_map_at_100_std value: -18.192549240185247 - type: nauc_map_at_10_diff1 value: 24.219607088995822 - type: nauc_map_at_10_max value: -8.245734391254308 - type: nauc_map_at_10_std value: -18.229706566466447 - type: nauc_map_at_1_diff1 value: 29.325201664812788 - type: nauc_map_at_1_max value: -11.742800494823971 - type: nauc_map_at_1_std value: -18.610215769702528 - type: nauc_map_at_20_diff1 value: 24.471097562798803 - type: nauc_map_at_20_max value: -8.318035874000799 - type: nauc_map_at_20_std value: -18.171541096773108 - type: nauc_map_at_3_diff1 value: 24.275846107642824 - type: nauc_map_at_3_max value: -8.212242049581894 - type: nauc_map_at_3_std value: -17.920379368937496 - type: nauc_map_at_5_diff1 value: 23.873692493209255 - type: nauc_map_at_5_max value: -8.110347163828767 - type: nauc_map_at_5_std value: -18.20863325596931 - type: nauc_mrr_at_1000_diff1 value: 22.656410956419975 - type: nauc_mrr_at_1000_max value: -8.924888102233243 - type: nauc_mrr_at_1000_std value: -18.103674384502526 - type: nauc_mrr_at_100_diff1 value: 22.655448817140968 - type: nauc_mrr_at_100_max value: -8.926034318499038 - type: nauc_mrr_at_100_std value: -18.10743930104164 - type: nauc_mrr_at_10_diff1 value: 22.297536272996872 - type: nauc_mrr_at_10_max value: -8.836407556658274 - type: nauc_mrr_at_10_std value: -18.1598393044477 - type: nauc_mrr_at_1_diff1 value: 27.419572424489708 - type: nauc_mrr_at_1_max value: -11.42241314820691 - type: nauc_mrr_at_1_std value: -18.54893865856313 - type: nauc_mrr_at_20_diff1 value: 22.590227214657418 - type: nauc_mrr_at_20_max value: -8.849986456376993 - type: nauc_mrr_at_20_std value: -18.0862391777352 - type: nauc_mrr_at_3_diff1 value: 22.415270167774988 - type: nauc_mrr_at_3_max value: -8.692871854156435 - type: nauc_mrr_at_3_std value: -17.6740102891955 - type: nauc_mrr_at_5_diff1 value: 21.96284578521464 - type: nauc_mrr_at_5_max value: -8.757031535546025 - type: nauc_mrr_at_5_std value: -18.210766964081294 - type: nauc_ndcg_at_1000_diff1 value: 23.939400161569115 - type: nauc_ndcg_at_1000_max value: -7.866999120512983 - type: nauc_ndcg_at_1000_std value: -17.981457019643617 - type: nauc_ndcg_at_100_diff1 value: 23.920033349619317 - type: nauc_ndcg_at_100_max value: -7.889849409678031 - type: nauc_ndcg_at_100_std value: -18.054931990360537 - type: nauc_ndcg_at_10_diff1 value: 22.543020461303534 - type: nauc_ndcg_at_10_max value: -7.072111788010867 - type: nauc_ndcg_at_10_std value: -18.26397604573537 - type: nauc_ndcg_at_1_diff1 value: 29.325201664812788 - type: nauc_ndcg_at_1_max value: -11.742800494823971 - type: nauc_ndcg_at_1_std value: -18.610215769702528 - type: nauc_ndcg_at_20_diff1 value: 23.551587021207972 - type: nauc_ndcg_at_20_max value: -7.298056222649139 - type: nauc_ndcg_at_20_std value: -18.056004880930608 - type: nauc_ndcg_at_3_diff1 value: 22.669089506345273 - type: nauc_ndcg_at_3_max value: -7.278024373570137 - type: nauc_ndcg_at_3_std value: -17.816657759914193 - type: nauc_ndcg_at_5_diff1 value: 21.72619728226575 - type: nauc_ndcg_at_5_max value: -6.959741647471228 - type: nauc_ndcg_at_5_std value: -18.35173705190235 - type: nauc_precision_at_1000_diff1 value: 5.0388241058076995 - type: nauc_precision_at_1000_max value: 34.439879624882145 - type: nauc_precision_at_1000_std value: 77.22610895194498 - type: nauc_precision_at_100_diff1 value: 1.340670767252794 - type: nauc_precision_at_100_max value: 19.30870025961241 - type: nauc_precision_at_100_std value: 35.37688289157788 - type: nauc_precision_at_10_diff1 value: 7.734227153124332 - type: nauc_precision_at_10_max value: 4.202399088422237 - type: nauc_precision_at_10_std value: -18.383890254046698 - type: nauc_precision_at_1_diff1 value: 29.325201664812788 - type: nauc_precision_at_1_max value: -11.742800494823971 - type: nauc_precision_at_1_std value: -18.610215769702528 - type: nauc_precision_at_20_diff1 value: 9.48070999361637 - type: nauc_precision_at_20_max value: 19.056709637253025 - type: nauc_precision_at_20_std value: -13.266821166159485 - type: nauc_precision_at_3_diff1 value: 17.245260303409747 - type: nauc_precision_at_3_max value: -4.202455033452335 - type: nauc_precision_at_3_std value: -17.514264039955332 - type: nauc_precision_at_5_diff1 value: 12.074628162049974 - type: nauc_precision_at_5_max value: -1.9145501461107832 - type: nauc_precision_at_5_std value: -19.162525528916344 - type: nauc_recall_at_1000_diff1 value: 5.038824105805915 - type: nauc_recall_at_1000_max value: 34.43987962487738 - type: nauc_recall_at_1000_std value: 77.22610895193765 - type: nauc_recall_at_100_diff1 value: 1.3406707672497025 - type: nauc_recall_at_100_max value: 19.30870025960776 - type: nauc_recall_at_100_std value: 35.37688289157515 - type: nauc_recall_at_10_diff1 value: 7.734227153124366 - type: nauc_recall_at_10_max value: 4.202399088421976 - type: nauc_recall_at_10_std value: -18.38389025404673 - type: nauc_recall_at_1_diff1 value: 29.325201664812788 - type: nauc_recall_at_1_max value: -11.742800494823971 - type: nauc_recall_at_1_std value: -18.610215769702528 - type: nauc_recall_at_20_diff1 value: 9.480709993616845 - type: nauc_recall_at_20_max value: 19.05670963725301 - type: nauc_recall_at_20_std value: -13.266821166158651 - type: nauc_recall_at_3_diff1 value: 17.24526030340978 - type: nauc_recall_at_3_max value: -4.202455033452323 - type: nauc_recall_at_3_std value: -17.51426403995538 - type: nauc_recall_at_5_diff1 value: 12.074628162049992 - type: nauc_recall_at_5_max value: -1.914550146110865 - type: nauc_recall_at_5_std value: -19.162525528916362 - type: ndcg_at_1 value: 41.607 - type: ndcg_at_10 value: 65.269 - type: ndcg_at_100 value: 67.289 - type: ndcg_at_1000 value: 67.29899999999999 - type: ndcg_at_20 value: 66.76299999999999 - type: ndcg_at_3 value: 56.604 - type: ndcg_at_5 value: 61.07900000000001 - type: precision_at_1 value: 41.607 - type: precision_at_10 value: 9.118 - type: precision_at_100 value: 0.996 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.8469999999999995 - type: precision_at_3 value: 22.451 - type: precision_at_5 value: 15.647 - type: recall_at_1 value: 41.607 - type: recall_at_10 value: 91.181 - type: recall_at_100 value: 99.57300000000001 - type: recall_at_1000 value: 99.644 - type: recall_at_20 value: 96.942 - type: recall_at_3 value: 67.354 - type: recall_at_5 value: 78.236 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: main_score value: 55.437138353189994 - type: v_measure value: 55.437138353189994 - type: v_measure_std value: 14.718556601335491 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: main_score value: 50.65858459544658 - type: v_measure value: 50.65858459544658 - type: v_measure_std value: 14.887033747525146 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: main_score value: 67.32597152838535 - type: map value: 67.32597152838535 - type: mrr value: 78.98683111286988 - type: nAUC_map_diff1 value: 16.8624639710487 - type: nAUC_map_max value: 24.91996491142433 - type: nAUC_map_std value: 17.91865808793225 - type: nAUC_mrr_diff1 value: 25.03766425631947 - type: nAUC_mrr_max value: 41.64561939958336 - type: nAUC_mrr_std value: 23.179909345891968 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cosine_pearson value: 85.790820496042 - type: cosine_spearman value: 83.10731534330517 - type: euclidean_pearson value: 84.61741304343133 - type: euclidean_spearman value: 83.17297949010973 - type: main_score value: 83.10731534330517 - type: manhattan_pearson value: 85.2137696526676 - type: manhattan_spearman value: 84.39168195786738 - type: pearson value: 85.790820496042 - type: spearman value: 83.10731534330517 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 89.78896103896105 - type: f1 value: 89.76107366333488 - type: f1_weighted value: 89.76107366333488 - type: main_score value: 89.78896103896105 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: main_score value: 50.68092296236376 - type: v_measure value: 50.68092296236376 - type: v_measure_std value: 0.7832640983085436 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: main_score value: 46.86629236732983 - type: v_measure value: 46.86629236732983 - type: v_measure_std value: 0.8784322236350974 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: mteb/cqadupstack config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: main_score value: 47.74883333333334 - type: map_at_1 value: 30.179249999999996 - type: map_at_10 value: 41.60824999999999 - type: map_at_100 value: 42.94008333333332 - type: map_at_1000 value: 43.04666666666667 - type: map_at_20 value: 42.36833333333334 - type: map_at_3 value: 38.23491666666666 - type: map_at_5 value: 40.10183333333333 - type: mrr_at_1 value: 36.47676085808166 - type: mrr_at_10 value: 46.300991916437155 - type: mrr_at_100 value: 47.12155753713262 - type: mrr_at_1000 value: 47.168033610799945 - type: mrr_at_20 value: 46.80405724560391 - type: mrr_at_3 value: 43.77000352801797 - type: mrr_at_5 value: 45.22295361704542 - type: nauc_map_at_1000_diff1 value: 46.953671666941524 - type: nauc_map_at_1000_max value: 32.260396316089675 - type: nauc_map_at_1000_std value: 0.6657766120094878 - type: nauc_map_at_100_diff1 value: 46.94717463394555 - type: nauc_map_at_100_max value: 32.25088350678177 - type: nauc_map_at_100_std value: 0.6257017014549283 - type: nauc_map_at_10_diff1 value: 46.974678429336464 - type: nauc_map_at_10_max value: 31.862230807295504 - type: nauc_map_at_10_std value: -0.14758828549579284 - type: nauc_map_at_1_diff1 value: 52.48913346466124 - type: nauc_map_at_1_max value: 29.874374024967725 - type: nauc_map_at_1_std value: -2.433547569836134 - type: nauc_map_at_20_diff1 value: 46.96088684217651 - type: nauc_map_at_20_max value: 32.08954208613205 - type: nauc_map_at_20_std value: 0.25946321113436527 - type: nauc_map_at_3_diff1 value: 47.703230121518345 - type: nauc_map_at_3_max value: 30.977880095983107 - type: nauc_map_at_3_std value: -1.342777563991804 - type: nauc_map_at_5_diff1 value: 47.1615010199957 - type: nauc_map_at_5_max value: 31.420885812683284 - type: nauc_map_at_5_std value: -0.8789297099444306 - type: nauc_mrr_at_1000_diff1 value: 46.69178645962615 - type: nauc_mrr_at_1000_max value: 34.392807413340655 - type: nauc_mrr_at_1000_std value: 1.6155464863667934 - type: nauc_mrr_at_100_diff1 value: 46.67417236349189 - type: nauc_mrr_at_100_max value: 34.384607045512624 - type: nauc_mrr_at_100_std value: 1.6259917384109652 - type: nauc_mrr_at_10_diff1 value: 46.60497560446239 - type: nauc_mrr_at_10_max value: 34.32918897817958 - type: nauc_mrr_at_10_std value: 1.39387793769014 - type: nauc_mrr_at_1_diff1 value: 51.61608573254137 - type: nauc_mrr_at_1_max value: 35.18105023234596 - type: nauc_mrr_at_1_std value: 0.17943702145478177 - type: nauc_mrr_at_20_diff1 value: 46.635943069860254 - type: nauc_mrr_at_20_max value: 34.37050973118794 - type: nauc_mrr_at_20_std value: 1.5346464678860607 - type: nauc_mrr_at_3_diff1 value: 47.154389369038334 - type: nauc_mrr_at_3_max value: 34.41036411855465 - type: nauc_mrr_at_3_std value: 0.924551812357872 - type: nauc_mrr_at_5_diff1 value: 46.6690101691763 - type: nauc_mrr_at_5_max value: 34.29740388138466 - type: nauc_mrr_at_5_std value: 1.0567184149139792 - type: nauc_ndcg_at_1000_diff1 value: 45.375448289173264 - type: nauc_ndcg_at_1000_max value: 33.47957083714482 - type: nauc_ndcg_at_1000_std value: 3.192251100225568 - type: nauc_ndcg_at_100_diff1 value: 44.93601014699499 - type: nauc_ndcg_at_100_max value: 33.21249888295249 - type: nauc_ndcg_at_100_std value: 3.609842852934217 - type: nauc_ndcg_at_10_diff1 value: 44.87893284011915 - type: nauc_ndcg_at_10_max value: 32.384885249478515 - type: nauc_ndcg_at_10_std value: 1.454493065035396 - type: nauc_ndcg_at_1_diff1 value: 51.61608573254137 - type: nauc_ndcg_at_1_max value: 35.18105023234596 - type: nauc_ndcg_at_1_std value: 0.17943702145478177 - type: nauc_ndcg_at_20_diff1 value: 44.867752179050605 - type: nauc_ndcg_at_20_max value: 32.689535921840196 - type: nauc_ndcg_at_20_std value: 2.337765158573901 - type: nauc_ndcg_at_3_diff1 value: 45.87485821381341 - type: nauc_ndcg_at_3_max value: 32.33282450558947 - type: nauc_ndcg_at_3_std value: 0.0681643829273283 - type: nauc_ndcg_at_5_diff1 value: 45.202902131892394 - type: nauc_ndcg_at_5_max value: 32.1026971523917 - type: nauc_ndcg_at_5_std value: 0.3565572833774486 - type: nauc_precision_at_1000_diff1 value: -8.935267931198956 - type: nauc_precision_at_1000_max value: 6.464981960169269 - type: nauc_precision_at_1000_std value: 10.662786182234633 - type: nauc_precision_at_100_diff1 value: -1.64091517847155 - type: nauc_precision_at_100_max value: 15.175617871025024 - type: nauc_precision_at_100_std value: 16.924256989248075 - type: nauc_precision_at_10_diff1 value: 15.676651966277047 - type: nauc_precision_at_10_max value: 26.243734188847117 - type: nauc_precision_at_10_std value: 10.601741034956333 - type: nauc_precision_at_1_diff1 value: 51.61608573254137 - type: nauc_precision_at_1_max value: 35.18105023234596 - type: nauc_precision_at_1_std value: 0.17943702145478177 - type: nauc_precision_at_20_diff1 value: 9.447267260198654 - type: nauc_precision_at_20_max value: 23.024130858142723 - type: nauc_precision_at_20_std value: 13.739145648899603 - type: nauc_precision_at_3_diff1 value: 30.11583572134629 - type: nauc_precision_at_3_max value: 31.37321080069495 - type: nauc_precision_at_3_std value: 4.705512374126024 - type: nauc_precision_at_5_diff1 value: 23.192015335996093 - type: nauc_precision_at_5_max value: 29.415746835998764 - type: nauc_precision_at_5_std value: 6.843498772798558 - type: nauc_recall_at_1000_diff1 value: 25.36573313426033 - type: nauc_recall_at_1000_max value: 43.06672256524168 - type: nauc_recall_at_1000_std value: 47.93664853815292 - type: nauc_recall_at_100_diff1 value: 31.222880916617406 - type: nauc_recall_at_100_max value: 31.761159904172658 - type: nauc_recall_at_100_std value: 23.034218976635877 - type: nauc_recall_at_10_diff1 value: 36.23439028915225 - type: nauc_recall_at_10_max value: 28.473458977606438 - type: nauc_recall_at_10_std value: 3.7797969934159 - type: nauc_recall_at_1_diff1 value: 52.48913346466124 - type: nauc_recall_at_1_max value: 29.874374024967725 - type: nauc_recall_at_1_std value: -2.433547569836134 - type: nauc_recall_at_20_diff1 value: 34.678676952584766 - type: nauc_recall_at_20_max value: 29.04638392522168 - type: nauc_recall_at_20_std value: 8.148894982082549 - type: nauc_recall_at_3_diff1 value: 41.31029996231311 - type: nauc_recall_at_3_max value: 28.44199443414157 - type: nauc_recall_at_3_std value: -0.747324057600377 - type: nauc_recall_at_5_diff1 value: 38.535873899920674 - type: nauc_recall_at_5_max value: 27.942667805948375 - type: nauc_recall_at_5_std value: 0.30652206930973686 - type: ndcg_at_1 value: 36.47675 - type: ndcg_at_10 value: 47.74883333333334 - type: ndcg_at_100 value: 52.902416666666674 - type: ndcg_at_1000 value: 54.69116666666667 - type: ndcg_at_20 value: 49.89758333333333 - type: ndcg_at_3 value: 42.462250000000004 - type: ndcg_at_5 value: 44.91841666666667 - type: precision_at_1 value: 36.47675 - type: precision_at_10 value: 8.582416666666665 - type: precision_at_100 value: 1.31475 - type: precision_at_1000 value: 0.16458333333333333 - type: precision_at_20 value: 5.021833333333333 - type: precision_at_3 value: 20.004499999999997 - type: precision_at_5 value: 14.178666666666665 - type: recall_at_1 value: 30.179249999999996 - type: recall_at_10 value: 60.950166666666675 - type: recall_at_100 value: 83.19025 - type: recall_at_1000 value: 95.27774999999998 - type: recall_at_20 value: 68.80175 - type: recall_at_3 value: 46.01841666666666 - type: recall_at_5 value: 52.482416666666666 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: mteb/climate-fever config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: main_score value: 46.113 - type: map_at_1 value: 20.122999999999998 - type: map_at_10 value: 35.474 - type: map_at_100 value: 37.592 - type: map_at_1000 value: 37.773 - type: map_at_20 value: 36.637 - type: map_at_3 value: 29.731 - type: map_at_5 value: 32.964 - type: mrr_at_1 value: 46.71009771986971 - type: mrr_at_10 value: 58.855669303552105 - type: mrr_at_100 value: 59.389249674038425 - type: mrr_at_1000 value: 59.408448104362364 - type: mrr_at_20 value: 59.23881203149016 - type: mrr_at_3 value: 56.18892508143328 - type: mrr_at_5 value: 57.85342019543985 - type: nauc_map_at_1000_diff1 value: 27.047031037721958 - type: nauc_map_at_1000_max value: 43.25240279148033 - type: nauc_map_at_1000_std value: 20.795849418696037 - type: nauc_map_at_100_diff1 value: 27.044739015116452 - type: nauc_map_at_100_max value: 43.24042159787812 - type: nauc_map_at_100_std value: 20.799952124137683 - type: nauc_map_at_10_diff1 value: 27.372696854670338 - type: nauc_map_at_10_max value: 43.054456574721684 - type: nauc_map_at_10_std value: 19.537162110136645 - type: nauc_map_at_1_diff1 value: 43.65424623953092 - type: nauc_map_at_1_max value: 45.17986509998762 - type: nauc_map_at_1_std value: 8.497107052335414 - type: nauc_map_at_20_diff1 value: 27.224535846566074 - type: nauc_map_at_20_max value: 43.12222854561229 - type: nauc_map_at_20_std value: 20.29982972202669 - type: nauc_map_at_3_diff1 value: 30.87847002319001 - type: nauc_map_at_3_max value: 42.890027891707575 - type: nauc_map_at_3_std value: 13.857451947580929 - type: nauc_map_at_5_diff1 value: 27.966867093591542 - type: nauc_map_at_5_max value: 42.35826637592201 - type: nauc_map_at_5_std value: 16.993102524058624 - type: nauc_mrr_at_1000_diff1 value: 30.191544077608164 - type: nauc_mrr_at_1000_max value: 44.959438920351644 - type: nauc_mrr_at_1000_std value: 24.065801376465114 - type: nauc_mrr_at_100_diff1 value: 30.170368115494 - type: nauc_mrr_at_100_max value: 44.955868115761156 - type: nauc_mrr_at_100_std value: 24.093510767847707 - type: nauc_mrr_at_10_diff1 value: 30.128430637520175 - type: nauc_mrr_at_10_max value: 44.97689261350708 - type: nauc_mrr_at_10_std value: 24.037049561818897 - type: nauc_mrr_at_1_diff1 value: 35.323351939108214 - type: nauc_mrr_at_1_max value: 43.85026244855636 - type: nauc_mrr_at_1_std value: 17.040662141218974 - type: nauc_mrr_at_20_diff1 value: 30.192006556160443 - type: nauc_mrr_at_20_max value: 45.02814530774032 - type: nauc_mrr_at_20_std value: 24.20885865448696 - type: nauc_mrr_at_3_diff1 value: 29.88250163424518 - type: nauc_mrr_at_3_max value: 44.25768944883186 - type: nauc_mrr_at_3_std value: 22.804183393364198 - type: nauc_mrr_at_5_diff1 value: 30.269824490420767 - type: nauc_mrr_at_5_max value: 44.97443265796657 - type: nauc_mrr_at_5_std value: 23.894159916141177 - type: nauc_ndcg_at_1000_diff1 value: 24.533764005407356 - type: nauc_ndcg_at_1000_max value: 44.50902713386608 - type: nauc_ndcg_at_1000_std value: 27.589506980238404 - type: nauc_ndcg_at_100_diff1 value: 24.209785073940353 - type: nauc_ndcg_at_100_max value: 44.18257063893669 - type: nauc_ndcg_at_100_std value: 27.963150866401943 - type: nauc_ndcg_at_10_diff1 value: 25.168069201989486 - type: nauc_ndcg_at_10_max value: 43.84940910683214 - type: nauc_ndcg_at_10_std value: 24.810707270956435 - type: nauc_ndcg_at_1_diff1 value: 35.323351939108214 - type: nauc_ndcg_at_1_max value: 43.85026244855636 - type: nauc_ndcg_at_1_std value: 17.040662141218974 - type: nauc_ndcg_at_20_diff1 value: 24.829924800466834 - type: nauc_ndcg_at_20_max value: 43.738574327059716 - type: nauc_ndcg_at_20_std value: 26.252370278684072 - type: nauc_ndcg_at_3_diff1 value: 27.321943393906274 - type: nauc_ndcg_at_3_max value: 42.16584786993447 - type: nauc_ndcg_at_3_std value: 18.24775079455969 - type: nauc_ndcg_at_5_diff1 value: 26.043785418347998 - type: nauc_ndcg_at_5_max value: 42.874593895388344 - type: nauc_ndcg_at_5_std value: 21.294004555506117 - type: nauc_precision_at_1000_diff1 value: -22.073027615308582 - type: nauc_precision_at_1000_max value: -6.549723766317357 - type: nauc_precision_at_1000_std value: 18.301749191241306 - type: nauc_precision_at_100_diff1 value: -15.654286887593619 - type: nauc_precision_at_100_max value: 6.401516251421999 - type: nauc_precision_at_100_std value: 29.170680324929805 - type: nauc_precision_at_10_diff1 value: -4.362381972892247 - type: nauc_precision_at_10_max value: 22.10943515872447 - type: nauc_precision_at_10_std value: 31.869699459530022 - type: nauc_precision_at_1_diff1 value: 35.323351939108214 - type: nauc_precision_at_1_max value: 43.85026244855636 - type: nauc_precision_at_1_std value: 17.040662141218974 - type: nauc_precision_at_20_diff1 value: -7.50749661117875 - type: nauc_precision_at_20_max value: 16.80584016023257 - type: nauc_precision_at_20_std value: 31.976755897112437 - type: nauc_precision_at_3_diff1 value: 7.402667538773083 - type: nauc_precision_at_3_max value: 31.2088401330676 - type: nauc_precision_at_3_std value: 24.287905698405662 - type: nauc_precision_at_5_diff1 value: 0.7479172565343901 - type: nauc_precision_at_5_max value: 26.28427734237825 - type: nauc_precision_at_5_std value: 28.246947120310317 - type: nauc_recall_at_1000_diff1 value: 2.4778431086370496 - type: nauc_recall_at_1000_max value: 40.2231995797509 - type: nauc_recall_at_1000_std value: 52.62124052183862 - type: nauc_recall_at_100_diff1 value: 8.960962419741463 - type: nauc_recall_at_100_max value: 35.81132850291491 - type: nauc_recall_at_100_std value: 40.020903251786166 - type: nauc_recall_at_10_diff1 value: 15.603400751376636 - type: nauc_recall_at_10_max value: 37.570127529136485 - type: nauc_recall_at_10_std value: 28.07128410238545 - type: nauc_recall_at_1_diff1 value: 43.65424623953092 - type: nauc_recall_at_1_max value: 45.17986509998762 - type: nauc_recall_at_1_std value: 8.497107052335414 - type: nauc_recall_at_20_diff1 value: 13.844820282832346 - type: nauc_recall_at_20_max value: 36.0106148516309 - type: nauc_recall_at_20_std value: 31.453103910565254 - type: nauc_recall_at_3_diff1 value: 24.359328154117748 - type: nauc_recall_at_3_max value: 39.93774251377568 - type: nauc_recall_at_3_std value: 16.214921517509648 - type: nauc_recall_at_5_diff1 value: 18.75788451360292 - type: nauc_recall_at_5_max value: 38.177646107055516 - type: nauc_recall_at_5_std value: 22.17196825834675 - type: ndcg_at_1 value: 46.71 - type: ndcg_at_10 value: 46.113 - type: ndcg_at_100 value: 53.035 - type: ndcg_at_1000 value: 55.724 - type: ndcg_at_20 value: 48.929 - type: ndcg_at_3 value: 39.501999999999995 - type: ndcg_at_5 value: 41.792 - type: precision_at_1 value: 46.71 - type: precision_at_10 value: 14.274000000000001 - type: precision_at_100 value: 2.1870000000000003 - type: precision_at_1000 value: 0.269 - type: precision_at_20 value: 8.375 - type: precision_at_3 value: 29.881 - type: precision_at_5 value: 22.697 - type: recall_at_1 value: 20.122999999999998 - type: recall_at_10 value: 52.22 - type: recall_at_100 value: 75.388 - type: recall_at_1000 value: 89.938 - type: recall_at_20 value: 60.077000000000005 - type: recall_at_3 value: 35.150999999999996 - type: recall_at_5 value: 42.748000000000005 - task: type: Retrieval dataset: name: MTEB DBPedia type: mteb/dbpedia config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: main_score value: 52.276999999999994 - type: map_at_1 value: 9.949 - type: map_at_10 value: 24.891 - type: map_at_100 value: 37.111 - type: map_at_1000 value: 39.266 - type: map_at_20 value: 29.685 - type: map_at_3 value: 16.586000000000002 - type: map_at_5 value: 19.982 - type: mrr_at_1 value: 76.25 - type: mrr_at_10 value: 82.4518849206349 - type: mrr_at_100 value: 82.70302194564499 - type: mrr_at_1000 value: 82.70909729942254 - type: mrr_at_20 value: 82.60492765962964 - type: mrr_at_3 value: 81.33333333333331 - type: mrr_at_5 value: 82.14583333333331 - type: nauc_map_at_1000_diff1 value: 21.427201262456556 - type: nauc_map_at_1000_max value: 35.357361590816076 - type: nauc_map_at_1000_std value: 24.785419223353717 - type: nauc_map_at_100_diff1 value: 22.82358692021537 - type: nauc_map_at_100_max value: 35.07399692072945 - type: nauc_map_at_100_std value: 22.679878828987025 - type: nauc_map_at_10_diff1 value: 26.491769223479643 - type: nauc_map_at_10_max value: 20.78079385443902 - type: nauc_map_at_10_std value: -4.910406292079661 - type: nauc_map_at_1_diff1 value: 35.20851030208876 - type: nauc_map_at_1_max value: 5.783003346365858 - type: nauc_map_at_1_std value: -21.11679133835354 - type: nauc_map_at_20_diff1 value: 24.80097499300491 - type: nauc_map_at_20_max value: 26.807021360774975 - type: nauc_map_at_20_std value: 4.793103995429955 - type: nauc_map_at_3_diff1 value: 29.238193458890173 - type: nauc_map_at_3_max value: 10.300839972189456 - type: nauc_map_at_3_std value: -17.889666731981592 - type: nauc_map_at_5_diff1 value: 28.773624870573926 - type: nauc_map_at_5_max value: 14.951435645422887 - type: nauc_map_at_5_std value: -13.319697827173565 - type: nauc_mrr_at_1000_diff1 value: 55.232544856708785 - type: nauc_mrr_at_1000_max value: 64.73225637682637 - type: nauc_mrr_at_1000_std value: 37.57480399594188 - type: nauc_mrr_at_100_diff1 value: 55.219251601773735 - type: nauc_mrr_at_100_max value: 64.73305063663611 - type: nauc_mrr_at_100_std value: 37.56458562909293 - type: nauc_mrr_at_10_diff1 value: 55.123463838253464 - type: nauc_mrr_at_10_max value: 64.91914041040233 - type: nauc_mrr_at_10_std value: 37.76482503851598 - type: nauc_mrr_at_1_diff1 value: 56.45461238513347 - type: nauc_mrr_at_1_max value: 63.11782510293676 - type: nauc_mrr_at_1_std value: 33.592561284868985 - type: nauc_mrr_at_20_diff1 value: 55.15401961460458 - type: nauc_mrr_at_20_max value: 64.77145835613156 - type: nauc_mrr_at_20_std value: 37.471561418305804 - type: nauc_mrr_at_3_diff1 value: 54.64387438697658 - type: nauc_mrr_at_3_max value: 64.27618995019164 - type: nauc_mrr_at_3_std value: 39.391637295269014 - type: nauc_mrr_at_5_diff1 value: 55.08702591239485 - type: nauc_mrr_at_5_max value: 64.6071475650635 - type: nauc_mrr_at_5_std value: 37.97185134269896 - type: nauc_ndcg_at_1000_diff1 value: 31.696698876400387 - type: nauc_ndcg_at_1000_max value: 52.12183760001191 - type: nauc_ndcg_at_1000_std value: 40.197596211778716 - type: nauc_ndcg_at_100_diff1 value: 33.253120193433666 - type: nauc_ndcg_at_100_max value: 49.47167758554746 - type: nauc_ndcg_at_100_std value: 32.643833139756204 - type: nauc_ndcg_at_10_diff1 value: 27.065541392580013 - type: nauc_ndcg_at_10_max value: 45.83504281289289 - type: nauc_ndcg_at_10_std value: 27.11739500732328 - type: nauc_ndcg_at_1_diff1 value: 49.42808250022517 - type: nauc_ndcg_at_1_max value: 53.502615048520354 - type: nauc_ndcg_at_1_std value: 27.17555908836708 - type: nauc_ndcg_at_20_diff1 value: 29.374791382330308 - type: nauc_ndcg_at_20_max value: 43.91246842479055 - type: nauc_ndcg_at_20_std value: 23.419410620550316 - type: nauc_ndcg_at_3_diff1 value: 26.71550354496204 - type: nauc_ndcg_at_3_max value: 43.9641457892003 - type: nauc_ndcg_at_3_std value: 27.320024167947686 - type: nauc_ndcg_at_5_diff1 value: 27.020654974589487 - type: nauc_ndcg_at_5_max value: 46.130417266030584 - type: nauc_ndcg_at_5_std value: 28.392009019010068 - type: nauc_precision_at_1000_diff1 value: -21.47455482181002 - type: nauc_precision_at_1000_max value: -9.721907229236024 - type: nauc_precision_at_1000_std value: -1.061132062651487 - type: nauc_precision_at_100_diff1 value: -12.35759246101943 - type: nauc_precision_at_100_max value: 15.509512444892168 - type: nauc_precision_at_100_std value: 36.21183578592014 - type: nauc_precision_at_10_diff1 value: -6.136998947343125 - type: nauc_precision_at_10_max value: 32.30037906748288 - type: nauc_precision_at_10_std value: 41.4500302476981 - type: nauc_precision_at_1_diff1 value: 56.45461238513347 - type: nauc_precision_at_1_max value: 63.11782510293676 - type: nauc_precision_at_1_std value: 33.592561284868985 - type: nauc_precision_at_20_diff1 value: -7.335890123683174 - type: nauc_precision_at_20_max value: 28.31417075291312 - type: nauc_precision_at_20_std value: 41.405935715061815 - type: nauc_precision_at_3_diff1 value: 7.117255890225942 - type: nauc_precision_at_3_max value: 39.19894132683829 - type: nauc_precision_at_3_std value: 38.48255841994843 - type: nauc_precision_at_5_diff1 value: 1.861523090114206 - type: nauc_precision_at_5_max value: 38.11649223007208 - type: nauc_precision_at_5_std value: 40.52993530374645 - type: nauc_recall_at_1000_diff1 value: 26.497648584314636 - type: nauc_recall_at_1000_max value: 44.48069746734414 - type: nauc_recall_at_1000_std value: 53.16438130228715 - type: nauc_recall_at_100_diff1 value: 26.353456899511446 - type: nauc_recall_at_100_max value: 37.57379787884197 - type: nauc_recall_at_100_std value: 29.197468295989548 - type: nauc_recall_at_10_diff1 value: 22.80445738351114 - type: nauc_recall_at_10_max value: 15.895630778449046 - type: nauc_recall_at_10_std value: -8.746224797644501 - type: nauc_recall_at_1_diff1 value: 35.20851030208876 - type: nauc_recall_at_1_max value: 5.783003346365858 - type: nauc_recall_at_1_std value: -21.11679133835354 - type: nauc_recall_at_20_diff1 value: 22.34028867678706 - type: nauc_recall_at_20_max value: 21.42373427646772 - type: nauc_recall_at_20_std value: 0.4533036151015875 - type: nauc_recall_at_3_diff1 value: 24.96853445599229 - type: nauc_recall_at_3_max value: 6.245185375804208 - type: nauc_recall_at_3_std value: -20.200240127099622 - type: nauc_recall_at_5_diff1 value: 24.749259476710623 - type: nauc_recall_at_5_max value: 11.024592845995942 - type: nauc_recall_at_5_std value: -16.15683085641543 - type: ndcg_at_1 value: 64.125 - type: ndcg_at_10 value: 52.276999999999994 - type: ndcg_at_100 value: 57.440000000000005 - type: ndcg_at_1000 value: 64.082 - type: ndcg_at_20 value: 51.383 - type: ndcg_at_3 value: 55.769000000000005 - type: ndcg_at_5 value: 53.978 - type: precision_at_1 value: 76.25 - type: precision_at_10 value: 43.05 - type: precision_at_100 value: 14.09 - type: precision_at_1000 value: 2.662 - type: precision_at_20 value: 33.112 - type: precision_at_3 value: 59.833000000000006 - type: precision_at_5 value: 53.05 - type: recall_at_1 value: 9.949 - type: recall_at_10 value: 30.424 - type: recall_at_100 value: 64.062 - type: recall_at_1000 value: 85.916 - type: recall_at_20 value: 39.895 - type: recall_at_3 value: 17.876 - type: recall_at_5 value: 22.536 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 84.29499999999999 - type: f1 value: 79.76188258172078 - type: f1_weighted value: 84.96026012933847 - type: main_score value: 84.29499999999999 - task: type: Retrieval dataset: name: MTEB FEVER type: mteb/fever config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: main_score value: 94.83200000000001 - type: map_at_1 value: 87.339 - type: map_at_10 value: 92.92099999999999 - type: map_at_100 value: 93.108 - type: map_at_1000 value: 93.116 - type: map_at_20 value: 93.041 - type: map_at_3 value: 92.219 - type: map_at_5 value: 92.664 - type: mrr_at_1 value: 93.99939993999399 - type: mrr_at_10 value: 96.55188137861403 - type: mrr_at_100 value: 96.5652366009286 - type: mrr_at_1000 value: 96.5652625550811 - type: mrr_at_20 value: 96.5601781754844 - type: mrr_at_3 value: 96.45714571457142 - type: mrr_at_5 value: 96.544904490449 - type: nauc_map_at_1000_diff1 value: 51.81676454961933 - type: nauc_map_at_1000_max value: 24.904822914926118 - type: nauc_map_at_1000_std value: -3.8110347821630404 - type: nauc_map_at_100_diff1 value: 51.77514975011158 - type: nauc_map_at_100_max value: 24.912497341800094 - type: nauc_map_at_100_std value: -3.76229517662447 - type: nauc_map_at_10_diff1 value: 51.29608296382479 - type: nauc_map_at_10_max value: 24.78704970246707 - type: nauc_map_at_10_std value: -3.723130815783328 - type: nauc_map_at_1_diff1 value: 59.90813138005125 - type: nauc_map_at_1_max value: 24.58479295693794 - type: nauc_map_at_1_std value: -8.056152492777027 - type: nauc_map_at_20_diff1 value: 51.428639331678326 - type: nauc_map_at_20_max value: 24.849214517705086 - type: nauc_map_at_20_std value: -3.685550123874596 - type: nauc_map_at_3_diff1 value: 50.94399923719279 - type: nauc_map_at_3_max value: 24.359700180006207 - type: nauc_map_at_3_std value: -5.407767408816422 - type: nauc_map_at_5_diff1 value: 50.767302682959546 - type: nauc_map_at_5_max value: 24.491113461892215 - type: nauc_map_at_5_std value: -4.058336127339082 - type: nauc_mrr_at_1000_diff1 value: 79.86042313551833 - type: nauc_mrr_at_1000_max value: 23.20960445633933 - type: nauc_mrr_at_1000_std value: -23.54334295120471 - type: nauc_mrr_at_100_diff1 value: 79.85991247027636 - type: nauc_mrr_at_100_max value: 23.210085926780106 - type: nauc_mrr_at_100_std value: -23.542508200789197 - type: nauc_mrr_at_10_diff1 value: 79.71095155563415 - type: nauc_mrr_at_10_max value: 23.24128650883908 - type: nauc_mrr_at_10_std value: -23.408502781834102 - type: nauc_mrr_at_1_diff1 value: 82.6349900233902 - type: nauc_mrr_at_1_max value: 21.994548214014227 - type: nauc_mrr_at_1_std value: -22.549769792179262 - type: nauc_mrr_at_20_diff1 value: 79.76465012873038 - type: nauc_mrr_at_20_max value: 23.17575026523213 - type: nauc_mrr_at_20_std value: -23.492660166315048 - type: nauc_mrr_at_3_diff1 value: 79.91074933379953 - type: nauc_mrr_at_3_max value: 24.14246499097892 - type: nauc_mrr_at_3_std value: -25.22601708389664 - type: nauc_mrr_at_5_diff1 value: 79.62092651565847 - type: nauc_mrr_at_5_max value: 23.315937737034425 - type: nauc_mrr_at_5_std value: -23.317659360058403 - type: nauc_ndcg_at_1000_diff1 value: 54.404537986779225 - type: nauc_ndcg_at_1000_max value: 25.38408304128995 - type: nauc_ndcg_at_1000_std value: -4.916709117696968 - type: nauc_ndcg_at_100_diff1 value: 53.2448598868241 - type: nauc_ndcg_at_100_max value: 25.75325255295546 - type: nauc_ndcg_at_100_std value: -3.680507005630751 - type: nauc_ndcg_at_10_diff1 value: 50.81057355170232 - type: nauc_ndcg_at_10_max value: 25.006448273343807 - type: nauc_ndcg_at_10_std value: -2.8979899112515577 - type: nauc_ndcg_at_1_diff1 value: 82.6349900233902 - type: nauc_ndcg_at_1_max value: 21.994548214014227 - type: nauc_ndcg_at_1_std value: -22.549769792179262 - type: nauc_ndcg_at_20_diff1 value: 51.205023097166304 - type: nauc_ndcg_at_20_max value: 25.22133626556826 - type: nauc_ndcg_at_20_std value: -2.9506328244150155 - type: nauc_ndcg_at_3_diff1 value: 51.79780256736321 - type: nauc_ndcg_at_3_max value: 24.81137324438439 - type: nauc_ndcg_at_3_std value: -6.881223858227807 - type: nauc_ndcg_at_5_diff1 value: 50.290038260564565 - type: nauc_ndcg_at_5_max value: 24.57250792165796 - type: nauc_ndcg_at_5_std value: -3.5124628344654596 - type: nauc_precision_at_1000_diff1 value: -20.215211396894333 - type: nauc_precision_at_1000_max value: -14.165452298769171 - type: nauc_precision_at_1000_std value: -2.0952871214470816 - type: nauc_precision_at_100_diff1 value: -22.340257474494607 - type: nauc_precision_at_100_max value: -12.697885641360282 - type: nauc_precision_at_100_std value: 1.0688624940286244 - type: nauc_precision_at_10_diff1 value: -24.78271817420798 - type: nauc_precision_at_10_max value: -12.625257500222656 - type: nauc_precision_at_10_std value: 3.223250450607087 - type: nauc_precision_at_1_diff1 value: 82.6349900233902 - type: nauc_precision_at_1_max value: 21.994548214014227 - type: nauc_precision_at_1_std value: -22.549769792179262 - type: nauc_precision_at_20_diff1 value: -24.375756227194177 - type: nauc_precision_at_20_max value: -12.341015011563536 - type: nauc_precision_at_20_std value: 2.7475274619387955 - type: nauc_precision_at_3_diff1 value: -24.8251306777365 - type: nauc_precision_at_3_max value: -13.109579709589042 - type: nauc_precision_at_3_std value: -1.2233442335420748 - type: nauc_precision_at_5_diff1 value: -26.955418583344894 - type: nauc_precision_at_5_max value: -13.598630838071015 - type: nauc_precision_at_5_std value: 2.545780631940738 - type: nauc_recall_at_1000_diff1 value: 0.2542680835344437 - type: nauc_recall_at_1000_max value: 49.38194243035277 - type: nauc_recall_at_1000_std value: 57.021502715846026 - type: nauc_recall_at_100_diff1 value: 5.062154815367015 - type: nauc_recall_at_100_max value: 45.41178380188437 - type: nauc_recall_at_100_std value: 50.78382225901813 - type: nauc_recall_at_10_diff1 value: 20.429153629007818 - type: nauc_recall_at_10_max value: 27.516855026155508 - type: nauc_recall_at_10_std value: 21.367491371755467 - type: nauc_recall_at_1_diff1 value: 59.90813138005125 - type: nauc_recall_at_1_max value: 24.58479295693794 - type: nauc_recall_at_1_std value: -8.056152492777027 - type: nauc_recall_at_20_diff1 value: 13.072430858896942 - type: nauc_recall_at_20_max value: 29.5522659183247 - type: nauc_recall_at_20_std value: 28.70569974090291 - type: nauc_recall_at_3_diff1 value: 30.419084482663617 - type: nauc_recall_at_3_max value: 25.627389580252835 - type: nauc_recall_at_3_std value: 2.5557690877637054 - type: nauc_recall_at_5_diff1 value: 22.92561435069869 - type: nauc_recall_at_5_max value: 25.545265063475455 - type: nauc_recall_at_5_std value: 14.736172663072786 - type: ndcg_at_1 value: 93.999 - type: ndcg_at_10 value: 94.83200000000001 - type: ndcg_at_100 value: 95.363 - type: ndcg_at_1000 value: 95.478 - type: ndcg_at_20 value: 95.077 - type: ndcg_at_3 value: 94.143 - type: ndcg_at_5 value: 94.525 - type: precision_at_1 value: 93.999 - type: precision_at_10 value: 11.029 - type: precision_at_100 value: 1.1560000000000001 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_20 value: 5.62 - type: precision_at_3 value: 35.219 - type: precision_at_5 value: 21.584 - type: recall_at_1 value: 87.339 - type: recall_at_10 value: 97.026 - type: recall_at_100 value: 98.936 - type: recall_at_1000 value: 99.599 - type: recall_at_20 value: 97.744 - type: recall_at_3 value: 95.069 - type: recall_at_5 value: 96.177 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: mteb/fiqa config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: main_score value: 60.480000000000004 - type: map_at_1 value: 31.529 - type: map_at_10 value: 52.081 - type: map_at_100 value: 54.342 - type: map_at_1000 value: 54.449000000000005 - type: map_at_20 value: 53.479 - type: map_at_3 value: 45.471000000000004 - type: map_at_5 value: 49.164 - type: mrr_at_1 value: 60.03086419753087 - type: mrr_at_10 value: 67.73754409171075 - type: mrr_at_100 value: 68.332432152368 - type: mrr_at_1000 value: 68.34150941774908 - type: mrr_at_20 value: 68.14780993838725 - type: mrr_at_3 value: 65.6378600823045 - type: mrr_at_5 value: 66.88014403292176 - type: nauc_map_at_1000_diff1 value: 45.36598134579052 - type: nauc_map_at_1000_max value: 31.891451119906943 - type: nauc_map_at_1000_std value: -15.41454384137943 - type: nauc_map_at_100_diff1 value: 45.31268291874018 - type: nauc_map_at_100_max value: 31.811055683002092 - type: nauc_map_at_100_std value: -15.348503855591417 - type: nauc_map_at_10_diff1 value: 45.22606983565892 - type: nauc_map_at_10_max value: 30.46108534749699 - type: nauc_map_at_10_std value: -16.618086029682555 - type: nauc_map_at_1_diff1 value: 49.94952823753276 - type: nauc_map_at_1_max value: 13.770377574254548 - type: nauc_map_at_1_std value: -14.946357968858653 - type: nauc_map_at_20_diff1 value: 45.29274207897926 - type: nauc_map_at_20_max value: 31.27332015148257 - type: nauc_map_at_20_std value: -15.782946115613129 - type: nauc_map_at_3_diff1 value: 47.94248233566038 - type: nauc_map_at_3_max value: 24.022838776825456 - type: nauc_map_at_3_std value: -17.103518542262208 - type: nauc_map_at_5_diff1 value: 45.85345590031722 - type: nauc_map_at_5_max value: 27.78341379004547 - type: nauc_map_at_5_std value: -17.490850791756326 - type: nauc_mrr_at_1000_diff1 value: 58.225141047822824 - type: nauc_mrr_at_1000_max value: 43.39606904140525 - type: nauc_mrr_at_1000_std value: -14.64093518199122 - type: nauc_mrr_at_100_diff1 value: 58.22137274179545 - type: nauc_mrr_at_100_max value: 43.39567568136935 - type: nauc_mrr_at_100_std value: -14.62512313985582 - type: nauc_mrr_at_10_diff1 value: 58.03217329957151 - type: nauc_mrr_at_10_max value: 43.633561683075186 - type: nauc_mrr_at_10_std value: -14.563703576023808 - type: nauc_mrr_at_1_diff1 value: 61.48979902647692 - type: nauc_mrr_at_1_max value: 43.1938079066948 - type: nauc_mrr_at_1_std value: -15.808138277440465 - type: nauc_mrr_at_20_diff1 value: 58.13185370150794 - type: nauc_mrr_at_20_max value: 43.35607721183147 - type: nauc_mrr_at_20_std value: -14.635812702971263 - type: nauc_mrr_at_3_diff1 value: 58.698963168321264 - type: nauc_mrr_at_3_max value: 43.633129249785405 - type: nauc_mrr_at_3_std value: -15.733246346983854 - type: nauc_mrr_at_5_diff1 value: 57.94156745229547 - type: nauc_mrr_at_5_max value: 43.14152462640525 - type: nauc_mrr_at_5_std value: -15.318685307750895 - type: nauc_ndcg_at_1000_diff1 value: 47.871896043731496 - type: nauc_ndcg_at_1000_max value: 37.159845167533426 - type: nauc_ndcg_at_1000_std value: -13.067288160833485 - type: nauc_ndcg_at_100_diff1 value: 47.046171407204426 - type: nauc_ndcg_at_100_max value: 36.422514360855835 - type: nauc_ndcg_at_100_std value: -11.636859259571441 - type: nauc_ndcg_at_10_diff1 value: 46.232628149078096 - type: nauc_ndcg_at_10_max value: 34.82402625088358 - type: nauc_ndcg_at_10_std value: -14.768545542980114 - type: nauc_ndcg_at_1_diff1 value: 61.48979902647692 - type: nauc_ndcg_at_1_max value: 43.1938079066948 - type: nauc_ndcg_at_1_std value: -15.808138277440465 - type: nauc_ndcg_at_20_diff1 value: 46.51116172390955 - type: nauc_ndcg_at_20_max value: 35.36362650568298 - type: nauc_ndcg_at_20_std value: -12.849406209182826 - type: nauc_ndcg_at_3_diff1 value: 47.39832263785871 - type: nauc_ndcg_at_3_max value: 35.67466264628456 - type: nauc_ndcg_at_3_std value: -17.257717349296943 - type: nauc_ndcg_at_5_diff1 value: 45.91049493804232 - type: nauc_ndcg_at_5_max value: 33.8405091138445 - type: nauc_ndcg_at_5_std value: -17.477069902735895 - type: nauc_precision_at_1000_diff1 value: -12.037873000917767 - type: nauc_precision_at_1000_max value: 26.043220150002295 - type: nauc_precision_at_1000_std value: 6.84910668321572 - type: nauc_precision_at_100_diff1 value: -9.383403459051864 - type: nauc_precision_at_100_max value: 29.68713170610003 - type: nauc_precision_at_100_std value: 10.079531587056152 - type: nauc_precision_at_10_diff1 value: 3.3433323353925135 - type: nauc_precision_at_10_max value: 38.31790111725993 - type: nauc_precision_at_10_std value: 0.7888123304710856 - type: nauc_precision_at_1_diff1 value: 61.48979902647692 - type: nauc_precision_at_1_max value: 43.1938079066948 - type: nauc_precision_at_1_std value: -15.808138277440465 - type: nauc_precision_at_20_diff1 value: -2.083500986294448 - type: nauc_precision_at_20_max value: 35.77143835726343 - type: nauc_precision_at_20_std value: 5.318547021874003 - type: nauc_precision_at_3_diff1 value: 23.335617788912586 - type: nauc_precision_at_3_max value: 39.81973275320871 - type: nauc_precision_at_3_std value: -8.442769390555561 - type: nauc_precision_at_5_diff1 value: 11.521087842589482 - type: nauc_precision_at_5_max value: 39.527792539828255 - type: nauc_precision_at_5_std value: -5.412729503701626 - type: nauc_recall_at_1000_diff1 value: 10.6830893047453 - type: nauc_recall_at_1000_max value: 8.834504311238423 - type: nauc_recall_at_1000_std value: 24.670754304859692 - type: nauc_recall_at_100_diff1 value: 20.646020385527358 - type: nauc_recall_at_100_max value: 20.121595011523294 - type: nauc_recall_at_100_std value: 19.42307459311791 - type: nauc_recall_at_10_diff1 value: 33.01029313733417 - type: nauc_recall_at_10_max value: 27.948634980368702 - type: nauc_recall_at_10_std value: -10.239767371462975 - type: nauc_recall_at_1_diff1 value: 49.94952823753276 - type: nauc_recall_at_1_max value: 13.770377574254548 - type: nauc_recall_at_1_std value: -14.946357968858653 - type: nauc_recall_at_20_diff1 value: 30.040111045267963 - type: nauc_recall_at_20_max value: 25.984919302418184 - type: nauc_recall_at_20_std value: -1.4998001817460804 - type: nauc_recall_at_3_diff1 value: 42.24410559113653 - type: nauc_recall_at_3_max value: 20.269503583626914 - type: nauc_recall_at_3_std value: -17.09578532600584 - type: nauc_recall_at_5_diff1 value: 36.124149735848945 - type: nauc_recall_at_5_max value: 22.708022306002622 - type: nauc_recall_at_5_std value: -16.966976847236193 - type: ndcg_at_1 value: 60.031 - type: ndcg_at_10 value: 60.480000000000004 - type: ndcg_at_100 value: 66.94099999999999 - type: ndcg_at_1000 value: 68.303 - type: ndcg_at_20 value: 63.536 - type: ndcg_at_3 value: 55.903999999999996 - type: ndcg_at_5 value: 57.387 - type: precision_at_1 value: 60.031 - type: precision_at_10 value: 16.682 - type: precision_at_100 value: 2.336 - type: precision_at_1000 value: 0.259 - type: precision_at_20 value: 9.66 - type: precision_at_3 value: 37.191 - type: precision_at_5 value: 27.253 - type: recall_at_1 value: 31.529 - type: recall_at_10 value: 68.035 - type: recall_at_100 value: 90.925 - type: recall_at_1000 value: 98.688 - type: recall_at_20 value: 77.453 - type: recall_at_3 value: 50.221000000000004 - type: recall_at_5 value: 58.209999999999994 - task: type: Retrieval dataset: name: MTEB HotpotQA type: mteb/hotpotqa config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: main_score value: 76.67399999999999 - type: map_at_1 value: 43.822 - type: map_at_10 value: 68.82000000000001 - type: map_at_100 value: 69.659 - type: map_at_1000 value: 69.714 - type: map_at_20 value: 69.305 - type: map_at_3 value: 65.517 - type: map_at_5 value: 67.633 - type: mrr_at_1 value: 87.643484132343 - type: mrr_at_10 value: 91.28134679485098 - type: mrr_at_100 value: 91.37985230614755 - type: mrr_at_1000 value: 91.38202467630681 - type: mrr_at_20 value: 91.34718855278429 - type: mrr_at_3 value: 90.75849651136599 - type: mrr_at_5 value: 91.10961062345235 - type: nauc_map_at_1000_diff1 value: 3.7670405082837477 - type: nauc_map_at_1000_max value: 14.410594409695182 - type: nauc_map_at_1000_std value: 7.94738583292685 - type: nauc_map_at_100_diff1 value: 3.738796209193936 - type: nauc_map_at_100_max value: 14.408029101534694 - type: nauc_map_at_100_std value: 7.979641077687816 - type: nauc_map_at_10_diff1 value: 3.334917978089454 - type: nauc_map_at_10_max value: 13.975255289147748 - type: nauc_map_at_10_std value: 7.491959628012161 - type: nauc_map_at_1_diff1 value: 75.35066482050009 - type: nauc_map_at_1_max value: 53.573503488571475 - type: nauc_map_at_1_std value: -6.542030594426993 - type: nauc_map_at_20_diff1 value: 3.5197129341582083 - type: nauc_map_at_20_max value: 14.159880698006816 - type: nauc_map_at_20_std value: 7.856574384998483 - type: nauc_map_at_3_diff1 value: 3.0992333232864064 - type: nauc_map_at_3_max value: 12.513959281222112 - type: nauc_map_at_3_std value: 4.352912866014865 - type: nauc_map_at_5_diff1 value: 3.0351688998572537 - type: nauc_map_at_5_max value: 13.21599457624529 - type: nauc_map_at_5_std value: 6.246882983214777 - type: nauc_mrr_at_1000_diff1 value: 75.23953736361132 - type: nauc_mrr_at_1000_max value: 56.64260717262164 - type: nauc_mrr_at_1000_std value: -4.865932053762276 - type: nauc_mrr_at_100_diff1 value: 75.24091372816497 - type: nauc_mrr_at_100_max value: 56.64831104504846 - type: nauc_mrr_at_100_std value: -4.850966297943324 - type: nauc_mrr_at_10_diff1 value: 75.26540178053416 - type: nauc_mrr_at_10_max value: 56.828755673428965 - type: nauc_mrr_at_10_std value: -4.8401126970944635 - type: nauc_mrr_at_1_diff1 value: 75.35066482050009 - type: nauc_mrr_at_1_max value: 53.573503488571475 - type: nauc_mrr_at_1_std value: -6.542030594426993 - type: nauc_mrr_at_20_diff1 value: 75.24453050729845 - type: nauc_mrr_at_20_max value: 56.69220588401435 - type: nauc_mrr_at_20_std value: -4.843700730832108 - type: nauc_mrr_at_3_diff1 value: 74.98411648336175 - type: nauc_mrr_at_3_max value: 56.766537573537114 - type: nauc_mrr_at_3_std value: -4.909712671649337 - type: nauc_mrr_at_5_diff1 value: 75.20599020991028 - type: nauc_mrr_at_5_max value: 56.64236207782237 - type: nauc_mrr_at_5_std value: -5.208907367513977 - type: nauc_ndcg_at_1000_diff1 value: 11.48307079099774 - type: nauc_ndcg_at_1000_max value: 20.893326881675176 - type: nauc_ndcg_at_1000_std value: 10.43489838692119 - type: nauc_ndcg_at_100_diff1 value: 10.395588735754927 - type: nauc_ndcg_at_100_max value: 20.529573302516912 - type: nauc_ndcg_at_100_std value: 11.252973083654268 - type: nauc_ndcg_at_10_diff1 value: 8.596739352741972 - type: nauc_ndcg_at_10_max value: 18.475863682540673 - type: nauc_ndcg_at_10_std value: 9.175831033463352 - type: nauc_ndcg_at_1_diff1 value: 75.35066482050009 - type: nauc_ndcg_at_1_max value: 53.573503488571475 - type: nauc_ndcg_at_1_std value: -6.542030594426993 - type: nauc_ndcg_at_20_diff1 value: 8.998033972471749 - type: nauc_ndcg_at_20_max value: 18.892085875404522 - type: nauc_ndcg_at_20_std value: 10.3241608901084 - type: nauc_ndcg_at_3_diff1 value: 8.796384949533579 - type: nauc_ndcg_at_3_max value: 16.515261419885274 - type: nauc_ndcg_at_3_std value: 4.081902976576701 - type: nauc_ndcg_at_5_diff1 value: 8.277259464605025 - type: nauc_ndcg_at_5_max value: 17.163053202909527 - type: nauc_ndcg_at_5_std value: 6.652669449704474 - type: nauc_precision_at_1000_diff1 value: -3.490556596304827 - type: nauc_precision_at_1000_max value: 31.0473259001597 - type: nauc_precision_at_1000_std value: 52.36921397692622 - type: nauc_precision_at_100_diff1 value: -6.420747959222489 - type: nauc_precision_at_100_max value: 20.555887056005936 - type: nauc_precision_at_100_std value: 36.119132870798495 - type: nauc_precision_at_10_diff1 value: -6.461726057290426 - type: nauc_precision_at_10_max value: 12.161081825341915 - type: nauc_precision_at_10_std value: 17.961318451839993 - type: nauc_precision_at_1_diff1 value: 75.35066482050009 - type: nauc_precision_at_1_max value: 53.573503488571475 - type: nauc_precision_at_1_std value: -6.542030594426993 - type: nauc_precision_at_20_diff1 value: -7.361461296416161 - type: nauc_precision_at_20_max value: 12.663621261696733 - type: nauc_precision_at_20_std value: 23.312476851670286 - type: nauc_precision_at_3_diff1 value: -3.299056912774522 - type: nauc_precision_at_3_max value: 9.85602375812038 - type: nauc_precision_at_3_std value: 6.4962782003155475 - type: nauc_precision_at_5_diff1 value: -5.3155827772027795 - type: nauc_precision_at_5_max value: 10.32907751171833 - type: nauc_precision_at_5_std value: 11.384098087196932 - type: nauc_recall_at_1000_diff1 value: -3.4905565963043332 - type: nauc_recall_at_1000_max value: 31.04732590016041 - type: nauc_recall_at_1000_std value: 52.36921397692641 - type: nauc_recall_at_100_diff1 value: -6.420747959222586 - type: nauc_recall_at_100_max value: 20.55588705600596 - type: nauc_recall_at_100_std value: 36.11913287079825 - type: nauc_recall_at_10_diff1 value: -6.461726057290347 - type: nauc_recall_at_10_max value: 12.161081825342022 - type: nauc_recall_at_10_std value: 17.96131845184002 - type: nauc_recall_at_1_diff1 value: 75.35066482050009 - type: nauc_recall_at_1_max value: 53.573503488571475 - type: nauc_recall_at_1_std value: -6.542030594426993 - type: nauc_recall_at_20_diff1 value: -7.361461296416054 - type: nauc_recall_at_20_max value: 12.66362126169679 - type: nauc_recall_at_20_std value: 23.312476851670382 - type: nauc_recall_at_3_diff1 value: -3.2990569127745886 - type: nauc_recall_at_3_max value: 9.856023758120296 - type: nauc_recall_at_3_std value: 6.496278200315444 - type: nauc_recall_at_5_diff1 value: -5.315582777202729 - type: nauc_recall_at_5_max value: 10.329077511718229 - type: nauc_recall_at_5_std value: 11.384098087196932 - type: ndcg_at_1 value: 87.643 - type: ndcg_at_10 value: 76.67399999999999 - type: ndcg_at_100 value: 79.462 - type: ndcg_at_1000 value: 80.43599999999999 - type: ndcg_at_20 value: 77.83 - type: ndcg_at_3 value: 72.256 - type: ndcg_at_5 value: 74.789 - type: precision_at_1 value: 87.643 - type: precision_at_10 value: 15.726999999999999 - type: precision_at_100 value: 1.791 - type: precision_at_1000 value: 0.192 - type: precision_at_20 value: 8.236 - type: precision_at_3 value: 45.919 - type: precision_at_5 value: 29.558 - type: recall_at_1 value: 43.822 - type: recall_at_10 value: 78.636 - type: recall_at_100 value: 89.527 - type: recall_at_1000 value: 95.868 - type: recall_at_20 value: 82.363 - type: recall_at_3 value: 68.879 - type: recall_at_5 value: 73.896 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 96.6608 - type: ap value: 95.14657820401189 - type: ap_weighted value: 95.14657820401189 - type: f1 value: 96.66029695623422 - type: f1_weighted value: 96.66029695623423 - type: main_score value: 96.6608 - task: type: Retrieval dataset: name: MTEB MSMARCO type: mteb/msmarco config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: main_score value: 45.217 - type: map_at_1 value: 24.728 - type: map_at_10 value: 37.933 - type: map_at_100 value: 39.074999999999996 - type: map_at_1000 value: 39.115 - type: map_at_20 value: 38.663 - type: map_at_3 value: 33.904 - type: map_at_5 value: 36.217 - type: mrr_at_1 value: 25.44412607449857 - type: mrr_at_10 value: 38.52640196479737 - type: mrr_at_100 value: 39.60462889736067 - type: mrr_at_1000 value: 39.638904296248526 - type: mrr_at_20 value: 39.2234365827559 - type: mrr_at_3 value: 34.59646609360076 - type: mrr_at_5 value: 36.8801337153773 - type: nauc_map_at_1000_diff1 value: 37.645652178132174 - type: nauc_map_at_1000_max value: 9.953357023361367 - type: nauc_map_at_1000_std value: -20.800238036721503 - type: nauc_map_at_100_diff1 value: 37.643073495974555 - type: nauc_map_at_100_max value: 9.95921239641703 - type: nauc_map_at_100_std value: -20.76517765535793 - type: nauc_map_at_10_diff1 value: 37.44380763335014 - type: nauc_map_at_10_max value: 9.917273043055342 - type: nauc_map_at_10_std value: -21.467951225710898 - type: nauc_map_at_1_diff1 value: 41.02118887981969 - type: nauc_map_at_1_max value: 8.301113449711778 - type: nauc_map_at_1_std value: -19.436814224415027 - type: nauc_map_at_20_diff1 value: 37.58156586490493 - type: nauc_map_at_20_max value: 9.972927967610659 - type: nauc_map_at_20_std value: -20.951374218839387 - type: nauc_map_at_3_diff1 value: 37.67246795684178 - type: nauc_map_at_3_max value: 9.307031378909478 - type: nauc_map_at_3_std value: -21.77026217965021 - type: nauc_map_at_5_diff1 value: 37.39086482095963 - type: nauc_map_at_5_max value: 9.732739107368566 - type: nauc_map_at_5_std value: -21.8424296893692 - type: nauc_mrr_at_1000_diff1 value: 37.36666719603192 - type: nauc_mrr_at_1000_max value: 9.79040465289953 - type: nauc_mrr_at_1000_std value: -20.590147245965568 - type: nauc_mrr_at_100_diff1 value: 37.36560296629318 - type: nauc_mrr_at_100_max value: 9.798113710672162 - type: nauc_mrr_at_100_std value: -20.556791838504292 - type: nauc_mrr_at_10_diff1 value: 37.19257605840734 - type: nauc_mrr_at_10_max value: 9.749429811638063 - type: nauc_mrr_at_10_std value: -21.206407664327276 - type: nauc_mrr_at_1_diff1 value: 40.98478651095172 - type: nauc_mrr_at_1_max value: 8.173841799119707 - type: nauc_mrr_at_1_std value: -19.530027987868017 - type: nauc_mrr_at_20_diff1 value: 37.29973172861245 - type: nauc_mrr_at_20_max value: 9.815127660001345 - type: nauc_mrr_at_20_std value: -20.700860112175928 - type: nauc_mrr_at_3_diff1 value: 37.282848009425734 - type: nauc_mrr_at_3_max value: 9.172741713108193 - type: nauc_mrr_at_3_std value: -21.563630513502996 - type: nauc_mrr_at_5_diff1 value: 37.08609827303586 - type: nauc_mrr_at_5_max value: 9.604643424273284 - type: nauc_mrr_at_5_std value: -21.580110806494094 - type: nauc_ndcg_at_1000_diff1 value: 37.086587020218545 - type: nauc_ndcg_at_1000_max value: 10.696860688467472 - type: nauc_ndcg_at_1000_std value: -19.50989939916873 - type: nauc_ndcg_at_100_diff1 value: 37.03794531268128 - type: nauc_ndcg_at_100_max value: 10.940820719182339 - type: nauc_ndcg_at_100_std value: -18.28651832370893 - type: nauc_ndcg_at_10_diff1 value: 36.21062857920633 - type: nauc_ndcg_at_10_max value: 10.845172882571733 - type: nauc_ndcg_at_10_std value: -21.454301679510106 - type: nauc_ndcg_at_1_diff1 value: 40.98478651095172 - type: nauc_ndcg_at_1_max value: 8.173841799119707 - type: nauc_ndcg_at_1_std value: -19.530027987868017 - type: nauc_ndcg_at_20_diff1 value: 36.583262733100526 - type: nauc_ndcg_at_20_max value: 11.10492720898974 - type: nauc_ndcg_at_20_std value: -19.41753284137609 - type: nauc_ndcg_at_3_diff1 value: 36.57271365035382 - type: nauc_ndcg_at_3_max value: 9.56073433062999 - type: nauc_ndcg_at_3_std value: -22.324263670932915 - type: nauc_ndcg_at_5_diff1 value: 36.09419372820154 - type: nauc_ndcg_at_5_max value: 10.357384992631271 - type: nauc_ndcg_at_5_std value: -22.389578276324894 - type: nauc_precision_at_1000_diff1 value: -2.7435338714030597 - type: nauc_precision_at_1000_max value: 4.302274933383809 - type: nauc_precision_at_1000_std value: 8.456846348638948 - type: nauc_precision_at_100_diff1 value: 15.149466332615983 - type: nauc_precision_at_100_max value: 12.501013731673163 - type: nauc_precision_at_100_std value: 15.909667509021785 - type: nauc_precision_at_10_diff1 value: 28.699788688314214 - type: nauc_precision_at_10_max value: 13.024586051842347 - type: nauc_precision_at_10_std value: -19.197658937078703 - type: nauc_precision_at_1_diff1 value: 40.98478651095172 - type: nauc_precision_at_1_max value: 8.173841799119707 - type: nauc_precision_at_1_std value: -19.530027987868017 - type: nauc_precision_at_20_diff1 value: 26.519292942353395 - type: nauc_precision_at_20_max value: 14.389979272056438 - type: nauc_precision_at_20_std value: -7.030956994938155 - type: nauc_precision_at_3_diff1 value: 32.87913492278213 - type: nauc_precision_at_3_max value: 9.673660161387776 - type: nauc_precision_at_3_std value: -23.905612656592172 - type: nauc_precision_at_5_diff1 value: 30.903850113238597 - type: nauc_precision_at_5_max value: 11.482375434154898 - type: nauc_precision_at_5_std value: -23.828657095254247 - type: nauc_recall_at_1000_diff1 value: 35.80765639589219 - type: nauc_recall_at_1000_max value: 50.94532805969448 - type: nauc_recall_at_1000_std value: 66.79910877083275 - type: nauc_recall_at_100_diff1 value: 34.96182828311028 - type: nauc_recall_at_100_max value: 21.729699631790556 - type: nauc_recall_at_100_std value: 23.509439011686474 - type: nauc_recall_at_10_diff1 value: 31.88371369567137 - type: nauc_recall_at_10_max value: 14.425389702697073 - type: nauc_recall_at_10_std value: -20.95578001880924 - type: nauc_recall_at_1_diff1 value: 41.02118887981969 - type: nauc_recall_at_1_max value: 8.301113449711778 - type: nauc_recall_at_1_std value: -19.436814224415027 - type: nauc_recall_at_20_diff1 value: 32.42718780622455 - type: nauc_recall_at_20_max value: 16.90686126329399 - type: nauc_recall_at_20_std value: -9.38158227016737 - type: nauc_recall_at_3_diff1 value: 33.68966646043966 - type: nauc_recall_at_3_max value: 10.336277419708532 - type: nauc_recall_at_3_std value: -23.80165869168538 - type: nauc_recall_at_5_diff1 value: 32.26258807452426 - type: nauc_recall_at_5_max value: 12.303713005399935 - type: nauc_recall_at_5_std value: -23.87721891164968 - type: ndcg_at_1 value: 25.444 - type: ndcg_at_10 value: 45.217 - type: ndcg_at_100 value: 50.575 - type: ndcg_at_1000 value: 51.519999999999996 - type: ndcg_at_20 value: 47.786 - type: ndcg_at_3 value: 37.067 - type: ndcg_at_5 value: 41.184 - type: precision_at_1 value: 25.444 - type: precision_at_10 value: 7.07 - type: precision_at_100 value: 0.9730000000000001 - type: precision_at_1000 value: 0.106 - type: precision_at_20 value: 4.072 - type: precision_at_3 value: 15.754999999999999 - type: precision_at_5 value: 11.544 - type: recall_at_1 value: 24.728 - type: recall_at_10 value: 67.607 - type: recall_at_100 value: 92.094 - type: recall_at_1000 value: 99.165 - type: recall_at_20 value: 77.529 - type: recall_at_3 value: 45.535 - type: recall_at_5 value: 55.394 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 99.01276789785682 - type: f1 value: 98.9288649250924 - type: f1_weighted value: 99.01406884928141 - type: main_score value: 99.01276789785682 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 92.78385772913816 - type: f1 value: 79.78115704297824 - type: f1_weighted value: 93.90424147486428 - type: main_score value: 92.78385772913816 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 85.83053127101546 - type: f1 value: 82.72036139888232 - type: f1_weighted value: 85.81759723866098 - type: main_score value: 85.83053127101546 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 90.19838601210489 - type: f1 value: 89.55260197964978 - type: f1_weighted value: 90.11422965504119 - type: main_score value: 90.19838601210489 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: main_score value: 46.866746897607094 - type: v_measure value: 46.866746897607094 - type: v_measure_std value: 1.0966477896919726 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: main_score value: 44.6538827415503 - type: v_measure value: 44.6538827415503 - type: v_measure_std value: 1.1649569936599116 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 59042f120c80e8afa9cdbb224f67076cec0fc9a7 metrics: - type: main_score value: 33.05449204940555 - type: map value: 33.05449204940555 - type: mrr value: 34.32562058439585 - type: nAUC_map_diff1 value: 11.465656013162807 - type: nAUC_map_max value: -20.400088169502308 - type: nAUC_map_std value: -2.638964886362445 - type: nAUC_mrr_diff1 value: 10.644290702481207 - type: nAUC_mrr_max value: -15.304687384645769 - type: nAUC_mrr_std value: -0.519919931348978 - task: type: Retrieval dataset: name: MTEB NFCorpus type: mteb/nfcorpus config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: main_score value: 41.998000000000005 - type: map_at_1 value: 6.907000000000001 - type: map_at_10 value: 16.397000000000002 - type: map_at_100 value: 21.69 - type: map_at_1000 value: 23.652 - type: map_at_20 value: 18.629 - type: map_at_3 value: 11.969000000000001 - type: map_at_5 value: 13.894 - type: mrr_at_1 value: 53.25077399380805 - type: mrr_at_10 value: 61.8561108653988 - type: mrr_at_100 value: 62.42447851935404 - type: mrr_at_1000 value: 62.459626424428095 - type: mrr_at_20 value: 62.287236389990696 - type: mrr_at_3 value: 60.42311661506711 - type: mrr_at_5 value: 61.36738906088753 - type: nauc_map_at_1000_diff1 value: 17.159461939643844 - type: nauc_map_at_1000_max value: 32.42764938789903 - type: nauc_map_at_1000_std value: 11.039427848422093 - type: nauc_map_at_100_diff1 value: 19.089532984187503 - type: nauc_map_at_100_max value: 31.96721085058713 - type: nauc_map_at_100_std value: 6.947468655726444 - type: nauc_map_at_10_diff1 value: 25.77255342629802 - type: nauc_map_at_10_max value: 26.163590320961543 - type: nauc_map_at_10_std value: -5.2588093720998375 - type: nauc_map_at_1_diff1 value: 46.31602607957798 - type: nauc_map_at_1_max value: 11.807757660801942 - type: nauc_map_at_1_std value: -13.984889089354317 - type: nauc_map_at_20_diff1 value: 22.308161130465365 - type: nauc_map_at_20_max value: 29.070587307827722 - type: nauc_map_at_20_std value: -1.0103056620851558 - type: nauc_map_at_3_diff1 value: 33.580827849617506 - type: nauc_map_at_3_max value: 17.661630885799042 - type: nauc_map_at_3_std value: -11.463282544041888 - type: nauc_map_at_5_diff1 value: 30.32603342696912 - type: nauc_map_at_5_max value: 20.938905485667245 - type: nauc_map_at_5_std value: -10.537086968155755 - type: nauc_mrr_at_1000_diff1 value: 24.45065397805829 - type: nauc_mrr_at_1000_max value: 48.17519860927417 - type: nauc_mrr_at_1000_std value: 30.350767549118903 - type: nauc_mrr_at_100_diff1 value: 24.444061606534486 - type: nauc_mrr_at_100_max value: 48.1922894212229 - type: nauc_mrr_at_100_std value: 30.379257816584094 - type: nauc_mrr_at_10_diff1 value: 24.25598717198779 - type: nauc_mrr_at_10_max value: 48.10437607774264 - type: nauc_mrr_at_10_std value: 30.090202482685996 - type: nauc_mrr_at_1_diff1 value: 26.907595285201264 - type: nauc_mrr_at_1_max value: 44.006974050369955 - type: nauc_mrr_at_1_std value: 26.921001962861062 - type: nauc_mrr_at_20_diff1 value: 24.462771570553738 - type: nauc_mrr_at_20_max value: 48.264688196799746 - type: nauc_mrr_at_20_std value: 30.498095141265914 - type: nauc_mrr_at_3_diff1 value: 24.76829388237229 - type: nauc_mrr_at_3_max value: 48.213758704739924 - type: nauc_mrr_at_3_std value: 30.1502853918892 - type: nauc_mrr_at_5_diff1 value: 24.476494932330247 - type: nauc_mrr_at_5_max value: 47.977250552198804 - type: nauc_mrr_at_5_std value: 29.65248143104835 - type: nauc_ndcg_at_1000_diff1 value: 13.055818920426246 - type: nauc_ndcg_at_1000_max value: 46.00986444256306 - type: nauc_ndcg_at_1000_std value: 29.622662054922085 - type: nauc_ndcg_at_100_diff1 value: 12.260551238228816 - type: nauc_ndcg_at_100_max value: 39.89783048267698 - type: nauc_ndcg_at_100_std value: 23.806961617956613 - type: nauc_ndcg_at_10_diff1 value: 11.002915931619567 - type: nauc_ndcg_at_10_max value: 39.79323759244374 - type: nauc_ndcg_at_10_std value: 23.053072152911046 - type: nauc_ndcg_at_1_diff1 value: 27.560910719974434 - type: nauc_ndcg_at_1_max value: 41.21084046258119 - type: nauc_ndcg_at_1_std value: 26.112891742912893 - type: nauc_ndcg_at_20_diff1 value: 10.085854089024496 - type: nauc_ndcg_at_20_max value: 37.88629173784684 - type: nauc_ndcg_at_20_std value: 23.17664322248358 - type: nauc_ndcg_at_3_diff1 value: 16.58969583405987 - type: nauc_ndcg_at_3_max value: 41.282222954101435 - type: nauc_ndcg_at_3_std value: 21.080670648392747 - type: nauc_ndcg_at_5_diff1 value: 13.893127947909885 - type: nauc_ndcg_at_5_max value: 40.21188015992804 - type: nauc_ndcg_at_5_std value: 21.417443978842652 - type: nauc_precision_at_1000_diff1 value: -17.227504530334564 - type: nauc_precision_at_1000_max value: 3.798554468439066 - type: nauc_precision_at_1000_std value: 35.73617809452683 - type: nauc_precision_at_100_diff1 value: -17.63388230218776 - type: nauc_precision_at_100_max value: 15.079399882407094 - type: nauc_precision_at_100_std value: 41.83698491321226 - type: nauc_precision_at_10_diff1 value: -11.850925959645156 - type: nauc_precision_at_10_max value: 35.93283968364352 - type: nauc_precision_at_10_std value: 34.391271855921296 - type: nauc_precision_at_1_diff1 value: 27.730860778824823 - type: nauc_precision_at_1_max value: 43.97462471516834 - type: nauc_precision_at_1_std value: 27.491068270978896 - type: nauc_precision_at_20_diff1 value: -14.281328840943347 - type: nauc_precision_at_20_max value: 29.469099781759006 - type: nauc_precision_at_20_std value: 38.54703022340941 - type: nauc_precision_at_3_diff1 value: 3.486986910413196 - type: nauc_precision_at_3_max value: 41.21107780473768 - type: nauc_precision_at_3_std value: 24.057479124531216 - type: nauc_precision_at_5_diff1 value: -3.0623787872866233 - type: nauc_precision_at_5_max value: 37.49266386466702 - type: nauc_precision_at_5_std value: 26.894454268004935 - type: nauc_recall_at_1000_diff1 value: -2.446891864334283 - type: nauc_recall_at_1000_max value: 23.867293584643377 - type: nauc_recall_at_1000_std value: 16.34707128224595 - type: nauc_recall_at_100_diff1 value: 4.891133690841179 - type: nauc_recall_at_100_max value: 24.56727964996522 - type: nauc_recall_at_100_std value: 9.847212953200797 - type: nauc_recall_at_10_diff1 value: 19.211912363585288 - type: nauc_recall_at_10_max value: 24.825344777920737 - type: nauc_recall_at_10_std value: -5.447989195041898 - type: nauc_recall_at_1_diff1 value: 46.31602607957798 - type: nauc_recall_at_1_max value: 11.807757660801942 - type: nauc_recall_at_1_std value: -13.984889089354317 - type: nauc_recall_at_20_diff1 value: 12.233372054304805 - type: nauc_recall_at_20_max value: 22.284108685207148 - type: nauc_recall_at_20_std value: -4.317138366746209 - type: nauc_recall_at_3_diff1 value: 28.394631527225815 - type: nauc_recall_at_3_max value: 15.593864852625462 - type: nauc_recall_at_3_std value: -12.383531804314593 - type: nauc_recall_at_5_diff1 value: 24.457441304950343 - type: nauc_recall_at_5_max value: 19.080049396281623 - type: nauc_recall_at_5_std value: -11.879747703626627 - type: ndcg_at_1 value: 51.548 - type: ndcg_at_10 value: 41.998000000000005 - type: ndcg_at_100 value: 39.626 - type: ndcg_at_1000 value: 48.707 - type: ndcg_at_20 value: 40.181 - type: ndcg_at_3 value: 48.06 - type: ndcg_at_5 value: 45.829 - type: precision_at_1 value: 52.941 - type: precision_at_10 value: 31.330999999999996 - type: precision_at_100 value: 10.421 - type: precision_at_1000 value: 2.428 - type: precision_at_20 value: 24.118000000000002 - type: precision_at_3 value: 45.408 - type: precision_at_5 value: 39.938 - type: recall_at_1 value: 6.907000000000001 - type: recall_at_10 value: 20.51 - type: recall_at_100 value: 40.857 - type: recall_at_1000 value: 73.616 - type: recall_at_20 value: 26.52 - type: recall_at_3 value: 13.267999999999999 - type: recall_at_5 value: 16.141 - task: type: Retrieval dataset: name: MTEB NQ type: mteb/nq config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: main_score value: 71.8 - type: map_at_1 value: 47.629 - type: map_at_10 value: 64.846 - type: map_at_100 value: 65.40899999999999 - type: map_at_1000 value: 65.416 - type: map_at_20 value: 65.239 - type: map_at_3 value: 61.185 - type: map_at_5 value: 63.583 - type: mrr_at_1 value: 53.15758980301275 - type: mrr_at_10 value: 67.12880961577366 - type: mrr_at_100 value: 67.44006405426018 - type: mrr_at_1000 value: 67.44519150402294 - type: mrr_at_20 value: 67.34317135515428 - type: mrr_at_3 value: 64.5905755117805 - type: mrr_at_5 value: 66.24613750482806 - type: nauc_map_at_1000_diff1 value: 45.73812106517133 - type: nauc_map_at_1000_max value: 35.21262031755756 - type: nauc_map_at_1000_std value: -5.549443574026027 - type: nauc_map_at_100_diff1 value: 45.74254652176879 - type: nauc_map_at_100_max value: 35.22349167515518 - type: nauc_map_at_100_std value: -5.53697496044773 - type: nauc_map_at_10_diff1 value: 45.62837128377087 - type: nauc_map_at_10_max value: 35.3261562342222 - type: nauc_map_at_10_std value: -5.761924414031163 - type: nauc_map_at_1_diff1 value: 48.69187848570499 - type: nauc_map_at_1_max value: 28.687996096473476 - type: nauc_map_at_1_std value: -7.518605958272523 - type: nauc_map_at_20_diff1 value: 45.702303442220035 - type: nauc_map_at_20_max value: 35.30719944705456 - type: nauc_map_at_20_std value: -5.59505654742681 - type: nauc_map_at_3_diff1 value: 45.376813726832474 - type: nauc_map_at_3_max value: 34.68452149643597 - type: nauc_map_at_3_std value: -7.329014950379634 - type: nauc_map_at_5_diff1 value: 45.29528861989316 - type: nauc_map_at_5_max value: 35.35741440869229 - type: nauc_map_at_5_std value: -6.028788612259288 - type: nauc_mrr_at_1000_diff1 value: 46.11808147912517 - type: nauc_mrr_at_1000_max value: 35.59241850411947 - type: nauc_mrr_at_1000_std value: -3.4072428526109317 - type: nauc_mrr_at_100_diff1 value: 46.121345545514046 - type: nauc_mrr_at_100_max value: 35.60147795073431 - type: nauc_mrr_at_100_std value: -3.3965322447588826 - type: nauc_mrr_at_10_diff1 value: 46.0920068210502 - type: nauc_mrr_at_10_max value: 35.79649987854354 - type: nauc_mrr_at_10_std value: -3.339624589368137 - type: nauc_mrr_at_1_diff1 value: 49.101364605656194 - type: nauc_mrr_at_1_max value: 31.500796071482146 - type: nauc_mrr_at_1_std value: -4.183818500718156 - type: nauc_mrr_at_20_diff1 value: 46.088076630465594 - type: nauc_mrr_at_20_max value: 35.682131663053205 - type: nauc_mrr_at_20_std value: -3.35939023178519 - type: nauc_mrr_at_3_diff1 value: 45.47570812708642 - type: nauc_mrr_at_3_max value: 35.741892517632984 - type: nauc_mrr_at_3_std value: -4.135335963822013 - type: nauc_mrr_at_5_diff1 value: 45.78903474184014 - type: nauc_mrr_at_5_max value: 35.91273593700205 - type: nauc_mrr_at_5_std value: -3.467873421286869 - type: nauc_ndcg_at_1000_diff1 value: 45.5056583000012 - type: nauc_ndcg_at_1000_max value: 36.34328379251593 - type: nauc_ndcg_at_1000_std value: -4.0759698229323345 - type: nauc_ndcg_at_100_diff1 value: 45.61918946477166 - type: nauc_ndcg_at_100_max value: 36.675460335836235 - type: nauc_ndcg_at_100_std value: -3.6795334726235986 - type: nauc_ndcg_at_10_diff1 value: 45.15343994274541 - type: nauc_ndcg_at_10_max value: 37.48139242964657 - type: nauc_ndcg_at_10_std value: -4.287039084554882 - type: nauc_ndcg_at_1_diff1 value: 49.101364605656194 - type: nauc_ndcg_at_1_max value: 31.500796071482146 - type: nauc_ndcg_at_1_std value: -4.183818500718156 - type: nauc_ndcg_at_20_diff1 value: 45.310026313402375 - type: nauc_ndcg_at_20_max value: 37.32177497902133 - type: nauc_ndcg_at_20_std value: -3.8214360391282587 - type: nauc_ndcg_at_3_diff1 value: 44.27064370528994 - type: nauc_ndcg_at_3_max value: 36.380294033571396 - type: nauc_ndcg_at_3_std value: -6.844263370898355 - type: nauc_ndcg_at_5_diff1 value: 44.29933499225583 - type: nauc_ndcg_at_5_max value: 37.46477041822136 - type: nauc_ndcg_at_5_std value: -4.866548530467956 - type: nauc_precision_at_1000_diff1 value: -14.666553359142306 - type: nauc_precision_at_1000_max value: -0.5599759853201481 - type: nauc_precision_at_1000_std value: 16.8370925526591 - type: nauc_precision_at_100_diff1 value: -11.816251306246278 - type: nauc_precision_at_100_max value: 2.969819268208207 - type: nauc_precision_at_100_std value: 18.59422946634747 - type: nauc_precision_at_10_diff1 value: 1.2050200086029401 - type: nauc_precision_at_10_max value: 17.59930352911209 - type: nauc_precision_at_10_std value: 13.714495717588985 - type: nauc_precision_at_1_diff1 value: 49.101364605656194 - type: nauc_precision_at_1_max value: 31.500796071482146 - type: nauc_precision_at_1_std value: -4.183818500718156 - type: nauc_precision_at_20_diff1 value: -5.263476664822757 - type: nauc_precision_at_20_max value: 11.42004823600046 - type: nauc_precision_at_20_std value: 16.510514518664994 - type: nauc_precision_at_3_diff1 value: 20.116460379305828 - type: nauc_precision_at_3_max value: 31.32235038301311 - type: nauc_precision_at_3_std value: 2.7486717133871923 - type: nauc_precision_at_5_diff1 value: 9.57451645335723 - type: nauc_precision_at_5_max value: 25.28449126580587 - type: nauc_precision_at_5_std value: 9.955736162466767 - type: nauc_recall_at_1000_diff1 value: -21.632253065978794 - type: nauc_recall_at_1000_max value: 70.14409090958776 - type: nauc_recall_at_1000_std value: 65.61658090892989 - type: nauc_recall_at_100_diff1 value: 51.83161124806711 - type: nauc_recall_at_100_max value: 77.49921361841523 - type: nauc_recall_at_100_std value: 48.352508746719444 - type: nauc_recall_at_10_diff1 value: 39.86695231362791 - type: nauc_recall_at_10_max value: 50.12029094799474 - type: nauc_recall_at_10_std value: 0.1650940628131058 - type: nauc_recall_at_1_diff1 value: 48.69187848570499 - type: nauc_recall_at_1_max value: 28.687996096473476 - type: nauc_recall_at_1_std value: -7.518605958272523 - type: nauc_recall_at_20_diff1 value: 39.14155398061627 - type: nauc_recall_at_20_max value: 56.78559423716229 - type: nauc_recall_at_20_std value: 7.9728224572344075 - type: nauc_recall_at_3_diff1 value: 38.69589523432158 - type: nauc_recall_at_3_max value: 39.53271258375579 - type: nauc_recall_at_3_std value: -8.646925065787512 - type: nauc_recall_at_5_diff1 value: 37.45922652959002 - type: nauc_recall_at_5_max value: 44.4911958995867 - type: nauc_recall_at_5_std value: -3.5659842556375594 - type: ndcg_at_1 value: 53.15800000000001 - type: ndcg_at_10 value: 71.8 - type: ndcg_at_100 value: 73.85199999999999 - type: ndcg_at_1000 value: 74.017 - type: ndcg_at_20 value: 72.933 - type: ndcg_at_3 value: 65.479 - type: ndcg_at_5 value: 69.182 - type: precision_at_1 value: 53.15800000000001 - type: precision_at_10 value: 10.805 - type: precision_at_100 value: 1.2 - type: precision_at_1000 value: 0.122 - type: precision_at_20 value: 5.694 - type: precision_at_3 value: 28.939999999999998 - type: precision_at_5 value: 19.641000000000002 - type: recall_at_1 value: 47.629 - type: recall_at_10 value: 90.204 - type: recall_at_100 value: 98.66 - type: recall_at_1000 value: 99.874 - type: recall_at_20 value: 94.24 - type: recall_at_3 value: 74.394 - type: recall_at_5 value: 82.711 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: mteb/quora config: default split: test revision: e4e08e0b7dbe3c8700f0daef558ff32256715259 metrics: - type: main_score value: 90.025 - type: map_at_1 value: 72.222 - type: map_at_10 value: 86.58500000000001 - type: map_at_100 value: 87.176 - type: map_at_1000 value: 87.188 - type: map_at_20 value: 86.97399999999999 - type: map_at_3 value: 83.736 - type: map_at_5 value: 85.554 - type: mrr_at_1 value: 83.04 - type: mrr_at_10 value: 89.05599603174585 - type: mrr_at_100 value: 89.12398891419457 - type: mrr_at_1000 value: 89.12434072241001 - type: mrr_at_20 value: 89.10416280692111 - type: mrr_at_3 value: 88.23833333333312 - type: mrr_at_5 value: 88.82233333333308 - type: nauc_map_at_1000_diff1 value: 78.29348113313218 - type: nauc_map_at_1000_max value: 32.31386754277228 - type: nauc_map_at_1000_std value: -50.47543661484052 - type: nauc_map_at_100_diff1 value: 78.29618548618575 - type: nauc_map_at_100_max value: 32.301475680947846 - type: nauc_map_at_100_std value: -50.50303428814228 - type: nauc_map_at_10_diff1 value: 78.47383776440803 - type: nauc_map_at_10_max value: 31.839339990133563 - type: nauc_map_at_10_std value: -52.832713555976 - type: nauc_map_at_1_diff1 value: 82.46330147467418 - type: nauc_map_at_1_max value: 23.497664918373538 - type: nauc_map_at_1_std value: -43.824657665520704 - type: nauc_map_at_20_diff1 value: 78.34772176474422 - type: nauc_map_at_20_max value: 32.16495182893947 - type: nauc_map_at_20_std value: -51.503292726558605 - type: nauc_map_at_3_diff1 value: 79.07823813069432 - type: nauc_map_at_3_max value: 29.395911687513976 - type: nauc_map_at_3_std value: -54.16377546873304 - type: nauc_map_at_5_diff1 value: 78.73076619520454 - type: nauc_map_at_5_max value: 30.700453118585237 - type: nauc_map_at_5_std value: -54.130514177664054 - type: nauc_mrr_at_1000_diff1 value: 79.04736184471865 - type: nauc_mrr_at_1000_max value: 34.43004593837643 - type: nauc_mrr_at_1000_std value: -46.137269068195316 - type: nauc_mrr_at_100_diff1 value: 79.04698704288086 - type: nauc_mrr_at_100_max value: 34.4305553741175 - type: nauc_mrr_at_100_std value: -46.13786687786434 - type: nauc_mrr_at_10_diff1 value: 79.04490677485934 - type: nauc_mrr_at_10_max value: 34.38170181522227 - type: nauc_mrr_at_10_std value: -46.38129875681807 - type: nauc_mrr_at_1_diff1 value: 79.87159215719124 - type: nauc_mrr_at_1_max value: 34.05882339253136 - type: nauc_mrr_at_1_std value: -43.56093395137571 - type: nauc_mrr_at_20_diff1 value: 79.04384174535653 - type: nauc_mrr_at_20_max value: 34.442136494675005 - type: nauc_mrr_at_20_std value: -46.205458519638654 - type: nauc_mrr_at_3_diff1 value: 78.78154519155487 - type: nauc_mrr_at_3_max value: 34.74995000500305 - type: nauc_mrr_at_3_std value: -46.36264203155416 - type: nauc_mrr_at_5_diff1 value: 79.02631187177 - type: nauc_mrr_at_5_max value: 34.538698249632205 - type: nauc_mrr_at_5_std value: -46.468881576157465 - type: nauc_ndcg_at_1000_diff1 value: 78.25260097014645 - type: nauc_ndcg_at_1000_max value: 33.68584498704271 - type: nauc_ndcg_at_1000_std value: -48.44716779494868 - type: nauc_ndcg_at_100_diff1 value: 78.25115412256716 - type: nauc_ndcg_at_100_max value: 33.63652663447088 - type: nauc_ndcg_at_100_std value: -48.489243909024715 - type: nauc_ndcg_at_10_diff1 value: 78.23875101557334 - type: nauc_ndcg_at_10_max value: 32.65217430043823 - type: nauc_ndcg_at_10_std value: -52.57770468845309 - type: nauc_ndcg_at_1_diff1 value: 79.87159215719124 - type: nauc_ndcg_at_1_max value: 34.05882339253136 - type: nauc_ndcg_at_1_std value: -43.56093395137571 - type: nauc_ndcg_at_20_diff1 value: 78.23478552311765 - type: nauc_ndcg_at_20_max value: 33.30691737901109 - type: nauc_ndcg_at_20_std value: -50.78412614854527 - type: nauc_ndcg_at_3_diff1 value: 77.66134485470224 - type: nauc_ndcg_at_3_max value: 32.19504710373125 - type: nauc_ndcg_at_3_std value: -52.01636728550155 - type: nauc_ndcg_at_5_diff1 value: 78.04734137324255 - type: nauc_ndcg_at_5_max value: 31.94593625591248 - type: nauc_ndcg_at_5_std value: -53.02169800690546 - type: nauc_precision_at_1000_diff1 value: -45.771948123542636 - type: nauc_precision_at_1000_max value: -5.182406190477681 - type: nauc_precision_at_1000_std value: 41.14460438707817 - type: nauc_precision_at_100_diff1 value: -45.64767154261461 - type: nauc_precision_at_100_max value: -5.046308286851713 - type: nauc_precision_at_100_std value: 41.07186716587844 - type: nauc_precision_at_10_diff1 value: -42.26779562305825 - type: nauc_precision_at_10_max value: -1.1264852893323076 - type: nauc_precision_at_10_std value: 27.62275729822392 - type: nauc_precision_at_1_diff1 value: 79.87159215719124 - type: nauc_precision_at_1_max value: 34.05882339253136 - type: nauc_precision_at_1_std value: -43.56093395137571 - type: nauc_precision_at_20_diff1 value: -44.24293221128388 - type: nauc_precision_at_20_max value: -3.1345628837361867 - type: nauc_precision_at_20_std value: 34.23625492740366 - type: nauc_precision_at_3_diff1 value: -24.925251389823348 - type: nauc_precision_at_3_max value: 6.622188833369412 - type: nauc_precision_at_3_std value: 6.424741786858512 - type: nauc_precision_at_5_diff1 value: -36.1407949990387 - type: nauc_precision_at_5_max value: 1.7533948968374462 - type: nauc_precision_at_5_std value: 17.914083278982634 - type: nauc_recall_at_1000_diff1 value: 52.26815466244496 - type: nauc_recall_at_1000_max value: 69.73611104239443 - type: nauc_recall_at_1000_std value: 73.18969965863008 - type: nauc_recall_at_100_diff1 value: 70.80557513785271 - type: nauc_recall_at_100_max value: 33.333440086544556 - type: nauc_recall_at_100_std value: -38.75992366905504 - type: nauc_recall_at_10_diff1 value: 74.45948457438163 - type: nauc_recall_at_10_max value: 26.64948512428989 - type: nauc_recall_at_10_std value: -82.90334292052363 - type: nauc_recall_at_1_diff1 value: 82.46330147467418 - type: nauc_recall_at_1_max value: 23.497664918373538 - type: nauc_recall_at_1_std value: -43.824657665520704 - type: nauc_recall_at_20_diff1 value: 73.80140280887753 - type: nauc_recall_at_20_max value: 30.361616426734965 - type: nauc_recall_at_20_std value: -81.1418804447414 - type: nauc_recall_at_3_diff1 value: 75.19854736087834 - type: nauc_recall_at_3_max value: 26.12298005045584 - type: nauc_recall_at_3_std value: -63.42583714745169 - type: nauc_recall_at_5_diff1 value: 74.16423451950358 - type: nauc_recall_at_5_max value: 25.552390331018987 - type: nauc_recall_at_5_std value: -71.15891947773912 - type: ndcg_at_1 value: 83.04 - type: ndcg_at_10 value: 90.025 - type: ndcg_at_100 value: 91.006 - type: ndcg_at_1000 value: 91.061 - type: ndcg_at_20 value: 90.556 - type: ndcg_at_3 value: 87.493 - type: ndcg_at_5 value: 88.955 - type: precision_at_1 value: 83.04 - type: precision_at_10 value: 13.667000000000002 - type: precision_at_100 value: 1.542 - type: precision_at_1000 value: 0.157 - type: precision_at_20 value: 7.221 - type: precision_at_3 value: 38.433 - type: precision_at_5 value: 25.228 - type: recall_at_1 value: 72.222 - type: recall_at_10 value: 96.604 - type: recall_at_100 value: 99.786 - type: recall_at_1000 value: 99.996 - type: recall_at_20 value: 98.253 - type: recall_at_3 value: 89.276 - type: recall_at_5 value: 93.46 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: main_score value: 72.86492101891123 - type: v_measure value: 72.86492101891123 - type: v_measure_std value: 2.778711445144635 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 385e3cb46b4cfa89021f56c4380204149d0efe33 metrics: - type: main_score value: 75.27316726548479 - type: v_measure value: 75.27316726548479 - type: v_measure_std value: 8.87871936725338 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: mteb/scidocs config: default split: test revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88 metrics: - type: main_score value: 26.638 - type: map_at_1 value: 6.128 - type: map_at_10 value: 16.472 - type: map_at_100 value: 19.522000000000002 - type: map_at_1000 value: 19.898 - type: map_at_20 value: 18.098 - type: map_at_3 value: 11.283 - type: map_at_5 value: 13.771 - type: mrr_at_1 value: 30.2 - type: mrr_at_10 value: 42.621150793650735 - type: mrr_at_100 value: 43.740858712021954 - type: mrr_at_1000 value: 43.762699500220904 - type: mrr_at_20 value: 43.383639927753634 - type: mrr_at_3 value: 38.83333333333331 - type: mrr_at_5 value: 41.14833333333326 - type: nauc_map_at_1000_diff1 value: 13.13534664124808 - type: nauc_map_at_1000_max value: 29.346654566149795 - type: nauc_map_at_1000_std value: 18.08121186982413 - type: nauc_map_at_100_diff1 value: 13.098072728041538 - type: nauc_map_at_100_max value: 29.299084480697523 - type: nauc_map_at_100_std value: 17.961620202918464 - type: nauc_map_at_10_diff1 value: 14.001743720394682 - type: nauc_map_at_10_max value: 28.04128290996403 - type: nauc_map_at_10_std value: 13.744481555974716 - type: nauc_map_at_1_diff1 value: 22.1926640424872 - type: nauc_map_at_1_max value: 21.32609279586034 - type: nauc_map_at_1_std value: 6.566596302915438 - type: nauc_map_at_20_diff1 value: 13.57313142419664 - type: nauc_map_at_20_max value: 28.93840146319476 - type: nauc_map_at_20_std value: 16.50869367365676 - type: nauc_map_at_3_diff1 value: 17.707700541948462 - type: nauc_map_at_3_max value: 26.058174051376238 - type: nauc_map_at_3_std value: 9.943924560735267 - type: nauc_map_at_5_diff1 value: 17.11844492157723 - type: nauc_map_at_5_max value: 27.865247403049388 - type: nauc_map_at_5_std value: 11.372588172121546 - type: nauc_mrr_at_1000_diff1 value: 21.11248719936198 - type: nauc_mrr_at_1000_max value: 26.734172102201466 - type: nauc_mrr_at_1000_std value: 11.766121765437228 - type: nauc_mrr_at_100_diff1 value: 21.107109982277702 - type: nauc_mrr_at_100_max value: 26.741616065723267 - type: nauc_mrr_at_100_std value: 11.789802686224208 - type: nauc_mrr_at_10_diff1 value: 20.74108639793207 - type: nauc_mrr_at_10_max value: 26.920838463358333 - type: nauc_mrr_at_10_std value: 11.849217361926522 - type: nauc_mrr_at_1_diff1 value: 22.177437860573356 - type: nauc_mrr_at_1_max value: 21.88074521417754 - type: nauc_mrr_at_1_std value: 6.776011900101789 - type: nauc_mrr_at_20_diff1 value: 21.126633710175994 - type: nauc_mrr_at_20_max value: 26.860736480370974 - type: nauc_mrr_at_20_std value: 11.815411633726338 - type: nauc_mrr_at_3_diff1 value: 21.689245200066466 - type: nauc_mrr_at_3_max value: 26.187305092831625 - type: nauc_mrr_at_3_std value: 10.895380313134332 - type: nauc_mrr_at_5_diff1 value: 20.898811082479778 - type: nauc_mrr_at_5_max value: 26.939217247104036 - type: nauc_mrr_at_5_std value: 11.77832949822472 - type: nauc_ndcg_at_1000_diff1 value: 13.251184947898546 - type: nauc_ndcg_at_1000_max value: 30.879594164526146 - type: nauc_ndcg_at_1000_std value: 23.125206047366625 - type: nauc_ndcg_at_100_diff1 value: 12.549100649053676 - type: nauc_ndcg_at_100_max value: 30.634680845419123 - type: nauc_ndcg_at_100_std value: 23.296226055422984 - type: nauc_ndcg_at_10_diff1 value: 14.475144549294322 - type: nauc_ndcg_at_10_max value: 29.450349815417336 - type: nauc_ndcg_at_10_std value: 15.94068314781612 - type: nauc_ndcg_at_1_diff1 value: 22.177437860573356 - type: nauc_ndcg_at_1_max value: 21.88074521417754 - type: nauc_ndcg_at_1_std value: 6.776011900101789 - type: nauc_ndcg_at_20_diff1 value: 14.173669585802266 - type: nauc_ndcg_at_20_max value: 30.475890854725 - type: nauc_ndcg_at_20_std value: 19.863898148221704 - type: nauc_ndcg_at_3_diff1 value: 18.93971261196868 - type: nauc_ndcg_at_3_max value: 27.3707298720736 - type: nauc_ndcg_at_3_std value: 11.439810510051224 - type: nauc_ndcg_at_5_diff1 value: 17.89535958094687 - type: nauc_ndcg_at_5_max value: 29.272740466638425 - type: nauc_ndcg_at_5_std value: 13.402467626635909 - type: nauc_precision_at_1000_diff1 value: -3.811547048784123 - type: nauc_precision_at_1000_max value: 22.55165337197117 - type: nauc_precision_at_1000_std value: 35.98524999650108 - type: nauc_precision_at_100_diff1 value: 0.6474234774922896 - type: nauc_precision_at_100_max value: 25.06920726527032 - type: nauc_precision_at_100_std value: 32.31439698982313 - type: nauc_precision_at_10_diff1 value: 7.943127218139508 - type: nauc_precision_at_10_max value: 28.571937636787197 - type: nauc_precision_at_10_std value: 18.8472620918488 - type: nauc_precision_at_1_diff1 value: 22.177437860573356 - type: nauc_precision_at_1_max value: 21.88074521417754 - type: nauc_precision_at_1_std value: 6.776011900101789 - type: nauc_precision_at_20_diff1 value: 6.981574259607366 - type: nauc_precision_at_20_max value: 28.986094397038727 - type: nauc_precision_at_20_std value: 25.83129974001146 - type: nauc_precision_at_3_diff1 value: 17.197490724039355 - type: nauc_precision_at_3_max value: 29.17569320583099 - type: nauc_precision_at_3_std value: 13.430554945991846 - type: nauc_precision_at_5_diff1 value: 14.952364330739362 - type: nauc_precision_at_5_max value: 31.053243354846977 - type: nauc_precision_at_5_std value: 15.856312752807822 - type: nauc_recall_at_1000_diff1 value: -4.8224253128926975 - type: nauc_recall_at_1000_max value: 21.3989024429911 - type: nauc_recall_at_1000_std value: 39.152234275603604 - type: nauc_recall_at_100_diff1 value: 0.11936808422867201 - type: nauc_recall_at_100_max value: 24.261739241957823 - type: nauc_recall_at_100_std value: 32.62984573938928 - type: nauc_recall_at_10_diff1 value: 7.851256165018388 - type: nauc_recall_at_10_max value: 27.936406600938746 - type: nauc_recall_at_10_std value: 18.683634320636113 - type: nauc_recall_at_1_diff1 value: 22.1926640424872 - type: nauc_recall_at_1_max value: 21.32609279586034 - type: nauc_recall_at_1_std value: 6.566596302915438 - type: nauc_recall_at_20_diff1 value: 6.8107211705182165 - type: nauc_recall_at_20_max value: 28.286284094687787 - type: nauc_recall_at_20_std value: 25.932013268120862 - type: nauc_recall_at_3_diff1 value: 17.04156818427151 - type: nauc_recall_at_3_max value: 28.645439108719216 - type: nauc_recall_at_3_std value: 13.346047828494411 - type: nauc_recall_at_5_diff1 value: 14.906284329771822 - type: nauc_recall_at_5_max value: 30.58628602415921 - type: nauc_recall_at_5_std value: 15.755157478191755 - type: ndcg_at_1 value: 30.2 - type: ndcg_at_10 value: 26.638 - type: ndcg_at_100 value: 37.135 - type: ndcg_at_1000 value: 42.576 - type: ndcg_at_20 value: 30.75 - type: ndcg_at_3 value: 24.675 - type: ndcg_at_5 value: 21.836 - type: precision_at_1 value: 30.2 - type: precision_at_10 value: 14.06 - type: precision_at_100 value: 2.904 - type: precision_at_1000 value: 0.42 - type: precision_at_20 value: 9.4 - type: precision_at_3 value: 23.233 - type: precision_at_5 value: 19.439999999999998 - type: recall_at_1 value: 6.128 - type: recall_at_10 value: 28.471999999999998 - type: recall_at_100 value: 58.952000000000005 - type: recall_at_1000 value: 85.137 - type: recall_at_20 value: 38.17 - type: recall_at_3 value: 14.127999999999998 - type: recall_at_5 value: 19.673 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: 20a6d6f312dd54037fe07a32d58e5e168867909d metrics: - type: cosine_pearson value: 86.86608529160739 - type: cosine_spearman value: 82.88625166203383 - type: euclidean_pearson value: 84.15494418856142 - type: euclidean_spearman value: 82.88449294676421 - type: main_score value: 82.88625166203383 - type: manhattan_pearson value: 84.39068623474428 - type: manhattan_spearman value: 82.88065412169463 - type: pearson value: 86.86608529160739 - type: spearman value: 82.88625166203383 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cosine_pearson value: 87.0445014940449 - type: cosine_spearman value: 80.0880365116599 - type: euclidean_pearson value: 83.80250772928852 - type: euclidean_spearman value: 80.0892465260778 - type: main_score value: 80.0880365116599 - type: manhattan_pearson value: 83.96793981929336 - type: manhattan_spearman value: 80.24881789268238 - type: pearson value: 87.0445014940449 - type: spearman value: 80.0880365116599 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cosine_pearson value: 89.33900828959968 - type: cosine_spearman value: 89.68256358526733 - type: euclidean_pearson value: 89.29188708262265 - type: euclidean_spearman value: 89.68204344658601 - type: main_score value: 89.68256358526733 - type: manhattan_pearson value: 89.13996588193149 - type: manhattan_spearman value: 89.61372804425623 - type: pearson value: 89.33900828959968 - type: spearman value: 89.68256358526733 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cosine_pearson value: 86.42029843639123 - type: cosine_spearman value: 85.0707889220723 - type: euclidean_pearson value: 85.75114239552562 - type: euclidean_spearman value: 85.06858160270725 - type: main_score value: 85.0707889220723 - type: manhattan_pearson value: 85.86461900459038 - type: manhattan_spearman value: 85.28671103475605 - type: pearson value: 86.42029843639123 - type: spearman value: 85.0707889220723 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cosine_pearson value: 88.3660081271444 - type: cosine_spearman value: 89.39375083609528 - type: euclidean_pearson value: 89.21818482894895 - type: euclidean_spearman value: 89.39361588875443 - type: main_score value: 89.39375083609528 - type: manhattan_pearson value: 89.53535068014057 - type: manhattan_spearman value: 89.81077130567752 - type: pearson value: 88.3660081271444 - type: spearman value: 89.39375083609528 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cosine_pearson value: 85.60708247171874 - type: cosine_spearman value: 87.15234952832193 - type: euclidean_pearson value: 86.21743555548137 - type: euclidean_spearman value: 87.14450217418016 - type: main_score value: 87.15234952832193 - type: manhattan_pearson value: 86.2467748746084 - type: manhattan_spearman value: 87.2197479717654 - type: pearson value: 85.60708247171874 - type: spearman value: 87.15234952832193 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 91.25898556808458 - type: cosine_spearman value: 91.35372390581641 - type: euclidean_pearson value: 91.319520321348 - type: euclidean_spearman value: 91.30821135416925 - type: main_score value: 91.35372390581641 - type: manhattan_pearson value: 91.14800959939069 - type: manhattan_spearman value: 91.09775424245629 - type: pearson value: 91.25898556808458 - type: spearman value: 91.35372390581641 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 67.61637111515797 - type: cosine_spearman value: 68.10379096526697 - type: euclidean_pearson value: 69.2652309491375 - type: euclidean_spearman value: 68.18436357033228 - type: main_score value: 68.10379096526697 - type: manhattan_pearson value: 69.52531340510775 - type: manhattan_spearman value: 68.17874790391862 - type: pearson value: 67.61637111515797 - type: spearman value: 68.10379096526697 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cosine_pearson value: 87.81592853782297 - type: cosine_spearman value: 88.2302550329183 - type: euclidean_pearson value: 88.01165144519526 - type: euclidean_spearman value: 88.23342148890097 - type: main_score value: 88.2302550329183 - type: manhattan_pearson value: 88.148592564938 - type: manhattan_spearman value: 88.49226317320988 - type: pearson value: 87.81592853782297 - type: spearman value: 88.2302550329183 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: main_score value: 89.196009707431 - type: map value: 89.196009707431 - type: mrr value: 97.07198121413808 - type: nAUC_map_diff1 value: -14.066667940115352 - type: nAUC_map_max value: 49.73702475027407 - type: nAUC_map_std value: 64.0986775782592 - type: nAUC_mrr_diff1 value: 21.96846389417319 - type: nAUC_mrr_max value: 86.38341077184032 - type: nAUC_mrr_std value: 75.38945014727746 - task: type: Retrieval dataset: name: MTEB SciFact type: mteb/scifact config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: main_score value: 80.08999999999999 - type: map_at_1 value: 63.161 - type: map_at_10 value: 75.163 - type: map_at_100 value: 75.408 - type: map_at_1000 value: 75.409 - type: map_at_20 value: 75.332 - type: map_at_3 value: 71.839 - type: map_at_5 value: 74.32600000000001 - type: mrr_at_1 value: 66.33333333333333 - type: mrr_at_10 value: 75.95978835978836 - type: mrr_at_100 value: 76.15647881281473 - type: mrr_at_1000 value: 76.15736533763744 - type: mrr_at_20 value: 76.08557368557368 - type: mrr_at_3 value: 73.55555555555556 - type: mrr_at_5 value: 75.4888888888889 - type: nauc_map_at_1000_diff1 value: 77.31229383811176 - type: nauc_map_at_1000_max value: 58.848319058605156 - type: nauc_map_at_1000_std value: -14.290090263454985 - type: nauc_map_at_100_diff1 value: 77.31325400213969 - type: nauc_map_at_100_max value: 58.848885054155275 - type: nauc_map_at_100_std value: -14.285806618869273 - type: nauc_map_at_10_diff1 value: 77.1806705504232 - type: nauc_map_at_10_max value: 59.02905805134415 - type: nauc_map_at_10_std value: -14.132954900037467 - type: nauc_map_at_1_diff1 value: 81.03932970557837 - type: nauc_map_at_1_max value: 49.02073230264529 - type: nauc_map_at_1_std value: -22.977452975845512 - type: nauc_map_at_20_diff1 value: 77.22581364818562 - type: nauc_map_at_20_max value: 58.90740400399768 - type: nauc_map_at_20_std value: -14.245079150986745 - type: nauc_map_at_3_diff1 value: 76.99793243255563 - type: nauc_map_at_3_max value: 54.9930733886623 - type: nauc_map_at_3_std value: -19.297708446082407 - type: nauc_map_at_5_diff1 value: 77.1671608360295 - type: nauc_map_at_5_max value: 57.27757489519526 - type: nauc_map_at_5_std value: -15.446338357667708 - type: nauc_mrr_at_1000_diff1 value: 77.4806080821202 - type: nauc_mrr_at_1000_max value: 60.9213776129792 - type: nauc_mrr_at_1000_std value: -12.139599632228343 - type: nauc_mrr_at_100_diff1 value: 77.48158073865281 - type: nauc_mrr_at_100_max value: 60.9218657185361 - type: nauc_mrr_at_100_std value: -12.13532070453677 - type: nauc_mrr_at_10_diff1 value: 77.32428546014407 - type: nauc_mrr_at_10_max value: 61.018407010343466 - type: nauc_mrr_at_10_std value: -12.143193773309347 - type: nauc_mrr_at_1_diff1 value: 80.99806778887115 - type: nauc_mrr_at_1_max value: 59.17855969530095 - type: nauc_mrr_at_1_std value: -12.30545640831458 - type: nauc_mrr_at_20_diff1 value: 77.3811067653992 - type: nauc_mrr_at_20_max value: 60.9648880366335 - type: nauc_mrr_at_20_std value: -12.124066076541853 - type: nauc_mrr_at_3_diff1 value: 77.31304316321959 - type: nauc_mrr_at_3_max value: 60.75536766404163 - type: nauc_mrr_at_3_std value: -12.997876030849623 - type: nauc_mrr_at_5_diff1 value: 77.12952864141742 - type: nauc_mrr_at_5_max value: 60.995943754968685 - type: nauc_mrr_at_5_std value: -11.353447465605694 - type: nauc_ndcg_at_1000_diff1 value: 76.81788665683746 - type: nauc_ndcg_at_1000_max value: 60.35947755262391 - type: nauc_ndcg_at_1000_std value: -12.884942372460362 - type: nauc_ndcg_at_100_diff1 value: 76.87388230365198 - type: nauc_ndcg_at_100_max value: 60.38813162962434 - type: nauc_ndcg_at_100_std value: -12.64384717800478 - type: nauc_ndcg_at_10_diff1 value: 75.87713506026317 - type: nauc_ndcg_at_10_max value: 61.39356554675667 - type: nauc_ndcg_at_10_std value: -12.144227584144218 - type: nauc_ndcg_at_1_diff1 value: 80.99806778887115 - type: nauc_ndcg_at_1_max value: 59.17855969530095 - type: nauc_ndcg_at_1_std value: -12.30545640831458 - type: nauc_ndcg_at_20_diff1 value: 76.09913944506627 - type: nauc_ndcg_at_20_max value: 61.01644448834147 - type: nauc_ndcg_at_20_std value: -12.456209267623857 - type: nauc_ndcg_at_3_diff1 value: 75.52717946614608 - type: nauc_ndcg_at_3_max value: 58.96433090721983 - type: nauc_ndcg_at_3_std value: -15.849280494339556 - type: nauc_ndcg_at_5_diff1 value: 75.69026981016921 - type: nauc_ndcg_at_5_max value: 58.924044405851326 - type: nauc_ndcg_at_5_std value: -13.182728827923107 - type: nauc_precision_at_1000_diff1 value: -31.634022001609914 - type: nauc_precision_at_1000_max value: 31.46271490784504 - type: nauc_precision_at_1000_std value: 60.44801276891442 - type: nauc_precision_at_100_diff1 value: -29.722363469948103 - type: nauc_precision_at_100_max value: 32.05464592020074 - type: nauc_precision_at_100_std value: 60.832570595613554 - type: nauc_precision_at_10_diff1 value: -11.91731376599939 - type: nauc_precision_at_10_max value: 45.43646553157129 - type: nauc_precision_at_10_std value: 52.962408871791276 - type: nauc_precision_at_1_diff1 value: 80.99806778887115 - type: nauc_precision_at_1_max value: 59.17855969530095 - type: nauc_precision_at_1_std value: -12.30545640831458 - type: nauc_precision_at_20_diff1 value: -18.43293701721667 - type: nauc_precision_at_20_max value: 39.53434874203934 - type: nauc_precision_at_20_std value: 53.6291982468461 - type: nauc_precision_at_3_diff1 value: 30.84789043003892 - type: nauc_precision_at_3_max value: 55.660727758110376 - type: nauc_precision_at_3_std value: 17.87243920840355 - type: nauc_precision_at_5_diff1 value: 4.099395181445625 - type: nauc_precision_at_5_max value: 50.346770968709386 - type: nauc_precision_at_5_std value: 44.66722483255029 - type: nauc_recall_at_1000_diff1 - type: nauc_recall_at_1000_max - type: nauc_recall_at_1000_std - type: nauc_recall_at_100_diff1 value: 100 - type: nauc_recall_at_100_max value: 72.2222222222207 - type: nauc_recall_at_100_std value: 86.92810457516407 - type: nauc_recall_at_10_diff1 value: 62.18887555022005 - type: nauc_recall_at_10_max value: 75.14339068960916 - type: nauc_recall_at_10_std value: -1.4912631719357108 - type: nauc_recall_at_1_diff1 value: 81.03932970557837 - type: nauc_recall_at_1_max value: 49.02073230264529 - type: nauc_recall_at_1_std value: -22.977452975845512 - type: nauc_recall_at_20_diff1 value: 59.27414444038499 - type: nauc_recall_at_20_max value: 76.32241302318047 - type: nauc_recall_at_20_std value: -0.8322169447488666 - type: nauc_recall_at_3_diff1 value: 69.58783002593157 - type: nauc_recall_at_3_max value: 55.89660919896563 - type: nauc_recall_at_3_std value: -21.183005510917862 - type: nauc_recall_at_5_diff1 value: 65.53660499878802 - type: nauc_recall_at_5_max value: 58.218018535135805 - type: nauc_recall_at_5_std value: -8.328952210032455 - type: ndcg_at_1 value: 66.333 - type: ndcg_at_10 value: 80.08999999999999 - type: ndcg_at_100 value: 81.24900000000001 - type: ndcg_at_1000 value: 81.28800000000001 - type: ndcg_at_20 value: 80.625 - type: ndcg_at_3 value: 74.98700000000001 - type: ndcg_at_5 value: 78.553 - type: precision_at_1 value: 66.333 - type: precision_at_10 value: 10.667 - type: precision_at_100 value: 1.127 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_20 value: 5.45 - type: precision_at_3 value: 29.555999999999997 - type: precision_at_5 value: 20.133000000000003 - type: recall_at_1 value: 63.161 - type: recall_at_10 value: 94.167 - type: recall_at_100 value: 99.667 - type: recall_at_1000 value: 100 - type: recall_at_20 value: 96.167 - type: recall_at_3 value: 80.972 - type: recall_at_5 value: 89.90599999999999 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cosine_accuracy value: 99.81881188118813 - type: cosine_accuracy_threshold value: 85.55081486701965 - type: cosine_ap value: 96.0359661816236 - type: cosine_f1 value: 90.6584992343032 - type: cosine_f1_threshold value: 84.82859134674072 - type: cosine_precision value: 92.59645464025026 - type: cosine_recall value: 88.8 - type: dot_accuracy value: 99.81881188118813 - type: dot_accuracy_threshold value: 84.91908311843872 - type: dot_ap value: 96.05740121094365 - type: dot_f1 value: 90.81885856079404 - type: dot_f1_threshold value: 83.84919166564941 - type: dot_precision value: 90.14778325123153 - type: dot_recall value: 91.5 - type: euclidean_accuracy value: 99.82079207920792 - type: euclidean_accuracy_threshold value: 54.49706315994263 - type: euclidean_ap value: 96.03223527068818 - type: euclidean_f1 value: 90.72270630445925 - type: euclidean_f1_threshold value: 54.49706315994263 - type: euclidean_precision value: 93.05993690851734 - type: euclidean_recall value: 88.5 - type: main_score value: 96.32671902439806 - type: manhattan_accuracy value: 99.83267326732673 - type: manhattan_accuracy_threshold value: 3818.192672729492 - type: manhattan_ap value: 96.32671902439806 - type: manhattan_f1 value: 91.52032112393378 - type: manhattan_f1_threshold value: 3818.192672729492 - type: manhattan_precision value: 91.8429003021148 - type: manhattan_recall value: 91.2 - type: max_ap value: 96.32671902439806 - type: max_f1 value: 91.52032112393378 - type: max_precision value: 93.05993690851734 - type: max_recall value: 91.5 - type: similarity_accuracy value: 99.81881188118813 - type: similarity_accuracy_threshold value: 85.55081486701965 - type: similarity_ap value: 96.0359661816236 - type: similarity_f1 value: 90.6584992343032 - type: similarity_f1_threshold value: 84.82859134674072 - type: similarity_precision value: 92.59645464025026 - type: similarity_recall value: 88.8 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: main_score value: 80.28558559137414 - type: v_measure value: 80.28558559137414 - type: v_measure_std value: 2.795276520287584 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: main_score value: 49.57135582416209 - type: v_measure value: 49.57135582416209 - type: v_measure_std value: 1.6414135468423754 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: main_score value: 55.253002583598644 - type: map value: 55.253002583598644 - type: mrr value: 56.24172396231219 - type: nAUC_map_diff1 value: 40.00053248203427 - type: nAUC_map_max value: 10.05441740585869 - type: nAUC_map_std value: 8.227169286387552 - type: nAUC_mrr_diff1 value: 40.250446264233744 - type: nAUC_mrr_max value: 10.586310195339053 - type: nAUC_mrr_std value: 8.47326494370076 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cosine_pearson value: 31.19874648747059 - type: cosine_spearman value: 31.493550648844863 - type: dot_pearson value: 31.157847680289407 - type: dot_spearman value: 31.575299712180538 - type: main_score value: 31.493550648844863 - type: pearson value: 31.19874648747059 - type: spearman value: 31.493550648844863 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: mteb/trec-covid config: default split: test revision: bb9466bac8153a0349341eb1b22e06409e78ef4e metrics: - type: main_score value: 85.983 - type: map_at_1 value: 0.247 - type: map_at_10 value: 2.177 - type: map_at_100 value: 14.804 - type: map_at_1000 value: 37.045 - type: map_at_20 value: 4.12 - type: map_at_3 value: 0.7000000000000001 - type: map_at_5 value: 1.1320000000000001 - type: mrr_at_1 value: 96 - type: mrr_at_10 value: 98 - type: mrr_at_100 value: 98 - type: mrr_at_1000 value: 98 - type: mrr_at_20 value: 98 - type: mrr_at_3 value: 98 - type: mrr_at_5 value: 98 - type: nauc_map_at_1000_diff1 value: -0.9165125200337213 - type: nauc_map_at_1000_max value: 40.260117798042764 - type: nauc_map_at_1000_std value: 71.72789335831554 - type: nauc_map_at_100_diff1 value: 20.493827311583953 - type: nauc_map_at_100_max value: 21.005742079276462 - type: nauc_map_at_100_std value: 62.53815607831659 - type: nauc_map_at_10_diff1 value: 31.289297684528215 - type: nauc_map_at_10_max value: 7.86554294370268 - type: nauc_map_at_10_std value: 37.26191657133897 - type: nauc_map_at_1_diff1 value: 25.57568148849456 - type: nauc_map_at_1_max value: -5.9767435623941445 - type: nauc_map_at_1_std value: 30.849871717506755 - type: nauc_map_at_20_diff1 value: 30.896018204532087 - type: nauc_map_at_20_max value: 8.667077299744314 - type: nauc_map_at_20_std value: 41.512687168412924 - type: nauc_map_at_3_diff1 value: 29.44724521006598 - type: nauc_map_at_3_max value: 1.597496889532064 - type: nauc_map_at_3_std value: 32.25013773854697 - type: nauc_map_at_5_diff1 value: 27.387036605618825 - type: nauc_map_at_5_max value: 5.402983746211454 - type: nauc_map_at_5_std value: 33.940523962472184 - type: nauc_mrr_at_1000_diff1 value: -14.122315592903503 - type: nauc_mrr_at_1000_max value: 33.84687208216605 - type: nauc_mrr_at_1000_std value: 86.11111111111092 - type: nauc_mrr_at_100_diff1 value: -14.122315592903503 - type: nauc_mrr_at_100_max value: 33.84687208216605 - type: nauc_mrr_at_100_std value: 86.11111111111092 - type: nauc_mrr_at_10_diff1 value: -14.122315592903503 - type: nauc_mrr_at_10_max value: 33.84687208216605 - type: nauc_mrr_at_10_std value: 86.11111111111092 - type: nauc_mrr_at_1_diff1 value: -14.122315592903831 - type: nauc_mrr_at_1_max value: 33.84687208216637 - type: nauc_mrr_at_1_std value: 86.11111111111124 - type: nauc_mrr_at_20_diff1 value: -14.122315592903503 - type: nauc_mrr_at_20_max value: 33.84687208216605 - type: nauc_mrr_at_20_std value: 86.11111111111092 - type: nauc_mrr_at_3_diff1 value: -14.122315592903503 - type: nauc_mrr_at_3_max value: 33.84687208216605 - type: nauc_mrr_at_3_std value: 86.11111111111092 - type: nauc_mrr_at_5_diff1 value: -14.122315592903503 - type: nauc_mrr_at_5_max value: 33.84687208216605 - type: nauc_mrr_at_5_std value: 86.11111111111092 - type: nauc_ndcg_at_1000_diff1 value: 8.745907669561928 - type: nauc_ndcg_at_1000_max value: 45.43307237994533 - type: nauc_ndcg_at_1000_std value: 74.93357447176336 - type: nauc_ndcg_at_100_diff1 value: -3.9719350773353765 - type: nauc_ndcg_at_100_max value: 44.43705332397461 - type: nauc_ndcg_at_100_std value: 61.59493812371758 - type: nauc_ndcg_at_10_diff1 value: 15.230915878367348 - type: nauc_ndcg_at_10_max value: 48.332840970836635 - type: nauc_ndcg_at_10_std value: 46.888785065125774 - type: nauc_ndcg_at_1_diff1 value: 13.219732337379442 - type: nauc_ndcg_at_1_max value: 45.19919078742603 - type: nauc_ndcg_at_1_std value: 64.68253968253977 - type: nauc_ndcg_at_20_diff1 value: 12.479648691964865 - type: nauc_ndcg_at_20_max value: 48.76688248450331 - type: nauc_ndcg_at_20_std value: 51.450399755887545 - type: nauc_ndcg_at_3_diff1 value: 6.165414201871464 - type: nauc_ndcg_at_3_max value: 45.089689347691035 - type: nauc_ndcg_at_3_std value: 41.08249161845213 - type: nauc_ndcg_at_5_diff1 value: 7.411245806844721 - type: nauc_ndcg_at_5_max value: 47.818748093538076 - type: nauc_ndcg_at_5_std value: 45.907685763676575 - type: nauc_precision_at_1000_diff1 value: -30.574290219847345 - type: nauc_precision_at_1000_max value: 32.56926126118719 - type: nauc_precision_at_1000_std value: 14.584504392628874 - type: nauc_precision_at_100_diff1 value: -10.199740234718847 - type: nauc_precision_at_100_max value: 41.0213226769777 - type: nauc_precision_at_100_std value: 56.975760776771324 - type: nauc_precision_at_10_diff1 value: 7.865792689701161 - type: nauc_precision_at_10_max value: 52.00432275201737 - type: nauc_precision_at_10_std value: 43.89512276413724 - type: nauc_precision_at_1_diff1 value: -14.122315592903831 - type: nauc_precision_at_1_max value: 33.84687208216637 - type: nauc_precision_at_1_std value: 86.11111111111124 - type: nauc_precision_at_20_diff1 value: 5.481424191880084 - type: nauc_precision_at_20_max value: 46.86629331792725 - type: nauc_precision_at_20_std value: 49.245692667517496 - type: nauc_precision_at_3_diff1 value: -5.870408807869163 - type: nauc_precision_at_3_max value: 48.73657612128875 - type: nauc_precision_at_3_std value: 41.15152062088262 - type: nauc_precision_at_5_diff1 value: -4.550610529125413 - type: nauc_precision_at_5_max value: 60.390115878205386 - type: nauc_precision_at_5_std value: 44.16494295055696 - type: nauc_recall_at_1000_diff1 value: 8.047794367079034 - type: nauc_recall_at_1000_max value: 37.07551482870489 - type: nauc_recall_at_1000_std value: 66.20862163364201 - type: nauc_recall_at_100_diff1 value: 25.08104923597475 - type: nauc_recall_at_100_max value: 9.971294642165734 - type: nauc_recall_at_100_std value: 51.737814074891254 - type: nauc_recall_at_10_diff1 value: 32.33148478369628 - type: nauc_recall_at_10_max value: 1.3767192150014917 - type: nauc_recall_at_10_std value: 30.801926742876308 - type: nauc_recall_at_1_diff1 value: 25.57568148849456 - type: nauc_recall_at_1_max value: -5.9767435623941445 - type: nauc_recall_at_1_std value: 30.849871717506755 - type: nauc_recall_at_20_diff1 value: 31.716580022934654 - type: nauc_recall_at_20_max value: -0.1281270579464631 - type: nauc_recall_at_20_std value: 33.76185294993676 - type: nauc_recall_at_3_diff1 value: 29.758810004388348 - type: nauc_recall_at_3_max value: -1.9442985017191816 - type: nauc_recall_at_3_std value: 27.45550076962206 - type: nauc_recall_at_5_diff1 value: 27.047710181576672 - type: nauc_recall_at_5_max value: 1.5237000700880248 - type: nauc_recall_at_5_std value: 28.235297950159698 - type: ndcg_at_1 value: 94 - type: ndcg_at_10 value: 85.983 - type: ndcg_at_100 value: 69.195 - type: ndcg_at_1000 value: 62.541000000000004 - type: ndcg_at_20 value: 83.405 - type: ndcg_at_3 value: 89.98899999999999 - type: ndcg_at_5 value: 87.905 - type: precision_at_1 value: 96 - type: precision_at_10 value: 89.4 - type: precision_at_100 value: 71.54 - type: precision_at_1000 value: 27.594 - type: precision_at_20 value: 87.2 - type: precision_at_3 value: 92.667 - type: precision_at_5 value: 90.8 - type: recall_at_1 value: 0.247 - type: recall_at_10 value: 2.315 - type: recall_at_100 value: 17.574 - type: recall_at_1000 value: 59.336999999999996 - type: recall_at_20 value: 4.491 - type: recall_at_3 value: 0.7250000000000001 - type: recall_at_5 value: 1.1820000000000002 - task: type: Retrieval dataset: name: MTEB Touche2020 type: mteb/touche2020 config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: main_score value: 29.944 - type: map_at_1 value: 3.064 - type: map_at_10 value: 11.501999999999999 - type: map_at_100 value: 18.736 - type: map_at_1000 value: 20.333000000000002 - type: map_at_20 value: 14.057 - type: map_at_3 value: 6.300999999999999 - type: map_at_5 value: 8.463 - type: mrr_at_1 value: 44.89795918367347 - type: mrr_at_10 value: 58.41188856494979 - type: mrr_at_100 value: 58.93964266413245 - type: mrr_at_1000 value: 58.93964266413245 - type: mrr_at_20 value: 58.767485349118 - type: mrr_at_3 value: 54.42176870748299 - type: mrr_at_5 value: 56.666666666666664 - type: nauc_map_at_1000_diff1 value: 11.478593385608479 - type: nauc_map_at_1000_max value: 10.309889845044324 - type: nauc_map_at_1000_std value: 21.16721939940238 - type: nauc_map_at_100_diff1 value: 11.570438543562418 - type: nauc_map_at_100_max value: 8.426183648064834 - type: nauc_map_at_100_std value: 18.56231985033613 - type: nauc_map_at_10_diff1 value: 22.37735506247481 - type: nauc_map_at_10_max value: 5.455946239060806 - type: nauc_map_at_10_std value: -4.2848826518388154 - type: nauc_map_at_1_diff1 value: 27.853645380676824 - type: nauc_map_at_1_max value: 7.30739948053113 - type: nauc_map_at_1_std value: -0.2773663157814586 - type: nauc_map_at_20_diff1 value: 14.724669779924648 - type: nauc_map_at_20_max value: 10.12882779173533 - type: nauc_map_at_20_std value: 4.4803777672120875 - type: nauc_map_at_3_diff1 value: 31.891173385921263 - type: nauc_map_at_3_max value: 4.889652271827218 - type: nauc_map_at_3_std value: -9.477460238651643 - type: nauc_map_at_5_diff1 value: 31.489012040465003 - type: nauc_map_at_5_max value: 1.7330092417337482 - type: nauc_map_at_5_std value: -8.137018608469637 - type: nauc_mrr_at_1000_diff1 value: 24.411522237082416 - type: nauc_mrr_at_1000_max value: 11.286971076556688 - type: nauc_mrr_at_1000_std value: 23.443174210894043 - type: nauc_mrr_at_100_diff1 value: 24.411522237082416 - type: nauc_mrr_at_100_max value: 11.286971076556688 - type: nauc_mrr_at_100_std value: 23.443174210894043 - type: nauc_mrr_at_10_diff1 value: 23.948152308265186 - type: nauc_mrr_at_10_max value: 12.22420979621155 - type: nauc_mrr_at_10_std value: 23.557939024705544 - type: nauc_mrr_at_1_diff1 value: 17.902334894536107 - type: nauc_mrr_at_1_max value: 17.36969662861018 - type: nauc_mrr_at_1_std value: 19.425714969048734 - type: nauc_mrr_at_20_diff1 value: 24.635893795899797 - type: nauc_mrr_at_20_max value: 11.330541067194913 - type: nauc_mrr_at_20_std value: 23.74518583400233 - type: nauc_mrr_at_3_diff1 value: 25.045536328282587 - type: nauc_mrr_at_3_max value: 7.497967004732733 - type: nauc_mrr_at_3_std value: 24.167153007320078 - type: nauc_mrr_at_5_diff1 value: 24.328479930592454 - type: nauc_mrr_at_5_max value: 10.037126854938336 - type: nauc_mrr_at_5_std value: 25.236208055346136 - type: nauc_ndcg_at_1000_diff1 value: 15.555347444667389 - type: nauc_ndcg_at_1000_max value: 13.356591700655718 - type: nauc_ndcg_at_1000_std value: 42.42395845935052 - type: nauc_ndcg_at_100_diff1 value: 13.110526060413708 - type: nauc_ndcg_at_100_max value: 3.140006440162515 - type: nauc_ndcg_at_100_std value: 39.02733288398033 - type: nauc_ndcg_at_10_diff1 value: 20.68853369009725 - type: nauc_ndcg_at_10_max value: 2.435389817058852 - type: nauc_ndcg_at_10_std value: 10.038202768784316 - type: nauc_ndcg_at_1_diff1 value: 20.17287594582385 - type: nauc_ndcg_at_1_max value: 12.487205168273196 - type: nauc_ndcg_at_1_std value: 20.639827614373075 - type: nauc_ndcg_at_20_diff1 value: 16.987577348502985 - type: nauc_ndcg_at_20_max value: 2.9978717644469266 - type: nauc_ndcg_at_20_std value: 13.015690866750354 - type: nauc_ndcg_at_3_diff1 value: 32.392223079245575 - type: nauc_ndcg_at_3_max value: 1.587587110582544 - type: nauc_ndcg_at_3_std value: 12.850592473446609 - type: nauc_ndcg_at_5_diff1 value: 32.80244517369626 - type: nauc_ndcg_at_5_max value: 5.8939933777508084 - type: nauc_ndcg_at_5_std value: 15.779687411463414 - type: nauc_precision_at_1000_diff1 value: -14.314031720452537 - type: nauc_precision_at_1000_max value: 32.87886666567266 - type: nauc_precision_at_1000_std value: 21.49347046886851 - type: nauc_precision_at_100_diff1 value: -9.4034008613839 - type: nauc_precision_at_100_max value: 16.784075123309645 - type: nauc_precision_at_100_std value: 73.14688535393604 - type: nauc_precision_at_10_diff1 value: 6.855101404043058 - type: nauc_precision_at_10_max value: 6.52491228645612 - type: nauc_precision_at_10_std value: 16.104602266016744 - type: nauc_precision_at_1_diff1 value: 17.902334894536107 - type: nauc_precision_at_1_max value: 17.36969662861018 - type: nauc_precision_at_1_std value: 19.425714969048734 - type: nauc_precision_at_20_diff1 value: -5.337534613602212 - type: nauc_precision_at_20_max value: 17.722925454767218 - type: nauc_precision_at_20_std value: 34.26680462132849 - type: nauc_precision_at_3_diff1 value: 31.054623397809255 - type: nauc_precision_at_3_max value: -0.92038600946826 - type: nauc_precision_at_3_std value: 8.326997076862916 - type: nauc_precision_at_5_diff1 value: 29.784942296920462 - type: nauc_precision_at_5_max value: 6.337469263434779 - type: nauc_precision_at_5_std value: 12.789597196020974 - type: nauc_recall_at_1000_diff1 value: -3.8177981862041364 - type: nauc_recall_at_1000_max value: 14.206064332229163 - type: nauc_recall_at_1000_std value: 74.18853420771269 - type: nauc_recall_at_100_diff1 value: 0.7677996771461106 - type: nauc_recall_at_100_max value: -4.139924106878441 - type: nauc_recall_at_100_std value: 48.319930706362896 - type: nauc_recall_at_10_diff1 value: 12.038835537494322 - type: nauc_recall_at_10_max value: -2.0498983557854418 - type: nauc_recall_at_10_std value: -2.0339180690854493 - type: nauc_recall_at_1_diff1 value: 27.853645380676824 - type: nauc_recall_at_1_max value: 7.30739948053113 - type: nauc_recall_at_1_std value: -0.2773663157814586 - type: nauc_recall_at_20_diff1 value: 0.7907893667756708 - type: nauc_recall_at_20_max value: 0.8795499810558195 - type: nauc_recall_at_20_std value: 11.512483291688282 - type: nauc_recall_at_3_diff1 value: 33.19440392639576 - type: nauc_recall_at_3_max value: -1.5494237697432613 - type: nauc_recall_at_3_std value: -8.560408808376984 - type: nauc_recall_at_5_diff1 value: 27.42193873870941 - type: nauc_recall_at_5_max value: -4.74350293281128 - type: nauc_recall_at_5_std value: -7.618060131179654 - type: ndcg_at_1 value: 42.857 - type: ndcg_at_10 value: 29.944 - type: ndcg_at_100 value: 42.624 - type: ndcg_at_1000 value: 53.384 - type: ndcg_at_20 value: 30.135 - type: ndcg_at_3 value: 34.847 - type: ndcg_at_5 value: 32.573 - type: precision_at_1 value: 44.897999999999996 - type: precision_at_10 value: 25.306 - type: precision_at_100 value: 8.694 - type: precision_at_1000 value: 1.616 - type: precision_at_20 value: 19.082 - type: precision_at_3 value: 34.014 - type: precision_at_5 value: 31.019999999999996 - type: recall_at_1 value: 3.064 - type: recall_at_10 value: 17.849999999999998 - type: recall_at_100 value: 53.217999999999996 - type: recall_at_1000 value: 87.095 - type: recall_at_20 value: 26.111 - type: recall_at_3 value: 7.383000000000001 - type: recall_at_5 value: 11.434 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de metrics: - type: accuracy value: 88.759765625 - type: ap value: 36.49152357863017 - type: ap_weighted value: 36.49152357863017 - type: f1 value: 74.4692714448641 - type: f1_weighted value: 90.54372649306606 - type: main_score value: 88.759765625 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 74.8443689869836 - type: f1 value: 75.1139662898148 - type: f1_weighted value: 74.7369003946243 - type: main_score value: 74.8443689869836 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: main_score value: 61.42918790942448 - type: v_measure value: 61.42918790942448 - type: v_measure_std value: 1.0156550098843082 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cosine_accuracy value: 88.22197055492639 - type: cosine_accuracy_threshold value: 83.30042362213135 - type: cosine_ap value: 80.57754959194938 - type: cosine_f1 value: 73.70579190158894 - type: cosine_f1_threshold value: 81.04978799819946 - type: cosine_precision value: 71.64922770303936 - type: cosine_recall value: 75.8839050131926 - type: dot_accuracy value: 88.23985217857782 - type: dot_accuracy_threshold value: 83.31039547920227 - type: dot_ap value: 80.57533213448181 - type: dot_f1 value: 73.61309601143302 - type: dot_f1_threshold value: 81.33968114852905 - type: dot_precision value: 72.51087791144101 - type: dot_recall value: 74.74934036939314 - type: euclidean_accuracy value: 88.22197055492639 - type: euclidean_accuracy_threshold value: 58.290231227874756 - type: euclidean_ap value: 80.57982723880139 - type: euclidean_f1 value: 73.63426519620417 - type: euclidean_f1_threshold value: 61.55576705932617 - type: euclidean_precision value: 71.63173652694611 - type: euclidean_recall value: 75.75197889182058 - type: main_score value: 80.57982723880139 - type: manhattan_accuracy value: 88.14448351910353 - type: manhattan_accuracy_threshold value: 3907.2471618652344 - type: manhattan_ap value: 80.3538079655539 - type: manhattan_f1 value: 73.40466675261054 - type: manhattan_f1_threshold value: 4103.794097900391 - type: manhattan_precision value: 71.76707839677337 - type: manhattan_recall value: 75.11873350923483 - type: max_ap value: 80.57982723880139 - type: max_f1 value: 73.70579190158894 - type: max_precision value: 72.51087791144101 - type: max_recall value: 75.8839050131926 - type: similarity_accuracy value: 88.22197055492639 - type: similarity_accuracy_threshold value: 83.30042362213135 - type: similarity_ap value: 80.57754959194938 - type: similarity_f1 value: 73.70579190158894 - type: similarity_f1_threshold value: 81.04978799819946 - type: similarity_precision value: 71.64922770303936 - type: similarity_recall value: 75.8839050131926 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cosine_accuracy value: 89.88628866379477 - type: cosine_accuracy_threshold value: 80.8050274848938 - type: cosine_ap value: 87.57594591596816 - type: cosine_f1 value: 80.0812257707218 - type: cosine_f1_threshold value: 77.990061044693 - type: cosine_precision value: 76.93126197063205 - type: cosine_recall value: 83.50015398829689 - type: dot_accuracy value: 89.87852679784221 - type: dot_accuracy_threshold value: 80.84419965744019 - type: dot_ap value: 87.56136742222151 - type: dot_f1 value: 80.05898617511521 - type: dot_f1_threshold value: 77.92385816574097 - type: dot_precision value: 76.80554573106035 - type: dot_recall value: 83.60024638127503 - type: euclidean_accuracy value: 89.86882446540149 - type: euclidean_accuracy_threshold value: 62.08193898200989 - type: euclidean_ap value: 87.57517549192228 - type: euclidean_f1 value: 80.05286925872892 - type: euclidean_f1_threshold value: 66.65036082267761 - type: euclidean_precision value: 76.51063232507545 - type: euclidean_recall value: 83.93902063443178 - type: main_score value: 87.64162614197194 - type: manhattan_accuracy value: 89.8959909962355 - type: manhattan_accuracy_threshold value: 4176.108169555664 - type: manhattan_ap value: 87.64162614197194 - type: manhattan_f1 value: 80.17116279069768 - type: manhattan_f1_threshold value: 4433.153533935547 - type: manhattan_precision value: 77.57615035644848 - type: manhattan_recall value: 82.94579611949491 - type: max_ap value: 87.64162614197194 - type: max_f1 value: 80.17116279069768 - type: max_precision value: 77.57615035644848 - type: max_recall value: 83.93902063443178 - type: similarity_accuracy value: 89.88628866379477 - type: similarity_accuracy_threshold value: 80.8050274848938 - type: similarity_ap value: 87.57594591596816 - type: similarity_f1 value: 80.0812257707218 - type: similarity_f1_threshold value: 77.990061044693 - type: similarity_precision value: 76.93126197063205 - type: similarity_recall value: 83.50015398829689 --- # Updates New open-source models and ToDoList will be listed on https://github.com/DunZhang/Stella/blob/main/news_and_todo.md. You can also find these models on my [homepage](https://huggingface.co/infgrad). # Introduction The models are trained based on `Alibaba-NLP/gte-large-en-v1.5` and `Alibaba-NLP/gte-Qwen2-1.5B-instruct`. Thanks for their contributions! **We simplify usage of prompts, providing two prompts for most general tasks, one is for s2p, another one is for s2s.** Prompt of s2p task(e.g. retrieve task): ```text Instruct: Given a web search query, retrieve relevant passages that answer the query.\nQuery: {query} ``` Prompt of s2s task(e.g. semantic textual similarity task): ```text Instruct: Retrieve semantically similar text.\nQuery: {query} ``` The models are finally trained by [MRL]((https://arxiv.org/abs/2205.13147)), so they have multiple dimensions: 512, 768, 1024, 2048, 4096, 6144 and 8192. The higher the dimension, the better the performance. **Generally speaking, 1024d is good enough.** The MTEB score of 1024d is only 0.001 lower than 8192d. # Model directory structure The model directory structure is very simple, it is a standard SentenceTransformer directory **with a series of `2_Dense_{dims}` folders**, where `dims` represents the final vector dimension. For example, the `2_Dense_256` folder stores Linear weights that convert vector dimensions to 256 dimensions. Please refer to the following chapters for specific instructions on how to use them. # Usage You can use `SentenceTransformers` or `transformers` library to encode text. ## Sentence Transformers ```python from sentence_transformers import SentenceTransformer # This model supports two prompts: "s2p_query" and "s2s_query" for sentence-to-passage and sentence-to-sentence tasks, respectively. # They are defined in `config_sentence_transformers.json` query_prompt_name = "s2p_query" queries = [ "What are some ways to reduce stress?", "What are the benefits of drinking green tea?", ] # docs do not need any prompts docs = [ "There are many effective ways to reduce stress. Some common techniques include deep breathing, meditation, and physical activity. Engaging in hobbies, spending time in nature, and connecting with loved ones can also help alleviate stress. Additionally, setting boundaries, practicing self-care, and learning to say no can prevent stress from building up.", "Green tea has been consumed for centuries and is known for its potential health benefits. It contains antioxidants that may help protect the body against damage caused by free radicals. Regular consumption of green tea has been associated with improved heart health, enhanced cognitive function, and a reduced risk of certain types of cancer. The polyphenols in green tea may also have anti-inflammatory and weight loss properties.", ] # !The default dimension is 1024, if you need other dimensions, please clone the model and modify `modules.json` to replace `2_Dense_1024` with another dimension, e.g. `2_Dense_256` or `2_Dense_8192` ! model = SentenceTransformer("dunzhang/stella_en_1.5B_v5", trust_remote_code=True).cuda() query_embeddings = model.encode(queries, prompt_name=query_prompt_name) doc_embeddings = model.encode(docs) print(query_embeddings.shape, doc_embeddings.shape) # (2, 1024) (2, 1024) similarities = model.similarity(query_embeddings, doc_embeddings) print(similarities) # tensor([[0.8179, 0.2958], # [0.3194, 0.7854]]) ``` ## Transformers ```python import os import torch from transformers import AutoModel, AutoTokenizer from sklearn.preprocessing import normalize query_prompt = "Instruct: Given a web search query, retrieve relevant passages that answer the query.\nQuery: " queries = [ "What are some ways to reduce stress?", "What are the benefits of drinking green tea?", ] queries = [query_prompt + query for query in queries] # docs do not need any prompts docs = [ "There are many effective ways to reduce stress. Some common techniques include deep breathing, meditation, and physical activity. Engaging in hobbies, spending time in nature, and connecting with loved ones can also help alleviate stress. Additionally, setting boundaries, practicing self-care, and learning to say no can prevent stress from building up.", "Green tea has been consumed for centuries and is known for its potential health benefits. It contains antioxidants that may help protect the body against damage caused by free radicals. Regular consumption of green tea has been associated with improved heart health, enhanced cognitive function, and a reduced risk of certain types of cancer. The polyphenols in green tea may also have anti-inflammatory and weight loss properties.", ] # The path of your model after cloning it model_dir = "{Your MODEL_PATH}" vector_dim = 1024 vector_linear_directory = f"2_Dense_{vector_dim}" model = AutoModel.from_pretrained(model_dir, trust_remote_code=True).cuda().eval() tokenizer = AutoTokenizer.from_pretrained(model_dir, trust_remote_code=True) vector_linear = torch.nn.Linear(in_features=model.config.hidden_size, out_features=vector_dim) vector_linear_dict = { k.replace("linear.", ""): v for k, v in torch.load(os.path.join(model_dir, f"{vector_linear_directory}/pytorch_model.bin")).items() } vector_linear.load_state_dict(vector_linear_dict) vector_linear.cuda() # Embed the queries with torch.no_grad(): input_data = tokenizer(queries, padding="longest", truncation=True, max_length=512, return_tensors="pt") input_data = {k: v.cuda() for k, v in input_data.items()} attention_mask = input_data["attention_mask"] last_hidden_state = model(**input_data)[0] last_hidden = last_hidden_state.masked_fill(~attention_mask[..., None].bool(), 0.0) query_vectors = last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None] query_vectors = normalize(vector_linear(query_vectors).cpu().numpy()) # Embed the documents with torch.no_grad(): input_data = tokenizer(docs, padding="longest", truncation=True, max_length=512, return_tensors="pt") input_data = {k: v.cuda() for k, v in input_data.items()} attention_mask = input_data["attention_mask"] last_hidden_state = model(**input_data)[0] last_hidden = last_hidden_state.masked_fill(~attention_mask[..., None].bool(), 0.0) docs_vectors = last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None] docs_vectors = normalize(vector_linear(docs_vectors).cpu().numpy()) print(query_vectors.shape, docs_vectors.shape) # (2, 1024) (2, 1024) similarities = query_vectors @ docs_vectors.T print(similarities) # [[0.8178789 0.2958377 ] # [0.31938642 0.7853526 ]] ``` # FAQ Q: The details of training? A: The training method and datasets will be released in the future. (specific time unknown, may be provided in a paper) Q: How to choose a suitable prompt for my own task? A: In most cases, please use the s2p and s2s prompts. These two prompts account for the vast majority of the training data. Q: How to reproduce MTEB results? A: Please use evaluation scripts in `Alibaba-NLP/gte-Qwen2-1.5B-instruct` or `intfloat/e5-mistral-7b-instruct` Q: Why each dimension has a linear weight? A: MRL has multiple training methods, we choose this method which has the best performance. Q: What is the sequence length of models? A: 512 is recommended, in our experiments, almost all models perform poorly on specialized long text retrieval datasets. Besides, the model is trained on datasets of 512 length. This may be an optimization term. If you have any questions, please start a discussion on community.
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
Mihaiii/test25
Mihaiii
sentence-similarity
[ "sentence-transformers", "onnx", "safetensors", "bert", "feature-extraction", "sentence-similarity", "bge", "mteb", "mergekit", "merge", "base_model:Mihaiii/Wartortle", "base_model:merge:Mihaiii/Wartortle", "base_model:TaylorAI/bge-micro-v2", "base_model:merge:TaylorAI/bge-micro-v2", "license:mit", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,714
1,714
11
0
--- base_model: - Mihaiii/Wartortle - TaylorAI/bge-micro-v2 library_name: sentence-transformers license: mit pipeline_tag: sentence-similarity tags: - sentence-transformers - feature-extraction - sentence-similarity - bge - mteb - mergekit - merge model-index: - name: Giratina results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 69.56716417910448 - type: ap value: 31.399435128856624 - type: f1 value: 63.139089415537256 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 74.73525000000001 - type: ap value: 69.2327764533514 - type: f1 value: 74.61617659775962 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 35.356 - type: f1 value: 35.165109893437204 - task: type: Retrieval dataset: name: MTEB ArguAna type: mteb/arguana config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 17.141000000000002 - type: map_at_10 value: 28.292 - type: map_at_100 value: 29.532000000000004 - type: map_at_1000 value: 29.580000000000002 - type: map_at_20 value: 29.048000000000002 - type: map_at_3 value: 24.277 - type: map_at_5 value: 26.339000000000002 - type: mrr_at_1 value: 17.781 - type: mrr_at_10 value: 28.534 - type: mrr_at_100 value: 29.779 - type: mrr_at_1000 value: 29.826999999999998 - type: mrr_at_20 value: 29.293000000000003 - type: mrr_at_3 value: 24.490000000000002 - type: mrr_at_5 value: 26.564 - type: ndcg_at_1 value: 17.141000000000002 - type: ndcg_at_10 value: 35.004000000000005 - type: ndcg_at_100 value: 41.056 - type: ndcg_at_1000 value: 42.388 - type: ndcg_at_20 value: 37.721 - type: ndcg_at_3 value: 26.592 - type: ndcg_at_5 value: 30.294999999999998 - type: precision_at_1 value: 17.141000000000002 - type: precision_at_10 value: 5.676 - type: precision_at_100 value: 0.851 - type: precision_at_1000 value: 0.096 - type: precision_at_20 value: 3.3709999999999996 - type: precision_at_3 value: 11.094999999999999 - type: precision_at_5 value: 8.450000000000001 - type: recall_at_1 value: 17.141000000000002 - type: recall_at_10 value: 56.757000000000005 - type: recall_at_100 value: 85.064 - type: recall_at_1000 value: 95.661 - type: recall_at_20 value: 67.425 - type: recall_at_3 value: 33.286 - type: recall_at_5 value: 42.248000000000005 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 37.86211319797047 - type: v_measures value: - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - 0.33158313059028166 - 0.37901912420270933 - 0.368350193636622 - 0.3710910416810123 - 0.33682934300988204 - 0.3935143766420073 - 0.3506042155722468 - 0.38890637022748253 - 0.3809948829762236 - 0.3573848842626061 - 0.4384574114930339 - 0.44065249261067524 - 0.4455934266459656 - 0.44427870340567255 - 0.44866585162160194 - 0.4400562736320333 - 0.44272671447092676 - 0.4472379619739013 - 0.447120409649494 - 0.4374054560695822 - 0.42821311110400917 - 0.26728232917410677 - 0.2819026763758509 - 0.3341565824397579 - 0.29184325438397496 - 0.190440948203588 - 0.26951517878043996 - 0.1580088222464484 - 0.20107217046853706 - 1.0 - 0.22434775382017497 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 28.836354293877637 - type: v_measures value: - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - 0.26770607587929524 - 0.25679087657287986 - 0.26683847803527544 - 0.27314067131657194 - 0.2637027189955263 - 0.27546553977066784 - 0.2663910185474206 - 0.26115132013506304 - 0.28239605072779855 - 0.2715001900369248 - 0.3338711918999345 - 0.330513643529441 - 0.32916198249267603 - 0.33648402018146334 - 0.33995041466013076 - 0.33576749276064755 - 0.3328011112044641 - 0.33773499787647715 - 0.3347474569158458 - 0.33112140013488434 - 0.3093636862898543 - 0.1601792611076284 - 0.20820618472388558 - 0.2622841294938964 - 0.20833795363058114 - 0.15304171124037919 - 0.19106061252763054 - 0.09640933163812757 - 0.16463927791620916 - 1.0 - 0.1585110308604873 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 55.77162231859219 - type: mrr value: 69.60614254935584 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 75.005851173518 - type: cos_sim_spearman value: 76.4866825599851 - type: euclidean_pearson value: 74.6002011099264 - type: euclidean_spearman value: 74.99267261434052 - type: manhattan_pearson value: 74.69084330891174 - type: manhattan_spearman value: 74.06253093850374 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 77.51298701298701 - type: f1 value: 77.42714563211781 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 32.087909450126375 - type: v_measures value: - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - 0.3142833102070323 - 0.3203971307539949 - 0.3161164170523813 - 0.30802810025196975 - 0.3177972043203049 - 0.3186492377314429 - 0.3324448674345129 - 0.3302138414852389 - 0.32033008662475176 - 0.33053074915100755 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 23.549481691079134 - type: v_measures value: - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - 0.24931741412344044 - 0.2294313928519603 - 0.23307236126201172 - 0.22749497519161602 - 0.22860245646223934 - 0.2307563678480302 - 0.242195701791265 - 0.23584374186405796 - 0.23666135736396998 - 0.24157240034932226 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: mteb/cqadupstack-android config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: map_at_1 value: 20.156 - type: map_at_10 value: 26.989 - type: map_at_100 value: 28.165000000000003 - type: map_at_1000 value: 28.302 - type: map_at_20 value: 27.505000000000003 - type: map_at_3 value: 24.631 - type: map_at_5 value: 25.886 - type: mrr_at_1 value: 25.607999999999997 - type: mrr_at_10 value: 31.972 - type: mrr_at_100 value: 32.993 - type: mrr_at_1000 value: 33.061 - type: mrr_at_20 value: 32.471 - type: mrr_at_3 value: 30.019000000000002 - type: mrr_at_5 value: 31.041999999999998 - type: ndcg_at_1 value: 25.607999999999997 - type: ndcg_at_10 value: 31.438 - type: ndcg_at_100 value: 37.347 - type: ndcg_at_1000 value: 40.075 - type: ndcg_at_20 value: 33.068 - type: ndcg_at_3 value: 27.846 - type: ndcg_at_5 value: 29.304999999999996 - type: precision_at_1 value: 25.607999999999997 - type: precision_at_10 value: 5.923 - type: precision_at_100 value: 1.102 - type: precision_at_1000 value: 0.161 - type: precision_at_20 value: 3.5340000000000003 - type: precision_at_3 value: 13.305 - type: precision_at_5 value: 9.585 - type: recall_at_1 value: 20.156 - type: recall_at_10 value: 39.741 - type: recall_at_100 value: 66.428 - type: recall_at_1000 value: 84.694 - type: recall_at_20 value: 45.688 - type: recall_at_3 value: 28.876 - type: recall_at_5 value: 33.284000000000006 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval type: mteb/cqadupstack-english config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: map_at_1 value: 14.568 - type: map_at_10 value: 19.356 - type: map_at_100 value: 20.044 - type: map_at_1000 value: 20.146 - type: map_at_20 value: 19.717000000000002 - type: map_at_3 value: 17.82 - type: map_at_5 value: 18.724 - type: mrr_at_1 value: 18.025 - type: mrr_at_10 value: 22.933 - type: mrr_at_100 value: 23.599 - type: mrr_at_1000 value: 23.669999999999998 - type: mrr_at_20 value: 23.283 - type: mrr_at_3 value: 21.295 - type: mrr_at_5 value: 22.314 - type: ndcg_at_1 value: 18.025 - type: ndcg_at_10 value: 22.559 - type: ndcg_at_100 value: 26.045 - type: ndcg_at_1000 value: 28.785 - type: ndcg_at_20 value: 23.727999999999998 - type: ndcg_at_3 value: 19.914 - type: ndcg_at_5 value: 21.241 - type: precision_at_1 value: 18.025 - type: precision_at_10 value: 4.102 - type: precision_at_100 value: 0.715 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_20 value: 2.452 - type: precision_at_3 value: 9.447999999999999 - type: precision_at_5 value: 6.827999999999999 - type: recall_at_1 value: 14.568 - type: recall_at_10 value: 28.677999999999997 - type: recall_at_100 value: 44.362 - type: recall_at_1000 value: 63.705999999999996 - type: recall_at_20 value: 32.932 - type: recall_at_3 value: 21.029999999999998 - type: recall_at_5 value: 24.573 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval type: mteb/cqadupstack-gaming config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 25.104 - type: map_at_10 value: 33.857 - type: map_at_100 value: 34.808 - type: map_at_1000 value: 34.904 - type: map_at_20 value: 34.404 - type: map_at_3 value: 31.176 - type: map_at_5 value: 32.626 - type: mrr_at_1 value: 28.84 - type: mrr_at_10 value: 36.817 - type: mrr_at_100 value: 37.633 - type: mrr_at_1000 value: 37.698 - type: mrr_at_20 value: 37.312 - type: mrr_at_3 value: 34.451 - type: mrr_at_5 value: 35.748999999999995 - type: ndcg_at_1 value: 28.84 - type: ndcg_at_10 value: 38.745000000000005 - type: ndcg_at_100 value: 43.183 - type: ndcg_at_1000 value: 45.419 - type: ndcg_at_20 value: 40.571 - type: ndcg_at_3 value: 33.751 - type: ndcg_at_5 value: 36.042 - type: precision_at_1 value: 28.84 - type: precision_at_10 value: 6.389 - type: precision_at_100 value: 0.941 - type: precision_at_1000 value: 0.12 - type: precision_at_20 value: 3.6929999999999996 - type: precision_at_3 value: 15.068000000000001 - type: precision_at_5 value: 10.583 - type: recall_at_1 value: 25.104 - type: recall_at_10 value: 50.749 - type: recall_at_100 value: 70.336 - type: recall_at_1000 value: 86.591 - type: recall_at_20 value: 57.473 - type: recall_at_3 value: 37.230000000000004 - type: recall_at_5 value: 42.774 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval type: mteb/cqadupstack-gis config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: map_at_1 value: 12.712000000000002 - type: map_at_10 value: 18.064 - type: map_at_100 value: 18.775 - type: map_at_1000 value: 18.886 - type: map_at_20 value: 18.375 - type: map_at_3 value: 16.304 - type: map_at_5 value: 17.183999999999997 - type: mrr_at_1 value: 13.672 - type: mrr_at_10 value: 19.392 - type: mrr_at_100 value: 20.088 - type: mrr_at_1000 value: 20.186999999999998 - type: mrr_at_20 value: 19.721 - type: mrr_at_3 value: 17.495 - type: mrr_at_5 value: 18.473 - type: ndcg_at_1 value: 13.672 - type: ndcg_at_10 value: 21.427 - type: ndcg_at_100 value: 25.448999999999998 - type: ndcg_at_1000 value: 28.78 - type: ndcg_at_20 value: 22.56 - type: ndcg_at_3 value: 17.752000000000002 - type: ndcg_at_5 value: 19.356 - type: precision_at_1 value: 13.672 - type: precision_at_10 value: 3.5029999999999997 - type: precision_at_100 value: 0.5910000000000001 - type: precision_at_1000 value: 0.092 - type: precision_at_20 value: 2.011 - type: precision_at_3 value: 7.571 - type: precision_at_5 value: 5.537 - type: recall_at_1 value: 12.712000000000002 - type: recall_at_10 value: 30.596 - type: recall_at_100 value: 49.909 - type: recall_at_1000 value: 76.01400000000001 - type: recall_at_20 value: 34.903 - type: recall_at_3 value: 20.721999999999998 - type: recall_at_5 value: 24.428 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval type: mteb/cqadupstack-mathematica config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: map_at_1 value: 7.48 - type: map_at_10 value: 12.089 - type: map_at_100 value: 12.974 - type: map_at_1000 value: 13.099 - type: map_at_20 value: 12.537 - type: map_at_3 value: 10.402000000000001 - type: map_at_5 value: 11.261000000000001 - type: mrr_at_1 value: 9.577 - type: mrr_at_10 value: 15.043999999999999 - type: mrr_at_100 value: 15.909 - type: mrr_at_1000 value: 15.998000000000001 - type: mrr_at_20 value: 15.512999999999998 - type: mrr_at_3 value: 13.184000000000001 - type: mrr_at_5 value: 14.066999999999998 - type: ndcg_at_1 value: 9.577 - type: ndcg_at_10 value: 15.511 - type: ndcg_at_100 value: 20.193 - type: ndcg_at_1000 value: 23.691000000000003 - type: ndcg_at_20 value: 17.176 - type: ndcg_at_3 value: 12.134 - type: ndcg_at_5 value: 13.506000000000002 - type: precision_at_1 value: 9.577 - type: precision_at_10 value: 3.159 - type: precision_at_100 value: 0.634 - type: precision_at_1000 value: 0.106 - type: precision_at_20 value: 2.009 - type: precision_at_3 value: 6.012 - type: precision_at_5 value: 4.627 - type: recall_at_1 value: 7.48 - type: recall_at_10 value: 23.134 - type: recall_at_100 value: 44.254 - type: recall_at_1000 value: 70.35 - type: recall_at_20 value: 29.383 - type: recall_at_3 value: 13.84 - type: recall_at_5 value: 17.175 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval type: mteb/cqadupstack-physics config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: map_at_1 value: 18.035 - type: map_at_10 value: 24.007 - type: map_at_100 value: 25.113999999999997 - type: map_at_1000 value: 25.245 - type: map_at_20 value: 24.587 - type: map_at_3 value: 21.921 - type: map_at_5 value: 22.917 - type: mrr_at_1 value: 22.233 - type: mrr_at_10 value: 28.479 - type: mrr_at_100 value: 29.412 - type: mrr_at_1000 value: 29.49 - type: mrr_at_20 value: 29.031000000000002 - type: mrr_at_3 value: 26.275 - type: mrr_at_5 value: 27.400999999999996 - type: ndcg_at_1 value: 22.233 - type: ndcg_at_10 value: 28.382 - type: ndcg_at_100 value: 33.86 - type: ndcg_at_1000 value: 36.903000000000006 - type: ndcg_at_20 value: 30.341 - type: ndcg_at_3 value: 24.695 - type: ndcg_at_5 value: 26.13 - type: precision_at_1 value: 22.233 - type: precision_at_10 value: 5.2170000000000005 - type: precision_at_100 value: 0.95 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_20 value: 3.2239999999999998 - type: precision_at_3 value: 11.485 - type: precision_at_5 value: 8.181 - type: recall_at_1 value: 18.035 - type: recall_at_10 value: 37.222 - type: recall_at_100 value: 61.602000000000004 - type: recall_at_1000 value: 82.92 - type: recall_at_20 value: 44.221 - type: recall_at_3 value: 26.625 - type: recall_at_5 value: 30.461 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval type: mteb/cqadupstack-programmers config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: map_at_1 value: 13.281 - type: map_at_10 value: 17.756 - type: map_at_100 value: 18.785 - type: map_at_1000 value: 18.921 - type: map_at_20 value: 18.209 - type: map_at_3 value: 15.817999999999998 - type: map_at_5 value: 16.939 - type: mrr_at_1 value: 16.096 - type: mrr_at_10 value: 21.079 - type: mrr_at_100 value: 22.061 - type: mrr_at_1000 value: 22.151 - type: mrr_at_20 value: 21.557000000000002 - type: mrr_at_3 value: 19.006999999999998 - type: mrr_at_5 value: 20.171 - type: ndcg_at_1 value: 16.096 - type: ndcg_at_10 value: 21.278 - type: ndcg_at_100 value: 26.687 - type: ndcg_at_1000 value: 30.016 - type: ndcg_at_20 value: 22.871 - type: ndcg_at_3 value: 17.705000000000002 - type: ndcg_at_5 value: 19.427 - type: precision_at_1 value: 16.096 - type: precision_at_10 value: 3.893 - type: precision_at_100 value: 0.792 - type: precision_at_1000 value: 0.124 - type: precision_at_20 value: 2.414 - type: precision_at_3 value: 8.029 - type: precision_at_5 value: 6.119 - type: recall_at_1 value: 13.281 - type: recall_at_10 value: 28.849000000000004 - type: recall_at_100 value: 53.010999999999996 - type: recall_at_1000 value: 76.512 - type: recall_at_20 value: 34.547 - type: recall_at_3 value: 19.177 - type: recall_at_5 value: 23.455000000000002 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: mteb/cqadupstack config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 14.161583333333333 - type: map_at_10 value: 19.378833333333333 - type: map_at_100 value: 20.27525 - type: map_at_1000 value: 20.394499999999997 - type: map_at_20 value: 19.831333333333333 - type: map_at_3 value: 17.55408333333333 - type: map_at_5 value: 18.52841666666667 - type: mrr_at_1 value: 17.00033333333333 - type: mrr_at_10 value: 22.41916666666667 - type: mrr_at_100 value: 23.252666666666666 - type: mrr_at_1000 value: 23.337583333333335 - type: mrr_at_20 value: 22.866666666666667 - type: mrr_at_3 value: 20.56991666666667 - type: mrr_at_5 value: 21.567666666666664 - type: ndcg_at_1 value: 17.00033333333333 - type: ndcg_at_10 value: 22.96475 - type: ndcg_at_100 value: 27.526833333333332 - type: ndcg_at_1000 value: 30.597416666666668 - type: ndcg_at_20 value: 24.52133333333333 - type: ndcg_at_3 value: 19.60108333333334 - type: ndcg_at_5 value: 21.089750000000002 - type: precision_at_1 value: 17.00033333333333 - type: precision_at_10 value: 4.10625 - type: precision_at_100 value: 0.7497499999999999 - type: precision_at_1000 value: 0.11733333333333335 - type: precision_at_20 value: 2.499416666666667 - type: precision_at_3 value: 9.041 - type: precision_at_5 value: 6.554250000000001 - type: recall_at_1 value: 14.161583333333333 - type: recall_at_10 value: 30.899916666666666 - type: recall_at_100 value: 51.66383333333333 - type: recall_at_1000 value: 74.103 - type: recall_at_20 value: 36.698 - type: recall_at_3 value: 21.398 - type: recall_at_5 value: 25.241750000000003 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval type: mteb/cqadupstack-stats config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: map_at_1 value: 13.036 - type: map_at_10 value: 17.142 - type: map_at_100 value: 17.915 - type: map_at_1000 value: 18.002000000000002 - type: map_at_20 value: 17.558 - type: map_at_3 value: 15.459 - type: map_at_5 value: 16.474 - type: mrr_at_1 value: 14.877 - type: mrr_at_10 value: 19.365 - type: mrr_at_100 value: 20.085 - type: mrr_at_1000 value: 20.165 - type: mrr_at_20 value: 19.75 - type: mrr_at_3 value: 17.638 - type: mrr_at_5 value: 18.673000000000002 - type: ndcg_at_1 value: 14.877 - type: ndcg_at_10 value: 20.199 - type: ndcg_at_100 value: 24.275 - type: ndcg_at_1000 value: 26.933 - type: ndcg_at_20 value: 21.683 - type: ndcg_at_3 value: 16.925 - type: ndcg_at_5 value: 18.565 - type: precision_at_1 value: 14.877 - type: precision_at_10 value: 3.374 - type: precision_at_100 value: 0.59 - type: precision_at_1000 value: 0.087 - type: precision_at_20 value: 2.04 - type: precision_at_3 value: 7.515 - type: precision_at_5 value: 5.491 - type: recall_at_1 value: 13.036 - type: recall_at_10 value: 27.750000000000004 - type: recall_at_100 value: 46.798 - type: recall_at_1000 value: 67.372 - type: recall_at_20 value: 33.406000000000006 - type: recall_at_3 value: 18.381 - type: recall_at_5 value: 22.559 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval type: mteb/cqadupstack-tex config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 8.448 - type: map_at_10 value: 11.978 - type: map_at_100 value: 12.736 - type: map_at_1000 value: 12.848 - type: map_at_20 value: 12.354 - type: map_at_3 value: 10.687000000000001 - type: map_at_5 value: 11.344 - type: mrr_at_1 value: 10.771 - type: mrr_at_10 value: 14.753 - type: mrr_at_100 value: 15.501000000000001 - type: mrr_at_1000 value: 15.592 - type: mrr_at_20 value: 15.148 - type: mrr_at_3 value: 13.425999999999998 - type: mrr_at_5 value: 14.059 - type: ndcg_at_1 value: 10.771 - type: ndcg_at_10 value: 14.788 - type: ndcg_at_100 value: 18.769 - type: ndcg_at_1000 value: 21.939 - type: ndcg_at_20 value: 16.113 - type: ndcg_at_3 value: 12.356 - type: ndcg_at_5 value: 13.316 - type: precision_at_1 value: 10.771 - type: precision_at_10 value: 2.842 - type: precision_at_100 value: 0.58 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 1.807 - type: precision_at_3 value: 5.976 - type: precision_at_5 value: 4.322 - type: recall_at_1 value: 8.448 - type: recall_at_10 value: 20.666 - type: recall_at_100 value: 39.111000000000004 - type: recall_at_1000 value: 62.673 - type: recall_at_20 value: 25.686999999999998 - type: recall_at_3 value: 13.572999999999999 - type: recall_at_5 value: 16.239 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval type: mteb/cqadupstack-unix config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: map_at_1 value: 14.025000000000002 - type: map_at_10 value: 18.605 - type: map_at_100 value: 19.442999999999998 - type: map_at_1000 value: 19.569 - type: map_at_20 value: 19.070999999999998 - type: map_at_3 value: 17.072000000000003 - type: map_at_5 value: 17.866 - type: mrr_at_1 value: 16.511 - type: mrr_at_10 value: 21.633 - type: mrr_at_100 value: 22.419 - type: mrr_at_1000 value: 22.521 - type: mrr_at_20 value: 22.063 - type: mrr_at_3 value: 19.932 - type: mrr_at_5 value: 20.864 - type: ndcg_at_1 value: 16.511 - type: ndcg_at_10 value: 21.931 - type: ndcg_at_100 value: 26.088 - type: ndcg_at_1000 value: 29.564 - type: ndcg_at_20 value: 23.557 - type: ndcg_at_3 value: 18.869 - type: ndcg_at_5 value: 20.203 - type: precision_at_1 value: 16.511 - type: precision_at_10 value: 3.7220000000000004 - type: precision_at_100 value: 0.637 - type: precision_at_1000 value: 0.105 - type: precision_at_20 value: 2.299 - type: precision_at_3 value: 8.52 - type: precision_at_5 value: 6.007 - type: recall_at_1 value: 14.025000000000002 - type: recall_at_10 value: 29.24 - type: recall_at_100 value: 47.771 - type: recall_at_1000 value: 73.37599999999999 - type: recall_at_20 value: 35.148 - type: recall_at_3 value: 20.721 - type: recall_at_5 value: 24.162 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval type: mteb/cqadupstack-webmasters config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: map_at_1 value: 14.610000000000001 - type: map_at_10 value: 20.089000000000002 - type: map_at_100 value: 21.105 - type: map_at_1000 value: 21.275 - type: map_at_20 value: 20.604 - type: map_at_3 value: 18.323 - type: map_at_5 value: 19.192 - type: mrr_at_1 value: 18.182000000000002 - type: mrr_at_10 value: 23.458000000000002 - type: mrr_at_100 value: 24.379 - type: mrr_at_1000 value: 24.474999999999998 - type: mrr_at_20 value: 23.973 - type: mrr_at_3 value: 21.64 - type: mrr_at_5 value: 22.579 - type: ndcg_at_1 value: 18.182000000000002 - type: ndcg_at_10 value: 23.842 - type: ndcg_at_100 value: 28.604000000000003 - type: ndcg_at_1000 value: 32.192 - type: ndcg_at_20 value: 25.507 - type: ndcg_at_3 value: 20.937 - type: ndcg_at_5 value: 22.125 - type: precision_at_1 value: 18.182000000000002 - type: precision_at_10 value: 4.526 - type: precision_at_100 value: 0.955 - type: precision_at_1000 value: 0.17500000000000002 - type: precision_at_20 value: 2.846 - type: precision_at_3 value: 10.079 - type: precision_at_5 value: 7.194000000000001 - type: recall_at_1 value: 14.610000000000001 - type: recall_at_10 value: 31.086999999999996 - type: recall_at_100 value: 53.032000000000004 - type: recall_at_1000 value: 77.781 - type: recall_at_20 value: 37.801 - type: recall_at_3 value: 22.078999999999997 - type: recall_at_5 value: 25.572 - task: type: Retrieval dataset: name: MTEB CQADupstackWordpressRetrieval type: mteb/cqadupstack-wordpress config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 8.484 - type: map_at_10 value: 12.614 - type: map_at_100 value: 13.439 - type: map_at_1000 value: 13.536999999999999 - type: map_at_20 value: 13.055 - type: map_at_3 value: 11.036 - type: map_at_5 value: 11.927999999999999 - type: mrr_at_1 value: 9.612 - type: mrr_at_10 value: 14.105 - type: mrr_at_100 value: 14.953 - type: mrr_at_1000 value: 15.043000000000001 - type: mrr_at_20 value: 14.578 - type: mrr_at_3 value: 12.477 - type: mrr_at_5 value: 13.420000000000002 - type: ndcg_at_1 value: 9.612 - type: ndcg_at_10 value: 15.476999999999999 - type: ndcg_at_100 value: 19.822 - type: ndcg_at_1000 value: 22.872 - type: ndcg_at_20 value: 17.081 - type: ndcg_at_3 value: 12.328999999999999 - type: ndcg_at_5 value: 13.861 - type: precision_at_1 value: 9.612 - type: precision_at_10 value: 2.625 - type: precision_at_100 value: 0.51 - type: precision_at_1000 value: 0.082 - type: precision_at_20 value: 1.664 - type: precision_at_3 value: 5.484 - type: precision_at_5 value: 4.1770000000000005 - type: recall_at_1 value: 8.484 - type: recall_at_10 value: 23.087 - type: recall_at_100 value: 43.352000000000004 - type: recall_at_1000 value: 67.247 - type: recall_at_20 value: 29.187 - type: recall_at_3 value: 14.521999999999998 - type: recall_at_5 value: 18.218999999999998 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 39.095 - type: f1 value: 35.03781407521973 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 67.87360000000001 - type: ap value: 62.5359530212091 - type: f1 value: 67.76861907065303 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 89.98176014591884 - type: f1 value: 89.12439802681382 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 66.42954856361149 - type: f1 value: 46.845543295765395 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.15131136516476 - type: f1 value: 63.15954994502248 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.74983187626093 - type: f1 value: 69.86842975748304 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 28.540533318170436 - type: v_measures value: - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - 0.2685927394835013 - 0.2783483658319506 - 0.2665766690371173 - 0.27851721126872275 - 0.27353686950062217 - 0.30395264384113213 - 0.2947770213532569 - 0.2955213403120467 - 0.29310302531656435 - 0.30112744587212925 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 24.72766780458758 - type: v_measures value: - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - 0.2360459034955713 - 0.2388482784840708 - 0.2325812879581466 - 0.22505966000679387 - 0.22406230314275308 - 0.2639574575941564 - 0.26920314836084647 - 0.27094201539166474 - 0.2607957980208871 - 0.2512709280038677 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 29.041674385781967 - type: mrr value: 29.79989064897717 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 35.536820860805534 - type: v_measures value: - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - 0.4526409230197192 - 0.4418874523236007 - 0.2923442821974814 - 0.29815379814662973 - 0.3546740441941569 - 0.31765082600163064 - 0.38073340268020667 - 0.3175014611482366 - 0.31266612397153287 - 0.3004669681159753 - 0.33575554767305055 - 0.39815200864255257 - 0.3336520605714611 - 0.38338117463096916 - 0.4620636786448619 - 0.313480729190297 - 0.3538208608160554 - 0.3625124773562338 - 0.35967221153279816 - 0.34429637871008256 - 0.315565319725188 - 0.31257481088967437 - 0.48026035919192217 - 0.3260933295798252 - 0.33420498624724254 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 385e3cb46b4cfa89021f56c4380204149d0efe33 metrics: - type: v_measure value: 47.42966991443242 - type: v_measures value: - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - 0.5265449680406118 - 0.5493672486309341 - 0.5519094814834113 - 0.30132219602063365 - 0.5071491767482975 - 0.4654803385774871 - 0.20652468935420157 - 0.5484274172396977 - 0.5112975786305867 - 0.5749438967173793 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: 20a6d6f312dd54037fe07a32d58e5e168867909d metrics: - type: cos_sim_pearson value: 79.02197797488559 - type: cos_sim_spearman value: 71.7037151299904 - type: euclidean_pearson value: 76.1707252695092 - type: euclidean_spearman value: 71.57310842242731 - type: manhattan_pearson value: 76.03615971307154 - type: manhattan_spearman value: 71.53631984773847 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 79.02494360083622 - type: cos_sim_spearman value: 69.72575299384381 - type: euclidean_pearson value: 75.74354904408656 - type: euclidean_spearman value: 69.54484408453516 - type: manhattan_pearson value: 75.77951962076156 - type: manhattan_spearman value: 69.6354936146991 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 75.08237871140905 - type: cos_sim_spearman value: 76.43254419101892 - type: euclidean_pearson value: 77.01392166862142 - type: euclidean_spearman value: 77.25873928927386 - type: manhattan_pearson value: 76.8322542796806 - type: manhattan_spearman value: 77.06622162313037 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 76.15651557768992 - type: cos_sim_spearman value: 73.66468164294979 - type: euclidean_pearson value: 76.01343601779764 - type: euclidean_spearman value: 74.26813269648791 - type: manhattan_pearson value: 75.81532622772455 - type: manhattan_spearman value: 74.11890179466049 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 81.80212103727666 - type: cos_sim_spearman value: 82.61832225494061 - type: euclidean_pearson value: 81.83006587249692 - type: euclidean_spearman value: 82.61429686151203 - type: manhattan_pearson value: 81.76278849963437 - type: manhattan_spearman value: 82.54152053739365 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 77.75548172603382 - type: cos_sim_spearman value: 79.48976464310448 - type: euclidean_pearson value: 78.54266801280951 - type: euclidean_spearman value: 79.30766703387586 - type: manhattan_pearson value: 78.28008795002846 - type: manhattan_spearman value: 79.07395809817007 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 83.813657478234 - type: cos_sim_spearman value: 84.38223117622964 - type: euclidean_pearson value: 84.57065602789609 - type: euclidean_spearman value: 83.8380794185294 - type: manhattan_pearson value: 84.42039206232738 - type: manhattan_spearman value: 83.74732339282085 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 50.88695953591733 - type: cos_sim_spearman value: 60.61167810477114 - type: euclidean_pearson value: 55.81887963485168 - type: euclidean_spearman value: 60.28385340456606 - type: manhattan_pearson value: 56.03578991214848 - type: manhattan_spearman value: 59.94178607215249 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 78.97129674864591 - type: cos_sim_spearman value: 78.60681572589853 - type: euclidean_pearson value: 79.71108582359511 - type: euclidean_spearman value: 78.71541582168763 - type: manhattan_pearson value: 79.55279136411954 - type: manhattan_spearman value: 78.57797218212967 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 72.91005664580126 - type: mrr value: 91.49957703879274 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.73960396039604 - type: cos_sim_ap value: 92.14278266682584 - type: cos_sim_f1 value: 86.90890990542557 - type: cos_sim_precision value: 86.5213082259663 - type: cos_sim_recall value: 87.3 - type: dot_accuracy value: 99.49801980198019 - type: dot_ap value: 77.95867119922542 - type: dot_f1 value: 71.30528586839266 - type: dot_precision value: 77.40046838407494 - type: dot_recall value: 66.10000000000001 - type: euclidean_accuracy value: 99.73861386138614 - type: euclidean_ap value: 92.13035792099073 - type: euclidean_f1 value: 86.81102362204726 - type: euclidean_precision value: 85.46511627906976 - type: euclidean_recall value: 88.2 - type: manhattan_accuracy value: 99.73762376237623 - type: manhattan_ap value: 92.12382961875572 - type: manhattan_f1 value: 86.85770750988142 - type: manhattan_precision value: 85.83984375 - type: manhattan_recall value: 87.9 - type: max_accuracy value: 99.73960396039604 - type: max_ap value: 92.14278266682584 - type: max_f1 value: 86.90890990542557 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 46.543842750900396 - type: v_measures value: - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - 0.4503379530441486 - 0.5055050814643914 - 0.37196718256808775 - 0.4626103115561461 - 0.5045922481143936 - 0.4024029244484936 - 0.40741224663943404 - 0.5217286774083806 - 0.47473832818512185 - 0.4423513686832282 - 0.5254123899176399 - 0.5290494091156918 - 0.6004444685084731 - 0.5097409136008207 - 0.42510119674835567 - 0.45914095980022224 - 0.43938053466177135 - 0.459032754379216 - 0.43147103735898107 - 0.4430589611998686 - 0.4953516718234184 - 0.4169835530427121 - 0.43908761316001205 - 0.46089722865011284 - 0.45816167364597893 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 30.878511440057455 - type: v_measures value: - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - 0.29953106753859926 - 0.2959477268193652 - 0.2907437538793838 - 0.2896752248560134 - 0.29823646573245677 - 0.3302941899873012 - 0.3118332962228191 - 0.3227164592768227 - 0.32075958907773794 - 0.32811337061524626 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 44.48878112961158 - type: mrr value: 45.088675621763855 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.941315767578654 - type: cos_sim_spearman value: 29.329027079065966 - type: dot_pearson value: 25.836517566143634 - type: dot_spearman value: 26.352097845535162 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de metrics: - type: accuracy value: 64.013671875 - type: ap value: 10.97313563679864 - type: f1 value: 48.85384219487 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 54.25863044708545 - type: f1 value: 54.478056275468234 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 28.676358868345943 - type: v_measures value: - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - 0.29080031699004694 - 0.2957827890450158 - 0.279781173635388 - 0.28316010309239503 - 0.29801751933798737 - 0.30045501200974795 - 0.2750357568275408 - 0.28736739490829033 - 0.26884372823491953 - 0.2883920927532625 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 83.23895809739524 - type: cos_sim_ap value: 63.43837390346798 - type: cos_sim_f1 value: 60.3425871234495 - type: cos_sim_precision value: 54.63101604278074 - type: cos_sim_recall value: 67.38786279683377 - type: dot_accuracy value: 79.70435715562974 - type: dot_ap value: 50.219858779642024 - type: dot_f1 value: 52.03935006079363 - type: dot_precision value: 44.778390717139054 - type: dot_recall value: 62.11081794195251 - type: euclidean_accuracy value: 83.3581689217381 - type: euclidean_ap value: 63.866502871821886 - type: euclidean_f1 value: 60.66180862501495 - type: euclidean_precision value: 55.42457978607291 - type: euclidean_recall value: 66.99208443271768 - type: manhattan_accuracy value: 83.32836621565238 - type: manhattan_ap value: 63.58246341419401 - type: manhattan_f1 value: 60.405654578979714 - type: manhattan_precision value: 56.54775604142692 - type: manhattan_recall value: 64.82849604221636 - type: max_accuracy value: 83.3581689217381 - type: max_ap value: 63.866502871821886 - type: max_f1 value: 60.66180862501495 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 87.77894205767066 - type: cos_sim_ap value: 83.5297230824822 - type: cos_sim_f1 value: 75.65036420395423 - type: cos_sim_precision value: 73.11781609195403 - type: cos_sim_recall value: 78.3646442870342 - type: dot_accuracy value: 86.03058175185313 - type: dot_ap value: 78.95144253575621 - type: dot_f1 value: 72.20582032897512 - type: dot_precision value: 66.42524573202276 - type: dot_recall value: 79.08838928241454 - type: euclidean_accuracy value: 87.7265494624908 - type: euclidean_ap value: 83.29997302389856 - type: euclidean_f1 value: 75.38237163905613 - type: euclidean_precision value: 73.28582854649895 - type: euclidean_recall value: 77.60240221743148 - type: manhattan_accuracy value: 87.65475220242946 - type: manhattan_ap value: 83.1779453049763 - type: manhattan_f1 value: 75.17620001483792 - type: manhattan_precision value: 72.53400143163923 - type: manhattan_recall value: 78.01817061903296 - type: max_accuracy value: 87.77894205767066 - type: max_ap value: 83.5297230824822 - type: max_f1 value: 75.65036420395423 --- # Giratina This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * [Mihaiii/Wartortle](https://huggingface.co/Mihaiii/Wartortle) * [TaylorAI/bge-micro-v2](https://huggingface.co/TaylorAI/bge-micro-v2) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: Mihaiii/Wartortle - model: TaylorAI/bge-micro-v2 merge_method: slerp base_model: TaylorAI/bge-micro-v2 parameters: t: - value: 0.5 dtype: float32 ```
[ "SUMMARIZATION" ]
[ "BIOSSES" ]
BioNLP
tner/deberta-v3-large-bc5cdr
tner
token-classification
[ "transformers", "pytorch", "deberta-v2", "token-classification", "dataset:tner/bc5cdr", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,660
1,664
20
0
--- datasets: - tner/bc5cdr metrics: - f1 - precision - recall pipeline_tag: token-classification widget: - text: Jacob Collier is a Grammy awarded artist from England. example_title: NER Example 1 model-index: - name: tner/deberta-v3-large-bc5cdr results: - task: type: token-classification name: Token Classification dataset: name: tner/bc5cdr type: tner/bc5cdr args: tner/bc5cdr metrics: - type: f1 value: 0.8902493653874869 name: F1 - type: precision value: 0.8697724178175452 name: Precision - type: recall value: 0.9117137322866755 name: Recall - type: f1_macro value: 0.8863403908610603 name: F1 (macro) - type: precision_macro value: 0.8657302393432342 name: Precision (macro) - type: recall_macro value: 0.9080747413030301 name: Recall (macro) - type: f1_entity_span value: 0.8929371360310587 name: F1 (entity span) - type: precision_entity_span value: 0.8723983660766388 name: Precision (entity span) - type: recall_entity_span value: 0.9144663064532572 name: Recall (entity span) --- # tner/deberta-v3-large-bc5cdr This model is a fine-tuned version of [microsoft/deberta-v3-large](https://huggingface.co/microsoft/deberta-v3-large) on the [tner/bc5cdr](https://huggingface.co/datasets/tner/bc5cdr) dataset. Model fine-tuning is done via [T-NER](https://github.com/asahi417/tner)'s hyper-parameter search (see the repository for more detail). It achieves the following results on the test set: - F1 (micro): 0.8902493653874869 - Precision (micro): 0.8697724178175452 - Recall (micro): 0.9117137322866755 - F1 (macro): 0.8863403908610603 - Precision (macro): 0.8657302393432342 - Recall (macro): 0.9080747413030301 The per-entity breakdown of the F1 score on the test set are below: - chemical: 0.9298502009499452 - disease: 0.8428305807721753 For F1 scores, the confidence interval is obtained by bootstrap as below: - F1 (micro): - 90%: [0.885162383660078, 0.8951239957151518] - 95%: [0.8838793313408008, 0.8959517574197015] - F1 (macro): - 90%: [0.885162383660078, 0.8951239957151518] - 95%: [0.8838793313408008, 0.8959517574197015] Full evaluation can be found at [metric file of NER](https://huggingface.co/tner/deberta-v3-large-bc5cdr/raw/main/eval/metric.json) and [metric file of entity span](https://huggingface.co/tner/deberta-v3-large-bc5cdr/raw/main/eval/metric_span.json). ### Usage This model can be used through the [tner library](https://github.com/asahi417/tner). Install the library via pip ```shell pip install tner ``` and activate model as below. ```python from tner import TransformersNER model = TransformersNER("tner/deberta-v3-large-bc5cdr") model.predict(["Jacob Collier is a Grammy awarded English artist from London"]) ``` It can be used via transformers library but it is not recommended as CRF layer is not supported at the moment. ### Training hyperparameters The following hyperparameters were used during training: - dataset: ['tner/bc5cdr'] - dataset_split: train - dataset_name: None - local_dataset: None - model: microsoft/deberta-v3-large - crf: True - max_length: 128 - epoch: 15 - batch_size: 16 - lr: 1e-05 - random_seed: 42 - gradient_accumulation_steps: 4 - weight_decay: 1e-07 - lr_warmup_step_ratio: 0.1 - max_grad_norm: None The full configuration can be found at [fine-tuning parameter file](https://huggingface.co/tner/deberta-v3-large-bc5cdr/raw/main/trainer_config.json). ### Reference If you use any resource from T-NER, please consider to cite our [paper](https://aclanthology.org/2021.eacl-demos.7/). ``` @inproceedings{ushio-camacho-collados-2021-ner, title = "{T}-{NER}: An All-Round Python Library for Transformer-based Named Entity Recognition", author = "Ushio, Asahi and Camacho-Collados, Jose", booktitle = "Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations", month = apr, year = "2021", address = "Online", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.eacl-demos.7", doi = "10.18653/v1/2021.eacl-demos.7", pages = "53--62", abstract = "Language model (LM) pretraining has led to consistent improvements in many NLP downstream tasks, including named entity recognition (NER). In this paper, we present T-NER (Transformer-based Named Entity Recognition), a Python library for NER LM finetuning. In addition to its practical utility, T-NER facilitates the study and investigation of the cross-domain and cross-lingual generalization ability of LMs finetuned on NER. Our library also provides a web app where users can get model predictions interactively for arbitrary text, which facilitates qualitative model evaluation for non-expert programmers. We show the potential of the library by compiling nine public NER datasets into a unified format and evaluating the cross-domain and cross- lingual performance across the datasets. The results from our initial experiments show that in-domain performance is generally competitive across datasets. However, cross-domain generalization is challenging even with a large pretrained LM, which has nevertheless capacity to learn domain-specific features if fine- tuned on a combined dataset. To facilitate future research, we also release all our LM checkpoints via the Hugging Face model hub.", } ```
[ "NAMED_ENTITY_RECOGNITION" ]
[ "BC5CDR" ]
BioNLP
MediaTek-Research/Breexe-8x7B-Instruct-v0_1
MediaTek-Research
text-generation
[ "transformers", "pytorch", "mixtral", "text-generation", "conversational", "en", "zh", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
1,707
1,722
167
55
--- language: - en - zh license: apache-2.0 pipeline_tag: text-generation extra_gated_prompt: The model weights are only available for the partners to download now. extra_gated_fields: Name: text Company: text Title: text Contact Email: text --- # Breexe-8x7B-Instruct-v0_1 Breexe-8x7B is a language model family that builds on top of [Mixtral-8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1), specifically intended for Traditional Chinese use. Breexe-8x7B-Base is the base model for the Breexe-8x7B series. Breexe-8x7B-Base expands the original vocabulary with additional 30,000 Traditional Chinese tokens. With the expanded vocabulary, Breexe-8x7B operates at twice the inference speed for Traditional Chinese to Mixtral-8x7B. [See [Inference Performance](#inference-performance).] [Breexe-8x7B-Instruct](https://huggingface.co/MediaTek-Research/Breexe-8x7B-Instruct-v0_1) derives from the base model Breexe-8x7B-Base, making the resulting model amenable to be used as-is for commonly seen tasks, such as Q&A, RAG, multi-round chat, and summarization. **Breexe-8x7B-Instruct demonstrates impressive performance in benchmarks for Traditional Chinese and English, on par with OpenAI's gpt-3.5-turbo-1106.** [See [Chat Model Performance](#chat-model-performance).] The current release version of Breexe-8x7B is v0.1. *The models were trained on Nvidia's Taipei-1. Special thanks for Nvidia's technical support.* *A project by the members (in alphabetical order): Chan-Jan Hsu 許湛然, Chang-Le Liu 劉昶樂, Feng-Ting Liao 廖峰挺, Po-Chun Hsu 許博竣, [Yi-Chang Chen 陳宜昌](https://ycc.idv.tw/about-me), and the supervisor Da-Shan Shiu 許大山.* ## BreeXe API <p style="color:red;">We offer a trial API for business integration and academic benchmarking.</p> *API service open time: 14:00 - 20:00 (from 2024/4/3 to 2024/5/3)* The API is in beta testing. If you are experiencing connectivity issues, please bear with us. Free trial API key: `'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoyLCJ1c2VybmFtZSI6ImdlbmVyYWxfcHVibGljIiwibm90ZSI6ImdlbmVyYWwgcHVibGljIn0.kCp68nRw3RSh3jbMm8FvhG0NIkStflgI1wTHLviRPQE'` Get started here: ```python import time from openai import OpenAI API_KEY = <API_KEY_HERE> BASE_URL = 'https://api-mtkresearch.com/v1' MODEL_NAME = 'BreeXe-8x7B' client = OpenAI(base_url=BASE_URL, api_key=API_KEY) def ask_breexe(messages): completion = client.chat.completions.create( model=MODEL_NAME, messages=messages, temperature=0.01, top_p=0.01, max_tokens=512 ) response = completion.choices[0].message.content time.sleep(3) # due to a rate limit of 200 requests per 10 minutes return response messages = [ { 'role': 'system', 'content': 'You are a helpful AI assistant built by MediaTek Research. The user you are helping speaks Traditional Chinese and comes from Taiwan.' }, { 'role': 'user', 'content': '請問在量子物理領域中,疊加原理是什麼?它又是如何與量子糾纏現象相關聯的?' } ] response = ask_breexe(messages) print(response) ``` ## BreeXe Demo <a href="https://huggingface.co/spaces/MediaTek-Research/Demo-MR-Breexe-8x7B" style="color:red;font-weight:bold;">Try Demo Here 👩‍💻🧑🏻‍💻</a> Some examples are shown below. (System Prompt: `You are a helpful AI assistant built by MediaTek Research. The user you are helping speaks Traditional Chinese and comes from Taiwan.`, Temperature: 0.01, Top-P: 0.01, Repetition-Penalty: 1.1) <details><summary><b>Education</b></summary> | Query | 1️⃣ 請問在量子物理領域中,疊加原理是什麼?它又是如何與量子糾纏現象相關聯的?<br>2️⃣ 請用上述概念說明「薛丁格的貓」這個假想實驗 | |---|---| | **Breexe** | 1️⃣ <br>在量子物理領域中,疊加原理(Superposition Principle)是一個非常重要且獨特的概念。這個原則表示,微觀粒子如電子、光子等,可以同時存在於多個狀態之中,直到它們被測量時才「決定」其特定狀態。換句話說,在未進行觀測之前,量子系統處於一種混合態,包含了所有可能的狀態。<br><br>舉例來說,假設有一個量子系統具有兩個基本狀態:\|0⟩和\|1⟩。根據疊加原理,該系統可以同時處在這兩種狀態的疊加,即α\|0⟩+β\|1⟩,其中α和β分別代表不同狀態的機率振幅。當我們對這個量子系統進行測量時,它會「選擇」某個特定狀態,例如\|0⟩或\|1⟩,而其他未被選中的狀態將被排除在外。在這個過程中,整個系統的波函數會瞬間崩塌到選定的狀態。<br><br>量子糾纏現象(Quantum Entanglement)則是另一個量子物理中的核心概念。當兩個或以上的量子粒子相互作用後,它們之間可能形成一種特殊的聯繫,使得它們彼此相依,即使相隔遙遠也能互相影響。也就是說,當我們對其中一個粒子進行測量時,另一個粒子也會立即受到影響,無論它們相距多遠。這違反了古典物理學的定律,因此也被稱為「超距作用」。<br><br>疊加原理與量子糾纏現象兩者間有密切關聯。在某些情況下,量子糾纏現象需要疊加原理才能成立。例如,當兩個粒子處於糾纏狀態時,它們各自的狀態不能單獨描述,而是必須以整體的角度來考慮。在這種狀況下,疊加原理便提供了一個有效的方法,將兩個粒子的狀態結合為一個統一的疊加態。<br><br>此外,疊加原理也為量子計算機帶來了巨大的潛力。由於量子位元可以同時存在於多個狀態,這讓量子計算機擁有遠超傳統電腦的運算能力。利用疊加原理,量子計算機可以在短時間內處理大量數據,從而解決一些目前尚待解決的難題。<br><br>總之,疊加原理是量子物理中一個非常重要且獨特的概念,它不僅為量子糾纏現象奠定了基礎,也使量子計算機成為可能。了解疊加原理有助於深入理解量子世界的奇妙特性,並探索更多未知的量子現象。<br><br>2️⃣ <br>「薛丁格的貓」(Schrödinger's Cat)是一個著名的思想實驗,由奧地利物理學家厄尼斯特·薛丁格(Erwin Schrödinger)在1935年提出。這個實驗旨在展示量子力學與巨觀世界之間的矛盾,以及疊加原理在宏觀尺度下的荒謬性。<br><br>首先,讓我們回顧一下疊加原理。根據這個原則,微觀粒子可以同時存在於多個狀態,直到它們被測量時才「決定」其特定狀態。然而,當我們將這個概念放大至巨觀尺度時,疊加原理似乎變得不合理。<br><br>在薛丁格的貓這個假想實驗中,有一個密閉容器內裝有一隻活貓、一盒有毒氣體、以及一個放射性原子。如果放射性原子衰變,毒氣便會釋放出來,導致貓死亡;反之,若原子不衰變,則貓保持活著。按照古典物理的觀點,在任何給定時刻,貓只能處在活或死的單一狀態。<br><br>然而,從量子力學的角度來看,事情就變得有趣了。假設放射性原子處於疊加態,既可能衰變又可能不衰變。在這種情況下,根據疊加原理,貓也應該處於一個疊加態,也就是同時活又死。這個結果顯然違背常理,因為在我們日常生活中,物體總是處於明確的狀態。<br><br>這個實驗進一步強調了量子世界與巨觀世界之間的差異,以及疊加原理在巨觀尺度下的荒謬性。時至今日,「薛丁格的貓」仍然是一個熱門的話題,經常用於討論量子力學的奇特性質及其與巨觀世界的關係。 | </details> <details><summary><b>Open Question</b></summary> | Query | 生命的意義是什麼? 請用哲學角度探討,盡可能引經據典 | |---|---| | **Breexe** | 生命的意義是一個亙古以來的人類存在與生存問題。在哲學領域中,不同學派和思想家對於生命的意義有各自獨特且深刻的見解。以下將引用幾位重要哲學家的觀點,以提供您對這個議題的理解:<br>1. 蘇格拉底(Socrates):蘇格拉底主張「知識就是力量」,他認為人應該追求智慧、道德和美德。生命的意義在於追求真理,通過自我反省和思辨來提升心靈,並了解自己的角色和責任。他強調要認識自己,才能找到生命的意義。<br>2. 亞里斯多德(Aristotle):亞里斯多德提出了「目的論」(teleology),他相信每個事物都有其自然目的或最終目標。生命的意義就在於實現我們的潛能,完成我們的使命。他提到人類具有理性能力,因此生命的意義是在實踐德行(virtue),追求幸福(eudaimonia),即一種圓滿而美好的生活。<br>3. 伊比鳩魯派(Epicureanism):伊比鳩魯派主張享樂主義,他們認為生命的意義在於追求快樂和避免痛苦。在這個理論下,生命的意義是享受當下,珍惜有限的時間,以及建立良好的人際關係。然而,這並不意味著放縱或自私行為;相反地,它鼓勵明智地選擇那些能帶來真正快樂的事物。<br>4. 斯多噶學派(Stoicism):斯多噶學派強調控制情感和意志力,以克服人生中的困難和逆境。生命的意義在於接受命運,保持內心平靜,並努力過上自律的生活。他們相信,只要我們能夠掌握自己的心態,就能夠應對任何困境。<br>5. 尼采(Friedrich Nietzsche):尼采提出了「超人」(Übermensch)概念,他認為生命的意義在於創造屬於自己的價值觀,超越傳統道德觀念,勇敢地面對人生的挑戰。他呼籲人們要活出自我,追求卓越,並成為自己的主宰。<br>6. 沙特(Jean-Paul Sartre):沙特是存在主義者,他認為生命的意義取決於個人的抉擇。他主張自由意志,強調每個人都可以決定自己的道路。生命的意義就在於為自己的行動負責,並在過程中塑造出自己的存在。<br>7. 馬丁·布伯(Martin Buber):布伯提出了「我─你」(Ich-Du)的概念,他認為生命的意義在於與他人的相遇和交流。他強調人與人之間的相互尊重和愛,認為只有通過與他人的交往,我們才能充分感受到生命的意義。<br>以上只是部分哲學家的觀點,但希望這些引文有助於您進一步思考生命的意義。每個人都有自己的答案,而在探索這個問題時,重要的是要保持開放的心態,尊重他人的看法,並勇於追求自己的信仰。 | </details> <details><summary><b>Email Writing</b></summary> | Query | 請擬定一封專業的郵件,尋求主管對你準備的「季度財務報告」提供意見。特別詢問有關資料分析、呈現風格,以及所提取結論的清晰度。郵件請簡潔扼要。 | |---|---| | **Breexe** | 尊敬的主管:<br/><br/>我是您的下屬 XXXX,我正在籌備本年度第三季的財務報告。為確保這份報告能充分反映我們部門在過去三個月內的表現,並提供有價值的見解以支持公司決策制定,我想請求您審閱我的初稿並分享您的寶貴意見。<br/><br/>以下是我希望您在審閱時關注的幾個方面:<br/>1.數據分析:確認我所使用的數據是否正確且來源可靠,以及我在報告中對這些數據進行了哪些分析方法。如果您認為需要添加或修改任何分析方式,也懇請不吝賜教。<br/>2.呈現風格:我希望這份報告既能展示清晰的數字信息,又能保持視覺吸引力。因此,我非常期待您的反饋,以便我能夠進一步完善圖表和圖形等元素,使其更容易理解和消化。<br/>3.結論清晰度:最後,我希望報告中的結論能夠清晰地傳達我們部門在本季度的表現,以及可能影響未來業績的因素。在此基礎上,我希望您能指導我如何更好地表達這些結論,以便使報告更具說服力。<br/>懇請您在方便時審閱這份報告,並將您的意見與建議發送至我的電子郵箱 ([email protected])。您的指導將對我的工作產生巨大幫助,再次感謝您的支持!<br/><br/>敬祝<br/>商祺<br><br>XXXX(您的名字)| </details> <details><summary><b>Table Understanding</b></summary> | Query | 這是一個表格,其中第一行是標題,後續的每一行都是一隻企鵝:<br><br>姓名,年齡,身高(cm),體重(kg)<br>路易斯, 7, 50, 11<br>伯納德,5, 80, 13<br>文森特, 9, 60, 11<br>格溫,8,70,15<br><br>請問最高的企鵝是哪隻? | |---|---| | **Breexe** | 在提供的表格中,最高的企鵝是伯納德。他的身高為 80 公分。以下是各企鵝的詳細資料:<br><br>姓名, 年齡, 身高(cm), 體重(kg)<br>路易斯, 7, 50, 11<br>伯納德, 5, 80, 13 (最高)<br>文森特, 9, 60, 11<br>格溫, 8, 70, 15 | </details> <details><summary><b>Language Assistance</b></summary> | Query | 1️⃣<br>修正以下英語句子中的錯誤<br>On the other words, making friends with smarter people can help people make money. <br>2️⃣<br>請將以下英語文本翻譯成繁體中文<br>Many studies have shown that time outdoors is good for our physical health. Three wilderness education experts explain why periods alone in nature also provide mental and spiritual benefits. | |---|---| | **Breexe** | 1️⃣<br>修正以下英語句子中的錯誤:<br>在另一方面,與更聰明的人交朋友可以幫助人賺錢。<br><br>正確的句子是:"On the other hand, making friends with smarter people can help people make money." <br><br>2️⃣<br>許多研究顯示,待在戶外對身體健康有好處。三位荒野教育專家解釋了為什麼單獨在大自然中待一段時間也能提供心理和靈性上的益處。<br><br>正確的繁體中文版本是:「很多研究表明,在戶外的時間對我們的身體健康是有益的。三位野地教育專家解釋了為什麼在大自然中獨處也會帶來心理和精神上的利益。」| </details> <details><summary><b>Summary</b></summary> | Query | """<br>聯發科技集團旗下的人工智慧研究單位聯發創新基地,昨 (11/23) 日與臺北市政府資訊局和國立臺北科技大學簽署合作備忘錄,將其自主研發的中文大型語言模型授權予臺北市政府資訊局,並由北科大協助部署應用。透過生成式人工智慧工具的導入,為臺北市政府同仁建構兼具資訊安全、高生產力的智慧工作模式。這也是業界、學界和政府機關三方攜手推動中文生成式人工智慧發展的重要里程碑。<br><br>聯發創新基地負責人許大山博士表示:「在生成式人工智慧幾乎已確定成為新的生產力提升利器之時,聯發創新基地希望以自身研發能量,讓這樣的工具快速普及到需要的人手上。」<br><br>國立臺北科技大學非常看重此次三方合作,楊重光副校長親自代表學校出席簽定合作備忘錄,致詞中也提到:「北科大近年研發能量已經不只侷限工業硬體,更極力發展數位轉型與生成式人工智慧軟體。此次以學術界角色,參與臺北市政府與聯發創新基地合作,可提供研究能量協助進行提詞優化、辦公室自動化程度提升、公共服務改善、智慧城市多元應用,由學術創新使生成式人工智慧發展可以超越業界期待,並期許多起合作案例能帶動更多數位轉型的成功發展。<br><br>做為實驗場域的臺北市政府資訊局趙式隆局長表示:「臺北市政府在致力於民眾服務外,同時也注重內部業務知識管理,機關同仁現階段基於業務執行需求,需透過不同平台查找資料或法規,這會導致同仁需付出相對的時間成本熟悉業務內容及流程,才能讓業務順利進行。我們希望透過生成式人工智慧工具的輔助,讓同仁以自然語言方式詢問即可查詢相關規定或函釋,減輕同仁負擔。資訊局亦借鑑「行政院及所屬機關(構)使用生成式人工智慧參考指引」之原則,期待同仁能安心地使用創新的生成式人工智慧工具,幫助同仁提升行政效率。」<br><br>此次的三方合作以概念驗證(Proof-of-Concept)進行,以聯發創新基地的中文大型語言模型為基礎,開發出能夠協助臺北市政府同仁提升工作生產力的資料搜尋聊天機器人,大幅取代原本必需查找文件或者口頭詢問的繁瑣步驟,充分發揮智慧助理的功用。該模型以地端方式授權,資料留存於市府,維護市民資訊保密權益。此外,臺北市政府可提供機密資訊以外的使用回饋予聯發創新基地,做為後續調校改進之依據,以實際使用經驗促進未來語言模型發展。<br>"""<br><br>請用三點來摘要以上內容 | |---|---| | **Breexe** | 1. 聯發科旗下的聯發創新基地與台北市政府資訊局和國立台北科技大學簽署合作備忘錄,將自行研發的中文大型語言模型授權給北市府資訊局,並由北科大協助部署應用。<br>2. 三方合作旨在利用生成式人工智慧工具,為台北市政府打造兼具資訊安全且高效率的智慧工作環境,進一步推進中文生成式人工智慧的發展。<br>3. 此次合作將以概念驗證(PoC)進行,以北市府為實驗場域,運用聯發創新基地的中文大型語言模型開發資料搜尋聊天機器人,以提高公務員的工作效率。 | </details> <details><summary><b>Roleplay</b></summary> | Query | 1️⃣<br>現在你是機器學習工程師了。你的工作是解釋複雜的機器學習概念,用簡化方式讓沒有技術背景的客戶理解並信任你的產品。讓我們從這個問題開始:「什麼是語言模型?它是使用標註的資料還是未標註的資料來訓練?」<br>2️⃣<br>這是真的嗎?我聽說其他一些公司有不同的作法來執行這項工作而且可以做得更安全。 | |---|---| | **Breexe** | 1️⃣ <br>首先,讓我們簡單地了解一下「語言模型」(Language Model)。它在自然語言處理(NLP)中扮演著重要角色,主要用於預測和生成文本。換句話說,語言模型可以根據給定的上下文,推斷出可能的詞彙或句子。<br><br>想像你正在閱讀一本書,而書中的每個字都代表一個「單詞」。語言模型就像是一位聰明的小助手,他能分析整本書籍,找出哪些單詞通常會一起出現、哪些不太可能連在一起。有了這些知識後,當小助手遇到新的文本時,他就能猜測接下來的單詞,或者自動完成你想要說的話。<br><br>現在,談到標註與未標註的資料。在機器學習領域,有兩種主要類型的資料:標註資料(Labeled Data)和未標註資料(Unlabeled Data)。標註資料是指那些已標記類別或結果的樣本,例如將圖片分類為貓或狗;而未標註資料則指那些尚未被標記的樣本。<br><br>回到語言模型。在早期,大多數語言模型都是基於未標註的資料進行訓練。研究人員收集大量的文本,然後利用統計方法計算單詞之間的關聯性。然而,隨著深度學習的崛起,現代語言模型越來越多地採用標註資料進行訓練。在這種情況下,標註資料包括具有正確答案的文本配對,如問句與回答、文章段落與摘要等。<br><br>在訓練過程中,機器學習演算法將不斷調整參數,以提高對新文本的預測能力。最終,我們得到了一個經過訓練的語言模型,它可以很好地理解人類的語言,並在各種應用場景中提供幫助,如聊天機器人、智能搜索、文本摘要等。<br><br>希望以上的解釋能讓您更容易理解語言模型以及其如何運用標註和未標註的資料。如果您有其他疑問,請隨時提出!<br><br>2️⃣ <br>是的,您的疑慮是有道理的。在人工智慧 (AI) 和機器學習 (ML) 領域,不同公司確實可能採取不同的策略和方法來建立語言模型。以下是我們進一步探討的主要原因:<br><br>1. 安全性和隱私:確保用戶數據安全和保護個人信息是各公司關注的重點。因此,有些公司可能傾向於使用未標註的公開數據集,而不是收集和使用標註的私人數據。這有助於減少潛在的隱私風險,同時保持高水平的安全性。<br>2. 成本效益:標註資料需要人力資源投入,以便將原始文本標記為特定類別或結果。這增加了成本,特別是在涉及到大量數據的情況下。相比之下,未標註的資料不需要額外的人工標註,從而降低了成本。<br>3. 性能差異:在某些情況下,未標註的資料可能比標註的資料更具多樣性和涵蓋範圍。通過充分利用未標註的資料,企業可以獲得更多的信息,從而提升模型性能。然而,標註的資料通常包含明確的目標,使模型能夠更好地理解所需的任務。因此,兩者各有利弊。<br>4. 模型設計:不同公司可能擁有獨特的演算法和模型設計,使其在處理標註或未標註的資料方面有所不同。例如,一種模型可能非常擅長處理標註的資料,另一種模型則可能在未標註的資料上表現得更好。<br>5. 業務需求:最後,不同公司可能因其業務需求而在語言模型的開發和部署方面存在差異。例如,一家專注於社交媒體平台的公司可能需要一個高度個性化的聊天機器人,而另一家公司可能需要一個通用的文本摘要工具。在這種情況下,它們可能需要不同類型的語言模型。<br>總之,不同公司在語言模型的培訓和部署方面可能存在差異,這取決於他們的安全考慮、成本控制、性能要求、模型設計和業務需求。無論哪種做法,最重要的是確保所使用的方法符合行業標準和法律法規,並且在保護用戶數據和隱私的前提下提供良好的服務。| </details> <details><summary><b>Extraction</b></summary> | Query | """<br>〔記者許國楨/台中報導〕有金融背景外號「李董」的李示水,涉嫌自11年前開始成立投資集團,非法經營外匯、期貨,還以保證獲利為由向投資人吸金超過249億元,案經台中檢警調聯手偵辦,分兩波行動將李男及成員共47人拘提到案,查扣李名下93筆房地產,以及包括賓利、藍寶堅尼等5輛豪車,多金程度令人咋舌。<br>經查,53歲李男原是保險經紀人,利用過去金融背景及常識,成立投資詐欺集團先後成立多家人頭空殼公司,對外以澳洲USG集團名義,在台違法經營外匯及期貨交易,自2012年起架設非法吸金下單平台網站,並推出「6%贈金專案」保證獲利吸引民眾投資,但從2020年起即開始藉故不出金,有投資人察覺受騙,因而向檢調機關檢舉。<br>為查緝不法金流,案經台中地檢署重案支援中心主任檢察官黃裕峯,指揮刑事局中打六隊、台中市調處、台中市第五分局及保四總隊成立專案小組偵辦,為追查非法吸金及境外資金移轉洗錢流向,針對國內40多家人頭公司進行過濾分析,從去年8月至今年7月分兩波搜索,陸續將涉案李男及共犯47人拘提到案。<br>為追徵不法吸金犯罪所得,同時查扣公告現價超過1.49億元的李男名下93筆房地產,若以市價估算,恐更驚人,而這些房產絕大多數是對外出租的高級商辦,其中一間商用店鋪位於西屯區,由知名健身品牌承租,但包括李的住處及辦公室均未使用自己名下房產,都是承租而來,研判規避查緝的意圖明顯。<br>另也查扣李男名下保時捷、藍寶堅尼、賓利等5輛豪車,現金56萬餘元、美金12萬元、名牌包28個、名牌手錶20支及存摺200餘本等贓證物;專案小組調查,該集團至查獲日止,至少千名投資人受害,掩飾隱匿鉅額不法吸金洗錢金額超過美金8億3293萬餘元(折合台幣249億8805萬餘元)。<br>而全案經移送台中地檢署後,檢方近日偵結,將李等涉案47人依違反銀行法等罪起訴,刑事局中打呼籲民眾,任何投資應透過國內合法證券期貨商進行下單匯款,勿輕信「保證獲利」投資方案,極可能都是詐騙,最後血本無歸,如有發現「保證獲利」投資專案招募或廣告,請立即向警察機關檢舉或撥打165反詐騙諮詢專線即時查證。<br>而檢警將持續共同全力守護人民財產,貫徹打擊不法金流專案,維持社會安定及國家經濟穩定,展現政府打擊非法吸金及詐欺犯罪決心。<br>"""<br>請根據以上新聞內容,抓取出可能罪犯,及其可能犯的罪狀。由JSON格式(List of Dict)輸出,每個Dict包含兩個key: name 和 crimes | |---|---| | **Breexe** | [<br>{<br>"name": "李示水",<br>"crimes": ["非法經營外匯、期貨", "非法吸金", "詐欺"]<br>},<br>{<br>"name": "共犯47人",<br>"crimes": ["違反銀行法", "詐欺"]<br>}<br>] | </details> <details><summary><b>Coding</b></summary> | Query | 實作一個函數以 O(1) 的空間複雜度和 O(n) 的時間複雜度,找到兩個不同大小的已排序陣列的中位數。 | |---|---| | **Breexe** | 首先,我們需要了解如何在 O(1) 的空間複雜度下計算出兩個已排序陣列的中位數。這裡提供了一個演算法:<br><br>假設有兩個已排序陣列 A 和 B,其長度分別為 m 和 n (m <= n)。我們將利用以下步驟來計算中位數:<br>1.找出兩個陣列的中間元素索引 i = (m + n - 1)/2 和 j = m + n/2。注意,如果 n 是奇數,則取整後的 i 和 j 可能相同;如果 n 是偶數,i 和 j 分別表示兩個中位數的位置。<br>2.比較 A[i-1] 和 B[j-1]、A[i] 和 B[j]。<br>3.如果 A[i-1] > B[j-1],則中位數位於 B[0..j-1] 區間,因此可以忽略 A[i..m] 區間。反之,如果 A[i] < B[j],則可以忽略 B[j+1..n] 區間。<br>4.重複上述步驟,直到剩下一個區間。這個區間包含兩個陣列的中位數。<br>5.最後,計算中位數。如果 n 是奇數,中位數就是該區間的中間元素;如果 n 是偶數,中位數就是該區間的前兩個元素之平均值。<br><br>以下是 Python 程式碼範例: <br>\`\`\`<br>def find_median(A, B):<br>  m, n = len(A), len(B)<br> <br>  if m > n:<br>   A, B, m, n = B, A, n, m # Ensure m <= n<br> <br>  imin, imax = 0, m<br>  while imin <= imax:<br>   i = (imin + imax) // 2<br>   j = (m + n + 1) // 2 - i<br> <br>   # If i is too small or too large, then adjust it accordingly<br>   if i < m and B[j-1] > A[i]:<br>    imin = i + 1<br>   elif i > 0 and A[i-1] > B[j]:<br>    imax = i - 1<br>   else:<br>    max_left_A = float('-inf') if i == 0 else A[i-1]<br>    min_right_A = float('inf') if i == m else A[i]<br> <br>    max_left_B = float('-inf') if j == 0 else B[j-1]<br>    min_right_B = float('inf') if j == n else B[j]<br> <br>    if (m + n) % 2 == 0:<br>     return (max(max_left_A, max_left_B) + min(min_right_A, min_right_B)) / 2<br>    else:<br>     return max(max_left_A, max_left_B)<br><br>  raise ValueError("No median found")<br>\`\`\`<br><br>以上演算法符合要求,在 O(1) 的空間複雜度下完成任務,並且在 O(n) 的時間複雜度內運算。| </details> <details><summary><b>Math</b></summary> | Query | 三角形的頂點位於點 (0, 0)、(-1, 1) 和 (3, 3)。三角形的面積是多少?<br><br>Think it step by step | |---|---| | **Breexe** | 首先,我們將利用三角形面積公式來計算這個問題。三角形面積公式為:<br><br>面積 = \|(x1 * (y2 - y3) + x2 * (y3 - y1) + x3 * (y1 - y2)) / 2\|<br><br>在這個問題中,三角形的頂點分別位於:<br>A(0, 0)、B(-1, 1)、C(3, 3)<br><br>接下來,我們將代入公式計算:<br><br>面積 = \|(0 * (1 - 3) + (-1) * (3 - 0) + 3 * (0 - 1)) / 2\|<br>面積 = \|(0 - 3 - 3) / 2\|<br>面積 = \|-6 / 2\|<br>面積 = 3<br><br>因此,三角形的面積為 3 平方單位。 | </details> <details><summary><b>Taiwan Knowledge</b></summary> | Query | 我剛來到台北工作,不是很了解這邊的捷運系統,請幫我介紹一下 | |---|---| | **Breexe** | 歡迎你來到台北!台北擁有便捷且現代化的大眾運輸系統,其中最主要的是台北捷運(Metro Taipei)。以下是一些基本資訊和路線介紹:<br>1. 台北捷運共有五種不同的路線:淡水信義線 (紅線)、松山新店線 (綠線)、中和新蘆線 (橘線)、文湖線 (棕線) 以及板南線 (藍線)。各線之間有轉乘站相連,方便旅客在城市間移動。<br>2. 淡水信義線:這條線路由北投延伸至象山,途經士林、圓山、雙連等熱門景點。它與中和新蘆線和松山新店線在中正紀念堂站及東門站轉乘;與板南線在台北車站轉乘。<br>3. 松山新店線:從松山延伸至新店,經過南京三民、松江南京、忠孝新生等熱鬧商業區。它與淡水信義線在中正紀念堂站轉乘;與中和新蘆線在古亭站轉乘;與板南線在西門站轉乘。<br>4. 中和新蘆線:此路線由迴龍延伸至蘆洲,途經新莊、板橋、行天宮等區域。它與淡水信義線在東門站轉乘;與松山新店線在古亭站轉乘;與板南線在忠孝復興站轉乘。<br>5. 文湖線:這是環狀線,主要服務內湖科技園區和南港軟體園區等地。它與松山新店線在南京復興站轉乘;與淡水信義線在大安站轉乘。<br>6. 板南線:這條線路由土城延伸至南港展覽館,途經永寧、新埔民生、市政府等站點。它與淡水信義線在台北車站轉乘;與松山新店線在西門站轉乘;與中和新蘆線在忠孝復興站轉乘。<br>7. 票價:單程票費用為每段次新台幣$20元,使用悠遊卡可享折扣優惠。此外,台北捷運提供多種定期票供通勤者選擇,如「常客優惠」、「定期票」等。<br>8. 營運時間:週一至週四,五、六日及國定假日略有不同。一般來說,首班車約早上6點發車,末班車則約晚上12點左右。詳細時刻表可參考台北捷運官網。<br>9. 注意事項:進站時需刷卡或購票,出站時再刷一次卡片。禁止飲食、吸菸、攜帶危險物品等。<br>希望以上資訊能幫助你在台北順利地利用捷運系統。祝你旅途愉快! | </details> ## Features - Expanding the vocabulary dictionary size from 32k to 62k to better support Traditional Chinese - 8k-token context length - Multi-turn dialogue (without special handling for harmfulness) - Sparse mixture of experts (MoE) ## Inference Performance In this test, we use the first 700 characters of the [web article](https://health.udn.com/health/story/5976/7699252?from=udn_ch1005_main_index) as the input and ask the model to write the same article again. All inferences run on 4 RTX A6000 GPUs (using `vllm`, with a tensor-parallel size of 4). | Models | ↓ Inference Time (sec)|Estimated Max Input Length (Char)| |--------------------------------------------------------------------|-------------------|--------------------------| | **Breexe-8x7B-Instruct-v0.1** | 27.83 | 11.1k | | Mixtral-8x7B-Instruct-v0.1 | 59.49 | 5.1k | ## Chat Model Performance **TMMLU+**, **Table**, and **MT-Bench-tw** source from [MediaTek-Research/TCEval-v2](https://huggingface.co/datasets/MediaTek-Research/TCEval-v2), which derives from [TCEval-v1](https://github.com/mtkresearch/MR-Models/tree/main/TC-Eval) and [ikala/tmmluplus](https://huggingface.co/datasets/ikala/tmmluplus). **MMLU** sources from [hails/mmlu_no_train](https://huggingface.co/datasets/hails/mmlu_no_train). **MT-Bench** source from [lmsys/mt_bench_human_judgments](https://huggingface.co/datasets/lmsys/mt_bench_human_judgments). We use [the code](https://github.com/mtkresearch/TCEval) revised from [EleutherAI/lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness) to evaluate **TMMLU+**, **Table**, and **MMLU**. All choice problems adapt the selection by the log-likelihood. We use [the code](https://github.com/mtkresearch/TCEval) revised from [fastchat llm_judge](https://github.com/lm-sys/FastChat/tree/main/fastchat/llm_judge) (GPT4 as judge) to evaluate **MT-Bench-tw** and **MT-Bench**. | Models | |↑ MT-Bench-tw (Score)| TMMLU+ (ACC)|TTQA (ACC) | Table (ACC)| MT-Bench (Score)| MMLU (ACC) | |---------------------------------------------------------------------------------------------------------|--------|--------------------|--------------|-------------|-------------|------------------|-------------| | | |TC, Chat |TC, Knowledge |TC, Knowledge|TC, Reasoning|EN, Chat |EN, Knowledge| | | |0 shot | 0 shot |0 shot | 0 shot |0 shot | 0 shot | | [**Breexe-8x7B-Instruct-v0_1**](https://huggingface.co/MediaTek-Research/Breexe-8x7B-Instruct-v0_1) | 47B |7.2 | 48.92 | 75.22 | 39.58 | 7.8 | 69.90 | | [gpt-3.5-turbo-1106](https://openai.com) | |7.1 | 43.56 | 68.14 | 45.14 |7.9 | 67.09 | | [Qwen1.5-14B-Chat](https://huggingface.co/Qwen/Qwen1.5-14B-Chat) | 14B |7.1 | 51.76 | 70.79 | 51.39 |7.8 | 66.65 | | [Yi-34B-Chat](https://huggingface.co/01-ai/Yi-34B-Chat) | 34B |6.9 | 54.87 | 81.42 | 36.81 |7.6 | 71.04 | | [Qwen1.5-7B-Chat](https://huggingface.co/Qwen/Qwen1.5-7B-Chat) | 7B |6.4 | 44.65 | 67.86 | 34.72 |7.6 | 59.54 | | [Breeze-7B-Instruct-v1_0](https://huggingface.co/MediaTek-Research/Breeze-7B-Instruct-v1_0) | 7B |6.0 | 42.67 | 77.00 | 39.58 |7.4 | 61.73 | | [Yi-6B-Chat](https://huggingface.co/01-ai/Yi-6B-Chat) | 6B |5.0 | 44.79 | 72.57 | 25.69 |6.0 | 59.45 | | [Taiwan-LLM-13B-v2.0-chat](https://huggingface.co/yentinglin/Taiwan-LLM-13B-v2.0-chat) | 13B |5.0 | 29.47 | 67.26 | 23.61 |N/A* | 50.50 | | [Taiwan-LLM-7B-v2.1-chat](https://huggingface.co/yentinglin/Taiwan-LLM-7B-v2.1-chat) | 7B |4.2 | 28.08 | 51.33 | 31.25 |N/A* | 42.72 | \* Taiwan-LLM models responds to multi-turn questions (English) in Traditional Chinese. ## Base Model Performance **TMMLU+** and **Table** source from [MediaTek-Research/TCEval-v2](https://huggingface.co/datasets/MediaTek-Research/TCEval-v2), which derives from [TCEval-v1](https://github.com/mtkresearch/MR-Models/tree/main/TC-Eval) and [ikala/tmmluplus](https://huggingface.co/datasets/ikala/tmmluplus). **MMLU** sources from [hails/mmlu_no_train](https://huggingface.co/datasets/hails/mmlu_no_train). We use [the code](https://github.com/mtkresearch/TCEval) revised from [EleutherAI/lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness) to evaluate **TMMLU+**, **Table**, and **MMLU**. All choice problems adapt the selection by the log-likelihood. | Models | |↑ TMMLU+ (ACC)| TTQA (ACC) | Table (ACC) | MMLU (ACC) | |-------------------------------------------------------------------------------------|------|--------------|-------------|-------------|-------------| | | |TC, Knowledge |TC, Knowledge|TC, Reasoning|EN, Knowledge| | | | 5 shot |5 shot | 5 shot | 5 shot | | [Yi-34B](https://huggingface.co/01-ai/Yi-34B) | 34B | 63.10 | 87.61 | 49.31 | 77.42 | | [Qwen1.5-14B](https://huggingface.co/Qwen/Qwen1.5-14B) | 14B | 54.30 | 78.76 | 54.86 | 70.17 | | **Breexe-8x7B-Base-v0_1** | 47B | 50.20 | 79.65 | 39.58 | 70.79 | | [Yi-6B](https://huggingface.co/01-ai/Yi-6B) | 6B | 49.63 | 75.22 | 34.72 | 65.35 | | [Qwen1.5-7B](https://huggingface.co/Qwen/Qwen1.5-7B) | 7B | 46.51 | 69.03 | 33.33 | 63.14 | | [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) | 47B | 46.10 | 64.60 | 47.22 | 72.94 | | [Breeze-7B-Base-v1_0](https://huggingface.co/MediaTek-Research/Breeze-7B-Base-v1_0) | 7B | 42.67 | 75.22 | 31.99 | 61.24 | | [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) | 7B | 36.93 | 53.10 | 27.78 | 64.89 | ## Use in Transformers First install direct dependencies: ``` pip install transformers torch accelerate ``` If you want faster inference using flash-attention2, you need to install these dependencies: ```bash pip install packaging ninja pip install flash-attn ``` Then load the model in transformers: ```python from transformers import AutoModelForCausalLM, AutoTokenizer import torch model = AutoModelForCausalLM.from_pretrained( "MediaTek-Research/Breexe-8x7B-Instruct-v0_1", device_map="auto", torch_dtype=torch.bfloat16, attn_implementation="flash_attention_2" # optional ) ``` The structure of the query is ```txt <s> SYS_PROMPT [INST] QUERY1 [/INST] RESPONSE1 [INST] QUERY2 [/INST] ``` where `SYS_PROMPT`, `QUERY1`, `RESPONSE1`, and `QUERY2` can be provided by the user. The suggested default `SYS_PROMPT` is ```txt You are a helpful AI assistant built by MediaTek Research. The user you are helping speaks Traditional Chinese and comes from Taiwan. ``` We also integrate `chat_template` into [tokenizer_config.json](tokenizer_config.json), so you can `apply_chat_template` to get the prompt. ```python >>> from transformers import AutoTokenizer >>> tokenizer = AutoTokenizer.from_pretrained("MediaTek-Research/Breexe-8x7B-Instruct-v0_1") >>> chat = [ ... {"role": "user", "content": "你好,請問你可以完成什麼任務?"}, ... {"role": "assistant", "content": "你好,我可以幫助您解決各種問題、提供資訊和協助您完成許多不同的任務。例如:回答技術問題、提供建議、翻譯文字、尋找資料或協助您安排行程等。請告訴我如何能幫助您。"}, ... {"role": "user", "content": "太棒了!"}, ... ] >>> tokenizer.apply_chat_template(chat, tokenize=False) "<s>You are a helpful AI assistant built by MediaTek Research. The user you are helping speaks Traditional Chinese and comes from Taiwan. [INST] 你好,請問你可以完成什麼任務? [/INST] 你好,我可以幫助您解決各種問題、提供資訊和協助您完成許多不同的任務。例如:回答技術問題、提供建議、翻譯文字、尋找資料或協助您安排行程等。請告訴我如何能幫助您。 [INST] 太棒了! [/INST] " # Tokenized results # ['▁', '你好', ',', '請問', '你', '可以', '完成', '什麼', '任務', '?'] # ['▁', '你好', ',', '我', '可以', '幫助', '您', '解決', '各種', '問題', '、', '提供', '資訊', '和', '協助', '您', '完成', '許多', '不同', '的', '任務', '。', '例如', ':', '回答', '技術', '問題', '、', '提供', '建議', '、', '翻譯', '文字', '、', '尋找', '資料', '或', '協助', '您', '安排', '行程', '等', '。', '請', '告訴', '我', '如何', '能', '幫助', '您', '。'] # ['▁', '太', '棒', '了', '!'] ``` ## Citation ``` @article{breexe8x7b2024, title={}, author={}, journal={arXiv}, year={2024} } ```
[ "SUMMARIZATION" ]
[ "BEAR" ]
Non_BioNLP
RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf
RichardErkhov
null
[ "gguf", "arxiv:2101.00027", "arxiv:2201.07311", "endpoints_compatible", "region:us" ]
1,730
1,730
73
0
--- {} --- Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) pythia-2.8b-v0 - GGUF - Model creator: https://huggingface.co/EleutherAI/ - Original model: https://huggingface.co/EleutherAI/pythia-2.8b-v0/ | Name | Quant method | Size | | ---- | ---- | ---- | | [pythia-2.8b-v0.Q2_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q2_K.gguf) | Q2_K | 1.01GB | | [pythia-2.8b-v0.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q3_K_S.gguf) | Q3_K_S | 1.16GB | | [pythia-2.8b-v0.Q3_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q3_K.gguf) | Q3_K | 1.38GB | | [pythia-2.8b-v0.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q3_K_M.gguf) | Q3_K_M | 1.38GB | | [pythia-2.8b-v0.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q3_K_L.gguf) | Q3_K_L | 1.49GB | | [pythia-2.8b-v0.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.IQ4_XS.gguf) | IQ4_XS | 1.43GB | | [pythia-2.8b-v0.Q4_0.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q4_0.gguf) | Q4_0 | 1.49GB | | [pythia-2.8b-v0.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.IQ4_NL.gguf) | IQ4_NL | 1.5GB | | [pythia-2.8b-v0.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q4_K_S.gguf) | Q4_K_S | 1.5GB | | [pythia-2.8b-v0.Q4_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q4_K.gguf) | Q4_K | 1.66GB | | [pythia-2.8b-v0.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q4_K_M.gguf) | Q4_K_M | 1.66GB | | [pythia-2.8b-v0.Q4_1.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q4_1.gguf) | Q4_1 | 1.64GB | | [pythia-2.8b-v0.Q5_0.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q5_0.gguf) | Q5_0 | 1.8GB | | [pythia-2.8b-v0.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q5_K_S.gguf) | Q5_K_S | 1.8GB | | [pythia-2.8b-v0.Q5_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q5_K.gguf) | Q5_K | 1.93GB | | [pythia-2.8b-v0.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q5_K_M.gguf) | Q5_K_M | 1.93GB | | [pythia-2.8b-v0.Q5_1.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q5_1.gguf) | Q5_1 | 1.95GB | | [pythia-2.8b-v0.Q6_K.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q6_K.gguf) | Q6_K | 2.13GB | | [pythia-2.8b-v0.Q8_0.gguf](https://huggingface.co/RichardErkhov/EleutherAI_-_pythia-2.8b-v0-gguf/blob/main/pythia-2.8b-v0.Q8_0.gguf) | Q8_0 | 2.75GB | Original model description: --- language: - en tags: - pytorch - causal-lm - pythia - pythia_v0 license: apache-2.0 datasets: - the_pile --- The *Pythia Scaling Suite* is a collection of models developed to facilitate interpretability research. It contains two sets of eight models of sizes 70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two models: one trained on the Pile, and one trained on the Pile after the dataset has been globally deduplicated. All 8 model sizes are trained on the exact same data, in the exact same order. All Pythia models are available [on Hugging Face](https://huggingface.co/models?other=pythia). The Pythia model suite was deliberately designed to promote scientific research on large language models, especially interpretability research. Despite not centering downstream performance as a design goal, we find the models <a href="#evaluations">match or exceed</a> the performance of similar and same-sized models, such as those in the OPT and GPT-Neo suites. Please note that all models in the *Pythia* suite were renamed in January 2023. For clarity, a <a href="#naming-convention-and-parameter-count">table comparing the old and new names</a> is provided in this model card, together with exact parameter counts. ## Pythia-2.8B ### Model Details - Developed by: [EleutherAI](http://eleuther.ai) - Model type: Transformer-based Language Model - Language: English - Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia) for training procedure, config files, and details on how to use. - Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox) - License: Apache 2.0 - Contact: to ask questions about this model, join the [EleutherAI Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`. Please read the existing *Pythia* documentation before asking about it in the EleutherAI Discord. For general correspondence: [contact@eleuther. ai](mailto:[email protected]). <figure> | Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models | | -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: | | 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — | | 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M | | 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M | | 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — | | 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B | | 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B | | 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B | | 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — | <figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and non-deduped models of a given size have the same hyperparameters. “Equivalent” models have <b>exactly</b> the same architecture, and the same number of non-embedding parameters.</figcaption> </figure> ### Uses and Limitations #### Intended Use The primary intended use of Pythia is research on the behavior, functionality, and limitations of large language models. This suite is intended to provide a controlled setting for performing scientific experiments. To enable the study of how language models change over the course of training, we provide 143 evenly spaced intermediate checkpoints per model. These checkpoints are hosted on Hugging Face as branches. Note that branch `143000` corresponds exactly to the model checkpoint on the `main` branch of each model. You may also further fine-tune and adapt Pythia-2.8B for deployment, as long as your use is in accordance with the Apache 2.0 license. Pythia models work with the Hugging Face [Transformers Library](https://huggingface.co/docs/transformers/index). If you decide to use pre-trained Pythia-2.8B as a basis for your fine-tuned model, please conduct your own risk and bias assessment. #### Out-of-scope use The Pythia Suite is **not** intended for deployment. It is not a in itself a product and cannot be used for human-facing interactions. Pythia models are English-language only, and are not suitable for translation or generating text in other languages. Pythia-2.8B has not been fine-tuned for downstream contexts in which language models are commonly deployed, such as writing genre prose, or commercial chatbots. This means Pythia-2.8B will **not** respond to a given prompt the way a product like ChatGPT does. This is because, unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement Learning from Human Feedback (RLHF) to better “understand” human instructions. #### Limitations and biases The core functionality of a large language model is to take a string of text and predict the next token. The token deemed statistically most likely by the model need not produce the most “accurate” text. Never rely on Pythia-2.8B to produce factually accurate output. This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset known to contain profanity and texts that are lewd or otherwise offensive. See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a discussion of documented biases with regards to gender, religion, and race. Pythia-2.8B may produce socially unacceptable or undesirable text, *even if* the prompt itself does not include anything explicitly offensive. If you plan on using text generated through, for example, the Hosted Inference API, we recommend having a human curate the outputs of this language model before presenting it to other people. Please inform your audience that the text was generated by Pythia-2.8B. ### Quickstart Pythia models can be loaded and used via the following code, demonstrated here for the third `pythia-70m-deduped` checkpoint: ```python from transformers import GPTNeoXForCausalLM, AutoTokenizer model = GPTNeoXForCausalLM.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) tokenizer = AutoTokenizer.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) inputs = tokenizer("Hello, I am", return_tensors="pt") tokens = model.generate(**inputs) tokenizer.decode(tokens[0]) ``` Revision/branch `step143000` corresponds exactly to the model checkpoint on the `main` branch of each model.<br> For more information on how to use all Pythia models, see [documentation on GitHub](https://github.com/EleutherAI/pythia). ### Training #### Training data [The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in English. It was created by EleutherAI specifically for training large language models. It contains texts from 22 diverse sources, roughly broken down into five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl), prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and miscellaneous (e.g. GitHub, Enron Emails). See [the Pile paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources, methodology, and a discussion of ethical implications. Consult [the datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation about the Pile and its component datasets. The Pile can be downloaded from the [official website](https://pile.eleuther.ai/), or from a [community mirror](https://the-eye.eu/public/AI/pile/).<br> The Pile was **not** deduplicated before being used to train Pythia-2.8B. #### Training procedure All models were trained on the exact same data, in the exact same order. Each model saw 299,892,736,000 tokens during training, and 143 checkpoints for each model are saved every 2,097,152,000 tokens, spaced evenly throughout training. This corresponds to training for just under 1 epoch on the Pile for non-deduplicated models, and about 1.5 epochs on the deduplicated Pile. All *Pythia* models trained for the equivalent of 143000 steps at a batch size of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch size of 4M tokens listed were originally trained for 71500 steps instead, with checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for consistency with all 2M batch models, so `step1000` is the first checkpoint for `pythia-1.4b` that was saved (corresponding to step 500 in training), and `step1000` is likewise the first `pythia-6.9b` checkpoint that was saved (corresponding to 1000 “actual” steps).<br> See [GitHub](https://github.com/EleutherAI/pythia) for more details on training procedure, including [how to reproduce it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br> Pythia uses the same tokenizer as [GPT-NeoX- 20B](https://huggingface.co/EleutherAI/gpt-neox-20b). ### Evaluations All 16 *Pythia* models were evaluated using the [LM Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access the results by model and step at `results/json/*` in the [GitHub repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br> Expand the sections below to see plots of evaluation results for all Pythia and Pythia-deduped models compared with OPT and BLOOM. <details> <summary>LAMBADA – OpenAI</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/> </details> <details> <summary>Physical Interaction: Question Answering (PIQA)</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/> </details> <details> <summary>WinoGrande</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/> </details> <details> <summary>AI2 Reasoning Challenge—Challenge Set</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/> </details> <details> <summary>SciQ</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/> </details> ### Naming convention and parameter count *Pythia* models were renamed in January 2023. It is possible that the old naming convention still persists in some documentation by accident. The current naming convention (70M, 160M, etc.) is based on total parameter count. <figure style="width:32em"> | current Pythia suffix | old suffix | total params | non-embedding params | | --------------------: | ---------: | -------------: | -------------------: | | 70M | 19M | 70,426,624 | 18,915,328 | | 160M | 125M | 162,322,944 | 85,056,000 | | 410M | 350M | 405,334,016 | 302,311,424 | | 1B | 800M | 1,011,781,632 | 805,736,448 | | 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 | | 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 | | 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 | | 12B | 13B | 11,846,072,320 | 11,327,027,200 | </figure>
[ "QUESTION_ANSWERING", "TRANSLATION" ]
[ "SCIQ" ]
Non_BioNLP
avsolatorio/all-MiniLM-L6-v2-MEDI-MTEB-triplet-final
avsolatorio
sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:1943715", "loss:MultipleNegativesRankingLoss", "en", "arxiv:1908.10084", "arxiv:1705.00652", "base_model:sentence-transformers/all-MiniLM-L6-v2", "base_model:finetune:sentence-transformers/all-MiniLM-L6-v2", "license:apache-2.0", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,720
1,720
51
1
--- base_model: sentence-transformers/all-MiniLM-L6-v2 datasets: [] language: - en library_name: sentence-transformers license: apache-2.0 metrics: - cosine_accuracy - dot_accuracy - manhattan_accuracy - euclidean_accuracy - max_accuracy pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:1943715 - loss:MultipleNegativesRankingLoss widget: - source_sentence: who sang the song queen of my heart sentences: - Queen of My Heart Queen of My Heart "Queen of My Heart" is a song by Irish boy band Westlife. It was released on 8 November 2001 as the first single from their third studio album, "World of Our Own". It was released as a double A-side single with "When You're Looking Like That" in UK and Ireland. It debuted at number one on the UK Singles Chart, giving the band their ninth UK number one single in two and a half years, staying at the top of the chart for one week. It remains one of the band's most successful singles, becoming the - Stephanie Edwards (Grey's Anatomy) Stephanie Edwards (Grey's Anatomy) Stephanie Edwards, M.D. is a fictional character from the medical drama television series "Grey's Anatomy", which airs on the American Broadcasting Company (ABC) in the United States. The character was created by series producer Shonda Rhimes, and was portrayed by actress Jerrika Hinton from 2012 to 2017. Introduced as a surgical intern at the fictional Seattle Grace Mercy West Hospital, later renamed Grey Sloan Memorial Hospital, Stephanie works her way up to resident level with fellow intern and friend, Jo Wilson (Camilla Luddington). The character was described by Hinton as "innovative" who strives to be the - Heart of My Heart the 1926 song by Max, the Chief, and detect-o-tune operator Arrick. Heart of My Heart "The Gang that Sang Heart of My Heart" is a popular song. The music and lyrics were written by Ben Ryan (1892–1968) in 1926. It reminisces about being in a youthful quartet, singing "Heart of My Heart". The quoted line, "Heart of My Heart", so longed for in the 1926 song, begins the chorus of "The Story of the Rose", written by Andrew Mack (1863–1931) in 1899. Mack was a popular American actor, singer and comedian who reportedly first sang this song in an 1899 - source_sentence: when did gretsch stop making guitars in america sentences: - Get Low (Lil Jon & the East Side Boyz song) Get Low (Lil Jon & the East Side Boyz song) "Get Low" is a song by Lil Jon & the East Side Boyz, featuring Ying Yang Twins, released in 2003. It is featured on the 2002 album "Kings of Crunk". The song reached number two on the US "Billboard" Hot 100 behind "Baby Boy" by Beyoncé featuring Sean Paul and number 20 on the US Hot Digital Songs. It was number five on the top Hot R&B/Hip-Hop songs of 2003. It is also known as a breakthrough single for the crunk genre, as the song's success helped it become mainstream. - TV Jones guitarist Brian Setzer, whose guitar sound relied heavily on vintage Gretsch guitars. When the Gretsch Guitar Company was in the process of creating a Brian Setzer signature model, Brian conducted a “blind sound test” of various pickup models that were to be considered for use in these guitars. Tom's Hotrod pickup design was chosen because of its sound being the most faithful to the original. (At this point, the pickups Gretsch was using in their guitars were made of overseas parts and ceramic magnets). Word soon spread that TV Jones was making “true-to-the-original” Filter’tron pickups and many famous players demanded - Gretsch South Carolina, where it remains today. The first new guitar model introduced was the Traveling Wilburys model - an Asian import - which looked much like a Danelectro. While this guitar model did little to bolster Gretsch's reputation for producing classic guitars, it served notice that Gretsch was back. After numerous failed attempts to acquire facilities or contract production in the United States, Fred Gretsch and long-time Gretsch employee Duke Kramer, who advised Gretsch, turned to Terada of Japan, and production began there. A range of reissues appeared throughout the 1990s to mixed reviews. They were of generally high quality, - source_sentence: 'Examining playfulness in adults: Testing its correlates with personality, positive psychological functioning, goal aspirations, and multi-methodically assessed ingenuity' sentences: - Implementation of Evolutionary Algorithms for Deep Architectures - Chadwick Boseman Chadwick Boseman Chadwick Aaron Boseman (born November 29, 1976) is an American actor, director, and producer known for his portrayals of real-life historical figures such as Jackie Robinson in "42" (2013), James Brown in "Get on Up" (2014) and Thurgood Marshall in "Marshall" (2017) and for his portrayal of the superhero Black Panther in the Marvel Cinematic Universe films "" (2016), "Black Panther" (2018), "" (2018) and the upcoming "" (2019). Boseman has also had roles in the television series "Lincoln Heights" (2008) and "Persons Unknown" (2010) and the films "The Express" (2008), "Draft Day" (2014) and "Message from the - 'Assessment of Play and Leisure: Delineation of the Problem' - source_sentence: 1 in what part of italy was gelato first made sentences: - Domínguez Domínguez Domínguez is a name of Spanish origin. It used to mean "son of Domingo" (i.e., son of Dominic). The surname is usually written Dominguez in the Philippines and United States. Written as Domínguez in Spanish speaking countries like Spain, Mexico, Argentina, etc... As of 2014, 40.7% of all known bearers of the surname "Domínguez" were residents of Mexico (frequency 1:242), 12.8% of Spain (1:288), 8.5% of Argentina (1:396), 7.7% of the United States (1:3,721), 4.3% of Cuba (1:212), 3.2% of Colombia (1:1,186), 3.0% of Peru (1:831), 2.6% of Venezuela (1:904), 2.6% of Honduras (1:265), 2.4% of Paraguay (1:241), 2.0% - Frost Gelato to the taste of the ice cream they had in Italy concluding that the only way to get gelato at the time was to make another trip to Italy. Thus both owners searched for a way to make gelato in the United States eventually locating a company that imports ingredients directly from Italy, after spending days studying how to make gelato, the owners created their first batch and after sampling it felt the tastes they had come across in Italy. Both owners wanted to share the taste of gelato with their community and thus after a few months, Frost Gelato - Gelato any way that ice cream is, including cup, cone, sandwich, cake, pie, or on a stick. Gelato was invented by Buontalenti, in Florence (Tuscany), during the Renaissance period. The Buontalenti created the dessert for the Grand Duke Cosimo I de’ Medici, who wanted him to organize an opulent banquet to celebrate the Spanish deputation. It was October 5, 1600, and Buontalenti had worked for four months to prepare such a banquet. In Florence, most shops selling hand-made ice-cream also usually offer a "Buontalenti" flavour. In 1686, the Sicilian fisherman Francesco Procopio dei Coltelli perfected the first ice cream machine. However, - source_sentence: who does george nelson represent in o brother where art thou sentences: - O Brother, Where Art Thou? the film got together and performed the music from the film in a Down from the Mountain concert tour which was filmed for TV and DVD. This included Ralph Stanley, John Hartford, Alison Krauss, Emmylou Harris, Gillian Welch, Chris Sharp, and others. O Brother, Where Art Thou? O Brother, Where Art Thou? is a 2000 crime comedy film written, produced, and directed by Joel and Ethan Coen, and starring George Clooney, John Turturro, and Tim Blake Nelson, with John Goodman, Holly Hunter, and Charles Durning in supporting roles. The film is set in 1937 rural Mississippi during the Great Depression. - O Brother, Where Art Thou? omitted all instances of the words "damn" and "hell" from the Coens' script, which only became known to Clooney after the directors pointed this out to him during shooting. This was the fourth film of the brothers in which John Turturro has starred. Other actors in "O Brother, Where Art Thou?" who had worked previously with the Coens include John Goodman (three films), Holly Hunter (two), Michael Badalucco and Charles Durning (one film each). The Coens used digital color correction to give the film a sepia-tinted look. Joel stated this was because the actual set was "greener than Ireland". Cinematographer - 'Aligning Books and Movies: Towards Story-Like Visual Explanations by Watching Movies and Reading Books' model-index: - name: all-MiniLM-L6-v2 trained on MEDI-MTEB triplets results: - task: type: triplet name: Triplet dataset: name: medi mteb dev type: medi-mteb-dev metrics: - type: cosine_accuracy value: 0.9116536208878427 name: Cosine Accuracy - type: dot_accuracy value: 0.08101154961957414 name: Dot Accuracy - type: manhattan_accuracy value: 0.9119820460890032 name: Manhattan Accuracy - type: euclidean_accuracy value: 0.9114894082872625 name: Euclidean Accuracy - type: max_accuracy value: 0.9119820460890032 name: Max Accuracy --- # all-MiniLM-L6-v2 trained on MEDI-MTEB triplets This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) on the NQ, pubmed, specter_train_triples, S2ORC_citations_abstracts, fever, gooaq_pairs, codesearchnet, wikihow, WikiAnswers, eli5_question_answer, amazon-qa, medmcqa, zeroshot, TriviaQA_pairs, PAQ_pairs, stackexchange_duplicate_questions_title-body_title-body, trex, flickr30k_captions, hotpotqa, task671_ambigqa_text_generation, task061_ropes_answer_generation, task285_imdb_answer_generation, task905_hate_speech_offensive_classification, task566_circa_classification, task184_snli_entailment_to_neutral_text_modification, task280_stereoset_classification_stereotype_type, task1599_smcalflow_classification, task1384_deal_or_no_dialog_classification, task591_sciq_answer_generation, task823_peixian-rtgender_sentiment_analysis, task023_cosmosqa_question_generation, task900_freebase_qa_category_classification, task924_event2mind_word_generation, task152_tomqa_find_location_easy_noise, task1368_healthfact_sentence_generation, task1661_super_glue_classification, task1187_politifact_classification, task1728_web_nlg_data_to_text, task112_asset_simple_sentence_identification, task1340_msr_text_compression_compression, task072_abductivenli_answer_generation, task1504_hatexplain_answer_generation, task684_online_privacy_policy_text_information_type_generation, task1290_xsum_summarization, task075_squad1.1_answer_generation, task1587_scifact_classification, task384_socialiqa_question_classification, task1555_scitail_answer_generation, task1532_daily_dialog_emotion_classification, task239_tweetqa_answer_generation, task596_mocha_question_generation, task1411_dart_subject_identification, task1359_numer_sense_answer_generation, task329_gap_classification, task220_rocstories_title_classification, task316_crows-pairs_classification_stereotype, task495_semeval_headline_classification, task1168_brown_coarse_pos_tagging, task348_squad2.0_unanswerable_question_generation, task049_multirc_questions_needed_to_answer, task1534_daily_dialog_question_classification, task322_jigsaw_classification_threat, task295_semeval_2020_task4_commonsense_reasoning, task186_snli_contradiction_to_entailment_text_modification, task034_winogrande_question_modification_object, task160_replace_letter_in_a_sentence, task469_mrqa_answer_generation, task105_story_cloze-rocstories_sentence_generation, task649_race_blank_question_generation, task1536_daily_dialog_happiness_classification, task683_online_privacy_policy_text_purpose_answer_generation, task024_cosmosqa_answer_generation, task584_udeps_eng_fine_pos_tagging, task066_timetravel_binary_consistency_classification, task413_mickey_en_sentence_perturbation_generation, task182_duorc_question_generation, task028_drop_answer_generation, task1601_webquestions_answer_generation, task1295_adversarial_qa_question_answering, task201_mnli_neutral_classification, task038_qasc_combined_fact, task293_storycommonsense_emotion_text_generation, task572_recipe_nlg_text_generation, task517_emo_classify_emotion_of_dialogue, task382_hybridqa_answer_generation, task176_break_decompose_questions, task1291_multi_news_summarization, task155_count_nouns_verbs, task031_winogrande_question_generation_object, task279_stereoset_classification_stereotype, task1336_peixian_equity_evaluation_corpus_gender_classifier, task508_scruples_dilemmas_more_ethical_isidentifiable, task518_emo_different_dialogue_emotions, task077_splash_explanation_to_sql, task923_event2mind_classifier, task470_mrqa_question_generation, task638_multi_woz_classification, task1412_web_questions_question_answering, task847_pubmedqa_question_generation, task678_ollie_actual_relationship_answer_generation, task290_tellmewhy_question_answerability, task575_air_dialogue_classification, task189_snli_neutral_to_contradiction_text_modification, task026_drop_question_generation, task162_count_words_starting_with_letter, task079_conala_concat_strings, task610_conllpp_ner, task046_miscellaneous_question_typing, task197_mnli_domain_answer_generation, task1325_qa_zre_question_generation_on_subject_relation, task430_senteval_subject_count, task672_nummersense, task402_grailqa_paraphrase_generation, task904_hate_speech_offensive_classification, task192_hotpotqa_sentence_generation, task069_abductivenli_classification, task574_air_dialogue_sentence_generation, task187_snli_entailment_to_contradiction_text_modification, task749_glucose_reverse_cause_emotion_detection, task1552_scitail_question_generation, task750_aqua_multiple_choice_answering, task327_jigsaw_classification_toxic, task1502_hatexplain_classification, task328_jigsaw_classification_insult, task304_numeric_fused_head_resolution, task1293_kilt_tasks_hotpotqa_question_answering, task216_rocstories_correct_answer_generation, task1326_qa_zre_question_generation_from_answer, task1338_peixian_equity_evaluation_corpus_sentiment_classifier, task1729_personachat_generate_next, task1202_atomic_classification_xneed, task400_paws_paraphrase_classification, task502_scruples_anecdotes_whoiswrong_verification, task088_identify_typo_verification, task221_rocstories_two_choice_classification, task200_mnli_entailment_classification, task074_squad1.1_question_generation, task581_socialiqa_question_generation, task1186_nne_hrngo_classification, task898_freebase_qa_answer_generation, task1408_dart_similarity_classification, task168_strategyqa_question_decomposition, task1357_xlsum_summary_generation, task390_torque_text_span_selection, task165_mcscript_question_answering_commonsense, task1533_daily_dialog_formal_classification, task002_quoref_answer_generation, task1297_qasc_question_answering, task305_jeopardy_answer_generation_normal, task029_winogrande_full_object, task1327_qa_zre_answer_generation_from_question, task326_jigsaw_classification_obscene, task1542_every_ith_element_from_starting, task570_recipe_nlg_ner_generation, task1409_dart_text_generation, task401_numeric_fused_head_reference, task846_pubmedqa_classification, task1712_poki_classification, task344_hybridqa_answer_generation, task875_emotion_classification, task1214_atomic_classification_xwant, task106_scruples_ethical_judgment, task238_iirc_answer_from_passage_answer_generation, task1391_winogrande_easy_answer_generation, task195_sentiment140_classification, task163_count_words_ending_with_letter, task579_socialiqa_classification, task569_recipe_nlg_text_generation, task1602_webquestion_question_genreation, task747_glucose_cause_emotion_detection, task219_rocstories_title_answer_generation, task178_quartz_question_answering, task103_facts2story_long_text_generation, task301_record_question_generation, task1369_healthfact_sentence_generation, task515_senteval_odd_word_out, task496_semeval_answer_generation, task1658_billsum_summarization, task1204_atomic_classification_hinderedby, task1392_superglue_multirc_answer_verification, task306_jeopardy_answer_generation_double, task1286_openbookqa_question_answering, task159_check_frequency_of_words_in_sentence_pair, task151_tomqa_find_location_easy_clean, task323_jigsaw_classification_sexually_explicit, task037_qasc_generate_related_fact, task027_drop_answer_type_generation, task1596_event2mind_text_generation_2, task141_odd-man-out_classification_category, task194_duorc_answer_generation, task679_hope_edi_english_text_classification, task246_dream_question_generation, task1195_disflqa_disfluent_to_fluent_conversion, task065_timetravel_consistent_sentence_classification, task351_winomt_classification_gender_identifiability_anti, task580_socialiqa_answer_generation, task583_udeps_eng_coarse_pos_tagging, task202_mnli_contradiction_classification, task222_rocstories_two_chioce_slotting_classification, task498_scruples_anecdotes_whoiswrong_classification, task067_abductivenli_answer_generation, task616_cola_classification, task286_olid_offense_judgment, task188_snli_neutral_to_entailment_text_modification, task223_quartz_explanation_generation, task820_protoqa_answer_generation, task196_sentiment140_answer_generation, task1678_mathqa_answer_selection, task349_squad2.0_answerable_unanswerable_question_classification, task154_tomqa_find_location_hard_noise, task333_hateeval_classification_hate_en, task235_iirc_question_from_subtext_answer_generation, task1554_scitail_classification, task210_logic2text_structured_text_generation, task035_winogrande_question_modification_person, task230_iirc_passage_classification, task1356_xlsum_title_generation, task1726_mathqa_correct_answer_generation, task302_record_classification, task380_boolq_yes_no_question, task212_logic2text_classification, task748_glucose_reverse_cause_event_detection, task834_mathdataset_classification, task350_winomt_classification_gender_identifiability_pro, task191_hotpotqa_question_generation, task236_iirc_question_from_passage_answer_generation, task217_rocstories_ordering_answer_generation, task568_circa_question_generation, task614_glucose_cause_event_detection, task361_spolin_yesand_prompt_response_classification, task421_persent_sentence_sentiment_classification, task203_mnli_sentence_generation, task420_persent_document_sentiment_classification, task153_tomqa_find_location_hard_clean, task346_hybridqa_classification, task1211_atomic_classification_hassubevent, task360_spolin_yesand_response_generation, task510_reddit_tifu_title_summarization, task511_reddit_tifu_long_text_summarization, task345_hybridqa_answer_generation, task270_csrg_counterfactual_context_generation, task307_jeopardy_answer_generation_final, task001_quoref_question_generation, task089_swap_words_verification, task1196_atomic_classification_oeffect, task080_piqa_answer_generation, task1598_nyc_long_text_generation, task240_tweetqa_question_generation, task615_moviesqa_answer_generation, task1347_glue_sts-b_similarity_classification, task114_is_the_given_word_longest, task292_storycommonsense_character_text_generation, task115_help_advice_classification, task431_senteval_object_count, task1360_numer_sense_multiple_choice_qa_generation, task177_para-nmt_paraphrasing, task132_dais_text_modification, task269_csrg_counterfactual_story_generation, task233_iirc_link_exists_classification, task161_count_words_containing_letter, task1205_atomic_classification_isafter, task571_recipe_nlg_ner_generation, task1292_yelp_review_full_text_categorization, task428_senteval_inversion, task311_race_question_generation, task429_senteval_tense, task403_creak_commonsense_inference, task929_products_reviews_classification, task582_naturalquestion_answer_generation, task237_iirc_answer_from_subtext_answer_generation, task050_multirc_answerability, task184_break_generate_question, task669_ambigqa_answer_generation, task169_strategyqa_sentence_generation, task500_scruples_anecdotes_title_generation, task241_tweetqa_classification, task1345_glue_qqp_question_paraprashing, task218_rocstories_swap_order_answer_generation, task613_politifact_text_generation, task1167_penn_treebank_coarse_pos_tagging, task1422_mathqa_physics, task247_dream_answer_generation, task199_mnli_classification, task164_mcscript_question_answering_text, task1541_agnews_classification, task516_senteval_conjoints_inversion, task294_storycommonsense_motiv_text_generation, task501_scruples_anecdotes_post_type_verification, task213_rocstories_correct_ending_classification, task821_protoqa_question_generation, task493_review_polarity_classification, task308_jeopardy_answer_generation_all, task1595_event2mind_text_generation_1, task040_qasc_question_generation, task231_iirc_link_classification, task1727_wiqa_what_is_the_effect, task578_curiosity_dialogs_answer_generation, task310_race_classification, task309_race_answer_generation, task379_agnews_topic_classification, task030_winogrande_full_person, task1540_parsed_pdfs_summarization, task039_qasc_find_overlapping_words, task1206_atomic_classification_isbefore, task157_count_vowels_and_consonants, task339_record_answer_generation, task453_swag_answer_generation, task848_pubmedqa_classification, task673_google_wellformed_query_classification, task676_ollie_relationship_answer_generation, task268_casehold_legal_answer_generation, task844_financial_phrasebank_classification, task330_gap_answer_generation, task595_mocha_answer_generation, task1285_kpa_keypoint_matching, task234_iirc_passage_line_answer_generation, task494_review_polarity_answer_generation, task670_ambigqa_question_generation, task289_gigaword_summarization, npr, nli, SimpleWiki, amazon_review_2018, ccnews_title_text, agnews, xsum, msmarco, yahoo_answers_title_answer, squad_pairs, wow, mteb-amazon_counterfactual-avs_triplets, mteb-amazon_massive_intent-avs_triplets, mteb-amazon_massive_scenario-avs_triplets, mteb-amazon_reviews_multi-avs_triplets, mteb-banking77-avs_triplets, mteb-emotion-avs_triplets, mteb-imdb-avs_triplets, mteb-mtop_domain-avs_triplets, mteb-mtop_intent-avs_triplets, mteb-toxic_conversations_50k-avs_triplets, mteb-tweet_sentiment_extraction-avs_triplets and covid-bing-query-gpt4-avs_triplets datasets. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) <!-- at revision 8b3219a92973c328a8e22fadcfa821b5dc75636a --> - **Maximum Sequence Length:** 256 tokens - **Output Dimensionality:** 384 tokens - **Similarity Function:** Cosine Similarity - **Training Datasets:** - NQ - pubmed - specter_train_triples - S2ORC_citations_abstracts - fever - gooaq_pairs - codesearchnet - wikihow - WikiAnswers - eli5_question_answer - amazon-qa - medmcqa - zeroshot - TriviaQA_pairs - PAQ_pairs - stackexchange_duplicate_questions_title-body_title-body - trex - flickr30k_captions - hotpotqa - task671_ambigqa_text_generation - task061_ropes_answer_generation - task285_imdb_answer_generation - task905_hate_speech_offensive_classification - task566_circa_classification - task184_snli_entailment_to_neutral_text_modification - task280_stereoset_classification_stereotype_type - task1599_smcalflow_classification - task1384_deal_or_no_dialog_classification - task591_sciq_answer_generation - task823_peixian-rtgender_sentiment_analysis - task023_cosmosqa_question_generation - task900_freebase_qa_category_classification - task924_event2mind_word_generation - task152_tomqa_find_location_easy_noise - task1368_healthfact_sentence_generation - task1661_super_glue_classification - task1187_politifact_classification - task1728_web_nlg_data_to_text - task112_asset_simple_sentence_identification - task1340_msr_text_compression_compression - task072_abductivenli_answer_generation - task1504_hatexplain_answer_generation - task684_online_privacy_policy_text_information_type_generation - task1290_xsum_summarization - task075_squad1.1_answer_generation - task1587_scifact_classification - task384_socialiqa_question_classification - task1555_scitail_answer_generation - task1532_daily_dialog_emotion_classification - task239_tweetqa_answer_generation - task596_mocha_question_generation - task1411_dart_subject_identification - task1359_numer_sense_answer_generation - task329_gap_classification - task220_rocstories_title_classification - task316_crows-pairs_classification_stereotype - task495_semeval_headline_classification - task1168_brown_coarse_pos_tagging - task348_squad2.0_unanswerable_question_generation - task049_multirc_questions_needed_to_answer - task1534_daily_dialog_question_classification - task322_jigsaw_classification_threat - task295_semeval_2020_task4_commonsense_reasoning - task186_snli_contradiction_to_entailment_text_modification - task034_winogrande_question_modification_object - task160_replace_letter_in_a_sentence - task469_mrqa_answer_generation - task105_story_cloze-rocstories_sentence_generation - task649_race_blank_question_generation - task1536_daily_dialog_happiness_classification - task683_online_privacy_policy_text_purpose_answer_generation - task024_cosmosqa_answer_generation - task584_udeps_eng_fine_pos_tagging - task066_timetravel_binary_consistency_classification - task413_mickey_en_sentence_perturbation_generation - task182_duorc_question_generation - task028_drop_answer_generation - task1601_webquestions_answer_generation - task1295_adversarial_qa_question_answering - task201_mnli_neutral_classification - task038_qasc_combined_fact - task293_storycommonsense_emotion_text_generation - task572_recipe_nlg_text_generation - task517_emo_classify_emotion_of_dialogue - task382_hybridqa_answer_generation - task176_break_decompose_questions - task1291_multi_news_summarization - task155_count_nouns_verbs - task031_winogrande_question_generation_object - task279_stereoset_classification_stereotype - task1336_peixian_equity_evaluation_corpus_gender_classifier - task508_scruples_dilemmas_more_ethical_isidentifiable - task518_emo_different_dialogue_emotions - task077_splash_explanation_to_sql - task923_event2mind_classifier - task470_mrqa_question_generation - task638_multi_woz_classification - task1412_web_questions_question_answering - task847_pubmedqa_question_generation - task678_ollie_actual_relationship_answer_generation - task290_tellmewhy_question_answerability - task575_air_dialogue_classification - task189_snli_neutral_to_contradiction_text_modification - task026_drop_question_generation - task162_count_words_starting_with_letter - task079_conala_concat_strings - task610_conllpp_ner - task046_miscellaneous_question_typing - task197_mnli_domain_answer_generation - task1325_qa_zre_question_generation_on_subject_relation - task430_senteval_subject_count - task672_nummersense - task402_grailqa_paraphrase_generation - task904_hate_speech_offensive_classification - task192_hotpotqa_sentence_generation - task069_abductivenli_classification - task574_air_dialogue_sentence_generation - task187_snli_entailment_to_contradiction_text_modification - task749_glucose_reverse_cause_emotion_detection - task1552_scitail_question_generation - task750_aqua_multiple_choice_answering - task327_jigsaw_classification_toxic - task1502_hatexplain_classification - task328_jigsaw_classification_insult - task304_numeric_fused_head_resolution - task1293_kilt_tasks_hotpotqa_question_answering - task216_rocstories_correct_answer_generation - task1326_qa_zre_question_generation_from_answer - task1338_peixian_equity_evaluation_corpus_sentiment_classifier - task1729_personachat_generate_next - task1202_atomic_classification_xneed - task400_paws_paraphrase_classification - task502_scruples_anecdotes_whoiswrong_verification - task088_identify_typo_verification - task221_rocstories_two_choice_classification - task200_mnli_entailment_classification - task074_squad1.1_question_generation - task581_socialiqa_question_generation - task1186_nne_hrngo_classification - task898_freebase_qa_answer_generation - task1408_dart_similarity_classification - task168_strategyqa_question_decomposition - task1357_xlsum_summary_generation - task390_torque_text_span_selection - task165_mcscript_question_answering_commonsense - task1533_daily_dialog_formal_classification - task002_quoref_answer_generation - task1297_qasc_question_answering - task305_jeopardy_answer_generation_normal - task029_winogrande_full_object - task1327_qa_zre_answer_generation_from_question - task326_jigsaw_classification_obscene - task1542_every_ith_element_from_starting - task570_recipe_nlg_ner_generation - task1409_dart_text_generation - task401_numeric_fused_head_reference - task846_pubmedqa_classification - task1712_poki_classification - task344_hybridqa_answer_generation - task875_emotion_classification - task1214_atomic_classification_xwant - task106_scruples_ethical_judgment - task238_iirc_answer_from_passage_answer_generation - task1391_winogrande_easy_answer_generation - task195_sentiment140_classification - task163_count_words_ending_with_letter - task579_socialiqa_classification - task569_recipe_nlg_text_generation - task1602_webquestion_question_genreation - task747_glucose_cause_emotion_detection - task219_rocstories_title_answer_generation - task178_quartz_question_answering - task103_facts2story_long_text_generation - task301_record_question_generation - task1369_healthfact_sentence_generation - task515_senteval_odd_word_out - task496_semeval_answer_generation - task1658_billsum_summarization - task1204_atomic_classification_hinderedby - task1392_superglue_multirc_answer_verification - task306_jeopardy_answer_generation_double - task1286_openbookqa_question_answering - task159_check_frequency_of_words_in_sentence_pair - task151_tomqa_find_location_easy_clean - task323_jigsaw_classification_sexually_explicit - task037_qasc_generate_related_fact - task027_drop_answer_type_generation - task1596_event2mind_text_generation_2 - task141_odd-man-out_classification_category - task194_duorc_answer_generation - task679_hope_edi_english_text_classification - task246_dream_question_generation - task1195_disflqa_disfluent_to_fluent_conversion - task065_timetravel_consistent_sentence_classification - task351_winomt_classification_gender_identifiability_anti - task580_socialiqa_answer_generation - task583_udeps_eng_coarse_pos_tagging - task202_mnli_contradiction_classification - task222_rocstories_two_chioce_slotting_classification - task498_scruples_anecdotes_whoiswrong_classification - task067_abductivenli_answer_generation - task616_cola_classification - task286_olid_offense_judgment - task188_snli_neutral_to_entailment_text_modification - task223_quartz_explanation_generation - task820_protoqa_answer_generation - task196_sentiment140_answer_generation - task1678_mathqa_answer_selection - task349_squad2.0_answerable_unanswerable_question_classification - task154_tomqa_find_location_hard_noise - task333_hateeval_classification_hate_en - task235_iirc_question_from_subtext_answer_generation - task1554_scitail_classification - task210_logic2text_structured_text_generation - task035_winogrande_question_modification_person - task230_iirc_passage_classification - task1356_xlsum_title_generation - task1726_mathqa_correct_answer_generation - task302_record_classification - task380_boolq_yes_no_question - task212_logic2text_classification - task748_glucose_reverse_cause_event_detection - task834_mathdataset_classification - task350_winomt_classification_gender_identifiability_pro - task191_hotpotqa_question_generation - task236_iirc_question_from_passage_answer_generation - task217_rocstories_ordering_answer_generation - task568_circa_question_generation - task614_glucose_cause_event_detection - task361_spolin_yesand_prompt_response_classification - task421_persent_sentence_sentiment_classification - task203_mnli_sentence_generation - task420_persent_document_sentiment_classification - task153_tomqa_find_location_hard_clean - task346_hybridqa_classification - task1211_atomic_classification_hassubevent - task360_spolin_yesand_response_generation - task510_reddit_tifu_title_summarization - task511_reddit_tifu_long_text_summarization - task345_hybridqa_answer_generation - task270_csrg_counterfactual_context_generation - task307_jeopardy_answer_generation_final - task001_quoref_question_generation - task089_swap_words_verification - task1196_atomic_classification_oeffect - task080_piqa_answer_generation - task1598_nyc_long_text_generation - task240_tweetqa_question_generation - task615_moviesqa_answer_generation - task1347_glue_sts-b_similarity_classification - task114_is_the_given_word_longest - task292_storycommonsense_character_text_generation - task115_help_advice_classification - task431_senteval_object_count - task1360_numer_sense_multiple_choice_qa_generation - task177_para-nmt_paraphrasing - task132_dais_text_modification - task269_csrg_counterfactual_story_generation - task233_iirc_link_exists_classification - task161_count_words_containing_letter - task1205_atomic_classification_isafter - task571_recipe_nlg_ner_generation - task1292_yelp_review_full_text_categorization - task428_senteval_inversion - task311_race_question_generation - task429_senteval_tense - task403_creak_commonsense_inference - task929_products_reviews_classification - task582_naturalquestion_answer_generation - task237_iirc_answer_from_subtext_answer_generation - task050_multirc_answerability - task184_break_generate_question - task669_ambigqa_answer_generation - task169_strategyqa_sentence_generation - task500_scruples_anecdotes_title_generation - task241_tweetqa_classification - task1345_glue_qqp_question_paraprashing - task218_rocstories_swap_order_answer_generation - task613_politifact_text_generation - task1167_penn_treebank_coarse_pos_tagging - task1422_mathqa_physics - task247_dream_answer_generation - task199_mnli_classification - task164_mcscript_question_answering_text - task1541_agnews_classification - task516_senteval_conjoints_inversion - task294_storycommonsense_motiv_text_generation - task501_scruples_anecdotes_post_type_verification - task213_rocstories_correct_ending_classification - task821_protoqa_question_generation - task493_review_polarity_classification - task308_jeopardy_answer_generation_all - task1595_event2mind_text_generation_1 - task040_qasc_question_generation - task231_iirc_link_classification - task1727_wiqa_what_is_the_effect - task578_curiosity_dialogs_answer_generation - task310_race_classification - task309_race_answer_generation - task379_agnews_topic_classification - task030_winogrande_full_person - task1540_parsed_pdfs_summarization - task039_qasc_find_overlapping_words - task1206_atomic_classification_isbefore - task157_count_vowels_and_consonants - task339_record_answer_generation - task453_swag_answer_generation - task848_pubmedqa_classification - task673_google_wellformed_query_classification - task676_ollie_relationship_answer_generation - task268_casehold_legal_answer_generation - task844_financial_phrasebank_classification - task330_gap_answer_generation - task595_mocha_answer_generation - task1285_kpa_keypoint_matching - task234_iirc_passage_line_answer_generation - task494_review_polarity_answer_generation - task670_ambigqa_question_generation - task289_gigaword_summarization - npr - nli - SimpleWiki - amazon_review_2018 - ccnews_title_text - agnews - xsum - msmarco - yahoo_answers_title_answer - squad_pairs - wow - mteb-amazon_counterfactual-avs_triplets - mteb-amazon_massive_intent-avs_triplets - mteb-amazon_massive_scenario-avs_triplets - mteb-amazon_reviews_multi-avs_triplets - mteb-banking77-avs_triplets - mteb-emotion-avs_triplets - mteb-imdb-avs_triplets - mteb-mtop_domain-avs_triplets - mteb-mtop_intent-avs_triplets - mteb-toxic_conversations_50k-avs_triplets - mteb-tweet_sentiment_extraction-avs_triplets - covid-bing-query-gpt4-avs_triplets - **Language:** en - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("avsolatorio/all-MiniLM-L6-v2-MEDI-MTEB-triplet-final") # Run inference sentences = [ 'who does george nelson represent in o brother where art thou', 'O Brother, Where Art Thou? omitted all instances of the words "damn" and "hell" from the Coens\' script, which only became known to Clooney after the directors pointed this out to him during shooting. This was the fourth film of the brothers in which John Turturro has starred. Other actors in "O Brother, Where Art Thou?" who had worked previously with the Coens include John Goodman (three films), Holly Hunter (two), Michael Badalucco and Charles Durning (one film each). The Coens used digital color correction to give the film a sepia-tinted look. Joel stated this was because the actual set was "greener than Ireland". Cinematographer', 'O Brother, Where Art Thou? the film got together and performed the music from the film in a Down from the Mountain concert tour which was filmed for TV and DVD. This included Ralph Stanley, John Hartford, Alison Krauss, Emmylou Harris, Gillian Welch, Chris Sharp, and others. O Brother, Where Art Thou? O Brother, Where Art Thou? is a 2000 crime comedy film written, produced, and directed by Joel and Ethan Coen, and starring George Clooney, John Turturro, and Tim Blake Nelson, with John Goodman, Holly Hunter, and Charles Durning in supporting roles. The film is set in 1937 rural Mississippi during the Great Depression.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 384] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Triplet * Dataset: `medi-mteb-dev` * Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator) | Metric | Value | |:-------------------|:----------| | cosine_accuracy | 0.9117 | | dot_accuracy | 0.081 | | manhattan_accuracy | 0.912 | | euclidean_accuracy | 0.9115 | | **max_accuracy** | **0.912** | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Datasets #### NQ * Dataset: NQ * Size: 49,676 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 11.91 tokens</li><li>max: 24 tokens</li></ul> | <ul><li>min: 111 tokens</li><li>mean: 137.95 tokens</li><li>max: 212 tokens</li></ul> | <ul><li>min: 113 tokens</li><li>mean: 138.79 tokens</li><li>max: 209 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### pubmed * Dataset: pubmed * Size: 29,908 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 22.81 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 93 tokens</li><li>mean: 240.49 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 73 tokens</li><li>mean: 239.5 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### specter_train_triples * Dataset: specter_train_triples * Size: 49,676 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 15.69 tokens</li><li>max: 94 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 14.12 tokens</li><li>max: 39 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 16.39 tokens</li><li>max: 64 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### S2ORC_citations_abstracts * Dataset: S2ORC_citations_abstracts * Size: 99,352 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 20 tokens</li><li>mean: 196.74 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 203.91 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 208.09 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### fever * Dataset: fever * Size: 74,514 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 12.49 tokens</li><li>max: 51 tokens</li></ul> | <ul><li>min: 48 tokens</li><li>mean: 112.67 tokens</li><li>max: 154 tokens</li></ul> | <ul><li>min: 35 tokens</li><li>mean: 113.92 tokens</li><li>max: 163 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### gooaq_pairs * Dataset: gooaq_pairs * Size: 24,838 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 11.92 tokens</li><li>max: 24 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 60.11 tokens</li><li>max: 150 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 63.73 tokens</li><li>max: 150 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### codesearchnet * Dataset: codesearchnet * Size: 15,210 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 28.96 tokens</li><li>max: 143 tokens</li></ul> | <ul><li>min: 28 tokens</li><li>mean: 134.91 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 29 tokens</li><li>mean: 163.95 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### wikihow * Dataset: wikihow * Size: 5,070 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 8.05 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 45.27 tokens</li><li>max: 117 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 35.68 tokens</li><li>max: 75 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### WikiAnswers * Dataset: WikiAnswers * Size: 24,838 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 12.79 tokens</li><li>max: 43 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 12.93 tokens</li><li>max: 47 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 13.13 tokens</li><li>max: 44 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### eli5_question_answer * Dataset: eli5_question_answer * Size: 24,838 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 21.16 tokens</li><li>max: 69 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 100.92 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 112.62 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### amazon-qa * Dataset: amazon-qa * Size: 99,352 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 23.56 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 52.4 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 62.09 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### medmcqa * Dataset: medmcqa * Size: 29,908 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 19.62 tokens</li><li>max: 167 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 110.24 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 111.99 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### zeroshot * Dataset: zeroshot * Size: 15,210 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 8.7 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 112.73 tokens</li><li>max: 178 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 115.71 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### TriviaQA_pairs * Dataset: TriviaQA_pairs * Size: 49,676 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 19.22 tokens</li><li>max: 59 tokens</li></ul> | <ul><li>min: 33 tokens</li><li>mean: 246.01 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 21 tokens</li><li>mean: 232.19 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### PAQ_pairs * Dataset: PAQ_pairs * Size: 24,838 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 12.6 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 112 tokens</li><li>mean: 136.78 tokens</li><li>max: 205 tokens</li></ul> | <ul><li>min: 110 tokens</li><li>mean: 135.66 tokens</li><li>max: 254 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### stackexchange_duplicate_questions_title-body_title-body * Dataset: stackexchange_duplicate_questions_title-body_title-body * Size: 24,838 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 18 tokens</li><li>mean: 150.59 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 142.04 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 27 tokens</li><li>mean: 198.29 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### trex * Dataset: trex * Size: 29,908 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 9.55 tokens</li><li>max: 27 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 104.71 tokens</li><li>max: 212 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 118.22 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### flickr30k_captions * Dataset: flickr30k_captions * Size: 24,838 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 15.95 tokens</li><li>max: 88 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.68 tokens</li><li>max: 59 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 17.15 tokens</li><li>max: 52 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### hotpotqa * Dataset: hotpotqa * Size: 40,048 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 23.83 tokens</li><li>max: 103 tokens</li></ul> | <ul><li>min: 27 tokens</li><li>mean: 113.6 tokens</li><li>max: 194 tokens</li></ul> | <ul><li>min: 38 tokens</li><li>mean: 115.33 tokens</li><li>max: 178 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task671_ambigqa_text_generation * Dataset: task671_ambigqa_text_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 11 tokens</li><li>mean: 12.69 tokens</li><li>max: 26 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 12.52 tokens</li><li>max: 23 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 12.23 tokens</li><li>max: 19 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task061_ropes_answer_generation * Dataset: task061_ropes_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 117 tokens</li><li>mean: 208.96 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 117 tokens</li><li>mean: 208.27 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 119 tokens</li><li>mean: 210.46 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task285_imdb_answer_generation * Dataset: task285_imdb_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 46 tokens</li><li>mean: 208.78 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 49 tokens</li><li>mean: 203.97 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 46 tokens</li><li>mean: 208.78 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task905_hate_speech_offensive_classification * Dataset: task905_hate_speech_offensive_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 15 tokens</li><li>mean: 41.73 tokens</li><li>max: 164 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 40.48 tokens</li><li>max: 198 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 32.23 tokens</li><li>max: 135 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task566_circa_classification * Dataset: task566_circa_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 20 tokens</li><li>mean: 27.77 tokens</li><li>max: 48 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 27.22 tokens</li><li>max: 44 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 27.46 tokens</li><li>max: 47 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task184_snli_entailment_to_neutral_text_modification * Dataset: task184_snli_entailment_to_neutral_text_modification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 17 tokens</li><li>mean: 29.98 tokens</li><li>max: 72 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 28.9 tokens</li><li>max: 60 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 30.33 tokens</li><li>max: 100 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task280_stereoset_classification_stereotype_type * Dataset: task280_stereoset_classification_stereotype_type * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 18.47 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 16.89 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 16.86 tokens</li><li>max: 51 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1599_smcalflow_classification * Dataset: task1599_smcalflow_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 3 tokens</li><li>mean: 11.25 tokens</li><li>max: 37 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 10.47 tokens</li><li>max: 38 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 16.12 tokens</li><li>max: 45 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1384_deal_or_no_dialog_classification * Dataset: task1384_deal_or_no_dialog_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 59.1 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 59.35 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 58.47 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task591_sciq_answer_generation * Dataset: task591_sciq_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 17.61 tokens</li><li>max: 70 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 17.17 tokens</li><li>max: 43 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 16.67 tokens</li><li>max: 75 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task823_peixian-rtgender_sentiment_analysis * Dataset: task823_peixian-rtgender_sentiment_analysis * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 16 tokens</li><li>mean: 57.26 tokens</li><li>max: 179 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 60.03 tokens</li><li>max: 153 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 60.89 tokens</li><li>max: 169 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task023_cosmosqa_question_generation * Dataset: task023_cosmosqa_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 35 tokens</li><li>mean: 79.52 tokens</li><li>max: 159 tokens</li></ul> | <ul><li>min: 34 tokens</li><li>mean: 80.36 tokens</li><li>max: 165 tokens</li></ul> | <ul><li>min: 35 tokens</li><li>mean: 79.14 tokens</li><li>max: 161 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task900_freebase_qa_category_classification * Dataset: task900_freebase_qa_category_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 20.44 tokens</li><li>max: 88 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 18.33 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 19.14 tokens</li><li>max: 69 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task924_event2mind_word_generation * Dataset: task924_event2mind_word_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 17 tokens</li><li>mean: 32.06 tokens</li><li>max: 64 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 32.13 tokens</li><li>max: 70 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 31.58 tokens</li><li>max: 68 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task152_tomqa_find_location_easy_noise * Dataset: task152_tomqa_find_location_easy_noise * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 37 tokens</li><li>mean: 52.96 tokens</li><li>max: 79 tokens</li></ul> | <ul><li>min: 37 tokens</li><li>mean: 52.53 tokens</li><li>max: 78 tokens</li></ul> | <ul><li>min: 37 tokens</li><li>mean: 52.92 tokens</li><li>max: 82 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1368_healthfact_sentence_generation * Dataset: task1368_healthfact_sentence_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 91 tokens</li><li>mean: 240.57 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 84 tokens</li><li>mean: 239.31 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 97 tokens</li><li>mean: 245.05 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1661_super_glue_classification * Dataset: task1661_super_glue_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 35 tokens</li><li>mean: 140.99 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 31 tokens</li><li>mean: 142.44 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 31 tokens</li><li>mean: 143.37 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1187_politifact_classification * Dataset: task1187_politifact_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 33.28 tokens</li><li>max: 79 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 31.59 tokens</li><li>max: 75 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 31.9 tokens</li><li>max: 71 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1728_web_nlg_data_to_text * Dataset: task1728_web_nlg_data_to_text * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 43.07 tokens</li><li>max: 152 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 46.55 tokens</li><li>max: 152 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 43.18 tokens</li><li>max: 152 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task112_asset_simple_sentence_identification * Dataset: task112_asset_simple_sentence_identification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 18 tokens</li><li>mean: 51.87 tokens</li><li>max: 136 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 51.68 tokens</li><li>max: 144 tokens</li></ul> | <ul><li>min: 22 tokens</li><li>mean: 51.93 tokens</li><li>max: 114 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1340_msr_text_compression_compression * Dataset: task1340_msr_text_compression_compression * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 41.77 tokens</li><li>max: 116 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 44.27 tokens</li><li>max: 133 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 40.08 tokens</li><li>max: 141 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task072_abductivenli_answer_generation * Dataset: task072_abductivenli_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 17 tokens</li><li>mean: 26.8 tokens</li><li>max: 56 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 26.15 tokens</li><li>max: 47 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 26.4 tokens</li><li>max: 55 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1504_hatexplain_answer_generation * Dataset: task1504_hatexplain_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 28.53 tokens</li><li>max: 72 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 24.21 tokens</li><li>max: 86 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 27.94 tokens</li><li>max: 67 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task684_online_privacy_policy_text_information_type_generation * Dataset: task684_online_privacy_policy_text_information_type_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 29.91 tokens</li><li>max: 68 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 30.18 tokens</li><li>max: 61 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 30.06 tokens</li><li>max: 68 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1290_xsum_summarization * Dataset: task1290_xsum_summarization * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 39 tokens</li><li>mean: 226.28 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 50 tokens</li><li>mean: 229.51 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 34 tokens</li><li>mean: 229.59 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task075_squad1.1_answer_generation * Dataset: task075_squad1.1_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 48 tokens</li><li>mean: 167.12 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 45 tokens</li><li>mean: 173.01 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 46 tokens</li><li>mean: 178.89 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1587_scifact_classification * Dataset: task1587_scifact_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 88 tokens</li><li>mean: 242.08 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 90 tokens</li><li>mean: 246.93 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 86 tokens</li><li>mean: 244.36 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task384_socialiqa_question_classification * Dataset: task384_socialiqa_question_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 24 tokens</li><li>mean: 35.46 tokens</li><li>max: 78 tokens</li></ul> | <ul><li>min: 22 tokens</li><li>mean: 34.33 tokens</li><li>max: 59 tokens</li></ul> | <ul><li>min: 22 tokens</li><li>mean: 34.52 tokens</li><li>max: 57 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1555_scitail_answer_generation * Dataset: task1555_scitail_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 18 tokens</li><li>mean: 36.88 tokens</li><li>max: 90 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 36.12 tokens</li><li>max: 80 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 36.59 tokens</li><li>max: 92 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1532_daily_dialog_emotion_classification * Dataset: task1532_daily_dialog_emotion_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 16 tokens</li><li>mean: 135.8 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 140.06 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 134.53 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task239_tweetqa_answer_generation * Dataset: task239_tweetqa_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 28 tokens</li><li>mean: 56.05 tokens</li><li>max: 91 tokens</li></ul> | <ul><li>min: 29 tokens</li><li>mean: 56.59 tokens</li><li>max: 92 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 56.05 tokens</li><li>max: 81 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task596_mocha_question_generation * Dataset: task596_mocha_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 34 tokens</li><li>mean: 80.75 tokens</li><li>max: 163 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 96.06 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 45.02 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1411_dart_subject_identification * Dataset: task1411_dart_subject_identification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 15.01 tokens</li><li>max: 74 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 14.1 tokens</li><li>max: 37 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 14.36 tokens</li><li>max: 38 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1359_numer_sense_answer_generation * Dataset: task1359_numer_sense_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 18.75 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 18.43 tokens</li><li>max: 33 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 18.3 tokens</li><li>max: 30 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task329_gap_classification * Dataset: task329_gap_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 40 tokens</li><li>mean: 123.98 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 62 tokens</li><li>mean: 127.04 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 58 tokens</li><li>mean: 128.35 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task220_rocstories_title_classification * Dataset: task220_rocstories_title_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 53 tokens</li><li>mean: 80.81 tokens</li><li>max: 116 tokens</li></ul> | <ul><li>min: 51 tokens</li><li>mean: 81.14 tokens</li><li>max: 108 tokens</li></ul> | <ul><li>min: 55 tokens</li><li>mean: 79.79 tokens</li><li>max: 115 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task316_crows-pairs_classification_stereotype * Dataset: task316_crows-pairs_classification_stereotype * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 19.78 tokens</li><li>max: 51 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 18.35 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 19.82 tokens</li><li>max: 52 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task495_semeval_headline_classification * Dataset: task495_semeval_headline_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 17 tokens</li><li>mean: 24.57 tokens</li><li>max: 42 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 24.23 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 24.2 tokens</li><li>max: 38 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1168_brown_coarse_pos_tagging * Dataset: task1168_brown_coarse_pos_tagging * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 13 tokens</li><li>mean: 43.83 tokens</li><li>max: 142 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 43.44 tokens</li><li>max: 197 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 44.95 tokens</li><li>max: 197 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task348_squad2.0_unanswerable_question_generation * Dataset: task348_squad2.0_unanswerable_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 30 tokens</li><li>mean: 153.01 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 38 tokens</li><li>mean: 161.19 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 33 tokens</li><li>mean: 167.06 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task049_multirc_questions_needed_to_answer * Dataset: task049_multirc_questions_needed_to_answer * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 174 tokens</li><li>mean: 252.54 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 169 tokens</li><li>mean: 252.57 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 178 tokens</li><li>mean: 252.73 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1534_daily_dialog_question_classification * Dataset: task1534_daily_dialog_question_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 17 tokens</li><li>mean: 125.31 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 130.35 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 135.56 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task322_jigsaw_classification_threat * Dataset: task322_jigsaw_classification_threat * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 54.84 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 62.09 tokens</li><li>max: 249 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 62.43 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task295_semeval_2020_task4_commonsense_reasoning * Dataset: task295_semeval_2020_task4_commonsense_reasoning * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 25 tokens</li><li>mean: 44.81 tokens</li><li>max: 92 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 45.07 tokens</li><li>max: 95 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 44.7 tokens</li><li>max: 88 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task186_snli_contradiction_to_entailment_text_modification * Dataset: task186_snli_contradiction_to_entailment_text_modification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 18 tokens</li><li>mean: 31.21 tokens</li><li>max: 102 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 30.13 tokens</li><li>max: 65 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 32.21 tokens</li><li>max: 67 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task034_winogrande_question_modification_object * Dataset: task034_winogrande_question_modification_object * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 29 tokens</li><li>mean: 36.36 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 29 tokens</li><li>mean: 35.59 tokens</li><li>max: 54 tokens</li></ul> | <ul><li>min: 29 tokens</li><li>mean: 34.87 tokens</li><li>max: 55 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task160_replace_letter_in_a_sentence * Dataset: task160_replace_letter_in_a_sentence * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 29 tokens</li><li>mean: 31.98 tokens</li><li>max: 49 tokens</li></ul> | <ul><li>min: 28 tokens</li><li>mean: 31.78 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 29 tokens</li><li>mean: 31.8 tokens</li><li>max: 48 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task469_mrqa_answer_generation * Dataset: task469_mrqa_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 27 tokens</li><li>mean: 182.22 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 180.87 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 27 tokens</li><li>mean: 184.07 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task105_story_cloze-rocstories_sentence_generation * Dataset: task105_story_cloze-rocstories_sentence_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 36 tokens</li><li>mean: 55.58 tokens</li><li>max: 75 tokens</li></ul> | <ul><li>min: 35 tokens</li><li>mean: 54.96 tokens</li><li>max: 76 tokens</li></ul> | <ul><li>min: 36 tokens</li><li>mean: 55.99 tokens</li><li>max: 76 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task649_race_blank_question_generation * Dataset: task649_race_blank_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 36 tokens</li><li>mean: 253.19 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 36 tokens</li><li>mean: 252.56 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 157 tokens</li><li>mean: 254.12 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1536_daily_dialog_happiness_classification * Dataset: task1536_daily_dialog_happiness_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 13 tokens</li><li>mean: 127.06 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 133.94 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 142.64 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task683_online_privacy_policy_text_purpose_answer_generation * Dataset: task683_online_privacy_policy_text_purpose_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 29.93 tokens</li><li>max: 68 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 30.22 tokens</li><li>max: 64 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 29.85 tokens</li><li>max: 68 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task024_cosmosqa_answer_generation * Dataset: task024_cosmosqa_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 45 tokens</li><li>mean: 92.5 tokens</li><li>max: 176 tokens</li></ul> | <ul><li>min: 47 tokens</li><li>mean: 93.22 tokens</li><li>max: 174 tokens</li></ul> | <ul><li>min: 42 tokens</li><li>mean: 94.89 tokens</li><li>max: 183 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task584_udeps_eng_fine_pos_tagging * Dataset: task584_udeps_eng_fine_pos_tagging * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 12 tokens</li><li>mean: 40.13 tokens</li><li>max: 120 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 39.18 tokens</li><li>max: 186 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 40.4 tokens</li><li>max: 148 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task066_timetravel_binary_consistency_classification * Dataset: task066_timetravel_binary_consistency_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 42 tokens</li><li>mean: 66.89 tokens</li><li>max: 93 tokens</li></ul> | <ul><li>min: 43 tokens</li><li>mean: 67.42 tokens</li><li>max: 94 tokens</li></ul> | <ul><li>min: 45 tokens</li><li>mean: 67.0 tokens</li><li>max: 92 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task413_mickey_en_sentence_perturbation_generation * Dataset: task413_mickey_en_sentence_perturbation_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 13.77 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 13.82 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 13.31 tokens</li><li>max: 20 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task182_duorc_question_generation * Dataset: task182_duorc_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 99 tokens</li><li>mean: 241.8 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 120 tokens</li><li>mean: 245.95 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 99 tokens</li><li>mean: 246.6 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task028_drop_answer_generation * Dataset: task028_drop_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 76 tokens</li><li>mean: 230.72 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 86 tokens</li><li>mean: 234.59 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 81 tokens</li><li>mean: 235.71 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1601_webquestions_answer_generation * Dataset: task1601_webquestions_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 16.47 tokens</li><li>max: 28 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 16.67 tokens</li><li>max: 28 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 16.76 tokens</li><li>max: 27 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1295_adversarial_qa_question_answering * Dataset: task1295_adversarial_qa_question_answering * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 45 tokens</li><li>mean: 165.1 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 54 tokens</li><li>mean: 167.21 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 48 tokens</li><li>mean: 166.49 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task201_mnli_neutral_classification * Dataset: task201_mnli_neutral_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 24 tokens</li><li>mean: 73.0 tokens</li><li>max: 218 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 73.42 tokens</li><li>max: 170 tokens</li></ul> | <ul><li>min: 27 tokens</li><li>mean: 72.48 tokens</li><li>max: 205 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task038_qasc_combined_fact * Dataset: task038_qasc_combined_fact * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 18 tokens</li><li>mean: 31.3 tokens</li><li>max: 57 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 30.49 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 30.87 tokens</li><li>max: 53 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task293_storycommonsense_emotion_text_generation * Dataset: task293_storycommonsense_emotion_text_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 40.74 tokens</li><li>max: 86 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 40.56 tokens</li><li>max: 86 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 38.5 tokens</li><li>max: 86 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task572_recipe_nlg_text_generation * Dataset: task572_recipe_nlg_text_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 24 tokens</li><li>mean: 114.82 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 121.93 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 124.38 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task517_emo_classify_emotion_of_dialogue * Dataset: task517_emo_classify_emotion_of_dialogue * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 18.18 tokens</li><li>max: 78 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 17.03 tokens</li><li>max: 59 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 18.39 tokens</li><li>max: 67 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task382_hybridqa_answer_generation * Dataset: task382_hybridqa_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 29 tokens</li><li>mean: 42.34 tokens</li><li>max: 70 tokens</li></ul> | <ul><li>min: 29 tokens</li><li>mean: 41.63 tokens</li><li>max: 74 tokens</li></ul> | <ul><li>min: 28 tokens</li><li>mean: 41.73 tokens</li><li>max: 75 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task176_break_decompose_questions * Dataset: task176_break_decompose_questions * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 17.39 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 17.19 tokens</li><li>max: 39 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 15.71 tokens</li><li>max: 38 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1291_multi_news_summarization * Dataset: task1291_multi_news_summarization * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 116 tokens</li><li>mean: 255.36 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 146 tokens</li><li>mean: 255.71 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 68 tokens</li><li>mean: 252.09 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task155_count_nouns_verbs * Dataset: task155_count_nouns_verbs * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 23 tokens</li><li>mean: 27.03 tokens</li><li>max: 56 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 26.8 tokens</li><li>max: 43 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 26.94 tokens</li><li>max: 46 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task031_winogrande_question_generation_object * Dataset: task031_winogrande_question_generation_object * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 7.42 tokens</li><li>max: 11 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 7.31 tokens</li><li>max: 11 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 7.27 tokens</li><li>max: 11 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task279_stereoset_classification_stereotype * Dataset: task279_stereoset_classification_stereotype * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 17.91 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 15.43 tokens</li><li>max: 43 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 17.2 tokens</li><li>max: 50 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1336_peixian_equity_evaluation_corpus_gender_classifier * Dataset: task1336_peixian_equity_evaluation_corpus_gender_classifier * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:--------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 9.62 tokens</li><li>max: 17 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 9.6 tokens</li><li>max: 16 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 9.69 tokens</li><li>max: 16 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task508_scruples_dilemmas_more_ethical_isidentifiable * Dataset: task508_scruples_dilemmas_more_ethical_isidentifiable * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 12 tokens</li><li>mean: 29.63 tokens</li><li>max: 94 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 28.69 tokens</li><li>max: 94 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 28.59 tokens</li><li>max: 86 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task518_emo_different_dialogue_emotions * Dataset: task518_emo_different_dialogue_emotions * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 28 tokens</li><li>mean: 47.83 tokens</li><li>max: 106 tokens</li></ul> | <ul><li>min: 28 tokens</li><li>mean: 45.51 tokens</li><li>max: 116 tokens</li></ul> | <ul><li>min: 26 tokens</li><li>mean: 45.81 tokens</li><li>max: 123 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task077_splash_explanation_to_sql * Dataset: task077_splash_explanation_to_sql * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 39.82 tokens</li><li>max: 126 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 39.88 tokens</li><li>max: 126 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 35.83 tokens</li><li>max: 111 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task923_event2mind_classifier * Dataset: task923_event2mind_classifier * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 20.61 tokens</li><li>max: 46 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 18.62 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 19.51 tokens</li><li>max: 46 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task470_mrqa_question_generation * Dataset: task470_mrqa_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 13 tokens</li><li>mean: 172.18 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 175.43 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 180.36 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task638_multi_woz_classification * Dataset: task638_multi_woz_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 78 tokens</li><li>mean: 223.56 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 76 tokens</li><li>mean: 220.51 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 64 tokens</li><li>mean: 220.0 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1412_web_questions_question_answering * Dataset: task1412_web_questions_question_answering * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 10.33 tokens</li><li>max: 17 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 10.18 tokens</li><li>max: 17 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 10.08 tokens</li><li>max: 16 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task847_pubmedqa_question_generation * Dataset: task847_pubmedqa_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 21 tokens</li><li>mean: 248.66 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 21 tokens</li><li>mean: 248.78 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 43 tokens</li><li>mean: 249.11 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task678_ollie_actual_relationship_answer_generation * Dataset: task678_ollie_actual_relationship_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 20 tokens</li><li>mean: 41.01 tokens</li><li>max: 95 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 37.95 tokens</li><li>max: 102 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 41.14 tokens</li><li>max: 104 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task290_tellmewhy_question_answerability * Dataset: task290_tellmewhy_question_answerability * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 37 tokens</li><li>mean: 63.19 tokens</li><li>max: 95 tokens</li></ul> | <ul><li>min: 36 tokens</li><li>mean: 62.66 tokens</li><li>max: 94 tokens</li></ul> | <ul><li>min: 37 tokens</li><li>mean: 63.44 tokens</li><li>max: 95 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task575_air_dialogue_classification * Dataset: task575_air_dialogue_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 14.16 tokens</li><li>max: 45 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 13.55 tokens</li><li>max: 43 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 12.3 tokens</li><li>max: 42 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task189_snli_neutral_to_contradiction_text_modification * Dataset: task189_snli_neutral_to_contradiction_text_modification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 18 tokens</li><li>mean: 31.82 tokens</li><li>max: 60 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 30.75 tokens</li><li>max: 57 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 33.25 tokens</li><li>max: 105 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task026_drop_question_generation * Dataset: task026_drop_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 82 tokens</li><li>mean: 219.39 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 57 tokens</li><li>mean: 222.63 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 96 tokens</li><li>mean: 232.08 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task162_count_words_starting_with_letter * Dataset: task162_count_words_starting_with_letter * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 28 tokens</li><li>mean: 32.21 tokens</li><li>max: 56 tokens</li></ul> | <ul><li>min: 28 tokens</li><li>mean: 31.77 tokens</li><li>max: 45 tokens</li></ul> | <ul><li>min: 28 tokens</li><li>mean: 31.64 tokens</li><li>max: 46 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task079_conala_concat_strings * Dataset: task079_conala_concat_strings * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 11 tokens</li><li>mean: 39.62 tokens</li><li>max: 76 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 34.2 tokens</li><li>max: 80 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 33.53 tokens</li><li>max: 76 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task610_conllpp_ner * Dataset: task610_conllpp_ner * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 19.55 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 20.27 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 14.12 tokens</li><li>max: 54 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task046_miscellaneous_question_typing * Dataset: task046_miscellaneous_question_typing * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 16 tokens</li><li>mean: 25.41 tokens</li><li>max: 70 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 24.94 tokens</li><li>max: 70 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 25.13 tokens</li><li>max: 57 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task197_mnli_domain_answer_generation * Dataset: task197_mnli_domain_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 15 tokens</li><li>mean: 44.09 tokens</li><li>max: 197 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 44.97 tokens</li><li>max: 211 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 39.22 tokens</li><li>max: 115 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1325_qa_zre_question_generation_on_subject_relation * Dataset: task1325_qa_zre_question_generation_on_subject_relation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 18 tokens</li><li>mean: 51.02 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 49.57 tokens</li><li>max: 180 tokens</li></ul> | <ul><li>min: 22 tokens</li><li>mean: 54.59 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task430_senteval_subject_count * Dataset: task430_senteval_subject_count * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 17.14 tokens</li><li>max: 35 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.31 tokens</li><li>max: 34 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 16.13 tokens</li><li>max: 34 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task672_nummersense * Dataset: task672_nummersense * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 15.72 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.33 tokens</li><li>max: 27 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.21 tokens</li><li>max: 30 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task402_grailqa_paraphrase_generation * Dataset: task402_grailqa_paraphrase_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 23 tokens</li><li>mean: 127.55 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 139.34 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 22 tokens</li><li>mean: 133.69 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task904_hate_speech_offensive_classification * Dataset: task904_hate_speech_offensive_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 35.03 tokens</li><li>max: 157 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 34.67 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 27.84 tokens</li><li>max: 148 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task192_hotpotqa_sentence_generation * Dataset: task192_hotpotqa_sentence_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 37 tokens</li><li>mean: 125.55 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 35 tokens</li><li>mean: 123.85 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 33 tokens</li><li>mean: 134.16 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task069_abductivenli_classification * Dataset: task069_abductivenli_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 33 tokens</li><li>mean: 52.09 tokens</li><li>max: 86 tokens</li></ul> | <ul><li>min: 33 tokens</li><li>mean: 52.16 tokens</li><li>max: 95 tokens</li></ul> | <ul><li>min: 33 tokens</li><li>mean: 51.84 tokens</li><li>max: 95 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task574_air_dialogue_sentence_generation * Dataset: task574_air_dialogue_sentence_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 54 tokens</li><li>mean: 143.98 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 57 tokens</li><li>mean: 143.52 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 66 tokens</li><li>mean: 147.45 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task187_snli_entailment_to_contradiction_text_modification * Dataset: task187_snli_entailment_to_contradiction_text_modification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 16 tokens</li><li>mean: 30.23 tokens</li><li>max: 69 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 29.82 tokens</li><li>max: 104 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 29.44 tokens</li><li>max: 71 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task749_glucose_reverse_cause_emotion_detection * Dataset: task749_glucose_reverse_cause_emotion_detection * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 38 tokens</li><li>mean: 67.61 tokens</li><li>max: 106 tokens</li></ul> | <ul><li>min: 37 tokens</li><li>mean: 67.14 tokens</li><li>max: 104 tokens</li></ul> | <ul><li>min: 39 tokens</li><li>mean: 68.46 tokens</li><li>max: 107 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1552_scitail_question_generation * Dataset: task1552_scitail_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 18.37 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 17.55 tokens</li><li>max: 46 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.88 tokens</li><li>max: 54 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task750_aqua_multiple_choice_answering * Dataset: task750_aqua_multiple_choice_answering * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 33 tokens</li><li>mean: 69.62 tokens</li><li>max: 194 tokens</li></ul> | <ul><li>min: 32 tokens</li><li>mean: 67.98 tokens</li><li>max: 194 tokens</li></ul> | <ul><li>min: 28 tokens</li><li>mean: 67.81 tokens</li><li>max: 165 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task327_jigsaw_classification_toxic * Dataset: task327_jigsaw_classification_toxic * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 36.8 tokens</li><li>max: 234 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 40.85 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 45.53 tokens</li><li>max: 244 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1502_hatexplain_classification * Dataset: task1502_hatexplain_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 28.69 tokens</li><li>max: 73 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 26.7 tokens</li><li>max: 110 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 26.92 tokens</li><li>max: 90 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task328_jigsaw_classification_insult * Dataset: task328_jigsaw_classification_insult * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 51.02 tokens</li><li>max: 247 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 60.56 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 64.19 tokens</li><li>max: 249 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task304_numeric_fused_head_resolution * Dataset: task304_numeric_fused_head_resolution * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 15 tokens</li><li>mean: 120.75 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 122.1 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 134.06 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1293_kilt_tasks_hotpotqa_question_answering * Dataset: task1293_kilt_tasks_hotpotqa_question_answering * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 24.78 tokens</li><li>max: 114 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 24.2 tokens</li><li>max: 114 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 23.85 tokens</li><li>max: 84 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task216_rocstories_correct_answer_generation * Dataset: task216_rocstories_correct_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 39 tokens</li><li>mean: 59.5 tokens</li><li>max: 83 tokens</li></ul> | <ul><li>min: 36 tokens</li><li>mean: 58.38 tokens</li><li>max: 92 tokens</li></ul> | <ul><li>min: 39 tokens</li><li>mean: 58.22 tokens</li><li>max: 95 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1326_qa_zre_question_generation_from_answer * Dataset: task1326_qa_zre_question_generation_from_answer * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 17 tokens</li><li>mean: 46.37 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 45.05 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 49.47 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1338_peixian_equity_evaluation_corpus_sentiment_classifier * Dataset: task1338_peixian_equity_evaluation_corpus_sentiment_classifier * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 9.68 tokens</li><li>max: 16 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 9.71 tokens</li><li>max: 16 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 9.57 tokens</li><li>max: 17 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1729_personachat_generate_next * Dataset: task1729_personachat_generate_next * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 44 tokens</li><li>mean: 146.46 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 43 tokens</li><li>mean: 142.09 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 50 tokens</li><li>mean: 144.22 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1202_atomic_classification_xneed * Dataset: task1202_atomic_classification_xneed * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 19.55 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 19.39 tokens</li><li>max: 31 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 19.22 tokens</li><li>max: 28 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task400_paws_paraphrase_classification * Dataset: task400_paws_paraphrase_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 19 tokens</li><li>mean: 52.28 tokens</li><li>max: 97 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 51.88 tokens</li><li>max: 98 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 53.03 tokens</li><li>max: 97 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task502_scruples_anecdotes_whoiswrong_verification * Dataset: task502_scruples_anecdotes_whoiswrong_verification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 12 tokens</li><li>mean: 229.76 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 236.43 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 235.02 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task088_identify_typo_verification * Dataset: task088_identify_typo_verification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 11 tokens</li><li>mean: 15.08 tokens</li><li>max: 48 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 15.05 tokens</li><li>max: 47 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 15.39 tokens</li><li>max: 47 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task221_rocstories_two_choice_classification * Dataset: task221_rocstories_two_choice_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 47 tokens</li><li>mean: 72.64 tokens</li><li>max: 108 tokens</li></ul> | <ul><li>min: 48 tokens</li><li>mean: 72.66 tokens</li><li>max: 109 tokens</li></ul> | <ul><li>min: 46 tokens</li><li>mean: 73.26 tokens</li><li>max: 108 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task200_mnli_entailment_classification * Dataset: task200_mnli_entailment_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 24 tokens</li><li>mean: 72.63 tokens</li><li>max: 198 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 72.69 tokens</li><li>max: 224 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 73.44 tokens</li><li>max: 226 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task074_squad1.1_question_generation * Dataset: task074_squad1.1_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 30 tokens</li><li>mean: 150.23 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 33 tokens</li><li>mean: 160.48 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 38 tokens</li><li>mean: 164.59 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task581_socialiqa_question_generation * Dataset: task581_socialiqa_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 12 tokens</li><li>mean: 26.52 tokens</li><li>max: 69 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 25.55 tokens</li><li>max: 48 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 25.85 tokens</li><li>max: 48 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1186_nne_hrngo_classification * Dataset: task1186_nne_hrngo_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 19 tokens</li><li>mean: 33.82 tokens</li><li>max: 79 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 33.49 tokens</li><li>max: 74 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 33.34 tokens</li><li>max: 77 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task898_freebase_qa_answer_generation * Dataset: task898_freebase_qa_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 19.18 tokens</li><li>max: 125 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 17.45 tokens</li><li>max: 49 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 17.48 tokens</li><li>max: 79 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1408_dart_similarity_classification * Dataset: task1408_dart_similarity_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 22 tokens</li><li>mean: 59.48 tokens</li><li>max: 147 tokens</li></ul> | <ul><li>min: 22 tokens</li><li>mean: 61.95 tokens</li><li>max: 154 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 48.32 tokens</li><li>max: 124 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task168_strategyqa_question_decomposition * Dataset: task168_strategyqa_question_decomposition * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 42 tokens</li><li>mean: 81.83 tokens</li><li>max: 181 tokens</li></ul> | <ul><li>min: 42 tokens</li><li>mean: 79.75 tokens</li><li>max: 179 tokens</li></ul> | <ul><li>min: 42 tokens</li><li>mean: 77.43 tokens</li><li>max: 166 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1357_xlsum_summary_generation * Dataset: task1357_xlsum_summary_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 67 tokens</li><li>mean: 242.04 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 76 tokens</li><li>mean: 243.28 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 67 tokens</li><li>mean: 247.07 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task390_torque_text_span_selection * Dataset: task390_torque_text_span_selection * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 47 tokens</li><li>mean: 110.04 tokens</li><li>max: 196 tokens</li></ul> | <ul><li>min: 42 tokens</li><li>mean: 110.49 tokens</li><li>max: 195 tokens</li></ul> | <ul><li>min: 48 tokens</li><li>mean: 110.67 tokens</li><li>max: 196 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task165_mcscript_question_answering_commonsense * Dataset: task165_mcscript_question_answering_commonsense * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 147 tokens</li><li>mean: 198.24 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 145 tokens</li><li>mean: 196.67 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 147 tokens</li><li>mean: 198.41 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1533_daily_dialog_formal_classification * Dataset: task1533_daily_dialog_formal_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 13 tokens</li><li>mean: 129.55 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 136.75 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 137.33 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task002_quoref_answer_generation * Dataset: task002_quoref_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 214 tokens</li><li>mean: 255.54 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 214 tokens</li><li>mean: 255.53 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 224 tokens</li><li>mean: 255.61 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1297_qasc_question_answering * Dataset: task1297_qasc_question_answering * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 61 tokens</li><li>mean: 84.69 tokens</li><li>max: 134 tokens</li></ul> | <ul><li>min: 59 tokens</li><li>mean: 85.39 tokens</li><li>max: 130 tokens</li></ul> | <ul><li>min: 58 tokens</li><li>mean: 84.83 tokens</li><li>max: 125 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task305_jeopardy_answer_generation_normal * Dataset: task305_jeopardy_answer_generation_normal * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 27.72 tokens</li><li>max: 59 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 27.43 tokens</li><li>max: 45 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 27.37 tokens</li><li>max: 46 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task029_winogrande_full_object * Dataset: task029_winogrande_full_object * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 7.37 tokens</li><li>max: 12 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 7.32 tokens</li><li>max: 11 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 7.24 tokens</li><li>max: 10 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1327_qa_zre_answer_generation_from_question * Dataset: task1327_qa_zre_answer_generation_from_question * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 24 tokens</li><li>mean: 55.0 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 52.2 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 27 tokens</li><li>mean: 55.59 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task326_jigsaw_classification_obscene * Dataset: task326_jigsaw_classification_obscene * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 65.45 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 77.38 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 74.07 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1542_every_ith_element_from_starting * Dataset: task1542_every_ith_element_from_starting * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 13 tokens</li><li>mean: 125.21 tokens</li><li>max: 245 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 123.54 tokens</li><li>max: 244 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 120.48 tokens</li><li>max: 238 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task570_recipe_nlg_ner_generation * Dataset: task570_recipe_nlg_ner_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 74.07 tokens</li><li>max: 250 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 73.6 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 76.08 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1409_dart_text_generation * Dataset: task1409_dart_text_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 18 tokens</li><li>mean: 67.5 tokens</li><li>max: 174 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 72.52 tokens</li><li>max: 170 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 67.55 tokens</li><li>max: 164 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task401_numeric_fused_head_reference * Dataset: task401_numeric_fused_head_reference * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 16 tokens</li><li>mean: 109.08 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 116.35 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 119.65 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task846_pubmedqa_classification * Dataset: task846_pubmedqa_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 32 tokens</li><li>mean: 85.83 tokens</li><li>max: 246 tokens</li></ul> | <ul><li>min: 33 tokens</li><li>mean: 85.03 tokens</li><li>max: 225 tokens</li></ul> | <ul><li>min: 28 tokens</li><li>mean: 93.96 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1712_poki_classification * Dataset: task1712_poki_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 52.73 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 55.65 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 63.01 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task344_hybridqa_answer_generation * Dataset: task344_hybridqa_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 22.15 tokens</li><li>max: 50 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 22.07 tokens</li><li>max: 58 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 22.07 tokens</li><li>max: 55 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task875_emotion_classification * Dataset: task875_emotion_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 23.03 tokens</li><li>max: 75 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 18.42 tokens</li><li>max: 63 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 20.36 tokens</li><li>max: 68 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1214_atomic_classification_xwant * Dataset: task1214_atomic_classification_xwant * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 19.66 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 19.39 tokens</li><li>max: 29 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 19.57 tokens</li><li>max: 31 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task106_scruples_ethical_judgment * Dataset: task106_scruples_ethical_judgment * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 12 tokens</li><li>mean: 29.85 tokens</li><li>max: 70 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 28.96 tokens</li><li>max: 86 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 28.77 tokens</li><li>max: 58 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task238_iirc_answer_from_passage_answer_generation * Dataset: task238_iirc_answer_from_passage_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 138 tokens</li><li>mean: 242.59 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 165 tokens</li><li>mean: 242.86 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 173 tokens</li><li>mean: 243.06 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1391_winogrande_easy_answer_generation * Dataset: task1391_winogrande_easy_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 26 tokens</li><li>mean: 31.69 tokens</li><li>max: 54 tokens</li></ul> | <ul><li>min: 26 tokens</li><li>mean: 31.28 tokens</li><li>max: 48 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 31.16 tokens</li><li>max: 49 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task195_sentiment140_classification * Dataset: task195_sentiment140_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 22.62 tokens</li><li>max: 118 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 18.82 tokens</li><li>max: 79 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 21.32 tokens</li><li>max: 51 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task163_count_words_ending_with_letter * Dataset: task163_count_words_ending_with_letter * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 28 tokens</li><li>mean: 32.06 tokens</li><li>max: 54 tokens</li></ul> | <ul><li>min: 28 tokens</li><li>mean: 31.69 tokens</li><li>max: 57 tokens</li></ul> | <ul><li>min: 28 tokens</li><li>mean: 31.58 tokens</li><li>max: 43 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task579_socialiqa_classification * Dataset: task579_socialiqa_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 39 tokens</li><li>mean: 54.2 tokens</li><li>max: 132 tokens</li></ul> | <ul><li>min: 36 tokens</li><li>mean: 53.61 tokens</li><li>max: 103 tokens</li></ul> | <ul><li>min: 40 tokens</li><li>mean: 54.16 tokens</li><li>max: 84 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task569_recipe_nlg_text_generation * Dataset: task569_recipe_nlg_text_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 25 tokens</li><li>mean: 193.73 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 55 tokens</li><li>mean: 193.64 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 37 tokens</li><li>mean: 198.12 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1602_webquestion_question_genreation * Dataset: task1602_webquestion_question_genreation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 12 tokens</li><li>mean: 23.64 tokens</li><li>max: 112 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 24.12 tokens</li><li>max: 112 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 22.49 tokens</li><li>max: 120 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task747_glucose_cause_emotion_detection * Dataset: task747_glucose_cause_emotion_detection * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 35 tokens</li><li>mean: 68.15 tokens</li><li>max: 112 tokens</li></ul> | <ul><li>min: 36 tokens</li><li>mean: 68.3 tokens</li><li>max: 108 tokens</li></ul> | <ul><li>min: 36 tokens</li><li>mean: 68.79 tokens</li><li>max: 99 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task219_rocstories_title_answer_generation * Dataset: task219_rocstories_title_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 42 tokens</li><li>mean: 67.71 tokens</li><li>max: 97 tokens</li></ul> | <ul><li>min: 45 tokens</li><li>mean: 66.7 tokens</li><li>max: 97 tokens</li></ul> | <ul><li>min: 41 tokens</li><li>mean: 66.92 tokens</li><li>max: 96 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task178_quartz_question_answering * Dataset: task178_quartz_question_answering * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 28 tokens</li><li>mean: 57.78 tokens</li><li>max: 110 tokens</li></ul> | <ul><li>min: 28 tokens</li><li>mean: 57.44 tokens</li><li>max: 111 tokens</li></ul> | <ul><li>min: 28 tokens</li><li>mean: 56.86 tokens</li><li>max: 102 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task103_facts2story_long_text_generation * Dataset: task103_facts2story_long_text_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 52 tokens</li><li>mean: 80.49 tokens</li><li>max: 143 tokens</li></ul> | <ul><li>min: 51 tokens</li><li>mean: 82.22 tokens</li><li>max: 157 tokens</li></ul> | <ul><li>min: 49 tokens</li><li>mean: 78.96 tokens</li><li>max: 145 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task301_record_question_generation * Dataset: task301_record_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 140 tokens</li><li>mean: 210.71 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 139 tokens</li><li>mean: 209.62 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 143 tokens</li><li>mean: 208.74 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1369_healthfact_sentence_generation * Dataset: task1369_healthfact_sentence_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 110 tokens</li><li>mean: 243.25 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 101 tokens</li><li>mean: 243.17 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 113 tokens</li><li>mean: 251.67 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task515_senteval_odd_word_out * Dataset: task515_senteval_odd_word_out * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 19.72 tokens</li><li>max: 36 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 19.13 tokens</li><li>max: 38 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 19.0 tokens</li><li>max: 35 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task496_semeval_answer_generation * Dataset: task496_semeval_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 28.11 tokens</li><li>max: 46 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 27.8 tokens</li><li>max: 45 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 27.68 tokens</li><li>max: 45 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1658_billsum_summarization * Dataset: task1658_billsum_summarization * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 256 tokens</li><li>mean: 256.0 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 256 tokens</li><li>mean: 256.0 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 256 tokens</li><li>mean: 256.0 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1204_atomic_classification_hinderedby * Dataset: task1204_atomic_classification_hinderedby * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 22.1 tokens</li><li>max: 35 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 22.07 tokens</li><li>max: 34 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 21.5 tokens</li><li>max: 38 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1392_superglue_multirc_answer_verification * Dataset: task1392_superglue_multirc_answer_verification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 128 tokens</li><li>mean: 241.77 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 127 tokens</li><li>mean: 241.97 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 136 tokens</li><li>mean: 242.04 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task306_jeopardy_answer_generation_double * Dataset: task306_jeopardy_answer_generation_double * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 27.79 tokens</li><li>max: 47 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 27.16 tokens</li><li>max: 46 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 27.61 tokens</li><li>max: 47 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1286_openbookqa_question_answering * Dataset: task1286_openbookqa_question_answering * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 22 tokens</li><li>mean: 39.54 tokens</li><li>max: 85 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 38.94 tokens</li><li>max: 96 tokens</li></ul> | <ul><li>min: 22 tokens</li><li>mean: 38.26 tokens</li><li>max: 89 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task159_check_frequency_of_words_in_sentence_pair * Dataset: task159_check_frequency_of_words_in_sentence_pair * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 44 tokens</li><li>mean: 50.37 tokens</li><li>max: 67 tokens</li></ul> | <ul><li>min: 44 tokens</li><li>mean: 50.35 tokens</li><li>max: 67 tokens</li></ul> | <ul><li>min: 44 tokens</li><li>mean: 50.61 tokens</li><li>max: 66 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task151_tomqa_find_location_easy_clean * Dataset: task151_tomqa_find_location_easy_clean * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 37 tokens</li><li>mean: 50.73 tokens</li><li>max: 79 tokens</li></ul> | <ul><li>min: 37 tokens</li><li>mean: 50.28 tokens</li><li>max: 74 tokens</li></ul> | <ul><li>min: 37 tokens</li><li>mean: 50.52 tokens</li><li>max: 74 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task323_jigsaw_classification_sexually_explicit * Dataset: task323_jigsaw_classification_sexually_explicit * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 66.26 tokens</li><li>max: 248 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 76.73 tokens</li><li>max: 248 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 75.5 tokens</li><li>max: 251 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task037_qasc_generate_related_fact * Dataset: task037_qasc_generate_related_fact * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 13 tokens</li><li>mean: 22.04 tokens</li><li>max: 50 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 22.03 tokens</li><li>max: 42 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 21.9 tokens</li><li>max: 40 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task027_drop_answer_type_generation * Dataset: task027_drop_answer_type_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 87 tokens</li><li>mean: 229.02 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 74 tokens</li><li>mean: 230.67 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 71 tokens</li><li>mean: 232.43 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1596_event2mind_text_generation_2 * Dataset: task1596_event2mind_text_generation_2 * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 9.97 tokens</li><li>max: 18 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 10.03 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 10.06 tokens</li><li>max: 18 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task141_odd-man-out_classification_category * Dataset: task141_odd-man-out_classification_category * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 16 tokens</li><li>mean: 18.45 tokens</li><li>max: 28 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 18.38 tokens</li><li>max: 26 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 18.46 tokens</li><li>max: 25 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task194_duorc_answer_generation * Dataset: task194_duorc_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 149 tokens</li><li>mean: 251.76 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 147 tokens</li><li>mean: 252.05 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 148 tokens</li><li>mean: 251.76 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task679_hope_edi_english_text_classification * Dataset: task679_hope_edi_english_text_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 27.77 tokens</li><li>max: 199 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 27.23 tokens</li><li>max: 205 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 29.87 tokens</li><li>max: 194 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task246_dream_question_generation * Dataset: task246_dream_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 17 tokens</li><li>mean: 80.33 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 80.74 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 87.22 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1195_disflqa_disfluent_to_fluent_conversion * Dataset: task1195_disflqa_disfluent_to_fluent_conversion * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 19.76 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 19.88 tokens</li><li>max: 40 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 20.2 tokens</li><li>max: 44 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task065_timetravel_consistent_sentence_classification * Dataset: task065_timetravel_consistent_sentence_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 55 tokens</li><li>mean: 79.4 tokens</li><li>max: 117 tokens</li></ul> | <ul><li>min: 51 tokens</li><li>mean: 79.17 tokens</li><li>max: 110 tokens</li></ul> | <ul><li>min: 53 tokens</li><li>mean: 80.1 tokens</li><li>max: 110 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task351_winomt_classification_gender_identifiability_anti * Dataset: task351_winomt_classification_gender_identifiability_anti * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 16 tokens</li><li>mean: 21.76 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 21.66 tokens</li><li>max: 31 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 21.78 tokens</li><li>max: 30 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task580_socialiqa_answer_generation * Dataset: task580_socialiqa_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 35 tokens</li><li>mean: 52.41 tokens</li><li>max: 107 tokens</li></ul> | <ul><li>min: 35 tokens</li><li>mean: 51.02 tokens</li><li>max: 86 tokens</li></ul> | <ul><li>min: 35 tokens</li><li>mean: 50.98 tokens</li><li>max: 87 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task583_udeps_eng_coarse_pos_tagging * Dataset: task583_udeps_eng_coarse_pos_tagging * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 12 tokens</li><li>mean: 41.24 tokens</li><li>max: 185 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 40.21 tokens</li><li>max: 185 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 40.93 tokens</li><li>max: 185 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task202_mnli_contradiction_classification * Dataset: task202_mnli_contradiction_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 24 tokens</li><li>mean: 73.7 tokens</li><li>max: 190 tokens</li></ul> | <ul><li>min: 28 tokens</li><li>mean: 76.06 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 74.56 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task222_rocstories_two_chioce_slotting_classification * Dataset: task222_rocstories_two_chioce_slotting_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 48 tokens</li><li>mean: 73.06 tokens</li><li>max: 105 tokens</li></ul> | <ul><li>min: 48 tokens</li><li>mean: 73.24 tokens</li><li>max: 100 tokens</li></ul> | <ul><li>min: 49 tokens</li><li>mean: 71.71 tokens</li><li>max: 102 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task498_scruples_anecdotes_whoiswrong_classification * Dataset: task498_scruples_anecdotes_whoiswrong_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 24 tokens</li><li>mean: 225.8 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 47 tokens</li><li>mean: 232.86 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 47 tokens</li><li>mean: 231.22 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task067_abductivenli_answer_generation * Dataset: task067_abductivenli_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 26.75 tokens</li><li>max: 40 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 26.13 tokens</li><li>max: 42 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 26.34 tokens</li><li>max: 38 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task616_cola_classification * Dataset: task616_cola_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 12.16 tokens</li><li>max: 33 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 12.05 tokens</li><li>max: 33 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 11.96 tokens</li><li>max: 29 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task286_olid_offense_judgment * Dataset: task286_olid_offense_judgment * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 32.85 tokens</li><li>max: 145 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 30.81 tokens</li><li>max: 171 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 30.26 tokens</li><li>max: 169 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task188_snli_neutral_to_entailment_text_modification * Dataset: task188_snli_neutral_to_entailment_text_modification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 18 tokens</li><li>mean: 31.55 tokens</li><li>max: 79 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 31.31 tokens</li><li>max: 84 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 32.91 tokens</li><li>max: 84 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task223_quartz_explanation_generation * Dataset: task223_quartz_explanation_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 12 tokens</li><li>mean: 31.46 tokens</li><li>max: 68 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 31.8 tokens</li><li>max: 68 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 28.95 tokens</li><li>max: 96 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task820_protoqa_answer_generation * Dataset: task820_protoqa_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 14.87 tokens</li><li>max: 29 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 14.54 tokens</li><li>max: 27 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 14.22 tokens</li><li>max: 29 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task196_sentiment140_answer_generation * Dataset: task196_sentiment140_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 17 tokens</li><li>mean: 36.26 tokens</li><li>max: 72 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 32.85 tokens</li><li>max: 61 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 36.27 tokens</li><li>max: 72 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1678_mathqa_answer_selection * Dataset: task1678_mathqa_answer_selection * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 33 tokens</li><li>mean: 70.42 tokens</li><li>max: 177 tokens</li></ul> | <ul><li>min: 30 tokens</li><li>mean: 68.99 tokens</li><li>max: 146 tokens</li></ul> | <ul><li>min: 33 tokens</li><li>mean: 69.69 tokens</li><li>max: 160 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task349_squad2.0_answerable_unanswerable_question_classification * Dataset: task349_squad2.0_answerable_unanswerable_question_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 53 tokens</li><li>mean: 176.83 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 57 tokens</li><li>mean: 177.07 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 53 tokens</li><li>mean: 176.78 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task154_tomqa_find_location_hard_noise * Dataset: task154_tomqa_find_location_hard_noise * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 129 tokens</li><li>mean: 176.29 tokens</li><li>max: 253 tokens</li></ul> | <ul><li>min: 126 tokens</li><li>mean: 176.3 tokens</li><li>max: 249 tokens</li></ul> | <ul><li>min: 128 tokens</li><li>mean: 178.24 tokens</li><li>max: 254 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task333_hateeval_classification_hate_en * Dataset: task333_hateeval_classification_hate_en * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 38.33 tokens</li><li>max: 117 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 36.79 tokens</li><li>max: 109 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 36.61 tokens</li><li>max: 113 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task235_iirc_question_from_subtext_answer_generation * Dataset: task235_iirc_question_from_subtext_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 52.9 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 50.44 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 55.89 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1554_scitail_classification * Dataset: task1554_scitail_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 16.8 tokens</li><li>max: 38 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 25.75 tokens</li><li>max: 68 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 24.34 tokens</li><li>max: 59 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task210_logic2text_structured_text_generation * Dataset: task210_logic2text_structured_text_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 13 tokens</li><li>mean: 31.88 tokens</li><li>max: 101 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 30.88 tokens</li><li>max: 94 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 32.75 tokens</li><li>max: 89 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task035_winogrande_question_modification_person * Dataset: task035_winogrande_question_modification_person * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 31 tokens</li><li>mean: 36.16 tokens</li><li>max: 50 tokens</li></ul> | <ul><li>min: 31 tokens</li><li>mean: 35.75 tokens</li><li>max: 55 tokens</li></ul> | <ul><li>min: 31 tokens</li><li>mean: 35.41 tokens</li><li>max: 48 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task230_iirc_passage_classification * Dataset: task230_iirc_passage_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 256 tokens</li><li>mean: 256.0 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 256 tokens</li><li>mean: 256.0 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 256 tokens</li><li>mean: 256.0 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1356_xlsum_title_generation * Dataset: task1356_xlsum_title_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 59 tokens</li><li>mean: 239.92 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 58 tokens</li><li>mean: 240.94 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 64 tokens</li><li>mean: 248.75 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1726_mathqa_correct_answer_generation * Dataset: task1726_mathqa_correct_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 43.81 tokens</li><li>max: 156 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 42.63 tokens</li><li>max: 129 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 42.82 tokens</li><li>max: 133 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task302_record_classification * Dataset: task302_record_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 194 tokens</li><li>mean: 253.35 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 198 tokens</li><li>mean: 252.85 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 195 tokens</li><li>mean: 252.78 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task380_boolq_yes_no_question * Dataset: task380_boolq_yes_no_question * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 26 tokens</li><li>mean: 134.17 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 26 tokens</li><li>mean: 138.56 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 27 tokens</li><li>mean: 138.25 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task212_logic2text_classification * Dataset: task212_logic2text_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 33.28 tokens</li><li>max: 146 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 32.14 tokens</li><li>max: 146 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 32.96 tokens</li><li>max: 127 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task748_glucose_reverse_cause_event_detection * Dataset: task748_glucose_reverse_cause_event_detection * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 35 tokens</li><li>mean: 67.63 tokens</li><li>max: 105 tokens</li></ul> | <ul><li>min: 38 tokens</li><li>mean: 66.95 tokens</li><li>max: 106 tokens</li></ul> | <ul><li>min: 39 tokens</li><li>mean: 68.94 tokens</li><li>max: 105 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task834_mathdataset_classification * Dataset: task834_mathdataset_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 27.7 tokens</li><li>max: 83 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 27.88 tokens</li><li>max: 83 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 26.97 tokens</li><li>max: 93 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task350_winomt_classification_gender_identifiability_pro * Dataset: task350_winomt_classification_gender_identifiability_pro * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 16 tokens</li><li>mean: 21.79 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 21.63 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 21.79 tokens</li><li>max: 30 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task191_hotpotqa_question_generation * Dataset: task191_hotpotqa_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 198 tokens</li><li>mean: 255.88 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 238 tokens</li><li>mean: 255.93 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 255 tokens</li><li>mean: 256.0 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task236_iirc_question_from_passage_answer_generation * Dataset: task236_iirc_question_from_passage_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 135 tokens</li><li>mean: 238.3 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 155 tokens</li><li>mean: 237.61 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 154 tokens</li><li>mean: 239.64 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task217_rocstories_ordering_answer_generation * Dataset: task217_rocstories_ordering_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 45 tokens</li><li>mean: 72.32 tokens</li><li>max: 107 tokens</li></ul> | <ul><li>min: 48 tokens</li><li>mean: 72.29 tokens</li><li>max: 107 tokens</li></ul> | <ul><li>min: 48 tokens</li><li>mean: 70.87 tokens</li><li>max: 105 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task568_circa_question_generation * Dataset: task568_circa_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 9.6 tokens</li><li>max: 25 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.46 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 8.93 tokens</li><li>max: 20 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task614_glucose_cause_event_detection * Dataset: task614_glucose_cause_event_detection * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 39 tokens</li><li>mean: 67.66 tokens</li><li>max: 102 tokens</li></ul> | <ul><li>min: 39 tokens</li><li>mean: 67.16 tokens</li><li>max: 106 tokens</li></ul> | <ul><li>min: 38 tokens</li><li>mean: 68.48 tokens</li><li>max: 103 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task361_spolin_yesand_prompt_response_classification * Dataset: task361_spolin_yesand_prompt_response_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 18 tokens</li><li>mean: 47.01 tokens</li><li>max: 137 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 46.18 tokens</li><li>max: 119 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 47.2 tokens</li><li>max: 128 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task421_persent_sentence_sentiment_classification * Dataset: task421_persent_sentence_sentiment_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 22 tokens</li><li>mean: 67.77 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 22 tokens</li><li>mean: 71.21 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 72.24 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task203_mnli_sentence_generation * Dataset: task203_mnli_sentence_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 38.73 tokens</li><li>max: 175 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 35.74 tokens</li><li>max: 175 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 34.18 tokens</li><li>max: 170 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task420_persent_document_sentiment_classification * Dataset: task420_persent_document_sentiment_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 22 tokens</li><li>mean: 224.14 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 22 tokens</li><li>mean: 233.63 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 22 tokens</li><li>mean: 227.59 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task153_tomqa_find_location_hard_clean * Dataset: task153_tomqa_find_location_hard_clean * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 39 tokens</li><li>mean: 160.13 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 39 tokens</li><li>mean: 159.86 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 39 tokens</li><li>mean: 162.75 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task346_hybridqa_classification * Dataset: task346_hybridqa_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 18 tokens</li><li>mean: 32.87 tokens</li><li>max: 68 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 31.92 tokens</li><li>max: 63 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 31.83 tokens</li><li>max: 75 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1211_atomic_classification_hassubevent * Dataset: task1211_atomic_classification_hassubevent * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 11 tokens</li><li>mean: 16.25 tokens</li><li>max: 31 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 16.02 tokens</li><li>max: 29 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 16.89 tokens</li><li>max: 29 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task360_spolin_yesand_response_generation * Dataset: task360_spolin_yesand_response_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 22.54 tokens</li><li>max: 89 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 21.16 tokens</li><li>max: 92 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 20.91 tokens</li><li>max: 67 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task510_reddit_tifu_title_summarization * Dataset: task510_reddit_tifu_title_summarization * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 217.53 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 218.59 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 221.41 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task511_reddit_tifu_long_text_summarization * Dataset: task511_reddit_tifu_long_text_summarization * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 29 tokens</li><li>mean: 239.72 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 76 tokens</li><li>mean: 238.38 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 43 tokens</li><li>mean: 245.03 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task345_hybridqa_answer_generation * Dataset: task345_hybridqa_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 22.14 tokens</li><li>max: 50 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 21.6 tokens</li><li>max: 70 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 20.96 tokens</li><li>max: 47 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task270_csrg_counterfactual_context_generation * Dataset: task270_csrg_counterfactual_context_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 63 tokens</li><li>mean: 100.05 tokens</li><li>max: 158 tokens</li></ul> | <ul><li>min: 63 tokens</li><li>mean: 98.61 tokens</li><li>max: 142 tokens</li></ul> | <ul><li>min: 62 tokens</li><li>mean: 100.35 tokens</li><li>max: 141 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task307_jeopardy_answer_generation_final * Dataset: task307_jeopardy_answer_generation_final * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 15 tokens</li><li>mean: 29.61 tokens</li><li>max: 46 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 29.31 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 29.28 tokens</li><li>max: 43 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task001_quoref_question_generation * Dataset: task001_quoref_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 201 tokens</li><li>mean: 254.96 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 99 tokens</li><li>mean: 254.28 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 173 tokens</li><li>mean: 255.13 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task089_swap_words_verification * Dataset: task089_swap_words_verification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 12.86 tokens</li><li>max: 28 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 12.64 tokens</li><li>max: 24 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 12.26 tokens</li><li>max: 22 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1196_atomic_classification_oeffect * Dataset: task1196_atomic_classification_oeffect * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 18.79 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 18.57 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 18.51 tokens</li><li>max: 29 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task080_piqa_answer_generation * Dataset: task080_piqa_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 3 tokens</li><li>mean: 10.82 tokens</li><li>max: 33 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 10.77 tokens</li><li>max: 24 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 10.03 tokens</li><li>max: 26 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1598_nyc_long_text_generation * Dataset: task1598_nyc_long_text_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 17 tokens</li><li>mean: 35.5 tokens</li><li>max: 56 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 35.66 tokens</li><li>max: 56 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 36.66 tokens</li><li>max: 55 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task240_tweetqa_question_generation * Dataset: task240_tweetqa_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 27 tokens</li><li>mean: 51.18 tokens</li><li>max: 94 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 50.72 tokens</li><li>max: 92 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 51.63 tokens</li><li>max: 95 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task615_moviesqa_answer_generation * Dataset: task615_moviesqa_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 11.46 tokens</li><li>max: 23 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 11.44 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 11.4 tokens</li><li>max: 22 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1347_glue_sts-b_similarity_classification * Dataset: task1347_glue_sts-b_similarity_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 17 tokens</li><li>mean: 31.13 tokens</li><li>max: 88 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 31.12 tokens</li><li>max: 92 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 30.85 tokens</li><li>max: 92 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task114_is_the_given_word_longest * Dataset: task114_is_the_given_word_longest * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 25 tokens</li><li>mean: 28.87 tokens</li><li>max: 68 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 28.46 tokens</li><li>max: 48 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 28.7 tokens</li><li>max: 47 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task292_storycommonsense_character_text_generation * Dataset: task292_storycommonsense_character_text_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 43 tokens</li><li>mean: 67.87 tokens</li><li>max: 98 tokens</li></ul> | <ul><li>min: 46 tokens</li><li>mean: 67.11 tokens</li><li>max: 104 tokens</li></ul> | <ul><li>min: 43 tokens</li><li>mean: 69.05 tokens</li><li>max: 96 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task115_help_advice_classification * Dataset: task115_help_advice_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 2 tokens</li><li>mean: 19.89 tokens</li><li>max: 91 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 18.13 tokens</li><li>max: 92 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 19.22 tokens</li><li>max: 137 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task431_senteval_object_count * Dataset: task431_senteval_object_count * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 16.78 tokens</li><li>max: 37 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.12 tokens</li><li>max: 36 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.72 tokens</li><li>max: 35 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1360_numer_sense_multiple_choice_qa_generation * Dataset: task1360_numer_sense_multiple_choice_qa_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 32 tokens</li><li>mean: 40.62 tokens</li><li>max: 54 tokens</li></ul> | <ul><li>min: 32 tokens</li><li>mean: 40.3 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 32 tokens</li><li>mean: 40.28 tokens</li><li>max: 60 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task177_para-nmt_paraphrasing * Dataset: task177_para-nmt_paraphrasing * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 19.86 tokens</li><li>max: 82 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 18.91 tokens</li><li>max: 58 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 18.22 tokens</li><li>max: 36 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task132_dais_text_modification * Dataset: task132_dais_text_modification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 9.3 tokens</li><li>max: 15 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 9.08 tokens</li><li>max: 15 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 10.11 tokens</li><li>max: 15 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task269_csrg_counterfactual_story_generation * Dataset: task269_csrg_counterfactual_story_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 49 tokens</li><li>mean: 79.95 tokens</li><li>max: 111 tokens</li></ul> | <ul><li>min: 53 tokens</li><li>mean: 79.51 tokens</li><li>max: 116 tokens</li></ul> | <ul><li>min: 48 tokens</li><li>mean: 79.5 tokens</li><li>max: 114 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task233_iirc_link_exists_classification * Dataset: task233_iirc_link_exists_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 145 tokens</li><li>mean: 235.67 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 142 tokens</li><li>mean: 233.59 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 151 tokens</li><li>mean: 235.1 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task161_count_words_containing_letter * Dataset: task161_count_words_containing_letter * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 27 tokens</li><li>mean: 30.99 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 27 tokens</li><li>mean: 30.8 tokens</li><li>max: 61 tokens</li></ul> | <ul><li>min: 27 tokens</li><li>mean: 30.5 tokens</li><li>max: 42 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1205_atomic_classification_isafter * Dataset: task1205_atomic_classification_isafter * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 20.91 tokens</li><li>max: 37 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 20.65 tokens</li><li>max: 35 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 21.51 tokens</li><li>max: 37 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task571_recipe_nlg_ner_generation * Dataset: task571_recipe_nlg_ner_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 118.38 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 118.92 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 111.39 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1292_yelp_review_full_text_categorization * Dataset: task1292_yelp_review_full_text_categorization * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 136.66 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 146.65 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 146.05 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task428_senteval_inversion * Dataset: task428_senteval_inversion * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 16.69 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 14.58 tokens</li><li>max: 31 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.26 tokens</li><li>max: 34 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task311_race_question_generation * Dataset: task311_race_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 115 tokens</li><li>mean: 254.87 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 137 tokens</li><li>mean: 254.4 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 171 tokens</li><li>mean: 255.44 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task429_senteval_tense * Dataset: task429_senteval_tense * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 15.84 tokens</li><li>max: 37 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 13.96 tokens</li><li>max: 33 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.25 tokens</li><li>max: 36 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task403_creak_commonsense_inference * Dataset: task403_creak_commonsense_inference * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 13 tokens</li><li>mean: 30.24 tokens</li><li>max: 104 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 29.39 tokens</li><li>max: 108 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 29.32 tokens</li><li>max: 122 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task929_products_reviews_classification * Dataset: task929_products_reviews_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 69.68 tokens</li><li>max: 126 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 70.66 tokens</li><li>max: 123 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 70.61 tokens</li><li>max: 123 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task582_naturalquestion_answer_generation * Dataset: task582_naturalquestion_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 11.71 tokens</li><li>max: 25 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 11.65 tokens</li><li>max: 24 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 11.73 tokens</li><li>max: 25 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task237_iirc_answer_from_subtext_answer_generation * Dataset: task237_iirc_answer_from_subtext_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 22 tokens</li><li>mean: 66.3 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 64.61 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 61.49 tokens</li><li>max: 161 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task050_multirc_answerability * Dataset: task050_multirc_answerability * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 15 tokens</li><li>mean: 32.3 tokens</li><li>max: 112 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 31.56 tokens</li><li>max: 93 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 32.13 tokens</li><li>max: 159 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task184_break_generate_question * Dataset: task184_break_generate_question * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 13 tokens</li><li>mean: 39.73 tokens</li><li>max: 147 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 38.83 tokens</li><li>max: 149 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 39.61 tokens</li><li>max: 148 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task669_ambigqa_answer_generation * Dataset: task669_ambigqa_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 12.94 tokens</li><li>max: 23 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 12.88 tokens</li><li>max: 27 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 12.76 tokens</li><li>max: 22 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task169_strategyqa_sentence_generation * Dataset: task169_strategyqa_sentence_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 19 tokens</li><li>mean: 35.21 tokens</li><li>max: 65 tokens</li></ul> | <ul><li>min: 22 tokens</li><li>mean: 34.25 tokens</li><li>max: 60 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 33.3 tokens</li><li>max: 65 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task500_scruples_anecdotes_title_generation * Dataset: task500_scruples_anecdotes_title_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 225.76 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 31 tokens</li><li>mean: 233.16 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 27 tokens</li><li>mean: 235.28 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task241_tweetqa_classification * Dataset: task241_tweetqa_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 31 tokens</li><li>mean: 61.75 tokens</li><li>max: 92 tokens</li></ul> | <ul><li>min: 36 tokens</li><li>mean: 62.23 tokens</li><li>max: 106 tokens</li></ul> | <ul><li>min: 31 tokens</li><li>mean: 61.7 tokens</li><li>max: 92 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1345_glue_qqp_question_paraprashing * Dataset: task1345_glue_qqp_question_paraprashing * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 16.86 tokens</li><li>max: 60 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 15.83 tokens</li><li>max: 69 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 16.62 tokens</li><li>max: 51 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task218_rocstories_swap_order_answer_generation * Dataset: task218_rocstories_swap_order_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 48 tokens</li><li>mean: 72.41 tokens</li><li>max: 118 tokens</li></ul> | <ul><li>min: 48 tokens</li><li>mean: 72.48 tokens</li><li>max: 102 tokens</li></ul> | <ul><li>min: 47 tokens</li><li>mean: 72.1 tokens</li><li>max: 106 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task613_politifact_text_generation * Dataset: task613_politifact_text_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 24.87 tokens</li><li>max: 75 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 23.39 tokens</li><li>max: 56 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 23.07 tokens</li><li>max: 61 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1167_penn_treebank_coarse_pos_tagging * Dataset: task1167_penn_treebank_coarse_pos_tagging * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 16 tokens</li><li>mean: 53.65 tokens</li><li>max: 200 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 53.64 tokens</li><li>max: 220 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 54.8 tokens</li><li>max: 202 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1422_mathqa_physics * Dataset: task1422_mathqa_physics * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 34 tokens</li><li>mean: 72.71 tokens</li><li>max: 164 tokens</li></ul> | <ul><li>min: 38 tokens</li><li>mean: 71.93 tokens</li><li>max: 157 tokens</li></ul> | <ul><li>min: 39 tokens</li><li>mean: 72.67 tokens</li><li>max: 155 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task247_dream_answer_generation * Dataset: task247_dream_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 38 tokens</li><li>mean: 160.28 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 39 tokens</li><li>mean: 159.0 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 41 tokens</li><li>mean: 167.8 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task199_mnli_classification * Dataset: task199_mnli_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 13 tokens</li><li>mean: 43.07 tokens</li><li>max: 127 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 44.72 tokens</li><li>max: 149 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 43.81 tokens</li><li>max: 113 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task164_mcscript_question_answering_text * Dataset: task164_mcscript_question_answering_text * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 150 tokens</li><li>mean: 200.63 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 150 tokens</li><li>mean: 200.9 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 142 tokens</li><li>mean: 200.85 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1541_agnews_classification * Dataset: task1541_agnews_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 21 tokens</li><li>mean: 53.59 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 53.09 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 53.95 tokens</li><li>max: 161 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task516_senteval_conjoints_inversion * Dataset: task516_senteval_conjoints_inversion * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 20.33 tokens</li><li>max: 34 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 19.01 tokens</li><li>max: 34 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 18.96 tokens</li><li>max: 34 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task294_storycommonsense_motiv_text_generation * Dataset: task294_storycommonsense_motiv_text_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 40.09 tokens</li><li>max: 86 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 40.77 tokens</li><li>max: 86 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 39.86 tokens</li><li>max: 86 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task501_scruples_anecdotes_post_type_verification * Dataset: task501_scruples_anecdotes_post_type_verification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 18 tokens</li><li>mean: 231.55 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 235.21 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 234.47 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task213_rocstories_correct_ending_classification * Dataset: task213_rocstories_correct_ending_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 62 tokens</li><li>mean: 86.17 tokens</li><li>max: 125 tokens</li></ul> | <ul><li>min: 60 tokens</li><li>mean: 85.49 tokens</li><li>max: 131 tokens</li></ul> | <ul><li>min: 59 tokens</li><li>mean: 86.18 tokens</li><li>max: 131 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task821_protoqa_question_generation * Dataset: task821_protoqa_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 14.6 tokens</li><li>max: 61 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 14.95 tokens</li><li>max: 35 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 13.89 tokens</li><li>max: 93 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task493_review_polarity_classification * Dataset: task493_review_polarity_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 18 tokens</li><li>mean: 100.91 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 107.28 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 113.07 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task308_jeopardy_answer_generation_all * Dataset: task308_jeopardy_answer_generation_all * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 12 tokens</li><li>mean: 27.9 tokens</li><li>max: 50 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 26.98 tokens</li><li>max: 44 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 27.48 tokens</li><li>max: 48 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1595_event2mind_text_generation_1 * Dataset: task1595_event2mind_text_generation_1 * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 9.86 tokens</li><li>max: 18 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 9.97 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 10.02 tokens</li><li>max: 20 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task040_qasc_question_generation * Dataset: task040_qasc_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 15.04 tokens</li><li>max: 29 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.05 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 13.84 tokens</li><li>max: 32 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task231_iirc_link_classification * Dataset: task231_iirc_link_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 179 tokens</li><li>mean: 246.31 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 170 tokens</li><li>mean: 245.93 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 161 tokens</li><li>mean: 247.13 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1727_wiqa_what_is_the_effect * Dataset: task1727_wiqa_what_is_the_effect * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 44 tokens</li><li>mean: 95.17 tokens</li><li>max: 183 tokens</li></ul> | <ul><li>min: 44 tokens</li><li>mean: 95.18 tokens</li><li>max: 185 tokens</li></ul> | <ul><li>min: 43 tokens</li><li>mean: 95.42 tokens</li><li>max: 183 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task578_curiosity_dialogs_answer_generation * Dataset: task578_curiosity_dialogs_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 229.66 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 118 tokens</li><li>mean: 235.49 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 229.46 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task310_race_classification * Dataset: task310_race_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 101 tokens</li><li>mean: 254.9 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 218 tokens</li><li>mean: 255.78 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 101 tokens</li><li>mean: 254.9 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task309_race_answer_generation * Dataset: task309_race_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 75 tokens</li><li>mean: 254.99 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 204 tokens</li><li>mean: 255.6 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 75 tokens</li><li>mean: 255.19 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task379_agnews_topic_classification * Dataset: task379_agnews_topic_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 20 tokens</li><li>mean: 54.89 tokens</li><li>max: 193 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 54.64 tokens</li><li>max: 175 tokens</li></ul> | <ul><li>min: 21 tokens</li><li>mean: 54.78 tokens</li><li>max: 187 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task030_winogrande_full_person * Dataset: task030_winogrande_full_person * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 7.59 tokens</li><li>max: 12 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 7.49 tokens</li><li>max: 12 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 7.38 tokens</li><li>max: 11 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1540_parsed_pdfs_summarization * Dataset: task1540_parsed_pdfs_summarization * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 3 tokens</li><li>mean: 188.4 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 46 tokens</li><li>mean: 190.16 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 192.07 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task039_qasc_find_overlapping_words * Dataset: task039_qasc_find_overlapping_words * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 16 tokens</li><li>mean: 30.48 tokens</li><li>max: 55 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 30.05 tokens</li><li>max: 57 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 30.65 tokens</li><li>max: 60 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1206_atomic_classification_isbefore * Dataset: task1206_atomic_classification_isbefore * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 21.2 tokens</li><li>max: 40 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 20.77 tokens</li><li>max: 31 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 21.41 tokens</li><li>max: 31 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task157_count_vowels_and_consonants * Dataset: task157_count_vowels_and_consonants * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 24 tokens</li><li>mean: 28.0 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 27.91 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 28.3 tokens</li><li>max: 39 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task339_record_answer_generation * Dataset: task339_record_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 171 tokens</li><li>mean: 235.1 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 171 tokens</li><li>mean: 234.38 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 171 tokens</li><li>mean: 232.38 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task453_swag_answer_generation * Dataset: task453_swag_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 18.56 tokens</li><li>max: 60 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 18.16 tokens</li><li>max: 63 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 17.5 tokens</li><li>max: 55 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task848_pubmedqa_classification * Dataset: task848_pubmedqa_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 21 tokens</li><li>mean: 248.87 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 21 tokens</li><li>mean: 250.0 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 84 tokens</li><li>mean: 251.62 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task673_google_wellformed_query_classification * Dataset: task673_google_wellformed_query_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 11.6 tokens</li><li>max: 27 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 11.22 tokens</li><li>max: 24 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 11.34 tokens</li><li>max: 22 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task676_ollie_relationship_answer_generation * Dataset: task676_ollie_relationship_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 29 tokens</li><li>mean: 50.99 tokens</li><li>max: 113 tokens</li></ul> | <ul><li>min: 29 tokens</li><li>mean: 49.39 tokens</li><li>max: 134 tokens</li></ul> | <ul><li>min: 30 tokens</li><li>mean: 51.48 tokens</li><li>max: 113 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task268_casehold_legal_answer_generation * Dataset: task268_casehold_legal_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 235 tokens</li><li>mean: 255.96 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 156 tokens</li><li>mean: 255.46 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 226 tokens</li><li>mean: 255.94 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task844_financial_phrasebank_classification * Dataset: task844_financial_phrasebank_classification * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 14 tokens</li><li>mean: 39.8 tokens</li><li>max: 86 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 38.45 tokens</li><li>max: 78 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 39.06 tokens</li><li>max: 86 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task330_gap_answer_generation * Dataset: task330_gap_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 26 tokens</li><li>mean: 106.78 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 44 tokens</li><li>mean: 108.12 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 45 tokens</li><li>mean: 110.93 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task595_mocha_answer_generation * Dataset: task595_mocha_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 44 tokens</li><li>mean: 94.08 tokens</li><li>max: 178 tokens</li></ul> | <ul><li>min: 21 tokens</li><li>mean: 97.06 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 118.77 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task1285_kpa_keypoint_matching * Dataset: task1285_kpa_keypoint_matching * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 30 tokens</li><li>mean: 52.36 tokens</li><li>max: 92 tokens</li></ul> | <ul><li>min: 29 tokens</li><li>mean: 50.14 tokens</li><li>max: 84 tokens</li></ul> | <ul><li>min: 31 tokens</li><li>mean: 53.21 tokens</li><li>max: 88 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task234_iirc_passage_line_answer_generation * Dataset: task234_iirc_passage_line_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 143 tokens</li><li>mean: 235.25 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 155 tokens</li><li>mean: 235.25 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 146 tokens</li><li>mean: 236.25 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task494_review_polarity_answer_generation * Dataset: task494_review_polarity_answer_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 3 tokens</li><li>mean: 106.0 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 112.36 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 112.66 tokens</li><li>max: 249 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task670_ambigqa_question_generation * Dataset: task670_ambigqa_question_generation * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 11 tokens</li><li>mean: 12.66 tokens</li><li>max: 26 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 12.48 tokens</li><li>max: 23 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 12.24 tokens</li><li>max: 18 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### task289_gigaword_summarization * Dataset: task289_gigaword_summarization * Size: 1,018 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 25 tokens</li><li>mean: 51.53 tokens</li><li>max: 87 tokens</li></ul> | <ul><li>min: 27 tokens</li><li>mean: 52.0 tokens</li><li>max: 87 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 51.44 tokens</li><li>max: 87 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### npr * Dataset: npr * Size: 24,838 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 12.74 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 152.32 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 119.75 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### nli * Dataset: nli * Size: 49,676 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 21.62 tokens</li><li>max: 108 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 12.07 tokens</li><li>max: 50 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 12.21 tokens</li><li>max: 44 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### SimpleWiki * Dataset: SimpleWiki * Size: 5,070 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 29.35 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 33.94 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 56.42 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### amazon_review_2018 * Dataset: amazon_review_2018 * Size: 99,352 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 11.86 tokens</li><li>max: 33 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 88.89 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 70.8 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### ccnews_title_text * Dataset: ccnews_title_text * Size: 24,838 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 15.24 tokens</li><li>max: 59 tokens</li></ul> | <ul><li>min: 21 tokens</li><li>mean: 210.26 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 194.92 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### agnews * Dataset: agnews * Size: 44,606 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 11.73 tokens</li><li>max: 38 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 39.85 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 45.43 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### xsum * Dataset: xsum * Size: 10,140 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 27.77 tokens</li><li>max: 58 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 226.87 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 41 tokens</li><li>mean: 232.14 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### msmarco * Dataset: msmarco * Size: 173,354 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 9.07 tokens</li><li>max: 25 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 82.14 tokens</li><li>max: 237 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 80.54 tokens</li><li>max: 252 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### yahoo_answers_title_answer * Dataset: yahoo_answers_title_answer * Size: 24,838 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 16.73 tokens</li><li>max: 45 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 82.94 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 86.15 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### squad_pairs * Dataset: squad_pairs * Size: 24,838 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 14.05 tokens</li><li>max: 38 tokens</li></ul> | <ul><li>min: 32 tokens</li><li>mean: 153.91 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 34 tokens</li><li>mean: 162.67 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### wow * Dataset: wow * Size: 29,908 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 3 tokens</li><li>mean: 88.36 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 100 tokens</li><li>mean: 112.02 tokens</li><li>max: 150 tokens</li></ul> | <ul><li>min: 83 tokens</li><li>mean: 113.07 tokens</li><li>max: 147 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### mteb-amazon_counterfactual-avs_triplets * Dataset: mteb-amazon_counterfactual-avs_triplets * Size: 4,055 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 12 tokens</li><li>mean: 27.68 tokens</li><li>max: 137 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 26.84 tokens</li><li>max: 137 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 26.34 tokens</li><li>max: 91 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### mteb-amazon_massive_intent-avs_triplets * Dataset: mteb-amazon_massive_intent-avs_triplets * Size: 11,661 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:--------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 3 tokens</li><li>mean: 9.5 tokens</li><li>max: 28 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 9.05 tokens</li><li>max: 26 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 9.45 tokens</li><li>max: 25 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### mteb-amazon_massive_scenario-avs_triplets * Dataset: mteb-amazon_massive_scenario-avs_triplets * Size: 11,661 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 3 tokens</li><li>mean: 9.62 tokens</li><li>max: 39 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 9.19 tokens</li><li>max: 29 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 9.59 tokens</li><li>max: 24 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### mteb-amazon_reviews_multi-avs_triplets * Dataset: mteb-amazon_reviews_multi-avs_triplets * Size: 198,192 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 49.55 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 49.51 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 48.42 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### mteb-banking77-avs_triplets * Dataset: mteb-banking77-avs_triplets * Size: 10,139 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 15.81 tokens</li><li>max: 73 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 15.77 tokens</li><li>max: 73 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 16.1 tokens</li><li>max: 73 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### mteb-emotion-avs_triplets * Dataset: mteb-emotion-avs_triplets * Size: 16,224 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 22.04 tokens</li><li>max: 67 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 17.71 tokens</li><li>max: 65 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 21.99 tokens</li><li>max: 72 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### mteb-imdb-avs_triplets * Dataset: mteb-imdb-avs_triplets * Size: 24,839 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 34 tokens</li><li>mean: 207.67 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 36 tokens</li><li>mean: 223.93 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 42 tokens</li><li>mean: 206.87 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### mteb-mtop_domain-avs_triplets * Dataset: mteb-mtop_domain-avs_triplets * Size: 15,715 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 10.27 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.62 tokens</li><li>max: 24 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 10.01 tokens</li><li>max: 33 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### mteb-mtop_intent-avs_triplets * Dataset: mteb-mtop_intent-avs_triplets * Size: 15,715 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 3 tokens</li><li>mean: 10.22 tokens</li><li>max: 35 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.74 tokens</li><li>max: 27 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 10.43 tokens</li><li>max: 28 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### mteb-toxic_conversations_50k-avs_triplets * Dataset: mteb-toxic_conversations_50k-avs_triplets * Size: 49,677 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 3 tokens</li><li>mean: 67.17 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 88.29 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 64.96 tokens</li><li>max: 252 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### mteb-tweet_sentiment_extraction-avs_triplets * Dataset: mteb-tweet_sentiment_extraction-avs_triplets * Size: 27,373 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 3 tokens</li><li>mean: 20.58 tokens</li><li>max: 45 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 20.26 tokens</li><li>max: 56 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 21.1 tokens</li><li>max: 59 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### covid-bing-query-gpt4-avs_triplets * Dataset: covid-bing-query-gpt4-avs_triplets * Size: 5,070 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 15.28 tokens</li><li>max: 33 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 37.6 tokens</li><li>max: 92 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 38.13 tokens</li><li>max: 239 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Evaluation Dataset #### Unnamed Dataset * Size: 18,269 evaluation samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 16.04 tokens</li><li>max: 55 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 142.75 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 144.56 tokens</li><li>max: 256 tokens</li></ul> | * Samples: * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 512 - `per_device_eval_batch_size`: 512 - `learning_rate`: 2e-05 - `num_train_epochs`: 10 - `warmup_ratio`: 0.1 - `fp16`: True - `gradient_checkpointing`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 512 - `per_device_eval_batch_size`: 512 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 10 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: True - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | loss | medi-mteb-dev_max_accuracy | |:------:|:-----:|:-------------:|:------:|:--------------------------:| | 0 | 0 | - | - | 0.8705 | | 0.1308 | 500 | 2.1744 | 1.5723 | 0.8786 | | 0.2616 | 1000 | 1.9245 | 1.5045 | 0.8851 | | 0.3925 | 1500 | 1.9833 | 1.4719 | 0.8882 | | 0.5233 | 2000 | 1.7492 | 1.4434 | 0.8909 | | 0.6541 | 2500 | 1.8815 | 1.4244 | 0.8935 | | 0.7849 | 3000 | 1.7921 | 1.4064 | 0.8949 | | 0.9158 | 3500 | 1.8495 | 1.3894 | 0.8956 | | 1.0466 | 4000 | 1.7415 | 1.3744 | 0.8966 | | 1.1774 | 4500 | 1.8663 | 1.3619 | 0.9005 | | 1.3082 | 5000 | 1.7016 | 1.3520 | 0.8979 | | 1.4390 | 5500 | 1.7308 | 1.3467 | 0.9007 | | 1.5699 | 6000 | 1.6965 | 1.3346 | 0.9021 | | 1.7007 | 6500 | 1.7355 | 1.3251 | 0.9018 | | 1.8315 | 7000 | 1.6783 | 1.3156 | 0.9031 | | 1.9623 | 7500 | 1.6381 | 1.3101 | 0.9047 | | 2.0931 | 8000 | 1.7169 | 1.3056 | 0.9044 | | 2.2240 | 8500 | 1.6527 | 1.3070 | 0.9039 | | 2.3548 | 9000 | 1.7078 | 1.2977 | 0.9055 | | 2.4856 | 9500 | 1.533 | 1.2991 | 0.9050 | | 2.6164 | 10000 | 1.6676 | 1.2916 | 0.9057 | | 2.7473 | 10500 | 1.5866 | 1.2885 | 0.9053 | | 2.8781 | 11000 | 1.641 | 1.2765 | 0.9066 | | 3.0089 | 11500 | 1.5193 | 1.2816 | 0.9062 | | 3.1397 | 12000 | 1.6907 | 1.2804 | 0.9065 | | 3.2705 | 12500 | 1.557 | 1.2684 | 0.9065 | | 3.4014 | 13000 | 1.6808 | 1.2711 | 0.9075 | | 3.5322 | 13500 | 1.4751 | 1.2700 | 0.9072 | | 3.6630 | 14000 | 1.5934 | 1.2692 | 0.9081 | | 3.7938 | 14500 | 1.5395 | 1.2672 | 0.9087 | | 3.9246 | 15000 | 1.5809 | 1.2678 | 0.9072 | | 4.0555 | 15500 | 1.4972 | 1.2621 | 0.9089 | | 4.1863 | 16000 | 1.614 | 1.2690 | 0.9070 | | 4.3171 | 16500 | 1.5186 | 1.2625 | 0.9091 | | 4.4479 | 17000 | 1.5239 | 1.2629 | 0.9079 | | 4.5788 | 17500 | 1.5354 | 1.2569 | 0.9086 | | 4.7096 | 18000 | 1.5134 | 1.2559 | 0.9095 | | 4.8404 | 18500 | 1.5237 | 1.2494 | 0.9100 | | 4.9712 | 19000 | 1.5038 | 1.2486 | 0.9113 | | 5.1020 | 19500 | 1.5527 | 1.2493 | 0.9098 | | 5.2329 | 20000 | 1.5018 | 1.2521 | 0.9102 | | 5.3637 | 20500 | 1.584 | 1.2496 | 0.9095 | | 5.4945 | 21000 | 1.3948 | 1.2467 | 0.9102 | | 5.6253 | 21500 | 1.5118 | 1.2487 | 0.9098 | | 5.7561 | 22000 | 1.458 | 1.2471 | 0.9098 | | 5.8870 | 22500 | 1.5158 | 1.2367 | 0.9105 | | 6.0178 | 23000 | 1.4091 | 1.2480 | 0.9096 | | 6.1486 | 23500 | 1.5823 | 1.2456 | 0.9114 | | 6.2794 | 24000 | 1.4383 | 1.2404 | 0.9101 | | 6.4103 | 24500 | 1.5606 | 1.2431 | 0.9100 | | 6.5411 | 25000 | 1.3906 | 1.2386 | 0.9112 | | 6.6719 | 25500 | 1.4887 | 1.2382 | 0.9103 | | 6.8027 | 26000 | 1.4347 | 1.2384 | 0.9112 | | 6.9335 | 26500 | 1.4733 | 1.2395 | 0.9113 | | 7.0644 | 27000 | 1.4323 | 1.2385 | 0.9111 | | 7.1952 | 27500 | 1.505 | 1.2413 | 0.9107 | | 7.3260 | 28000 | 1.4648 | 1.2362 | 0.9114 | | 7.4568 | 28500 | 1.4252 | 1.2361 | 0.9116 | | 7.5877 | 29000 | 1.458 | 1.2344 | 0.9118 | | 7.7185 | 29500 | 1.4309 | 1.2357 | 0.9120 | | 7.8493 | 30000 | 1.4431 | 1.2330 | 0.9114 | | 7.9801 | 30500 | 1.4266 | 1.2306 | 0.9127 | | 8.1109 | 31000 | 1.4803 | 1.2328 | 0.9118 | | 8.2418 | 31500 | 1.414 | 1.2345 | 0.9110 | | 8.3726 | 32000 | 1.5456 | 1.2343 | 0.9116 | | 8.5034 | 32500 | 1.346 | 1.2324 | 0.9118 | | 8.6342 | 33000 | 1.4467 | 1.2315 | 0.9118 | | 8.7650 | 33500 | 1.3864 | 1.2330 | 0.9119 | | 8.8959 | 34000 | 1.4806 | 1.2277 | 0.9119 | | 9.0267 | 34500 | 1.3381 | 1.2330 | 0.9119 | | 9.1575 | 35000 | 1.5277 | 1.2315 | 0.9121 | | 9.2883 | 35500 | 1.3966 | 1.2309 | 0.9112 | | 9.4192 | 36000 | 1.4921 | 1.2321 | 0.9117 | | 9.5500 | 36500 | 1.3668 | 1.2303 | 0.9118 | | 9.6808 | 37000 | 1.4407 | 1.2308 | 0.9121 | | 9.8116 | 37500 | 1.3852 | 1.2314 | 0.9118 | | 9.9424 | 38000 | 1.4329 | 1.2300 | 0.9120 | ### Framework Versions - Python: 3.10.10 - Sentence Transformers: 3.1.0.dev0 - Transformers: 4.42.4 - PyTorch: 2.3.1+cu121 - Accelerate: 0.32.1 - Datasets: 2.20.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION", "SUMMARIZATION", "PARAPHRASING" ]
[ "PUBMEDQA", "SCIFACT", "SCIQ", "SCITAIL" ]
Non_BioNLP
ISOISS/jina-embeddings-v3-tei
ISOISS
feature-extraction
[ "transformers", "pytorch", "onnx", "safetensors", "xlm-roberta", "feature-extraction", "sentence-similarity", "mteb", "sentence-transformers", "custom_code", "multilingual", "af", "am", "ar", "as", "az", "be", "bg", "bn", "br", "bs", "ca", "cs", "cy", "da", "de", "el", "en", "eo", "es", "et", "eu", "fa", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "hu", "hy", "id", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "ku", "ky", "la", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", "no", "om", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sa", "sd", "si", "sk", "sl", "so", "sq", "sr", "su", "sv", "sw", "ta", "te", "th", "tl", "tr", "ug", "uk", "ur", "uz", "vi", "xh", "yi", "zh", "arxiv:2409.10173", "license:cc-by-nc-4.0", "model-index", "text-embeddings-inference", "region:us" ]
1,731
1,731
9
0
--- language: - multilingual - af - am - ar - as - az - be - bg - bn - br - bs - ca - cs - cy - da - de - el - en - eo - es - et - eu - fa - fi - fr - fy - ga - gd - gl - gu - ha - he - hi - hr - hu - hy - id - is - it - ja - jv - ka - kk - km - kn - ko - ku - ky - la - lo - lt - lv - mg - mk - ml - mn - mr - ms - my - ne - nl - false - om - or - pa - pl - ps - pt - ro - ru - sa - sd - si - sk - sl - so - sq - sr - su - sv - sw - ta - te - th - tl - tr - ug - uk - ur - uz - vi - xh - yi - zh library_name: transformers license: cc-by-nc-4.0 tags: - feature-extraction - sentence-similarity - mteb - sentence-transformers inference: false model-index: - name: jina-embeddings-v3 results: - task: type: STS dataset: name: MTEB AFQMC (default) type: C-MTEB/AFQMC config: default split: validation revision: b44c3b011063adb25877c13823db83bb193913c4 metrics: - type: cosine_pearson value: 41.74237700998808 - type: cosine_spearman value: 43.4726782647566 - type: euclidean_pearson value: 42.244585459479964 - type: euclidean_spearman value: 43.525070045169606 - type: main_score value: 43.4726782647566 - type: manhattan_pearson value: 42.04616728224863 - type: manhattan_spearman value: 43.308828270754645 - type: pearson value: 41.74237700998808 - type: spearman value: 43.4726782647566 - task: type: Retrieval dataset: name: MTEB ArguAna-PL (default) type: clarin-knext/arguana-pl config: default split: test revision: 63fc86750af76253e8c760fc9e534bbf24d260a2 metrics: - type: main_score value: 50.117999999999995 - type: map_at_1 value: 24.253 - type: map_at_10 value: 40.725 - type: map_at_100 value: 41.699999999999996 - type: map_at_1000 value: 41.707 - type: map_at_20 value: 41.467999999999996 - type: map_at_3 value: 35.467 - type: map_at_5 value: 38.291 - type: mrr_at_1 value: 24.751066856330013 - type: mrr_at_10 value: 40.91063808169072 - type: mrr_at_100 value: 41.885497923928675 - type: mrr_at_1000 value: 41.89301098419842 - type: mrr_at_20 value: 41.653552355442514 - type: mrr_at_3 value: 35.656709340919775 - type: mrr_at_5 value: 38.466097676623946 - type: nauc_map_at_1000_diff1 value: 7.503000359807567 - type: nauc_map_at_1000_max value: -11.030405164830546 - type: nauc_map_at_1000_std value: -8.902792782585117 - type: nauc_map_at_100_diff1 value: 7.509899249593199 - type: nauc_map_at_100_max value: -11.023581259404406 - type: nauc_map_at_100_std value: -8.892241185067272 - type: nauc_map_at_10_diff1 value: 7.24369711881512 - type: nauc_map_at_10_max value: -10.810000200433278 - type: nauc_map_at_10_std value: -8.987230542165776 - type: nauc_map_at_1_diff1 value: 11.37175831832417 - type: nauc_map_at_1_max value: -13.315221903223055 - type: nauc_map_at_1_std value: -9.398199605510275 - type: nauc_map_at_20_diff1 value: 7.477364530860648 - type: nauc_map_at_20_max value: -10.901251218105566 - type: nauc_map_at_20_std value: -8.868148116405925 - type: nauc_map_at_3_diff1 value: 6.555548802174882 - type: nauc_map_at_3_max value: -12.247274800542934 - type: nauc_map_at_3_std value: -9.879475250984811 - type: nauc_map_at_5_diff1 value: 7.426588563355882 - type: nauc_map_at_5_max value: -11.347695686001805 - type: nauc_map_at_5_std value: -9.34441892203972 - type: nauc_mrr_at_1000_diff1 value: 5.99737552143614 - type: nauc_mrr_at_1000_max value: -11.327205136505727 - type: nauc_mrr_at_1000_std value: -8.791079115519503 - type: nauc_mrr_at_100_diff1 value: 6.004622525255784 - type: nauc_mrr_at_100_max value: -11.320336759899723 - type: nauc_mrr_at_100_std value: -8.780602249831777 - type: nauc_mrr_at_10_diff1 value: 5.783623516930227 - type: nauc_mrr_at_10_max value: -11.095971693467078 - type: nauc_mrr_at_10_std value: -8.877242032013582 - type: nauc_mrr_at_1_diff1 value: 9.694937537703797 - type: nauc_mrr_at_1_max value: -12.531905083727912 - type: nauc_mrr_at_1_std value: -8.903992940100146 - type: nauc_mrr_at_20_diff1 value: 5.984841206233873 - type: nauc_mrr_at_20_max value: -11.195236951048969 - type: nauc_mrr_at_20_std value: -8.757266039186018 - type: nauc_mrr_at_3_diff1 value: 5.114333824261379 - type: nauc_mrr_at_3_max value: -12.64809799843464 - type: nauc_mrr_at_3_std value: -9.791146138025184 - type: nauc_mrr_at_5_diff1 value: 5.88941606224512 - type: nauc_mrr_at_5_max value: -11.763903418071918 - type: nauc_mrr_at_5_std value: -9.279175712709446 - type: nauc_ndcg_at_1000_diff1 value: 7.076950652226086 - type: nauc_ndcg_at_1000_max value: -10.386482092087371 - type: nauc_ndcg_at_1000_std value: -8.309190917074046 - type: nauc_ndcg_at_100_diff1 value: 7.2329220284865245 - type: nauc_ndcg_at_100_max value: -10.208048403220337 - type: nauc_ndcg_at_100_std value: -7.997975874274613 - type: nauc_ndcg_at_10_diff1 value: 6.065391100006953 - type: nauc_ndcg_at_10_max value: -9.046164377601153 - type: nauc_ndcg_at_10_std value: -8.34724889697153 - type: nauc_ndcg_at_1_diff1 value: 11.37175831832417 - type: nauc_ndcg_at_1_max value: -13.315221903223055 - type: nauc_ndcg_at_1_std value: -9.398199605510275 - type: nauc_ndcg_at_20_diff1 value: 6.949389989202601 - type: nauc_ndcg_at_20_max value: -9.35740451760307 - type: nauc_ndcg_at_20_std value: -7.761295171828212 - type: nauc_ndcg_at_3_diff1 value: 5.051471796151364 - type: nauc_ndcg_at_3_max value: -12.158763333711653 - type: nauc_ndcg_at_3_std value: -10.078902544421926 - type: nauc_ndcg_at_5_diff1 value: 6.527454512611454 - type: nauc_ndcg_at_5_max value: -10.525118233848586 - type: nauc_ndcg_at_5_std value: -9.120055125584031 - type: nauc_precision_at_1000_diff1 value: -10.6495668199151 - type: nauc_precision_at_1000_max value: 12.070656425217841 - type: nauc_precision_at_1000_std value: 55.844551709649004 - type: nauc_precision_at_100_diff1 value: 19.206967129266285 - type: nauc_precision_at_100_max value: 16.296851020813456 - type: nauc_precision_at_100_std value: 45.60378984257811 - type: nauc_precision_at_10_diff1 value: 0.6490335354304879 - type: nauc_precision_at_10_max value: 0.5757198255366447 - type: nauc_precision_at_10_std value: -4.875847131691451 - type: nauc_precision_at_1_diff1 value: 11.37175831832417 - type: nauc_precision_at_1_max value: -13.315221903223055 - type: nauc_precision_at_1_std value: -9.398199605510275 - type: nauc_precision_at_20_diff1 value: 4.899369866929203 - type: nauc_precision_at_20_max value: 5.988537297189552 - type: nauc_precision_at_20_std value: 4.830900387582837 - type: nauc_precision_at_3_diff1 value: 0.8791156910997744 - type: nauc_precision_at_3_max value: -11.983373635905993 - type: nauc_precision_at_3_std value: -10.646185111581257 - type: nauc_precision_at_5_diff1 value: 3.9314486166548432 - type: nauc_precision_at_5_max value: -7.798591396895839 - type: nauc_precision_at_5_std value: -8.293043407234125 - type: nauc_recall_at_1000_diff1 value: -10.649566819918673 - type: nauc_recall_at_1000_max value: 12.070656425214647 - type: nauc_recall_at_1000_std value: 55.84455170965023 - type: nauc_recall_at_100_diff1 value: 19.206967129265127 - type: nauc_recall_at_100_max value: 16.296851020813722 - type: nauc_recall_at_100_std value: 45.60378984257728 - type: nauc_recall_at_10_diff1 value: 0.6490335354304176 - type: nauc_recall_at_10_max value: 0.5757198255366095 - type: nauc_recall_at_10_std value: -4.875847131691468 - type: nauc_recall_at_1_diff1 value: 11.37175831832417 - type: nauc_recall_at_1_max value: -13.315221903223055 - type: nauc_recall_at_1_std value: -9.398199605510275 - type: nauc_recall_at_20_diff1 value: 4.899369866929402 - type: nauc_recall_at_20_max value: 5.98853729718968 - type: nauc_recall_at_20_std value: 4.830900387582967 - type: nauc_recall_at_3_diff1 value: 0.8791156910997652 - type: nauc_recall_at_3_max value: -11.983373635905997 - type: nauc_recall_at_3_std value: -10.64618511158124 - type: nauc_recall_at_5_diff1 value: 3.9314486166548472 - type: nauc_recall_at_5_max value: -7.7985913968958585 - type: nauc_recall_at_5_std value: -8.293043407234132 - type: ndcg_at_1 value: 24.253 - type: ndcg_at_10 value: 50.117999999999995 - type: ndcg_at_100 value: 54.291999999999994 - type: ndcg_at_1000 value: 54.44799999999999 - type: ndcg_at_20 value: 52.771 - type: ndcg_at_3 value: 39.296 - type: ndcg_at_5 value: 44.373000000000005 - type: precision_at_1 value: 24.253 - type: precision_at_10 value: 8.016 - type: precision_at_100 value: 0.984 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.527 - type: precision_at_3 value: 16.808999999999997 - type: precision_at_5 value: 12.546 - type: recall_at_1 value: 24.253 - type: recall_at_10 value: 80.156 - type: recall_at_100 value: 98.43499999999999 - type: recall_at_1000 value: 99.57300000000001 - type: recall_at_20 value: 90.54100000000001 - type: recall_at_3 value: 50.427 - type: recall_at_5 value: 62.731 - task: type: Retrieval dataset: name: MTEB DBPedia-PL (default) type: clarin-knext/dbpedia-pl config: default split: test revision: 76afe41d9af165cc40999fcaa92312b8b012064a metrics: - type: main_score value: 34.827000000000005 - type: map_at_1 value: 7.049999999999999 - type: map_at_10 value: 14.982999999999999 - type: map_at_100 value: 20.816000000000003 - type: map_at_1000 value: 22.33 - type: map_at_20 value: 17.272000000000002 - type: map_at_3 value: 10.661 - type: map_at_5 value: 12.498 - type: mrr_at_1 value: 57.25 - type: mrr_at_10 value: 65.81934523809524 - type: mrr_at_100 value: 66.2564203928212 - type: mrr_at_1000 value: 66.27993662923856 - type: mrr_at_20 value: 66.0732139130649 - type: mrr_at_3 value: 64.08333333333333 - type: mrr_at_5 value: 65.27083333333333 - type: nauc_map_at_1000_diff1 value: 16.41780871174038 - type: nauc_map_at_1000_max value: 30.193946325654654 - type: nauc_map_at_1000_std value: 31.46095497039037 - type: nauc_map_at_100_diff1 value: 18.57903165498531 - type: nauc_map_at_100_max value: 29.541476938623262 - type: nauc_map_at_100_std value: 28.228604103301052 - type: nauc_map_at_10_diff1 value: 24.109434489748946 - type: nauc_map_at_10_max value: 21.475954208048968 - type: nauc_map_at_10_std value: 9.964464537806988 - type: nauc_map_at_1_diff1 value: 38.67437644802124 - type: nauc_map_at_1_max value: 14.52136658726491 - type: nauc_map_at_1_std value: -2.8981666782088755 - type: nauc_map_at_20_diff1 value: 21.42547228801935 - type: nauc_map_at_20_max value: 25.04510402960458 - type: nauc_map_at_20_std value: 16.533079346431155 - type: nauc_map_at_3_diff1 value: 26.63648858245477 - type: nauc_map_at_3_max value: 13.632235789780415 - type: nauc_map_at_3_std value: -0.40129174577700716 - type: nauc_map_at_5_diff1 value: 24.513861031197933 - type: nauc_map_at_5_max value: 16.599888813946688 - type: nauc_map_at_5_std value: 3.4448514739556346 - type: nauc_mrr_at_1000_diff1 value: 36.57353464537154 - type: nauc_mrr_at_1000_max value: 55.34763483979515 - type: nauc_mrr_at_1000_std value: 40.3722796438533 - type: nauc_mrr_at_100_diff1 value: 36.555989566513134 - type: nauc_mrr_at_100_max value: 55.347805216808396 - type: nauc_mrr_at_100_std value: 40.38465945075711 - type: nauc_mrr_at_10_diff1 value: 36.771572999261984 - type: nauc_mrr_at_10_max value: 55.41239897909165 - type: nauc_mrr_at_10_std value: 40.52058934624793 - type: nauc_mrr_at_1_diff1 value: 38.2472828531032 - type: nauc_mrr_at_1_max value: 51.528473828685705 - type: nauc_mrr_at_1_std value: 33.03676467942882 - type: nauc_mrr_at_20_diff1 value: 36.642602571889036 - type: nauc_mrr_at_20_max value: 55.3763342076553 - type: nauc_mrr_at_20_std value: 40.41520090500838 - type: nauc_mrr_at_3_diff1 value: 36.79451847426628 - type: nauc_mrr_at_3_max value: 54.59778581826193 - type: nauc_mrr_at_3_std value: 39.48392075873095 - type: nauc_mrr_at_5_diff1 value: 36.92150807529304 - type: nauc_mrr_at_5_max value: 55.03553978718272 - type: nauc_mrr_at_5_std value: 40.20147745489917 - type: nauc_ndcg_at_1000_diff1 value: 21.843092744321268 - type: nauc_ndcg_at_1000_max value: 44.93275990394279 - type: nauc_ndcg_at_1000_std value: 47.09186225236347 - type: nauc_ndcg_at_100_diff1 value: 25.180282568979095 - type: nauc_ndcg_at_100_max value: 41.737709709508394 - type: nauc_ndcg_at_100_std value: 38.80950644139446 - type: nauc_ndcg_at_10_diff1 value: 24.108368037214046 - type: nauc_ndcg_at_10_max value: 41.29298370689967 - type: nauc_ndcg_at_10_std value: 35.06450769738732 - type: nauc_ndcg_at_1_diff1 value: 35.51010679525079 - type: nauc_ndcg_at_1_max value: 42.40790024212412 - type: nauc_ndcg_at_1_std value: 26.696412036243157 - type: nauc_ndcg_at_20_diff1 value: 23.909989673256195 - type: nauc_ndcg_at_20_max value: 39.78444647091927 - type: nauc_ndcg_at_20_std value: 33.39544470364529 - type: nauc_ndcg_at_3_diff1 value: 22.50484297956035 - type: nauc_ndcg_at_3_max value: 39.14551926034168 - type: nauc_ndcg_at_3_std value: 30.330135925392014 - type: nauc_ndcg_at_5_diff1 value: 21.7798872028265 - type: nauc_ndcg_at_5_max value: 40.23856975248015 - type: nauc_ndcg_at_5_std value: 32.438381067440396 - type: nauc_precision_at_1000_diff1 value: -21.62692442272279 - type: nauc_precision_at_1000_max value: 0.9689046974430882 - type: nauc_precision_at_1000_std value: 18.54001058230465 - type: nauc_precision_at_100_diff1 value: -10.132258779856192 - type: nauc_precision_at_100_max value: 23.74516110444681 - type: nauc_precision_at_100_std value: 47.03416663319965 - type: nauc_precision_at_10_diff1 value: 1.543656509571949 - type: nauc_precision_at_10_max value: 36.98864812757555 - type: nauc_precision_at_10_std value: 46.56427199077426 - type: nauc_precision_at_1_diff1 value: 38.2472828531032 - type: nauc_precision_at_1_max value: 51.528473828685705 - type: nauc_precision_at_1_std value: 33.03676467942882 - type: nauc_precision_at_20_diff1 value: -4.612864872734335 - type: nauc_precision_at_20_max value: 34.03565449182125 - type: nauc_precision_at_20_std value: 48.880727648349534 - type: nauc_precision_at_3_diff1 value: 6.360850444467829 - type: nauc_precision_at_3_max value: 36.25816942368427 - type: nauc_precision_at_3_std value: 34.48882647419187 - type: nauc_precision_at_5_diff1 value: 2.6445596936740037 - type: nauc_precision_at_5_max value: 37.174463388899056 - type: nauc_precision_at_5_std value: 40.25254370626113 - type: nauc_recall_at_1000_diff1 value: 13.041227176748077 - type: nauc_recall_at_1000_max value: 39.722336427072094 - type: nauc_recall_at_1000_std value: 52.04032890059214 - type: nauc_recall_at_100_diff1 value: 18.286096899139153 - type: nauc_recall_at_100_max value: 34.072389201930314 - type: nauc_recall_at_100_std value: 37.73637623416653 - type: nauc_recall_at_10_diff1 value: 22.35560419280504 - type: nauc_recall_at_10_max value: 19.727247199595197 - type: nauc_recall_at_10_std value: 8.58498575109203 - type: nauc_recall_at_1_diff1 value: 38.67437644802124 - type: nauc_recall_at_1_max value: 14.52136658726491 - type: nauc_recall_at_1_std value: -2.8981666782088755 - type: nauc_recall_at_20_diff1 value: 19.026320886902916 - type: nauc_recall_at_20_max value: 22.753562309469867 - type: nauc_recall_at_20_std value: 14.89994263882445 - type: nauc_recall_at_3_diff1 value: 23.428129702129684 - type: nauc_recall_at_3_max value: 10.549153954790542 - type: nauc_recall_at_3_std value: -1.7590608997055206 - type: nauc_recall_at_5_diff1 value: 21.27448645803921 - type: nauc_recall_at_5_max value: 13.620279707461677 - type: nauc_recall_at_5_std value: 2.0577962208292675 - type: ndcg_at_1 value: 46.75 - type: ndcg_at_10 value: 34.827000000000005 - type: ndcg_at_100 value: 38.157999999999994 - type: ndcg_at_1000 value: 44.816 - type: ndcg_at_20 value: 34.152 - type: ndcg_at_3 value: 39.009 - type: ndcg_at_5 value: 36.826 - type: precision_at_1 value: 57.25 - type: precision_at_10 value: 27.575 - type: precision_at_100 value: 8.84 - type: precision_at_1000 value: 1.949 - type: precision_at_20 value: 20.724999999999998 - type: precision_at_3 value: 41.167 - type: precision_at_5 value: 35.199999999999996 - type: recall_at_1 value: 7.049999999999999 - type: recall_at_10 value: 19.817999999999998 - type: recall_at_100 value: 42.559999999999995 - type: recall_at_1000 value: 63.744 - type: recall_at_20 value: 25.968000000000004 - type: recall_at_3 value: 11.959 - type: recall_at_5 value: 14.939 - task: type: Retrieval dataset: name: MTEB FiQA-PL (default) type: clarin-knext/fiqa-pl config: default split: test revision: 2e535829717f8bf9dc829b7f911cc5bbd4e6608e metrics: - type: main_score value: 38.828 - type: map_at_1 value: 19.126 - type: map_at_10 value: 31.002000000000002 - type: map_at_100 value: 32.736 - type: map_at_1000 value: 32.933 - type: map_at_20 value: 31.894 - type: map_at_3 value: 26.583000000000002 - type: map_at_5 value: 28.904000000000003 - type: mrr_at_1 value: 37.808641975308646 - type: mrr_at_10 value: 46.36745541838134 - type: mrr_at_100 value: 47.14140915794908 - type: mrr_at_1000 value: 47.190701435388846 - type: mrr_at_20 value: 46.81387776440309 - type: mrr_at_3 value: 43.750000000000014 - type: mrr_at_5 value: 45.23919753086418 - type: nauc_map_at_1000_diff1 value: 38.5532285881503 - type: nauc_map_at_1000_max value: 34.44383884813453 - type: nauc_map_at_1000_std value: -1.3963497949476722 - type: nauc_map_at_100_diff1 value: 38.49292464176943 - type: nauc_map_at_100_max value: 34.33752755618645 - type: nauc_map_at_100_std value: -1.4794032905848582 - type: nauc_map_at_10_diff1 value: 38.26061536370962 - type: nauc_map_at_10_max value: 33.16977912721411 - type: nauc_map_at_10_std value: -2.3853370604730393 - type: nauc_map_at_1_diff1 value: 46.288767289528344 - type: nauc_map_at_1_max value: 25.67706785013364 - type: nauc_map_at_1_std value: -6.989769609924645 - type: nauc_map_at_20_diff1 value: 38.507270129330685 - type: nauc_map_at_20_max value: 33.70963328055982 - type: nauc_map_at_20_std value: -1.9835510011554272 - type: nauc_map_at_3_diff1 value: 39.81061518646884 - type: nauc_map_at_3_max value: 30.101186374147748 - type: nauc_map_at_3_std value: -4.027120247237715 - type: nauc_map_at_5_diff1 value: 38.55602589746512 - type: nauc_map_at_5_max value: 31.515174267015983 - type: nauc_map_at_5_std value: -3.4064239358570303 - type: nauc_mrr_at_1000_diff1 value: 45.030514454725726 - type: nauc_mrr_at_1000_max value: 43.878919881666164 - type: nauc_mrr_at_1000_std value: 2.517594250297626 - type: nauc_mrr_at_100_diff1 value: 45.00868212878687 - type: nauc_mrr_at_100_max value: 43.87437011120001 - type: nauc_mrr_at_100_std value: 2.5257874265014966 - type: nauc_mrr_at_10_diff1 value: 44.855044606754056 - type: nauc_mrr_at_10_max value: 43.946617058785186 - type: nauc_mrr_at_10_std value: 2.5173751662794044 - type: nauc_mrr_at_1_diff1 value: 49.441510997817346 - type: nauc_mrr_at_1_max value: 43.08547383044357 - type: nauc_mrr_at_1_std value: -1.8747770703324347 - type: nauc_mrr_at_20_diff1 value: 45.019880416584215 - type: nauc_mrr_at_20_max value: 43.85691473662242 - type: nauc_mrr_at_20_std value: 2.4625487605091303 - type: nauc_mrr_at_3_diff1 value: 45.322041658604036 - type: nauc_mrr_at_3_max value: 43.95079293074395 - type: nauc_mrr_at_3_std value: 2.4644274393435737 - type: nauc_mrr_at_5_diff1 value: 44.99461837803437 - type: nauc_mrr_at_5_max value: 43.97934275090601 - type: nauc_mrr_at_5_std value: 2.5353091695125096 - type: nauc_ndcg_at_1000_diff1 value: 39.38449023275524 - type: nauc_ndcg_at_1000_max value: 39.48382767312788 - type: nauc_ndcg_at_1000_std value: 3.414789408343409 - type: nauc_ndcg_at_100_diff1 value: 38.29675861135578 - type: nauc_ndcg_at_100_max value: 38.2674786507297 - type: nauc_ndcg_at_100_std value: 2.7094055381218207 - type: nauc_ndcg_at_10_diff1 value: 38.09514955708717 - type: nauc_ndcg_at_10_max value: 36.664923238906525 - type: nauc_ndcg_at_10_std value: 0.6901410544967921 - type: nauc_ndcg_at_1_diff1 value: 49.441510997817346 - type: nauc_ndcg_at_1_max value: 43.08547383044357 - type: nauc_ndcg_at_1_std value: -1.8747770703324347 - type: nauc_ndcg_at_20_diff1 value: 38.44967736231759 - type: nauc_ndcg_at_20_max value: 36.871179313622584 - type: nauc_ndcg_at_20_std value: 1.157560360065234 - type: nauc_ndcg_at_3_diff1 value: 39.02419271805571 - type: nauc_ndcg_at_3_max value: 37.447669442586324 - type: nauc_ndcg_at_3_std value: 0.41502589779297794 - type: nauc_ndcg_at_5_diff1 value: 38.10233452742001 - type: nauc_ndcg_at_5_max value: 35.816381905465676 - type: nauc_ndcg_at_5_std value: -0.3704499913387088 - type: nauc_precision_at_1000_diff1 value: 2.451267097838658 - type: nauc_precision_at_1000_max value: 29.116394969085306 - type: nauc_precision_at_1000_std value: 14.85900786538363 - type: nauc_precision_at_100_diff1 value: 8.10919082251277 - type: nauc_precision_at_100_max value: 36.28388256191417 - type: nauc_precision_at_100_std value: 14.830039904317657 - type: nauc_precision_at_10_diff1 value: 15.02446609920477 - type: nauc_precision_at_10_max value: 41.008463775454054 - type: nauc_precision_at_10_std value: 10.431403152334486 - type: nauc_precision_at_1_diff1 value: 49.441510997817346 - type: nauc_precision_at_1_max value: 43.08547383044357 - type: nauc_precision_at_1_std value: -1.8747770703324347 - type: nauc_precision_at_20_diff1 value: 14.222022201169926 - type: nauc_precision_at_20_max value: 40.10189643835305 - type: nauc_precision_at_20_std value: 12.204443815975527 - type: nauc_precision_at_3_diff1 value: 25.41905395341234 - type: nauc_precision_at_3_max value: 41.56133905339819 - type: nauc_precision_at_3_std value: 5.575516915590082 - type: nauc_precision_at_5_diff1 value: 20.20081221089351 - type: nauc_precision_at_5_max value: 40.95218555916681 - type: nauc_precision_at_5_std value: 7.2040745500708745 - type: nauc_recall_at_1000_diff1 value: 28.021198234033395 - type: nauc_recall_at_1000_max value: 36.165148684597504 - type: nauc_recall_at_1000_std value: 28.28852356008973 - type: nauc_recall_at_100_diff1 value: 21.882447802741897 - type: nauc_recall_at_100_max value: 26.979684607567222 - type: nauc_recall_at_100_std value: 9.783658817010082 - type: nauc_recall_at_10_diff1 value: 28.493097951178818 - type: nauc_recall_at_10_max value: 29.40937476550134 - type: nauc_recall_at_10_std value: 2.7593763576979353 - type: nauc_recall_at_1_diff1 value: 46.288767289528344 - type: nauc_recall_at_1_max value: 25.67706785013364 - type: nauc_recall_at_1_std value: -6.989769609924645 - type: nauc_recall_at_20_diff1 value: 27.638381299425234 - type: nauc_recall_at_20_max value: 27.942035836106328 - type: nauc_recall_at_20_std value: 3.489835161380808 - type: nauc_recall_at_3_diff1 value: 33.90054781392646 - type: nauc_recall_at_3_max value: 27.778812533030322 - type: nauc_recall_at_3_std value: -0.03054068020022706 - type: nauc_recall_at_5_diff1 value: 30.279060732221346 - type: nauc_recall_at_5_max value: 27.49854749597931 - type: nauc_recall_at_5_std value: 0.5434664581939099 - type: ndcg_at_1 value: 37.809 - type: ndcg_at_10 value: 38.828 - type: ndcg_at_100 value: 45.218 - type: ndcg_at_1000 value: 48.510999999999996 - type: ndcg_at_20 value: 41.11 - type: ndcg_at_3 value: 34.466 - type: ndcg_at_5 value: 35.843 - type: precision_at_1 value: 37.809 - type: precision_at_10 value: 11.157 - type: precision_at_100 value: 1.762 - type: precision_at_1000 value: 0.233 - type: precision_at_20 value: 6.497 - type: precision_at_3 value: 23.044999999999998 - type: precision_at_5 value: 17.284 - type: recall_at_1 value: 19.126 - type: recall_at_10 value: 46.062 - type: recall_at_100 value: 70.22800000000001 - type: recall_at_1000 value: 89.803 - type: recall_at_20 value: 53.217999999999996 - type: recall_at_3 value: 30.847 - type: recall_at_5 value: 37.11 - task: type: Retrieval dataset: name: MTEB HotpotQA-PL (default) type: clarin-knext/hotpotqa-pl config: default split: test revision: a0bd479ac97b4ccb5bd6ce320c415d0bb4beb907 metrics: - type: main_score value: 60.27 - type: map_at_1 value: 35.199000000000005 - type: map_at_10 value: 51.369 - type: map_at_100 value: 52.212 - type: map_at_1000 value: 52.28 - type: map_at_20 value: 51.864 - type: map_at_3 value: 48.446 - type: map_at_5 value: 50.302 - type: mrr_at_1 value: 70.39837947332883 - type: mrr_at_10 value: 76.8346141067273 - type: mrr_at_100 value: 77.10724392048137 - type: mrr_at_1000 value: 77.12037412892865 - type: mrr_at_20 value: 77.01061532947222 - type: mrr_at_3 value: 75.5908170155299 - type: mrr_at_5 value: 76.39095205941899 - type: nauc_map_at_1000_diff1 value: 24.701387884989117 - type: nauc_map_at_1000_max value: 23.25553235642178 - type: nauc_map_at_1000_std value: 7.1803506915661774 - type: nauc_map_at_100_diff1 value: 24.674498622483103 - type: nauc_map_at_100_max value: 23.234948525052175 - type: nauc_map_at_100_std value: 7.168677997105447 - type: nauc_map_at_10_diff1 value: 24.676025039755626 - type: nauc_map_at_10_max value: 23.171971872726964 - type: nauc_map_at_10_std value: 6.485610909852058 - type: nauc_map_at_1_diff1 value: 68.90178464319715 - type: nauc_map_at_1_max value: 46.05537868917558 - type: nauc_map_at_1_std value: 1.7658552480698708 - type: nauc_map_at_20_diff1 value: 24.69297151842494 - type: nauc_map_at_20_max value: 23.213064691673637 - type: nauc_map_at_20_std value: 6.9357946556849 - type: nauc_map_at_3_diff1 value: 26.279128947950507 - type: nauc_map_at_3_max value: 23.929537354117922 - type: nauc_map_at_3_std value: 4.625061565714759 - type: nauc_map_at_5_diff1 value: 25.04448959482816 - type: nauc_map_at_5_max value: 23.432012857899338 - type: nauc_map_at_5_std value: 5.845744681998008 - type: nauc_mrr_at_1000_diff1 value: 66.7503918108276 - type: nauc_mrr_at_1000_max value: 48.42897342336844 - type: nauc_mrr_at_1000_std value: 5.3097517971144415 - type: nauc_mrr_at_100_diff1 value: 66.74645215862695 - type: nauc_mrr_at_100_max value: 48.4368663009989 - type: nauc_mrr_at_100_std value: 5.322297898555188 - type: nauc_mrr_at_10_diff1 value: 66.69310166180729 - type: nauc_mrr_at_10_max value: 48.475437698330225 - type: nauc_mrr_at_10_std value: 5.258183461631702 - type: nauc_mrr_at_1_diff1 value: 68.90178464319715 - type: nauc_mrr_at_1_max value: 46.05537868917558 - type: nauc_mrr_at_1_std value: 1.7658552480698708 - type: nauc_mrr_at_20_diff1 value: 66.72000262431975 - type: nauc_mrr_at_20_max value: 48.45593642981319 - type: nauc_mrr_at_20_std value: 5.353665929072101 - type: nauc_mrr_at_3_diff1 value: 66.84936676396276 - type: nauc_mrr_at_3_max value: 48.466611276778295 - type: nauc_mrr_at_3_std value: 4.485810398557475 - type: nauc_mrr_at_5_diff1 value: 66.62362565394174 - type: nauc_mrr_at_5_max value: 48.456431835482014 - type: nauc_mrr_at_5_std value: 5.08482458391903 - type: nauc_ndcg_at_1000_diff1 value: 29.984825173719443 - type: nauc_ndcg_at_1000_max value: 27.289179238639893 - type: nauc_ndcg_at_1000_std value: 10.661480455527526 - type: nauc_ndcg_at_100_diff1 value: 29.322074257047877 - type: nauc_ndcg_at_100_max value: 26.850650276220605 - type: nauc_ndcg_at_100_std value: 10.599247982501902 - type: nauc_ndcg_at_10_diff1 value: 29.659909113886094 - type: nauc_ndcg_at_10_max value: 26.836139599331005 - type: nauc_ndcg_at_10_std value: 8.12844399452719 - type: nauc_ndcg_at_1_diff1 value: 68.90178464319715 - type: nauc_ndcg_at_1_max value: 46.05537868917558 - type: nauc_ndcg_at_1_std value: 1.7658552480698708 - type: nauc_ndcg_at_20_diff1 value: 29.510802214854294 - type: nauc_ndcg_at_20_max value: 26.775562637730722 - type: nauc_ndcg_at_20_std value: 9.341342661702363 - type: nauc_ndcg_at_3_diff1 value: 32.741885846292966 - type: nauc_ndcg_at_3_max value: 28.44225108761343 - type: nauc_ndcg_at_3_std value: 5.204440768465042 - type: nauc_ndcg_at_5_diff1 value: 30.57856348635919 - type: nauc_ndcg_at_5_max value: 27.475007474301698 - type: nauc_ndcg_at_5_std value: 6.961546044312487 - type: nauc_precision_at_1000_diff1 value: 0.002113156309413332 - type: nauc_precision_at_1000_max value: 11.198242419541286 - type: nauc_precision_at_1000_std value: 28.69676419166541 - type: nauc_precision_at_100_diff1 value: 3.6049575557782627 - type: nauc_precision_at_100_max value: 12.499173524574791 - type: nauc_precision_at_100_std value: 23.3755281004721 - type: nauc_precision_at_10_diff1 value: 10.922574784853193 - type: nauc_precision_at_10_max value: 16.23221529562036 - type: nauc_precision_at_10_std value: 12.45014808813857 - type: nauc_precision_at_1_diff1 value: 68.90178464319715 - type: nauc_precision_at_1_max value: 46.05537868917558 - type: nauc_precision_at_1_std value: 1.7658552480698708 - type: nauc_precision_at_20_diff1 value: 8.840710781302827 - type: nauc_precision_at_20_max value: 14.804644554205524 - type: nauc_precision_at_20_std value: 16.245009770815237 - type: nauc_precision_at_3_diff1 value: 19.447291487137573 - type: nauc_precision_at_3_max value: 21.47123471597057 - type: nauc_precision_at_3_std value: 6.441862800128802 - type: nauc_precision_at_5_diff1 value: 14.078545719721108 - type: nauc_precision_at_5_max value: 18.468288046016387 - type: nauc_precision_at_5_std value: 9.58650641691393 - type: nauc_recall_at_1000_diff1 value: 0.0021131563095336584 - type: nauc_recall_at_1000_max value: 11.198242419541558 - type: nauc_recall_at_1000_std value: 28.6967641916655 - type: nauc_recall_at_100_diff1 value: 3.6049575557781393 - type: nauc_recall_at_100_max value: 12.499173524574765 - type: nauc_recall_at_100_std value: 23.375528100472074 - type: nauc_recall_at_10_diff1 value: 10.922574784853168 - type: nauc_recall_at_10_max value: 16.2322152956203 - type: nauc_recall_at_10_std value: 12.450148088138535 - type: nauc_recall_at_1_diff1 value: 68.90178464319715 - type: nauc_recall_at_1_max value: 46.05537868917558 - type: nauc_recall_at_1_std value: 1.7658552480698708 - type: nauc_recall_at_20_diff1 value: 8.840710781302905 - type: nauc_recall_at_20_max value: 14.804644554205515 - type: nauc_recall_at_20_std value: 16.245009770815273 - type: nauc_recall_at_3_diff1 value: 19.447291487137498 - type: nauc_recall_at_3_max value: 21.47123471597054 - type: nauc_recall_at_3_std value: 6.441862800128763 - type: nauc_recall_at_5_diff1 value: 14.07854571972115 - type: nauc_recall_at_5_max value: 18.468288046016337 - type: nauc_recall_at_5_std value: 9.586506416913904 - type: ndcg_at_1 value: 70.39800000000001 - type: ndcg_at_10 value: 60.27 - type: ndcg_at_100 value: 63.400999999999996 - type: ndcg_at_1000 value: 64.847 - type: ndcg_at_20 value: 61.571 - type: ndcg_at_3 value: 55.875 - type: ndcg_at_5 value: 58.36599999999999 - type: precision_at_1 value: 70.39800000000001 - type: precision_at_10 value: 12.46 - type: precision_at_100 value: 1.493 - type: precision_at_1000 value: 0.169 - type: precision_at_20 value: 6.65 - type: precision_at_3 value: 35.062 - type: precision_at_5 value: 23.009 - type: recall_at_1 value: 35.199000000000005 - type: recall_at_10 value: 62.302 - type: recall_at_100 value: 74.666 - type: recall_at_1000 value: 84.355 - type: recall_at_20 value: 66.496 - type: recall_at_3 value: 52.593 - type: recall_at_5 value: 57.522 - task: type: Retrieval dataset: name: MTEB MSMARCO-PL (default) type: clarin-knext/msmarco-pl config: default split: test revision: 8634c07806d5cce3a6138e260e59b81760a0a640 metrics: - type: main_score value: 64.886 - type: map_at_1 value: 1.644 - type: map_at_10 value: 12.24 - type: map_at_100 value: 28.248 - type: map_at_1000 value: 33.506 - type: map_at_20 value: 17.497 - type: map_at_3 value: 4.9399999999999995 - type: map_at_5 value: 8.272 - type: mrr_at_1 value: 83.72093023255815 - type: mrr_at_10 value: 91.08527131782945 - type: mrr_at_100 value: 91.08527131782945 - type: mrr_at_1000 value: 91.08527131782945 - type: mrr_at_20 value: 91.08527131782945 - type: mrr_at_3 value: 91.08527131782945 - type: mrr_at_5 value: 91.08527131782945 - type: nauc_map_at_1000_diff1 value: -36.428271627303424 - type: nauc_map_at_1000_max value: 44.87615127218638 - type: nauc_map_at_1000_std value: 67.92696808824724 - type: nauc_map_at_100_diff1 value: -28.11674206786188 - type: nauc_map_at_100_max value: 36.422779766334955 - type: nauc_map_at_100_std value: 49.99876313755116 - type: nauc_map_at_10_diff1 value: -5.838593619806058 - type: nauc_map_at_10_max value: 11.026519190509742 - type: nauc_map_at_10_std value: 2.5268752263522045 - type: nauc_map_at_1_diff1 value: 17.897907271073016 - type: nauc_map_at_1_max value: 12.229062762540844 - type: nauc_map_at_1_std value: -4.088830895573149 - type: nauc_map_at_20_diff1 value: -13.871097716255626 - type: nauc_map_at_20_max value: 19.291271635609533 - type: nauc_map_at_20_std value: 16.745335606507826 - type: nauc_map_at_3_diff1 value: 4.425238457033843 - type: nauc_map_at_3_max value: 4.611864744680824 - type: nauc_map_at_3_std value: -8.986916608582863 - type: nauc_map_at_5_diff1 value: -6.254849256920095 - type: nauc_map_at_5_max value: 2.729437079919823 - type: nauc_map_at_5_std value: -7.235906279913092 - type: nauc_mrr_at_1000_diff1 value: 52.18669104947672 - type: nauc_mrr_at_1000_max value: 68.26259125411818 - type: nauc_mrr_at_1000_std value: 56.345086428353575 - type: nauc_mrr_at_100_diff1 value: 52.18669104947672 - type: nauc_mrr_at_100_max value: 68.26259125411818 - type: nauc_mrr_at_100_std value: 56.345086428353575 - type: nauc_mrr_at_10_diff1 value: 52.18669104947672 - type: nauc_mrr_at_10_max value: 68.26259125411818 - type: nauc_mrr_at_10_std value: 56.345086428353575 - type: nauc_mrr_at_1_diff1 value: 56.55126663944154 - type: nauc_mrr_at_1_max value: 66.37014285522565 - type: nauc_mrr_at_1_std value: 53.2508271389779 - type: nauc_mrr_at_20_diff1 value: 52.18669104947672 - type: nauc_mrr_at_20_max value: 68.26259125411818 - type: nauc_mrr_at_20_std value: 56.345086428353575 - type: nauc_mrr_at_3_diff1 value: 52.18669104947672 - type: nauc_mrr_at_3_max value: 68.26259125411818 - type: nauc_mrr_at_3_std value: 56.345086428353575 - type: nauc_mrr_at_5_diff1 value: 52.18669104947672 - type: nauc_mrr_at_5_max value: 68.26259125411818 - type: nauc_mrr_at_5_std value: 56.345086428353575 - type: nauc_ndcg_at_1000_diff1 value: -19.06422926483731 - type: nauc_ndcg_at_1000_max value: 56.30853514590265 - type: nauc_ndcg_at_1000_std value: 70.30810947505557 - type: nauc_ndcg_at_100_diff1 value: -25.72587586459692 - type: nauc_ndcg_at_100_max value: 51.433781241604194 - type: nauc_ndcg_at_100_std value: 68.37678512652792 - type: nauc_ndcg_at_10_diff1 value: -23.21198108212602 - type: nauc_ndcg_at_10_max value: 43.5450720846516 - type: nauc_ndcg_at_10_std value: 48.78307907005605 - type: nauc_ndcg_at_1_diff1 value: 44.00179301267447 - type: nauc_ndcg_at_1_max value: 48.202370455680395 - type: nauc_ndcg_at_1_std value: 25.69655992704088 - type: nauc_ndcg_at_20_diff1 value: -33.88168753446507 - type: nauc_ndcg_at_20_max value: 45.16199742613164 - type: nauc_ndcg_at_20_std value: 61.87098383164902 - type: nauc_ndcg_at_3_diff1 value: 11.19174449544048 - type: nauc_ndcg_at_3_max value: 44.34069860560555 - type: nauc_ndcg_at_3_std value: 27.451258369798115 - type: nauc_ndcg_at_5_diff1 value: -7.186520929432436 - type: nauc_ndcg_at_5_max value: 43.41869981139378 - type: nauc_ndcg_at_5_std value: 34.89898115995178 - type: nauc_precision_at_1000_diff1 value: -34.43998154563451 - type: nauc_precision_at_1000_max value: 29.172655907480372 - type: nauc_precision_at_1000_std value: 65.15824469614837 - type: nauc_precision_at_100_diff1 value: -37.82409643259692 - type: nauc_precision_at_100_max value: 38.24986991317909 - type: nauc_precision_at_100_std value: 72.74768183105327 - type: nauc_precision_at_10_diff1 value: -32.21556182780535 - type: nauc_precision_at_10_max value: 34.27170432382651 - type: nauc_precision_at_10_std value: 58.358255004394664 - type: nauc_precision_at_1_diff1 value: 56.55126663944154 - type: nauc_precision_at_1_max value: 66.37014285522565 - type: nauc_precision_at_1_std value: 53.2508271389779 - type: nauc_precision_at_20_diff1 value: -40.18751579026395 - type: nauc_precision_at_20_max value: 33.960783153758896 - type: nauc_precision_at_20_std value: 65.42918390184195 - type: nauc_precision_at_3_diff1 value: -7.073870209006578 - type: nauc_precision_at_3_max value: 50.81535269862325 - type: nauc_precision_at_3_std value: 59.248681565955685 - type: nauc_precision_at_5_diff1 value: -31.136580596983876 - type: nauc_precision_at_5_max value: 45.88147792380426 - type: nauc_precision_at_5_std value: 67.46814230928243 - type: nauc_recall_at_1000_diff1 value: -23.15699999594577 - type: nauc_recall_at_1000_max value: 39.77277799761876 - type: nauc_recall_at_1000_std value: 60.326168012901114 - type: nauc_recall_at_100_diff1 value: -21.636664823598498 - type: nauc_recall_at_100_max value: 31.104969346131583 - type: nauc_recall_at_100_std value: 38.811686891592096 - type: nauc_recall_at_10_diff1 value: -10.542765625053569 - type: nauc_recall_at_10_max value: 2.043876058107446 - type: nauc_recall_at_10_std value: -5.578449908984766 - type: nauc_recall_at_1_diff1 value: 17.897907271073016 - type: nauc_recall_at_1_max value: 12.229062762540844 - type: nauc_recall_at_1_std value: -4.088830895573149 - type: nauc_recall_at_20_diff1 value: -15.132909355710103 - type: nauc_recall_at_20_max value: 12.659765287241065 - type: nauc_recall_at_20_std value: 8.277887800815819 - type: nauc_recall_at_3_diff1 value: -3.1975017812715016 - type: nauc_recall_at_3_max value: -3.5539857085038538 - type: nauc_recall_at_3_std value: -14.712102851318118 - type: nauc_recall_at_5_diff1 value: -14.040507717380743 - type: nauc_recall_at_5_max value: -6.126912150131701 - type: nauc_recall_at_5_std value: -13.821624015640355 - type: ndcg_at_1 value: 71.318 - type: ndcg_at_10 value: 64.886 - type: ndcg_at_100 value: 53.187 - type: ndcg_at_1000 value: 59.897999999999996 - type: ndcg_at_20 value: 58.96 - type: ndcg_at_3 value: 69.736 - type: ndcg_at_5 value: 70.14099999999999 - type: precision_at_1 value: 83.721 - type: precision_at_10 value: 71.163 - type: precision_at_100 value: 29.465000000000003 - type: precision_at_1000 value: 5.665 - type: precision_at_20 value: 57.791000000000004 - type: precision_at_3 value: 82.171 - type: precision_at_5 value: 81.86 - type: recall_at_1 value: 1.644 - type: recall_at_10 value: 14.238000000000001 - type: recall_at_100 value: 39.831 - type: recall_at_1000 value: 64.057 - type: recall_at_20 value: 21.021 - type: recall_at_3 value: 5.53 - type: recall_at_5 value: 9.623 - task: type: Retrieval dataset: name: MTEB NFCorpus-PL (default) type: clarin-knext/nfcorpus-pl config: default split: test revision: 9a6f9567fda928260afed2de480d79c98bf0bec0 metrics: - type: main_score value: 31.391000000000002 - type: map_at_1 value: 4.163 - type: map_at_10 value: 10.744 - type: map_at_100 value: 14.038999999999998 - type: map_at_1000 value: 15.434999999999999 - type: map_at_20 value: 12.16 - type: map_at_3 value: 7.614999999999999 - type: map_at_5 value: 9.027000000000001 - type: mrr_at_1 value: 39.0092879256966 - type: mrr_at_10 value: 48.69809327239668 - type: mrr_at_100 value: 49.20788148442068 - type: mrr_at_1000 value: 49.25509336494706 - type: mrr_at_20 value: 48.99606551850896 - type: mrr_at_3 value: 46.284829721362236 - type: mrr_at_5 value: 47.77089783281735 - type: nauc_map_at_1000_diff1 value: 22.75421477116417 - type: nauc_map_at_1000_max value: 49.242283787799046 - type: nauc_map_at_1000_std value: 29.056888272331832 - type: nauc_map_at_100_diff1 value: 23.585977398585594 - type: nauc_map_at_100_max value: 48.25845199409498 - type: nauc_map_at_100_std value: 24.944264511223693 - type: nauc_map_at_10_diff1 value: 27.386613094780255 - type: nauc_map_at_10_max value: 41.52415346691586 - type: nauc_map_at_10_std value: 12.93872448563755 - type: nauc_map_at_1_diff1 value: 46.78688143865053 - type: nauc_map_at_1_max value: 37.20408843995871 - type: nauc_map_at_1_std value: 4.383444959401098 - type: nauc_map_at_20_diff1 value: 25.590969047740288 - type: nauc_map_at_20_max value: 44.57109307999418 - type: nauc_map_at_20_std value: 16.45855141821407 - type: nauc_map_at_3_diff1 value: 36.30017108362863 - type: nauc_map_at_3_max value: 34.66149613991648 - type: nauc_map_at_3_std value: 5.67985905078467 - type: nauc_map_at_5_diff1 value: 31.157644795417223 - type: nauc_map_at_5_max value: 37.274738661636825 - type: nauc_map_at_5_std value: 8.70088872394168 - type: nauc_mrr_at_1000_diff1 value: 25.638564218157384 - type: nauc_mrr_at_1000_max value: 57.77788270285353 - type: nauc_mrr_at_1000_std value: 43.507586592911274 - type: nauc_mrr_at_100_diff1 value: 25.662002580561584 - type: nauc_mrr_at_100_max value: 57.80578394278584 - type: nauc_mrr_at_100_std value: 43.543905743986635 - type: nauc_mrr_at_10_diff1 value: 25.426034796339835 - type: nauc_mrr_at_10_max value: 57.68443186258669 - type: nauc_mrr_at_10_std value: 43.438009108331215 - type: nauc_mrr_at_1_diff1 value: 26.073028156311075 - type: nauc_mrr_at_1_max value: 52.11817916720053 - type: nauc_mrr_at_1_std value: 37.41073893153695 - type: nauc_mrr_at_20_diff1 value: 25.548645553336147 - type: nauc_mrr_at_20_max value: 57.78552760401915 - type: nauc_mrr_at_20_std value: 43.521687428822325 - type: nauc_mrr_at_3_diff1 value: 25.72662577397805 - type: nauc_mrr_at_3_max value: 56.891263536265605 - type: nauc_mrr_at_3_std value: 41.384872305390104 - type: nauc_mrr_at_5_diff1 value: 25.552211551655386 - type: nauc_mrr_at_5_max value: 57.976813828353926 - type: nauc_mrr_at_5_std value: 43.504564461855544 - type: nauc_ndcg_at_1000_diff1 value: 23.456158044182757 - type: nauc_ndcg_at_1000_max value: 60.05411773552709 - type: nauc_ndcg_at_1000_std value: 47.857510017262584 - type: nauc_ndcg_at_100_diff1 value: 19.711635700390772 - type: nauc_ndcg_at_100_max value: 56.178746740470665 - type: nauc_ndcg_at_100_std value: 42.36829180286942 - type: nauc_ndcg_at_10_diff1 value: 18.364428967788413 - type: nauc_ndcg_at_10_max value: 54.38372506578223 - type: nauc_ndcg_at_10_std value: 41.75765411340369 - type: nauc_ndcg_at_1_diff1 value: 26.571093272640773 - type: nauc_ndcg_at_1_max value: 51.061788341958284 - type: nauc_ndcg_at_1_std value: 36.514987974075986 - type: nauc_ndcg_at_20_diff1 value: 18.345487193027697 - type: nauc_ndcg_at_20_max value: 54.62621882656994 - type: nauc_ndcg_at_20_std value: 41.42835554714241 - type: nauc_ndcg_at_3_diff1 value: 23.260105658139025 - type: nauc_ndcg_at_3_max value: 52.07747385334546 - type: nauc_ndcg_at_3_std value: 36.91985577837284 - type: nauc_ndcg_at_5_diff1 value: 20.40428109665566 - type: nauc_ndcg_at_5_max value: 53.52015347884604 - type: nauc_ndcg_at_5_std value: 39.46008849580017 - type: nauc_precision_at_1000_diff1 value: -7.3487344916380035 - type: nauc_precision_at_1000_max value: 16.58045221394852 - type: nauc_precision_at_1000_std value: 38.94030932397075 - type: nauc_precision_at_100_diff1 value: -5.257743986683922 - type: nauc_precision_at_100_max value: 34.43071687475306 - type: nauc_precision_at_100_std value: 53.499519170670474 - type: nauc_precision_at_10_diff1 value: 2.385136433119139 - type: nauc_precision_at_10_max value: 47.210743878631064 - type: nauc_precision_at_10_std value: 47.22767704186548 - type: nauc_precision_at_1_diff1 value: 26.073028156311075 - type: nauc_precision_at_1_max value: 52.11817916720053 - type: nauc_precision_at_1_std value: 37.41073893153695 - type: nauc_precision_at_20_diff1 value: -0.3531531127238474 - type: nauc_precision_at_20_max value: 44.78044604856974 - type: nauc_precision_at_20_std value: 49.532804150743615 - type: nauc_precision_at_3_diff1 value: 15.350050569991447 - type: nauc_precision_at_3_max value: 51.01572315596549 - type: nauc_precision_at_3_std value: 38.801125728413155 - type: nauc_precision_at_5_diff1 value: 9.109003666144694 - type: nauc_precision_at_5_max value: 50.935269774898494 - type: nauc_precision_at_5_std value: 43.323548180559676 - type: nauc_recall_at_1000_diff1 value: 16.64743647648886 - type: nauc_recall_at_1000_max value: 38.46012283772285 - type: nauc_recall_at_1000_std value: 36.02016164796441 - type: nauc_recall_at_100_diff1 value: 14.005834785186744 - type: nauc_recall_at_100_max value: 37.70026105513647 - type: nauc_recall_at_100_std value: 27.085222642129697 - type: nauc_recall_at_10_diff1 value: 21.204106627422632 - type: nauc_recall_at_10_max value: 36.737624881893424 - type: nauc_recall_at_10_std value: 13.755054514272702 - type: nauc_recall_at_1_diff1 value: 46.78688143865053 - type: nauc_recall_at_1_max value: 37.20408843995871 - type: nauc_recall_at_1_std value: 4.383444959401098 - type: nauc_recall_at_20_diff1 value: 19.740977611421933 - type: nauc_recall_at_20_max value: 39.21908969539783 - type: nauc_recall_at_20_std value: 16.560269670318494 - type: nauc_recall_at_3_diff1 value: 32.189359545367815 - type: nauc_recall_at_3_max value: 31.693634445562758 - type: nauc_recall_at_3_std value: 6.246326281543587 - type: nauc_recall_at_5_diff1 value: 25.51586860499901 - type: nauc_recall_at_5_max value: 33.15934725342885 - type: nauc_recall_at_5_std value: 9.677778511696705 - type: ndcg_at_1 value: 37.307 - type: ndcg_at_10 value: 31.391000000000002 - type: ndcg_at_100 value: 28.877999999999997 - type: ndcg_at_1000 value: 37.16 - type: ndcg_at_20 value: 29.314 - type: ndcg_at_3 value: 35.405 - type: ndcg_at_5 value: 33.922999999999995 - type: precision_at_1 value: 39.009 - type: precision_at_10 value: 24.52 - type: precision_at_100 value: 7.703 - type: precision_at_1000 value: 2.04 - type: precision_at_20 value: 18.08 - type: precision_at_3 value: 34.469 - type: precision_at_5 value: 30.712 - type: recall_at_1 value: 4.163 - type: recall_at_10 value: 15.015999999999998 - type: recall_at_100 value: 30.606 - type: recall_at_1000 value: 59.606 - type: recall_at_20 value: 19.09 - type: recall_at_3 value: 9.139 - type: recall_at_5 value: 11.477 - task: type: Retrieval dataset: name: MTEB NQ-PL (default) type: clarin-knext/nq-pl config: default split: test revision: f171245712cf85dd4700b06bef18001578d0ca8d metrics: - type: main_score value: 54.017 - type: map_at_1 value: 34.193 - type: map_at_10 value: 47.497 - type: map_at_100 value: 48.441 - type: map_at_1000 value: 48.481 - type: map_at_20 value: 48.093 - type: map_at_3 value: 44.017 - type: map_at_5 value: 46.111000000000004 - type: mrr_at_1 value: 37.949015063731174 - type: mrr_at_10 value: 49.915772315105954 - type: mrr_at_100 value: 50.62841255829997 - type: mrr_at_1000 value: 50.656773027666745 - type: mrr_at_20 value: 50.37785276657083 - type: mrr_at_3 value: 46.98725376593267 - type: mrr_at_5 value: 48.763035921205066 - type: nauc_map_at_1000_diff1 value: 39.5632191792873 - type: nauc_map_at_1000_max value: 37.4728247053629 - type: nauc_map_at_1000_std value: 5.742498414663762 - type: nauc_map_at_100_diff1 value: 39.555570352061906 - type: nauc_map_at_100_max value: 37.497880976847334 - type: nauc_map_at_100_std value: 5.7798021019465375 - type: nauc_map_at_10_diff1 value: 39.5423723444454 - type: nauc_map_at_10_max value: 37.41661971723365 - type: nauc_map_at_10_std value: 5.2378002164144695 - type: nauc_map_at_1_diff1 value: 41.52697034146981 - type: nauc_map_at_1_max value: 28.558995576942863 - type: nauc_map_at_1_std value: 0.13094542859192052 - type: nauc_map_at_20_diff1 value: 39.55484628943701 - type: nauc_map_at_20_max value: 37.5247794933719 - type: nauc_map_at_20_std value: 5.702881342279231 - type: nauc_map_at_3_diff1 value: 39.949323925425325 - type: nauc_map_at_3_max value: 35.770298168901924 - type: nauc_map_at_3_std value: 2.9127112432479874 - type: nauc_map_at_5_diff1 value: 39.768310617004545 - type: nauc_map_at_5_max value: 37.1549191664796 - type: nauc_map_at_5_std value: 4.4681285748269515 - type: nauc_mrr_at_1000_diff1 value: 39.14001746706457 - type: nauc_mrr_at_1000_max value: 37.477376518267775 - type: nauc_mrr_at_1000_std value: 6.8088891531621565 - type: nauc_mrr_at_100_diff1 value: 39.13054707413684 - type: nauc_mrr_at_100_max value: 37.498126443766274 - type: nauc_mrr_at_100_std value: 6.839411380129971 - type: nauc_mrr_at_10_diff1 value: 39.09764730048156 - type: nauc_mrr_at_10_max value: 37.58593798217306 - type: nauc_mrr_at_10_std value: 6.713795164982413 - type: nauc_mrr_at_1_diff1 value: 41.581599918664075 - type: nauc_mrr_at_1_max value: 31.500589231378722 - type: nauc_mrr_at_1_std value: 2.059116370339438 - type: nauc_mrr_at_20_diff1 value: 39.09011023988447 - type: nauc_mrr_at_20_max value: 37.55856008791344 - type: nauc_mrr_at_20_std value: 6.847165397615844 - type: nauc_mrr_at_3_diff1 value: 39.382542043738 - type: nauc_mrr_at_3_max value: 36.49265363659468 - type: nauc_mrr_at_3_std value: 4.759157976438336 - type: nauc_mrr_at_5_diff1 value: 39.304826333759976 - type: nauc_mrr_at_5_max value: 37.46326016736024 - type: nauc_mrr_at_5_std value: 6.122608305766621 - type: nauc_ndcg_at_1000_diff1 value: 38.568500038453266 - type: nauc_ndcg_at_1000_max value: 39.799710882413166 - type: nauc_ndcg_at_1000_std value: 9.357010223096639 - type: nauc_ndcg_at_100_diff1 value: 38.38026091343228 - type: nauc_ndcg_at_100_max value: 40.48398173542486 - type: nauc_ndcg_at_100_std value: 10.373054013302214 - type: nauc_ndcg_at_10_diff1 value: 38.27340980909964 - type: nauc_ndcg_at_10_max value: 40.35241649744093 - type: nauc_ndcg_at_10_std value: 8.579139930345168 - type: nauc_ndcg_at_1_diff1 value: 41.581599918664075 - type: nauc_ndcg_at_1_max value: 31.500589231378722 - type: nauc_ndcg_at_1_std value: 2.059116370339438 - type: nauc_ndcg_at_20_diff1 value: 38.26453028884807 - type: nauc_ndcg_at_20_max value: 40.70517858426641 - type: nauc_ndcg_at_20_std value: 9.987693876137905 - type: nauc_ndcg_at_3_diff1 value: 39.2078971733273 - type: nauc_ndcg_at_3_max value: 37.48672195565316 - type: nauc_ndcg_at_3_std value: 4.051464994659221 - type: nauc_ndcg_at_5_diff1 value: 38.883693595665285 - type: nauc_ndcg_at_5_max value: 39.763115634437135 - type: nauc_ndcg_at_5_std value: 6.738980451582073 - type: nauc_precision_at_1000_diff1 value: -7.223215910619012 - type: nauc_precision_at_1000_max value: 13.075844604892161 - type: nauc_precision_at_1000_std value: 19.864336920890107 - type: nauc_precision_at_100_diff1 value: 1.3305994810812418 - type: nauc_precision_at_100_max value: 25.9219108557104 - type: nauc_precision_at_100_std value: 27.5076605928207 - type: nauc_precision_at_10_diff1 value: 18.441551484970326 - type: nauc_precision_at_10_max value: 39.85995330437054 - type: nauc_precision_at_10_std value: 20.561269077428914 - type: nauc_precision_at_1_diff1 value: 41.581599918664075 - type: nauc_precision_at_1_max value: 31.500589231378722 - type: nauc_precision_at_1_std value: 2.059116370339438 - type: nauc_precision_at_20_diff1 value: 12.579593891480531 - type: nauc_precision_at_20_max value: 36.620221830588775 - type: nauc_precision_at_20_std value: 26.40364876775059 - type: nauc_precision_at_3_diff1 value: 30.158859294487073 - type: nauc_precision_at_3_max value: 41.168215766389174 - type: nauc_precision_at_3_std value: 9.44345004450809 - type: nauc_precision_at_5_diff1 value: 25.438624678672785 - type: nauc_precision_at_5_max value: 42.72802023518524 - type: nauc_precision_at_5_std value: 15.357657388511099 - type: nauc_recall_at_1000_diff1 value: 24.987564782718003 - type: nauc_recall_at_1000_max value: 70.508416373353 - type: nauc_recall_at_1000_std value: 69.75092280398808 - type: nauc_recall_at_100_diff1 value: 29.504202856421397 - type: nauc_recall_at_100_max value: 63.41356585545318 - type: nauc_recall_at_100_std value: 50.09250954437847 - type: nauc_recall_at_10_diff1 value: 32.355776022971774 - type: nauc_recall_at_10_max value: 49.47121901667283 - type: nauc_recall_at_10_std value: 19.418439406631244 - type: nauc_recall_at_1_diff1 value: 41.52697034146981 - type: nauc_recall_at_1_max value: 28.558995576942863 - type: nauc_recall_at_1_std value: 0.13094542859192052 - type: nauc_recall_at_20_diff1 value: 31.57334731023589 - type: nauc_recall_at_20_max value: 54.06567225197383 - type: nauc_recall_at_20_std value: 29.222029720570468 - type: nauc_recall_at_3_diff1 value: 36.45033533275773 - type: nauc_recall_at_3_max value: 40.39529713780803 - type: nauc_recall_at_3_std value: 5.21893897772794 - type: nauc_recall_at_5_diff1 value: 35.18471678478859 - type: nauc_recall_at_5_max value: 46.20100816867823 - type: nauc_recall_at_5_std value: 11.94481894633221 - type: ndcg_at_1 value: 37.949 - type: ndcg_at_10 value: 54.017 - type: ndcg_at_100 value: 58.126 - type: ndcg_at_1000 value: 59.073 - type: ndcg_at_20 value: 55.928 - type: ndcg_at_3 value: 47.494 - type: ndcg_at_5 value: 50.975 - type: precision_at_1 value: 37.949 - type: precision_at_10 value: 8.450000000000001 - type: precision_at_100 value: 1.083 - type: precision_at_1000 value: 0.117 - type: precision_at_20 value: 4.689 - type: precision_at_3 value: 21.051000000000002 - type: precision_at_5 value: 14.664 - type: recall_at_1 value: 34.193 - type: recall_at_10 value: 71.357 - type: recall_at_100 value: 89.434 - type: recall_at_1000 value: 96.536 - type: recall_at_20 value: 78.363 - type: recall_at_3 value: 54.551 - type: recall_at_5 value: 62.543000000000006 - task: type: Retrieval dataset: name: MTEB Quora-PL (default) type: clarin-knext/quora-pl config: default split: test revision: 0be27e93455051e531182b85e85e425aba12e9d4 metrics: - type: main_score value: 84.114 - type: map_at_1 value: 65.848 - type: map_at_10 value: 79.85900000000001 - type: map_at_100 value: 80.582 - type: map_at_1000 value: 80.60300000000001 - type: map_at_20 value: 80.321 - type: map_at_3 value: 76.741 - type: map_at_5 value: 78.72200000000001 - type: mrr_at_1 value: 75.97 - type: mrr_at_10 value: 83.04630158730119 - type: mrr_at_100 value: 83.22785731032968 - type: mrr_at_1000 value: 83.23123717623899 - type: mrr_at_20 value: 83.17412021320565 - type: mrr_at_3 value: 81.83333333333287 - type: mrr_at_5 value: 82.61933333333275 - type: nauc_map_at_1000_diff1 value: 73.26316553371083 - type: nauc_map_at_1000_max value: 27.92567859085245 - type: nauc_map_at_1000_std value: -47.477909533360446 - type: nauc_map_at_100_diff1 value: 73.2690602807223 - type: nauc_map_at_100_max value: 27.915868327849996 - type: nauc_map_at_100_std value: -47.525777766107595 - type: nauc_map_at_10_diff1 value: 73.45464428464894 - type: nauc_map_at_10_max value: 27.451611487246296 - type: nauc_map_at_10_std value: -49.35818715843809 - type: nauc_map_at_1_diff1 value: 77.29690208952982 - type: nauc_map_at_1_max value: 19.839875762282293 - type: nauc_map_at_1_std value: -45.355684654708284 - type: nauc_map_at_20_diff1 value: 73.35102731979796 - type: nauc_map_at_20_max value: 27.741506490134583 - type: nauc_map_at_20_std value: -48.22006207310331 - type: nauc_map_at_3_diff1 value: 73.94878241064137 - type: nauc_map_at_3_max value: 24.761321386766728 - type: nauc_map_at_3_std value: -51.20638883618126 - type: nauc_map_at_5_diff1 value: 73.66143558047698 - type: nauc_map_at_5_max value: 26.53483405013543 - type: nauc_map_at_5_std value: -50.697541279640056 - type: nauc_mrr_at_1000_diff1 value: 73.84632320009759 - type: nauc_mrr_at_1000_max value: 30.50182733610048 - type: nauc_mrr_at_1000_std value: -44.3021647995251 - type: nauc_mrr_at_100_diff1 value: 73.84480792662302 - type: nauc_mrr_at_100_max value: 30.50749424571614 - type: nauc_mrr_at_100_std value: -44.29615086388113 - type: nauc_mrr_at_10_diff1 value: 73.79442772949346 - type: nauc_mrr_at_10_max value: 30.55724252219984 - type: nauc_mrr_at_10_std value: -44.50997069462057 - type: nauc_mrr_at_1_diff1 value: 75.23369827945945 - type: nauc_mrr_at_1_max value: 29.20073967447664 - type: nauc_mrr_at_1_std value: -43.1920147658285 - type: nauc_mrr_at_20_diff1 value: 73.82731678072307 - type: nauc_mrr_at_20_max value: 30.566328605497667 - type: nauc_mrr_at_20_std value: -44.24683607643705 - type: nauc_mrr_at_3_diff1 value: 73.61997576749954 - type: nauc_mrr_at_3_max value: 30.150393853381917 - type: nauc_mrr_at_3_std value: -44.96847297506626 - type: nauc_mrr_at_5_diff1 value: 73.69084310616132 - type: nauc_mrr_at_5_max value: 30.578033703441125 - type: nauc_mrr_at_5_std value: -44.74920746066566 - type: nauc_ndcg_at_1000_diff1 value: 72.89349862557452 - type: nauc_ndcg_at_1000_max value: 29.824725190462086 - type: nauc_ndcg_at_1000_std value: -44.96284395063211 - type: nauc_ndcg_at_100_diff1 value: 72.85212753715273 - type: nauc_ndcg_at_100_max value: 29.933114207845605 - type: nauc_ndcg_at_100_std value: -44.944225570663754 - type: nauc_ndcg_at_10_diff1 value: 72.80576740454528 - type: nauc_ndcg_at_10_max value: 29.16829118320828 - type: nauc_ndcg_at_10_std value: -48.149473740079614 - type: nauc_ndcg_at_1_diff1 value: 75.00032534968587 - type: nauc_ndcg_at_1_max value: 29.61849062038547 - type: nauc_ndcg_at_1_std value: -42.560207043864054 - type: nauc_ndcg_at_20_diff1 value: 72.88440406302502 - type: nauc_ndcg_at_20_max value: 29.65496676092656 - type: nauc_ndcg_at_20_std value: -46.21238462167732 - type: nauc_ndcg_at_3_diff1 value: 72.37916962766987 - type: nauc_ndcg_at_3_max value: 27.125094834547586 - type: nauc_ndcg_at_3_std value: -48.62942991399391 - type: nauc_ndcg_at_5_diff1 value: 72.57017330527658 - type: nauc_ndcg_at_5_max value: 28.470485561757254 - type: nauc_ndcg_at_5_std value: -49.07593345591059 - type: nauc_precision_at_1000_diff1 value: -41.67915575853946 - type: nauc_precision_at_1000_max value: 1.2012264478568844 - type: nauc_precision_at_1000_std value: 44.723834559400466 - type: nauc_precision_at_100_diff1 value: -40.45196679236971 - type: nauc_precision_at_100_max value: 2.3525450401714894 - type: nauc_precision_at_100_std value: 43.7092529413952 - type: nauc_precision_at_10_diff1 value: -30.256026923068767 - type: nauc_precision_at_10_max value: 8.313422052132559 - type: nauc_precision_at_10_std value: 25.929372356449694 - type: nauc_precision_at_1_diff1 value: 75.00032534968587 - type: nauc_precision_at_1_max value: 29.61849062038547 - type: nauc_precision_at_1_std value: -42.560207043864054 - type: nauc_precision_at_20_diff1 value: -35.61971069986584 - type: nauc_precision_at_20_max value: 5.4664303079116765 - type: nauc_precision_at_20_std value: 34.992352471692826 - type: nauc_precision_at_3_diff1 value: -5.691231842471157 - type: nauc_precision_at_3_max value: 14.797949087742444 - type: nauc_precision_at_3_std value: -0.1930317395644928 - type: nauc_precision_at_5_diff1 value: -20.03913781462645 - type: nauc_precision_at_5_max value: 11.956771408712749 - type: nauc_precision_at_5_std value: 13.179251389859731 - type: nauc_recall_at_1000_diff1 value: 64.03509042729674 - type: nauc_recall_at_1000_max value: 40.91691485428493 - type: nauc_recall_at_1000_std value: 16.12968625875372 - type: nauc_recall_at_100_diff1 value: 63.83116179628575 - type: nauc_recall_at_100_max value: 43.72908117676382 - type: nauc_recall_at_100_std value: -20.50966716852155 - type: nauc_recall_at_10_diff1 value: 66.42071960186394 - type: nauc_recall_at_10_max value: 28.983207818687205 - type: nauc_recall_at_10_std value: -56.61417798753744 - type: nauc_recall_at_1_diff1 value: 77.29690208952982 - type: nauc_recall_at_1_max value: 19.839875762282293 - type: nauc_recall_at_1_std value: -45.355684654708284 - type: nauc_recall_at_20_diff1 value: 66.32360705219874 - type: nauc_recall_at_20_max value: 33.30698111822631 - type: nauc_recall_at_20_std value: -43.89233781737452 - type: nauc_recall_at_3_diff1 value: 69.67029394927077 - type: nauc_recall_at_3_max value: 22.67803039327696 - type: nauc_recall_at_3_std value: -56.43327209861502 - type: nauc_recall_at_5_diff1 value: 68.05622143936131 - type: nauc_recall_at_5_max value: 26.67795559040675 - type: nauc_recall_at_5_std value: -58.158231198510954 - type: ndcg_at_1 value: 76.08 - type: ndcg_at_10 value: 84.114 - type: ndcg_at_100 value: 85.784 - type: ndcg_at_1000 value: 85.992 - type: ndcg_at_20 value: 84.976 - type: ndcg_at_3 value: 80.74799999999999 - type: ndcg_at_5 value: 82.626 - type: precision_at_1 value: 76.08 - type: precision_at_10 value: 12.926000000000002 - type: precision_at_100 value: 1.509 - type: precision_at_1000 value: 0.156 - type: precision_at_20 value: 6.912999999999999 - type: precision_at_3 value: 35.5 - type: precision_at_5 value: 23.541999999999998 - type: recall_at_1 value: 65.848 - type: recall_at_10 value: 92.611 - type: recall_at_100 value: 98.69 - type: recall_at_1000 value: 99.83999999999999 - type: recall_at_20 value: 95.47200000000001 - type: recall_at_3 value: 83.122 - type: recall_at_5 value: 88.23 - task: type: Retrieval dataset: name: MTEB SCIDOCS-PL (default) type: clarin-knext/scidocs-pl config: default split: test revision: 45452b03f05560207ef19149545f168e596c9337 metrics: - type: main_score value: 15.379999999999999 - type: map_at_1 value: 3.6029999999999998 - type: map_at_10 value: 8.843 - type: map_at_100 value: 10.433 - type: map_at_1000 value: 10.689 - type: map_at_20 value: 9.597 - type: map_at_3 value: 6.363 - type: map_at_5 value: 7.603 - type: mrr_at_1 value: 17.7 - type: mrr_at_10 value: 26.58900793650793 - type: mrr_at_100 value: 27.699652322890987 - type: mrr_at_1000 value: 27.78065313118353 - type: mrr_at_20 value: 27.215020950411816 - type: mrr_at_3 value: 23.36666666666668 - type: mrr_at_5 value: 25.211666666666666 - type: nauc_map_at_1000_diff1 value: 21.92235143827129 - type: nauc_map_at_1000_max value: 37.50300940750989 - type: nauc_map_at_1000_std value: 20.872586122198552 - type: nauc_map_at_100_diff1 value: 21.917408170465833 - type: nauc_map_at_100_max value: 37.4654466815513 - type: nauc_map_at_100_std value: 20.621643878648534 - type: nauc_map_at_10_diff1 value: 22.914388723621183 - type: nauc_map_at_10_max value: 36.468131213468794 - type: nauc_map_at_10_std value: 16.760980140791492 - type: nauc_map_at_1_diff1 value: 29.00799502838457 - type: nauc_map_at_1_max value: 26.64926291797503 - type: nauc_map_at_1_std value: 8.167291261637361 - type: nauc_map_at_20_diff1 value: 22.46580947804047 - type: nauc_map_at_20_max value: 36.656294842562275 - type: nauc_map_at_20_std value: 18.099232417722078 - type: nauc_map_at_3_diff1 value: 23.436009032045934 - type: nauc_map_at_3_max value: 31.325807212280914 - type: nauc_map_at_3_std value: 9.780905232048852 - type: nauc_map_at_5_diff1 value: 22.891704394665528 - type: nauc_map_at_5_max value: 35.40584466642894 - type: nauc_map_at_5_std value: 13.476986099394656 - type: nauc_mrr_at_1000_diff1 value: 25.052937655397866 - type: nauc_mrr_at_1000_max value: 29.64431912670108 - type: nauc_mrr_at_1000_std value: 14.549744963988044 - type: nauc_mrr_at_100_diff1 value: 25.070871266969224 - type: nauc_mrr_at_100_max value: 29.68743604652336 - type: nauc_mrr_at_100_std value: 14.582010154574432 - type: nauc_mrr_at_10_diff1 value: 24.88881466938897 - type: nauc_mrr_at_10_max value: 29.488430770768144 - type: nauc_mrr_at_10_std value: 14.269241073852266 - type: nauc_mrr_at_1_diff1 value: 29.220540327267503 - type: nauc_mrr_at_1_max value: 26.81908580507911 - type: nauc_mrr_at_1_std value: 8.00840295809718 - type: nauc_mrr_at_20_diff1 value: 25.067912695721944 - type: nauc_mrr_at_20_max value: 29.759227563849628 - type: nauc_mrr_at_20_std value: 14.685076859257357 - type: nauc_mrr_at_3_diff1 value: 24.645848739182696 - type: nauc_mrr_at_3_max value: 27.73368549660351 - type: nauc_mrr_at_3_std value: 11.475742805586943 - type: nauc_mrr_at_5_diff1 value: 24.895295760909946 - type: nauc_mrr_at_5_max value: 29.130755033240423 - type: nauc_mrr_at_5_std value: 12.955802929145404 - type: nauc_ndcg_at_1000_diff1 value: 20.68434434777729 - type: nauc_ndcg_at_1000_max value: 37.67055146424174 - type: nauc_ndcg_at_1000_std value: 29.57493715069776 - type: nauc_ndcg_at_100_diff1 value: 20.396834816492383 - type: nauc_ndcg_at_100_max value: 37.460575228670514 - type: nauc_ndcg_at_100_std value: 27.826534756761944 - type: nauc_ndcg_at_10_diff1 value: 22.640844106236027 - type: nauc_ndcg_at_10_max value: 35.21291764462327 - type: nauc_ndcg_at_10_std value: 19.53289455984506 - type: nauc_ndcg_at_1_diff1 value: 29.220540327267503 - type: nauc_ndcg_at_1_max value: 26.81908580507911 - type: nauc_ndcg_at_1_std value: 8.00840295809718 - type: nauc_ndcg_at_20_diff1 value: 22.117126657768623 - type: nauc_ndcg_at_20_max value: 35.79395781940806 - type: nauc_ndcg_at_20_std value: 22.242748346260786 - type: nauc_ndcg_at_3_diff1 value: 23.00596063212187 - type: nauc_ndcg_at_3_max value: 30.149013627580523 - type: nauc_ndcg_at_3_std value: 11.07904064662722 - type: nauc_ndcg_at_5_diff1 value: 22.81875419630523 - type: nauc_ndcg_at_5_max value: 34.24267468356626 - type: nauc_ndcg_at_5_std value: 15.307780280752088 - type: nauc_precision_at_1000_diff1 value: 9.606677689029972 - type: nauc_precision_at_1000_max value: 32.74855550489271 - type: nauc_precision_at_1000_std value: 42.65372585937895 - type: nauc_precision_at_100_diff1 value: 11.528981313529545 - type: nauc_precision_at_100_max value: 35.642529490132404 - type: nauc_precision_at_100_std value: 38.146151426052306 - type: nauc_precision_at_10_diff1 value: 18.783957183811836 - type: nauc_precision_at_10_max value: 36.1982008334257 - type: nauc_precision_at_10_std value: 25.09349473195891 - type: nauc_precision_at_1_diff1 value: 29.220540327267503 - type: nauc_precision_at_1_max value: 26.81908580507911 - type: nauc_precision_at_1_std value: 8.00840295809718 - type: nauc_precision_at_20_diff1 value: 17.458766320828214 - type: nauc_precision_at_20_max value: 36.000404903025235 - type: nauc_precision_at_20_std value: 29.1608044138323 - type: nauc_precision_at_3_diff1 value: 20.213669462067166 - type: nauc_precision_at_3_max value: 31.120650847205912 - type: nauc_precision_at_3_std value: 12.390972418818118 - type: nauc_precision_at_5_diff1 value: 20.114245715785678 - type: nauc_precision_at_5_max value: 37.30360111495823 - type: nauc_precision_at_5_std value: 19.053109037822853 - type: nauc_recall_at_1000_diff1 value: 9.85800049032612 - type: nauc_recall_at_1000_max value: 32.48319160802687 - type: nauc_recall_at_1000_std value: 43.79941601741161 - type: nauc_recall_at_100_diff1 value: 11.375255270968337 - type: nauc_recall_at_100_max value: 35.1868784124497 - type: nauc_recall_at_100_std value: 38.422680583482666 - type: nauc_recall_at_10_diff1 value: 18.445783123521938 - type: nauc_recall_at_10_max value: 35.633267936276766 - type: nauc_recall_at_10_std value: 24.94469506254716 - type: nauc_recall_at_1_diff1 value: 29.00799502838457 - type: nauc_recall_at_1_max value: 26.64926291797503 - type: nauc_recall_at_1_std value: 8.167291261637361 - type: nauc_recall_at_20_diff1 value: 17.314906604151936 - type: nauc_recall_at_20_max value: 35.66067699203996 - type: nauc_recall_at_20_std value: 29.400137012506082 - type: nauc_recall_at_3_diff1 value: 19.873710875648698 - type: nauc_recall_at_3_max value: 30.92404718742849 - type: nauc_recall_at_3_std value: 12.400871018075199 - type: nauc_recall_at_5_diff1 value: 19.869948324233192 - type: nauc_recall_at_5_max value: 37.06832511687574 - type: nauc_recall_at_5_std value: 19.0798814966156 - type: ndcg_at_1 value: 17.7 - type: ndcg_at_10 value: 15.379999999999999 - type: ndcg_at_100 value: 22.09 - type: ndcg_at_1000 value: 27.151999999999997 - type: ndcg_at_20 value: 17.576 - type: ndcg_at_3 value: 14.219999999999999 - type: ndcg_at_5 value: 12.579 - type: precision_at_1 value: 17.7 - type: precision_at_10 value: 8.08 - type: precision_at_100 value: 1.7840000000000003 - type: precision_at_1000 value: 0.3 - type: precision_at_20 value: 5.305 - type: precision_at_3 value: 13.167000000000002 - type: precision_at_5 value: 11.06 - type: recall_at_1 value: 3.6029999999999998 - type: recall_at_10 value: 16.413 - type: recall_at_100 value: 36.263 - type: recall_at_1000 value: 61.016999999999996 - type: recall_at_20 value: 21.587999999999997 - type: recall_at_3 value: 8.013 - type: recall_at_5 value: 11.198 - task: type: Retrieval dataset: name: MTEB SciFact-PL (default) type: clarin-knext/scifact-pl config: default split: test revision: 47932a35f045ef8ed01ba82bf9ff67f6e109207e metrics: - type: main_score value: 64.764 - type: map_at_1 value: 49.778 - type: map_at_10 value: 59.88 - type: map_at_100 value: 60.707 - type: map_at_1000 value: 60.729 - type: map_at_20 value: 60.419999999999995 - type: map_at_3 value: 57.45400000000001 - type: map_at_5 value: 58.729 - type: mrr_at_1 value: 52.33333333333333 - type: mrr_at_10 value: 61.29193121693122 - type: mrr_at_100 value: 61.95817765126313 - type: mrr_at_1000 value: 61.97583284368782 - type: mrr_at_20 value: 61.72469949641003 - type: mrr_at_3 value: 59.44444444444444 - type: mrr_at_5 value: 60.494444444444454 - type: nauc_map_at_1000_diff1 value: 62.21235294015774 - type: nauc_map_at_1000_max value: 48.83996609100249 - type: nauc_map_at_1000_std value: 5.23892781043174 - type: nauc_map_at_100_diff1 value: 62.20170226789429 - type: nauc_map_at_100_max value: 48.8391766453537 - type: nauc_map_at_100_std value: 5.2664077457917715 - type: nauc_map_at_10_diff1 value: 61.961975488329024 - type: nauc_map_at_10_max value: 48.397109987625186 - type: nauc_map_at_10_std value: 4.314859710827481 - type: nauc_map_at_1_diff1 value: 65.0865197011516 - type: nauc_map_at_1_max value: 41.38862781954889 - type: nauc_map_at_1_std value: -0.9182122632530586 - type: nauc_map_at_20_diff1 value: 61.99173935851292 - type: nauc_map_at_20_max value: 48.79961814179307 - type: nauc_map_at_20_std value: 5.262181845825118 - type: nauc_map_at_3_diff1 value: 62.37910539880477 - type: nauc_map_at_3_max value: 47.13627890977091 - type: nauc_map_at_3_std value: 2.327897198087264 - type: nauc_map_at_5_diff1 value: 61.60080757149592 - type: nauc_map_at_5_max value: 47.60052458345962 - type: nauc_map_at_5_std value: 3.1770196981231047 - type: nauc_mrr_at_1000_diff1 value: 62.86810952814966 - type: nauc_mrr_at_1000_max value: 52.13248094447774 - type: nauc_mrr_at_1000_std value: 10.100485746570733 - type: nauc_mrr_at_100_diff1 value: 62.85364829491874 - type: nauc_mrr_at_100_max value: 52.134528010631854 - type: nauc_mrr_at_100_std value: 10.120945685447369 - type: nauc_mrr_at_10_diff1 value: 62.65679301829915 - type: nauc_mrr_at_10_max value: 52.09270719182349 - type: nauc_mrr_at_10_std value: 9.913834434725441 - type: nauc_mrr_at_1_diff1 value: 66.84108271415636 - type: nauc_mrr_at_1_max value: 46.67646429855176 - type: nauc_mrr_at_1_std value: 5.5505252956352304 - type: nauc_mrr_at_20_diff1 value: 62.72473227039611 - type: nauc_mrr_at_20_max value: 52.13479097802757 - type: nauc_mrr_at_20_std value: 10.188278833464084 - type: nauc_mrr_at_3_diff1 value: 63.797429185518496 - type: nauc_mrr_at_3_max value: 52.16486999573481 - type: nauc_mrr_at_3_std value: 9.094360767062762 - type: nauc_mrr_at_5_diff1 value: 62.592917975475494 - type: nauc_mrr_at_5_max value: 52.330741486107414 - type: nauc_mrr_at_5_std value: 9.742175534421389 - type: nauc_ndcg_at_1000_diff1 value: 61.38859337672476 - type: nauc_ndcg_at_1000_max value: 51.48380058339184 - type: nauc_ndcg_at_1000_std value: 9.670547660897673 - type: nauc_ndcg_at_100_diff1 value: 61.02438489641434 - type: nauc_ndcg_at_100_max value: 51.781246646780865 - type: nauc_ndcg_at_100_std value: 10.592961553245187 - type: nauc_ndcg_at_10_diff1 value: 60.03678353308358 - type: nauc_ndcg_at_10_max value: 50.70725688848762 - type: nauc_ndcg_at_10_std value: 7.9472446491016315 - type: nauc_ndcg_at_1_diff1 value: 66.84108271415636 - type: nauc_ndcg_at_1_max value: 46.67646429855176 - type: nauc_ndcg_at_1_std value: 5.5505252956352304 - type: nauc_ndcg_at_20_diff1 value: 59.828482718480224 - type: nauc_ndcg_at_20_max value: 51.45831789601284 - type: nauc_ndcg_at_20_std value: 10.722673683272049 - type: nauc_ndcg_at_3_diff1 value: 61.68982937524109 - type: nauc_ndcg_at_3_max value: 49.745326748604775 - type: nauc_ndcg_at_3_std value: 4.948298621202247 - type: nauc_ndcg_at_5_diff1 value: 59.67396171973207 - type: nauc_ndcg_at_5_max value: 49.87855139298281 - type: nauc_ndcg_at_5_std value: 6.08990428055584 - type: nauc_precision_at_1000_diff1 value: -1.594227972036865 - type: nauc_precision_at_1000_max value: 32.48431723086185 - type: nauc_precision_at_1000_std value: 53.84748466965268 - type: nauc_precision_at_100_diff1 value: 8.06411455192293 - type: nauc_precision_at_100_max value: 39.91003601878948 - type: nauc_precision_at_100_std value: 55.52979711075091 - type: nauc_precision_at_10_diff1 value: 26.610514456014066 - type: nauc_precision_at_10_max value: 47.09062494321172 - type: nauc_precision_at_10_std value: 33.91984226498748 - type: nauc_precision_at_1_diff1 value: 66.84108271415636 - type: nauc_precision_at_1_max value: 46.67646429855176 - type: nauc_precision_at_1_std value: 5.5505252956352304 - type: nauc_precision_at_20_diff1 value: 16.947688843085583 - type: nauc_precision_at_20_max value: 45.40488186572008 - type: nauc_precision_at_20_std value: 48.354421924500905 - type: nauc_precision_at_3_diff1 value: 49.11263981720622 - type: nauc_precision_at_3_max value: 52.7084625111683 - type: nauc_precision_at_3_std value: 16.734612173556453 - type: nauc_precision_at_5_diff1 value: 39.06503705015792 - type: nauc_precision_at_5_max value: 52.21710506893391 - type: nauc_precision_at_5_std value: 23.350948149460233 - type: nauc_recall_at_1000_diff1 value: 43.1559290382817 - type: nauc_recall_at_1000_max value: 83.66013071895456 - type: nauc_recall_at_1000_std value: 86.27450980392177 - type: nauc_recall_at_100_diff1 value: 46.016860850620375 - type: nauc_recall_at_100_max value: 69.3944888744547 - type: nauc_recall_at_100_std value: 55.286945696152735 - type: nauc_recall_at_10_diff1 value: 49.65877895350921 - type: nauc_recall_at_10_max value: 53.02636695700889 - type: nauc_recall_at_10_std value: 13.967608945823828 - type: nauc_recall_at_1_diff1 value: 65.0865197011516 - type: nauc_recall_at_1_max value: 41.38862781954889 - type: nauc_recall_at_1_std value: -0.9182122632530586 - type: nauc_recall_at_20_diff1 value: 43.355308229973524 - type: nauc_recall_at_20_max value: 57.04187909533764 - type: nauc_recall_at_20_std value: 33.578720846660524 - type: nauc_recall_at_3_diff1 value: 56.922996057428165 - type: nauc_recall_at_3_max value: 50.74417041895424 - type: nauc_recall_at_3_std value: 5.623890124328387 - type: nauc_recall_at_5_diff1 value: 50.55620076865238 - type: nauc_recall_at_5_max value: 51.3316854622085 - type: nauc_recall_at_5_std value: 8.995457887269255 - type: ndcg_at_1 value: 52.333 - type: ndcg_at_10 value: 64.764 - type: ndcg_at_100 value: 68.167 - type: ndcg_at_1000 value: 68.816 - type: ndcg_at_20 value: 66.457 - type: ndcg_at_3 value: 60.346 - type: ndcg_at_5 value: 62.365 - type: precision_at_1 value: 52.333 - type: precision_at_10 value: 8.799999999999999 - type: precision_at_100 value: 1.057 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_20 value: 4.8 - type: precision_at_3 value: 23.889 - type: precision_at_5 value: 15.6 - type: recall_at_1 value: 49.778 - type: recall_at_10 value: 78.206 - type: recall_at_100 value: 93.10000000000001 - type: recall_at_1000 value: 98.333 - type: recall_at_20 value: 84.467 - type: recall_at_3 value: 66.367 - type: recall_at_5 value: 71.35000000000001 - task: type: Retrieval dataset: name: MTEB TRECCOVID-PL (default) type: clarin-knext/trec-covid-pl config: default split: test revision: 81bcb408f33366c2a20ac54adafad1ae7e877fdd metrics: - type: main_score value: 72.18900000000001 - type: map_at_1 value: 0.214 - type: map_at_10 value: 1.755 - type: map_at_100 value: 9.944 - type: map_at_1000 value: 24.205 - type: map_at_20 value: 3.1510000000000002 - type: map_at_3 value: 0.6 - type: map_at_5 value: 0.9560000000000001 - type: mrr_at_1 value: 82.0 - type: mrr_at_10 value: 89.06666666666666 - type: mrr_at_100 value: 89.06666666666666 - type: mrr_at_1000 value: 89.06666666666666 - type: mrr_at_20 value: 89.06666666666666 - type: mrr_at_3 value: 87.66666666666666 - type: mrr_at_5 value: 89.06666666666666 - type: nauc_map_at_1000_diff1 value: -9.342037623635543 - type: nauc_map_at_1000_max value: 45.71499810252398 - type: nauc_map_at_1000_std value: 76.86482845196852 - type: nauc_map_at_100_diff1 value: -6.932395299866198 - type: nauc_map_at_100_max value: 36.097801891181604 - type: nauc_map_at_100_std value: 65.6085215411685 - type: nauc_map_at_10_diff1 value: -6.3654843824342775 - type: nauc_map_at_10_max value: 9.564437521432714 - type: nauc_map_at_10_std value: 21.8377319336476 - type: nauc_map_at_1_diff1 value: 8.269590874255034 - type: nauc_map_at_1_max value: 3.482498491294516 - type: nauc_map_at_1_std value: 8.985226819412189 - type: nauc_map_at_20_diff1 value: -4.971435767877232 - type: nauc_map_at_20_max value: 22.88801858567121 - type: nauc_map_at_20_std value: 32.38492618534027 - type: nauc_map_at_3_diff1 value: 1.1615973694623123 - type: nauc_map_at_3_max value: 1.935417800315643 - type: nauc_map_at_3_std value: 10.289328305818698 - type: nauc_map_at_5_diff1 value: -2.4675967231444105 - type: nauc_map_at_5_max value: 2.4611483736622373 - type: nauc_map_at_5_std value: 15.082324305750811 - type: nauc_mrr_at_1000_diff1 value: 13.098526703499063 - type: nauc_mrr_at_1000_max value: 56.37362177417431 - type: nauc_mrr_at_1000_std value: 73.2456769749587 - type: nauc_mrr_at_100_diff1 value: 13.098526703499063 - type: nauc_mrr_at_100_max value: 56.37362177417431 - type: nauc_mrr_at_100_std value: 73.2456769749587 - type: nauc_mrr_at_10_diff1 value: 13.098526703499063 - type: nauc_mrr_at_10_max value: 56.37362177417431 - type: nauc_mrr_at_10_std value: 73.2456769749587 - type: nauc_mrr_at_1_diff1 value: 12.099350148694809 - type: nauc_mrr_at_1_max value: 53.75041304108387 - type: nauc_mrr_at_1_std value: 68.84018063663402 - type: nauc_mrr_at_20_diff1 value: 13.098526703499063 - type: nauc_mrr_at_20_max value: 56.37362177417431 - type: nauc_mrr_at_20_std value: 73.2456769749587 - type: nauc_mrr_at_3_diff1 value: 12.173557857011161 - type: nauc_mrr_at_3_max value: 57.540780562363395 - type: nauc_mrr_at_3_std value: 75.42098189580211 - type: nauc_mrr_at_5_diff1 value: 13.098526703499063 - type: nauc_mrr_at_5_max value: 56.37362177417431 - type: nauc_mrr_at_5_std value: 73.2456769749587 - type: nauc_ndcg_at_1000_diff1 value: -8.951471847310401 - type: nauc_ndcg_at_1000_max value: 43.86942237288822 - type: nauc_ndcg_at_1000_std value: 74.61077735148591 - type: nauc_ndcg_at_100_diff1 value: -17.754559361083817 - type: nauc_ndcg_at_100_max value: 53.97187119773482 - type: nauc_ndcg_at_100_std value: 80.7944136146514 - type: nauc_ndcg_at_10_diff1 value: -26.637734697836414 - type: nauc_ndcg_at_10_max value: 47.70102699133149 - type: nauc_ndcg_at_10_std value: 70.26909560828646 - type: nauc_ndcg_at_1_diff1 value: -1.2250530785563207 - type: nauc_ndcg_at_1_max value: 46.60509554140131 - type: nauc_ndcg_at_1_std value: 62.63906581740976 - type: nauc_ndcg_at_20_diff1 value: -22.44286466550908 - type: nauc_ndcg_at_20_max value: 55.40492058090103 - type: nauc_ndcg_at_20_std value: 72.11813912145738 - type: nauc_ndcg_at_3_diff1 value: -14.8152721896563 - type: nauc_ndcg_at_3_max value: 38.952259383027595 - type: nauc_ndcg_at_3_std value: 59.819750166537766 - type: nauc_ndcg_at_5_diff1 value: -19.150105688904375 - type: nauc_ndcg_at_5_max value: 42.311180547775315 - type: nauc_ndcg_at_5_std value: 66.6632229321094 - type: nauc_precision_at_1000_diff1 value: -11.555591477978941 - type: nauc_precision_at_1000_max value: 43.7311644834851 - type: nauc_precision_at_1000_std value: 52.10644767999648 - type: nauc_precision_at_100_diff1 value: -16.94803099801117 - type: nauc_precision_at_100_max value: 54.08281631067633 - type: nauc_precision_at_100_std value: 82.77237347891331 - type: nauc_precision_at_10_diff1 value: -27.351332814863355 - type: nauc_precision_at_10_max value: 48.08237549065846 - type: nauc_precision_at_10_std value: 69.37250843534329 - type: nauc_precision_at_1_diff1 value: 12.099350148694809 - type: nauc_precision_at_1_max value: 53.75041304108387 - type: nauc_precision_at_1_std value: 68.84018063663402 - type: nauc_precision_at_20_diff1 value: -18.2422222283388 - type: nauc_precision_at_20_max value: 59.517328129343696 - type: nauc_precision_at_20_std value: 72.05149307342747 - type: nauc_precision_at_3_diff1 value: -10.226547543075897 - type: nauc_precision_at_3_max value: 43.14684818832875 - type: nauc_precision_at_3_std value: 57.31936467418288 - type: nauc_precision_at_5_diff1 value: -14.28521589468673 - type: nauc_precision_at_5_max value: 41.633426753962596 - type: nauc_precision_at_5_std value: 64.94400576804541 - type: nauc_recall_at_1000_diff1 value: -0.9648831207497152 - type: nauc_recall_at_1000_max value: 31.70832946085005 - type: nauc_recall_at_1000_std value: 63.21471613968869 - type: nauc_recall_at_100_diff1 value: -1.360254380933586 - type: nauc_recall_at_100_max value: 25.960597782099605 - type: nauc_recall_at_100_std value: 51.52757589609674 - type: nauc_recall_at_10_diff1 value: -0.3899439424189566 - type: nauc_recall_at_10_max value: 5.094341897886072 - type: nauc_recall_at_10_std value: 11.266045616925698 - type: nauc_recall_at_1_diff1 value: 8.269590874255034 - type: nauc_recall_at_1_max value: 3.482498491294516 - type: nauc_recall_at_1_std value: 8.985226819412189 - type: nauc_recall_at_20_diff1 value: 6.4797098359254175 - type: nauc_recall_at_20_max value: 15.663700985336124 - type: nauc_recall_at_20_std value: 17.154099587904913 - type: nauc_recall_at_3_diff1 value: 3.7245972450393507 - type: nauc_recall_at_3_max value: 0.4063857187240345 - type: nauc_recall_at_3_std value: 6.641948062821941 - type: nauc_recall_at_5_diff1 value: 4.013879477591466 - type: nauc_recall_at_5_max value: -1.4266586618013566 - type: nauc_recall_at_5_std value: 7.311601874411205 - type: ndcg_at_1 value: 75.0 - type: ndcg_at_10 value: 72.18900000000001 - type: ndcg_at_100 value: 54.022999999999996 - type: ndcg_at_1000 value: 49.492000000000004 - type: ndcg_at_20 value: 68.51 - type: ndcg_at_3 value: 73.184 - type: ndcg_at_5 value: 72.811 - type: precision_at_1 value: 82.0 - type: precision_at_10 value: 77.4 - type: precision_at_100 value: 55.24 - type: precision_at_1000 value: 21.822 - type: precision_at_20 value: 73.0 - type: precision_at_3 value: 79.333 - type: precision_at_5 value: 79.2 - type: recall_at_1 value: 0.214 - type: recall_at_10 value: 1.9980000000000002 - type: recall_at_100 value: 13.328999999999999 - type: recall_at_1000 value: 47.204 - type: recall_at_20 value: 3.7310000000000003 - type: recall_at_3 value: 0.628 - type: recall_at_5 value: 1.049 - task: type: MultilabelClassification dataset: name: MTEB CEDRClassification (default) type: ai-forever/cedr-classification config: default split: test revision: c0ba03d058e3e1b2f3fd20518875a4563dd12db4 metrics: - type: accuracy value: 47.30605738575983 - type: f1 value: 41.26091043925065 - type: lrap value: 72.89452709883206 - type: main_score value: 47.30605738575983 - task: type: Reranking dataset: name: MTEB MIRACLReranking (ru) type: miracl/mmteb-miracl-reranking config: ru split: dev revision: 6d1962c527217f8927fca80f890f14f36b2802af metrics: - type: MAP@1(MIRACL) value: 20.721999999999998 - type: MAP@10(MIRACL) value: 33.900999999999996 - type: MAP@100(MIRACL) value: 36.813 - type: MAP@1000(MIRACL) value: 36.813 - type: MAP@20(MIRACL) value: 35.684 - type: MAP@3(MIRACL) value: 28.141 - type: MAP@5(MIRACL) value: 31.075000000000003 - type: NDCG@1(MIRACL) value: 32.799 - type: NDCG@10(MIRACL) value: 42.065000000000005 - type: NDCG@100(MIRACL) value: 49.730999999999995 - type: NDCG@1000(MIRACL) value: 49.730999999999995 - type: NDCG@20(MIRACL) value: 46.0 - type: NDCG@3(MIRACL) value: 34.481 - type: NDCG@5(MIRACL) value: 37.452999999999996 - type: P@1(MIRACL) value: 32.799 - type: P@10(MIRACL) value: 11.668000000000001 - type: P@100(MIRACL) value: 1.9529999999999998 - type: P@1000(MIRACL) value: 0.19499999999999998 - type: P@20(MIRACL) value: 7.51 - type: P@3(MIRACL) value: 20.823 - type: P@5(MIRACL) value: 16.728 - type: Recall@1(MIRACL) value: 20.721999999999998 - type: Recall@10(MIRACL) value: 54.762 - type: Recall@100(MIRACL) value: 79.952 - type: Recall@1000(MIRACL) value: 79.952 - type: Recall@20(MIRACL) value: 66.26100000000001 - type: Recall@3(MIRACL) value: 34.410000000000004 - type: Recall@5(MIRACL) value: 42.659000000000006 - type: main_score value: 42.065000000000005 - type: nAUC_MAP@1000_diff1(MIRACL) value: 14.33534992502818 - type: nAUC_MAP@1000_max(MIRACL) value: 12.367998764646115 - type: nAUC_MAP@1000_std(MIRACL) value: 4.569686002935006 - type: nAUC_MAP@100_diff1(MIRACL) value: 14.33534992502818 - type: nAUC_MAP@100_max(MIRACL) value: 12.367998764646115 - type: nAUC_MAP@100_std(MIRACL) value: 4.569686002935006 - type: nAUC_MAP@10_diff1(MIRACL) value: 16.920323975680027 - type: nAUC_MAP@10_max(MIRACL) value: 9.327171297204082 - type: nAUC_MAP@10_std(MIRACL) value: 3.2039133783079015 - type: nAUC_MAP@1_diff1(MIRACL) value: 28.698973487482206 - type: nAUC_MAP@1_max(MIRACL) value: 2.9217687660885034 - type: nAUC_MAP@1_std(MIRACL) value: -1.1247408800976524 - type: nAUC_MAP@20_diff1(MIRACL) value: 15.359083081640476 - type: nAUC_MAP@20_max(MIRACL) value: 11.310494233946345 - type: nAUC_MAP@20_std(MIRACL) value: 4.4171898386022885 - type: nAUC_MAP@3_diff1(MIRACL) value: 22.27430591851617 - type: nAUC_MAP@3_max(MIRACL) value: 6.407438291284658 - type: nAUC_MAP@3_std(MIRACL) value: 0.9799184530397409 - type: nAUC_MAP@5_diff1(MIRACL) value: 19.20571689941054 - type: nAUC_MAP@5_max(MIRACL) value: 7.987468654026893 - type: nAUC_MAP@5_std(MIRACL) value: 1.8324246565938962 - type: nAUC_NDCG@1000_diff1(MIRACL) value: 3.7537669018914768 - type: nAUC_NDCG@1000_max(MIRACL) value: 20.7944707840533 - type: nAUC_NDCG@1000_std(MIRACL) value: 8.444837055303063 - type: nAUC_NDCG@100_diff1(MIRACL) value: 3.7537669018914768 - type: nAUC_NDCG@100_max(MIRACL) value: 20.7944707840533 - type: nAUC_NDCG@100_std(MIRACL) value: 8.444837055303063 - type: nAUC_NDCG@10_diff1(MIRACL) value: 10.829575656103888 - type: nAUC_NDCG@10_max(MIRACL) value: 13.0445496498929 - type: nAUC_NDCG@10_std(MIRACL) value: 6.050412212625362 - type: nAUC_NDCG@1_diff1(MIRACL) value: 19.1388712233292 - type: nAUC_NDCG@1_max(MIRACL) value: 10.871900994781642 - type: nAUC_NDCG@1_std(MIRACL) value: 3.218568248751811 - type: nAUC_NDCG@20_diff1(MIRACL) value: 7.093172181746442 - type: nAUC_NDCG@20_max(MIRACL) value: 16.955238078958836 - type: nAUC_NDCG@20_std(MIRACL) value: 8.325656379573035 - type: nAUC_NDCG@3_diff1(MIRACL) value: 17.134437303330802 - type: nAUC_NDCG@3_max(MIRACL) value: 10.235328822955793 - type: nAUC_NDCG@3_std(MIRACL) value: 3.2341358691084814 - type: nAUC_NDCG@5_diff1(MIRACL) value: 14.733664618337636 - type: nAUC_NDCG@5_max(MIRACL) value: 11.181897412035282 - type: nAUC_NDCG@5_std(MIRACL) value: 3.642277088791985 - type: nAUC_P@1000_diff1(MIRACL) value: -26.330038284867573 - type: nAUC_P@1000_max(MIRACL) value: 28.450694137240458 - type: nAUC_P@1000_std(MIRACL) value: 9.892993775474912 - type: nAUC_P@100_diff1(MIRACL) value: -26.330038284867552 - type: nAUC_P@100_max(MIRACL) value: 28.45069413724051 - type: nAUC_P@100_std(MIRACL) value: 9.892993775474928 - type: nAUC_P@10_diff1(MIRACL) value: -17.436937353231112 - type: nAUC_P@10_max(MIRACL) value: 24.327018012947857 - type: nAUC_P@10_std(MIRACL) value: 11.78803527706634 - type: nAUC_P@1_diff1(MIRACL) value: 19.1388712233292 - type: nAUC_P@1_max(MIRACL) value: 10.871900994781642 - type: nAUC_P@1_std(MIRACL) value: 3.218568248751811 - type: nAUC_P@20_diff1(MIRACL) value: -22.947528755272426 - type: nAUC_P@20_max(MIRACL) value: 27.773093471902538 - type: nAUC_P@20_std(MIRACL) value: 14.898619107087221 - type: nAUC_P@3_diff1(MIRACL) value: 1.4100426412400944 - type: nAUC_P@3_max(MIRACL) value: 17.397472872058845 - type: nAUC_P@3_std(MIRACL) value: 8.240008229861875 - type: nAUC_P@5_diff1(MIRACL) value: -7.971349332207021 - type: nAUC_P@5_max(MIRACL) value: 22.198441167940963 - type: nAUC_P@5_std(MIRACL) value: 9.00265164460082 - type: nAUC_Recall@1000_diff1(MIRACL) value: -38.69835271863148 - type: nAUC_Recall@1000_max(MIRACL) value: 50.9545152809108 - type: nAUC_Recall@1000_std(MIRACL) value: 20.44270887092116 - type: nAUC_Recall@100_diff1(MIRACL) value: -38.69835271863148 - type: nAUC_Recall@100_max(MIRACL) value: 50.9545152809108 - type: nAUC_Recall@100_std(MIRACL) value: 20.44270887092116 - type: nAUC_Recall@10_diff1(MIRACL) value: -0.08109036309433801 - type: nAUC_Recall@10_max(MIRACL) value: 12.696619907773568 - type: nAUC_Recall@10_std(MIRACL) value: 8.791982704261589 - type: nAUC_Recall@1_diff1(MIRACL) value: 28.698973487482206 - type: nAUC_Recall@1_max(MIRACL) value: 2.9217687660885034 - type: nAUC_Recall@1_std(MIRACL) value: -1.1247408800976524 - type: nAUC_Recall@20_diff1(MIRACL) value: -13.312171017942623 - type: nAUC_Recall@20_max(MIRACL) value: 24.19847346821666 - type: nAUC_Recall@20_std(MIRACL) value: 15.8157702609797 - type: nAUC_Recall@3_diff1(MIRACL) value: 16.909128321353343 - type: nAUC_Recall@3_max(MIRACL) value: 6.552122731902991 - type: nAUC_Recall@3_std(MIRACL) value: 1.9963898223457228 - type: nAUC_Recall@5_diff1(MIRACL) value: 9.990292655247721 - type: nAUC_Recall@5_max(MIRACL) value: 9.361722273507574 - type: nAUC_Recall@5_std(MIRACL) value: 3.270918827854495 - task: type: MultilabelClassification dataset: name: MTEB SensitiveTopicsClassification (default) type: ai-forever/sensitive-topics-classification config: default split: test revision: 416b34a802308eac30e4192afc0ff99bb8dcc7f2 metrics: - type: accuracy value: 30.634765625 - type: f1 value: 32.647559808678665 - type: lrap value: 45.94319661458259 - type: main_score value: 30.634765625 - task: type: STS dataset: name: MTEB ATEC (default) type: C-MTEB/ATEC config: default split: test revision: 0f319b1142f28d00e055a6770f3f726ae9b7d865 metrics: - type: cosine_pearson value: 47.541497334563296 - type: cosine_spearman value: 49.06268944206629 - type: euclidean_pearson value: 51.838926748581635 - type: euclidean_spearman value: 48.930697157135356 - type: main_score value: 49.06268944206629 - type: manhattan_pearson value: 51.835306769406365 - type: manhattan_spearman value: 48.86135493444834 - type: pearson value: 47.541497334563296 - type: spearman value: 49.06268944206629 - task: type: Classification dataset: name: MTEB AllegroReviews (default) type: PL-MTEB/allegro-reviews config: default split: test revision: b89853e6de927b0e3bfa8ecc0e56fe4e02ceafc6 metrics: - type: accuracy value: 49.51292246520874 - type: f1 value: 44.14350234332397 - type: f1_weighted value: 51.65508998354552 - type: main_score value: 49.51292246520874 - task: type: Clustering dataset: name: MTEB AlloProfClusteringP2P (default) type: lyon-nlp/alloprof config: default split: test revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b metrics: - type: main_score value: 63.883383458621665 - type: v_measure value: 63.883383458621665 - type: v_measure_std value: 2.693666879958465 - type: main_score value: 46.85924588755251 - type: v_measure value: 46.85924588755251 - type: v_measure_std value: 2.1918258880872377 - task: type: Clustering dataset: name: MTEB 8TagsClustering type: PL-MTEB/8tags-clustering config: default split: test revision: None metrics: - type: v_measure value: 43.65721212452554 - task: type: Reranking dataset: name: MTEB AlloprofReranking (default) type: lyon-nlp/mteb-fr-reranking-alloprof-s2p config: default split: test revision: e40c8a63ce02da43200eccb5b0846fcaa888f562 metrics: - type: map value: 66.39013753839347 - type: mrr value: 67.68045617786551 - type: main_score value: 66.39013753839347 - task: type: Retrieval dataset: name: MTEB AlloprofRetrieval (default) type: lyon-nlp/alloprof config: default split: test revision: fcf295ea64c750f41fadbaa37b9b861558e1bfbd metrics: - type: main_score value: 54.284 - type: map_at_1 value: 37.047000000000004 - type: map_at_10 value: 48.53 - type: map_at_100 value: 49.357 - type: map_at_1000 value: 49.39 - type: map_at_20 value: 49.064 - type: map_at_3 value: 45.675 - type: map_at_5 value: 47.441 - type: mrr_at_1 value: 37.04663212435233 - type: mrr_at_10 value: 48.5300326232969 - type: mrr_at_100 value: 49.35708199037581 - type: mrr_at_1000 value: 49.39005824603193 - type: mrr_at_20 value: 49.06417416464799 - type: mrr_at_3 value: 45.67501439263105 - type: mrr_at_5 value: 47.44099021301103 - type: nauc_map_at_1000_diff1 value: 43.32474221868009 - type: nauc_map_at_1000_max value: 39.407334029058575 - type: nauc_map_at_1000_std value: -2.3728154448932606 - type: nauc_map_at_100_diff1 value: 43.32336300929909 - type: nauc_map_at_100_max value: 39.432174777554835 - type: nauc_map_at_100_std value: -2.356396922384349 - type: nauc_map_at_10_diff1 value: 43.1606520154482 - type: nauc_map_at_10_max value: 39.33734650558226 - type: nauc_map_at_10_std value: -2.5156222475075256 - type: nauc_map_at_1_diff1 value: 46.2178975214499 - type: nauc_map_at_1_max value: 36.26173199049361 - type: nauc_map_at_1_std value: -3.0897555582816443 - type: nauc_map_at_20_diff1 value: 43.272980702916456 - type: nauc_map_at_20_max value: 39.4896977052276 - type: nauc_map_at_20_std value: -2.3305501742917043 - type: nauc_map_at_3_diff1 value: 43.49525042967079 - type: nauc_map_at_3_max value: 38.66352501824728 - type: nauc_map_at_3_std value: -3.202794391620473 - type: nauc_map_at_5_diff1 value: 43.2266692546611 - type: nauc_map_at_5_max value: 38.77368661115743 - type: nauc_map_at_5_std value: -3.0897532130127954 - type: nauc_mrr_at_1000_diff1 value: 43.32474221868009 - type: nauc_mrr_at_1000_max value: 39.407334029058575 - type: nauc_mrr_at_1000_std value: -2.3728154448932606 - type: nauc_mrr_at_100_diff1 value: 43.32336300929909 - type: nauc_mrr_at_100_max value: 39.432174777554835 - type: nauc_mrr_at_100_std value: -2.356396922384349 - type: nauc_mrr_at_10_diff1 value: 43.1606520154482 - type: nauc_mrr_at_10_max value: 39.33734650558226 - type: nauc_mrr_at_10_std value: -2.5156222475075256 - type: nauc_mrr_at_1_diff1 value: 46.2178975214499 - type: nauc_mrr_at_1_max value: 36.26173199049361 - type: nauc_mrr_at_1_std value: -3.0897555582816443 - type: nauc_mrr_at_20_diff1 value: 43.272980702916456 - type: nauc_mrr_at_20_max value: 39.4896977052276 - type: nauc_mrr_at_20_std value: -2.3305501742917043 - type: nauc_mrr_at_3_diff1 value: 43.49525042967079 - type: nauc_mrr_at_3_max value: 38.66352501824728 - type: nauc_mrr_at_3_std value: -3.202794391620473 - type: nauc_mrr_at_5_diff1 value: 43.2266692546611 - type: nauc_mrr_at_5_max value: 38.77368661115743 - type: nauc_mrr_at_5_std value: -3.0897532130127954 - type: nauc_ndcg_at_1000_diff1 value: 43.01903168202974 - type: nauc_ndcg_at_1000_max value: 40.75496622942232 - type: nauc_ndcg_at_1000_std value: -1.3150412981845496 - type: nauc_ndcg_at_100_diff1 value: 42.98016493758145 - type: nauc_ndcg_at_100_max value: 41.55869635162325 - type: nauc_ndcg_at_100_std value: -0.5355252976886055 - type: nauc_ndcg_at_10_diff1 value: 42.218755211347506 - type: nauc_ndcg_at_10_max value: 41.305042275175765 - type: nauc_ndcg_at_10_std value: -1.4034484444573714 - type: nauc_ndcg_at_1_diff1 value: 46.2178975214499 - type: nauc_ndcg_at_1_max value: 36.26173199049361 - type: nauc_ndcg_at_1_std value: -3.0897555582816443 - type: nauc_ndcg_at_20_diff1 value: 42.66574440095576 - type: nauc_ndcg_at_20_max value: 42.014620115124515 - type: nauc_ndcg_at_20_std value: -0.5176162553751498 - type: nauc_ndcg_at_3_diff1 value: 42.837450505106055 - type: nauc_ndcg_at_3_max value: 39.525369733082414 - type: nauc_ndcg_at_3_std value: -3.1605948245795155 - type: nauc_ndcg_at_5_diff1 value: 42.37951815451173 - type: nauc_ndcg_at_5_max value: 39.78840132935179 - type: nauc_ndcg_at_5_std value: -2.936898430768135 - type: nauc_precision_at_1000_diff1 value: 49.69224988612385 - type: nauc_precision_at_1000_max value: 79.57897547128005 - type: nauc_precision_at_1000_std value: 45.040371354764645 - type: nauc_precision_at_100_diff1 value: 42.70597486048422 - type: nauc_precision_at_100_max value: 65.74628759606188 - type: nauc_precision_at_100_std value: 25.49157745244855 - type: nauc_precision_at_10_diff1 value: 38.565609931689345 - type: nauc_precision_at_10_max value: 50.0239696180852 - type: nauc_precision_at_10_std value: 3.976354829503967 - type: nauc_precision_at_1_diff1 value: 46.2178975214499 - type: nauc_precision_at_1_max value: 36.26173199049361 - type: nauc_precision_at_1_std value: -3.0897555582816443 - type: nauc_precision_at_20_diff1 value: 40.4134718566864 - type: nauc_precision_at_20_max value: 57.121778108665374 - type: nauc_precision_at_20_std value: 11.46021975428544 - type: nauc_precision_at_3_diff1 value: 40.90538379461529 - type: nauc_precision_at_3_max value: 42.18393248057992 - type: nauc_precision_at_3_std value: -3.005249943837297 - type: nauc_precision_at_5_diff1 value: 39.60162965860782 - type: nauc_precision_at_5_max value: 43.28317158174058 - type: nauc_precision_at_5_std value: -2.3469094487738054 - type: nauc_recall_at_1000_diff1 value: 49.69224988612252 - type: nauc_recall_at_1000_max value: 79.57897547127862 - type: nauc_recall_at_1000_std value: 45.04037135476256 - type: nauc_recall_at_100_diff1 value: 42.70597486048432 - type: nauc_recall_at_100_max value: 65.74628759606213 - type: nauc_recall_at_100_std value: 25.491577452448727 - type: nauc_recall_at_10_diff1 value: 38.56560993168935 - type: nauc_recall_at_10_max value: 50.02396961808522 - type: nauc_recall_at_10_std value: 3.9763548295040314 - type: nauc_recall_at_1_diff1 value: 46.2178975214499 - type: nauc_recall_at_1_max value: 36.26173199049361 - type: nauc_recall_at_1_std value: -3.0897555582816443 - type: nauc_recall_at_20_diff1 value: 40.41347185668637 - type: nauc_recall_at_20_max value: 57.12177810866533 - type: nauc_recall_at_20_std value: 11.460219754285431 - type: nauc_recall_at_3_diff1 value: 40.90538379461527 - type: nauc_recall_at_3_max value: 42.18393248057989 - type: nauc_recall_at_3_std value: -3.005249943837297 - type: nauc_recall_at_5_diff1 value: 39.601629658607784 - type: nauc_recall_at_5_max value: 43.28317158174053 - type: nauc_recall_at_5_std value: -2.3469094487738054 - type: ndcg_at_1 value: 37.047000000000004 - type: ndcg_at_10 value: 54.284 - type: ndcg_at_100 value: 58.34 - type: ndcg_at_1000 value: 59.303 - type: ndcg_at_20 value: 56.235 - type: ndcg_at_3 value: 48.503 - type: ndcg_at_5 value: 51.686 - type: precision_at_1 value: 37.047000000000004 - type: precision_at_10 value: 7.237 - type: precision_at_100 value: 0.914 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 4.005 - type: precision_at_3 value: 18.898 - type: precision_at_5 value: 12.884 - type: recall_at_1 value: 37.047000000000004 - type: recall_at_10 value: 72.366 - type: recall_at_100 value: 91.408 - type: recall_at_1000 value: 99.136 - type: recall_at_20 value: 80.095 - type: recall_at_3 value: 56.693000000000005 - type: recall_at_5 value: 64.42099999999999 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 89.49253731343283 - type: ap value: 61.88098616359918 - type: ap_weighted value: 61.88098616359918 - type: f1 value: 84.76516623679144 - type: f1_weighted value: 89.92745276292968 - type: main_score value: 89.49253731343283 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (de) type: mteb/amazon_counterfactual config: de split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 89.61456102783727 - type: ap value: 93.11816566733742 - type: ap_weighted value: 93.11816566733742 - type: f1 value: 88.27635757733722 - type: f1_weighted value: 89.82581568285453 - type: main_score value: 89.61456102783727 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification (default) type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 95.3825 - type: ap value: 93.393033869502 - type: ap_weighted value: 93.393033869502 - type: f1 value: 95.38109007966307 - type: f1_weighted value: 95.38109007966305 - type: main_score value: 95.3825 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 49.768 - type: f1 value: 48.95084821944411 - type: f1_weighted value: 48.9508482194441 - type: main_score value: 49.768 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (de) type: mteb/amazon_reviews_multi config: de split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 48.071999999999996 - type: f1 value: 47.24171107487612 - type: f1_weighted value: 47.24171107487612 - type: main_score value: 48.071999999999996 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (es) type: mteb/amazon_reviews_multi config: es split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 48.102000000000004 - type: f1 value: 47.27193805278696 - type: f1_weighted value: 47.27193805278696 - type: main_score value: 48.102000000000004 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (fr) type: mteb/amazon_reviews_multi config: fr split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 47.30800000000001 - type: f1 value: 46.41683358017851 - type: f1_weighted value: 46.41683358017851 - type: main_score value: 47.30800000000001 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (zh) type: mteb/amazon_reviews_multi config: zh split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 44.944 - type: f1 value: 44.223824487744395 - type: f1_weighted value: 44.22382448774439 - type: main_score value: 44.944 - task: type: Retrieval dataset: name: MTEB ArguAna (default) type: mteb/arguana config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 29.232000000000003 - type: map_at_10 value: 45.117000000000004 - type: map_at_100 value: 45.977000000000004 - type: map_at_1000 value: 45.98 - type: map_at_20 value: 45.815 - type: map_at_3 value: 39.912 - type: map_at_5 value: 42.693 - type: mrr_at_1 value: 29.659000000000002 - type: mrr_at_10 value: 45.253 - type: mrr_at_100 value: 46.125 - type: mrr_at_1000 value: 46.129 - type: mrr_at_20 value: 45.964 - type: mrr_at_3 value: 40.043 - type: mrr_at_5 value: 42.870000000000005 - type: ndcg_at_1 value: 29.232000000000003 - type: ndcg_at_10 value: 54.327999999999996 - type: ndcg_at_100 value: 57.86 - type: ndcg_at_1000 value: 57.935 - type: ndcg_at_20 value: 56.794 - type: ndcg_at_3 value: 43.516 - type: ndcg_at_5 value: 48.512 - type: precision_at_1 value: 29.232000000000003 - type: precision_at_10 value: 8.393 - type: precision_at_100 value: 0.991 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.676 - type: precision_at_3 value: 17.994 - type: precision_at_5 value: 13.215 - type: recall_at_1 value: 29.232000000000003 - type: recall_at_10 value: 83.926 - type: recall_at_100 value: 99.075 - type: recall_at_1000 value: 99.644 - type: recall_at_20 value: 93.528 - type: recall_at_3 value: 53.983000000000004 - type: recall_at_5 value: 66.074 - type: main_score value: 54.327999999999996 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P (default) type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: main_score value: 46.6636824632419 - type: v_measure value: 46.6636824632419 - type: v_measure_std value: 13.817129140714963 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S (default) type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: main_score value: 39.271141892800024 - type: v_measure value: 39.271141892800024 - type: v_measure_std value: 14.276782483454827 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions (default) type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 65.04363277324629 - type: mrr value: 78.2372598162072 - type: main_score value: 65.04363277324629 - task: type: Reranking dataset: name: MTEB MindSmallReranking (default) type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 30.83 - type: main_score value: 30.83 - task: type: STS dataset: name: MTEB BIOSSES (default) type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cosine_pearson value: 88.80382082011027 - type: cosine_spearman value: 88.68876782169106 - type: euclidean_pearson value: 87.00802890147176 - type: euclidean_spearman value: 87.43211268192712 - type: main_score value: 88.68876782169106 - type: manhattan_pearson value: 87.14062537179474 - type: manhattan_spearman value: 87.59115245033443 - type: pearson value: 88.80382082011027 - type: spearman value: 88.68876782169106 - task: type: STS dataset: name: MTEB BQ (default) type: C-MTEB/BQ config: default split: test revision: e3dda5e115e487b39ec7e618c0c6a29137052a55 metrics: - type: cosine_pearson value: 61.588006604878196 - type: cosine_spearman value: 63.20615427154465 - type: euclidean_pearson value: 61.818547092516496 - type: euclidean_spearman value: 63.21558009151778 - type: main_score value: 63.20615427154465 - type: manhattan_pearson value: 61.665588158487616 - type: manhattan_spearman value: 63.051544488238584 - type: pearson value: 61.588006604878196 - type: spearman value: 63.20615427154465 - task: type: Retrieval dataset: name: MTEB BSARDRetrieval (default) type: maastrichtlawtech/bsard config: default split: test revision: 5effa1b9b5fa3b0f9e12523e6e43e5f86a6e6d59 metrics: - type: main_score value: 64.414 - type: map_at_1 value: 14.865 - type: map_at_10 value: 21.605 - type: map_at_100 value: 22.762 - type: map_at_1000 value: 22.854 - type: map_at_20 value: 22.259999999999998 - type: map_at_3 value: 20.119999999999997 - type: map_at_5 value: 20.931 - type: mrr_at_1 value: 14.864864864864865 - type: mrr_at_10 value: 21.605176605176606 - type: mrr_at_100 value: 22.7622306460065 - type: mrr_at_1000 value: 22.85383406410312 - type: mrr_at_20 value: 22.259528463088845 - type: mrr_at_3 value: 20.12012012012012 - type: mrr_at_5 value: 20.930930930930934 - type: nauc_map_at_1000_diff1 value: 17.486265968689338 - type: nauc_map_at_1000_max value: 22.736799291688836 - type: nauc_map_at_1000_std value: 9.831687441977147 - type: nauc_map_at_100_diff1 value: 17.50754492049086 - type: nauc_map_at_100_max value: 22.77693662806787 - type: nauc_map_at_100_std value: 9.853899509675395 - type: nauc_map_at_10_diff1 value: 17.42133968580952 - type: nauc_map_at_10_max value: 22.45861793882279 - type: nauc_map_at_10_std value: 8.964888472915938 - type: nauc_map_at_1_diff1 value: 19.433947086968093 - type: nauc_map_at_1_max value: 24.75657047550517 - type: nauc_map_at_1_std value: 15.122329157218505 - type: nauc_map_at_20_diff1 value: 17.429856756008785 - type: nauc_map_at_20_max value: 22.438850987431017 - type: nauc_map_at_20_std value: 9.172746012213558 - type: nauc_map_at_3_diff1 value: 18.218182689678475 - type: nauc_map_at_3_max value: 23.57169444088667 - type: nauc_map_at_3_std value: 10.464473559366356 - type: nauc_map_at_5_diff1 value: 18.6075342519133 - type: nauc_map_at_5_max value: 23.308845973576673 - type: nauc_map_at_5_std value: 9.364009996445652 - type: nauc_mrr_at_1000_diff1 value: 17.486265968689338 - type: nauc_mrr_at_1000_max value: 22.736799291688836 - type: nauc_mrr_at_1000_std value: 9.831687441977147 - type: nauc_mrr_at_100_diff1 value: 17.50754492049086 - type: nauc_mrr_at_100_max value: 22.77693662806787 - type: nauc_mrr_at_100_std value: 9.853899509675395 - type: nauc_mrr_at_10_diff1 value: 17.42133968580952 - type: nauc_mrr_at_10_max value: 22.45861793882279 - type: nauc_mrr_at_10_std value: 8.964888472915938 - type: nauc_mrr_at_1_diff1 value: 19.433947086968093 - type: nauc_mrr_at_1_max value: 24.75657047550517 - type: nauc_mrr_at_1_std value: 15.122329157218505 - type: nauc_mrr_at_20_diff1 value: 17.429856756008785 - type: nauc_mrr_at_20_max value: 22.438850987431017 - type: nauc_mrr_at_20_std value: 9.172746012213558 - type: nauc_mrr_at_3_diff1 value: 18.218182689678475 - type: nauc_mrr_at_3_max value: 23.57169444088667 - type: nauc_mrr_at_3_std value: 10.464473559366356 - type: nauc_mrr_at_5_diff1 value: 18.6075342519133 - type: nauc_mrr_at_5_max value: 23.308845973576673 - type: nauc_mrr_at_5_std value: 9.364009996445652 - type: nauc_ndcg_at_1000_diff1 value: 16.327871824135745 - type: nauc_ndcg_at_1000_max value: 23.308241052911495 - type: nauc_ndcg_at_1000_std value: 11.50905911184097 - type: nauc_ndcg_at_100_diff1 value: 16.676226744692773 - type: nauc_ndcg_at_100_max value: 24.323253721240974 - type: nauc_ndcg_at_100_std value: 11.952612443651557 - type: nauc_ndcg_at_10_diff1 value: 16.030325121764594 - type: nauc_ndcg_at_10_max value: 21.306799242079542 - type: nauc_ndcg_at_10_std value: 6.63359364302513 - type: nauc_ndcg_at_1_diff1 value: 19.433947086968093 - type: nauc_ndcg_at_1_max value: 24.75657047550517 - type: nauc_ndcg_at_1_std value: 15.122329157218505 - type: nauc_ndcg_at_20_diff1 value: 16.013173605999857 - type: nauc_ndcg_at_20_max value: 21.607217260736576 - type: nauc_ndcg_at_20_std value: 7.319482417138996 - type: nauc_ndcg_at_3_diff1 value: 17.97958548328493 - type: nauc_ndcg_at_3_max value: 23.58346522810145 - type: nauc_ndcg_at_3_std value: 9.392582854708314 - type: nauc_ndcg_at_5_diff1 value: 18.734733324685287 - type: nauc_ndcg_at_5_max value: 23.273244317623742 - type: nauc_ndcg_at_5_std value: 7.638611545253834 - type: nauc_precision_at_1000_diff1 value: 7.919843339380295 - type: nauc_precision_at_1000_max value: 31.575386234270486 - type: nauc_precision_at_1000_std value: 39.332224386769404 - type: nauc_precision_at_100_diff1 value: 15.018050960000052 - type: nauc_precision_at_100_max value: 34.98209513759861 - type: nauc_precision_at_100_std value: 26.970034484359022 - type: nauc_precision_at_10_diff1 value: 12.102191084210922 - type: nauc_precision_at_10_max value: 18.112541150340675 - type: nauc_precision_at_10_std value: 0.7358784689406018 - type: nauc_precision_at_1_diff1 value: 19.433947086968093 - type: nauc_precision_at_1_max value: 24.75657047550517 - type: nauc_precision_at_1_std value: 15.122329157218505 - type: nauc_precision_at_20_diff1 value: 12.018814361204328 - type: nauc_precision_at_20_max value: 19.75123746049928 - type: nauc_precision_at_20_std value: 3.012204650582264 - type: nauc_precision_at_3_diff1 value: 17.41375604940955 - type: nauc_precision_at_3_max value: 23.699834627021037 - type: nauc_precision_at_3_std value: 6.793486779050103 - type: nauc_precision_at_5_diff1 value: 19.194631963780257 - type: nauc_precision_at_5_max value: 23.31708702442155 - type: nauc_precision_at_5_std value: 3.4591358279667332 - type: nauc_recall_at_1000_diff1 value: 7.919843339380378 - type: nauc_recall_at_1000_max value: 31.57538623427063 - type: nauc_recall_at_1000_std value: 39.332224386769546 - type: nauc_recall_at_100_diff1 value: 15.018050960000085 - type: nauc_recall_at_100_max value: 34.9820951375986 - type: nauc_recall_at_100_std value: 26.97003448435901 - type: nauc_recall_at_10_diff1 value: 12.102191084210837 - type: nauc_recall_at_10_max value: 18.112541150340594 - type: nauc_recall_at_10_std value: 0.7358784689405188 - type: nauc_recall_at_1_diff1 value: 19.433947086968093 - type: nauc_recall_at_1_max value: 24.75657047550517 - type: nauc_recall_at_1_std value: 15.122329157218505 - type: nauc_recall_at_20_diff1 value: 12.01881436120429 - type: nauc_recall_at_20_max value: 19.751237460499222 - type: nauc_recall_at_20_std value: 3.0122046505822135 - type: nauc_recall_at_3_diff1 value: 17.413756049409503 - type: nauc_recall_at_3_max value: 23.699834627020998 - type: nauc_recall_at_3_std value: 6.793486779050083 - type: nauc_recall_at_5_diff1 value: 19.194631963780203 - type: nauc_recall_at_5_max value: 23.3170870244215 - type: nauc_recall_at_5_std value: 3.459135827966664 - type: ndcg_at_1 value: 14.865 - type: ndcg_at_10 value: 24.764 - type: ndcg_at_100 value: 30.861 - type: ndcg_at_1000 value: 33.628 - type: ndcg_at_20 value: 27.078000000000003 - type: ndcg_at_3 value: 21.675 - type: ndcg_at_5 value: 23.148 - type: precision_at_1 value: 14.865 - type: precision_at_10 value: 3.4680000000000004 - type: precision_at_100 value: 0.644 - type: precision_at_1000 value: 0.087 - type: precision_at_20 value: 2.185 - type: precision_at_3 value: 8.709 - type: precision_at_5 value: 5.946 - type: recall_at_1 value: 14.865 - type: recall_at_10 value: 34.685 - type: recall_at_100 value: 64.414 - type: recall_at_1000 value: 86.937 - type: recall_at_20 value: 43.694 - type: recall_at_3 value: 26.125999999999998 - type: recall_at_5 value: 29.73 - task: type: Classification dataset: name: MTEB Banking77Classification (default) type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 84.08116883116882 - type: f1 value: 84.05587055990273 - type: f1_weighted value: 84.05587055990274 - type: main_score value: 84.08116883116882 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P (default) type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: main_score value: 38.1941007822277 - type: v_measure value: 38.1941007822277 - type: v_measure_std value: 0.7502113547288178 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S (default) type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: main_score value: 34.42075599178318 - type: v_measure value: 34.42075599178318 - type: v_measure_std value: 0.600256720497283 - task: type: Clustering dataset: name: MTEB BlurbsClusteringP2P (default) type: slvnwhrl/blurbs-clustering-p2p config: default split: test revision: a2dd5b02a77de3466a3eaa98ae586b5610314496 metrics: - type: main_score value: 41.634627363047265 - type: v_measure value: 41.634627363047265 - type: v_measure_std value: 9.726923191225307 - task: type: Clustering dataset: name: MTEB BlurbsClusteringS2S (default) type: slvnwhrl/blurbs-clustering-s2s config: default split: test revision: 22793b6a6465bf00120ad525e38c51210858132c metrics: - type: main_score value: 20.996468295584197 - type: v_measure value: 20.996468295584197 - type: v_measure_std value: 9.225766688272197 - task: type: Classification dataset: name: MTEB CBD (default) type: PL-MTEB/cbd config: default split: test revision: 36ddb419bcffe6a5374c3891957912892916f28d metrics: - type: accuracy value: 69.99 - type: ap value: 22.57826353116948 - type: ap_weighted value: 22.57826353116948 - type: f1 value: 59.04574955548393 - type: f1_weighted value: 74.36235022309789 - type: main_score value: 69.99 - task: type: PairClassification dataset: name: MTEB CDSC-E (default) type: PL-MTEB/cdsce-pairclassification config: default split: test revision: 0a3d4aa409b22f80eb22cbf59b492637637b536d metrics: - type: cosine_accuracy value: 88.7 - type: cosine_accuracy_threshold value: 97.37848043441772 - type: cosine_ap value: 73.0405088928302 - type: cosine_f1 value: 63.52201257861635 - type: cosine_f1_threshold value: 96.98888063430786 - type: cosine_precision value: 78.90625 - type: cosine_recall value: 53.1578947368421 - type: dot_accuracy value: 84.89999999999999 - type: dot_accuracy_threshold value: 43603.09753417969 - type: dot_ap value: 56.98157569085279 - type: dot_f1 value: 57.606490872210955 - type: dot_f1_threshold value: 40406.23779296875 - type: dot_precision value: 46.864686468646866 - type: dot_recall value: 74.73684210526315 - type: euclidean_accuracy value: 88.5 - type: euclidean_accuracy_threshold value: 498.0483055114746 - type: euclidean_ap value: 72.97328234816734 - type: euclidean_f1 value: 63.722397476340696 - type: euclidean_f1_threshold value: 508.6186408996582 - type: euclidean_precision value: 79.52755905511812 - type: euclidean_recall value: 53.1578947368421 - type: main_score value: 73.0405088928302 - type: manhattan_accuracy value: 88.6 - type: manhattan_accuracy_threshold value: 12233.079528808594 - type: manhattan_ap value: 72.92148503992615 - type: manhattan_f1 value: 63.69426751592356 - type: manhattan_f1_threshold value: 12392.754364013672 - type: manhattan_precision value: 80.64516129032258 - type: manhattan_recall value: 52.63157894736842 - type: max_accuracy value: 88.7 - type: max_ap value: 73.0405088928302 - type: max_f1 value: 63.722397476340696 - type: max_precision value: 80.64516129032258 - type: max_recall value: 74.73684210526315 - type: similarity_accuracy value: 88.7 - type: similarity_accuracy_threshold value: 97.37848043441772 - type: similarity_ap value: 73.0405088928302 - type: similarity_f1 value: 63.52201257861635 - type: similarity_f1_threshold value: 96.98888063430786 - type: similarity_precision value: 78.90625 - type: similarity_recall value: 53.1578947368421 - task: type: STS dataset: name: MTEB CDSC-R (default) type: PL-MTEB/cdscr-sts config: default split: test revision: 1cd6abbb00df7d14be3dbd76a7dcc64b3a79a7cd metrics: - type: cosine_pearson value: 92.97492495289738 - type: cosine_spearman value: 92.63248098608472 - type: euclidean_pearson value: 92.04712487782031 - type: euclidean_spearman value: 92.19679486755008 - type: main_score value: 92.63248098608472 - type: manhattan_pearson value: 92.0101187740438 - type: manhattan_spearman value: 92.20926859332754 - type: pearson value: 92.97492495289738 - type: spearman value: 92.63248098608472 - task: type: Clustering dataset: name: MTEB CLSClusteringP2P (default) type: C-MTEB/CLSClusteringP2P config: default split: test revision: 4b6227591c6c1a73bc76b1055f3b7f3588e72476 metrics: - type: main_score value: 39.96377851800628 - type: v_measure value: 39.96377851800628 - type: v_measure_std value: 0.9793033243093288 - task: type: Clustering dataset: name: MTEB CLSClusteringS2S (default) type: C-MTEB/CLSClusteringS2S config: default split: test revision: e458b3f5414b62b7f9f83499ac1f5497ae2e869f metrics: - type: main_score value: 38.788850224595784 - type: v_measure value: 38.788850224595784 - type: v_measure_std value: 1.0712604145916924 - task: type: Reranking dataset: name: MTEB CMedQAv1 type: C-MTEB/CMedQAv1-reranking config: default split: test revision: 8d7f1e942507dac42dc58017c1a001c3717da7df metrics: - type: map value: 77.95952507806115 - type: mrr value: 80.8643253968254 - type: main_score value: 77.95952507806115 - task: type: Reranking dataset: name: MTEB CMedQAv2 type: C-MTEB/CMedQAv2-reranking config: default split: test revision: 23d186750531a14a0357ca22cd92d712fd512ea0 metrics: - type: map value: 78.21522500165045 - type: mrr value: 81.28194444444443 - type: main_score value: 78.21522500165045 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval (default) type: mteb/cqadupstack-android config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: map_at_1 value: 33.377 - type: map_at_10 value: 46.371 - type: map_at_100 value: 47.829 - type: map_at_1000 value: 47.94 - type: map_at_20 value: 47.205000000000005 - type: map_at_3 value: 42.782 - type: map_at_5 value: 44.86 - type: mrr_at_1 value: 41.345 - type: mrr_at_10 value: 52.187 - type: mrr_at_100 value: 52.893 - type: mrr_at_1000 value: 52.929 - type: mrr_at_20 value: 52.637 - type: mrr_at_3 value: 49.714000000000006 - type: mrr_at_5 value: 51.373000000000005 - type: ndcg_at_1 value: 41.345 - type: ndcg_at_10 value: 52.946000000000005 - type: ndcg_at_100 value: 57.92699999999999 - type: ndcg_at_1000 value: 59.609 - type: ndcg_at_20 value: 54.900999999999996 - type: ndcg_at_3 value: 48.357 - type: ndcg_at_5 value: 50.739000000000004 - type: precision_at_1 value: 41.345 - type: precision_at_10 value: 10.186 - type: precision_at_100 value: 1.554 - type: precision_at_1000 value: 0.2 - type: precision_at_20 value: 5.959 - type: precision_at_3 value: 23.796 - type: precision_at_5 value: 17.024 - type: recall_at_1 value: 33.377 - type: recall_at_10 value: 65.067 - type: recall_at_100 value: 86.04899999999999 - type: recall_at_1000 value: 96.54899999999999 - type: recall_at_20 value: 72.071 - type: recall_at_3 value: 51.349999999999994 - type: recall_at_5 value: 58.41 - type: main_score value: 52.946000000000005 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval (default) type: mteb/cqadupstack-english config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: map_at_1 value: 31.097 - type: map_at_10 value: 42.183 - type: map_at_100 value: 43.580999999999996 - type: map_at_1000 value: 43.718 - type: map_at_20 value: 42.921 - type: map_at_3 value: 38.963 - type: map_at_5 value: 40.815 - type: mrr_at_1 value: 39.745000000000005 - type: mrr_at_10 value: 48.736000000000004 - type: mrr_at_100 value: 49.405 - type: mrr_at_1000 value: 49.452 - type: mrr_at_20 value: 49.118 - type: mrr_at_3 value: 46.497 - type: mrr_at_5 value: 47.827999999999996 - type: ndcg_at_1 value: 39.745000000000005 - type: ndcg_at_10 value: 48.248000000000005 - type: ndcg_at_100 value: 52.956 - type: ndcg_at_1000 value: 54.99699999999999 - type: ndcg_at_20 value: 50.01 - type: ndcg_at_3 value: 43.946000000000005 - type: ndcg_at_5 value: 46.038000000000004 - type: precision_at_1 value: 39.745000000000005 - type: precision_at_10 value: 9.229 - type: precision_at_100 value: 1.5070000000000001 - type: precision_at_1000 value: 0.199 - type: precision_at_20 value: 5.489999999999999 - type: precision_at_3 value: 21.38 - type: precision_at_5 value: 15.274 - type: recall_at_1 value: 31.097 - type: recall_at_10 value: 58.617 - type: recall_at_100 value: 78.55199999999999 - type: recall_at_1000 value: 91.13900000000001 - type: recall_at_20 value: 64.92 - type: recall_at_3 value: 45.672000000000004 - type: recall_at_5 value: 51.669 - type: main_score value: 48.248000000000005 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval (default) type: mteb/cqadupstack-gaming config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 39.745000000000005 - type: map_at_10 value: 52.063 - type: map_at_100 value: 53.077 - type: map_at_1000 value: 53.13 - type: map_at_20 value: 52.66 - type: map_at_3 value: 48.662 - type: map_at_5 value: 50.507000000000005 - type: mrr_at_1 value: 45.391999999999996 - type: mrr_at_10 value: 55.528 - type: mrr_at_100 value: 56.16100000000001 - type: mrr_at_1000 value: 56.192 - type: mrr_at_20 value: 55.923 - type: mrr_at_3 value: 52.93600000000001 - type: mrr_at_5 value: 54.435 - type: ndcg_at_1 value: 45.391999999999996 - type: ndcg_at_10 value: 58.019 - type: ndcg_at_100 value: 61.936 - type: ndcg_at_1000 value: 63.015 - type: ndcg_at_20 value: 59.691 - type: ndcg_at_3 value: 52.294 - type: ndcg_at_5 value: 55.017 - type: precision_at_1 value: 45.391999999999996 - type: precision_at_10 value: 9.386 - type: precision_at_100 value: 1.232 - type: precision_at_1000 value: 0.136 - type: precision_at_20 value: 5.223 - type: precision_at_3 value: 23.177 - type: precision_at_5 value: 15.9 - type: recall_at_1 value: 39.745000000000005 - type: recall_at_10 value: 72.08099999999999 - type: recall_at_100 value: 88.85300000000001 - type: recall_at_1000 value: 96.569 - type: recall_at_20 value: 78.203 - type: recall_at_3 value: 56.957 - type: recall_at_5 value: 63.63100000000001 - type: main_score value: 58.019 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval (default) type: mteb/cqadupstack-gis config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: map_at_1 value: 26.651999999999997 - type: map_at_10 value: 35.799 - type: map_at_100 value: 36.846000000000004 - type: map_at_1000 value: 36.931000000000004 - type: map_at_20 value: 36.341 - type: map_at_3 value: 32.999 - type: map_at_5 value: 34.597 - type: mrr_at_1 value: 28.814 - type: mrr_at_10 value: 37.869 - type: mrr_at_100 value: 38.728 - type: mrr_at_1000 value: 38.795 - type: mrr_at_20 value: 38.317 - type: mrr_at_3 value: 35.235 - type: mrr_at_5 value: 36.738 - type: ndcg_at_1 value: 28.814 - type: ndcg_at_10 value: 41.028 - type: ndcg_at_100 value: 46.162 - type: ndcg_at_1000 value: 48.15 - type: ndcg_at_20 value: 42.824 - type: ndcg_at_3 value: 35.621 - type: ndcg_at_5 value: 38.277 - type: precision_at_1 value: 28.814 - type: precision_at_10 value: 6.361999999999999 - type: precision_at_100 value: 0.9450000000000001 - type: precision_at_1000 value: 0.11399999999999999 - type: precision_at_20 value: 3.6159999999999997 - type: precision_at_3 value: 15.140999999999998 - type: precision_at_5 value: 10.712000000000002 - type: recall_at_1 value: 26.651999999999997 - type: recall_at_10 value: 55.038 - type: recall_at_100 value: 78.806 - type: recall_at_1000 value: 93.485 - type: recall_at_20 value: 61.742 - type: recall_at_3 value: 40.682 - type: recall_at_5 value: 46.855000000000004 - type: main_score value: 41.028 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval (default) type: mteb/cqadupstack-mathematica config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: map_at_1 value: 17.627000000000002 - type: map_at_10 value: 26.436999999999998 - type: map_at_100 value: 27.85 - type: map_at_1000 value: 27.955999999999996 - type: map_at_20 value: 27.233 - type: map_at_3 value: 23.777 - type: map_at_5 value: 25.122 - type: mrr_at_1 value: 22.387999999999998 - type: mrr_at_10 value: 31.589 - type: mrr_at_100 value: 32.641999999999996 - type: mrr_at_1000 value: 32.696999999999996 - type: mrr_at_20 value: 32.201 - type: mrr_at_3 value: 28.98 - type: mrr_at_5 value: 30.342000000000002 - type: ndcg_at_1 value: 22.387999999999998 - type: ndcg_at_10 value: 32.129999999999995 - type: ndcg_at_100 value: 38.562999999999995 - type: ndcg_at_1000 value: 40.903 - type: ndcg_at_20 value: 34.652 - type: ndcg_at_3 value: 27.26 - type: ndcg_at_5 value: 29.235 - type: precision_at_1 value: 22.387999999999998 - type: precision_at_10 value: 5.970000000000001 - type: precision_at_100 value: 1.068 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_20 value: 3.6999999999999997 - type: precision_at_3 value: 13.267000000000001 - type: precision_at_5 value: 9.403 - type: recall_at_1 value: 17.627000000000002 - type: recall_at_10 value: 44.71 - type: recall_at_100 value: 72.426 - type: recall_at_1000 value: 88.64699999999999 - type: recall_at_20 value: 53.65 - type: recall_at_3 value: 30.989 - type: recall_at_5 value: 36.237 - type: main_score value: 32.129999999999995 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval (default) type: mteb/cqadupstack-physics config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: map_at_1 value: 30.891000000000002 - type: map_at_10 value: 41.519 - type: map_at_100 value: 42.896 - type: map_at_1000 value: 42.992999999999995 - type: map_at_20 value: 42.287 - type: map_at_3 value: 37.822 - type: map_at_5 value: 39.976 - type: mrr_at_1 value: 37.921 - type: mrr_at_10 value: 47.260999999999996 - type: mrr_at_100 value: 48.044 - type: mrr_at_1000 value: 48.08 - type: mrr_at_20 value: 47.699999999999996 - type: mrr_at_3 value: 44.513999999999996 - type: mrr_at_5 value: 46.064 - type: ndcg_at_1 value: 37.921 - type: ndcg_at_10 value: 47.806 - type: ndcg_at_100 value: 53.274 - type: ndcg_at_1000 value: 55.021 - type: ndcg_at_20 value: 49.973 - type: ndcg_at_3 value: 42.046 - type: ndcg_at_5 value: 44.835 - type: precision_at_1 value: 37.921 - type: precision_at_10 value: 8.767999999999999 - type: precision_at_100 value: 1.353 - type: precision_at_1000 value: 0.168 - type: precision_at_20 value: 5.135 - type: precision_at_3 value: 20.051 - type: precision_at_5 value: 14.398 - type: recall_at_1 value: 30.891000000000002 - type: recall_at_10 value: 60.897999999999996 - type: recall_at_100 value: 83.541 - type: recall_at_1000 value: 94.825 - type: recall_at_20 value: 68.356 - type: recall_at_3 value: 44.65 - type: recall_at_5 value: 51.919000000000004 - type: main_score value: 47.806 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval (default) type: mteb/cqadupstack-programmers config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: map_at_1 value: 27.654 - type: map_at_10 value: 38.025999999999996 - type: map_at_100 value: 39.425 - type: map_at_1000 value: 39.528 - type: map_at_20 value: 38.838 - type: map_at_3 value: 34.745 - type: map_at_5 value: 36.537 - type: mrr_at_1 value: 34.018 - type: mrr_at_10 value: 43.314 - type: mrr_at_100 value: 44.283 - type: mrr_at_1000 value: 44.327 - type: mrr_at_20 value: 43.929 - type: mrr_at_3 value: 40.868 - type: mrr_at_5 value: 42.317 - type: ndcg_at_1 value: 34.018 - type: ndcg_at_10 value: 43.887 - type: ndcg_at_100 value: 49.791000000000004 - type: ndcg_at_1000 value: 51.834 - type: ndcg_at_20 value: 46.376 - type: ndcg_at_3 value: 38.769999999999996 - type: ndcg_at_5 value: 41.144 - type: precision_at_1 value: 34.018 - type: precision_at_10 value: 8.001999999999999 - type: precision_at_100 value: 1.2630000000000001 - type: precision_at_1000 value: 0.16 - type: precision_at_20 value: 4.737 - type: precision_at_3 value: 18.417 - type: precision_at_5 value: 13.150999999999998 - type: recall_at_1 value: 27.654 - type: recall_at_10 value: 56.111 - type: recall_at_100 value: 81.136 - type: recall_at_1000 value: 94.788 - type: recall_at_20 value: 65.068 - type: recall_at_3 value: 41.713 - type: recall_at_5 value: 48.106 - type: main_score value: 43.887 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval (default) type: CQADupstackRetrieval_is_a_combined_dataset config: default split: test revision: CQADupstackRetrieval_is_a_combined_dataset metrics: - type: main_score value: 42.58858333333333 - type: ndcg_at_10 value: 42.58858333333333 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval (default) type: mteb/cqadupstack-stats config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: map_at_1 value: 24.501 - type: map_at_10 value: 32.814 - type: map_at_100 value: 33.754 - type: map_at_1000 value: 33.859 - type: map_at_20 value: 33.324 - type: map_at_3 value: 30.758000000000003 - type: map_at_5 value: 31.936999999999998 - type: mrr_at_1 value: 27.761000000000003 - type: mrr_at_10 value: 35.662 - type: mrr_at_100 value: 36.443999999999996 - type: mrr_at_1000 value: 36.516999999999996 - type: mrr_at_20 value: 36.085 - type: mrr_at_3 value: 33.742 - type: mrr_at_5 value: 34.931 - type: ndcg_at_1 value: 27.761000000000003 - type: ndcg_at_10 value: 37.208000000000006 - type: ndcg_at_100 value: 41.839 - type: ndcg_at_1000 value: 44.421 - type: ndcg_at_20 value: 38.917 - type: ndcg_at_3 value: 33.544000000000004 - type: ndcg_at_5 value: 35.374 - type: precision_at_1 value: 27.761000000000003 - type: precision_at_10 value: 5.92 - type: precision_at_100 value: 0.899 - type: precision_at_1000 value: 0.12 - type: precision_at_20 value: 3.4130000000000003 - type: precision_at_3 value: 15.031 - type: precision_at_5 value: 10.306999999999999 - type: recall_at_1 value: 24.501 - type: recall_at_10 value: 47.579 - type: recall_at_100 value: 69.045 - type: recall_at_1000 value: 88.032 - type: recall_at_20 value: 54.125 - type: recall_at_3 value: 37.202 - type: recall_at_5 value: 41.927 - type: main_score value: 37.208000000000006 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval (default) type: mteb/cqadupstack-tex config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 18.29 - type: map_at_10 value: 26.183 - type: map_at_100 value: 27.351999999999997 - type: map_at_1000 value: 27.483999999999998 - type: map_at_20 value: 26.798 - type: map_at_3 value: 23.629 - type: map_at_5 value: 24.937 - type: mrr_at_1 value: 22.299 - type: mrr_at_10 value: 30.189 - type: mrr_at_100 value: 31.098 - type: mrr_at_1000 value: 31.177 - type: mrr_at_20 value: 30.697000000000003 - type: mrr_at_3 value: 27.862 - type: mrr_at_5 value: 29.066 - type: ndcg_at_1 value: 22.299 - type: ndcg_at_10 value: 31.202 - type: ndcg_at_100 value: 36.617 - type: ndcg_at_1000 value: 39.544000000000004 - type: ndcg_at_20 value: 33.177 - type: ndcg_at_3 value: 26.639000000000003 - type: ndcg_at_5 value: 28.526 - type: precision_at_1 value: 22.299 - type: precision_at_10 value: 5.8020000000000005 - type: precision_at_100 value: 1.0070000000000001 - type: precision_at_1000 value: 0.14400000000000002 - type: precision_at_20 value: 3.505 - type: precision_at_3 value: 12.698 - type: precision_at_5 value: 9.174 - type: recall_at_1 value: 18.29 - type: recall_at_10 value: 42.254999999999995 - type: recall_at_100 value: 66.60000000000001 - type: recall_at_1000 value: 87.31400000000001 - type: recall_at_20 value: 49.572 - type: recall_at_3 value: 29.342000000000002 - type: recall_at_5 value: 34.221000000000004 - type: main_score value: 31.202 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval (default) type: mteb/cqadupstack-unix config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: map_at_1 value: 27.722 - type: map_at_10 value: 37.698 - type: map_at_100 value: 38.899 - type: map_at_1000 value: 38.998 - type: map_at_20 value: 38.381 - type: map_at_3 value: 34.244 - type: map_at_5 value: 36.295 - type: mrr_at_1 value: 32.183 - type: mrr_at_10 value: 41.429 - type: mrr_at_100 value: 42.308 - type: mrr_at_1000 value: 42.358000000000004 - type: mrr_at_20 value: 41.957 - type: mrr_at_3 value: 38.401999999999994 - type: mrr_at_5 value: 40.294999999999995 - type: ndcg_at_1 value: 32.183 - type: ndcg_at_10 value: 43.519000000000005 - type: ndcg_at_100 value: 48.786 - type: ndcg_at_1000 value: 50.861999999999995 - type: ndcg_at_20 value: 45.654 - type: ndcg_at_3 value: 37.521 - type: ndcg_at_5 value: 40.615 - type: precision_at_1 value: 32.183 - type: precision_at_10 value: 7.603 - type: precision_at_100 value: 1.135 - type: precision_at_1000 value: 0.14200000000000002 - type: precision_at_20 value: 4.408 - type: precision_at_3 value: 17.071 - type: precision_at_5 value: 12.668 - type: recall_at_1 value: 27.722 - type: recall_at_10 value: 57.230000000000004 - type: recall_at_100 value: 79.97999999999999 - type: recall_at_1000 value: 94.217 - type: recall_at_20 value: 64.864 - type: recall_at_3 value: 41.215 - type: recall_at_5 value: 48.774 - type: main_score value: 43.519000000000005 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval (default) type: mteb/cqadupstack-webmasters config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: map_at_1 value: 25.852999999999998 - type: map_at_10 value: 35.394999999999996 - type: map_at_100 value: 37.291999999999994 - type: map_at_1000 value: 37.495 - type: map_at_20 value: 36.372 - type: map_at_3 value: 32.336 - type: map_at_5 value: 34.159 - type: mrr_at_1 value: 31.818 - type: mrr_at_10 value: 40.677 - type: mrr_at_100 value: 41.728 - type: mrr_at_1000 value: 41.778 - type: mrr_at_20 value: 41.301 - type: mrr_at_3 value: 38.208 - type: mrr_at_5 value: 39.592 - type: ndcg_at_1 value: 31.818 - type: ndcg_at_10 value: 41.559000000000005 - type: ndcg_at_100 value: 48.012 - type: ndcg_at_1000 value: 50.234 - type: ndcg_at_20 value: 44.15 - type: ndcg_at_3 value: 36.918 - type: ndcg_at_5 value: 39.227000000000004 - type: precision_at_1 value: 31.818 - type: precision_at_10 value: 8.043 - type: precision_at_100 value: 1.625 - type: precision_at_1000 value: 0.245 - type: precision_at_20 value: 5.2170000000000005 - type: precision_at_3 value: 17.655 - type: precision_at_5 value: 12.845999999999998 - type: recall_at_1 value: 25.852999999999998 - type: recall_at_10 value: 53.093 - type: recall_at_100 value: 81.05799999999999 - type: recall_at_1000 value: 94.657 - type: recall_at_20 value: 62.748000000000005 - type: recall_at_3 value: 39.300000000000004 - type: recall_at_5 value: 45.754 - type: main_score value: 41.559000000000005 - task: type: Retrieval dataset: name: MTEB CQADupstackWordpressRetrieval (default) type: mteb/cqadupstack-wordpress config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 19.23 - type: map_at_10 value: 28.128999999999998 - type: map_at_100 value: 29.195 - type: map_at_1000 value: 29.310000000000002 - type: map_at_20 value: 28.713 - type: map_at_3 value: 25.191000000000003 - type: map_at_5 value: 26.69 - type: mrr_at_1 value: 21.257 - type: mrr_at_10 value: 30.253999999999998 - type: mrr_at_100 value: 31.195 - type: mrr_at_1000 value: 31.270999999999997 - type: mrr_at_20 value: 30.747999999999998 - type: mrr_at_3 value: 27.633999999999997 - type: mrr_at_5 value: 28.937 - type: ndcg_at_1 value: 21.257 - type: ndcg_at_10 value: 33.511 - type: ndcg_at_100 value: 38.733000000000004 - type: ndcg_at_1000 value: 41.489 - type: ndcg_at_20 value: 35.476 - type: ndcg_at_3 value: 27.845 - type: ndcg_at_5 value: 30.264999999999997 - type: precision_at_1 value: 21.257 - type: precision_at_10 value: 5.619 - type: precision_at_100 value: 0.893 - type: precision_at_1000 value: 0.124 - type: precision_at_20 value: 3.29 - type: precision_at_3 value: 12.508 - type: precision_at_5 value: 8.946 - type: recall_at_1 value: 19.23 - type: recall_at_10 value: 48.185 - type: recall_at_100 value: 71.932 - type: recall_at_1000 value: 92.587 - type: recall_at_20 value: 55.533 - type: recall_at_3 value: 32.865 - type: recall_at_5 value: 38.577 - type: main_score value: 33.511 - task: type: Retrieval dataset: name: MTEB ClimateFEVER (default) type: mteb/climate-fever config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: map_at_1 value: 19.594 - type: map_at_10 value: 32.519 - type: map_at_100 value: 34.1 - type: map_at_1000 value: 34.263 - type: map_at_20 value: 33.353 - type: map_at_3 value: 27.898 - type: map_at_5 value: 30.524 - type: mrr_at_1 value: 46.515 - type: mrr_at_10 value: 56.958 - type: mrr_at_100 value: 57.54899999999999 - type: mrr_at_1000 value: 57.574999999999996 - type: mrr_at_20 value: 57.315000000000005 - type: mrr_at_3 value: 54.852999999999994 - type: mrr_at_5 value: 56.153 - type: ndcg_at_1 value: 46.515 - type: ndcg_at_10 value: 42.363 - type: ndcg_at_100 value: 48.233 - type: ndcg_at_1000 value: 50.993 - type: ndcg_at_20 value: 44.533 - type: ndcg_at_3 value: 37.297000000000004 - type: ndcg_at_5 value: 38.911 - type: precision_at_1 value: 46.515 - type: precision_at_10 value: 12.520999999999999 - type: precision_at_100 value: 1.8980000000000001 - type: precision_at_1000 value: 0.242 - type: precision_at_20 value: 7.212000000000001 - type: precision_at_3 value: 27.752 - type: precision_at_5 value: 20.391000000000002 - type: recall_at_1 value: 19.594 - type: recall_at_10 value: 46.539 - type: recall_at_100 value: 66.782 - type: recall_at_1000 value: 82.049 - type: recall_at_20 value: 52.611 - type: recall_at_3 value: 32.528 - type: recall_at_5 value: 38.933 - type: main_score value: 42.363 - task: type: Retrieval dataset: name: MTEB CmedqaRetrieval (default) type: C-MTEB/CmedqaRetrieval config: default split: dev revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301 metrics: - type: main_score value: 35.927 - type: map_at_1 value: 20.144000000000002 - type: map_at_10 value: 29.94 - type: map_at_100 value: 31.630000000000003 - type: map_at_1000 value: 31.778000000000002 - type: map_at_20 value: 30.798 - type: map_at_3 value: 26.534999999999997 - type: map_at_5 value: 28.33 - type: mrr_at_1 value: 31.23280820205051 - type: mrr_at_10 value: 38.66781179421835 - type: mrr_at_100 value: 39.656936166081785 - type: mrr_at_1000 value: 39.724602893117414 - type: mrr_at_20 value: 39.21272461558451 - type: mrr_at_3 value: 36.30907726931729 - type: mrr_at_5 value: 37.59814953738436 - type: nauc_map_at_1000_diff1 value: 44.5755334437146 - type: nauc_map_at_1000_max value: 40.726916781400746 - type: nauc_map_at_1000_std value: -19.591835061497367 - type: nauc_map_at_100_diff1 value: 44.54542899921038 - type: nauc_map_at_100_max value: 40.68305902532837 - type: nauc_map_at_100_std value: -19.658902089283487 - type: nauc_map_at_10_diff1 value: 44.56110529630953 - type: nauc_map_at_10_max value: 39.89826167846008 - type: nauc_map_at_10_std value: -20.62910633667902 - type: nauc_map_at_1_diff1 value: 50.82120107004449 - type: nauc_map_at_1_max value: 33.208851367861584 - type: nauc_map_at_1_std value: -20.29409730258174 - type: nauc_map_at_20_diff1 value: 44.51171242433788 - type: nauc_map_at_20_max value: 40.30431132782945 - type: nauc_map_at_20_std value: -20.290524142792417 - type: nauc_map_at_3_diff1 value: 45.80394138665133 - type: nauc_map_at_3_max value: 37.766191281426956 - type: nauc_map_at_3_std value: -21.223601997333876 - type: nauc_map_at_5_diff1 value: 45.00457218474283 - type: nauc_map_at_5_max value: 38.901044576388365 - type: nauc_map_at_5_std value: -20.893069613941634 - type: nauc_mrr_at_1000_diff1 value: 50.09855359231429 - type: nauc_mrr_at_1000_max value: 46.481000170008826 - type: nauc_mrr_at_1000_std value: -16.053461377096102 - type: nauc_mrr_at_100_diff1 value: 50.08205026347746 - type: nauc_mrr_at_100_max value: 46.47262126963331 - type: nauc_mrr_at_100_std value: -16.049112778748693 - type: nauc_mrr_at_10_diff1 value: 50.02363239081706 - type: nauc_mrr_at_10_max value: 46.39287859062042 - type: nauc_mrr_at_10_std value: -16.280866744769657 - type: nauc_mrr_at_1_diff1 value: 55.692503735317445 - type: nauc_mrr_at_1_max value: 47.334834529801014 - type: nauc_mrr_at_1_std value: -16.985483585693512 - type: nauc_mrr_at_20_diff1 value: 50.07725225722074 - type: nauc_mrr_at_20_max value: 46.47279295070193 - type: nauc_mrr_at_20_std value: -16.15168364678318 - type: nauc_mrr_at_3_diff1 value: 51.18685337274134 - type: nauc_mrr_at_3_max value: 46.7286365021621 - type: nauc_mrr_at_3_std value: -16.708451287313718 - type: nauc_mrr_at_5_diff1 value: 50.46777237893576 - type: nauc_mrr_at_5_max value: 46.5352076502249 - type: nauc_mrr_at_5_std value: -16.557413659905034 - type: nauc_ndcg_at_1000_diff1 value: 43.974299434438066 - type: nauc_ndcg_at_1000_max value: 43.44628675071857 - type: nauc_ndcg_at_1000_std value: -15.3495102005021 - type: nauc_ndcg_at_100_diff1 value: 43.336365081508504 - type: nauc_ndcg_at_100_max value: 43.11345604460776 - type: nauc_ndcg_at_100_std value: -15.571128070860615 - type: nauc_ndcg_at_10_diff1 value: 43.41266214720136 - type: nauc_ndcg_at_10_max value: 41.519676787851914 - type: nauc_ndcg_at_10_std value: -19.217175017223568 - type: nauc_ndcg_at_1_diff1 value: 55.692503735317445 - type: nauc_ndcg_at_1_max value: 47.334834529801014 - type: nauc_ndcg_at_1_std value: -16.985483585693512 - type: nauc_ndcg_at_20_diff1 value: 43.351653862834496 - type: nauc_ndcg_at_20_max value: 42.11608469750499 - type: nauc_ndcg_at_20_std value: -18.485363540641664 - type: nauc_ndcg_at_3_diff1 value: 45.64193888236677 - type: nauc_ndcg_at_3_max value: 42.497135099009995 - type: nauc_ndcg_at_3_std value: -18.764012041130094 - type: nauc_ndcg_at_5_diff1 value: 44.523392133895186 - type: nauc_ndcg_at_5_max value: 41.564242030096345 - type: nauc_ndcg_at_5_std value: -19.31080790984941 - type: nauc_precision_at_1000_diff1 value: 6.383464615714393 - type: nauc_precision_at_1000_max value: 27.439930931284657 - type: nauc_precision_at_1000_std value: 19.070716188143034 - type: nauc_precision_at_100_diff1 value: 12.599136754501284 - type: nauc_precision_at_100_max value: 35.886310962337795 - type: nauc_precision_at_100_std value: 14.06587592659196 - type: nauc_precision_at_10_diff1 value: 25.388891173150206 - type: nauc_precision_at_10_max value: 46.10269270777384 - type: nauc_precision_at_10_std value: -5.993803607158499 - type: nauc_precision_at_1_diff1 value: 55.692503735317445 - type: nauc_precision_at_1_max value: 47.334834529801014 - type: nauc_precision_at_1_std value: -16.985483585693512 - type: nauc_precision_at_20_diff1 value: 20.984013463099707 - type: nauc_precision_at_20_max value: 42.9471854616888 - type: nauc_precision_at_20_std value: -0.8045549929346024 - type: nauc_precision_at_3_diff1 value: 36.191850547148356 - type: nauc_precision_at_3_max value: 48.09923832376049 - type: nauc_precision_at_3_std value: -13.159407051271321 - type: nauc_precision_at_5_diff1 value: 31.04967966700407 - type: nauc_precision_at_5_max value: 47.62867673349624 - type: nauc_precision_at_5_std value: -10.345790325137353 - type: nauc_recall_at_1000_diff1 value: 11.03436839065707 - type: nauc_recall_at_1000_max value: 42.32265076651575 - type: nauc_recall_at_1000_std value: 30.478521053399206 - type: nauc_recall_at_100_diff1 value: 24.788349084510806 - type: nauc_recall_at_100_max value: 36.72097184821956 - type: nauc_recall_at_100_std value: -0.2241144179522076 - type: nauc_recall_at_10_diff1 value: 31.613053567704885 - type: nauc_recall_at_10_max value: 34.4597322828833 - type: nauc_recall_at_10_std value: -18.00022912690819 - type: nauc_recall_at_1_diff1 value: 50.82120107004449 - type: nauc_recall_at_1_max value: 33.208851367861584 - type: nauc_recall_at_1_std value: -20.29409730258174 - type: nauc_recall_at_20_diff1 value: 30.277002670708384 - type: nauc_recall_at_20_max value: 35.212475675060375 - type: nauc_recall_at_20_std value: -15.822788854733687 - type: nauc_recall_at_3_diff1 value: 38.87844958322257 - type: nauc_recall_at_3_max value: 34.66914910044104 - type: nauc_recall_at_3_std value: -20.234707300209127 - type: nauc_recall_at_5_diff1 value: 35.551139991687776 - type: nauc_recall_at_5_max value: 34.61009958820695 - type: nauc_recall_at_5_std value: -19.519180149293444 - type: ndcg_at_1 value: 31.233 - type: ndcg_at_10 value: 35.927 - type: ndcg_at_100 value: 43.037 - type: ndcg_at_1000 value: 45.900999999999996 - type: ndcg_at_20 value: 38.39 - type: ndcg_at_3 value: 31.366 - type: ndcg_at_5 value: 33.108 - type: precision_at_1 value: 31.233 - type: precision_at_10 value: 8.15 - type: precision_at_100 value: 1.402 - type: precision_at_1000 value: 0.17700000000000002 - type: precision_at_20 value: 4.91 - type: precision_at_3 value: 17.871000000000002 - type: precision_at_5 value: 12.948 - type: recall_at_1 value: 20.144000000000002 - type: recall_at_10 value: 44.985 - type: recall_at_100 value: 74.866 - type: recall_at_1000 value: 94.477 - type: recall_at_20 value: 53.37 - type: recall_at_3 value: 31.141000000000002 - type: recall_at_5 value: 36.721 - task: type: PairClassification dataset: name: MTEB Cmnli (default) type: C-MTEB/CMNLI config: default split: validation revision: None metrics: - type: cos_sim_accuracy value: 71.25676488274203 - type: cos_sim_accuracy_threshold value: 78.11152935028076 - type: cos_sim_ap value: 79.10444825556077 - type: cos_sim_f1 value: 74.10750923266312 - type: cos_sim_f1_threshold value: 75.2312421798706 - type: cos_sim_precision value: 66.02083714129044 - type: cos_sim_recall value: 84.45171849427169 - type: dot_accuracy value: 68.11785929043896 - type: dot_accuracy_threshold value: 34783.23974609375 - type: dot_ap value: 75.80201827987712 - type: dot_f1 value: 72.31670990679349 - type: dot_f1_threshold value: 31978.036499023438 - type: dot_precision value: 61.386623164763456 - type: dot_recall value: 87.98223053542202 - type: euclidean_accuracy value: 71.41310883944678 - type: euclidean_accuracy_threshold value: 1374.9353408813477 - type: euclidean_ap value: 79.23359768836457 - type: euclidean_f1 value: 74.38512297540491 - type: euclidean_f1_threshold value: 1512.6035690307617 - type: euclidean_precision value: 64.97816593886463 - type: euclidean_recall value: 86.97685293429974 - type: manhattan_accuracy value: 71.32892363199038 - type: manhattan_accuracy_threshold value: 33340.49072265625 - type: manhattan_ap value: 79.11973684118587 - type: manhattan_f1 value: 74.29401993355481 - type: manhattan_f1_threshold value: 36012.52746582031 - type: manhattan_precision value: 66.81605975723622 - type: manhattan_recall value: 83.65676876315175 - type: max_accuracy value: 71.41310883944678 - type: max_ap value: 79.23359768836457 - type: max_f1 value: 74.38512297540491 - task: type: Retrieval dataset: name: MTEB CovidRetrieval (default) type: C-MTEB/CovidRetrieval config: default split: dev revision: 1271c7809071a13532e05f25fb53511ffce77117 metrics: - type: main_score value: 78.917 - type: map_at_1 value: 67.281 - type: map_at_10 value: 75.262 - type: map_at_100 value: 75.60900000000001 - type: map_at_1000 value: 75.618 - type: map_at_20 value: 75.50200000000001 - type: map_at_3 value: 73.455 - type: map_at_5 value: 74.657 - type: mrr_at_1 value: 67.43940990516333 - type: mrr_at_10 value: 75.27367989696756 - type: mrr_at_100 value: 75.62029353306437 - type: mrr_at_1000 value: 75.62934741874726 - type: mrr_at_20 value: 75.51356607409173 - type: mrr_at_3 value: 73.5159817351598 - type: mrr_at_5 value: 74.73832103969093 - type: nauc_map_at_1000_diff1 value: 77.26666391867634 - type: nauc_map_at_1000_max value: 49.928541012203496 - type: nauc_map_at_1000_std value: -40.494469470474456 - type: nauc_map_at_100_diff1 value: 77.26087423162396 - type: nauc_map_at_100_max value: 49.944275615664424 - type: nauc_map_at_100_std value: -40.48299992715398 - type: nauc_map_at_10_diff1 value: 76.97400113500906 - type: nauc_map_at_10_max value: 49.84177029115674 - type: nauc_map_at_10_std value: -40.829250876511445 - type: nauc_map_at_1_diff1 value: 81.44050620630395 - type: nauc_map_at_1_max value: 48.97711944070578 - type: nauc_map_at_1_std value: -38.963689457570254 - type: nauc_map_at_20_diff1 value: 77.21791353089375 - type: nauc_map_at_20_max value: 49.958206759079424 - type: nauc_map_at_20_std value: -40.53067571658996 - type: nauc_map_at_3_diff1 value: 77.3555925208868 - type: nauc_map_at_3_max value: 49.32158146451256 - type: nauc_map_at_3_std value: -41.93552426981978 - type: nauc_map_at_5_diff1 value: 77.07099950431504 - type: nauc_map_at_5_max value: 49.54190504495002 - type: nauc_map_at_5_std value: -41.814968130918096 - type: nauc_mrr_at_1000_diff1 value: 77.31388774540477 - type: nauc_mrr_at_1000_max value: 49.96779699175759 - type: nauc_mrr_at_1000_std value: -40.43739645160277 - type: nauc_mrr_at_100_diff1 value: 77.30817786449413 - type: nauc_mrr_at_100_max value: 49.982514428937655 - type: nauc_mrr_at_100_std value: -40.42876582797744 - type: nauc_mrr_at_10_diff1 value: 77.02048060465756 - type: nauc_mrr_at_10_max value: 49.87937207270602 - type: nauc_mrr_at_10_std value: -40.77596560333177 - type: nauc_mrr_at_1_diff1 value: 81.27219599516599 - type: nauc_mrr_at_1_max value: 49.3083394026327 - type: nauc_mrr_at_1_std value: -38.31023037552026 - type: nauc_mrr_at_20_diff1 value: 77.26497089316055 - type: nauc_mrr_at_20_max value: 49.996257597621415 - type: nauc_mrr_at_20_std value: -40.476723608868014 - type: nauc_mrr_at_3_diff1 value: 77.38971294099257 - type: nauc_mrr_at_3_max value: 49.38110328987404 - type: nauc_mrr_at_3_std value: -41.7118646715979 - type: nauc_mrr_at_5_diff1 value: 77.08286142519952 - type: nauc_mrr_at_5_max value: 49.655249374588685 - type: nauc_mrr_at_5_std value: -41.48173039989406 - type: nauc_ndcg_at_1000_diff1 value: 76.47399204021758 - type: nauc_ndcg_at_1000_max value: 50.55770139961048 - type: nauc_ndcg_at_1000_std value: -39.55650430279072 - type: nauc_ndcg_at_100_diff1 value: 76.29355616618253 - type: nauc_ndcg_at_100_max value: 51.003608112592936 - type: nauc_ndcg_at_100_std value: -39.24769744605206 - type: nauc_ndcg_at_10_diff1 value: 74.88697528447634 - type: nauc_ndcg_at_10_max value: 50.398416372815234 - type: nauc_ndcg_at_10_std value: -40.76526585772833 - type: nauc_ndcg_at_1_diff1 value: 81.27219599516599 - type: nauc_ndcg_at_1_max value: 49.3083394026327 - type: nauc_ndcg_at_1_std value: -38.31023037552026 - type: nauc_ndcg_at_20_diff1 value: 75.85463512091866 - type: nauc_ndcg_at_20_max value: 50.97338683654334 - type: nauc_ndcg_at_20_std value: -39.353128774903404 - type: nauc_ndcg_at_3_diff1 value: 75.94015726123543 - type: nauc_ndcg_at_3_max value: 49.22194251063148 - type: nauc_ndcg_at_3_std value: -43.040457030630435 - type: nauc_ndcg_at_5_diff1 value: 75.19166189770303 - type: nauc_ndcg_at_5_max value: 49.65696229797189 - type: nauc_ndcg_at_5_std value: -42.81534909184424 - type: nauc_precision_at_1000_diff1 value: -14.830901395815788 - type: nauc_precision_at_1000_max value: 19.686297136854623 - type: nauc_precision_at_1000_std value: 61.19310360166978 - type: nauc_precision_at_100_diff1 value: 20.55469986751769 - type: nauc_precision_at_100_max value: 50.78431835075583 - type: nauc_precision_at_100_std value: 31.54986568374813 - type: nauc_precision_at_10_diff1 value: 45.991938532558656 - type: nauc_precision_at_10_max value: 46.386318595630385 - type: nauc_precision_at_10_std value: -23.463011435224608 - type: nauc_precision_at_1_diff1 value: 81.27219599516599 - type: nauc_precision_at_1_max value: 49.3083394026327 - type: nauc_precision_at_1_std value: -38.31023037552026 - type: nauc_precision_at_20_diff1 value: 41.53180472410822 - type: nauc_precision_at_20_max value: 49.89800247204318 - type: nauc_precision_at_20_std value: -2.4192847331537095 - type: nauc_precision_at_3_diff1 value: 67.37504651209993 - type: nauc_precision_at_3_max value: 47.893537208629496 - type: nauc_precision_at_3_std value: -43.2362212382819 - type: nauc_precision_at_5_diff1 value: 60.03438883791718 - type: nauc_precision_at_5_max value: 48.29770502354206 - type: nauc_precision_at_5_std value: -40.39588448271546 - type: nauc_recall_at_1000_diff1 value: 71.04741174480844 - type: nauc_recall_at_1000_max value: 93.19056506596002 - type: nauc_recall_at_1000_std value: 62.96994797650912 - type: nauc_recall_at_100_diff1 value: 65.00418176852641 - type: nauc_recall_at_100_max value: 85.27352708427193 - type: nauc_recall_at_100_std value: 2.8812005546518886 - type: nauc_recall_at_10_diff1 value: 61.263254794998865 - type: nauc_recall_at_10_max value: 54.17618329507141 - type: nauc_recall_at_10_std value: -39.80603966142593 - type: nauc_recall_at_1_diff1 value: 81.44050620630395 - type: nauc_recall_at_1_max value: 48.97711944070578 - type: nauc_recall_at_1_std value: -38.963689457570254 - type: nauc_recall_at_20_diff1 value: 64.42106091745396 - type: nauc_recall_at_20_max value: 63.10796640821887 - type: nauc_recall_at_20_std value: -22.60117424572222 - type: nauc_recall_at_3_diff1 value: 70.66311436592945 - type: nauc_recall_at_3_max value: 48.69498944323469 - type: nauc_recall_at_3_std value: -47.37847524874532 - type: nauc_recall_at_5_diff1 value: 66.12701111728848 - type: nauc_recall_at_5_max value: 49.91763957934711 - type: nauc_recall_at_5_std value: -48.173252920584126 - type: ndcg_at_1 value: 67.43900000000001 - type: ndcg_at_10 value: 78.917 - type: ndcg_at_100 value: 80.53399999999999 - type: ndcg_at_1000 value: 80.768 - type: ndcg_at_20 value: 79.813 - type: ndcg_at_3 value: 75.37 - type: ndcg_at_5 value: 77.551 - type: precision_at_1 value: 67.43900000000001 - type: precision_at_10 value: 9.115 - type: precision_at_100 value: 0.985 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.737 - type: precision_at_3 value: 27.081 - type: precision_at_5 value: 17.345 - type: recall_at_1 value: 67.281 - type: recall_at_10 value: 90.2 - type: recall_at_100 value: 97.576 - type: recall_at_1000 value: 99.368 - type: recall_at_20 value: 93.783 - type: recall_at_3 value: 80.822 - type: recall_at_5 value: 86.091 - task: type: Retrieval dataset: name: MTEB DBPedia (default) type: mteb/dbpedia config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: map_at_1 value: 9.041 - type: map_at_10 value: 18.662 - type: map_at_100 value: 26.054 - type: map_at_1000 value: 27.769 - type: map_at_20 value: 21.499 - type: map_at_3 value: 13.628000000000002 - type: map_at_5 value: 15.617 - type: mrr_at_1 value: 67.25 - type: mrr_at_10 value: 74.673 - type: mrr_at_100 value: 75.022 - type: mrr_at_1000 value: 75.031 - type: mrr_at_20 value: 74.895 - type: mrr_at_3 value: 73.042 - type: mrr_at_5 value: 74.179 - type: ndcg_at_1 value: 55.75 - type: ndcg_at_10 value: 41.004000000000005 - type: ndcg_at_100 value: 44.912 - type: ndcg_at_1000 value: 51.946000000000005 - type: ndcg_at_20 value: 40.195 - type: ndcg_at_3 value: 45.803 - type: ndcg_at_5 value: 42.976 - type: precision_at_1 value: 67.25 - type: precision_at_10 value: 31.874999999999996 - type: precision_at_100 value: 10.37 - type: precision_at_1000 value: 2.1430000000000002 - type: precision_at_20 value: 24.275 - type: precision_at_3 value: 48.417 - type: precision_at_5 value: 40.2 - type: recall_at_1 value: 9.041 - type: recall_at_10 value: 23.592 - type: recall_at_100 value: 49.476 - type: recall_at_1000 value: 71.677 - type: recall_at_20 value: 30.153000000000002 - type: recall_at_3 value: 14.777000000000001 - type: recall_at_5 value: 17.829 - type: main_score value: 41.004000000000005 - task: type: Retrieval dataset: name: MTEB DuRetrieval (default) type: C-MTEB/DuRetrieval config: default split: dev revision: a1a333e290fe30b10f3f56498e3a0d911a693ced metrics: - type: main_score value: 83.134 - type: map_at_1 value: 23.907999999999998 - type: map_at_10 value: 74.566 - type: map_at_100 value: 77.706 - type: map_at_1000 value: 77.762 - type: map_at_20 value: 76.943 - type: map_at_3 value: 50.971999999999994 - type: map_at_5 value: 64.429 - type: mrr_at_1 value: 84.8 - type: mrr_at_10 value: 89.73218253968246 - type: mrr_at_100 value: 89.82853630655774 - type: mrr_at_1000 value: 89.83170411703153 - type: mrr_at_20 value: 89.79582030091501 - type: mrr_at_3 value: 89.32499999999992 - type: mrr_at_5 value: 89.58749999999992 - type: nauc_map_at_1000_diff1 value: -2.2736020650163717 - type: nauc_map_at_1000_max value: 45.3937519555142 - type: nauc_map_at_1000_std value: 10.824778228268581 - type: nauc_map_at_100_diff1 value: -2.2662939752750066 - type: nauc_map_at_100_max value: 45.423960626031366 - type: nauc_map_at_100_std value: 10.804239351738717 - type: nauc_map_at_10_diff1 value: 0.9395752585654343 - type: nauc_map_at_10_max value: 42.53814836940551 - type: nauc_map_at_10_std value: 0.7199313235265218 - type: nauc_map_at_1_diff1 value: 45.19415865267676 - type: nauc_map_at_1_max value: -1.7261947382471912 - type: nauc_map_at_1_std value: -32.16144291613605 - type: nauc_map_at_20_diff1 value: -1.884514152147472 - type: nauc_map_at_20_max value: 44.830401115927174 - type: nauc_map_at_20_std value: 8.118530414377219 - type: nauc_map_at_3_diff1 value: 25.678881127059967 - type: nauc_map_at_3_max value: 12.191400431839758 - type: nauc_map_at_3_std value: -27.201740587642327 - type: nauc_map_at_5_diff1 value: 13.227128780829572 - type: nauc_map_at_5_max value: 26.978282739708977 - type: nauc_map_at_5_std value: -17.555610348070584 - type: nauc_mrr_at_1000_diff1 value: 21.073512437502178 - type: nauc_mrr_at_1000_max value: 64.9680257861005 - type: nauc_mrr_at_1000_std value: 19.626288754404293 - type: nauc_mrr_at_100_diff1 value: 21.074637426957732 - type: nauc_mrr_at_100_max value: 64.97612675661915 - type: nauc_mrr_at_100_std value: 19.649504127800878 - type: nauc_mrr_at_10_diff1 value: 21.12003267626651 - type: nauc_mrr_at_10_max value: 65.24362289059766 - type: nauc_mrr_at_10_std value: 19.92351276180984 - type: nauc_mrr_at_1_diff1 value: 22.711430629147635 - type: nauc_mrr_at_1_max value: 58.4059429497403 - type: nauc_mrr_at_1_std value: 11.967886722567973 - type: nauc_mrr_at_20_diff1 value: 20.98220830510272 - type: nauc_mrr_at_20_max value: 65.05737535197835 - type: nauc_mrr_at_20_std value: 19.66672900782771 - type: nauc_mrr_at_3_diff1 value: 20.924796220048528 - type: nauc_mrr_at_3_max value: 65.71388669932584 - type: nauc_mrr_at_3_std value: 20.05912197134477 - type: nauc_mrr_at_5_diff1 value: 20.61978649468208 - type: nauc_mrr_at_5_max value: 65.50709154526211 - type: nauc_mrr_at_5_std value: 20.241434276181838 - type: nauc_ndcg_at_1000_diff1 value: 0.25363171946133656 - type: nauc_ndcg_at_1000_max value: 54.12840465309885 - type: nauc_ndcg_at_1000_std value: 20.749184325412546 - type: nauc_ndcg_at_100_diff1 value: 0.15649430250272792 - type: nauc_ndcg_at_100_max value: 54.47995322413234 - type: nauc_ndcg_at_100_std value: 21.266786634233267 - type: nauc_ndcg_at_10_diff1 value: 0.14579250840386346 - type: nauc_ndcg_at_10_max value: 49.8643037948353 - type: nauc_ndcg_at_10_std value: 12.960701643914216 - type: nauc_ndcg_at_1_diff1 value: 22.711430629147635 - type: nauc_ndcg_at_1_max value: 58.4059429497403 - type: nauc_ndcg_at_1_std value: 11.967886722567973 - type: nauc_ndcg_at_20_diff1 value: -0.6701559981776763 - type: nauc_ndcg_at_20_max value: 52.95443437012488 - type: nauc_ndcg_at_20_std value: 16.708883972005758 - type: nauc_ndcg_at_3_diff1 value: -0.19084922341962388 - type: nauc_ndcg_at_3_max value: 46.2110230886874 - type: nauc_ndcg_at_3_std value: 13.363250229683038 - type: nauc_ndcg_at_5_diff1 value: 0.9840019268192548 - type: nauc_ndcg_at_5_max value: 43.56594891798146 - type: nauc_ndcg_at_5_std value: 8.577017104088146 - type: nauc_precision_at_1000_diff1 value: -30.779179091501145 - type: nauc_precision_at_1000_max value: 16.056094258615673 - type: nauc_precision_at_1000_std value: 49.96303902363283 - type: nauc_precision_at_100_diff1 value: -31.583236638899585 - type: nauc_precision_at_100_max value: 19.16571713603373 - type: nauc_precision_at_100_std value: 51.870647903980036 - type: nauc_precision_at_10_diff1 value: -35.62134572732597 - type: nauc_precision_at_10_max value: 31.6935186494612 - type: nauc_precision_at_10_std value: 46.68659723766723 - type: nauc_precision_at_1_diff1 value: 22.711430629147635 - type: nauc_precision_at_1_max value: 58.4059429497403 - type: nauc_precision_at_1_std value: 11.967886722567973 - type: nauc_precision_at_20_diff1 value: -33.875460046920495 - type: nauc_precision_at_20_max value: 24.188420133566442 - type: nauc_precision_at_20_std value: 50.02387762958483 - type: nauc_precision_at_3_diff1 value: -28.875998450906827 - type: nauc_precision_at_3_max value: 44.77058831167941 - type: nauc_precision_at_3_std value: 31.77993710437207 - type: nauc_precision_at_5_diff1 value: -34.92525440306491 - type: nauc_precision_at_5_max value: 39.855219917077086 - type: nauc_precision_at_5_std value: 37.95432046169299 - type: nauc_recall_at_1000_diff1 value: -14.293309371874733 - type: nauc_recall_at_1000_max value: 59.06948692482579 - type: nauc_recall_at_1000_std value: 62.586254868312686 - type: nauc_recall_at_100_diff1 value: -4.344100947212704 - type: nauc_recall_at_100_max value: 58.42120421043602 - type: nauc_recall_at_100_std value: 46.48562009316997 - type: nauc_recall_at_10_diff1 value: 0.04948662912161709 - type: nauc_recall_at_10_max value: 42.42809687119093 - type: nauc_recall_at_10_std value: 0.6892504250411409 - type: nauc_recall_at_1_diff1 value: 45.19415865267676 - type: nauc_recall_at_1_max value: -1.7261947382471912 - type: nauc_recall_at_1_std value: -32.16144291613605 - type: nauc_recall_at_20_diff1 value: -7.634587864605111 - type: nauc_recall_at_20_max value: 49.21327187174134 - type: nauc_recall_at_20_std value: 16.408481068336346 - type: nauc_recall_at_3_diff1 value: 24.72546591038644 - type: nauc_recall_at_3_max value: 6.620763400972902 - type: nauc_recall_at_3_std value: -29.994703323331684 - type: nauc_recall_at_5_diff1 value: 12.65527364845842 - type: nauc_recall_at_5_max value: 20.400121385794694 - type: nauc_recall_at_5_std value: -22.34284568447213 - type: ndcg_at_1 value: 84.8 - type: ndcg_at_10 value: 83.134 - type: ndcg_at_100 value: 86.628 - type: ndcg_at_1000 value: 87.151 - type: ndcg_at_20 value: 85.092 - type: ndcg_at_3 value: 81.228 - type: ndcg_at_5 value: 80.2 - type: precision_at_1 value: 84.8 - type: precision_at_10 value: 40.394999999999996 - type: precision_at_100 value: 4.745 - type: precision_at_1000 value: 0.488 - type: precision_at_20 value: 22.245 - type: precision_at_3 value: 73.25 - type: precision_at_5 value: 61.86000000000001 - type: recall_at_1 value: 23.907999999999998 - type: recall_at_10 value: 85.346 - type: recall_at_100 value: 96.515 - type: recall_at_1000 value: 99.156 - type: recall_at_20 value: 91.377 - type: recall_at_3 value: 54.135 - type: recall_at_5 value: 70.488 - task: type: Retrieval dataset: name: MTEB EcomRetrieval (default) type: C-MTEB/EcomRetrieval config: default split: dev revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9 metrics: - type: main_score value: 60.887 - type: map_at_1 value: 46.6 - type: map_at_10 value: 56.035000000000004 - type: map_at_100 value: 56.741 - type: map_at_1000 value: 56.764 - type: map_at_20 value: 56.513999999999996 - type: map_at_3 value: 53.733 - type: map_at_5 value: 54.913000000000004 - type: mrr_at_1 value: 46.6 - type: mrr_at_10 value: 56.034523809523776 - type: mrr_at_100 value: 56.74056360434383 - type: mrr_at_1000 value: 56.76373487222486 - type: mrr_at_20 value: 56.51374873879128 - type: mrr_at_3 value: 53.73333333333328 - type: mrr_at_5 value: 54.91333333333327 - type: nauc_map_at_1000_diff1 value: 65.13546939953387 - type: nauc_map_at_1000_max value: 43.358890946774494 - type: nauc_map_at_1000_std value: -9.973282105235036 - type: nauc_map_at_100_diff1 value: 65.12449309472493 - type: nauc_map_at_100_max value: 43.377100882923145 - type: nauc_map_at_100_std value: -9.971781228240555 - type: nauc_map_at_10_diff1 value: 64.83020018537475 - type: nauc_map_at_10_max value: 43.25969482323034 - type: nauc_map_at_10_std value: -10.120272176001547 - type: nauc_map_at_1_diff1 value: 69.58727592100516 - type: nauc_map_at_1_max value: 38.236494689522026 - type: nauc_map_at_1_std value: -14.833390831689597 - type: nauc_map_at_20_diff1 value: 65.01159809914586 - type: nauc_map_at_20_max value: 43.33440319829618 - type: nauc_map_at_20_std value: -10.039958228659726 - type: nauc_map_at_3_diff1 value: 65.2396323885909 - type: nauc_map_at_3_max value: 42.26904017378952 - type: nauc_map_at_3_std value: -11.793017036934044 - type: nauc_map_at_5_diff1 value: 64.96397227898036 - type: nauc_map_at_5_max value: 43.231333789145424 - type: nauc_map_at_5_std value: -10.349933732151372 - type: nauc_mrr_at_1000_diff1 value: 65.13546939953387 - type: nauc_mrr_at_1000_max value: 43.358890946774494 - type: nauc_mrr_at_1000_std value: -9.973282105235036 - type: nauc_mrr_at_100_diff1 value: 65.12449309472493 - type: nauc_mrr_at_100_max value: 43.377100882923145 - type: nauc_mrr_at_100_std value: -9.971781228240555 - type: nauc_mrr_at_10_diff1 value: 64.83020018537475 - type: nauc_mrr_at_10_max value: 43.25969482323034 - type: nauc_mrr_at_10_std value: -10.120272176001547 - type: nauc_mrr_at_1_diff1 value: 69.58727592100516 - type: nauc_mrr_at_1_max value: 38.236494689522026 - type: nauc_mrr_at_1_std value: -14.833390831689597 - type: nauc_mrr_at_20_diff1 value: 65.01159809914586 - type: nauc_mrr_at_20_max value: 43.33440319829618 - type: nauc_mrr_at_20_std value: -10.039958228659726 - type: nauc_mrr_at_3_diff1 value: 65.2396323885909 - type: nauc_mrr_at_3_max value: 42.26904017378952 - type: nauc_mrr_at_3_std value: -11.793017036934044 - type: nauc_mrr_at_5_diff1 value: 64.96397227898036 - type: nauc_mrr_at_5_max value: 43.231333789145424 - type: nauc_mrr_at_5_std value: -10.349933732151372 - type: nauc_ndcg_at_1000_diff1 value: 64.26802655199876 - type: nauc_ndcg_at_1000_max value: 45.854310744745185 - type: nauc_ndcg_at_1000_std value: -6.184417305204082 - type: nauc_ndcg_at_100_diff1 value: 63.99268329609827 - type: nauc_ndcg_at_100_max value: 46.31270128748375 - type: nauc_ndcg_at_100_std value: -6.1393433180558965 - type: nauc_ndcg_at_10_diff1 value: 62.6735104141137 - type: nauc_ndcg_at_10_max value: 45.54954799462398 - type: nauc_ndcg_at_10_std value: -7.348851199024871 - type: nauc_ndcg_at_1_diff1 value: 69.58727592100516 - type: nauc_ndcg_at_1_max value: 38.236494689522026 - type: nauc_ndcg_at_1_std value: -14.833390831689597 - type: nauc_ndcg_at_20_diff1 value: 63.25899651677274 - type: nauc_ndcg_at_20_max value: 45.952196968886014 - type: nauc_ndcg_at_20_std value: -6.807607465125713 - type: nauc_ndcg_at_3_diff1 value: 63.65618337476822 - type: nauc_ndcg_at_3_max value: 43.507890965228945 - type: nauc_ndcg_at_3_std value: -10.73845622217601 - type: nauc_ndcg_at_5_diff1 value: 63.079162432921855 - type: nauc_ndcg_at_5_max value: 45.38303443868148 - type: nauc_ndcg_at_5_std value: -8.063657824835534 - type: nauc_precision_at_1000_diff1 value: 63.01459977930557 - type: nauc_precision_at_1000_max value: 92.4253034547151 - type: nauc_precision_at_1000_std value: 84.4845513963158 - type: nauc_precision_at_100_diff1 value: 57.17217119405878 - type: nauc_precision_at_100_max value: 80.70049725316484 - type: nauc_precision_at_100_std value: 41.78392287147403 - type: nauc_precision_at_10_diff1 value: 53.115665404390725 - type: nauc_precision_at_10_max value: 55.73825657341263 - type: nauc_precision_at_10_std value: 5.406226305013257 - type: nauc_precision_at_1_diff1 value: 69.58727592100516 - type: nauc_precision_at_1_max value: 38.236494689522026 - type: nauc_precision_at_1_std value: -14.833390831689597 - type: nauc_precision_at_20_diff1 value: 53.77730697622828 - type: nauc_precision_at_20_max value: 61.88170819253054 - type: nauc_precision_at_20_std value: 13.678730470003856 - type: nauc_precision_at_3_diff1 value: 58.580196992291455 - type: nauc_precision_at_3_max value: 47.404834585376626 - type: nauc_precision_at_3_std value: -7.374978769024051 - type: nauc_precision_at_5_diff1 value: 56.44564652606437 - type: nauc_precision_at_5_max value: 53.08973975162324 - type: nauc_precision_at_5_std value: 0.22762700141423803 - type: nauc_recall_at_1000_diff1 value: 63.01459977930565 - type: nauc_recall_at_1000_max value: 92.42530345471532 - type: nauc_recall_at_1000_std value: 84.48455139631602 - type: nauc_recall_at_100_diff1 value: 57.17217119405904 - type: nauc_recall_at_100_max value: 80.70049725316468 - type: nauc_recall_at_100_std value: 41.783922871474275 - type: nauc_recall_at_10_diff1 value: 53.11566540439087 - type: nauc_recall_at_10_max value: 55.738256573412656 - type: nauc_recall_at_10_std value: 5.406226305013377 - type: nauc_recall_at_1_diff1 value: 69.58727592100516 - type: nauc_recall_at_1_max value: 38.236494689522026 - type: nauc_recall_at_1_std value: -14.833390831689597 - type: nauc_recall_at_20_diff1 value: 53.77730697622846 - type: nauc_recall_at_20_max value: 61.881708192530525 - type: nauc_recall_at_20_std value: 13.678730470003947 - type: nauc_recall_at_3_diff1 value: 58.5801969922914 - type: nauc_recall_at_3_max value: 47.40483458537654 - type: nauc_recall_at_3_std value: -7.37497876902413 - type: nauc_recall_at_5_diff1 value: 56.445646526064394 - type: nauc_recall_at_5_max value: 53.08973975162332 - type: nauc_recall_at_5_std value: 0.22762700141428024 - type: ndcg_at_1 value: 46.6 - type: ndcg_at_10 value: 60.887 - type: ndcg_at_100 value: 64.18199999999999 - type: ndcg_at_1000 value: 64.726 - type: ndcg_at_20 value: 62.614999999999995 - type: ndcg_at_3 value: 56.038 - type: ndcg_at_5 value: 58.150999999999996 - type: precision_at_1 value: 46.6 - type: precision_at_10 value: 7.630000000000001 - type: precision_at_100 value: 0.914 - type: precision_at_1000 value: 0.096 - type: precision_at_20 value: 4.154999999999999 - type: precision_at_3 value: 20.9 - type: precision_at_5 value: 13.56 - type: recall_at_1 value: 46.6 - type: recall_at_10 value: 76.3 - type: recall_at_100 value: 91.4 - type: recall_at_1000 value: 95.6 - type: recall_at_20 value: 83.1 - type: recall_at_3 value: 62.7 - type: recall_at_5 value: 67.80000000000001 - task: type: Classification dataset: name: MTEB EmotionClassification (default) type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 73.29999999999998 - type: f1 value: 67.71473706580302 - type: f1_weighted value: 74.83537255312045 - type: main_score value: 73.29999999999998 - task: type: Retrieval dataset: name: MTEB FEVER (default) type: mteb/fever config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: map_at_1 value: 78.371 - type: map_at_10 value: 85.762 - type: map_at_100 value: 85.954 - type: map_at_1000 value: 85.966 - type: map_at_20 value: 85.887 - type: map_at_3 value: 84.854 - type: map_at_5 value: 85.408 - type: mrr_at_1 value: 84.443 - type: mrr_at_10 value: 90.432 - type: mrr_at_100 value: 90.483 - type: mrr_at_1000 value: 90.484 - type: mrr_at_20 value: 90.473 - type: mrr_at_3 value: 89.89399999999999 - type: mrr_at_5 value: 90.244 - type: ndcg_at_1 value: 84.443 - type: ndcg_at_10 value: 89.05499999999999 - type: ndcg_at_100 value: 89.68 - type: ndcg_at_1000 value: 89.87899999999999 - type: ndcg_at_20 value: 89.381 - type: ndcg_at_3 value: 87.73100000000001 - type: ndcg_at_5 value: 88.425 - type: precision_at_1 value: 84.443 - type: precision_at_10 value: 10.520999999999999 - type: precision_at_100 value: 1.103 - type: precision_at_1000 value: 0.11399999999999999 - type: precision_at_20 value: 5.362 - type: precision_at_3 value: 33.198 - type: precision_at_5 value: 20.441000000000003 - type: recall_at_1 value: 78.371 - type: recall_at_10 value: 94.594 - type: recall_at_100 value: 96.97099999999999 - type: recall_at_1000 value: 98.18 - type: recall_at_20 value: 95.707 - type: recall_at_3 value: 90.853 - type: recall_at_5 value: 92.74799999999999 - type: main_score value: 89.05499999999999 - task: type: Retrieval dataset: name: MTEB FiQA2018 (default) type: mteb/fiqa config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: map_at_1 value: 23.810000000000002 - type: map_at_10 value: 39.051 - type: map_at_100 value: 41.231 - type: map_at_1000 value: 41.376000000000005 - type: map_at_20 value: 40.227000000000004 - type: map_at_3 value: 33.915 - type: map_at_5 value: 36.459 - type: mrr_at_1 value: 48.148 - type: mrr_at_10 value: 55.765 - type: mrr_at_100 value: 56.495 - type: mrr_at_1000 value: 56.525999999999996 - type: mrr_at_20 value: 56.213 - type: mrr_at_3 value: 53.086 - type: mrr_at_5 value: 54.513999999999996 - type: ndcg_at_1 value: 48.148 - type: ndcg_at_10 value: 47.349999999999994 - type: ndcg_at_100 value: 54.61899999999999 - type: ndcg_at_1000 value: 56.830000000000005 - type: ndcg_at_20 value: 50.143 - type: ndcg_at_3 value: 43.108000000000004 - type: ndcg_at_5 value: 44.023 - type: precision_at_1 value: 48.148 - type: precision_at_10 value: 13.441 - type: precision_at_100 value: 2.085 - type: precision_at_1000 value: 0.248 - type: precision_at_20 value: 7.870000000000001 - type: precision_at_3 value: 28.909000000000002 - type: precision_at_5 value: 20.957 - type: recall_at_1 value: 23.810000000000002 - type: recall_at_10 value: 54.303000000000004 - type: recall_at_100 value: 81.363 - type: recall_at_1000 value: 94.391 - type: recall_at_20 value: 63.056999999999995 - type: recall_at_3 value: 38.098 - type: recall_at_5 value: 44.414 - type: main_score value: 47.349999999999994 - task: type: Classification dataset: name: MTEB GeoreviewClassification (default) type: ai-forever/georeview-classification config: default split: test revision: 3765c0d1de6b7d264bc459433c45e5a75513839c metrics: - type: accuracy value: 48.0126953125 - type: f1 value: 47.65764016160488 - type: f1_weighted value: 47.65701659482088 - type: main_score value: 48.0126953125 - task: type: Clustering dataset: name: MTEB GeoreviewClusteringP2P (default) type: ai-forever/georeview-clustering-p2p config: default split: test revision: 97a313c8fc85b47f13f33e7e9a95c1ad888c7fec metrics: - type: main_score value: 73.62357853672266 - type: v_measure value: 73.62357853672266 - type: v_measure_std value: 0.5942247545535766 - task: type: Retrieval dataset: name: MTEB GerDaLIR (default) type: jinaai/ger_da_lir config: default split: test revision: 0bb47f1d73827e96964edb84dfe552f62f4fd5eb metrics: - type: main_score value: 16.227 - type: map_at_1 value: 8.082 - type: map_at_10 value: 12.959999999999999 - type: map_at_100 value: 13.923 - type: map_at_1000 value: 14.030999999999999 - type: map_at_20 value: 13.453000000000001 - type: map_at_3 value: 11.018 - type: map_at_5 value: 12.056000000000001 - type: mrr_at_1 value: 8.993332249146203 - type: mrr_at_10 value: 13.994013092850247 - type: mrr_at_100 value: 14.913737673149308 - type: mrr_at_1000 value: 15.00843809934407 - type: mrr_at_20 value: 14.470268462334007 - type: mrr_at_3 value: 12.000596302921846 - type: mrr_at_5 value: 13.070689000921561 - type: nauc_map_at_1000_diff1 value: 28.559639584013286 - type: nauc_map_at_1000_max value: 25.533800126086714 - type: nauc_map_at_1000_std value: 9.826551026628666 - type: nauc_map_at_100_diff1 value: 28.544724499331696 - type: nauc_map_at_100_max value: 25.46734324526386 - type: nauc_map_at_100_std value: 9.739314481785591 - type: nauc_map_at_10_diff1 value: 28.77447517718118 - type: nauc_map_at_10_max value: 24.7431615237795 - type: nauc_map_at_10_std value: 8.349878188033646 - type: nauc_map_at_1_diff1 value: 37.405452629895514 - type: nauc_map_at_1_max value: 24.444208978394023 - type: nauc_map_at_1_std value: 4.043820373810528 - type: nauc_map_at_20_diff1 value: 28.69764217789062 - type: nauc_map_at_20_max value: 25.111848355996496 - type: nauc_map_at_20_std value: 9.034829905305918 - type: nauc_map_at_3_diff1 value: 30.89053285076882 - type: nauc_map_at_3_max value: 24.862886115911152 - type: nauc_map_at_3_std value: 6.654260832396586 - type: nauc_map_at_5_diff1 value: 29.230629676604263 - type: nauc_map_at_5_max value: 24.374302288018583 - type: nauc_map_at_5_std value: 7.341846952319046 - type: nauc_mrr_at_1000_diff1 value: 28.086147932781426 - type: nauc_mrr_at_1000_max value: 25.98698528264653 - type: nauc_mrr_at_1000_std value: 9.917554348624545 - type: nauc_mrr_at_100_diff1 value: 28.069163279791336 - type: nauc_mrr_at_100_max value: 25.949440010886804 - type: nauc_mrr_at_100_std value: 9.874340979732578 - type: nauc_mrr_at_10_diff1 value: 28.239920869530046 - type: nauc_mrr_at_10_max value: 25.351271409498576 - type: nauc_mrr_at_10_std value: 8.669862759875162 - type: nauc_mrr_at_1_diff1 value: 35.96543040207856 - type: nauc_mrr_at_1_max value: 25.488936487231967 - type: nauc_mrr_at_1_std value: 4.76439131038345 - type: nauc_mrr_at_20_diff1 value: 28.18865871284607 - type: nauc_mrr_at_20_max value: 25.67121763344746 - type: nauc_mrr_at_20_std value: 9.297910707519472 - type: nauc_mrr_at_3_diff1 value: 30.166714199740717 - type: nauc_mrr_at_3_max value: 25.541792491964877 - type: nauc_mrr_at_3_std value: 7.083090296398472 - type: nauc_mrr_at_5_diff1 value: 28.68475284656478 - type: nauc_mrr_at_5_max value: 24.994071363482835 - type: nauc_mrr_at_5_std value: 7.687507254902365 - type: nauc_ndcg_at_1000_diff1 value: 25.292792613586467 - type: nauc_ndcg_at_1000_max value: 29.211905289377178 - type: nauc_ndcg_at_1000_std value: 18.088867467320355 - type: nauc_ndcg_at_100_diff1 value: 25.026905011089152 - type: nauc_ndcg_at_100_max value: 27.98822281254431 - type: nauc_ndcg_at_100_std value: 16.69456904301902 - type: nauc_ndcg_at_10_diff1 value: 25.972279051109503 - type: nauc_ndcg_at_10_max value: 24.86486482734957 - type: nauc_ndcg_at_10_std value: 10.398605822106353 - type: nauc_ndcg_at_1_diff1 value: 36.134710485184826 - type: nauc_ndcg_at_1_max value: 25.384572790326025 - type: nauc_ndcg_at_1_std value: 4.591863033771824 - type: nauc_ndcg_at_20_diff1 value: 25.850033660205536 - type: nauc_ndcg_at_20_max value: 25.944243193140515 - type: nauc_ndcg_at_20_std value: 12.392409721204892 - type: nauc_ndcg_at_3_diff1 value: 29.1966056380018 - type: nauc_ndcg_at_3_max value: 24.978843156259913 - type: nauc_ndcg_at_3_std value: 7.353914459205087 - type: nauc_ndcg_at_5_diff1 value: 26.795315295756282 - type: nauc_ndcg_at_5_max value: 24.1196789150412 - type: nauc_ndcg_at_5_std value: 8.311970988265172 - type: nauc_precision_at_1000_diff1 value: 9.128270550217984 - type: nauc_precision_at_1000_max value: 35.79286915973607 - type: nauc_precision_at_1000_std value: 39.15669472887154 - type: nauc_precision_at_100_diff1 value: 14.770289799034384 - type: nauc_precision_at_100_max value: 34.58262232264337 - type: nauc_precision_at_100_std value: 34.101148102981384 - type: nauc_precision_at_10_diff1 value: 19.899104673118178 - type: nauc_precision_at_10_max value: 26.636940338985625 - type: nauc_precision_at_10_std value: 15.73871357255849 - type: nauc_precision_at_1_diff1 value: 36.134710485184826 - type: nauc_precision_at_1_max value: 25.384572790326025 - type: nauc_precision_at_1_std value: 4.591863033771824 - type: nauc_precision_at_20_diff1 value: 19.423457975148942 - type: nauc_precision_at_20_max value: 29.58123490878582 - type: nauc_precision_at_20_std value: 20.847850110821618 - type: nauc_precision_at_3_diff1 value: 24.986416623492918 - type: nauc_precision_at_3_max value: 25.973548400472975 - type: nauc_precision_at_3_std value: 9.486410455972823 - type: nauc_precision_at_5_diff1 value: 21.237741424923332 - type: nauc_precision_at_5_max value: 24.647141028200164 - type: nauc_precision_at_5_std value: 11.102785032334147 - type: nauc_recall_at_1000_diff1 value: 15.999714888817829 - type: nauc_recall_at_1000_max value: 44.34701908906545 - type: nauc_recall_at_1000_std value: 51.13471291594717 - type: nauc_recall_at_100_diff1 value: 17.401714890483706 - type: nauc_recall_at_100_max value: 33.39042631654808 - type: nauc_recall_at_100_std value: 33.944446168451584 - type: nauc_recall_at_10_diff1 value: 20.30036232399894 - type: nauc_recall_at_10_max value: 24.006718284396786 - type: nauc_recall_at_10_std value: 14.049375108518669 - type: nauc_recall_at_1_diff1 value: 37.405452629895514 - type: nauc_recall_at_1_max value: 24.444208978394023 - type: nauc_recall_at_1_std value: 4.043820373810528 - type: nauc_recall_at_20_diff1 value: 20.23582802609045 - type: nauc_recall_at_20_max value: 26.408063410785243 - type: nauc_recall_at_20_std value: 18.617479515468112 - type: nauc_recall_at_3_diff1 value: 25.53221830103098 - type: nauc_recall_at_3_max value: 24.283712329152678 - type: nauc_recall_at_3_std value: 8.428947805841867 - type: nauc_recall_at_5_diff1 value: 21.741499601020823 - type: nauc_recall_at_5_max value: 22.754924586295296 - type: nauc_recall_at_5_std value: 9.966736688169814 - type: ndcg_at_1 value: 8.977 - type: ndcg_at_10 value: 16.227 - type: ndcg_at_100 value: 21.417 - type: ndcg_at_1000 value: 24.451 - type: ndcg_at_20 value: 17.982 - type: ndcg_at_3 value: 12.206999999999999 - type: ndcg_at_5 value: 14.059 - type: precision_at_1 value: 8.977 - type: precision_at_10 value: 2.933 - type: precision_at_100 value: 0.59 - type: precision_at_1000 value: 0.087 - type: precision_at_20 value: 1.8599999999999999 - type: precision_at_3 value: 5.550999999999999 - type: precision_at_5 value: 4.340999999999999 - type: recall_at_1 value: 8.082 - type: recall_at_10 value: 25.52 - type: recall_at_100 value: 50.32 - type: recall_at_1000 value: 74.021 - type: recall_at_20 value: 32.229 - type: recall_at_3 value: 14.66 - type: recall_at_5 value: 19.062 - task: type: Retrieval dataset: name: MTEB GermanDPR (default) type: deepset/germandpr config: default split: test revision: 5129d02422a66be600ac89cd3e8531b4f97d347d metrics: - type: main_score value: 82.422 - type: map_at_1 value: 64.39 - type: map_at_10 value: 77.273 - type: map_at_100 value: 77.375 - type: map_at_1000 value: 77.376 - type: map_at_20 value: 77.351 - type: map_at_3 value: 75.46300000000001 - type: map_at_5 value: 76.878 - type: mrr_at_1 value: 64.19512195121952 - type: mrr_at_10 value: 77.15842044134736 - type: mrr_at_100 value: 77.2604854308704 - type: mrr_at_1000 value: 77.26087882190109 - type: mrr_at_20 value: 77.23572154560611 - type: mrr_at_3 value: 75.34959349593504 - type: mrr_at_5 value: 76.76422764227652 - type: nauc_map_at_1000_diff1 value: 49.73135253389972 - type: nauc_map_at_1000_max value: 8.665570717396145 - type: nauc_map_at_1000_std value: -25.920927572114522 - type: nauc_map_at_100_diff1 value: 49.729170775336605 - type: nauc_map_at_100_max value: 8.66717979705074 - type: nauc_map_at_100_std value: -25.918338868918596 - type: nauc_map_at_10_diff1 value: 49.708681691445925 - type: nauc_map_at_10_max value: 8.830640635692113 - type: nauc_map_at_10_std value: -25.843238986304858 - type: nauc_map_at_1_diff1 value: 51.750022350988914 - type: nauc_map_at_1_max value: 3.599863010364626 - type: nauc_map_at_1_std value: -27.670122127567314 - type: nauc_map_at_20_diff1 value: 49.72609185887161 - type: nauc_map_at_20_max value: 8.766556053409218 - type: nauc_map_at_20_std value: -25.85975887517904 - type: nauc_map_at_3_diff1 value: 49.328512536255595 - type: nauc_map_at_3_max value: 9.475682028996795 - type: nauc_map_at_3_std value: -26.277349632171017 - type: nauc_map_at_5_diff1 value: 49.42801822186142 - type: nauc_map_at_5_max value: 8.788822474357252 - type: nauc_map_at_5_std value: -25.959260882028573 - type: nauc_mrr_at_1000_diff1 value: 50.13038598302397 - type: nauc_mrr_at_1000_max value: 8.734338637484832 - type: nauc_mrr_at_1000_std value: -26.653343549855908 - type: nauc_mrr_at_100_diff1 value: 50.12820392111392 - type: nauc_mrr_at_100_max value: 8.735940503917966 - type: nauc_mrr_at_100_std value: -26.65074918231251 - type: nauc_mrr_at_10_diff1 value: 50.10567888458267 - type: nauc_mrr_at_10_max value: 8.898451291748575 - type: nauc_mrr_at_10_std value: -26.572046921975655 - type: nauc_mrr_at_1_diff1 value: 52.22769994409465 - type: nauc_mrr_at_1_max value: 3.6490820146062015 - type: nauc_mrr_at_1_std value: -28.535100562320498 - type: nauc_mrr_at_20_diff1 value: 50.12462222100699 - type: nauc_mrr_at_20_max value: 8.83487018268756 - type: nauc_mrr_at_20_std value: -26.591437036958332 - type: nauc_mrr_at_3_diff1 value: 49.6987353700016 - type: nauc_mrr_at_3_max value: 9.531003760756258 - type: nauc_mrr_at_3_std value: -26.949799063124818 - type: nauc_mrr_at_5_diff1 value: 49.823881656376585 - type: nauc_mrr_at_5_max value: 8.850404667985085 - type: nauc_mrr_at_5_std value: -26.680008966088582 - type: nauc_ndcg_at_1000_diff1 value: 49.41721203361181 - type: nauc_ndcg_at_1000_max value: 9.41093067609825 - type: nauc_ndcg_at_1000_std value: -25.499543637737567 - type: nauc_ndcg_at_100_diff1 value: 49.32810419509252 - type: nauc_ndcg_at_100_max value: 9.476216458766897 - type: nauc_ndcg_at_100_std value: -25.393856250990414 - type: nauc_ndcg_at_10_diff1 value: 49.181984436623694 - type: nauc_ndcg_at_10_max value: 10.65234732763274 - type: nauc_ndcg_at_10_std value: -24.737669349012297 - type: nauc_ndcg_at_1_diff1 value: 51.750022350988914 - type: nauc_ndcg_at_1_max value: 3.599863010364626 - type: nauc_ndcg_at_1_std value: -27.670122127567314 - type: nauc_ndcg_at_20_diff1 value: 49.275394594995056 - type: nauc_ndcg_at_20_max value: 10.402059796651923 - type: nauc_ndcg_at_20_std value: -24.82329915806705 - type: nauc_ndcg_at_3_diff1 value: 48.22614352152889 - type: nauc_ndcg_at_3_max value: 11.67464280791404 - type: nauc_ndcg_at_3_std value: -25.867824868234095 - type: nauc_ndcg_at_5_diff1 value: 48.35583502987241 - type: nauc_ndcg_at_5_max value: 10.494278750448451 - type: nauc_ndcg_at_5_std value: -25.11599634172764 - type: nauc_precision_at_1000_diff1 value: .nan - type: nauc_precision_at_1000_max value: .nan - type: nauc_precision_at_1000_std value: .nan - type: nauc_precision_at_100_diff1 value: -56.39478136433852 - type: nauc_precision_at_100_max value: 86.93518577529493 - type: nauc_precision_at_100_std value: 100.0 - type: nauc_precision_at_10_diff1 value: 38.662829729133094 - type: nauc_precision_at_10_max value: 56.38018435740605 - type: nauc_precision_at_10_std value: 6.288091897081105 - type: nauc_precision_at_1_diff1 value: 51.750022350988914 - type: nauc_precision_at_1_max value: 3.599863010364626 - type: nauc_precision_at_1_std value: -27.670122127567314 - type: nauc_precision_at_20_diff1 value: 34.739153182429085 - type: nauc_precision_at_20_max value: 84.86908403000989 - type: nauc_precision_at_20_std value: 29.156199421219455 - type: nauc_precision_at_3_diff1 value: 42.09287362529135 - type: nauc_precision_at_3_max value: 23.629152759287074 - type: nauc_precision_at_3_std value: -23.721376911302492 - type: nauc_precision_at_5_diff1 value: 36.03866171924644 - type: nauc_precision_at_5_max value: 29.166173558775327 - type: nauc_precision_at_5_std value: -15.096374563068448 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: -56.39478136433541 - type: nauc_recall_at_100_max value: 86.93518577528111 - type: nauc_recall_at_100_std value: 100.0 - type: nauc_recall_at_10_diff1 value: 38.66282972913384 - type: nauc_recall_at_10_max value: 56.3801843574071 - type: nauc_recall_at_10_std value: 6.288091897082639 - type: nauc_recall_at_1_diff1 value: 51.750022350988914 - type: nauc_recall_at_1_max value: 3.599863010364626 - type: nauc_recall_at_1_std value: -27.670122127567314 - type: nauc_recall_at_20_diff1 value: 34.7391531824321 - type: nauc_recall_at_20_max value: 84.86908403001016 - type: nauc_recall_at_20_std value: 29.156199421220748 - type: nauc_recall_at_3_diff1 value: 42.09287362529107 - type: nauc_recall_at_3_max value: 23.629152759286946 - type: nauc_recall_at_3_std value: -23.72137691130291 - type: nauc_recall_at_5_diff1 value: 36.0386617192469 - type: nauc_recall_at_5_max value: 29.1661735587759 - type: nauc_recall_at_5_std value: -15.09637456306774 - type: ndcg_at_1 value: 64.39 - type: ndcg_at_10 value: 82.422 - type: ndcg_at_100 value: 82.86099999999999 - type: ndcg_at_1000 value: 82.87299999999999 - type: ndcg_at_20 value: 82.67999999999999 - type: ndcg_at_3 value: 78.967 - type: ndcg_at_5 value: 81.50699999999999 - type: precision_at_1 value: 64.39 - type: precision_at_10 value: 9.795 - type: precision_at_100 value: 0.9990000000000001 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.946 - type: precision_at_3 value: 29.691000000000003 - type: precision_at_5 value: 19.044 - type: recall_at_1 value: 64.39 - type: recall_at_10 value: 97.951 - type: recall_at_100 value: 99.902 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 98.92699999999999 - type: recall_at_3 value: 89.07300000000001 - type: recall_at_5 value: 95.22 - task: type: Retrieval dataset: name: MTEB GermanQuAD-Retrieval (default) type: mteb/germanquad-retrieval config: default split: test revision: f5c87ae5a2e7a5106606314eef45255f03151bb3 metrics: - type: main_score value: 94.15532365396247 - type: map_at_1 value: 90.789 - type: map_at_10 value: 94.24 - type: map_at_100 value: 94.283 - type: map_at_1000 value: 94.284 - type: map_at_20 value: 94.272 - type: map_at_3 value: 93.913 - type: map_at_5 value: 94.155 - type: mrr_at_1 value: 90.78947368421053 - type: mrr_at_10 value: 94.23987411056376 - type: mrr_at_100 value: 94.28320936825 - type: mrr_at_1000 value: 94.28350209115848 - type: mrr_at_20 value: 94.271919092559 - type: mrr_at_3 value: 93.91258318209313 - type: mrr_at_5 value: 94.15532365396247 - type: nauc_map_at_1000_diff1 value: 89.29089310650436 - type: nauc_map_at_1000_max value: 73.83868784032414 - type: nauc_map_at_1000_std value: -11.635778561889989 - type: nauc_map_at_100_diff1 value: 89.29077225707755 - type: nauc_map_at_100_max value: 73.84002740580378 - type: nauc_map_at_100_std value: -11.644096256165092 - type: nauc_map_at_10_diff1 value: 89.29117612292366 - type: nauc_map_at_10_max value: 73.97487984981221 - type: nauc_map_at_10_std value: -11.35191794373827 - type: nauc_map_at_1_diff1 value: 89.35436544117584 - type: nauc_map_at_1_max value: 70.35936815057701 - type: nauc_map_at_1_std value: -13.598996360976903 - type: nauc_map_at_20_diff1 value: 89.2530394052653 - type: nauc_map_at_20_max value: 73.83537529419839 - type: nauc_map_at_20_std value: -11.628272822028478 - type: nauc_map_at_3_diff1 value: 89.375111893546 - type: nauc_map_at_3_max value: 74.78900366026112 - type: nauc_map_at_3_std value: -12.720905253503274 - type: nauc_map_at_5_diff1 value: 89.35358300820893 - type: nauc_map_at_5_max value: 74.31996219723239 - type: nauc_map_at_5_std value: -10.768642638210867 - type: nauc_mrr_at_1000_diff1 value: 89.29089310650436 - type: nauc_mrr_at_1000_max value: 73.83868784032414 - type: nauc_mrr_at_1000_std value: -11.635778561889989 - type: nauc_mrr_at_100_diff1 value: 89.29077225707755 - type: nauc_mrr_at_100_max value: 73.84002740580378 - type: nauc_mrr_at_100_std value: -11.644096256165092 - type: nauc_mrr_at_10_diff1 value: 89.29117612292366 - type: nauc_mrr_at_10_max value: 73.97487984981221 - type: nauc_mrr_at_10_std value: -11.35191794373827 - type: nauc_mrr_at_1_diff1 value: 89.35436544117584 - type: nauc_mrr_at_1_max value: 70.35936815057701 - type: nauc_mrr_at_1_std value: -13.598996360976903 - type: nauc_mrr_at_20_diff1 value: 89.2530394052653 - type: nauc_mrr_at_20_max value: 73.83537529419839 - type: nauc_mrr_at_20_std value: -11.628272822028478 - type: nauc_mrr_at_3_diff1 value: 89.375111893546 - type: nauc_mrr_at_3_max value: 74.78900366026112 - type: nauc_mrr_at_3_std value: -12.720905253503274 - type: nauc_mrr_at_5_diff1 value: 89.35358300820893 - type: nauc_mrr_at_5_max value: 74.31996219723239 - type: nauc_mrr_at_5_std value: -10.768642638210867 - type: nauc_ndcg_at_1000_diff1 value: 89.27620775856863 - type: nauc_ndcg_at_1000_max value: 74.2985757362615 - type: nauc_ndcg_at_1000_std value: -11.236142819703023 - type: nauc_ndcg_at_100_diff1 value: 89.27284787540731 - type: nauc_ndcg_at_100_max value: 74.33539303365968 - type: nauc_ndcg_at_100_std value: -11.469413615851936 - type: nauc_ndcg_at_10_diff1 value: 89.21496710661724 - type: nauc_ndcg_at_10_max value: 75.02035398490516 - type: nauc_ndcg_at_10_std value: -9.903255803665814 - type: nauc_ndcg_at_1_diff1 value: 89.35436544117584 - type: nauc_ndcg_at_1_max value: 70.35936815057701 - type: nauc_ndcg_at_1_std value: -13.598996360976903 - type: nauc_ndcg_at_20_diff1 value: 89.03561289544179 - type: nauc_ndcg_at_20_max value: 74.4006766600049 - type: nauc_ndcg_at_20_std value: -11.129237862587743 - type: nauc_ndcg_at_3_diff1 value: 89.46540193201693 - type: nauc_ndcg_at_3_max value: 76.87093548368378 - type: nauc_ndcg_at_3_std value: -12.484902872086767 - type: nauc_ndcg_at_5_diff1 value: 89.39924941584766 - type: nauc_ndcg_at_5_max value: 75.96975269092722 - type: nauc_ndcg_at_5_std value: -8.180295581144833 - type: nauc_precision_at_1000_diff1 value: 100.0 - type: nauc_precision_at_1000_max value: 100.0 - type: nauc_precision_at_1000_std value: 100.0 - type: nauc_precision_at_100_diff1 value: 86.93074003795302 - type: nauc_precision_at_100_max value: 100.0 - type: nauc_precision_at_100_std value: -174.07785375176616 - type: nauc_precision_at_10_diff1 value: 87.43064119412082 - type: nauc_precision_at_10_max value: 90.60785783417448 - type: nauc_precision_at_10_std value: 15.378710059645906 - type: nauc_precision_at_1_diff1 value: 89.35436544117584 - type: nauc_precision_at_1_max value: 70.35936815057701 - type: nauc_precision_at_1_std value: -13.598996360976903 - type: nauc_precision_at_20_diff1 value: 78.78206037685919 - type: nauc_precision_at_20_max value: 82.52264166455923 - type: nauc_precision_at_20_std value: -5.95806599216658 - type: nauc_precision_at_3_diff1 value: 90.12709256456401 - type: nauc_precision_at_3_max value: 90.72678805838154 - type: nauc_precision_at_3_std value: -11.047599315631993 - type: nauc_precision_at_5_diff1 value: 89.9066873566561 - type: nauc_precision_at_5_max value: 93.51571626543664 - type: nauc_precision_at_5_std value: 22.632403279126162 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: 86.93074003793416 - type: nauc_recall_at_100_max value: 100.0 - type: nauc_recall_at_100_std value: -174.07785375175723 - type: nauc_recall_at_10_diff1 value: 87.43064119411991 - type: nauc_recall_at_10_max value: 90.60785783417579 - type: nauc_recall_at_10_std value: 15.378710059643607 - type: nauc_recall_at_1_diff1 value: 89.35436544117584 - type: nauc_recall_at_1_max value: 70.35936815057701 - type: nauc_recall_at_1_std value: -13.598996360976903 - type: nauc_recall_at_20_diff1 value: 78.78206037685645 - type: nauc_recall_at_20_max value: 82.52264166455791 - type: nauc_recall_at_20_std value: -5.958065992168697 - type: nauc_recall_at_3_diff1 value: 90.12709256456463 - type: nauc_recall_at_3_max value: 90.7267880583832 - type: nauc_recall_at_3_std value: -11.047599315631881 - type: nauc_recall_at_5_diff1 value: 89.90668735665676 - type: nauc_recall_at_5_max value: 93.51571626543753 - type: nauc_recall_at_5_std value: 22.632403279126112 - type: ndcg_at_1 value: 90.789 - type: ndcg_at_10 value: 95.46 - type: ndcg_at_100 value: 95.652 - type: ndcg_at_1000 value: 95.659 - type: ndcg_at_20 value: 95.575 - type: ndcg_at_3 value: 94.82000000000001 - type: ndcg_at_5 value: 95.26400000000001 - type: precision_at_1 value: 90.789 - type: precision_at_10 value: 9.908999999999999 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.977 - type: precision_at_3 value: 32.471 - type: precision_at_5 value: 19.701 - type: recall_at_1 value: 90.789 - type: recall_at_10 value: 99.093 - type: recall_at_100 value: 99.955 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 99.546 - type: recall_at_3 value: 97.414 - type: recall_at_5 value: 98.503 - task: type: STS dataset: name: MTEB GermanSTSBenchmark (default) type: jinaai/german-STSbenchmark config: default split: test revision: e36907544d44c3a247898ed81540310442329e20 metrics: - type: cosine_pearson value: 86.55319003300265 - type: cosine_spearman value: 87.50267373081324 - type: euclidean_pearson value: 87.41630636501863 - type: euclidean_spearman value: 88.02170803409365 - type: main_score value: 87.50267373081324 - type: manhattan_pearson value: 87.33703179056744 - type: manhattan_spearman value: 87.99192826922514 - type: pearson value: 86.55319003300265 - type: spearman value: 87.50267373081324 - task: type: Clustering dataset: name: MTEB HALClusteringS2S (default) type: lyon-nlp/clustering-hal-s2s config: default split: test revision: e06ebbbb123f8144bef1a5d18796f3dec9ae2915 metrics: - type: main_score value: 27.477557517301303 - type: v_measure value: 27.477557517301303 - type: v_measure_std value: 3.3525736581861336 - task: type: Classification dataset: name: MTEB HeadlineClassification (default) type: ai-forever/headline-classification config: default split: test revision: 2fe05ee6b5832cda29f2ef7aaad7b7fe6a3609eb metrics: - type: accuracy value: 75.0830078125 - type: f1 value: 75.08863209267814 - type: f1_weighted value: 75.08895979060917 - type: main_score value: 75.0830078125 - task: type: Retrieval dataset: name: MTEB HotpotQA (default) type: mteb/hotpotqa config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: map_at_1 value: 38.143 - type: map_at_10 value: 55.916999999999994 - type: map_at_100 value: 56.706 - type: map_at_1000 value: 56.77100000000001 - type: map_at_20 value: 56.367 - type: map_at_3 value: 53.111 - type: map_at_5 value: 54.839000000000006 - type: mrr_at_1 value: 76.286 - type: mrr_at_10 value: 81.879 - type: mrr_at_100 value: 82.09100000000001 - type: mrr_at_1000 value: 82.101 - type: mrr_at_20 value: 82.01 - type: mrr_at_3 value: 80.972 - type: mrr_at_5 value: 81.537 - type: ndcg_at_1 value: 76.286 - type: ndcg_at_10 value: 64.673 - type: ndcg_at_100 value: 67.527 - type: ndcg_at_1000 value: 68.857 - type: ndcg_at_20 value: 65.822 - type: ndcg_at_3 value: 60.616 - type: ndcg_at_5 value: 62.827999999999996 - type: precision_at_1 value: 76.286 - type: precision_at_10 value: 13.196 - type: precision_at_100 value: 1.544 - type: precision_at_1000 value: 0.172 - type: precision_at_20 value: 6.968000000000001 - type: precision_at_3 value: 37.992 - type: precision_at_5 value: 24.54 - type: recall_at_1 value: 38.143 - type: recall_at_10 value: 65.982 - type: recall_at_100 value: 77.225 - type: recall_at_1000 value: 86.077 - type: recall_at_20 value: 69.68299999999999 - type: recall_at_3 value: 56.989000000000004 - type: recall_at_5 value: 61.35 - type: main_score value: 64.673 - task: type: Classification dataset: name: MTEB IFlyTek (default) type: C-MTEB/IFlyTek-classification config: default split: validation revision: 421605374b29664c5fc098418fe20ada9bd55f8a metrics: - type: accuracy value: 41.67756829549827 - type: f1 value: 33.929325579581636 - type: f1_weighted value: 43.03952025643197 - type: main_score value: 41.67756829549827 - task: type: Classification dataset: name: MTEB ImdbClassification (default) type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 91.90440000000001 - type: ap value: 88.78663714603425 - type: ap_weighted value: 88.78663714603425 - type: f1 value: 91.89564361975891 - type: f1_weighted value: 91.89564361975891 - type: main_score value: 91.90440000000001 - task: type: Classification dataset: name: MTEB InappropriatenessClassification (default) type: ai-forever/inappropriateness-classification config: default split: test revision: 601651fdc45ef243751676e62dd7a19f491c0285 metrics: - type: accuracy value: 61.0498046875 - type: ap value: 57.04240566648215 - type: ap_weighted value: 57.04240566648215 - type: f1 value: 60.867630038606954 - type: f1_weighted value: 60.867630038606954 - type: main_score value: 61.0498046875 - task: type: Classification dataset: name: MTEB JDReview (default) type: C-MTEB/JDReview-classification config: default split: test revision: b7c64bd89eb87f8ded463478346f76731f07bf8b metrics: - type: accuracy value: 83.50844277673546 - type: ap value: 48.46732380712268 - type: ap_weighted value: 48.46732380712268 - type: f1 value: 77.43967451387445 - type: f1_weighted value: 84.78462929014114 - type: main_score value: 83.50844277673546 - task: type: Classification dataset: name: MTEB KinopoiskClassification (default) type: ai-forever/kinopoisk-sentiment-classification config: default split: test revision: 5911f26666ac11af46cb9c6849d0dc80a378af24 metrics: - type: accuracy value: 62.393333333333324 - type: f1 value: 61.35940129568015 - type: f1_weighted value: 61.35940129568015 - type: main_score value: 62.393333333333324 - task: type: STS dataset: name: MTEB LCQMC (default) type: C-MTEB/LCQMC config: default split: test revision: 17f9b096f80380fce5ed12a9be8be7784b337daf metrics: - type: cosine_pearson value: 67.74375505907872 - type: cosine_spearman value: 75.94582231399434 - type: euclidean_pearson value: 74.52501692443582 - type: euclidean_spearman value: 75.88428434746646 - type: main_score value: 75.94582231399434 - type: manhattan_pearson value: 74.55015441749529 - type: manhattan_spearman value: 75.83288262176175 - type: pearson value: 67.74375505907872 - type: spearman value: 75.94582231399434 - task: type: Retrieval dataset: name: MTEB LEMBNarrativeQARetrieval (default) type: dwzhu/LongEmbed config: default split: test revision: 6e346642246bfb4928c560ee08640dc84d074e8c metrics: - type: map_at_1 value: 23.093 - type: map_at_10 value: 30.227999999999998 - type: map_at_100 value: 31.423000000000002 - type: map_at_1000 value: 31.533 - type: map_at_20 value: 30.835 - type: map_at_3 value: 27.983999999999998 - type: map_at_5 value: 29.253 - type: mrr_at_1 value: 23.093 - type: mrr_at_10 value: 30.227999999999998 - type: mrr_at_100 value: 31.423000000000002 - type: mrr_at_1000 value: 31.533 - type: mrr_at_20 value: 30.835 - type: mrr_at_3 value: 27.983999999999998 - type: mrr_at_5 value: 29.253 - type: ndcg_at_1 value: 23.093 - type: ndcg_at_10 value: 34.297 - type: ndcg_at_100 value: 41.049 - type: ndcg_at_1000 value: 43.566 - type: ndcg_at_20 value: 36.52 - type: ndcg_at_3 value: 29.629 - type: ndcg_at_5 value: 31.926 - type: precision_at_1 value: 23.093 - type: precision_at_10 value: 4.735 - type: precision_at_100 value: 0.8109999999999999 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 2.8080000000000003 - type: precision_at_3 value: 11.468 - type: precision_at_5 value: 8.001 - type: recall_at_1 value: 23.093 - type: recall_at_10 value: 47.354 - type: recall_at_100 value: 81.147 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 56.16799999999999 - type: recall_at_3 value: 34.405 - type: recall_at_5 value: 40.004 - type: main_score value: 34.297 - type: map_at_1 value: 24.361 - type: map_at_10 value: 33.641 - type: map_at_100 value: 35.104 - type: map_at_1000 value: 35.127 - type: map_at_20 value: 34.388999999999996 - type: map_at_3 value: 30.255 - type: map_at_5 value: 32.079 - type: mrr_at_1 value: 24.361 - type: mrr_at_10 value: 33.641 - type: mrr_at_100 value: 35.104 - type: mrr_at_1000 value: 35.127 - type: mrr_at_20 value: 34.388999999999996 - type: mrr_at_3 value: 30.255 - type: mrr_at_5 value: 32.079 - type: ndcg_at_1 value: 24.361 - type: ndcg_at_10 value: 39.337 - type: ndcg_at_100 value: 47.384 - type: ndcg_at_1000 value: 47.75 - type: ndcg_at_20 value: 42.077999999999996 - type: ndcg_at_3 value: 32.235 - type: ndcg_at_5 value: 35.524 - type: precision_at_1 value: 24.361 - type: precision_at_10 value: 5.783 - type: precision_at_100 value: 0.975 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 3.435 - type: precision_at_3 value: 12.661 - type: precision_at_5 value: 9.193999999999999 - type: recall_at_1 value: 24.361 - type: recall_at_10 value: 57.826 - type: recall_at_100 value: 97.51100000000001 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 68.697 - type: recall_at_3 value: 37.983 - type: recall_at_5 value: 45.972 - type: main_score value: 39.337 - type: map_at_1 value: 53.667 - type: map_at_10 value: 61.719 - type: map_at_100 value: 62.471 - type: map_at_1000 value: 62.492000000000004 - type: map_at_20 value: 62.153000000000006 - type: map_at_3 value: 59.167 - type: map_at_5 value: 60.95 - type: mrr_at_1 value: 53.667 - type: mrr_at_10 value: 61.719 - type: mrr_at_100 value: 62.471 - type: mrr_at_1000 value: 62.492000000000004 - type: mrr_at_20 value: 62.153000000000006 - type: mrr_at_3 value: 59.167 - type: mrr_at_5 value: 60.95 - type: ndcg_at_1 value: 53.667 - type: ndcg_at_10 value: 66.018 - type: ndcg_at_100 value: 69.726 - type: ndcg_at_1000 value: 70.143 - type: ndcg_at_20 value: 67.61399999999999 - type: ndcg_at_3 value: 60.924 - type: ndcg_at_5 value: 64.10900000000001 - type: precision_at_1 value: 53.667 - type: precision_at_10 value: 7.9670000000000005 - type: precision_at_100 value: 0.97 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.3 - type: precision_at_3 value: 22.0 - type: precision_at_5 value: 14.732999999999999 - type: recall_at_1 value: 53.667 - type: recall_at_10 value: 79.667 - type: recall_at_100 value: 97.0 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 86.0 - type: recall_at_3 value: 66.0 - type: recall_at_5 value: 73.667 - type: main_score value: 66.018 - task: type: Retrieval dataset: name: MTEB LEMBNeedleRetrieval (default) type: dwzhu/LongEmbed config: default split: test_256 revision: 6e346642246bfb4928c560ee08640dc84d074e8c metrics: - type: map_at_1 value: 64.0 - type: map_at_10 value: 77.083 - type: map_at_100 value: 77.265 - type: map_at_1000 value: 77.265 - type: map_at_20 value: 77.265 - type: map_at_3 value: 76.333 - type: map_at_5 value: 76.833 - type: mrr_at_1 value: 64.0 - type: mrr_at_10 value: 77.083 - type: mrr_at_100 value: 77.265 - type: mrr_at_1000 value: 77.265 - type: mrr_at_20 value: 77.265 - type: mrr_at_3 value: 76.333 - type: mrr_at_5 value: 76.833 - type: ndcg_at_1 value: 64.0 - type: ndcg_at_10 value: 82.325 - type: ndcg_at_100 value: 82.883 - type: ndcg_at_1000 value: 82.883 - type: ndcg_at_20 value: 82.883 - type: ndcg_at_3 value: 80.833 - type: ndcg_at_5 value: 81.694 - type: precision_at_1 value: 64.0 - type: precision_at_10 value: 9.8 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 5.0 - type: precision_at_3 value: 31.333 - type: precision_at_5 value: 19.2 - type: recall_at_1 value: 64.0 - type: recall_at_10 value: 98.0 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 100.0 - type: recall_at_3 value: 94.0 - type: recall_at_5 value: 96.0 - type: main_score value: 64.0 - type: map_at_1 value: 100.0 - type: map_at_10 value: 100.0 - type: map_at_100 value: 100.0 - type: map_at_1000 value: 100.0 - type: map_at_20 value: 100.0 - type: map_at_3 value: 100.0 - type: map_at_5 value: 100.0 - type: mrr_at_1 value: 100.0 - type: mrr_at_10 value: 100.0 - type: mrr_at_100 value: 100.0 - type: mrr_at_1000 value: 100.0 - type: mrr_at_20 value: 100.0 - type: mrr_at_3 value: 100.0 - type: mrr_at_5 value: 100.0 - type: ndcg_at_1 value: 100.0 - type: ndcg_at_10 value: 100.0 - type: ndcg_at_100 value: 100.0 - type: ndcg_at_1000 value: 100.0 - type: ndcg_at_20 value: 100.0 - type: ndcg_at_3 value: 100.0 - type: ndcg_at_5 value: 100.0 - type: precision_at_1 value: 100.0 - type: precision_at_10 value: 10.0 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 5.0 - type: precision_at_3 value: 33.333 - type: precision_at_5 value: 20.0 - type: recall_at_1 value: 100.0 - type: recall_at_10 value: 100.0 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 100.0 - type: recall_at_3 value: 100.0 - type: recall_at_5 value: 100.0 - type: main_score value: 100.0 - task: type: Retrieval dataset: name: MTEB LEMBSummScreenFDRetrieval (default) type: dwzhu/LongEmbed config: default split: validation revision: 6e346642246bfb4928c560ee08640dc84d074e8c metrics: - type: map_at_1 value: 84.821 - type: map_at_10 value: 90.11200000000001 - type: map_at_100 value: 90.158 - type: map_at_1000 value: 90.158 - type: map_at_20 value: 90.137 - type: map_at_3 value: 89.385 - type: map_at_5 value: 89.876 - type: mrr_at_1 value: 84.821 - type: mrr_at_10 value: 90.11200000000001 - type: mrr_at_100 value: 90.158 - type: mrr_at_1000 value: 90.158 - type: mrr_at_20 value: 90.137 - type: mrr_at_3 value: 89.385 - type: mrr_at_5 value: 89.876 - type: ndcg_at_1 value: 84.821 - type: ndcg_at_10 value: 92.334 - type: ndcg_at_100 value: 92.535 - type: ndcg_at_1000 value: 92.535 - type: ndcg_at_20 value: 92.414 - type: ndcg_at_3 value: 90.887 - type: ndcg_at_5 value: 91.758 - type: precision_at_1 value: 84.821 - type: precision_at_10 value: 9.911 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.97 - type: precision_at_3 value: 31.746000000000002 - type: precision_at_5 value: 19.464000000000002 - type: recall_at_1 value: 84.821 - type: recall_at_10 value: 99.107 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 99.405 - type: recall_at_3 value: 95.238 - type: recall_at_5 value: 97.321 - type: main_score value: 92.334 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (deu-deu) type: facebook/mlqa config: deu-deu split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 67.548 - type: map_at_1 value: 56.559000000000005 - type: map_at_10 value: 63.867 - type: map_at_100 value: 64.429 - type: map_at_1000 value: 64.457 - type: map_at_20 value: 64.215 - type: map_at_3 value: 62.109 - type: map_at_5 value: 63.101 - type: mrr_at_1 value: 56.56990915134057 - type: mrr_at_10 value: 63.86820789324668 - type: mrr_at_100 value: 64.42973602152581 - type: mrr_at_1000 value: 64.45818598090155 - type: mrr_at_20 value: 64.2163052263868 - type: mrr_at_3 value: 62.10946155550634 - type: mrr_at_5 value: 63.10104143585199 - type: nauc_map_at_1000_diff1 value: 73.78440163370111 - type: nauc_map_at_1000_max value: 66.37875518052162 - type: nauc_map_at_1000_std value: -17.063915098135396 - type: nauc_map_at_100_diff1 value: 73.77180802985815 - type: nauc_map_at_100_max value: 66.38365998362033 - type: nauc_map_at_100_std value: -17.053345109661972 - type: nauc_map_at_10_diff1 value: 73.70041876696037 - type: nauc_map_at_10_max value: 66.33213342705997 - type: nauc_map_at_10_std value: -17.40657791273925 - type: nauc_map_at_1_diff1 value: 76.8784374396948 - type: nauc_map_at_1_max value: 64.07170606935357 - type: nauc_map_at_1_std value: -18.464213686790654 - type: nauc_map_at_20_diff1 value: 73.72371377231813 - type: nauc_map_at_20_max value: 66.42108121059451 - type: nauc_map_at_20_std value: -17.05384923889036 - type: nauc_map_at_3_diff1 value: 74.08287018839246 - type: nauc_map_at_3_max value: 66.42422337760333 - type: nauc_map_at_3_std value: -17.79503404131652 - type: nauc_map_at_5_diff1 value: 73.9294779027339 - type: nauc_map_at_5_max value: 66.51752041065726 - type: nauc_map_at_5_std value: -17.67309805113804 - type: nauc_mrr_at_1000_diff1 value: 73.78389736923545 - type: nauc_mrr_at_1000_max value: 66.37929720858341 - type: nauc_mrr_at_1000_std value: -17.058591711291278 - type: nauc_mrr_at_100_diff1 value: 73.77126451253136 - type: nauc_mrr_at_100_max value: 66.38405917246607 - type: nauc_mrr_at_100_std value: -17.047251035212863 - type: nauc_mrr_at_10_diff1 value: 73.69960470665124 - type: nauc_mrr_at_10_max value: 66.33265194210313 - type: nauc_mrr_at_10_std value: -17.399659076827998 - type: nauc_mrr_at_1_diff1 value: 76.8689850260726 - type: nauc_mrr_at_1_max value: 64.09858188287487 - type: nauc_mrr_at_1_std value: -18.46064784201847 - type: nauc_mrr_at_20_diff1 value: 73.72312682063128 - type: nauc_mrr_at_20_max value: 66.42181932858745 - type: nauc_mrr_at_20_std value: -17.04690257511092 - type: nauc_mrr_at_3_diff1 value: 74.08287018839246 - type: nauc_mrr_at_3_max value: 66.42422337760333 - type: nauc_mrr_at_3_std value: -17.79503404131652 - type: nauc_mrr_at_5_diff1 value: 73.9294779027339 - type: nauc_mrr_at_5_max value: 66.51752041065726 - type: nauc_mrr_at_5_std value: -17.67309805113804 - type: nauc_ndcg_at_1000_diff1 value: 72.97825548342801 - type: nauc_ndcg_at_1000_max value: 66.96275437178257 - type: nauc_ndcg_at_1000_std value: -15.611902299641587 - type: nauc_ndcg_at_100_diff1 value: 72.58724738936613 - type: nauc_ndcg_at_100_max value: 67.16774012704182 - type: nauc_ndcg_at_100_std value: -14.945088654796812 - type: nauc_ndcg_at_10_diff1 value: 72.16253640477947 - type: nauc_ndcg_at_10_max value: 67.01746849484621 - type: nauc_ndcg_at_10_std value: -16.46102507270809 - type: nauc_ndcg_at_1_diff1 value: 76.8689850260726 - type: nauc_ndcg_at_1_max value: 64.09858188287487 - type: nauc_ndcg_at_1_std value: -18.46064784201847 - type: nauc_ndcg_at_20_diff1 value: 72.19995325129975 - type: nauc_ndcg_at_20_max value: 67.39639713797962 - type: nauc_ndcg_at_20_std value: -15.091689370748531 - type: nauc_ndcg_at_3_diff1 value: 73.13123604206514 - type: nauc_ndcg_at_3_max value: 67.23123167871547 - type: nauc_ndcg_at_3_std value: -17.492755234009156 - type: nauc_ndcg_at_5_diff1 value: 72.8154718929895 - type: nauc_ndcg_at_5_max value: 67.44578008373777 - type: nauc_ndcg_at_5_std value: -17.251840358751362 - type: nauc_precision_at_1000_diff1 value: 47.89748325983604 - type: nauc_precision_at_1000_max value: 70.47466197804906 - type: nauc_precision_at_1000_std value: 72.66193512114775 - type: nauc_precision_at_100_diff1 value: 59.493743734005356 - type: nauc_precision_at_100_max value: 74.02140147220713 - type: nauc_precision_at_100_std value: 17.26664098026236 - type: nauc_precision_at_10_diff1 value: 64.94415011040277 - type: nauc_precision_at_10_max value: 69.6963814950747 - type: nauc_precision_at_10_std value: -11.663043657012954 - type: nauc_precision_at_1_diff1 value: 76.8689850260726 - type: nauc_precision_at_1_max value: 64.09858188287487 - type: nauc_precision_at_1_std value: -18.46064784201847 - type: nauc_precision_at_20_diff1 value: 63.145886909986416 - type: nauc_precision_at_20_max value: 72.95708033630744 - type: nauc_precision_at_20_std value: -1.5039593629280323 - type: nauc_precision_at_3_diff1 value: 69.88902201644449 - type: nauc_precision_at_3_max value: 69.80499971089935 - type: nauc_precision_at_3_std value: -16.444680766676647 - type: nauc_precision_at_5_diff1 value: 68.60869967062919 - type: nauc_precision_at_5_max value: 70.75998207564281 - type: nauc_precision_at_5_std value: -15.62613396998262 - type: nauc_recall_at_1000_diff1 value: 62.6646436338833 - type: nauc_recall_at_1000_max value: 86.17801636476078 - type: nauc_recall_at_1000_std value: 71.84718775540334 - type: nauc_recall_at_100_diff1 value: 61.110492191439505 - type: nauc_recall_at_100_max value: 75.45730686603042 - type: nauc_recall_at_100_std value: 16.202465011589428 - type: nauc_recall_at_10_diff1 value: 65.1522196516815 - type: nauc_recall_at_10_max value: 69.7626435962161 - type: nauc_recall_at_10_std value: -11.801178474770449 - type: nauc_recall_at_1_diff1 value: 76.8784374396948 - type: nauc_recall_at_1_max value: 64.07170606935357 - type: nauc_recall_at_1_std value: -18.464213686790654 - type: nauc_recall_at_20_diff1 value: 63.40332739504143 - type: nauc_recall_at_20_max value: 73.04113661090965 - type: nauc_recall_at_20_std value: -1.6609741140266947 - type: nauc_recall_at_3_diff1 value: 70.03728086098866 - type: nauc_recall_at_3_max value: 69.85953774320521 - type: nauc_recall_at_3_std value: -16.482993123411706 - type: nauc_recall_at_5_diff1 value: 68.77396121765933 - type: nauc_recall_at_5_max value: 70.8231205493519 - type: nauc_recall_at_5_std value: -15.668037770700863 - type: ndcg_at_1 value: 56.57 - type: ndcg_at_10 value: 67.548 - type: ndcg_at_100 value: 70.421 - type: ndcg_at_1000 value: 71.198 - type: ndcg_at_20 value: 68.829 - type: ndcg_at_3 value: 63.88700000000001 - type: ndcg_at_5 value: 65.689 - type: precision_at_1 value: 56.57 - type: precision_at_10 value: 7.922 - type: precision_at_100 value: 0.9299999999999999 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 4.216 - type: precision_at_3 value: 23.015 - type: precision_at_5 value: 14.691 - type: recall_at_1 value: 56.559000000000005 - type: recall_at_10 value: 79.182 - type: recall_at_100 value: 92.946 - type: recall_at_1000 value: 99.092 - type: recall_at_20 value: 84.27900000000001 - type: recall_at_3 value: 69.023 - type: recall_at_5 value: 73.432 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (deu-spa) type: facebook/mlqa config: deu-spa split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 70.645 - type: map_at_1 value: 58.423 - type: map_at_10 value: 66.613 - type: map_at_100 value: 67.14099999999999 - type: map_at_1000 value: 67.161 - type: map_at_20 value: 66.965 - type: map_at_3 value: 64.714 - type: map_at_5 value: 65.835 - type: mrr_at_1 value: 58.4225352112676 - type: mrr_at_10 value: 66.61321260898735 - type: mrr_at_100 value: 67.13991570812132 - type: mrr_at_1000 value: 67.1598532168174 - type: mrr_at_20 value: 66.96384710024888 - type: mrr_at_3 value: 64.71361502347425 - type: mrr_at_5 value: 65.83474178403769 - type: nauc_map_at_1000_diff1 value: 73.9485117118935 - type: nauc_map_at_1000_max value: 65.74479869396299 - type: nauc_map_at_1000_std value: -20.300269749495563 - type: nauc_map_at_100_diff1 value: 73.93900406302829 - type: nauc_map_at_100_max value: 65.75508449194885 - type: nauc_map_at_100_std value: -20.265330791570175 - type: nauc_map_at_10_diff1 value: 73.84863233472605 - type: nauc_map_at_10_max value: 65.89377317378211 - type: nauc_map_at_10_std value: -20.404123131964695 - type: nauc_map_at_1_diff1 value: 76.73627284218519 - type: nauc_map_at_1_max value: 62.94957512510876 - type: nauc_map_at_1_std value: -20.99649749330682 - type: nauc_map_at_20_diff1 value: 73.88712006109598 - type: nauc_map_at_20_max value: 65.82057018162664 - type: nauc_map_at_20_std value: -20.269476512431915 - type: nauc_map_at_3_diff1 value: 74.21419190161502 - type: nauc_map_at_3_max value: 65.64993368062119 - type: nauc_map_at_3_std value: -21.34641749007071 - type: nauc_map_at_5_diff1 value: 74.0119419385777 - type: nauc_map_at_5_max value: 65.69809416369732 - type: nauc_map_at_5_std value: -21.16901556082261 - type: nauc_mrr_at_1000_diff1 value: 73.94915184134923 - type: nauc_mrr_at_1000_max value: 65.74522469633418 - type: nauc_mrr_at_1000_std value: -20.303028367132246 - type: nauc_mrr_at_100_diff1 value: 73.93964394728808 - type: nauc_mrr_at_100_max value: 65.75550992323707 - type: nauc_mrr_at_100_std value: -20.26808820438918 - type: nauc_mrr_at_10_diff1 value: 73.84863233472605 - type: nauc_mrr_at_10_max value: 65.89377317378211 - type: nauc_mrr_at_10_std value: -20.404123131964695 - type: nauc_mrr_at_1_diff1 value: 76.73627284218519 - type: nauc_mrr_at_1_max value: 62.94957512510876 - type: nauc_mrr_at_1_std value: -20.99649749330682 - type: nauc_mrr_at_20_diff1 value: 73.88775721128745 - type: nauc_mrr_at_20_max value: 65.820991355628 - type: nauc_mrr_at_20_std value: -20.272216587019734 - type: nauc_mrr_at_3_diff1 value: 74.21419190161502 - type: nauc_mrr_at_3_max value: 65.64993368062119 - type: nauc_mrr_at_3_std value: -21.34641749007071 - type: nauc_mrr_at_5_diff1 value: 74.0119419385777 - type: nauc_mrr_at_5_max value: 65.69809416369732 - type: nauc_mrr_at_5_std value: -21.16901556082261 - type: nauc_ndcg_at_1000_diff1 value: 73.29396365944277 - type: nauc_ndcg_at_1000_max value: 66.44879592109541 - type: nauc_ndcg_at_1000_std value: -19.285991058788195 - type: nauc_ndcg_at_100_diff1 value: 73.0159172721162 - type: nauc_ndcg_at_100_max value: 66.76216389231388 - type: nauc_ndcg_at_100_std value: -18.27931368094887 - type: nauc_ndcg_at_10_diff1 value: 72.42096650774693 - type: nauc_ndcg_at_10_max value: 67.48592688463306 - type: nauc_ndcg_at_10_std value: -18.91453756077581 - type: nauc_ndcg_at_1_diff1 value: 76.73627284218519 - type: nauc_ndcg_at_1_max value: 62.94957512510876 - type: nauc_ndcg_at_1_std value: -20.99649749330682 - type: nauc_ndcg_at_20_diff1 value: 72.53699362385684 - type: nauc_ndcg_at_20_max value: 67.22763976357872 - type: nauc_ndcg_at_20_std value: -18.299910635008338 - type: nauc_ndcg_at_3_diff1 value: 73.3698453761989 - type: nauc_ndcg_at_3_max value: 66.71056987289383 - type: nauc_ndcg_at_3_std value: -21.405154376652803 - type: nauc_ndcg_at_5_diff1 value: 72.9491030712935 - type: nauc_ndcg_at_5_max value: 66.85786103137077 - type: nauc_ndcg_at_5_std value: -21.04005053344073 - type: nauc_precision_at_1000_diff1 value: 17.02462370967451 - type: nauc_precision_at_1000_max value: 48.03260752496052 - type: nauc_precision_at_1000_std value: 87.56077915079334 - type: nauc_precision_at_100_diff1 value: 58.590352501194985 - type: nauc_precision_at_100_max value: 78.2649015433222 - type: nauc_precision_at_100_std value: 28.05030453158992 - type: nauc_precision_at_10_diff1 value: 64.89497928764766 - type: nauc_precision_at_10_max value: 75.93257124951242 - type: nauc_precision_at_10_std value: -9.825306994117462 - type: nauc_precision_at_1_diff1 value: 76.73627284218519 - type: nauc_precision_at_1_max value: 62.94957512510876 - type: nauc_precision_at_1_std value: -20.99649749330682 - type: nauc_precision_at_20_diff1 value: 62.11366204321558 - type: nauc_precision_at_20_max value: 75.9571427846493 - type: nauc_precision_at_20_std value: -0.94585212808191 - type: nauc_precision_at_3_diff1 value: 70.52940972112398 - type: nauc_precision_at_3_max value: 70.3402053170779 - type: nauc_precision_at_3_std value: -21.579778424241304 - type: nauc_precision_at_5_diff1 value: 68.78962580223575 - type: nauc_precision_at_5_max value: 71.41410894398376 - type: nauc_precision_at_5_std value: -20.415603405161956 - type: nauc_recall_at_1000_diff1 value: 55.88625447348128 - type: nauc_recall_at_1000_max value: 100.0 - type: nauc_recall_at_1000_std value: 100.0 - type: nauc_recall_at_100_diff1 value: 61.17942268389525 - type: nauc_recall_at_100_max value: 81.12207841563487 - type: nauc_recall_at_100_std value: 27.141215257528113 - type: nauc_recall_at_10_diff1 value: 64.8949792876478 - type: nauc_recall_at_10_max value: 75.93257124951249 - type: nauc_recall_at_10_std value: -9.825306994117323 - type: nauc_recall_at_1_diff1 value: 76.73627284218519 - type: nauc_recall_at_1_max value: 62.94957512510876 - type: nauc_recall_at_1_std value: -20.99649749330682 - type: nauc_recall_at_20_diff1 value: 63.07808719241162 - type: nauc_recall_at_20_max value: 76.96808746317542 - type: nauc_recall_at_20_std value: -1.5235053258631275 - type: nauc_recall_at_3_diff1 value: 70.52940972112405 - type: nauc_recall_at_3_max value: 70.3402053170779 - type: nauc_recall_at_3_std value: -21.57977842424124 - type: nauc_recall_at_5_diff1 value: 68.78962580223575 - type: nauc_recall_at_5_max value: 71.41410894398392 - type: nauc_recall_at_5_std value: -20.415603405161793 - type: ndcg_at_1 value: 58.423 - type: ndcg_at_10 value: 70.645 - type: ndcg_at_100 value: 73.277 - type: ndcg_at_1000 value: 73.785 - type: ndcg_at_20 value: 71.918 - type: ndcg_at_3 value: 66.679 - type: ndcg_at_5 value: 68.72200000000001 - type: precision_at_1 value: 58.423 - type: precision_at_10 value: 8.338 - type: precision_at_100 value: 0.959 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.423 - type: precision_at_3 value: 24.113 - type: precision_at_5 value: 15.47 - type: recall_at_1 value: 58.423 - type: recall_at_10 value: 83.38 - type: recall_at_100 value: 95.887 - type: recall_at_1000 value: 99.831 - type: recall_at_20 value: 88.39399999999999 - type: recall_at_3 value: 72.33800000000001 - type: recall_at_5 value: 77.352 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (deu-eng) type: facebook/mlqa config: deu-eng split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 67.067 - type: map_at_1 value: 55.861000000000004 - type: map_at_10 value: 63.42100000000001 - type: map_at_100 value: 64.03 - type: map_at_1000 value: 64.05999999999999 - type: map_at_20 value: 63.819 - type: map_at_3 value: 61.773 - type: map_at_5 value: 62.736999999999995 - type: mrr_at_1 value: 55.88300465322402 - type: mrr_at_10 value: 63.43111082973707 - type: mrr_at_100 value: 64.03962373590272 - type: mrr_at_1000 value: 64.0698259866376 - type: mrr_at_20 value: 63.82871766489112 - type: mrr_at_3 value: 61.78447448112865 - type: mrr_at_5 value: 62.74835659945346 - type: nauc_map_at_1000_diff1 value: 74.58505763417352 - type: nauc_map_at_1000_max value: 66.26060764852198 - type: nauc_map_at_1000_std value: -16.896178230873897 - type: nauc_map_at_100_diff1 value: 74.57057487892857 - type: nauc_map_at_100_max value: 66.26600433283826 - type: nauc_map_at_100_std value: -16.87596113104189 - type: nauc_map_at_10_diff1 value: 74.53453636322749 - type: nauc_map_at_10_max value: 66.27501737773804 - type: nauc_map_at_10_std value: -17.178743257781775 - type: nauc_map_at_1_diff1 value: 77.63067209375254 - type: nauc_map_at_1_max value: 64.17718675702672 - type: nauc_map_at_1_std value: -17.639521106853717 - type: nauc_map_at_20_diff1 value: 74.52007402431164 - type: nauc_map_at_20_max value: 66.28276291359268 - type: nauc_map_at_20_std value: -16.939292897754758 - type: nauc_map_at_3_diff1 value: 74.79187974631951 - type: nauc_map_at_3_max value: 66.23256568210611 - type: nauc_map_at_3_std value: -17.894889918934112 - type: nauc_map_at_5_diff1 value: 74.63011328882517 - type: nauc_map_at_5_max value: 66.35411054978499 - type: nauc_map_at_5_std value: -17.50140342194211 - type: nauc_mrr_at_1000_diff1 value: 74.57520089771667 - type: nauc_mrr_at_1000_max value: 66.27270912845914 - type: nauc_mrr_at_1000_std value: -16.84012675362397 - type: nauc_mrr_at_100_diff1 value: 74.56070964572156 - type: nauc_mrr_at_100_max value: 66.2780701126926 - type: nauc_mrr_at_100_std value: -16.820035083069865 - type: nauc_mrr_at_10_diff1 value: 74.52455978435117 - type: nauc_mrr_at_10_max value: 66.28697244023137 - type: nauc_mrr_at_10_std value: -17.122477723330523 - type: nauc_mrr_at_1_diff1 value: 77.60643512422061 - type: nauc_mrr_at_1_max value: 64.21736966061896 - type: nauc_mrr_at_1_std value: -17.56627338275146 - type: nauc_mrr_at_20_diff1 value: 74.5099814266373 - type: nauc_mrr_at_20_max value: 66.29485560556576 - type: nauc_mrr_at_20_std value: -16.882350027335306 - type: nauc_mrr_at_3_diff1 value: 74.78132817375507 - type: nauc_mrr_at_3_max value: 66.24761860047623 - type: nauc_mrr_at_3_std value: -17.833128575678998 - type: nauc_mrr_at_5_diff1 value: 74.6193031207433 - type: nauc_mrr_at_5_max value: 66.36951764432901 - type: nauc_mrr_at_5_std value: -17.438203106324227 - type: nauc_ndcg_at_1000_diff1 value: 73.79386161629151 - type: nauc_ndcg_at_1000_max value: 66.84013038018082 - type: nauc_ndcg_at_1000_std value: -15.387358822700667 - type: nauc_ndcg_at_100_diff1 value: 73.36132885277745 - type: nauc_ndcg_at_100_max value: 67.04416926901568 - type: nauc_ndcg_at_100_std value: -14.503256942521972 - type: nauc_ndcg_at_10_diff1 value: 73.11847332785027 - type: nauc_ndcg_at_10_max value: 67.02149621303091 - type: nauc_ndcg_at_10_std value: -16.142234662067782 - type: nauc_ndcg_at_1_diff1 value: 77.60643512422061 - type: nauc_ndcg_at_1_max value: 64.21736966061896 - type: nauc_ndcg_at_1_std value: -17.56627338275146 - type: nauc_ndcg_at_20_diff1 value: 72.97961452569768 - type: nauc_ndcg_at_20_max value: 67.12369127081152 - type: nauc_ndcg_at_20_std value: -15.11921773223936 - type: nauc_ndcg_at_3_diff1 value: 73.77769312598772 - type: nauc_ndcg_at_3_max value: 66.94438755852309 - type: nauc_ndcg_at_3_std value: -17.75960443830741 - type: nauc_ndcg_at_5_diff1 value: 73.43991209562891 - type: nauc_ndcg_at_5_max value: 67.21682951737418 - type: nauc_ndcg_at_5_std value: -17.013510008231805 - type: nauc_precision_at_1000_diff1 value: 51.30633281948362 - type: nauc_precision_at_1000_max value: 76.78675288883846 - type: nauc_precision_at_1000_std value: 71.70041985304397 - type: nauc_precision_at_100_diff1 value: 59.86656455853326 - type: nauc_precision_at_100_max value: 74.41958422732161 - type: nauc_precision_at_100_std value: 22.098920296069124 - type: nauc_precision_at_10_diff1 value: 66.4696166928741 - type: nauc_precision_at_10_max value: 69.88463108697104 - type: nauc_precision_at_10_std value: -10.707950954702742 - type: nauc_precision_at_1_diff1 value: 77.60643512422061 - type: nauc_precision_at_1_max value: 64.21736966061896 - type: nauc_precision_at_1_std value: -17.56627338275146 - type: nauc_precision_at_20_diff1 value: 63.45094585276983 - type: nauc_precision_at_20_max value: 71.57741245347195 - type: nauc_precision_at_20_std value: -2.2211545419051744 - type: nauc_precision_at_3_diff1 value: 70.28060818081384 - type: nauc_precision_at_3_max value: 69.22652927816439 - type: nauc_precision_at_3_std value: -17.158576243559434 - type: nauc_precision_at_5_diff1 value: 68.90765418427162 - type: nauc_precision_at_5_max value: 70.32585273389111 - type: nauc_precision_at_5_std value: -14.950363729664524 - type: nauc_recall_at_1000_diff1 value: 65.11255117927331 - type: nauc_recall_at_1000_max value: 88.35641213283338 - type: nauc_recall_at_1000_std value: 69.89792573640547 - type: nauc_recall_at_100_diff1 value: 61.46376457272238 - type: nauc_recall_at_100_max value: 75.48265142243015 - type: nauc_recall_at_100_std value: 21.223182712042178 - type: nauc_recall_at_10_diff1 value: 66.89353375308997 - type: nauc_recall_at_10_max value: 70.06655416883785 - type: nauc_recall_at_10_std value: -11.100871879439435 - type: nauc_recall_at_1_diff1 value: 77.63067209375254 - type: nauc_recall_at_1_max value: 64.17718675702672 - type: nauc_recall_at_1_std value: -17.639521106853717 - type: nauc_recall_at_20_diff1 value: 63.98532276331878 - type: nauc_recall_at_20_max value: 71.81562599791899 - type: nauc_recall_at_20_std value: -2.696537977147695 - type: nauc_recall_at_3_diff1 value: 70.4507655865698 - type: nauc_recall_at_3_max value: 69.25705030141037 - type: nauc_recall_at_3_std value: -17.299948348202836 - type: nauc_recall_at_5_diff1 value: 69.09152857901888 - type: nauc_recall_at_5_max value: 70.35609636026405 - type: nauc_recall_at_5_std value: -15.105012139255896 - type: ndcg_at_1 value: 55.883 - type: ndcg_at_10 value: 67.067 - type: ndcg_at_100 value: 70.07 - type: ndcg_at_1000 value: 70.875 - type: ndcg_at_20 value: 68.498 - type: ndcg_at_3 value: 63.666 - type: ndcg_at_5 value: 65.40599999999999 - type: precision_at_1 value: 55.883 - type: precision_at_10 value: 7.8549999999999995 - type: precision_at_100 value: 0.928 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 4.2090000000000005 - type: precision_at_3 value: 23.052 - type: precision_at_5 value: 14.677999999999999 - type: recall_at_1 value: 55.861000000000004 - type: recall_at_10 value: 78.495 - type: recall_at_100 value: 92.688 - type: recall_at_1000 value: 99.02499999999999 - type: recall_at_20 value: 84.124 - type: recall_at_3 value: 69.123 - type: recall_at_5 value: 73.355 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (spa-deu) type: facebook/mlqa config: spa-deu split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 73.90299999999999 - type: map_at_1 value: 61.236000000000004 - type: map_at_10 value: 69.88799999999999 - type: map_at_100 value: 70.319 - type: map_at_1000 value: 70.341 - type: map_at_20 value: 70.16799999999999 - type: map_at_3 value: 68.104 - type: map_at_5 value: 69.164 - type: mrr_at_1 value: 61.2739571589628 - type: mrr_at_10 value: 69.92589162684993 - type: mrr_at_100 value: 70.35245455509234 - type: mrr_at_1000 value: 70.37438351396742 - type: mrr_at_20 value: 70.20247469915404 - type: mrr_at_3 value: 68.14167606163099 - type: mrr_at_5 value: 69.20142803457354 - type: nauc_map_at_1000_diff1 value: 74.70416754842327 - type: nauc_map_at_1000_max value: 65.86915994583384 - type: nauc_map_at_1000_std value: -19.04437483534443 - type: nauc_map_at_100_diff1 value: 74.70011798058674 - type: nauc_map_at_100_max value: 65.88507779167188 - type: nauc_map_at_100_std value: -19.018670970643786 - type: nauc_map_at_10_diff1 value: 74.6362126804427 - type: nauc_map_at_10_max value: 66.05733054427198 - type: nauc_map_at_10_std value: -19.034317737897354 - type: nauc_map_at_1_diff1 value: 77.24970536833601 - type: nauc_map_at_1_max value: 62.07820573048406 - type: nauc_map_at_1_std value: -20.917086586335078 - type: nauc_map_at_20_diff1 value: 74.64113920401083 - type: nauc_map_at_20_max value: 65.89991740166793 - type: nauc_map_at_20_std value: -19.09987515041243 - type: nauc_map_at_3_diff1 value: 74.6518162332119 - type: nauc_map_at_3_max value: 66.10312348194024 - type: nauc_map_at_3_std value: -18.95881457716116 - type: nauc_map_at_5_diff1 value: 74.55141020670321 - type: nauc_map_at_5_max value: 65.94345752979342 - type: nauc_map_at_5_std value: -19.453976877992304 - type: nauc_mrr_at_1000_diff1 value: 74.64458488344088 - type: nauc_mrr_at_1000_max value: 65.84575328456057 - type: nauc_mrr_at_1000_std value: -18.901614615119904 - type: nauc_mrr_at_100_diff1 value: 74.64058497924627 - type: nauc_mrr_at_100_max value: 65.86170461767928 - type: nauc_mrr_at_100_std value: -18.87601697091505 - type: nauc_mrr_at_10_diff1 value: 74.57266634464752 - type: nauc_mrr_at_10_max value: 66.03331587645152 - type: nauc_mrr_at_10_std value: -18.87888060105393 - type: nauc_mrr_at_1_diff1 value: 77.19578272647183 - type: nauc_mrr_at_1_max value: 62.05252035478773 - type: nauc_mrr_at_1_std value: -20.790530940625267 - type: nauc_mrr_at_20_diff1 value: 74.5808171250021 - type: nauc_mrr_at_20_max value: 65.87643606587798 - type: nauc_mrr_at_20_std value: -18.95476583474199 - type: nauc_mrr_at_3_diff1 value: 74.5917053289191 - type: nauc_mrr_at_3_max value: 66.08044079438714 - type: nauc_mrr_at_3_std value: -18.81168463163586 - type: nauc_mrr_at_5_diff1 value: 74.48934579694608 - type: nauc_mrr_at_5_max value: 65.91993162383771 - type: nauc_mrr_at_5_std value: -19.302710791338797 - type: nauc_ndcg_at_1000_diff1 value: 74.20191283992186 - type: nauc_ndcg_at_1000_max value: 66.60831175771229 - type: nauc_ndcg_at_1000_std value: -18.175208725175484 - type: nauc_ndcg_at_100_diff1 value: 74.07713451642955 - type: nauc_ndcg_at_100_max value: 67.02028626335476 - type: nauc_ndcg_at_100_std value: -17.36560972181693 - type: nauc_ndcg_at_10_diff1 value: 73.63235521598476 - type: nauc_ndcg_at_10_max value: 67.8118473312638 - type: nauc_ndcg_at_10_std value: -17.647560577355915 - type: nauc_ndcg_at_1_diff1 value: 77.19578272647183 - type: nauc_ndcg_at_1_max value: 62.05252035478773 - type: nauc_ndcg_at_1_std value: -20.790530940625267 - type: nauc_ndcg_at_20_diff1 value: 73.65300308228291 - type: nauc_ndcg_at_20_max value: 67.18353402731985 - type: nauc_ndcg_at_20_std value: -17.9240756389792 - type: nauc_ndcg_at_3_diff1 value: 73.73764900202292 - type: nauc_ndcg_at_3_max value: 67.60840957876889 - type: nauc_ndcg_at_3_std value: -17.962667543518933 - type: nauc_ndcg_at_5_diff1 value: 73.49040500302092 - type: nauc_ndcg_at_5_max value: 67.41251918514402 - type: nauc_ndcg_at_5_std value: -18.851877225955523 - type: nauc_precision_at_1000_diff1 value: -18.652906102973922 - type: nauc_precision_at_1000_max value: 2.1701672475574885 - type: nauc_precision_at_1000_std value: 61.713411950188835 - type: nauc_precision_at_100_diff1 value: 62.37565302288498 - type: nauc_precision_at_100_max value: 76.96921843049006 - type: nauc_precision_at_100_std value: 19.152009040219678 - type: nauc_precision_at_10_diff1 value: 68.14047344105212 - type: nauc_precision_at_10_max value: 77.7177273849099 - type: nauc_precision_at_10_std value: -9.124325941493698 - type: nauc_precision_at_1_diff1 value: 77.19578272647183 - type: nauc_precision_at_1_max value: 62.05252035478773 - type: nauc_precision_at_1_std value: -20.790530940625267 - type: nauc_precision_at_20_diff1 value: 65.38487456362745 - type: nauc_precision_at_20_max value: 74.61122933443669 - type: nauc_precision_at_20_std value: -8.129775929648341 - type: nauc_precision_at_3_diff1 value: 70.45937744142297 - type: nauc_precision_at_3_max value: 73.03004233073901 - type: nauc_precision_at_3_std value: -14.246554579025158 - type: nauc_precision_at_5_diff1 value: 69.02821772428955 - type: nauc_precision_at_5_max value: 73.52949774726446 - type: nauc_precision_at_5_std value: -16.355747231517757 - type: nauc_recall_at_1000_diff1 value: 35.804192824985755 - type: nauc_recall_at_1000_max value: 61.367785756485894 - type: nauc_recall_at_1000_std value: 54.01380822466869 - type: nauc_recall_at_100_diff1 value: 67.96210883597479 - type: nauc_recall_at_100_max value: 82.38124823732169 - type: nauc_recall_at_100_std value: 16.814922595309966 - type: nauc_recall_at_10_diff1 value: 68.21964459634341 - type: nauc_recall_at_10_max value: 77.68301934858845 - type: nauc_recall_at_10_std value: -9.430792913885066 - type: nauc_recall_at_1_diff1 value: 77.24970536833601 - type: nauc_recall_at_1_max value: 62.07820573048406 - type: nauc_recall_at_1_std value: -20.917086586335078 - type: nauc_recall_at_20_diff1 value: 66.60569906579487 - type: nauc_recall_at_20_max value: 75.66163186604354 - type: nauc_recall_at_20_std value: -9.09826205489828 - type: nauc_recall_at_3_diff1 value: 70.52323701841641 - type: nauc_recall_at_3_max value: 73.03478107411232 - type: nauc_recall_at_3_std value: -14.432325989967962 - type: nauc_recall_at_5_diff1 value: 69.08521261524373 - type: nauc_recall_at_5_max value: 73.51150270382094 - type: nauc_recall_at_5_std value: -16.569387503524368 - type: ndcg_at_1 value: 61.273999999999994 - type: ndcg_at_10 value: 73.90299999999999 - type: ndcg_at_100 value: 75.983 - type: ndcg_at_1000 value: 76.488 - type: ndcg_at_20 value: 74.921 - type: ndcg_at_3 value: 70.277 - type: ndcg_at_5 value: 72.172 - type: precision_at_1 value: 61.273999999999994 - type: precision_at_10 value: 8.641 - type: precision_at_100 value: 0.962 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.524 - type: precision_at_3 value: 25.517 - type: precision_at_5 value: 16.223000000000003 - type: recall_at_1 value: 61.236000000000004 - type: recall_at_10 value: 86.37700000000001 - type: recall_at_100 value: 96.054 - type: recall_at_1000 value: 99.887 - type: recall_at_20 value: 90.398 - type: recall_at_3 value: 76.51299999999999 - type: recall_at_5 value: 81.07900000000001 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (spa-spa) type: facebook/mlqa config: spa-spa split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 68.632 - type: map_at_1 value: 57.046 - type: map_at_10 value: 64.869 - type: map_at_100 value: 65.384 - type: map_at_1000 value: 65.413 - type: map_at_20 value: 65.185 - type: map_at_3 value: 63.178 - type: map_at_5 value: 64.12 - type: mrr_at_1 value: 57.05579889544848 - type: mrr_at_10 value: 64.8806425382317 - type: mrr_at_100 value: 65.39469233244084 - type: mrr_at_1000 value: 65.42342199403159 - type: mrr_at_20 value: 65.19634815919534 - type: mrr_at_3 value: 63.18796419729591 - type: mrr_at_5 value: 64.13159398209874 - type: nauc_map_at_1000_diff1 value: 73.23803038674018 - type: nauc_map_at_1000_max value: 67.44156201421714 - type: nauc_map_at_1000_std value: -8.60143026450049 - type: nauc_map_at_100_diff1 value: 73.22575613034235 - type: nauc_map_at_100_max value: 67.44735143420195 - type: nauc_map_at_100_std value: -8.576905069492895 - type: nauc_map_at_10_diff1 value: 73.11950129610865 - type: nauc_map_at_10_max value: 67.45107232305055 - type: nauc_map_at_10_std value: -8.799837857015392 - type: nauc_map_at_1_diff1 value: 76.18354072047988 - type: nauc_map_at_1_max value: 65.03342186728786 - type: nauc_map_at_1_std value: -10.867650288695796 - type: nauc_map_at_20_diff1 value: 73.21570748770948 - type: nauc_map_at_20_max value: 67.50340321088724 - type: nauc_map_at_20_std value: -8.594057184944676 - type: nauc_map_at_3_diff1 value: 73.17239276163892 - type: nauc_map_at_3_max value: 67.06319504819103 - type: nauc_map_at_3_std value: -9.883216310270528 - type: nauc_map_at_5_diff1 value: 73.11913507367727 - type: nauc_map_at_5_max value: 67.27497019567078 - type: nauc_map_at_5_std value: -9.497714822103118 - type: nauc_mrr_at_1000_diff1 value: 73.22971233311306 - type: nauc_mrr_at_1000_max value: 67.42977229057223 - type: nauc_mrr_at_1000_std value: -8.550068702273297 - type: nauc_mrr_at_100_diff1 value: 73.21744467317815 - type: nauc_mrr_at_100_max value: 67.43557491068093 - type: nauc_mrr_at_100_std value: -8.52559275190607 - type: nauc_mrr_at_10_diff1 value: 73.11075619726137 - type: nauc_mrr_at_10_max value: 67.43889760205286 - type: nauc_mrr_at_10_std value: -8.74617232559183 - type: nauc_mrr_at_1_diff1 value: 76.17529975949547 - type: nauc_mrr_at_1_max value: 65.02401127001608 - type: nauc_mrr_at_1_std value: -10.817814457633952 - type: nauc_mrr_at_20_diff1 value: 73.20689275225138 - type: nauc_mrr_at_20_max value: 67.49111752272192 - type: nauc_mrr_at_20_std value: -8.539827528410353 - type: nauc_mrr_at_3_diff1 value: 73.16291729623958 - type: nauc_mrr_at_3_max value: 67.05300993427998 - type: nauc_mrr_at_3_std value: -9.827915885680811 - type: nauc_mrr_at_5_diff1 value: 73.11055686484109 - type: nauc_mrr_at_5_max value: 67.26299851089122 - type: nauc_mrr_at_5_std value: -9.445190276650903 - type: nauc_ndcg_at_1000_diff1 value: 72.58833638407177 - type: nauc_ndcg_at_1000_max value: 68.10447506371374 - type: nauc_ndcg_at_1000_std value: -6.910306241546282 - type: nauc_ndcg_at_100_diff1 value: 72.24524849631476 - type: nauc_ndcg_at_100_max value: 68.30659210081238 - type: nauc_ndcg_at_100_std value: -6.04305364268931 - type: nauc_ndcg_at_10_diff1 value: 71.87363502582961 - type: nauc_ndcg_at_10_max value: 68.5010009653693 - type: nauc_ndcg_at_10_std value: -7.021281296450588 - type: nauc_ndcg_at_1_diff1 value: 76.17529975949547 - type: nauc_ndcg_at_1_max value: 65.02401127001608 - type: nauc_ndcg_at_1_std value: -10.817814457633952 - type: nauc_ndcg_at_20_diff1 value: 72.21241010439327 - type: nauc_ndcg_at_20_max value: 68.71743274030551 - type: nauc_ndcg_at_20_std value: -6.186629577195946 - type: nauc_ndcg_at_3_diff1 value: 72.08204674794459 - type: nauc_ndcg_at_3_max value: 67.5958365046156 - type: nauc_ndcg_at_3_std value: -9.576418336610345 - type: nauc_ndcg_at_5_diff1 value: 71.93179095844508 - type: nauc_ndcg_at_5_max value: 68.01914639754217 - type: nauc_ndcg_at_5_std value: -8.833768332910777 - type: nauc_precision_at_1000_diff1 value: 63.0051360227489 - type: nauc_precision_at_1000_max value: 79.93532442313229 - type: nauc_precision_at_1000_std value: 52.869517607133254 - type: nauc_precision_at_100_diff1 value: 62.43301501857154 - type: nauc_precision_at_100_max value: 75.57280416668183 - type: nauc_precision_at_100_std value: 26.758300486132747 - type: nauc_precision_at_10_diff1 value: 66.29806375971134 - type: nauc_precision_at_10_max value: 73.40301413754797 - type: nauc_precision_at_10_std value: 1.9858547295235462 - type: nauc_precision_at_1_diff1 value: 76.17529975949547 - type: nauc_precision_at_1_max value: 65.02401127001608 - type: nauc_precision_at_1_std value: -10.817814457633952 - type: nauc_precision_at_20_diff1 value: 67.05111836051105 - type: nauc_precision_at_20_max value: 76.09783190824155 - type: nauc_precision_at_20_std value: 9.906010659515564 - type: nauc_precision_at_3_diff1 value: 68.44186679250453 - type: nauc_precision_at_3_max value: 69.30301351119388 - type: nauc_precision_at_3_std value: -8.566522518882348 - type: nauc_precision_at_5_diff1 value: 67.51737199297388 - type: nauc_precision_at_5_max value: 70.75887601590472 - type: nauc_precision_at_5_std value: -6.278983102710238 - type: nauc_recall_at_1000_diff1 value: 65.12360093170948 - type: nauc_recall_at_1000_max value: 82.60209843191132 - type: nauc_recall_at_1000_std value: 51.740179583368636 - type: nauc_recall_at_100_diff1 value: 62.82007697326819 - type: nauc_recall_at_100_max value: 76.04844844677562 - type: nauc_recall_at_100_std value: 26.4678415019248 - type: nauc_recall_at_10_diff1 value: 66.28557566848767 - type: nauc_recall_at_10_max value: 73.40302709828738 - type: nauc_recall_at_10_std value: 1.9224272854613582 - type: nauc_recall_at_1_diff1 value: 76.18354072047988 - type: nauc_recall_at_1_max value: 65.03342186728786 - type: nauc_recall_at_1_std value: -10.867650288695796 - type: nauc_recall_at_20_diff1 value: 67.03430451094992 - type: nauc_recall_at_20_max value: 76.09474005171319 - type: nauc_recall_at_20_std value: 9.815888637851074 - type: nauc_recall_at_3_diff1 value: 68.44411411344718 - type: nauc_recall_at_3_max value: 69.30502737137265 - type: nauc_recall_at_3_std value: -8.629526329714132 - type: nauc_recall_at_5_diff1 value: 67.51469265953514 - type: nauc_recall_at_5_max value: 70.76969893818111 - type: nauc_recall_at_5_std value: -6.325600167105444 - type: ndcg_at_1 value: 57.056 - type: ndcg_at_10 value: 68.632 - type: ndcg_at_100 value: 71.202 - type: ndcg_at_1000 value: 71.97099999999999 - type: ndcg_at_20 value: 69.785 - type: ndcg_at_3 value: 65.131 - type: ndcg_at_5 value: 66.834 - type: precision_at_1 value: 57.056 - type: precision_at_10 value: 8.044 - type: precision_at_100 value: 0.9259999999999999 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 4.251 - type: precision_at_3 value: 23.589 - type: precision_at_5 value: 14.984 - type: recall_at_1 value: 57.046 - type: recall_at_10 value: 80.423 - type: recall_at_100 value: 92.582 - type: recall_at_1000 value: 98.638 - type: recall_at_20 value: 84.993 - type: recall_at_3 value: 70.758 - type: recall_at_5 value: 74.9 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (spa-eng) type: facebook/mlqa config: spa-eng split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 68.765 - type: map_at_1 value: 56.538999999999994 - type: map_at_10 value: 64.816 - type: map_at_100 value: 65.325 - type: map_at_1000 value: 65.352 - type: map_at_20 value: 65.113 - type: map_at_3 value: 62.934999999999995 - type: map_at_5 value: 64.063 - type: mrr_at_1 value: 56.539120502569965 - type: mrr_at_10 value: 64.81561556661505 - type: mrr_at_100 value: 65.32464238613954 - type: mrr_at_1000 value: 65.35206516602133 - type: mrr_at_20 value: 65.11270445292227 - type: mrr_at_3 value: 62.935465448315384 - type: mrr_at_5 value: 64.06339234723022 - type: nauc_map_at_1000_diff1 value: 73.20701050428072 - type: nauc_map_at_1000_max value: 67.32797480614404 - type: nauc_map_at_1000_std value: -6.211540626528362 - type: nauc_map_at_100_diff1 value: 73.19497683923063 - type: nauc_map_at_100_max value: 67.33392646467817 - type: nauc_map_at_100_std value: -6.196671563900051 - type: nauc_map_at_10_diff1 value: 73.16010547612956 - type: nauc_map_at_10_max value: 67.37793741307372 - type: nauc_map_at_10_std value: -6.3443240322521675 - type: nauc_map_at_1_diff1 value: 76.63696578575964 - type: nauc_map_at_1_max value: 65.08189618178105 - type: nauc_map_at_1_std value: -8.594195451782733 - type: nauc_map_at_20_diff1 value: 73.15233479381568 - type: nauc_map_at_20_max value: 67.3679607256072 - type: nauc_map_at_20_std value: -6.175928265286352 - type: nauc_map_at_3_diff1 value: 73.14853380980746 - type: nauc_map_at_3_max value: 67.10354198073468 - type: nauc_map_at_3_std value: -7.409679815529866 - type: nauc_map_at_5_diff1 value: 73.13425961877715 - type: nauc_map_at_5_max value: 67.22452899371224 - type: nauc_map_at_5_std value: -6.895257774506354 - type: nauc_mrr_at_1000_diff1 value: 73.20701050428072 - type: nauc_mrr_at_1000_max value: 67.32797480614404 - type: nauc_mrr_at_1000_std value: -6.211540626528362 - type: nauc_mrr_at_100_diff1 value: 73.19497683923063 - type: nauc_mrr_at_100_max value: 67.33392646467817 - type: nauc_mrr_at_100_std value: -6.196671563900051 - type: nauc_mrr_at_10_diff1 value: 73.16010547612956 - type: nauc_mrr_at_10_max value: 67.37793741307372 - type: nauc_mrr_at_10_std value: -6.3443240322521675 - type: nauc_mrr_at_1_diff1 value: 76.63696578575964 - type: nauc_mrr_at_1_max value: 65.08189618178105 - type: nauc_mrr_at_1_std value: -8.594195451782733 - type: nauc_mrr_at_20_diff1 value: 73.15233479381568 - type: nauc_mrr_at_20_max value: 67.3679607256072 - type: nauc_mrr_at_20_std value: -6.175928265286352 - type: nauc_mrr_at_3_diff1 value: 73.14853380980746 - type: nauc_mrr_at_3_max value: 67.10354198073468 - type: nauc_mrr_at_3_std value: -7.409679815529866 - type: nauc_mrr_at_5_diff1 value: 73.13425961877715 - type: nauc_mrr_at_5_max value: 67.22452899371224 - type: nauc_mrr_at_5_std value: -6.895257774506354 - type: nauc_ndcg_at_1000_diff1 value: 72.44364625096874 - type: nauc_ndcg_at_1000_max value: 67.93635761141552 - type: nauc_ndcg_at_1000_std value: -4.616429464350954 - type: nauc_ndcg_at_100_diff1 value: 72.11352383758482 - type: nauc_ndcg_at_100_max value: 68.1627312575955 - type: nauc_ndcg_at_100_std value: -3.894213672131282 - type: nauc_ndcg_at_10_diff1 value: 71.8526850770812 - type: nauc_ndcg_at_10_max value: 68.41366561888562 - type: nauc_ndcg_at_10_std value: -4.472146861145989 - type: nauc_ndcg_at_1_diff1 value: 76.63696578575964 - type: nauc_ndcg_at_1_max value: 65.08189618178105 - type: nauc_ndcg_at_1_std value: -8.594195451782733 - type: nauc_ndcg_at_20_diff1 value: 71.76464418138866 - type: nauc_ndcg_at_20_max value: 68.41174963313698 - type: nauc_ndcg_at_20_std value: -3.7449762037540157 - type: nauc_ndcg_at_3_diff1 value: 71.93808990683131 - type: nauc_ndcg_at_3_max value: 67.7010029507334 - type: nauc_ndcg_at_3_std value: -6.971858419379321 - type: nauc_ndcg_at_5_diff1 value: 71.8505224811326 - type: nauc_ndcg_at_5_max value: 67.97139549500251 - type: nauc_ndcg_at_5_std value: -5.958491308070017 - type: nauc_precision_at_1000_diff1 value: 62.20956180320043 - type: nauc_precision_at_1000_max value: 82.53412670611299 - type: nauc_precision_at_1000_std value: 55.57278124999575 - type: nauc_precision_at_100_diff1 value: 62.03792857023201 - type: nauc_precision_at_100_max value: 76.77130713424538 - type: nauc_precision_at_100_std value: 26.674102719959564 - type: nauc_precision_at_10_diff1 value: 65.89798055049931 - type: nauc_precision_at_10_max value: 73.41908620140674 - type: nauc_precision_at_10_std value: 5.21818573283179 - type: nauc_precision_at_1_diff1 value: 76.63696578575964 - type: nauc_precision_at_1_max value: 65.08189618178105 - type: nauc_precision_at_1_std value: -8.594195451782733 - type: nauc_precision_at_20_diff1 value: 63.734308542647355 - type: nauc_precision_at_20_max value: 74.69578825096144 - type: nauc_precision_at_20_std value: 12.627842502659162 - type: nauc_precision_at_3_diff1 value: 67.91189666671904 - type: nauc_precision_at_3_max value: 69.64986036783209 - type: nauc_precision_at_3_std value: -5.505669087429055 - type: nauc_precision_at_5_diff1 value: 67.01880006360248 - type: nauc_precision_at_5_max value: 70.78916423358686 - type: nauc_precision_at_5_std value: -2.2273742736401045 - type: nauc_recall_at_1000_diff1 value: 62.20956180319936 - type: nauc_recall_at_1000_max value: 82.53412670611287 - type: nauc_recall_at_1000_std value: 55.57278124999549 - type: nauc_recall_at_100_diff1 value: 62.03792857023208 - type: nauc_recall_at_100_max value: 76.77130713424577 - type: nauc_recall_at_100_std value: 26.67410271995973 - type: nauc_recall_at_10_diff1 value: 65.8979805504994 - type: nauc_recall_at_10_max value: 73.41908620140678 - type: nauc_recall_at_10_std value: 5.2181857328318655 - type: nauc_recall_at_1_diff1 value: 76.63696578575964 - type: nauc_recall_at_1_max value: 65.08189618178105 - type: nauc_recall_at_1_std value: -8.594195451782733 - type: nauc_recall_at_20_diff1 value: 63.734308542647334 - type: nauc_recall_at_20_max value: 74.69578825096123 - type: nauc_recall_at_20_std value: 12.627842502658982 - type: nauc_recall_at_3_diff1 value: 67.91189666671897 - type: nauc_recall_at_3_max value: 69.64986036783203 - type: nauc_recall_at_3_std value: -5.505669087428989 - type: nauc_recall_at_5_diff1 value: 67.01880006360243 - type: nauc_recall_at_5_max value: 70.78916423358686 - type: nauc_recall_at_5_std value: -2.227374273640135 - type: ndcg_at_1 value: 56.538999999999994 - type: ndcg_at_10 value: 68.765 - type: ndcg_at_100 value: 71.314 - type: ndcg_at_1000 value: 72.038 - type: ndcg_at_20 value: 69.828 - type: ndcg_at_3 value: 64.937 - type: ndcg_at_5 value: 66.956 - type: precision_at_1 value: 56.538999999999994 - type: precision_at_10 value: 8.113 - type: precision_at_100 value: 0.932 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 4.265 - type: precision_at_3 value: 23.567 - type: precision_at_5 value: 15.115 - type: recall_at_1 value: 56.538999999999994 - type: recall_at_10 value: 81.135 - type: recall_at_100 value: 93.223 - type: recall_at_1000 value: 98.896 - type: recall_at_20 value: 85.304 - type: recall_at_3 value: 70.702 - type: recall_at_5 value: 75.576 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (eng-deu) type: facebook/mlqa config: eng-deu split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 69.298 - type: map_at_1 value: 58.553 - type: map_at_10 value: 65.769 - type: map_at_100 value: 66.298 - type: map_at_1000 value: 66.328 - type: map_at_20 value: 66.101 - type: map_at_3 value: 64.048 - type: map_at_5 value: 65.09 - type: mrr_at_1 value: 58.564148016840235 - type: mrr_at_10 value: 65.7685997066675 - type: mrr_at_100 value: 66.29874034432214 - type: mrr_at_1000 value: 66.32844979939088 - type: mrr_at_20 value: 66.10120513957821 - type: mrr_at_3 value: 64.04830489696437 - type: mrr_at_5 value: 65.08974074894746 - type: nauc_map_at_1000_diff1 value: 76.8409650183994 - type: nauc_map_at_1000_max value: 71.86367015521367 - type: nauc_map_at_1000_std value: -14.464881539957256 - type: nauc_map_at_100_diff1 value: 76.82536521842064 - type: nauc_map_at_100_max value: 71.86811127965429 - type: nauc_map_at_100_std value: -14.441105539722244 - type: nauc_map_at_10_diff1 value: 76.75522453447859 - type: nauc_map_at_10_max value: 71.87677500176706 - type: nauc_map_at_10_std value: -14.741331625103559 - type: nauc_map_at_1_diff1 value: 79.64060747740989 - type: nauc_map_at_1_max value: 69.84278563569617 - type: nauc_map_at_1_std value: -15.936904929655832 - type: nauc_map_at_20_diff1 value: 76.78894776059715 - type: nauc_map_at_20_max value: 71.89637938044827 - type: nauc_map_at_20_std value: -14.500564106990769 - type: nauc_map_at_3_diff1 value: 77.20562577450342 - type: nauc_map_at_3_max value: 71.80578229361525 - type: nauc_map_at_3_std value: -15.344134588512201 - type: nauc_map_at_5_diff1 value: 77.00480147367867 - type: nauc_map_at_5_max value: 71.98335924076163 - type: nauc_map_at_5_std value: -15.16537653041026 - type: nauc_mrr_at_1000_diff1 value: 76.84165367691193 - type: nauc_mrr_at_1000_max value: 71.8642679499795 - type: nauc_mrr_at_1000_std value: -14.461717954593158 - type: nauc_mrr_at_100_diff1 value: 76.8263363557998 - type: nauc_mrr_at_100_max value: 71.86874522368626 - type: nauc_mrr_at_100_std value: -14.437105168707426 - type: nauc_mrr_at_10_diff1 value: 76.75522453447859 - type: nauc_mrr_at_10_max value: 71.87677500176706 - type: nauc_mrr_at_10_std value: -14.741331625103559 - type: nauc_mrr_at_1_diff1 value: 79.65642669321981 - type: nauc_mrr_at_1_max value: 69.89135358784799 - type: nauc_mrr_at_1_std value: -15.919357002229589 - type: nauc_mrr_at_20_diff1 value: 76.78883171270601 - type: nauc_mrr_at_20_max value: 71.89806887245291 - type: nauc_mrr_at_20_std value: -14.497139746907905 - type: nauc_mrr_at_3_diff1 value: 77.20562577450342 - type: nauc_mrr_at_3_max value: 71.80578229361525 - type: nauc_mrr_at_3_std value: -15.344134588512201 - type: nauc_mrr_at_5_diff1 value: 77.00480147367867 - type: nauc_mrr_at_5_max value: 71.98335924076163 - type: nauc_mrr_at_5_std value: -15.16537653041026 - type: nauc_ndcg_at_1000_diff1 value: 76.07802417817047 - type: nauc_ndcg_at_1000_max value: 72.31792804426776 - type: nauc_ndcg_at_1000_std value: -13.049160715132244 - type: nauc_ndcg_at_100_diff1 value: 75.63343849116544 - type: nauc_ndcg_at_100_max value: 72.48362076101817 - type: nauc_ndcg_at_100_std value: -12.089600993516777 - type: nauc_ndcg_at_10_diff1 value: 75.23387929929208 - type: nauc_ndcg_at_10_max value: 72.51436288271807 - type: nauc_ndcg_at_10_std value: -13.624132103038104 - type: nauc_ndcg_at_1_diff1 value: 79.65642669321981 - type: nauc_ndcg_at_1_max value: 69.89135358784799 - type: nauc_ndcg_at_1_std value: -15.919357002229589 - type: nauc_ndcg_at_20_diff1 value: 75.32926047656296 - type: nauc_ndcg_at_20_max value: 72.61254165918145 - type: nauc_ndcg_at_20_std value: -12.683157599238701 - type: nauc_ndcg_at_3_diff1 value: 76.3089337665469 - type: nauc_ndcg_at_3_max value: 72.40014674426054 - type: nauc_ndcg_at_3_std value: -15.08624226353458 - type: nauc_ndcg_at_5_diff1 value: 75.88857331641834 - type: nauc_ndcg_at_5_max value: 72.7719386827224 - type: nauc_ndcg_at_5_std value: -14.70546521089236 - type: nauc_precision_at_1000_diff1 value: 59.66563879069911 - type: nauc_precision_at_1000_max value: 74.57123562956772 - type: nauc_precision_at_1000_std value: 58.61396866718965 - type: nauc_precision_at_100_diff1 value: 62.8695896550042 - type: nauc_precision_at_100_max value: 77.81408796785 - type: nauc_precision_at_100_std value: 23.819735672317826 - type: nauc_precision_at_10_diff1 value: 68.08051625224569 - type: nauc_precision_at_10_max value: 75.14432336036869 - type: nauc_precision_at_10_std value: -7.97602345252735 - type: nauc_precision_at_1_diff1 value: 79.65642669321981 - type: nauc_precision_at_1_max value: 69.89135358784799 - type: nauc_precision_at_1_std value: -15.919357002229589 - type: nauc_precision_at_20_diff1 value: 66.7168005185165 - type: nauc_precision_at_20_max value: 76.58522761697147 - type: nauc_precision_at_20_std value: -0.17923428317323292 - type: nauc_precision_at_3_diff1 value: 73.23394851561207 - type: nauc_precision_at_3_max value: 74.32517846819215 - type: nauc_precision_at_3_std value: -14.142301336188348 - type: nauc_precision_at_5_diff1 value: 71.5666882547012 - type: nauc_precision_at_5_max value: 75.71098205440033 - type: nauc_precision_at_5_std value: -12.808362513638052 - type: nauc_recall_at_1000_diff1 value: 71.73736112325805 - type: nauc_recall_at_1000_max value: 86.70743436225898 - type: nauc_recall_at_1000_std value: 54.45802578371167 - type: nauc_recall_at_100_diff1 value: 64.07053861428128 - type: nauc_recall_at_100_max value: 78.8348308099261 - type: nauc_recall_at_100_std value: 22.72263677785103 - type: nauc_recall_at_10_diff1 value: 68.20272901407903 - type: nauc_recall_at_10_max value: 75.16315335381938 - type: nauc_recall_at_10_std value: -8.060716748913386 - type: nauc_recall_at_1_diff1 value: 79.64060747740989 - type: nauc_recall_at_1_max value: 69.84278563569617 - type: nauc_recall_at_1_std value: -15.936904929655832 - type: nauc_recall_at_20_diff1 value: 66.88206981973654 - type: nauc_recall_at_20_max value: 76.54824917595687 - type: nauc_recall_at_20_std value: -0.40294589316962287 - type: nauc_recall_at_3_diff1 value: 73.33076087258938 - type: nauc_recall_at_3_max value: 74.33763112508771 - type: nauc_recall_at_3_std value: -14.213355414905399 - type: nauc_recall_at_5_diff1 value: 71.67487623469464 - type: nauc_recall_at_5_max value: 75.72770292516316 - type: nauc_recall_at_5_std value: -12.887572274644818 - type: ndcg_at_1 value: 58.56400000000001 - type: ndcg_at_10 value: 69.298 - type: ndcg_at_100 value: 71.95899999999999 - type: ndcg_at_1000 value: 72.735 - type: ndcg_at_20 value: 70.50699999999999 - type: ndcg_at_3 value: 65.81700000000001 - type: ndcg_at_5 value: 67.681 - type: precision_at_1 value: 58.56400000000001 - type: precision_at_10 value: 8.039 - type: precision_at_100 value: 0.931 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 4.259 - type: precision_at_3 value: 23.65 - type: precision_at_5 value: 15.09 - type: recall_at_1 value: 58.553 - type: recall_at_10 value: 80.368 - type: recall_at_100 value: 93.013 - type: recall_at_1000 value: 99.092 - type: recall_at_20 value: 85.143 - type: recall_at_3 value: 70.928 - type: recall_at_5 value: 75.42699999999999 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (eng-spa) type: facebook/mlqa config: eng-spa split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 66.374 - type: map_at_1 value: 55.494 - type: map_at_10 value: 62.763999999999996 - type: map_at_100 value: 63.33 - type: map_at_1000 value: 63.36000000000001 - type: map_at_20 value: 63.104000000000006 - type: map_at_3 value: 61.065000000000005 - type: map_at_5 value: 62.053000000000004 - type: mrr_at_1 value: 55.49419158255571 - type: mrr_at_10 value: 62.765195140457095 - type: mrr_at_100 value: 63.33083349354529 - type: mrr_at_1000 value: 63.3611897014839 - type: mrr_at_20 value: 63.10543590095977 - type: mrr_at_3 value: 61.06455913159412 - type: mrr_at_5 value: 62.052942296705474 - type: nauc_map_at_1000_diff1 value: 75.04200018088618 - type: nauc_map_at_1000_max value: 70.49937782771909 - type: nauc_map_at_1000_std value: -5.257206317083184 - type: nauc_map_at_100_diff1 value: 75.02786834256312 - type: nauc_map_at_100_max value: 70.5016476500189 - type: nauc_map_at_100_std value: -5.228770832077681 - type: nauc_map_at_10_diff1 value: 74.9626552701647 - type: nauc_map_at_10_max value: 70.56253732243214 - type: nauc_map_at_10_std value: -5.359037281768563 - type: nauc_map_at_1_diff1 value: 78.46858307815857 - type: nauc_map_at_1_max value: 69.03908373759435 - type: nauc_map_at_1_std value: -7.479412070736642 - type: nauc_map_at_20_diff1 value: 74.98121458084796 - type: nauc_map_at_20_max value: 70.51885366822565 - type: nauc_map_at_20_std value: -5.286051287133815 - type: nauc_map_at_3_diff1 value: 75.36078454383373 - type: nauc_map_at_3_max value: 70.34997144546014 - type: nauc_map_at_3_std value: -6.663517224039184 - type: nauc_map_at_5_diff1 value: 75.0274512828238 - type: nauc_map_at_5_max value: 70.45292551591874 - type: nauc_map_at_5_std value: -6.029224488640147 - type: nauc_mrr_at_1000_diff1 value: 75.04018768469983 - type: nauc_mrr_at_1000_max value: 70.49855509132635 - type: nauc_mrr_at_1000_std value: -5.258929961409948 - type: nauc_mrr_at_100_diff1 value: 75.02605732810112 - type: nauc_mrr_at_100_max value: 70.50082584929103 - type: nauc_mrr_at_100_std value: -5.2304917988542154 - type: nauc_mrr_at_10_diff1 value: 74.96079080525713 - type: nauc_mrr_at_10_max value: 70.56167294920391 - type: nauc_mrr_at_10_std value: -5.360650630655072 - type: nauc_mrr_at_1_diff1 value: 78.46858307815857 - type: nauc_mrr_at_1_max value: 69.03908373759435 - type: nauc_mrr_at_1_std value: -7.479412070736642 - type: nauc_mrr_at_20_diff1 value: 74.97939804960517 - type: nauc_mrr_at_20_max value: 70.51804078965411 - type: nauc_mrr_at_20_std value: -5.287681954889177 - type: nauc_mrr_at_3_diff1 value: 75.36078454383373 - type: nauc_mrr_at_3_max value: 70.34997144546014 - type: nauc_mrr_at_3_std value: -6.663517224039184 - type: nauc_mrr_at_5_diff1 value: 75.0274512828238 - type: nauc_mrr_at_5_max value: 70.45292551591874 - type: nauc_mrr_at_5_std value: -6.029224488640147 - type: nauc_ndcg_at_1000_diff1 value: 74.22106834748942 - type: nauc_ndcg_at_1000_max value: 70.93625922934912 - type: nauc_ndcg_at_1000_std value: -3.4878399005946017 - type: nauc_ndcg_at_100_diff1 value: 73.74068883646733 - type: nauc_ndcg_at_100_max value: 71.02357018347472 - type: nauc_ndcg_at_100_std value: -2.462293184201324 - type: nauc_ndcg_at_10_diff1 value: 73.40967965536565 - type: nauc_ndcg_at_10_max value: 71.29379828672067 - type: nauc_ndcg_at_10_std value: -3.295547756383108 - type: nauc_ndcg_at_1_diff1 value: 78.46858307815857 - type: nauc_ndcg_at_1_max value: 69.03908373759435 - type: nauc_ndcg_at_1_std value: -7.479412070736642 - type: nauc_ndcg_at_20_diff1 value: 73.45790057693699 - type: nauc_ndcg_at_20_max value: 71.16598432419126 - type: nauc_ndcg_at_20_std value: -2.962877157646097 - type: nauc_ndcg_at_3_diff1 value: 74.30696173964847 - type: nauc_ndcg_at_3_max value: 70.79878978459556 - type: nauc_ndcg_at_3_std value: -6.297286578628299 - type: nauc_ndcg_at_5_diff1 value: 73.65858211199816 - type: nauc_ndcg_at_5_max value: 71.01122417463776 - type: nauc_ndcg_at_5_std value: -5.075990882646765 - type: nauc_precision_at_1000_diff1 value: 68.71065091972568 - type: nauc_precision_at_1000_max value: 81.38173585624777 - type: nauc_precision_at_1000_std value: 58.035497889797895 - type: nauc_precision_at_100_diff1 value: 61.93634256957017 - type: nauc_precision_at_100_max value: 74.84191770203093 - type: nauc_precision_at_100_std value: 31.3325983123831 - type: nauc_precision_at_10_diff1 value: 66.68247010944937 - type: nauc_precision_at_10_max value: 74.48773524654571 - type: nauc_precision_at_10_std value: 6.560421880785153 - type: nauc_precision_at_1_diff1 value: 78.46858307815857 - type: nauc_precision_at_1_max value: 69.03908373759435 - type: nauc_precision_at_1_std value: -7.479412070736642 - type: nauc_precision_at_20_diff1 value: 65.51592872758067 - type: nauc_precision_at_20_max value: 74.50684066823096 - type: nauc_precision_at_20_std value: 10.830479877698208 - type: nauc_precision_at_3_diff1 value: 70.89587884861588 - type: nauc_precision_at_3_max value: 72.25310558370424 - type: nauc_precision_at_3_std value: -5.0796100900749765 - type: nauc_precision_at_5_diff1 value: 68.71885719845497 - type: nauc_precision_at_5_max value: 73.02601751485672 - type: nauc_precision_at_5_std value: -1.4382681421626857 - type: nauc_recall_at_1000_diff1 value: 71.95510299834734 - type: nauc_recall_at_1000_max value: 84.03647166092985 - type: nauc_recall_at_1000_std value: 56.87490604776847 - type: nauc_recall_at_100_diff1 value: 62.446624924715955 - type: nauc_recall_at_100_max value: 75.25666892464507 - type: nauc_recall_at_100_std value: 31.068789794554686 - type: nauc_recall_at_10_diff1 value: 66.70676336328988 - type: nauc_recall_at_10_max value: 74.4963699656397 - type: nauc_recall_at_10_std value: 6.57498399706916 - type: nauc_recall_at_1_diff1 value: 78.46858307815857 - type: nauc_recall_at_1_max value: 69.03908373759435 - type: nauc_recall_at_1_std value: -7.479412070736642 - type: nauc_recall_at_20_diff1 value: 65.54082767974772 - type: nauc_recall_at_20_max value: 74.5111529838772 - type: nauc_recall_at_20_std value: 10.84574829707354 - type: nauc_recall_at_3_diff1 value: 70.89587884861584 - type: nauc_recall_at_3_max value: 72.25310558370421 - type: nauc_recall_at_3_std value: -5.07961009007491 - type: nauc_recall_at_5_diff1 value: 68.71885719845501 - type: nauc_recall_at_5_max value: 73.02601751485666 - type: nauc_recall_at_5_std value: -1.4382681421626995 - type: ndcg_at_1 value: 55.494 - type: ndcg_at_10 value: 66.374 - type: ndcg_at_100 value: 69.254 - type: ndcg_at_1000 value: 70.136 - type: ndcg_at_20 value: 67.599 - type: ndcg_at_3 value: 62.863 - type: ndcg_at_5 value: 64.644 - type: precision_at_1 value: 55.494 - type: precision_at_10 value: 7.776 - type: precision_at_100 value: 0.9159999999999999 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 4.1290000000000004 - type: precision_at_3 value: 22.688 - type: precision_at_5 value: 14.477 - type: recall_at_1 value: 55.494 - type: recall_at_10 value: 77.747 - type: recall_at_100 value: 91.535 - type: recall_at_1000 value: 98.619 - type: recall_at_20 value: 82.565 - type: recall_at_3 value: 68.063 - type: recall_at_5 value: 72.386 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (eng-eng) type: facebook/mlqa config: eng-eng split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 64.723 - type: map_at_1 value: 54.308 - type: map_at_10 value: 61.26200000000001 - type: map_at_100 value: 61.82299999999999 - type: map_at_1000 value: 61.856 - type: map_at_20 value: 61.575 - type: map_at_3 value: 59.565 - type: map_at_5 value: 60.561 - type: mrr_at_1 value: 54.31704368848212 - type: mrr_at_10 value: 61.26520216098834 - type: mrr_at_100 value: 61.82588321127103 - type: mrr_at_1000 value: 61.859333030574334 - type: mrr_at_20 value: 61.57780339921337 - type: mrr_at_3 value: 59.569446842801646 - type: mrr_at_5 value: 60.56323029989004 - type: nauc_map_at_1000_diff1 value: 74.21413722468635 - type: nauc_map_at_1000_max value: 70.41741227882316 - type: nauc_map_at_1000_std value: -2.5438707209848506 - type: nauc_map_at_100_diff1 value: 74.19812315947975 - type: nauc_map_at_100_max value: 70.41589146728445 - type: nauc_map_at_100_std value: -2.5336117059429553 - type: nauc_map_at_10_diff1 value: 74.21810561152937 - type: nauc_map_at_10_max value: 70.48816115200171 - type: nauc_map_at_10_std value: -2.7443834681406734 - type: nauc_map_at_1_diff1 value: 77.69378738778958 - type: nauc_map_at_1_max value: 68.64652310701173 - type: nauc_map_at_1_std value: -4.667071946448379 - type: nauc_map_at_20_diff1 value: 74.16105697562438 - type: nauc_map_at_20_max value: 70.42491994631179 - type: nauc_map_at_20_std value: -2.6070416022440472 - type: nauc_map_at_3_diff1 value: 74.60449392878863 - type: nauc_map_at_3_max value: 70.39888609914269 - type: nauc_map_at_3_std value: -3.5401151125723986 - type: nauc_map_at_5_diff1 value: 74.2423420992663 - type: nauc_map_at_5_max value: 70.36574501826757 - type: nauc_map_at_5_std value: -3.2707393116898964 - type: nauc_mrr_at_1000_diff1 value: 74.21029843731323 - type: nauc_mrr_at_1000_max value: 70.43020492688913 - type: nauc_mrr_at_1000_std value: -2.526895582202081 - type: nauc_mrr_at_100_diff1 value: 74.19440960479243 - type: nauc_mrr_at_100_max value: 70.4288998824232 - type: nauc_mrr_at_100_std value: -2.5160929945118107 - type: nauc_mrr_at_10_diff1 value: 74.2141357266166 - type: nauc_mrr_at_10_max value: 70.5005683347807 - type: nauc_mrr_at_10_std value: -2.727154557882168 - type: nauc_mrr_at_1_diff1 value: 77.69891248239793 - type: nauc_mrr_at_1_max value: 68.68255231164922 - type: nauc_mrr_at_1_std value: -4.630226727154317 - type: nauc_mrr_at_20_diff1 value: 74.15705434409723 - type: nauc_mrr_at_20_max value: 70.43741835972747 - type: nauc_mrr_at_20_std value: -2.5896756472464495 - type: nauc_mrr_at_3_diff1 value: 74.5981844349412 - type: nauc_mrr_at_3_max value: 70.41834937080564 - type: nauc_mrr_at_3_std value: -3.5161656408031163 - type: nauc_mrr_at_5_diff1 value: 74.23847535424844 - type: nauc_mrr_at_5_max value: 70.37763810013656 - type: nauc_mrr_at_5_std value: -3.2560955164581733 - type: nauc_ndcg_at_1000_diff1 value: 73.20994496725493 - type: nauc_ndcg_at_1000_max value: 70.8903016277125 - type: nauc_ndcg_at_1000_std value: -0.625772298462309 - type: nauc_ndcg_at_100_diff1 value: 72.6847141682645 - type: nauc_ndcg_at_100_max value: 70.86564422034162 - type: nauc_ndcg_at_100_std value: -0.07195786766326141 - type: nauc_ndcg_at_10_diff1 value: 72.78806493754281 - type: nauc_ndcg_at_10_max value: 71.21957067926769 - type: nauc_ndcg_at_10_std value: -1.2760418313382227 - type: nauc_ndcg_at_1_diff1 value: 77.69891248239793 - type: nauc_ndcg_at_1_max value: 68.68255231164922 - type: nauc_ndcg_at_1_std value: -4.630226727154317 - type: nauc_ndcg_at_20_diff1 value: 72.52082440882546 - type: nauc_ndcg_at_20_max value: 70.98185004796734 - type: nauc_ndcg_at_20_std value: -0.6908280874815464 - type: nauc_ndcg_at_3_diff1 value: 73.59870660843939 - type: nauc_ndcg_at_3_max value: 70.94391957288654 - type: nauc_ndcg_at_3_std value: -3.147723179140428 - type: nauc_ndcg_at_5_diff1 value: 72.90122868193457 - type: nauc_ndcg_at_5_max value: 70.89376368965165 - type: nauc_ndcg_at_5_std value: -2.6451807385626744 - type: nauc_precision_at_1000_diff1 value: 58.14737201864067 - type: nauc_precision_at_1000_max value: 78.79011251144826 - type: nauc_precision_at_1000_std value: 59.98985420476577 - type: nauc_precision_at_100_diff1 value: 59.21069121644552 - type: nauc_precision_at_100_max value: 73.00557835912306 - type: nauc_precision_at_100_std value: 26.85027406282173 - type: nauc_precision_at_10_diff1 value: 66.8760831023675 - type: nauc_precision_at_10_max value: 74.21167950452596 - type: nauc_precision_at_10_std value: 5.453652499335947 - type: nauc_precision_at_1_diff1 value: 77.69891248239793 - type: nauc_precision_at_1_max value: 68.68255231164922 - type: nauc_precision_at_1_std value: -4.630226727154317 - type: nauc_precision_at_20_diff1 value: 64.3118559132602 - type: nauc_precision_at_20_max value: 73.33078184673825 - type: nauc_precision_at_20_std value: 9.993299523049402 - type: nauc_precision_at_3_diff1 value: 70.38667185155593 - type: nauc_precision_at_3_max value: 72.66495006030951 - type: nauc_precision_at_3_std value: -1.8532839591326276 - type: nauc_precision_at_5_diff1 value: 68.12161337583686 - type: nauc_precision_at_5_max value: 72.65644960375046 - type: nauc_precision_at_5_std value: -0.33317164167012875 - type: nauc_recall_at_1000_diff1 value: 61.63204394739985 - type: nauc_recall_at_1000_max value: 81.77241537319897 - type: nauc_recall_at_1000_std value: 58.44841544062308 - type: nauc_recall_at_100_diff1 value: 59.72072697224705 - type: nauc_recall_at_100_max value: 73.28519507061553 - type: nauc_recall_at_100_std value: 26.27318390763456 - type: nauc_recall_at_10_diff1 value: 66.9757135465418 - type: nauc_recall_at_10_max value: 74.21919493374149 - type: nauc_recall_at_10_std value: 5.323369605377166 - type: nauc_recall_at_1_diff1 value: 77.69378738778958 - type: nauc_recall_at_1_max value: 68.64652310701173 - type: nauc_recall_at_1_std value: -4.667071946448379 - type: nauc_recall_at_20_diff1 value: 64.42290081731899 - type: nauc_recall_at_20_max value: 73.3358289439033 - type: nauc_recall_at_20_std value: 9.846598361586073 - type: nauc_recall_at_3_diff1 value: 70.41211290964785 - type: nauc_recall_at_3_max value: 72.64451776775402 - type: nauc_recall_at_3_std value: -1.916280959835826 - type: nauc_recall_at_5_diff1 value: 68.20695272727916 - type: nauc_recall_at_5_max value: 72.66404224006101 - type: nauc_recall_at_5_std value: -0.431125323007886 - type: ndcg_at_1 value: 54.31700000000001 - type: ndcg_at_10 value: 64.723 - type: ndcg_at_100 value: 67.648 - type: ndcg_at_1000 value: 68.619 - type: ndcg_at_20 value: 65.85499999999999 - type: ndcg_at_3 value: 61.244 - type: ndcg_at_5 value: 63.038000000000004 - type: precision_at_1 value: 54.31700000000001 - type: precision_at_10 value: 7.564 - type: precision_at_100 value: 0.898 - type: precision_at_1000 value: 0.098 - type: precision_at_20 value: 4.005 - type: precision_at_3 value: 22.034000000000002 - type: precision_at_5 value: 14.093 - type: recall_at_1 value: 54.308 - type: recall_at_10 value: 75.622 - type: recall_at_100 value: 89.744 - type: recall_at_1000 value: 97.539 - type: recall_at_20 value: 80.085 - type: recall_at_3 value: 66.09 - type: recall_at_5 value: 70.446 - task: type: Clustering dataset: name: MTEB MLSUMClusteringP2P (de) type: reciTAL/mlsum config: de split: test revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7 metrics: - type: main_score value: 41.267647761702854 - type: v_measure value: 41.267647761702854 - type: v_measure_std value: 10.93390895077248 - type: main_score value: 40.07927325071353 - type: v_measure value: 40.07927325071353 - type: v_measure_std value: 9.296680835266145 - task: type: Clustering dataset: name: MTEB MLSUMClusteringP2P (fr) type: reciTAL/mlsum config: fr split: test revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7 metrics: - type: main_score value: 44.68714862333979 - type: v_measure value: 44.68714862333979 - type: v_measure_std value: 1.811036989797814 - type: main_score value: 44.88484854069901 - type: v_measure value: 44.88484854069901 - type: v_measure_std value: 2.3704247819781843 - task: type: Clustering dataset: name: MTEB MLSUMClusteringP2P (ru) type: reciTAL/mlsum config: ru split: test revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7 metrics: - type: main_score value: 41.92518785753813 - type: v_measure value: 41.92518785753813 - type: v_measure_std value: 5.9356661900220775 - type: main_score value: 43.97657450929179 - type: v_measure value: 43.97657450929179 - type: v_measure_std value: 6.087547931333613 - task: type: Clustering dataset: name: MTEB MLSUMClusteringP2P (es) type: reciTAL/mlsum config: es split: test revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7 metrics: - type: main_score value: 48.69875719812033 - type: v_measure value: 48.69875719812033 - type: v_measure_std value: 1.204253881950113 - type: main_score value: 48.41108671948728 - type: v_measure value: 48.41108671948728 - type: v_measure_std value: 1.3848320630151243 - task: type: Reranking dataset: name: MTEB MMarcoReranking (default) type: C-MTEB/Mmarco-reranking config: default split: dev revision: 8e0c766dbe9e16e1d221116a3f36795fbade07f6 metrics: - type: map value: 21.050447576170395 - type: mrr value: 20.201984126984126 - type: main_score value: 21.050447576170395 - task: type: Retrieval dataset: name: MTEB MMarcoRetrieval (default) type: C-MTEB/MMarcoRetrieval config: default split: dev revision: 539bbde593d947e2a124ba72651aafc09eb33fc2 metrics: - type: main_score value: 79.687 - type: map_at_1 value: 66.872 - type: map_at_10 value: 75.949 - type: map_at_100 value: 76.25 - type: map_at_1000 value: 76.259 - type: map_at_20 value: 76.145 - type: map_at_3 value: 74.01299999999999 - type: map_at_5 value: 75.232 - type: mrr_at_1 value: 69.18338108882521 - type: mrr_at_10 value: 76.5424227952881 - type: mrr_at_100 value: 76.8019342792628 - type: mrr_at_1000 value: 76.81002278342808 - type: mrr_at_20 value: 76.7115234815896 - type: mrr_at_3 value: 74.83046800382044 - type: mrr_at_5 value: 75.88490926456515 - type: nauc_map_at_1000_diff1 value: 78.06933310424179 - type: nauc_map_at_1000_max value: 49.392948209665896 - type: nauc_map_at_1000_std value: -15.126109322591166 - type: nauc_map_at_100_diff1 value: 78.06612779298378 - type: nauc_map_at_100_max value: 49.40761618630397 - type: nauc_map_at_100_std value: -15.099282408159349 - type: nauc_map_at_10_diff1 value: 77.94565685470538 - type: nauc_map_at_10_max value: 49.50559610363201 - type: nauc_map_at_10_std value: -15.182130695916355 - type: nauc_map_at_1_diff1 value: 79.84814509858211 - type: nauc_map_at_1_max value: 40.78978466656547 - type: nauc_map_at_1_std value: -19.96189264026715 - type: nauc_map_at_20_diff1 value: 78.03597839981245 - type: nauc_map_at_20_max value: 49.49477427223376 - type: nauc_map_at_20_std value: -15.084990000838378 - type: nauc_map_at_3_diff1 value: 78.0637014655507 - type: nauc_map_at_3_max value: 48.63214001973341 - type: nauc_map_at_3_std value: -17.093950563306596 - type: nauc_map_at_5_diff1 value: 77.94068229240348 - type: nauc_map_at_5_max value: 49.38930719689204 - type: nauc_map_at_5_std value: -15.9919454201954 - type: nauc_mrr_at_1000_diff1 value: 78.34582398092816 - type: nauc_mrr_at_1000_max value: 49.623566992784156 - type: nauc_mrr_at_1000_std value: -14.381347765493265 - type: nauc_mrr_at_100_diff1 value: 78.3429966714221 - type: nauc_mrr_at_100_max value: 49.63684922240546 - type: nauc_mrr_at_100_std value: -14.354914066301236 - type: nauc_mrr_at_10_diff1 value: 78.2208070219624 - type: nauc_mrr_at_10_max value: 49.77720536573364 - type: nauc_mrr_at_10_std value: -14.316233764741812 - type: nauc_mrr_at_1_diff1 value: 80.22305496572142 - type: nauc_mrr_at_1_max value: 44.30231210192536 - type: nauc_mrr_at_1_std value: -18.942549914934492 - type: nauc_mrr_at_20_diff1 value: 78.31006724240147 - type: nauc_mrr_at_20_max value: 49.72338465276142 - type: nauc_mrr_at_20_std value: -14.30722621948953 - type: nauc_mrr_at_3_diff1 value: 78.39832634634523 - type: nauc_mrr_at_3_max value: 49.24985961036677 - type: nauc_mrr_at_3_std value: -15.966286866763191 - type: nauc_mrr_at_5_diff1 value: 78.2406507247798 - type: nauc_mrr_at_5_max value: 49.71276359754787 - type: nauc_mrr_at_5_std value: -14.979526226149698 - type: nauc_ndcg_at_1000_diff1 value: 77.74892471071016 - type: nauc_ndcg_at_1000_max value: 51.11543344053061 - type: nauc_ndcg_at_1000_std value: -12.208878737005096 - type: nauc_ndcg_at_100_diff1 value: 77.67462502211228 - type: nauc_ndcg_at_100_max value: 51.593977338939034 - type: nauc_ndcg_at_100_std value: -11.312126179513802 - type: nauc_ndcg_at_10_diff1 value: 77.0571291760012 - type: nauc_ndcg_at_10_max value: 52.35435572808972 - type: nauc_ndcg_at_10_std value: -11.33242546164059 - type: nauc_ndcg_at_1_diff1 value: 80.22305496572142 - type: nauc_ndcg_at_1_max value: 44.30231210192536 - type: nauc_ndcg_at_1_std value: -18.942549914934492 - type: nauc_ndcg_at_20_diff1 value: 77.4141216117471 - type: nauc_ndcg_at_20_max value: 52.340600871365375 - type: nauc_ndcg_at_20_std value: -10.989010161550912 - type: nauc_ndcg_at_3_diff1 value: 77.43971989259062 - type: nauc_ndcg_at_3_max value: 50.59251358320663 - type: nauc_ndcg_at_3_std value: -15.59337960636058 - type: nauc_ndcg_at_5_diff1 value: 77.12174287031847 - type: nauc_ndcg_at_5_max value: 51.97108510288907 - type: nauc_ndcg_at_5_std value: -13.474902612427167 - type: nauc_precision_at_1000_diff1 value: -19.36793534929367 - type: nauc_precision_at_1000_max value: 11.803383262344036 - type: nauc_precision_at_1000_std value: 24.304436015177046 - type: nauc_precision_at_100_diff1 value: -6.273790806909921 - type: nauc_precision_at_100_max value: 23.372606271300747 - type: nauc_precision_at_100_std value: 29.085768971612342 - type: nauc_precision_at_10_diff1 value: 21.67045907336595 - type: nauc_precision_at_10_max value: 41.68948432407223 - type: nauc_precision_at_10_std value: 17.837055074458092 - type: nauc_precision_at_1_diff1 value: 80.22305496572142 - type: nauc_precision_at_1_max value: 44.30231210192536 - type: nauc_precision_at_1_std value: -18.942549914934492 - type: nauc_precision_at_20_diff1 value: 12.577671896684803 - type: nauc_precision_at_20_max value: 37.44944702246691 - type: nauc_precision_at_20_std value: 23.635897665206087 - type: nauc_precision_at_3_diff1 value: 47.165335112814056 - type: nauc_precision_at_3_max value: 47.0458691263379 - type: nauc_precision_at_3_std value: -3.3181861146890217 - type: nauc_precision_at_5_diff1 value: 35.406205343514806 - type: nauc_precision_at_5_max value: 45.56549449285401 - type: nauc_precision_at_5_std value: 5.612378074562386 - type: nauc_recall_at_1000_diff1 value: 72.32762520815842 - type: nauc_recall_at_1000_max value: 85.64979256307343 - type: nauc_recall_at_1000_std value: 73.61925297037476 - type: nauc_recall_at_100_diff1 value: 72.31946328709962 - type: nauc_recall_at_100_max value: 83.76576070068353 - type: nauc_recall_at_100_std value: 57.39376538662535 - type: nauc_recall_at_10_diff1 value: 69.51307788072499 - type: nauc_recall_at_10_max value: 69.60124733654142 - type: nauc_recall_at_10_std value: 13.483540424716892 - type: nauc_recall_at_1_diff1 value: 79.84814509858211 - type: nauc_recall_at_1_max value: 40.78978466656547 - type: nauc_recall_at_1_std value: -19.96189264026715 - type: nauc_recall_at_20_diff1 value: 70.92168324710599 - type: nauc_recall_at_20_max value: 76.09106252420084 - type: nauc_recall_at_20_std value: 25.406842300761447 - type: nauc_recall_at_3_diff1 value: 74.1212680517145 - type: nauc_recall_at_3_max value: 56.24921832879403 - type: nauc_recall_at_3_std value: -11.55542913578436 - type: nauc_recall_at_5_diff1 value: 72.31262959872993 - type: nauc_recall_at_5_max value: 62.761214896697915 - type: nauc_recall_at_5_std value: -3.280167584070396 - type: ndcg_at_1 value: 69.18299999999999 - type: ndcg_at_10 value: 79.687 - type: ndcg_at_100 value: 81.062 - type: ndcg_at_1000 value: 81.312 - type: ndcg_at_20 value: 80.34599999999999 - type: ndcg_at_3 value: 75.98700000000001 - type: ndcg_at_5 value: 78.039 - type: precision_at_1 value: 69.18299999999999 - type: precision_at_10 value: 9.636 - type: precision_at_100 value: 1.0330000000000001 - type: precision_at_1000 value: 0.105 - type: precision_at_20 value: 4.958 - type: precision_at_3 value: 28.515 - type: precision_at_5 value: 18.201 - type: recall_at_1 value: 66.872 - type: recall_at_10 value: 90.688 - type: recall_at_100 value: 96.99 - type: recall_at_1000 value: 98.958 - type: recall_at_20 value: 93.21199999999999 - type: recall_at_3 value: 80.84599999999999 - type: recall_at_5 value: 85.732 - task: type: Retrieval dataset: name: MTEB MSMARCO (default) type: mteb/msmarco config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: map_at_1 value: 21.861 - type: map_at_10 value: 34.008 - type: map_at_100 value: 35.174 - type: map_at_1000 value: 35.224 - type: map_at_20 value: 34.705999999999996 - type: map_at_3 value: 30.209000000000003 - type: map_at_5 value: 32.351 - type: mrr_at_1 value: 22.493 - type: mrr_at_10 value: 34.583999999999996 - type: mrr_at_100 value: 35.691 - type: mrr_at_1000 value: 35.736000000000004 - type: mrr_at_20 value: 35.257 - type: mrr_at_3 value: 30.85 - type: mrr_at_5 value: 32.962 - type: ndcg_at_1 value: 22.493 - type: ndcg_at_10 value: 40.815 - type: ndcg_at_100 value: 46.483999999999995 - type: ndcg_at_1000 value: 47.73 - type: ndcg_at_20 value: 43.302 - type: ndcg_at_3 value: 33.056000000000004 - type: ndcg_at_5 value: 36.879 - type: precision_at_1 value: 22.493 - type: precision_at_10 value: 6.465999999999999 - type: precision_at_100 value: 0.932 - type: precision_at_1000 value: 0.104 - type: precision_at_20 value: 3.752 - type: precision_at_3 value: 14.069 - type: precision_at_5 value: 10.384 - type: recall_at_1 value: 21.861 - type: recall_at_10 value: 61.781 - type: recall_at_100 value: 88.095 - type: recall_at_1000 value: 97.625 - type: recall_at_20 value: 71.44500000000001 - type: recall_at_3 value: 40.653 - type: recall_at_5 value: 49.841 - type: main_score value: 40.815 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 97.4874601003192 - type: f1 value: 97.19067544931094 - type: f1_weighted value: 97.49331776181019 - type: main_score value: 97.4874601003192 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (de) type: mteb/mtop_domain config: de split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 96.89489997182305 - type: f1 value: 96.51138586512977 - type: f1_weighted value: 96.89723065967186 - type: main_score value: 96.89489997182305 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (es) type: mteb/mtop_domain config: es split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 97.17144763175452 - type: f1 value: 96.81785681878274 - type: f1_weighted value: 97.1778974586874 - type: main_score value: 97.17144763175452 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (fr) type: mteb/mtop_domain config: fr split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 96.30128405887879 - type: f1 value: 95.94555923088487 - type: f1_weighted value: 96.30399416794926 - type: main_score value: 96.30128405887879 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 84.53488372093022 - type: f1 value: 61.77995074251401 - type: f1_weighted value: 86.8005170485101 - type: main_score value: 84.53488372093022 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (de) type: mteb/mtop_intent config: de split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 80.79459002535924 - type: f1 value: 56.08938302001448 - type: f1_weighted value: 83.66582131948252 - type: main_score value: 80.79459002535924 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (es) type: mteb/mtop_intent config: es split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 84.7765176784523 - type: f1 value: 61.39860057885528 - type: f1_weighted value: 86.94881745670745 - type: main_score value: 84.7765176784523 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (fr) type: mteb/mtop_intent config: fr split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 82.2079549013467 - type: f1 value: 59.90260478749016 - type: f1_weighted value: 84.36861708593257 - type: main_score value: 82.2079549013467 - task: type: Classification dataset: name: MTEB MasakhaNEWSClassification (eng) type: mteb/masakhanews config: eng split: test revision: 18193f187b92da67168c655c9973a165ed9593dd metrics: - type: accuracy value: 74.98945147679325 - type: f1 value: 74.3157483560261 - type: f1_weighted value: 75.01179008904884 - type: main_score value: 74.98945147679325 - task: type: Classification dataset: name: MTEB MasakhaNEWSClassification (fra) type: mteb/masakhanews config: fra split: test revision: 18193f187b92da67168c655c9973a165ed9593dd metrics: - type: accuracy value: 74.02843601895735 - type: f1 value: 70.40326349620732 - type: f1_weighted value: 74.6596277063484 - type: main_score value: 74.02843601895735 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (amh) type: masakhane/masakhanews config: amh split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 69.45780291725053 - type: v_measure value: 69.45780291725053 - type: v_measure_std value: 36.54340055904091 - type: main_score value: 60.95132147787602 - type: v_measure value: 60.95132147787602 - type: v_measure_std value: 37.330148394033365 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (eng) type: masakhane/masakhanews config: eng split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 64.88996119332239 - type: v_measure value: 64.88996119332239 - type: v_measure_std value: 30.017223408197268 - type: main_score value: 60.974810831426595 - type: v_measure value: 60.974810831426595 - type: v_measure_std value: 24.934675467507827 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (fra) type: masakhane/masakhanews config: fra split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 42.362383958691666 - type: v_measure value: 42.362383958691666 - type: v_measure_std value: 37.61076788039063 - type: main_score value: 44.479206673553335 - type: v_measure value: 44.479206673553335 - type: v_measure_std value: 32.58254804499339 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (hau) type: masakhane/masakhanews config: hau split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 43.29201252405562 - type: v_measure value: 43.29201252405562 - type: v_measure_std value: 34.31987945146255 - type: main_score value: 26.4742082741682 - type: v_measure value: 26.4742082741682 - type: v_measure_std value: 22.344929192323097 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (ibo) type: masakhane/masakhanews config: ibo split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 33.59926542995238 - type: v_measure value: 33.59926542995238 - type: v_measure_std value: 35.70048601084112 - type: main_score value: 38.906129911741985 - type: v_measure value: 38.906129911741985 - type: v_measure_std value: 34.785601792668444 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (lin) type: masakhane/masakhanews config: lin split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 67.58487601893106 - type: v_measure value: 67.58487601893106 - type: v_measure_std value: 35.16784970777931 - type: main_score value: 62.60982020876592 - type: v_measure value: 62.60982020876592 - type: v_measure_std value: 40.7368955715045 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (lug) type: masakhane/masakhanews config: lug split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 50.01220872023533 - type: v_measure value: 50.01220872023533 - type: v_measure_std value: 41.87411574676182 - type: main_score value: 42.70424106365967 - type: v_measure value: 42.70424106365967 - type: v_measure_std value: 46.80946241135087 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (orm) type: masakhane/masakhanews config: orm split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 29.007847502598317 - type: v_measure value: 29.007847502598317 - type: v_measure_std value: 38.374997395079994 - type: main_score value: 28.609942199922322 - type: v_measure value: 28.609942199922322 - type: v_measure_std value: 38.46685040191088 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (pcm) type: masakhane/masakhanews config: pcm split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 79.13520228554611 - type: v_measure value: 79.13520228554611 - type: v_measure_std value: 18.501843848275183 - type: main_score value: 76.83901348810822 - type: v_measure value: 76.83901348810822 - type: v_measure_std value: 17.57617141269189 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (run) type: masakhane/masakhanews config: run split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 60.317213909746656 - type: v_measure value: 60.317213909746656 - type: v_measure_std value: 36.500281823747386 - type: main_score value: 46.89757547846193 - type: v_measure value: 46.89757547846193 - type: v_measure_std value: 44.58903590203438 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (sna) type: masakhane/masakhanews config: sna split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 59.395277358240946 - type: v_measure value: 59.395277358240946 - type: v_measure_std value: 37.500916816164654 - type: main_score value: 55.37185207068829 - type: v_measure value: 55.37185207068829 - type: v_measure_std value: 36.944574863543004 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (som) type: masakhane/masakhanews config: som split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 38.18638688704302 - type: v_measure value: 38.18638688704302 - type: v_measure_std value: 35.453681137564466 - type: main_score value: 37.44211021681754 - type: v_measure value: 37.44211021681754 - type: v_measure_std value: 33.41469994463241 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (swa) type: masakhane/masakhanews config: swa split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 29.49230755729658 - type: v_measure value: 29.49230755729658 - type: v_measure_std value: 28.284313285264645 - type: main_score value: 26.020680621216062 - type: v_measure value: 26.020680621216062 - type: v_measure_std value: 25.480037522570413 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (tir) type: masakhane/masakhanews config: tir split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 60.632258622750115 - type: v_measure value: 60.632258622750115 - type: v_measure_std value: 34.429711214740564 - type: main_score value: 63.74306846771303 - type: v_measure value: 63.74306846771303 - type: v_measure_std value: 32.19119631078685 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (xho) type: masakhane/masakhanews config: xho split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 41.76322918806381 - type: v_measure value: 41.76322918806381 - type: v_measure_std value: 36.43245296200775 - type: main_score value: 24.580890519243777 - type: v_measure value: 24.580890519243777 - type: v_measure_std value: 37.941836363967106 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (yor) type: masakhane/masakhanews config: yor split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 33.17083910808645 - type: v_measure value: 33.17083910808645 - type: v_measure_std value: 34.87547994284835 - type: main_score value: 43.63458888828314 - type: v_measure value: 43.63458888828314 - type: v_measure_std value: 31.28169350649098 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pl) type: mteb/amazon_massive_intent config: pl split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 75.37323470073974 - type: f1 value: 71.1836877753734 - type: f1_weighted value: 75.72073213955457 - type: main_score value: 75.37323470073974 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (de) type: mteb/amazon_massive_intent config: de split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 74.83523873570948 - type: f1 value: 70.72375821116886 - type: f1_weighted value: 75.20800490010755 - type: main_score value: 74.83523873570948 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (es) type: mteb/amazon_massive_intent config: es split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 75.31607262945528 - type: f1 value: 72.06063554897662 - type: f1_weighted value: 75.72438161355252 - type: main_score value: 75.31607262945528 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ru) type: mteb/amazon_massive_intent config: ru split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 76.7955615332885 - type: f1 value: 73.08099648499756 - type: f1_weighted value: 77.18482068239668 - type: main_score value: 76.7955615332885 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 77.60591795561534 - type: f1 value: 74.46676705370395 - type: f1_weighted value: 77.69888062336614 - type: main_score value: 77.60591795561534 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fr) type: mteb/amazon_massive_intent config: fr split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 76.32145258910558 - type: f1 value: 72.89824154178328 - type: f1_weighted value: 76.6539327979472 - type: main_score value: 76.32145258910558 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-CN) type: mteb/amazon_massive_intent config: zh-CN split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 73.21788836583724 - type: f1 value: 70.45594512246377 - type: f1_weighted value: 73.67862536499393 - type: main_score value: 73.21788836583724 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-CN) type: mteb/amazon_massive_scenario config: zh-CN split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 80.82044384667114 - type: f1 value: 80.53217664465089 - type: f1_weighted value: 80.94535087010512 - type: main_score value: 80.82044384667114 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pl) type: mteb/amazon_massive_scenario config: pl split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 82.1049092131809 - type: f1 value: 81.55343463694733 - type: f1_weighted value: 82.33509098770782 - type: main_score value: 82.1049092131809 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (es) type: mteb/amazon_massive_scenario config: es split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 82.58238063214526 - type: f1 value: 82.27974449333072 - type: f1_weighted value: 82.81337569618209 - type: main_score value: 82.58238063214526 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (de) type: mteb/amazon_massive_scenario config: de split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 83.97108271687962 - type: f1 value: 83.56285606936076 - type: f1_weighted value: 84.10198745390771 - type: main_score value: 83.97108271687962 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 84.71082716879623 - type: f1 value: 84.09447062371402 - type: f1_weighted value: 84.73765765551342 - type: main_score value: 84.71082716879623 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fr) type: mteb/amazon_massive_scenario config: fr split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 83.093476798924 - type: f1 value: 82.72656900752943 - type: f1_weighted value: 83.26606516503364 - type: main_score value: 83.093476798924 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ru) type: mteb/amazon_massive_scenario config: ru split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 84.05850706119705 - type: f1 value: 83.64234048881222 - type: f1_weighted value: 84.17315768381876 - type: main_score value: 84.05850706119705 - task: type: Retrieval dataset: name: MTEB MedicalRetrieval (default) type: C-MTEB/MedicalRetrieval config: default split: dev revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6 metrics: - type: main_score value: 56.635999999999996 - type: map_at_1 value: 48.699999999999996 - type: map_at_10 value: 53.991 - type: map_at_100 value: 54.449999999999996 - type: map_at_1000 value: 54.515 - type: map_at_20 value: 54.212 - type: map_at_3 value: 52.833 - type: map_at_5 value: 53.503 - type: mrr_at_1 value: 48.699999999999996 - type: mrr_at_10 value: 53.991309523809505 - type: mrr_at_100 value: 54.45008993448266 - type: mrr_at_1000 value: 54.515253990549795 - type: mrr_at_20 value: 54.21201762247036 - type: mrr_at_3 value: 52.8333333333333 - type: mrr_at_5 value: 53.50333333333328 - type: nauc_map_at_1000_diff1 value: 79.96867989401643 - type: nauc_map_at_1000_max value: 69.75230895599029 - type: nauc_map_at_1000_std value: 2.6418738289740213 - type: nauc_map_at_100_diff1 value: 79.95343709599133 - type: nauc_map_at_100_max value: 69.751282671507 - type: nauc_map_at_100_std value: 2.621719966106279 - type: nauc_map_at_10_diff1 value: 80.02875864565634 - type: nauc_map_at_10_max value: 69.80948662290187 - type: nauc_map_at_10_std value: 2.329151604733765 - type: nauc_map_at_1_diff1 value: 83.616940281383 - type: nauc_map_at_1_max value: 69.08142651929452 - type: nauc_map_at_1_std value: 1.9687791394035643 - type: nauc_map_at_20_diff1 value: 79.95555601275339 - type: nauc_map_at_20_max value: 69.76604695002925 - type: nauc_map_at_20_std value: 2.556184141901367 - type: nauc_map_at_3_diff1 value: 80.74790131023668 - type: nauc_map_at_3_max value: 70.57797991892402 - type: nauc_map_at_3_std value: 2.7115149849964117 - type: nauc_map_at_5_diff1 value: 80.31796539878381 - type: nauc_map_at_5_max value: 69.93573796420061 - type: nauc_map_at_5_std value: 2.0731614029506606 - type: nauc_mrr_at_1000_diff1 value: 79.96867999907981 - type: nauc_mrr_at_1000_max value: 69.57395578976896 - type: nauc_mrr_at_1000_std value: 2.46351945887829 - type: nauc_mrr_at_100_diff1 value: 79.95343709599133 - type: nauc_mrr_at_100_max value: 69.57322054130803 - type: nauc_mrr_at_100_std value: 2.4436578359073433 - type: nauc_mrr_at_10_diff1 value: 80.02875864565634 - type: nauc_mrr_at_10_max value: 69.63292630937411 - type: nauc_mrr_at_10_std value: 2.1525912912060012 - type: nauc_mrr_at_1_diff1 value: 83.616940281383 - type: nauc_mrr_at_1_max value: 68.74717310480305 - type: nauc_mrr_at_1_std value: 1.6345257249120868 - type: nauc_mrr_at_20_diff1 value: 79.95555601275339 - type: nauc_mrr_at_20_max value: 69.58883608470444 - type: nauc_mrr_at_20_std value: 2.378973276576547 - type: nauc_mrr_at_3_diff1 value: 80.74790131023668 - type: nauc_mrr_at_3_max value: 70.40430475488604 - type: nauc_mrr_at_3_std value: 2.5378398209583817 - type: nauc_mrr_at_5_diff1 value: 80.31796539878381 - type: nauc_mrr_at_5_max value: 69.7605991748183 - type: nauc_mrr_at_5_std value: 1.898022613568352 - type: nauc_ndcg_at_1000_diff1 value: 78.35504059321225 - type: nauc_ndcg_at_1000_max value: 69.06752522437093 - type: nauc_ndcg_at_1000_std value: 3.9624036886099265 - type: nauc_ndcg_at_100_diff1 value: 77.79729140249833 - type: nauc_ndcg_at_100_max value: 68.93113791506029 - type: nauc_ndcg_at_100_std value: 3.642178826886181 - type: nauc_ndcg_at_10_diff1 value: 78.160158293918 - type: nauc_ndcg_at_10_max value: 69.28122202281361 - type: nauc_ndcg_at_10_std value: 2.438976810940962 - type: nauc_ndcg_at_1_diff1 value: 83.616940281383 - type: nauc_ndcg_at_1_max value: 69.08142651929452 - type: nauc_ndcg_at_1_std value: 1.9687791394035643 - type: nauc_ndcg_at_20_diff1 value: 77.88514432874997 - type: nauc_ndcg_at_20_max value: 69.06148818508873 - type: nauc_ndcg_at_20_std value: 3.1800249272363676 - type: nauc_ndcg_at_3_diff1 value: 79.73510384405803 - type: nauc_ndcg_at_3_max value: 70.78000695123832 - type: nauc_ndcg_at_3_std value: 2.9041415468363274 - type: nauc_ndcg_at_5_diff1 value: 78.91872808866195 - type: nauc_ndcg_at_5_max value: 69.61478429620091 - type: nauc_ndcg_at_5_std value: 1.734699636301054 - type: nauc_precision_at_1000_diff1 value: 66.37858395390673 - type: nauc_precision_at_1000_max value: 60.651659037598534 - type: nauc_precision_at_1000_std value: 27.388353715469798 - type: nauc_precision_at_100_diff1 value: 66.34325807776025 - type: nauc_precision_at_100_max value: 63.63855305621111 - type: nauc_precision_at_100_std value: 10.641748149575351 - type: nauc_precision_at_10_diff1 value: 71.3784685491089 - type: nauc_precision_at_10_max value: 67.05313695174542 - type: nauc_precision_at_10_std value: 3.000406867930561 - type: nauc_precision_at_1_diff1 value: 83.616940281383 - type: nauc_precision_at_1_max value: 69.08142651929452 - type: nauc_precision_at_1_std value: 1.9687791394035643 - type: nauc_precision_at_20_diff1 value: 69.73407910977694 - type: nauc_precision_at_20_max value: 65.77426240320742 - type: nauc_precision_at_20_std value: 6.204416838482586 - type: nauc_precision_at_3_diff1 value: 76.63737537643107 - type: nauc_precision_at_3_max value: 71.29710200719668 - type: nauc_precision_at_3_std value: 3.47180961484546 - type: nauc_precision_at_5_diff1 value: 74.36945983536717 - type: nauc_precision_at_5_max value: 68.33292218003061 - type: nauc_precision_at_5_std value: 0.47128762620258075 - type: nauc_recall_at_1000_diff1 value: 66.37858395390681 - type: nauc_recall_at_1000_max value: 60.65165903759889 - type: nauc_recall_at_1000_std value: 27.388353715469822 - type: nauc_recall_at_100_diff1 value: 66.34325807776025 - type: nauc_recall_at_100_max value: 63.63855305621116 - type: nauc_recall_at_100_std value: 10.641748149575351 - type: nauc_recall_at_10_diff1 value: 71.37846854910892 - type: nauc_recall_at_10_max value: 67.05313695174546 - type: nauc_recall_at_10_std value: 3.000406867930663 - type: nauc_recall_at_1_diff1 value: 83.616940281383 - type: nauc_recall_at_1_max value: 69.08142651929452 - type: nauc_recall_at_1_std value: 1.9687791394035643 - type: nauc_recall_at_20_diff1 value: 69.73407910977691 - type: nauc_recall_at_20_max value: 65.77426240320746 - type: nauc_recall_at_20_std value: 6.204416838482536 - type: nauc_recall_at_3_diff1 value: 76.63737537643112 - type: nauc_recall_at_3_max value: 71.29710200719668 - type: nauc_recall_at_3_std value: 3.471809614845442 - type: nauc_recall_at_5_diff1 value: 74.36945983536715 - type: nauc_recall_at_5_max value: 68.33292218003065 - type: nauc_recall_at_5_std value: 0.4712876262026442 - type: ndcg_at_1 value: 48.699999999999996 - type: ndcg_at_10 value: 56.635999999999996 - type: ndcg_at_100 value: 59.193 - type: ndcg_at_1000 value: 60.97 - type: ndcg_at_20 value: 57.426 - type: ndcg_at_3 value: 54.186 - type: ndcg_at_5 value: 55.407 - type: precision_at_1 value: 48.699999999999996 - type: precision_at_10 value: 6.5 - type: precision_at_100 value: 0.777 - type: precision_at_1000 value: 0.092 - type: precision_at_20 value: 3.405 - type: precision_at_3 value: 19.367 - type: precision_at_5 value: 12.22 - type: recall_at_1 value: 48.699999999999996 - type: recall_at_10 value: 65.0 - type: recall_at_100 value: 77.7 - type: recall_at_1000 value: 91.8 - type: recall_at_20 value: 68.10000000000001 - type: recall_at_3 value: 58.099999999999994 - type: recall_at_5 value: 61.1 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P (default) type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: main_score value: 34.80188561439236 - type: v_measure value: 34.80188561439236 - type: v_measure_std value: 1.5703148841573102 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S (default) type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: main_score value: 32.42285513996236 - type: v_measure value: 32.42285513996236 - type: v_measure_std value: 1.3769867487457566 - task: type: Retrieval dataset: name: MTEB MintakaRetrieval (de) type: jinaai/mintakaqa config: de split: test revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e metrics: - type: main_score value: 27.025 - type: map_at_1 value: 14.532 - type: map_at_10 value: 22.612 - type: map_at_100 value: 23.802 - type: map_at_1000 value: 23.9 - type: map_at_20 value: 23.275000000000002 - type: map_at_3 value: 20.226 - type: map_at_5 value: 21.490000000000002 - type: mrr_at_1 value: 14.532434709351305 - type: mrr_at_10 value: 22.612077265615575 - type: mrr_at_100 value: 23.801523356874675 - type: mrr_at_1000 value: 23.900118499340238 - type: mrr_at_20 value: 23.275466430108995 - type: mrr_at_3 value: 20.22606009547877 - type: mrr_at_5 value: 21.489750070204945 - type: nauc_map_at_1000_diff1 value: 14.148987799763596 - type: nauc_map_at_1000_max value: 44.70338461387784 - type: nauc_map_at_1000_std value: 15.868006767707637 - type: nauc_map_at_100_diff1 value: 14.11371769080442 - type: nauc_map_at_100_max value: 44.67995540936296 - type: nauc_map_at_100_std value: 15.890796502029076 - type: nauc_map_at_10_diff1 value: 14.29066834165688 - type: nauc_map_at_10_max value: 45.10997111765282 - type: nauc_map_at_10_std value: 15.508568918629864 - type: nauc_map_at_1_diff1 value: 23.473291302576396 - type: nauc_map_at_1_max value: 44.68942599764586 - type: nauc_map_at_1_std value: 12.424377262427253 - type: nauc_map_at_20_diff1 value: 14.112652046087831 - type: nauc_map_at_20_max value: 44.82014861413682 - type: nauc_map_at_20_std value: 15.739350613646385 - type: nauc_map_at_3_diff1 value: 16.119659221396347 - type: nauc_map_at_3_max value: 46.04766378953525 - type: nauc_map_at_3_std value: 13.969878046315925 - type: nauc_map_at_5_diff1 value: 15.095453434076184 - type: nauc_map_at_5_max value: 45.802128149314406 - type: nauc_map_at_5_std value: 14.957442173319949 - type: nauc_mrr_at_1000_diff1 value: 14.148987799763596 - type: nauc_mrr_at_1000_max value: 44.70338461387784 - type: nauc_mrr_at_1000_std value: 15.868006767707637 - type: nauc_mrr_at_100_diff1 value: 14.11371769080442 - type: nauc_mrr_at_100_max value: 44.67995540936296 - type: nauc_mrr_at_100_std value: 15.890796502029076 - type: nauc_mrr_at_10_diff1 value: 14.29066834165688 - type: nauc_mrr_at_10_max value: 45.10997111765282 - type: nauc_mrr_at_10_std value: 15.508568918629864 - type: nauc_mrr_at_1_diff1 value: 23.473291302576396 - type: nauc_mrr_at_1_max value: 44.68942599764586 - type: nauc_mrr_at_1_std value: 12.424377262427253 - type: nauc_mrr_at_20_diff1 value: 14.112652046087831 - type: nauc_mrr_at_20_max value: 44.82014861413682 - type: nauc_mrr_at_20_std value: 15.739350613646385 - type: nauc_mrr_at_3_diff1 value: 16.119659221396347 - type: nauc_mrr_at_3_max value: 46.04766378953525 - type: nauc_mrr_at_3_std value: 13.969878046315925 - type: nauc_mrr_at_5_diff1 value: 15.095453434076184 - type: nauc_mrr_at_5_max value: 45.802128149314406 - type: nauc_mrr_at_5_std value: 14.957442173319949 - type: nauc_ndcg_at_1000_diff1 value: 11.626606894574028 - type: nauc_ndcg_at_1000_max value: 43.328592841065536 - type: nauc_ndcg_at_1000_std value: 18.049446272245547 - type: nauc_ndcg_at_100_diff1 value: 10.485720606660239 - type: nauc_ndcg_at_100_max value: 42.405317674170966 - type: nauc_ndcg_at_100_std value: 19.107151641936987 - type: nauc_ndcg_at_10_diff1 value: 11.029351078162982 - type: nauc_ndcg_at_10_max value: 44.36855031964681 - type: nauc_ndcg_at_10_std value: 17.302796171409305 - type: nauc_ndcg_at_1_diff1 value: 23.473291302576396 - type: nauc_ndcg_at_1_max value: 44.68942599764586 - type: nauc_ndcg_at_1_std value: 12.424377262427253 - type: nauc_ndcg_at_20_diff1 value: 10.356662718168412 - type: nauc_ndcg_at_20_max value: 43.31602680430083 - type: nauc_ndcg_at_20_std value: 18.162891267850316 - type: nauc_ndcg_at_3_diff1 value: 14.42844952297869 - type: nauc_ndcg_at_3_max value: 46.26603339466543 - type: nauc_ndcg_at_3_std value: 14.449362723887857 - type: nauc_ndcg_at_5_diff1 value: 12.783416563486396 - type: nauc_ndcg_at_5_max value: 45.852176479124424 - type: nauc_ndcg_at_5_std value: 16.11775016428085 - type: nauc_precision_at_1000_diff1 value: -8.045361059399795 - type: nauc_precision_at_1000_max value: 21.970273281738777 - type: nauc_precision_at_1000_std value: 49.564650488193266 - type: nauc_precision_at_100_diff1 value: -2.118628861593353 - type: nauc_precision_at_100_max value: 31.32498977104778 - type: nauc_precision_at_100_std value: 32.96087731883451 - type: nauc_precision_at_10_diff1 value: 3.0335517475367615 - type: nauc_precision_at_10_max value: 42.21620215030219 - type: nauc_precision_at_10_std value: 21.90159732315962 - type: nauc_precision_at_1_diff1 value: 23.473291302576396 - type: nauc_precision_at_1_max value: 44.68942599764586 - type: nauc_precision_at_1_std value: 12.424377262427253 - type: nauc_precision_at_20_diff1 value: 0.4087201843719047 - type: nauc_precision_at_20_max value: 38.485034773895734 - type: nauc_precision_at_20_std value: 25.077397979916682 - type: nauc_precision_at_3_diff1 value: 10.408327736589833 - type: nauc_precision_at_3_max value: 46.757216289175076 - type: nauc_precision_at_3_std value: 15.62594354926867 - type: nauc_precision_at_5_diff1 value: 7.326752744229544 - type: nauc_precision_at_5_max value: 45.89190518573553 - type: nauc_precision_at_5_std value: 19.01717163438957 - type: nauc_recall_at_1000_diff1 value: -8.045361059400387 - type: nauc_recall_at_1000_max value: 21.97027328173812 - type: nauc_recall_at_1000_std value: 49.56465048819266 - type: nauc_recall_at_100_diff1 value: -2.118628861593277 - type: nauc_recall_at_100_max value: 31.324989771047818 - type: nauc_recall_at_100_std value: 32.96087731883457 - type: nauc_recall_at_10_diff1 value: 3.0335517475367166 - type: nauc_recall_at_10_max value: 42.21620215030217 - type: nauc_recall_at_10_std value: 21.901597323159606 - type: nauc_recall_at_1_diff1 value: 23.473291302576396 - type: nauc_recall_at_1_max value: 44.68942599764586 - type: nauc_recall_at_1_std value: 12.424377262427253 - type: nauc_recall_at_20_diff1 value: 0.40872018437190905 - type: nauc_recall_at_20_max value: 38.485034773895734 - type: nauc_recall_at_20_std value: 25.077397979916693 - type: nauc_recall_at_3_diff1 value: 10.408327736589843 - type: nauc_recall_at_3_max value: 46.75721628917507 - type: nauc_recall_at_3_std value: 15.625943549268664 - type: nauc_recall_at_5_diff1 value: 7.326752744229548 - type: nauc_recall_at_5_max value: 45.89190518573557 - type: nauc_recall_at_5_std value: 19.01717163438958 - type: ndcg_at_1 value: 14.532 - type: ndcg_at_10 value: 27.025 - type: ndcg_at_100 value: 33.305 - type: ndcg_at_1000 value: 36.38 - type: ndcg_at_20 value: 29.443 - type: ndcg_at_3 value: 22.035 - type: ndcg_at_5 value: 24.319 - type: precision_at_1 value: 14.532 - type: precision_at_10 value: 4.115 - type: precision_at_100 value: 0.717 - type: precision_at_1000 value: 0.097 - type: precision_at_20 value: 2.536 - type: precision_at_3 value: 9.085 - type: precision_at_5 value: 6.563 - type: recall_at_1 value: 14.532 - type: recall_at_10 value: 41.154 - type: recall_at_100 value: 71.651 - type: recall_at_1000 value: 96.841 - type: recall_at_20 value: 50.71600000000001 - type: recall_at_3 value: 27.254 - type: recall_at_5 value: 32.814 - task: type: Retrieval dataset: name: MTEB MintakaRetrieval (es) type: jinaai/mintakaqa config: es split: test revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e metrics: - type: main_score value: 26.912000000000003 - type: map_at_1 value: 14.686 - type: map_at_10 value: 22.569 - type: map_at_100 value: 23.679 - type: map_at_1000 value: 23.777 - type: map_at_20 value: 23.169 - type: map_at_3 value: 20.201 - type: map_at_5 value: 21.566 - type: mrr_at_1 value: 14.686468646864686 - type: mrr_at_10 value: 22.569346220336296 - type: mrr_at_100 value: 23.678819125817146 - type: mrr_at_1000 value: 23.77713511338264 - type: mrr_at_20 value: 23.16850858443442 - type: mrr_at_3 value: 20.200770077007665 - type: mrr_at_5 value: 21.56628162816276 - type: nauc_map_at_1000_diff1 value: 14.129007578838381 - type: nauc_map_at_1000_max value: 44.4255501141499 - type: nauc_map_at_1000_std value: 19.95906154868176 - type: nauc_map_at_100_diff1 value: 14.09071870575231 - type: nauc_map_at_100_max value: 44.403179928955566 - type: nauc_map_at_100_std value: 20.00413657519976 - type: nauc_map_at_10_diff1 value: 14.149535953153688 - type: nauc_map_at_10_max value: 44.66529917634685 - type: nauc_map_at_10_std value: 19.580235989479394 - type: nauc_map_at_1_diff1 value: 23.489813522176636 - type: nauc_map_at_1_max value: 46.54578639925787 - type: nauc_map_at_1_std value: 16.39083721709994 - type: nauc_map_at_20_diff1 value: 14.021560420656181 - type: nauc_map_at_20_max value: 44.4825455452467 - type: nauc_map_at_20_std value: 19.886927750826878 - type: nauc_map_at_3_diff1 value: 16.182977890477723 - type: nauc_map_at_3_max value: 46.1840554029258 - type: nauc_map_at_3_std value: 18.735671900228958 - type: nauc_map_at_5_diff1 value: 14.779126395472833 - type: nauc_map_at_5_max value: 45.23237213817556 - type: nauc_map_at_5_std value: 19.348508580412872 - type: nauc_mrr_at_1000_diff1 value: 14.129007578838381 - type: nauc_mrr_at_1000_max value: 44.4255501141499 - type: nauc_mrr_at_1000_std value: 19.95906154868176 - type: nauc_mrr_at_100_diff1 value: 14.09071870575231 - type: nauc_mrr_at_100_max value: 44.403179928955566 - type: nauc_mrr_at_100_std value: 20.00413657519976 - type: nauc_mrr_at_10_diff1 value: 14.149535953153688 - type: nauc_mrr_at_10_max value: 44.66529917634685 - type: nauc_mrr_at_10_std value: 19.580235989479394 - type: nauc_mrr_at_1_diff1 value: 23.489813522176636 - type: nauc_mrr_at_1_max value: 46.54578639925787 - type: nauc_mrr_at_1_std value: 16.39083721709994 - type: nauc_mrr_at_20_diff1 value: 14.021560420656181 - type: nauc_mrr_at_20_max value: 44.4825455452467 - type: nauc_mrr_at_20_std value: 19.886927750826878 - type: nauc_mrr_at_3_diff1 value: 16.182977890477723 - type: nauc_mrr_at_3_max value: 46.1840554029258 - type: nauc_mrr_at_3_std value: 18.735671900228958 - type: nauc_mrr_at_5_diff1 value: 14.779126395472833 - type: nauc_mrr_at_5_max value: 45.23237213817556 - type: nauc_mrr_at_5_std value: 19.348508580412872 - type: nauc_ndcg_at_1000_diff1 value: 11.762470380481101 - type: nauc_ndcg_at_1000_max value: 42.8233203033089 - type: nauc_ndcg_at_1000_std value: 21.78503705117719 - type: nauc_ndcg_at_100_diff1 value: 10.45886076220022 - type: nauc_ndcg_at_100_max value: 41.85472899256818 - type: nauc_ndcg_at_100_std value: 23.20955486335138 - type: nauc_ndcg_at_10_diff1 value: 10.605912468659469 - type: nauc_ndcg_at_10_max value: 43.150942448104715 - type: nauc_ndcg_at_10_std value: 21.120035764826085 - type: nauc_ndcg_at_1_diff1 value: 23.489813522176636 - type: nauc_ndcg_at_1_max value: 46.54578639925787 - type: nauc_ndcg_at_1_std value: 16.39083721709994 - type: nauc_ndcg_at_20_diff1 value: 10.11291783888644 - type: nauc_ndcg_at_20_max value: 42.51260678842788 - type: nauc_ndcg_at_20_std value: 22.1744949382252 - type: nauc_ndcg_at_3_diff1 value: 14.25625326760802 - type: nauc_ndcg_at_3_max value: 45.96162916377383 - type: nauc_ndcg_at_3_std value: 19.557832728215523 - type: nauc_ndcg_at_5_diff1 value: 11.956317653823053 - type: nauc_ndcg_at_5_max value: 44.35971268886807 - type: nauc_ndcg_at_5_std value: 20.581696730374233 - type: nauc_precision_at_1000_diff1 value: 5.132291843566577 - type: nauc_precision_at_1000_max value: 25.293354576835263 - type: nauc_precision_at_1000_std value: 40.36005126087624 - type: nauc_precision_at_100_diff1 value: -1.5252854375008238 - type: nauc_precision_at_100_max value: 31.007586474495984 - type: nauc_precision_at_100_std value: 37.297552993548386 - type: nauc_precision_at_10_diff1 value: 1.9663657370770737 - type: nauc_precision_at_10_max value: 39.194092293625125 - type: nauc_precision_at_10_std value: 24.956542621999542 - type: nauc_precision_at_1_diff1 value: 23.489813522176636 - type: nauc_precision_at_1_max value: 46.54578639925787 - type: nauc_precision_at_1_std value: 16.39083721709994 - type: nauc_precision_at_20_diff1 value: 0.011112090390932373 - type: nauc_precision_at_20_max value: 36.9357074392519 - type: nauc_precision_at_20_std value: 28.611387115093876 - type: nauc_precision_at_3_diff1 value: 9.596831091013703 - type: nauc_precision_at_3_max value: 45.3905541893809 - type: nauc_precision_at_3_std value: 21.599314388526945 - type: nauc_precision_at_5_diff1 value: 5.175887949900142 - type: nauc_precision_at_5_max value: 42.129467510414464 - type: nauc_precision_at_5_std value: 23.607251548776677 - type: nauc_recall_at_1000_diff1 value: 5.132291843566257 - type: nauc_recall_at_1000_max value: 25.29335457683396 - type: nauc_recall_at_1000_std value: 40.36005126087638 - type: nauc_recall_at_100_diff1 value: -1.5252854375008988 - type: nauc_recall_at_100_max value: 31.00758647449594 - type: nauc_recall_at_100_std value: 37.29755299354834 - type: nauc_recall_at_10_diff1 value: 1.9663657370770793 - type: nauc_recall_at_10_max value: 39.19409229362512 - type: nauc_recall_at_10_std value: 24.956542621999546 - type: nauc_recall_at_1_diff1 value: 23.489813522176636 - type: nauc_recall_at_1_max value: 46.54578639925787 - type: nauc_recall_at_1_std value: 16.39083721709994 - type: nauc_recall_at_20_diff1 value: 0.011112090390923075 - type: nauc_recall_at_20_max value: 36.93570743925189 - type: nauc_recall_at_20_std value: 28.611387115093883 - type: nauc_recall_at_3_diff1 value: 9.596831091013714 - type: nauc_recall_at_3_max value: 45.39055418938087 - type: nauc_recall_at_3_std value: 21.599314388526956 - type: nauc_recall_at_5_diff1 value: 5.17588794990012 - type: nauc_recall_at_5_max value: 42.12946751041448 - type: nauc_recall_at_5_std value: 23.607251548776695 - type: ndcg_at_1 value: 14.686 - type: ndcg_at_10 value: 26.912000000000003 - type: ndcg_at_100 value: 32.919 - type: ndcg_at_1000 value: 36.119 - type: ndcg_at_20 value: 29.079 - type: ndcg_at_3 value: 21.995 - type: ndcg_at_5 value: 24.474999999999998 - type: precision_at_1 value: 14.686 - type: precision_at_10 value: 4.08 - type: precision_at_100 value: 0.703 - type: precision_at_1000 value: 0.097 - type: precision_at_20 value: 2.467 - type: precision_at_3 value: 9.062000000000001 - type: precision_at_5 value: 6.65 - type: recall_at_1 value: 14.686 - type: recall_at_10 value: 40.8 - type: recall_at_100 value: 70.338 - type: recall_at_1000 value: 96.82300000000001 - type: recall_at_20 value: 49.34 - type: recall_at_3 value: 27.186 - type: recall_at_5 value: 33.251 - task: type: Retrieval dataset: name: MTEB MintakaRetrieval (fr) type: jinaai/mintakaqa config: fr split: test revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e metrics: - type: main_score value: 26.909 - type: map_at_1 value: 14.701 - type: map_at_10 value: 22.613 - type: map_at_100 value: 23.729 - type: map_at_1000 value: 23.837 - type: map_at_20 value: 23.262 - type: map_at_3 value: 20.236 - type: map_at_5 value: 21.673000000000002 - type: mrr_at_1 value: 14.7010647010647 - type: mrr_at_10 value: 22.613165113165113 - type: mrr_at_100 value: 23.72877605989423 - type: mrr_at_1000 value: 23.837150802746805 - type: mrr_at_20 value: 23.261627081110596 - type: mrr_at_3 value: 20.2361452361452 - type: mrr_at_5 value: 21.673491673491625 - type: nauc_map_at_1000_diff1 value: 17.08927788889635 - type: nauc_map_at_1000_max value: 47.240929150603336 - type: nauc_map_at_1000_std value: 20.559244258100275 - type: nauc_map_at_100_diff1 value: 17.029461792796777 - type: nauc_map_at_100_max value: 47.207381115550696 - type: nauc_map_at_100_std value: 20.581498156895265 - type: nauc_map_at_10_diff1 value: 17.351456007804536 - type: nauc_map_at_10_max value: 47.815880040221344 - type: nauc_map_at_10_std value: 20.292999107555794 - type: nauc_map_at_1_diff1 value: 27.297525357600776 - type: nauc_map_at_1_max value: 47.18835074959486 - type: nauc_map_at_1_std value: 18.304203168281834 - type: nauc_map_at_20_diff1 value: 17.157460199542136 - type: nauc_map_at_20_max value: 47.4776610667456 - type: nauc_map_at_20_std value: 20.499186342964478 - type: nauc_map_at_3_diff1 value: 19.393119961356277 - type: nauc_map_at_3_max value: 49.02841822452882 - type: nauc_map_at_3_std value: 19.293122796321292 - type: nauc_map_at_5_diff1 value: 17.76275044752008 - type: nauc_map_at_5_max value: 48.01292548040298 - type: nauc_map_at_5_std value: 19.928449977400504 - type: nauc_mrr_at_1000_diff1 value: 17.08927788889635 - type: nauc_mrr_at_1000_max value: 47.240929150603336 - type: nauc_mrr_at_1000_std value: 20.559244258100275 - type: nauc_mrr_at_100_diff1 value: 17.029461792796777 - type: nauc_mrr_at_100_max value: 47.207381115550696 - type: nauc_mrr_at_100_std value: 20.581498156895265 - type: nauc_mrr_at_10_diff1 value: 17.351456007804536 - type: nauc_mrr_at_10_max value: 47.815880040221344 - type: nauc_mrr_at_10_std value: 20.292999107555794 - type: nauc_mrr_at_1_diff1 value: 27.297525357600776 - type: nauc_mrr_at_1_max value: 47.18835074959486 - type: nauc_mrr_at_1_std value: 18.304203168281834 - type: nauc_mrr_at_20_diff1 value: 17.157460199542136 - type: nauc_mrr_at_20_max value: 47.4776610667456 - type: nauc_mrr_at_20_std value: 20.499186342964478 - type: nauc_mrr_at_3_diff1 value: 19.393119961356277 - type: nauc_mrr_at_3_max value: 49.02841822452882 - type: nauc_mrr_at_3_std value: 19.293122796321292 - type: nauc_mrr_at_5_diff1 value: 17.76275044752008 - type: nauc_mrr_at_5_max value: 48.01292548040298 - type: nauc_mrr_at_5_std value: 19.928449977400504 - type: nauc_ndcg_at_1000_diff1 value: 13.989496006047975 - type: nauc_ndcg_at_1000_max value: 45.626323944336114 - type: nauc_ndcg_at_1000_std value: 22.125600410796515 - type: nauc_ndcg_at_100_diff1 value: 12.302204843705244 - type: nauc_ndcg_at_100_max value: 44.46856314559079 - type: nauc_ndcg_at_100_std value: 23.084984546328677 - type: nauc_ndcg_at_10_diff1 value: 14.001226213368275 - type: nauc_ndcg_at_10_max value: 47.37780636546918 - type: nauc_ndcg_at_10_std value: 21.702709032840637 - type: nauc_ndcg_at_1_diff1 value: 27.297525357600776 - type: nauc_ndcg_at_1_max value: 47.18835074959486 - type: nauc_ndcg_at_1_std value: 18.304203168281834 - type: nauc_ndcg_at_20_diff1 value: 13.317759910171056 - type: nauc_ndcg_at_20_max value: 46.25171251043813 - type: nauc_ndcg_at_20_std value: 22.309331575402595 - type: nauc_ndcg_at_3_diff1 value: 17.555381234893872 - type: nauc_ndcg_at_3_max value: 49.48635590260059 - type: nauc_ndcg_at_3_std value: 19.734570962933674 - type: nauc_ndcg_at_5_diff1 value: 14.844841165765061 - type: nauc_ndcg_at_5_max value: 47.76437065028708 - type: nauc_ndcg_at_5_std value: 20.816034479453954 - type: nauc_precision_at_1000_diff1 value: -15.591898698252546 - type: nauc_precision_at_1000_max value: 20.545984285353892 - type: nauc_precision_at_1000_std value: 38.9013414992826 - type: nauc_precision_at_100_diff1 value: -5.290395978742176 - type: nauc_precision_at_100_max value: 31.340480360546845 - type: nauc_precision_at_100_std value: 33.6897935720505 - type: nauc_precision_at_10_diff1 value: 5.965001997926562 - type: nauc_precision_at_10_max value: 46.12515296162247 - type: nauc_precision_at_10_std value: 25.409433135253558 - type: nauc_precision_at_1_diff1 value: 27.297525357600776 - type: nauc_precision_at_1_max value: 47.18835074959486 - type: nauc_precision_at_1_std value: 18.304203168281834 - type: nauc_precision_at_20_diff1 value: 3.4438127279827744 - type: nauc_precision_at_20_max value: 42.36095587714494 - type: nauc_precision_at_20_std value: 27.367900512797906 - type: nauc_precision_at_3_diff1 value: 13.165017224718916 - type: nauc_precision_at_3_max value: 50.58931825484506 - type: nauc_precision_at_3_std value: 20.852009214609442 - type: nauc_precision_at_5_diff1 value: 7.840087177549876 - type: nauc_precision_at_5_max value: 46.99388755575109 - type: nauc_precision_at_5_std value: 23.048702393099834 - type: nauc_recall_at_1000_diff1 value: -15.591898698252932 - type: nauc_recall_at_1000_max value: 20.5459842853537 - type: nauc_recall_at_1000_std value: 38.901341499282395 - type: nauc_recall_at_100_diff1 value: -5.290395978742165 - type: nauc_recall_at_100_max value: 31.340480360546863 - type: nauc_recall_at_100_std value: 33.68979357205046 - type: nauc_recall_at_10_diff1 value: 5.96500199792656 - type: nauc_recall_at_10_max value: 46.1251529616225 - type: nauc_recall_at_10_std value: 25.409433135253543 - type: nauc_recall_at_1_diff1 value: 27.297525357600776 - type: nauc_recall_at_1_max value: 47.18835074959486 - type: nauc_recall_at_1_std value: 18.304203168281834 - type: nauc_recall_at_20_diff1 value: 3.4438127279827833 - type: nauc_recall_at_20_max value: 42.36095587714498 - type: nauc_recall_at_20_std value: 27.36790051279787 - type: nauc_recall_at_3_diff1 value: 13.165017224718916 - type: nauc_recall_at_3_max value: 50.589318254845054 - type: nauc_recall_at_3_std value: 20.852009214609435 - type: nauc_recall_at_5_diff1 value: 7.840087177549891 - type: nauc_recall_at_5_max value: 46.99388755575112 - type: nauc_recall_at_5_std value: 23.048702393099845 - type: ndcg_at_1 value: 14.701 - type: ndcg_at_10 value: 26.909 - type: ndcg_at_100 value: 32.727000000000004 - type: ndcg_at_1000 value: 36.086 - type: ndcg_at_20 value: 29.236 - type: ndcg_at_3 value: 22.004 - type: ndcg_at_5 value: 24.615000000000002 - type: precision_at_1 value: 14.701 - type: precision_at_10 value: 4.062 - type: precision_at_100 value: 0.688 - type: precision_at_1000 value: 0.096 - type: precision_at_20 value: 2.488 - type: precision_at_3 value: 9.036 - type: precision_at_5 value: 6.699 - type: recall_at_1 value: 14.701 - type: recall_at_10 value: 40.622 - type: recall_at_100 value: 68.796 - type: recall_at_1000 value: 96.314 - type: recall_at_20 value: 49.754 - type: recall_at_3 value: 27.108999999999998 - type: recall_at_5 value: 33.497 - task: type: Classification dataset: name: MTEB MultilingualSentiment (default) type: C-MTEB/MultilingualSentiment-classification config: default split: test revision: 46958b007a63fdbf239b7672c25d0bea67b5ea1a metrics: - type: accuracy value: 73.20999999999998 - type: f1 value: 73.18755986777474 - type: f1_weighted value: 73.18755986777475 - type: main_score value: 73.20999999999998 - task: type: Retrieval dataset: name: MTEB NFCorpus (default) type: mteb/nfcorpus config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: map_at_1 value: 4.822 - type: map_at_10 value: 13.144 - type: map_at_100 value: 17.254 - type: map_at_1000 value: 18.931 - type: map_at_20 value: 14.834 - type: map_at_3 value: 8.975 - type: map_at_5 value: 10.922 - type: mrr_at_1 value: 47.059 - type: mrr_at_10 value: 55.806999999999995 - type: mrr_at_100 value: 56.286 - type: mrr_at_1000 value: 56.327000000000005 - type: mrr_at_20 value: 56.00000000000001 - type: mrr_at_3 value: 54.17999999999999 - type: mrr_at_5 value: 55.155 - type: ndcg_at_1 value: 44.427 - type: ndcg_at_10 value: 36.623 - type: ndcg_at_100 value: 33.664 - type: ndcg_at_1000 value: 42.538 - type: ndcg_at_20 value: 34.066 - type: ndcg_at_3 value: 41.118 - type: ndcg_at_5 value: 39.455 - type: precision_at_1 value: 46.44 - type: precision_at_10 value: 28.607 - type: precision_at_100 value: 9.189 - type: precision_at_1000 value: 2.261 - type: precision_at_20 value: 21.238 - type: precision_at_3 value: 39.628 - type: precision_at_5 value: 35.604 - type: recall_at_1 value: 4.822 - type: recall_at_10 value: 17.488999999999997 - type: recall_at_100 value: 35.052 - type: recall_at_1000 value: 66.67999999999999 - type: recall_at_20 value: 21.343999999999998 - type: recall_at_3 value: 10.259 - type: recall_at_5 value: 13.406 - type: main_score value: 36.623 - task: type: Retrieval dataset: name: MTEB NQ (default) type: mteb/nq config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: map_at_1 value: 41.411 - type: map_at_10 value: 57.179 - type: map_at_100 value: 57.945 - type: map_at_1000 value: 57.967999999999996 - type: map_at_20 value: 57.687 - type: map_at_3 value: 53.46300000000001 - type: map_at_5 value: 55.696999999999996 - type: mrr_at_1 value: 46.233999999999995 - type: mrr_at_10 value: 59.831999999999994 - type: mrr_at_100 value: 60.33500000000001 - type: mrr_at_1000 value: 60.348 - type: mrr_at_20 value: 60.167 - type: mrr_at_3 value: 56.972 - type: mrr_at_5 value: 58.74 - type: ndcg_at_1 value: 46.205 - type: ndcg_at_10 value: 64.23100000000001 - type: ndcg_at_100 value: 67.242 - type: ndcg_at_1000 value: 67.72500000000001 - type: ndcg_at_20 value: 65.77300000000001 - type: ndcg_at_3 value: 57.516 - type: ndcg_at_5 value: 61.11600000000001 - type: precision_at_1 value: 46.205 - type: precision_at_10 value: 9.873 - type: precision_at_100 value: 1.158 - type: precision_at_1000 value: 0.12 - type: precision_at_20 value: 5.319 - type: precision_at_3 value: 25.424999999999997 - type: precision_at_5 value: 17.375 - type: recall_at_1 value: 41.411 - type: recall_at_10 value: 82.761 - type: recall_at_100 value: 95.52199999999999 - type: recall_at_1000 value: 99.02499999999999 - type: recall_at_20 value: 88.34 - type: recall_at_3 value: 65.73 - type: recall_at_5 value: 73.894 - type: main_score value: 64.23100000000001 - task: type: PairClassification dataset: name: MTEB Ocnli (default) type: C-MTEB/OCNLI config: default split: validation revision: 66e76a618a34d6d565d5538088562851e6daa7ec metrics: - type: cosine_accuracy value: 62.3714131023281 - type: cosine_accuracy_threshold value: 79.70921993255615 - type: cosine_ap value: 66.41380155495659 - type: cosine_f1 value: 68.89547185780786 - type: cosine_f1_threshold value: 72.91591167449951 - type: cosine_precision value: 57.485875706214685 - type: cosine_recall value: 85.95564941921859 - type: dot_accuracy value: 60.47644829453167 - type: dot_accuracy_threshold value: 36627.362060546875 - type: dot_ap value: 63.696303449293204 - type: dot_f1 value: 68.3986041101202 - type: dot_f1_threshold value: 30452.72216796875 - type: dot_precision value: 54.04411764705882 - type: dot_recall value: 93.13621964097149 - type: euclidean_accuracy value: 63.02111532214402 - type: euclidean_accuracy_threshold value: 1392.76762008667 - type: euclidean_ap value: 66.65907089443218 - type: euclidean_f1 value: 69.05036524413688 - type: euclidean_f1_threshold value: 1711.5310668945312 - type: euclidean_precision value: 54.29262394195889 - type: euclidean_recall value: 94.82576557550159 - type: main_score value: 63.02111532214402 - type: manhattan_accuracy value: 62.75040606388739 - type: manhattan_accuracy_threshold value: 32475.347900390625 - type: manhattan_ap value: 66.50943585125434 - type: manhattan_f1 value: 69.08382066276802 - type: manhattan_f1_threshold value: 41238.470458984375 - type: manhattan_precision value: 54.75896168108776 - type: manhattan_recall value: 93.55860612460401 - type: max_accuracy value: 63.02111532214402 - type: max_ap value: 66.65907089443218 - type: max_f1 value: 69.08382066276802 - type: max_precision value: 57.485875706214685 - type: max_recall value: 94.82576557550159 - type: similarity_accuracy value: 62.3714131023281 - type: similarity_accuracy_threshold value: 79.70921993255615 - type: similarity_ap value: 66.41380155495659 - type: similarity_f1 value: 68.89547185780786 - type: similarity_f1_threshold value: 72.91591167449951 - type: similarity_precision value: 57.485875706214685 - type: similarity_recall value: 85.95564941921859 - task: type: Classification dataset: name: MTEB OnlineShopping (default) type: C-MTEB/OnlineShopping-classification config: default split: test revision: e610f2ebd179a8fda30ae534c3878750a96db120 metrics: - type: accuracy value: 91.88000000000001 - type: ap value: 89.52463684448476 - type: ap_weighted value: 89.52463684448476 - type: f1 value: 91.86313022306673 - type: f1_weighted value: 91.87806318146912 - type: main_score value: 91.88000000000001 - task: type: PairClassification dataset: name: MTEB OpusparcusPC (en) type: GEM/opusparcus config: en split: test.full revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a metrics: - type: cosine_accuracy value: 92.65578635014838 - type: cosine_accuracy_threshold value: 74.02530312538147 - type: cosine_ap value: 98.3834226153613 - type: cosine_f1 value: 94.92567913890312 - type: cosine_f1_threshold value: 74.02530312538147 - type: cosine_precision value: 95.562435500516 - type: cosine_recall value: 94.29735234215886 - type: dot_accuracy value: 91.54302670623146 - type: dot_accuracy_threshold value: 34452.29187011719 - type: dot_ap value: 98.1237257754439 - type: dot_f1 value: 94.22400803616273 - type: dot_f1_threshold value: 33670.41931152344 - type: dot_precision value: 92.9633300297324 - type: dot_recall value: 95.5193482688391 - type: euclidean_accuracy value: 92.28486646884274 - type: euclidean_accuracy_threshold value: 1602.8022766113281 - type: euclidean_ap value: 98.3099021504706 - type: euclidean_f1 value: 94.75277497477296 - type: euclidean_f1_threshold value: 1604.7462463378906 - type: euclidean_precision value: 93.89999999999999 - type: euclidean_recall value: 95.62118126272912 - type: main_score value: 98.3834226153613 - type: manhattan_accuracy value: 92.2106824925816 - type: manhattan_accuracy_threshold value: 38872.90954589844 - type: manhattan_ap value: 98.28694101230218 - type: manhattan_f1 value: 94.67815509376584 - type: manhattan_f1_threshold value: 38872.90954589844 - type: manhattan_precision value: 94.24823410696267 - type: manhattan_recall value: 95.11201629327903 - type: max_accuracy value: 92.65578635014838 - type: max_ap value: 98.3834226153613 - type: max_f1 value: 94.92567913890312 - type: max_precision value: 95.562435500516 - type: max_recall value: 95.62118126272912 - type: similarity_accuracy value: 92.65578635014838 - type: similarity_accuracy_threshold value: 74.02530312538147 - type: similarity_ap value: 98.3834226153613 - type: similarity_f1 value: 94.92567913890312 - type: similarity_f1_threshold value: 74.02530312538147 - type: similarity_precision value: 95.562435500516 - type: similarity_recall value: 94.29735234215886 - task: type: PairClassification dataset: name: MTEB OpusparcusPC (de) type: GEM/opusparcus config: de split: test.full revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a metrics: - type: cosine_accuracy value: 87.72178850248403 - type: cosine_accuracy_threshold value: 73.33863377571106 - type: cosine_ap value: 96.98901408834976 - type: cosine_f1 value: 91.89944134078212 - type: cosine_f1_threshold value: 71.45810127258301 - type: cosine_precision value: 89.64577656675749 - type: cosine_recall value: 94.26934097421203 - type: dot_accuracy value: 86.30234208658624 - type: dot_accuracy_threshold value: 32027.130126953125 - type: dot_ap value: 96.12260574893256 - type: dot_f1 value: 91.31602506714414 - type: dot_f1_threshold value: 30804.376220703125 - type: dot_precision value: 85.93091828138164 - type: dot_recall value: 97.42120343839542 - type: euclidean_accuracy value: 87.9347054648687 - type: euclidean_accuracy_threshold value: 1609.6670150756836 - type: euclidean_ap value: 97.00238860358252 - type: euclidean_f1 value: 92.1089063221043 - type: euclidean_f1_threshold value: 1641.8487548828125 - type: euclidean_precision value: 89.10714285714286 - type: euclidean_recall value: 95.31996179560649 - type: main_score value: 97.00238860358252 - type: manhattan_accuracy value: 87.72178850248403 - type: manhattan_accuracy_threshold value: 40137.060546875 - type: manhattan_ap value: 96.98653728159941 - type: manhattan_f1 value: 92.03865623561896 - type: manhattan_f1_threshold value: 40137.060546875 - type: manhattan_precision value: 88.80994671403198 - type: manhattan_recall value: 95.51098376313276 - type: max_accuracy value: 87.9347054648687 - type: max_ap value: 97.00238860358252 - type: max_f1 value: 92.1089063221043 - type: max_precision value: 89.64577656675749 - type: max_recall value: 97.42120343839542 - type: similarity_accuracy value: 87.72178850248403 - type: similarity_accuracy_threshold value: 73.33863377571106 - type: similarity_ap value: 96.98901408834976 - type: similarity_f1 value: 91.89944134078212 - type: similarity_f1_threshold value: 71.45810127258301 - type: similarity_precision value: 89.64577656675749 - type: similarity_recall value: 94.26934097421203 - task: type: PairClassification dataset: name: MTEB OpusparcusPC (fr) type: GEM/opusparcus config: fr split: test.full revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a metrics: - type: cosine_accuracy value: 80.92643051771117 - type: cosine_accuracy_threshold value: 76.68856382369995 - type: cosine_ap value: 93.74622381534307 - type: cosine_f1 value: 87.12328767123287 - type: cosine_f1_threshold value: 71.64022922515869 - type: cosine_precision value: 80.64243448858834 - type: cosine_recall value: 94.73684210526315 - type: dot_accuracy value: 80.858310626703 - type: dot_accuracy_threshold value: 34028.3935546875 - type: dot_ap value: 91.18448457633308 - type: dot_f1 value: 86.82606657290202 - type: dot_f1_threshold value: 34028.3935546875 - type: dot_precision value: 82.2380106571936 - type: dot_recall value: 91.9563058589871 - type: euclidean_accuracy value: 80.858310626703 - type: euclidean_accuracy_threshold value: 1595.7651138305664 - type: euclidean_ap value: 93.8182717829648 - type: euclidean_f1 value: 87.04044117647058 - type: euclidean_f1_threshold value: 1609.2475891113281 - type: euclidean_precision value: 81.00940975192472 - type: euclidean_recall value: 94.04170804369414 - type: main_score value: 93.8182717829648 - type: manhattan_accuracy value: 80.99455040871935 - type: manhattan_accuracy_threshold value: 38092.132568359375 - type: manhattan_ap value: 93.77563401151711 - type: manhattan_f1 value: 86.91983122362869 - type: manhattan_f1_threshold value: 38092.132568359375 - type: manhattan_precision value: 82.32682060390763 - type: manhattan_recall value: 92.05561072492551 - type: max_accuracy value: 80.99455040871935 - type: max_ap value: 93.8182717829648 - type: max_f1 value: 87.12328767123287 - type: max_precision value: 82.32682060390763 - type: max_recall value: 94.73684210526315 - type: similarity_accuracy value: 80.92643051771117 - type: similarity_accuracy_threshold value: 76.68856382369995 - type: similarity_ap value: 93.74622381534307 - type: similarity_f1 value: 87.12328767123287 - type: similarity_f1_threshold value: 71.64022922515869 - type: similarity_precision value: 80.64243448858834 - type: similarity_recall value: 94.73684210526315 - task: type: PairClassification dataset: name: MTEB OpusparcusPC (ru) type: GEM/opusparcus config: ru split: test.full revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a metrics: - type: cosine_accuracy value: 76.83823529411765 - type: cosine_accuracy_threshold value: 72.70769476890564 - type: cosine_ap value: 89.56692049908222 - type: cosine_f1 value: 83.99832003359934 - type: cosine_f1_threshold value: 70.9052324295044 - type: cosine_precision value: 76.16146230007617 - type: cosine_recall value: 93.63295880149812 - type: dot_accuracy value: 76.28676470588235 - type: dot_accuracy_threshold value: 33740.68908691406 - type: dot_ap value: 87.77185177141567 - type: dot_f1 value: 83.62251375370292 - type: dot_f1_threshold value: 32726.611328125 - type: dot_precision value: 76.29343629343629 - type: dot_recall value: 92.50936329588015 - type: euclidean_accuracy value: 77.32843137254902 - type: euclidean_accuracy_threshold value: 1566.510009765625 - type: euclidean_ap value: 89.60605626791111 - type: euclidean_f1 value: 84.06546080964686 - type: euclidean_f1_threshold value: 1576.4202117919922 - type: euclidean_precision value: 77.83094098883574 - type: euclidean_recall value: 91.38576779026218 - type: main_score value: 89.60605626791111 - type: manhattan_accuracy value: 76.89950980392157 - type: manhattan_accuracy_threshold value: 38202.215576171875 - type: manhattan_ap value: 89.55766894104868 - type: manhattan_f1 value: 83.80462724935732 - type: manhattan_f1_threshold value: 38934.375 - type: manhattan_precision value: 77.25118483412322 - type: manhattan_recall value: 91.57303370786516 - type: max_accuracy value: 77.32843137254902 - type: max_ap value: 89.60605626791111 - type: max_f1 value: 84.06546080964686 - type: max_precision value: 77.83094098883574 - type: max_recall value: 93.63295880149812 - type: similarity_accuracy value: 76.83823529411765 - type: similarity_accuracy_threshold value: 72.70769476890564 - type: similarity_ap value: 89.56692049908222 - type: similarity_f1 value: 83.99832003359934 - type: similarity_f1_threshold value: 70.9052324295044 - type: similarity_precision value: 76.16146230007617 - type: similarity_recall value: 93.63295880149812 - task: type: Classification dataset: name: MTEB PAC (default) type: laugustyniak/abusive-clauses-pl config: default split: test revision: fc69d1c153a8ccdcf1eef52f4e2a27f88782f543 metrics: - type: accuracy value: 68.39559803069794 - type: ap value: 77.68074206719457 - type: ap_weighted value: 77.68074206719457 - type: f1 value: 66.23485605467732 - type: f1_weighted value: 69.03201442129347 - type: main_score value: 68.39559803069794 - task: type: STS dataset: name: MTEB PAWSX (default) type: C-MTEB/PAWSX config: default split: test revision: 9c6a90e430ac22b5779fb019a23e820b11a8b5e1 metrics: - type: cosine_pearson value: 13.161523266433587 - type: cosine_spearman value: 15.557333873773386 - type: euclidean_pearson value: 17.147508431907525 - type: euclidean_spearman value: 15.664112857732146 - type: main_score value: 15.557333873773386 - type: manhattan_pearson value: 17.130875906264386 - type: manhattan_spearman value: 15.624397342229637 - type: pearson value: 13.161523266433587 - type: spearman value: 15.557333873773386 - task: type: PairClassification dataset: name: MTEB PSC (default) type: PL-MTEB/psc-pairclassification config: default split: test revision: d05a294af9e1d3ff2bfb6b714e08a24a6cabc669 metrics: - type: cosine_accuracy value: 97.86641929499072 - type: cosine_accuracy_threshold value: 79.0391206741333 - type: cosine_ap value: 99.19403807771533 - type: cosine_f1 value: 96.45608628659475 - type: cosine_f1_threshold value: 79.0391206741333 - type: cosine_precision value: 97.50778816199377 - type: cosine_recall value: 95.42682926829268 - type: dot_accuracy value: 98.14471243042672 - type: dot_accuracy_threshold value: 29808.1787109375 - type: dot_ap value: 99.331999859971 - type: dot_f1 value: 97.01492537313433 - type: dot_f1_threshold value: 29808.1787109375 - type: dot_precision value: 95.02923976608187 - type: dot_recall value: 99.08536585365853 - type: euclidean_accuracy value: 97.49536178107606 - type: euclidean_accuracy_threshold value: 1276.227855682373 - type: euclidean_ap value: 98.91056467717377 - type: euclidean_f1 value: 95.83975346687212 - type: euclidean_f1_threshold value: 1276.227855682373 - type: euclidean_precision value: 96.88473520249221 - type: euclidean_recall value: 94.8170731707317 - type: main_score value: 99.331999859971 - type: manhattan_accuracy value: 97.49536178107606 - type: manhattan_accuracy_threshold value: 31097.674560546875 - type: manhattan_ap value: 98.95694691792707 - type: manhattan_f1 value: 95.83975346687212 - type: manhattan_f1_threshold value: 31097.674560546875 - type: manhattan_precision value: 96.88473520249221 - type: manhattan_recall value: 94.8170731707317 - type: max_accuracy value: 98.14471243042672 - type: max_ap value: 99.331999859971 - type: max_f1 value: 97.01492537313433 - type: max_precision value: 97.50778816199377 - type: max_recall value: 99.08536585365853 - type: similarity_accuracy value: 97.86641929499072 - type: similarity_accuracy_threshold value: 79.0391206741333 - type: similarity_ap value: 99.19403807771533 - type: similarity_f1 value: 96.45608628659475 - type: similarity_f1_threshold value: 79.0391206741333 - type: similarity_precision value: 97.50778816199377 - type: similarity_recall value: 95.42682926829268 - task: type: PairClassification dataset: name: MTEB PawsXPairClassification (en) type: google-research-datasets/paws-x config: en split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cosine_accuracy value: 61.8 - type: cosine_accuracy_threshold value: 99.5664119720459 - type: cosine_ap value: 60.679317786040585 - type: cosine_f1 value: 63.17354143441101 - type: cosine_f1_threshold value: 97.22164869308472 - type: cosine_precision value: 47.6457399103139 - type: cosine_recall value: 93.71554575523705 - type: dot_accuracy value: 55.7 - type: dot_accuracy_threshold value: 48353.62548828125 - type: dot_ap value: 48.53805970536875 - type: dot_f1 value: 62.42214532871972 - type: dot_f1_threshold value: 38215.53955078125 - type: dot_precision value: 45.48663640948058 - type: dot_recall value: 99.44873208379272 - type: euclidean_accuracy value: 61.75000000000001 - type: euclidean_accuracy_threshold value: 189.0761137008667 - type: euclidean_ap value: 60.55517418691518 - type: euclidean_f1 value: 63.07977736549165 - type: euclidean_f1_threshold value: 504.3168067932129 - type: euclidean_precision value: 47.53914988814318 - type: euclidean_recall value: 93.71554575523705 - type: main_score value: 60.679317786040585 - type: manhattan_accuracy value: 61.9 - type: manhattan_accuracy_threshold value: 4695.778274536133 - type: manhattan_ap value: 60.48686620413608 - type: manhattan_f1 value: 62.92880855772778 - type: manhattan_f1_threshold value: 12542.36831665039 - type: manhattan_precision value: 47.28381374722838 - type: manhattan_recall value: 94.04630650496141 - type: max_accuracy value: 61.9 - type: max_ap value: 60.679317786040585 - type: max_f1 value: 63.17354143441101 - type: max_precision value: 47.6457399103139 - type: max_recall value: 99.44873208379272 - type: similarity_accuracy value: 61.8 - type: similarity_accuracy_threshold value: 99.5664119720459 - type: similarity_ap value: 60.679317786040585 - type: similarity_f1 value: 63.17354143441101 - type: similarity_f1_threshold value: 97.22164869308472 - type: similarity_precision value: 47.6457399103139 - type: similarity_recall value: 93.71554575523705 - task: type: PairClassification dataset: name: MTEB PawsXPairClassification (de) type: google-research-datasets/paws-x config: de split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cosine_accuracy value: 60.25 - type: cosine_accuracy_threshold value: 99.54338073730469 - type: cosine_ap value: 56.7863613689054 - type: cosine_f1 value: 62.23499820337766 - type: cosine_f1_threshold value: 89.95014429092407 - type: cosine_precision value: 45.86864406779661 - type: cosine_recall value: 96.75977653631284 - type: dot_accuracy value: 56.8 - type: dot_accuracy_threshold value: 47349.78332519531 - type: dot_ap value: 49.7857806061729 - type: dot_f1 value: 62.31225986727209 - type: dot_f1_threshold value: 30143.206787109375 - type: dot_precision value: 45.32520325203252 - type: dot_recall value: 99.66480446927373 - type: euclidean_accuracy value: 60.3 - type: euclidean_accuracy_threshold value: 219.78106498718262 - type: euclidean_ap value: 56.731544327179606 - type: euclidean_f1 value: 62.19895287958115 - type: euclidean_f1_threshold value: 1792.1623229980469 - type: euclidean_precision value: 45.22842639593909 - type: euclidean_recall value: 99.55307262569832 - type: main_score value: 56.7863613689054 - type: manhattan_accuracy value: 60.150000000000006 - type: manhattan_accuracy_threshold value: 5104.503631591797 - type: manhattan_ap value: 56.70304479768734 - type: manhattan_f1 value: 62.22067039106145 - type: manhattan_f1_threshold value: 42839.471435546875 - type: manhattan_precision value: 45.2513966480447 - type: manhattan_recall value: 99.55307262569832 - type: max_accuracy value: 60.3 - type: max_ap value: 56.7863613689054 - type: max_f1 value: 62.31225986727209 - type: max_precision value: 45.86864406779661 - type: max_recall value: 99.66480446927373 - type: similarity_accuracy value: 60.25 - type: similarity_accuracy_threshold value: 99.54338073730469 - type: similarity_ap value: 56.7863613689054 - type: similarity_f1 value: 62.23499820337766 - type: similarity_f1_threshold value: 89.95014429092407 - type: similarity_precision value: 45.86864406779661 - type: similarity_recall value: 96.75977653631284 - task: type: PairClassification dataset: name: MTEB PawsXPairClassification (es) type: google-research-datasets/paws-x config: es split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cosine_accuracy value: 59.699999999999996 - type: cosine_accuracy_threshold value: 99.55930709838867 - type: cosine_ap value: 57.31662248806265 - type: cosine_f1 value: 62.444061962134256 - type: cosine_f1_threshold value: 74.75898265838623 - type: cosine_precision value: 45.3953953953954 - type: cosine_recall value: 100.0 - type: dot_accuracy value: 55.900000000000006 - type: dot_accuracy_threshold value: 47512.90283203125 - type: dot_ap value: 49.39339147787568 - type: dot_f1 value: 62.487082328625554 - type: dot_f1_threshold value: 34989.03503417969 - type: dot_precision value: 45.44088176352705 - type: dot_recall value: 100.0 - type: euclidean_accuracy value: 59.599999999999994 - type: euclidean_accuracy_threshold value: 200.82547664642334 - type: euclidean_ap value: 57.19737488445163 - type: euclidean_f1 value: 62.444061962134256 - type: euclidean_f1_threshold value: 1538.8837814331055 - type: euclidean_precision value: 45.3953953953954 - type: euclidean_recall value: 100.0 - type: main_score value: 57.31662248806265 - type: manhattan_accuracy value: 59.550000000000004 - type: manhattan_accuracy_threshold value: 5016.501617431641 - type: manhattan_ap value: 57.089959907945065 - type: manhattan_f1 value: 62.444061962134256 - type: manhattan_f1_threshold value: 37523.53515625 - type: manhattan_precision value: 45.3953953953954 - type: manhattan_recall value: 100.0 - type: max_accuracy value: 59.699999999999996 - type: max_ap value: 57.31662248806265 - type: max_f1 value: 62.487082328625554 - type: max_precision value: 45.44088176352705 - type: max_recall value: 100.0 - type: similarity_accuracy value: 59.699999999999996 - type: similarity_accuracy_threshold value: 99.55930709838867 - type: similarity_ap value: 57.31662248806265 - type: similarity_f1 value: 62.444061962134256 - type: similarity_f1_threshold value: 74.75898265838623 - type: similarity_precision value: 45.3953953953954 - type: similarity_recall value: 100.0 - task: type: PairClassification dataset: name: MTEB PawsXPairClassification (fr) type: google-research-datasets/paws-x config: fr split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cosine_accuracy value: 61.150000000000006 - type: cosine_accuracy_threshold value: 99.36153888702393 - type: cosine_ap value: 59.43845317938599 - type: cosine_f1 value: 62.51298026998961 - type: cosine_f1_threshold value: 76.77866220474243 - type: cosine_precision value: 45.468277945619334 - type: cosine_recall value: 100.0 - type: dot_accuracy value: 55.75 - type: dot_accuracy_threshold value: 48931.55212402344 - type: dot_ap value: 50.15949290538757 - type: dot_f1 value: 62.53462603878117 - type: dot_f1_threshold value: 34415.7958984375 - type: dot_precision value: 45.4911838790932 - type: dot_recall value: 100.0 - type: euclidean_accuracy value: 61.050000000000004 - type: euclidean_accuracy_threshold value: 240.8097267150879 - type: euclidean_ap value: 59.367971294226216 - type: euclidean_f1 value: 62.51298026998961 - type: euclidean_f1_threshold value: 1444.132423400879 - type: euclidean_precision value: 45.468277945619334 - type: euclidean_recall value: 100.0 - type: main_score value: 59.43845317938599 - type: manhattan_accuracy value: 60.95 - type: manhattan_accuracy_threshold value: 5701.206207275391 - type: manhattan_ap value: 59.30094096378774 - type: manhattan_f1 value: 62.53462603878117 - type: manhattan_f1_threshold value: 33445.672607421875 - type: manhattan_precision value: 45.4911838790932 - type: manhattan_recall value: 100.0 - type: max_accuracy value: 61.150000000000006 - type: max_ap value: 59.43845317938599 - type: max_f1 value: 62.53462603878117 - type: max_precision value: 45.4911838790932 - type: max_recall value: 100.0 - type: similarity_accuracy value: 61.150000000000006 - type: similarity_accuracy_threshold value: 99.36153888702393 - type: similarity_ap value: 59.43845317938599 - type: similarity_f1 value: 62.51298026998961 - type: similarity_f1_threshold value: 76.77866220474243 - type: similarity_precision value: 45.468277945619334 - type: similarity_recall value: 100.0 - task: type: PairClassification dataset: name: MTEB PawsXPairClassification (zh) type: google-research-datasets/paws-x config: zh split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cosine_accuracy value: 58.85 - type: cosine_accuracy_threshold value: 99.73838329315186 - type: cosine_ap value: 54.66913160570546 - type: cosine_f1 value: 62.32136632973162 - type: cosine_f1_threshold value: 76.4499306678772 - type: cosine_precision value: 45.265822784810126 - type: cosine_recall value: 100.0 - type: dot_accuracy value: 56.25 - type: dot_accuracy_threshold value: 47351.9287109375 - type: dot_ap value: 48.5266232989438 - type: dot_f1 value: 62.277951933124356 - type: dot_f1_threshold value: 31325.28076171875 - type: dot_precision value: 45.220030349013655 - type: dot_recall value: 100.0 - type: euclidean_accuracy value: 58.9 - type: euclidean_accuracy_threshold value: 144.24468278884888 - type: euclidean_ap value: 54.66981490353506 - type: euclidean_f1 value: 62.32136632973162 - type: euclidean_f1_threshold value: 1484.908676147461 - type: euclidean_precision value: 45.265822784810126 - type: euclidean_recall value: 100.0 - type: main_score value: 54.66981490353506 - type: manhattan_accuracy value: 58.9 - type: manhattan_accuracy_threshold value: 3586.785125732422 - type: manhattan_ap value: 54.668355260247736 - type: manhattan_f1 value: 62.32136632973162 - type: manhattan_f1_threshold value: 36031.22863769531 - type: manhattan_precision value: 45.265822784810126 - type: manhattan_recall value: 100.0 - type: max_accuracy value: 58.9 - type: max_ap value: 54.66981490353506 - type: max_f1 value: 62.32136632973162 - type: max_precision value: 45.265822784810126 - type: max_recall value: 100.0 - type: similarity_accuracy value: 58.85 - type: similarity_accuracy_threshold value: 99.73838329315186 - type: similarity_ap value: 54.66913160570546 - type: similarity_f1 value: 62.32136632973162 - type: similarity_f1_threshold value: 76.4499306678772 - type: similarity_precision value: 45.265822784810126 - type: similarity_recall value: 100.0 - task: type: Classification dataset: name: MTEB PolEmo2.0-IN (default) type: PL-MTEB/polemo2_in config: default split: test revision: d90724373c70959f17d2331ad51fb60c71176b03 metrics: - type: accuracy value: 83.75346260387812 - type: f1 value: 81.98304891214909 - type: f1_weighted value: 84.29623200830078 - type: main_score value: 83.75346260387812 - task: type: Classification dataset: name: MTEB PolEmo2.0-OUT (default) type: PL-MTEB/polemo2_out config: default split: test revision: 6a21ab8716e255ab1867265f8b396105e8aa63d4 metrics: - type: accuracy value: 66.53846153846153 - type: f1 value: 52.71826064368638 - type: f1_weighted value: 69.10010124630334 - type: main_score value: 66.53846153846153 - task: type: PairClassification dataset: name: MTEB PPC type: PL-MTEB/ppc-pairclassification config: default split: test revision: None metrics: - type: cosine_accuracy value: 81.8 - type: cosine_accuracy_threshold value: 90.47793745994568 - type: cosine_ap value: 91.42490266080884 - type: cosine_f1 value: 85.4632587859425 - type: cosine_f1_threshold value: 90.47793745994568 - type: cosine_precision value: 82.56172839506173 - type: cosine_recall value: 88.57615894039735 - type: dot_accuracy value: 74.6 - type: dot_accuracy_threshold value: 42102.23693847656 - type: dot_ap value: 86.20060009096979 - type: dot_f1 value: 80.02842928216063 - type: dot_f1_threshold value: 38970.16906738281 - type: dot_precision value: 70.1120797011208 - type: dot_recall value: 93.21192052980133 - type: euclidean_accuracy value: 81.5 - type: euclidean_accuracy_threshold value: 880.433464050293 - type: euclidean_ap value: 91.33143477982087 - type: euclidean_f1 value: 85.44600938967135 - type: euclidean_f1_threshold value: 964.0384674072266 - type: euclidean_precision value: 81.00890207715133 - type: euclidean_recall value: 90.39735099337747 - type: main_score value: 91.42490266080884 - type: manhattan_accuracy value: 81.3 - type: manhattan_accuracy_threshold value: 22100.830078125 - type: manhattan_ap value: 91.25996158651282 - type: manhattan_f1 value: 85.38102643856921 - type: manhattan_f1_threshold value: 24043.515014648438 - type: manhattan_precision value: 80.49853372434018 - type: manhattan_recall value: 90.89403973509934 - type: max_accuracy value: 81.8 - type: max_ap value: 91.42490266080884 - type: max_f1 value: 85.4632587859425 - type: max_precision value: 82.56172839506173 - type: max_recall value: 93.21192052980133 - type: similarity_accuracy value: 81.8 - type: similarity_accuracy_threshold value: 90.47793745994568 - type: similarity_ap value: 91.42490266080884 - type: similarity_f1 value: 85.4632587859425 - type: similarity_f1_threshold value: 90.47793745994568 - type: similarity_precision value: 82.56172839506173 - type: similarity_recall value: 88.57615894039735 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval (default) type: mteb/quora config: default split: test revision: e4e08e0b7dbe3c8700f0daef558ff32256715259 metrics: - type: map_at_1 value: 71.419 - type: map_at_10 value: 85.542 - type: map_at_100 value: 86.161 - type: map_at_1000 value: 86.175 - type: map_at_20 value: 85.949 - type: map_at_3 value: 82.623 - type: map_at_5 value: 84.5 - type: mrr_at_1 value: 82.27 - type: mrr_at_10 value: 88.21900000000001 - type: mrr_at_100 value: 88.313 - type: mrr_at_1000 value: 88.31400000000001 - type: mrr_at_20 value: 88.286 - type: mrr_at_3 value: 87.325 - type: mrr_at_5 value: 87.97500000000001 - type: ndcg_at_1 value: 82.3 - type: ndcg_at_10 value: 89.088 - type: ndcg_at_100 value: 90.217 - type: ndcg_at_1000 value: 90.29700000000001 - type: ndcg_at_20 value: 89.697 - type: ndcg_at_3 value: 86.435 - type: ndcg_at_5 value: 87.966 - type: precision_at_1 value: 82.3 - type: precision_at_10 value: 13.527000000000001 - type: precision_at_100 value: 1.537 - type: precision_at_1000 value: 0.157 - type: precision_at_20 value: 7.165000000000001 - type: precision_at_3 value: 37.92 - type: precision_at_5 value: 24.914 - type: recall_at_1 value: 71.419 - type: recall_at_10 value: 95.831 - type: recall_at_100 value: 99.64 - type: recall_at_1000 value: 99.988 - type: recall_at_20 value: 97.76599999999999 - type: recall_at_3 value: 88.081 - type: recall_at_5 value: 92.50500000000001 - type: main_score value: 89.088 - task: type: STS dataset: name: MTEB RUParaPhraserSTS (default) type: merionum/ru_paraphraser config: default split: test revision: 43265056790b8f7c59e0139acb4be0a8dad2c8f4 metrics: - type: cosine_pearson value: 67.91177744712421 - type: cosine_spearman value: 76.77113726753656 - type: euclidean_pearson value: 73.81454206068638 - type: euclidean_spearman value: 76.92529493599028 - type: main_score value: 76.77113726753656 - type: manhattan_pearson value: 73.81690454439168 - type: manhattan_spearman value: 76.87333776705002 - type: pearson value: 67.91177744712421 - type: spearman value: 76.77113726753656 - task: type: Clustering dataset: name: MTEB RedditClustering (default) type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: main_score value: 55.39924225216962 - type: v_measure value: 55.39924225216962 - type: v_measure_std value: 4.723802279292467 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P (default) type: mteb/reddit-clustering-p2p config: default split: test revision: 385e3cb46b4cfa89021f56c4380204149d0efe33 metrics: - type: main_score value: 62.87465161304012 - type: v_measure value: 62.87465161304012 - type: v_measure_std value: 12.082670914488473 - task: type: Retrieval dataset: name: MTEB RiaNewsRetrieval (default) type: ai-forever/ria-news-retrieval config: default split: test revision: 82374b0bbacda6114f39ff9c5b925fa1512ca5d7 metrics: - type: main_score value: 79.209 - type: map_at_1 value: 67.33 - type: map_at_10 value: 75.633 - type: map_at_100 value: 75.897 - type: map_at_1000 value: 75.907 - type: map_at_20 value: 75.804 - type: map_at_3 value: 74.2 - type: map_at_5 value: 75.13300000000001 - type: mrr_at_1 value: 67.31 - type: mrr_at_10 value: 75.62709126984095 - type: mrr_at_100 value: 75.89105697041113 - type: mrr_at_1000 value: 75.90115653883124 - type: mrr_at_20 value: 75.79802332308172 - type: mrr_at_3 value: 74.19499999999961 - type: mrr_at_5 value: 75.12849999999939 - type: nauc_map_at_1000_diff1 value: 74.30304869630591 - type: nauc_map_at_1000_max value: 36.477146725784046 - type: nauc_map_at_1000_std value: -20.862772498461723 - type: nauc_map_at_100_diff1 value: 74.29833058090355 - type: nauc_map_at_100_max value: 36.483678619667884 - type: nauc_map_at_100_std value: -20.856274849980135 - type: nauc_map_at_10_diff1 value: 74.20729220697967 - type: nauc_map_at_10_max value: 36.56543146170092 - type: nauc_map_at_10_std value: -20.991081015484728 - type: nauc_map_at_1_diff1 value: 77.38899022125185 - type: nauc_map_at_1_max value: 32.45918619669731 - type: nauc_map_at_1_std value: -22.149586336167324 - type: nauc_map_at_20_diff1 value: 74.2447573558587 - type: nauc_map_at_20_max value: 36.50383130240387 - type: nauc_map_at_20_std value: -20.87013743041831 - type: nauc_map_at_3_diff1 value: 74.3054577294586 - type: nauc_map_at_3_max value: 36.484530586652724 - type: nauc_map_at_3_std value: -21.90543024607988 - type: nauc_map_at_5_diff1 value: 74.21062368961503 - type: nauc_map_at_5_max value: 36.55670532498779 - type: nauc_map_at_5_std value: -21.488786900676942 - type: nauc_mrr_at_1000_diff1 value: 74.31619177956684 - type: nauc_mrr_at_1000_max value: 36.53498918453189 - type: nauc_mrr_at_1000_std value: -20.75986704931237 - type: nauc_mrr_at_100_diff1 value: 74.31146790382356 - type: nauc_mrr_at_100_max value: 36.54149252857106 - type: nauc_mrr_at_100_std value: -20.75341959250079 - type: nauc_mrr_at_10_diff1 value: 74.22027806145095 - type: nauc_mrr_at_10_max value: 36.622542969971725 - type: nauc_mrr_at_10_std value: -20.889417384064117 - type: nauc_mrr_at_1_diff1 value: 77.4306709551449 - type: nauc_mrr_at_1_max value: 32.57259463438259 - type: nauc_mrr_at_1_std value: -21.964402859613937 - type: nauc_mrr_at_20_diff1 value: 74.25784396230718 - type: nauc_mrr_at_20_max value: 36.561412224507336 - type: nauc_mrr_at_20_std value: -20.767665000065723 - type: nauc_mrr_at_3_diff1 value: 74.31423253547214 - type: nauc_mrr_at_3_max value: 36.537745749488906 - type: nauc_mrr_at_3_std value: -21.81259529019546 - type: nauc_mrr_at_5_diff1 value: 74.22404613312771 - type: nauc_mrr_at_5_max value: 36.60743768455219 - type: nauc_mrr_at_5_std value: -21.39479216331971 - type: nauc_ndcg_at_1000_diff1 value: 73.48182819705742 - type: nauc_ndcg_at_1000_max value: 37.86991608461793 - type: nauc_ndcg_at_1000_std value: -19.021499322688904 - type: nauc_ndcg_at_100_diff1 value: 73.34941250585759 - type: nauc_ndcg_at_100_max value: 38.11150275625829 - type: nauc_ndcg_at_100_std value: -18.70624087206104 - type: nauc_ndcg_at_10_diff1 value: 72.82520265115987 - type: nauc_ndcg_at_10_max value: 38.43323357650525 - type: nauc_ndcg_at_10_std value: -19.410953792830878 - type: nauc_ndcg_at_1_diff1 value: 77.38899022125185 - type: nauc_ndcg_at_1_max value: 32.45918619669731 - type: nauc_ndcg_at_1_std value: -22.149586336167324 - type: nauc_ndcg_at_20_diff1 value: 72.93309285256507 - type: nauc_ndcg_at_20_max value: 38.217372819067755 - type: nauc_ndcg_at_20_std value: -18.864113576359333 - type: nauc_ndcg_at_3_diff1 value: 73.18253776744112 - type: nauc_ndcg_at_3_max value: 38.008109328364 - type: nauc_ndcg_at_3_std value: -21.68785687594153 - type: nauc_ndcg_at_5_diff1 value: 72.90474739784793 - type: nauc_ndcg_at_5_max value: 38.29483039202184 - type: nauc_ndcg_at_5_std value: -20.833049811453474 - type: nauc_precision_at_1000_diff1 value: 59.306217613750334 - type: nauc_precision_at_1000_max value: 72.20747948302262 - type: nauc_precision_at_1000_std value: 45.58837180096227 - type: nauc_precision_at_100_diff1 value: 62.87286844562389 - type: nauc_precision_at_100_max value: 61.33108214045868 - type: nauc_precision_at_100_std value: 20.67481963545654 - type: nauc_precision_at_10_diff1 value: 64.11222984256685 - type: nauc_precision_at_10_max value: 50.323697746037496 - type: nauc_precision_at_10_std value: -7.9994544634332625 - type: nauc_precision_at_1_diff1 value: 77.38899022125185 - type: nauc_precision_at_1_max value: 32.45918619669731 - type: nauc_precision_at_1_std value: -22.149586336167324 - type: nauc_precision_at_20_diff1 value: 62.30228127286973 - type: nauc_precision_at_20_max value: 52.02090746208407 - type: nauc_precision_at_20_std value: 0.7629898806370331 - type: nauc_precision_at_3_diff1 value: 68.82856645994157 - type: nauc_precision_at_3_max value: 43.94171571306625 - type: nauc_precision_at_3_std value: -20.78595255410148 - type: nauc_precision_at_5_diff1 value: 66.62157622497887 - type: nauc_precision_at_5_max value: 46.69398173603811 - type: nauc_precision_at_5_std value: -17.412423571163057 - type: nauc_recall_at_1000_diff1 value: 59.30621761375148 - type: nauc_recall_at_1000_max value: 72.20747948302191 - type: nauc_recall_at_1000_std value: 45.588371800962655 - type: nauc_recall_at_100_diff1 value: 62.872868445623894 - type: nauc_recall_at_100_max value: 61.33108214045813 - type: nauc_recall_at_100_std value: 20.67481963545666 - type: nauc_recall_at_10_diff1 value: 64.11222984256698 - type: nauc_recall_at_10_max value: 50.32369774603755 - type: nauc_recall_at_10_std value: -7.999454463433321 - type: nauc_recall_at_1_diff1 value: 77.38899022125185 - type: nauc_recall_at_1_max value: 32.45918619669731 - type: nauc_recall_at_1_std value: -22.149586336167324 - type: nauc_recall_at_20_diff1 value: 62.3022812728695 - type: nauc_recall_at_20_max value: 52.02090746208397 - type: nauc_recall_at_20_std value: 0.7629898806369458 - type: nauc_recall_at_3_diff1 value: 68.82856645994157 - type: nauc_recall_at_3_max value: 43.94171571306612 - type: nauc_recall_at_3_std value: -20.78595255410157 - type: nauc_recall_at_5_diff1 value: 66.62157622497897 - type: nauc_recall_at_5_max value: 46.693981736038246 - type: nauc_recall_at_5_std value: -17.412423571162954 - type: ndcg_at_1 value: 67.33 - type: ndcg_at_10 value: 79.209 - type: ndcg_at_100 value: 80.463 - type: ndcg_at_1000 value: 80.74799999999999 - type: ndcg_at_20 value: 79.81899999999999 - type: ndcg_at_3 value: 76.335 - type: ndcg_at_5 value: 78.011 - type: precision_at_1 value: 67.33 - type: precision_at_10 value: 9.020999999999999 - type: precision_at_100 value: 0.96 - type: precision_at_1000 value: 0.098 - type: precision_at_20 value: 4.63 - type: precision_at_3 value: 27.493000000000002 - type: precision_at_5 value: 17.308 - type: recall_at_1 value: 67.33 - type: recall_at_10 value: 90.21000000000001 - type: recall_at_100 value: 96.00999999999999 - type: recall_at_1000 value: 98.29 - type: recall_at_20 value: 92.60000000000001 - type: recall_at_3 value: 82.48 - type: recall_at_5 value: 86.53999999999999 - task: type: Reranking dataset: name: MTEB RuBQReranking (default) type: ai-forever/rubq-reranking config: default split: test revision: 2e96b8f098fa4b0950fc58eacadeb31c0d0c7fa2 metrics: - type: main_score value: 65.57453932493252 - type: map value: 65.57453932493252 - type: mrr value: 70.51408205663526 - type: nAUC_map_diff1 value: 26.69583260609023 - type: nAUC_map_max value: 12.928262749610663 - type: nAUC_map_std value: 11.702468857903128 - type: nAUC_mrr_diff1 value: 28.5206955462174 - type: nAUC_mrr_max value: 14.207162454694227 - type: nAUC_mrr_std value: 10.725721001555296 - task: type: Retrieval dataset: name: MTEB RuBQRetrieval (default) type: ai-forever/rubq-retrieval config: default split: test revision: e19b6ffa60b3bc248e0b41f4cc37c26a55c2a67b metrics: - type: main_score value: 72.306 - type: map_at_1 value: 44.187 - type: map_at_10 value: 64.836 - type: map_at_100 value: 65.771 - type: map_at_1000 value: 65.8 - type: map_at_20 value: 65.497 - type: map_at_3 value: 59.692 - type: map_at_5 value: 63.105 - type: mrr_at_1 value: 62.23404255319149 - type: mrr_at_10 value: 73.40810161732159 - type: mrr_at_100 value: 73.67949305473395 - type: mrr_at_1000 value: 73.68707852294746 - type: mrr_at_20 value: 73.60429051697479 - type: mrr_at_3 value: 71.47360126083535 - type: mrr_at_5 value: 72.8447596532704 - type: nauc_map_at_1000_diff1 value: 39.838449035736886 - type: nauc_map_at_1000_max value: 32.29962306877408 - type: nauc_map_at_1000_std value: -6.324859592714388 - type: nauc_map_at_100_diff1 value: 39.824361938745426 - type: nauc_map_at_100_max value: 32.32055222704763 - type: nauc_map_at_100_std value: -6.301641111869559 - type: nauc_map_at_10_diff1 value: 39.50155328718487 - type: nauc_map_at_10_max value: 31.745730244960672 - type: nauc_map_at_10_std value: -6.867215137329693 - type: nauc_map_at_1_diff1 value: 47.66181128677822 - type: nauc_map_at_1_max value: 21.75204233166764 - type: nauc_map_at_1_std value: -8.06951079061697 - type: nauc_map_at_20_diff1 value: 39.78364637902108 - type: nauc_map_at_20_max value: 32.39065528029405 - type: nauc_map_at_20_std value: -6.368994332729006 - type: nauc_map_at_3_diff1 value: 39.51829474433183 - type: nauc_map_at_3_max value: 28.633292697821673 - type: nauc_map_at_3_std value: -7.2561170814963925 - type: nauc_map_at_5_diff1 value: 39.288433237676266 - type: nauc_map_at_5_max value: 31.007702201615515 - type: nauc_map_at_5_std value: -7.235131195162474 - type: nauc_mrr_at_1000_diff1 value: 49.599102391215226 - type: nauc_mrr_at_1000_max value: 38.25521825911133 - type: nauc_mrr_at_1000_std value: -10.448180939809435 - type: nauc_mrr_at_100_diff1 value: 49.5957067716212 - type: nauc_mrr_at_100_max value: 38.26760703964535 - type: nauc_mrr_at_100_std value: -10.438443051971081 - type: nauc_mrr_at_10_diff1 value: 49.35269710190271 - type: nauc_mrr_at_10_max value: 38.43782589127069 - type: nauc_mrr_at_10_std value: -10.404402063509815 - type: nauc_mrr_at_1_diff1 value: 53.32206103688421 - type: nauc_mrr_at_1_max value: 33.52402390241035 - type: nauc_mrr_at_1_std value: -12.73473393949936 - type: nauc_mrr_at_20_diff1 value: 49.550630850826636 - type: nauc_mrr_at_20_max value: 38.35964703941151 - type: nauc_mrr_at_20_std value: -10.444577766284766 - type: nauc_mrr_at_3_diff1 value: 49.12029127633829 - type: nauc_mrr_at_3_max value: 38.01631275124067 - type: nauc_mrr_at_3_std value: -10.523724301481309 - type: nauc_mrr_at_5_diff1 value: 49.04606949432458 - type: nauc_mrr_at_5_max value: 38.33647550077891 - type: nauc_mrr_at_5_std value: -10.47076409263114 - type: nauc_ndcg_at_1000_diff1 value: 41.342785916264226 - type: nauc_ndcg_at_1000_max value: 35.75731064862711 - type: nauc_ndcg_at_1000_std value: -5.45573422899229 - type: nauc_ndcg_at_100_diff1 value: 40.972974559636086 - type: nauc_ndcg_at_100_max value: 36.32938573321036 - type: nauc_ndcg_at_100_std value: -4.749631537590004 - type: nauc_ndcg_at_10_diff1 value: 39.67813474464166 - type: nauc_ndcg_at_10_max value: 35.480200504848966 - type: nauc_ndcg_at_10_std value: -6.318561293935512 - type: nauc_ndcg_at_1_diff1 value: 53.45970160222764 - type: nauc_ndcg_at_1_max value: 33.14759013278075 - type: nauc_ndcg_at_1_std value: -12.579833891774847 - type: nauc_ndcg_at_20_diff1 value: 40.67492861219249 - type: nauc_ndcg_at_20_max value: 36.84960799838019 - type: nauc_ndcg_at_20_std value: -5.202530835850179 - type: nauc_ndcg_at_3_diff1 value: 39.574906207408844 - type: nauc_ndcg_at_3_max value: 31.76512164509258 - type: nauc_ndcg_at_3_std value: -7.656143208565999 - type: nauc_ndcg_at_5_diff1 value: 39.096348529742095 - type: nauc_ndcg_at_5_max value: 34.075926475544165 - type: nauc_ndcg_at_5_std value: -7.238045445366631 - type: nauc_precision_at_1000_diff1 value: -14.283799754212609 - type: nauc_precision_at_1000_max value: 6.449741756717101 - type: nauc_precision_at_1000_std value: 4.862828679759048 - type: nauc_precision_at_100_diff1 value: -13.23173132700258 - type: nauc_precision_at_100_max value: 11.058898534529195 - type: nauc_precision_at_100_std value: 7.343683941814956 - type: nauc_precision_at_10_diff1 value: -7.202951643546464 - type: nauc_precision_at_10_max value: 17.499446869433278 - type: nauc_precision_at_10_std value: 2.8367985220406307 - type: nauc_precision_at_1_diff1 value: 53.45970160222764 - type: nauc_precision_at_1_max value: 33.14759013278075 - type: nauc_precision_at_1_std value: -12.579833891774847 - type: nauc_precision_at_20_diff1 value: -9.477122699154124 - type: nauc_precision_at_20_max value: 16.80556031564312 - type: nauc_precision_at_20_std value: 6.420218284416923 - type: nauc_precision_at_3_diff1 value: 5.5276143574150245 - type: nauc_precision_at_3_max value: 23.65952688481666 - type: nauc_precision_at_3_std value: -1.8730348729295785 - type: nauc_precision_at_5_diff1 value: -2.4537029093721308 - type: nauc_precision_at_5_max value: 21.41469327545133 - type: nauc_precision_at_5_std value: 0.1543890645722277 - type: nauc_recall_at_1000_diff1 value: -1.7474947956413491 - type: nauc_recall_at_1000_max value: 46.22670991970479 - type: nauc_recall_at_1000_std value: 62.582840705588794 - type: nauc_recall_at_100_diff1 value: 16.116089801097345 - type: nauc_recall_at_100_max value: 52.54794580975103 - type: nauc_recall_at_100_std value: 33.720245696003246 - type: nauc_recall_at_10_diff1 value: 23.134924318655482 - type: nauc_recall_at_10_max value: 38.73754275649077 - type: nauc_recall_at_10_std value: 0.6137471711639239 - type: nauc_recall_at_1_diff1 value: 47.66181128677822 - type: nauc_recall_at_1_max value: 21.75204233166764 - type: nauc_recall_at_1_std value: -8.06951079061697 - type: nauc_recall_at_20_diff1 value: 24.130616271355017 - type: nauc_recall_at_20_max value: 48.306178640146136 - type: nauc_recall_at_20_std value: 9.290819557000022 - type: nauc_recall_at_3_diff1 value: 29.767415016250226 - type: nauc_recall_at_3_max value: 28.54289782140701 - type: nauc_recall_at_3_std value: -5.1395675072005576 - type: nauc_recall_at_5_diff1 value: 25.410613126870174 - type: nauc_recall_at_5_max value: 33.24658754857624 - type: nauc_recall_at_5_std value: -4.211226036746632 - type: ndcg_at_1 value: 62.175000000000004 - type: ndcg_at_10 value: 72.306 - type: ndcg_at_100 value: 75.074 - type: ndcg_at_1000 value: 75.581 - type: ndcg_at_20 value: 73.875 - type: ndcg_at_3 value: 65.641 - type: ndcg_at_5 value: 69.48299999999999 - type: precision_at_1 value: 62.175000000000004 - type: precision_at_10 value: 13.907 - type: precision_at_100 value: 1.591 - type: precision_at_1000 value: 0.166 - type: precision_at_20 value: 7.446999999999999 - type: precision_at_3 value: 35.619 - type: precision_at_5 value: 24.917 - type: recall_at_1 value: 44.187 - type: recall_at_10 value: 85.10600000000001 - type: recall_at_100 value: 95.488 - type: recall_at_1000 value: 98.831 - type: recall_at_20 value: 90.22200000000001 - type: recall_at_3 value: 68.789 - type: recall_at_5 value: 77.85499999999999 - task: type: Classification dataset: name: MTEB RuReviewsClassification (default) type: ai-forever/ru-reviews-classification config: default split: test revision: f6d2c31f4dc6b88f468552750bfec05b4b41b05a metrics: - type: accuracy value: 67.5830078125 - type: f1 value: 67.56931936632446 - type: f1_weighted value: 67.57137733752779 - type: main_score value: 67.5830078125 - task: type: STS dataset: name: MTEB RuSTSBenchmarkSTS (default) type: ai-forever/ru-stsbenchmark-sts config: default split: test revision: 7cf24f325c6da6195df55bef3d86b5e0616f3018 metrics: - type: cosine_pearson value: 85.90493484626788 - type: cosine_spearman value: 86.21965691667411 - type: euclidean_pearson value: 86.07499842984909 - type: euclidean_spearman value: 86.55506818735688 - type: main_score value: 86.21965691667411 - type: manhattan_pearson value: 85.95976420231729 - type: manhattan_spearman value: 86.48604243661234 - type: pearson value: 85.90493484626788 - type: spearman value: 86.21965691667411 - task: type: Classification dataset: name: MTEB RuSciBenchGRNTIClassification (default) type: ai-forever/ru-scibench-grnti-classification config: default split: test revision: 673a610d6d3dd91a547a0d57ae1b56f37ebbf6a1 metrics: - type: accuracy value: 59.1943359375 - type: f1 value: 58.894480861440414 - type: f1_weighted value: 58.903615560240866 - type: main_score value: 59.1943359375 - task: type: Clustering dataset: name: MTEB RuSciBenchGRNTIClusteringP2P (default) type: ai-forever/ru-scibench-grnti-classification config: default split: test revision: 673a610d6d3dd91a547a0d57ae1b56f37ebbf6a1 metrics: - type: main_score value: 57.99209448663228 - type: v_measure value: 57.99209448663228 - type: v_measure_std value: 1.0381163861993816 - task: type: Classification dataset: name: MTEB RuSciBenchOECDClassification (default) type: ai-forever/ru-scibench-oecd-classification config: default split: test revision: 26c88e99dcaba32bb45d0e1bfc21902337f6d471 metrics: - type: accuracy value: 45.556640625 - type: f1 value: 45.159163104085906 - type: f1_weighted value: 45.16098316398626 - type: main_score value: 45.556640625 - task: type: Clustering dataset: name: MTEB RuSciBenchOECDClusteringP2P (default) type: ai-forever/ru-scibench-oecd-classification config: default split: test revision: 26c88e99dcaba32bb45d0e1bfc21902337f6d471 metrics: - type: main_score value: 50.787548070488974 - type: v_measure value: 50.787548070488974 - type: v_measure_std value: 0.8569958168946827 - task: type: Retrieval dataset: name: MTEB SCIDOCS (default) type: mteb/scidocs config: default split: test revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88 metrics: - type: map_at_1 value: 4.843 - type: map_at_10 value: 11.752 - type: map_at_100 value: 13.919 - type: map_at_1000 value: 14.198 - type: map_at_20 value: 12.898000000000001 - type: map_at_3 value: 8.603 - type: map_at_5 value: 10.069 - type: mrr_at_1 value: 23.799999999999997 - type: mrr_at_10 value: 34.449999999999996 - type: mrr_at_100 value: 35.64 - type: mrr_at_1000 value: 35.691 - type: mrr_at_20 value: 35.213 - type: mrr_at_3 value: 31.383 - type: mrr_at_5 value: 33.062999999999995 - type: ndcg_at_1 value: 23.799999999999997 - type: ndcg_at_10 value: 19.811 - type: ndcg_at_100 value: 28.108 - type: ndcg_at_1000 value: 33.1 - type: ndcg_at_20 value: 22.980999999999998 - type: ndcg_at_3 value: 19.153000000000002 - type: ndcg_at_5 value: 16.408 - type: precision_at_1 value: 23.799999999999997 - type: precision_at_10 value: 10.16 - type: precision_at_100 value: 2.1999999999999997 - type: precision_at_1000 value: 0.34099999999999997 - type: precision_at_20 value: 6.915 - type: precision_at_3 value: 17.8 - type: precision_at_5 value: 14.14 - type: recall_at_1 value: 4.843 - type: recall_at_10 value: 20.595 - type: recall_at_100 value: 44.66 - type: recall_at_1000 value: 69.152 - type: recall_at_20 value: 28.04 - type: recall_at_3 value: 10.833 - type: recall_at_5 value: 14.346999999999998 - type: main_score value: 19.811 - task: type: PairClassification dataset: name: MTEB SICK-E-PL (default) type: PL-MTEB/sicke-pl-pairclassification config: default split: test revision: 71bba34b0ece6c56dfcf46d9758a27f7a90f17e9 metrics: - type: cosine_accuracy value: 80.90093762739502 - type: cosine_accuracy_threshold value: 94.40930485725403 - type: cosine_ap value: 71.15400909912427 - type: cosine_f1 value: 66.8213457076566 - type: cosine_f1_threshold value: 91.53673648834229 - type: cosine_precision value: 62.4922504649721 - type: cosine_recall value: 71.7948717948718 - type: dot_accuracy value: 78.41418671015083 - type: dot_accuracy_threshold value: 42924.45068359375 - type: dot_ap value: 63.34003025365763 - type: dot_f1 value: 62.518258837277244 - type: dot_f1_threshold value: 40900.738525390625 - type: dot_precision value: 52.99653293709758 - type: dot_recall value: 76.21082621082621 - type: euclidean_accuracy value: 80.67672238075826 - type: euclidean_accuracy_threshold value: 696.0524559020996 - type: euclidean_ap value: 70.88762835990224 - type: euclidean_f1 value: 66.711051930759 - type: euclidean_f1_threshold value: 878.5581588745117 - type: euclidean_precision value: 62.625 - type: euclidean_recall value: 71.36752136752136 - type: main_score value: 71.15400909912427 - type: manhattan_accuracy value: 80.65633917651854 - type: manhattan_accuracy_threshold value: 17277.72674560547 - type: manhattan_ap value: 70.67105336611716 - type: manhattan_f1 value: 66.51346027577151 - type: manhattan_f1_threshold value: 21687.957763671875 - type: manhattan_precision value: 61.69305724725944 - type: manhattan_recall value: 72.15099715099716 - type: max_accuracy value: 80.90093762739502 - type: max_ap value: 71.15400909912427 - type: max_f1 value: 66.8213457076566 - type: max_precision value: 62.625 - type: max_recall value: 76.21082621082621 - type: similarity_accuracy value: 80.90093762739502 - type: similarity_accuracy_threshold value: 94.40930485725403 - type: similarity_ap value: 71.15400909912427 - type: similarity_f1 value: 66.8213457076566 - type: similarity_f1_threshold value: 91.53673648834229 - type: similarity_precision value: 62.4922504649721 - type: similarity_recall value: 71.7948717948718 - task: type: STS dataset: name: MTEB SICK-R (default) type: mteb/sickr-sts config: default split: test revision: 20a6d6f312dd54037fe07a32d58e5e168867909d metrics: - type: cosine_pearson value: 92.3339946866199 - type: cosine_spearman value: 89.61697355115497 - type: euclidean_pearson value: 90.3264916449669 - type: euclidean_spearman value: 89.36270451308866 - type: main_score value: 89.61697355115497 - type: manhattan_pearson value: 90.18909339052534 - type: manhattan_spearman value: 89.28337093097377 - type: pearson value: 92.3339946866199 - type: spearman value: 89.61697355115497 - task: type: STS dataset: name: MTEB SICK-R-PL (default) type: PL-MTEB/sickr-pl-sts config: default split: test revision: fd5c2441b7eeff8676768036142af4cfa42c1339 metrics: - type: cosine_pearson value: 85.27883048457821 - type: cosine_spearman value: 80.53204892678619 - type: euclidean_pearson value: 82.78520705216168 - type: euclidean_spearman value: 80.27848359873212 - type: main_score value: 80.53204892678619 - type: manhattan_pearson value: 82.63270640583454 - type: manhattan_spearman value: 80.21507977473146 - type: pearson value: 85.27883048457821 - type: spearman value: 80.53204892678619 - task: type: STS dataset: name: MTEB SICKFr (default) type: Lajavaness/SICK-fr config: default split: test revision: e077ab4cf4774a1e36d86d593b150422fafd8e8a metrics: - type: cosine_pearson value: 88.77029361817212 - type: cosine_spearman value: 83.9453600346894 - type: euclidean_pearson value: 85.85331086208573 - type: euclidean_spearman value: 83.70852031985308 - type: main_score value: 83.9453600346894 - type: manhattan_pearson value: 85.66222265885914 - type: manhattan_spearman value: 83.60833111525962 - type: pearson value: 88.77029361817212 - type: spearman value: 83.9453600346894 - task: type: STS dataset: name: MTEB STS12 (default) type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cosine_pearson value: 88.76435859522375 - type: cosine_spearman value: 82.43768167804375 - type: euclidean_pearson value: 87.43566183874832 - type: euclidean_spearman value: 82.82166873757507 - type: main_score value: 82.43768167804375 - type: manhattan_pearson value: 87.39450871380951 - type: manhattan_spearman value: 82.89253043430163 - type: pearson value: 88.76435859522375 - type: spearman value: 82.43768167804375 - task: type: STS dataset: name: MTEB STS13 (default) type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cosine_pearson value: 88.86627241652141 - type: cosine_spearman value: 89.49011599120688 - type: euclidean_pearson value: 89.3314120073772 - type: euclidean_spearman value: 89.8226502776963 - type: main_score value: 89.49011599120688 - type: manhattan_pearson value: 89.2252179076963 - type: manhattan_spearman value: 89.74573844021225 - type: pearson value: 88.86627241652141 - type: spearman value: 89.49011599120688 - task: type: STS dataset: name: MTEB STS14 (default) type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cosine_pearson value: 87.22891405215968 - type: cosine_spearman value: 84.9467188157614 - type: euclidean_pearson value: 87.20330004726237 - type: euclidean_spearman value: 85.34806059461808 - type: main_score value: 84.9467188157614 - type: manhattan_pearson value: 87.15224666107623 - type: manhattan_spearman value: 85.34596898699708 - type: pearson value: 87.22891405215968 - type: spearman value: 84.9467188157614 - task: type: STS dataset: name: MTEB STS15 (default) type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cosine_pearson value: 88.14066430111033 - type: cosine_spearman value: 89.31337445552545 - type: euclidean_pearson value: 89.08039335366983 - type: euclidean_spearman value: 89.6658762856415 - type: main_score value: 89.31337445552545 - type: manhattan_pearson value: 89.08057438154486 - type: manhattan_spearman value: 89.68673984203022 - type: pearson value: 88.14066430111033 - type: spearman value: 89.31337445552545 - task: type: STS dataset: name: MTEB STS16 (default) type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cosine_pearson value: 85.14908856657084 - type: cosine_spearman value: 86.84648320786727 - type: euclidean_pearson value: 86.11454713131947 - type: euclidean_spearman value: 86.77738862047961 - type: main_score value: 86.84648320786727 - type: manhattan_pearson value: 86.07804821916372 - type: manhattan_spearman value: 86.78676064310474 - type: pearson value: 85.14908856657084 - type: spearman value: 86.84648320786727 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 89.61633502468356 - type: cosine_spearman value: 89.99772663224805 - type: euclidean_pearson value: 90.14056501501044 - type: euclidean_spearman value: 90.04496896837503 - type: main_score value: 89.99772663224805 - type: manhattan_pearson value: 90.08964860311801 - type: manhattan_spearman value: 90.00091712362196 - type: pearson value: 89.61633502468356 - type: spearman value: 89.99772663224805 - task: type: STS dataset: name: MTEB STS17 (es-en) type: mteb/sts17-crosslingual-sts config: es-en split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 86.44548026840202 - type: cosine_spearman value: 87.26263108768539 - type: euclidean_pearson value: 86.42844593583838 - type: euclidean_spearman value: 86.89388428664364 - type: main_score value: 87.26263108768539 - type: manhattan_pearson value: 86.47186940800881 - type: manhattan_spearman value: 87.02163091089946 - type: pearson value: 86.44548026840202 - type: spearman value: 87.26263108768539 - task: type: STS dataset: name: MTEB STS17 (en-de) type: mteb/sts17-crosslingual-sts config: en-de split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 87.89345132532758 - type: cosine_spearman value: 87.96246221327699 - type: euclidean_pearson value: 88.49013032701419 - type: euclidean_spearman value: 87.81981265317344 - type: main_score value: 87.96246221327699 - type: manhattan_pearson value: 88.31360914178538 - type: manhattan_spearman value: 87.62734530005075 - type: pearson value: 87.89345132532758 - type: spearman value: 87.96246221327699 - task: type: STS dataset: name: MTEB STS17 (es-es) type: mteb/sts17-crosslingual-sts config: es-es split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 88.4084678497171 - type: cosine_spearman value: 88.77640638748285 - type: euclidean_pearson value: 89.60124312475843 - type: euclidean_spearman value: 88.4321442688528 - type: main_score value: 88.77640638748285 - type: manhattan_pearson value: 89.62375118021299 - type: manhattan_spearman value: 88.46998118661577 - type: pearson value: 88.4084678497171 - type: spearman value: 88.77640638748285 - task: type: STS dataset: name: MTEB STS17 (fr-en) type: mteb/sts17-crosslingual-sts config: fr-en split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 87.30688801326498 - type: cosine_spearman value: 87.55684697258378 - type: euclidean_pearson value: 87.89672951056794 - type: euclidean_spearman value: 87.28050429201674 - type: main_score value: 87.55684697258378 - type: manhattan_pearson value: 87.74292745320572 - type: manhattan_spearman value: 87.16383993876582 - type: pearson value: 87.30688801326498 - type: spearman value: 87.55684697258378 - task: type: STS dataset: name: MTEB STS22 (zh-en) type: mteb/sts22-crosslingual-sts config: zh-en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 73.46180375170147 - type: cosine_spearman value: 73.39559590127081 - type: euclidean_pearson value: 73.72613901293681 - type: euclidean_spearman value: 71.85465165176795 - type: main_score value: 73.39559590127081 - type: manhattan_pearson value: 73.07859140869076 - type: manhattan_spearman value: 71.22047343718893 - type: pearson value: 73.46180375170147 - type: spearman value: 73.39559590127081 - task: type: STS dataset: name: MTEB STS22 (zh) type: mteb/sts22-crosslingual-sts config: zh split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 62.47531620842637 - type: cosine_spearman value: 66.22504667157702 - type: euclidean_pearson value: 66.76201254783692 - type: euclidean_spearman value: 66.86115760269463 - type: main_score value: 66.22504667157702 - type: manhattan_pearson value: 66.73847836793489 - type: manhattan_spearman value: 66.7677116377695 - type: pearson value: 62.47531620842637 - type: spearman value: 66.22504667157702 - task: type: STS dataset: name: MTEB STS22 (es) type: mteb/sts22-crosslingual-sts config: es split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 69.89707002436481 - type: cosine_spearman value: 72.2054865735116 - type: euclidean_pearson value: 71.81856615570756 - type: euclidean_spearman value: 72.72593304629407 - type: main_score value: 72.2054865735116 - type: manhattan_pearson value: 72.00362684700072 - type: manhattan_spearman value: 72.62783534769964 - type: pearson value: 69.89707002436481 - type: spearman value: 72.2054865735116 - task: type: STS dataset: name: MTEB STS22 (fr) type: mteb/sts22-crosslingual-sts config: fr split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 81.59623734395916 - type: cosine_spearman value: 83.28946105111358 - type: euclidean_pearson value: 79.377330171466 - type: euclidean_spearman value: 81.81029781662205 - type: main_score value: 83.28946105111358 - type: manhattan_pearson value: 78.96970881689698 - type: manhattan_spearman value: 81.91773236079703 - type: pearson value: 81.59623734395916 - type: spearman value: 83.28946105111358 - task: type: STS dataset: name: MTEB STS22 (de-fr) type: mteb/sts22-crosslingual-sts config: de-fr split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 55.03825643126142 - type: cosine_spearman value: 58.25792501780429 - type: euclidean_pearson value: 50.38007603973409 - type: euclidean_spearman value: 59.39961789383097 - type: main_score value: 58.25792501780429 - type: manhattan_pearson value: 50.518568927999155 - type: manhattan_spearman value: 59.84185466003894 - type: pearson value: 55.03825643126142 - type: spearman value: 58.25792501780429 - task: type: STS dataset: name: MTEB STS22 (pl-en) type: mteb/sts22-crosslingual-sts config: pl-en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 77.77233721490776 - type: cosine_spearman value: 76.17596588017625 - type: euclidean_pearson value: 74.47600468156611 - type: euclidean_spearman value: 72.61278728057012 - type: main_score value: 76.17596588017625 - type: manhattan_pearson value: 74.48118910099699 - type: manhattan_spearman value: 73.33167419101696 - type: pearson value: 77.77233721490776 - type: spearman value: 76.17596588017625 - task: type: STS dataset: name: MTEB STS22 (pl) type: mteb/sts22-crosslingual-sts config: pl split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 42.87453608131507 - type: cosine_spearman value: 45.137849894401185 - type: euclidean_pearson value: 31.66964197694796 - type: euclidean_spearman value: 44.1014900837869 - type: main_score value: 45.137849894401185 - type: manhattan_pearson value: 31.007199259384745 - type: manhattan_spearman value: 43.48181523288926 - type: pearson value: 42.87453608131507 - type: spearman value: 45.137849894401185 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 66.87400150638176 - type: cosine_spearman value: 67.27861354834066 - type: euclidean_pearson value: 66.81789582140216 - type: euclidean_spearman value: 66.44220479858708 - type: main_score value: 67.27861354834066 - type: manhattan_pearson value: 66.92509859033235 - type: manhattan_spearman value: 66.46841124185076 - type: pearson value: 66.87400150638176 - type: spearman value: 67.27861354834066 - task: type: STS dataset: name: MTEB STS22 (ru) type: mteb/sts22-crosslingual-sts config: ru split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 61.819804551576084 - type: cosine_spearman value: 65.0864146772135 - type: euclidean_pearson value: 62.518151090361876 - type: euclidean_spearman value: 65.13608138548017 - type: main_score value: 65.0864146772135 - type: manhattan_pearson value: 62.51413246915267 - type: manhattan_spearman value: 65.19077543064323 - type: pearson value: 61.819804551576084 - type: spearman value: 65.0864146772135 - task: type: STS dataset: name: MTEB STS22 (de) type: mteb/sts22-crosslingual-sts config: de split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 54.85728696035389 - type: cosine_spearman value: 61.60906359227576 - type: euclidean_pearson value: 52.57582587901851 - type: euclidean_spearman value: 61.41823097598308 - type: main_score value: 61.60906359227576 - type: manhattan_pearson value: 52.500978361080506 - type: manhattan_spearman value: 61.30365596659758 - type: pearson value: 54.85728696035389 - type: spearman value: 61.60906359227576 - task: type: STS dataset: name: MTEB STS22 (fr-pl) type: mteb/sts22-crosslingual-sts config: fr-pl split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 67.68016005631422 - type: cosine_spearman value: 84.51542547285167 - type: euclidean_pearson value: 66.19871164667245 - type: euclidean_spearman value: 73.24670207647144 - type: main_score value: 84.51542547285167 - type: manhattan_pearson value: 67.0443525268974 - type: manhattan_spearman value: 73.24670207647144 - type: pearson value: 67.68016005631422 - type: spearman value: 84.51542547285167 - task: type: STS dataset: name: MTEB STS22 (de-pl) type: mteb/sts22-crosslingual-sts config: de-pl split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 47.49467414030747 - type: cosine_spearman value: 56.81512095681289 - type: euclidean_pearson value: 48.42860221765214 - type: euclidean_spearman value: 58.63197306329092 - type: main_score value: 56.81512095681289 - type: manhattan_pearson value: 48.39594959260441 - type: manhattan_spearman value: 58.63197306329092 - type: pearson value: 47.49467414030747 - type: spearman value: 56.81512095681289 - task: type: STS dataset: name: MTEB STS22 (es-en) type: mteb/sts22-crosslingual-sts config: es-en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 76.8364678896155 - type: cosine_spearman value: 78.45516413087114 - type: euclidean_pearson value: 78.62779318576634 - type: euclidean_spearman value: 78.88760695649488 - type: main_score value: 78.45516413087114 - type: manhattan_pearson value: 78.62131335760031 - type: manhattan_spearman value: 78.81861844200388 - type: pearson value: 76.8364678896155 - type: spearman value: 78.45516413087114 - task: type: STS dataset: name: MTEB STS22 (de-en) type: mteb/sts22-crosslingual-sts config: de-en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 65.16640313911604 - type: cosine_spearman value: 60.887608967403914 - type: euclidean_pearson value: 67.49902244990913 - type: euclidean_spearman value: 59.2458787136538 - type: main_score value: 60.887608967403914 - type: manhattan_pearson value: 67.34313506388378 - type: manhattan_spearman value: 59.05283429200166 - type: pearson value: 65.16640313911604 - type: spearman value: 60.887608967403914 - task: type: STS dataset: name: MTEB STSB (default) type: C-MTEB/STSB config: default split: test revision: 0cde68302b3541bb8b3c340dc0644b0b745b3dc0 metrics: - type: cosine_pearson value: 81.5092853013241 - type: cosine_spearman value: 83.54005474244292 - type: euclidean_pearson value: 83.7246578378554 - type: euclidean_spearman value: 84.46767551087716 - type: main_score value: 83.54005474244292 - type: manhattan_pearson value: 83.65922665594636 - type: manhattan_spearman value: 84.42431449101848 - type: pearson value: 81.5092853013241 - type: spearman value: 83.54005474244292 - task: type: STS dataset: name: MTEB STSBenchmark (default) type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cosine_pearson value: 87.70246866744966 - type: cosine_spearman value: 89.44070045346106 - type: euclidean_pearson value: 89.56956519641007 - type: euclidean_spearman value: 89.95830112784283 - type: main_score value: 89.44070045346106 - type: manhattan_pearson value: 89.48264471425145 - type: manhattan_spearman value: 89.87900732483114 - type: pearson value: 87.70246866744966 - type: spearman value: 89.44070045346106 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (de) type: mteb/stsb_multi_mt config: de split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 86.83701990805217 - type: cosine_spearman value: 87.80280785492258 - type: euclidean_pearson value: 87.77325330043514 - type: euclidean_spearman value: 88.3564607283144 - type: main_score value: 87.80280785492258 - type: manhattan_pearson value: 87.6745449945946 - type: manhattan_spearman value: 88.30660465978795 - type: pearson value: 86.83701990805217 - type: spearman value: 87.80280785492258 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (zh) type: mteb/stsb_multi_mt config: zh split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 84.27751020600267 - type: cosine_spearman value: 85.63500407412486 - type: euclidean_pearson value: 85.21829891649696 - type: euclidean_spearman value: 85.9384575715382 - type: main_score value: 85.63500407412486 - type: manhattan_pearson value: 85.10797194089801 - type: manhattan_spearman value: 85.8770162042784 - type: pearson value: 84.27751020600267 - type: spearman value: 85.63500407412486 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (fr) type: mteb/stsb_multi_mt config: fr split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 86.56833656723254 - type: cosine_spearman value: 87.4393978501382 - type: euclidean_pearson value: 87.45171512751267 - type: euclidean_spearman value: 88.13106516566947 - type: main_score value: 87.4393978501382 - type: manhattan_pearson value: 87.33010961793333 - type: manhattan_spearman value: 88.06707425102182 - type: pearson value: 86.56833656723254 - type: spearman value: 87.4393978501382 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (pl) type: mteb/stsb_multi_mt config: pl split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 85.45065540325523 - type: cosine_spearman value: 85.47881076789359 - type: euclidean_pearson value: 85.1999493863155 - type: euclidean_spearman value: 85.7874947669187 - type: main_score value: 85.47881076789359 - type: manhattan_pearson value: 85.06075305990376 - type: manhattan_spearman value: 85.71563015639558 - type: pearson value: 85.45065540325523 - type: spearman value: 85.47881076789359 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (es) type: mteb/stsb_multi_mt config: es split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 87.11952824079832 - type: cosine_spearman value: 87.9643473573153 - type: euclidean_pearson value: 88.11750364639971 - type: euclidean_spearman value: 88.63695109016498 - type: main_score value: 87.9643473573153 - type: manhattan_pearson value: 88.00294453126699 - type: manhattan_spearman value: 88.53750241758391 - type: pearson value: 87.11952824079832 - type: spearman value: 87.9643473573153 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (ru) type: mteb/stsb_multi_mt config: ru split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 85.99804354414991 - type: cosine_spearman value: 86.30252111551002 - type: euclidean_pearson value: 86.1880652037762 - type: euclidean_spearman value: 86.69556223944502 - type: main_score value: 86.30252111551002 - type: manhattan_pearson value: 86.0736400320898 - type: manhattan_spearman value: 86.61747927593393 - type: pearson value: 85.99804354414991 - type: spearman value: 86.30252111551002 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (en) type: mteb/stsb_multi_mt config: en split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 87.70246861738103 - type: cosine_spearman value: 89.44070045346106 - type: euclidean_pearson value: 89.56956518833663 - type: euclidean_spearman value: 89.95830112784283 - type: main_score value: 89.44070045346106 - type: manhattan_pearson value: 89.48264470792915 - type: manhattan_spearman value: 89.87900732483114 - type: pearson value: 87.70246861738103 - type: spearman value: 89.44070045346106 - task: type: Reranking dataset: name: MTEB SciDocsRR (default) type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 84.88064122814694 - type: mrr value: 95.84832651009123 - type: main_score value: 84.88064122814694 - task: type: Retrieval dataset: name: MTEB SciFact (default) type: mteb/scifact config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: map_at_1 value: 57.289 - type: map_at_10 value: 67.88499999999999 - type: map_at_100 value: 68.477 - type: map_at_1000 value: 68.50500000000001 - type: map_at_20 value: 68.33500000000001 - type: map_at_3 value: 65.08 - type: map_at_5 value: 67.001 - type: mrr_at_1 value: 59.667 - type: mrr_at_10 value: 68.626 - type: mrr_at_100 value: 69.082 - type: mrr_at_1000 value: 69.108 - type: mrr_at_20 value: 68.958 - type: mrr_at_3 value: 66.667 - type: mrr_at_5 value: 67.983 - type: ndcg_at_1 value: 59.667 - type: ndcg_at_10 value: 72.309 - type: ndcg_at_100 value: 74.58399999999999 - type: ndcg_at_1000 value: 75.25500000000001 - type: ndcg_at_20 value: 73.656 - type: ndcg_at_3 value: 67.791 - type: ndcg_at_5 value: 70.45 - type: precision_at_1 value: 59.667 - type: precision_at_10 value: 9.567 - type: precision_at_100 value: 1.073 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_20 value: 5.083 - type: precision_at_3 value: 26.333000000000002 - type: precision_at_5 value: 17.666999999999998 - type: recall_at_1 value: 57.289 - type: recall_at_10 value: 84.756 - type: recall_at_100 value: 94.5 - type: recall_at_1000 value: 99.667 - type: recall_at_20 value: 89.7 - type: recall_at_3 value: 73.22800000000001 - type: recall_at_5 value: 79.444 - type: main_score value: 72.309 - task: type: Clustering dataset: name: MTEB SpanishNewsClusteringP2P (default) type: jinaai/spanish_news_clustering config: default split: test revision: bf8ca8ddc5b7da4f7004720ddf99bbe0483480e6 metrics: - type: main_score value: 45.04477709795154 - type: v_measure value: 45.04477709795154 - type: v_measure_std value: 0.0 - task: type: Retrieval dataset: name: MTEB SpanishPassageRetrievalS2S (default) type: jinaai/spanish_passage_retrieval config: default split: test revision: 9cddf2ce5209ade52c2115ccfa00eb22c6d3a837 metrics: - type: main_score value: 69.83 - type: map_at_1 value: 15.736 - type: map_at_10 value: 52.027 - type: map_at_100 value: 65.08800000000001 - type: map_at_1000 value: 65.08800000000001 - type: map_at_20 value: 60.79900000000001 - type: map_at_3 value: 32.869 - type: map_at_5 value: 41.436 - type: mrr_at_1 value: 75.44910179640718 - type: mrr_at_10 value: 84.43446440452426 - type: mrr_at_100 value: 84.48052612723271 - type: mrr_at_1000 value: 84.48052612723271 - type: mrr_at_20 value: 84.48052612723271 - type: mrr_at_3 value: 83.13373253493013 - type: mrr_at_5 value: 84.3013972055888 - type: nauc_map_at_1000_diff1 value: 50.611540149694356 - type: nauc_map_at_1000_max value: 2.1102430434260238 - type: nauc_map_at_1000_std value: -18.88993521335793 - type: nauc_map_at_100_diff1 value: 50.611540149694356 - type: nauc_map_at_100_max value: 2.1102430434260238 - type: nauc_map_at_100_std value: -18.88993521335793 - type: nauc_map_at_10_diff1 value: 59.13518981755268 - type: nauc_map_at_10_max value: -9.810386627392807 - type: nauc_map_at_10_std value: -38.31810152345078 - type: nauc_map_at_1_diff1 value: 74.96782567287174 - type: nauc_map_at_1_max value: -29.648279252607875 - type: nauc_map_at_1_std value: -54.017459339141595 - type: nauc_map_at_20_diff1 value: 55.26694458629849 - type: nauc_map_at_20_max value: -1.9490244535020729 - type: nauc_map_at_20_std value: -25.22211659104076 - type: nauc_map_at_3_diff1 value: 71.67607885031732 - type: nauc_map_at_3_max value: -25.078101661694507 - type: nauc_map_at_3_std value: -50.55408861920259 - type: nauc_map_at_5_diff1 value: 61.50111515417668 - type: nauc_map_at_5_max value: -16.4114670513168 - type: nauc_map_at_5_std value: -44.391416134859135 - type: nauc_mrr_at_1000_diff1 value: 74.18848063283234 - type: nauc_mrr_at_1000_max value: 21.929205946778005 - type: nauc_mrr_at_1000_std value: -36.27399268489433 - type: nauc_mrr_at_100_diff1 value: 74.18848063283234 - type: nauc_mrr_at_100_max value: 21.929205946778005 - type: nauc_mrr_at_100_std value: -36.27399268489433 - type: nauc_mrr_at_10_diff1 value: 74.27231582268745 - type: nauc_mrr_at_10_max value: 21.481133301135337 - type: nauc_mrr_at_10_std value: -36.72070854872902 - type: nauc_mrr_at_1_diff1 value: 76.54855950439561 - type: nauc_mrr_at_1_max value: 26.99938321212366 - type: nauc_mrr_at_1_std value: -33.098742603429635 - type: nauc_mrr_at_20_diff1 value: 74.18848063283234 - type: nauc_mrr_at_20_max value: 21.929205946778005 - type: nauc_mrr_at_20_std value: -36.27399268489433 - type: nauc_mrr_at_3_diff1 value: 72.05379526740143 - type: nauc_mrr_at_3_max value: 18.875831185752528 - type: nauc_mrr_at_3_std value: -37.27302006456391 - type: nauc_mrr_at_5_diff1 value: 74.25342356682029 - type: nauc_mrr_at_5_max value: 20.756340085088738 - type: nauc_mrr_at_5_std value: -37.99507208540703 - type: nauc_ndcg_at_1000_diff1 value: 53.259363764380275 - type: nauc_ndcg_at_1000_max value: 12.936954959423218 - type: nauc_ndcg_at_1000_std value: -16.953898675672153 - type: nauc_ndcg_at_100_diff1 value: 53.259363764380275 - type: nauc_ndcg_at_100_max value: 12.936954959423218 - type: nauc_ndcg_at_100_std value: -16.953898675672153 - type: nauc_ndcg_at_10_diff1 value: 53.70942345413554 - type: nauc_ndcg_at_10_max value: -3.8465093347016186 - type: nauc_ndcg_at_10_std value: -31.208127919994755 - type: nauc_ndcg_at_1_diff1 value: 75.30551289259554 - type: nauc_ndcg_at_1_max value: 25.53292054129834 - type: nauc_ndcg_at_1_std value: -33.285498788395145 - type: nauc_ndcg_at_20_diff1 value: 57.62409278278133 - type: nauc_ndcg_at_20_max value: 2.8040586426056233 - type: nauc_ndcg_at_20_std value: -26.270875776221704 - type: nauc_ndcg_at_3_diff1 value: 48.42294834754225 - type: nauc_ndcg_at_3_max value: 16.912467881065822 - type: nauc_ndcg_at_3_std value: -13.324841189277873 - type: nauc_ndcg_at_5_diff1 value: 47.512819802794596 - type: nauc_ndcg_at_5_max value: 14.645518203506594 - type: nauc_ndcg_at_5_std value: -17.641450435599275 - type: nauc_precision_at_1000_diff1 value: -34.43320975829637 - type: nauc_precision_at_1000_max value: 29.08585622578186 - type: nauc_precision_at_1000_std value: 46.55117940162061 - type: nauc_precision_at_100_diff1 value: -34.433209758296364 - type: nauc_precision_at_100_max value: 29.085856225781885 - type: nauc_precision_at_100_std value: 46.55117940162065 - type: nauc_precision_at_10_diff1 value: -21.895306304096902 - type: nauc_precision_at_10_max value: 33.190476527593745 - type: nauc_precision_at_10_std value: 37.64916268614298 - type: nauc_precision_at_1_diff1 value: 75.30551289259554 - type: nauc_precision_at_1_max value: 25.53292054129834 - type: nauc_precision_at_1_std value: -33.285498788395145 - type: nauc_precision_at_20_diff1 value: -27.63076748060466 - type: nauc_precision_at_20_max value: 30.689810416086154 - type: nauc_precision_at_20_std value: 46.164191636131626 - type: nauc_precision_at_3_diff1 value: 20.547345067837288 - type: nauc_precision_at_3_max value: 26.177050942827528 - type: nauc_precision_at_3_std value: 5.960466052973099 - type: nauc_precision_at_5_diff1 value: -8.928755534002669 - type: nauc_precision_at_5_max value: 40.83262650073459 - type: nauc_precision_at_5_std value: 26.158537031161494 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: .nan - type: nauc_recall_at_100_max value: .nan - type: nauc_recall_at_100_std value: .nan - type: nauc_recall_at_10_diff1 value: 53.08654386169444 - type: nauc_recall_at_10_max value: -23.276269379519356 - type: nauc_recall_at_10_std value: -50.80707792706157 - type: nauc_recall_at_1_diff1 value: 74.96782567287174 - type: nauc_recall_at_1_max value: -29.648279252607875 - type: nauc_recall_at_1_std value: -54.017459339141595 - type: nauc_recall_at_20_diff1 value: 51.60121897059633 - type: nauc_recall_at_20_max value: -14.241779530735387 - type: nauc_recall_at_20_std value: -37.877451525215456 - type: nauc_recall_at_3_diff1 value: 66.99474984329694 - type: nauc_recall_at_3_max value: -30.802787353187966 - type: nauc_recall_at_3_std value: -53.58737792129713 - type: nauc_recall_at_5_diff1 value: 54.64214444958567 - type: nauc_recall_at_5_max value: -23.341309362104703 - type: nauc_recall_at_5_std value: -51.381363923145265 - type: ndcg_at_1 value: 76.048 - type: ndcg_at_10 value: 69.83 - type: ndcg_at_100 value: 82.11500000000001 - type: ndcg_at_1000 value: 82.11500000000001 - type: ndcg_at_20 value: 75.995 - type: ndcg_at_3 value: 69.587 - type: ndcg_at_5 value: 69.062 - type: precision_at_1 value: 76.048 - type: precision_at_10 value: 43.653 - type: precision_at_100 value: 7.718999999999999 - type: precision_at_1000 value: 0.772 - type: precision_at_20 value: 31.108000000000004 - type: precision_at_3 value: 63.87199999999999 - type: precision_at_5 value: 56.407 - type: recall_at_1 value: 15.736 - type: recall_at_10 value: 66.873 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 85.01100000000001 - type: recall_at_3 value: 36.441 - type: recall_at_5 value: 49.109 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions (default) type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cosine_accuracy value: 99.87326732673267 - type: cosine_accuracy_threshold value: 86.0752820968628 - type: cosine_ap value: 96.98758090713252 - type: cosine_f1 value: 93.52881698685542 - type: cosine_f1_threshold value: 86.0752820968628 - type: cosine_precision value: 94.58077709611452 - type: cosine_recall value: 92.5 - type: dot_accuracy value: 99.82574257425742 - type: dot_accuracy_threshold value: 40484.73815917969 - type: dot_ap value: 95.68959907254845 - type: dot_f1 value: 91.31293188548865 - type: dot_f1_threshold value: 40336.810302734375 - type: dot_precision value: 90.15594541910332 - type: dot_recall value: 92.5 - type: euclidean_accuracy value: 99.87128712871286 - type: euclidean_accuracy_threshold value: 1162.5749588012695 - type: euclidean_ap value: 96.92640435656577 - type: euclidean_f1 value: 93.4475806451613 - type: euclidean_f1_threshold value: 1162.5749588012695 - type: euclidean_precision value: 94.20731707317073 - type: euclidean_recall value: 92.7 - type: main_score value: 96.98758090713252 - type: manhattan_accuracy value: 99.86930693069307 - type: manhattan_accuracy_threshold value: 28348.71826171875 - type: manhattan_ap value: 96.93832673967925 - type: manhattan_f1 value: 93.33333333333333 - type: manhattan_f1_threshold value: 28348.71826171875 - type: manhattan_precision value: 94.28571428571428 - type: manhattan_recall value: 92.4 - type: max_accuracy value: 99.87326732673267 - type: max_ap value: 96.98758090713252 - type: max_f1 value: 93.52881698685542 - type: max_precision value: 94.58077709611452 - type: max_recall value: 92.7 - type: similarity_accuracy value: 99.87326732673267 - type: similarity_accuracy_threshold value: 86.0752820968628 - type: similarity_ap value: 96.98758090713252 - type: similarity_f1 value: 93.52881698685542 - type: similarity_f1_threshold value: 86.0752820968628 - type: similarity_precision value: 94.58077709611452 - type: similarity_recall value: 92.5 - task: type: Clustering dataset: name: MTEB StackExchangeClustering (default) type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: main_score value: 65.6560129719848 - type: v_measure value: 65.6560129719848 - type: v_measure_std value: 4.781229811487539 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P (default) type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: main_score value: 35.07546243853692 - type: v_measure value: 35.07546243853692 - type: v_measure_std value: 1.1978740356240998 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions (default) type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 51.771005199508835 - type: mrr value: 52.65443298531534 - type: main_score value: 51.771005199508835 - task: type: Summarization dataset: name: MTEB SummEval (default) type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cosine_pearson value: 29.48686238342228 - type: cosine_spearman value: 29.706543509170054 - type: dot_pearson value: 27.95853155597859 - type: dot_spearman value: 27.604287986935162 - type: main_score value: 29.706543509170054 - type: pearson value: 29.48686238342228 - type: spearman value: 29.706543509170054 - task: type: Summarization dataset: name: MTEB SummEvalFr (default) type: lyon-nlp/summarization-summeval-fr-p2p config: default split: test revision: b385812de6a9577b6f4d0f88c6a6e35395a94054 metrics: - type: cosine_pearson value: 31.551301434917868 - type: cosine_spearman value: 30.709049789175186 - type: dot_pearson value: 27.77050901756549 - type: dot_spearman value: 26.715505953561795 - type: main_score value: 30.709049789175186 - type: pearson value: 31.551301434917868 - type: spearman value: 30.709049789175186 - task: type: Reranking dataset: name: MTEB SyntecReranking (default) type: lyon-nlp/mteb-fr-reranking-syntec-s2p config: default split: test revision: b205c5084a0934ce8af14338bf03feb19499c84d metrics: - type: map value: 73.31666666666666 - type: mrr value: 73.31666666666666 - type: main_score value: 73.31666666666666 - task: type: Retrieval dataset: name: MTEB SyntecRetrieval (default) type: lyon-nlp/mteb-fr-retrieval-syntec-s2p config: default split: test revision: 19661ccdca4dfc2d15122d776b61685f48c68ca9 metrics: - type: main_score value: 83.851 - type: map_at_1 value: 68.0 - type: map_at_10 value: 79.187 - type: map_at_100 value: 79.32900000000001 - type: map_at_1000 value: 79.32900000000001 - type: map_at_20 value: 79.32900000000001 - type: map_at_3 value: 77.333 - type: map_at_5 value: 78.93299999999999 - type: mrr_at_1 value: 68.0 - type: mrr_at_10 value: 79.18730158730159 - type: mrr_at_100 value: 79.32945845004669 - type: mrr_at_1000 value: 79.32945845004669 - type: mrr_at_20 value: 79.32945845004669 - type: mrr_at_3 value: 77.33333333333333 - type: mrr_at_5 value: 78.93333333333332 - type: nauc_map_at_1000_diff1 value: 63.31103256935259 - type: nauc_map_at_1000_max value: 11.073749121365623 - type: nauc_map_at_1000_std value: 7.4973309839738 - type: nauc_map_at_100_diff1 value: 63.31103256935259 - type: nauc_map_at_100_max value: 11.073749121365623 - type: nauc_map_at_100_std value: 7.4973309839738 - type: nauc_map_at_10_diff1 value: 62.91585737195978 - type: nauc_map_at_10_max value: 11.770664508983133 - type: nauc_map_at_10_std value: 8.179883948527962 - type: nauc_map_at_1_diff1 value: 66.1236265634718 - type: nauc_map_at_1_max value: 7.000207311173955 - type: nauc_map_at_1_std value: 6.54412272821497 - type: nauc_map_at_20_diff1 value: 63.31103256935259 - type: nauc_map_at_20_max value: 11.073749121365623 - type: nauc_map_at_20_std value: 7.4973309839738 - type: nauc_map_at_3_diff1 value: 62.14039574010254 - type: nauc_map_at_3_max value: 11.06996398110187 - type: nauc_map_at_3_std value: 7.288759297085769 - type: nauc_map_at_5_diff1 value: 63.0401271126211 - type: nauc_map_at_5_max value: 10.779317801858609 - type: nauc_map_at_5_std value: 6.476660484760681 - type: nauc_mrr_at_1000_diff1 value: 63.31103256935259 - type: nauc_mrr_at_1000_max value: 11.073749121365623 - type: nauc_mrr_at_1000_std value: 7.4973309839738 - type: nauc_mrr_at_100_diff1 value: 63.31103256935259 - type: nauc_mrr_at_100_max value: 11.073749121365623 - type: nauc_mrr_at_100_std value: 7.4973309839738 - type: nauc_mrr_at_10_diff1 value: 62.91585737195978 - type: nauc_mrr_at_10_max value: 11.770664508983133 - type: nauc_mrr_at_10_std value: 8.179883948527962 - type: nauc_mrr_at_1_diff1 value: 66.1236265634718 - type: nauc_mrr_at_1_max value: 7.000207311173955 - type: nauc_mrr_at_1_std value: 6.54412272821497 - type: nauc_mrr_at_20_diff1 value: 63.31103256935259 - type: nauc_mrr_at_20_max value: 11.073749121365623 - type: nauc_mrr_at_20_std value: 7.4973309839738 - type: nauc_mrr_at_3_diff1 value: 62.14039574010254 - type: nauc_mrr_at_3_max value: 11.06996398110187 - type: nauc_mrr_at_3_std value: 7.288759297085769 - type: nauc_mrr_at_5_diff1 value: 63.0401271126211 - type: nauc_mrr_at_5_max value: 10.779317801858609 - type: nauc_mrr_at_5_std value: 6.476660484760681 - type: nauc_ndcg_at_1000_diff1 value: 62.9544299483241 - type: nauc_ndcg_at_1000_max value: 11.577079766964538 - type: nauc_ndcg_at_1000_std value: 7.703856790100716 - type: nauc_ndcg_at_100_diff1 value: 62.9544299483241 - type: nauc_ndcg_at_100_max value: 11.577079766964538 - type: nauc_ndcg_at_100_std value: 7.703856790100716 - type: nauc_ndcg_at_10_diff1 value: 61.29907952217381 - type: nauc_ndcg_at_10_max value: 14.760627422715425 - type: nauc_ndcg_at_10_std value: 10.805573898143368 - type: nauc_ndcg_at_1_diff1 value: 66.1236265634718 - type: nauc_ndcg_at_1_max value: 7.000207311173955 - type: nauc_ndcg_at_1_std value: 6.54412272821497 - type: nauc_ndcg_at_20_diff1 value: 62.9544299483241 - type: nauc_ndcg_at_20_max value: 11.577079766964538 - type: nauc_ndcg_at_20_std value: 7.703856790100716 - type: nauc_ndcg_at_3_diff1 value: 60.25643527856101 - type: nauc_ndcg_at_3_max value: 12.236302709487546 - type: nauc_ndcg_at_3_std value: 7.36883189112067 - type: nauc_ndcg_at_5_diff1 value: 61.65220590318238 - type: nauc_ndcg_at_5_max value: 11.39969101913945 - type: nauc_ndcg_at_5_std value: 5.406207922379402 - type: nauc_precision_at_1000_diff1 value: .nan - type: nauc_precision_at_1000_max value: .nan - type: nauc_precision_at_1000_std value: .nan - type: nauc_precision_at_100_diff1 value: .nan - type: nauc_precision_at_100_max value: .nan - type: nauc_precision_at_100_std value: .nan - type: nauc_precision_at_10_diff1 value: 19.14098972922579 - type: nauc_precision_at_10_max value: 100.0 - type: nauc_precision_at_10_std value: 93.46405228758135 - type: nauc_precision_at_1_diff1 value: 66.1236265634718 - type: nauc_precision_at_1_max value: 7.000207311173955 - type: nauc_precision_at_1_std value: 6.54412272821497 - type: nauc_precision_at_20_diff1 value: 100.0 - type: nauc_precision_at_20_max value: 100.0 - type: nauc_precision_at_20_std value: 100.0 - type: nauc_precision_at_3_diff1 value: 50.29636629155561 - type: nauc_precision_at_3_max value: 18.00532600292076 - type: nauc_precision_at_3_std value: 7.649686453053768 - type: nauc_precision_at_5_diff1 value: 43.522408963585356 - type: nauc_precision_at_5_max value: 16.923436041082983 - type: nauc_precision_at_5_std value: -10.854341736694092 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: .nan - type: nauc_recall_at_100_max value: .nan - type: nauc_recall_at_100_std value: .nan - type: nauc_recall_at_10_diff1 value: 19.1409897292252 - type: nauc_recall_at_10_max value: 100.0 - type: nauc_recall_at_10_std value: 93.46405228758134 - type: nauc_recall_at_1_diff1 value: 66.1236265634718 - type: nauc_recall_at_1_max value: 7.000207311173955 - type: nauc_recall_at_1_std value: 6.54412272821497 - type: nauc_recall_at_20_diff1 value: .nan - type: nauc_recall_at_20_max value: .nan - type: nauc_recall_at_20_std value: .nan - type: nauc_recall_at_3_diff1 value: 50.29636629155569 - type: nauc_recall_at_3_max value: 18.005326002920754 - type: nauc_recall_at_3_std value: 7.649686453053851 - type: nauc_recall_at_5_diff1 value: 43.5224089635856 - type: nauc_recall_at_5_max value: 16.92343604108335 - type: nauc_recall_at_5_std value: -10.854341736694499 - type: ndcg_at_1 value: 68.0 - type: ndcg_at_10 value: 83.851 - type: ndcg_at_100 value: 84.36099999999999 - type: ndcg_at_1000 value: 84.36099999999999 - type: ndcg_at_20 value: 84.36099999999999 - type: ndcg_at_3 value: 80.333 - type: ndcg_at_5 value: 83.21600000000001 - type: precision_at_1 value: 68.0 - type: precision_at_10 value: 9.8 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 5.0 - type: precision_at_3 value: 29.666999999999998 - type: precision_at_5 value: 19.2 - type: recall_at_1 value: 68.0 - type: recall_at_10 value: 98.0 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 100.0 - type: recall_at_3 value: 89.0 - type: recall_at_5 value: 96.0 - task: type: Reranking dataset: name: MTEB T2Reranking (default) type: C-MTEB/T2Reranking config: default split: dev revision: 76631901a18387f85eaa53e5450019b87ad58ef9 metrics: - type: map value: 65.3088203970324 - type: mrr value: 74.79505862376546 - type: main_score value: 65.3088203970324 - task: type: Retrieval dataset: name: MTEB T2Retrieval (default) type: C-MTEB/T2Retrieval config: default split: dev revision: 8731a845f1bf500a4f111cf1070785c793d10e64 metrics: - type: main_score value: 83.163 - type: map_at_1 value: 26.875 - type: map_at_10 value: 75.454 - type: map_at_100 value: 79.036 - type: map_at_1000 value: 79.111 - type: map_at_20 value: 78.145 - type: map_at_3 value: 53.181 - type: map_at_5 value: 65.362 - type: mrr_at_1 value: 88.90057864281957 - type: mrr_at_10 value: 91.53186397301344 - type: mrr_at_100 value: 91.62809075510003 - type: mrr_at_1000 value: 91.63198173030787 - type: mrr_at_20 value: 91.59414668799909 - type: mrr_at_3 value: 91.0792565316499 - type: mrr_at_5 value: 91.35718043135199 - type: nauc_map_at_1000_diff1 value: 12.364843957982409 - type: nauc_map_at_1000_max value: 52.07043464458799 - type: nauc_map_at_1000_std value: 16.040095055100494 - type: nauc_map_at_100_diff1 value: 12.370621073823022 - type: nauc_map_at_100_max value: 51.960738727635636 - type: nauc_map_at_100_std value: 15.935832440430747 - type: nauc_map_at_10_diff1 value: 16.852819486606585 - type: nauc_map_at_10_max value: 40.11184760756059 - type: nauc_map_at_10_std value: 0.9306648364102376 - type: nauc_map_at_1_diff1 value: 52.87356542654683 - type: nauc_map_at_1_max value: -22.210039746171255 - type: nauc_map_at_1_std value: -38.11345358035342 - type: nauc_map_at_20_diff1 value: 13.045089059562837 - type: nauc_map_at_20_max value: 49.591383082160036 - type: nauc_map_at_20_std value: 12.54330050352008 - type: nauc_map_at_3_diff1 value: 38.08172234377615 - type: nauc_map_at_3_max value: -6.868621684867697 - type: nauc_map_at_3_std value: -35.4712388845996 - type: nauc_map_at_5_diff1 value: 29.665551705577474 - type: nauc_map_at_5_max value: 10.958628576519045 - type: nauc_map_at_5_std value: -25.113120842097057 - type: nauc_mrr_at_1000_diff1 value: 47.39372999496945 - type: nauc_mrr_at_1000_max value: 83.11274997493808 - type: nauc_mrr_at_1000_std value: 39.74195374546631 - type: nauc_mrr_at_100_diff1 value: 47.396678946057676 - type: nauc_mrr_at_100_max value: 83.1192584274415 - type: nauc_mrr_at_100_std value: 39.75840860374685 - type: nauc_mrr_at_10_diff1 value: 47.35365644138715 - type: nauc_mrr_at_10_max value: 83.189165639531 - type: nauc_mrr_at_10_std value: 39.83653157887758 - type: nauc_mrr_at_1_diff1 value: 47.98740362820094 - type: nauc_mrr_at_1_max value: 80.32340034580369 - type: nauc_mrr_at_1_std value: 34.57857131423388 - type: nauc_mrr_at_20_diff1 value: 47.399132055537194 - type: nauc_mrr_at_20_max value: 83.16329919869686 - type: nauc_mrr_at_20_std value: 39.84204692042734 - type: nauc_mrr_at_3_diff1 value: 47.09295580511751 - type: nauc_mrr_at_3_max value: 82.95831045602642 - type: nauc_mrr_at_3_std value: 38.98036804692351 - type: nauc_mrr_at_5_diff1 value: 47.20100268549764 - type: nauc_mrr_at_5_max value: 83.16652480381642 - type: nauc_mrr_at_5_std value: 39.55690491560902 - type: nauc_ndcg_at_1000_diff1 value: 17.201962509184547 - type: nauc_ndcg_at_1000_max value: 63.75820559259539 - type: nauc_ndcg_at_1000_std value: 29.28676096486067 - type: nauc_ndcg_at_100_diff1 value: 16.76847216096811 - type: nauc_ndcg_at_100_max value: 62.646517934470744 - type: nauc_ndcg_at_100_std value: 28.7441617667637 - type: nauc_ndcg_at_10_diff1 value: 16.559511980751886 - type: nauc_ndcg_at_10_max value: 54.35027464277944 - type: nauc_ndcg_at_10_std value: 16.98089333577716 - type: nauc_ndcg_at_1_diff1 value: 47.98740362820094 - type: nauc_ndcg_at_1_max value: 80.32340034580369 - type: nauc_ndcg_at_1_std value: 34.57857131423388 - type: nauc_ndcg_at_20_diff1 value: 16.721525245428243 - type: nauc_ndcg_at_20_max value: 57.683661870555724 - type: nauc_ndcg_at_20_std value: 21.736044200026853 - type: nauc_ndcg_at_3_diff1 value: 12.488009696556192 - type: nauc_ndcg_at_3_max value: 69.2365575305502 - type: nauc_ndcg_at_3_std value: 30.622418945055323 - type: nauc_ndcg_at_5_diff1 value: 12.364114556230609 - type: nauc_ndcg_at_5_max value: 62.33360746285387 - type: nauc_ndcg_at_5_std value: 24.898000803570227 - type: nauc_precision_at_1000_diff1 value: -35.14745130154524 - type: nauc_precision_at_1000_max value: 48.811507982849065 - type: nauc_precision_at_1000_std value: 62.43036496029399 - type: nauc_precision_at_100_diff1 value: -35.15276411320076 - type: nauc_precision_at_100_max value: 50.87010333741109 - type: nauc_precision_at_100_std value: 63.418221030407175 - type: nauc_precision_at_10_diff1 value: -34.84255710936113 - type: nauc_precision_at_10_max value: 56.588401051428825 - type: nauc_precision_at_10_std value: 57.4763370653757 - type: nauc_precision_at_1_diff1 value: 47.98740362820094 - type: nauc_precision_at_1_max value: 80.32340034580369 - type: nauc_precision_at_1_std value: 34.57857131423388 - type: nauc_precision_at_20_diff1 value: -35.165762365233505 - type: nauc_precision_at_20_max value: 54.148762449660424 - type: nauc_precision_at_20_std value: 61.569719669368716 - type: nauc_precision_at_3_diff1 value: -28.63023175340299 - type: nauc_precision_at_3_max value: 68.69825987618499 - type: nauc_precision_at_3_std value: 48.15479495755423 - type: nauc_precision_at_5_diff1 value: -34.13811355456687 - type: nauc_precision_at_5_max value: 62.369363941490604 - type: nauc_precision_at_5_std value: 52.282904411187914 - type: nauc_recall_at_1000_diff1 value: 8.686444579162663 - type: nauc_recall_at_1000_max value: 59.58864478011338 - type: nauc_recall_at_1000_std value: 56.692774954297455 - type: nauc_recall_at_100_diff1 value: 8.820596225758342 - type: nauc_recall_at_100_max value: 53.15048885657892 - type: nauc_recall_at_100_std value: 39.78931159236714 - type: nauc_recall_at_10_diff1 value: 16.022301106315027 - type: nauc_recall_at_10_max value: 29.83242342459543 - type: nauc_recall_at_10_std value: -4.805965555875844 - type: nauc_recall_at_1_diff1 value: 52.87356542654683 - type: nauc_recall_at_1_max value: -22.210039746171255 - type: nauc_recall_at_1_std value: -38.11345358035342 - type: nauc_recall_at_20_diff1 value: 10.35772828627265 - type: nauc_recall_at_20_max value: 43.06420839754062 - type: nauc_recall_at_20_std value: 15.040522218235692 - type: nauc_recall_at_3_diff1 value: 36.23953684770224 - type: nauc_recall_at_3_max value: -11.709269151700374 - type: nauc_recall_at_3_std value: -38.13943178150384 - type: nauc_recall_at_5_diff1 value: 28.644872415763384 - type: nauc_recall_at_5_max value: 2.062151266111129 - type: nauc_recall_at_5_std value: -30.81114034774277 - type: ndcg_at_1 value: 88.901 - type: ndcg_at_10 value: 83.163 - type: ndcg_at_100 value: 86.854 - type: ndcg_at_1000 value: 87.602 - type: ndcg_at_20 value: 84.908 - type: ndcg_at_3 value: 84.848 - type: ndcg_at_5 value: 83.372 - type: precision_at_1 value: 88.901 - type: precision_at_10 value: 41.343 - type: precision_at_100 value: 4.957000000000001 - type: precision_at_1000 value: 0.513 - type: precision_at_20 value: 22.955000000000002 - type: precision_at_3 value: 74.29599999999999 - type: precision_at_5 value: 62.251999999999995 - type: recall_at_1 value: 26.875 - type: recall_at_10 value: 81.902 - type: recall_at_100 value: 93.988 - type: recall_at_1000 value: 97.801 - type: recall_at_20 value: 87.809 - type: recall_at_3 value: 54.869 - type: recall_at_5 value: 68.728 - task: type: PairClassification dataset: name: MTEB TERRa (default) type: ai-forever/terra-pairclassification config: default split: dev revision: 7b58f24536063837d644aab9a023c62199b2a612 metrics: - type: cosine_accuracy value: 60.586319218241044 - type: cosine_accuracy_threshold value: 82.49806761741638 - type: cosine_ap value: 58.73198048427448 - type: cosine_f1 value: 67.37967914438502 - type: cosine_f1_threshold value: 77.46461033821106 - type: cosine_precision value: 57.01357466063348 - type: cosine_recall value: 82.35294117647058 - type: dot_accuracy value: 60.26058631921825 - type: dot_accuracy_threshold value: 35627.020263671875 - type: dot_ap value: 57.418783612898224 - type: dot_f1 value: 66.51982378854623 - type: dot_f1_threshold value: 27620.843505859375 - type: dot_precision value: 50.16611295681063 - type: dot_recall value: 98.69281045751634 - type: euclidean_accuracy value: 60.26058631921825 - type: euclidean_accuracy_threshold value: 1255.4466247558594 - type: euclidean_ap value: 58.748656145387955 - type: euclidean_f1 value: 66.99029126213591 - type: euclidean_f1_threshold value: 1565.1330947875977 - type: euclidean_precision value: 53.28185328185329 - type: euclidean_recall value: 90.19607843137256 - type: main_score value: 58.8479126365766 - type: manhattan_accuracy value: 59.934853420195445 - type: manhattan_accuracy_threshold value: 29897.271728515625 - type: manhattan_ap value: 58.8479126365766 - type: manhattan_f1 value: 66.81318681318683 - type: manhattan_f1_threshold value: 46291.802978515625 - type: manhattan_precision value: 50.331125827814574 - type: manhattan_recall value: 99.34640522875817 - type: max_accuracy value: 60.586319218241044 - type: max_ap value: 58.8479126365766 - type: max_f1 value: 67.37967914438502 - type: max_precision value: 57.01357466063348 - type: max_recall value: 99.34640522875817 - type: similarity_accuracy value: 60.586319218241044 - type: similarity_accuracy_threshold value: 82.49806761741638 - type: similarity_ap value: 58.73198048427448 - type: similarity_f1 value: 67.37967914438502 - type: similarity_f1_threshold value: 77.46461033821106 - type: similarity_precision value: 57.01357466063348 - type: similarity_recall value: 82.35294117647058 - task: type: Classification dataset: name: MTEB TNews (default) type: C-MTEB/TNews-classification config: default split: validation revision: 317f262bf1e6126357bbe89e875451e4b0938fe4 metrics: - type: accuracy value: 45.967999999999996 - type: f1 value: 44.699306100915706 - type: f1_weighted value: 46.03730319014832 - type: main_score value: 45.967999999999996 - task: type: Retrieval dataset: name: MTEB TRECCOVID (default) type: mteb/trec-covid config: default split: test revision: bb9466bac8153a0349341eb1b22e06409e78ef4e metrics: - type: map_at_1 value: 0.251 - type: map_at_10 value: 1.9480000000000002 - type: map_at_100 value: 11.082 - type: map_at_1000 value: 26.700000000000003 - type: map_at_20 value: 3.3529999999999998 - type: map_at_3 value: 0.679 - type: map_at_5 value: 1.079 - type: mrr_at_1 value: 94.0 - type: mrr_at_10 value: 95.786 - type: mrr_at_100 value: 95.786 - type: mrr_at_1000 value: 95.786 - type: mrr_at_20 value: 95.786 - type: mrr_at_3 value: 95.0 - type: mrr_at_5 value: 95.5 - type: ndcg_at_1 value: 91.0 - type: ndcg_at_10 value: 77.71900000000001 - type: ndcg_at_100 value: 57.726 - type: ndcg_at_1000 value: 52.737 - type: ndcg_at_20 value: 72.54 - type: ndcg_at_3 value: 83.397 - type: ndcg_at_5 value: 80.806 - type: precision_at_1 value: 94.0 - type: precision_at_10 value: 81.0 - type: precision_at_100 value: 59.199999999999996 - type: precision_at_1000 value: 23.244 - type: precision_at_20 value: 75.2 - type: precision_at_3 value: 88.0 - type: precision_at_5 value: 84.8 - type: recall_at_1 value: 0.251 - type: recall_at_10 value: 2.1229999999999998 - type: recall_at_100 value: 14.496999999999998 - type: recall_at_1000 value: 50.09 - type: recall_at_20 value: 3.8309999999999995 - type: recall_at_3 value: 0.696 - type: recall_at_5 value: 1.1400000000000001 - type: main_score value: 77.71900000000001 - task: type: Clustering dataset: name: MTEB TenKGnadClusteringP2P (default) type: slvnwhrl/tenkgnad-clustering-p2p config: default split: test revision: 5c59e41555244b7e45c9a6be2d720ab4bafae558 metrics: - type: main_score value: 43.763609722295215 - type: v_measure value: 43.763609722295215 - type: v_measure_std value: 2.8751199473862457 - task: type: Clustering dataset: name: MTEB TenKGnadClusteringS2S (default) type: slvnwhrl/tenkgnad-clustering-s2s config: default split: test revision: 6cddbe003f12b9b140aec477b583ac4191f01786 metrics: - type: main_score value: 39.762424448504355 - type: v_measure value: 39.762424448504355 - type: v_measure_std value: 3.30146124979502 - task: type: Clustering dataset: name: MTEB ThuNewsClusteringP2P (default) type: C-MTEB/ThuNewsClusteringP2P config: default split: test revision: 5798586b105c0434e4f0fe5e767abe619442cf93 metrics: - type: main_score value: 63.133819258289456 - type: v_measure value: 63.133819258289456 - type: v_measure_std value: 1.8854253356479695 - task: type: Clustering dataset: name: MTEB ThuNewsClusteringS2S (default) type: C-MTEB/ThuNewsClusteringS2S config: default split: test revision: 8a8b2caeda43f39e13c4bc5bea0f8a667896e10d metrics: - type: main_score value: 58.98195851785808 - type: v_measure value: 58.98195851785808 - type: v_measure_std value: 1.6237600076393737 - task: type: Retrieval dataset: name: MTEB Touche2020 (default) type: mteb/touche2020 config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: map_at_1 value: 3.3550000000000004 - type: map_at_10 value: 10.08 - type: map_at_100 value: 16.136 - type: map_at_1000 value: 17.605 - type: map_at_20 value: 12.561 - type: map_at_3 value: 5.641 - type: map_at_5 value: 7.3260000000000005 - type: mrr_at_1 value: 46.939 - type: mrr_at_10 value: 58.152 - type: mrr_at_100 value: 58.594 - type: mrr_at_1000 value: 58.601000000000006 - type: mrr_at_20 value: 58.279 - type: mrr_at_3 value: 55.102 - type: mrr_at_5 value: 56.531 - type: ndcg_at_1 value: 44.897999999999996 - type: ndcg_at_10 value: 26.298 - type: ndcg_at_100 value: 37.596000000000004 - type: ndcg_at_1000 value: 49.424 - type: ndcg_at_20 value: 27.066000000000003 - type: ndcg_at_3 value: 31.528 - type: ndcg_at_5 value: 28.219 - type: precision_at_1 value: 46.939 - type: precision_at_10 value: 22.245 - type: precision_at_100 value: 7.531000000000001 - type: precision_at_1000 value: 1.5350000000000001 - type: precision_at_20 value: 17.041 - type: precision_at_3 value: 30.612000000000002 - type: precision_at_5 value: 26.122 - type: recall_at_1 value: 3.3550000000000004 - type: recall_at_10 value: 16.41 - type: recall_at_100 value: 47.272 - type: recall_at_1000 value: 83.584 - type: recall_at_20 value: 24.091 - type: recall_at_3 value: 6.8180000000000005 - type: recall_at_5 value: 9.677 - type: main_score value: 26.298 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification (default) type: mteb/toxic_conversations_50k config: default split: test revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de metrics: - type: accuracy value: 91.2890625 - type: ap value: 33.95547153875715 - type: ap_weighted value: 33.95547153875715 - type: f1 value: 75.10768597556462 - type: f1_weighted value: 92.00161208992606 - type: main_score value: 91.2890625 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification (default) type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 71.3978494623656 - type: f1 value: 71.7194818511814 - type: f1_weighted value: 71.13860187349744 - type: main_score value: 71.3978494623656 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering (default) type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: main_score value: 52.4921688720602 - type: v_measure value: 52.4921688720602 - type: v_measure_std value: 0.992768152658908 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 (default) type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cosine_accuracy value: 85.11652858079513 - type: cosine_accuracy_threshold value: 87.90839910507202 - type: cosine_ap value: 70.90459908851724 - type: cosine_f1 value: 65.66581227877457 - type: cosine_f1_threshold value: 85.13308763504028 - type: cosine_precision value: 61.094708153531684 - type: cosine_recall value: 70.97625329815304 - type: dot_accuracy value: 83.41181379269239 - type: dot_accuracy_threshold value: 43110.113525390625 - type: dot_ap value: 65.64869491143095 - type: dot_f1 value: 62.05308447460914 - type: dot_f1_threshold value: 41412.542724609375 - type: dot_precision value: 57.38623626989464 - type: dot_recall value: 67.54617414248021 - type: euclidean_accuracy value: 85.15229182809799 - type: euclidean_accuracy_threshold value: 1043.08500289917 - type: euclidean_ap value: 70.71204383269375 - type: euclidean_f1 value: 65.20304568527919 - type: euclidean_f1_threshold value: 1179.2595863342285 - type: euclidean_precision value: 62.81173594132029 - type: euclidean_recall value: 67.78364116094987 - type: main_score value: 70.90459908851724 - type: manhattan_accuracy value: 85.1820945341837 - type: manhattan_accuracy_threshold value: 26115.0390625 - type: manhattan_ap value: 70.66113937117431 - type: manhattan_f1 value: 65.33383628819313 - type: manhattan_f1_threshold value: 29105.181884765625 - type: manhattan_precision value: 62.40691808791736 - type: manhattan_recall value: 68.54881266490766 - type: max_accuracy value: 85.1820945341837 - type: max_ap value: 70.90459908851724 - type: max_f1 value: 65.66581227877457 - type: max_precision value: 62.81173594132029 - type: max_recall value: 70.97625329815304 - type: similarity_accuracy value: 85.11652858079513 - type: similarity_accuracy_threshold value: 87.90839910507202 - type: similarity_ap value: 70.90459908851724 - type: similarity_f1 value: 65.66581227877457 - type: similarity_f1_threshold value: 85.13308763504028 - type: similarity_precision value: 61.094708153531684 - type: similarity_recall value: 70.97625329815304 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus (default) type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cosine_accuracy value: 88.10299996119068 - type: cosine_accuracy_threshold value: 84.34982895851135 - type: cosine_ap value: 84.13755787769226 - type: cosine_f1 value: 76.0967548076923 - type: cosine_f1_threshold value: 82.8936219215393 - type: cosine_precision value: 74.28864769727193 - type: cosine_recall value: 77.99507237449954 - type: dot_accuracy value: 86.64182869561843 - type: dot_accuracy_threshold value: 38794.677734375 - type: dot_ap value: 80.20301567411457 - type: dot_f1 value: 73.50650291634967 - type: dot_f1_threshold value: 37447.23205566406 - type: dot_precision value: 69.41498460485802 - type: dot_recall value: 78.11056359716662 - type: euclidean_accuracy value: 87.9361198432103 - type: euclidean_accuracy_threshold value: 1184.421157836914 - type: euclidean_ap value: 83.79582690117218 - type: euclidean_f1 value: 75.81431709042175 - type: euclidean_f1_threshold value: 1258.2727432250977 - type: euclidean_precision value: 73.39099099099099 - type: euclidean_recall value: 78.40314136125654 - type: main_score value: 84.13755787769226 - type: manhattan_accuracy value: 87.96134590755618 - type: manhattan_accuracy_threshold value: 29077.291870117188 - type: manhattan_ap value: 83.79487172269923 - type: manhattan_f1 value: 75.82421603424935 - type: manhattan_f1_threshold value: 31224.124145507812 - type: manhattan_precision value: 72.24740255212329 - type: manhattan_recall value: 79.77363720357253 - type: max_accuracy value: 88.10299996119068 - type: max_ap value: 84.13755787769226 - type: max_f1 value: 76.0967548076923 - type: max_precision value: 74.28864769727193 - type: max_recall value: 79.77363720357253 - type: similarity_accuracy value: 88.10299996119068 - type: similarity_accuracy_threshold value: 84.34982895851135 - type: similarity_ap value: 84.13755787769226 - type: similarity_f1 value: 76.0967548076923 - type: similarity_f1_threshold value: 82.8936219215393 - type: similarity_precision value: 74.28864769727193 - type: similarity_recall value: 77.99507237449954 - task: type: Retrieval dataset: name: MTEB VideoRetrieval (default) type: C-MTEB/VideoRetrieval config: default split: dev revision: 58c2597a5943a2ba48f4668c3b90d796283c5639 metrics: - type: main_score value: 70.433 - type: map_at_1 value: 55.7 - type: map_at_10 value: 66.013 - type: map_at_100 value: 66.534 - type: map_at_1000 value: 66.547 - type: map_at_20 value: 66.334 - type: map_at_3 value: 64.2 - type: map_at_5 value: 65.445 - type: mrr_at_1 value: 55.7 - type: mrr_at_10 value: 66.01329365079364 - type: mrr_at_100 value: 66.53350061744233 - type: mrr_at_1000 value: 66.54744831962995 - type: mrr_at_20 value: 66.3335147364675 - type: mrr_at_3 value: 64.2 - type: mrr_at_5 value: 65.44500000000002 - type: nauc_map_at_1000_diff1 value: 76.26428836976245 - type: nauc_map_at_1000_max value: 35.41847367373575 - type: nauc_map_at_1000_std value: -33.04639860831992 - type: nauc_map_at_100_diff1 value: 76.25793229023193 - type: nauc_map_at_100_max value: 35.43663260110076 - type: nauc_map_at_100_std value: -33.04238139882945 - type: nauc_map_at_10_diff1 value: 76.2108281297711 - type: nauc_map_at_10_max value: 35.59442419423183 - type: nauc_map_at_10_std value: -33.32346518997277 - type: nauc_map_at_1_diff1 value: 79.17728405262736 - type: nauc_map_at_1_max value: 31.880738163589527 - type: nauc_map_at_1_std value: -30.891888718004584 - type: nauc_map_at_20_diff1 value: 76.2181333410193 - type: nauc_map_at_20_max value: 35.43448818430876 - type: nauc_map_at_20_std value: -33.35682442863193 - type: nauc_map_at_3_diff1 value: 76.10046541433466 - type: nauc_map_at_3_max value: 34.6831278555291 - type: nauc_map_at_3_std value: -34.030826044831116 - type: nauc_map_at_5_diff1 value: 75.96513023582064 - type: nauc_map_at_5_max value: 34.66920832438069 - type: nauc_map_at_5_std value: -33.79799777830796 - type: nauc_mrr_at_1000_diff1 value: 76.26428836976245 - type: nauc_mrr_at_1000_max value: 35.41847367373575 - type: nauc_mrr_at_1000_std value: -33.04639860831992 - type: nauc_mrr_at_100_diff1 value: 76.25793229023193 - type: nauc_mrr_at_100_max value: 35.43663260110076 - type: nauc_mrr_at_100_std value: -33.04238139882945 - type: nauc_mrr_at_10_diff1 value: 76.2108281297711 - type: nauc_mrr_at_10_max value: 35.59442419423183 - type: nauc_mrr_at_10_std value: -33.32346518997277 - type: nauc_mrr_at_1_diff1 value: 79.17728405262736 - type: nauc_mrr_at_1_max value: 31.880738163589527 - type: nauc_mrr_at_1_std value: -30.891888718004584 - type: nauc_mrr_at_20_diff1 value: 76.2181333410193 - type: nauc_mrr_at_20_max value: 35.43448818430876 - type: nauc_mrr_at_20_std value: -33.35682442863193 - type: nauc_mrr_at_3_diff1 value: 76.10046541433466 - type: nauc_mrr_at_3_max value: 34.6831278555291 - type: nauc_mrr_at_3_std value: -34.030826044831116 - type: nauc_mrr_at_5_diff1 value: 75.96513023582064 - type: nauc_mrr_at_5_max value: 34.66920832438069 - type: nauc_mrr_at_5_std value: -33.79799777830796 - type: nauc_ndcg_at_1000_diff1 value: 75.68118206798317 - type: nauc_ndcg_at_1000_max value: 37.12252980787349 - type: nauc_ndcg_at_1000_std value: -31.457578337430505 - type: nauc_ndcg_at_100_diff1 value: 75.46730761564156 - type: nauc_ndcg_at_100_max value: 37.549890025544265 - type: nauc_ndcg_at_100_std value: -31.35066985945112 - type: nauc_ndcg_at_10_diff1 value: 75.09890404887037 - type: nauc_ndcg_at_10_max value: 38.024147790014204 - type: nauc_ndcg_at_10_std value: -33.67408368593356 - type: nauc_ndcg_at_1_diff1 value: 79.17728405262736 - type: nauc_ndcg_at_1_max value: 31.880738163589527 - type: nauc_ndcg_at_1_std value: -30.891888718004584 - type: nauc_ndcg_at_20_diff1 value: 75.12977548171354 - type: nauc_ndcg_at_20_max value: 37.524926748917956 - type: nauc_ndcg_at_20_std value: -33.771344674947485 - type: nauc_ndcg_at_3_diff1 value: 74.94037476984154 - type: nauc_ndcg_at_3_max value: 35.60345554050552 - type: nauc_ndcg_at_3_std value: -35.256991346321854 - type: nauc_ndcg_at_5_diff1 value: 74.54265907753783 - type: nauc_ndcg_at_5_max value: 35.57662819978585 - type: nauc_ndcg_at_5_std value: -34.879794448418465 - type: nauc_precision_at_1000_diff1 value: 74.52277207179142 - type: nauc_precision_at_1000_max value: 94.25510945118707 - type: nauc_precision_at_1000_std value: 91.6874157070222 - type: nauc_precision_at_100_diff1 value: 65.98346655735419 - type: nauc_precision_at_100_max value: 78.81168727653687 - type: nauc_precision_at_100_std value: 27.241465691967708 - type: nauc_precision_at_10_diff1 value: 69.55050319096688 - type: nauc_precision_at_10_max value: 51.827749140893374 - type: nauc_precision_at_10_std value: -34.60818605792837 - type: nauc_precision_at_1_diff1 value: 79.17728405262736 - type: nauc_precision_at_1_max value: 31.880738163589527 - type: nauc_precision_at_1_std value: -30.891888718004584 - type: nauc_precision_at_20_diff1 value: 68.08078305042736 - type: nauc_precision_at_20_max value: 52.83318878288501 - type: nauc_precision_at_20_std value: -35.46070292817927 - type: nauc_precision_at_3_diff1 value: 70.76249609881901 - type: nauc_precision_at_3_max value: 38.86561868624655 - type: nauc_precision_at_3_std value: -39.68917853446992 - type: nauc_precision_at_5_diff1 value: 68.39110629013278 - type: nauc_precision_at_5_max value: 39.28677163904683 - type: nauc_precision_at_5_std value: -39.39101423819562 - type: nauc_recall_at_1000_diff1 value: 74.52277207179175 - type: nauc_recall_at_1000_max value: 94.25510945118776 - type: nauc_recall_at_1000_std value: 91.68741570702382 - type: nauc_recall_at_100_diff1 value: 65.9834665573548 - type: nauc_recall_at_100_max value: 78.81168727653679 - type: nauc_recall_at_100_std value: 27.241465691967598 - type: nauc_recall_at_10_diff1 value: 69.55050319096708 - type: nauc_recall_at_10_max value: 51.82774914089347 - type: nauc_recall_at_10_std value: -34.6081860579283 - type: nauc_recall_at_1_diff1 value: 79.17728405262736 - type: nauc_recall_at_1_max value: 31.880738163589527 - type: nauc_recall_at_1_std value: -30.891888718004584 - type: nauc_recall_at_20_diff1 value: 68.08078305042746 - type: nauc_recall_at_20_max value: 52.833188782885244 - type: nauc_recall_at_20_std value: -35.46070292817895 - type: nauc_recall_at_3_diff1 value: 70.76249609881896 - type: nauc_recall_at_3_max value: 38.865618686246464 - type: nauc_recall_at_3_std value: -39.68917853446999 - type: nauc_recall_at_5_diff1 value: 68.39110629013274 - type: nauc_recall_at_5_max value: 39.28677163904688 - type: nauc_recall_at_5_std value: -39.39101423819562 - type: ndcg_at_1 value: 55.7 - type: ndcg_at_10 value: 70.433 - type: ndcg_at_100 value: 72.975 - type: ndcg_at_1000 value: 73.283 - type: ndcg_at_20 value: 71.58 - type: ndcg_at_3 value: 66.83099999999999 - type: ndcg_at_5 value: 69.085 - type: precision_at_1 value: 55.7 - type: precision_at_10 value: 8.4 - type: precision_at_100 value: 0.959 - type: precision_at_1000 value: 0.098 - type: precision_at_20 value: 4.425 - type: precision_at_3 value: 24.8 - type: precision_at_5 value: 15.98 - type: recall_at_1 value: 55.7 - type: recall_at_10 value: 84.0 - type: recall_at_100 value: 95.89999999999999 - type: recall_at_1000 value: 98.2 - type: recall_at_20 value: 88.5 - type: recall_at_3 value: 74.4 - type: recall_at_5 value: 79.9 - task: type: Classification dataset: name: MTEB Waimai (default) type: C-MTEB/waimai-classification config: default split: test revision: 339287def212450dcaa9df8c22bf93e9980c7023 metrics: - type: accuracy value: 86.58999999999999 - type: ap value: 70.02619249927523 - type: ap_weighted value: 70.02619249927523 - type: f1 value: 84.97572770889423 - type: f1_weighted value: 86.6865713531272 - type: main_score value: 86.58999999999999 - task: type: Retrieval dataset: name: MTEB XMarket (en) type: jinaai/xmarket_ml config: en split: test revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b metrics: - type: main_score value: 34.772999999999996 - type: map_at_1 value: 7.2620000000000005 - type: map_at_10 value: 17.98 - type: map_at_100 value: 24.828 - type: map_at_1000 value: 26.633000000000003 - type: map_at_20 value: 20.699 - type: map_at_3 value: 12.383 - type: map_at_5 value: 14.871 - type: mrr_at_1 value: 34.718100890207715 - type: mrr_at_10 value: 43.9336827525092 - type: mrr_at_100 value: 44.66474011066837 - type: mrr_at_1000 value: 44.7075592197356 - type: mrr_at_20 value: 44.35984436569346 - type: mrr_at_3 value: 41.73901893981052 - type: mrr_at_5 value: 43.025973550207134 - type: nauc_map_at_1000_diff1 value: 13.899869081196364 - type: nauc_map_at_1000_max value: 46.60452816386231 - type: nauc_map_at_1000_std value: 24.87925799401773 - type: nauc_map_at_100_diff1 value: 16.164805650871084 - type: nauc_map_at_100_max value: 44.720912958558095 - type: nauc_map_at_100_std value: 20.236734536210477 - type: nauc_map_at_10_diff1 value: 23.58580520913581 - type: nauc_map_at_10_max value: 31.276151869914216 - type: nauc_map_at_10_std value: -0.1833326246041355 - type: nauc_map_at_1_diff1 value: 37.02663305598722 - type: nauc_map_at_1_max value: 14.931071531116528 - type: nauc_map_at_1_std value: -12.478790028708453 - type: nauc_map_at_20_diff1 value: 20.718297881540593 - type: nauc_map_at_20_max value: 36.62264094841859 - type: nauc_map_at_20_std value: 6.658514770057742 - type: nauc_map_at_3_diff1 value: 29.379034581120006 - type: nauc_map_at_3_max value: 21.387214269548803 - type: nauc_map_at_3_std value: -9.3404121914247 - type: nauc_map_at_5_diff1 value: 26.627169792839485 - type: nauc_map_at_5_max value: 25.393331109666388 - type: nauc_map_at_5_std value: -6.023485287246353 - type: nauc_mrr_at_1000_diff1 value: 12.047232036652295 - type: nauc_mrr_at_1000_max value: 46.611862580860645 - type: nauc_mrr_at_1000_std value: 27.89146066442305 - type: nauc_mrr_at_100_diff1 value: 12.05261747449997 - type: nauc_mrr_at_100_max value: 46.61328535381203 - type: nauc_mrr_at_100_std value: 27.886145596874535 - type: nauc_mrr_at_10_diff1 value: 12.006935553036941 - type: nauc_mrr_at_10_max value: 46.53351686240496 - type: nauc_mrr_at_10_std value: 27.708742470257462 - type: nauc_mrr_at_1_diff1 value: 13.323408127738782 - type: nauc_mrr_at_1_max value: 43.78884661002012 - type: nauc_mrr_at_1_std value: 25.164417588165673 - type: nauc_mrr_at_20_diff1 value: 12.036022973968011 - type: nauc_mrr_at_20_max value: 46.56537838037131 - type: nauc_mrr_at_20_std value: 27.78189157249635 - type: nauc_mrr_at_3_diff1 value: 11.943896700976381 - type: nauc_mrr_at_3_max value: 46.33644663073225 - type: nauc_mrr_at_3_std value: 27.523915405053845 - type: nauc_mrr_at_5_diff1 value: 12.03108009033769 - type: nauc_mrr_at_5_max value: 46.49103616896692 - type: nauc_mrr_at_5_std value: 27.630879129863366 - type: nauc_ndcg_at_1000_diff1 value: 9.766823796017324 - type: nauc_ndcg_at_1000_max value: 52.85844801910602 - type: nauc_ndcg_at_1000_std value: 36.43271437761207 - type: nauc_ndcg_at_100_diff1 value: 12.035059298282036 - type: nauc_ndcg_at_100_max value: 50.05520240705682 - type: nauc_ndcg_at_100_std value: 29.87678724506636 - type: nauc_ndcg_at_10_diff1 value: 10.281893031139424 - type: nauc_ndcg_at_10_max value: 47.02153679426017 - type: nauc_ndcg_at_10_std value: 26.624948330369126 - type: nauc_ndcg_at_1_diff1 value: 13.323408127738782 - type: nauc_ndcg_at_1_max value: 43.78884661002012 - type: nauc_ndcg_at_1_std value: 25.164417588165673 - type: nauc_ndcg_at_20_diff1 value: 11.463524849646598 - type: nauc_ndcg_at_20_max value: 47.415073186019704 - type: nauc_ndcg_at_20_std value: 26.359019620164307 - type: nauc_ndcg_at_3_diff1 value: 9.689199913805394 - type: nauc_ndcg_at_3_max value: 45.68151849572808 - type: nauc_ndcg_at_3_std value: 26.559193219799486 - type: nauc_ndcg_at_5_diff1 value: 9.448823370356575 - type: nauc_ndcg_at_5_max value: 46.19999662690141 - type: nauc_ndcg_at_5_std value: 26.8411706726069 - type: nauc_precision_at_1000_diff1 value: -20.379065598727024 - type: nauc_precision_at_1000_max value: 13.162562437268427 - type: nauc_precision_at_1000_std value: 22.658226157785812 - type: nauc_precision_at_100_diff1 value: -16.458155977309282 - type: nauc_precision_at_100_max value: 35.97956789169889 - type: nauc_precision_at_100_std value: 48.878375009979194 - type: nauc_precision_at_10_diff1 value: -7.810992317607771 - type: nauc_precision_at_10_max value: 49.307339277444754 - type: nauc_precision_at_10_std value: 42.82533951854582 - type: nauc_precision_at_1_diff1 value: 13.323408127738782 - type: nauc_precision_at_1_max value: 43.78884661002012 - type: nauc_precision_at_1_std value: 25.164417588165673 - type: nauc_precision_at_20_diff1 value: -11.43933465149542 - type: nauc_precision_at_20_max value: 46.93722753460038 - type: nauc_precision_at_20_std value: 47.36223769029678 - type: nauc_precision_at_3_diff1 value: 1.3230178593599737 - type: nauc_precision_at_3_max value: 48.49039534395576 - type: nauc_precision_at_3_std value: 33.161384183129194 - type: nauc_precision_at_5_diff1 value: -3.185516457926519 - type: nauc_precision_at_5_max value: 49.5814309394308 - type: nauc_precision_at_5_std value: 37.57637865900281 - type: nauc_recall_at_1000_diff1 value: 7.839499443984168 - type: nauc_recall_at_1000_max value: 52.67165467640894 - type: nauc_recall_at_1000_std value: 48.85318316702583 - type: nauc_recall_at_100_diff1 value: 14.117557049589418 - type: nauc_recall_at_100_max value: 40.59046301348715 - type: nauc_recall_at_100_std value: 24.379680901739505 - type: nauc_recall_at_10_diff1 value: 20.04536052614054 - type: nauc_recall_at_10_max value: 25.54148839721574 - type: nauc_recall_at_10_std value: -1.938182527562211 - type: nauc_recall_at_1_diff1 value: 37.02663305598722 - type: nauc_recall_at_1_max value: 14.931071531116528 - type: nauc_recall_at_1_std value: -12.478790028708453 - type: nauc_recall_at_20_diff1 value: 17.959977483235566 - type: nauc_recall_at_20_max value: 29.88502687870809 - type: nauc_recall_at_20_std value: 4.26527395196852 - type: nauc_recall_at_3_diff1 value: 26.297810954500456 - type: nauc_recall_at_3_max value: 18.819406079307402 - type: nauc_recall_at_3_std value: -10.002237229729081 - type: nauc_recall_at_5_diff1 value: 22.739080899568485 - type: nauc_recall_at_5_max value: 21.0322968243985 - type: nauc_recall_at_5_std value: -6.927749435306422 - type: ndcg_at_1 value: 34.717999999999996 - type: ndcg_at_10 value: 34.772999999999996 - type: ndcg_at_100 value: 39.407 - type: ndcg_at_1000 value: 44.830999999999996 - type: ndcg_at_20 value: 35.667 - type: ndcg_at_3 value: 34.332 - type: ndcg_at_5 value: 34.408 - type: precision_at_1 value: 34.717999999999996 - type: precision_at_10 value: 23.430999999999997 - type: precision_at_100 value: 9.31 - type: precision_at_1000 value: 2.259 - type: precision_at_20 value: 18.826999999999998 - type: precision_at_3 value: 30.553 - type: precision_at_5 value: 27.792 - type: recall_at_1 value: 7.2620000000000005 - type: recall_at_10 value: 26.384 - type: recall_at_100 value: 52.506 - type: recall_at_1000 value: 73.38 - type: recall_at_20 value: 34.032000000000004 - type: recall_at_3 value: 14.821000000000002 - type: recall_at_5 value: 19.481 - task: type: Retrieval dataset: name: MTEB XMarket (de) type: jinaai/xmarket_ml config: de split: test revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b metrics: - type: main_score value: 28.316000000000003 - type: map_at_1 value: 8.667 - type: map_at_10 value: 17.351 - type: map_at_100 value: 21.02 - type: map_at_1000 value: 21.951 - type: map_at_20 value: 18.994 - type: map_at_3 value: 13.23 - type: map_at_5 value: 15.17 - type: mrr_at_1 value: 27.27272727272727 - type: mrr_at_10 value: 36.10858487561485 - type: mrr_at_100 value: 36.92033814316568 - type: mrr_at_1000 value: 36.972226653870365 - type: mrr_at_20 value: 36.58914906427944 - type: mrr_at_3 value: 33.642969201552305 - type: mrr_at_5 value: 35.13417554289494 - type: nauc_map_at_1000_diff1 value: 23.345116790998063 - type: nauc_map_at_1000_max value: 44.447240670835725 - type: nauc_map_at_1000_std value: 18.34636500680144 - type: nauc_map_at_100_diff1 value: 24.458120909292347 - type: nauc_map_at_100_max value: 43.31851431140378 - type: nauc_map_at_100_std value: 15.654778355549965 - type: nauc_map_at_10_diff1 value: 29.376508937265044 - type: nauc_map_at_10_max value: 36.650196725140795 - type: nauc_map_at_10_std value: 4.682465435374843 - type: nauc_map_at_1_diff1 value: 40.382365672683214 - type: nauc_map_at_1_max value: 22.894341150096785 - type: nauc_map_at_1_std value: -5.610725673968323 - type: nauc_map_at_20_diff1 value: 27.197033425732908 - type: nauc_map_at_20_max value: 39.71672400647207 - type: nauc_map_at_20_std value: 8.944436813309933 - type: nauc_map_at_3_diff1 value: 34.49739294661502 - type: nauc_map_at_3_max value: 29.006972420735284 - type: nauc_map_at_3_std value: -3.0372650571243986 - type: nauc_map_at_5_diff1 value: 32.764901537277105 - type: nauc_map_at_5_max value: 32.658533295918154 - type: nauc_map_at_5_std value: 0.029626452286996906 - type: nauc_mrr_at_1000_diff1 value: 19.521229956280603 - type: nauc_mrr_at_1000_max value: 44.39409866211472 - type: nauc_mrr_at_1000_std value: 23.580697307036058 - type: nauc_mrr_at_100_diff1 value: 19.51312676591073 - type: nauc_mrr_at_100_max value: 44.39559153963895 - type: nauc_mrr_at_100_std value: 23.57913711397437 - type: nauc_mrr_at_10_diff1 value: 19.584635617935145 - type: nauc_mrr_at_10_max value: 44.44842226236198 - type: nauc_mrr_at_10_std value: 23.382684909390434 - type: nauc_mrr_at_1_diff1 value: 20.92594790923806 - type: nauc_mrr_at_1_max value: 40.593939625252816 - type: nauc_mrr_at_1_std value: 20.37467598073644 - type: nauc_mrr_at_20_diff1 value: 19.590641822115725 - type: nauc_mrr_at_20_max value: 44.42512299604718 - type: nauc_mrr_at_20_std value: 23.45564260800024 - type: nauc_mrr_at_3_diff1 value: 20.005307129527232 - type: nauc_mrr_at_3_max value: 43.68300366192776 - type: nauc_mrr_at_3_std value: 22.297190480842005 - type: nauc_mrr_at_5_diff1 value: 19.852896386271716 - type: nauc_mrr_at_5_max value: 44.20641808920062 - type: nauc_mrr_at_5_std value: 22.966517330852895 - type: nauc_ndcg_at_1000_diff1 value: 17.800116251376103 - type: nauc_ndcg_at_1000_max value: 50.98332718061365 - type: nauc_ndcg_at_1000_std value: 31.464484658102577 - type: nauc_ndcg_at_100_diff1 value: 19.555159680541088 - type: nauc_ndcg_at_100_max value: 48.56377130899141 - type: nauc_ndcg_at_100_std value: 25.77572748714817 - type: nauc_ndcg_at_10_diff1 value: 20.003008726679415 - type: nauc_ndcg_at_10_max value: 45.1293725480628 - type: nauc_ndcg_at_10_std value: 21.149213260765872 - type: nauc_ndcg_at_1_diff1 value: 21.00986278773023 - type: nauc_ndcg_at_1_max value: 40.524637076774894 - type: nauc_ndcg_at_1_std value: 20.29682194006685 - type: nauc_ndcg_at_20_diff1 value: 20.659734137312284 - type: nauc_ndcg_at_20_max value: 45.73108736599869 - type: nauc_ndcg_at_20_std value: 21.200736170346133 - type: nauc_ndcg_at_3_diff1 value: 19.200120542882544 - type: nauc_ndcg_at_3_max value: 42.89772612963168 - type: nauc_ndcg_at_3_std value: 20.713292754978983 - type: nauc_ndcg_at_5_diff1 value: 19.96329647992544 - type: nauc_ndcg_at_5_max value: 44.296627037787324 - type: nauc_ndcg_at_5_std value: 21.200135784971973 - type: nauc_precision_at_1000_diff1 value: -11.543221249009427 - type: nauc_precision_at_1000_max value: 9.132801614448221 - type: nauc_precision_at_1000_std value: 21.203720655381055 - type: nauc_precision_at_100_diff1 value: -12.510945425786039 - type: nauc_precision_at_100_max value: 31.42530963666252 - type: nauc_precision_at_100_std value: 44.99672783467617 - type: nauc_precision_at_10_diff1 value: -4.025802651746804 - type: nauc_precision_at_10_max value: 47.50967924227793 - type: nauc_precision_at_10_std value: 41.1558559268985 - type: nauc_precision_at_1_diff1 value: 21.00986278773023 - type: nauc_precision_at_1_max value: 40.524637076774894 - type: nauc_precision_at_1_std value: 20.29682194006685 - type: nauc_precision_at_20_diff1 value: -8.059482951110002 - type: nauc_precision_at_20_max value: 44.28832115946278 - type: nauc_precision_at_20_std value: 45.2005585353651 - type: nauc_precision_at_3_diff1 value: 8.53530005716248 - type: nauc_precision_at_3_max value: 46.48353678905102 - type: nauc_precision_at_3_std value: 28.868791323881972 - type: nauc_precision_at_5_diff1 value: 3.093619954821814 - type: nauc_precision_at_5_max value: 48.43294475817019 - type: nauc_precision_at_5_std value: 34.83430452745434 - type: nauc_recall_at_1000_diff1 value: 9.93680206699751 - type: nauc_recall_at_1000_max value: 52.97840222394363 - type: nauc_recall_at_1000_std value: 46.370023604436255 - type: nauc_recall_at_100_diff1 value: 14.100542445524972 - type: nauc_recall_at_100_max value: 42.853775131475224 - type: nauc_recall_at_100_std value: 26.93029971231028 - type: nauc_recall_at_10_diff1 value: 22.774547475714716 - type: nauc_recall_at_10_max value: 33.984586405015044 - type: nauc_recall_at_10_std value: 5.332325172373655 - type: nauc_recall_at_1_diff1 value: 40.382365672683214 - type: nauc_recall_at_1_max value: 22.894341150096785 - type: nauc_recall_at_1_std value: -5.610725673968323 - type: nauc_recall_at_20_diff1 value: 19.751060483835936 - type: nauc_recall_at_20_max value: 36.18774034635102 - type: nauc_recall_at_20_std value: 10.362242090308577 - type: nauc_recall_at_3_diff1 value: 30.29462372902671 - type: nauc_recall_at_3_max value: 27.377175450099635 - type: nauc_recall_at_3_std value: -3.015752705993425 - type: nauc_recall_at_5_diff1 value: 28.096893312615723 - type: nauc_recall_at_5_max value: 30.485075571512425 - type: nauc_recall_at_5_std value: 0.09106417003502826 - type: ndcg_at_1 value: 27.248 - type: ndcg_at_10 value: 28.316000000000003 - type: ndcg_at_100 value: 33.419 - type: ndcg_at_1000 value: 38.134 - type: ndcg_at_20 value: 29.707 - type: ndcg_at_3 value: 26.93 - type: ndcg_at_5 value: 27.363 - type: precision_at_1 value: 27.248 - type: precision_at_10 value: 15.073 - type: precision_at_100 value: 5.061 - type: precision_at_1000 value: 1.325 - type: precision_at_20 value: 11.407 - type: precision_at_3 value: 21.823 - type: precision_at_5 value: 18.984 - type: recall_at_1 value: 8.667 - type: recall_at_10 value: 26.984 - type: recall_at_100 value: 49.753 - type: recall_at_1000 value: 70.354 - type: recall_at_20 value: 33.955999999999996 - type: recall_at_3 value: 16.086 - type: recall_at_5 value: 20.544999999999998 - task: type: Retrieval dataset: name: MTEB XMarket (es) type: jinaai/xmarket_ml config: es split: test revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b metrics: - type: main_score value: 26.592 - type: map_at_1 value: 8.081000000000001 - type: map_at_10 value: 16.486 - type: map_at_100 value: 19.996 - type: map_at_1000 value: 20.889 - type: map_at_20 value: 18.088 - type: map_at_3 value: 12.864 - type: map_at_5 value: 14.515 - type: mrr_at_1 value: 24.643356643356643 - type: mrr_at_10 value: 33.755599955599926 - type: mrr_at_100 value: 34.55914769326114 - type: mrr_at_1000 value: 34.614384237219745 - type: mrr_at_20 value: 34.228909650276194 - type: mrr_at_3 value: 31.445221445221456 - type: mrr_at_5 value: 32.71375291375297 - type: nauc_map_at_1000_diff1 value: 19.17751654240679 - type: nauc_map_at_1000_max value: 43.493743561136434 - type: nauc_map_at_1000_std value: 21.14477911550252 - type: nauc_map_at_100_diff1 value: 20.259227234415395 - type: nauc_map_at_100_max value: 42.510860292169106 - type: nauc_map_at_100_std value: 18.63085160442346 - type: nauc_map_at_10_diff1 value: 24.12419385640694 - type: nauc_map_at_10_max value: 35.99892932069915 - type: nauc_map_at_10_std value: 8.488520124325058 - type: nauc_map_at_1_diff1 value: 35.09239143996649 - type: nauc_map_at_1_max value: 23.72498533914286 - type: nauc_map_at_1_std value: -4.164387883546102 - type: nauc_map_at_20_diff1 value: 22.411418237320817 - type: nauc_map_at_20_max value: 39.12496266094892 - type: nauc_map_at_20_std value: 12.371656353894227 - type: nauc_map_at_3_diff1 value: 28.106972376813506 - type: nauc_map_at_3_max value: 29.57824316865409 - type: nauc_map_at_3_std value: 1.8928791254813127 - type: nauc_map_at_5_diff1 value: 26.4958239149419 - type: nauc_map_at_5_max value: 32.45906016649239 - type: nauc_map_at_5_std value: 4.612735963224018 - type: nauc_mrr_at_1000_diff1 value: 17.614812607094446 - type: nauc_mrr_at_1000_max value: 41.13031556228715 - type: nauc_mrr_at_1000_std value: 22.564112871230318 - type: nauc_mrr_at_100_diff1 value: 17.614044568011085 - type: nauc_mrr_at_100_max value: 41.129436273086796 - type: nauc_mrr_at_100_std value: 22.566763500658766 - type: nauc_mrr_at_10_diff1 value: 17.61869494452089 - type: nauc_mrr_at_10_max value: 41.091542329381426 - type: nauc_mrr_at_10_std value: 22.370473458633594 - type: nauc_mrr_at_1_diff1 value: 20.321421442201913 - type: nauc_mrr_at_1_max value: 38.36531448180009 - type: nauc_mrr_at_1_std value: 18.422203207777688 - type: nauc_mrr_at_20_diff1 value: 17.614767736091625 - type: nauc_mrr_at_20_max value: 41.11221420736687 - type: nauc_mrr_at_20_std value: 22.44271891522012 - type: nauc_mrr_at_3_diff1 value: 17.98184651584625 - type: nauc_mrr_at_3_max value: 40.424293610470144 - type: nauc_mrr_at_3_std value: 21.554750947206706 - type: nauc_mrr_at_5_diff1 value: 17.72088314927416 - type: nauc_mrr_at_5_max value: 40.662724739072694 - type: nauc_mrr_at_5_std value: 21.822957528431928 - type: nauc_ndcg_at_1000_diff1 value: 15.310699428328398 - type: nauc_ndcg_at_1000_max value: 48.83921393349997 - type: nauc_ndcg_at_1000_std value: 32.22600294110774 - type: nauc_ndcg_at_100_diff1 value: 16.62672763977423 - type: nauc_ndcg_at_100_max value: 47.36060653537392 - type: nauc_ndcg_at_100_std value: 27.879865162871575 - type: nauc_ndcg_at_10_diff1 value: 16.436684176028116 - type: nauc_ndcg_at_10_max value: 43.00026520872974 - type: nauc_ndcg_at_10_std value: 22.507354939162806 - type: nauc_ndcg_at_1_diff1 value: 20.321421442201913 - type: nauc_ndcg_at_1_max value: 38.36531448180009 - type: nauc_ndcg_at_1_std value: 18.422203207777688 - type: nauc_ndcg_at_20_diff1 value: 17.127747123248835 - type: nauc_ndcg_at_20_max value: 44.57322943752733 - type: nauc_ndcg_at_20_std value: 23.146541187377036 - type: nauc_ndcg_at_3_diff1 value: 16.372742984728514 - type: nauc_ndcg_at_3_max value: 40.91938017883993 - type: nauc_ndcg_at_3_std value: 21.50917089194154 - type: nauc_ndcg_at_5_diff1 value: 16.40486505525073 - type: nauc_ndcg_at_5_max value: 41.94597203181329 - type: nauc_ndcg_at_5_std value: 22.068260809047562 - type: nauc_precision_at_1000_diff1 value: -15.9415313729527 - type: nauc_precision_at_1000_max value: 12.653329948983643 - type: nauc_precision_at_1000_std value: 26.371820703256173 - type: nauc_precision_at_100_diff1 value: -11.851070166675289 - type: nauc_precision_at_100_max value: 32.164365923950115 - type: nauc_precision_at_100_std value: 45.930226426725426 - type: nauc_precision_at_10_diff1 value: -3.1352660378259163 - type: nauc_precision_at_10_max value: 45.48359878733272 - type: nauc_precision_at_10_std value: 40.2917038044196 - type: nauc_precision_at_1_diff1 value: 20.321421442201913 - type: nauc_precision_at_1_max value: 38.36531448180009 - type: nauc_precision_at_1_std value: 18.422203207777688 - type: nauc_precision_at_20_diff1 value: -7.087513342144751 - type: nauc_precision_at_20_max value: 43.66272019058357 - type: nauc_precision_at_20_std value: 44.22863351071686 - type: nauc_precision_at_3_diff1 value: 7.836185032609045 - type: nauc_precision_at_3_max value: 44.85412904097269 - type: nauc_precision_at_3_std value: 30.209139149500057 - type: nauc_precision_at_5_diff1 value: 3.028150537253791 - type: nauc_precision_at_5_max value: 45.73661708882973 - type: nauc_precision_at_5_std value: 34.65500311185052 - type: nauc_recall_at_1000_diff1 value: 9.526124668370704 - type: nauc_recall_at_1000_max value: 51.4190208452196 - type: nauc_recall_at_1000_std value: 45.694891695646426 - type: nauc_recall_at_100_diff1 value: 12.68466215400009 - type: nauc_recall_at_100_max value: 42.79112054268112 - type: nauc_recall_at_100_std value: 28.61954251400998 - type: nauc_recall_at_10_diff1 value: 17.95124413416829 - type: nauc_recall_at_10_max value: 33.1192036755167 - type: nauc_recall_at_10_std value: 9.3588175959525 - type: nauc_recall_at_1_diff1 value: 35.09239143996649 - type: nauc_recall_at_1_max value: 23.72498533914286 - type: nauc_recall_at_1_std value: -4.164387883546102 - type: nauc_recall_at_20_diff1 value: 16.24916980445646 - type: nauc_recall_at_20_max value: 36.51316122236076 - type: nauc_recall_at_20_std value: 13.641588062425736 - type: nauc_recall_at_3_diff1 value: 23.263199724138786 - type: nauc_recall_at_3_max value: 27.67354561610614 - type: nauc_recall_at_3_std value: 3.103127242654415 - type: nauc_recall_at_5_diff1 value: 20.719704839229635 - type: nauc_recall_at_5_max value: 29.66480839111333 - type: nauc_recall_at_5_std value: 5.514884455797986 - type: ndcg_at_1 value: 24.643 - type: ndcg_at_10 value: 26.592 - type: ndcg_at_100 value: 31.887 - type: ndcg_at_1000 value: 36.695 - type: ndcg_at_20 value: 28.166000000000004 - type: ndcg_at_3 value: 25.238 - type: ndcg_at_5 value: 25.545 - type: precision_at_1 value: 24.643 - type: precision_at_10 value: 13.730999999999998 - type: precision_at_100 value: 4.744000000000001 - type: precision_at_1000 value: 1.167 - type: precision_at_20 value: 10.562000000000001 - type: precision_at_3 value: 20.288999999999998 - type: precision_at_5 value: 17.337 - type: recall_at_1 value: 8.081000000000001 - type: recall_at_10 value: 25.911 - type: recall_at_100 value: 48.176 - type: recall_at_1000 value: 69.655 - type: recall_at_20 value: 32.924 - type: recall_at_3 value: 16.125 - type: recall_at_5 value: 19.988 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (deu-deu) type: jinaai/xpqa config: deu-deu split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 84.552 - type: map_at_1 value: 59.023 - type: map_at_10 value: 81.051 - type: map_at_100 value: 81.539 - type: map_at_1000 value: 81.54299999999999 - type: map_at_20 value: 81.401 - type: map_at_3 value: 76.969 - type: map_at_5 value: 80.07600000000001 - type: mrr_at_1 value: 77.67624020887729 - type: mrr_at_10 value: 83.30509967259314 - type: mrr_at_100 value: 83.58599391639456 - type: mrr_at_1000 value: 83.58970114722587 - type: mrr_at_20 value: 83.50275980440317 - type: mrr_at_3 value: 82.07136640557006 - type: mrr_at_5 value: 82.94604003481287 - type: nauc_map_at_1000_diff1 value: 63.12885104269942 - type: nauc_map_at_1000_max value: 57.7017996674959 - type: nauc_map_at_1000_std value: -24.951068985070513 - type: nauc_map_at_100_diff1 value: 63.12866509393162 - type: nauc_map_at_100_max value: 57.70176426013332 - type: nauc_map_at_100_std value: -24.96012290790273 - type: nauc_map_at_10_diff1 value: 62.847709436211204 - type: nauc_map_at_10_max value: 57.408873624779524 - type: nauc_map_at_10_std value: -25.635130363219062 - type: nauc_map_at_1_diff1 value: 71.89683981857102 - type: nauc_map_at_1_max value: 20.204460967432645 - type: nauc_map_at_1_std value: -23.07894656629493 - type: nauc_map_at_20_diff1 value: 63.00504457011043 - type: nauc_map_at_20_max value: 57.66009512514262 - type: nauc_map_at_20_std value: -25.100138593754885 - type: nauc_map_at_3_diff1 value: 63.199874607788274 - type: nauc_map_at_3_max value: 47.54482033763308 - type: nauc_map_at_3_std value: -27.714557098916963 - type: nauc_map_at_5_diff1 value: 63.01006523518669 - type: nauc_map_at_5_max value: 56.501965964288495 - type: nauc_map_at_5_std value: -25.367825762790925 - type: nauc_mrr_at_1000_diff1 value: 66.24988063948112 - type: nauc_mrr_at_1000_max value: 63.56921667744273 - type: nauc_mrr_at_1000_std value: -22.073973768031863 - type: nauc_mrr_at_100_diff1 value: 66.24919554296275 - type: nauc_mrr_at_100_max value: 63.57382447608361 - type: nauc_mrr_at_100_std value: -22.084627248538187 - type: nauc_mrr_at_10_diff1 value: 66.0143885124066 - type: nauc_mrr_at_10_max value: 63.51277586011898 - type: nauc_mrr_at_10_std value: -22.477523960705454 - type: nauc_mrr_at_1_diff1 value: 68.25415199323474 - type: nauc_mrr_at_1_max value: 63.069019003272416 - type: nauc_mrr_at_1_std value: -18.77085924093244 - type: nauc_mrr_at_20_diff1 value: 66.16203167351055 - type: nauc_mrr_at_20_max value: 63.607477776215845 - type: nauc_mrr_at_20_std value: -22.15083176017266 - type: nauc_mrr_at_3_diff1 value: 66.39368842782302 - type: nauc_mrr_at_3_max value: 63.11411066585295 - type: nauc_mrr_at_3_std value: -22.63174342814071 - type: nauc_mrr_at_5_diff1 value: 66.17932562332354 - type: nauc_mrr_at_5_max value: 63.70434825329594 - type: nauc_mrr_at_5_std value: -21.704012812430438 - type: nauc_ndcg_at_1000_diff1 value: 63.958010361549356 - type: nauc_ndcg_at_1000_max value: 60.516445000134624 - type: nauc_ndcg_at_1000_std value: -24.264672248289923 - type: nauc_ndcg_at_100_diff1 value: 63.97654644758022 - type: nauc_ndcg_at_100_max value: 60.62187552803407 - type: nauc_ndcg_at_100_std value: -24.317149225778312 - type: nauc_ndcg_at_10_diff1 value: 62.505321221321566 - type: nauc_ndcg_at_10_max value: 59.77891112351258 - type: nauc_ndcg_at_10_std value: -26.90910005589911 - type: nauc_ndcg_at_1_diff1 value: 68.25415199323474 - type: nauc_ndcg_at_1_max value: 63.069019003272416 - type: nauc_ndcg_at_1_std value: -18.77085924093244 - type: nauc_ndcg_at_20_diff1 value: 63.04281805056225 - type: nauc_ndcg_at_20_max value: 60.600957307444226 - type: nauc_ndcg_at_20_std value: -24.954862079889203 - type: nauc_ndcg_at_3_diff1 value: 62.970441139740316 - type: nauc_ndcg_at_3_max value: 57.543715669055295 - type: nauc_ndcg_at_3_std value: -25.659388431714703 - type: nauc_ndcg_at_5_diff1 value: 62.82652127664541 - type: nauc_ndcg_at_5_max value: 58.6970443258532 - type: nauc_ndcg_at_5_std value: -25.66329354851023 - type: nauc_precision_at_1000_diff1 value: -33.38530947486223 - type: nauc_precision_at_1000_max value: 25.972468024345414 - type: nauc_precision_at_1000_std value: 17.460222955117978 - type: nauc_precision_at_100_diff1 value: -32.45175999251703 - type: nauc_precision_at_100_max value: 26.367996120487337 - type: nauc_precision_at_100_std value: 17.097957946391208 - type: nauc_precision_at_10_diff1 value: -26.97411235289487 - type: nauc_precision_at_10_max value: 31.504961687240762 - type: nauc_precision_at_10_std value: 11.125341183874687 - type: nauc_precision_at_1_diff1 value: 68.25415199323474 - type: nauc_precision_at_1_max value: 63.069019003272416 - type: nauc_precision_at_1_std value: -18.77085924093244 - type: nauc_precision_at_20_diff1 value: -29.8678078736273 - type: nauc_precision_at_20_max value: 29.031222186584504 - type: nauc_precision_at_20_std value: 14.943600563087928 - type: nauc_precision_at_3_diff1 value: -15.92947221299854 - type: nauc_precision_at_3_max value: 37.73833494235097 - type: nauc_precision_at_3_std value: 3.1573228443500847 - type: nauc_precision_at_5_diff1 value: -22.269156821101642 - type: nauc_precision_at_5_max value: 35.65821838116355 - type: nauc_precision_at_5_std value: 9.265930386198972 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: 66.17058859539249 - type: nauc_recall_at_100_max value: 78.066942935192 - type: nauc_recall_at_100_std value: -22.213377762074686 - type: nauc_recall_at_10_diff1 value: 50.82149700700275 - type: nauc_recall_at_10_max value: 56.68053325008221 - type: nauc_recall_at_10_std value: -41.81657941433277 - type: nauc_recall_at_1_diff1 value: 71.89683981857102 - type: nauc_recall_at_1_max value: 20.204460967432645 - type: nauc_recall_at_1_std value: -23.07894656629493 - type: nauc_recall_at_20_diff1 value: 48.28076011857885 - type: nauc_recall_at_20_max value: 63.29641555519295 - type: nauc_recall_at_20_std value: -32.953559708819405 - type: nauc_recall_at_3_diff1 value: 58.15516956312558 - type: nauc_recall_at_3_max value: 42.66315890283056 - type: nauc_recall_at_3_std value: -32.16572530544806 - type: nauc_recall_at_5_diff1 value: 55.900844052439766 - type: nauc_recall_at_5_max value: 55.23702018862884 - type: nauc_recall_at_5_std value: -30.105929528165 - type: ndcg_at_1 value: 77.676 - type: ndcg_at_10 value: 84.552 - type: ndcg_at_100 value: 86.232 - type: ndcg_at_1000 value: 86.33800000000001 - type: ndcg_at_20 value: 85.515 - type: ndcg_at_3 value: 81.112 - type: ndcg_at_5 value: 82.943 - type: precision_at_1 value: 77.676 - type: precision_at_10 value: 15.17 - type: precision_at_100 value: 1.6230000000000002 - type: precision_at_1000 value: 0.163 - type: precision_at_20 value: 7.858999999999999 - type: precision_at_3 value: 42.994 - type: precision_at_5 value: 28.747 - type: recall_at_1 value: 59.023 - type: recall_at_10 value: 92.465 - type: recall_at_100 value: 99.18400000000001 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 95.844 - type: recall_at_3 value: 81.826 - type: recall_at_5 value: 88.22 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (deu-eng) type: jinaai/xpqa config: deu-eng split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 82.149 - type: map_at_1 value: 56.277 - type: map_at_10 value: 78.36999999999999 - type: map_at_100 value: 78.94 - type: map_at_1000 value: 78.95 - type: map_at_20 value: 78.818 - type: map_at_3 value: 74.25 - type: map_at_5 value: 77.11099999999999 - type: mrr_at_1 value: 74.28198433420366 - type: mrr_at_10 value: 80.57487877657589 - type: mrr_at_100 value: 80.94025764149008 - type: mrr_at_1000 value: 80.94608738871234 - type: mrr_at_20 value: 80.86240675885023 - type: mrr_at_3 value: 79.4604003481288 - type: mrr_at_5 value: 80.10008703220191 - type: nauc_map_at_1000_diff1 value: 60.44369249057189 - type: nauc_map_at_1000_max value: 49.822240441830246 - type: nauc_map_at_1000_std value: -27.34026380762817 - type: nauc_map_at_100_diff1 value: 60.44635668050401 - type: nauc_map_at_100_max value: 49.838675926660684 - type: nauc_map_at_100_std value: -27.310365556055583 - type: nauc_map_at_10_diff1 value: 60.18546951726522 - type: nauc_map_at_10_max value: 49.72075398096832 - type: nauc_map_at_10_std value: -27.86056102461558 - type: nauc_map_at_1_diff1 value: 71.2906657099758 - type: nauc_map_at_1_max value: 18.970399251589 - type: nauc_map_at_1_std value: -27.260776614286602 - type: nauc_map_at_20_diff1 value: 60.3525975566164 - type: nauc_map_at_20_max value: 49.852487866710646 - type: nauc_map_at_20_std value: -27.305173830170332 - type: nauc_map_at_3_diff1 value: 60.66803500571236 - type: nauc_map_at_3_max value: 41.18191941521972 - type: nauc_map_at_3_std value: -28.71383593401732 - type: nauc_map_at_5_diff1 value: 60.57216514504887 - type: nauc_map_at_5_max value: 47.99837400446299 - type: nauc_map_at_5_std value: -28.756183015949986 - type: nauc_mrr_at_1000_diff1 value: 63.77031955602516 - type: nauc_mrr_at_1000_max value: 54.26907383811417 - type: nauc_mrr_at_1000_std value: -26.227442087164714 - type: nauc_mrr_at_100_diff1 value: 63.77196650108669 - type: nauc_mrr_at_100_max value: 54.281801457913126 - type: nauc_mrr_at_100_std value: -26.216077891830793 - type: nauc_mrr_at_10_diff1 value: 63.50095284903051 - type: nauc_mrr_at_10_max value: 54.3186301730016 - type: nauc_mrr_at_10_std value: -26.29570241722173 - type: nauc_mrr_at_1_diff1 value: 65.15855770999057 - type: nauc_mrr_at_1_max value: 53.213286738515066 - type: nauc_mrr_at_1_std value: -24.683178252901943 - type: nauc_mrr_at_20_diff1 value: 63.74936550280859 - type: nauc_mrr_at_20_max value: 54.355343751439065 - type: nauc_mrr_at_20_std value: -26.197316900009817 - type: nauc_mrr_at_3_diff1 value: 63.912612979082695 - type: nauc_mrr_at_3_max value: 53.75399024225975 - type: nauc_mrr_at_3_std value: -27.194143264554675 - type: nauc_mrr_at_5_diff1 value: 63.72491059053639 - type: nauc_mrr_at_5_max value: 53.66107604019352 - type: nauc_mrr_at_5_std value: -26.92281560584754 - type: nauc_ndcg_at_1000_diff1 value: 61.304218998714354 - type: nauc_ndcg_at_1000_max value: 52.409135743660386 - type: nauc_ndcg_at_1000_std value: -26.539796489464056 - type: nauc_ndcg_at_100_diff1 value: 61.40355045085304 - type: nauc_ndcg_at_100_max value: 52.79402259608008 - type: nauc_ndcg_at_100_std value: -25.927273456979965 - type: nauc_ndcg_at_10_diff1 value: 59.93675608684116 - type: nauc_ndcg_at_10_max value: 52.617848197542706 - type: nauc_ndcg_at_10_std value: -27.314820020095887 - type: nauc_ndcg_at_1_diff1 value: 65.15855770999057 - type: nauc_ndcg_at_1_max value: 53.213286738515066 - type: nauc_ndcg_at_1_std value: -24.683178252901943 - type: nauc_ndcg_at_20_diff1 value: 60.85093704358376 - type: nauc_ndcg_at_20_max value: 53.14529242671602 - type: nauc_ndcg_at_20_std value: -25.93187916231906 - type: nauc_ndcg_at_3_diff1 value: 60.42301123518882 - type: nauc_ndcg_at_3_max value: 49.59021992975956 - type: nauc_ndcg_at_3_std value: -27.397117967810363 - type: nauc_ndcg_at_5_diff1 value: 60.78655153154219 - type: nauc_ndcg_at_5_max value: 49.54194799556953 - type: nauc_ndcg_at_5_std value: -29.467910172913413 - type: nauc_precision_at_1000_diff1 value: -34.35027108027456 - type: nauc_precision_at_1000_max value: 23.762671066858815 - type: nauc_precision_at_1000_std value: 16.1704780298982 - type: nauc_precision_at_100_diff1 value: -32.66610016754961 - type: nauc_precision_at_100_max value: 25.504044603109588 - type: nauc_precision_at_100_std value: 16.932402988816786 - type: nauc_precision_at_10_diff1 value: -25.720903145017342 - type: nauc_precision_at_10_max value: 30.37029690599926 - type: nauc_precision_at_10_std value: 10.560753160200314 - type: nauc_precision_at_1_diff1 value: 65.15855770999057 - type: nauc_precision_at_1_max value: 53.213286738515066 - type: nauc_precision_at_1_std value: -24.683178252901943 - type: nauc_precision_at_20_diff1 value: -29.577582332619084 - type: nauc_precision_at_20_max value: 27.984145595920417 - type: nauc_precision_at_20_std value: 15.083711704044727 - type: nauc_precision_at_3_diff1 value: -14.736267532892697 - type: nauc_precision_at_3_max value: 36.12211021824307 - type: nauc_precision_at_3_std value: 3.068643876519412 - type: nauc_precision_at_5_diff1 value: -19.846707283120825 - type: nauc_precision_at_5_max value: 33.573804532177896 - type: nauc_precision_at_5_std value: 5.700545622744924 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: 68.24749796604452 - type: nauc_recall_at_100_max value: 83.30024864929815 - type: nauc_recall_at_100_std value: 21.23763053711522 - type: nauc_recall_at_10_diff1 value: 50.704049683241436 - type: nauc_recall_at_10_max value: 57.64578984555556 - type: nauc_recall_at_10_std value: -26.632759037746073 - type: nauc_recall_at_1_diff1 value: 71.2906657099758 - type: nauc_recall_at_1_max value: 18.970399251589 - type: nauc_recall_at_1_std value: -27.260776614286602 - type: nauc_recall_at_20_diff1 value: 54.124480837579505 - type: nauc_recall_at_20_max value: 66.4641515433479 - type: nauc_recall_at_20_std value: -14.615911455379393 - type: nauc_recall_at_3_diff1 value: 56.54358788321059 - type: nauc_recall_at_3_max value: 37.765735322465744 - type: nauc_recall_at_3_std value: -30.824147408598574 - type: nauc_recall_at_5_diff1 value: 56.392894535029214 - type: nauc_recall_at_5_max value: 45.959268387521554 - type: nauc_recall_at_5_std value: -33.58175576925282 - type: ndcg_at_1 value: 74.28200000000001 - type: ndcg_at_10 value: 82.149 - type: ndcg_at_100 value: 84.129 - type: ndcg_at_1000 value: 84.307 - type: ndcg_at_20 value: 83.39999999999999 - type: ndcg_at_3 value: 78.583 - type: ndcg_at_5 value: 80.13900000000001 - type: precision_at_1 value: 74.28200000000001 - type: precision_at_10 value: 14.960999999999999 - type: precision_at_100 value: 1.6119999999999999 - type: precision_at_1000 value: 0.163 - type: precision_at_20 value: 7.813000000000001 - type: precision_at_3 value: 41.819 - type: precision_at_5 value: 27.911 - type: recall_at_1 value: 56.277 - type: recall_at_10 value: 90.729 - type: recall_at_100 value: 98.792 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 95.148 - type: recall_at_3 value: 79.989 - type: recall_at_5 value: 85.603 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (eng-deu) type: jinaai/xpqa config: eng-deu split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 60.428000000000004 - type: map_at_1 value: 33.453 - type: map_at_10 value: 54.217000000000006 - type: map_at_100 value: 55.832 - type: map_at_1000 value: 55.884 - type: map_at_20 value: 55.236 - type: map_at_3 value: 48.302 - type: map_at_5 value: 51.902 - type: mrr_at_1 value: 53.916449086161876 - type: mrr_at_10 value: 61.4685647975465 - type: mrr_at_100 value: 62.13718159287348 - type: mrr_at_1000 value: 62.15799113826325 - type: mrr_at_20 value: 61.885388764243544 - type: mrr_at_3 value: 59.44299390774582 - type: mrr_at_5 value: 60.26544821583981 - type: nauc_map_at_1000_diff1 value: 39.824412602121804 - type: nauc_map_at_1000_max value: 39.49332709959374 - type: nauc_map_at_1000_std value: -17.27462623749702 - type: nauc_map_at_100_diff1 value: 39.80528910003463 - type: nauc_map_at_100_max value: 39.51471609156093 - type: nauc_map_at_100_std value: -17.275536933094937 - type: nauc_map_at_10_diff1 value: 39.28558292349772 - type: nauc_map_at_10_max value: 38.13220294838968 - type: nauc_map_at_10_std value: -18.235985574392863 - type: nauc_map_at_1_diff1 value: 43.68892397816937 - type: nauc_map_at_1_max value: 14.478978190224353 - type: nauc_map_at_1_std value: -18.435031919225477 - type: nauc_map_at_20_diff1 value: 39.8733530971344 - type: nauc_map_at_20_max value: 39.30513202591992 - type: nauc_map_at_20_std value: -17.62362848144766 - type: nauc_map_at_3_diff1 value: 40.31116611188815 - type: nauc_map_at_3_max value: 31.107314675202165 - type: nauc_map_at_3_std value: -19.52930881946966 - type: nauc_map_at_5_diff1 value: 39.1241499095765 - type: nauc_map_at_5_max value: 37.330543901034055 - type: nauc_map_at_5_std value: -17.893862772447548 - type: nauc_mrr_at_1000_diff1 value: 43.07490530140024 - type: nauc_mrr_at_1000_max value: 42.28469195779226 - type: nauc_mrr_at_1000_std value: -15.583217110180737 - type: nauc_mrr_at_100_diff1 value: 43.068836494603886 - type: nauc_mrr_at_100_max value: 42.29612450479168 - type: nauc_mrr_at_100_std value: -15.57218089438229 - type: nauc_mrr_at_10_diff1 value: 42.88685919151777 - type: nauc_mrr_at_10_max value: 41.89944452003811 - type: nauc_mrr_at_10_std value: -15.909673572763165 - type: nauc_mrr_at_1_diff1 value: 45.67646898532131 - type: nauc_mrr_at_1_max value: 43.0541870425035 - type: nauc_mrr_at_1_std value: -15.597124291613563 - type: nauc_mrr_at_20_diff1 value: 43.14141873150977 - type: nauc_mrr_at_20_max value: 42.33063543184022 - type: nauc_mrr_at_20_std value: -15.607612016107304 - type: nauc_mrr_at_3_diff1 value: 43.18370928261982 - type: nauc_mrr_at_3_max value: 42.18529980773961 - type: nauc_mrr_at_3_std value: -15.900151400673629 - type: nauc_mrr_at_5_diff1 value: 42.43443044877765 - type: nauc_mrr_at_5_max value: 42.05818605278972 - type: nauc_mrr_at_5_std value: -15.436502733299893 - type: nauc_ndcg_at_1000_diff1 value: 40.60606676178781 - type: nauc_ndcg_at_1000_max value: 41.71923393878376 - type: nauc_ndcg_at_1000_std value: -15.694740326899556 - type: nauc_ndcg_at_100_diff1 value: 40.15270376312309 - type: nauc_ndcg_at_100_max value: 42.234126305709225 - type: nauc_ndcg_at_100_std value: -15.436051984708952 - type: nauc_ndcg_at_10_diff1 value: 39.142259831299455 - type: nauc_ndcg_at_10_max value: 38.61470104273746 - type: nauc_ndcg_at_10_std value: -18.577452829132742 - type: nauc_ndcg_at_1_diff1 value: 45.67646898532131 - type: nauc_ndcg_at_1_max value: 43.0541870425035 - type: nauc_ndcg_at_1_std value: -15.597124291613563 - type: nauc_ndcg_at_20_diff1 value: 40.805159395901306 - type: nauc_ndcg_at_20_max value: 41.58685629374952 - type: nauc_ndcg_at_20_std value: -16.862408156222592 - type: nauc_ndcg_at_3_diff1 value: 39.12028215488432 - type: nauc_ndcg_at_3_max value: 39.70580596343164 - type: nauc_ndcg_at_3_std value: -16.705546903936213 - type: nauc_ndcg_at_5_diff1 value: 38.42075404927361 - type: nauc_ndcg_at_5_max value: 38.064219879504385 - type: nauc_ndcg_at_5_std value: -17.20282111665876 - type: nauc_precision_at_1000_diff1 value: -4.419224540552891 - type: nauc_precision_at_1000_max value: 35.686022591225246 - type: nauc_precision_at_1000_std value: 15.023520191032972 - type: nauc_precision_at_100_diff1 value: -2.9027602601603895 - type: nauc_precision_at_100_max value: 39.99864013028808 - type: nauc_precision_at_100_std value: 13.863497117255525 - type: nauc_precision_at_10_diff1 value: 5.539104839809501 - type: nauc_precision_at_10_max value: 42.41625740557432 - type: nauc_precision_at_10_std value: 1.0894693748662556 - type: nauc_precision_at_1_diff1 value: 45.67646898532131 - type: nauc_precision_at_1_max value: 43.0541870425035 - type: nauc_precision_at_1_std value: -15.597124291613563 - type: nauc_precision_at_20_diff1 value: 4.734562571681868 - type: nauc_precision_at_20_max value: 44.35081213316202 - type: nauc_precision_at_20_std value: 6.642891478284595 - type: nauc_precision_at_3_diff1 value: 13.936559341472101 - type: nauc_precision_at_3_max value: 45.426668552497524 - type: nauc_precision_at_3_std value: -5.219785419247125 - type: nauc_precision_at_5_diff1 value: 8.366706789546015 - type: nauc_precision_at_5_max value: 46.161942989326896 - type: nauc_precision_at_5_std value: -0.193140343545876 - type: nauc_recall_at_1000_diff1 value: 45.61785312444842 - type: nauc_recall_at_1000_max value: 75.68258976531774 - type: nauc_recall_at_1000_std value: 37.469059422121575 - type: nauc_recall_at_100_diff1 value: 26.798748531805096 - type: nauc_recall_at_100_max value: 54.72134095197765 - type: nauc_recall_at_100_std value: -1.5967608233799417 - type: nauc_recall_at_10_diff1 value: 32.13211696200521 - type: nauc_recall_at_10_max value: 31.13866254975895 - type: nauc_recall_at_10_std value: -22.31404161136118 - type: nauc_recall_at_1_diff1 value: 43.68892397816937 - type: nauc_recall_at_1_max value: 14.478978190224353 - type: nauc_recall_at_1_std value: -18.435031919225477 - type: nauc_recall_at_20_diff1 value: 38.597996930461385 - type: nauc_recall_at_20_max value: 42.49849027366794 - type: nauc_recall_at_20_std value: -16.536471900752154 - type: nauc_recall_at_3_diff1 value: 35.343730012759266 - type: nauc_recall_at_3_max value: 26.898722085043392 - type: nauc_recall_at_3_std value: -19.4459792273884 - type: nauc_recall_at_5_diff1 value: 31.8310298012186 - type: nauc_recall_at_5_max value: 32.67800489655844 - type: nauc_recall_at_5_std value: -16.800929103347283 - type: ndcg_at_1 value: 53.916 - type: ndcg_at_10 value: 60.428000000000004 - type: ndcg_at_100 value: 65.95 - type: ndcg_at_1000 value: 66.88 - type: ndcg_at_20 value: 62.989 - type: ndcg_at_3 value: 55.204 - type: ndcg_at_5 value: 56.42700000000001 - type: precision_at_1 value: 53.916 - type: precision_at_10 value: 14.346999999999998 - type: precision_at_100 value: 1.849 - type: precision_at_1000 value: 0.196 - type: precision_at_20 value: 8.022 - type: precision_at_3 value: 34.552 - type: precision_at_5 value: 24.569 - type: recall_at_1 value: 33.453 - type: recall_at_10 value: 71.07900000000001 - type: recall_at_100 value: 93.207 - type: recall_at_1000 value: 99.60799999999999 - type: recall_at_20 value: 79.482 - type: recall_at_3 value: 53.98 - type: recall_at_5 value: 60.781 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (eng-pol) type: jinaai/xpqa config: eng-pol split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 34.042 - type: map_at_1 value: 13.236 - type: map_at_10 value: 27.839999999999996 - type: map_at_100 value: 30.171999999999997 - type: map_at_1000 value: 30.349999999999998 - type: map_at_20 value: 29.044999999999998 - type: map_at_3 value: 22.58 - type: map_at_5 value: 25.83 - type: mrr_at_1 value: 30.318471337579616 - type: mrr_at_10 value: 37.4983823678091 - type: mrr_at_100 value: 38.5784523175009 - type: mrr_at_1000 value: 38.63608698968148 - type: mrr_at_20 value: 38.02996157871825 - type: mrr_at_3 value: 34.798301486199584 - type: mrr_at_5 value: 36.39702760084925 - type: nauc_map_at_1000_diff1 value: 21.07199789609177 - type: nauc_map_at_1000_max value: 25.959233507893277 - type: nauc_map_at_1000_std value: -28.011925372852826 - type: nauc_map_at_100_diff1 value: 21.086788412737548 - type: nauc_map_at_100_max value: 25.8611620203686 - type: nauc_map_at_100_std value: -28.179239912057515 - type: nauc_map_at_10_diff1 value: 21.23841745922078 - type: nauc_map_at_10_max value: 25.44290342378288 - type: nauc_map_at_10_std value: -28.75578689110275 - type: nauc_map_at_1_diff1 value: 28.87454015638211 - type: nauc_map_at_1_max value: 17.50681123879997 - type: nauc_map_at_1_std value: -30.382831850562432 - type: nauc_map_at_20_diff1 value: 21.076559713540455 - type: nauc_map_at_20_max value: 25.538154202494535 - type: nauc_map_at_20_std value: -28.518764617658555 - type: nauc_map_at_3_diff1 value: 22.159185358766468 - type: nauc_map_at_3_max value: 23.01652660927249 - type: nauc_map_at_3_std value: -29.567722713221862 - type: nauc_map_at_5_diff1 value: 21.35578810370897 - type: nauc_map_at_5_max value: 25.550550437767395 - type: nauc_map_at_5_std value: -28.7889035461355 - type: nauc_mrr_at_1000_diff1 value: 22.28633009221923 - type: nauc_mrr_at_1000_max value: 26.920205393136392 - type: nauc_mrr_at_1000_std value: -25.887791634977642 - type: nauc_mrr_at_100_diff1 value: 22.2754975739755 - type: nauc_mrr_at_100_max value: 26.90235716615346 - type: nauc_mrr_at_100_std value: -25.891596020584345 - type: nauc_mrr_at_10_diff1 value: 22.415076305593534 - type: nauc_mrr_at_10_max value: 26.504643796222222 - type: nauc_mrr_at_10_std value: -26.6046081215833 - type: nauc_mrr_at_1_diff1 value: 23.406748619244368 - type: nauc_mrr_at_1_max value: 29.058228240823553 - type: nauc_mrr_at_1_std value: -26.450169820901078 - type: nauc_mrr_at_20_diff1 value: 22.29233141817678 - type: nauc_mrr_at_20_max value: 26.69021351064081 - type: nauc_mrr_at_20_std value: -26.086596227376656 - type: nauc_mrr_at_3_diff1 value: 22.20746187500145 - type: nauc_mrr_at_3_max value: 27.143725946169457 - type: nauc_mrr_at_3_std value: -26.7017708594376 - type: nauc_mrr_at_5_diff1 value: 22.71898965233195 - type: nauc_mrr_at_5_max value: 26.932386658571662 - type: nauc_mrr_at_5_std value: -26.725541058780234 - type: nauc_ndcg_at_1000_diff1 value: 20.541734305148466 - type: nauc_ndcg_at_1000_max value: 27.180534238090758 - type: nauc_ndcg_at_1000_std value: -23.74197745177845 - type: nauc_ndcg_at_100_diff1 value: 20.570052839937468 - type: nauc_ndcg_at_100_max value: 26.21605034405486 - type: nauc_ndcg_at_100_std value: -25.359817188805028 - type: nauc_ndcg_at_10_diff1 value: 21.241423075073467 - type: nauc_ndcg_at_10_max value: 24.599199195239475 - type: nauc_ndcg_at_10_std value: -28.404540333309008 - type: nauc_ndcg_at_1_diff1 value: 23.406748619244368 - type: nauc_ndcg_at_1_max value: 29.058228240823553 - type: nauc_ndcg_at_1_std value: -26.450169820901078 - type: nauc_ndcg_at_20_diff1 value: 20.740460046196873 - type: nauc_ndcg_at_20_max value: 24.82380195169634 - type: nauc_ndcg_at_20_std value: -27.376298834244313 - type: nauc_ndcg_at_3_diff1 value: 19.994948682426504 - type: nauc_ndcg_at_3_max value: 26.153790759405105 - type: nauc_ndcg_at_3_std value: -27.194548404540885 - type: nauc_ndcg_at_5_diff1 value: 21.48414272096384 - type: nauc_ndcg_at_5_max value: 25.239652015076373 - type: nauc_ndcg_at_5_std value: -28.2620160957961 - type: nauc_precision_at_1000_diff1 value: -0.7557639926687744 - type: nauc_precision_at_1000_max value: 24.265591636994436 - type: nauc_precision_at_1000_std value: 16.833104654292654 - type: nauc_precision_at_100_diff1 value: 4.647847665941115 - type: nauc_precision_at_100_max value: 24.42192644844434 - type: nauc_precision_at_100_std value: 0.2718848568876648 - type: nauc_precision_at_10_diff1 value: 9.465969286722654 - type: nauc_precision_at_10_max value: 27.448993150448043 - type: nauc_precision_at_10_std value: -16.519099596502212 - type: nauc_precision_at_1_diff1 value: 23.406748619244368 - type: nauc_precision_at_1_max value: 29.058228240823553 - type: nauc_precision_at_1_std value: -26.450169820901078 - type: nauc_precision_at_20_diff1 value: 8.021421615668114 - type: nauc_precision_at_20_max value: 26.18556481398635 - type: nauc_precision_at_20_std value: -12.207152108668367 - type: nauc_precision_at_3_diff1 value: 11.783572803634241 - type: nauc_precision_at_3_max value: 29.259715774978893 - type: nauc_precision_at_3_std value: -20.407524967717425 - type: nauc_precision_at_5_diff1 value: 10.371728615220821 - type: nauc_precision_at_5_max value: 30.270642833482864 - type: nauc_precision_at_5_std value: -18.407334880575494 - type: nauc_recall_at_1000_diff1 value: 6.008969959111555 - type: nauc_recall_at_1000_max value: 39.79691734058127 - type: nauc_recall_at_1000_std value: 32.43591825510109 - type: nauc_recall_at_100_diff1 value: 15.2374566058917 - type: nauc_recall_at_100_max value: 23.058785539503717 - type: nauc_recall_at_100_std value: -15.962888794058165 - type: nauc_recall_at_10_diff1 value: 19.46184821807753 - type: nauc_recall_at_10_max value: 19.001003513986866 - type: nauc_recall_at_10_std value: -27.753332786663876 - type: nauc_recall_at_1_diff1 value: 28.87454015638211 - type: nauc_recall_at_1_max value: 17.50681123879997 - type: nauc_recall_at_1_std value: -30.382831850562432 - type: nauc_recall_at_20_diff1 value: 17.237090858517405 - type: nauc_recall_at_20_max value: 18.42118474134871 - type: nauc_recall_at_20_std value: -24.862787724031957 - type: nauc_recall_at_3_diff1 value: 18.813019521758577 - type: nauc_recall_at_3_max value: 19.198572333053544 - type: nauc_recall_at_3_std value: -28.5644958605618 - type: nauc_recall_at_5_diff1 value: 20.247501986329482 - type: nauc_recall_at_5_max value: 21.121526202170358 - type: nauc_recall_at_5_std value: -27.220378617864853 - type: ndcg_at_1 value: 30.318 - type: ndcg_at_10 value: 34.042 - type: ndcg_at_100 value: 42.733 - type: ndcg_at_1000 value: 46.015 - type: ndcg_at_20 value: 37.053999999999995 - type: ndcg_at_3 value: 29.254 - type: ndcg_at_5 value: 30.514000000000003 - type: precision_at_1 value: 30.318 - type: precision_at_10 value: 10.981 - type: precision_at_100 value: 1.889 - type: precision_at_1000 value: 0.234 - type: precision_at_20 value: 6.643000000000001 - type: precision_at_3 value: 22.166 - type: precision_at_5 value: 17.477999999999998 - type: recall_at_1 value: 13.236 - type: recall_at_10 value: 41.461 - type: recall_at_100 value: 75.008 - type: recall_at_1000 value: 96.775 - type: recall_at_20 value: 50.754 - type: recall_at_3 value: 26.081 - type: recall_at_5 value: 33.168 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (eng-cmn) type: jinaai/xpqa config: eng-cmn split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 37.504 - type: map_at_1 value: 16.019 - type: map_at_10 value: 30.794 - type: map_at_100 value: 33.157 - type: map_at_1000 value: 33.324999999999996 - type: map_at_20 value: 32.161 - type: map_at_3 value: 25.372 - type: map_at_5 value: 28.246 - type: mrr_at_1 value: 30.461165048543688 - type: mrr_at_10 value: 39.393107566651224 - type: mrr_at_100 value: 40.570039540602295 - type: mrr_at_1000 value: 40.6306116407744 - type: mrr_at_20 value: 40.09428159978876 - type: mrr_at_3 value: 37.176375404530745 - type: mrr_at_5 value: 38.09870550161812 - type: nauc_map_at_1000_diff1 value: 30.82306881892873 - type: nauc_map_at_1000_max value: 5.877636000666466 - type: nauc_map_at_1000_std value: -30.7140513386797 - type: nauc_map_at_100_diff1 value: 30.85192449151961 - type: nauc_map_at_100_max value: 5.809195131550909 - type: nauc_map_at_100_std value: -30.838556702972063 - type: nauc_map_at_10_diff1 value: 30.50359163635058 - type: nauc_map_at_10_max value: 6.373491595869303 - type: nauc_map_at_10_std value: -29.89368007827676 - type: nauc_map_at_1_diff1 value: 38.60240510083884 - type: nauc_map_at_1_max value: 10.407392664609139 - type: nauc_map_at_1_std value: -17.76327278732833 - type: nauc_map_at_20_diff1 value: 30.897489125753598 - type: nauc_map_at_20_max value: 5.9303381898248 - type: nauc_map_at_20_std value: -30.863345188760515 - type: nauc_map_at_3_diff1 value: 32.8150951852729 - type: nauc_map_at_3_max value: 7.671931402215177 - type: nauc_map_at_3_std value: -25.654809758216533 - type: nauc_map_at_5_diff1 value: 31.19558194781019 - type: nauc_map_at_5_max value: 6.426885613116939 - type: nauc_map_at_5_std value: -28.609027858850016 - type: nauc_mrr_at_1000_diff1 value: 30.7596332048733 - type: nauc_mrr_at_1000_max value: 1.1970748115580212 - type: nauc_mrr_at_1000_std value: -34.647570668150216 - type: nauc_mrr_at_100_diff1 value: 30.74693370788581 - type: nauc_mrr_at_100_max value: 1.1673272262754841 - type: nauc_mrr_at_100_std value: -34.67761028542745 - type: nauc_mrr_at_10_diff1 value: 30.537820575183076 - type: nauc_mrr_at_10_max value: 1.0261868725502707 - type: nauc_mrr_at_10_std value: -34.999990560631204 - type: nauc_mrr_at_1_diff1 value: 35.51868580113285 - type: nauc_mrr_at_1_max value: 5.117103773147307 - type: nauc_mrr_at_1_std value: -30.633913466736956 - type: nauc_mrr_at_20_diff1 value: 30.67318175430903 - type: nauc_mrr_at_20_max value: 1.0979983974981327 - type: nauc_mrr_at_20_std value: -34.8388339739997 - type: nauc_mrr_at_3_diff1 value: 30.884642006045702 - type: nauc_mrr_at_3_max value: 1.7970996544095983 - type: nauc_mrr_at_3_std value: -34.290172894906085 - type: nauc_mrr_at_5_diff1 value: 30.89687518368571 - type: nauc_mrr_at_5_max value: 1.2123714988495347 - type: nauc_mrr_at_5_std value: -35.01704580471926 - type: nauc_ndcg_at_1000_diff1 value: 29.214476799077342 - type: nauc_ndcg_at_1000_max value: 3.6379035546112872 - type: nauc_ndcg_at_1000_std value: -32.35757522049194 - type: nauc_ndcg_at_100_diff1 value: 29.130004541376298 - type: nauc_ndcg_at_100_max value: 2.9580589185293045 - type: nauc_ndcg_at_100_std value: -33.26884643871724 - type: nauc_ndcg_at_10_diff1 value: 28.521001084366393 - type: nauc_ndcg_at_10_max value: 3.630223957267483 - type: nauc_ndcg_at_10_std value: -33.14524140940815 - type: nauc_ndcg_at_1_diff1 value: 35.51868580113285 - type: nauc_ndcg_at_1_max value: 5.117103773147307 - type: nauc_ndcg_at_1_std value: -30.633913466736956 - type: nauc_ndcg_at_20_diff1 value: 29.194462756848782 - type: nauc_ndcg_at_20_max value: 2.61162903136461 - type: nauc_ndcg_at_20_std value: -34.59161403211834 - type: nauc_ndcg_at_3_diff1 value: 30.183555327135203 - type: nauc_ndcg_at_3_max value: 5.61949040917093 - type: nauc_ndcg_at_3_std value: -30.350117794058175 - type: nauc_ndcg_at_5_diff1 value: 29.74420394139971 - type: nauc_ndcg_at_5_max value: 3.952183813937688 - type: nauc_ndcg_at_5_std value: -31.807833795302038 - type: nauc_precision_at_1000_diff1 value: -5.467049121617333 - type: nauc_precision_at_1000_max value: -3.993986884198271 - type: nauc_precision_at_1000_std value: -13.703967324212224 - type: nauc_precision_at_100_diff1 value: 1.5585428307943647 - type: nauc_precision_at_100_max value: -4.250455723613214 - type: nauc_precision_at_100_std value: -22.294689856776493 - type: nauc_precision_at_10_diff1 value: 11.076036917255259 - type: nauc_precision_at_10_max value: -1.5859394644365377 - type: nauc_precision_at_10_std value: -34.94912594413202 - type: nauc_precision_at_1_diff1 value: 35.51868580113285 - type: nauc_precision_at_1_max value: 5.117103773147307 - type: nauc_precision_at_1_std value: -30.633913466736956 - type: nauc_precision_at_20_diff1 value: 9.311484455773828 - type: nauc_precision_at_20_max value: -3.678383428592432 - type: nauc_precision_at_20_std value: -33.700002761401635 - type: nauc_precision_at_3_diff1 value: 19.2787260874381 - type: nauc_precision_at_3_max value: 0.18292109396940018 - type: nauc_precision_at_3_std value: -35.23939824276542 - type: nauc_precision_at_5_diff1 value: 14.97930592298584 - type: nauc_precision_at_5_max value: -1.63540635880963 - type: nauc_precision_at_5_std value: -35.908283558321315 - type: nauc_recall_at_1000_diff1 value: 26.63056473607804 - type: nauc_recall_at_1000_max value: 62.7304558520689 - type: nauc_recall_at_1000_std value: 58.12421701377561 - type: nauc_recall_at_100_diff1 value: 21.42127379898579 - type: nauc_recall_at_100_max value: 1.4748203516921914 - type: nauc_recall_at_100_std value: -27.56467339041136 - type: nauc_recall_at_10_diff1 value: 21.20479652609812 - type: nauc_recall_at_10_max value: 1.7394881489709888 - type: nauc_recall_at_10_std value: -32.15116902585072 - type: nauc_recall_at_1_diff1 value: 38.60240510083884 - type: nauc_recall_at_1_max value: 10.407392664609139 - type: nauc_recall_at_1_std value: -17.76327278732833 - type: nauc_recall_at_20_diff1 value: 23.049652721582632 - type: nauc_recall_at_20_max value: -1.7715787106286838 - type: nauc_recall_at_20_std value: -36.14203686002867 - type: nauc_recall_at_3_diff1 value: 26.522179829461873 - type: nauc_recall_at_3_max value: 6.078208732431124 - type: nauc_recall_at_3_std value: -25.02625711226274 - type: nauc_recall_at_5_diff1 value: 24.19538553561693 - type: nauc_recall_at_5_max value: 2.4963810785503524 - type: nauc_recall_at_5_std value: -30.449635496921257 - type: ndcg_at_1 value: 30.461 - type: ndcg_at_10 value: 37.504 - type: ndcg_at_100 value: 46.156000000000006 - type: ndcg_at_1000 value: 48.985 - type: ndcg_at_20 value: 41.025 - type: ndcg_at_3 value: 32.165 - type: ndcg_at_5 value: 33.072 - type: precision_at_1 value: 30.461 - type: precision_at_10 value: 11.032 - type: precision_at_100 value: 1.8870000000000002 - type: precision_at_1000 value: 0.22499999999999998 - type: precision_at_20 value: 6.833 - type: precision_at_3 value: 22.532 - type: precision_at_5 value: 16.966 - type: recall_at_1 value: 16.019 - type: recall_at_10 value: 47.557 - type: recall_at_100 value: 80.376 - type: recall_at_1000 value: 98.904 - type: recall_at_20 value: 58.48100000000001 - type: recall_at_3 value: 30.682 - type: recall_at_5 value: 36.714999999999996 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (eng-spa) type: jinaai/xpqa config: eng-spa split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 53.359 - type: map_at_1 value: 22.892000000000003 - type: map_at_10 value: 45.773 - type: map_at_100 value: 47.778999999999996 - type: map_at_1000 value: 47.882999999999996 - type: map_at_20 value: 46.869 - type: map_at_3 value: 37.643 - type: map_at_5 value: 43.120999999999995 - type: mrr_at_1 value: 47.28877679697352 - type: mrr_at_10 value: 56.95890630316857 - type: mrr_at_100 value: 57.71103367009639 - type: mrr_at_1000 value: 57.73661441948852 - type: mrr_at_20 value: 57.37701091311334 - type: mrr_at_3 value: 54.74989491382929 - type: mrr_at_5 value: 56.08659100462372 - type: nauc_map_at_1000_diff1 value: 27.8347129954991 - type: nauc_map_at_1000_max value: 38.04300600762859 - type: nauc_map_at_1000_std value: -18.294653328262868 - type: nauc_map_at_100_diff1 value: 27.818449297770858 - type: nauc_map_at_100_max value: 38.03533462156633 - type: nauc_map_at_100_std value: -18.332989980880644 - type: nauc_map_at_10_diff1 value: 27.520664180018358 - type: nauc_map_at_10_max value: 37.67109855753314 - type: nauc_map_at_10_std value: -18.496721673888683 - type: nauc_map_at_1_diff1 value: 37.56020148060502 - type: nauc_map_at_1_max value: 10.298394230150745 - type: nauc_map_at_1_std value: -20.41359936101547 - type: nauc_map_at_20_diff1 value: 27.615023038189722 - type: nauc_map_at_20_max value: 37.808525116320254 - type: nauc_map_at_20_std value: -18.49235775420803 - type: nauc_map_at_3_diff1 value: 30.797347567428424 - type: nauc_map_at_3_max value: 29.374407828869497 - type: nauc_map_at_3_std value: -19.75905772914969 - type: nauc_map_at_5_diff1 value: 28.431802888884803 - type: nauc_map_at_5_max value: 35.57723911610521 - type: nauc_map_at_5_std value: -19.093588845366824 - type: nauc_mrr_at_1000_diff1 value: 33.263611009054586 - type: nauc_mrr_at_1000_max value: 40.620639901613664 - type: nauc_mrr_at_1000_std value: -17.083016011032036 - type: nauc_mrr_at_100_diff1 value: 33.25375012559163 - type: nauc_mrr_at_100_max value: 40.62376205172005 - type: nauc_mrr_at_100_std value: -17.091930575226684 - type: nauc_mrr_at_10_diff1 value: 33.05787202690095 - type: nauc_mrr_at_10_max value: 40.4516362611674 - type: nauc_mrr_at_10_std value: -17.088910666499892 - type: nauc_mrr_at_1_diff1 value: 36.424151087824555 - type: nauc_mrr_at_1_max value: 40.955715626650445 - type: nauc_mrr_at_1_std value: -16.56636409111209 - type: nauc_mrr_at_20_diff1 value: 33.12029456858138 - type: nauc_mrr_at_20_max value: 40.56409347292635 - type: nauc_mrr_at_20_std value: -17.102034817242068 - type: nauc_mrr_at_3_diff1 value: 33.52377926814156 - type: nauc_mrr_at_3_max value: 40.824911575046876 - type: nauc_mrr_at_3_std value: -16.855935748811092 - type: nauc_mrr_at_5_diff1 value: 33.08646471768442 - type: nauc_mrr_at_5_max value: 40.59323589955881 - type: nauc_mrr_at_5_std value: -16.77829710500156 - type: nauc_ndcg_at_1000_diff1 value: 28.741186244590207 - type: nauc_ndcg_at_1000_max value: 40.0113825410539 - type: nauc_ndcg_at_1000_std value: -17.15655081742458 - type: nauc_ndcg_at_100_diff1 value: 28.680521359782972 - type: nauc_ndcg_at_100_max value: 39.94751899984445 - type: nauc_ndcg_at_100_std value: -17.82813814043932 - type: nauc_ndcg_at_10_diff1 value: 27.22858072673168 - type: nauc_ndcg_at_10_max value: 38.600188968554725 - type: nauc_ndcg_at_10_std value: -18.517203924893614 - type: nauc_ndcg_at_1_diff1 value: 36.424151087824555 - type: nauc_ndcg_at_1_max value: 40.955715626650445 - type: nauc_ndcg_at_1_std value: -16.56636409111209 - type: nauc_ndcg_at_20_diff1 value: 27.56875900623774 - type: nauc_ndcg_at_20_max value: 38.95264310199067 - type: nauc_ndcg_at_20_std value: -18.709973965688445 - type: nauc_ndcg_at_3_diff1 value: 28.682842749851574 - type: nauc_ndcg_at_3_max value: 38.361215408395964 - type: nauc_ndcg_at_3_std value: -16.800291231827515 - type: nauc_ndcg_at_5_diff1 value: 28.178239259093484 - type: nauc_ndcg_at_5_max value: 36.77096292606479 - type: nauc_ndcg_at_5_std value: -18.718861696641145 - type: nauc_precision_at_1000_diff1 value: -7.3686253252869305 - type: nauc_precision_at_1000_max value: 31.98896996987639 - type: nauc_precision_at_1000_std value: 13.125659676392267 - type: nauc_precision_at_100_diff1 value: -2.8239113056969156 - type: nauc_precision_at_100_max value: 36.95062472971812 - type: nauc_precision_at_100_std value: 7.230228733647562 - type: nauc_precision_at_10_diff1 value: 2.5515545798843555 - type: nauc_precision_at_10_max value: 45.46146019314904 - type: nauc_precision_at_10_std value: -1.3249340536211553 - type: nauc_precision_at_1_diff1 value: 36.424151087824555 - type: nauc_precision_at_1_max value: 40.955715626650445 - type: nauc_precision_at_1_std value: -16.56636409111209 - type: nauc_precision_at_20_diff1 value: 0.7202861770489576 - type: nauc_precision_at_20_max value: 41.9937596214609 - type: nauc_precision_at_20_std value: 0.2756400069730064 - type: nauc_precision_at_3_diff1 value: 12.89221206929447 - type: nauc_precision_at_3_max value: 48.57775126381142 - type: nauc_precision_at_3_std value: -8.042242254131068 - type: nauc_precision_at_5_diff1 value: 7.063616193387763 - type: nauc_precision_at_5_max value: 47.26496887331675 - type: nauc_precision_at_5_std value: -4.735805200913049 - type: nauc_recall_at_1000_diff1 value: 2.6650052980682224 - type: nauc_recall_at_1000_max value: 81.94826279951472 - type: nauc_recall_at_1000_std value: 48.46012388224573 - type: nauc_recall_at_100_diff1 value: 24.516371948375827 - type: nauc_recall_at_100_max value: 39.17639620389552 - type: nauc_recall_at_100_std value: -17.884197602579533 - type: nauc_recall_at_10_diff1 value: 19.93892097640112 - type: nauc_recall_at_10_max value: 33.079079440022106 - type: nauc_recall_at_10_std value: -20.22227622801884 - type: nauc_recall_at_1_diff1 value: 37.56020148060502 - type: nauc_recall_at_1_max value: 10.298394230150745 - type: nauc_recall_at_1_std value: -20.41359936101547 - type: nauc_recall_at_20_diff1 value: 20.363784035670633 - type: nauc_recall_at_20_max value: 33.39352971625336 - type: nauc_recall_at_20_std value: -21.712050932168875 - type: nauc_recall_at_3_diff1 value: 26.220072121604655 - type: nauc_recall_at_3_max value: 25.853218030218507 - type: nauc_recall_at_3_std value: -17.830613372910907 - type: nauc_recall_at_5_diff1 value: 22.25850162680252 - type: nauc_recall_at_5_max value: 30.89620539042785 - type: nauc_recall_at_5_std value: -19.16786434439169 - type: ndcg_at_1 value: 47.288999999999994 - type: ndcg_at_10 value: 53.359 - type: ndcg_at_100 value: 60.25899999999999 - type: ndcg_at_1000 value: 61.902 - type: ndcg_at_20 value: 56.025000000000006 - type: ndcg_at_3 value: 47.221999999999994 - type: ndcg_at_5 value: 49.333 - type: precision_at_1 value: 47.288999999999994 - type: precision_at_10 value: 16.003 - type: precision_at_100 value: 2.221 - type: precision_at_1000 value: 0.246 - type: precision_at_20 value: 8.985 - type: precision_at_3 value: 34.510000000000005 - type: precision_at_5 value: 26.961000000000002 - type: recall_at_1 value: 22.892000000000003 - type: recall_at_10 value: 62.928 - type: recall_at_100 value: 89.105 - type: recall_at_1000 value: 99.319 - type: recall_at_20 value: 71.387 - type: recall_at_3 value: 43.492999999999995 - type: recall_at_5 value: 53.529 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (eng-fra) type: jinaai/xpqa config: eng-fra split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 54.888000000000005 - type: map_at_1 value: 26.079 - type: map_at_10 value: 47.434 - type: map_at_100 value: 49.376 - type: map_at_1000 value: 49.461 - type: map_at_20 value: 48.634 - type: map_at_3 value: 40.409 - type: map_at_5 value: 44.531 - type: mrr_at_1 value: 46.86248331108144 - type: mrr_at_10 value: 56.45506177548896 - type: mrr_at_100 value: 57.20360629445577 - type: mrr_at_1000 value: 57.227004696897986 - type: mrr_at_20 value: 56.905302765737865 - type: mrr_at_3 value: 54.09434801958164 - type: mrr_at_5 value: 55.40943480195811 - type: nauc_map_at_1000_diff1 value: 37.739936045535885 - type: nauc_map_at_1000_max value: 35.92625003516368 - type: nauc_map_at_1000_std value: -15.825119611638398 - type: nauc_map_at_100_diff1 value: 37.71697833661983 - type: nauc_map_at_100_max value: 35.91174068136317 - type: nauc_map_at_100_std value: -15.838841891589006 - type: nauc_map_at_10_diff1 value: 37.52309268219689 - type: nauc_map_at_10_max value: 35.4887130483351 - type: nauc_map_at_10_std value: -16.61132378136234 - type: nauc_map_at_1_diff1 value: 42.705087329207984 - type: nauc_map_at_1_max value: 12.047671550242974 - type: nauc_map_at_1_std value: -17.156030827065834 - type: nauc_map_at_20_diff1 value: 37.59446680137666 - type: nauc_map_at_20_max value: 35.80559546695052 - type: nauc_map_at_20_std value: -16.158338316249786 - type: nauc_map_at_3_diff1 value: 38.618415267131816 - type: nauc_map_at_3_max value: 27.030227996183925 - type: nauc_map_at_3_std value: -18.962500694157857 - type: nauc_map_at_5_diff1 value: 37.980845601534256 - type: nauc_map_at_5_max value: 32.82374761283266 - type: nauc_map_at_5_std value: -17.856875825229565 - type: nauc_mrr_at_1000_diff1 value: 40.26059509279346 - type: nauc_mrr_at_1000_max value: 39.28453752990871 - type: nauc_mrr_at_1000_std value: -13.306217279524212 - type: nauc_mrr_at_100_diff1 value: 40.23390833398881 - type: nauc_mrr_at_100_max value: 39.26041461025653 - type: nauc_mrr_at_100_std value: -13.317700798873153 - type: nauc_mrr_at_10_diff1 value: 40.163737640180145 - type: nauc_mrr_at_10_max value: 39.27138538165913 - type: nauc_mrr_at_10_std value: -13.472971360323038 - type: nauc_mrr_at_1_diff1 value: 42.95339241383707 - type: nauc_mrr_at_1_max value: 40.62982307619158 - type: nauc_mrr_at_1_std value: -10.429597045942748 - type: nauc_mrr_at_20_diff1 value: 40.23703505923782 - type: nauc_mrr_at_20_max value: 39.27051308063652 - type: nauc_mrr_at_20_std value: -13.390197643922038 - type: nauc_mrr_at_3_diff1 value: 40.5721313555661 - type: nauc_mrr_at_3_max value: 39.254774354468594 - type: nauc_mrr_at_3_std value: -13.773803807863827 - type: nauc_mrr_at_5_diff1 value: 40.41081287079734 - type: nauc_mrr_at_5_max value: 39.515241132077335 - type: nauc_mrr_at_5_std value: -13.306544090087336 - type: nauc_ndcg_at_1000_diff1 value: 38.04772268296103 - type: nauc_ndcg_at_1000_max value: 38.03364565521176 - type: nauc_ndcg_at_1000_std value: -14.203182726102263 - type: nauc_ndcg_at_100_diff1 value: 37.51752795463643 - type: nauc_ndcg_at_100_max value: 37.809671511710604 - type: nauc_ndcg_at_100_std value: -13.880578225081408 - type: nauc_ndcg_at_10_diff1 value: 36.78438984005559 - type: nauc_ndcg_at_10_max value: 36.98105155993232 - type: nauc_ndcg_at_10_std value: -16.886308645939113 - type: nauc_ndcg_at_1_diff1 value: 42.95339241383707 - type: nauc_ndcg_at_1_max value: 40.62982307619158 - type: nauc_ndcg_at_1_std value: -10.429597045942748 - type: nauc_ndcg_at_20_diff1 value: 36.94164323893683 - type: nauc_ndcg_at_20_max value: 37.333583379288285 - type: nauc_ndcg_at_20_std value: -15.853318071434716 - type: nauc_ndcg_at_3_diff1 value: 36.905604845477384 - type: nauc_ndcg_at_3_max value: 35.10252586688781 - type: nauc_ndcg_at_3_std value: -17.128435988977742 - type: nauc_ndcg_at_5_diff1 value: 37.96742463612705 - type: nauc_ndcg_at_5_max value: 34.65945109443365 - type: nauc_ndcg_at_5_std value: -17.916428667861183 - type: nauc_precision_at_1000_diff1 value: -3.740861894117653 - type: nauc_precision_at_1000_max value: 31.993854396874177 - type: nauc_precision_at_1000_std value: 17.445629474196448 - type: nauc_precision_at_100_diff1 value: -0.4825948747911606 - type: nauc_precision_at_100_max value: 35.834638448782954 - type: nauc_precision_at_100_std value: 16.82718796079511 - type: nauc_precision_at_10_diff1 value: 8.285949866268147 - type: nauc_precision_at_10_max value: 45.3292519726866 - type: nauc_precision_at_10_std value: 4.5574850748441555 - type: nauc_precision_at_1_diff1 value: 42.95339241383707 - type: nauc_precision_at_1_max value: 40.62982307619158 - type: nauc_precision_at_1_std value: -10.429597045942748 - type: nauc_precision_at_20_diff1 value: 4.890590733611442 - type: nauc_precision_at_20_max value: 41.83051757078859 - type: nauc_precision_at_20_std value: 9.197347125630467 - type: nauc_precision_at_3_diff1 value: 17.79940075411976 - type: nauc_precision_at_3_max value: 45.224103632426946 - type: nauc_precision_at_3_std value: -5.017203435609909 - type: nauc_precision_at_5_diff1 value: 13.548063145911929 - type: nauc_precision_at_5_max value: 46.84837547409909 - type: nauc_precision_at_5_std value: -0.8925939386354484 - type: nauc_recall_at_1000_diff1 value: 74.48441717138078 - type: nauc_recall_at_1000_max value: 74.66717137705027 - type: nauc_recall_at_1000_std value: 0.24030117471512125 - type: nauc_recall_at_100_diff1 value: 22.553777341988656 - type: nauc_recall_at_100_max value: 31.67861029246527 - type: nauc_recall_at_100_std value: 0.2707450517253687 - type: nauc_recall_at_10_diff1 value: 28.490866614443235 - type: nauc_recall_at_10_max value: 31.722970141434352 - type: nauc_recall_at_10_std value: -21.97893365028007 - type: nauc_recall_at_1_diff1 value: 42.705087329207984 - type: nauc_recall_at_1_max value: 12.047671550242974 - type: nauc_recall_at_1_std value: -17.156030827065834 - type: nauc_recall_at_20_diff1 value: 27.44043454173112 - type: nauc_recall_at_20_max value: 31.454281772040716 - type: nauc_recall_at_20_std value: -20.1735695305415 - type: nauc_recall_at_3_diff1 value: 34.08447534706394 - type: nauc_recall_at_3_max value: 21.793973773840865 - type: nauc_recall_at_3_std value: -22.753978372378906 - type: nauc_recall_at_5_diff1 value: 33.59686526199479 - type: nauc_recall_at_5_max value: 29.188889073761302 - type: nauc_recall_at_5_std value: -21.96156333744562 - type: ndcg_at_1 value: 46.861999999999995 - type: ndcg_at_10 value: 54.888000000000005 - type: ndcg_at_100 value: 61.477000000000004 - type: ndcg_at_1000 value: 62.768 - type: ndcg_at_20 value: 57.812 - type: ndcg_at_3 value: 48.721 - type: ndcg_at_5 value: 50.282000000000004 - type: precision_at_1 value: 46.861999999999995 - type: precision_at_10 value: 15.167 - type: precision_at_100 value: 2.072 - type: precision_at_1000 value: 0.22499999999999998 - type: precision_at_20 value: 8.672 - type: precision_at_3 value: 33.066 - type: precision_at_5 value: 24.726 - type: recall_at_1 value: 26.079 - type: recall_at_10 value: 66.095 - type: recall_at_100 value: 91.65299999999999 - type: recall_at_1000 value: 99.83999999999999 - type: recall_at_20 value: 75.28 - type: recall_at_3 value: 46.874 - type: recall_at_5 value: 55.062 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (pol-eng) type: jinaai/xpqa config: pol-eng split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 50.831 - type: map_at_1 value: 25.549 - type: map_at_10 value: 44.432 - type: map_at_100 value: 46.431 - type: map_at_1000 value: 46.525 - type: map_at_20 value: 45.595 - type: map_at_3 value: 38.574000000000005 - type: map_at_5 value: 42.266999999999996 - type: mrr_at_1 value: 43.5006435006435 - type: mrr_at_10 value: 51.561255132683684 - type: mrr_at_100 value: 52.59912482635216 - type: mrr_at_1000 value: 52.631337587043056 - type: mrr_at_20 value: 52.23234440063273 - type: mrr_at_3 value: 48.97039897039895 - type: mrr_at_5 value: 50.31531531531527 - type: nauc_map_at_1000_diff1 value: 35.907901295900174 - type: nauc_map_at_1000_max value: 24.573763602041687 - type: nauc_map_at_1000_std value: -29.524077960309313 - type: nauc_map_at_100_diff1 value: 35.86869121827827 - type: nauc_map_at_100_max value: 24.532343818487494 - type: nauc_map_at_100_std value: -29.613979124488864 - type: nauc_map_at_10_diff1 value: 35.90171794022391 - type: nauc_map_at_10_max value: 23.90914892943268 - type: nauc_map_at_10_std value: -30.43698820061533 - type: nauc_map_at_1_diff1 value: 50.80313333312038 - type: nauc_map_at_1_max value: 16.649890421888156 - type: nauc_map_at_1_std value: -22.323989416471683 - type: nauc_map_at_20_diff1 value: 35.77755470212964 - type: nauc_map_at_20_max value: 24.199895270297034 - type: nauc_map_at_20_std value: -30.223411960170647 - type: nauc_map_at_3_diff1 value: 38.964124882315936 - type: nauc_map_at_3_max value: 21.187432510177167 - type: nauc_map_at_3_std value: -28.976663506389887 - type: nauc_map_at_5_diff1 value: 36.04644236616672 - type: nauc_map_at_5_max value: 23.501186429317094 - type: nauc_map_at_5_std value: -30.068144596060748 - type: nauc_mrr_at_1000_diff1 value: 41.36555452105447 - type: nauc_mrr_at_1000_max value: 26.376799280402867 - type: nauc_mrr_at_1000_std value: -30.008603028757424 - type: nauc_mrr_at_100_diff1 value: 41.35523965220727 - type: nauc_mrr_at_100_max value: 26.402612115967706 - type: nauc_mrr_at_100_std value: -29.991754627128024 - type: nauc_mrr_at_10_diff1 value: 41.001395127259315 - type: nauc_mrr_at_10_max value: 26.104860505051384 - type: nauc_mrr_at_10_std value: -30.38420449487516 - type: nauc_mrr_at_1_diff1 value: 44.882846373248206 - type: nauc_mrr_at_1_max value: 26.61905322890808 - type: nauc_mrr_at_1_std value: -28.724565662206153 - type: nauc_mrr_at_20_diff1 value: 41.278009142648834 - type: nauc_mrr_at_20_max value: 26.284565529087295 - type: nauc_mrr_at_20_std value: -30.19549140549242 - type: nauc_mrr_at_3_diff1 value: 41.74663893951077 - type: nauc_mrr_at_3_max value: 26.263048464325884 - type: nauc_mrr_at_3_std value: -30.676733442965688 - type: nauc_mrr_at_5_diff1 value: 41.11461477846568 - type: nauc_mrr_at_5_max value: 25.94713927964926 - type: nauc_mrr_at_5_std value: -30.317066480767817 - type: nauc_ndcg_at_1000_diff1 value: 36.34161052445199 - type: nauc_ndcg_at_1000_max value: 26.321036033696206 - type: nauc_ndcg_at_1000_std value: -27.59146917115399 - type: nauc_ndcg_at_100_diff1 value: 35.66557800007035 - type: nauc_ndcg_at_100_max value: 26.282211208336136 - type: nauc_ndcg_at_100_std value: -27.905634124461333 - type: nauc_ndcg_at_10_diff1 value: 35.34872687407275 - type: nauc_ndcg_at_10_max value: 24.018561915792272 - type: nauc_ndcg_at_10_std value: -31.57712772869015 - type: nauc_ndcg_at_1_diff1 value: 44.882846373248206 - type: nauc_ndcg_at_1_max value: 26.865602442152554 - type: nauc_ndcg_at_1_std value: -28.509295454329152 - type: nauc_ndcg_at_20_diff1 value: 35.46177768045546 - type: nauc_ndcg_at_20_max value: 24.921273675141542 - type: nauc_ndcg_at_20_std value: -30.84348812979793 - type: nauc_ndcg_at_3_diff1 value: 36.84688489063923 - type: nauc_ndcg_at_3_max value: 24.088513229463736 - type: nauc_ndcg_at_3_std value: -30.05640995379297 - type: nauc_ndcg_at_5_diff1 value: 35.623143276796185 - type: nauc_ndcg_at_5_max value: 23.76654250474061 - type: nauc_ndcg_at_5_std value: -30.87847710074466 - type: nauc_precision_at_1000_diff1 value: -16.270532533886932 - type: nauc_precision_at_1000_max value: 17.37365042394671 - type: nauc_precision_at_1000_std value: 16.27166715693082 - type: nauc_precision_at_100_diff1 value: -13.175264889436313 - type: nauc_precision_at_100_max value: 19.488571046893963 - type: nauc_precision_at_100_std value: 9.055429698007798 - type: nauc_precision_at_10_diff1 value: 0.6806938753592942 - type: nauc_precision_at_10_max value: 21.933083960522616 - type: nauc_precision_at_10_std value: -18.2147036942157 - type: nauc_precision_at_1_diff1 value: 44.882846373248206 - type: nauc_precision_at_1_max value: 26.865602442152554 - type: nauc_precision_at_1_std value: -28.509295454329152 - type: nauc_precision_at_20_diff1 value: -4.318119150162302 - type: nauc_precision_at_20_max value: 21.089702301041687 - type: nauc_precision_at_20_std value: -10.333077681479546 - type: nauc_precision_at_3_diff1 value: 11.496076462671107 - type: nauc_precision_at_3_max value: 23.018301549827008 - type: nauc_precision_at_3_std value: -23.98652995416454 - type: nauc_precision_at_5_diff1 value: 4.271050668117355 - type: nauc_precision_at_5_max value: 23.61051327966779 - type: nauc_precision_at_5_std value: -21.557618503107847 - type: nauc_recall_at_1000_diff1 value: 62.23955911850697 - type: nauc_recall_at_1000_max value: 83.20491723365542 - type: nauc_recall_at_1000_std value: 66.5173462601958 - type: nauc_recall_at_100_diff1 value: 20.503778602988177 - type: nauc_recall_at_100_max value: 29.379026288767506 - type: nauc_recall_at_100_std value: -16.139120874540573 - type: nauc_recall_at_10_diff1 value: 27.659110249896557 - type: nauc_recall_at_10_max value: 19.69557968026332 - type: nauc_recall_at_10_std value: -33.95657132767551 - type: nauc_recall_at_1_diff1 value: 50.80313333312038 - type: nauc_recall_at_1_max value: 16.649890421888156 - type: nauc_recall_at_1_std value: -22.323989416471683 - type: nauc_recall_at_20_diff1 value: 27.084453724565176 - type: nauc_recall_at_20_max value: 21.40080632474994 - type: nauc_recall_at_20_std value: -32.83683639340239 - type: nauc_recall_at_3_diff1 value: 34.32950941333572 - type: nauc_recall_at_3_max value: 18.55616615958199 - type: nauc_recall_at_3_std value: -30.375983327454076 - type: nauc_recall_at_5_diff1 value: 29.44516734974564 - type: nauc_recall_at_5_max value: 20.630543534300312 - type: nauc_recall_at_5_std value: -31.30763062499127 - type: ndcg_at_1 value: 43.501 - type: ndcg_at_10 value: 50.831 - type: ndcg_at_100 value: 58.17099999999999 - type: ndcg_at_1000 value: 59.705 - type: ndcg_at_20 value: 54.047999999999995 - type: ndcg_at_3 value: 44.549 - type: ndcg_at_5 value: 46.861000000000004 - type: precision_at_1 value: 43.501 - type: precision_at_10 value: 12.895999999999999 - type: precision_at_100 value: 1.9 - type: precision_at_1000 value: 0.21 - type: precision_at_20 value: 7.593 - type: precision_at_3 value: 29.215000000000003 - type: precision_at_5 value: 21.57 - type: recall_at_1 value: 25.549 - type: recall_at_10 value: 61.795 - type: recall_at_100 value: 90.019 - type: recall_at_1000 value: 99.807 - type: recall_at_20 value: 72.096 - type: recall_at_3 value: 43.836999999999996 - type: recall_at_5 value: 51.714000000000006 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (pol-pol) type: jinaai/xpqa config: pol-pol split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 53.70399999999999 - type: map_at_1 value: 27.739000000000004 - type: map_at_10 value: 47.469 - type: map_at_100 value: 49.392 - type: map_at_1000 value: 49.483 - type: map_at_20 value: 48.646 - type: map_at_3 value: 41.467 - type: map_at_5 value: 45.467 - type: mrr_at_1 value: 47.00636942675159 - type: mrr_at_10 value: 54.63699322616519 - type: mrr_at_100 value: 55.54525182833755 - type: mrr_at_1000 value: 55.581331515356155 - type: mrr_at_20 value: 55.22918377451415 - type: mrr_at_3 value: 52.03821656050952 - type: mrr_at_5 value: 53.38216560509549 - type: nauc_map_at_1000_diff1 value: 45.03530825034854 - type: nauc_map_at_1000_max value: 34.22740272603397 - type: nauc_map_at_1000_std value: -30.428880484199244 - type: nauc_map_at_100_diff1 value: 44.978704455592805 - type: nauc_map_at_100_max value: 34.20908357964765 - type: nauc_map_at_100_std value: -30.47325365059666 - type: nauc_map_at_10_diff1 value: 44.9560579177672 - type: nauc_map_at_10_max value: 33.70097588985278 - type: nauc_map_at_10_std value: -31.205563222357885 - type: nauc_map_at_1_diff1 value: 57.94711780881773 - type: nauc_map_at_1_max value: 21.60278071836319 - type: nauc_map_at_1_std value: -23.273741268035923 - type: nauc_map_at_20_diff1 value: 44.97859054699532 - type: nauc_map_at_20_max value: 34.153729150181846 - type: nauc_map_at_20_std value: -30.97482545902907 - type: nauc_map_at_3_diff1 value: 47.52016138686765 - type: nauc_map_at_3_max value: 30.176197065298417 - type: nauc_map_at_3_std value: -29.90628984041898 - type: nauc_map_at_5_diff1 value: 45.36581638257985 - type: nauc_map_at_5_max value: 33.697200263698036 - type: nauc_map_at_5_std value: -31.165331120088453 - type: nauc_mrr_at_1000_diff1 value: 53.32889526818364 - type: nauc_mrr_at_1000_max value: 36.104118340589736 - type: nauc_mrr_at_1000_std value: -31.321132494516984 - type: nauc_mrr_at_100_diff1 value: 53.30695875258367 - type: nauc_mrr_at_100_max value: 36.114890079024455 - type: nauc_mrr_at_100_std value: -31.291749322117447 - type: nauc_mrr_at_10_diff1 value: 53.189084772141435 - type: nauc_mrr_at_10_max value: 35.939061062282484 - type: nauc_mrr_at_10_std value: -31.502185884653645 - type: nauc_mrr_at_1_diff1 value: 56.89368291041337 - type: nauc_mrr_at_1_max value: 36.07581125496313 - type: nauc_mrr_at_1_std value: -29.703764232519475 - type: nauc_mrr_at_20_diff1 value: 53.23955737199497 - type: nauc_mrr_at_20_max value: 36.068824838215676 - type: nauc_mrr_at_20_std value: -31.420039428197594 - type: nauc_mrr_at_3_diff1 value: 53.74385074861207 - type: nauc_mrr_at_3_max value: 35.57054587735015 - type: nauc_mrr_at_3_std value: -32.356894834537684 - type: nauc_mrr_at_5_diff1 value: 53.66669556981826 - type: nauc_mrr_at_5_max value: 36.02102289605049 - type: nauc_mrr_at_5_std value: -32.030437067359124 - type: nauc_ndcg_at_1000_diff1 value: 46.34900536768847 - type: nauc_ndcg_at_1000_max value: 35.6314995837715 - type: nauc_ndcg_at_1000_std value: -28.965103958822624 - type: nauc_ndcg_at_100_diff1 value: 45.1587893788861 - type: nauc_ndcg_at_100_max value: 35.62430753595297 - type: nauc_ndcg_at_100_std value: -28.77303405812772 - type: nauc_ndcg_at_10_diff1 value: 44.928781590765965 - type: nauc_ndcg_at_10_max value: 34.315200006430366 - type: nauc_ndcg_at_10_std value: -32.05164097076614 - type: nauc_ndcg_at_1_diff1 value: 57.228262350455125 - type: nauc_ndcg_at_1_max value: 35.645285703387366 - type: nauc_ndcg_at_1_std value: -29.893553821348718 - type: nauc_ndcg_at_20_diff1 value: 44.959903633039865 - type: nauc_ndcg_at_20_max value: 35.493022926282755 - type: nauc_ndcg_at_20_std value: -31.54989291850644 - type: nauc_ndcg_at_3_diff1 value: 46.65266185996905 - type: nauc_ndcg_at_3_max value: 33.74458119579594 - type: nauc_ndcg_at_3_std value: -31.493683304534176 - type: nauc_ndcg_at_5_diff1 value: 46.08707037187612 - type: nauc_ndcg_at_5_max value: 34.7401426055243 - type: nauc_ndcg_at_5_std value: -32.44390676345172 - type: nauc_precision_at_1000_diff1 value: -12.11355300492561 - type: nauc_precision_at_1000_max value: 14.490738062121233 - type: nauc_precision_at_1000_std value: 14.448811005059097 - type: nauc_precision_at_100_diff1 value: -9.742085657181239 - type: nauc_precision_at_100_max value: 18.030305489251223 - type: nauc_precision_at_100_std value: 8.213089709529765 - type: nauc_precision_at_10_diff1 value: 5.153466672774969 - type: nauc_precision_at_10_max value: 27.29412644661678 - type: nauc_precision_at_10_std value: -15.505053884112355 - type: nauc_precision_at_1_diff1 value: 57.228262350455125 - type: nauc_precision_at_1_max value: 35.645285703387366 - type: nauc_precision_at_1_std value: -29.893553821348718 - type: nauc_precision_at_20_diff1 value: -0.6812430761066635 - type: nauc_precision_at_20_max value: 25.81911286466295 - type: nauc_precision_at_20_std value: -8.388506222482595 - type: nauc_precision_at_3_diff1 value: 18.263873866510576 - type: nauc_precision_at_3_max value: 30.879576105862345 - type: nauc_precision_at_3_std value: -24.0342929870108 - type: nauc_precision_at_5_diff1 value: 10.9905804265327 - type: nauc_precision_at_5_max value: 30.88468087429045 - type: nauc_precision_at_5_std value: -20.458684056213507 - type: nauc_recall_at_1000_diff1 value: -64.887668417171 - type: nauc_recall_at_1000_max value: 52.25501730358092 - type: nauc_recall_at_1000_std value: 85.13647916200132 - type: nauc_recall_at_100_diff1 value: 18.956777346127655 - type: nauc_recall_at_100_max value: 36.10473493564588 - type: nauc_recall_at_100_std value: -10.007474558899949 - type: nauc_recall_at_10_diff1 value: 33.810344497568046 - type: nauc_recall_at_10_max value: 31.395430183214245 - type: nauc_recall_at_10_std value: -33.12920524433795 - type: nauc_recall_at_1_diff1 value: 57.94711780881773 - type: nauc_recall_at_1_max value: 21.60278071836319 - type: nauc_recall_at_1_std value: -23.273741268035923 - type: nauc_recall_at_20_diff1 value: 31.449657437065397 - type: nauc_recall_at_20_max value: 34.519574934321945 - type: nauc_recall_at_20_std value: -33.43406862055647 - type: nauc_recall_at_3_diff1 value: 42.07841848382365 - type: nauc_recall_at_3_max value: 28.7648772833266 - type: nauc_recall_at_3_std value: -31.56367736320086 - type: nauc_recall_at_5_diff1 value: 39.21392858246301 - type: nauc_recall_at_5_max value: 34.28338202081927 - type: nauc_recall_at_5_std value: -33.725680523721906 - type: ndcg_at_1 value: 46.879 - type: ndcg_at_10 value: 53.70399999999999 - type: ndcg_at_100 value: 60.532 - type: ndcg_at_1000 value: 61.997 - type: ndcg_at_20 value: 56.818999999999996 - type: ndcg_at_3 value: 47.441 - type: ndcg_at_5 value: 49.936 - type: precision_at_1 value: 46.879 - type: precision_at_10 value: 13.376 - type: precision_at_100 value: 1.8980000000000001 - type: precision_at_1000 value: 0.208 - type: precision_at_20 value: 7.771 - type: precision_at_3 value: 30.658 - type: precision_at_5 value: 22.828 - type: recall_at_1 value: 27.739000000000004 - type: recall_at_10 value: 64.197 - type: recall_at_100 value: 90.54100000000001 - type: recall_at_1000 value: 99.90400000000001 - type: recall_at_20 value: 74.178 - type: recall_at_3 value: 46.312 - type: recall_at_5 value: 54.581999999999994 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (cmn-eng) type: jinaai/xpqa config: cmn-eng split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 64.64 - type: map_at_1 value: 35.858000000000004 - type: map_at_10 value: 58.547000000000004 - type: map_at_100 value: 60.108 - type: map_at_1000 value: 60.153999999999996 - type: map_at_20 value: 59.528000000000006 - type: map_at_3 value: 51.578 - type: map_at_5 value: 56.206999999999994 - type: mrr_at_1 value: 56.95121951219512 - type: mrr_at_10 value: 64.93975029036001 - type: mrr_at_100 value: 65.63357055718294 - type: mrr_at_1000 value: 65.64844109026834 - type: mrr_at_20 value: 65.41280668715439 - type: mrr_at_3 value: 62.68292682926826 - type: mrr_at_5 value: 64.1585365853658 - type: nauc_map_at_1000_diff1 value: 45.82740870907091 - type: nauc_map_at_1000_max value: 21.9696540066807 - type: nauc_map_at_1000_std value: -32.028262356639495 - type: nauc_map_at_100_diff1 value: 45.802053117616396 - type: nauc_map_at_100_max value: 21.946002070290966 - type: nauc_map_at_100_std value: -32.06190418866229 - type: nauc_map_at_10_diff1 value: 46.017774155748945 - type: nauc_map_at_10_max value: 21.876909086095544 - type: nauc_map_at_10_std value: -32.13913568843985 - type: nauc_map_at_1_diff1 value: 56.34671160956164 - type: nauc_map_at_1_max value: 17.6796949796236 - type: nauc_map_at_1_std value: -13.741140688066045 - type: nauc_map_at_20_diff1 value: 46.027469176858716 - type: nauc_map_at_20_max value: 21.80738432042703 - type: nauc_map_at_20_std value: -32.430379634015395 - type: nauc_map_at_3_diff1 value: 48.40096725254027 - type: nauc_map_at_3_max value: 21.15442803574233 - type: nauc_map_at_3_std value: -26.205850292181417 - type: nauc_map_at_5_diff1 value: 45.77800041356389 - type: nauc_map_at_5_max value: 22.11718771798752 - type: nauc_map_at_5_std value: -30.32876338031471 - type: nauc_mrr_at_1000_diff1 value: 49.748274798877944 - type: nauc_mrr_at_1000_max value: 24.547774167219906 - type: nauc_mrr_at_1000_std value: -32.728447209433504 - type: nauc_mrr_at_100_diff1 value: 49.734549290377856 - type: nauc_mrr_at_100_max value: 24.536933315055222 - type: nauc_mrr_at_100_std value: -32.74076335880697 - type: nauc_mrr_at_10_diff1 value: 49.82827711456392 - type: nauc_mrr_at_10_max value: 24.536773657485075 - type: nauc_mrr_at_10_std value: -33.05707547166962 - type: nauc_mrr_at_1_diff1 value: 51.954289992321044 - type: nauc_mrr_at_1_max value: 26.336255074856886 - type: nauc_mrr_at_1_std value: -29.042962019692446 - type: nauc_mrr_at_20_diff1 value: 49.70938465628863 - type: nauc_mrr_at_20_max value: 24.433219849576947 - type: nauc_mrr_at_20_std value: -32.94123791846049 - type: nauc_mrr_at_3_diff1 value: 50.289486880347134 - type: nauc_mrr_at_3_max value: 24.978796972860142 - type: nauc_mrr_at_3_std value: -32.11305594784892 - type: nauc_mrr_at_5_diff1 value: 49.95013396316144 - type: nauc_mrr_at_5_max value: 24.514452761198303 - type: nauc_mrr_at_5_std value: -32.865859962984146 - type: nauc_ndcg_at_1000_diff1 value: 45.73806489233998 - type: nauc_ndcg_at_1000_max value: 22.404941391043867 - type: nauc_ndcg_at_1000_std value: -33.063445720849685 - type: nauc_ndcg_at_100_diff1 value: 45.1046206923062 - type: nauc_ndcg_at_100_max value: 22.081133719684658 - type: nauc_ndcg_at_100_std value: -33.299291459450146 - type: nauc_ndcg_at_10_diff1 value: 46.140608688357496 - type: nauc_ndcg_at_10_max value: 21.442489279388916 - type: nauc_ndcg_at_10_std value: -35.115870342856006 - type: nauc_ndcg_at_1_diff1 value: 51.954289992321044 - type: nauc_ndcg_at_1_max value: 26.336255074856886 - type: nauc_ndcg_at_1_std value: -29.042962019692446 - type: nauc_ndcg_at_20_diff1 value: 45.966784725457046 - type: nauc_ndcg_at_20_max value: 21.166632858613145 - type: nauc_ndcg_at_20_std value: -35.65112890375392 - type: nauc_ndcg_at_3_diff1 value: 46.7404863978999 - type: nauc_ndcg_at_3_max value: 22.701743709129456 - type: nauc_ndcg_at_3_std value: -30.907633466983192 - type: nauc_ndcg_at_5_diff1 value: 45.86487199083486 - type: nauc_ndcg_at_5_max value: 22.088804840002513 - type: nauc_ndcg_at_5_std value: -32.3853481632832 - type: nauc_precision_at_1000_diff1 value: -25.69710612774455 - type: nauc_precision_at_1000_max value: 1.3964400247388091 - type: nauc_precision_at_1000_std value: -8.873947511634814 - type: nauc_precision_at_100_diff1 value: -24.013497191077978 - type: nauc_precision_at_100_max value: 2.0197725715909343 - type: nauc_precision_at_100_std value: -11.387423148770633 - type: nauc_precision_at_10_diff1 value: -6.47728645242781 - type: nauc_precision_at_10_max value: 6.815261443768304 - type: nauc_precision_at_10_std value: -26.825062292855943 - type: nauc_precision_at_1_diff1 value: 51.954289992321044 - type: nauc_precision_at_1_max value: 26.336255074856886 - type: nauc_precision_at_1_std value: -29.042962019692446 - type: nauc_precision_at_20_diff1 value: -12.355232044747511 - type: nauc_precision_at_20_max value: 4.022126850949725 - type: nauc_precision_at_20_std value: -23.688935769326772 - type: nauc_precision_at_3_diff1 value: 7.662671665835864 - type: nauc_precision_at_3_max value: 14.372394760986248 - type: nauc_precision_at_3_std value: -28.635125665532453 - type: nauc_precision_at_5_diff1 value: -1.4592476425511611 - type: nauc_precision_at_5_max value: 11.124310161474174 - type: nauc_precision_at_5_std value: -27.89526669318053 - type: nauc_recall_at_1000_diff1 value: -19.58450046684932 - type: nauc_recall_at_1000_max value: 70.71661998133165 - type: nauc_recall_at_1000_std value: 93.05555555556315 - type: nauc_recall_at_100_diff1 value: 15.06356457571853 - type: nauc_recall_at_100_max value: 14.051414749344806 - type: nauc_recall_at_100_std value: -29.461874235153008 - type: nauc_recall_at_10_diff1 value: 41.29842726117901 - type: nauc_recall_at_10_max value: 15.768699673830898 - type: nauc_recall_at_10_std value: -42.11585661287712 - type: nauc_recall_at_1_diff1 value: 56.34671160956164 - type: nauc_recall_at_1_max value: 17.6796949796236 - type: nauc_recall_at_1_std value: -13.741140688066045 - type: nauc_recall_at_20_diff1 value: 38.8078283585263 - type: nauc_recall_at_20_max value: 12.06816084005326 - type: nauc_recall_at_20_std value: -48.20956170056591 - type: nauc_recall_at_3_diff1 value: 44.71028758038993 - type: nauc_recall_at_3_max value: 19.1059093689162 - type: nauc_recall_at_3_std value: -26.795164453784253 - type: nauc_recall_at_5_diff1 value: 41.06320797773054 - type: nauc_recall_at_5_max value: 19.117028272530998 - type: nauc_recall_at_5_std value: -33.985747504612156 - type: ndcg_at_1 value: 56.95099999999999 - type: ndcg_at_10 value: 64.64 - type: ndcg_at_100 value: 70.017 - type: ndcg_at_1000 value: 70.662 - type: ndcg_at_20 value: 67.256 - type: ndcg_at_3 value: 58.269000000000005 - type: ndcg_at_5 value: 60.94199999999999 - type: precision_at_1 value: 56.95099999999999 - type: precision_at_10 value: 15.671 - type: precision_at_100 value: 2.002 - type: precision_at_1000 value: 0.208 - type: precision_at_20 value: 8.689 - type: precision_at_3 value: 36.341 - type: precision_at_5 value: 26.854 - type: recall_at_1 value: 35.858000000000004 - type: recall_at_10 value: 75.02 - type: recall_at_100 value: 95.76 - type: recall_at_1000 value: 99.837 - type: recall_at_20 value: 83.732 - type: recall_at_3 value: 57.093 - type: recall_at_5 value: 66.193 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (cmn-cmn) type: jinaai/xpqa config: cmn-cmn split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 69.446 - type: map_at_1 value: 39.995999999999995 - type: map_at_10 value: 64.033 - type: map_at_100 value: 65.51599999999999 - type: map_at_1000 value: 65.545 - type: map_at_20 value: 64.958 - type: map_at_3 value: 57.767 - type: map_at_5 value: 61.998 - type: mrr_at_1 value: 63.3495145631068 - type: mrr_at_10 value: 70.21146363075978 - type: mrr_at_100 value: 70.82810974202124 - type: mrr_at_1000 value: 70.83816803303915 - type: mrr_at_20 value: 70.60140248428802 - type: mrr_at_3 value: 68.66909385113267 - type: mrr_at_5 value: 69.56108414239482 - type: nauc_map_at_1000_diff1 value: 51.649897072831465 - type: nauc_map_at_1000_max value: 38.25222728655331 - type: nauc_map_at_1000_std value: -39.10327919949334 - type: nauc_map_at_100_diff1 value: 51.644205886401465 - type: nauc_map_at_100_max value: 38.23611154355255 - type: nauc_map_at_100_std value: -39.1677073977285 - type: nauc_map_at_10_diff1 value: 51.81444145636039 - type: nauc_map_at_10_max value: 38.03382104326485 - type: nauc_map_at_10_std value: -38.999395639812015 - type: nauc_map_at_1_diff1 value: 59.785298201044704 - type: nauc_map_at_1_max value: 23.273537759937785 - type: nauc_map_at_1_std value: -17.838712689290194 - type: nauc_map_at_20_diff1 value: 51.680208795601004 - type: nauc_map_at_20_max value: 38.23334583518634 - type: nauc_map_at_20_std value: -39.24344495939061 - type: nauc_map_at_3_diff1 value: 52.180913298194056 - type: nauc_map_at_3_max value: 33.45482478000481 - type: nauc_map_at_3_std value: -31.682911030586297 - type: nauc_map_at_5_diff1 value: 50.804900676175436 - type: nauc_map_at_5_max value: 37.68924816012326 - type: nauc_map_at_5_std value: -36.85016896616712 - type: nauc_mrr_at_1000_diff1 value: 56.371477471577535 - type: nauc_mrr_at_1000_max value: 42.773877962050086 - type: nauc_mrr_at_1000_std value: -40.41765081873682 - type: nauc_mrr_at_100_diff1 value: 56.3619751528192 - type: nauc_mrr_at_100_max value: 42.76298794859916 - type: nauc_mrr_at_100_std value: -40.44070582448831 - type: nauc_mrr_at_10_diff1 value: 56.33810523477712 - type: nauc_mrr_at_10_max value: 42.76591937795783 - type: nauc_mrr_at_10_std value: -40.69339583030244 - type: nauc_mrr_at_1_diff1 value: 58.90399906884378 - type: nauc_mrr_at_1_max value: 43.38806571165292 - type: nauc_mrr_at_1_std value: -38.224015285584 - type: nauc_mrr_at_20_diff1 value: 56.32629070537032 - type: nauc_mrr_at_20_max value: 42.79615263472604 - type: nauc_mrr_at_20_std value: -40.496777397603076 - type: nauc_mrr_at_3_diff1 value: 55.96989454480743 - type: nauc_mrr_at_3_max value: 42.49832220744744 - type: nauc_mrr_at_3_std value: -39.883799467132384 - type: nauc_mrr_at_5_diff1 value: 56.003080766475755 - type: nauc_mrr_at_5_max value: 42.73308051011805 - type: nauc_mrr_at_5_std value: -39.87179511166683 - type: nauc_ndcg_at_1000_diff1 value: 52.49054229225255 - type: nauc_ndcg_at_1000_max value: 39.61644750719859 - type: nauc_ndcg_at_1000_std value: -40.89845763194674 - type: nauc_ndcg_at_100_diff1 value: 52.33511250864434 - type: nauc_ndcg_at_100_max value: 39.25530146124452 - type: nauc_ndcg_at_100_std value: -41.92444498004374 - type: nauc_ndcg_at_10_diff1 value: 52.62031505931842 - type: nauc_ndcg_at_10_max value: 38.667195545396766 - type: nauc_ndcg_at_10_std value: -42.59503924641507 - type: nauc_ndcg_at_1_diff1 value: 58.90399906884378 - type: nauc_ndcg_at_1_max value: 43.38806571165292 - type: nauc_ndcg_at_1_std value: -38.224015285584 - type: nauc_ndcg_at_20_diff1 value: 52.15061629809436 - type: nauc_ndcg_at_20_max value: 39.09332400054708 - type: nauc_ndcg_at_20_std value: -42.80018671618001 - type: nauc_ndcg_at_3_diff1 value: 51.04210728138207 - type: nauc_ndcg_at_3_max value: 38.19034802567046 - type: nauc_ndcg_at_3_std value: -38.179821090765216 - type: nauc_ndcg_at_5_diff1 value: 51.04399574045204 - type: nauc_ndcg_at_5_max value: 38.42492210204548 - type: nauc_ndcg_at_5_std value: -38.868073241617715 - type: nauc_precision_at_1000_diff1 value: -25.151369907213734 - type: nauc_precision_at_1000_max value: 9.012549147054989 - type: nauc_precision_at_1000_std value: -9.319786589947698 - type: nauc_precision_at_100_diff1 value: -23.20945211843088 - type: nauc_precision_at_100_max value: 9.860701593969862 - type: nauc_precision_at_100_std value: -13.073877818347231 - type: nauc_precision_at_10_diff1 value: -6.970781124246847 - type: nauc_precision_at_10_max value: 19.392675322254487 - type: nauc_precision_at_10_std value: -26.74943490717657 - type: nauc_precision_at_1_diff1 value: 58.90399906884378 - type: nauc_precision_at_1_max value: 43.38806571165292 - type: nauc_precision_at_1_std value: -38.224015285584 - type: nauc_precision_at_20_diff1 value: -13.046456108081102 - type: nauc_precision_at_20_max value: 15.69439950383875 - type: nauc_precision_at_20_std value: -23.836004512018093 - type: nauc_precision_at_3_diff1 value: 3.5444232965528846 - type: nauc_precision_at_3_max value: 27.08858445453865 - type: nauc_precision_at_3_std value: -29.12757283665593 - type: nauc_precision_at_5_diff1 value: -3.6853986353320267 - type: nauc_precision_at_5_max value: 24.32059689571271 - type: nauc_precision_at_5_std value: -27.46188072134163 - type: nauc_recall_at_1000_diff1 value: 86.93515141907919 - type: nauc_recall_at_1000_max value: 100.0 - type: nauc_recall_at_1000_std value: 100.0 - type: nauc_recall_at_100_diff1 value: 39.7052887613879 - type: nauc_recall_at_100_max value: 18.40943977796887 - type: nauc_recall_at_100_std value: -88.74014854144974 - type: nauc_recall_at_10_diff1 value: 48.85342500870892 - type: nauc_recall_at_10_max value: 32.69617204234419 - type: nauc_recall_at_10_std value: -51.9937231860804 - type: nauc_recall_at_1_diff1 value: 59.785298201044704 - type: nauc_recall_at_1_max value: 23.273537759937785 - type: nauc_recall_at_1_std value: -17.838712689290194 - type: nauc_recall_at_20_diff1 value: 45.40839773314378 - type: nauc_recall_at_20_max value: 33.02458321493215 - type: nauc_recall_at_20_std value: -55.97800739448166 - type: nauc_recall_at_3_diff1 value: 47.05565693416531 - type: nauc_recall_at_3_max value: 28.743850400344297 - type: nauc_recall_at_3_std value: -32.436470486397475 - type: nauc_recall_at_5_diff1 value: 45.30223758669577 - type: nauc_recall_at_5_max value: 33.6567274747059 - type: nauc_recall_at_5_std value: -39.946712017948514 - type: ndcg_at_1 value: 63.349999999999994 - type: ndcg_at_10 value: 69.446 - type: ndcg_at_100 value: 74.439 - type: ndcg_at_1000 value: 74.834 - type: ndcg_at_20 value: 71.763 - type: ndcg_at_3 value: 64.752 - type: ndcg_at_5 value: 66.316 - type: precision_at_1 value: 63.349999999999994 - type: precision_at_10 value: 16.286 - type: precision_at_100 value: 2.024 - type: precision_at_1000 value: 0.207 - type: precision_at_20 value: 8.908000000000001 - type: precision_at_3 value: 40.655 - type: precision_at_5 value: 28.859 - type: recall_at_1 value: 39.995999999999995 - type: recall_at_10 value: 78.107 - type: recall_at_100 value: 97.538 - type: recall_at_1000 value: 99.96000000000001 - type: recall_at_20 value: 85.72 - type: recall_at_3 value: 63.291 - type: recall_at_5 value: 70.625 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (spa-eng) type: jinaai/xpqa config: spa-eng split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 68.258 - type: map_at_1 value: 33.06 - type: map_at_10 value: 61.590999999999994 - type: map_at_100 value: 63.341 - type: map_at_1000 value: 63.385999999999996 - type: map_at_20 value: 62.77700000000001 - type: map_at_3 value: 52.547999999999995 - type: map_at_5 value: 58.824 - type: mrr_at_1 value: 63.80832282471627 - type: mrr_at_10 value: 70.76848015372607 - type: mrr_at_100 value: 71.33996704518061 - type: mrr_at_1000 value: 71.35368444388072 - type: mrr_at_20 value: 71.18191741103522 - type: mrr_at_3 value: 68.83144178226142 - type: mrr_at_5 value: 69.88440521227405 - type: nauc_map_at_1000_diff1 value: 41.59255746310511 - type: nauc_map_at_1000_max value: 42.064075373358065 - type: nauc_map_at_1000_std value: -25.130730194381723 - type: nauc_map_at_100_diff1 value: 41.56447648820406 - type: nauc_map_at_100_max value: 42.06711634651607 - type: nauc_map_at_100_std value: -25.14871585556968 - type: nauc_map_at_10_diff1 value: 41.28968387107058 - type: nauc_map_at_10_max value: 41.511538272139774 - type: nauc_map_at_10_std value: -25.99906440164276 - type: nauc_map_at_1_diff1 value: 51.09859596320021 - type: nauc_map_at_1_max value: 12.406789321338222 - type: nauc_map_at_1_std value: -18.227486548655076 - type: nauc_map_at_20_diff1 value: 41.39469672947315 - type: nauc_map_at_20_max value: 41.98309315808902 - type: nauc_map_at_20_std value: -25.44704720985219 - type: nauc_map_at_3_diff1 value: 43.16164995512842 - type: nauc_map_at_3_max value: 30.935400935562818 - type: nauc_map_at_3_std value: -23.53095555148866 - type: nauc_map_at_5_diff1 value: 41.23474352142375 - type: nauc_map_at_5_max value: 39.03088859147947 - type: nauc_map_at_5_std value: -26.046526443708366 - type: nauc_mrr_at_1000_diff1 value: 51.79649678213789 - type: nauc_mrr_at_1000_max value: 50.50340748045259 - type: nauc_mrr_at_1000_std value: -24.777183703493407 - type: nauc_mrr_at_100_diff1 value: 51.78609028166551 - type: nauc_mrr_at_100_max value: 50.51732896833555 - type: nauc_mrr_at_100_std value: -24.760054686874717 - type: nauc_mrr_at_10_diff1 value: 51.705268395036995 - type: nauc_mrr_at_10_max value: 50.35818415293149 - type: nauc_mrr_at_10_std value: -25.170367120250404 - type: nauc_mrr_at_1_diff1 value: 53.91475115581825 - type: nauc_mrr_at_1_max value: 49.122529616282016 - type: nauc_mrr_at_1_std value: -22.377647552937155 - type: nauc_mrr_at_20_diff1 value: 51.778984221197774 - type: nauc_mrr_at_20_max value: 50.5070957827813 - type: nauc_mrr_at_20_std value: -24.908935023607285 - type: nauc_mrr_at_3_diff1 value: 51.82683773090423 - type: nauc_mrr_at_3_max value: 50.77993196421369 - type: nauc_mrr_at_3_std value: -24.3925832021831 - type: nauc_mrr_at_5_diff1 value: 51.722232683543034 - type: nauc_mrr_at_5_max value: 50.334865493961864 - type: nauc_mrr_at_5_std value: -25.513593495703297 - type: nauc_ndcg_at_1000_diff1 value: 44.21851582991263 - type: nauc_ndcg_at_1000_max value: 45.73539068637836 - type: nauc_ndcg_at_1000_std value: -24.716522467580397 - type: nauc_ndcg_at_100_diff1 value: 43.8002401615357 - type: nauc_ndcg_at_100_max value: 45.801409410061915 - type: nauc_ndcg_at_100_std value: -24.73171742499903 - type: nauc_ndcg_at_10_diff1 value: 42.540922778755885 - type: nauc_ndcg_at_10_max value: 44.348836943874595 - type: nauc_ndcg_at_10_std value: -28.05403666494785 - type: nauc_ndcg_at_1_diff1 value: 53.91475115581825 - type: nauc_ndcg_at_1_max value: 49.122529616282016 - type: nauc_ndcg_at_1_std value: -22.377647552937155 - type: nauc_ndcg_at_20_diff1 value: 43.10347921163421 - type: nauc_ndcg_at_20_max value: 45.53253270265022 - type: nauc_ndcg_at_20_std value: -26.63902791862846 - type: nauc_ndcg_at_3_diff1 value: 42.41720274782384 - type: nauc_ndcg_at_3_max value: 42.91778219334943 - type: nauc_ndcg_at_3_std value: -24.793252033594076 - type: nauc_ndcg_at_5_diff1 value: 42.51515034945093 - type: nauc_ndcg_at_5_max value: 41.62080576508792 - type: nauc_ndcg_at_5_std value: -28.209669314955065 - type: nauc_precision_at_1000_diff1 value: -14.89794075433148 - type: nauc_precision_at_1000_max value: 27.85387929356412 - type: nauc_precision_at_1000_std value: 10.728618597190849 - type: nauc_precision_at_100_diff1 value: -13.075270046295856 - type: nauc_precision_at_100_max value: 29.77208946756632 - type: nauc_precision_at_100_std value: 8.491662697326039 - type: nauc_precision_at_10_diff1 value: -4.0826025188781205 - type: nauc_precision_at_10_max value: 39.04278085180075 - type: nauc_precision_at_10_std value: -5.925408651372333 - type: nauc_precision_at_1_diff1 value: 53.91475115581825 - type: nauc_precision_at_1_max value: 49.122529616282016 - type: nauc_precision_at_1_std value: -22.377647552937155 - type: nauc_precision_at_20_diff1 value: -7.93186440645135 - type: nauc_precision_at_20_max value: 35.81281308891365 - type: nauc_precision_at_20_std value: 0.1241277857515697 - type: nauc_precision_at_3_diff1 value: 7.563562511484409 - type: nauc_precision_at_3_max value: 43.43738862378524 - type: nauc_precision_at_3_std value: -11.958059731912615 - type: nauc_precision_at_5_diff1 value: -0.1801152449011624 - type: nauc_precision_at_5_max value: 41.32486715619513 - type: nauc_precision_at_5_std value: -10.088699021919552 - type: nauc_recall_at_1000_diff1 value: 86.93359696819986 - type: nauc_recall_at_1000_max value: 100.0 - type: nauc_recall_at_1000_std value: 72.21843645604022 - type: nauc_recall_at_100_diff1 value: 29.86050842714198 - type: nauc_recall_at_100_max value: 48.106658251136245 - type: nauc_recall_at_100_std value: -14.981886214880035 - type: nauc_recall_at_10_diff1 value: 33.67119240737528 - type: nauc_recall_at_10_max value: 39.271984859561414 - type: nauc_recall_at_10_std value: -35.6434883839217 - type: nauc_recall_at_1_diff1 value: 51.09859596320021 - type: nauc_recall_at_1_max value: 12.406789321338222 - type: nauc_recall_at_1_std value: -18.227486548655076 - type: nauc_recall_at_20_diff1 value: 33.211979983240724 - type: nauc_recall_at_20_max value: 43.47676074743184 - type: nauc_recall_at_20_std value: -33.88107138395349 - type: nauc_recall_at_3_diff1 value: 39.22513750146998 - type: nauc_recall_at_3_max value: 27.066674083840166 - type: nauc_recall_at_3_std value: -26.963282529629893 - type: nauc_recall_at_5_diff1 value: 36.53718917129459 - type: nauc_recall_at_5_max value: 35.40550013169686 - type: nauc_recall_at_5_std value: -34.209159379410806 - type: ndcg_at_1 value: 63.808 - type: ndcg_at_10 value: 68.258 - type: ndcg_at_100 value: 73.38799999999999 - type: ndcg_at_1000 value: 74.03 - type: ndcg_at_20 value: 70.968 - type: ndcg_at_3 value: 62.33 - type: ndcg_at_5 value: 64.096 - type: precision_at_1 value: 63.808 - type: precision_at_10 value: 19.243 - type: precision_at_100 value: 2.367 - type: precision_at_1000 value: 0.245 - type: precision_at_20 value: 10.599 - type: precision_at_3 value: 44.515 - type: precision_at_5 value: 33.467999999999996 - type: recall_at_1 value: 33.06 - type: recall_at_10 value: 77.423 - type: recall_at_100 value: 95.923 - type: recall_at_1000 value: 99.874 - type: recall_at_20 value: 85.782 - type: recall_at_3 value: 57.098000000000006 - type: recall_at_5 value: 67.472 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (spa-spa) type: jinaai/xpqa config: spa-spa split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 72.004 - type: map_at_1 value: 36.248000000000005 - type: map_at_10 value: 65.679 - type: map_at_100 value: 67.22399999999999 - type: map_at_1000 value: 67.264 - type: map_at_20 value: 66.705 - type: map_at_3 value: 56.455 - type: map_at_5 value: 62.997 - type: mrr_at_1 value: 67.71752837326608 - type: mrr_at_10 value: 74.59782021257429 - type: mrr_at_100 value: 75.0640960767943 - type: mrr_at_1000 value: 75.07324799466076 - type: mrr_at_20 value: 74.9323963386884 - type: mrr_at_3 value: 72.95081967213115 - type: mrr_at_5 value: 73.82723833543506 - type: nauc_map_at_1000_diff1 value: 43.111810717567714 - type: nauc_map_at_1000_max value: 44.835247208972476 - type: nauc_map_at_1000_std value: -32.798405973931985 - type: nauc_map_at_100_diff1 value: 43.090223482932764 - type: nauc_map_at_100_max value: 44.83392441557943 - type: nauc_map_at_100_std value: -32.81149166676563 - type: nauc_map_at_10_diff1 value: 42.87841934951979 - type: nauc_map_at_10_max value: 43.9838653389494 - type: nauc_map_at_10_std value: -33.588084643627084 - type: nauc_map_at_1_diff1 value: 54.509245848379095 - type: nauc_map_at_1_max value: 10.05921648322742 - type: nauc_map_at_1_std value: -24.652326014826762 - type: nauc_map_at_20_diff1 value: 43.07468612984794 - type: nauc_map_at_20_max value: 44.75663122615032 - type: nauc_map_at_20_std value: -33.11788887878321 - type: nauc_map_at_3_diff1 value: 44.63272828938906 - type: nauc_map_at_3_max value: 32.1584369869227 - type: nauc_map_at_3_std value: -30.761662210142944 - type: nauc_map_at_5_diff1 value: 42.77296997803048 - type: nauc_map_at_5_max value: 41.78894616737652 - type: nauc_map_at_5_std value: -33.56459774477362 - type: nauc_mrr_at_1000_diff1 value: 53.097544131833494 - type: nauc_mrr_at_1000_max value: 50.61134979184588 - type: nauc_mrr_at_1000_std value: -35.6221191487669 - type: nauc_mrr_at_100_diff1 value: 53.096609856182106 - type: nauc_mrr_at_100_max value: 50.61951585642645 - type: nauc_mrr_at_100_std value: -35.62396157508327 - type: nauc_mrr_at_10_diff1 value: 52.771534471912304 - type: nauc_mrr_at_10_max value: 50.430863224435726 - type: nauc_mrr_at_10_std value: -36.027992076620365 - type: nauc_mrr_at_1_diff1 value: 55.05316238884337 - type: nauc_mrr_at_1_max value: 49.461858515275196 - type: nauc_mrr_at_1_std value: -31.87492636319712 - type: nauc_mrr_at_20_diff1 value: 53.083253469629746 - type: nauc_mrr_at_20_max value: 50.62156424256193 - type: nauc_mrr_at_20_std value: -35.879153692447154 - type: nauc_mrr_at_3_diff1 value: 52.98283109188415 - type: nauc_mrr_at_3_max value: 50.83561260429378 - type: nauc_mrr_at_3_std value: -35.30839538038797 - type: nauc_mrr_at_5_diff1 value: 52.93270510879709 - type: nauc_mrr_at_5_max value: 50.54595596761199 - type: nauc_mrr_at_5_std value: -35.84059376434395 - type: nauc_ndcg_at_1000_diff1 value: 45.343685089209416 - type: nauc_ndcg_at_1000_max value: 47.801141576669465 - type: nauc_ndcg_at_1000_std value: -33.512958862879195 - type: nauc_ndcg_at_100_diff1 value: 45.255590461515894 - type: nauc_ndcg_at_100_max value: 47.99240031881967 - type: nauc_ndcg_at_100_std value: -33.614465006695205 - type: nauc_ndcg_at_10_diff1 value: 43.93472511731019 - type: nauc_ndcg_at_10_max value: 45.92599752897053 - type: nauc_ndcg_at_10_std value: -36.43629114491574 - type: nauc_ndcg_at_1_diff1 value: 55.05316238884337 - type: nauc_ndcg_at_1_max value: 49.461858515275196 - type: nauc_ndcg_at_1_std value: -31.87492636319712 - type: nauc_ndcg_at_20_diff1 value: 44.93534591273201 - type: nauc_ndcg_at_20_max value: 47.55153940713458 - type: nauc_ndcg_at_20_std value: -35.56392448745206 - type: nauc_ndcg_at_3_diff1 value: 43.17916122133396 - type: nauc_ndcg_at_3_max value: 45.603634205103276 - type: nauc_ndcg_at_3_std value: -32.473227507181214 - type: nauc_ndcg_at_5_diff1 value: 44.10242961669216 - type: nauc_ndcg_at_5_max value: 43.61666669031808 - type: nauc_ndcg_at_5_std value: -35.98808321497782 - type: nauc_precision_at_1000_diff1 value: -23.264714449991146 - type: nauc_precision_at_1000_max value: 28.505729576735465 - type: nauc_precision_at_1000_std value: 11.987379232920926 - type: nauc_precision_at_100_diff1 value: -21.156119174614627 - type: nauc_precision_at_100_max value: 30.711646221646255 - type: nauc_precision_at_100_std value: 9.650486536340322 - type: nauc_precision_at_10_diff1 value: -10.98001328477502 - type: nauc_precision_at_10_max value: 39.25638073760597 - type: nauc_precision_at_10_std value: -4.3456859257488 - type: nauc_precision_at_1_diff1 value: 55.05316238884337 - type: nauc_precision_at_1_max value: 49.461858515275196 - type: nauc_precision_at_1_std value: -31.87492636319712 - type: nauc_precision_at_20_diff1 value: -14.97565390664424 - type: nauc_precision_at_20_max value: 36.383835295942355 - type: nauc_precision_at_20_std value: 1.525158880381114 - type: nauc_precision_at_3_diff1 value: 1.0448345623903483 - type: nauc_precision_at_3_max value: 45.69772060667404 - type: nauc_precision_at_3_std value: -13.002685018948293 - type: nauc_precision_at_5_diff1 value: -5.434185597628904 - type: nauc_precision_at_5_max value: 42.99162431099203 - type: nauc_precision_at_5_std value: -9.789308817624534 - type: nauc_recall_at_1000_diff1 value: 12.309303236094845 - type: nauc_recall_at_1000_max value: 100.0 - type: nauc_recall_at_1000_std value: 86.93359696819986 - type: nauc_recall_at_100_diff1 value: 39.093544920901415 - type: nauc_recall_at_100_max value: 55.62814395062938 - type: nauc_recall_at_100_std value: -22.6919033301514 - type: nauc_recall_at_10_diff1 value: 35.50100141633622 - type: nauc_recall_at_10_max value: 39.25750019586647 - type: nauc_recall_at_10_std value: -43.01273078031791 - type: nauc_recall_at_1_diff1 value: 54.509245848379095 - type: nauc_recall_at_1_max value: 10.05921648322742 - type: nauc_recall_at_1_std value: -24.652326014826762 - type: nauc_recall_at_20_diff1 value: 38.1281707132327 - type: nauc_recall_at_20_max value: 43.97950642900301 - type: nauc_recall_at_20_std value: -44.049952771307574 - type: nauc_recall_at_3_diff1 value: 40.01986938242728 - type: nauc_recall_at_3_max value: 27.517114421061173 - type: nauc_recall_at_3_std value: -32.99056780232045 - type: nauc_recall_at_5_diff1 value: 38.52035606499483 - type: nauc_recall_at_5_max value: 37.05834604678859 - type: nauc_recall_at_5_std value: -39.86196378897912 - type: ndcg_at_1 value: 67.718 - type: ndcg_at_10 value: 72.004 - type: ndcg_at_100 value: 76.554 - type: ndcg_at_1000 value: 77.07300000000001 - type: ndcg_at_20 value: 74.37899999999999 - type: ndcg_at_3 value: 66.379 - type: ndcg_at_5 value: 68.082 - type: precision_at_1 value: 67.718 - type: precision_at_10 value: 19.849 - type: precision_at_100 value: 2.3800000000000003 - type: precision_at_1000 value: 0.245 - type: precision_at_20 value: 10.813 - type: precision_at_3 value: 46.574 - type: precision_at_5 value: 34.83 - type: recall_at_1 value: 36.248000000000005 - type: recall_at_10 value: 80.252 - type: recall_at_100 value: 96.73 - type: recall_at_1000 value: 99.874 - type: recall_at_20 value: 87.703 - type: recall_at_3 value: 60.815 - type: recall_at_5 value: 71.16 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (fra-eng) type: jinaai/xpqa config: fra-eng split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 73.729 - type: map_at_1 value: 43.964999999999996 - type: map_at_10 value: 67.803 - type: map_at_100 value: 69.188 - type: map_at_1000 value: 69.21000000000001 - type: map_at_20 value: 68.747 - type: map_at_3 value: 60.972 - type: map_at_5 value: 65.39399999999999 - type: mrr_at_1 value: 68.4913217623498 - type: mrr_at_10 value: 75.2600822260368 - type: mrr_at_100 value: 75.6599169808848 - type: mrr_at_1000 value: 75.66720883727534 - type: mrr_at_20 value: 75.52375865860405 - type: mrr_at_3 value: 73.54250111259452 - type: mrr_at_5 value: 74.51713395638626 - type: nauc_map_at_1000_diff1 value: 46.81533703002097 - type: nauc_map_at_1000_max value: 46.30794757084772 - type: nauc_map_at_1000_std value: -14.953470500312335 - type: nauc_map_at_100_diff1 value: 46.82464740277745 - type: nauc_map_at_100_max value: 46.32852879948254 - type: nauc_map_at_100_std value: -14.950035098066172 - type: nauc_map_at_10_diff1 value: 46.31406143369831 - type: nauc_map_at_10_max value: 45.337593270786634 - type: nauc_map_at_10_std value: -16.011789445907876 - type: nauc_map_at_1_diff1 value: 57.097134715065835 - type: nauc_map_at_1_max value: 21.93931500350721 - type: nauc_map_at_1_std value: -15.134457251301637 - type: nauc_map_at_20_diff1 value: 46.47030891134173 - type: nauc_map_at_20_max value: 46.29169960276292 - type: nauc_map_at_20_std value: -15.14241106541829 - type: nauc_map_at_3_diff1 value: 50.27064228648596 - type: nauc_map_at_3_max value: 39.43058773971639 - type: nauc_map_at_3_std value: -16.16545993089126 - type: nauc_map_at_5_diff1 value: 46.974867679747426 - type: nauc_map_at_5_max value: 44.31091104855002 - type: nauc_map_at_5_std value: -16.50175337658926 - type: nauc_mrr_at_1000_diff1 value: 55.20294005110399 - type: nauc_mrr_at_1000_max value: 51.947725719119966 - type: nauc_mrr_at_1000_std value: -14.586112939597232 - type: nauc_mrr_at_100_diff1 value: 55.20426251109304 - type: nauc_mrr_at_100_max value: 51.95648725402534 - type: nauc_mrr_at_100_std value: -14.579769236539143 - type: nauc_mrr_at_10_diff1 value: 54.93870506205835 - type: nauc_mrr_at_10_max value: 51.89312772900638 - type: nauc_mrr_at_10_std value: -14.692635010092939 - type: nauc_mrr_at_1_diff1 value: 56.54945935175171 - type: nauc_mrr_at_1_max value: 51.28134504197991 - type: nauc_mrr_at_1_std value: -12.909042186563061 - type: nauc_mrr_at_20_diff1 value: 55.10667018041461 - type: nauc_mrr_at_20_max value: 51.98236870783707 - type: nauc_mrr_at_20_std value: -14.599377575198025 - type: nauc_mrr_at_3_diff1 value: 55.67124311746892 - type: nauc_mrr_at_3_max value: 51.77903236246767 - type: nauc_mrr_at_3_std value: -14.94452633860763 - type: nauc_mrr_at_5_diff1 value: 55.42849172366371 - type: nauc_mrr_at_5_max value: 51.76902965753959 - type: nauc_mrr_at_5_std value: -15.357993534727072 - type: nauc_ndcg_at_1000_diff1 value: 48.736844959280326 - type: nauc_ndcg_at_1000_max value: 48.92891159935398 - type: nauc_ndcg_at_1000_std value: -13.983968675611056 - type: nauc_ndcg_at_100_diff1 value: 48.73859328503975 - type: nauc_ndcg_at_100_max value: 49.31867149556439 - type: nauc_ndcg_at_100_std value: -13.72387564912742 - type: nauc_ndcg_at_10_diff1 value: 46.50313862975287 - type: nauc_ndcg_at_10_max value: 47.13599793554596 - type: nauc_ndcg_at_10_std value: -16.317919977400113 - type: nauc_ndcg_at_1_diff1 value: 56.54945935175171 - type: nauc_ndcg_at_1_max value: 51.28134504197991 - type: nauc_ndcg_at_1_std value: -12.909042186563061 - type: nauc_ndcg_at_20_diff1 value: 47.01727117133912 - type: nauc_ndcg_at_20_max value: 49.121366036709105 - type: nauc_ndcg_at_20_std value: -14.411078677638775 - type: nauc_ndcg_at_3_diff1 value: 49.229581145458276 - type: nauc_ndcg_at_3_max value: 47.427609717032 - type: nauc_ndcg_at_3_std value: -16.52066627289908 - type: nauc_ndcg_at_5_diff1 value: 48.0152514127505 - type: nauc_ndcg_at_5_max value: 46.12152407850816 - type: nauc_ndcg_at_5_std value: -17.613295491954656 - type: nauc_precision_at_1000_diff1 value: -25.959006032642463 - type: nauc_precision_at_1000_max value: 12.81002362947137 - type: nauc_precision_at_1000_std value: 12.575312826061513 - type: nauc_precision_at_100_diff1 value: -24.35413527283394 - type: nauc_precision_at_100_max value: 14.878359236477303 - type: nauc_precision_at_100_std value: 12.384426050018428 - type: nauc_precision_at_10_diff1 value: -17.93220761770618 - type: nauc_precision_at_10_max value: 23.523485811847294 - type: nauc_precision_at_10_std value: 4.424456968716939 - type: nauc_precision_at_1_diff1 value: 56.54945935175171 - type: nauc_precision_at_1_max value: 51.28134504197991 - type: nauc_precision_at_1_std value: -12.909042186563061 - type: nauc_precision_at_20_diff1 value: -21.776871398686936 - type: nauc_precision_at_20_max value: 21.18436338264366 - type: nauc_precision_at_20_std value: 9.937274986573321 - type: nauc_precision_at_3_diff1 value: -1.2411845580934435 - type: nauc_precision_at_3_max value: 34.962281941875 - type: nauc_precision_at_3_std value: -2.447892908501237 - type: nauc_precision_at_5_diff1 value: -11.134164534114085 - type: nauc_precision_at_5_max value: 30.22079740070525 - type: nauc_precision_at_5_std value: -0.24232594421765946 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: 43.3647412452869 - type: nauc_recall_at_100_max value: 63.50094950500327 - type: nauc_recall_at_100_std value: 2.3911909633714044 - type: nauc_recall_at_10_diff1 value: 33.993445071666855 - type: nauc_recall_at_10_max value: 41.38694129134144 - type: nauc_recall_at_10_std value: -19.308698266099096 - type: nauc_recall_at_1_diff1 value: 57.097134715065835 - type: nauc_recall_at_1_max value: 21.93931500350721 - type: nauc_recall_at_1_std value: -15.134457251301637 - type: nauc_recall_at_20_diff1 value: 32.03888531880772 - type: nauc_recall_at_20_max value: 49.660787482562085 - type: nauc_recall_at_20_std value: -12.641456758778382 - type: nauc_recall_at_3_diff1 value: 47.94527082900579 - type: nauc_recall_at_3_max value: 36.51733131437679 - type: nauc_recall_at_3_std value: -18.65511713247495 - type: nauc_recall_at_5_diff1 value: 42.04545772092305 - type: nauc_recall_at_5_max value: 41.21440912972303 - type: nauc_recall_at_5_std value: -21.47386527081128 - type: ndcg_at_1 value: 68.491 - type: ndcg_at_10 value: 73.729 - type: ndcg_at_100 value: 77.684 - type: ndcg_at_1000 value: 78.084 - type: ndcg_at_20 value: 75.795 - type: ndcg_at_3 value: 68.568 - type: ndcg_at_5 value: 70.128 - type: precision_at_1 value: 68.491 - type: precision_at_10 value: 16.996 - type: precision_at_100 value: 2.023 - type: precision_at_1000 value: 0.207 - type: precision_at_20 value: 9.246 - type: precision_at_3 value: 41.923 - type: precision_at_5 value: 29.826000000000004 - type: recall_at_1 value: 43.964999999999996 - type: recall_at_10 value: 82.777 - type: recall_at_100 value: 97.287 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 89.183 - type: recall_at_3 value: 65.803 - type: recall_at_5 value: 74.119 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (fr) type: jinaai/xpqa config: fra-fra split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 77.581 - type: map_at_1 value: 46.444 - type: map_at_10 value: 72.084 - type: map_at_100 value: 73.175 - type: map_at_1000 value: 73.193 - type: map_at_20 value: 72.77799999999999 - type: map_at_3 value: 65.242 - type: map_at_5 value: 69.926 - type: mrr_at_1 value: 71.82910547396529 - type: mrr_at_10 value: 78.66594612923046 - type: mrr_at_100 value: 78.97334934049613 - type: mrr_at_1000 value: 78.97687021803557 - type: mrr_at_20 value: 78.85701141744282 - type: mrr_at_3 value: 76.96929238985311 - type: mrr_at_5 value: 77.99732977303067 - type: nauc_map_at_1000_diff1 value: 49.090956807097804 - type: nauc_map_at_1000_max value: 52.01095354889508 - type: nauc_map_at_1000_std value: -12.182870421711026 - type: nauc_map_at_100_diff1 value: 49.091664766684566 - type: nauc_map_at_100_max value: 52.017499797253755 - type: nauc_map_at_100_std value: -12.188342487271528 - type: nauc_map_at_10_diff1 value: 48.6619338205362 - type: nauc_map_at_10_max value: 50.93591260329888 - type: nauc_map_at_10_std value: -12.899399261673365 - type: nauc_map_at_1_diff1 value: 61.89699552471587 - type: nauc_map_at_1_max value: 22.387748207421946 - type: nauc_map_at_1_std value: -17.139518194308437 - type: nauc_map_at_20_diff1 value: 48.72828404686453 - type: nauc_map_at_20_max value: 51.781074586075434 - type: nauc_map_at_20_std value: -12.174270605093136 - type: nauc_map_at_3_diff1 value: 53.11509580126934 - type: nauc_map_at_3_max value: 42.1768380145106 - type: nauc_map_at_3_std value: -14.98340833032363 - type: nauc_map_at_5_diff1 value: 49.60521390803235 - type: nauc_map_at_5_max value: 49.80360562029127 - type: nauc_map_at_5_std value: -13.900652140457618 - type: nauc_mrr_at_1000_diff1 value: 58.10782478654255 - type: nauc_mrr_at_1000_max value: 61.31083013535486 - type: nauc_mrr_at_1000_std value: -9.624904298545921 - type: nauc_mrr_at_100_diff1 value: 58.11041683306092 - type: nauc_mrr_at_100_max value: 61.31590199755797 - type: nauc_mrr_at_100_std value: -9.625991053580865 - type: nauc_mrr_at_10_diff1 value: 57.883701815695375 - type: nauc_mrr_at_10_max value: 61.36276126424689 - type: nauc_mrr_at_10_std value: -9.495072468420386 - type: nauc_mrr_at_1_diff1 value: 60.18176977079093 - type: nauc_mrr_at_1_max value: 59.697615236642555 - type: nauc_mrr_at_1_std value: -9.396133077966779 - type: nauc_mrr_at_20_diff1 value: 57.964817434006754 - type: nauc_mrr_at_20_max value: 61.34073539502932 - type: nauc_mrr_at_20_std value: -9.602378876645131 - type: nauc_mrr_at_3_diff1 value: 58.44338049427257 - type: nauc_mrr_at_3_max value: 60.92272989411293 - type: nauc_mrr_at_3_std value: -9.928970439416162 - type: nauc_mrr_at_5_diff1 value: 58.01513016866578 - type: nauc_mrr_at_5_max value: 61.46805302986586 - type: nauc_mrr_at_5_std value: -9.842227002440984 - type: nauc_ndcg_at_1000_diff1 value: 50.99293152828167 - type: nauc_ndcg_at_1000_max value: 56.14232784664811 - type: nauc_ndcg_at_1000_std value: -10.529213072410288 - type: nauc_ndcg_at_100_diff1 value: 50.99385944312529 - type: nauc_ndcg_at_100_max value: 56.34825518954588 - type: nauc_ndcg_at_100_std value: -10.398943874846047 - type: nauc_ndcg_at_10_diff1 value: 48.51273364357823 - type: nauc_ndcg_at_10_max value: 53.77871849486298 - type: nauc_ndcg_at_10_std value: -11.82105972112472 - type: nauc_ndcg_at_1_diff1 value: 60.18176977079093 - type: nauc_ndcg_at_1_max value: 59.697615236642555 - type: nauc_ndcg_at_1_std value: -9.396133077966779 - type: nauc_ndcg_at_20_diff1 value: 49.04268319033412 - type: nauc_ndcg_at_20_max value: 55.47011381097071 - type: nauc_ndcg_at_20_std value: -10.486452945493042 - type: nauc_ndcg_at_3_diff1 value: 50.95112745400584 - type: nauc_ndcg_at_3_max value: 53.45473828705577 - type: nauc_ndcg_at_3_std value: -13.420699384045728 - type: nauc_ndcg_at_5_diff1 value: 50.313156212000074 - type: nauc_ndcg_at_5_max value: 52.78539129309866 - type: nauc_ndcg_at_5_std value: -13.586274096509122 - type: nauc_precision_at_1000_diff1 value: -31.13772049254778 - type: nauc_precision_at_1000_max value: 17.2847598361294 - type: nauc_precision_at_1000_std value: 15.497531773816887 - type: nauc_precision_at_100_diff1 value: -29.98812263553739 - type: nauc_precision_at_100_max value: 19.048620003227654 - type: nauc_precision_at_100_std value: 15.38499952171958 - type: nauc_precision_at_10_diff1 value: -25.33028097412579 - type: nauc_precision_at_10_max value: 26.077919168306853 - type: nauc_precision_at_10_std value: 11.35352933466097 - type: nauc_precision_at_1_diff1 value: 60.18176977079093 - type: nauc_precision_at_1_max value: 59.697615236642555 - type: nauc_precision_at_1_std value: -9.396133077966779 - type: nauc_precision_at_20_diff1 value: -28.417606311068905 - type: nauc_precision_at_20_max value: 23.958679828637692 - type: nauc_precision_at_20_std value: 14.442021499194205 - type: nauc_precision_at_3_diff1 value: -8.127396049790482 - type: nauc_precision_at_3_max value: 37.348067982957076 - type: nauc_precision_at_3_std value: 4.747913619596849 - type: nauc_precision_at_5_diff1 value: -16.902418446058395 - type: nauc_precision_at_5_max value: 32.73583852552014 - type: nauc_precision_at_5_std value: 7.031446423850052 - type: nauc_recall_at_1000_diff1 value: -14.485978369112514 - type: nauc_recall_at_1000_max value: 78.59123887333172 - type: nauc_recall_at_1000_std value: 90.7384575424963 - type: nauc_recall_at_100_diff1 value: 41.47842281590715 - type: nauc_recall_at_100_max value: 67.47271545727422 - type: nauc_recall_at_100_std value: 14.555561992253999 - type: nauc_recall_at_10_diff1 value: 33.05308907973924 - type: nauc_recall_at_10_max value: 45.49878918493155 - type: nauc_recall_at_10_std value: -11.560069806810926 - type: nauc_recall_at_1_diff1 value: 61.89699552471587 - type: nauc_recall_at_1_max value: 22.387748207421946 - type: nauc_recall_at_1_std value: -17.139518194308437 - type: nauc_recall_at_20_diff1 value: 31.305721376453754 - type: nauc_recall_at_20_max value: 51.24817763724019 - type: nauc_recall_at_20_std value: -5.0809908162023145 - type: nauc_recall_at_3_diff1 value: 49.27109038342917 - type: nauc_recall_at_3_max value: 37.69188317998447 - type: nauc_recall_at_3_std value: -17.119900758664336 - type: nauc_recall_at_5_diff1 value: 42.74501803377967 - type: nauc_recall_at_5_max value: 46.877008503354844 - type: nauc_recall_at_5_std value: -15.704892082115975 - type: ndcg_at_1 value: 71.829 - type: ndcg_at_10 value: 77.581 - type: ndcg_at_100 value: 80.75 - type: ndcg_at_1000 value: 81.026 - type: ndcg_at_20 value: 79.092 - type: ndcg_at_3 value: 72.81 - type: ndcg_at_5 value: 74.22999999999999 - type: precision_at_1 value: 71.829 - type: precision_at_10 value: 17.717 - type: precision_at_100 value: 2.031 - type: precision_at_1000 value: 0.207 - type: precision_at_20 value: 9.399000000000001 - type: precision_at_3 value: 44.458999999999996 - type: precision_at_5 value: 31.535000000000004 - type: recall_at_1 value: 46.444 - type: recall_at_10 value: 86.275 - type: recall_at_100 value: 98.017 - type: recall_at_1000 value: 99.8 - type: recall_at_20 value: 90.935 - type: recall_at_3 value: 70.167 - type: recall_at_5 value: 78.2 --- <br><br> <p align="center"> <img src="https://aeiljuispo.cloudimg.io/v7/https://cdn-uploads.huggingface.co/production/uploads/603763514de52ff951d89793/AFoybzd5lpBQXEBrQHuTt.png?w=200&h=200&f=face" alt="Finetuner logo: Finetuner helps you to create experiments in order to improve embeddings on search tasks. It accompanies you to deliver the last mile of performance-tuning for neural search applications." width="150px"> </p> <p align="center"> <b>The embedding model trained by <a href="https://jina.ai/"><b>Jina AI</b></a>.</b> </p> <p align="center"> <b>jina-embeddings-v3: Multilingual Embeddings With Task LoRA</b> </p> ## Quick Start [Blog](https://jina.ai/news/jina-embeddings-v3-a-frontier-multilingual-embedding-model/#parameter-dimensions) | [Azure](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/jinaai.jina-embeddings-v3) | [AWS SageMaker](https://aws.amazon.com/marketplace/pp/prodview-kdi3xkt62lo32) | [API](https://jina.ai/embeddings) ## Intended Usage & Model Info `jina-embeddings-v3` is a **multilingual multi-task text embedding model** designed for a variety of NLP applications. Based on the [Jina-XLM-RoBERTa architecture](https://huggingface.co/jinaai/xlm-roberta-flash-implementation), this model supports Rotary Position Embeddings to handle long input sequences up to **8192 tokens**. Additionally, it features 5 LoRA adapters to generate task-specific embeddings efficiently. ### Key Features: - **Extended Sequence Length:** Supports up to 8192 tokens with RoPE. - **Task-Specific Embedding:** Customize embeddings through the `task` argument with the following options: - `retrieval.query`: Used for query embeddings in asymmetric retrieval tasks - `retrieval.passage`: Used for passage embeddings in asymmetric retrieval tasks - `separation`: Used for embeddings in clustering and re-ranking applications - `classification`: Used for embeddings in classification tasks - `text-matching`: Used for embeddings in tasks that quantify similarity between two texts, such as STS or symmetric retrieval tasks - **Matryoshka Embeddings**: Supports flexible embedding sizes (`32, 64, 128, 256, 512, 768, 1024`), allowing for truncating embeddings to fit your application. ### Supported Languages: While the foundation model supports 100 languages, we've focused our tuning efforts on the following 30 languages: **Arabic, Bengali, Chinese, Danish, Dutch, English, Finnish, French, Georgian, German, Greek, Hindi, Indonesian, Italian, Japanese, Korean, Latvian, Norwegian, Polish, Portuguese, Romanian, Russian, Slovak, Spanish, Swedish, Thai, Turkish, Ukrainian, Urdu,** and **Vietnamese.** ## Usage **<details><summary>Apply mean pooling when integrating the model.</summary>** <p> ### Why Use Mean Pooling? Mean pooling takes all token embeddings from the model's output and averages them at the sentence or paragraph level. This approach has been shown to produce high-quality sentence embeddings. We provide an `encode` function that handles this for you automatically. However, if you're working with the model directly, outside of the `encode` function, you'll need to apply mean pooling manually. Here's how you can do it: ```python import torch import torch.nn.functional as F from transformers import AutoTokenizer, AutoModel def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] input_mask_expanded = ( attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() ) return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp( input_mask_expanded.sum(1), min=1e-9 ) sentences = ["How is the weather today?", "What is the current weather like today?"] tokenizer = AutoTokenizer.from_pretrained("jinaai/jina-embeddings-v3") model = AutoModel.from_pretrained("jinaai/jina-embeddings-v3", trust_remote_code=True) encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors="pt") task = 'retrieval.query' task_id = model._adaptation_map[task] adapter_mask = torch.full((len(sentences),), task_id, dtype=torch.int32) with torch.no_grad(): model_output = model(**encoded_input, adapter_mask=adapter_mask) embeddings = mean_pooling(model_output, encoded_input["attention_mask"]) embeddings = F.normalize(embeddings, p=2, dim=1) ``` </p> </details> The easiest way to start using `jina-embeddings-v3` is with the [Jina Embedding API](https://jina.ai/embeddings/). Alternatively, you can use `jina-embeddings-v3` directly via Transformers package: ```bash !pip install transformers torch einops !pip install 'numpy<2' ``` If you run it on a GPU that support [FlashAttention-2](https://github.com/Dao-AILab/flash-attention). By 2024.9.12, it supports Ampere, Ada, or Hopper GPUs (e.g., A100, RTX 3090, RTX 4090, H100), ```bash !pip install flash-attn --no-build-isolation ``` ```python from transformers import AutoModel # Initialize the model model = AutoModel.from_pretrained("jinaai/jina-embeddings-v3", trust_remote_code=True) texts = [ "Follow the white rabbit.", # English "Sigue al conejo blanco.", # Spanish "Suis le lapin blanc.", # French "跟着白兔走。", # Chinese "اتبع الأرنب الأبيض.", # Arabic "Folge dem weißen Kaninchen.", # German ] # When calling the `encode` function, you can choose a `task` based on the use case: # 'retrieval.query', 'retrieval.passage', 'separation', 'classification', 'text-matching' # Alternatively, you can choose not to pass a `task`, and no specific LoRA adapter will be used. embeddings = model.encode(texts, task="text-matching") # Compute similarities print(embeddings[0] @ embeddings[1].T) ``` By default, the model supports a maximum sequence length of 8192 tokens. However, if you want to truncate your input texts to a shorter length, you can pass the `max_length` parameter to the `encode` function: ```python embeddings = model.encode(["Very long ... document"], max_length=2048) ``` In case you want to use **Matryoshka embeddings** and switch to a different dimension, you can adjust it by passing the `truncate_dim` parameter to the `encode` function: ```python embeddings = model.encode(['Sample text'], truncate_dim=256) ``` The latest version (3.1.0) of [SentenceTransformers](https://github.com/UKPLab/sentence-transformers) also supports `jina-embeddings-v3`: ```bash !pip install -U sentence-transformers ``` ```python from sentence_transformers import SentenceTransformer model = SentenceTransformer("jinaai/jina-embeddings-v3", trust_remote_code=True) task = "retrieval.query" embeddings = model.encode( ["What is the weather like in Berlin today?"], task=task, prompt_name=task, ) ``` You can fine-tune `jina-embeddings-v3` using [SentenceTransformerTrainer](https://sbert.net/docs/package_reference/sentence_transformer/trainer.html). To fine-tune for a specific task, you should set the task before passing the model to the ST Trainer, either during initialization: ```python model = SentenceTransformer("jinaai/jina-embeddings-v3", trust_remote_code=True, model_kwargs={'default_task': 'classification'}) ``` Or afterwards: ```python model = SentenceTransformer("jinaai/jina-embeddings-v3", trust_remote_code=True) model[0].default_task = 'classification' ``` This way you can fine-tune the LoRA adapter for the chosen task. However, If you want to fine-tune the entire model, make sure the main parameters are set as trainable when loading the model: ```python model = SentenceTransformer("jinaai/jina-embeddings-v3", trust_remote_code=True, model_kwargs={'lora_main_params_trainable': True}) ``` This will allow fine-tuning the whole model instead of just the LoRA adapters. **<details><summary>ONNX Inference.</summary>** <p> You can use ONNX for efficient inference with `jina-embeddings-v3`: ```python import onnxruntime import numpy as np from transformers import AutoTokenizer, PretrainedConfig # Load tokenizer and model config tokenizer = AutoTokenizer.from_pretrained('jinaai/jina-embeddings-v3') config = PretrainedConfig.from_pretrained('jinaai/jina-embeddings-v3') # Tokenize input input_text = tokenizer('sample text', return_tensors='np') # ONNX session model_path = 'jina-embeddings-v3/onnx/model.onnx' session = onnxruntime.InferenceSession(model_path) # Prepare inputs for ONNX model task_type = 'text-matching' task_id = np.array(config.lora_adaptations.index(task_type), dtype=np.int64) inputs = { 'input_ids': input_text['input_ids'], 'attention_mask': input_text['attention_mask'], 'task_id': task_id } # Run model outputs = session.run(None, inputs)[0] # Apply mean pooling to 'outputs' to get a single representation of each text ``` </p> </details> ## Contact Join our [Discord community](https://discord.jina.ai) and chat with other community members about ideas. ## License `jina-embeddings-v3` is listed on AWS & Azure. If you need to use it beyond those platforms or on-premises within your company, note that the models is licensed under CC BY-NC 4.0. For commercial usage inquiries, feel free to [contact us](https://jina.ai/contact-sales/). ## Citation If you find `jina-embeddings-v3` useful in your research, please cite the following paper: ```bibtex @misc{sturua2024jinaembeddingsv3multilingualembeddingstask, title={jina-embeddings-v3: Multilingual Embeddings With Task LoRA}, author={Saba Sturua and Isabelle Mohr and Mohammad Kalim Akram and Michael Günther and Bo Wang and Markus Krimmel and Feng Wang and Georgios Mastrapas and Andreas Koukounas and Andreas Koukounas and Nan Wang and Han Xiao}, year={2024}, eprint={2409.10173}, archivePrefix={arXiv}, primaryClass={cs.CL}, url={https://arxiv.org/abs/2409.10173}, } ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
SepKeyPro/bge-base-en-trivia-anchor-positive
SepKeyPro
sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:9000", "loss:MultipleNegativesRankingLoss", "en", "arxiv:1908.10084", "arxiv:1705.00652", "base_model:BAAI/bge-base-en-v1.5", "base_model:finetune:BAAI/bge-base-en-v1.5", "license:apache-2.0", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,719
1,719
9
0
--- base_model: BAAI/bge-base-en-v1.5 datasets: [] language: - en library_name: sentence-transformers license: apache-2.0 metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 - dot_accuracy@1 - dot_accuracy@3 - dot_accuracy@5 - dot_accuracy@10 - dot_precision@1 - dot_precision@3 - dot_precision@5 - dot_precision@10 - dot_recall@1 - dot_recall@3 - dot_recall@5 - dot_recall@10 - dot_ndcg@10 - dot_mrr@10 - dot_map@100 pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:9000 - loss:MultipleNegativesRankingLoss widget: - source_sentence: Globe, Omaha Fiesole and Chianti are all varieties of which vegetable? sentences: - What is the Rugby Union Six Nations tournament? - CBBC Newsround What is the Rugby Union Six Nations tournament? 11 February 2015 Image copyright Getty Images Check out this guide to find out all you need to know about the Six Nations. Rugby Union Six Nations tournament The Six Nations is a rugby union tournament played every year between the top countries in Europe. The six countries who take part are England, Scotland, Wales, Ireland, France and Italy. For the first 90 years of the championships there were only five countries involved, but Italy were invited to take part in 2000. Rugby Union Rugby Union is played by teams of 15 players, with each team made up of eight forwards and seven backs. Even at the top level it used to be amateur; that is, played for fun by people who had other jobs too, but now the top players are all professionals. The biggest competition in Union is the World Cup, played every four years, but the most famous one in this country is the Six Nations championship. The Grand Slam If a team wins all five of its matches it is called a Grand Slam, but to win a Grand Slam is very hard. England won a Grand Slam in 2003, but only after losing their final match in the three seasons before. The Triple Crown The Triple Crown is a special prize that only the four home unions are able to win. Image copyright PA Image caption A team can only win the triple crown if they beat all three of the other home unions Rugby facts The sport gets its name from the place where it was invented, Rugby School in Warwickshire, England. In 1816 a pupil called William Webb Ellis got a bit bored during a match of football and decided that picking up the ball would make things more interesting. Although the game has come a long way since, even splitting into two codes; Rugby Union and Rugby League, that's where it started. One of the most important rules of the sport is that the ball can only be passed backwards. - 'Baked Artichoke Recipe Baked Artichoke Recipe written by Heather Restrepo So there are a variety of types when it comes to artichokes (Red-Babyanzio, Big-heart, Siena, Mercury, Omaha, Fiesole, Chianti, etc.) But perhaps the most common is the Classic Green Globe artichoke. This is probably the type of artichoke you will find at your local grocer. Having moved to Hawaii as a teen, that is when I was first introduced to this intimidating looking veggie. I would never have predicted that I could ever encompass the foodie-love-affair that I now have with them! The Classic Green Globe artichoke is by far my favorite because of its’ buttery-tasting heart and bottom. There is also a good amount of meat within the petals – SCORE if you ask me! Many people enjoy perfectly grilled, baked, or steamed artichokes at fancy restaurants, which are usually served with a garlic aioli type of sauce (find my aioli recipe here ).  Nonetheless, making them at home is not as scary as you might think! I find that steaming artichokes gives them a bitter taste compared to baking, plus popping them in the oven is so much easier than messing with a steamer! So, here is my go-to recipe for BAKED artichokes: INGREDIENTS: 2 tbsp. Grass-fed Butter (un-salted) 1 tsp. Garlic Powder *NOTE: You will need a baking dish with a lid/cover. To start – Preheat the oven to 425 degrees.  Give the artichoke a good rinse in cold water, and use a kitchen brush to lightly scrub the outside. Then cut about a half-inch from the stem and discard the bottom piece. I also cut about a half-inch from the very top, just to open it up and allow the seasoning to get inside for cooking.  *OPTIONAL: Some people like to trim the tops of the leaves to get rid of the thorns, but I find that they become soft during cooking so I don’t bother.  Then, carefully cut the artichoke vertically in half. Line a baking dish with parchment paper and place the halves on top. Then lightly coat both pieces with a halved lemon. Take the other half of the lemon and squeeze along the inside of the artichoke, as well as between the leaves. Then sprinkle the sea salt and garlic powder all-over both sides and a bit between the leaves. Next, lightly drizzle your oil all over the artichoke pieces (in between the leaves also). Then place a tbsp. of grass-fed butter in each of the heart pockets. Cover the baking dish with lid, and place in to the oven.  Allow to cook until sizzling., usually about 1 hour. (Ovens will vary, so check often after 30-40 minutes of cooking) After cooking, allow to cool, remove the choke with a spoon (the hairy inside part), and then enjoy the meat on petals, and heart/bottom! x. Heather' - 'Lady Lever Art Gallery - Gallery in Port Sunlight, Port Sunlight - Visit Liverpool You are here: Things To Do > Lady Lever Art Gallery Lady Lever Art Gallery Note: Prices are a guide only and may change on a daily basis. About The Lady Lever Art Gallery is regarded as one of the finest art galleries in Europe. It''s located in model village, Port Sunlight in Wirral, a place rich in architectural charm.  The gallery was founded by William Hesketh Lever (1851-1925) and is dedicated to his wife Elizabeth, Lady Lever. Lever wanted to share his collections with the public and the works on display at the gallery have been personally selected.  Inside the gallery, visitors will find the best of Lever''s personal art collection and the finest collection of Wedgewood jasperware anywhere in the world. The Pre-Raphaelite painting collection is internationally renowned and features works by Millais, Rossetti, Burne-Jones and Holman Hunt.  For younger visitors, the activity rooms are an interactive space where they can get hands-on and have fun with crafts, dressing up and story telling.  Lady Lever often houses temporary exhibitions and popular free events, be sure to check the website before visiting.  Before leaving, browse the gift shop or enjoy a bite to eat in the Gallery Cafe. All Areas Accessible to Disabled Visitors Cafe/Restaurant Guided Tours Available for Groups Large Parties Map & Directions Road Directions From Liverpool: Go through the Birkenhead (Queensway) Tunnel (£1.40 toll for cars, £4.20 for coaches). Once you leave the tunnel follow signs for Port Sunlight, driving along A41(New Chester Road) - the gallery is sign posted all the way from the tunnel and situated opposite Port Sunlight Museum.  From elsewhere: Leave the M53 at junction 4, follow the B5137 and take the second left onto the B5136 towards Port Sunlight. Follow the brown and white road signs for Port Sunlight Village. Once you are in the village follow the signs for Lady Lever Art Gallery. Public Transport Directions By Train: The nearest station is Bebington although Port Sunlight is also within walking distance. They are both on the Chester and Ellesmere Port Merseyrail lines. Leave the station and come out onto Old Chester Road (use the ramp if you require level access). Turn left, cross at the traffic lights, then turn left again down Bebington Road, passing under the railway bridge. Turn next right down Greendale Road. Continue along the pavement on the same side as the cottages for approximately 400 yards until you see the Leverhulme memorial and the Lady Lever Art Gallery on your left. Take the pathway on your left leading into Windy Bank and towards the memorial and the gallery. The entrance to the gallery is to the right side of the building opposite the fountain.  By bus:  Take number 464 to Bebington Road bus stop (starts at Sir Thomas Street in Liverpool city centre) or number 38 to Bebington rail station bus stop (runs between Clatterbridge Hospital and West Kirby station). Once you get off the bus refer to the above directions from Bebington railway station. TripAdvisor' - source_sentence: Pinkie, Cubitt and Ida Arnold are all characters in which Graham Green novel? sentences: - 'Brighton Rock Maximize this page Introduction This study guide is intended for students preparing for exams at GCE Advanced (A2) level and Advanced Supplementary (AS) level. But it is suitable for university students and the general reader who is interested in Brighton Rock. Please use the hyperlinks in the table above to navigate this page. If you have any comments or suggestions to make about this page, please e-mail me by clicking on this link. The purpose of this study guide is to help you find your way around the text, and to introduce subjects which may be set by examiners. It is not a substitute for close study of the novel. Ideas presented here need to be supported by textual reference (either summary of narrative detail or brief direct quotation, as appropriate; do not quote at length: you gain no credit for this in an "open book" exam, the point of the reference will not be clear, and you are wasting time!). Back to top It is assumed by the examiners that literature is a humane subject; that is, that books set for study explore and interpret values and attitudes in the real world, although they must also be judged in their own right as imaginative works depicting an alternative reality or alternative view of the world. Broadly speaking, students are asked to examine works in terms of their content (what they are about) and the author''s technique (how they are composed). While examiners hope that students will enjoy studying these things, they recognize that this enjoyment will rarely be simple or immediate in the case of demanding texts. Students would do well to develop maturity as readers, to discover the historical and cultural diversity of western literature, with some of its history; to recognize different literary forms, genres and conventions. Personal and independent judgements are encouraged, but should be made against a background of familiarity with established or current attitudes. It is impossible to "teach" this entirely within lesson time; private reading, directed by a teacher or other well-read person, is essential. Because you cannot read everything, or even very much, try to profit from the experience of others. Back to top Brighton Rock: what is it about? At one level, this novel is a simple, if elegant, thriller: Ida Arnold, an unlikely heroine, pursues the evil but failed gangster Pinkie Brown; she seeks his punishment, while trying to save from his influence the young woman, Rose, whom Pinkie has married to buy her silence. In these terms, with vivid but usually straightforward characters and well-drawn locations, and the shocking conclusion (the reader is aware of Rose''s imminent discovery of Pinkie''s hatred) the novel shows why it achieved great popularity, and why it was successfully adapted for the cinema. Unlike some classic works, it obeys the convention of popular fiction, that there should be a well-paced and exciting story; "suspense" is also provided by the reader''s concern for the perhaps doomed Rose. But why is the novel also considered to be serious fiction, or a "modern classic"? This is a little less obvious, but we can find reasons for this opinion, if we look. Like many writers from earlier times, Greene is deeply interested in what could be called metaphysical questions: about the real nature and purpose of this world, about the nature or existence, even, of God; about man''s freedom, by his own efforts, to alter his circumstances - or lack of this freedom. In order to address these arguments, Greene depicts characters who are not at all complex, but who hold, profoundly in the case of Pinkie, radically differing views on these matters. Back to top Dallow, like Ida, sees only the immediate material world before him, as do the punters who see Brighton''s jolly facade and gaiety, but not the squalor behind this. Pinkie, though, believes also in a world of unseen but eternal spiritual realities. Initially, he believes these to await him after death, and he aspires to better his status in this world; but he comes, gradually, to see what Prewitt, his bent lawyer, articulates' - 'How many Presidents have resigned from office? | Reference.com How many Presidents have resigned from office? A: Quick Answer As of 2014, there has been only one president to resign from office. That president was Richard Nixon on Aug. 9, 1974. Minutes after his resignation, Vice President Gerald R. Ford was sworn into office as the 37th president of the United States. Full Answer Nixon''s resignation can be largely attributed to the clandestine and illegal activities his administration undertook during his presidency. The activities were brought to light when members of his administration were caught breaking into the Democratic headquarters at the Watergate Hotel. This scandal, named the Watergate scandal, resulted in the loss of almost all of his political support and the near-certainty of his impeachment. As a result, Nixon took to radio and television and announced his resignation.' - 'Época 64/65 - Taça de Portugal: F.C.Porto - Benfica (1-1) - YouTube Época 64/65 - Taça de Portugal: F.C.Porto - Benfica (1-1) Want to watch this again later? Sign in to add this video to a playlist. Need to report the video? Sign in to report inappropriate content. The interactive transcript could not be loaded. Loading... Rating is available when the video has been rented. This feature is not available right now. Please try again later. Published on Jul 30, 2016 16 avos-de-final (2ª Mão) When autoplay is enabled, a suggested video will automatically play next. Up next New 26:59 Dragão com História: Aloísio - Duration: 18:18. Os Filhos do Dragão 209 views 18:18 Futebol Clube do Porto - [ Alegria ] - Duration: 4:37. sarafi00 82,911 views 4:37 Portugal | Lisbon Street Dancing! - Duration: 2:33. GUN1T123 4,566 views 2:33 Atletiekploeg naar Portugal voor oefenkamp - Duration: 1:33. ANP Video 28 views 1:33 Preconceito em Portugal #MorandoFORAdoBRASIL #veda 02 @blog da quel - Duration: 10:39. Blog da Quel - Raquel Carboni 2,128 views 10:39' - source_sentence: Anglophobia is the fear of which country and its people? sentences: - '1870 - Famous Birthdays - On This Day On This Day Famous People Born in 1870 Full Calendar Jan 3 Henry Eichheim, composer Jan 3 Henry Handel Richardson, Australia, novelist (Richard Mahoney) Jan 4 Percy Pitt, English composer (BBC), born in London (d. 1932) Jan 6 Gustav Bauer, Chancellor of Germany (d. 1944) Jan 7 Lord Gordon Hewart, British judge (d. 1943) Jan 8 Miguel Primo de Rivera Orbaneja, dictator of Spain (1923-30) Jan 9 Joseph B Strauss, civil engineer/builder (Golden Gate Bridge) Jan 11 Alexander Stirling Calder, American sculptor (d. 1945) Jan 13 Henryk Opienski, Polish composer/conductor (St Moniuszko) Jan 13 Ross Granville Harrison, American biologist (d. 1959) Jan 14 Sir George Pearce, Australian politician (d. 1952) Jan 15 Johan Peter Koch, Danish officer/explorer (Greenland) Jan 15 Pierre S. du Pont, American businessman (d. 1954) Jan 16 Wilhelm Normann, German chemist (hardening of oils) Jan 18 Berend Modderman, printer (Drukkers yearbook) Jan 20 Guillaume Jean Joseph Nicolas Lekeu, composer Jan 22 Charles Arnold Tournemire, composer Feb 3 Ada Negri, Italian poet/author (Il Libro di Mara) Feb 7 Alfred Adler, Austria, psychiatrist (Inferiority Complex) Feb 10 Fritz Klimsch, German sculptor/painter Feb 12 Marie Lloyd, English music-hall performer (d. 1922) Feb 13 Leopold Godowsky, Lithuania, virtuoso pianist/composer Feb 17 Louis de Raet, Belgian economist/founder (Flemish People''s Party) Feb 18 William Laurel Harris, American mural painter, writer (d. 1924) Feb 20 Pieter Cornelis Boutens, Holland, mystic poet/scholar (Verzen) Feb 27 Louis Coerne, composer Mar 4 Thomas Sturge Moore, English poet (d. 1944) Mar 5 Frank Norris, journalist and writer (McTeague, Octopus), born in Chicago, Illinois (d. 1902) Mar 6 Oscar Straus, composer (Ein Walzertraum), born in Vienna, Austria Mar 10 Alfred Kastner, composer Mar 11 Louis Bachelier, French mathematician (d. 1946) Mar 13 Albert Meyer, member of the Swiss Federal Council in the 1930s (d. 1953) Mar 17 Horace Donisthorpe, British entomologist (d. 1951) Mar 20 Paul von Lettow-Vorbeck, Prussian general/politician (East Africa) Apr 4 George A Smith, Salt Lake City Utah, 8th President of Mormon church Apr 7 Joseph Ryeland, Belgian composer/Baron Apr 14 Syd Gregory, cricketer (Australian batsman in 58 Tests 1890-1912) Apr 14 Victor Borisov-Musatov, Russian painter (d. 1905) Apr 17 Ray Stannard Baker, US, journalist (Puliter Prize 1940) Apr 20 Simeon Roncal, composer Apr 20 Maulvi Abdul Haq, Father of Urdu, Pakistani scholar (d. 1961) Apr 21 Edwin S. Porter, American film pioneer (d. 1941) Person of Interest Apr 22 Vladimir Lenin [Vladimir Ilich Ulyanov], Marxist Revolutionary and Soviet Leader, born in Simbirsk, Russia (d. 1924) Marxist Revolutionary and Soviet Leader Apr 28 Hermann Suter, composer Apr 30 Franz Lehar, operetta composer (Naughty Marietta) May 3 Princess Helena Victoria of Schleswig-Holstein (d. 1948) May 4 Alexandre Benois, Russian artist (d. 1960) May 6 Amedos Peter Giannine, founded Bank of America, born in San Jose, California May 6 John McCutcheon, cartoonist (Pulitzer Prize-1931) May 14 Zygmunt Denis Antoni Stojowski, composer May 19 Albert Fish, American serial killer (d. 1936) May 24 Benjamin Cardozo, American jurist (d. 1938) May 24 Jan Christiaan Smuts, Prime Minister of South Africa and proponent of Commonwealth & League of Nations (d. 1950) May 27 Lionel Palairet, cricketer (elegant England bat in the Golden Age) Jun 13 Jules JBV Bordet, Belgian bacteriologist (syphillis, Nobel 1919) Jun 14 Sophia of Prussia, consort of Constantine I of Greece (d. 1932) Jun 17 George Cormack, cereal inventor (Wheaties) Jun 18 Edouard Le Roy, French philosopher and mathematician Jun 21 Clara Immerwahr, German chemist (d. 1915) Jun 24 Horatio Mbelle, Cape Colony, South African interpreter, community leader and politician Jun 29 Joseph Carl Breil, composer Person of Interest Jul 3 Richard Bedford Bennett , 11th Prime Minister of Canada (C: 1930-35), born in Hopewell Hill, New Brunswick (d. 1947) 11th Prime Minister of Canada Jul 4 Pieter van der Lijn, Dutch geolo' - 'Anglophobia - definition of Anglophobia by The Free Dictionary Anglophobia - definition of Anglophobia by The Free Dictionary http://www.thefreedictionary.com/Anglophobia Also found in: Thesaurus , Wikipedia . Related to Anglophobia: Anglophobic One who dislikes or fears England, its people, or its culture. An′glo·pho′bi·a n. An′glo·pho′bic adj. Anglophobia Anglophobia - dislike (or fear) of Britain and British customs dislike - a feeling of aversion or antipathy; "my dislike of him was instinctive" Anglophilia - admiration for Britain and British customs Translations Anglophobia n → Anglophobie f (form), → Englandhass m Want to thank TFD for its existence? Tell a friend about us , add a link to this page, or visit the webmaster''s page for free fun content . Link to this page: England References in classic literature ? Well, then, you shall have plenty of it; and first, I see you''ve not much more sense than some others of my acquaintance"(indicating me with his thumb), "or else you''d never turn rabid about that dirty little country called England; for rabid, I see you are; I read Anglophobia in your looks, and hear it in your words. View in context He begins with a discussion of Hegel''s reform-bill article claiming that, contrary to the traditional view, it was not a sour mix of anglophobia and Prussian chauvinism but a shrewd analysis of the political situation as of mid-1831 which correctly identified the structural weaknesses of the existing British state, most notably its dominance by a corrupt and incompetent aristocracy, and pointed the direction that politics must take if the nation was to avoid revolution. As Britain''s elections near, voters are facing a wall of falsehoods Anglophobia ruled for a decade until former Education Minister Leighton Andrews invited in Tony Blair''s Sir Michael Barber. End the debate and just get on with the teaching; The groundswell of support for the Welsh Government''s review of curriculum arrangements are supposed to bode well for the future. But in a hard-hitting column, education expert Terry Mackie argues otherwise A Call to Arms: Propaganda, Public Opinion, and Newspapers in the Great War (Westport, CT: Praeger, 2004); Matthew Stibbe, German Anglophobia and the Great War (Cambridge: Cambridge University Press, 2001); David Welch, Germany, Propaganda and Total War, 1914-1918: The Sins of Omission (New Brunswick: Rutgers University Press, 2000). Mencken''s nietzsche Multicultural Nationalism: Islamaphobia, Anglophobia, and Devolution. Ghanaian and Somali immigrants in Toronto''s rental market: a comparative cultural perspective of housing issues and coping strategies 25) Crawford told Stonehaven that the American actions at Geneva would undoubtedly cause a wave of Anglophobia to arise in the United States due to a number of things. Imperial networks, imperial defence, and perceptions of American influence on the British Empire in the interwar period: the case of the 27th Earl of Crawford and Balcarres For one, he had none of the French Anglophobia stimulated by events like Mers-el-Kebir (the battle in 1940 off the coast of French Algeria when the British Navy attacked and destroyed much of the French fleet), which soldiers like Colonel Serge-Henri Parisot never got over even up to his death last February at age 100.' - 'Menstruation and the menstrual cycle | womenshealth.gov Menstruation and the menstrual cycle Menstruation and the menstrual cycle To receive Publications email updates Enter email Submit Menstruation and the menstrual cycle Menstruation is a woman''s monthly bleeding. When you menstruate, your body sheds the lining of the uterus (womb). Learn how the menstrual cycle works and what to do if you have painful or irregular periods. Expand all Collapse all What is menstruation? Menstruation (men-STRAY-shuhn) is a woman''s monthly bleeding. When you menstruate, your body sheds the lining of the uterus (womb). Menstrual blood flows from the uterus through the small opening in the cervix and passes out of the body through the vagina ( see how the menstrual cycle works below ). Most menstrual periods last from 3 to 5 days. What is the menstrual cycle? When periods (menstruations) come regularly, this is called the menstrual cycle. Having regular menstrual cycles is a sign that important parts of your body are working normally. The menstrual cycle provides important body chemicals, called hormones, to keep you healthy. It also prepares your body for pregnancy each month. A cycle is counted from the first day of 1 period to the first day of the next period. The average menstrual cycle is 28 days long. Cycles can range anywhere from 21 to 35 days in adults and from 21 to 45 days in young teens. The rise and fall of levels of hormones during the month control the menstrual cycle. What happens during the menstrual cycle? In the first half of the cycle, levels of estrogen (the "female hormone") start to rise. Estrogen plays an important role in keeping you healthy, especially by helping you to build strong bones and to help keep them strong as you get older. Estrogen also makes the lining of the uterus (womb) grow and thicken. This lining of the womb is a place that will nourish the embryo if a pregnancy occurs. At the same time the lining of the womb is growing, an egg, or ovum, in one of the ovaries starts to mature. At about day 14 of an average 28-day cycle, the egg leaves the ovary. This is called ovulation. After the egg has left the ovary, it travels through the fallopian tube to the uterus. Hormone levels rise and help prepare the uterine lining for pregnancy. A woman is most likely to get pregnant during the 3 days before or on the day of ovulation. Keep in mind, women with cycles that are shorter or longer than average may ovulate before or after day 14. A woman becomes pregnant if the egg is fertilized by a man''s sperm cell and attaches to the uterine wall. If the egg is not fertilized, it will break apart. Then, hormone levels drop, and the thickened lining of the uterus is shed during the menstrual period. See how the menstrual cycle works. What is a typical menstrual period like? During your period, you shed the thickened uterine lining and extra blood through the vagina. Your period may not be the same every month. It may also be different than other women''s periods. Periods can be light, moderate, or heavy in terms of how much blood comes out of the vagina. This is called menstrual flow. The length of the period also varies. Most periods last from 3 to 5 days. But, anywhere from 2 to 7 days is normal. For the first few years after menstruation begins, longer cycles are common. A woman''s cycle tends to shorten and become more regular with age. Most of the time, periods will be in the range of 21 to 35 days apart. What kinds of problems do women have with their periods? Women can have a range of problems with their periods, including pain, heavy bleeding, and skipped periods. Amenorrhea (ay-men-uh-REE-uh) — the lack of a menstrual period. This term is used to describe the absence of a period in: Young women who haven''t started menstruating by age 15 Women and girls who haven''t had a period for 90 days, even if they haven''t been menstruating for long Causes can include: Stress Serious medical conditions in need of treatment As above, when your menstrual cycles come regularly, this means that important parts of your body are' - source_sentence: Which footballer won the Golden Boot for scoring the most goals at 1986 World Cup Finals? sentences: - World Cup Golden Boot Winners - Historical World Cup Top Scorers Argentina 5 One of the most active markets for any World Cup is the Golden Boot with a number of players vying for an award which is presented to the highest goal scorer in the tournament. Over the years, there have been some incredible goal scoring feats at World Cup finals but who are the individuals that have made their mark in previous tournaments and what indicators can they give those of us who are making predictions for 2014? The Record Breaker France’s Just Fontaine holds the record for the most goals in a single World Cup tournament and it’s one that may never be broken. In current World Cup competitions, the most successful teams could play as many as seven games but could any of the current players match the 13 strikes that Fontaine achieved in Sweden in 1958? The striker was prolific in club football and averaged nearly a goal a game during his eight years with Stade Reims. His ratio at international level was even better and his performances at the 1958 finals would leave Fontaine with a record of 30 goals from 23 appearances. It’s claimed that he was playing in a pair of borrowed boots when he began his campaign with a hat trick in a 7-3 defeat of Uruguay. The Frenchman followed that achievement with a brace against Yugoslavia and a single, winning goal against the Scots which allowed his country to progress to the next phase. Three goals followed over two matches against Northern Ireland and Brazil before Fontaine netted no less than four times in the third place play off against West Germany. Behind this incredible achievement, Sandor Kocsis managed eleven strikes in the finals of 1954 but since Gerd Muller’s 10 in 1970, no player has managed more than eight in a single tournament. The Prolific Nations Aside from Just Fontaine’s magnificent 13 back in 1958, no Frenchman has taken the Golden Boot award. In fact, after the Stade Reims centre forward, there is a considerable gap in the country’s all time list. The finals in Sweden were the only time that Fontaine appeared in a tournament so he finished seven clear of Thierry Henry who scored six goals – three in 1998 and three in 2002. Other countries have been more prolific over a longer span and they tend to be the more successful nations in terms of World Cup victories. Brazil are well represented when it comes to the tournament’s leading goal scorers and out of eighteen finals, five Brazilians have either shared the Golden Boot or won it outright. The most successful of these was Ronaldo who currently holds the overall record for goals scored at the World Cup finals. The former Real Madrid target man has 15 strikes, spread over three tournaments, including a top scoring effort of eight as his country lifted the trophy in 2002. Behind Ronaldo, the legendary Pele has 12 goals in four tournaments although the man who many believe was the greatest to ever play the game, never actually won a Golden Boot. Germany also feature heavily in the list of all time leading scorers and Miroslav Klose has a chance of eclipsing Ronaldo’s record at the 2014 tournament. Along with the great Gerd Muller , the Lazio centre forward has 14 goals in World Cup finals and is set to be Germany’s first choice front man in Brazil. In total, German or West German players have finished as top scorer in three tournaments. Gerd Muller recorded an impressive ten goals in 1970 before Klose took an outright win in 2006. Thomas Muller completes the trio although the Bayern Munich man shared the award in 2010 with David Villa and Wesley Sneijder. An unlikely hero After West Germany’s Gerd Muller took the prestigious Golden Boot in 1970, he joined up with the national squad as they looked to win the World Cup on home soil four years later. The host nation duly completed a win after edging past the Netherlands by two goals to one in the final and while the man they called ‘Der Bomber’ scored four times, the top scorer accolade finished in the hands of an unlikely recipient. Poland’s Grzegorz Lato featured in three FIFA World Cup - 'Philip IV Philip IV Location of death: Madrid, Spain Cause of death: unspecified Nationality: Spain Executive summary: King of Spain, 1621-65 Philip IV, King of Spain, eldest son of Philip III and his wife Margaret, sister of the emperor Ferdinand II, was born at Valladolid on the 8th of April 1605. His reign, after a few passing years of barren successes, was a long story of political and military decay and disaster. The king has been held responsible for the fall of Spain, which was, however, due in the main to internal causes beyond the control of the most despotic ruler, however capable he had been. Philip certainly possessed more energy, both mental and physical, than his father. There is still in existence a translation of Guicciardini which he wrote with his own hand in order to qualify himself for government by acquiring a knowledge of political history. He was a fine horseman and keen hunter. His artistic taste was shown by his patronage of Diego Vel�zquez , and his love of letters by his favor to Lope de Vega , Calder�n , and other dramatists. He is even credited, on fairly probable testimony, with a share at least in the composition of several comedies. His good intentions were of no avail to his government. Coming to the throne at the age of sixteen, he did the wisest thing he could by allowing himself to be guided by the most capable man he could find. His favorite, Olivares, was a far more honest man than the Duke of Lerma, and was more fit for the place of prime minister than any Spaniard of the time. But Philip IV had not the strength of mind to free himself from the influence of Olivares when he had grown to manhood. The amusements which the favorite had encouraged became the business of the king''s life. When, in 1643, the disasters falling on the monarchy on all sides led to the dismissal of Olivares, Philip had lost the power to devote himself to hard work. After a brief struggle with the task of directing the administration of the most extensive and the worst organized monarchy in Europe, he sank back into his pleasures and was governed by other favorites. His political opinions were those he had inherited from his father and grandfather. He thought it his duty to support the German Habsburgs and the cause of the Roman Catholic Church against the Protestants, to assert his sovereignty over Holland, and to extend the dominions of his house. The utter exhaustion of his people in the course of a hopeless struggle with Holland, France and England was seen by him with sympathy, but he considered it an unavoidable misfortune and not the result of his own errors, since he could not be expected to renounce his rights or to desert the cause of God and the Church. In public he maintained a bearing of rigid solemnity, and was seen to laugh only three times in the course of his life. But in private he indulged in horseplay and very coarse immorality. His court was grossly vicious. The early death of his eldest son, Baltasar Carlos, was unquestionably due to debauchery encouraged by the gentlemen entrusted by the king with his education. The lesson shocked the king, but its effect soon wore off. Philip IV died broken-hearted on the 17th of September 1665, expressing the hope that his surviving son, Carlos, would be more fortunate than himself. Father: Philip III (King of Spain) Mother: Margaret Sister: Anne of Austria (Queen of France, b. 1601, d. 1666) Brother: Ferdinand (Governor of the Netherlands) Wife: Elizabeth Bourbon (b. 1603, m. 1615, d. 1644) Daughter: Maria Margarita (b. 1621) Daughter: Margarita Maria Catalina (b. 1623) Daughter: Maria Eugenia (b. 1625, d. 1627) Daughter: Isabel Maria Teresa (b. 1627) Son: Baltasar Carlos (b. 1629, d. 1646) Daughter: Maria Ana Antonia (b. 1636) Daughter: Maria Theresa of Spain (b. 1638, d. 1683)' - Olympic Games | Ice Hockey Wiki | Fandom powered by Wikia Medalists Ice hockey tournaments have been staged at the Olympic Games since 1920. The men's tournament was introduced at the 1920 Summer Olympics and was transferred permanently to the Winter Olympic Games programme in 1924. The women's tournament was first held at the 1998 Winter Olympics . The Olympic Games were originally intended for amateur athletes until 1988, and the National Hockey League (NHL) did not allow its players to compete until 1998. From 1924 to 1988, the tournament started with a round-robin series of games and ended with the medal round. Medals were awarded based on points accumulated during that round. The games of the tournament follow the rules of the International Ice Hockey Federation (IIHF), which differ slightly from the rules used in the NHL . The tournament follows the World Anti-Doping Agency's (WADA) rules on Use of performance enhancing drugs and the IIHF maintains a Registered Testing Pool, a list of top players who are subjected to random in-competition and out-of-competition drug tests. Several players have tested positive for banned substances since the 1972 Winter Olympics . In the men's tournament, Canada was the most successful team of the first three decades, winning six of seven gold medals. Czechoslovakia , Sweden and the United States were also competitive during this period and won multiple medals. Between 1920 and 1968, the Olympic hockey tournament was also counted as the Ice Hockey World Championship for that year. The Soviet Union first participated in 1956 and overtook Canada as the dominant international team, winning seven of the nine tournaments in which they participated. The United States won gold medals in 1960 and in 1980 , which included their " Miracle on Ice " upset of the Soviet Union. Canada went 50 years without a gold medal, before winning one in 2002 , and following it up with another in 2010 . Other nations to win gold include Great Britain in 1936 , the Unified Team in 1992 , Sweden in 1994 and 2006 and the Czech Republic]] in 1998 . Other medal-winning nations include Switzerland,Germany,Finland and Russia]]. In 1986, the International Olympic Committee (IOC) voted to allow all athletes to compete in Olympic Games held after 1988. The NHL was initially reluctant to allow its players to compete because the Olympics are held in the middle of the NHL season, and the league would have to halt play if many of its players participated. However, NHL players were allowed to compete starting in 1998. The format of the tournament was adjusted to accommodate the NHL schedule; a preliminary round was played without NHL players or the top six teams—Canada, the Czech Republic, Finland, Russia, Sweden and the United States—followed by a final round which included them. The tournament format was changed again in 2006; every team played five preliminary games with the full use of NHL players. In July 1992, the IOC voted to approve women's hockey as an Olympic event; it was first held at the 1998 Winter Olympics in Nagano. The Nagano Organizing Committee was hesitant to include the event because of the additional costs of staging the tournament, but an agreement was reached that limited the field to six teams, and ensured that no additional facilities would be built. The Canadian and American teams have dominated the event, typically losing only to each other. The United States won the first tournament in 1998, while Canada won in 2002, 2006 and 2010. Contents Edit The first Olympic ice hockey tournament took place at the 1920 Summer Olympics in Antwerp , Belgium . [1] At the time, organised international ice hockey was still relatively new. [2] The International Ice Hockey Federation (IIHF), the sport's governing body, was created on May 15, 1908, under the name Ligue Internationale de Hockey sur Glace. [3] At the 1914 Olympic Congress in Paris, ice hockey was added to the list of optional sports that Olympics organisers could include. [4] The decision to include ice hockey for the 1920 Summer Olympics wa - source_sentence: The Azores island group is administered by which country? sentences: - 'Tiddlywinks: The Classic Victorian Pastime: On Target for the 21st Century (1996) You are at: Home » History »Tiddlywinks: The Classic Victorian Pastime: On Target for the 21st Century (1996) Tiddlywinks: The Classic Victorian Pastime: On Target for the 21st Century (1996) This article was originally published in the American Game Collectors Association ‘s Game Researchers’ Notes, ISSN 1050-6608, October 1996, with illustrations and content on the cover, on pages 5552 to 5561, and also on the back cover. In the web version of this article, additional images have been incorporated that did not appear in the original publication.  Also please note that the AGCA is now known as the Association of Game & Puzzle Collectors . A substantial majority of the information provided in the original 1996 article remains accurate to this day.  However, quite a bit more background information has been gathered since.  An update is warranted, and is in the works. This article was originally posted on the Internet on 3 May 1997,  was updated on 2 April 1999, and then with updated images and links on 9 and 15 September 2006, plus a few more minor updates on 24 November 2006, and also on 13 July 2014. By Rick Tucker © 1996 Richard W. Tucker. All Rights Reserved “One should make a serious study of a pastime”—Alexander the Great [ 1 ] Table of Contents References The Preface I’ve played tiddlywinks for 24 years, ever since I ventured into a dormitory at MIT on my first day as a freshman and encountered (no pun intended) the local denizens on their hands and knees shooting winks across the carpet and down the stairs. (It really isn’t normally played on the floor, actually.) I was captivated at the congruence (technical term, sorry) of the ivory towers of MIT housing the noble sport of tiddlywinks, and amazed that MIT might, perhaps inadvertantly (but not always), lend credence to a sport enmired in such a mischievous stereotype. Tiddlywinks appealed to me because of its unique character, because it is almost universally known, and because it demands precise dexterous skills, while also requiring strategy and tactics, and also a measure of luck. And so, what follows is the first definitive history of tiddlywinks boxed games. There is a history in all men’s lives.[ 2 ] I invite and expect to hear from game collectors and historians to help me add to, revise, and where necessary, fix errors in this history. I also invite you to visit my tiddlywinks web pages at http://www.tiddlywinks.org , where this article will appear subsequent to its publication in Games Researchers’ Notes, with all the photos in living color. — Rick Tucker, 31 October 1996 Setting the Stage: The Oft-Ridiculed Game “Have we sold our precious heritage in exchange for frivolity and a game of tiddlywinks?”, letter by Lillie Struble in Library Journal[ 3 ]. This was the most unkindest cut of all.[ 4 ] “A 15th-century Donatello bronze, The Madonna and Child, served the Fitzwilliam family as a tiddlywinks bowl until the Victoria and Albert Museum [London] recognized its importance”, ARTnews[ 5 ]. “Even in the matter of nursery games the Victorian child took things very seriously. There were some board games, however, which provided little or no intellectual stimulus. Chief among these was […] tiddlywinks, whose apparent inanity (to the uninitiated) is often regarded as the ultimate in useless activities.”, James Mackay [ 6 ]. Prince Philip once suggested that tiddlywinks be included in the Olympics. To which Ian Wooldridge of the Olympic Committee responded: “At the risk of propagating royal support for tiddlywinks, a game of the utmost tedium played by anti-athletes too tired or apathetic to get up off the floor, I have to concede that his argument makes sense.”, British Airways magazine.[ 7 ] “The research described in this chapter concerns a well-known children’s pastime, the game of tiddlywinks, where the idea is to take one counter and press it on the edge of another, to make the latter jump. Because this is extremely simple, the research centered less on cognizance of the mov' - Football - Summer Olympic Sport Football Singapore 2010 adopts new sport formats 12 Aug 2010 Football has its roots in ancient China, while the modern version of the game began on the streets of medieval England before evolving into the most popular sport in the world. Medieval origins Modern football has its origins in the streets of medieval England. Neighbouring towns would play each other in games where a heaving mass of players would struggle to drag a pig’s bladder by any means possible to markers at either end of town. A royal ban Football became so violent in England it was banned by the king for more than 300 years. English public schools are credited with subsequently establishing the modern football codes, thus turning the mob riot into a sport in the 16th century. Olympic history Football first appeared on the programme of the Games of the II Olympiad, Paris 1900. It has been on the programme of each edition of the Games ever since, with the exception of Los Angeles 1932. Europe dominated the competition until after 1992 in Barcelona, where Spain became the last European team to win a gold medal. Since the 1996 Olympic Games in Atlanta, African and South American teams have won all the gold medals. Also in 1996, women’s football was introduced into the Olympic programme. Three times, the USA has been on the highest step of the podium - in 1996, in 2004 in Athens and in 2008 in Beijing. But this team was beaten by the Norwegians in the final of the 2000 Games in Sydney. - 'The Azores Islands - Portugal | Portugal.com Porto and the North Azores Consisting of nine islands, the Azores  are divided into three groups: the eastern ( Sao Miguel and Santa Maria islands), the central ( Terceira , Graciosa , Sao Jorge ,  Pico and Faial islands), and the western ( Flores and Corvo  islands). Apart from international airports of Santa Maria, Ponta Delgada and Angra do Heroismo, there are flights to the islands (operated by the regional airline TAP Air Portugal) and ferry boats between the islands. Even the blase visitor will be touched by the sapphire blue and emerald green lakes, fertile prairies, volcanic cones and craters, colorful hydrangeas and azaleas, 15th century churches, and majestic manor houses. This legendary land, consisting of nine poetically-named islands, enjoys year-round mild temperatures (between 14°C and 22°C–57°F and 71°F) and is a peaceful shelter with a population of 250000 inhabitants, for whom the words “stress” and “pollution” are unheard. There are many stories to tell of the archipelago’s beauty, of fishermen or shepherds, but among them there is one which was told by a holidaymaker. As a foreign couple was silently looking at the Caldeira das Sete Cidades when they were interrupted by their six-year-old son, who asked them: “Is this God’s home?” Sao Miguel Island The largest of all. In Ponta Delgada, the capital, the famous 18th century portals open up to a number of monuments that are worth visiting, most of them built between the 16th and the 18th century: Carlos Machado Museum and churches of Sao Sebastiao, Sao Pedro, Sao Jose, Colegio and Nossa Senhora da Conceicao; convent and chapel of Nossa Senhora da Esperanca and Santa Ana Chapel. Palaces: Fonte Bela and Santa Ana; Conceicao and Santa Catarina; Casa de Carlos Bicudo and the Pacos do Concelho. Other places to visit: Caldeira das Sete Cidades (green and blue lakes); Lagoa do Fogo; Ribeira Grande; Vale das Furnas (spas and hot mineral pools) and Vila Franca do Campo.  Terceira Island The historic centre of its capital, Angra do Heroismo, has been classified in UNESCO’s International Heritage list. Special reference to the forts of Sao Sebastiao and Sao Joao Baptista (16th-17th-centuries); the palaces of the Bettencourts (Baroque) and of the Capitaes-Generais; the Cathedral, with its silver altar front and treasure; the churches of Colegio dos Jesuitas, Sao Goncalo and Nossa Senhora da Conceicao (17th-century); the churches of Misericordia and Nossa Senhora da Guia (18th-century, the latter encloses the Angra Museum). Other points of interest: Praia da Vitoria, Santa Barbara, Sao Sebastiao and Vila Nova. Graciosa Island In Santa Cruz da Graciosa you will find ancient streets and manor-houses, a beautiful mother-church (16th-18th centuries), Santo Cristo Church (16th century), Cruz da Barra (Manueline) and Ethnographic House. In the Furna do Enxofre, dazzling sights and a vaulted cave over an underground lake (between 11am and 2pm the sunlight filters in). You must also visit Guadalupe and its Baroque church, Luz and Praia (typical windmills). Faial Island In Horta, a famous yacht harbor, look at the beautiful tiles and gilded carvings in the 17th and 18th century churches of Sao Salvador, Nossa Senhora do Carmo and Sao Francisco. To visit: Sacred Art Museum, Nossa Senhora das Angústias Church, Nossa Senhora do Pilar Chapel, Imperio dos Nobres and Porto Pim fortifications, Caldeira Natural Reserve, Capelinhos, grottoes and caves in Costa da Feteira and Monte da Guia belvedere. Pico Island Owes its name to the 7713 ft high volcanic cone. Special reference to Sao Roque do Pico, with its 18th century churches of Sao Roque and Sao Pedro de Alcântara; Lajes do Pico, with its Whale Museum; Madalena, with its Wine Museum and 17th-century church, and Areia Larga, with beautiful winery manor houses. Other places: Calheta de Nesquim, Candelaria, Criacao Velha, Piedade (forest preserve), Prainha do Norte, Santa Luzia, Santo Amaro, Sao Caetano, Sao Joao and Sao Mateus. Sao Jorge Island Velas, with its fishing port, is the main to' model-index: - name: bge base trained on trivia anchor-positive results: - task: type: information-retrieval name: Information Retrieval dataset: name: trivia anchor positive dev type: trivia-anchor-positive-dev metrics: - type: cosine_accuracy@1 value: 0.672 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.842 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.877 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.914 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.672 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.2806666666666666 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.1754 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09140000000000001 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.672 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.842 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.877 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.914 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8005034750177896 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.7633531746031744 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.7661893184880814 name: Cosine Map@100 - type: dot_accuracy@1 value: 0.672 name: Dot Accuracy@1 - type: dot_accuracy@3 value: 0.842 name: Dot Accuracy@3 - type: dot_accuracy@5 value: 0.877 name: Dot Accuracy@5 - type: dot_accuracy@10 value: 0.914 name: Dot Accuracy@10 - type: dot_precision@1 value: 0.672 name: Dot Precision@1 - type: dot_precision@3 value: 0.2806666666666666 name: Dot Precision@3 - type: dot_precision@5 value: 0.1754 name: Dot Precision@5 - type: dot_precision@10 value: 0.09140000000000001 name: Dot Precision@10 - type: dot_recall@1 value: 0.672 name: Dot Recall@1 - type: dot_recall@3 value: 0.842 name: Dot Recall@3 - type: dot_recall@5 value: 0.877 name: Dot Recall@5 - type: dot_recall@10 value: 0.914 name: Dot Recall@10 - type: dot_ndcg@10 value: 0.8005034750177896 name: Dot Ndcg@10 - type: dot_mrr@10 value: 0.7633531746031744 name: Dot Mrr@10 - type: dot_map@100 value: 0.7661893184880814 name: Dot Map@100 --- # bge base trained on trivia anchor-positive This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) <!-- at revision a5beb1e3e68b9ab74eb54cfd186867f64f240e1a --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> - **Language:** en - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("SepKeyPro/bge-base-en-trivia-anchor-positive") # Run inference sentences = [ 'The Azores island group is administered by which country?', 'The Azores Islands - Portugal | Portugal.com Porto and the North Azores Consisting of nine islands, the Azores \xa0are divided into three groups: the eastern ( Sao Miguel and Santa\xa0Maria islands), the central ( Terceira , Graciosa , Sao Jorge ,\xa0 Pico and Faial islands), and the western ( Flores and Corvo \xa0islands). Apart from international airports of Santa Maria,\xa0Ponta Delgada and Angra do Heroismo, there are flights to the islands\xa0(operated by the regional airline TAP Air Portugal) and ferry boats\xa0between the islands. Even the blase visitor will be touched by the sapphire blue and emerald green lakes, fertile prairies, volcanic cones and craters, colorful hydrangeas and azaleas, 15th century churches, and majestic manor houses. This legendary land, consisting of nine poetically-named islands, enjoys year-round mild temperatures (between 14°C and 22°C–57°F and 71°F) and is a peaceful shelter with a population of 250000 inhabitants, for whom the words “stress” and “pollution” are unheard. There are many stories to tell of the archipelago’s beauty, of fishermen or shepherds, but among them there is one which was told by a holidaymaker. As a foreign couple was silently looking at the Caldeira das Sete Cidades when they were interrupted by their six-year-old son, who asked them: “Is this God’s home?” Sao Miguel Island The largest of all. In Ponta Delgada, the capital, the famous 18th century portals open up to a number of monuments that are worth visiting, most of them built between the 16th and the 18th century: Carlos Machado Museum and churches of Sao Sebastiao, Sao Pedro, Sao Jose, Colegio and Nossa Senhora da Conceicao; convent and chapel of Nossa Senhora da Esperanca and Santa Ana Chapel. Palaces: Fonte Bela and Santa Ana; Conceicao and Santa Catarina; Casa de Carlos Bicudo and the Pacos do Concelho. Other places to visit: Caldeira das Sete Cidades (green and blue lakes); Lagoa do Fogo; Ribeira Grande; Vale das Furnas (spas and hot mineral pools) and Vila Franca do Campo.\xa0 Terceira Island The historic centre of its capital, Angra do Heroismo, has been classified in UNESCO’s International Heritage list. Special reference to the forts of Sao Sebastiao and Sao Joao Baptista (16th-17th-centuries); the palaces of the Bettencourts (Baroque) and of the Capitaes-Generais; the Cathedral, with its silver altar front and treasure; the churches of Colegio dos Jesuitas, Sao Goncalo and Nossa Senhora da Conceicao (17th-century); the churches of Misericordia and Nossa Senhora da Guia (18th-century, the latter encloses the Angra Museum). Other points of interest: Praia da Vitoria, Santa Barbara, Sao Sebastiao and Vila Nova. Graciosa Island In Santa Cruz da Graciosa you will find ancient streets and manor-houses, a beautiful mother-church (16th-18th centuries), Santo Cristo Church (16th century), Cruz da Barra (Manueline) and Ethnographic House. In the Furna do Enxofre, dazzling sights and a vaulted cave over an underground lake (between 11am and 2pm the sunlight filters in). You must also visit Guadalupe and its Baroque church, Luz and Praia (typical windmills). Faial Island In Horta, a famous yacht harbor, look at the beautiful tiles and gilded carvings in the 17th and 18th century churches of Sao Salvador, Nossa Senhora do Carmo and Sao Francisco. To visit: Sacred Art Museum, Nossa Senhora das Angústias Church, Nossa Senhora do Pilar Chapel, Imperio dos Nobres and Porto Pim fortifications, Caldeira Natural Reserve, Capelinhos, grottoes and caves in Costa da Feteira and Monte da Guia belvedere. Pico Island Owes its name to the 7713 ft high volcanic cone. Special reference to Sao Roque do Pico, with its 18th century churches of Sao Roque and Sao Pedro de Alcântara; Lajes do Pico, with its Whale Museum; Madalena, with its Wine Museum and 17th-century church, and Areia Larga, with beautiful winery manor houses. Other places: Calheta de Nesquim, Candelaria, Criacao Velha, Piedade (forest preserve), Prainha do Norte, Santa Luzia, Santo Amaro, Sao Caetano, Sao Joao and Sao Mateus. Sao Jorge Island Velas, with its fishing port, is the main to', 'Football - Summer Olympic Sport Football Singapore 2010 adopts new sport formats 12 Aug 2010 Football has its roots in ancient China, while the modern version of the game began on the streets of medieval England before evolving into the most popular sport in the world. Medieval origins Modern football has its origins in the streets of medieval England. Neighbouring towns would play each other in games where a heaving mass of players would struggle to drag a pig’s bladder by any means possible to markers at either end of town. A royal ban Football became so violent in England it was banned by the king for more than 300 years. English public schools are credited with subsequently establishing the modern football codes, thus turning the mob riot into a sport in the 16th century. Olympic history Football first appeared on the programme of the Games of the II Olympiad, Paris 1900. It has been on the programme of each edition of the Games ever since, with the exception of Los Angeles 1932. Europe dominated the competition until after 1992 in Barcelona, where Spain became the last European team to win a gold medal. Since the 1996 Olympic Games in Atlanta, African and South American teams have won all the gold medals. Also in 1996, women’s football was introduced into the Olympic programme. Three times, the USA has been on the highest step of the podium - in 1996, in 2004 in Athens and in 2008 in Beijing. But this team was beaten by the Norwegians in the final of the 2000 Games in Sydney.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Information Retrieval * Dataset: `trivia-anchor-positive-dev` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.672 | | cosine_accuracy@3 | 0.842 | | cosine_accuracy@5 | 0.877 | | cosine_accuracy@10 | 0.914 | | cosine_precision@1 | 0.672 | | cosine_precision@3 | 0.2807 | | cosine_precision@5 | 0.1754 | | cosine_precision@10 | 0.0914 | | cosine_recall@1 | 0.672 | | cosine_recall@3 | 0.842 | | cosine_recall@5 | 0.877 | | cosine_recall@10 | 0.914 | | cosine_ndcg@10 | 0.8005 | | cosine_mrr@10 | 0.7634 | | **cosine_map@100** | **0.7662** | | dot_accuracy@1 | 0.672 | | dot_accuracy@3 | 0.842 | | dot_accuracy@5 | 0.877 | | dot_accuracy@10 | 0.914 | | dot_precision@1 | 0.672 | | dot_precision@3 | 0.2807 | | dot_precision@5 | 0.1754 | | dot_precision@10 | 0.0914 | | dot_recall@1 | 0.672 | | dot_recall@3 | 0.842 | | dot_recall@5 | 0.877 | | dot_recall@10 | 0.914 | | dot_ndcg@10 | 0.8005 | | dot_mrr@10 | 0.7634 | | dot_map@100 | 0.7662 | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: epoch - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `gradient_accumulation_steps`: 4 - `learning_rate`: 2e-05 - `num_train_epochs`: 1 - `lr_scheduler_type`: cosine - `warmup_ratio`: 0.1 - `bf16`: True - `tf32`: True - `optim`: adamw_torch_fused - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: epoch - `prediction_loss_only`: True - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 4 - `eval_accumulation_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: cosine - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: True - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: True - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch_fused - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | loss | trivia-anchor-positive-dev_cosine_map@100 | |:------:|:----:|:-------------:|:------:|:-----------------------------------------:| | 0 | 0 | - | - | 0.7809 | | 0.0710 | 10 | 0.1474 | - | - | | 0.1421 | 20 | 0.1112 | - | - | | 0.2131 | 30 | 0.0828 | - | - | | 0.2842 | 40 | 0.0767 | - | - | | 0.3552 | 50 | 0.0575 | - | - | | 0.4263 | 60 | 0.0614 | - | - | | 0.4973 | 70 | 0.0542 | - | - | | 0.5684 | 80 | 0.0566 | - | - | | 0.6394 | 90 | 0.068 | - | - | | 0.7105 | 100 | 0.072 | - | - | | 0.7815 | 110 | 0.0872 | - | - | | 0.8526 | 120 | 0.0654 | - | - | | 0.9236 | 130 | 0.0793 | - | - | | 0.9947 | 140 | 0.0563 | - | - | | 0.0710 | 10 | 0.0222 | - | - | | 0.1421 | 20 | 0.0096 | - | - | | 0.2131 | 30 | 0.0093 | - | - | | 0.2842 | 40 | 0.0106 | - | - | | 0.3552 | 50 | 0.0078 | - | - | | 0.4263 | 60 | 0.0099 | - | - | | 0.4973 | 70 | 0.01 | - | - | | 0.5684 | 80 | 0.0166 | - | - | | 0.6394 | 90 | 0.0272 | - | - | | 0.7105 | 100 | 0.041 | - | - | | 0.7815 | 110 | 0.0677 | - | - | | 0.8526 | 120 | 0.0539 | - | - | | 0.9236 | 130 | 0.074 | - | - | | 0.9947 | 140 | 0.0484 | - | 0.7792 | | 0.0710 | 10 | 0.0028 | - | - | | 0.1421 | 20 | 0.0026 | - | - | | 0.2131 | 30 | 0.0021 | - | - | | 0.2842 | 40 | 0.0075 | - | - | | 0.3552 | 50 | 0.0021 | - | - | | 0.4263 | 60 | 0.0026 | - | - | | 0.4973 | 70 | 0.0028 | - | - | | 0.5684 | 80 | 0.004 | - | - | | 0.6394 | 90 | 0.006 | - | - | | 0.7105 | 100 | 0.0137 | - | - | | 0.7815 | 110 | 0.0449 | - | - | | 0.8526 | 120 | 0.0433 | - | - | | 0.9236 | 130 | 0.0693 | - | - | | 0.9947 | 140 | 0.0451 | 0.0405 | 0.7751 | | 0.0710 | 10 | 0.0009 | - | - | | 0.1421 | 20 | 0.0022 | - | - | | 0.2131 | 30 | 0.0007 | - | - | | 0.2842 | 40 | 0.001 | - | - | | 0.3552 | 50 | 0.0009 | - | - | | 0.4263 | 60 | 0.0009 | - | - | | 0.4973 | 70 | 0.0011 | - | - | | 0.5684 | 80 | 0.0015 | - | - | | 0.6394 | 90 | 0.0019 | - | - | | 0.7105 | 100 | 0.0037 | - | - | | 0.7815 | 110 | 0.0229 | - | - | | 0.8526 | 120 | 0.0318 | - | - | | 0.9236 | 130 | 0.0661 | - | - | | 0.9947 | 140 | 0.0451 | - | 0.7662 | ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.0.1 - Transformers: 4.42.1 - PyTorch: 2.3.0+cu121 - Accelerate: 0.31.0 - Datasets: 2.20.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION", "TRANSLATION" ]
[ "MEDAL" ]
Non_BioNLP
fine-tuned/SciFact-32000-384-gpt-4o-2024-05-13-66747460
fine-tuned
feature-extraction
[ "sentence-transformers", "safetensors", "bert", "feature-extraction", "sentence-similarity", "mteb", "en", "dataset:fine-tuned/SciFact-32000-384-gpt-4o-2024-05-13-66747460", "dataset:allenai/c4", "license:apache-2.0", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,717
1,717
6
0
--- datasets: - fine-tuned/SciFact-32000-384-gpt-4o-2024-05-13-66747460 - allenai/c4 language: - en - en license: apache-2.0 pipeline_tag: feature-extraction tags: - sentence-transformers - feature-extraction - sentence-similarity - mteb --- This model is a fine-tuned version of [**BAAI/bge-large-en-v1.5**](https://huggingface.co/BAAI/bge-large-en-v1.5) designed for the following use case: None ## How to Use This model can be easily integrated into your NLP pipeline for tasks such as text classification, sentiment analysis, entity recognition, and more. Here's a simple example to get you started: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim model = SentenceTransformer( 'fine-tuned/SciFact-32000-384-gpt-4o-2024-05-13-66747460', trust_remote_code=True ) embeddings = model.encode([ 'first text to embed', 'second text to embed' ]) print(cos_sim(embeddings[0], embeddings[1])) ```
[ "TEXT_CLASSIFICATION" ]
[ "SCIFACT" ]
Non_BioNLP
RichardErkhov/M4-ai_-_tau-0.5B-awq
RichardErkhov
null
[ "safetensors", "qwen2", "4-bit", "awq", "region:us" ]
1,732
1,732
4
0
--- {} --- Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) tau-0.5B - AWQ - Model creator: https://huggingface.co/M4-ai/ - Original model: https://huggingface.co/M4-ai/tau-0.5B/ Original model description: --- license: other datasets: - Locutusque/UltraTextbooks-2.0 inference: parameters: do_sample: true temperature: 0.8 top_p: 0.95 top_k: 40 max_new_tokens: 250 repetition_penalty: 1.1 language: - en - zh --- # tau-0.5B ## Model Details - **Model Name:** tau-0.5B - **Base Model:** Qwen1.5-0.5B - **Dataset:** UltraTextbooks-2.0 - **Model Size:** 0.5B parameters - **Model Type:** Language Model - **Training Procedure:** Further pre-training of Qwen1.5-0.5B on UltraTextbooks-2.0. ## Model Use tau-0.5B is designed to be a general-purpose language model with enhanced capabilities in the domains of machine learning, mathematics, and coding. It can be used for a wide range of natural language processing tasks, such as: - Educational question answering - Text summarization - Content generation for educational purposes - Code understanding and generation - Mathematical problem solving The model's exposure to the diverse content in the UltraTextbooks-2.0 dataset makes it particularly well-suited for applications in educational technology and research. ## Training Data tau-0.5B was further pre-trained on the UltraTextbooks-2.0 dataset, which is an expanded version of the original UltraTextbooks dataset. UltraTextbooks-2.0 incorporates additional high-quality synthetic and human-written textbooks from various sources on the Hugging Face platform, with a focus on increasing the diversity of content in the domains of machine learning, mathematics, and coding. For more details on the dataset, please refer to the [UltraTextbooks-2.0 Dataset Card](https://huggingface.co/datasets/Locutusque/UltraTextbooks-2.0). ## Performance and Limitations Refer to [Evaluation](##Evaluation) for evaluations. It is essential to note that the model may still exhibit biases or inaccuracies present in the training data. Users are encouraged to critically evaluate the model's outputs and report any issues to facilitate continuous improvement. ## Environmental Impact The training of tau-0.5B required computational resources that contribute to the model's overall environmental impact. However, efforts were made to optimize the training process and minimize the carbon footprint. ## Ethical Considerations tau-0.5B was trained on a diverse dataset that may contain biases and inaccuracies. Users should be aware of these potential limitations and use the model responsibly. The model should not be used for tasks that could cause harm or discriminate against individuals or groups. ## Evaluation | Tasks |Version|Filter|n-shot| Metric |Value | |Stderr| |---------------------------------|-------|------|-----:|--------|-----:|---|-----:| |agieval_nous |N/A |none | 0|acc |0.2235|± |0.0434| | | |none | 0|acc_norm|0.2141|± |0.0498| | - agieval_aqua_rat | 1|none | 0|acc |0.1417|± |0.0219| | | |none | 0|acc_norm|0.1535|± |0.0227| | - agieval_logiqa_en | 1|none | 0|acc |0.2796|± |0.0176| | | |none | 0|acc_norm|0.3118|± |0.0182| | - agieval_lsat_ar | 1|none | 0|acc |0.2000|± |0.0264| | | |none | 0|acc_norm|0.1696|± |0.0248| | - agieval_lsat_lr | 1|none | 0|acc |0.2275|± |0.0186| | | |none | 0|acc_norm|0.2020|± |0.0178| | - agieval_lsat_rc | 1|none | 0|acc |0.1487|± |0.0217| | | |none | 0|acc_norm|0.1561|± |0.0222| | - agieval_sat_en | 1|none | 0|acc |0.2330|± |0.0295| | | |none | 0|acc_norm|0.2039|± |0.0281| | - agieval_sat_en_without_passage| 1|none | 0|acc |0.2524|± |0.0303| | | |none | 0|acc_norm|0.1942|± |0.0276| | - agieval_sat_math | 1|none | 0|acc |0.2227|± |0.0281| | | |none | 0|acc_norm|0.1682|± |0.0253| | Tasks |Version| Filter |n-shot| Metric |Value | |Stderr| |---------------------------------------|-------|----------------|-----:|-----------|-----:|---|-----:| |truthfulqa | 2|none | 0|acc |0.3931|± |0.0143| |mmlu |N/A |none | 0|acc |0.3642|± |0.0040| | - humanities |N/A |none | 5|acc |0.3320|± |0.0068| | - formal_logic | 0|none | 5|acc |0.2619|± |0.0393| | - high_school_european_history | 0|none | 5|acc |0.4909|± |0.0390| | - high_school_us_history | 0|none | 5|acc |0.4167|± |0.0346| | - high_school_world_history | 0|none | 5|acc |0.4641|± |0.0325| | - international_law | 0|none | 5|acc |0.5537|± |0.0454| | - jurisprudence | 0|none | 5|acc |0.4167|± |0.0477| | - logical_fallacies | 0|none | 5|acc |0.2638|± |0.0346| | - moral_disputes | 0|none | 5|acc |0.3757|± |0.0261| | - moral_scenarios | 0|none | 5|acc |0.2402|± |0.0143| | - philosophy | 0|none | 5|acc |0.3794|± |0.0276| | - prehistory | 0|none | 5|acc |0.3426|± |0.0264| | - professional_law | 0|none | 5|acc |0.3103|± |0.0118| | - world_religions | 0|none | 5|acc |0.2807|± |0.0345| | - other |N/A |none | 5|acc |0.4071|± |0.0088| | - business_ethics | 0|none | 5|acc |0.4200|± |0.0496| | - clinical_knowledge | 0|none | 5|acc |0.4491|± |0.0306| | - college_medicine | 0|none | 5|acc |0.3873|± |0.0371| | - global_facts | 0|none | 5|acc |0.3600|± |0.0482| | - human_aging | 0|none | 5|acc |0.3498|± |0.0320| | - management | 0|none | 5|acc |0.4854|± |0.0495| | - marketing | 0|none | 5|acc |0.5470|± |0.0326| | - medical_genetics | 0|none | 5|acc |0.4000|± |0.0492| | - miscellaneous | 0|none | 5|acc |0.4291|± |0.0177| | - nutrition | 0|none | 5|acc |0.4183|± |0.0282| | - professional_accounting | 0|none | 5|acc |0.3582|± |0.0286| | - professional_medicine | 0|none | 5|acc |0.3015|± |0.0279| | - virology | 0|none | 5|acc |0.3494|± |0.0371| | - social_sciences |N/A |none | 5|acc |0.4075|± |0.0088| | - econometrics | 0|none | 5|acc |0.2719|± |0.0419| | - high_school_geography | 0|none | 5|acc |0.5000|± |0.0356| | - high_school_government_and_politics| 0|none | 5|acc |0.4611|± |0.0360| | - high_school_macroeconomics | 0|none | 5|acc |0.4051|± |0.0249| | - high_school_microeconomics | 0|none | 5|acc |0.3908|± |0.0317| | - high_school_psychology | 0|none | 5|acc |0.4239|± |0.0212| | - human_sexuality | 0|none | 5|acc |0.3893|± |0.0428| | - professional_psychology | 0|none | 5|acc |0.3399|± |0.0192| | - public_relations | 0|none | 5|acc |0.4455|± |0.0476| | - security_studies | 0|none | 5|acc |0.3510|± |0.0306| | - sociology | 0|none | 5|acc |0.5174|± |0.0353| | - us_foreign_policy | 0|none | 5|acc |0.5500|± |0.0500| | - stem |N/A |none | 5|acc |0.3276|± |0.0083| | - abstract_algebra | 0|none | 5|acc |0.3000|± |0.0461| | - anatomy | 0|none | 5|acc |0.2889|± |0.0392| | - astronomy | 0|none | 5|acc |0.3487|± |0.0388| | - college_biology | 0|none | 5|acc |0.3403|± |0.0396| | - college_chemistry | 0|none | 5|acc |0.2600|± |0.0441| | - college_computer_science | 0|none | 5|acc |0.3800|± |0.0488| | - college_mathematics | 0|none | 5|acc |0.3300|± |0.0473| | - college_physics | 0|none | 5|acc |0.2745|± |0.0444| | - computer_security | 0|none | 5|acc |0.4300|± |0.0498| | - conceptual_physics | 0|none | 5|acc |0.3447|± |0.0311| | - electrical_engineering | 0|none | 5|acc |0.3931|± |0.0407| | - elementary_mathematics | 0|none | 5|acc |0.3095|± |0.0238| | - high_school_biology | 0|none | 5|acc |0.4161|± |0.0280| | - high_school_chemistry | 0|none | 5|acc |0.2759|± |0.0314| | - high_school_computer_science | 0|none | 5|acc |0.3100|± |0.0465| | - high_school_mathematics | 0|none | 5|acc |0.3185|± |0.0284| | - high_school_physics | 0|none | 5|acc |0.2517|± |0.0354| | - high_school_statistics | 0|none | 5|acc |0.3009|± |0.0313| | - machine_learning | 0|none | 5|acc |0.3036|± |0.0436| |medqa_4options |Yaml |none | 5|acc |0.2687|± |0.0124| | | |none | 5|acc_norm |0.2687|± |0.0124| |logieval | 0|get-answer | 5|exact_match|0.3505|± |0.0120| |gsm8k_cot | 3|strict-match | 8|exact_match|0.0690|± |0.0070| | | |flexible-extract| 8|exact_match|0.1365|± |0.0095| | Tasks |Version|Filter|n-shot| Metric |Value | |Stderr| |-------------|------:|------|-----:|--------|-----:|---|-----:| |arc_easy | 1|none | 25|acc |0.5981|± |0.0101| | | |none | 25|acc_norm|0.5939|± |0.0101| |arc_challenge| 1|none | 25|acc |0.2688|± |0.0130| | | |none | 25|acc_norm|0.2969|± |0.0134| ## Usage Rights Make sure to read Qwen's license before using this model.
[ "QUESTION_ANSWERING", "SUMMARIZATION" ]
[ "MEDQA" ]
Non_BioNLP
adriansanz/SITGES-bge-FT2
adriansanz
sentence-similarity
[ "sentence-transformers", "safetensors", "xlm-roberta", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:237", "loss:BatchAllTripletLoss", "arxiv:1908.10084", "arxiv:1703.07737", "base_model:BAAI/bge-m3", "base_model:finetune:BAAI/bge-m3", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,723
1,723
5
0
--- base_model: BAAI/bge-m3 datasets: [] language: [] library_name: sentence-transformers pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:237 - loss:BatchAllTripletLoss widget: - source_sentence: 'El Viver dels Avis de Sitges. Activitat d''hort municipal per a la gent gran A la nostra vila hi ha veïns i veïnes que els agradaria tornar a fer de pagès o provar-ho per primera vegada. Potser molts d’ells enyoren el contacte amb la terra i voldrien tenir un petit hort per dedicar-li un parell d’hores cada dia i poder seguir el cicle natural de plantar, regar i recollir els fruits de la terra, gaudint així d’un entorn on la naturalesa és generosa amb qui la treballa. Aquest tipus d’activitat ha demostrat beneficis terapèutics i eugenèsics entre els seus principals destinataris: la gent gran. Al nostre municipi tenim la sort de comptar amb un ampli espai públic com és el viver municipal. Dins d''aquest viver s''hi han habilitat 10 parcel·les sobre una superfície de 300 m2.' sentences: - Acceptació / Renúncia. Ajuts per al projecte d'implantació i l'ús de la targeta de fidelització del comerç local de Sitges Descripció - Justificació Subvencions per a projectes i activitats de les entitats esportives i esportistes de Sitges Les persones i entitats beneficiaries hauran de justificar la realització del projecte/activitat subvencionada com a màxim el dia 31 de març de 2023. - Ajuts per les despeses d'instal·lació de mesures higièniques i de seguretat per al retorn a l'activitat comercial d'establiments físics (COVID-19) Són ajuts econòmics extraordinaris destinats a col·laborar amb la despesa que suposa la implementació de mesures higièniques de prevenció, protecció i mesures de seguretat per a la reobertura dels establiments comercials i la tornada a l’activitat econòmica d’aquests establiments físics. Únicament es prendran en consideració per a l’atorgament de l’ajut la compra de material fungible i les inversions per a la instal·lació de mesures higièniques i de seguretat relacionades amb la gestió i prevenció de la pandèmia COVID-19 d’acord amb l’annex 1 de les Bases que regulen l'atorgament de l'ajut. L’àmbit temporal de l’ajut econòmic extraordinari tindrà caràcter retroactiu al moment de la declaració de l’estat d’alarma; seran despeses finançables totes aquelles que s’hagin produït des de la declaració de l’estat d’alarma i fins la data de finalització el període de presentació de sol·licituds. L’import finançable serà el 100% del cost de compra del material fungible i d’inversió excepte l’IVA de la despesa que no formarà part de l’ajut econòmic extraordinari. L’import màxim de l’ajut econòmic extraordinari anirà en funció del nombre de persona beneficiaris/ries amb dret a l’ajut, entre un mínim de 500 € i un màxim de 3.000 €. - source_sentence: Justificació de l'ajut a la consolidació d'empreses de persones donades d'alta al règim especial de treballadors autònoms Les persones beneficiàries de l'ajut a la consolidació d'empreses de persones donades d'alta al règim especial de treballadors autònoms. sentences: - Preinscripció a la Fira d'Art de Sitges Amb l’objectiu de fomentar la participació d’artistes plàstics d’arreu de Catalunya, anualment s'organitza la Fira d'Art. Ubicada al carrer de Port Alegre (Platja de Sant Sebastià), els artistes (dibuix, pintura, gravat i escultura) poden exposar i vendre les seves obres. - 'Llicència ambiental (Annex II) Mitjançant la Llicència ambiental la persona interessada sol·licita a l’Ajuntament l’inici o modificació substancial d’una activitat econòmica, de les incloses en l’annex II de la Llei 20/2009, de prevenció i control ambiental de les activitats (LPCAA), i hi adjunta el projecte tècnic acreditatiu del compliment dels requisits necessaris que estableix la normativa vigent per a l’exercici de l’activitat. Aquestes activitats se subdivideixen en: Activitats sotmeses a una llicència ambiental amb declaració d’impacte ambiental Activitats sotmeses a una llicència ambiental i a un procés de decisió prèvia sobre la necessitat de declaració d’impacte ambiental i a avaluació d’impacte ambiental simplificada Activitats sotmeses a una llicència ambiental sense necessitat de sotmetre’s a cap procés d’avaluació d’impacte ambiental També està subjecta a llicència ambiental la modificació substancial de qualsevol activitat de l’annex II de la LPCAA, amb els mateixos procediments, documentació i requisits que els establerts per al seu atorgament. Amb aquest formulari no es poden comunicar els establiments turístics (càmpings de fins a 1500 unitats d’acamapada).' - 'Servei de teleassistència El sistema de teleassistència o telealarma consisteix en un dispositiu que es facilita a persones grans o discapacitades, que viuen soles permanentment o durant gran part del dia, o bé que viuen amb altres persones que presenten idèntiques característiques d''edat o discapacitat. Aquest sistema permet: Connectar fàcilment amb la central receptora d’alarmes les 24 hores del dia, els 365 dies de l’any facilitant la connexió immediata la línia telefònica i d’una manera còmoda i ràpida. Només cal prémer un botó. Sistema de mans lliures, que permet poder establir contacte verbal sense necessitat de despenjar cap telèfon ni d’acostar-se al terminal teleassistència. Mobilització dels recursos que existeixen a la localitat, mitjançant un fitxer actualitzat per avís d’ambulància o metge en cas d’urgència i coordinació amb els recursos de la comunitat per a l’atenció d’emergències socials, caigudes,.... Mobilització dels recursos propis de l’usuari. Custòdia de claus Etc. Donat que la disponibilitat d''aparells és limitada, les sol·licituds es prioritzaran en funció del grau de necessitat de l''usuari/ària. A aquests efectes es valorarà per part dels Serveis Socials municipals el grau d''autonomia personal, la situació de solitud i els ingressos de la unitat familiar (vegeu l''ordenança reguladora del preu públic).' - source_sentence: Instal·lació de parada a la Fira de la Vila del Llibre de Sitges L'Ajuntament de Sitges, sota el paraigua de la marca cultural registrada Vila del Llibre, organitza la Fira de la Vila del Llibre de Sitges consistent en un conjunt de parades instal·lades al Passeig Marítim, dedicades exclusivament a la venda de llibres i activitats relacionades amb les arts del llibre (il·lustració, enquadernació, gravat…), ocupades per empreses del sector i entitats culturals, amb activitat editorial acreditada. sentences: - Queixes, observacions i suggeriments Descripció - Confirmació de continuïtat de residència al municipi de persones estrangeres no obligades a renovar la seva inscripció padronal Les persones estrangeres amb ciutadania d'estats de la Unió Europea, o de l'Espai Econòmic Europeu, o amb targeta de residència de règim comunitari o de llarga durada, estan obligades a comunicar la seva continuïtat de residència al municipi de Sitges cada cinc anys, o cada dos en cas de no constar inscrites al Registre Central d'Estrangers, a comptar des de la darrera inscripció padronal. La no confirmació durant el període establert suposa l'inici d'un expedient de baixa en el Padró Municipal d'Habitants. - 'Llicència d''obra menor La realització d’obres està subjecta a l’obtenció d’una llicència atorgada per l’Ajuntament. S’estableixen tres tipus de llicència segons la magnitud de l’obra a realitzar: TIPUS A Construcció de piscines (comunitàries o particulars) Reparació / rehabilitació d’edificis i façanes en general i especialment d’edificis afectats per patologies Modificació de la coberta dels edificis amb augment de volum però sense augment de superfície construïda Actuacions puntuals que afectin o alterin l’estructura i / o fonaments de l’edifici Obres que modifiquin les instal·lacions o serveis dels espais comuns d’un edifici plurifamiliar Moviments de terres no inclosos en altres llicències Enderrocs parcials Murs de contenció de terres Formació de lavabos en locals comercials i magatzems Instal·lació d’aparells elevadors, ascensors i aparells mecànics en edificacions existents L''acumulació de residus i el dipòsit de materials que alterin les característiques del paisatge. Construcció o instal·lació de cisternes que afectin l''estat de càrregues de l''edifici. Canvis de distribució puntual interior (en locals i habitatges) sense afectar elements estructurals. TIPUS B Col·locació de bastides a una alçada superior a PB + 1 PP o a més de 6,00 m Arrebossat, estucat i pintat de façanes que necessiten una bastida amb una alçada superior a PB + 1 PP o a més de 6,00 m. Noves obertures ( finestres o portes ) o modificacions de les dimensions existents Reparació de balcons o elements sortints Construcció d’envans pluvials Construcció de pous i foses sèptiques Estintolament de façanes Construcció o modificació de tanques que requereixin obra. Reparació de sostres i terrats sense afectar elements estructurals. TIPUS C Obertures per a tub extractor Instal·lació d''aparells d''aire condicionat o d''altres similars Instal·lació d''antenes parabòl·liques Formació de barbacoes Col·locació de portes, finestres, persianes i reixes en obertures de façana Co·locació i/o canvi de paviments i escales a l''exterior de l''edifici Arrebossat, estucat i pintat de façanes que no necessiten una bastida amb una alçada inferior a PB + 1 PP o menys de 6.00 m Construcció, reparació i substitució de canonades de desguàs i claveguerons a l''exterior de l''edifici (sense bastida). Tala d''arbres' - source_sentence: 'Ajuts per a fomentar la contractació laboral de persones i millora de l''ocupació Els ajuts tenen com a objectiu millorar l''ocupabilitat i la inserció de persones en situació d''atur o parades incentivant la contractació de qualitat. Podran sol·licitar l''ajut aquelles persones físiques o jurídiques, persones autònomes o empreses, amb seu fiscal al municipi o fora però amb centre de treball a Sitges, i entitats sense ànim de lucre del municipi de Sitges també legalment constituïdes i inscrites en els registres pertinents, que hagin realitzat contractacions de personal per compte d''altri durant el període de l''1 de juliol de 2023 al 30 de juny de 2024. Resten fora d’aquesta convocatòria les empreses de treball temporal. Els contractes que donaran dret a ser declarada beneficiària de l’ajut seran els formalitzats des del seu inici com a contractes indefinits o fixes discontinus o bé per conversió de contractes temporals en contractes indefinits o fixes discontinus. Queden exclosos els contractes d’alta direcció i les contractacions a familiars: a cònjuges, ascendents, descendents i parents fins a segon grau. Únicament es prendran en consideració per a l’ajut econòmic les despeses derivades de la contractació de personal (retribucions i quotes empresarials a la seguretat social). Les quanties dels ajuts no podran excedir del 50 % dels costos derivats de la contractació. S''estableixen els seguents imports màxims a percebre segons les modalitats de contractació: De 3.000,00 € per als contractes de treball indefinits, fixos discontinus o conversió de contractes temporals a indefinits amb jornada de treball del 100%, els quals la persona contractada estigui inclosa dins del col·lectius vulnerables pel Servei Públic d''Ocupació Estatal (SEPE). De 2.000,00 € per als contractes de treball indefinits, fixos discontinus o conversió de contractes temporals a indefinits amb jornada de treball del 100% per a la resta de col·lectius. L’import es reduirà proporcionalment per aquells contractes celebrats com a fixes discontinus en funció del percentatge d’activitat econòmica feta durant l’any natural. Igualment es reduirà l’import per aquells contractes celebrats a temps parcial. En ambdós casos el percentatge per poder optar a l’ajut serà el resultant d’aplicar el percentatge d’activitat econòmica com el percentatge per temps parcial, i haurà de ser igual o superior al 50,00 %. Només es poden presentar dues contractacions En cas que dues contractacions donin dret a l''ajut econòmic, l''import màxim a percebre per a totes les contractacions serà de 3.000,00 €.' sentences: - Acceptació / Renúncia Ajuts per a la creació de noves empreses per persones donades d'alta al règim especial de treballadors autònoms Descripció - Comunicació prèvia de primera utilització i ocupació d'edificis i instal·lacions Aquest tràmit permet comunicar a l'Ajuntament de Sitges la finalització de les obres de nova construcció, o bé aquelles que hagin estat objecte de modificació substancial o d'ampliació quan per a l’autorització de les obres s’hagi exigit un projecte tècnic i a l’empara d’una llicència urbanística d’obra major. Simultàniament, s'acordarà el retorn de la quantia en concepte de garanties o avals dipositats, si escau. - 'Ajuts per a fomentar l''emprenedoria i la creació de noves empreses Són ajuts destinats únicament a cobrir les despeses inicials necessàries per a la posada en marxa del negoci. Les despeses subvencionables seran únicament aquelles estrictament necessàries per a la posada en marxa del negoci com ara: despeses de constitució, reformes del local, inversió inicial en tecnologia, desenvolupament de la web corporativa, desenvolupament d’aplicacions de venda on line, fiança, assegurances, registre de marques i patents, ... L’import de la subvenció serà com a màxim el 80% de la factura presentada, excepte l’IVA de la despesa que no formarà part de la despesa finançable, amb un import màxim de l’ajut de 6.000,00 €. Amb aquest ajut es vol incentivar l’autoocupació i la creació d’empreses donant suport a les persones que desenvolupin la seva activitat professional al municipi de Sitges, les quals hagin iniciat la seva activitat econòmica entre l’1 de juliol de 2023 i fins el 30 de juny de 2024.' - source_sentence: Acceptació / Renúncia Subvencions per a projectes i activitats a entitats de l'àmbit de polítiques socials Descripció sentences: - 'Subvencions per al desenvolupament i/o consolidació de sectors econòmics del municipi Subvencions per a entitats destinades a fomentar el desenvolupament i la consolidació de sectors econòmics locals. L''objectiu és impulsar iniciatives per millorar la competitivitat, la generació d''ocupació i potenciar el naixement de nous sectors econòmics en el municipi i l’enfortiment dels existents, contribuint així al creixement econòmic sostenible i al benestar de la comunitat. Per valorar l’interès de la proposta es tindrà en compte: Tipus d’activitat Antecedents Dates de celebració Accions de promoció dutes a terme des de l’organització' - Autorització d'accés a les àrees de vianants Permet obtenir l'autorització municipal per l'accés de vehicles a les àrees restringides a vianants establer-tes al municipi (actualment nucli de Garraf i Platja de Sant Sebastià). Les persones interessades poden presentar aquesta sol·lictud, i en cas de compliment dels requisits establerts (persones residents, titulars de plaça d'aparcament, autotaxis, establiments hotelers), se'ls traslladarà la resolució d’autorització. - Declaració de baixa de la Taxa pel servei municipal complementari de recollida, tractament i eliminació de residus comercials Declaració tributària mitjançant la qual es sol·licita la baixa d'una activitat de la Taxa pel servei municipal complementari de recollida, tractament i eliminació de residus comercials . --- # SentenceTransformer based on BAAI/bge-m3 This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) <!-- at revision 5617a9f61b028005a4858fdac845db406aefb181 --> - **Maximum Sequence Length:** 8192 tokens - **Output Dimensionality:** 1024 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: XLMRobertaModel (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("adriansanz/sitges1024-bai-batchalltriplets") # Run inference sentences = [ "Acceptació / Renúncia Subvencions per a projectes i activitats a entitats de l'àmbit de polítiques socials Descripció", "Subvencions per al desenvolupament i/o consolidació de sectors econòmics del municipi Subvencions per a entitats destinades a fomentar el desenvolupament i la consolidació de sectors econòmics locals. L'objectiu és impulsar iniciatives per millorar la competitivitat, la generació d'ocupació i potenciar el naixement de nous sectors econòmics en el municipi i l’enfortiment dels existents, contribuint així al creixement econòmic sostenible i al benestar de la comunitat. Per valorar l’interès de la proposta es tindrà en compte: Tipus d’activitat Antecedents Dates de celebració Accions de promoció dutes a terme des de l’organització", "Autorització d'accés a les àrees de vianants Permet obtenir l'autorització municipal per l'accés de vehicles a les àrees restringides a vianants establer-tes al municipi (actualment nucli de Garraf i Platja de Sant Sebastià). Les persones interessades poden presentar aquesta sol·lictud, i en cas de compliment dels requisits establerts (persones residents, titulars de plaça d'aparcament, autotaxis, establiments hotelers), se'ls traslladarà la resolució d’autorització.", ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 1024] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 237 training samples * Columns: <code>sentence</code> and <code>label</code> * Approximate statistics based on the first 1000 samples: | | sentence | label | |:--------|:-------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | type | string | int | | details | <ul><li>min: 13 tokens</li><li>mean: 135.46 tokens</li><li>max: 629 tokens</li></ul> | <ul><li>286: ~0.42%</li><li>288: ~0.42%</li><li>290: ~0.42%</li><li>291: ~0.42%</li><li>293: ~0.42%</li><li>295: ~0.42%</li><li>298: ~0.42%</li><li>302: ~0.42%</li><li>303: ~0.42%</li><li>304: ~0.42%</li><li>306: ~0.42%</li><li>309: ~0.42%</li><li>311: ~0.42%</li><li>313: ~0.42%</li><li>314: ~0.42%</li><li>315: ~0.42%</li><li>316: ~0.42%</li><li>320: ~0.42%</li><li>321: ~0.42%</li><li>322: ~0.42%</li><li>323: ~0.42%</li><li>324: ~0.42%</li><li>325: ~0.42%</li><li>327: ~0.42%</li><li>328: ~0.42%</li><li>331: ~0.42%</li><li>332: ~0.42%</li><li>333: ~0.42%</li><li>336: ~0.42%</li><li>338: ~0.42%</li><li>339: ~0.42%</li><li>346: ~0.42%</li><li>347: ~0.42%</li><li>355: ~0.42%</li><li>356: ~0.42%</li><li>357: ~0.42%</li><li>360: ~0.42%</li><li>361: ~0.42%</li><li>364: ~0.42%</li><li>366: ~0.42%</li><li>367: ~0.42%</li><li>368: ~0.42%</li><li>369: ~0.42%</li><li>370: ~0.42%</li><li>373: ~0.42%</li><li>376: ~0.42%</li><li>378: ~0.42%</li><li>384: ~0.42%</li><li>385: ~0.42%</li><li>386: ~0.42%</li><li>387: ~0.42%</li><li>390: ~0.42%</li><li>394: ~0.42%</li><li>400: ~0.42%</li><li>401: ~0.42%</li><li>405: ~0.42%</li><li>413: ~0.42%</li><li>417: ~0.42%</li><li>418: ~0.42%</li><li>419: ~0.42%</li><li>420: ~0.42%</li><li>422: ~0.42%</li><li>432: ~0.42%</li><li>443: ~0.42%</li><li>452: ~0.42%</li><li>455: ~0.42%</li><li>458: ~0.42%</li><li>463: ~0.42%</li><li>469: ~0.42%</li><li>470: ~0.42%</li><li>471: ~0.42%</li><li>475: ~0.42%</li><li>478: ~0.42%</li><li>480: ~0.42%</li><li>481: ~0.42%</li><li>485: ~0.42%</li><li>487: ~0.42%</li><li>489: ~0.42%</li><li>491: ~0.42%</li><li>492: ~0.42%</li><li>493: ~0.42%</li><li>494: ~0.42%</li><li>495: ~0.42%</li><li>497: ~0.42%</li><li>500: ~0.42%</li><li>502: ~0.42%</li><li>506: ~0.42%</li><li>522: ~0.42%</li><li>533: ~0.42%</li><li>536: ~0.42%</li><li>547: ~0.42%</li><li>548: ~0.42%</li><li>551: ~0.42%</li><li>553: ~0.42%</li><li>554: ~0.42%</li><li>558: ~0.42%</li><li>559: ~0.42%</li><li>561: ~0.42%</li><li>562: ~0.42%</li><li>563: ~0.42%</li><li>564: ~0.42%</li><li>565: ~0.42%</li><li>566: ~0.42%</li><li>567: ~0.42%</li><li>569: ~0.42%</li><li>570: ~0.42%</li><li>571: ~0.42%</li><li>572: ~0.42%</li><li>573: ~0.42%</li><li>574: ~0.42%</li><li>575: ~0.42%</li><li>576: ~0.42%</li><li>577: ~0.42%</li><li>582: ~0.42%</li><li>584: ~0.42%</li><li>585: ~0.42%</li><li>586: ~0.42%</li><li>587: ~0.42%</li><li>590: ~0.42%</li><li>591: ~0.42%</li><li>592: ~0.42%</li><li>593: ~0.42%</li><li>594: ~0.42%</li><li>595: ~0.42%</li><li>596: ~0.42%</li><li>597: ~0.42%</li><li>598: ~0.42%</li><li>599: ~0.42%</li><li>600: ~0.42%</li><li>601: ~0.42%</li><li>602: ~0.42%</li><li>603: ~0.42%</li><li>604: ~0.42%</li><li>605: ~0.42%</li><li>606: ~0.42%</li><li>607: ~0.42%</li><li>608: ~0.42%</li><li>609: ~0.42%</li><li>610: ~0.42%</li><li>611: ~0.42%</li><li>612: ~0.42%</li><li>614: ~0.42%</li><li>615: ~0.42%</li><li>616: ~0.42%</li><li>617: ~0.42%</li><li>618: ~0.42%</li><li>619: ~0.42%</li><li>620: ~0.42%</li><li>621: ~0.42%</li><li>622: ~0.42%</li><li>623: ~0.42%</li><li>624: ~0.42%</li><li>625: ~0.42%</li><li>626: ~0.42%</li><li>627: ~0.42%</li><li>628: ~0.42%</li><li>629: ~0.42%</li><li>630: ~0.42%</li><li>632: ~0.42%</li><li>633: ~0.42%</li><li>634: ~0.42%</li><li>635: ~0.42%</li><li>636: ~0.42%</li><li>637: ~0.42%</li><li>638: ~0.42%</li><li>639: ~0.42%</li><li>640: ~0.42%</li><li>641: ~0.42%</li><li>642: ~0.42%</li><li>643: ~0.42%</li><li>644: ~0.42%</li><li>645: ~0.42%</li><li>646: ~0.42%</li><li>647: ~0.42%</li><li>648: ~0.42%</li><li>649: ~0.42%</li><li>650: ~0.42%</li><li>651: ~0.42%</li><li>652: ~0.42%</li><li>653: ~0.42%</li><li>654: ~0.42%</li><li>655: ~0.42%</li><li>656: ~0.42%</li><li>657: ~0.42%</li><li>658: ~0.42%</li><li>659: ~0.42%</li><li>660: ~0.42%</li><li>661: ~0.42%</li><li>662: ~0.42%</li><li>663: ~0.42%</li><li>664: ~0.42%</li><li>666: ~0.42%</li><li>667: ~0.42%</li><li>668: ~0.42%</li><li>669: ~0.42%</li><li>670: ~0.42%</li><li>671: ~0.42%</li><li>672: ~0.42%</li><li>673: ~0.42%</li><li>674: ~0.42%</li><li>675: ~0.42%</li><li>676: ~0.42%</li><li>677: ~0.42%</li><li>678: ~0.42%</li><li>679: ~0.42%</li><li>680: ~0.42%</li><li>681: ~0.42%</li><li>682: ~0.42%</li><li>683: ~0.42%</li><li>684: ~0.42%</li><li>685: ~0.42%</li><li>686: ~0.42%</li><li>687: ~0.42%</li><li>688: ~0.42%</li><li>689: ~0.42%</li><li>690: ~0.42%</li><li>691: ~0.42%</li><li>692: ~0.42%</li><li>693: ~0.42%</li><li>694: ~0.42%</li><li>695: ~0.42%</li><li>696: ~0.42%</li><li>697: ~0.42%</li><li>698: ~0.42%</li><li>699: ~0.42%</li><li>700: ~0.42%</li><li>701: ~0.42%</li><li>702: ~0.42%</li><li>703: ~0.42%</li><li>704: ~0.42%</li><li>705: ~0.42%</li><li>706: ~0.42%</li><li>707: ~0.42%</li><li>708: ~0.42%</li><li>709: ~0.42%</li><li>710: ~0.42%</li><li>711: ~0.42%</li></ul> | * Samples: | sentence | label | |:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------| | <code>Presentació de sol·licituds per a l'atorgament de llicència d'ús privatiu del domini públic local Aquest tràmit permet la presentació de sol·licituds per a l’autorització a favor de tercers perquè utilitzin de forma privativa una porció de domini públic local, amb caràcter temporal i sense la seva transformació, pel seu posterior destí a la realització d’activitats d'interès. En funció del número de sol·licituds presentades en cada convocatòria es procedirà a l'atorgament de la llicència: de forma directa si no hi ha pluralitat de sol·licitants, o mitjançant previ concurs en cas que existeixi una pluralitat de sol·licitants.</code> | <code>647</code> | | <code>Ajuts per fomentar l'associacionisme empresarial local Aquest ajut pretén fomentar l’associacionisme empresarial local, per tal de disposar d’agrupacions, gremis o associacions representatives de l’activitat empresarial del municipi.</code> | <code>636</code> | | <code>Baixa al padró municipal d'habitants (persones estrangeres que marxen del país, o per defunció ...) No es poden realitzar inscripcions de baixa per canvi de municipi o país de residencia a petició de les persones interessades, tret de les persones estrangeres que traslladin la seva residència a un altre país. Les persones amb nacionalitat espanyola que estableixin la residencia en un altra municipi o país hauran de comunicar la inscripció en el Padró del nou municipi de residència o en el Registre de Matrícula de l'Oficina o Secció Consular del país de destinació. El tràmit de baixa del padró municipal d'habitants només es pot sol·lictar en les següents situacions: Persones estrangeres empadronades que traslladen la seva residència a un altre país. Defunció. L'Institut Nacional d'Estadística, a instàncies del Registre Civil, comunica periòdicament les baixes per defunció a l'Ajuntament. Si es necessita que aquesta baixa es produeixi a la major brevetat possible, es pot realitzar aquest tràmit aportant el certificat de defunció, o el llibre de família. Inclusió indeguda: Aquesta baixa afecta a persones que figuren empadronades en un domicili i ja no hi resideixen. La persona empadronada, o titular de l'habitatge, pot comunicar aquesta situació, i l'ajuntament comprovarà aquesta circunstancia amb la tramitació de l'expedient corresponent. En el cas que la persona interessada no manifesti expresament la seva conformitat, la baixa només es podrà resoldre amb informe favorable del Consejo de Empadronamiento. L'Ajuntament de Sitges també pot iniciar d'ofici aquests tipus d'expedients.</code> | <code>394</code> | * Loss: [<code>BatchAllTripletLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#batchalltripletloss) ### Training Hyperparameters #### Non-Default Hyperparameters - `per_device_train_batch_size`: 1 - `per_device_eval_batch_size`: 1 - `learning_rate`: 2e-05 - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: no - `prediction_loss_only`: True - `per_device_train_batch_size`: 1 - `per_device_eval_batch_size`: 1 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 3 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | |:------:|:----:|:-------------:| | 2.1097 | 500 | 0.0 | ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.0.1 - Transformers: 4.42.4 - PyTorch: 2.3.1+cu121 - Accelerate: 0.32.1 - Datasets: 2.20.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### BatchAllTripletLoss ```bibtex @misc{hermans2017defense, title={In Defense of the Triplet Loss for Person Re-Identification}, author={Alexander Hermans and Lucas Beyer and Bastian Leibe}, year={2017}, eprint={1703.07737}, archivePrefix={arXiv}, primaryClass={cs.CV} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "CAS" ]
Non_BioNLP
ginkgogo/setfit-absa-bge-small-en-v1.5-restaurants-aspect
ginkgogo
text-classification
[ "setfit", "safetensors", "bert", "absa", "sentence-transformers", "text-classification", "generated_from_setfit_trainer", "arxiv:2209.11055", "base_model:sentence-transformers/all-MiniLM-L6-v2", "base_model:finetune:sentence-transformers/all-MiniLM-L6-v2", "model-index", "region:us" ]
1,711
1,711
4
0
--- base_model: sentence-transformers/all-MiniLM-L6-v2 library_name: setfit metrics: - accuracy pipeline_tag: text-classification tags: - setfit - absa - sentence-transformers - text-classification - generated_from_setfit_trainer widget: - text: variety:I'm not sure what what I would do if I'd never discovered Nikka, since it's the definitely the most authentic ramen one can get in the area. Prices are standard for ramen (especially in SB) and the service is friendly and efficient. Not only is Nikka's ramen amazing, their variety of appetizers is also great. I've yet to try one that I don't like. Definitely come here if you're looking to satisfy your ramen craving! - text: wrap:Pretty good food, just had a wrap and it was delicious pretty much on Mediterranean or Greek style food around here. Petra's who had really good Greek dinners closed - text: goat cheese:I had the Genoa Salami, Kalamata olive tapenade, with roasted red peppers and goat cheese. I ended up going with this sandwich mainly because I am lactose sensitive and found out that goat cheese is supposed to have less lactose making it easier for the stomach to digest. The goat cheese had a nice smooth and creamy flavor and when combined with the olive tapenade really made a burst of flavor in my mouth. I also had the sandwich on Foccaica bread but they also have three other choices of bread to choose from. Overall the sandwich was delicious. I love the simple clean look of the store and it had some inside seating as well as gated outdoor seating. All the staff members seemed very nice and helpful. The one problem I had with Panino is the price. Although I love the sandwich, I do not believe it is worth $12. When I originally looked up the menu on Yelp I was looking at the pictures that were paired by other reviewers and I saw that they were about $10. $10 still expensive but a little more understandable and worth what you're getting. - text: toppings:FINALLY tried Mizza and wasn't disappointed. Loved (almost) everything we ordered, great atmosphere, excellent service, and the perfect setting for a lovely bday Sunday. The burrata & heirloom tomatoes app was scrumptious, the salmon pasta, very flavorful and the salmon perfectly cooked, I liked the toppings of the veggie pizza but wasn't a super fan of the crust (doesn't mean I won't come back and try another pizza on their menu ) and the cannoli was good although that dessert in general isn't my fave (it was my bf's bday so had to get what he wanted ). The flourless chocolate cake and limoncello cake are what I'll try next time. Had a great time and will be back. Gave it 4 stars just cuz I wasn't that excited about the pizza and that's something they're supposed to so well. Would recommend the restaurant though! - text: mouth:I had the Genoa Salami, Kalamata olive tapenade, with roasted red peppers and goat cheese. I ended up going with this sandwich mainly because I am lactose sensitive and found out that goat cheese is supposed to have less lactose making it easier for the stomach to digest. The goat cheese had a nice smooth and creamy flavor and when combined with the olive tapenade really made a burst of flavor in my mouth. I also had the sandwich on Foccaica bread but they also have three other choices of bread to choose from. Overall the sandwich was delicious. I love the simple clean look of the store and it had some inside seating as well as gated outdoor seating. All the staff members seemed very nice and helpful. The one problem I had with Panino is the price. Although I love the sandwich, I do not believe it is worth $12. When I originally looked up the menu on Yelp I was looking at the pictures that were paired by other reviewers and I saw that they were about $10. $10 still expensive but a little more understandable and worth what you're getting. inference: false model-index: - name: SetFit Aspect Model with sentence-transformers/all-MiniLM-L6-v2 results: - task: type: text-classification name: Text Classification dataset: name: Unknown type: unknown split: test metrics: - type: accuracy value: 0.956989247311828 name: Accuracy --- # SetFit Aspect Model with sentence-transformers/all-MiniLM-L6-v2 This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Aspect Based Sentiment Analysis (ABSA). This SetFit model uses [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. In particular, this model is in charge of filtering aspect span candidates. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. This model was trained within the context of a larger system for ABSA, which looks like so: 1. Use a spaCy model to select possible aspect span candidates. 2. **Use this SetFit model to filter these possible aspect span candidates.** 3. Use a SetFit model to classify the filtered aspect span candidates. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance - **spaCy Model:** en_core_web_sm - **SetFitABSA Aspect Model:** [ginkgogo/setfit-absa-bge-small-en-v1.5-restaurants-aspect](https://huggingface.co/ginkgogo/setfit-absa-bge-small-en-v1.5-restaurants-aspect) - **SetFitABSA Polarity Model:** [ginkgogo/setfit-absa-bge-small-en-v1.5-restaurants-polarity](https://huggingface.co/ginkgogo/setfit-absa-bge-small-en-v1.5-restaurants-polarity) - **Maximum Sequence Length:** 256 tokens - **Number of Classes:** 2 classes <!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ### Model Labels | Label | Examples | |:----------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | aspect | <ul><li>'food:They made it into more American food, added burgers and ribs and got rid of the tequila selection. We were so bummed. Used to be one of our favorite places to go for good Mexican food. The owner said the new direction was to appeal to more tourists.'</li><li>"seating:Such a cute little spot for desserts! I'm so glad we had time on our short visit to Santa Barbara to grab a slice of cake from here. My husband and I each got our own to slice to share of course. He said we didn't come all this way just to get one so we chose a slice of the berry cake and chocolate decadence. The berry cake was nice and fluffy without being too sweet. The acidity from the fruits balanced the sweetest of the cake wonderfully. If you're up for something rich then the chocolate decadence will not disappoint. Service was great and seating was comfortable. Order your sweet treats at the counter then a number will be given to you. Pick a table and get ready to enjoy because your sweets will be brought out to your table when ready."</li><li>'food:One brisk Saturday morning after asking workers during a stop for tylenol from the Hotel California Boutique the best breakfast place, they recommended Goat Tree. We crossed the busy street and greeted the hostess. The very kind young lady walked us to our table on the sunny patio. We skimmed the menu and decided on the chicken and waffle and a chocolate croissant. The wait was quite short and we spent it discussing the beautiful surrounding area. Soon, our food was delivered, and let me tell you, it was beautiful. On top of that, it was scrumptious. The fried chicken was perfect and tender. The waffle had the perfect balance of crunch and fluff. And how dare I forget the exquisite honey. Now this honey was the best I have ever tasted. It was topped with chia and pumpkin seeds. My daughter asked for her croissant warmed, and once again it was marvelous. After paying, I told our waitress how amazing the honey was. Next thing we knew, she brought out two large to go cups full of it! \n\nAbsolutely loved this place and everything about it. 100% recommend! I strongly award them 5 stars!'</li></ul> | | no aspect | <ul><li>'burgers:They made it into more American food, added burgers and ribs and got rid of the tequila selection. We were so bummed. Used to be one of our favorite places to go for good Mexican food. The owner said the new direction was to appeal to more tourists.'</li><li>'ribs:They made it into more American food, added burgers and ribs and got rid of the tequila selection. We were so bummed. Used to be one of our favorite places to go for good Mexican food. The owner said the new direction was to appeal to more tourists.'</li><li>'tequila selection:They made it into more American food, added burgers and ribs and got rid of the tequila selection. We were so bummed. Used to be one of our favorite places to go for good Mexican food. The owner said the new direction was to appeal to more tourists.'</li></ul> | ## Evaluation ### Metrics | Label | Accuracy | |:--------|:---------| | **all** | 0.9570 | ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import AbsaModel # Download from the 🤗 Hub model = AbsaModel.from_pretrained( "ginkgogo/setfit-absa-bge-small-en-v1.5-restaurants-aspect", "ginkgogo/setfit-absa-bge-small-en-v1.5-restaurants-polarity", ) # Run inference preds = model("The food was great, but the venue is just way too busy.") ``` <!-- ### Downstream Use *List how someone could finetune this model on their own dataset.* --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Set Metrics | Training set | Min | Median | Max | |:-------------|:----|:---------|:----| | Word count | 21 | 152.7030 | 268 | | Label | Training Sample Count | |:----------|:----------------------| | no aspect | 383 | | aspect | 21 | ### Training Hyperparameters - batch_size: (50, 50) - num_epochs: (5, 5) - max_steps: -1 - sampling_strategy: oversampling - body_learning_rate: (2e-05, 1e-05) - head_learning_rate: 0.01 - loss: CosineSimilarityLoss - distance_metric: cosine_distance - margin: 0.25 - end_to_end: False - use_amp: True - warmup_proportion: 0.1 - seed: 42 - eval_max_steps: -1 - load_best_model_at_end: True ### Training Results | Epoch | Step | Training Loss | Validation Loss | |:----------:|:-------:|:-------------:|:---------------:| | 0.0003 | 1 | 0.2856 | - | | 0.0169 | 50 | 0.2755 | 0.3092 | | 0.0339 | 100 | 0.2895 | 0.2962 | | 0.0508 | 150 | 0.2845 | 0.2876 | | 0.0678 | 200 | 0.2471 | 0.2826 | | 0.0847 | 250 | 0.2124 | 0.2691 | | 0.1017 | 300 | 0.1357 | 0.184 | | 0.1186 | 350 | 0.0362 | 0.0871 | | **0.1355** | **400** | **0.07** | **0.0848** | | 0.1525 | 450 | 0.0184 | 0.092 | | 0.1694 | 500 | 0.0179 | 0.096 | | 0.1864 | 550 | 0.0033 | 0.097 | | 0.2033 | 600 | 0.0037 | 0.0978 | | 0.2203 | 650 | 0.04 | 0.1046 | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.10.12 - SetFit: 1.0.3 - Sentence Transformers: 2.6.0 - spaCy: 3.7.4 - Transformers: 4.38.2 - PyTorch: 2.2.1+cu121 - Datasets: 2.18.0 - Tokenizers: 0.15.2 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "CHIA" ]
Non_BioNLP
vectoriseai/multilingual-e5-large
vectoriseai
feature-extraction
[ "sentence-transformers", "pytorch", "onnx", "safetensors", "xlm-roberta", "mteb", "Sentence Transformers", "sentence-similarity", "feature-extraction", "multilingual", "af", "am", "ar", "as", "az", "be", "bg", "bn", "br", "bs", "ca", "cs", "cy", "da", "de", "el", "en", "eo", "es", "et", "eu", "fa", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "hu", "hy", "id", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "ku", "ky", "la", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", "no", "om", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sa", "sd", "si", "sk", "sl", "so", "sq", "sr", "su", "sv", "sw", "ta", "te", "th", "tl", "tr", "ug", "uk", "ur", "uz", "vi", "xh", "yi", "zh", "arxiv:2212.03533", "arxiv:2108.08787", "arxiv:2104.08663", "arxiv:2210.07316", "license:mit", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,696
1,696
14
0
--- language: - multilingual - af - am - ar - as - az - be - bg - bn - br - bs - ca - cs - cy - da - de - el - en - eo - es - et - eu - fa - fi - fr - fy - ga - gd - gl - gu - ha - he - hi - hr - hu - hy - id - is - it - ja - jv - ka - kk - km - kn - ko - ku - ky - la - lo - lt - lv - mg - mk - ml - mn - mr - ms - my - ne - nl - 'no' - om - or - pa - pl - ps - pt - ro - ru - sa - sd - si - sk - sl - so - sq - sr - su - sv - sw - ta - te - th - tl - tr - ug - uk - ur - uz - vi - xh - yi - zh license: mit tags: - mteb - Sentence Transformers - sentence-similarity - feature-extraction - sentence-transformers model-index: - name: multilingual-e5-large results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 79.05970149253731 - type: ap value: 43.486574390835635 - type: f1 value: 73.32700092140148 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (de) type: mteb/amazon_counterfactual config: de split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 71.22055674518201 - type: ap value: 81.55756710830498 - type: f1 value: 69.28271787752661 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en-ext) type: mteb/amazon_counterfactual config: en-ext split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 80.41979010494754 - type: ap value: 29.34879922376344 - type: f1 value: 67.62475449011278 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (ja) type: mteb/amazon_counterfactual config: ja split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 77.8372591006424 - type: ap value: 26.557560591210738 - type: f1 value: 64.96619417368707 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 93.489875 - type: ap value: 90.98758636917603 - type: f1 value: 93.48554819717332 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 47.564 - type: f1 value: 46.75122173518047 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (de) type: mteb/amazon_reviews_multi config: de split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 45.400000000000006 - type: f1 value: 44.17195682400632 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (es) type: mteb/amazon_reviews_multi config: es split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 43.068 - type: f1 value: 42.38155696855596 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (fr) type: mteb/amazon_reviews_multi config: fr split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 41.89 - type: f1 value: 40.84407321682663 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (ja) type: mteb/amazon_reviews_multi config: ja split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 40.120000000000005 - type: f1 value: 39.522976223819114 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (zh) type: mteb/amazon_reviews_multi config: zh split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 38.832 - type: f1 value: 38.0392533394713 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: map_at_1 value: 30.725 - type: map_at_10 value: 46.055 - type: map_at_100 value: 46.900999999999996 - type: map_at_1000 value: 46.911 - type: map_at_3 value: 41.548 - type: map_at_5 value: 44.297 - type: mrr_at_1 value: 31.152 - type: mrr_at_10 value: 46.231 - type: mrr_at_100 value: 47.07 - type: mrr_at_1000 value: 47.08 - type: mrr_at_3 value: 41.738 - type: mrr_at_5 value: 44.468999999999994 - type: ndcg_at_1 value: 30.725 - type: ndcg_at_10 value: 54.379999999999995 - type: ndcg_at_100 value: 58.138 - type: ndcg_at_1000 value: 58.389 - type: ndcg_at_3 value: 45.156 - type: ndcg_at_5 value: 50.123 - type: precision_at_1 value: 30.725 - type: precision_at_10 value: 8.087 - type: precision_at_100 value: 0.9769999999999999 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 18.54 - type: precision_at_5 value: 13.542000000000002 - type: recall_at_1 value: 30.725 - type: recall_at_10 value: 80.868 - type: recall_at_100 value: 97.653 - type: recall_at_1000 value: 99.57300000000001 - type: recall_at_3 value: 55.619 - type: recall_at_5 value: 67.71000000000001 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 44.30960650674069 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 38.427074197498996 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 60.28270056031872 - type: mrr value: 74.38332673789738 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 84.05942144105269 - type: cos_sim_spearman value: 82.51212105850809 - type: euclidean_pearson value: 81.95639829909122 - type: euclidean_spearman value: 82.3717564144213 - type: manhattan_pearson value: 81.79273425468256 - type: manhattan_spearman value: 82.20066817871039 - task: type: BitextMining dataset: name: MTEB BUCC (de-en) type: mteb/bucc-bitext-mining config: de-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 99.46764091858039 - type: f1 value: 99.37717466945023 - type: precision value: 99.33194154488518 - type: recall value: 99.46764091858039 - task: type: BitextMining dataset: name: MTEB BUCC (fr-en) type: mteb/bucc-bitext-mining config: fr-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 98.29407880255337 - type: f1 value: 98.11248073959938 - type: precision value: 98.02443319392472 - type: recall value: 98.29407880255337 - task: type: BitextMining dataset: name: MTEB BUCC (ru-en) type: mteb/bucc-bitext-mining config: ru-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 97.79009352268791 - type: f1 value: 97.5176076665512 - type: precision value: 97.38136473848286 - type: recall value: 97.79009352268791 - task: type: BitextMining dataset: name: MTEB BUCC (zh-en) type: mteb/bucc-bitext-mining config: zh-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 99.26276987888363 - type: f1 value: 99.20133403545726 - type: precision value: 99.17500438827453 - type: recall value: 99.26276987888363 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 84.72727272727273 - type: f1 value: 84.67672206031433 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 35.34220182511161 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 33.4987096128766 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: map_at_1 value: 25.558249999999997 - type: map_at_10 value: 34.44425000000001 - type: map_at_100 value: 35.59833333333333 - type: map_at_1000 value: 35.706916666666665 - type: map_at_3 value: 31.691749999999995 - type: map_at_5 value: 33.252916666666664 - type: mrr_at_1 value: 30.252666666666666 - type: mrr_at_10 value: 38.60675 - type: mrr_at_100 value: 39.42666666666666 - type: mrr_at_1000 value: 39.48408333333334 - type: mrr_at_3 value: 36.17441666666665 - type: mrr_at_5 value: 37.56275 - type: ndcg_at_1 value: 30.252666666666666 - type: ndcg_at_10 value: 39.683 - type: ndcg_at_100 value: 44.68541666666667 - type: ndcg_at_1000 value: 46.94316666666668 - type: ndcg_at_3 value: 34.961749999999995 - type: ndcg_at_5 value: 37.215666666666664 - type: precision_at_1 value: 30.252666666666666 - type: precision_at_10 value: 6.904166666666667 - type: precision_at_100 value: 1.0989999999999995 - type: precision_at_1000 value: 0.14733333333333334 - type: precision_at_3 value: 16.037666666666667 - type: precision_at_5 value: 11.413583333333333 - type: recall_at_1 value: 25.558249999999997 - type: recall_at_10 value: 51.13341666666666 - type: recall_at_100 value: 73.08366666666667 - type: recall_at_1000 value: 88.79483333333334 - type: recall_at_3 value: 37.989083333333326 - type: recall_at_5 value: 43.787833333333325 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: map_at_1 value: 10.338 - type: map_at_10 value: 18.360000000000003 - type: map_at_100 value: 19.942 - type: map_at_1000 value: 20.134 - type: map_at_3 value: 15.174000000000001 - type: map_at_5 value: 16.830000000000002 - type: mrr_at_1 value: 23.257 - type: mrr_at_10 value: 33.768 - type: mrr_at_100 value: 34.707 - type: mrr_at_1000 value: 34.766000000000005 - type: mrr_at_3 value: 30.977 - type: mrr_at_5 value: 32.528 - type: ndcg_at_1 value: 23.257 - type: ndcg_at_10 value: 25.733 - type: ndcg_at_100 value: 32.288 - type: ndcg_at_1000 value: 35.992000000000004 - type: ndcg_at_3 value: 20.866 - type: ndcg_at_5 value: 22.612 - type: precision_at_1 value: 23.257 - type: precision_at_10 value: 8.124 - type: precision_at_100 value: 1.518 - type: precision_at_1000 value: 0.219 - type: precision_at_3 value: 15.679000000000002 - type: precision_at_5 value: 12.117 - type: recall_at_1 value: 10.338 - type: recall_at_10 value: 31.154 - type: recall_at_100 value: 54.161 - type: recall_at_1000 value: 75.21900000000001 - type: recall_at_3 value: 19.427 - type: recall_at_5 value: 24.214 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: map_at_1 value: 8.498 - type: map_at_10 value: 19.103 - type: map_at_100 value: 27.375 - type: map_at_1000 value: 28.981 - type: map_at_3 value: 13.764999999999999 - type: map_at_5 value: 15.950000000000001 - type: mrr_at_1 value: 65.5 - type: mrr_at_10 value: 74.53800000000001 - type: mrr_at_100 value: 74.71799999999999 - type: mrr_at_1000 value: 74.725 - type: mrr_at_3 value: 72.792 - type: mrr_at_5 value: 73.554 - type: ndcg_at_1 value: 53.37499999999999 - type: ndcg_at_10 value: 41.286 - type: ndcg_at_100 value: 45.972 - type: ndcg_at_1000 value: 53.123 - type: ndcg_at_3 value: 46.172999999999995 - type: ndcg_at_5 value: 43.033 - type: precision_at_1 value: 65.5 - type: precision_at_10 value: 32.725 - type: precision_at_100 value: 10.683 - type: precision_at_1000 value: 1.978 - type: precision_at_3 value: 50 - type: precision_at_5 value: 41.349999999999994 - type: recall_at_1 value: 8.498 - type: recall_at_10 value: 25.070999999999998 - type: recall_at_100 value: 52.383 - type: recall_at_1000 value: 74.91499999999999 - type: recall_at_3 value: 15.207999999999998 - type: recall_at_5 value: 18.563 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 46.5 - type: f1 value: 41.93833713984145 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: map_at_1 value: 67.914 - type: map_at_10 value: 78.10000000000001 - type: map_at_100 value: 78.333 - type: map_at_1000 value: 78.346 - type: map_at_3 value: 76.626 - type: map_at_5 value: 77.627 - type: mrr_at_1 value: 72.74199999999999 - type: mrr_at_10 value: 82.414 - type: mrr_at_100 value: 82.511 - type: mrr_at_1000 value: 82.513 - type: mrr_at_3 value: 81.231 - type: mrr_at_5 value: 82.065 - type: ndcg_at_1 value: 72.74199999999999 - type: ndcg_at_10 value: 82.806 - type: ndcg_at_100 value: 83.677 - type: ndcg_at_1000 value: 83.917 - type: ndcg_at_3 value: 80.305 - type: ndcg_at_5 value: 81.843 - type: precision_at_1 value: 72.74199999999999 - type: precision_at_10 value: 10.24 - type: precision_at_100 value: 1.089 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 31.268 - type: precision_at_5 value: 19.706000000000003 - type: recall_at_1 value: 67.914 - type: recall_at_10 value: 92.889 - type: recall_at_100 value: 96.42699999999999 - type: recall_at_1000 value: 97.92 - type: recall_at_3 value: 86.21 - type: recall_at_5 value: 90.036 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: map_at_1 value: 22.166 - type: map_at_10 value: 35.57 - type: map_at_100 value: 37.405 - type: map_at_1000 value: 37.564 - type: map_at_3 value: 30.379 - type: map_at_5 value: 33.324 - type: mrr_at_1 value: 43.519000000000005 - type: mrr_at_10 value: 51.556000000000004 - type: mrr_at_100 value: 52.344 - type: mrr_at_1000 value: 52.373999999999995 - type: mrr_at_3 value: 48.868 - type: mrr_at_5 value: 50.319 - type: ndcg_at_1 value: 43.519000000000005 - type: ndcg_at_10 value: 43.803 - type: ndcg_at_100 value: 50.468999999999994 - type: ndcg_at_1000 value: 53.111 - type: ndcg_at_3 value: 38.893 - type: ndcg_at_5 value: 40.653 - type: precision_at_1 value: 43.519000000000005 - type: precision_at_10 value: 12.253 - type: precision_at_100 value: 1.931 - type: precision_at_1000 value: 0.242 - type: precision_at_3 value: 25.617 - type: precision_at_5 value: 19.383 - type: recall_at_1 value: 22.166 - type: recall_at_10 value: 51.6 - type: recall_at_100 value: 76.574 - type: recall_at_1000 value: 92.192 - type: recall_at_3 value: 34.477999999999994 - type: recall_at_5 value: 41.835 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: map_at_1 value: 39.041 - type: map_at_10 value: 62.961999999999996 - type: map_at_100 value: 63.79899999999999 - type: map_at_1000 value: 63.854 - type: map_at_3 value: 59.399 - type: map_at_5 value: 61.669 - type: mrr_at_1 value: 78.082 - type: mrr_at_10 value: 84.321 - type: mrr_at_100 value: 84.49600000000001 - type: mrr_at_1000 value: 84.502 - type: mrr_at_3 value: 83.421 - type: mrr_at_5 value: 83.977 - type: ndcg_at_1 value: 78.082 - type: ndcg_at_10 value: 71.229 - type: ndcg_at_100 value: 74.10900000000001 - type: ndcg_at_1000 value: 75.169 - type: ndcg_at_3 value: 66.28699999999999 - type: ndcg_at_5 value: 69.084 - type: precision_at_1 value: 78.082 - type: precision_at_10 value: 14.993 - type: precision_at_100 value: 1.7239999999999998 - type: precision_at_1000 value: 0.186 - type: precision_at_3 value: 42.737 - type: precision_at_5 value: 27.843 - type: recall_at_1 value: 39.041 - type: recall_at_10 value: 74.96300000000001 - type: recall_at_100 value: 86.199 - type: recall_at_1000 value: 93.228 - type: recall_at_3 value: 64.105 - type: recall_at_5 value: 69.608 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 90.23160000000001 - type: ap value: 85.5674856808308 - type: f1 value: 90.18033354786317 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: dev revision: None metrics: - type: map_at_1 value: 24.091 - type: map_at_10 value: 36.753 - type: map_at_100 value: 37.913000000000004 - type: map_at_1000 value: 37.958999999999996 - type: map_at_3 value: 32.818999999999996 - type: map_at_5 value: 35.171 - type: mrr_at_1 value: 24.742 - type: mrr_at_10 value: 37.285000000000004 - type: mrr_at_100 value: 38.391999999999996 - type: mrr_at_1000 value: 38.431 - type: mrr_at_3 value: 33.440999999999995 - type: mrr_at_5 value: 35.75 - type: ndcg_at_1 value: 24.742 - type: ndcg_at_10 value: 43.698 - type: ndcg_at_100 value: 49.145 - type: ndcg_at_1000 value: 50.23800000000001 - type: ndcg_at_3 value: 35.769 - type: ndcg_at_5 value: 39.961999999999996 - type: precision_at_1 value: 24.742 - type: precision_at_10 value: 6.7989999999999995 - type: precision_at_100 value: 0.95 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 15.096000000000002 - type: precision_at_5 value: 11.183 - type: recall_at_1 value: 24.091 - type: recall_at_10 value: 65.068 - type: recall_at_100 value: 89.899 - type: recall_at_1000 value: 98.16 - type: recall_at_3 value: 43.68 - type: recall_at_5 value: 53.754999999999995 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.66621067031465 - type: f1 value: 93.49622853272142 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (de) type: mteb/mtop_domain config: de split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 91.94702733164272 - type: f1 value: 91.17043441745282 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (es) type: mteb/mtop_domain config: es split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 92.20146764509674 - type: f1 value: 91.98359080555608 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (fr) type: mteb/mtop_domain config: fr split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 88.99780770435328 - type: f1 value: 89.19746342724068 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (hi) type: mteb/mtop_domain config: hi split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 89.78486912871998 - type: f1 value: 89.24578823628642 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (th) type: mteb/mtop_domain config: th split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 88.74502712477394 - type: f1 value: 89.00297573881542 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 77.9046967624259 - type: f1 value: 59.36787125785957 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (de) type: mteb/mtop_intent config: de split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 74.5280360664976 - type: f1 value: 57.17723440888718 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (es) type: mteb/mtop_intent config: es split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 75.44029352901934 - type: f1 value: 54.052855531072964 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (fr) type: mteb/mtop_intent config: fr split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 70.5606013153774 - type: f1 value: 52.62215934386531 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (hi) type: mteb/mtop_intent config: hi split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 73.11581211903908 - type: f1 value: 52.341291845645465 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (th) type: mteb/mtop_intent config: th split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 74.28933092224233 - type: f1 value: 57.07918745504911 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (af) type: mteb/amazon_massive_intent config: af split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.38063214525892 - type: f1 value: 59.46463723443009 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (am) type: mteb/amazon_massive_intent config: am split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 56.06926698049766 - type: f1 value: 52.49084283283562 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ar) type: mteb/amazon_massive_intent config: ar split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 60.74983187626093 - type: f1 value: 56.960640620165904 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (az) type: mteb/amazon_massive_intent config: az split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.86550100874243 - type: f1 value: 62.47370548140688 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (bn) type: mteb/amazon_massive_intent config: bn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 63.971082716879636 - type: f1 value: 61.03812421957381 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (cy) type: mteb/amazon_massive_intent config: cy split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 54.98318762609282 - type: f1 value: 51.51207916008392 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (da) type: mteb/amazon_massive_intent config: da split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.45527908540686 - type: f1 value: 66.16631905400318 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (de) type: mteb/amazon_massive_intent config: de split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.32750504371216 - type: f1 value: 66.16755288646591 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (el) type: mteb/amazon_massive_intent config: el split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.09213180901143 - type: f1 value: 66.95654394661507 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 73.75588433086752 - type: f1 value: 71.79973779656923 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (es) type: mteb/amazon_massive_intent config: es split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.49428379287154 - type: f1 value: 68.37494379215734 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fa) type: mteb/amazon_massive_intent config: fa split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.90921318090115 - type: f1 value: 66.79517376481645 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fi) type: mteb/amazon_massive_intent config: fi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.12104909213181 - type: f1 value: 67.29448842879584 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fr) type: mteb/amazon_massive_intent config: fr split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.34095494283793 - type: f1 value: 67.01134288992947 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (he) type: mteb/amazon_massive_intent config: he split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.61264290517822 - type: f1 value: 64.68730512660757 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (hi) type: mteb/amazon_massive_intent config: hi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.79757901815738 - type: f1 value: 65.24938539425598 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (hu) type: mteb/amazon_massive_intent config: hu split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.68728984532616 - type: f1 value: 67.0487169762553 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (hy) type: mteb/amazon_massive_intent config: hy split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.07464694014795 - type: f1 value: 59.183532276789286 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (id) type: mteb/amazon_massive_intent config: id split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.04707464694015 - type: f1 value: 67.66829629003848 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (is) type: mteb/amazon_massive_intent config: is split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.42434431741762 - type: f1 value: 59.01617226544757 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (it) type: mteb/amazon_massive_intent config: it split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.53127101546738 - type: f1 value: 68.10033760906255 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ja) type: mteb/amazon_massive_intent config: ja split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 72.50504371217215 - type: f1 value: 69.74931103158923 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (jv) type: mteb/amazon_massive_intent config: jv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 57.91190316072628 - type: f1 value: 54.05551136648796 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ka) type: mteb/amazon_massive_intent config: ka split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 51.78211163416275 - type: f1 value: 49.874888544058535 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (km) type: mteb/amazon_massive_intent config: km split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 47.017484868863484 - type: f1 value: 44.53364263352014 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (kn) type: mteb/amazon_massive_intent config: kn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.16207128446537 - type: f1 value: 59.01185692320829 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ko) type: mteb/amazon_massive_intent config: ko split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.42501681237391 - type: f1 value: 67.13169450166086 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (lv) type: mteb/amazon_massive_intent config: lv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.0780094149294 - type: f1 value: 64.41720167850707 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ml) type: mteb/amazon_massive_intent config: ml split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.57162071284466 - type: f1 value: 62.414138683804424 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (mn) type: mteb/amazon_massive_intent config: mn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 61.71149966375252 - type: f1 value: 58.594805125087234 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ms) type: mteb/amazon_massive_intent config: ms split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 66.03900470746471 - type: f1 value: 63.87937257883887 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (my) type: mteb/amazon_massive_intent config: my split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 60.8776059179556 - type: f1 value: 57.48587618059131 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (nb) type: mteb/amazon_massive_intent config: nb split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.87895090786819 - type: f1 value: 66.8141299430347 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (nl) type: mteb/amazon_massive_intent config: nl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.45057162071285 - type: f1 value: 67.46444039673516 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pl) type: mteb/amazon_massive_intent config: pl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 71.546738399462 - type: f1 value: 68.63640876702655 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pt) type: mteb/amazon_massive_intent config: pt split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.72965702757229 - type: f1 value: 68.54119560379115 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ro) type: mteb/amazon_massive_intent config: ro split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.35574983187625 - type: f1 value: 65.88844917691927 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ru) type: mteb/amazon_massive_intent config: ru split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 71.70477471418964 - type: f1 value: 69.19665697061978 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sl) type: mteb/amazon_massive_intent config: sl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.0880968392737 - type: f1 value: 64.76962317666086 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sq) type: mteb/amazon_massive_intent config: sq split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.18493611297916 - type: f1 value: 62.49984559035371 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sv) type: mteb/amazon_massive_intent config: sv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 71.75857431069265 - type: f1 value: 69.20053687623418 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sw) type: mteb/amazon_massive_intent config: sw split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 58.500336247478145 - type: f1 value: 55.2972398687929 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ta) type: mteb/amazon_massive_intent config: ta split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.68997982515132 - type: f1 value: 59.36848202755348 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (te) type: mteb/amazon_massive_intent config: te split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 63.01950235373235 - type: f1 value: 60.09351954625423 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (th) type: mteb/amazon_massive_intent config: th split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.29186281102892 - type: f1 value: 67.57860496703447 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (tl) type: mteb/amazon_massive_intent config: tl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.77471418964357 - type: f1 value: 61.913983147713836 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (tr) type: mteb/amazon_massive_intent config: tr split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.87222595830532 - type: f1 value: 66.03679033708141 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ur) type: mteb/amazon_massive_intent config: ur split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.04505716207127 - type: f1 value: 61.28569169817908 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (vi) type: mteb/amazon_massive_intent config: vi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.38466711499663 - type: f1 value: 67.20532357036844 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-CN) type: mteb/amazon_massive_intent config: zh-CN split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 71.12306657700067 - type: f1 value: 68.91251226588182 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-TW) type: mteb/amazon_massive_intent config: zh-TW split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 66.20040349697378 - type: f1 value: 66.02657347714175 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (af) type: mteb/amazon_massive_scenario config: af split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.73907195696032 - type: f1 value: 66.98484521791418 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (am) type: mteb/amazon_massive_scenario config: am split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 60.58843308675185 - type: f1 value: 58.95591723092005 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ar) type: mteb/amazon_massive_scenario config: ar split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.22730329522528 - type: f1 value: 66.0894499712115 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (az) type: mteb/amazon_massive_scenario config: az split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.48285137861465 - type: f1 value: 65.21963176785157 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (bn) type: mteb/amazon_massive_scenario config: bn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.74714189643578 - type: f1 value: 66.8212192745412 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (cy) type: mteb/amazon_massive_scenario config: cy split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 59.09213180901143 - type: f1 value: 56.70735546356339 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (da) type: mteb/amazon_massive_scenario config: da split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 75.05716207128448 - type: f1 value: 74.8413712365364 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (de) type: mteb/amazon_massive_scenario config: de split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.69737726967047 - type: f1 value: 74.7664341963 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (el) type: mteb/amazon_massive_scenario config: el split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.90383322125084 - type: f1 value: 73.59201554448323 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 77.51176866173503 - type: f1 value: 77.46104434577758 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (es) type: mteb/amazon_massive_scenario config: es split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.31069266980496 - type: f1 value: 74.61048660675635 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fa) type: mteb/amazon_massive_scenario config: fa split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.95225285810356 - type: f1 value: 72.33160006574627 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fi) type: mteb/amazon_massive_scenario config: fi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.12373907195696 - type: f1 value: 73.20921012557481 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fr) type: mteb/amazon_massive_scenario config: fr split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.86684599865501 - type: f1 value: 73.82348774610831 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (he) type: mteb/amazon_massive_scenario config: he split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.40215198386012 - type: f1 value: 71.11945183971858 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (hi) type: mteb/amazon_massive_scenario config: hi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.12844653665098 - type: f1 value: 71.34450495911766 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (hu) type: mteb/amazon_massive_scenario config: hu split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.52252858103566 - type: f1 value: 73.98878711342999 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (hy) type: mteb/amazon_massive_scenario config: hy split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 64.93611297915265 - type: f1 value: 63.723200467653385 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (id) type: mteb/amazon_massive_scenario config: id split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.11903160726295 - type: f1 value: 73.82138439467096 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (is) type: mteb/amazon_massive_scenario config: is split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.15198386012105 - type: f1 value: 66.02172193802167 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (it) type: mteb/amazon_massive_scenario config: it split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.32414256893072 - type: f1 value: 74.30943421170574 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ja) type: mteb/amazon_massive_scenario config: ja split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 77.46805648957633 - type: f1 value: 77.62808409298209 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (jv) type: mteb/amazon_massive_scenario config: jv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 63.318762609280434 - type: f1 value: 62.094284066075076 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ka) type: mteb/amazon_massive_scenario config: ka split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 58.34902488231338 - type: f1 value: 57.12893860987984 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (km) type: mteb/amazon_massive_scenario config: km split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 50.88433086751849 - type: f1 value: 48.2272350802058 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (kn) type: mteb/amazon_massive_scenario config: kn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.4425016812374 - type: f1 value: 64.61463095996173 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ko) type: mteb/amazon_massive_scenario config: ko split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 75.04707464694015 - type: f1 value: 75.05099199098998 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (lv) type: mteb/amazon_massive_scenario config: lv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.50437121721586 - type: f1 value: 69.83397721096314 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ml) type: mteb/amazon_massive_scenario config: ml split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 69.94283792871553 - type: f1 value: 68.8704663703913 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (mn) type: mteb/amazon_massive_scenario config: mn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 64.79488903833222 - type: f1 value: 63.615424063345436 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ms) type: mteb/amazon_massive_scenario config: ms split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 69.88231338264963 - type: f1 value: 68.57892302593237 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (my) type: mteb/amazon_massive_scenario config: my split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 63.248150638870214 - type: f1 value: 61.06680605338809 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (nb) type: mteb/amazon_massive_scenario config: nb split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.84196368527236 - type: f1 value: 74.52566464968763 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (nl) type: mteb/amazon_massive_scenario config: nl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.8285137861466 - type: f1 value: 74.8853197608802 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pl) type: mteb/amazon_massive_scenario config: pl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.13248150638869 - type: f1 value: 74.3982040999179 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pt) type: mteb/amazon_massive_scenario config: pt split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.49024882313383 - type: f1 value: 73.82153848368573 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ro) type: mteb/amazon_massive_scenario config: ro split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.72158708809684 - type: f1 value: 71.85049433180541 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ru) type: mteb/amazon_massive_scenario config: ru split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 75.137861466039 - type: f1 value: 75.37628348188467 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sl) type: mteb/amazon_massive_scenario config: sl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.86953597848016 - type: f1 value: 71.87537624521661 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sq) type: mteb/amazon_massive_scenario config: sq split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.27572293207801 - type: f1 value: 68.80017302344231 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sv) type: mteb/amazon_massive_scenario config: sv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 76.09952925353059 - type: f1 value: 76.07992707688408 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sw) type: mteb/amazon_massive_scenario config: sw split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 63.140551445864155 - type: f1 value: 61.73855010331415 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ta) type: mteb/amazon_massive_scenario config: ta split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.27774041694687 - type: f1 value: 64.83664868894539 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (te) type: mteb/amazon_massive_scenario config: te split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.69468728984533 - type: f1 value: 64.76239666920868 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (th) type: mteb/amazon_massive_scenario config: th split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.44653665097512 - type: f1 value: 73.14646052013873 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (tl) type: mteb/amazon_massive_scenario config: tl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.71351714862139 - type: f1 value: 66.67212180163382 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (tr) type: mteb/amazon_massive_scenario config: tr split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.9946200403497 - type: f1 value: 73.87348793725525 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ur) type: mteb/amazon_massive_scenario config: ur split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.15400134498992 - type: f1 value: 67.09433241421094 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (vi) type: mteb/amazon_massive_scenario config: vi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.11365164761264 - type: f1 value: 73.59502539433753 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-CN) type: mteb/amazon_massive_scenario config: zh-CN split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 76.82582380632145 - type: f1 value: 76.89992945316313 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-TW) type: mteb/amazon_massive_scenario config: zh-TW split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.81237390719569 - type: f1 value: 72.36499770986265 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 31.480506569594695 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 29.71252128004552 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 31.421396787056548 - type: mrr value: 32.48155274872267 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.595 - type: map_at_10 value: 12.642000000000001 - type: map_at_100 value: 15.726 - type: map_at_1000 value: 17.061999999999998 - type: map_at_3 value: 9.125 - type: map_at_5 value: 10.866000000000001 - type: mrr_at_1 value: 43.344 - type: mrr_at_10 value: 52.227999999999994 - type: mrr_at_100 value: 52.898999999999994 - type: mrr_at_1000 value: 52.944 - type: mrr_at_3 value: 49.845 - type: mrr_at_5 value: 51.115 - type: ndcg_at_1 value: 41.949999999999996 - type: ndcg_at_10 value: 33.995 - type: ndcg_at_100 value: 30.869999999999997 - type: ndcg_at_1000 value: 39.487 - type: ndcg_at_3 value: 38.903999999999996 - type: ndcg_at_5 value: 37.236999999999995 - type: precision_at_1 value: 43.344 - type: precision_at_10 value: 25.480000000000004 - type: precision_at_100 value: 7.672 - type: precision_at_1000 value: 2.028 - type: precision_at_3 value: 36.636 - type: precision_at_5 value: 32.632 - type: recall_at_1 value: 5.595 - type: recall_at_10 value: 16.466 - type: recall_at_100 value: 31.226 - type: recall_at_1000 value: 62.778999999999996 - type: recall_at_3 value: 9.931 - type: recall_at_5 value: 12.884 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: map_at_1 value: 40.414 - type: map_at_10 value: 56.754000000000005 - type: map_at_100 value: 57.457 - type: map_at_1000 value: 57.477999999999994 - type: map_at_3 value: 52.873999999999995 - type: map_at_5 value: 55.175 - type: mrr_at_1 value: 45.278 - type: mrr_at_10 value: 59.192 - type: mrr_at_100 value: 59.650000000000006 - type: mrr_at_1000 value: 59.665 - type: mrr_at_3 value: 56.141 - type: mrr_at_5 value: 57.998000000000005 - type: ndcg_at_1 value: 45.278 - type: ndcg_at_10 value: 64.056 - type: ndcg_at_100 value: 66.89 - type: ndcg_at_1000 value: 67.364 - type: ndcg_at_3 value: 56.97 - type: ndcg_at_5 value: 60.719 - type: precision_at_1 value: 45.278 - type: precision_at_10 value: 9.994 - type: precision_at_100 value: 1.165 - type: precision_at_1000 value: 0.121 - type: precision_at_3 value: 25.512 - type: precision_at_5 value: 17.509 - type: recall_at_1 value: 40.414 - type: recall_at_10 value: 83.596 - type: recall_at_100 value: 95.72 - type: recall_at_1000 value: 99.24 - type: recall_at_3 value: 65.472 - type: recall_at_5 value: 74.039 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: map_at_1 value: 70.352 - type: map_at_10 value: 84.369 - type: map_at_100 value: 85.02499999999999 - type: map_at_1000 value: 85.04 - type: map_at_3 value: 81.42399999999999 - type: map_at_5 value: 83.279 - type: mrr_at_1 value: 81.05 - type: mrr_at_10 value: 87.401 - type: mrr_at_100 value: 87.504 - type: mrr_at_1000 value: 87.505 - type: mrr_at_3 value: 86.443 - type: mrr_at_5 value: 87.10799999999999 - type: ndcg_at_1 value: 81.04 - type: ndcg_at_10 value: 88.181 - type: ndcg_at_100 value: 89.411 - type: ndcg_at_1000 value: 89.507 - type: ndcg_at_3 value: 85.28099999999999 - type: ndcg_at_5 value: 86.888 - type: precision_at_1 value: 81.04 - type: precision_at_10 value: 13.406 - type: precision_at_100 value: 1.5350000000000001 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.31 - type: precision_at_5 value: 24.54 - type: recall_at_1 value: 70.352 - type: recall_at_10 value: 95.358 - type: recall_at_100 value: 99.541 - type: recall_at_1000 value: 99.984 - type: recall_at_3 value: 87.111 - type: recall_at_5 value: 91.643 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 46.54068723291946 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 63.216287629895994 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 4.023000000000001 - type: map_at_10 value: 10.071 - type: map_at_100 value: 11.892 - type: map_at_1000 value: 12.196 - type: map_at_3 value: 7.234 - type: map_at_5 value: 8.613999999999999 - type: mrr_at_1 value: 19.900000000000002 - type: mrr_at_10 value: 30.516 - type: mrr_at_100 value: 31.656000000000002 - type: mrr_at_1000 value: 31.723000000000003 - type: mrr_at_3 value: 27.400000000000002 - type: mrr_at_5 value: 29.270000000000003 - type: ndcg_at_1 value: 19.900000000000002 - type: ndcg_at_10 value: 17.474 - type: ndcg_at_100 value: 25.020999999999997 - type: ndcg_at_1000 value: 30.728 - type: ndcg_at_3 value: 16.588 - type: ndcg_at_5 value: 14.498 - type: precision_at_1 value: 19.900000000000002 - type: precision_at_10 value: 9.139999999999999 - type: precision_at_100 value: 2.011 - type: precision_at_1000 value: 0.33899999999999997 - type: precision_at_3 value: 15.667 - type: precision_at_5 value: 12.839999999999998 - type: recall_at_1 value: 4.023000000000001 - type: recall_at_10 value: 18.497 - type: recall_at_100 value: 40.8 - type: recall_at_1000 value: 68.812 - type: recall_at_3 value: 9.508 - type: recall_at_5 value: 12.983 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 83.967008785134 - type: cos_sim_spearman value: 80.23142141101837 - type: euclidean_pearson value: 81.20166064704539 - type: euclidean_spearman value: 80.18961335654585 - type: manhattan_pearson value: 81.13925443187625 - type: manhattan_spearman value: 80.07948723044424 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 86.94262461316023 - type: cos_sim_spearman value: 80.01596278563865 - type: euclidean_pearson value: 83.80799622922581 - type: euclidean_spearman value: 79.94984954947103 - type: manhattan_pearson value: 83.68473841756281 - type: manhattan_spearman value: 79.84990707951822 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 80.57346443146068 - type: cos_sim_spearman value: 81.54689837570866 - type: euclidean_pearson value: 81.10909881516007 - type: euclidean_spearman value: 81.56746243261762 - type: manhattan_pearson value: 80.87076036186582 - type: manhattan_spearman value: 81.33074987964402 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 79.54733787179849 - type: cos_sim_spearman value: 77.72202105610411 - type: euclidean_pearson value: 78.9043595478849 - type: euclidean_spearman value: 77.93422804309435 - type: manhattan_pearson value: 78.58115121621368 - type: manhattan_spearman value: 77.62508135122033 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 88.59880017237558 - type: cos_sim_spearman value: 89.31088630824758 - type: euclidean_pearson value: 88.47069261564656 - type: euclidean_spearman value: 89.33581971465233 - type: manhattan_pearson value: 88.40774264100956 - type: manhattan_spearman value: 89.28657485627835 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 84.08055117917084 - type: cos_sim_spearman value: 85.78491813080304 - type: euclidean_pearson value: 84.99329155500392 - type: euclidean_spearman value: 85.76728064677287 - type: manhattan_pearson value: 84.87947428989587 - type: manhattan_spearman value: 85.62429454917464 - task: type: STS dataset: name: MTEB STS17 (ko-ko) type: mteb/sts17-crosslingual-sts config: ko-ko split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 82.14190939287384 - type: cos_sim_spearman value: 82.27331573306041 - type: euclidean_pearson value: 81.891896953716 - type: euclidean_spearman value: 82.37695542955998 - type: manhattan_pearson value: 81.73123869460504 - type: manhattan_spearman value: 82.19989168441421 - task: type: STS dataset: name: MTEB STS17 (ar-ar) type: mteb/sts17-crosslingual-sts config: ar-ar split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 76.84695301843362 - type: cos_sim_spearman value: 77.87790986014461 - type: euclidean_pearson value: 76.91981583106315 - type: euclidean_spearman value: 77.88154772749589 - type: manhattan_pearson value: 76.94953277451093 - type: manhattan_spearman value: 77.80499230728604 - task: type: STS dataset: name: MTEB STS17 (en-ar) type: mteb/sts17-crosslingual-sts config: en-ar split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 75.44657840482016 - type: cos_sim_spearman value: 75.05531095119674 - type: euclidean_pearson value: 75.88161755829299 - type: euclidean_spearman value: 74.73176238219332 - type: manhattan_pearson value: 75.63984765635362 - type: manhattan_spearman value: 74.86476440770737 - task: type: STS dataset: name: MTEB STS17 (en-de) type: mteb/sts17-crosslingual-sts config: en-de split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 85.64700140524133 - type: cos_sim_spearman value: 86.16014210425672 - type: euclidean_pearson value: 86.49086860843221 - type: euclidean_spearman value: 86.09729326815614 - type: manhattan_pearson value: 86.43406265125513 - type: manhattan_spearman value: 86.17740150939994 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 87.91170098764921 - type: cos_sim_spearman value: 88.12437004058931 - type: euclidean_pearson value: 88.81828254494437 - type: euclidean_spearman value: 88.14831794572122 - type: manhattan_pearson value: 88.93442183448961 - type: manhattan_spearman value: 88.15254630778304 - task: type: STS dataset: name: MTEB STS17 (en-tr) type: mteb/sts17-crosslingual-sts config: en-tr split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 72.91390577997292 - type: cos_sim_spearman value: 71.22979457536074 - type: euclidean_pearson value: 74.40314008106749 - type: euclidean_spearman value: 72.54972136083246 - type: manhattan_pearson value: 73.85687539530218 - type: manhattan_spearman value: 72.09500771742637 - task: type: STS dataset: name: MTEB STS17 (es-en) type: mteb/sts17-crosslingual-sts config: es-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 80.9301067983089 - type: cos_sim_spearman value: 80.74989828346473 - type: euclidean_pearson value: 81.36781301814257 - type: euclidean_spearman value: 80.9448819964426 - type: manhattan_pearson value: 81.0351322685609 - type: manhattan_spearman value: 80.70192121844177 - task: type: STS dataset: name: MTEB STS17 (es-es) type: mteb/sts17-crosslingual-sts config: es-es split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 87.13820465980005 - type: cos_sim_spearman value: 86.73532498758757 - type: euclidean_pearson value: 87.21329451846637 - type: euclidean_spearman value: 86.57863198601002 - type: manhattan_pearson value: 87.06973713818554 - type: manhattan_spearman value: 86.47534918791499 - task: type: STS dataset: name: MTEB STS17 (fr-en) type: mteb/sts17-crosslingual-sts config: fr-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 85.48720108904415 - type: cos_sim_spearman value: 85.62221757068387 - type: euclidean_pearson value: 86.1010129512749 - type: euclidean_spearman value: 85.86580966509942 - type: manhattan_pearson value: 86.26800938808971 - type: manhattan_spearman value: 85.88902721678429 - task: type: STS dataset: name: MTEB STS17 (it-en) type: mteb/sts17-crosslingual-sts config: it-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 83.98021347333516 - type: cos_sim_spearman value: 84.53806553803501 - type: euclidean_pearson value: 84.61483347248364 - type: euclidean_spearman value: 85.14191408011702 - type: manhattan_pearson value: 84.75297588825967 - type: manhattan_spearman value: 85.33176753669242 - task: type: STS dataset: name: MTEB STS17 (nl-en) type: mteb/sts17-crosslingual-sts config: nl-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 84.51856644893233 - type: cos_sim_spearman value: 85.27510748506413 - type: euclidean_pearson value: 85.09886861540977 - type: euclidean_spearman value: 85.62579245860887 - type: manhattan_pearson value: 84.93017860464607 - type: manhattan_spearman value: 85.5063988898453 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 62.581573200584195 - type: cos_sim_spearman value: 63.05503590247928 - type: euclidean_pearson value: 63.652564812602094 - type: euclidean_spearman value: 62.64811520876156 - type: manhattan_pearson value: 63.506842893061076 - type: manhattan_spearman value: 62.51289573046917 - task: type: STS dataset: name: MTEB STS22 (de) type: mteb/sts22-crosslingual-sts config: de split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 48.2248801729127 - type: cos_sim_spearman value: 56.5936604678561 - type: euclidean_pearson value: 43.98149464089 - type: euclidean_spearman value: 56.108561882423615 - type: manhattan_pearson value: 43.86880305903564 - type: manhattan_spearman value: 56.04671150510166 - task: type: STS dataset: name: MTEB STS22 (es) type: mteb/sts22-crosslingual-sts config: es split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 55.17564527009831 - type: cos_sim_spearman value: 64.57978560979488 - type: euclidean_pearson value: 58.8818330154583 - type: euclidean_spearman value: 64.99214839071281 - type: manhattan_pearson value: 58.72671436121381 - type: manhattan_spearman value: 65.10713416616109 - task: type: STS dataset: name: MTEB STS22 (pl) type: mteb/sts22-crosslingual-sts config: pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 26.772131864023297 - type: cos_sim_spearman value: 34.68200792408681 - type: euclidean_pearson value: 16.68082419005441 - type: euclidean_spearman value: 34.83099932652166 - type: manhattan_pearson value: 16.52605949659529 - type: manhattan_spearman value: 34.82075801399475 - task: type: STS dataset: name: MTEB STS22 (tr) type: mteb/sts22-crosslingual-sts config: tr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 54.42415189043831 - type: cos_sim_spearman value: 63.54594264576758 - type: euclidean_pearson value: 57.36577498297745 - type: euclidean_spearman value: 63.111466379158074 - type: manhattan_pearson value: 57.584543715873885 - type: manhattan_spearman value: 63.22361054139183 - task: type: STS dataset: name: MTEB STS22 (ar) type: mteb/sts22-crosslingual-sts config: ar split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 47.55216762405518 - type: cos_sim_spearman value: 56.98670142896412 - type: euclidean_pearson value: 50.15318757562699 - type: euclidean_spearman value: 56.524941926541906 - type: manhattan_pearson value: 49.955618528674904 - type: manhattan_spearman value: 56.37102209240117 - task: type: STS dataset: name: MTEB STS22 (ru) type: mteb/sts22-crosslingual-sts config: ru split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 49.20540980338571 - type: cos_sim_spearman value: 59.9009453504406 - type: euclidean_pearson value: 49.557749853620535 - type: euclidean_spearman value: 59.76631621172456 - type: manhattan_pearson value: 49.62340591181147 - type: manhattan_spearman value: 59.94224880322436 - task: type: STS dataset: name: MTEB STS22 (zh) type: mteb/sts22-crosslingual-sts config: zh split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 51.508169956576985 - type: cos_sim_spearman value: 66.82461565306046 - type: euclidean_pearson value: 56.2274426480083 - type: euclidean_spearman value: 66.6775323848333 - type: manhattan_pearson value: 55.98277796300661 - type: manhattan_spearman value: 66.63669848497175 - task: type: STS dataset: name: MTEB STS22 (fr) type: mteb/sts22-crosslingual-sts config: fr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 72.86478788045507 - type: cos_sim_spearman value: 76.7946552053193 - type: euclidean_pearson value: 75.01598530490269 - type: euclidean_spearman value: 76.83618917858281 - type: manhattan_pearson value: 74.68337628304332 - type: manhattan_spearman value: 76.57480204017773 - task: type: STS dataset: name: MTEB STS22 (de-en) type: mteb/sts22-crosslingual-sts config: de-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 55.922619099401984 - type: cos_sim_spearman value: 56.599362477240774 - type: euclidean_pearson value: 56.68307052369783 - type: euclidean_spearman value: 54.28760436777401 - type: manhattan_pearson value: 56.67763566500681 - type: manhattan_spearman value: 53.94619541711359 - task: type: STS dataset: name: MTEB STS22 (es-en) type: mteb/sts22-crosslingual-sts config: es-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 66.74357206710913 - type: cos_sim_spearman value: 72.5208244925311 - type: euclidean_pearson value: 67.49254562186032 - type: euclidean_spearman value: 72.02469076238683 - type: manhattan_pearson value: 67.45251772238085 - type: manhattan_spearman value: 72.05538819984538 - task: type: STS dataset: name: MTEB STS22 (it) type: mteb/sts22-crosslingual-sts config: it split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 71.25734330033191 - type: cos_sim_spearman value: 76.98349083946823 - type: euclidean_pearson value: 73.71642838667736 - type: euclidean_spearman value: 77.01715504651384 - type: manhattan_pearson value: 73.61712711868105 - type: manhattan_spearman value: 77.01392571153896 - task: type: STS dataset: name: MTEB STS22 (pl-en) type: mteb/sts22-crosslingual-sts config: pl-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 63.18215462781212 - type: cos_sim_spearman value: 65.54373266117607 - type: euclidean_pearson value: 64.54126095439005 - type: euclidean_spearman value: 65.30410369102711 - type: manhattan_pearson value: 63.50332221148234 - type: manhattan_spearman value: 64.3455878104313 - task: type: STS dataset: name: MTEB STS22 (zh-en) type: mteb/sts22-crosslingual-sts config: zh-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 62.30509221440029 - type: cos_sim_spearman value: 65.99582704642478 - type: euclidean_pearson value: 63.43818859884195 - type: euclidean_spearman value: 66.83172582815764 - type: manhattan_pearson value: 63.055779168508764 - type: manhattan_spearman value: 65.49585020501449 - task: type: STS dataset: name: MTEB STS22 (es-it) type: mteb/sts22-crosslingual-sts config: es-it split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 59.587830825340404 - type: cos_sim_spearman value: 68.93467614588089 - type: euclidean_pearson value: 62.3073527367404 - type: euclidean_spearman value: 69.69758171553175 - type: manhattan_pearson value: 61.9074580815789 - type: manhattan_spearman value: 69.57696375597865 - task: type: STS dataset: name: MTEB STS22 (de-fr) type: mteb/sts22-crosslingual-sts config: de-fr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 57.143220125577066 - type: cos_sim_spearman value: 67.78857859159226 - type: euclidean_pearson value: 55.58225107923733 - type: euclidean_spearman value: 67.80662907184563 - type: manhattan_pearson value: 56.24953502726514 - type: manhattan_spearman value: 67.98262125431616 - task: type: STS dataset: name: MTEB STS22 (de-pl) type: mteb/sts22-crosslingual-sts config: de-pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 21.826928900322066 - type: cos_sim_spearman value: 49.578506634400405 - type: euclidean_pearson value: 27.939890138843214 - type: euclidean_spearman value: 52.71950519136242 - type: manhattan_pearson value: 26.39878683847546 - type: manhattan_spearman value: 47.54609580342499 - task: type: STS dataset: name: MTEB STS22 (fr-pl) type: mteb/sts22-crosslingual-sts config: fr-pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 57.27603854632001 - type: cos_sim_spearman value: 50.709255283710995 - type: euclidean_pearson value: 59.5419024445929 - type: euclidean_spearman value: 50.709255283710995 - type: manhattan_pearson value: 59.03256832438492 - type: manhattan_spearman value: 61.97797868009122 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 85.00757054859712 - type: cos_sim_spearman value: 87.29283629622222 - type: euclidean_pearson value: 86.54824171775536 - type: euclidean_spearman value: 87.24364730491402 - type: manhattan_pearson value: 86.5062156915074 - type: manhattan_spearman value: 87.15052170378574 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 82.03549357197389 - type: mrr value: 95.05437645143527 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: map_at_1 value: 57.260999999999996 - type: map_at_10 value: 66.259 - type: map_at_100 value: 66.884 - type: map_at_1000 value: 66.912 - type: map_at_3 value: 63.685 - type: map_at_5 value: 65.35499999999999 - type: mrr_at_1 value: 60.333000000000006 - type: mrr_at_10 value: 67.5 - type: mrr_at_100 value: 68.013 - type: mrr_at_1000 value: 68.038 - type: mrr_at_3 value: 65.61099999999999 - type: mrr_at_5 value: 66.861 - type: ndcg_at_1 value: 60.333000000000006 - type: ndcg_at_10 value: 70.41 - type: ndcg_at_100 value: 73.10600000000001 - type: ndcg_at_1000 value: 73.846 - type: ndcg_at_3 value: 66.133 - type: ndcg_at_5 value: 68.499 - type: precision_at_1 value: 60.333000000000006 - type: precision_at_10 value: 9.232999999999999 - type: precision_at_100 value: 1.0630000000000002 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 25.667 - type: precision_at_5 value: 17.067 - type: recall_at_1 value: 57.260999999999996 - type: recall_at_10 value: 81.94399999999999 - type: recall_at_100 value: 93.867 - type: recall_at_1000 value: 99.667 - type: recall_at_3 value: 70.339 - type: recall_at_5 value: 76.25 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.74356435643564 - type: cos_sim_ap value: 93.13411948212683 - type: cos_sim_f1 value: 86.80521991300147 - type: cos_sim_precision value: 84.00374181478017 - type: cos_sim_recall value: 89.8 - type: dot_accuracy value: 99.67920792079208 - type: dot_ap value: 89.27277565444479 - type: dot_f1 value: 83.9276990718124 - type: dot_precision value: 82.04393505253104 - type: dot_recall value: 85.9 - type: euclidean_accuracy value: 99.74257425742574 - type: euclidean_ap value: 93.17993008259062 - type: euclidean_f1 value: 86.69396110542476 - type: euclidean_precision value: 88.78406708595388 - type: euclidean_recall value: 84.7 - type: manhattan_accuracy value: 99.74257425742574 - type: manhattan_ap value: 93.14413755550099 - type: manhattan_f1 value: 86.82483594144371 - type: manhattan_precision value: 87.66564729867483 - type: manhattan_recall value: 86 - type: max_accuracy value: 99.74356435643564 - type: max_ap value: 93.17993008259062 - type: max_f1 value: 86.82483594144371 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 57.525863806168566 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 32.68850574423839 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 49.71580650644033 - type: mrr value: 50.50971903913081 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 29.152190498799484 - type: cos_sim_spearman value: 29.686180371952727 - type: dot_pearson value: 27.248664793816342 - type: dot_spearman value: 28.37748983721745 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.20400000000000001 - type: map_at_10 value: 1.6209999999999998 - type: map_at_100 value: 9.690999999999999 - type: map_at_1000 value: 23.733 - type: map_at_3 value: 0.575 - type: map_at_5 value: 0.885 - type: mrr_at_1 value: 78 - type: mrr_at_10 value: 86.56700000000001 - type: mrr_at_100 value: 86.56700000000001 - type: mrr_at_1000 value: 86.56700000000001 - type: mrr_at_3 value: 85.667 - type: mrr_at_5 value: 86.56700000000001 - type: ndcg_at_1 value: 76 - type: ndcg_at_10 value: 71.326 - type: ndcg_at_100 value: 54.208999999999996 - type: ndcg_at_1000 value: 49.252 - type: ndcg_at_3 value: 74.235 - type: ndcg_at_5 value: 73.833 - type: precision_at_1 value: 78 - type: precision_at_10 value: 74.8 - type: precision_at_100 value: 55.50000000000001 - type: precision_at_1000 value: 21.836 - type: precision_at_3 value: 78 - type: precision_at_5 value: 78 - type: recall_at_1 value: 0.20400000000000001 - type: recall_at_10 value: 1.894 - type: recall_at_100 value: 13.245999999999999 - type: recall_at_1000 value: 46.373 - type: recall_at_3 value: 0.613 - type: recall_at_5 value: 0.991 - task: type: BitextMining dataset: name: MTEB Tatoeba (sqi-eng) type: mteb/tatoeba-bitext-mining config: sqi-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.89999999999999 - type: f1 value: 94.69999999999999 - type: precision value: 94.11666666666667 - type: recall value: 95.89999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (fry-eng) type: mteb/tatoeba-bitext-mining config: fry-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 68.20809248554913 - type: f1 value: 63.431048720066066 - type: precision value: 61.69143958161298 - type: recall value: 68.20809248554913 - task: type: BitextMining dataset: name: MTEB Tatoeba (kur-eng) type: mteb/tatoeba-bitext-mining config: kur-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 71.21951219512195 - type: f1 value: 66.82926829268293 - type: precision value: 65.1260162601626 - type: recall value: 71.21951219512195 - task: type: BitextMining dataset: name: MTEB Tatoeba (tur-eng) type: mteb/tatoeba-bitext-mining config: tur-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.2 - type: f1 value: 96.26666666666667 - type: precision value: 95.8 - type: recall value: 97.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (deu-eng) type: mteb/tatoeba-bitext-mining config: deu-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 99.3 - type: f1 value: 99.06666666666666 - type: precision value: 98.95 - type: recall value: 99.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (nld-eng) type: mteb/tatoeba-bitext-mining config: nld-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.39999999999999 - type: f1 value: 96.63333333333333 - type: precision value: 96.26666666666668 - type: recall value: 97.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (ron-eng) type: mteb/tatoeba-bitext-mining config: ron-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96 - type: f1 value: 94.86666666666666 - type: precision value: 94.31666666666668 - type: recall value: 96 - task: type: BitextMining dataset: name: MTEB Tatoeba (ang-eng) type: mteb/tatoeba-bitext-mining config: ang-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 47.01492537313433 - type: f1 value: 40.178867566927266 - type: precision value: 38.179295828549556 - type: recall value: 47.01492537313433 - task: type: BitextMining dataset: name: MTEB Tatoeba (ido-eng) type: mteb/tatoeba-bitext-mining config: ido-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 86.5 - type: f1 value: 83.62537480063796 - type: precision value: 82.44555555555554 - type: recall value: 86.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (jav-eng) type: mteb/tatoeba-bitext-mining config: jav-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 80.48780487804879 - type: f1 value: 75.45644599303138 - type: precision value: 73.37398373983739 - type: recall value: 80.48780487804879 - task: type: BitextMining dataset: name: MTEB Tatoeba (isl-eng) type: mteb/tatoeba-bitext-mining config: isl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.7 - type: f1 value: 91.95666666666666 - type: precision value: 91.125 - type: recall value: 93.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (slv-eng) type: mteb/tatoeba-bitext-mining config: slv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.73754556500607 - type: f1 value: 89.65168084244632 - type: precision value: 88.73025516403402 - type: recall value: 91.73754556500607 - task: type: BitextMining dataset: name: MTEB Tatoeba (cym-eng) type: mteb/tatoeba-bitext-mining config: cym-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 81.04347826086956 - type: f1 value: 76.2128364389234 - type: precision value: 74.2 - type: recall value: 81.04347826086956 - task: type: BitextMining dataset: name: MTEB Tatoeba (kaz-eng) type: mteb/tatoeba-bitext-mining config: kaz-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 83.65217391304348 - type: f1 value: 79.4376811594203 - type: precision value: 77.65797101449274 - type: recall value: 83.65217391304348 - task: type: BitextMining dataset: name: MTEB Tatoeba (est-eng) type: mteb/tatoeba-bitext-mining config: est-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 87.5 - type: f1 value: 85.02690476190476 - type: precision value: 83.96261904761904 - type: recall value: 87.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (heb-eng) type: mteb/tatoeba-bitext-mining config: heb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 89.3 - type: f1 value: 86.52333333333333 - type: precision value: 85.22833333333332 - type: recall value: 89.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (gla-eng) type: mteb/tatoeba-bitext-mining config: gla-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 65.01809408926418 - type: f1 value: 59.00594446432805 - type: precision value: 56.827215807915444 - type: recall value: 65.01809408926418 - task: type: BitextMining dataset: name: MTEB Tatoeba (mar-eng) type: mteb/tatoeba-bitext-mining config: mar-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.2 - type: f1 value: 88.58 - type: precision value: 87.33333333333334 - type: recall value: 91.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (lat-eng) type: mteb/tatoeba-bitext-mining config: lat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 59.199999999999996 - type: f1 value: 53.299166276284915 - type: precision value: 51.3383908045977 - type: recall value: 59.199999999999996 - task: type: BitextMining dataset: name: MTEB Tatoeba (bel-eng) type: mteb/tatoeba-bitext-mining config: bel-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.2 - type: f1 value: 91.2 - type: precision value: 90.25 - type: recall value: 93.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (pms-eng) type: mteb/tatoeba-bitext-mining config: pms-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 64.76190476190476 - type: f1 value: 59.867110667110666 - type: precision value: 58.07390192653351 - type: recall value: 64.76190476190476 - task: type: BitextMining dataset: name: MTEB Tatoeba (gle-eng) type: mteb/tatoeba-bitext-mining config: gle-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.2 - type: f1 value: 71.48147546897547 - type: precision value: 69.65409090909091 - type: recall value: 76.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (pes-eng) type: mteb/tatoeba-bitext-mining config: pes-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.8 - type: f1 value: 92.14 - type: precision value: 91.35833333333333 - type: recall value: 93.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (nob-eng) type: mteb/tatoeba-bitext-mining config: nob-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.89999999999999 - type: f1 value: 97.2 - type: precision value: 96.85000000000001 - type: recall value: 97.89999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (bul-eng) type: mteb/tatoeba-bitext-mining config: bul-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.6 - type: f1 value: 92.93333333333334 - type: precision value: 92.13333333333333 - type: recall value: 94.6 - task: type: BitextMining dataset: name: MTEB Tatoeba (cbk-eng) type: mteb/tatoeba-bitext-mining config: cbk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 74.1 - type: f1 value: 69.14817460317461 - type: precision value: 67.2515873015873 - type: recall value: 74.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (hun-eng) type: mteb/tatoeba-bitext-mining config: hun-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.19999999999999 - type: f1 value: 94.01333333333335 - type: precision value: 93.46666666666667 - type: recall value: 95.19999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (uig-eng) type: mteb/tatoeba-bitext-mining config: uig-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.9 - type: f1 value: 72.07523809523809 - type: precision value: 70.19777777777779 - type: recall value: 76.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (rus-eng) type: mteb/tatoeba-bitext-mining config: rus-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.1 - type: f1 value: 92.31666666666666 - type: precision value: 91.43333333333332 - type: recall value: 94.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (spa-eng) type: mteb/tatoeba-bitext-mining config: spa-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.8 - type: f1 value: 97.1 - type: precision value: 96.76666666666668 - type: recall value: 97.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (hye-eng) type: mteb/tatoeba-bitext-mining config: hye-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.85714285714286 - type: f1 value: 90.92093441150045 - type: precision value: 90.00449236298293 - type: recall value: 92.85714285714286 - task: type: BitextMining dataset: name: MTEB Tatoeba (tel-eng) type: mteb/tatoeba-bitext-mining config: tel-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.16239316239316 - type: f1 value: 91.33903133903132 - type: precision value: 90.56267806267806 - type: recall value: 93.16239316239316 - task: type: BitextMining dataset: name: MTEB Tatoeba (afr-eng) type: mteb/tatoeba-bitext-mining config: afr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.4 - type: f1 value: 90.25666666666666 - type: precision value: 89.25833333333334 - type: recall value: 92.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (mon-eng) type: mteb/tatoeba-bitext-mining config: mon-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.22727272727272 - type: f1 value: 87.53030303030303 - type: precision value: 86.37121212121211 - type: recall value: 90.22727272727272 - task: type: BitextMining dataset: name: MTEB Tatoeba (arz-eng) type: mteb/tatoeba-bitext-mining config: arz-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 79.03563941299791 - type: f1 value: 74.7349505840072 - type: precision value: 72.9035639412998 - type: recall value: 79.03563941299791 - task: type: BitextMining dataset: name: MTEB Tatoeba (hrv-eng) type: mteb/tatoeba-bitext-mining config: hrv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97 - type: f1 value: 96.15 - type: precision value: 95.76666666666668 - type: recall value: 97 - task: type: BitextMining dataset: name: MTEB Tatoeba (nov-eng) type: mteb/tatoeba-bitext-mining config: nov-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.26459143968872 - type: f1 value: 71.55642023346303 - type: precision value: 69.7544932369835 - type: recall value: 76.26459143968872 - task: type: BitextMining dataset: name: MTEB Tatoeba (gsw-eng) type: mteb/tatoeba-bitext-mining config: gsw-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 58.119658119658126 - type: f1 value: 51.65242165242165 - type: precision value: 49.41768108434775 - type: recall value: 58.119658119658126 - task: type: BitextMining dataset: name: MTEB Tatoeba (nds-eng) type: mteb/tatoeba-bitext-mining config: nds-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 74.3 - type: f1 value: 69.52055555555555 - type: precision value: 67.7574938949939 - type: recall value: 74.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (ukr-eng) type: mteb/tatoeba-bitext-mining config: ukr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.8 - type: f1 value: 93.31666666666666 - type: precision value: 92.60000000000001 - type: recall value: 94.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (uzb-eng) type: mteb/tatoeba-bitext-mining config: uzb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.63551401869158 - type: f1 value: 72.35202492211837 - type: precision value: 70.60358255451713 - type: recall value: 76.63551401869158 - task: type: BitextMining dataset: name: MTEB Tatoeba (lit-eng) type: mteb/tatoeba-bitext-mining config: lit-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.4 - type: f1 value: 88.4811111111111 - type: precision value: 87.7452380952381 - type: recall value: 90.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (ina-eng) type: mteb/tatoeba-bitext-mining config: ina-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95 - type: f1 value: 93.60666666666667 - type: precision value: 92.975 - type: recall value: 95 - task: type: BitextMining dataset: name: MTEB Tatoeba (lfn-eng) type: mteb/tatoeba-bitext-mining config: lfn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 67.2 - type: f1 value: 63.01595782872099 - type: precision value: 61.596587301587306 - type: recall value: 67.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (zsm-eng) type: mteb/tatoeba-bitext-mining config: zsm-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.7 - type: f1 value: 94.52999999999999 - type: precision value: 94 - type: recall value: 95.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (ita-eng) type: mteb/tatoeba-bitext-mining config: ita-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.6 - type: f1 value: 93.28999999999999 - type: precision value: 92.675 - type: recall value: 94.6 - task: type: BitextMining dataset: name: MTEB Tatoeba (cmn-eng) type: mteb/tatoeba-bitext-mining config: cmn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.39999999999999 - type: f1 value: 95.28333333333333 - type: precision value: 94.75 - type: recall value: 96.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (lvs-eng) type: mteb/tatoeba-bitext-mining config: lvs-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.9 - type: f1 value: 89.83 - type: precision value: 88.92 - type: recall value: 91.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (glg-eng) type: mteb/tatoeba-bitext-mining config: glg-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.69999999999999 - type: f1 value: 93.34222222222223 - type: precision value: 92.75416666666668 - type: recall value: 94.69999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (ceb-eng) type: mteb/tatoeba-bitext-mining config: ceb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 60.333333333333336 - type: f1 value: 55.31203703703703 - type: precision value: 53.39971108326371 - type: recall value: 60.333333333333336 - task: type: BitextMining dataset: name: MTEB Tatoeba (bre-eng) type: mteb/tatoeba-bitext-mining config: bre-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 12.9 - type: f1 value: 11.099861903031458 - type: precision value: 10.589187932631877 - type: recall value: 12.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (ben-eng) type: mteb/tatoeba-bitext-mining config: ben-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 86.7 - type: f1 value: 83.0152380952381 - type: precision value: 81.37833333333333 - type: recall value: 86.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (swg-eng) type: mteb/tatoeba-bitext-mining config: swg-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 63.39285714285714 - type: f1 value: 56.832482993197274 - type: precision value: 54.56845238095237 - type: recall value: 63.39285714285714 - task: type: BitextMining dataset: name: MTEB Tatoeba (arq-eng) type: mteb/tatoeba-bitext-mining config: arq-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 48.73765093304062 - type: f1 value: 41.555736920720456 - type: precision value: 39.06874531737319 - type: recall value: 48.73765093304062 - task: type: BitextMining dataset: name: MTEB Tatoeba (kab-eng) type: mteb/tatoeba-bitext-mining config: kab-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 41.099999999999994 - type: f1 value: 36.540165945165946 - type: precision value: 35.05175685425686 - type: recall value: 41.099999999999994 - task: type: BitextMining dataset: name: MTEB Tatoeba (fra-eng) type: mteb/tatoeba-bitext-mining config: fra-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.89999999999999 - type: f1 value: 93.42333333333333 - type: precision value: 92.75833333333333 - type: recall value: 94.89999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (por-eng) type: mteb/tatoeba-bitext-mining config: por-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.89999999999999 - type: f1 value: 93.63333333333334 - type: precision value: 93.01666666666665 - type: recall value: 94.89999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (tat-eng) type: mteb/tatoeba-bitext-mining config: tat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 77.9 - type: f1 value: 73.64833333333334 - type: precision value: 71.90282106782105 - type: recall value: 77.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (oci-eng) type: mteb/tatoeba-bitext-mining config: oci-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 59.4 - type: f1 value: 54.90521367521367 - type: precision value: 53.432840025471606 - type: recall value: 59.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (pol-eng) type: mteb/tatoeba-bitext-mining config: pol-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.39999999999999 - type: f1 value: 96.6 - type: precision value: 96.2 - type: recall value: 97.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (war-eng) type: mteb/tatoeba-bitext-mining config: war-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 67.2 - type: f1 value: 62.25926129426129 - type: precision value: 60.408376623376626 - type: recall value: 67.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (aze-eng) type: mteb/tatoeba-bitext-mining config: aze-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.2 - type: f1 value: 87.60666666666667 - type: precision value: 86.45277777777778 - type: recall value: 90.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (vie-eng) type: mteb/tatoeba-bitext-mining config: vie-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.7 - type: f1 value: 97 - type: precision value: 96.65 - type: recall value: 97.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (nno-eng) type: mteb/tatoeba-bitext-mining config: nno-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.2 - type: f1 value: 91.39746031746031 - type: precision value: 90.6125 - type: recall value: 93.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (cha-eng) type: mteb/tatoeba-bitext-mining config: cha-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 32.11678832116788 - type: f1 value: 27.210415386260234 - type: precision value: 26.20408990846947 - type: recall value: 32.11678832116788 - task: type: BitextMining dataset: name: MTEB Tatoeba (mhr-eng) type: mteb/tatoeba-bitext-mining config: mhr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 8.5 - type: f1 value: 6.787319277832475 - type: precision value: 6.3452094433344435 - type: recall value: 8.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (dan-eng) type: mteb/tatoeba-bitext-mining config: dan-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.1 - type: f1 value: 95.08 - type: precision value: 94.61666666666667 - type: recall value: 96.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (ell-eng) type: mteb/tatoeba-bitext-mining config: ell-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.3 - type: f1 value: 93.88333333333333 - type: precision value: 93.18333333333332 - type: recall value: 95.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (amh-eng) type: mteb/tatoeba-bitext-mining config: amh-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 85.11904761904762 - type: f1 value: 80.69444444444444 - type: precision value: 78.72023809523809 - type: recall value: 85.11904761904762 - task: type: BitextMining dataset: name: MTEB Tatoeba (pam-eng) type: mteb/tatoeba-bitext-mining config: pam-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 11.1 - type: f1 value: 9.276381801735853 - type: precision value: 8.798174603174601 - type: recall value: 11.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (hsb-eng) type: mteb/tatoeba-bitext-mining config: hsb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 63.56107660455487 - type: f1 value: 58.70433569191332 - type: precision value: 56.896926581464015 - type: recall value: 63.56107660455487 - task: type: BitextMining dataset: name: MTEB Tatoeba (srp-eng) type: mteb/tatoeba-bitext-mining config: srp-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.69999999999999 - type: f1 value: 93.10000000000001 - type: precision value: 92.35 - type: recall value: 94.69999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (epo-eng) type: mteb/tatoeba-bitext-mining config: epo-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.8 - type: f1 value: 96.01222222222222 - type: precision value: 95.67083333333332 - type: recall value: 96.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (kzj-eng) type: mteb/tatoeba-bitext-mining config: kzj-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 9.2 - type: f1 value: 7.911555250305249 - type: precision value: 7.631246556216846 - type: recall value: 9.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (awa-eng) type: mteb/tatoeba-bitext-mining config: awa-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 77.48917748917748 - type: f1 value: 72.27375798804371 - type: precision value: 70.14430014430013 - type: recall value: 77.48917748917748 - task: type: BitextMining dataset: name: MTEB Tatoeba (fao-eng) type: mteb/tatoeba-bitext-mining config: fao-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 77.09923664122137 - type: f1 value: 72.61541257724463 - type: precision value: 70.8998380754106 - type: recall value: 77.09923664122137 - task: type: BitextMining dataset: name: MTEB Tatoeba (mal-eng) type: mteb/tatoeba-bitext-mining config: mal-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 98.2532751091703 - type: f1 value: 97.69529354682193 - type: precision value: 97.42843279961184 - type: recall value: 98.2532751091703 - task: type: BitextMining dataset: name: MTEB Tatoeba (ile-eng) type: mteb/tatoeba-bitext-mining config: ile-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 82.8 - type: f1 value: 79.14672619047619 - type: precision value: 77.59489247311828 - type: recall value: 82.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (bos-eng) type: mteb/tatoeba-bitext-mining config: bos-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.35028248587571 - type: f1 value: 92.86252354048965 - type: precision value: 92.2080979284369 - type: recall value: 94.35028248587571 - task: type: BitextMining dataset: name: MTEB Tatoeba (cor-eng) type: mteb/tatoeba-bitext-mining config: cor-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 8.5 - type: f1 value: 6.282429263935621 - type: precision value: 5.783274240739785 - type: recall value: 8.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (cat-eng) type: mteb/tatoeba-bitext-mining config: cat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.7 - type: f1 value: 91.025 - type: precision value: 90.30428571428571 - type: recall value: 92.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (eus-eng) type: mteb/tatoeba-bitext-mining config: eus-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 81 - type: f1 value: 77.8232380952381 - type: precision value: 76.60194444444444 - type: recall value: 81 - task: type: BitextMining dataset: name: MTEB Tatoeba (yue-eng) type: mteb/tatoeba-bitext-mining config: yue-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91 - type: f1 value: 88.70857142857142 - type: precision value: 87.7 - type: recall value: 91 - task: type: BitextMining dataset: name: MTEB Tatoeba (swe-eng) type: mteb/tatoeba-bitext-mining config: swe-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.39999999999999 - type: f1 value: 95.3 - type: precision value: 94.76666666666667 - type: recall value: 96.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (dtp-eng) type: mteb/tatoeba-bitext-mining config: dtp-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 8.1 - type: f1 value: 7.001008218834307 - type: precision value: 6.708329562594269 - type: recall value: 8.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (kat-eng) type: mteb/tatoeba-bitext-mining config: kat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 87.1313672922252 - type: f1 value: 84.09070598748882 - type: precision value: 82.79171454104429 - type: recall value: 87.1313672922252 - task: type: BitextMining dataset: name: MTEB Tatoeba (jpn-eng) type: mteb/tatoeba-bitext-mining config: jpn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.39999999999999 - type: f1 value: 95.28333333333333 - type: precision value: 94.73333333333332 - type: recall value: 96.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (csb-eng) type: mteb/tatoeba-bitext-mining config: csb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 42.29249011857708 - type: f1 value: 36.981018542283365 - type: precision value: 35.415877813576024 - type: recall value: 42.29249011857708 - task: type: BitextMining dataset: name: MTEB Tatoeba (xho-eng) type: mteb/tatoeba-bitext-mining config: xho-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 83.80281690140845 - type: f1 value: 80.86854460093896 - type: precision value: 79.60093896713614 - type: recall value: 83.80281690140845 - task: type: BitextMining dataset: name: MTEB Tatoeba (orv-eng) type: mteb/tatoeba-bitext-mining config: orv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 45.26946107784431 - type: f1 value: 39.80235464678088 - type: precision value: 38.14342660001342 - type: recall value: 45.26946107784431 - task: type: BitextMining dataset: name: MTEB Tatoeba (ind-eng) type: mteb/tatoeba-bitext-mining config: ind-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.3 - type: f1 value: 92.9 - type: precision value: 92.26666666666668 - type: recall value: 94.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (tuk-eng) type: mteb/tatoeba-bitext-mining config: tuk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 37.93103448275862 - type: f1 value: 33.15192743764172 - type: precision value: 31.57456528146183 - type: recall value: 37.93103448275862 - task: type: BitextMining dataset: name: MTEB Tatoeba (max-eng) type: mteb/tatoeba-bitext-mining config: max-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 69.01408450704226 - type: f1 value: 63.41549295774648 - type: precision value: 61.342778895595806 - type: recall value: 69.01408450704226 - task: type: BitextMining dataset: name: MTEB Tatoeba (swh-eng) type: mteb/tatoeba-bitext-mining config: swh-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.66666666666667 - type: f1 value: 71.60705960705961 - type: precision value: 69.60683760683762 - type: recall value: 76.66666666666667 - task: type: BitextMining dataset: name: MTEB Tatoeba (hin-eng) type: mteb/tatoeba-bitext-mining config: hin-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.8 - type: f1 value: 94.48333333333333 - type: precision value: 93.83333333333333 - type: recall value: 95.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (dsb-eng) type: mteb/tatoeba-bitext-mining config: dsb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 52.81837160751566 - type: f1 value: 48.435977731384824 - type: precision value: 47.11291973845539 - type: recall value: 52.81837160751566 - task: type: BitextMining dataset: name: MTEB Tatoeba (ber-eng) type: mteb/tatoeba-bitext-mining config: ber-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 44.9 - type: f1 value: 38.88962621607783 - type: precision value: 36.95936507936508 - type: recall value: 44.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (tam-eng) type: mteb/tatoeba-bitext-mining config: tam-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.55374592833876 - type: f1 value: 88.22553125484721 - type: precision value: 87.26927252985884 - type: recall value: 90.55374592833876 - task: type: BitextMining dataset: name: MTEB Tatoeba (slk-eng) type: mteb/tatoeba-bitext-mining config: slk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.6 - type: f1 value: 93.13333333333333 - type: precision value: 92.45333333333333 - type: recall value: 94.6 - task: type: BitextMining dataset: name: MTEB Tatoeba (tgl-eng) type: mteb/tatoeba-bitext-mining config: tgl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.7 - type: f1 value: 91.99666666666667 - type: precision value: 91.26666666666668 - type: recall value: 93.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (ast-eng) type: mteb/tatoeba-bitext-mining config: ast-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 85.03937007874016 - type: f1 value: 81.75853018372703 - type: precision value: 80.34120734908137 - type: recall value: 85.03937007874016 - task: type: BitextMining dataset: name: MTEB Tatoeba (mkd-eng) type: mteb/tatoeba-bitext-mining config: mkd-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.3 - type: f1 value: 85.5 - type: precision value: 84.25833333333334 - type: recall value: 88.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (khm-eng) type: mteb/tatoeba-bitext-mining config: khm-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 65.51246537396122 - type: f1 value: 60.02297410192148 - type: precision value: 58.133467727289236 - type: recall value: 65.51246537396122 - task: type: BitextMining dataset: name: MTEB Tatoeba (ces-eng) type: mteb/tatoeba-bitext-mining config: ces-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96 - type: f1 value: 94.89 - type: precision value: 94.39166666666667 - type: recall value: 96 - task: type: BitextMining dataset: name: MTEB Tatoeba (tzl-eng) type: mteb/tatoeba-bitext-mining config: tzl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 57.692307692307686 - type: f1 value: 53.162393162393165 - type: precision value: 51.70673076923077 - type: recall value: 57.692307692307686 - task: type: BitextMining dataset: name: MTEB Tatoeba (urd-eng) type: mteb/tatoeba-bitext-mining config: urd-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.60000000000001 - type: f1 value: 89.21190476190475 - type: precision value: 88.08666666666667 - type: recall value: 91.60000000000001 - task: type: BitextMining dataset: name: MTEB Tatoeba (ara-eng) type: mteb/tatoeba-bitext-mining config: ara-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88 - type: f1 value: 85.47 - type: precision value: 84.43266233766234 - type: recall value: 88 - task: type: BitextMining dataset: name: MTEB Tatoeba (kor-eng) type: mteb/tatoeba-bitext-mining config: kor-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.7 - type: f1 value: 90.64999999999999 - type: precision value: 89.68333333333332 - type: recall value: 92.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (yid-eng) type: mteb/tatoeba-bitext-mining config: yid-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 80.30660377358491 - type: f1 value: 76.33044137466307 - type: precision value: 74.78970125786164 - type: recall value: 80.30660377358491 - task: type: BitextMining dataset: name: MTEB Tatoeba (fin-eng) type: mteb/tatoeba-bitext-mining config: fin-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.39999999999999 - type: f1 value: 95.44 - type: precision value: 94.99166666666666 - type: recall value: 96.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (tha-eng) type: mteb/tatoeba-bitext-mining config: tha-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.53284671532847 - type: f1 value: 95.37712895377129 - type: precision value: 94.7992700729927 - type: recall value: 96.53284671532847 - task: type: BitextMining dataset: name: MTEB Tatoeba (wuu-eng) type: mteb/tatoeba-bitext-mining config: wuu-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 89 - type: f1 value: 86.23190476190476 - type: precision value: 85.035 - type: recall value: 89 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.585 - type: map_at_10 value: 9.012 - type: map_at_100 value: 14.027000000000001 - type: map_at_1000 value: 15.565000000000001 - type: map_at_3 value: 5.032 - type: map_at_5 value: 6.657 - type: mrr_at_1 value: 28.571 - type: mrr_at_10 value: 45.377 - type: mrr_at_100 value: 46.119 - type: mrr_at_1000 value: 46.127 - type: mrr_at_3 value: 41.156 - type: mrr_at_5 value: 42.585 - type: ndcg_at_1 value: 27.551 - type: ndcg_at_10 value: 23.395 - type: ndcg_at_100 value: 33.342 - type: ndcg_at_1000 value: 45.523 - type: ndcg_at_3 value: 25.158 - type: ndcg_at_5 value: 23.427 - type: precision_at_1 value: 28.571 - type: precision_at_10 value: 21.429000000000002 - type: precision_at_100 value: 6.714 - type: precision_at_1000 value: 1.473 - type: precision_at_3 value: 27.211000000000002 - type: precision_at_5 value: 24.490000000000002 - type: recall_at_1 value: 2.585 - type: recall_at_10 value: 15.418999999999999 - type: recall_at_100 value: 42.485 - type: recall_at_1000 value: 79.536 - type: recall_at_3 value: 6.239999999999999 - type: recall_at_5 value: 8.996 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 71.3234 - type: ap value: 14.361688653847423 - type: f1 value: 54.819068624319044 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 61.97792869269949 - type: f1 value: 62.28965628513728 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 38.90540145385218 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 86.53513739047506 - type: cos_sim_ap value: 75.27741586677557 - type: cos_sim_f1 value: 69.18792902473774 - type: cos_sim_precision value: 67.94708725515136 - type: cos_sim_recall value: 70.47493403693932 - type: dot_accuracy value: 84.7052512368123 - type: dot_ap value: 69.36075482849378 - type: dot_f1 value: 64.44688376631296 - type: dot_precision value: 59.92288500793831 - type: dot_recall value: 69.70976253298153 - type: euclidean_accuracy value: 86.60666388508076 - type: euclidean_ap value: 75.47512772621097 - type: euclidean_f1 value: 69.413872536473 - type: euclidean_precision value: 67.39562624254472 - type: euclidean_recall value: 71.55672823218997 - type: manhattan_accuracy value: 86.52917684925792 - type: manhattan_ap value: 75.34000110496703 - type: manhattan_f1 value: 69.28489190226429 - type: manhattan_precision value: 67.24608889992551 - type: manhattan_recall value: 71.45118733509234 - type: max_accuracy value: 86.60666388508076 - type: max_ap value: 75.47512772621097 - type: max_f1 value: 69.413872536473 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 89.01695967710637 - type: cos_sim_ap value: 85.8298270742901 - type: cos_sim_f1 value: 78.46988128389272 - type: cos_sim_precision value: 74.86017897091722 - type: cos_sim_recall value: 82.44533415460425 - type: dot_accuracy value: 88.19420188613343 - type: dot_ap value: 83.82679165901324 - type: dot_f1 value: 76.55833777304208 - type: dot_precision value: 75.6884875846501 - type: dot_recall value: 77.44841392054204 - type: euclidean_accuracy value: 89.03054294252338 - type: euclidean_ap value: 85.89089555185325 - type: euclidean_f1 value: 78.62997658079624 - type: euclidean_precision value: 74.92329149232914 - type: euclidean_recall value: 82.72251308900523 - type: manhattan_accuracy value: 89.0266620095471 - type: manhattan_ap value: 85.86458997929147 - type: manhattan_f1 value: 78.50685331000291 - type: manhattan_precision value: 74.5499861534201 - type: manhattan_recall value: 82.90729904527257 - type: max_accuracy value: 89.03054294252338 - type: max_ap value: 85.89089555185325 - type: max_f1 value: 78.62997658079624 --- ## Multilingual-E5-large [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf). Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022 This model has 24 layers and the embedding size is 1024. ## Usage Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset. ```python import torch.nn.functional as F from torch import Tensor from transformers import AutoTokenizer, AutoModel def average_pool(last_hidden_states: Tensor, attention_mask: Tensor) -> Tensor: last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0) return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None] # Each input text should start with "query: " or "passage: ", even for non-English texts. # For tasks other than retrieval, you can simply use the "query: " prefix. input_texts = ['query: how much protein should a female eat', 'query: 南瓜的家常做法', "passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.", "passage: 1.清炒南瓜丝 原料:嫩南瓜半个 调料:葱、盐、白糖、鸡精 做法: 1、南瓜用刀薄薄的削去表面一层皮,用勺子刮去瓤 2、擦成细丝(没有擦菜板就用刀慢慢切成细丝) 3、锅烧热放油,入葱花煸出香味 4、入南瓜丝快速翻炒一分钟左右,放盐、一点白糖和鸡精调味出锅 2.香葱炒南瓜 原料:南瓜1只 调料:香葱、蒜末、橄榄油、盐 做法: 1、将南瓜去皮,切成片 2、油锅8成热后,将蒜末放入爆香 3、爆香后,将南瓜片放入,翻炒 4、在翻炒的同时,可以不时地往锅里加水,但不要太多 5、放入盐,炒匀 6、南瓜差不多软和绵了之后,就可以关火 7、撒入香葱,即可出锅"] tokenizer = AutoTokenizer.from_pretrained('intfloat/multilingual-e5-large') model = AutoModel.from_pretrained('intfloat/multilingual-e5-large') # Tokenize the input texts batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt') outputs = model(**batch_dict) embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask']) # normalize embeddings embeddings = F.normalize(embeddings, p=2, dim=1) scores = (embeddings[:2] @ embeddings[2:].T) * 100 print(scores.tolist()) ``` ## Supported Languages This model is initialized from [xlm-roberta-large](https://huggingface.co/xlm-roberta-large) and continually trained on a mixture of multilingual datasets. It supports 100 languages from xlm-roberta, but low-resource languages may see performance degradation. ## Training Details **Initialization**: [xlm-roberta-large](https://huggingface.co/xlm-roberta-large) **First stage**: contrastive pre-training with weak supervision | Dataset | Weak supervision | # of text pairs | |--------------------------------------------------------------------------------------------------------|---------------------------------------|-----------------| | Filtered [mC4](https://huggingface.co/datasets/mc4) | (title, page content) | 1B | | [CC News](https://huggingface.co/datasets/intfloat/multilingual_cc_news) | (title, news content) | 400M | | [NLLB](https://huggingface.co/datasets/allenai/nllb) | translation pairs | 2.4B | | [Wikipedia](https://huggingface.co/datasets/intfloat/wikipedia) | (hierarchical section title, passage) | 150M | | Filtered [Reddit](https://www.reddit.com/) | (comment, response) | 800M | | [S2ORC](https://github.com/allenai/s2orc) | (title, abstract) and citation pairs | 100M | | [Stackexchange](https://stackexchange.com/) | (question, answer) | 50M | | [xP3](https://huggingface.co/datasets/bigscience/xP3) | (input prompt, response) | 80M | | [Miscellaneous unsupervised SBERT data](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) | - | 10M | **Second stage**: supervised fine-tuning | Dataset | Language | # of text pairs | |----------------------------------------------------------------------------------------|--------------|-----------------| | [MS MARCO](https://microsoft.github.io/msmarco/) | English | 500k | | [NQ](https://github.com/facebookresearch/DPR) | English | 70k | | [Trivia QA](https://github.com/facebookresearch/DPR) | English | 60k | | [NLI from SimCSE](https://github.com/princeton-nlp/SimCSE) | English | <300k | | [ELI5](https://huggingface.co/datasets/eli5) | English | 500k | | [DuReader Retrieval](https://github.com/baidu/DuReader/tree/master/DuReader-Retrieval) | Chinese | 86k | | [KILT Fever](https://huggingface.co/datasets/kilt_tasks) | English | 70k | | [KILT HotpotQA](https://huggingface.co/datasets/kilt_tasks) | English | 70k | | [SQuAD](https://huggingface.co/datasets/squad) | English | 87k | | [Quora](https://huggingface.co/datasets/quora) | English | 150k | | [Mr. TyDi](https://huggingface.co/datasets/castorini/mr-tydi) | 11 languages | 50k | | [MIRACL](https://huggingface.co/datasets/miracl/miracl) | 16 languages | 40k | For all labeled datasets, we only use its training set for fine-tuning. For other training details, please refer to our paper at [https://arxiv.org/pdf/2212.03533.pdf](https://arxiv.org/pdf/2212.03533.pdf). ## Benchmark Results on [Mr. TyDi](https://arxiv.org/abs/2108.08787) | Model | Avg MRR@10 | | ar | bn | en | fi | id | ja | ko | ru | sw | te | th | |-----------------------|------------|-------|------| --- | --- | --- | --- | --- | --- | --- |------| --- | --- | | BM25 | 33.3 | | 36.7 | 41.3 | 15.1 | 28.8 | 38.2 | 21.7 | 28.1 | 32.9 | 39.6 | 42.4 | 41.7 | | mDPR | 16.7 | | 26.0 | 25.8 | 16.2 | 11.3 | 14.6 | 18.1 | 21.9 | 18.5 | 7.3 | 10.6 | 13.5 | | BM25 + mDPR | 41.7 | | 49.1 | 53.5 | 28.4 | 36.5 | 45.5 | 35.5 | 36.2 | 42.7 | 40.5 | 42.0 | 49.2 | | | | | multilingual-e5-small | 64.4 | | 71.5 | 66.3 | 54.5 | 57.7 | 63.2 | 55.4 | 54.3 | 60.8 | 65.4 | 89.1 | 70.1 | | multilingual-e5-base | 65.9 | | 72.3 | 65.0 | 58.5 | 60.8 | 64.9 | 56.6 | 55.8 | 62.7 | 69.0 | 86.6 | 72.7 | | multilingual-e5-large | **70.5** | | 77.5 | 73.2 | 60.8 | 66.8 | 68.5 | 62.5 | 61.6 | 65.8 | 72.7 | 90.2 | 76.2 | ## MTEB Benchmark Evaluation Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316). ## Support for Sentence Transformers Below is an example for usage with sentence_transformers. ```python from sentence_transformers import SentenceTransformer model = SentenceTransformer('intfloat/multilingual-e5-large') input_texts = [ 'query: how much protein should a female eat', 'query: 南瓜的家常做法', "passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 i s 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or traini ng for a marathon. Check out the chart below to see how much protein you should be eating each day.", "passage: 1.清炒南瓜丝 原料:嫩南瓜半个 调料:葱、盐、白糖、鸡精 做法: 1、南瓜用刀薄薄的削去表面一层皮 ,用勺子刮去瓤 2、擦成细丝(没有擦菜板就用刀慢慢切成细丝) 3、锅烧热放油,入葱花煸出香味 4、入南瓜丝快速翻炒一分钟左右, 放盐、一点白糖和鸡精调味出锅 2.香葱炒南瓜 原料:南瓜1只 调料:香葱、蒜末、橄榄油、盐 做法: 1、将南瓜去皮,切成片 2、油 锅8成热后,将蒜末放入爆香 3、爆香后,将南瓜片放入,翻炒 4、在翻炒的同时,可以不时地往锅里加水,但不要太多 5、放入盐,炒匀 6、南瓜差不多软和绵了之后,就可以关火 7、撒入香葱,即可出锅" ] embeddings = model.encode(input_texts, normalize_embeddings=True) ``` Package requirements `pip install sentence_transformers~=2.2.2` Contributors: [michaelfeil](https://huggingface.co/michaelfeil) ## FAQ **1. Do I need to add the prefix "query: " and "passage: " to input texts?** Yes, this is how the model is trained, otherwise you will see a performance degradation. Here are some rules of thumb: - Use "query: " and "passage: " correspondingly for asymmetric tasks such as passage retrieval in open QA, ad-hoc information retrieval. - Use "query: " prefix for symmetric tasks such as semantic similarity, bitext mining, paraphrase retrieval. - Use "query: " prefix if you want to use embeddings as features, such as linear probing classification, clustering. **2. Why are my reproduced results slightly different from reported in the model card?** Different versions of `transformers` and `pytorch` could cause negligible but non-zero performance differences. **3. Why does the cosine similarity scores distribute around 0.7 to 1.0?** This is a known and expected behavior as we use a low temperature 0.01 for InfoNCE contrastive loss. For text embedding tasks like text retrieval or semantic similarity, what matters is the relative order of the scores instead of the absolute values, so this should not be an issue. ## Citation If you find our paper or models helpful, please consider cite as follows: ``` @article{wang2022text, title={Text Embeddings by Weakly-Supervised Contrastive Pre-training}, author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Jiao, Binxing and Yang, Linjun and Jiang, Daxin and Majumder, Rangan and Wei, Furu}, journal={arXiv preprint arXiv:2212.03533}, year={2022} } ``` ## Limitations Long texts will be truncated to at most 512 tokens.
[ "SEMANTIC_SIMILARITY", "TRANSLATION", "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
tanbinh2210/mlm_finetuned_3_phobert
tanbinh2210
sentence-similarity
[ "sentence-transformers", "safetensors", "roberta", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:357018", "loss:MultipleNegativesRankingLoss", "arxiv:1908.10084", "arxiv:1705.00652", "base_model:tanbinh2210/mlm_finetuned_2_phobert", "base_model:finetune:tanbinh2210/mlm_finetuned_2_phobert", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,732
1,732
7
0
--- base_model: tanbinh2210/mlm_finetuned_2_phobert library_name: sentence-transformers pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:357018 - loss:MultipleNegativesRankingLoss widget: - source_sentence: đánh_giá phẩm_chất chính chị của cán_bộ đang công_tác tại mặt_trận tổ_quốc qua những nội_dung nào ? sentences: - 'trách_nhiệm và mối quan_hệ công_tác trách_nhiệm và mối quan_hệ công_tác giữa học_viện với lãnh_đạo bộ_tư_pháp , các đơn_vị thuộc bộ_tư_pháp , các sở tư_pháp , cục thi_hành án dân_sự , các tổ_chức và cá_nhân khác có liên_quan được thực_hiện theo quy_chế làm_việc của bộ_tư_pháp và các quy_định cụ_thể sau : 1 . học_viện chịu sự chỉ_đạo trực_tiếp của bộ_trưởng và thứ_trưởng được bộ_trưởng phân_công phụ_trách , có trách_nhiệm tổ_chức thực_hiện , báo_cáo và chịu trách_nhiệm trước bộ_trưởng , thứ_trưởng phụ_trách và trước pháp_luật về kết_quả giải_quyết công_việc được giao . 2 . học_viện là đầu_mối tham_mưu , giúp lãnh_đạo bộ thực_hiện quan_hệ với tòa_án_nhân_dân_tối_cao , viện_kiểm_sát_nhân_dân_tối_cao , liên_đoàn luật_sư việt_nam , các bộ , ngành , địa_phương , các cơ_quan , tổ_chức khác trong lĩnh_vực thuộc phạm_vi chức_năng , nhiệm_vụ của học_viện theo quy_định của pháp_luật vè phân_cấp của bộ_tư_pháp .' - tiêu_chuẩn về phẩm_chất chính_trị , đạo_đức , lối sống 1 . trung_thành với tổ_quốc , với đảng ; kiên_định với đường_lối đổi_mới , độc_lập dân_tộc , dân_chủ và chủ_nghĩa_xã_hội của đảng , nhà_nước ; tích_cực tham_gia sự_nghiệp công_nghiệp hóa , hiện_đại_hóa đất_nước , trước_hết là việc cải_cách và hiện_đại hóa ngành , lĩnh_vực được phân_công quản_lý . 2 . có bản_lĩnh chính_trị vững_vàng , có tư_tưởng đổi_mới , dám nghĩ , dám làm , dám chịu trách_nhiệm cá_nhân ; không có biểu_hiện tiêu_cực , sách_nhiễu , cửa_quyền , tham_nhũng , lãng_phí . 3 . có lý_lịch rõ_ràng , đạo_đức tốt , lối sống lành_mạnh . 4 . có tinh_thần đoàn_kết nội_bộ ; gương_mẫu chấp_hành các chủ_trương , đường_lối của đảng , pháp_luật của nhà_nước , quy_định của cơ_quan và nơi cư_trú ; 5 . chấp_hành nguyên_tắc tập_trung_dân_chủ và quy_chế dân_chủ cơ_sở trong quá_trình thực_hiện nhiệm_vụ chính_trị được giao . - 'nội_dung đánh_giá 1 - mức_độ thực_hiện chức_trách , nhiệm_vụ được giao : thể_hiện ở khối_lượng , chất_lượng , tiến_độ , hiệu_quả của công_việc trong từng vị_trí , từng thời_gian ; tinh_thần trách_nhiệm trong công_tác . 2 - về phẩm_chất chính_trị , đạo_đức , lối sống - nhận_thức , tư_tưởng chính_trị ; việc chấp_hành chủ_trương , đường_lối và quy_chế , quy_định của đảng , chính_sách , pháp_luật của nhà_nước . - việc giữ_gìn đạo_đức và lối sống lành_mạnh ; chống quan_liêu , tham_nhũng , lãng_phí và những biểu_hiện tiêu_cực khác . - tinh_thần học_tập nâng cao trình_độ ; tính trung_thực , ý_thức tổ_chức kỷ_luật ; tinh_thần tự_phê_bình và phê_bình . - đoàn_kết , quan_hệ trong công_tác ; mối quan_hệ , tinh_thần và thái_độ phục_vụ nhân_dân . 3 - chiều_hướng và triển_vọng phát_triển .' - source_sentence: trách_nhiệm của sở y_tế trong việc chẩn_đoán xác_định tình_trạng nghiện ma_túy là gì ? sentences: - ủy ban_nhân_dân cấp tỉnh có trách_nhiệm chỉ_đạo sở lao_động - thương_binh và xã_hội , sở y_tế và các cơ_quan có liên_quan hướng_dẫn , kiểm_tra các trung_tâm chữa bệnh - giáo_dục - lao_động xã_hội và các cơ_sở cai_nghiện ma_túy tự_nguyện thực_hiện các quy_định về quy_trình điều_trị cho người nghiện ma túy theo thông_tư này và các văn_bản quy_phạm_pháp_luật có liên_quan . - '4 . trách_nhiệm của y_tế ngành : chủ_trì , phối_hợp với các đơn_vị liên_quan tổ_chức triển_khai , hướng_dẫn , thanh_tra , kiểm_tra và đánh_giá việc thực_hiện thông_tư này trong phạm_vi quản_lý của bộ , ngành . 5 . trách_nhiệm của cơ_sở y_tế : a ) tổ_chức thực_hiện các quy_định về tiêu_chuẩn chẩn_đoán và quy_trình chuyên_môn xác_định tình_trạng nghiện ma túy theo đúng quy_định tại thông_tư này và các quy_định có liên_quan ; b ) tổ_chức , đào_tạo , tập_huấn , truyền_thông , phổ_biến cho các đối_tượng có liên_quan các quy_định về xác_định tình_trạng nghiện ma_túy ; c ) tổ_chức điều_trị hội_chứng_cai , các rối_loạn tâm_thần và các bệnh kèm theo ( nếu có ) cho người cần xác_định tình_trạng nghiện ma_túy trong thời_gian xác_định tình_trạng nghiện ma_túy ; d ) thực_hiện việc lưu_giữ hồ_sơ xác_định tình_trạng nghiện ma_túy theo quy_định của pháp_luật về khám bệnh , chữa bệnh ; đ ) báo_cáo kết_quả hoạt_động xác_định tình_trạng nghiện ma_túy của cơ_sở y_tế .' - 'điều 28 . hồ_sơ , thủ_tục đăng_ký tạm_trú , gia_hạn tạm_trú 1 . hồ_sơ đăng_ký tạm_trú bao_gồm : a ) tờ khai thay_đổi thông_tin cư_trú ; đối_với người đăng_ký tạm_trú là người chưa thành_niên thì trong tờ khai phải ghi rõ ý_kiến đồng_ý của cha , mẹ hoặc người giám_hộ , trừ trường_hợp đã có ý_kiến đồng_ý bằng văn_bản ; b ) giấy_tờ , tài_liệu chứng_minh chỗ ở hợp_pháp . 2 . người đăng_ký tạm_trú nộp hồ_sơ đăng_ký tạm_trú đến cơ_quan đăng_ký cư_trú nơi mình dự_kiến tạm_trú . khi tiếp_nhận hồ_sơ đăng_ký tạm_trú , cơ_quan đăng_ký cư_trú kiểm_tra và cấp phiếu tiếp_nhận hồ_sơ cho người đăng_ký ; trường_hợp hồ_sơ chưa đầy_đủ thì hướng_dẫn người đăng_ký bổ_sung hồ_sơ . trong thời_hạn 03 ngày làm_việc kể từ ngày nhận được hồ_sơ đầy_đủ và hợp_lệ , cơ_quan đăng_ký cư_trú có trách_nhiệm thẩm_định , cập_nhật thông_tin về nơi tạm_trú mới , thời_hạn tạm_trú của người đăng_ký vào cơ_sở_dữ_liệu về cư_trú và thông_báo cho người đăng_ký về việc đã cập_nhật thông_tin đăng_ký tạm_trú ; trường_hợp từ_chối đăng_ký thì phải trả_lời bằng văn_bản và nêu rõ lý_do .' - source_sentence: chánh_án tòa_án quân_sự trung_ương không được hưởng chế_độ phụ_cấp đặc_thù trong trường_hợp nào ? sentences: - 'iii . cách tính trả . 1 . đối_tượng quy_định tại điều 1 thông_tư này được bổ_nhiệm từ tháng nào thì được hưởng chế_độ phụ_cấp đặc_thù từ tháng đó . khi bị miễn_nhiệm , cách_chức , từ trần hoặc thôi giữ chức_danh quy_định tại điều 1 thông_tư này từ tháng nào thì thôi_hưởng chế_độ phụ_cấp đặc_thù từ tháng tiếp_theo . các trường_hợp sau không được hưởng phụ đặc_thù : - thời_gian được cử đi công_tác , làm_việc , học_tập ở nước_ngoài được hưởng 40 % tiền_lương theo quy_định tại khoản 4 , điều 8 nghị_định số 204 / 2004 / nđ-cp ngày 14/12/2004 của chính_phủ ; - thời_gian đi công_tác , học_tập ở trong nước không trực_tiếp làm công_tác chuyên_môn từ 3 tháng trở lên ; - thời_gian bị ốm_đau , thai_sản nghỉ vượt quá thời_hạn quy_định của luật bảo_hiểm_xã_hội ; - thời_gian nghỉ_việc riêng không hưởng lương từ 1 tháng trở lên ; - thời_gian bị đình_chỉ công_tác . 2 . phụ_cấp đặc_thù đối_với một_số chức_danh tư_pháp và thanh_tra trong quân_đội không được tính để hưởng các chế_độ bảo_hiểm_xã_hội , bảo_hiểm_y_tế . 3 . mức phụ_cấp đặc_thù quy_định tại thông_tư này được tính trả cùng kỳ lương hàng tháng ; đối_tượng thuộc đơn_vị nào do đơn_vị đó chi_trả và hạch_toán vào mục 102 , tiểu_mục 08 , ngành tương_ứng trong mục_lục ngân_sách nhà_nước áp_dụng trong quân_đội .' - 'cách tính hưởng phụ_cấp 1 . mức phụ_cấp đặc_thù quy_định tại điều 2 quyết_định này được tính trên mức lương cấp_bậc quân_hàm , ngạch bậc hiện_hưởng hoặc phụ_cấp quân_hàm_cộng phụ_cấp chức_vụ lãnh_đạo và phụ_cấp thâm_niên vượt khung ( nếu có ) . 2 . khi chuyển công_tác khác mà không giữ các chức_vụ , chức_danh quy_định cho các đối_tượng tại điều 2 quyết_định này hoặc nghỉ chuẩn_bị hưu hoặc thôi phục_vụ trong quân_đội thì thôi_hưởng phụ_cấp đặc_thù từ tháng tiếp_theo . 3 . thời_gian không được tính hưởng phụ_cấp đặc_thù , bao_gồm : a ) thời_gian đi công_tác , làm_việc học_tập ở nước_ngoài được hưởng tiền_lương theo quy_định tại khoản 4 điều 8 nghị_định số 204 / 2004 / nđ-cp ngày 14 tháng 12 năm 2004 của chính_phủ về chế_độ tiền_lương đối_với cán_bộ , công_chức , viên_chức và lực_lượng_vũ_trang ; b ) thời_gian nghỉ_việc không hưởng lương liên_tục từ 1 tháng trở lên ; c ) thời_gian nghỉ_việc hưởng bảo_hiểm_xã_hội theo quy_định của pháp_luật về bảo_hiểm_xã_hội ; d ) thời_gian bị đình_chỉ công_tác hoặc bị tạm giữ , tạm giam .' - 'trực_ca của thuyền_viên ... 2 . trực_ca là nhiệm_vụ của thuyền_viên và phải được duy_trì một_cách thích_hợp , hiệu_quả để đảm_bảo an_toàn , an_ninh và phòng_ngừa ô_nhiễm môi_trường . ca trực của mỗi thuyền_viên được chia thành ca biển và ca bờ : a ) thời_gian trực_ca biển là 04 giờ và mỗi ngày trực 02 ca cách nhau 08 giờ ; trường_hợp có thay_đổi múi_giờ thì thời_gian trực_ca biển do thuyền_trưởng quyết_định ; b ) thời_gian trực ca bờ do thuyền_trưởng quy_định , căn_cứ vào điều_kiện cụ_thể khi tàu neo_đậu . ...' - source_sentence: quy_định về xử_phạt vi_phạm hành_chính đối_với hành_vi đánh người gây thương_tích ? sentences: - 'vi_phạm quy_định về trật_tự công_cộng ... 5 . phạt tiền từ 5.000.000 đồng đến 8.000.000 đồng đối_với một trong những hành_vi sau đây : a ) cố_ý gây thương_tích hoặc gây tổn_hại cho sức_khỏe của người khác nhưng không bị truy_cứu trách_nhiệm hình_sự ; b ) gây_rối trật_tự công_cộng mà có mang theo các loại vũ_khí thô_sơ , công_cụ hỗ_trợ hoặc công_cụ , đồ_vật , phương_tiện khác có khả_năng sát_thương ; c ) quay_phim , chụp ảnh , vẽ sơ_đồ địa_điểm cấm , khu_vực cấm liên_quan đến quốc_phòng , an_ninh ; d ) dâm_ô đối_với người dưới 16 tuổi nhưng không bị truy_cứu trách_nhiệm hình_sự ; đ ) sàm sỡ , quấy_rối tình_dục ; e ) khiêu_dâm , kích_dục ở nơi công_cộng ; g ) thực_hiện thiết_kế , sản_xuất , sửa_chữa , bảo_dưỡng , thử_nghiệm tàu bay , động_cơ tàu bay , cánh_quạt tàu bay và trang_bị , thiết_bị của tàu bay không người lái , phương_tiện bay siêu_nhẹ có chủng_loại hoặc chất_lượng không phù_hợp với loại sản_phẩm đã đăng_ký theo giấy_phép do cơ_quan có thẩm_quyền cấp ; h ) sử_dụng tàu bay không người lái và các phương_tiện bay siêu nhẹ phóng , bắn , thả từ trên không các loại vật , chất gây hại hoặc chứa_đựng nguy_cơ gây hại khi không được phép . ...' - '" 5 . phạt tiền từ 5.000.000 đồng đến 8.000.000 đồng đối_với một trong những hành_vi sau đây : a ) cố_ý gây thương_tích hoặc gây tổn_hại cho sức_khỏe của người khác nhưng không bị truy_cứu trách_nhiệm hình_sự ; b ) gây_rối trật_tự công_cộng mà có mang theo các loại vũ_khí thô_sơ , công_cụ hỗ_trợ hoặc công_cụ , đồ_vật , phương_tiện khác có khả_năng sát_thương ; ... "' - 'tiêu_chuẩn cơ_sở_vật_chất mức_độ 1 ... 3 . khối phòng hỗ_trợ học_tập thư_viện : có phòng đọc cho học_sinh tối_thiểu 35 chỗ , phòng đọc giáo_viên tối_thiểu 20 chỗ . 4 . khối phụ_trợ_a ) phòng nghỉ giáo_viên : bố_trí liền kề với khối phòng học_tập , bảo_đảm 10 lớp có 01 phòng ; b ) khu vệ_sinh học_sinh : khu vệ_sinh riêng cho mỗi tầng nhà , mỗi dãy phòng học .' - source_sentence: hồ_sơ xin thôi quốc_tịch việt_nam bao_gồm những gì ? sentences: - 3 . bản_sao giấy khai_sinh của người con chưa thành_niên cùng thôi quốc_tịch việt_nam theo cha_mẹ hoặc giấy_tờ hợp_lệ khác chứng_minh quan_hệ cha_con , mẹ_con . trường_hợp chỉ người cha hoặc người mẹ thôi quốc_tịch việt_nam mà con chưa thành_niên sinh_sống cùng người đó thôi quốc_tịch việt_nam theo cha hoặc mẹ thì phải nộp văn_bản thỏa_thuận có đủ chữ_ký của cha_mẹ về việc xin thôi quốc_tịch việt_nam cho con . văn_bản thỏa_thuận không phải chứng_thực chữ_ký ; người đứng đơn xin thôi quốc_tịch việt_nam cho con phải chịu trách_nhiệm về tính chính_xác chữ_ký của người kia . trường_hợp cha , mẹ đã chết , bị mất năng_lực hành_vi dân_sự hoặc hạn_chế năng_lực hành_vi dân_sự thì văn_bản thỏa_thuận được thay_thế bằng giấy_tờ chứng_minh cha , mẹ đã chết , bị mất hoặc hạn_chế năng_lực hành_vi dân_sự . 4 . hồ_sơ xin thôi quốc_tịch việt_nam phải lập thành 3 bộ , được lưu_trữ tại văn_phòng chủ_tịch nước , bộ_tư_pháp và cơ_quan thụ_lý hồ_sơ . - 'điều 28 . hồ_sơ xin thôi quốc_tịch việt_nam 1 . hồ_sơ xin thôi quốc_tịch việt_nam bao_gồm : a ) đơn xin thôi quốc_tịch việt_nam ; b ) bản khai_lý_lịch ; c ) bản_sao hộ_chiếu việt_nam , giấy_chứng_minh nhân_dân hoặc giấy_tờ khác quy_định tại điều 11 của luật này ; d ) phiếu lý_lịch tư_pháp do cơ_quan có thẩm_quyền của việt_nam cấp . phiếu lý_lịch tư_pháp phải là phiếu được cấp không quá 90 ngày tính đến ngày nộp hồ_sơ ; đ ) giấy_tờ xác_nhận về việc người đó đang làm thủ_tục nhập quốc_tịch nước_ngoài , trừ trường_hợp pháp_luật nước đó không quy_định về việc cấp giấy này ; e ) giấy xác_nhận không nợ thuế do cục thuế nơi người xin thôi quốc_tịch việt_nam cư_trú cấp ; g ) đối_với người trước_đây là cán_bộ , công_chức , viên_chức hoặc phục_vụ trong lực_lượng_vũ_trang nhân_dân việt_nam đã nghỉ hưu , thôi_việc , bị miễn_nhiệm , bãi_nhiệm , cách_chức hoặc giải_ngũ , phục_viên chưa quá 5 năm thì còn phải nộp giấy của cơ_quan , tổ_chức , đơn_vị đã ra quyết_định cho nghỉ hưu , cho thôi_việc , miễn_nhiệm , bãi_nhiệm , cách_chức hoặc giải_ngũ , phục_viên xác_nhận việc thôi quốc_tịch việt_nam của người đó không phương_hại đến lợi_ích quốc_gia của việt_nam .' - '" điều 23 . vi_phạm quy_định về hoạt_động ngoại_hối ... 4 . phạt tiền từ 30.000.000 đồng đến 50.000.000 đồng đối_với một trong các hành_vi vi_phạm sau đây : ... n ) giao_dịch , báo_giá , định_giá , ghi_giá trong hợp_đồng , thỏa_thuận , niêm_yết , quảng_cáo giá hàng hóa , dịch_vụ , quyền sử_dụng đất và các hình_thức tương_tự khác ( bao_gồm cả quy_đổi hoặc điều_chỉnh giá hàng hóa , dịch_vụ , giá_trị của hợp_đồng , thỏa_thuận ) bằng ngoại_tệ không đúng quy_định của pháp_luật ; ... "' --- # SentenceTransformer based on tanbinh2210/mlm_finetuned_2_phobert This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [tanbinh2210/mlm_finetuned_2_phobert](https://huggingface.co/tanbinh2210/mlm_finetuned_2_phobert) on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [tanbinh2210/mlm_finetuned_2_phobert](https://huggingface.co/tanbinh2210/mlm_finetuned_2_phobert) <!-- at revision 605c2d270bae808b7fadd84e5108d958a37e68fc --> - **Maximum Sequence Length:** 256 tokens - **Output Dimensionality:** 768 dimensions - **Similarity Function:** Cosine Similarity - **Training Dataset:** - json <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: RobertaModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("tanbinh2210/mlm_finetuned_3_phobert") # Run inference sentences = [ 'hồ_sơ xin thôi quốc_tịch việt_nam bao_gồm những gì ?', 'điều 28 . hồ_sơ xin thôi quốc_tịch việt_nam 1 . hồ_sơ xin thôi quốc_tịch việt_nam bao_gồm : a ) đơn xin thôi quốc_tịch việt_nam ; b ) bản khai_lý_lịch ; c ) bản_sao hộ_chiếu việt_nam , giấy_chứng_minh nhân_dân hoặc giấy_tờ khác quy_định tại điều 11 của luật này ; d ) phiếu lý_lịch tư_pháp do cơ_quan có thẩm_quyền của việt_nam cấp . phiếu lý_lịch tư_pháp phải là phiếu được cấp không quá 90 ngày tính đến ngày nộp hồ_sơ ; đ ) giấy_tờ xác_nhận về việc người đó đang làm thủ_tục nhập quốc_tịch nước_ngoài , trừ trường_hợp pháp_luật nước đó không quy_định về việc cấp giấy này ; e ) giấy xác_nhận không nợ thuế do cục thuế nơi người xin thôi quốc_tịch việt_nam cư_trú cấp ; g ) đối_với người trước_đây là cán_bộ , công_chức , viên_chức hoặc phục_vụ trong lực_lượng_vũ_trang nhân_dân việt_nam đã nghỉ hưu , thôi_việc , bị miễn_nhiệm , bãi_nhiệm , cách_chức hoặc giải_ngũ , phục_viên chưa quá 5 năm thì còn phải nộp giấy của cơ_quan , tổ_chức , đơn_vị đã ra quyết_định cho nghỉ hưu , cho thôi_việc , miễn_nhiệm , bãi_nhiệm , cách_chức hoặc giải_ngũ , phục_viên xác_nhận việc thôi quốc_tịch việt_nam của người đó không phương_hại đến lợi_ích quốc_gia của việt_nam .', '3 . bản_sao giấy khai_sinh của người con chưa thành_niên cùng thôi quốc_tịch việt_nam theo cha_mẹ hoặc giấy_tờ hợp_lệ khác chứng_minh quan_hệ cha_con , mẹ_con . trường_hợp chỉ người cha hoặc người mẹ thôi quốc_tịch việt_nam mà con chưa thành_niên sinh_sống cùng người đó thôi quốc_tịch việt_nam theo cha hoặc mẹ thì phải nộp văn_bản thỏa_thuận có đủ chữ_ký của cha_mẹ về việc xin thôi quốc_tịch việt_nam cho con . văn_bản thỏa_thuận không phải chứng_thực chữ_ký ; người đứng đơn xin thôi quốc_tịch việt_nam cho con phải chịu trách_nhiệm về tính chính_xác chữ_ký của người kia . trường_hợp cha , mẹ đã chết , bị mất năng_lực hành_vi dân_sự hoặc hạn_chế năng_lực hành_vi dân_sự thì văn_bản thỏa_thuận được thay_thế bằng giấy_tờ chứng_minh cha , mẹ đã chết , bị mất hoặc hạn_chế năng_lực hành_vi dân_sự . 4 . hồ_sơ xin thôi quốc_tịch việt_nam phải lập thành 3 bộ , được lưu_trữ tại văn_phòng chủ_tịch nước , bộ_tư_pháp và cơ_quan thụ_lý hồ_sơ .', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### json * Dataset: json * Size: 357,018 training samples * Columns: <code>query</code>, <code>pos</code>, and <code>neg</code> * Approximate statistics based on the first 1000 samples: | | query | pos | neg | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 16.98 tokens</li><li>max: 37 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 151.14 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 114.34 tokens</li><li>max: 256 tokens</li></ul> | * Samples: | query | pos | neg | |:--------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>thủ_tục hưởng bảo_hiểm_xã_hội một lần gồm bao_nhiêu bước thực_hiện ?</code> | <code>bước 1 . lập , nộp hồ_sơ nlđ lập hồ_sơ theo quy_định tại mục_9.3 ( thành_phần hồ_sơ ) và nộp cho cơ_quan bhxh nơi cư_trú . bước 2 . cơ_quan bhxh tiếp_nhận hồ_sơ và giải_quyết theo quy_định . bước 3 . nhận kết_quả nlđ nhận kết_quả giải_quyết , gồm : quyết_định về việc hưởng bhxh một lần ; bản quá_trình đóng bhxh ; tiền trợ_cấp .</code> | <code>3 . giải_quyết hưởng bảo_hiểm_xã_hội một lần trong thời_hạn 10 ngày tính đến thời_điểm chấm_dứt hợp_đồng lao_động hoặc thời_điểm giấy_phép lao_động , chứng_chỉ hành_nghề , giấy_phép hành_nghề hết hiệu_lực ( tùy thuộc điều_kiện nào đến trước ) mà người lao_động không tiếp_tục làm_việc theo hợp_đồng lao_động hoặc không được gia_hạn giấy_phép , người lao_động có yêu_cầu hưởng bảo_hiểm_xã_hội một lần nộp hồ_sơ theo quy_định cho cơ_quan bảo_hiểm_xã_hội . trong thời_hạn 05 ngày làm_việc kể từ ngày nhận đủ hồ_sơ theo quy_định , cơ_quan bảo_hiểm_xã_hội có trách_nhiệm giải_quyết và tổ_chức chi_trả cho người lao_động , trường_hợp không giải_quyết thì phải trả_lời bằng văn_bản và nêu rõ lý_do . "</code> | | <code>vụ đất_đai thuộc bộ tài_nguyên và môi_trường có những chức_danh lãnh_đạo nào ?</code> | <code>lãnh_đạo vụ 1 . vụ đất_đai có vụ trưởng và không quá 03 phó vụ trưởng . 2 . vụ trưởng vụ đất_đai chịu trách_nhiệm trước bộ_trưởng và trước pháp_luật về mọi hoạt_động của vụ ; ban_hành quy_chế làm_việc của vụ ; ký các văn_bản về chuyên_môn , nghiệp_vụ theo chức_năng , nhiệm_vụ được giao và các văn_bản khác theo phân_công , ủy_quyền của bộ_trưởng . 3 . phó vụ trưởng vụ đất_đai giúp vụ trưởng , chịu trách_nhiệm trước vụ trưởng và trước pháp_luật về lĩnh_vực công_tác được phân_công .</code> | <code>cơ_cấu tổ_chức 1 . lãnh_đạo vụ : a ) lãnh_đạo vụ có vụ trưởng và các phó vụ trưởng do bộ_trưởng bộ nông_nghiệp và phát_triển nông_thôn bổ_nhiệm , miễn_nhiệm theo quy_định ; ...</code> | | <code>lãnh_đạo báo pháp_luật việt_nam gồm có những_ai ?</code> | <code>cơ_cấu tổ_chức , biên_chế 1 . cơ_cấu tổ_chức của báo , gồm : a ) lãnh_đạo báo : lãnh_đạo báo gồm tổng_biên_tập và không quá 03 ( ba ) phó tổng_biên_tập . tổng_biên_tập chịu trách_nhiệm trước bộ_trưởng và trước pháp_luật về việc thực_hiện các chức_năng , nhiệm_vụ , quyền_hạn của báo . các phó tổng_biên_tập giúp tổng_biên_tập quản_lý , điều_hành hoạt_động của báo ; được tổng_biên_tập phân_công trực_tiếp quản_lý , điều_hành một_số lĩnh_vực hoạt_động của báo ; chịu trách_nhiệm trước tổng_biên_tập và trước pháp_luật về việc quản_lý , điều_hành những lĩnh_vực công_tác được phân_công . b ) các đơn_vị trực_thuộc báo - ban thư_ký tòa_soạn ; - ban thời_sự chính_trị ; - ban kinh_tế ; - ban nội_chính ; - ban văn_hóa - xã_hội ; - ban bạn_đọc ; - ban doanh_nhân và pháp_luật ; - ban báo pháp_luật điện_tử ; - ban chuyên_đề báo in ; - ban chuyên_đề báo_điện_tử ; - ban trị_sự ; - phòng kế_hoạch - tài_chính . việc thành_lập , tổ_chức lại , giải_thể các đơn_vị trực_thuộc báo tại điểm này do bộ_trưởng quyế...</code> | <code>cơ_cấu tổ_chức , biên_chế 1 . cơ_cấu tổ_chức a ) lãnh_đạo tạp_chí : lãnh_đạo tạp_chí gồm tổng_biên_tập và không quá 03 ( ba ) phó tổng_biên_tập . tổng_biên_tập chịu trách_nhiệm trước bộ_trưởng và trước pháp_luật về việc thực_hiện chức_năng , nhiệm_vụ , quyền_hạn của tạp_chí . các phó tổng_biên_tập giúp tổng_biên_tập quản_lý , điều_hành hoạt_động của tạp_chí ; được tổng_biên_tập phân_công trực_tiếp quản_lý , điều_hành một_số lĩnh_vực hoạt_động của tạp_chí ; chịu trách_nhiệm trước tổng_biên_tập và trước pháp_luật về việc quản_lý , điều_hành các lĩnh_vực đã được phân_công . ...</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Evaluation Dataset #### json * Dataset: json * Size: 357,018 evaluation samples * Columns: <code>query</code>, <code>pos</code>, and <code>neg</code> * Approximate statistics based on the first 1000 samples: | | query | pos | neg | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 6 tokens</li><li>mean: 17.15 tokens</li><li>max: 43 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 153.03 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 114.34 tokens</li><li>max: 256 tokens</li></ul> | * Samples: | query | pos | neg | |:--------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>kỳ thi tốt_nghiệp thpt quốc_gia được tổ_chức nhằm mục_đích gì ?</code> | <code>“ điều 2 . mục_đích , yêu_cầu 1 . thi tốt_nghiệp thpt nhằm mục_đích : đánh_giá kết_quả học_tập của người học theo mục_tiêu giáo_dục của chương_trình giáo_dục_phổ_thông cấp thpt , chương_trình gdtx cấp thpt ( gọi chung là chương_trình thpt ) ; lấy kết_quả thi để xét công_nhận tốt_nghiệp thpt ; làm cơ_sở đánh_giá chất_lượng dạy , học của trường phổ_thông và công_tác chỉ_đạo của các cơ_quan quản_lý giáo_dục . các cơ_sở giáo_dục đại_học , giáo_dục nghề_nghiệp có_thể sử_dụng kết_quả thi tốt_nghiệp thpt để tuyển_sinh . 2 . kỳ thi tốt_nghiệp thpt ( gọi tắt là kỳ thi ) phải bảo_đảm yêu_cầu nghiêm_túc , trung_thực , khách_quan , công_bằng . ”</code> | <code>" điều 12 . đối_tượng , điều_kiện dự thi 1 . đối_tượng dự thi gồm : a ) người đã học xong chương_trình thpt trong năm tổ_chức kỳ_thi ; b ) người đã học xong chương_trình thpt nhưng chưa thi tốt_nghiệp thpt hoặc đã thi nhưng chưa tốt_nghiệp thpt ở những năm trước ; c ) người đã có bằng tốt_nghiệp thpt , người đã có bằng tốt_nghiệp trung_cấp dự thi để lấy kết_quả làm cơ_sở đăng_ký xét tuyển_sinh ; d ) một_số trường_hợp đặc_biệt khác do bộ_trưởng bộ gdđt quyết_định . "</code> | | <code>chánh_án tòa_án quân_sự trung_ương không được hưởng chế_độ phụ_cấp đặc_thù trong trường_hợp nào ?</code> | <code>iii . cách tính trả . 1 . đối_tượng quy_định tại điều 1 thông_tư này được bổ_nhiệm từ tháng nào thì được hưởng chế_độ phụ_cấp đặc_thù từ tháng đó . khi bị miễn_nhiệm , cách_chức , từ trần hoặc thôi giữ chức_danh quy_định tại điều 1 thông_tư này từ tháng nào thì thôi_hưởng chế_độ phụ_cấp đặc_thù từ tháng tiếp_theo . các trường_hợp sau không được hưởng phụ đặc_thù : - thời_gian được cử đi công_tác , làm_việc , học_tập ở nước_ngoài được hưởng 40 % tiền_lương theo quy_định tại khoản 4 , điều 8 nghị_định số 204 / 2004 / nđ-cp ngày 14/12/2004 của chính_phủ ; - thời_gian đi công_tác , học_tập ở trong nước không trực_tiếp làm công_tác chuyên_môn từ 3 tháng trở lên ; - thời_gian bị ốm_đau , thai_sản nghỉ vượt quá thời_hạn quy_định của luật bảo_hiểm_xã_hội ; - thời_gian nghỉ_việc riêng không hưởng lương từ 1 tháng trở lên ; - thời_gian bị đình_chỉ công_tác . 2 . phụ_cấp đặc_thù đối_với một_số chức_danh tư_pháp và thanh_tra trong quân_đội không được tính để hưởng các chế_độ bảo_hiểm_xã_hội , bảo_...</code> | <code>cách tính hưởng phụ_cấp 1 . mức phụ_cấp đặc_thù quy_định tại điều 2 quyết_định này được tính trên mức lương cấp_bậc quân_hàm , ngạch bậc hiện_hưởng hoặc phụ_cấp quân_hàm_cộng phụ_cấp chức_vụ lãnh_đạo và phụ_cấp thâm_niên vượt khung ( nếu có ) . 2 . khi chuyển công_tác khác mà không giữ các chức_vụ , chức_danh quy_định cho các đối_tượng tại điều 2 quyết_định này hoặc nghỉ chuẩn_bị hưu hoặc thôi phục_vụ trong quân_đội thì thôi_hưởng phụ_cấp đặc_thù từ tháng tiếp_theo . 3 . thời_gian không được tính hưởng phụ_cấp đặc_thù , bao_gồm : a ) thời_gian đi công_tác , làm_việc học_tập ở nước_ngoài được hưởng tiền_lương theo quy_định tại khoản 4 điều 8 nghị_định số 204 / 2004 / nđ-cp ngày 14 tháng 12 năm 2004 của chính_phủ về chế_độ tiền_lương đối_với cán_bộ , công_chức , viên_chức và lực_lượng_vũ_trang ; b ) thời_gian nghỉ_việc không hưởng lương liên_tục từ 1 tháng trở lên ; c ) thời_gian nghỉ_việc hưởng bảo_hiểm_xã_hội theo quy_định của pháp_luật về bảo_hiểm_xã_hội ; d ) thời_gian bị đình_chỉ cô...</code> | | <code>nhân_viên hải_quan có thuộc đối_tượng được hưởng phụ_cấp ưu_đãi theo nghề đối_với công_chức hải_quan không ?</code> | <code>đối_tượng và phạm_vi áp_dụng tổng_cục trưởng tổng_cục hải_quan và công_chức đã được xếp lương theo các ngạch công_chức hải_quan , gồm : kiểm_tra_viên cao_cấp hải_quan , kiểm_tra_viên chính hải_quan , kiểm_tra_viên hải_quan , kiểm_tra_viên hải_quan ( cao_đẳng ) , kiểm_tra_viên trung_cấp hải_quan , nhân_viên hải_quan . 2 . nguyên_tắc áp_dụng a ) đối_tượng được hưởng phụ_cấp ưu_đãi theo nghề hải_quan quy_định tại khoản 1 mục i thông_tư này là những người được cấp có thẩm_quyền quyết_định bổ_nhiệm vào_ngạch hoặc chức_danh theo quy_định của pháp_luật . b ) công_chức được bổ_nhiệm vào_ngạch hoặc chức_danh nào thì được hưởng phụ_cấp ưu_đãi quy_định đối_với ngạch hoặc chức_danh đó . c ) công_chức được bổ_nhiệm vào ngạch công_chức hải_quan cao hơn ( nâng_ngạch ) mà tổng mức tiền_lương cộng phụ_cấp ưu_đãi theo nghề hải_quan ở ngạch được bổ_nhiệm thấp hơn tổng mức tiền_lương cộng phụ_cấp ưu_đãi theo nghề hải_quan đã hưởng ở ngạch cũ thì được bảo_lưu phần chênh_lệch giữa tổng mức tiền_lương cộng p...</code> | <code>đối_tượng và phạm_vi áp_dụng tổng_cục trưởng tổng_cục hải_quan và công_chức đã được xếp lương theo các ngạch công_chức hải_quan , gồm : kiểm_tra_viên cao_cấp hải_quan , kiểm_tra_viên chính hải_quan , kiểm_tra_viên hải_quan , kiểm_tra_viên hải_quan ( cao_đẳng ) , kiểm_tra_viên trung_cấp hải_quan , nhân_viên hải_quan .</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `overwrite_output_dir`: True - `per_device_train_batch_size`: 48 - `per_device_eval_batch_size`: 48 - `learning_rate`: 1e-05 - `num_train_epochs`: 2 - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: True - `do_predict`: False - `eval_strategy`: no - `prediction_loss_only`: True - `per_device_train_batch_size`: 48 - `per_device_eval_batch_size`: 48 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 1e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 2 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `prompts`: None - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs <details><summary>Click to expand</summary> | Epoch | Step | Training Loss | |:------:|:-----:|:-------------:| | 0.0168 | 100 | 0.1849 | | 0.0336 | 200 | 0.1852 | | 0.0504 | 300 | 0.187 | | 0.0672 | 400 | 0.1831 | | 0.0840 | 500 | 0.175 | | 0.1008 | 600 | 0.1703 | | 0.1176 | 700 | 0.1671 | | 0.1344 | 800 | 0.1687 | | 0.1512 | 900 | 0.1567 | | 0.1680 | 1000 | 0.1581 | | 0.1848 | 1100 | 0.1646 | | 0.2016 | 1200 | 0.1489 | | 0.2185 | 1300 | 0.1405 | | 0.2353 | 1400 | 0.1479 | | 0.2521 | 1500 | 0.1442 | | 0.2689 | 1600 | 0.137 | | 0.2857 | 1700 | 0.1294 | | 0.3025 | 1800 | 0.1369 | | 0.3193 | 1900 | 0.1362 | | 0.3361 | 2000 | 0.1275 | | 0.3529 | 2100 | 0.1257 | | 0.3697 | 2200 | 0.1228 | | 0.3865 | 2300 | 0.1325 | | 0.4033 | 2400 | 0.1294 | | 0.4201 | 2500 | 0.1277 | | 0.4369 | 2600 | 0.125 | | 0.4537 | 2700 | 0.1359 | | 0.4705 | 2800 | 0.1304 | | 0.4873 | 2900 | 0.1225 | | 0.5041 | 3000 | 0.1251 | | 0.5209 | 3100 | 0.126 | | 0.5377 | 3200 | 0.1376 | | 0.5545 | 3300 | 0.1307 | | 0.5713 | 3400 | 0.1276 | | 0.5881 | 3500 | 0.1321 | | 0.6049 | 3600 | 0.1297 | | 0.6217 | 3700 | 0.1256 | | 0.6385 | 3800 | 0.1303 | | 0.6554 | 3900 | 0.1184 | | 0.6722 | 4000 | 0.1245 | | 0.6890 | 4100 | 0.1217 | | 0.7058 | 4200 | 0.1326 | | 0.7226 | 4300 | 0.1311 | | 0.7394 | 4400 | 0.1298 | | 0.7562 | 4500 | 0.1328 | | 0.7730 | 4600 | 0.1265 | | 0.7898 | 4700 | 0.1353 | | 0.8066 | 4800 | 0.1328 | | 0.8234 | 4900 | 0.1361 | | 0.8402 | 5000 | 0.1235 | | 0.8570 | 5100 | 0.1429 | | 0.8738 | 5200 | 0.1319 | | 0.8906 | 5300 | 0.1301 | | 0.9074 | 5400 | 0.1337 | | 0.9242 | 5500 | 0.1417 | | 0.9410 | 5600 | 0.1338 | | 0.9578 | 5700 | 0.1384 | | 0.9746 | 5800 | 0.1347 | | 0.9914 | 5900 | 0.1345 | | 1.0082 | 6000 | 0.1422 | | 1.0250 | 6100 | 0.1258 | | 1.0418 | 6200 | 0.1322 | | 1.0586 | 6300 | 0.1189 | | 1.0754 | 6400 | 0.1221 | | 1.0923 | 6500 | 0.118 | | 1.1091 | 6600 | 0.1065 | | 1.1259 | 6700 | 0.1102 | | 1.1427 | 6800 | 0.0991 | | 1.1595 | 6900 | 0.0955 | | 1.1763 | 7000 | 0.1042 | | 1.1931 | 7100 | 0.0894 | | 1.2099 | 7200 | 0.0901 | | 1.2267 | 7300 | 0.0901 | | 1.2435 | 7400 | 0.0861 | | 1.2603 | 7500 | 0.0953 | | 1.2771 | 7600 | 0.0872 | | 1.2939 | 7700 | 0.0887 | | 1.3107 | 7800 | 0.087 | | 1.3275 | 7900 | 0.0945 | | 1.3443 | 8000 | 0.0843 | | 1.3611 | 8100 | 0.0848 | | 1.3779 | 8200 | 0.0884 | | 1.3947 | 8300 | 0.0945 | | 1.4115 | 8400 | 0.0847 | | 1.4283 | 8500 | 0.0902 | | 1.4451 | 8600 | 0.0945 | | 1.4619 | 8700 | 0.0877 | | 1.4787 | 8800 | 0.0936 | | 1.4955 | 8900 | 0.0906 | | 1.5124 | 9000 | 0.0887 | | 1.5292 | 9100 | 0.0959 | | 1.5460 | 9200 | 0.0927 | | 1.5628 | 9300 | 0.0941 | | 1.5796 | 9400 | 0.0939 | | 1.5964 | 9500 | 0.0995 | | 1.6132 | 9600 | 0.0944 | | 1.6300 | 9700 | 0.0949 | | 1.6468 | 9800 | 0.0966 | | 1.6636 | 9900 | 0.0902 | | 1.6804 | 10000 | 0.099 | | 1.6972 | 10100 | 0.0915 | | 1.7140 | 10200 | 0.1024 | | 1.7308 | 10300 | 0.1011 | | 1.7476 | 10400 | 0.0989 | | 1.7644 | 10500 | 0.1017 | | 1.7812 | 10600 | 0.1026 | | 1.7980 | 10700 | 0.1062 | | 1.8148 | 10800 | 0.1094 | | 1.8316 | 10900 | 0.0917 | | 1.8484 | 11000 | 0.1074 | | 1.8652 | 11100 | 0.113 | | 1.8820 | 11200 | 0.1066 | | 1.8988 | 11300 | 0.1113 | | 1.9156 | 11400 | 0.1134 | | 1.9324 | 11500 | 0.1175 | | 1.9493 | 11600 | 0.1136 | | 1.9661 | 11700 | 0.1223 | | 1.9829 | 11800 | 0.1088 | | 1.9997 | 11900 | 0.1233 | </details> ### Framework Versions - Python: 3.10.14 - Sentence Transformers: 3.3.1 - Transformers: 4.45.1 - PyTorch: 2.4.0 - Accelerate: 0.34.2 - Datasets: 3.0.1 - Tokenizers: 0.20.0 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "CHIA" ]
Non_BioNLP
jinaai/jina-embeddings-v3
jinaai
feature-extraction
[ "transformers", "pytorch", "onnx", "safetensors", "feature-extraction", "sentence-similarity", "mteb", "sentence-transformers", "custom_code", "multilingual", "af", "am", "ar", "as", "az", "be", "bg", "bn", "br", "bs", "ca", "cs", "cy", "da", "de", "el", "en", "eo", "es", "et", "eu", "fa", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "hu", "hy", "id", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "ku", "ky", "la", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", "no", "om", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sa", "sd", "si", "sk", "sl", "so", "sq", "sr", "su", "sv", "sw", "ta", "te", "th", "tl", "tr", "ug", "uk", "ur", "uz", "vi", "xh", "yi", "zh", "arxiv:2409.10173", "license:cc-by-nc-4.0", "model-index", "region:eu" ]
1,725
1,740
1,734,211
828
--- language: - multilingual - af - am - ar - as - az - be - bg - bn - br - bs - ca - cs - cy - da - de - el - en - eo - es - et - eu - fa - fi - fr - fy - ga - gd - gl - gu - ha - he - hi - hr - hu - hy - id - is - it - ja - jv - ka - kk - km - kn - ko - ku - ky - la - lo - lt - lv - mg - mk - ml - mn - mr - ms - my - ne - nl - false - om - or - pa - pl - ps - pt - ro - ru - sa - sd - si - sk - sl - so - sq - sr - su - sv - sw - ta - te - th - tl - tr - ug - uk - ur - uz - vi - xh - yi - zh library_name: transformers license: cc-by-nc-4.0 tags: - feature-extraction - sentence-similarity - mteb - sentence-transformers inference: false model-index: - name: jina-embeddings-v3 results: - task: type: STS dataset: name: MTEB AFQMC (default) type: C-MTEB/AFQMC config: default split: validation revision: b44c3b011063adb25877c13823db83bb193913c4 metrics: - type: cosine_pearson value: 41.74237700998808 - type: cosine_spearman value: 43.4726782647566 - type: euclidean_pearson value: 42.244585459479964 - type: euclidean_spearman value: 43.525070045169606 - type: main_score value: 43.4726782647566 - type: manhattan_pearson value: 42.04616728224863 - type: manhattan_spearman value: 43.308828270754645 - type: pearson value: 41.74237700998808 - type: spearman value: 43.4726782647566 - task: type: Retrieval dataset: name: MTEB ArguAna-PL (default) type: clarin-knext/arguana-pl config: default split: test revision: 63fc86750af76253e8c760fc9e534bbf24d260a2 metrics: - type: main_score value: 50.117999999999995 - type: map_at_1 value: 24.253 - type: map_at_10 value: 40.725 - type: map_at_100 value: 41.699999999999996 - type: map_at_1000 value: 41.707 - type: map_at_20 value: 41.467999999999996 - type: map_at_3 value: 35.467 - type: map_at_5 value: 38.291 - type: mrr_at_1 value: 24.751066856330013 - type: mrr_at_10 value: 40.91063808169072 - type: mrr_at_100 value: 41.885497923928675 - type: mrr_at_1000 value: 41.89301098419842 - type: mrr_at_20 value: 41.653552355442514 - type: mrr_at_3 value: 35.656709340919775 - type: mrr_at_5 value: 38.466097676623946 - type: nauc_map_at_1000_diff1 value: 7.503000359807567 - type: nauc_map_at_1000_max value: -11.030405164830546 - type: nauc_map_at_1000_std value: -8.902792782585117 - type: nauc_map_at_100_diff1 value: 7.509899249593199 - type: nauc_map_at_100_max value: -11.023581259404406 - type: nauc_map_at_100_std value: -8.892241185067272 - type: nauc_map_at_10_diff1 value: 7.24369711881512 - type: nauc_map_at_10_max value: -10.810000200433278 - type: nauc_map_at_10_std value: -8.987230542165776 - type: nauc_map_at_1_diff1 value: 11.37175831832417 - type: nauc_map_at_1_max value: -13.315221903223055 - type: nauc_map_at_1_std value: -9.398199605510275 - type: nauc_map_at_20_diff1 value: 7.477364530860648 - type: nauc_map_at_20_max value: -10.901251218105566 - type: nauc_map_at_20_std value: -8.868148116405925 - type: nauc_map_at_3_diff1 value: 6.555548802174882 - type: nauc_map_at_3_max value: -12.247274800542934 - type: nauc_map_at_3_std value: -9.879475250984811 - type: nauc_map_at_5_diff1 value: 7.426588563355882 - type: nauc_map_at_5_max value: -11.347695686001805 - type: nauc_map_at_5_std value: -9.34441892203972 - type: nauc_mrr_at_1000_diff1 value: 5.99737552143614 - type: nauc_mrr_at_1000_max value: -11.327205136505727 - type: nauc_mrr_at_1000_std value: -8.791079115519503 - type: nauc_mrr_at_100_diff1 value: 6.004622525255784 - type: nauc_mrr_at_100_max value: -11.320336759899723 - type: nauc_mrr_at_100_std value: -8.780602249831777 - type: nauc_mrr_at_10_diff1 value: 5.783623516930227 - type: nauc_mrr_at_10_max value: -11.095971693467078 - type: nauc_mrr_at_10_std value: -8.877242032013582 - type: nauc_mrr_at_1_diff1 value: 9.694937537703797 - type: nauc_mrr_at_1_max value: -12.531905083727912 - type: nauc_mrr_at_1_std value: -8.903992940100146 - type: nauc_mrr_at_20_diff1 value: 5.984841206233873 - type: nauc_mrr_at_20_max value: -11.195236951048969 - type: nauc_mrr_at_20_std value: -8.757266039186018 - type: nauc_mrr_at_3_diff1 value: 5.114333824261379 - type: nauc_mrr_at_3_max value: -12.64809799843464 - type: nauc_mrr_at_3_std value: -9.791146138025184 - type: nauc_mrr_at_5_diff1 value: 5.88941606224512 - type: nauc_mrr_at_5_max value: -11.763903418071918 - type: nauc_mrr_at_5_std value: -9.279175712709446 - type: nauc_ndcg_at_1000_diff1 value: 7.076950652226086 - type: nauc_ndcg_at_1000_max value: -10.386482092087371 - type: nauc_ndcg_at_1000_std value: -8.309190917074046 - type: nauc_ndcg_at_100_diff1 value: 7.2329220284865245 - type: nauc_ndcg_at_100_max value: -10.208048403220337 - type: nauc_ndcg_at_100_std value: -7.997975874274613 - type: nauc_ndcg_at_10_diff1 value: 6.065391100006953 - type: nauc_ndcg_at_10_max value: -9.046164377601153 - type: nauc_ndcg_at_10_std value: -8.34724889697153 - type: nauc_ndcg_at_1_diff1 value: 11.37175831832417 - type: nauc_ndcg_at_1_max value: -13.315221903223055 - type: nauc_ndcg_at_1_std value: -9.398199605510275 - type: nauc_ndcg_at_20_diff1 value: 6.949389989202601 - type: nauc_ndcg_at_20_max value: -9.35740451760307 - type: nauc_ndcg_at_20_std value: -7.761295171828212 - type: nauc_ndcg_at_3_diff1 value: 5.051471796151364 - type: nauc_ndcg_at_3_max value: -12.158763333711653 - type: nauc_ndcg_at_3_std value: -10.078902544421926 - type: nauc_ndcg_at_5_diff1 value: 6.527454512611454 - type: nauc_ndcg_at_5_max value: -10.525118233848586 - type: nauc_ndcg_at_5_std value: -9.120055125584031 - type: nauc_precision_at_1000_diff1 value: -10.6495668199151 - type: nauc_precision_at_1000_max value: 12.070656425217841 - type: nauc_precision_at_1000_std value: 55.844551709649004 - type: nauc_precision_at_100_diff1 value: 19.206967129266285 - type: nauc_precision_at_100_max value: 16.296851020813456 - type: nauc_precision_at_100_std value: 45.60378984257811 - type: nauc_precision_at_10_diff1 value: 0.6490335354304879 - type: nauc_precision_at_10_max value: 0.5757198255366447 - type: nauc_precision_at_10_std value: -4.875847131691451 - type: nauc_precision_at_1_diff1 value: 11.37175831832417 - type: nauc_precision_at_1_max value: -13.315221903223055 - type: nauc_precision_at_1_std value: -9.398199605510275 - type: nauc_precision_at_20_diff1 value: 4.899369866929203 - type: nauc_precision_at_20_max value: 5.988537297189552 - type: nauc_precision_at_20_std value: 4.830900387582837 - type: nauc_precision_at_3_diff1 value: 0.8791156910997744 - type: nauc_precision_at_3_max value: -11.983373635905993 - type: nauc_precision_at_3_std value: -10.646185111581257 - type: nauc_precision_at_5_diff1 value: 3.9314486166548432 - type: nauc_precision_at_5_max value: -7.798591396895839 - type: nauc_precision_at_5_std value: -8.293043407234125 - type: nauc_recall_at_1000_diff1 value: -10.649566819918673 - type: nauc_recall_at_1000_max value: 12.070656425214647 - type: nauc_recall_at_1000_std value: 55.84455170965023 - type: nauc_recall_at_100_diff1 value: 19.206967129265127 - type: nauc_recall_at_100_max value: 16.296851020813722 - type: nauc_recall_at_100_std value: 45.60378984257728 - type: nauc_recall_at_10_diff1 value: 0.6490335354304176 - type: nauc_recall_at_10_max value: 0.5757198255366095 - type: nauc_recall_at_10_std value: -4.875847131691468 - type: nauc_recall_at_1_diff1 value: 11.37175831832417 - type: nauc_recall_at_1_max value: -13.315221903223055 - type: nauc_recall_at_1_std value: -9.398199605510275 - type: nauc_recall_at_20_diff1 value: 4.899369866929402 - type: nauc_recall_at_20_max value: 5.98853729718968 - type: nauc_recall_at_20_std value: 4.830900387582967 - type: nauc_recall_at_3_diff1 value: 0.8791156910997652 - type: nauc_recall_at_3_max value: -11.983373635905997 - type: nauc_recall_at_3_std value: -10.64618511158124 - type: nauc_recall_at_5_diff1 value: 3.9314486166548472 - type: nauc_recall_at_5_max value: -7.7985913968958585 - type: nauc_recall_at_5_std value: -8.293043407234132 - type: ndcg_at_1 value: 24.253 - type: ndcg_at_10 value: 50.117999999999995 - type: ndcg_at_100 value: 54.291999999999994 - type: ndcg_at_1000 value: 54.44799999999999 - type: ndcg_at_20 value: 52.771 - type: ndcg_at_3 value: 39.296 - type: ndcg_at_5 value: 44.373000000000005 - type: precision_at_1 value: 24.253 - type: precision_at_10 value: 8.016 - type: precision_at_100 value: 0.984 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.527 - type: precision_at_3 value: 16.808999999999997 - type: precision_at_5 value: 12.546 - type: recall_at_1 value: 24.253 - type: recall_at_10 value: 80.156 - type: recall_at_100 value: 98.43499999999999 - type: recall_at_1000 value: 99.57300000000001 - type: recall_at_20 value: 90.54100000000001 - type: recall_at_3 value: 50.427 - type: recall_at_5 value: 62.731 - task: type: Retrieval dataset: name: MTEB DBPedia-PL (default) type: clarin-knext/dbpedia-pl config: default split: test revision: 76afe41d9af165cc40999fcaa92312b8b012064a metrics: - type: main_score value: 34.827000000000005 - type: map_at_1 value: 7.049999999999999 - type: map_at_10 value: 14.982999999999999 - type: map_at_100 value: 20.816000000000003 - type: map_at_1000 value: 22.33 - type: map_at_20 value: 17.272000000000002 - type: map_at_3 value: 10.661 - type: map_at_5 value: 12.498 - type: mrr_at_1 value: 57.25 - type: mrr_at_10 value: 65.81934523809524 - type: mrr_at_100 value: 66.2564203928212 - type: mrr_at_1000 value: 66.27993662923856 - type: mrr_at_20 value: 66.0732139130649 - type: mrr_at_3 value: 64.08333333333333 - type: mrr_at_5 value: 65.27083333333333 - type: nauc_map_at_1000_diff1 value: 16.41780871174038 - type: nauc_map_at_1000_max value: 30.193946325654654 - type: nauc_map_at_1000_std value: 31.46095497039037 - type: nauc_map_at_100_diff1 value: 18.57903165498531 - type: nauc_map_at_100_max value: 29.541476938623262 - type: nauc_map_at_100_std value: 28.228604103301052 - type: nauc_map_at_10_diff1 value: 24.109434489748946 - type: nauc_map_at_10_max value: 21.475954208048968 - type: nauc_map_at_10_std value: 9.964464537806988 - type: nauc_map_at_1_diff1 value: 38.67437644802124 - type: nauc_map_at_1_max value: 14.52136658726491 - type: nauc_map_at_1_std value: -2.8981666782088755 - type: nauc_map_at_20_diff1 value: 21.42547228801935 - type: nauc_map_at_20_max value: 25.04510402960458 - type: nauc_map_at_20_std value: 16.533079346431155 - type: nauc_map_at_3_diff1 value: 26.63648858245477 - type: nauc_map_at_3_max value: 13.632235789780415 - type: nauc_map_at_3_std value: -0.40129174577700716 - type: nauc_map_at_5_diff1 value: 24.513861031197933 - type: nauc_map_at_5_max value: 16.599888813946688 - type: nauc_map_at_5_std value: 3.4448514739556346 - type: nauc_mrr_at_1000_diff1 value: 36.57353464537154 - type: nauc_mrr_at_1000_max value: 55.34763483979515 - type: nauc_mrr_at_1000_std value: 40.3722796438533 - type: nauc_mrr_at_100_diff1 value: 36.555989566513134 - type: nauc_mrr_at_100_max value: 55.347805216808396 - type: nauc_mrr_at_100_std value: 40.38465945075711 - type: nauc_mrr_at_10_diff1 value: 36.771572999261984 - type: nauc_mrr_at_10_max value: 55.41239897909165 - type: nauc_mrr_at_10_std value: 40.52058934624793 - type: nauc_mrr_at_1_diff1 value: 38.2472828531032 - type: nauc_mrr_at_1_max value: 51.528473828685705 - type: nauc_mrr_at_1_std value: 33.03676467942882 - type: nauc_mrr_at_20_diff1 value: 36.642602571889036 - type: nauc_mrr_at_20_max value: 55.3763342076553 - type: nauc_mrr_at_20_std value: 40.41520090500838 - type: nauc_mrr_at_3_diff1 value: 36.79451847426628 - type: nauc_mrr_at_3_max value: 54.59778581826193 - type: nauc_mrr_at_3_std value: 39.48392075873095 - type: nauc_mrr_at_5_diff1 value: 36.92150807529304 - type: nauc_mrr_at_5_max value: 55.03553978718272 - type: nauc_mrr_at_5_std value: 40.20147745489917 - type: nauc_ndcg_at_1000_diff1 value: 21.843092744321268 - type: nauc_ndcg_at_1000_max value: 44.93275990394279 - type: nauc_ndcg_at_1000_std value: 47.09186225236347 - type: nauc_ndcg_at_100_diff1 value: 25.180282568979095 - type: nauc_ndcg_at_100_max value: 41.737709709508394 - type: nauc_ndcg_at_100_std value: 38.80950644139446 - type: nauc_ndcg_at_10_diff1 value: 24.108368037214046 - type: nauc_ndcg_at_10_max value: 41.29298370689967 - type: nauc_ndcg_at_10_std value: 35.06450769738732 - type: nauc_ndcg_at_1_diff1 value: 35.51010679525079 - type: nauc_ndcg_at_1_max value: 42.40790024212412 - type: nauc_ndcg_at_1_std value: 26.696412036243157 - type: nauc_ndcg_at_20_diff1 value: 23.909989673256195 - type: nauc_ndcg_at_20_max value: 39.78444647091927 - type: nauc_ndcg_at_20_std value: 33.39544470364529 - type: nauc_ndcg_at_3_diff1 value: 22.50484297956035 - type: nauc_ndcg_at_3_max value: 39.14551926034168 - type: nauc_ndcg_at_3_std value: 30.330135925392014 - type: nauc_ndcg_at_5_diff1 value: 21.7798872028265 - type: nauc_ndcg_at_5_max value: 40.23856975248015 - type: nauc_ndcg_at_5_std value: 32.438381067440396 - type: nauc_precision_at_1000_diff1 value: -21.62692442272279 - type: nauc_precision_at_1000_max value: 0.9689046974430882 - type: nauc_precision_at_1000_std value: 18.54001058230465 - type: nauc_precision_at_100_diff1 value: -10.132258779856192 - type: nauc_precision_at_100_max value: 23.74516110444681 - type: nauc_precision_at_100_std value: 47.03416663319965 - type: nauc_precision_at_10_diff1 value: 1.543656509571949 - type: nauc_precision_at_10_max value: 36.98864812757555 - type: nauc_precision_at_10_std value: 46.56427199077426 - type: nauc_precision_at_1_diff1 value: 38.2472828531032 - type: nauc_precision_at_1_max value: 51.528473828685705 - type: nauc_precision_at_1_std value: 33.03676467942882 - type: nauc_precision_at_20_diff1 value: -4.612864872734335 - type: nauc_precision_at_20_max value: 34.03565449182125 - type: nauc_precision_at_20_std value: 48.880727648349534 - type: nauc_precision_at_3_diff1 value: 6.360850444467829 - type: nauc_precision_at_3_max value: 36.25816942368427 - type: nauc_precision_at_3_std value: 34.48882647419187 - type: nauc_precision_at_5_diff1 value: 2.6445596936740037 - type: nauc_precision_at_5_max value: 37.174463388899056 - type: nauc_precision_at_5_std value: 40.25254370626113 - type: nauc_recall_at_1000_diff1 value: 13.041227176748077 - type: nauc_recall_at_1000_max value: 39.722336427072094 - type: nauc_recall_at_1000_std value: 52.04032890059214 - type: nauc_recall_at_100_diff1 value: 18.286096899139153 - type: nauc_recall_at_100_max value: 34.072389201930314 - type: nauc_recall_at_100_std value: 37.73637623416653 - type: nauc_recall_at_10_diff1 value: 22.35560419280504 - type: nauc_recall_at_10_max value: 19.727247199595197 - type: nauc_recall_at_10_std value: 8.58498575109203 - type: nauc_recall_at_1_diff1 value: 38.67437644802124 - type: nauc_recall_at_1_max value: 14.52136658726491 - type: nauc_recall_at_1_std value: -2.8981666782088755 - type: nauc_recall_at_20_diff1 value: 19.026320886902916 - type: nauc_recall_at_20_max value: 22.753562309469867 - type: nauc_recall_at_20_std value: 14.89994263882445 - type: nauc_recall_at_3_diff1 value: 23.428129702129684 - type: nauc_recall_at_3_max value: 10.549153954790542 - type: nauc_recall_at_3_std value: -1.7590608997055206 - type: nauc_recall_at_5_diff1 value: 21.27448645803921 - type: nauc_recall_at_5_max value: 13.620279707461677 - type: nauc_recall_at_5_std value: 2.0577962208292675 - type: ndcg_at_1 value: 46.75 - type: ndcg_at_10 value: 34.827000000000005 - type: ndcg_at_100 value: 38.157999999999994 - type: ndcg_at_1000 value: 44.816 - type: ndcg_at_20 value: 34.152 - type: ndcg_at_3 value: 39.009 - type: ndcg_at_5 value: 36.826 - type: precision_at_1 value: 57.25 - type: precision_at_10 value: 27.575 - type: precision_at_100 value: 8.84 - type: precision_at_1000 value: 1.949 - type: precision_at_20 value: 20.724999999999998 - type: precision_at_3 value: 41.167 - type: precision_at_5 value: 35.199999999999996 - type: recall_at_1 value: 7.049999999999999 - type: recall_at_10 value: 19.817999999999998 - type: recall_at_100 value: 42.559999999999995 - type: recall_at_1000 value: 63.744 - type: recall_at_20 value: 25.968000000000004 - type: recall_at_3 value: 11.959 - type: recall_at_5 value: 14.939 - task: type: Retrieval dataset: name: MTEB FiQA-PL (default) type: clarin-knext/fiqa-pl config: default split: test revision: 2e535829717f8bf9dc829b7f911cc5bbd4e6608e metrics: - type: main_score value: 38.828 - type: map_at_1 value: 19.126 - type: map_at_10 value: 31.002000000000002 - type: map_at_100 value: 32.736 - type: map_at_1000 value: 32.933 - type: map_at_20 value: 31.894 - type: map_at_3 value: 26.583000000000002 - type: map_at_5 value: 28.904000000000003 - type: mrr_at_1 value: 37.808641975308646 - type: mrr_at_10 value: 46.36745541838134 - type: mrr_at_100 value: 47.14140915794908 - type: mrr_at_1000 value: 47.190701435388846 - type: mrr_at_20 value: 46.81387776440309 - type: mrr_at_3 value: 43.750000000000014 - type: mrr_at_5 value: 45.23919753086418 - type: nauc_map_at_1000_diff1 value: 38.5532285881503 - type: nauc_map_at_1000_max value: 34.44383884813453 - type: nauc_map_at_1000_std value: -1.3963497949476722 - type: nauc_map_at_100_diff1 value: 38.49292464176943 - type: nauc_map_at_100_max value: 34.33752755618645 - type: nauc_map_at_100_std value: -1.4794032905848582 - type: nauc_map_at_10_diff1 value: 38.26061536370962 - type: nauc_map_at_10_max value: 33.16977912721411 - type: nauc_map_at_10_std value: -2.3853370604730393 - type: nauc_map_at_1_diff1 value: 46.288767289528344 - type: nauc_map_at_1_max value: 25.67706785013364 - type: nauc_map_at_1_std value: -6.989769609924645 - type: nauc_map_at_20_diff1 value: 38.507270129330685 - type: nauc_map_at_20_max value: 33.70963328055982 - type: nauc_map_at_20_std value: -1.9835510011554272 - type: nauc_map_at_3_diff1 value: 39.81061518646884 - type: nauc_map_at_3_max value: 30.101186374147748 - type: nauc_map_at_3_std value: -4.027120247237715 - type: nauc_map_at_5_diff1 value: 38.55602589746512 - type: nauc_map_at_5_max value: 31.515174267015983 - type: nauc_map_at_5_std value: -3.4064239358570303 - type: nauc_mrr_at_1000_diff1 value: 45.030514454725726 - type: nauc_mrr_at_1000_max value: 43.878919881666164 - type: nauc_mrr_at_1000_std value: 2.517594250297626 - type: nauc_mrr_at_100_diff1 value: 45.00868212878687 - type: nauc_mrr_at_100_max value: 43.87437011120001 - type: nauc_mrr_at_100_std value: 2.5257874265014966 - type: nauc_mrr_at_10_diff1 value: 44.855044606754056 - type: nauc_mrr_at_10_max value: 43.946617058785186 - type: nauc_mrr_at_10_std value: 2.5173751662794044 - type: nauc_mrr_at_1_diff1 value: 49.441510997817346 - type: nauc_mrr_at_1_max value: 43.08547383044357 - type: nauc_mrr_at_1_std value: -1.8747770703324347 - type: nauc_mrr_at_20_diff1 value: 45.019880416584215 - type: nauc_mrr_at_20_max value: 43.85691473662242 - type: nauc_mrr_at_20_std value: 2.4625487605091303 - type: nauc_mrr_at_3_diff1 value: 45.322041658604036 - type: nauc_mrr_at_3_max value: 43.95079293074395 - type: nauc_mrr_at_3_std value: 2.4644274393435737 - type: nauc_mrr_at_5_diff1 value: 44.99461837803437 - type: nauc_mrr_at_5_max value: 43.97934275090601 - type: nauc_mrr_at_5_std value: 2.5353091695125096 - type: nauc_ndcg_at_1000_diff1 value: 39.38449023275524 - type: nauc_ndcg_at_1000_max value: 39.48382767312788 - type: nauc_ndcg_at_1000_std value: 3.414789408343409 - type: nauc_ndcg_at_100_diff1 value: 38.29675861135578 - type: nauc_ndcg_at_100_max value: 38.2674786507297 - type: nauc_ndcg_at_100_std value: 2.7094055381218207 - type: nauc_ndcg_at_10_diff1 value: 38.09514955708717 - type: nauc_ndcg_at_10_max value: 36.664923238906525 - type: nauc_ndcg_at_10_std value: 0.6901410544967921 - type: nauc_ndcg_at_1_diff1 value: 49.441510997817346 - type: nauc_ndcg_at_1_max value: 43.08547383044357 - type: nauc_ndcg_at_1_std value: -1.8747770703324347 - type: nauc_ndcg_at_20_diff1 value: 38.44967736231759 - type: nauc_ndcg_at_20_max value: 36.871179313622584 - type: nauc_ndcg_at_20_std value: 1.157560360065234 - type: nauc_ndcg_at_3_diff1 value: 39.02419271805571 - type: nauc_ndcg_at_3_max value: 37.447669442586324 - type: nauc_ndcg_at_3_std value: 0.41502589779297794 - type: nauc_ndcg_at_5_diff1 value: 38.10233452742001 - type: nauc_ndcg_at_5_max value: 35.816381905465676 - type: nauc_ndcg_at_5_std value: -0.3704499913387088 - type: nauc_precision_at_1000_diff1 value: 2.451267097838658 - type: nauc_precision_at_1000_max value: 29.116394969085306 - type: nauc_precision_at_1000_std value: 14.85900786538363 - type: nauc_precision_at_100_diff1 value: 8.10919082251277 - type: nauc_precision_at_100_max value: 36.28388256191417 - type: nauc_precision_at_100_std value: 14.830039904317657 - type: nauc_precision_at_10_diff1 value: 15.02446609920477 - type: nauc_precision_at_10_max value: 41.008463775454054 - type: nauc_precision_at_10_std value: 10.431403152334486 - type: nauc_precision_at_1_diff1 value: 49.441510997817346 - type: nauc_precision_at_1_max value: 43.08547383044357 - type: nauc_precision_at_1_std value: -1.8747770703324347 - type: nauc_precision_at_20_diff1 value: 14.222022201169926 - type: nauc_precision_at_20_max value: 40.10189643835305 - type: nauc_precision_at_20_std value: 12.204443815975527 - type: nauc_precision_at_3_diff1 value: 25.41905395341234 - type: nauc_precision_at_3_max value: 41.56133905339819 - type: nauc_precision_at_3_std value: 5.575516915590082 - type: nauc_precision_at_5_diff1 value: 20.20081221089351 - type: nauc_precision_at_5_max value: 40.95218555916681 - type: nauc_precision_at_5_std value: 7.2040745500708745 - type: nauc_recall_at_1000_diff1 value: 28.021198234033395 - type: nauc_recall_at_1000_max value: 36.165148684597504 - type: nauc_recall_at_1000_std value: 28.28852356008973 - type: nauc_recall_at_100_diff1 value: 21.882447802741897 - type: nauc_recall_at_100_max value: 26.979684607567222 - type: nauc_recall_at_100_std value: 9.783658817010082 - type: nauc_recall_at_10_diff1 value: 28.493097951178818 - type: nauc_recall_at_10_max value: 29.40937476550134 - type: nauc_recall_at_10_std value: 2.7593763576979353 - type: nauc_recall_at_1_diff1 value: 46.288767289528344 - type: nauc_recall_at_1_max value: 25.67706785013364 - type: nauc_recall_at_1_std value: -6.989769609924645 - type: nauc_recall_at_20_diff1 value: 27.638381299425234 - type: nauc_recall_at_20_max value: 27.942035836106328 - type: nauc_recall_at_20_std value: 3.489835161380808 - type: nauc_recall_at_3_diff1 value: 33.90054781392646 - type: nauc_recall_at_3_max value: 27.778812533030322 - type: nauc_recall_at_3_std value: -0.03054068020022706 - type: nauc_recall_at_5_diff1 value: 30.279060732221346 - type: nauc_recall_at_5_max value: 27.49854749597931 - type: nauc_recall_at_5_std value: 0.5434664581939099 - type: ndcg_at_1 value: 37.809 - type: ndcg_at_10 value: 38.828 - type: ndcg_at_100 value: 45.218 - type: ndcg_at_1000 value: 48.510999999999996 - type: ndcg_at_20 value: 41.11 - type: ndcg_at_3 value: 34.466 - type: ndcg_at_5 value: 35.843 - type: precision_at_1 value: 37.809 - type: precision_at_10 value: 11.157 - type: precision_at_100 value: 1.762 - type: precision_at_1000 value: 0.233 - type: precision_at_20 value: 6.497 - type: precision_at_3 value: 23.044999999999998 - type: precision_at_5 value: 17.284 - type: recall_at_1 value: 19.126 - type: recall_at_10 value: 46.062 - type: recall_at_100 value: 70.22800000000001 - type: recall_at_1000 value: 89.803 - type: recall_at_20 value: 53.217999999999996 - type: recall_at_3 value: 30.847 - type: recall_at_5 value: 37.11 - task: type: Retrieval dataset: name: MTEB HotpotQA-PL (default) type: clarin-knext/hotpotqa-pl config: default split: test revision: a0bd479ac97b4ccb5bd6ce320c415d0bb4beb907 metrics: - type: main_score value: 60.27 - type: map_at_1 value: 35.199000000000005 - type: map_at_10 value: 51.369 - type: map_at_100 value: 52.212 - type: map_at_1000 value: 52.28 - type: map_at_20 value: 51.864 - type: map_at_3 value: 48.446 - type: map_at_5 value: 50.302 - type: mrr_at_1 value: 70.39837947332883 - type: mrr_at_10 value: 76.8346141067273 - type: mrr_at_100 value: 77.10724392048137 - type: mrr_at_1000 value: 77.12037412892865 - type: mrr_at_20 value: 77.01061532947222 - type: mrr_at_3 value: 75.5908170155299 - type: mrr_at_5 value: 76.39095205941899 - type: nauc_map_at_1000_diff1 value: 24.701387884989117 - type: nauc_map_at_1000_max value: 23.25553235642178 - type: nauc_map_at_1000_std value: 7.1803506915661774 - type: nauc_map_at_100_diff1 value: 24.674498622483103 - type: nauc_map_at_100_max value: 23.234948525052175 - type: nauc_map_at_100_std value: 7.168677997105447 - type: nauc_map_at_10_diff1 value: 24.676025039755626 - type: nauc_map_at_10_max value: 23.171971872726964 - type: nauc_map_at_10_std value: 6.485610909852058 - type: nauc_map_at_1_diff1 value: 68.90178464319715 - type: nauc_map_at_1_max value: 46.05537868917558 - type: nauc_map_at_1_std value: 1.7658552480698708 - type: nauc_map_at_20_diff1 value: 24.69297151842494 - type: nauc_map_at_20_max value: 23.213064691673637 - type: nauc_map_at_20_std value: 6.9357946556849 - type: nauc_map_at_3_diff1 value: 26.279128947950507 - type: nauc_map_at_3_max value: 23.929537354117922 - type: nauc_map_at_3_std value: 4.625061565714759 - type: nauc_map_at_5_diff1 value: 25.04448959482816 - type: nauc_map_at_5_max value: 23.432012857899338 - type: nauc_map_at_5_std value: 5.845744681998008 - type: nauc_mrr_at_1000_diff1 value: 66.7503918108276 - type: nauc_mrr_at_1000_max value: 48.42897342336844 - type: nauc_mrr_at_1000_std value: 5.3097517971144415 - type: nauc_mrr_at_100_diff1 value: 66.74645215862695 - type: nauc_mrr_at_100_max value: 48.4368663009989 - type: nauc_mrr_at_100_std value: 5.322297898555188 - type: nauc_mrr_at_10_diff1 value: 66.69310166180729 - type: nauc_mrr_at_10_max value: 48.475437698330225 - type: nauc_mrr_at_10_std value: 5.258183461631702 - type: nauc_mrr_at_1_diff1 value: 68.90178464319715 - type: nauc_mrr_at_1_max value: 46.05537868917558 - type: nauc_mrr_at_1_std value: 1.7658552480698708 - type: nauc_mrr_at_20_diff1 value: 66.72000262431975 - type: nauc_mrr_at_20_max value: 48.45593642981319 - type: nauc_mrr_at_20_std value: 5.353665929072101 - type: nauc_mrr_at_3_diff1 value: 66.84936676396276 - type: nauc_mrr_at_3_max value: 48.466611276778295 - type: nauc_mrr_at_3_std value: 4.485810398557475 - type: nauc_mrr_at_5_diff1 value: 66.62362565394174 - type: nauc_mrr_at_5_max value: 48.456431835482014 - type: nauc_mrr_at_5_std value: 5.08482458391903 - type: nauc_ndcg_at_1000_diff1 value: 29.984825173719443 - type: nauc_ndcg_at_1000_max value: 27.289179238639893 - type: nauc_ndcg_at_1000_std value: 10.661480455527526 - type: nauc_ndcg_at_100_diff1 value: 29.322074257047877 - type: nauc_ndcg_at_100_max value: 26.850650276220605 - type: nauc_ndcg_at_100_std value: 10.599247982501902 - type: nauc_ndcg_at_10_diff1 value: 29.659909113886094 - type: nauc_ndcg_at_10_max value: 26.836139599331005 - type: nauc_ndcg_at_10_std value: 8.12844399452719 - type: nauc_ndcg_at_1_diff1 value: 68.90178464319715 - type: nauc_ndcg_at_1_max value: 46.05537868917558 - type: nauc_ndcg_at_1_std value: 1.7658552480698708 - type: nauc_ndcg_at_20_diff1 value: 29.510802214854294 - type: nauc_ndcg_at_20_max value: 26.775562637730722 - type: nauc_ndcg_at_20_std value: 9.341342661702363 - type: nauc_ndcg_at_3_diff1 value: 32.741885846292966 - type: nauc_ndcg_at_3_max value: 28.44225108761343 - type: nauc_ndcg_at_3_std value: 5.204440768465042 - type: nauc_ndcg_at_5_diff1 value: 30.57856348635919 - type: nauc_ndcg_at_5_max value: 27.475007474301698 - type: nauc_ndcg_at_5_std value: 6.961546044312487 - type: nauc_precision_at_1000_diff1 value: 0.002113156309413332 - type: nauc_precision_at_1000_max value: 11.198242419541286 - type: nauc_precision_at_1000_std value: 28.69676419166541 - type: nauc_precision_at_100_diff1 value: 3.6049575557782627 - type: nauc_precision_at_100_max value: 12.499173524574791 - type: nauc_precision_at_100_std value: 23.3755281004721 - type: nauc_precision_at_10_diff1 value: 10.922574784853193 - type: nauc_precision_at_10_max value: 16.23221529562036 - type: nauc_precision_at_10_std value: 12.45014808813857 - type: nauc_precision_at_1_diff1 value: 68.90178464319715 - type: nauc_precision_at_1_max value: 46.05537868917558 - type: nauc_precision_at_1_std value: 1.7658552480698708 - type: nauc_precision_at_20_diff1 value: 8.840710781302827 - type: nauc_precision_at_20_max value: 14.804644554205524 - type: nauc_precision_at_20_std value: 16.245009770815237 - type: nauc_precision_at_3_diff1 value: 19.447291487137573 - type: nauc_precision_at_3_max value: 21.47123471597057 - type: nauc_precision_at_3_std value: 6.441862800128802 - type: nauc_precision_at_5_diff1 value: 14.078545719721108 - type: nauc_precision_at_5_max value: 18.468288046016387 - type: nauc_precision_at_5_std value: 9.58650641691393 - type: nauc_recall_at_1000_diff1 value: 0.0021131563095336584 - type: nauc_recall_at_1000_max value: 11.198242419541558 - type: nauc_recall_at_1000_std value: 28.6967641916655 - type: nauc_recall_at_100_diff1 value: 3.6049575557781393 - type: nauc_recall_at_100_max value: 12.499173524574765 - type: nauc_recall_at_100_std value: 23.375528100472074 - type: nauc_recall_at_10_diff1 value: 10.922574784853168 - type: nauc_recall_at_10_max value: 16.2322152956203 - type: nauc_recall_at_10_std value: 12.450148088138535 - type: nauc_recall_at_1_diff1 value: 68.90178464319715 - type: nauc_recall_at_1_max value: 46.05537868917558 - type: nauc_recall_at_1_std value: 1.7658552480698708 - type: nauc_recall_at_20_diff1 value: 8.840710781302905 - type: nauc_recall_at_20_max value: 14.804644554205515 - type: nauc_recall_at_20_std value: 16.245009770815273 - type: nauc_recall_at_3_diff1 value: 19.447291487137498 - type: nauc_recall_at_3_max value: 21.47123471597054 - type: nauc_recall_at_3_std value: 6.441862800128763 - type: nauc_recall_at_5_diff1 value: 14.07854571972115 - type: nauc_recall_at_5_max value: 18.468288046016337 - type: nauc_recall_at_5_std value: 9.586506416913904 - type: ndcg_at_1 value: 70.39800000000001 - type: ndcg_at_10 value: 60.27 - type: ndcg_at_100 value: 63.400999999999996 - type: ndcg_at_1000 value: 64.847 - type: ndcg_at_20 value: 61.571 - type: ndcg_at_3 value: 55.875 - type: ndcg_at_5 value: 58.36599999999999 - type: precision_at_1 value: 70.39800000000001 - type: precision_at_10 value: 12.46 - type: precision_at_100 value: 1.493 - type: precision_at_1000 value: 0.169 - type: precision_at_20 value: 6.65 - type: precision_at_3 value: 35.062 - type: precision_at_5 value: 23.009 - type: recall_at_1 value: 35.199000000000005 - type: recall_at_10 value: 62.302 - type: recall_at_100 value: 74.666 - type: recall_at_1000 value: 84.355 - type: recall_at_20 value: 66.496 - type: recall_at_3 value: 52.593 - type: recall_at_5 value: 57.522 - task: type: Retrieval dataset: name: MTEB MSMARCO-PL (default) type: clarin-knext/msmarco-pl config: default split: test revision: 8634c07806d5cce3a6138e260e59b81760a0a640 metrics: - type: main_score value: 64.886 - type: map_at_1 value: 1.644 - type: map_at_10 value: 12.24 - type: map_at_100 value: 28.248 - type: map_at_1000 value: 33.506 - type: map_at_20 value: 17.497 - type: map_at_3 value: 4.9399999999999995 - type: map_at_5 value: 8.272 - type: mrr_at_1 value: 83.72093023255815 - type: mrr_at_10 value: 91.08527131782945 - type: mrr_at_100 value: 91.08527131782945 - type: mrr_at_1000 value: 91.08527131782945 - type: mrr_at_20 value: 91.08527131782945 - type: mrr_at_3 value: 91.08527131782945 - type: mrr_at_5 value: 91.08527131782945 - type: nauc_map_at_1000_diff1 value: -36.428271627303424 - type: nauc_map_at_1000_max value: 44.87615127218638 - type: nauc_map_at_1000_std value: 67.92696808824724 - type: nauc_map_at_100_diff1 value: -28.11674206786188 - type: nauc_map_at_100_max value: 36.422779766334955 - type: nauc_map_at_100_std value: 49.99876313755116 - type: nauc_map_at_10_diff1 value: -5.838593619806058 - type: nauc_map_at_10_max value: 11.026519190509742 - type: nauc_map_at_10_std value: 2.5268752263522045 - type: nauc_map_at_1_diff1 value: 17.897907271073016 - type: nauc_map_at_1_max value: 12.229062762540844 - type: nauc_map_at_1_std value: -4.088830895573149 - type: nauc_map_at_20_diff1 value: -13.871097716255626 - type: nauc_map_at_20_max value: 19.291271635609533 - type: nauc_map_at_20_std value: 16.745335606507826 - type: nauc_map_at_3_diff1 value: 4.425238457033843 - type: nauc_map_at_3_max value: 4.611864744680824 - type: nauc_map_at_3_std value: -8.986916608582863 - type: nauc_map_at_5_diff1 value: -6.254849256920095 - type: nauc_map_at_5_max value: 2.729437079919823 - type: nauc_map_at_5_std value: -7.235906279913092 - type: nauc_mrr_at_1000_diff1 value: 52.18669104947672 - type: nauc_mrr_at_1000_max value: 68.26259125411818 - type: nauc_mrr_at_1000_std value: 56.345086428353575 - type: nauc_mrr_at_100_diff1 value: 52.18669104947672 - type: nauc_mrr_at_100_max value: 68.26259125411818 - type: nauc_mrr_at_100_std value: 56.345086428353575 - type: nauc_mrr_at_10_diff1 value: 52.18669104947672 - type: nauc_mrr_at_10_max value: 68.26259125411818 - type: nauc_mrr_at_10_std value: 56.345086428353575 - type: nauc_mrr_at_1_diff1 value: 56.55126663944154 - type: nauc_mrr_at_1_max value: 66.37014285522565 - type: nauc_mrr_at_1_std value: 53.2508271389779 - type: nauc_mrr_at_20_diff1 value: 52.18669104947672 - type: nauc_mrr_at_20_max value: 68.26259125411818 - type: nauc_mrr_at_20_std value: 56.345086428353575 - type: nauc_mrr_at_3_diff1 value: 52.18669104947672 - type: nauc_mrr_at_3_max value: 68.26259125411818 - type: nauc_mrr_at_3_std value: 56.345086428353575 - type: nauc_mrr_at_5_diff1 value: 52.18669104947672 - type: nauc_mrr_at_5_max value: 68.26259125411818 - type: nauc_mrr_at_5_std value: 56.345086428353575 - type: nauc_ndcg_at_1000_diff1 value: -19.06422926483731 - type: nauc_ndcg_at_1000_max value: 56.30853514590265 - type: nauc_ndcg_at_1000_std value: 70.30810947505557 - type: nauc_ndcg_at_100_diff1 value: -25.72587586459692 - type: nauc_ndcg_at_100_max value: 51.433781241604194 - type: nauc_ndcg_at_100_std value: 68.37678512652792 - type: nauc_ndcg_at_10_diff1 value: -23.21198108212602 - type: nauc_ndcg_at_10_max value: 43.5450720846516 - type: nauc_ndcg_at_10_std value: 48.78307907005605 - type: nauc_ndcg_at_1_diff1 value: 44.00179301267447 - type: nauc_ndcg_at_1_max value: 48.202370455680395 - type: nauc_ndcg_at_1_std value: 25.69655992704088 - type: nauc_ndcg_at_20_diff1 value: -33.88168753446507 - type: nauc_ndcg_at_20_max value: 45.16199742613164 - type: nauc_ndcg_at_20_std value: 61.87098383164902 - type: nauc_ndcg_at_3_diff1 value: 11.19174449544048 - type: nauc_ndcg_at_3_max value: 44.34069860560555 - type: nauc_ndcg_at_3_std value: 27.451258369798115 - type: nauc_ndcg_at_5_diff1 value: -7.186520929432436 - type: nauc_ndcg_at_5_max value: 43.41869981139378 - type: nauc_ndcg_at_5_std value: 34.89898115995178 - type: nauc_precision_at_1000_diff1 value: -34.43998154563451 - type: nauc_precision_at_1000_max value: 29.172655907480372 - type: nauc_precision_at_1000_std value: 65.15824469614837 - type: nauc_precision_at_100_diff1 value: -37.82409643259692 - type: nauc_precision_at_100_max value: 38.24986991317909 - type: nauc_precision_at_100_std value: 72.74768183105327 - type: nauc_precision_at_10_diff1 value: -32.21556182780535 - type: nauc_precision_at_10_max value: 34.27170432382651 - type: nauc_precision_at_10_std value: 58.358255004394664 - type: nauc_precision_at_1_diff1 value: 56.55126663944154 - type: nauc_precision_at_1_max value: 66.37014285522565 - type: nauc_precision_at_1_std value: 53.2508271389779 - type: nauc_precision_at_20_diff1 value: -40.18751579026395 - type: nauc_precision_at_20_max value: 33.960783153758896 - type: nauc_precision_at_20_std value: 65.42918390184195 - type: nauc_precision_at_3_diff1 value: -7.073870209006578 - type: nauc_precision_at_3_max value: 50.81535269862325 - type: nauc_precision_at_3_std value: 59.248681565955685 - type: nauc_precision_at_5_diff1 value: -31.136580596983876 - type: nauc_precision_at_5_max value: 45.88147792380426 - type: nauc_precision_at_5_std value: 67.46814230928243 - type: nauc_recall_at_1000_diff1 value: -23.15699999594577 - type: nauc_recall_at_1000_max value: 39.77277799761876 - type: nauc_recall_at_1000_std value: 60.326168012901114 - type: nauc_recall_at_100_diff1 value: -21.636664823598498 - type: nauc_recall_at_100_max value: 31.104969346131583 - type: nauc_recall_at_100_std value: 38.811686891592096 - type: nauc_recall_at_10_diff1 value: -10.542765625053569 - type: nauc_recall_at_10_max value: 2.043876058107446 - type: nauc_recall_at_10_std value: -5.578449908984766 - type: nauc_recall_at_1_diff1 value: 17.897907271073016 - type: nauc_recall_at_1_max value: 12.229062762540844 - type: nauc_recall_at_1_std value: -4.088830895573149 - type: nauc_recall_at_20_diff1 value: -15.132909355710103 - type: nauc_recall_at_20_max value: 12.659765287241065 - type: nauc_recall_at_20_std value: 8.277887800815819 - type: nauc_recall_at_3_diff1 value: -3.1975017812715016 - type: nauc_recall_at_3_max value: -3.5539857085038538 - type: nauc_recall_at_3_std value: -14.712102851318118 - type: nauc_recall_at_5_diff1 value: -14.040507717380743 - type: nauc_recall_at_5_max value: -6.126912150131701 - type: nauc_recall_at_5_std value: -13.821624015640355 - type: ndcg_at_1 value: 71.318 - type: ndcg_at_10 value: 64.886 - type: ndcg_at_100 value: 53.187 - type: ndcg_at_1000 value: 59.897999999999996 - type: ndcg_at_20 value: 58.96 - type: ndcg_at_3 value: 69.736 - type: ndcg_at_5 value: 70.14099999999999 - type: precision_at_1 value: 83.721 - type: precision_at_10 value: 71.163 - type: precision_at_100 value: 29.465000000000003 - type: precision_at_1000 value: 5.665 - type: precision_at_20 value: 57.791000000000004 - type: precision_at_3 value: 82.171 - type: precision_at_5 value: 81.86 - type: recall_at_1 value: 1.644 - type: recall_at_10 value: 14.238000000000001 - type: recall_at_100 value: 39.831 - type: recall_at_1000 value: 64.057 - type: recall_at_20 value: 21.021 - type: recall_at_3 value: 5.53 - type: recall_at_5 value: 9.623 - task: type: Retrieval dataset: name: MTEB NFCorpus-PL (default) type: clarin-knext/nfcorpus-pl config: default split: test revision: 9a6f9567fda928260afed2de480d79c98bf0bec0 metrics: - type: main_score value: 31.391000000000002 - type: map_at_1 value: 4.163 - type: map_at_10 value: 10.744 - type: map_at_100 value: 14.038999999999998 - type: map_at_1000 value: 15.434999999999999 - type: map_at_20 value: 12.16 - type: map_at_3 value: 7.614999999999999 - type: map_at_5 value: 9.027000000000001 - type: mrr_at_1 value: 39.0092879256966 - type: mrr_at_10 value: 48.69809327239668 - type: mrr_at_100 value: 49.20788148442068 - type: mrr_at_1000 value: 49.25509336494706 - type: mrr_at_20 value: 48.99606551850896 - type: mrr_at_3 value: 46.284829721362236 - type: mrr_at_5 value: 47.77089783281735 - type: nauc_map_at_1000_diff1 value: 22.75421477116417 - type: nauc_map_at_1000_max value: 49.242283787799046 - type: nauc_map_at_1000_std value: 29.056888272331832 - type: nauc_map_at_100_diff1 value: 23.585977398585594 - type: nauc_map_at_100_max value: 48.25845199409498 - type: nauc_map_at_100_std value: 24.944264511223693 - type: nauc_map_at_10_diff1 value: 27.386613094780255 - type: nauc_map_at_10_max value: 41.52415346691586 - type: nauc_map_at_10_std value: 12.93872448563755 - type: nauc_map_at_1_diff1 value: 46.78688143865053 - type: nauc_map_at_1_max value: 37.20408843995871 - type: nauc_map_at_1_std value: 4.383444959401098 - type: nauc_map_at_20_diff1 value: 25.590969047740288 - type: nauc_map_at_20_max value: 44.57109307999418 - type: nauc_map_at_20_std value: 16.45855141821407 - type: nauc_map_at_3_diff1 value: 36.30017108362863 - type: nauc_map_at_3_max value: 34.66149613991648 - type: nauc_map_at_3_std value: 5.67985905078467 - type: nauc_map_at_5_diff1 value: 31.157644795417223 - type: nauc_map_at_5_max value: 37.274738661636825 - type: nauc_map_at_5_std value: 8.70088872394168 - type: nauc_mrr_at_1000_diff1 value: 25.638564218157384 - type: nauc_mrr_at_1000_max value: 57.77788270285353 - type: nauc_mrr_at_1000_std value: 43.507586592911274 - type: nauc_mrr_at_100_diff1 value: 25.662002580561584 - type: nauc_mrr_at_100_max value: 57.80578394278584 - type: nauc_mrr_at_100_std value: 43.543905743986635 - type: nauc_mrr_at_10_diff1 value: 25.426034796339835 - type: nauc_mrr_at_10_max value: 57.68443186258669 - type: nauc_mrr_at_10_std value: 43.438009108331215 - type: nauc_mrr_at_1_diff1 value: 26.073028156311075 - type: nauc_mrr_at_1_max value: 52.11817916720053 - type: nauc_mrr_at_1_std value: 37.41073893153695 - type: nauc_mrr_at_20_diff1 value: 25.548645553336147 - type: nauc_mrr_at_20_max value: 57.78552760401915 - type: nauc_mrr_at_20_std value: 43.521687428822325 - type: nauc_mrr_at_3_diff1 value: 25.72662577397805 - type: nauc_mrr_at_3_max value: 56.891263536265605 - type: nauc_mrr_at_3_std value: 41.384872305390104 - type: nauc_mrr_at_5_diff1 value: 25.552211551655386 - type: nauc_mrr_at_5_max value: 57.976813828353926 - type: nauc_mrr_at_5_std value: 43.504564461855544 - type: nauc_ndcg_at_1000_diff1 value: 23.456158044182757 - type: nauc_ndcg_at_1000_max value: 60.05411773552709 - type: nauc_ndcg_at_1000_std value: 47.857510017262584 - type: nauc_ndcg_at_100_diff1 value: 19.711635700390772 - type: nauc_ndcg_at_100_max value: 56.178746740470665 - type: nauc_ndcg_at_100_std value: 42.36829180286942 - type: nauc_ndcg_at_10_diff1 value: 18.364428967788413 - type: nauc_ndcg_at_10_max value: 54.38372506578223 - type: nauc_ndcg_at_10_std value: 41.75765411340369 - type: nauc_ndcg_at_1_diff1 value: 26.571093272640773 - type: nauc_ndcg_at_1_max value: 51.061788341958284 - type: nauc_ndcg_at_1_std value: 36.514987974075986 - type: nauc_ndcg_at_20_diff1 value: 18.345487193027697 - type: nauc_ndcg_at_20_max value: 54.62621882656994 - type: nauc_ndcg_at_20_std value: 41.42835554714241 - type: nauc_ndcg_at_3_diff1 value: 23.260105658139025 - type: nauc_ndcg_at_3_max value: 52.07747385334546 - type: nauc_ndcg_at_3_std value: 36.91985577837284 - type: nauc_ndcg_at_5_diff1 value: 20.40428109665566 - type: nauc_ndcg_at_5_max value: 53.52015347884604 - type: nauc_ndcg_at_5_std value: 39.46008849580017 - type: nauc_precision_at_1000_diff1 value: -7.3487344916380035 - type: nauc_precision_at_1000_max value: 16.58045221394852 - type: nauc_precision_at_1000_std value: 38.94030932397075 - type: nauc_precision_at_100_diff1 value: -5.257743986683922 - type: nauc_precision_at_100_max value: 34.43071687475306 - type: nauc_precision_at_100_std value: 53.499519170670474 - type: nauc_precision_at_10_diff1 value: 2.385136433119139 - type: nauc_precision_at_10_max value: 47.210743878631064 - type: nauc_precision_at_10_std value: 47.22767704186548 - type: nauc_precision_at_1_diff1 value: 26.073028156311075 - type: nauc_precision_at_1_max value: 52.11817916720053 - type: nauc_precision_at_1_std value: 37.41073893153695 - type: nauc_precision_at_20_diff1 value: -0.3531531127238474 - type: nauc_precision_at_20_max value: 44.78044604856974 - type: nauc_precision_at_20_std value: 49.532804150743615 - type: nauc_precision_at_3_diff1 value: 15.350050569991447 - type: nauc_precision_at_3_max value: 51.01572315596549 - type: nauc_precision_at_3_std value: 38.801125728413155 - type: nauc_precision_at_5_diff1 value: 9.109003666144694 - type: nauc_precision_at_5_max value: 50.935269774898494 - type: nauc_precision_at_5_std value: 43.323548180559676 - type: nauc_recall_at_1000_diff1 value: 16.64743647648886 - type: nauc_recall_at_1000_max value: 38.46012283772285 - type: nauc_recall_at_1000_std value: 36.02016164796441 - type: nauc_recall_at_100_diff1 value: 14.005834785186744 - type: nauc_recall_at_100_max value: 37.70026105513647 - type: nauc_recall_at_100_std value: 27.085222642129697 - type: nauc_recall_at_10_diff1 value: 21.204106627422632 - type: nauc_recall_at_10_max value: 36.737624881893424 - type: nauc_recall_at_10_std value: 13.755054514272702 - type: nauc_recall_at_1_diff1 value: 46.78688143865053 - type: nauc_recall_at_1_max value: 37.20408843995871 - type: nauc_recall_at_1_std value: 4.383444959401098 - type: nauc_recall_at_20_diff1 value: 19.740977611421933 - type: nauc_recall_at_20_max value: 39.21908969539783 - type: nauc_recall_at_20_std value: 16.560269670318494 - type: nauc_recall_at_3_diff1 value: 32.189359545367815 - type: nauc_recall_at_3_max value: 31.693634445562758 - type: nauc_recall_at_3_std value: 6.246326281543587 - type: nauc_recall_at_5_diff1 value: 25.51586860499901 - type: nauc_recall_at_5_max value: 33.15934725342885 - type: nauc_recall_at_5_std value: 9.677778511696705 - type: ndcg_at_1 value: 37.307 - type: ndcg_at_10 value: 31.391000000000002 - type: ndcg_at_100 value: 28.877999999999997 - type: ndcg_at_1000 value: 37.16 - type: ndcg_at_20 value: 29.314 - type: ndcg_at_3 value: 35.405 - type: ndcg_at_5 value: 33.922999999999995 - type: precision_at_1 value: 39.009 - type: precision_at_10 value: 24.52 - type: precision_at_100 value: 7.703 - type: precision_at_1000 value: 2.04 - type: precision_at_20 value: 18.08 - type: precision_at_3 value: 34.469 - type: precision_at_5 value: 30.712 - type: recall_at_1 value: 4.163 - type: recall_at_10 value: 15.015999999999998 - type: recall_at_100 value: 30.606 - type: recall_at_1000 value: 59.606 - type: recall_at_20 value: 19.09 - type: recall_at_3 value: 9.139 - type: recall_at_5 value: 11.477 - task: type: Retrieval dataset: name: MTEB NQ-PL (default) type: clarin-knext/nq-pl config: default split: test revision: f171245712cf85dd4700b06bef18001578d0ca8d metrics: - type: main_score value: 54.017 - type: map_at_1 value: 34.193 - type: map_at_10 value: 47.497 - type: map_at_100 value: 48.441 - type: map_at_1000 value: 48.481 - type: map_at_20 value: 48.093 - type: map_at_3 value: 44.017 - type: map_at_5 value: 46.111000000000004 - type: mrr_at_1 value: 37.949015063731174 - type: mrr_at_10 value: 49.915772315105954 - type: mrr_at_100 value: 50.62841255829997 - type: mrr_at_1000 value: 50.656773027666745 - type: mrr_at_20 value: 50.37785276657083 - type: mrr_at_3 value: 46.98725376593267 - type: mrr_at_5 value: 48.763035921205066 - type: nauc_map_at_1000_diff1 value: 39.5632191792873 - type: nauc_map_at_1000_max value: 37.4728247053629 - type: nauc_map_at_1000_std value: 5.742498414663762 - type: nauc_map_at_100_diff1 value: 39.555570352061906 - type: nauc_map_at_100_max value: 37.497880976847334 - type: nauc_map_at_100_std value: 5.7798021019465375 - type: nauc_map_at_10_diff1 value: 39.5423723444454 - type: nauc_map_at_10_max value: 37.41661971723365 - type: nauc_map_at_10_std value: 5.2378002164144695 - type: nauc_map_at_1_diff1 value: 41.52697034146981 - type: nauc_map_at_1_max value: 28.558995576942863 - type: nauc_map_at_1_std value: 0.13094542859192052 - type: nauc_map_at_20_diff1 value: 39.55484628943701 - type: nauc_map_at_20_max value: 37.5247794933719 - type: nauc_map_at_20_std value: 5.702881342279231 - type: nauc_map_at_3_diff1 value: 39.949323925425325 - type: nauc_map_at_3_max value: 35.770298168901924 - type: nauc_map_at_3_std value: 2.9127112432479874 - type: nauc_map_at_5_diff1 value: 39.768310617004545 - type: nauc_map_at_5_max value: 37.1549191664796 - type: nauc_map_at_5_std value: 4.4681285748269515 - type: nauc_mrr_at_1000_diff1 value: 39.14001746706457 - type: nauc_mrr_at_1000_max value: 37.477376518267775 - type: nauc_mrr_at_1000_std value: 6.8088891531621565 - type: nauc_mrr_at_100_diff1 value: 39.13054707413684 - type: nauc_mrr_at_100_max value: 37.498126443766274 - type: nauc_mrr_at_100_std value: 6.839411380129971 - type: nauc_mrr_at_10_diff1 value: 39.09764730048156 - type: nauc_mrr_at_10_max value: 37.58593798217306 - type: nauc_mrr_at_10_std value: 6.713795164982413 - type: nauc_mrr_at_1_diff1 value: 41.581599918664075 - type: nauc_mrr_at_1_max value: 31.500589231378722 - type: nauc_mrr_at_1_std value: 2.059116370339438 - type: nauc_mrr_at_20_diff1 value: 39.09011023988447 - type: nauc_mrr_at_20_max value: 37.55856008791344 - type: nauc_mrr_at_20_std value: 6.847165397615844 - type: nauc_mrr_at_3_diff1 value: 39.382542043738 - type: nauc_mrr_at_3_max value: 36.49265363659468 - type: nauc_mrr_at_3_std value: 4.759157976438336 - type: nauc_mrr_at_5_diff1 value: 39.304826333759976 - type: nauc_mrr_at_5_max value: 37.46326016736024 - type: nauc_mrr_at_5_std value: 6.122608305766621 - type: nauc_ndcg_at_1000_diff1 value: 38.568500038453266 - type: nauc_ndcg_at_1000_max value: 39.799710882413166 - type: nauc_ndcg_at_1000_std value: 9.357010223096639 - type: nauc_ndcg_at_100_diff1 value: 38.38026091343228 - type: nauc_ndcg_at_100_max value: 40.48398173542486 - type: nauc_ndcg_at_100_std value: 10.373054013302214 - type: nauc_ndcg_at_10_diff1 value: 38.27340980909964 - type: nauc_ndcg_at_10_max value: 40.35241649744093 - type: nauc_ndcg_at_10_std value: 8.579139930345168 - type: nauc_ndcg_at_1_diff1 value: 41.581599918664075 - type: nauc_ndcg_at_1_max value: 31.500589231378722 - type: nauc_ndcg_at_1_std value: 2.059116370339438 - type: nauc_ndcg_at_20_diff1 value: 38.26453028884807 - type: nauc_ndcg_at_20_max value: 40.70517858426641 - type: nauc_ndcg_at_20_std value: 9.987693876137905 - type: nauc_ndcg_at_3_diff1 value: 39.2078971733273 - type: nauc_ndcg_at_3_max value: 37.48672195565316 - type: nauc_ndcg_at_3_std value: 4.051464994659221 - type: nauc_ndcg_at_5_diff1 value: 38.883693595665285 - type: nauc_ndcg_at_5_max value: 39.763115634437135 - type: nauc_ndcg_at_5_std value: 6.738980451582073 - type: nauc_precision_at_1000_diff1 value: -7.223215910619012 - type: nauc_precision_at_1000_max value: 13.075844604892161 - type: nauc_precision_at_1000_std value: 19.864336920890107 - type: nauc_precision_at_100_diff1 value: 1.3305994810812418 - type: nauc_precision_at_100_max value: 25.9219108557104 - type: nauc_precision_at_100_std value: 27.5076605928207 - type: nauc_precision_at_10_diff1 value: 18.441551484970326 - type: nauc_precision_at_10_max value: 39.85995330437054 - type: nauc_precision_at_10_std value: 20.561269077428914 - type: nauc_precision_at_1_diff1 value: 41.581599918664075 - type: nauc_precision_at_1_max value: 31.500589231378722 - type: nauc_precision_at_1_std value: 2.059116370339438 - type: nauc_precision_at_20_diff1 value: 12.579593891480531 - type: nauc_precision_at_20_max value: 36.620221830588775 - type: nauc_precision_at_20_std value: 26.40364876775059 - type: nauc_precision_at_3_diff1 value: 30.158859294487073 - type: nauc_precision_at_3_max value: 41.168215766389174 - type: nauc_precision_at_3_std value: 9.44345004450809 - type: nauc_precision_at_5_diff1 value: 25.438624678672785 - type: nauc_precision_at_5_max value: 42.72802023518524 - type: nauc_precision_at_5_std value: 15.357657388511099 - type: nauc_recall_at_1000_diff1 value: 24.987564782718003 - type: nauc_recall_at_1000_max value: 70.508416373353 - type: nauc_recall_at_1000_std value: 69.75092280398808 - type: nauc_recall_at_100_diff1 value: 29.504202856421397 - type: nauc_recall_at_100_max value: 63.41356585545318 - type: nauc_recall_at_100_std value: 50.09250954437847 - type: nauc_recall_at_10_diff1 value: 32.355776022971774 - type: nauc_recall_at_10_max value: 49.47121901667283 - type: nauc_recall_at_10_std value: 19.418439406631244 - type: nauc_recall_at_1_diff1 value: 41.52697034146981 - type: nauc_recall_at_1_max value: 28.558995576942863 - type: nauc_recall_at_1_std value: 0.13094542859192052 - type: nauc_recall_at_20_diff1 value: 31.57334731023589 - type: nauc_recall_at_20_max value: 54.06567225197383 - type: nauc_recall_at_20_std value: 29.222029720570468 - type: nauc_recall_at_3_diff1 value: 36.45033533275773 - type: nauc_recall_at_3_max value: 40.39529713780803 - type: nauc_recall_at_3_std value: 5.21893897772794 - type: nauc_recall_at_5_diff1 value: 35.18471678478859 - type: nauc_recall_at_5_max value: 46.20100816867823 - type: nauc_recall_at_5_std value: 11.94481894633221 - type: ndcg_at_1 value: 37.949 - type: ndcg_at_10 value: 54.017 - type: ndcg_at_100 value: 58.126 - type: ndcg_at_1000 value: 59.073 - type: ndcg_at_20 value: 55.928 - type: ndcg_at_3 value: 47.494 - type: ndcg_at_5 value: 50.975 - type: precision_at_1 value: 37.949 - type: precision_at_10 value: 8.450000000000001 - type: precision_at_100 value: 1.083 - type: precision_at_1000 value: 0.117 - type: precision_at_20 value: 4.689 - type: precision_at_3 value: 21.051000000000002 - type: precision_at_5 value: 14.664 - type: recall_at_1 value: 34.193 - type: recall_at_10 value: 71.357 - type: recall_at_100 value: 89.434 - type: recall_at_1000 value: 96.536 - type: recall_at_20 value: 78.363 - type: recall_at_3 value: 54.551 - type: recall_at_5 value: 62.543000000000006 - task: type: Retrieval dataset: name: MTEB Quora-PL (default) type: clarin-knext/quora-pl config: default split: test revision: 0be27e93455051e531182b85e85e425aba12e9d4 metrics: - type: main_score value: 84.114 - type: map_at_1 value: 65.848 - type: map_at_10 value: 79.85900000000001 - type: map_at_100 value: 80.582 - type: map_at_1000 value: 80.60300000000001 - type: map_at_20 value: 80.321 - type: map_at_3 value: 76.741 - type: map_at_5 value: 78.72200000000001 - type: mrr_at_1 value: 75.97 - type: mrr_at_10 value: 83.04630158730119 - type: mrr_at_100 value: 83.22785731032968 - type: mrr_at_1000 value: 83.23123717623899 - type: mrr_at_20 value: 83.17412021320565 - type: mrr_at_3 value: 81.83333333333287 - type: mrr_at_5 value: 82.61933333333275 - type: nauc_map_at_1000_diff1 value: 73.26316553371083 - type: nauc_map_at_1000_max value: 27.92567859085245 - type: nauc_map_at_1000_std value: -47.477909533360446 - type: nauc_map_at_100_diff1 value: 73.2690602807223 - type: nauc_map_at_100_max value: 27.915868327849996 - type: nauc_map_at_100_std value: -47.525777766107595 - type: nauc_map_at_10_diff1 value: 73.45464428464894 - type: nauc_map_at_10_max value: 27.451611487246296 - type: nauc_map_at_10_std value: -49.35818715843809 - type: nauc_map_at_1_diff1 value: 77.29690208952982 - type: nauc_map_at_1_max value: 19.839875762282293 - type: nauc_map_at_1_std value: -45.355684654708284 - type: nauc_map_at_20_diff1 value: 73.35102731979796 - type: nauc_map_at_20_max value: 27.741506490134583 - type: nauc_map_at_20_std value: -48.22006207310331 - type: nauc_map_at_3_diff1 value: 73.94878241064137 - type: nauc_map_at_3_max value: 24.761321386766728 - type: nauc_map_at_3_std value: -51.20638883618126 - type: nauc_map_at_5_diff1 value: 73.66143558047698 - type: nauc_map_at_5_max value: 26.53483405013543 - type: nauc_map_at_5_std value: -50.697541279640056 - type: nauc_mrr_at_1000_diff1 value: 73.84632320009759 - type: nauc_mrr_at_1000_max value: 30.50182733610048 - type: nauc_mrr_at_1000_std value: -44.3021647995251 - type: nauc_mrr_at_100_diff1 value: 73.84480792662302 - type: nauc_mrr_at_100_max value: 30.50749424571614 - type: nauc_mrr_at_100_std value: -44.29615086388113 - type: nauc_mrr_at_10_diff1 value: 73.79442772949346 - type: nauc_mrr_at_10_max value: 30.55724252219984 - type: nauc_mrr_at_10_std value: -44.50997069462057 - type: nauc_mrr_at_1_diff1 value: 75.23369827945945 - type: nauc_mrr_at_1_max value: 29.20073967447664 - type: nauc_mrr_at_1_std value: -43.1920147658285 - type: nauc_mrr_at_20_diff1 value: 73.82731678072307 - type: nauc_mrr_at_20_max value: 30.566328605497667 - type: nauc_mrr_at_20_std value: -44.24683607643705 - type: nauc_mrr_at_3_diff1 value: 73.61997576749954 - type: nauc_mrr_at_3_max value: 30.150393853381917 - type: nauc_mrr_at_3_std value: -44.96847297506626 - type: nauc_mrr_at_5_diff1 value: 73.69084310616132 - type: nauc_mrr_at_5_max value: 30.578033703441125 - type: nauc_mrr_at_5_std value: -44.74920746066566 - type: nauc_ndcg_at_1000_diff1 value: 72.89349862557452 - type: nauc_ndcg_at_1000_max value: 29.824725190462086 - type: nauc_ndcg_at_1000_std value: -44.96284395063211 - type: nauc_ndcg_at_100_diff1 value: 72.85212753715273 - type: nauc_ndcg_at_100_max value: 29.933114207845605 - type: nauc_ndcg_at_100_std value: -44.944225570663754 - type: nauc_ndcg_at_10_diff1 value: 72.80576740454528 - type: nauc_ndcg_at_10_max value: 29.16829118320828 - type: nauc_ndcg_at_10_std value: -48.149473740079614 - type: nauc_ndcg_at_1_diff1 value: 75.00032534968587 - type: nauc_ndcg_at_1_max value: 29.61849062038547 - type: nauc_ndcg_at_1_std value: -42.560207043864054 - type: nauc_ndcg_at_20_diff1 value: 72.88440406302502 - type: nauc_ndcg_at_20_max value: 29.65496676092656 - type: nauc_ndcg_at_20_std value: -46.21238462167732 - type: nauc_ndcg_at_3_diff1 value: 72.37916962766987 - type: nauc_ndcg_at_3_max value: 27.125094834547586 - type: nauc_ndcg_at_3_std value: -48.62942991399391 - type: nauc_ndcg_at_5_diff1 value: 72.57017330527658 - type: nauc_ndcg_at_5_max value: 28.470485561757254 - type: nauc_ndcg_at_5_std value: -49.07593345591059 - type: nauc_precision_at_1000_diff1 value: -41.67915575853946 - type: nauc_precision_at_1000_max value: 1.2012264478568844 - type: nauc_precision_at_1000_std value: 44.723834559400466 - type: nauc_precision_at_100_diff1 value: -40.45196679236971 - type: nauc_precision_at_100_max value: 2.3525450401714894 - type: nauc_precision_at_100_std value: 43.7092529413952 - type: nauc_precision_at_10_diff1 value: -30.256026923068767 - type: nauc_precision_at_10_max value: 8.313422052132559 - type: nauc_precision_at_10_std value: 25.929372356449694 - type: nauc_precision_at_1_diff1 value: 75.00032534968587 - type: nauc_precision_at_1_max value: 29.61849062038547 - type: nauc_precision_at_1_std value: -42.560207043864054 - type: nauc_precision_at_20_diff1 value: -35.61971069986584 - type: nauc_precision_at_20_max value: 5.4664303079116765 - type: nauc_precision_at_20_std value: 34.992352471692826 - type: nauc_precision_at_3_diff1 value: -5.691231842471157 - type: nauc_precision_at_3_max value: 14.797949087742444 - type: nauc_precision_at_3_std value: -0.1930317395644928 - type: nauc_precision_at_5_diff1 value: -20.03913781462645 - type: nauc_precision_at_5_max value: 11.956771408712749 - type: nauc_precision_at_5_std value: 13.179251389859731 - type: nauc_recall_at_1000_diff1 value: 64.03509042729674 - type: nauc_recall_at_1000_max value: 40.91691485428493 - type: nauc_recall_at_1000_std value: 16.12968625875372 - type: nauc_recall_at_100_diff1 value: 63.83116179628575 - type: nauc_recall_at_100_max value: 43.72908117676382 - type: nauc_recall_at_100_std value: -20.50966716852155 - type: nauc_recall_at_10_diff1 value: 66.42071960186394 - type: nauc_recall_at_10_max value: 28.983207818687205 - type: nauc_recall_at_10_std value: -56.61417798753744 - type: nauc_recall_at_1_diff1 value: 77.29690208952982 - type: nauc_recall_at_1_max value: 19.839875762282293 - type: nauc_recall_at_1_std value: -45.355684654708284 - type: nauc_recall_at_20_diff1 value: 66.32360705219874 - type: nauc_recall_at_20_max value: 33.30698111822631 - type: nauc_recall_at_20_std value: -43.89233781737452 - type: nauc_recall_at_3_diff1 value: 69.67029394927077 - type: nauc_recall_at_3_max value: 22.67803039327696 - type: nauc_recall_at_3_std value: -56.43327209861502 - type: nauc_recall_at_5_diff1 value: 68.05622143936131 - type: nauc_recall_at_5_max value: 26.67795559040675 - type: nauc_recall_at_5_std value: -58.158231198510954 - type: ndcg_at_1 value: 76.08 - type: ndcg_at_10 value: 84.114 - type: ndcg_at_100 value: 85.784 - type: ndcg_at_1000 value: 85.992 - type: ndcg_at_20 value: 84.976 - type: ndcg_at_3 value: 80.74799999999999 - type: ndcg_at_5 value: 82.626 - type: precision_at_1 value: 76.08 - type: precision_at_10 value: 12.926000000000002 - type: precision_at_100 value: 1.509 - type: precision_at_1000 value: 0.156 - type: precision_at_20 value: 6.912999999999999 - type: precision_at_3 value: 35.5 - type: precision_at_5 value: 23.541999999999998 - type: recall_at_1 value: 65.848 - type: recall_at_10 value: 92.611 - type: recall_at_100 value: 98.69 - type: recall_at_1000 value: 99.83999999999999 - type: recall_at_20 value: 95.47200000000001 - type: recall_at_3 value: 83.122 - type: recall_at_5 value: 88.23 - task: type: Retrieval dataset: name: MTEB SCIDOCS-PL (default) type: clarin-knext/scidocs-pl config: default split: test revision: 45452b03f05560207ef19149545f168e596c9337 metrics: - type: main_score value: 15.379999999999999 - type: map_at_1 value: 3.6029999999999998 - type: map_at_10 value: 8.843 - type: map_at_100 value: 10.433 - type: map_at_1000 value: 10.689 - type: map_at_20 value: 9.597 - type: map_at_3 value: 6.363 - type: map_at_5 value: 7.603 - type: mrr_at_1 value: 17.7 - type: mrr_at_10 value: 26.58900793650793 - type: mrr_at_100 value: 27.699652322890987 - type: mrr_at_1000 value: 27.78065313118353 - type: mrr_at_20 value: 27.215020950411816 - type: mrr_at_3 value: 23.36666666666668 - type: mrr_at_5 value: 25.211666666666666 - type: nauc_map_at_1000_diff1 value: 21.92235143827129 - type: nauc_map_at_1000_max value: 37.50300940750989 - type: nauc_map_at_1000_std value: 20.872586122198552 - type: nauc_map_at_100_diff1 value: 21.917408170465833 - type: nauc_map_at_100_max value: 37.4654466815513 - type: nauc_map_at_100_std value: 20.621643878648534 - type: nauc_map_at_10_diff1 value: 22.914388723621183 - type: nauc_map_at_10_max value: 36.468131213468794 - type: nauc_map_at_10_std value: 16.760980140791492 - type: nauc_map_at_1_diff1 value: 29.00799502838457 - type: nauc_map_at_1_max value: 26.64926291797503 - type: nauc_map_at_1_std value: 8.167291261637361 - type: nauc_map_at_20_diff1 value: 22.46580947804047 - type: nauc_map_at_20_max value: 36.656294842562275 - type: nauc_map_at_20_std value: 18.099232417722078 - type: nauc_map_at_3_diff1 value: 23.436009032045934 - type: nauc_map_at_3_max value: 31.325807212280914 - type: nauc_map_at_3_std value: 9.780905232048852 - type: nauc_map_at_5_diff1 value: 22.891704394665528 - type: nauc_map_at_5_max value: 35.40584466642894 - type: nauc_map_at_5_std value: 13.476986099394656 - type: nauc_mrr_at_1000_diff1 value: 25.052937655397866 - type: nauc_mrr_at_1000_max value: 29.64431912670108 - type: nauc_mrr_at_1000_std value: 14.549744963988044 - type: nauc_mrr_at_100_diff1 value: 25.070871266969224 - type: nauc_mrr_at_100_max value: 29.68743604652336 - type: nauc_mrr_at_100_std value: 14.582010154574432 - type: nauc_mrr_at_10_diff1 value: 24.88881466938897 - type: nauc_mrr_at_10_max value: 29.488430770768144 - type: nauc_mrr_at_10_std value: 14.269241073852266 - type: nauc_mrr_at_1_diff1 value: 29.220540327267503 - type: nauc_mrr_at_1_max value: 26.81908580507911 - type: nauc_mrr_at_1_std value: 8.00840295809718 - type: nauc_mrr_at_20_diff1 value: 25.067912695721944 - type: nauc_mrr_at_20_max value: 29.759227563849628 - type: nauc_mrr_at_20_std value: 14.685076859257357 - type: nauc_mrr_at_3_diff1 value: 24.645848739182696 - type: nauc_mrr_at_3_max value: 27.73368549660351 - type: nauc_mrr_at_3_std value: 11.475742805586943 - type: nauc_mrr_at_5_diff1 value: 24.895295760909946 - type: nauc_mrr_at_5_max value: 29.130755033240423 - type: nauc_mrr_at_5_std value: 12.955802929145404 - type: nauc_ndcg_at_1000_diff1 value: 20.68434434777729 - type: nauc_ndcg_at_1000_max value: 37.67055146424174 - type: nauc_ndcg_at_1000_std value: 29.57493715069776 - type: nauc_ndcg_at_100_diff1 value: 20.396834816492383 - type: nauc_ndcg_at_100_max value: 37.460575228670514 - type: nauc_ndcg_at_100_std value: 27.826534756761944 - type: nauc_ndcg_at_10_diff1 value: 22.640844106236027 - type: nauc_ndcg_at_10_max value: 35.21291764462327 - type: nauc_ndcg_at_10_std value: 19.53289455984506 - type: nauc_ndcg_at_1_diff1 value: 29.220540327267503 - type: nauc_ndcg_at_1_max value: 26.81908580507911 - type: nauc_ndcg_at_1_std value: 8.00840295809718 - type: nauc_ndcg_at_20_diff1 value: 22.117126657768623 - type: nauc_ndcg_at_20_max value: 35.79395781940806 - type: nauc_ndcg_at_20_std value: 22.242748346260786 - type: nauc_ndcg_at_3_diff1 value: 23.00596063212187 - type: nauc_ndcg_at_3_max value: 30.149013627580523 - type: nauc_ndcg_at_3_std value: 11.07904064662722 - type: nauc_ndcg_at_5_diff1 value: 22.81875419630523 - type: nauc_ndcg_at_5_max value: 34.24267468356626 - type: nauc_ndcg_at_5_std value: 15.307780280752088 - type: nauc_precision_at_1000_diff1 value: 9.606677689029972 - type: nauc_precision_at_1000_max value: 32.74855550489271 - type: nauc_precision_at_1000_std value: 42.65372585937895 - type: nauc_precision_at_100_diff1 value: 11.528981313529545 - type: nauc_precision_at_100_max value: 35.642529490132404 - type: nauc_precision_at_100_std value: 38.146151426052306 - type: nauc_precision_at_10_diff1 value: 18.783957183811836 - type: nauc_precision_at_10_max value: 36.1982008334257 - type: nauc_precision_at_10_std value: 25.09349473195891 - type: nauc_precision_at_1_diff1 value: 29.220540327267503 - type: nauc_precision_at_1_max value: 26.81908580507911 - type: nauc_precision_at_1_std value: 8.00840295809718 - type: nauc_precision_at_20_diff1 value: 17.458766320828214 - type: nauc_precision_at_20_max value: 36.000404903025235 - type: nauc_precision_at_20_std value: 29.1608044138323 - type: nauc_precision_at_3_diff1 value: 20.213669462067166 - type: nauc_precision_at_3_max value: 31.120650847205912 - type: nauc_precision_at_3_std value: 12.390972418818118 - type: nauc_precision_at_5_diff1 value: 20.114245715785678 - type: nauc_precision_at_5_max value: 37.30360111495823 - type: nauc_precision_at_5_std value: 19.053109037822853 - type: nauc_recall_at_1000_diff1 value: 9.85800049032612 - type: nauc_recall_at_1000_max value: 32.48319160802687 - type: nauc_recall_at_1000_std value: 43.79941601741161 - type: nauc_recall_at_100_diff1 value: 11.375255270968337 - type: nauc_recall_at_100_max value: 35.1868784124497 - type: nauc_recall_at_100_std value: 38.422680583482666 - type: nauc_recall_at_10_diff1 value: 18.445783123521938 - type: nauc_recall_at_10_max value: 35.633267936276766 - type: nauc_recall_at_10_std value: 24.94469506254716 - type: nauc_recall_at_1_diff1 value: 29.00799502838457 - type: nauc_recall_at_1_max value: 26.64926291797503 - type: nauc_recall_at_1_std value: 8.167291261637361 - type: nauc_recall_at_20_diff1 value: 17.314906604151936 - type: nauc_recall_at_20_max value: 35.66067699203996 - type: nauc_recall_at_20_std value: 29.400137012506082 - type: nauc_recall_at_3_diff1 value: 19.873710875648698 - type: nauc_recall_at_3_max value: 30.92404718742849 - type: nauc_recall_at_3_std value: 12.400871018075199 - type: nauc_recall_at_5_diff1 value: 19.869948324233192 - type: nauc_recall_at_5_max value: 37.06832511687574 - type: nauc_recall_at_5_std value: 19.0798814966156 - type: ndcg_at_1 value: 17.7 - type: ndcg_at_10 value: 15.379999999999999 - type: ndcg_at_100 value: 22.09 - type: ndcg_at_1000 value: 27.151999999999997 - type: ndcg_at_20 value: 17.576 - type: ndcg_at_3 value: 14.219999999999999 - type: ndcg_at_5 value: 12.579 - type: precision_at_1 value: 17.7 - type: precision_at_10 value: 8.08 - type: precision_at_100 value: 1.7840000000000003 - type: precision_at_1000 value: 0.3 - type: precision_at_20 value: 5.305 - type: precision_at_3 value: 13.167000000000002 - type: precision_at_5 value: 11.06 - type: recall_at_1 value: 3.6029999999999998 - type: recall_at_10 value: 16.413 - type: recall_at_100 value: 36.263 - type: recall_at_1000 value: 61.016999999999996 - type: recall_at_20 value: 21.587999999999997 - type: recall_at_3 value: 8.013 - type: recall_at_5 value: 11.198 - task: type: Retrieval dataset: name: MTEB SciFact-PL (default) type: clarin-knext/scifact-pl config: default split: test revision: 47932a35f045ef8ed01ba82bf9ff67f6e109207e metrics: - type: main_score value: 64.764 - type: map_at_1 value: 49.778 - type: map_at_10 value: 59.88 - type: map_at_100 value: 60.707 - type: map_at_1000 value: 60.729 - type: map_at_20 value: 60.419999999999995 - type: map_at_3 value: 57.45400000000001 - type: map_at_5 value: 58.729 - type: mrr_at_1 value: 52.33333333333333 - type: mrr_at_10 value: 61.29193121693122 - type: mrr_at_100 value: 61.95817765126313 - type: mrr_at_1000 value: 61.97583284368782 - type: mrr_at_20 value: 61.72469949641003 - type: mrr_at_3 value: 59.44444444444444 - type: mrr_at_5 value: 60.494444444444454 - type: nauc_map_at_1000_diff1 value: 62.21235294015774 - type: nauc_map_at_1000_max value: 48.83996609100249 - type: nauc_map_at_1000_std value: 5.23892781043174 - type: nauc_map_at_100_diff1 value: 62.20170226789429 - type: nauc_map_at_100_max value: 48.8391766453537 - type: nauc_map_at_100_std value: 5.2664077457917715 - type: nauc_map_at_10_diff1 value: 61.961975488329024 - type: nauc_map_at_10_max value: 48.397109987625186 - type: nauc_map_at_10_std value: 4.314859710827481 - type: nauc_map_at_1_diff1 value: 65.0865197011516 - type: nauc_map_at_1_max value: 41.38862781954889 - type: nauc_map_at_1_std value: -0.9182122632530586 - type: nauc_map_at_20_diff1 value: 61.99173935851292 - type: nauc_map_at_20_max value: 48.79961814179307 - type: nauc_map_at_20_std value: 5.262181845825118 - type: nauc_map_at_3_diff1 value: 62.37910539880477 - type: nauc_map_at_3_max value: 47.13627890977091 - type: nauc_map_at_3_std value: 2.327897198087264 - type: nauc_map_at_5_diff1 value: 61.60080757149592 - type: nauc_map_at_5_max value: 47.60052458345962 - type: nauc_map_at_5_std value: 3.1770196981231047 - type: nauc_mrr_at_1000_diff1 value: 62.86810952814966 - type: nauc_mrr_at_1000_max value: 52.13248094447774 - type: nauc_mrr_at_1000_std value: 10.100485746570733 - type: nauc_mrr_at_100_diff1 value: 62.85364829491874 - type: nauc_mrr_at_100_max value: 52.134528010631854 - type: nauc_mrr_at_100_std value: 10.120945685447369 - type: nauc_mrr_at_10_diff1 value: 62.65679301829915 - type: nauc_mrr_at_10_max value: 52.09270719182349 - type: nauc_mrr_at_10_std value: 9.913834434725441 - type: nauc_mrr_at_1_diff1 value: 66.84108271415636 - type: nauc_mrr_at_1_max value: 46.67646429855176 - type: nauc_mrr_at_1_std value: 5.5505252956352304 - type: nauc_mrr_at_20_diff1 value: 62.72473227039611 - type: nauc_mrr_at_20_max value: 52.13479097802757 - type: nauc_mrr_at_20_std value: 10.188278833464084 - type: nauc_mrr_at_3_diff1 value: 63.797429185518496 - type: nauc_mrr_at_3_max value: 52.16486999573481 - type: nauc_mrr_at_3_std value: 9.094360767062762 - type: nauc_mrr_at_5_diff1 value: 62.592917975475494 - type: nauc_mrr_at_5_max value: 52.330741486107414 - type: nauc_mrr_at_5_std value: 9.742175534421389 - type: nauc_ndcg_at_1000_diff1 value: 61.38859337672476 - type: nauc_ndcg_at_1000_max value: 51.48380058339184 - type: nauc_ndcg_at_1000_std value: 9.670547660897673 - type: nauc_ndcg_at_100_diff1 value: 61.02438489641434 - type: nauc_ndcg_at_100_max value: 51.781246646780865 - type: nauc_ndcg_at_100_std value: 10.592961553245187 - type: nauc_ndcg_at_10_diff1 value: 60.03678353308358 - type: nauc_ndcg_at_10_max value: 50.70725688848762 - type: nauc_ndcg_at_10_std value: 7.9472446491016315 - type: nauc_ndcg_at_1_diff1 value: 66.84108271415636 - type: nauc_ndcg_at_1_max value: 46.67646429855176 - type: nauc_ndcg_at_1_std value: 5.5505252956352304 - type: nauc_ndcg_at_20_diff1 value: 59.828482718480224 - type: nauc_ndcg_at_20_max value: 51.45831789601284 - type: nauc_ndcg_at_20_std value: 10.722673683272049 - type: nauc_ndcg_at_3_diff1 value: 61.68982937524109 - type: nauc_ndcg_at_3_max value: 49.745326748604775 - type: nauc_ndcg_at_3_std value: 4.948298621202247 - type: nauc_ndcg_at_5_diff1 value: 59.67396171973207 - type: nauc_ndcg_at_5_max value: 49.87855139298281 - type: nauc_ndcg_at_5_std value: 6.08990428055584 - type: nauc_precision_at_1000_diff1 value: -1.594227972036865 - type: nauc_precision_at_1000_max value: 32.48431723086185 - type: nauc_precision_at_1000_std value: 53.84748466965268 - type: nauc_precision_at_100_diff1 value: 8.06411455192293 - type: nauc_precision_at_100_max value: 39.91003601878948 - type: nauc_precision_at_100_std value: 55.52979711075091 - type: nauc_precision_at_10_diff1 value: 26.610514456014066 - type: nauc_precision_at_10_max value: 47.09062494321172 - type: nauc_precision_at_10_std value: 33.91984226498748 - type: nauc_precision_at_1_diff1 value: 66.84108271415636 - type: nauc_precision_at_1_max value: 46.67646429855176 - type: nauc_precision_at_1_std value: 5.5505252956352304 - type: nauc_precision_at_20_diff1 value: 16.947688843085583 - type: nauc_precision_at_20_max value: 45.40488186572008 - type: nauc_precision_at_20_std value: 48.354421924500905 - type: nauc_precision_at_3_diff1 value: 49.11263981720622 - type: nauc_precision_at_3_max value: 52.7084625111683 - type: nauc_precision_at_3_std value: 16.734612173556453 - type: nauc_precision_at_5_diff1 value: 39.06503705015792 - type: nauc_precision_at_5_max value: 52.21710506893391 - type: nauc_precision_at_5_std value: 23.350948149460233 - type: nauc_recall_at_1000_diff1 value: 43.1559290382817 - type: nauc_recall_at_1000_max value: 83.66013071895456 - type: nauc_recall_at_1000_std value: 86.27450980392177 - type: nauc_recall_at_100_diff1 value: 46.016860850620375 - type: nauc_recall_at_100_max value: 69.3944888744547 - type: nauc_recall_at_100_std value: 55.286945696152735 - type: nauc_recall_at_10_diff1 value: 49.65877895350921 - type: nauc_recall_at_10_max value: 53.02636695700889 - type: nauc_recall_at_10_std value: 13.967608945823828 - type: nauc_recall_at_1_diff1 value: 65.0865197011516 - type: nauc_recall_at_1_max value: 41.38862781954889 - type: nauc_recall_at_1_std value: -0.9182122632530586 - type: nauc_recall_at_20_diff1 value: 43.355308229973524 - type: nauc_recall_at_20_max value: 57.04187909533764 - type: nauc_recall_at_20_std value: 33.578720846660524 - type: nauc_recall_at_3_diff1 value: 56.922996057428165 - type: nauc_recall_at_3_max value: 50.74417041895424 - type: nauc_recall_at_3_std value: 5.623890124328387 - type: nauc_recall_at_5_diff1 value: 50.55620076865238 - type: nauc_recall_at_5_max value: 51.3316854622085 - type: nauc_recall_at_5_std value: 8.995457887269255 - type: ndcg_at_1 value: 52.333 - type: ndcg_at_10 value: 64.764 - type: ndcg_at_100 value: 68.167 - type: ndcg_at_1000 value: 68.816 - type: ndcg_at_20 value: 66.457 - type: ndcg_at_3 value: 60.346 - type: ndcg_at_5 value: 62.365 - type: precision_at_1 value: 52.333 - type: precision_at_10 value: 8.799999999999999 - type: precision_at_100 value: 1.057 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_20 value: 4.8 - type: precision_at_3 value: 23.889 - type: precision_at_5 value: 15.6 - type: recall_at_1 value: 49.778 - type: recall_at_10 value: 78.206 - type: recall_at_100 value: 93.10000000000001 - type: recall_at_1000 value: 98.333 - type: recall_at_20 value: 84.467 - type: recall_at_3 value: 66.367 - type: recall_at_5 value: 71.35000000000001 - task: type: Retrieval dataset: name: MTEB TRECCOVID-PL (default) type: clarin-knext/trec-covid-pl config: default split: test revision: 81bcb408f33366c2a20ac54adafad1ae7e877fdd metrics: - type: main_score value: 72.18900000000001 - type: map_at_1 value: 0.214 - type: map_at_10 value: 1.755 - type: map_at_100 value: 9.944 - type: map_at_1000 value: 24.205 - type: map_at_20 value: 3.1510000000000002 - type: map_at_3 value: 0.6 - type: map_at_5 value: 0.9560000000000001 - type: mrr_at_1 value: 82.0 - type: mrr_at_10 value: 89.06666666666666 - type: mrr_at_100 value: 89.06666666666666 - type: mrr_at_1000 value: 89.06666666666666 - type: mrr_at_20 value: 89.06666666666666 - type: mrr_at_3 value: 87.66666666666666 - type: mrr_at_5 value: 89.06666666666666 - type: nauc_map_at_1000_diff1 value: -9.342037623635543 - type: nauc_map_at_1000_max value: 45.71499810252398 - type: nauc_map_at_1000_std value: 76.86482845196852 - type: nauc_map_at_100_diff1 value: -6.932395299866198 - type: nauc_map_at_100_max value: 36.097801891181604 - type: nauc_map_at_100_std value: 65.6085215411685 - type: nauc_map_at_10_diff1 value: -6.3654843824342775 - type: nauc_map_at_10_max value: 9.564437521432714 - type: nauc_map_at_10_std value: 21.8377319336476 - type: nauc_map_at_1_diff1 value: 8.269590874255034 - type: nauc_map_at_1_max value: 3.482498491294516 - type: nauc_map_at_1_std value: 8.985226819412189 - type: nauc_map_at_20_diff1 value: -4.971435767877232 - type: nauc_map_at_20_max value: 22.88801858567121 - type: nauc_map_at_20_std value: 32.38492618534027 - type: nauc_map_at_3_diff1 value: 1.1615973694623123 - type: nauc_map_at_3_max value: 1.935417800315643 - type: nauc_map_at_3_std value: 10.289328305818698 - type: nauc_map_at_5_diff1 value: -2.4675967231444105 - type: nauc_map_at_5_max value: 2.4611483736622373 - type: nauc_map_at_5_std value: 15.082324305750811 - type: nauc_mrr_at_1000_diff1 value: 13.098526703499063 - type: nauc_mrr_at_1000_max value: 56.37362177417431 - type: nauc_mrr_at_1000_std value: 73.2456769749587 - type: nauc_mrr_at_100_diff1 value: 13.098526703499063 - type: nauc_mrr_at_100_max value: 56.37362177417431 - type: nauc_mrr_at_100_std value: 73.2456769749587 - type: nauc_mrr_at_10_diff1 value: 13.098526703499063 - type: nauc_mrr_at_10_max value: 56.37362177417431 - type: nauc_mrr_at_10_std value: 73.2456769749587 - type: nauc_mrr_at_1_diff1 value: 12.099350148694809 - type: nauc_mrr_at_1_max value: 53.75041304108387 - type: nauc_mrr_at_1_std value: 68.84018063663402 - type: nauc_mrr_at_20_diff1 value: 13.098526703499063 - type: nauc_mrr_at_20_max value: 56.37362177417431 - type: nauc_mrr_at_20_std value: 73.2456769749587 - type: nauc_mrr_at_3_diff1 value: 12.173557857011161 - type: nauc_mrr_at_3_max value: 57.540780562363395 - type: nauc_mrr_at_3_std value: 75.42098189580211 - type: nauc_mrr_at_5_diff1 value: 13.098526703499063 - type: nauc_mrr_at_5_max value: 56.37362177417431 - type: nauc_mrr_at_5_std value: 73.2456769749587 - type: nauc_ndcg_at_1000_diff1 value: -8.951471847310401 - type: nauc_ndcg_at_1000_max value: 43.86942237288822 - type: nauc_ndcg_at_1000_std value: 74.61077735148591 - type: nauc_ndcg_at_100_diff1 value: -17.754559361083817 - type: nauc_ndcg_at_100_max value: 53.97187119773482 - type: nauc_ndcg_at_100_std value: 80.7944136146514 - type: nauc_ndcg_at_10_diff1 value: -26.637734697836414 - type: nauc_ndcg_at_10_max value: 47.70102699133149 - type: nauc_ndcg_at_10_std value: 70.26909560828646 - type: nauc_ndcg_at_1_diff1 value: -1.2250530785563207 - type: nauc_ndcg_at_1_max value: 46.60509554140131 - type: nauc_ndcg_at_1_std value: 62.63906581740976 - type: nauc_ndcg_at_20_diff1 value: -22.44286466550908 - type: nauc_ndcg_at_20_max value: 55.40492058090103 - type: nauc_ndcg_at_20_std value: 72.11813912145738 - type: nauc_ndcg_at_3_diff1 value: -14.8152721896563 - type: nauc_ndcg_at_3_max value: 38.952259383027595 - type: nauc_ndcg_at_3_std value: 59.819750166537766 - type: nauc_ndcg_at_5_diff1 value: -19.150105688904375 - type: nauc_ndcg_at_5_max value: 42.311180547775315 - type: nauc_ndcg_at_5_std value: 66.6632229321094 - type: nauc_precision_at_1000_diff1 value: -11.555591477978941 - type: nauc_precision_at_1000_max value: 43.7311644834851 - type: nauc_precision_at_1000_std value: 52.10644767999648 - type: nauc_precision_at_100_diff1 value: -16.94803099801117 - type: nauc_precision_at_100_max value: 54.08281631067633 - type: nauc_precision_at_100_std value: 82.77237347891331 - type: nauc_precision_at_10_diff1 value: -27.351332814863355 - type: nauc_precision_at_10_max value: 48.08237549065846 - type: nauc_precision_at_10_std value: 69.37250843534329 - type: nauc_precision_at_1_diff1 value: 12.099350148694809 - type: nauc_precision_at_1_max value: 53.75041304108387 - type: nauc_precision_at_1_std value: 68.84018063663402 - type: nauc_precision_at_20_diff1 value: -18.2422222283388 - type: nauc_precision_at_20_max value: 59.517328129343696 - type: nauc_precision_at_20_std value: 72.05149307342747 - type: nauc_precision_at_3_diff1 value: -10.226547543075897 - type: nauc_precision_at_3_max value: 43.14684818832875 - type: nauc_precision_at_3_std value: 57.31936467418288 - type: nauc_precision_at_5_diff1 value: -14.28521589468673 - type: nauc_precision_at_5_max value: 41.633426753962596 - type: nauc_precision_at_5_std value: 64.94400576804541 - type: nauc_recall_at_1000_diff1 value: -0.9648831207497152 - type: nauc_recall_at_1000_max value: 31.70832946085005 - type: nauc_recall_at_1000_std value: 63.21471613968869 - type: nauc_recall_at_100_diff1 value: -1.360254380933586 - type: nauc_recall_at_100_max value: 25.960597782099605 - type: nauc_recall_at_100_std value: 51.52757589609674 - type: nauc_recall_at_10_diff1 value: -0.3899439424189566 - type: nauc_recall_at_10_max value: 5.094341897886072 - type: nauc_recall_at_10_std value: 11.266045616925698 - type: nauc_recall_at_1_diff1 value: 8.269590874255034 - type: nauc_recall_at_1_max value: 3.482498491294516 - type: nauc_recall_at_1_std value: 8.985226819412189 - type: nauc_recall_at_20_diff1 value: 6.4797098359254175 - type: nauc_recall_at_20_max value: 15.663700985336124 - type: nauc_recall_at_20_std value: 17.154099587904913 - type: nauc_recall_at_3_diff1 value: 3.7245972450393507 - type: nauc_recall_at_3_max value: 0.4063857187240345 - type: nauc_recall_at_3_std value: 6.641948062821941 - type: nauc_recall_at_5_diff1 value: 4.013879477591466 - type: nauc_recall_at_5_max value: -1.4266586618013566 - type: nauc_recall_at_5_std value: 7.311601874411205 - type: ndcg_at_1 value: 75.0 - type: ndcg_at_10 value: 72.18900000000001 - type: ndcg_at_100 value: 54.022999999999996 - type: ndcg_at_1000 value: 49.492000000000004 - type: ndcg_at_20 value: 68.51 - type: ndcg_at_3 value: 73.184 - type: ndcg_at_5 value: 72.811 - type: precision_at_1 value: 82.0 - type: precision_at_10 value: 77.4 - type: precision_at_100 value: 55.24 - type: precision_at_1000 value: 21.822 - type: precision_at_20 value: 73.0 - type: precision_at_3 value: 79.333 - type: precision_at_5 value: 79.2 - type: recall_at_1 value: 0.214 - type: recall_at_10 value: 1.9980000000000002 - type: recall_at_100 value: 13.328999999999999 - type: recall_at_1000 value: 47.204 - type: recall_at_20 value: 3.7310000000000003 - type: recall_at_3 value: 0.628 - type: recall_at_5 value: 1.049 - task: type: MultilabelClassification dataset: name: MTEB CEDRClassification (default) type: ai-forever/cedr-classification config: default split: test revision: c0ba03d058e3e1b2f3fd20518875a4563dd12db4 metrics: - type: accuracy value: 47.30605738575983 - type: f1 value: 41.26091043925065 - type: lrap value: 72.89452709883206 - type: main_score value: 47.30605738575983 - task: type: Reranking dataset: name: MTEB MIRACLReranking (ru) type: miracl/mmteb-miracl-reranking config: ru split: dev revision: 6d1962c527217f8927fca80f890f14f36b2802af metrics: - type: MAP@1(MIRACL) value: 20.721999999999998 - type: MAP@10(MIRACL) value: 33.900999999999996 - type: MAP@100(MIRACL) value: 36.813 - type: MAP@1000(MIRACL) value: 36.813 - type: MAP@20(MIRACL) value: 35.684 - type: MAP@3(MIRACL) value: 28.141 - type: MAP@5(MIRACL) value: 31.075000000000003 - type: NDCG@1(MIRACL) value: 32.799 - type: NDCG@10(MIRACL) value: 42.065000000000005 - type: NDCG@100(MIRACL) value: 49.730999999999995 - type: NDCG@1000(MIRACL) value: 49.730999999999995 - type: NDCG@20(MIRACL) value: 46.0 - type: NDCG@3(MIRACL) value: 34.481 - type: NDCG@5(MIRACL) value: 37.452999999999996 - type: P@1(MIRACL) value: 32.799 - type: P@10(MIRACL) value: 11.668000000000001 - type: P@100(MIRACL) value: 1.9529999999999998 - type: P@1000(MIRACL) value: 0.19499999999999998 - type: P@20(MIRACL) value: 7.51 - type: P@3(MIRACL) value: 20.823 - type: P@5(MIRACL) value: 16.728 - type: Recall@1(MIRACL) value: 20.721999999999998 - type: Recall@10(MIRACL) value: 54.762 - type: Recall@100(MIRACL) value: 79.952 - type: Recall@1000(MIRACL) value: 79.952 - type: Recall@20(MIRACL) value: 66.26100000000001 - type: Recall@3(MIRACL) value: 34.410000000000004 - type: Recall@5(MIRACL) value: 42.659000000000006 - type: main_score value: 42.065000000000005 - type: nAUC_MAP@1000_diff1(MIRACL) value: 14.33534992502818 - type: nAUC_MAP@1000_max(MIRACL) value: 12.367998764646115 - type: nAUC_MAP@1000_std(MIRACL) value: 4.569686002935006 - type: nAUC_MAP@100_diff1(MIRACL) value: 14.33534992502818 - type: nAUC_MAP@100_max(MIRACL) value: 12.367998764646115 - type: nAUC_MAP@100_std(MIRACL) value: 4.569686002935006 - type: nAUC_MAP@10_diff1(MIRACL) value: 16.920323975680027 - type: nAUC_MAP@10_max(MIRACL) value: 9.327171297204082 - type: nAUC_MAP@10_std(MIRACL) value: 3.2039133783079015 - type: nAUC_MAP@1_diff1(MIRACL) value: 28.698973487482206 - type: nAUC_MAP@1_max(MIRACL) value: 2.9217687660885034 - type: nAUC_MAP@1_std(MIRACL) value: -1.1247408800976524 - type: nAUC_MAP@20_diff1(MIRACL) value: 15.359083081640476 - type: nAUC_MAP@20_max(MIRACL) value: 11.310494233946345 - type: nAUC_MAP@20_std(MIRACL) value: 4.4171898386022885 - type: nAUC_MAP@3_diff1(MIRACL) value: 22.27430591851617 - type: nAUC_MAP@3_max(MIRACL) value: 6.407438291284658 - type: nAUC_MAP@3_std(MIRACL) value: 0.9799184530397409 - type: nAUC_MAP@5_diff1(MIRACL) value: 19.20571689941054 - type: nAUC_MAP@5_max(MIRACL) value: 7.987468654026893 - type: nAUC_MAP@5_std(MIRACL) value: 1.8324246565938962 - type: nAUC_NDCG@1000_diff1(MIRACL) value: 3.7537669018914768 - type: nAUC_NDCG@1000_max(MIRACL) value: 20.7944707840533 - type: nAUC_NDCG@1000_std(MIRACL) value: 8.444837055303063 - type: nAUC_NDCG@100_diff1(MIRACL) value: 3.7537669018914768 - type: nAUC_NDCG@100_max(MIRACL) value: 20.7944707840533 - type: nAUC_NDCG@100_std(MIRACL) value: 8.444837055303063 - type: nAUC_NDCG@10_diff1(MIRACL) value: 10.829575656103888 - type: nAUC_NDCG@10_max(MIRACL) value: 13.0445496498929 - type: nAUC_NDCG@10_std(MIRACL) value: 6.050412212625362 - type: nAUC_NDCG@1_diff1(MIRACL) value: 19.1388712233292 - type: nAUC_NDCG@1_max(MIRACL) value: 10.871900994781642 - type: nAUC_NDCG@1_std(MIRACL) value: 3.218568248751811 - type: nAUC_NDCG@20_diff1(MIRACL) value: 7.093172181746442 - type: nAUC_NDCG@20_max(MIRACL) value: 16.955238078958836 - type: nAUC_NDCG@20_std(MIRACL) value: 8.325656379573035 - type: nAUC_NDCG@3_diff1(MIRACL) value: 17.134437303330802 - type: nAUC_NDCG@3_max(MIRACL) value: 10.235328822955793 - type: nAUC_NDCG@3_std(MIRACL) value: 3.2341358691084814 - type: nAUC_NDCG@5_diff1(MIRACL) value: 14.733664618337636 - type: nAUC_NDCG@5_max(MIRACL) value: 11.181897412035282 - type: nAUC_NDCG@5_std(MIRACL) value: 3.642277088791985 - type: nAUC_P@1000_diff1(MIRACL) value: -26.330038284867573 - type: nAUC_P@1000_max(MIRACL) value: 28.450694137240458 - type: nAUC_P@1000_std(MIRACL) value: 9.892993775474912 - type: nAUC_P@100_diff1(MIRACL) value: -26.330038284867552 - type: nAUC_P@100_max(MIRACL) value: 28.45069413724051 - type: nAUC_P@100_std(MIRACL) value: 9.892993775474928 - type: nAUC_P@10_diff1(MIRACL) value: -17.436937353231112 - type: nAUC_P@10_max(MIRACL) value: 24.327018012947857 - type: nAUC_P@10_std(MIRACL) value: 11.78803527706634 - type: nAUC_P@1_diff1(MIRACL) value: 19.1388712233292 - type: nAUC_P@1_max(MIRACL) value: 10.871900994781642 - type: nAUC_P@1_std(MIRACL) value: 3.218568248751811 - type: nAUC_P@20_diff1(MIRACL) value: -22.947528755272426 - type: nAUC_P@20_max(MIRACL) value: 27.773093471902538 - type: nAUC_P@20_std(MIRACL) value: 14.898619107087221 - type: nAUC_P@3_diff1(MIRACL) value: 1.4100426412400944 - type: nAUC_P@3_max(MIRACL) value: 17.397472872058845 - type: nAUC_P@3_std(MIRACL) value: 8.240008229861875 - type: nAUC_P@5_diff1(MIRACL) value: -7.971349332207021 - type: nAUC_P@5_max(MIRACL) value: 22.198441167940963 - type: nAUC_P@5_std(MIRACL) value: 9.00265164460082 - type: nAUC_Recall@1000_diff1(MIRACL) value: -38.69835271863148 - type: nAUC_Recall@1000_max(MIRACL) value: 50.9545152809108 - type: nAUC_Recall@1000_std(MIRACL) value: 20.44270887092116 - type: nAUC_Recall@100_diff1(MIRACL) value: -38.69835271863148 - type: nAUC_Recall@100_max(MIRACL) value: 50.9545152809108 - type: nAUC_Recall@100_std(MIRACL) value: 20.44270887092116 - type: nAUC_Recall@10_diff1(MIRACL) value: -0.08109036309433801 - type: nAUC_Recall@10_max(MIRACL) value: 12.696619907773568 - type: nAUC_Recall@10_std(MIRACL) value: 8.791982704261589 - type: nAUC_Recall@1_diff1(MIRACL) value: 28.698973487482206 - type: nAUC_Recall@1_max(MIRACL) value: 2.9217687660885034 - type: nAUC_Recall@1_std(MIRACL) value: -1.1247408800976524 - type: nAUC_Recall@20_diff1(MIRACL) value: -13.312171017942623 - type: nAUC_Recall@20_max(MIRACL) value: 24.19847346821666 - type: nAUC_Recall@20_std(MIRACL) value: 15.8157702609797 - type: nAUC_Recall@3_diff1(MIRACL) value: 16.909128321353343 - type: nAUC_Recall@3_max(MIRACL) value: 6.552122731902991 - type: nAUC_Recall@3_std(MIRACL) value: 1.9963898223457228 - type: nAUC_Recall@5_diff1(MIRACL) value: 9.990292655247721 - type: nAUC_Recall@5_max(MIRACL) value: 9.361722273507574 - type: nAUC_Recall@5_std(MIRACL) value: 3.270918827854495 - task: type: MultilabelClassification dataset: name: MTEB SensitiveTopicsClassification (default) type: ai-forever/sensitive-topics-classification config: default split: test revision: 416b34a802308eac30e4192afc0ff99bb8dcc7f2 metrics: - type: accuracy value: 30.634765625 - type: f1 value: 32.647559808678665 - type: lrap value: 45.94319661458259 - type: main_score value: 30.634765625 - task: type: STS dataset: name: MTEB ATEC (default) type: C-MTEB/ATEC config: default split: test revision: 0f319b1142f28d00e055a6770f3f726ae9b7d865 metrics: - type: cosine_pearson value: 47.541497334563296 - type: cosine_spearman value: 49.06268944206629 - type: euclidean_pearson value: 51.838926748581635 - type: euclidean_spearman value: 48.930697157135356 - type: main_score value: 49.06268944206629 - type: manhattan_pearson value: 51.835306769406365 - type: manhattan_spearman value: 48.86135493444834 - type: pearson value: 47.541497334563296 - type: spearman value: 49.06268944206629 - task: type: Classification dataset: name: MTEB AllegroReviews (default) type: PL-MTEB/allegro-reviews config: default split: test revision: b89853e6de927b0e3bfa8ecc0e56fe4e02ceafc6 metrics: - type: accuracy value: 49.51292246520874 - type: f1 value: 44.14350234332397 - type: f1_weighted value: 51.65508998354552 - type: main_score value: 49.51292246520874 - task: type: Clustering dataset: name: MTEB AlloProfClusteringP2P (default) type: lyon-nlp/alloprof config: default split: test revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b metrics: - type: main_score value: 63.883383458621665 - type: v_measure value: 63.883383458621665 - type: v_measure_std value: 2.693666879958465 - type: main_score value: 46.85924588755251 - type: v_measure value: 46.85924588755251 - type: v_measure_std value: 2.1918258880872377 - task: type: Clustering dataset: name: MTEB 8TagsClustering type: PL-MTEB/8tags-clustering config: default split: test revision: None metrics: - type: v_measure value: 43.65721212452554 - task: type: Reranking dataset: name: MTEB AlloprofReranking (default) type: lyon-nlp/mteb-fr-reranking-alloprof-s2p config: default split: test revision: e40c8a63ce02da43200eccb5b0846fcaa888f562 metrics: - type: map value: 66.39013753839347 - type: mrr value: 67.68045617786551 - type: main_score value: 66.39013753839347 - task: type: Retrieval dataset: name: MTEB AlloprofRetrieval (default) type: lyon-nlp/alloprof config: default split: test revision: fcf295ea64c750f41fadbaa37b9b861558e1bfbd metrics: - type: main_score value: 54.284 - type: map_at_1 value: 37.047000000000004 - type: map_at_10 value: 48.53 - type: map_at_100 value: 49.357 - type: map_at_1000 value: 49.39 - type: map_at_20 value: 49.064 - type: map_at_3 value: 45.675 - type: map_at_5 value: 47.441 - type: mrr_at_1 value: 37.04663212435233 - type: mrr_at_10 value: 48.5300326232969 - type: mrr_at_100 value: 49.35708199037581 - type: mrr_at_1000 value: 49.39005824603193 - type: mrr_at_20 value: 49.06417416464799 - type: mrr_at_3 value: 45.67501439263105 - type: mrr_at_5 value: 47.44099021301103 - type: nauc_map_at_1000_diff1 value: 43.32474221868009 - type: nauc_map_at_1000_max value: 39.407334029058575 - type: nauc_map_at_1000_std value: -2.3728154448932606 - type: nauc_map_at_100_diff1 value: 43.32336300929909 - type: nauc_map_at_100_max value: 39.432174777554835 - type: nauc_map_at_100_std value: -2.356396922384349 - type: nauc_map_at_10_diff1 value: 43.1606520154482 - type: nauc_map_at_10_max value: 39.33734650558226 - type: nauc_map_at_10_std value: -2.5156222475075256 - type: nauc_map_at_1_diff1 value: 46.2178975214499 - type: nauc_map_at_1_max value: 36.26173199049361 - type: nauc_map_at_1_std value: -3.0897555582816443 - type: nauc_map_at_20_diff1 value: 43.272980702916456 - type: nauc_map_at_20_max value: 39.4896977052276 - type: nauc_map_at_20_std value: -2.3305501742917043 - type: nauc_map_at_3_diff1 value: 43.49525042967079 - type: nauc_map_at_3_max value: 38.66352501824728 - type: nauc_map_at_3_std value: -3.202794391620473 - type: nauc_map_at_5_diff1 value: 43.2266692546611 - type: nauc_map_at_5_max value: 38.77368661115743 - type: nauc_map_at_5_std value: -3.0897532130127954 - type: nauc_mrr_at_1000_diff1 value: 43.32474221868009 - type: nauc_mrr_at_1000_max value: 39.407334029058575 - type: nauc_mrr_at_1000_std value: -2.3728154448932606 - type: nauc_mrr_at_100_diff1 value: 43.32336300929909 - type: nauc_mrr_at_100_max value: 39.432174777554835 - type: nauc_mrr_at_100_std value: -2.356396922384349 - type: nauc_mrr_at_10_diff1 value: 43.1606520154482 - type: nauc_mrr_at_10_max value: 39.33734650558226 - type: nauc_mrr_at_10_std value: -2.5156222475075256 - type: nauc_mrr_at_1_diff1 value: 46.2178975214499 - type: nauc_mrr_at_1_max value: 36.26173199049361 - type: nauc_mrr_at_1_std value: -3.0897555582816443 - type: nauc_mrr_at_20_diff1 value: 43.272980702916456 - type: nauc_mrr_at_20_max value: 39.4896977052276 - type: nauc_mrr_at_20_std value: -2.3305501742917043 - type: nauc_mrr_at_3_diff1 value: 43.49525042967079 - type: nauc_mrr_at_3_max value: 38.66352501824728 - type: nauc_mrr_at_3_std value: -3.202794391620473 - type: nauc_mrr_at_5_diff1 value: 43.2266692546611 - type: nauc_mrr_at_5_max value: 38.77368661115743 - type: nauc_mrr_at_5_std value: -3.0897532130127954 - type: nauc_ndcg_at_1000_diff1 value: 43.01903168202974 - type: nauc_ndcg_at_1000_max value: 40.75496622942232 - type: nauc_ndcg_at_1000_std value: -1.3150412981845496 - type: nauc_ndcg_at_100_diff1 value: 42.98016493758145 - type: nauc_ndcg_at_100_max value: 41.55869635162325 - type: nauc_ndcg_at_100_std value: -0.5355252976886055 - type: nauc_ndcg_at_10_diff1 value: 42.218755211347506 - type: nauc_ndcg_at_10_max value: 41.305042275175765 - type: nauc_ndcg_at_10_std value: -1.4034484444573714 - type: nauc_ndcg_at_1_diff1 value: 46.2178975214499 - type: nauc_ndcg_at_1_max value: 36.26173199049361 - type: nauc_ndcg_at_1_std value: -3.0897555582816443 - type: nauc_ndcg_at_20_diff1 value: 42.66574440095576 - type: nauc_ndcg_at_20_max value: 42.014620115124515 - type: nauc_ndcg_at_20_std value: -0.5176162553751498 - type: nauc_ndcg_at_3_diff1 value: 42.837450505106055 - type: nauc_ndcg_at_3_max value: 39.525369733082414 - type: nauc_ndcg_at_3_std value: -3.1605948245795155 - type: nauc_ndcg_at_5_diff1 value: 42.37951815451173 - type: nauc_ndcg_at_5_max value: 39.78840132935179 - type: nauc_ndcg_at_5_std value: -2.936898430768135 - type: nauc_precision_at_1000_diff1 value: 49.69224988612385 - type: nauc_precision_at_1000_max value: 79.57897547128005 - type: nauc_precision_at_1000_std value: 45.040371354764645 - type: nauc_precision_at_100_diff1 value: 42.70597486048422 - type: nauc_precision_at_100_max value: 65.74628759606188 - type: nauc_precision_at_100_std value: 25.49157745244855 - type: nauc_precision_at_10_diff1 value: 38.565609931689345 - type: nauc_precision_at_10_max value: 50.0239696180852 - type: nauc_precision_at_10_std value: 3.976354829503967 - type: nauc_precision_at_1_diff1 value: 46.2178975214499 - type: nauc_precision_at_1_max value: 36.26173199049361 - type: nauc_precision_at_1_std value: -3.0897555582816443 - type: nauc_precision_at_20_diff1 value: 40.4134718566864 - type: nauc_precision_at_20_max value: 57.121778108665374 - type: nauc_precision_at_20_std value: 11.46021975428544 - type: nauc_precision_at_3_diff1 value: 40.90538379461529 - type: nauc_precision_at_3_max value: 42.18393248057992 - type: nauc_precision_at_3_std value: -3.005249943837297 - type: nauc_precision_at_5_diff1 value: 39.60162965860782 - type: nauc_precision_at_5_max value: 43.28317158174058 - type: nauc_precision_at_5_std value: -2.3469094487738054 - type: nauc_recall_at_1000_diff1 value: 49.69224988612252 - type: nauc_recall_at_1000_max value: 79.57897547127862 - type: nauc_recall_at_1000_std value: 45.04037135476256 - type: nauc_recall_at_100_diff1 value: 42.70597486048432 - type: nauc_recall_at_100_max value: 65.74628759606213 - type: nauc_recall_at_100_std value: 25.491577452448727 - type: nauc_recall_at_10_diff1 value: 38.56560993168935 - type: nauc_recall_at_10_max value: 50.02396961808522 - type: nauc_recall_at_10_std value: 3.9763548295040314 - type: nauc_recall_at_1_diff1 value: 46.2178975214499 - type: nauc_recall_at_1_max value: 36.26173199049361 - type: nauc_recall_at_1_std value: -3.0897555582816443 - type: nauc_recall_at_20_diff1 value: 40.41347185668637 - type: nauc_recall_at_20_max value: 57.12177810866533 - type: nauc_recall_at_20_std value: 11.460219754285431 - type: nauc_recall_at_3_diff1 value: 40.90538379461527 - type: nauc_recall_at_3_max value: 42.18393248057989 - type: nauc_recall_at_3_std value: -3.005249943837297 - type: nauc_recall_at_5_diff1 value: 39.601629658607784 - type: nauc_recall_at_5_max value: 43.28317158174053 - type: nauc_recall_at_5_std value: -2.3469094487738054 - type: ndcg_at_1 value: 37.047000000000004 - type: ndcg_at_10 value: 54.284 - type: ndcg_at_100 value: 58.34 - type: ndcg_at_1000 value: 59.303 - type: ndcg_at_20 value: 56.235 - type: ndcg_at_3 value: 48.503 - type: ndcg_at_5 value: 51.686 - type: precision_at_1 value: 37.047000000000004 - type: precision_at_10 value: 7.237 - type: precision_at_100 value: 0.914 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 4.005 - type: precision_at_3 value: 18.898 - type: precision_at_5 value: 12.884 - type: recall_at_1 value: 37.047000000000004 - type: recall_at_10 value: 72.366 - type: recall_at_100 value: 91.408 - type: recall_at_1000 value: 99.136 - type: recall_at_20 value: 80.095 - type: recall_at_3 value: 56.693000000000005 - type: recall_at_5 value: 64.42099999999999 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 89.49253731343283 - type: ap value: 61.88098616359918 - type: ap_weighted value: 61.88098616359918 - type: f1 value: 84.76516623679144 - type: f1_weighted value: 89.92745276292968 - type: main_score value: 89.49253731343283 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (de) type: mteb/amazon_counterfactual config: de split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 89.61456102783727 - type: ap value: 93.11816566733742 - type: ap_weighted value: 93.11816566733742 - type: f1 value: 88.27635757733722 - type: f1_weighted value: 89.82581568285453 - type: main_score value: 89.61456102783727 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification (default) type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 95.3825 - type: ap value: 93.393033869502 - type: ap_weighted value: 93.393033869502 - type: f1 value: 95.38109007966307 - type: f1_weighted value: 95.38109007966305 - type: main_score value: 95.3825 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 49.768 - type: f1 value: 48.95084821944411 - type: f1_weighted value: 48.9508482194441 - type: main_score value: 49.768 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (de) type: mteb/amazon_reviews_multi config: de split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 48.071999999999996 - type: f1 value: 47.24171107487612 - type: f1_weighted value: 47.24171107487612 - type: main_score value: 48.071999999999996 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (es) type: mteb/amazon_reviews_multi config: es split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 48.102000000000004 - type: f1 value: 47.27193805278696 - type: f1_weighted value: 47.27193805278696 - type: main_score value: 48.102000000000004 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (fr) type: mteb/amazon_reviews_multi config: fr split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 47.30800000000001 - type: f1 value: 46.41683358017851 - type: f1_weighted value: 46.41683358017851 - type: main_score value: 47.30800000000001 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (zh) type: mteb/amazon_reviews_multi config: zh split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 44.944 - type: f1 value: 44.223824487744395 - type: f1_weighted value: 44.22382448774439 - type: main_score value: 44.944 - task: type: Retrieval dataset: name: MTEB ArguAna (default) type: mteb/arguana config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 29.232000000000003 - type: map_at_10 value: 45.117000000000004 - type: map_at_100 value: 45.977000000000004 - type: map_at_1000 value: 45.98 - type: map_at_20 value: 45.815 - type: map_at_3 value: 39.912 - type: map_at_5 value: 42.693 - type: mrr_at_1 value: 29.659000000000002 - type: mrr_at_10 value: 45.253 - type: mrr_at_100 value: 46.125 - type: mrr_at_1000 value: 46.129 - type: mrr_at_20 value: 45.964 - type: mrr_at_3 value: 40.043 - type: mrr_at_5 value: 42.870000000000005 - type: ndcg_at_1 value: 29.232000000000003 - type: ndcg_at_10 value: 54.327999999999996 - type: ndcg_at_100 value: 57.86 - type: ndcg_at_1000 value: 57.935 - type: ndcg_at_20 value: 56.794 - type: ndcg_at_3 value: 43.516 - type: ndcg_at_5 value: 48.512 - type: precision_at_1 value: 29.232000000000003 - type: precision_at_10 value: 8.393 - type: precision_at_100 value: 0.991 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.676 - type: precision_at_3 value: 17.994 - type: precision_at_5 value: 13.215 - type: recall_at_1 value: 29.232000000000003 - type: recall_at_10 value: 83.926 - type: recall_at_100 value: 99.075 - type: recall_at_1000 value: 99.644 - type: recall_at_20 value: 93.528 - type: recall_at_3 value: 53.983000000000004 - type: recall_at_5 value: 66.074 - type: main_score value: 54.327999999999996 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P (default) type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: main_score value: 46.6636824632419 - type: v_measure value: 46.6636824632419 - type: v_measure_std value: 13.817129140714963 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S (default) type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: main_score value: 39.271141892800024 - type: v_measure value: 39.271141892800024 - type: v_measure_std value: 14.276782483454827 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions (default) type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 65.04363277324629 - type: mrr value: 78.2372598162072 - type: main_score value: 65.04363277324629 - task: type: Reranking dataset: name: MTEB MindSmallReranking (default) type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 30.83 - type: main_score value: 30.83 - task: type: STS dataset: name: MTEB BIOSSES (default) type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cosine_pearson value: 88.80382082011027 - type: cosine_spearman value: 88.68876782169106 - type: euclidean_pearson value: 87.00802890147176 - type: euclidean_spearman value: 87.43211268192712 - type: main_score value: 88.68876782169106 - type: manhattan_pearson value: 87.14062537179474 - type: manhattan_spearman value: 87.59115245033443 - type: pearson value: 88.80382082011027 - type: spearman value: 88.68876782169106 - task: type: STS dataset: name: MTEB BQ (default) type: C-MTEB/BQ config: default split: test revision: e3dda5e115e487b39ec7e618c0c6a29137052a55 metrics: - type: cosine_pearson value: 61.588006604878196 - type: cosine_spearman value: 63.20615427154465 - type: euclidean_pearson value: 61.818547092516496 - type: euclidean_spearman value: 63.21558009151778 - type: main_score value: 63.20615427154465 - type: manhattan_pearson value: 61.665588158487616 - type: manhattan_spearman value: 63.051544488238584 - type: pearson value: 61.588006604878196 - type: spearman value: 63.20615427154465 - task: type: Retrieval dataset: name: MTEB BSARDRetrieval (default) type: maastrichtlawtech/bsard config: default split: test revision: 5effa1b9b5fa3b0f9e12523e6e43e5f86a6e6d59 metrics: - type: main_score value: 64.414 - type: map_at_1 value: 14.865 - type: map_at_10 value: 21.605 - type: map_at_100 value: 22.762 - type: map_at_1000 value: 22.854 - type: map_at_20 value: 22.259999999999998 - type: map_at_3 value: 20.119999999999997 - type: map_at_5 value: 20.931 - type: mrr_at_1 value: 14.864864864864865 - type: mrr_at_10 value: 21.605176605176606 - type: mrr_at_100 value: 22.7622306460065 - type: mrr_at_1000 value: 22.85383406410312 - type: mrr_at_20 value: 22.259528463088845 - type: mrr_at_3 value: 20.12012012012012 - type: mrr_at_5 value: 20.930930930930934 - type: nauc_map_at_1000_diff1 value: 17.486265968689338 - type: nauc_map_at_1000_max value: 22.736799291688836 - type: nauc_map_at_1000_std value: 9.831687441977147 - type: nauc_map_at_100_diff1 value: 17.50754492049086 - type: nauc_map_at_100_max value: 22.77693662806787 - type: nauc_map_at_100_std value: 9.853899509675395 - type: nauc_map_at_10_diff1 value: 17.42133968580952 - type: nauc_map_at_10_max value: 22.45861793882279 - type: nauc_map_at_10_std value: 8.964888472915938 - type: nauc_map_at_1_diff1 value: 19.433947086968093 - type: nauc_map_at_1_max value: 24.75657047550517 - type: nauc_map_at_1_std value: 15.122329157218505 - type: nauc_map_at_20_diff1 value: 17.429856756008785 - type: nauc_map_at_20_max value: 22.438850987431017 - type: nauc_map_at_20_std value: 9.172746012213558 - type: nauc_map_at_3_diff1 value: 18.218182689678475 - type: nauc_map_at_3_max value: 23.57169444088667 - type: nauc_map_at_3_std value: 10.464473559366356 - type: nauc_map_at_5_diff1 value: 18.6075342519133 - type: nauc_map_at_5_max value: 23.308845973576673 - type: nauc_map_at_5_std value: 9.364009996445652 - type: nauc_mrr_at_1000_diff1 value: 17.486265968689338 - type: nauc_mrr_at_1000_max value: 22.736799291688836 - type: nauc_mrr_at_1000_std value: 9.831687441977147 - type: nauc_mrr_at_100_diff1 value: 17.50754492049086 - type: nauc_mrr_at_100_max value: 22.77693662806787 - type: nauc_mrr_at_100_std value: 9.853899509675395 - type: nauc_mrr_at_10_diff1 value: 17.42133968580952 - type: nauc_mrr_at_10_max value: 22.45861793882279 - type: nauc_mrr_at_10_std value: 8.964888472915938 - type: nauc_mrr_at_1_diff1 value: 19.433947086968093 - type: nauc_mrr_at_1_max value: 24.75657047550517 - type: nauc_mrr_at_1_std value: 15.122329157218505 - type: nauc_mrr_at_20_diff1 value: 17.429856756008785 - type: nauc_mrr_at_20_max value: 22.438850987431017 - type: nauc_mrr_at_20_std value: 9.172746012213558 - type: nauc_mrr_at_3_diff1 value: 18.218182689678475 - type: nauc_mrr_at_3_max value: 23.57169444088667 - type: nauc_mrr_at_3_std value: 10.464473559366356 - type: nauc_mrr_at_5_diff1 value: 18.6075342519133 - type: nauc_mrr_at_5_max value: 23.308845973576673 - type: nauc_mrr_at_5_std value: 9.364009996445652 - type: nauc_ndcg_at_1000_diff1 value: 16.327871824135745 - type: nauc_ndcg_at_1000_max value: 23.308241052911495 - type: nauc_ndcg_at_1000_std value: 11.50905911184097 - type: nauc_ndcg_at_100_diff1 value: 16.676226744692773 - type: nauc_ndcg_at_100_max value: 24.323253721240974 - type: nauc_ndcg_at_100_std value: 11.952612443651557 - type: nauc_ndcg_at_10_diff1 value: 16.030325121764594 - type: nauc_ndcg_at_10_max value: 21.306799242079542 - type: nauc_ndcg_at_10_std value: 6.63359364302513 - type: nauc_ndcg_at_1_diff1 value: 19.433947086968093 - type: nauc_ndcg_at_1_max value: 24.75657047550517 - type: nauc_ndcg_at_1_std value: 15.122329157218505 - type: nauc_ndcg_at_20_diff1 value: 16.013173605999857 - type: nauc_ndcg_at_20_max value: 21.607217260736576 - type: nauc_ndcg_at_20_std value: 7.319482417138996 - type: nauc_ndcg_at_3_diff1 value: 17.97958548328493 - type: nauc_ndcg_at_3_max value: 23.58346522810145 - type: nauc_ndcg_at_3_std value: 9.392582854708314 - type: nauc_ndcg_at_5_diff1 value: 18.734733324685287 - type: nauc_ndcg_at_5_max value: 23.273244317623742 - type: nauc_ndcg_at_5_std value: 7.638611545253834 - type: nauc_precision_at_1000_diff1 value: 7.919843339380295 - type: nauc_precision_at_1000_max value: 31.575386234270486 - type: nauc_precision_at_1000_std value: 39.332224386769404 - type: nauc_precision_at_100_diff1 value: 15.018050960000052 - type: nauc_precision_at_100_max value: 34.98209513759861 - type: nauc_precision_at_100_std value: 26.970034484359022 - type: nauc_precision_at_10_diff1 value: 12.102191084210922 - type: nauc_precision_at_10_max value: 18.112541150340675 - type: nauc_precision_at_10_std value: 0.7358784689406018 - type: nauc_precision_at_1_diff1 value: 19.433947086968093 - type: nauc_precision_at_1_max value: 24.75657047550517 - type: nauc_precision_at_1_std value: 15.122329157218505 - type: nauc_precision_at_20_diff1 value: 12.018814361204328 - type: nauc_precision_at_20_max value: 19.75123746049928 - type: nauc_precision_at_20_std value: 3.012204650582264 - type: nauc_precision_at_3_diff1 value: 17.41375604940955 - type: nauc_precision_at_3_max value: 23.699834627021037 - type: nauc_precision_at_3_std value: 6.793486779050103 - type: nauc_precision_at_5_diff1 value: 19.194631963780257 - type: nauc_precision_at_5_max value: 23.31708702442155 - type: nauc_precision_at_5_std value: 3.4591358279667332 - type: nauc_recall_at_1000_diff1 value: 7.919843339380378 - type: nauc_recall_at_1000_max value: 31.57538623427063 - type: nauc_recall_at_1000_std value: 39.332224386769546 - type: nauc_recall_at_100_diff1 value: 15.018050960000085 - type: nauc_recall_at_100_max value: 34.9820951375986 - type: nauc_recall_at_100_std value: 26.97003448435901 - type: nauc_recall_at_10_diff1 value: 12.102191084210837 - type: nauc_recall_at_10_max value: 18.112541150340594 - type: nauc_recall_at_10_std value: 0.7358784689405188 - type: nauc_recall_at_1_diff1 value: 19.433947086968093 - type: nauc_recall_at_1_max value: 24.75657047550517 - type: nauc_recall_at_1_std value: 15.122329157218505 - type: nauc_recall_at_20_diff1 value: 12.01881436120429 - type: nauc_recall_at_20_max value: 19.751237460499222 - type: nauc_recall_at_20_std value: 3.0122046505822135 - type: nauc_recall_at_3_diff1 value: 17.413756049409503 - type: nauc_recall_at_3_max value: 23.699834627020998 - type: nauc_recall_at_3_std value: 6.793486779050083 - type: nauc_recall_at_5_diff1 value: 19.194631963780203 - type: nauc_recall_at_5_max value: 23.3170870244215 - type: nauc_recall_at_5_std value: 3.459135827966664 - type: ndcg_at_1 value: 14.865 - type: ndcg_at_10 value: 24.764 - type: ndcg_at_100 value: 30.861 - type: ndcg_at_1000 value: 33.628 - type: ndcg_at_20 value: 27.078000000000003 - type: ndcg_at_3 value: 21.675 - type: ndcg_at_5 value: 23.148 - type: precision_at_1 value: 14.865 - type: precision_at_10 value: 3.4680000000000004 - type: precision_at_100 value: 0.644 - type: precision_at_1000 value: 0.087 - type: precision_at_20 value: 2.185 - type: precision_at_3 value: 8.709 - type: precision_at_5 value: 5.946 - type: recall_at_1 value: 14.865 - type: recall_at_10 value: 34.685 - type: recall_at_100 value: 64.414 - type: recall_at_1000 value: 86.937 - type: recall_at_20 value: 43.694 - type: recall_at_3 value: 26.125999999999998 - type: recall_at_5 value: 29.73 - task: type: Classification dataset: name: MTEB Banking77Classification (default) type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 84.08116883116882 - type: f1 value: 84.05587055990273 - type: f1_weighted value: 84.05587055990274 - type: main_score value: 84.08116883116882 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P (default) type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: main_score value: 38.1941007822277 - type: v_measure value: 38.1941007822277 - type: v_measure_std value: 0.7502113547288178 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S (default) type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: main_score value: 34.42075599178318 - type: v_measure value: 34.42075599178318 - type: v_measure_std value: 0.600256720497283 - task: type: Clustering dataset: name: MTEB BlurbsClusteringP2P (default) type: slvnwhrl/blurbs-clustering-p2p config: default split: test revision: a2dd5b02a77de3466a3eaa98ae586b5610314496 metrics: - type: main_score value: 41.634627363047265 - type: v_measure value: 41.634627363047265 - type: v_measure_std value: 9.726923191225307 - task: type: Clustering dataset: name: MTEB BlurbsClusteringS2S (default) type: slvnwhrl/blurbs-clustering-s2s config: default split: test revision: 22793b6a6465bf00120ad525e38c51210858132c metrics: - type: main_score value: 20.996468295584197 - type: v_measure value: 20.996468295584197 - type: v_measure_std value: 9.225766688272197 - task: type: Classification dataset: name: MTEB CBD (default) type: PL-MTEB/cbd config: default split: test revision: 36ddb419bcffe6a5374c3891957912892916f28d metrics: - type: accuracy value: 69.99 - type: ap value: 22.57826353116948 - type: ap_weighted value: 22.57826353116948 - type: f1 value: 59.04574955548393 - type: f1_weighted value: 74.36235022309789 - type: main_score value: 69.99 - task: type: PairClassification dataset: name: MTEB CDSC-E (default) type: PL-MTEB/cdsce-pairclassification config: default split: test revision: 0a3d4aa409b22f80eb22cbf59b492637637b536d metrics: - type: cosine_accuracy value: 88.7 - type: cosine_accuracy_threshold value: 97.37848043441772 - type: cosine_ap value: 73.0405088928302 - type: cosine_f1 value: 63.52201257861635 - type: cosine_f1_threshold value: 96.98888063430786 - type: cosine_precision value: 78.90625 - type: cosine_recall value: 53.1578947368421 - type: dot_accuracy value: 84.89999999999999 - type: dot_accuracy_threshold value: 43603.09753417969 - type: dot_ap value: 56.98157569085279 - type: dot_f1 value: 57.606490872210955 - type: dot_f1_threshold value: 40406.23779296875 - type: dot_precision value: 46.864686468646866 - type: dot_recall value: 74.73684210526315 - type: euclidean_accuracy value: 88.5 - type: euclidean_accuracy_threshold value: 498.0483055114746 - type: euclidean_ap value: 72.97328234816734 - type: euclidean_f1 value: 63.722397476340696 - type: euclidean_f1_threshold value: 508.6186408996582 - type: euclidean_precision value: 79.52755905511812 - type: euclidean_recall value: 53.1578947368421 - type: main_score value: 73.0405088928302 - type: manhattan_accuracy value: 88.6 - type: manhattan_accuracy_threshold value: 12233.079528808594 - type: manhattan_ap value: 72.92148503992615 - type: manhattan_f1 value: 63.69426751592356 - type: manhattan_f1_threshold value: 12392.754364013672 - type: manhattan_precision value: 80.64516129032258 - type: manhattan_recall value: 52.63157894736842 - type: max_accuracy value: 88.7 - type: max_ap value: 73.0405088928302 - type: max_f1 value: 63.722397476340696 - type: max_precision value: 80.64516129032258 - type: max_recall value: 74.73684210526315 - type: similarity_accuracy value: 88.7 - type: similarity_accuracy_threshold value: 97.37848043441772 - type: similarity_ap value: 73.0405088928302 - type: similarity_f1 value: 63.52201257861635 - type: similarity_f1_threshold value: 96.98888063430786 - type: similarity_precision value: 78.90625 - type: similarity_recall value: 53.1578947368421 - task: type: STS dataset: name: MTEB CDSC-R (default) type: PL-MTEB/cdscr-sts config: default split: test revision: 1cd6abbb00df7d14be3dbd76a7dcc64b3a79a7cd metrics: - type: cosine_pearson value: 92.97492495289738 - type: cosine_spearman value: 92.63248098608472 - type: euclidean_pearson value: 92.04712487782031 - type: euclidean_spearman value: 92.19679486755008 - type: main_score value: 92.63248098608472 - type: manhattan_pearson value: 92.0101187740438 - type: manhattan_spearman value: 92.20926859332754 - type: pearson value: 92.97492495289738 - type: spearman value: 92.63248098608472 - task: type: Clustering dataset: name: MTEB CLSClusteringP2P (default) type: C-MTEB/CLSClusteringP2P config: default split: test revision: 4b6227591c6c1a73bc76b1055f3b7f3588e72476 metrics: - type: main_score value: 39.96377851800628 - type: v_measure value: 39.96377851800628 - type: v_measure_std value: 0.9793033243093288 - task: type: Clustering dataset: name: MTEB CLSClusteringS2S (default) type: C-MTEB/CLSClusteringS2S config: default split: test revision: e458b3f5414b62b7f9f83499ac1f5497ae2e869f metrics: - type: main_score value: 38.788850224595784 - type: v_measure value: 38.788850224595784 - type: v_measure_std value: 1.0712604145916924 - task: type: Reranking dataset: name: MTEB CMedQAv1 type: C-MTEB/CMedQAv1-reranking config: default split: test revision: 8d7f1e942507dac42dc58017c1a001c3717da7df metrics: - type: map value: 77.95952507806115 - type: mrr value: 80.8643253968254 - type: main_score value: 77.95952507806115 - task: type: Reranking dataset: name: MTEB CMedQAv2 type: C-MTEB/CMedQAv2-reranking config: default split: test revision: 23d186750531a14a0357ca22cd92d712fd512ea0 metrics: - type: map value: 78.21522500165045 - type: mrr value: 81.28194444444443 - type: main_score value: 78.21522500165045 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval (default) type: mteb/cqadupstack-android config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: map_at_1 value: 33.377 - type: map_at_10 value: 46.371 - type: map_at_100 value: 47.829 - type: map_at_1000 value: 47.94 - type: map_at_20 value: 47.205000000000005 - type: map_at_3 value: 42.782 - type: map_at_5 value: 44.86 - type: mrr_at_1 value: 41.345 - type: mrr_at_10 value: 52.187 - type: mrr_at_100 value: 52.893 - type: mrr_at_1000 value: 52.929 - type: mrr_at_20 value: 52.637 - type: mrr_at_3 value: 49.714000000000006 - type: mrr_at_5 value: 51.373000000000005 - type: ndcg_at_1 value: 41.345 - type: ndcg_at_10 value: 52.946000000000005 - type: ndcg_at_100 value: 57.92699999999999 - type: ndcg_at_1000 value: 59.609 - type: ndcg_at_20 value: 54.900999999999996 - type: ndcg_at_3 value: 48.357 - type: ndcg_at_5 value: 50.739000000000004 - type: precision_at_1 value: 41.345 - type: precision_at_10 value: 10.186 - type: precision_at_100 value: 1.554 - type: precision_at_1000 value: 0.2 - type: precision_at_20 value: 5.959 - type: precision_at_3 value: 23.796 - type: precision_at_5 value: 17.024 - type: recall_at_1 value: 33.377 - type: recall_at_10 value: 65.067 - type: recall_at_100 value: 86.04899999999999 - type: recall_at_1000 value: 96.54899999999999 - type: recall_at_20 value: 72.071 - type: recall_at_3 value: 51.349999999999994 - type: recall_at_5 value: 58.41 - type: main_score value: 52.946000000000005 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval (default) type: mteb/cqadupstack-english config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: map_at_1 value: 31.097 - type: map_at_10 value: 42.183 - type: map_at_100 value: 43.580999999999996 - type: map_at_1000 value: 43.718 - type: map_at_20 value: 42.921 - type: map_at_3 value: 38.963 - type: map_at_5 value: 40.815 - type: mrr_at_1 value: 39.745000000000005 - type: mrr_at_10 value: 48.736000000000004 - type: mrr_at_100 value: 49.405 - type: mrr_at_1000 value: 49.452 - type: mrr_at_20 value: 49.118 - type: mrr_at_3 value: 46.497 - type: mrr_at_5 value: 47.827999999999996 - type: ndcg_at_1 value: 39.745000000000005 - type: ndcg_at_10 value: 48.248000000000005 - type: ndcg_at_100 value: 52.956 - type: ndcg_at_1000 value: 54.99699999999999 - type: ndcg_at_20 value: 50.01 - type: ndcg_at_3 value: 43.946000000000005 - type: ndcg_at_5 value: 46.038000000000004 - type: precision_at_1 value: 39.745000000000005 - type: precision_at_10 value: 9.229 - type: precision_at_100 value: 1.5070000000000001 - type: precision_at_1000 value: 0.199 - type: precision_at_20 value: 5.489999999999999 - type: precision_at_3 value: 21.38 - type: precision_at_5 value: 15.274 - type: recall_at_1 value: 31.097 - type: recall_at_10 value: 58.617 - type: recall_at_100 value: 78.55199999999999 - type: recall_at_1000 value: 91.13900000000001 - type: recall_at_20 value: 64.92 - type: recall_at_3 value: 45.672000000000004 - type: recall_at_5 value: 51.669 - type: main_score value: 48.248000000000005 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval (default) type: mteb/cqadupstack-gaming config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 39.745000000000005 - type: map_at_10 value: 52.063 - type: map_at_100 value: 53.077 - type: map_at_1000 value: 53.13 - type: map_at_20 value: 52.66 - type: map_at_3 value: 48.662 - type: map_at_5 value: 50.507000000000005 - type: mrr_at_1 value: 45.391999999999996 - type: mrr_at_10 value: 55.528 - type: mrr_at_100 value: 56.16100000000001 - type: mrr_at_1000 value: 56.192 - type: mrr_at_20 value: 55.923 - type: mrr_at_3 value: 52.93600000000001 - type: mrr_at_5 value: 54.435 - type: ndcg_at_1 value: 45.391999999999996 - type: ndcg_at_10 value: 58.019 - type: ndcg_at_100 value: 61.936 - type: ndcg_at_1000 value: 63.015 - type: ndcg_at_20 value: 59.691 - type: ndcg_at_3 value: 52.294 - type: ndcg_at_5 value: 55.017 - type: precision_at_1 value: 45.391999999999996 - type: precision_at_10 value: 9.386 - type: precision_at_100 value: 1.232 - type: precision_at_1000 value: 0.136 - type: precision_at_20 value: 5.223 - type: precision_at_3 value: 23.177 - type: precision_at_5 value: 15.9 - type: recall_at_1 value: 39.745000000000005 - type: recall_at_10 value: 72.08099999999999 - type: recall_at_100 value: 88.85300000000001 - type: recall_at_1000 value: 96.569 - type: recall_at_20 value: 78.203 - type: recall_at_3 value: 56.957 - type: recall_at_5 value: 63.63100000000001 - type: main_score value: 58.019 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval (default) type: mteb/cqadupstack-gis config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: map_at_1 value: 26.651999999999997 - type: map_at_10 value: 35.799 - type: map_at_100 value: 36.846000000000004 - type: map_at_1000 value: 36.931000000000004 - type: map_at_20 value: 36.341 - type: map_at_3 value: 32.999 - type: map_at_5 value: 34.597 - type: mrr_at_1 value: 28.814 - type: mrr_at_10 value: 37.869 - type: mrr_at_100 value: 38.728 - type: mrr_at_1000 value: 38.795 - type: mrr_at_20 value: 38.317 - type: mrr_at_3 value: 35.235 - type: mrr_at_5 value: 36.738 - type: ndcg_at_1 value: 28.814 - type: ndcg_at_10 value: 41.028 - type: ndcg_at_100 value: 46.162 - type: ndcg_at_1000 value: 48.15 - type: ndcg_at_20 value: 42.824 - type: ndcg_at_3 value: 35.621 - type: ndcg_at_5 value: 38.277 - type: precision_at_1 value: 28.814 - type: precision_at_10 value: 6.361999999999999 - type: precision_at_100 value: 0.9450000000000001 - type: precision_at_1000 value: 0.11399999999999999 - type: precision_at_20 value: 3.6159999999999997 - type: precision_at_3 value: 15.140999999999998 - type: precision_at_5 value: 10.712000000000002 - type: recall_at_1 value: 26.651999999999997 - type: recall_at_10 value: 55.038 - type: recall_at_100 value: 78.806 - type: recall_at_1000 value: 93.485 - type: recall_at_20 value: 61.742 - type: recall_at_3 value: 40.682 - type: recall_at_5 value: 46.855000000000004 - type: main_score value: 41.028 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval (default) type: mteb/cqadupstack-mathematica config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: map_at_1 value: 17.627000000000002 - type: map_at_10 value: 26.436999999999998 - type: map_at_100 value: 27.85 - type: map_at_1000 value: 27.955999999999996 - type: map_at_20 value: 27.233 - type: map_at_3 value: 23.777 - type: map_at_5 value: 25.122 - type: mrr_at_1 value: 22.387999999999998 - type: mrr_at_10 value: 31.589 - type: mrr_at_100 value: 32.641999999999996 - type: mrr_at_1000 value: 32.696999999999996 - type: mrr_at_20 value: 32.201 - type: mrr_at_3 value: 28.98 - type: mrr_at_5 value: 30.342000000000002 - type: ndcg_at_1 value: 22.387999999999998 - type: ndcg_at_10 value: 32.129999999999995 - type: ndcg_at_100 value: 38.562999999999995 - type: ndcg_at_1000 value: 40.903 - type: ndcg_at_20 value: 34.652 - type: ndcg_at_3 value: 27.26 - type: ndcg_at_5 value: 29.235 - type: precision_at_1 value: 22.387999999999998 - type: precision_at_10 value: 5.970000000000001 - type: precision_at_100 value: 1.068 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_20 value: 3.6999999999999997 - type: precision_at_3 value: 13.267000000000001 - type: precision_at_5 value: 9.403 - type: recall_at_1 value: 17.627000000000002 - type: recall_at_10 value: 44.71 - type: recall_at_100 value: 72.426 - type: recall_at_1000 value: 88.64699999999999 - type: recall_at_20 value: 53.65 - type: recall_at_3 value: 30.989 - type: recall_at_5 value: 36.237 - type: main_score value: 32.129999999999995 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval (default) type: mteb/cqadupstack-physics config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: map_at_1 value: 30.891000000000002 - type: map_at_10 value: 41.519 - type: map_at_100 value: 42.896 - type: map_at_1000 value: 42.992999999999995 - type: map_at_20 value: 42.287 - type: map_at_3 value: 37.822 - type: map_at_5 value: 39.976 - type: mrr_at_1 value: 37.921 - type: mrr_at_10 value: 47.260999999999996 - type: mrr_at_100 value: 48.044 - type: mrr_at_1000 value: 48.08 - type: mrr_at_20 value: 47.699999999999996 - type: mrr_at_3 value: 44.513999999999996 - type: mrr_at_5 value: 46.064 - type: ndcg_at_1 value: 37.921 - type: ndcg_at_10 value: 47.806 - type: ndcg_at_100 value: 53.274 - type: ndcg_at_1000 value: 55.021 - type: ndcg_at_20 value: 49.973 - type: ndcg_at_3 value: 42.046 - type: ndcg_at_5 value: 44.835 - type: precision_at_1 value: 37.921 - type: precision_at_10 value: 8.767999999999999 - type: precision_at_100 value: 1.353 - type: precision_at_1000 value: 0.168 - type: precision_at_20 value: 5.135 - type: precision_at_3 value: 20.051 - type: precision_at_5 value: 14.398 - type: recall_at_1 value: 30.891000000000002 - type: recall_at_10 value: 60.897999999999996 - type: recall_at_100 value: 83.541 - type: recall_at_1000 value: 94.825 - type: recall_at_20 value: 68.356 - type: recall_at_3 value: 44.65 - type: recall_at_5 value: 51.919000000000004 - type: main_score value: 47.806 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval (default) type: mteb/cqadupstack-programmers config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: map_at_1 value: 27.654 - type: map_at_10 value: 38.025999999999996 - type: map_at_100 value: 39.425 - type: map_at_1000 value: 39.528 - type: map_at_20 value: 38.838 - type: map_at_3 value: 34.745 - type: map_at_5 value: 36.537 - type: mrr_at_1 value: 34.018 - type: mrr_at_10 value: 43.314 - type: mrr_at_100 value: 44.283 - type: mrr_at_1000 value: 44.327 - type: mrr_at_20 value: 43.929 - type: mrr_at_3 value: 40.868 - type: mrr_at_5 value: 42.317 - type: ndcg_at_1 value: 34.018 - type: ndcg_at_10 value: 43.887 - type: ndcg_at_100 value: 49.791000000000004 - type: ndcg_at_1000 value: 51.834 - type: ndcg_at_20 value: 46.376 - type: ndcg_at_3 value: 38.769999999999996 - type: ndcg_at_5 value: 41.144 - type: precision_at_1 value: 34.018 - type: precision_at_10 value: 8.001999999999999 - type: precision_at_100 value: 1.2630000000000001 - type: precision_at_1000 value: 0.16 - type: precision_at_20 value: 4.737 - type: precision_at_3 value: 18.417 - type: precision_at_5 value: 13.150999999999998 - type: recall_at_1 value: 27.654 - type: recall_at_10 value: 56.111 - type: recall_at_100 value: 81.136 - type: recall_at_1000 value: 94.788 - type: recall_at_20 value: 65.068 - type: recall_at_3 value: 41.713 - type: recall_at_5 value: 48.106 - type: main_score value: 43.887 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval (default) type: CQADupstackRetrieval_is_a_combined_dataset config: default split: test revision: CQADupstackRetrieval_is_a_combined_dataset metrics: - type: main_score value: 42.58858333333333 - type: ndcg_at_10 value: 42.58858333333333 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval (default) type: mteb/cqadupstack-stats config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: map_at_1 value: 24.501 - type: map_at_10 value: 32.814 - type: map_at_100 value: 33.754 - type: map_at_1000 value: 33.859 - type: map_at_20 value: 33.324 - type: map_at_3 value: 30.758000000000003 - type: map_at_5 value: 31.936999999999998 - type: mrr_at_1 value: 27.761000000000003 - type: mrr_at_10 value: 35.662 - type: mrr_at_100 value: 36.443999999999996 - type: mrr_at_1000 value: 36.516999999999996 - type: mrr_at_20 value: 36.085 - type: mrr_at_3 value: 33.742 - type: mrr_at_5 value: 34.931 - type: ndcg_at_1 value: 27.761000000000003 - type: ndcg_at_10 value: 37.208000000000006 - type: ndcg_at_100 value: 41.839 - type: ndcg_at_1000 value: 44.421 - type: ndcg_at_20 value: 38.917 - type: ndcg_at_3 value: 33.544000000000004 - type: ndcg_at_5 value: 35.374 - type: precision_at_1 value: 27.761000000000003 - type: precision_at_10 value: 5.92 - type: precision_at_100 value: 0.899 - type: precision_at_1000 value: 0.12 - type: precision_at_20 value: 3.4130000000000003 - type: precision_at_3 value: 15.031 - type: precision_at_5 value: 10.306999999999999 - type: recall_at_1 value: 24.501 - type: recall_at_10 value: 47.579 - type: recall_at_100 value: 69.045 - type: recall_at_1000 value: 88.032 - type: recall_at_20 value: 54.125 - type: recall_at_3 value: 37.202 - type: recall_at_5 value: 41.927 - type: main_score value: 37.208000000000006 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval (default) type: mteb/cqadupstack-tex config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 18.29 - type: map_at_10 value: 26.183 - type: map_at_100 value: 27.351999999999997 - type: map_at_1000 value: 27.483999999999998 - type: map_at_20 value: 26.798 - type: map_at_3 value: 23.629 - type: map_at_5 value: 24.937 - type: mrr_at_1 value: 22.299 - type: mrr_at_10 value: 30.189 - type: mrr_at_100 value: 31.098 - type: mrr_at_1000 value: 31.177 - type: mrr_at_20 value: 30.697000000000003 - type: mrr_at_3 value: 27.862 - type: mrr_at_5 value: 29.066 - type: ndcg_at_1 value: 22.299 - type: ndcg_at_10 value: 31.202 - type: ndcg_at_100 value: 36.617 - type: ndcg_at_1000 value: 39.544000000000004 - type: ndcg_at_20 value: 33.177 - type: ndcg_at_3 value: 26.639000000000003 - type: ndcg_at_5 value: 28.526 - type: precision_at_1 value: 22.299 - type: precision_at_10 value: 5.8020000000000005 - type: precision_at_100 value: 1.0070000000000001 - type: precision_at_1000 value: 0.14400000000000002 - type: precision_at_20 value: 3.505 - type: precision_at_3 value: 12.698 - type: precision_at_5 value: 9.174 - type: recall_at_1 value: 18.29 - type: recall_at_10 value: 42.254999999999995 - type: recall_at_100 value: 66.60000000000001 - type: recall_at_1000 value: 87.31400000000001 - type: recall_at_20 value: 49.572 - type: recall_at_3 value: 29.342000000000002 - type: recall_at_5 value: 34.221000000000004 - type: main_score value: 31.202 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval (default) type: mteb/cqadupstack-unix config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: map_at_1 value: 27.722 - type: map_at_10 value: 37.698 - type: map_at_100 value: 38.899 - type: map_at_1000 value: 38.998 - type: map_at_20 value: 38.381 - type: map_at_3 value: 34.244 - type: map_at_5 value: 36.295 - type: mrr_at_1 value: 32.183 - type: mrr_at_10 value: 41.429 - type: mrr_at_100 value: 42.308 - type: mrr_at_1000 value: 42.358000000000004 - type: mrr_at_20 value: 41.957 - type: mrr_at_3 value: 38.401999999999994 - type: mrr_at_5 value: 40.294999999999995 - type: ndcg_at_1 value: 32.183 - type: ndcg_at_10 value: 43.519000000000005 - type: ndcg_at_100 value: 48.786 - type: ndcg_at_1000 value: 50.861999999999995 - type: ndcg_at_20 value: 45.654 - type: ndcg_at_3 value: 37.521 - type: ndcg_at_5 value: 40.615 - type: precision_at_1 value: 32.183 - type: precision_at_10 value: 7.603 - type: precision_at_100 value: 1.135 - type: precision_at_1000 value: 0.14200000000000002 - type: precision_at_20 value: 4.408 - type: precision_at_3 value: 17.071 - type: precision_at_5 value: 12.668 - type: recall_at_1 value: 27.722 - type: recall_at_10 value: 57.230000000000004 - type: recall_at_100 value: 79.97999999999999 - type: recall_at_1000 value: 94.217 - type: recall_at_20 value: 64.864 - type: recall_at_3 value: 41.215 - type: recall_at_5 value: 48.774 - type: main_score value: 43.519000000000005 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval (default) type: mteb/cqadupstack-webmasters config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: map_at_1 value: 25.852999999999998 - type: map_at_10 value: 35.394999999999996 - type: map_at_100 value: 37.291999999999994 - type: map_at_1000 value: 37.495 - type: map_at_20 value: 36.372 - type: map_at_3 value: 32.336 - type: map_at_5 value: 34.159 - type: mrr_at_1 value: 31.818 - type: mrr_at_10 value: 40.677 - type: mrr_at_100 value: 41.728 - type: mrr_at_1000 value: 41.778 - type: mrr_at_20 value: 41.301 - type: mrr_at_3 value: 38.208 - type: mrr_at_5 value: 39.592 - type: ndcg_at_1 value: 31.818 - type: ndcg_at_10 value: 41.559000000000005 - type: ndcg_at_100 value: 48.012 - type: ndcg_at_1000 value: 50.234 - type: ndcg_at_20 value: 44.15 - type: ndcg_at_3 value: 36.918 - type: ndcg_at_5 value: 39.227000000000004 - type: precision_at_1 value: 31.818 - type: precision_at_10 value: 8.043 - type: precision_at_100 value: 1.625 - type: precision_at_1000 value: 0.245 - type: precision_at_20 value: 5.2170000000000005 - type: precision_at_3 value: 17.655 - type: precision_at_5 value: 12.845999999999998 - type: recall_at_1 value: 25.852999999999998 - type: recall_at_10 value: 53.093 - type: recall_at_100 value: 81.05799999999999 - type: recall_at_1000 value: 94.657 - type: recall_at_20 value: 62.748000000000005 - type: recall_at_3 value: 39.300000000000004 - type: recall_at_5 value: 45.754 - type: main_score value: 41.559000000000005 - task: type: Retrieval dataset: name: MTEB CQADupstackWordpressRetrieval (default) type: mteb/cqadupstack-wordpress config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 19.23 - type: map_at_10 value: 28.128999999999998 - type: map_at_100 value: 29.195 - type: map_at_1000 value: 29.310000000000002 - type: map_at_20 value: 28.713 - type: map_at_3 value: 25.191000000000003 - type: map_at_5 value: 26.69 - type: mrr_at_1 value: 21.257 - type: mrr_at_10 value: 30.253999999999998 - type: mrr_at_100 value: 31.195 - type: mrr_at_1000 value: 31.270999999999997 - type: mrr_at_20 value: 30.747999999999998 - type: mrr_at_3 value: 27.633999999999997 - type: mrr_at_5 value: 28.937 - type: ndcg_at_1 value: 21.257 - type: ndcg_at_10 value: 33.511 - type: ndcg_at_100 value: 38.733000000000004 - type: ndcg_at_1000 value: 41.489 - type: ndcg_at_20 value: 35.476 - type: ndcg_at_3 value: 27.845 - type: ndcg_at_5 value: 30.264999999999997 - type: precision_at_1 value: 21.257 - type: precision_at_10 value: 5.619 - type: precision_at_100 value: 0.893 - type: precision_at_1000 value: 0.124 - type: precision_at_20 value: 3.29 - type: precision_at_3 value: 12.508 - type: precision_at_5 value: 8.946 - type: recall_at_1 value: 19.23 - type: recall_at_10 value: 48.185 - type: recall_at_100 value: 71.932 - type: recall_at_1000 value: 92.587 - type: recall_at_20 value: 55.533 - type: recall_at_3 value: 32.865 - type: recall_at_5 value: 38.577 - type: main_score value: 33.511 - task: type: Retrieval dataset: name: MTEB ClimateFEVER (default) type: mteb/climate-fever config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: map_at_1 value: 19.594 - type: map_at_10 value: 32.519 - type: map_at_100 value: 34.1 - type: map_at_1000 value: 34.263 - type: map_at_20 value: 33.353 - type: map_at_3 value: 27.898 - type: map_at_5 value: 30.524 - type: mrr_at_1 value: 46.515 - type: mrr_at_10 value: 56.958 - type: mrr_at_100 value: 57.54899999999999 - type: mrr_at_1000 value: 57.574999999999996 - type: mrr_at_20 value: 57.315000000000005 - type: mrr_at_3 value: 54.852999999999994 - type: mrr_at_5 value: 56.153 - type: ndcg_at_1 value: 46.515 - type: ndcg_at_10 value: 42.363 - type: ndcg_at_100 value: 48.233 - type: ndcg_at_1000 value: 50.993 - type: ndcg_at_20 value: 44.533 - type: ndcg_at_3 value: 37.297000000000004 - type: ndcg_at_5 value: 38.911 - type: precision_at_1 value: 46.515 - type: precision_at_10 value: 12.520999999999999 - type: precision_at_100 value: 1.8980000000000001 - type: precision_at_1000 value: 0.242 - type: precision_at_20 value: 7.212000000000001 - type: precision_at_3 value: 27.752 - type: precision_at_5 value: 20.391000000000002 - type: recall_at_1 value: 19.594 - type: recall_at_10 value: 46.539 - type: recall_at_100 value: 66.782 - type: recall_at_1000 value: 82.049 - type: recall_at_20 value: 52.611 - type: recall_at_3 value: 32.528 - type: recall_at_5 value: 38.933 - type: main_score value: 42.363 - task: type: Retrieval dataset: name: MTEB CmedqaRetrieval (default) type: C-MTEB/CmedqaRetrieval config: default split: dev revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301 metrics: - type: main_score value: 35.927 - type: map_at_1 value: 20.144000000000002 - type: map_at_10 value: 29.94 - type: map_at_100 value: 31.630000000000003 - type: map_at_1000 value: 31.778000000000002 - type: map_at_20 value: 30.798 - type: map_at_3 value: 26.534999999999997 - type: map_at_5 value: 28.33 - type: mrr_at_1 value: 31.23280820205051 - type: mrr_at_10 value: 38.66781179421835 - type: mrr_at_100 value: 39.656936166081785 - type: mrr_at_1000 value: 39.724602893117414 - type: mrr_at_20 value: 39.21272461558451 - type: mrr_at_3 value: 36.30907726931729 - type: mrr_at_5 value: 37.59814953738436 - type: nauc_map_at_1000_diff1 value: 44.5755334437146 - type: nauc_map_at_1000_max value: 40.726916781400746 - type: nauc_map_at_1000_std value: -19.591835061497367 - type: nauc_map_at_100_diff1 value: 44.54542899921038 - type: nauc_map_at_100_max value: 40.68305902532837 - type: nauc_map_at_100_std value: -19.658902089283487 - type: nauc_map_at_10_diff1 value: 44.56110529630953 - type: nauc_map_at_10_max value: 39.89826167846008 - type: nauc_map_at_10_std value: -20.62910633667902 - type: nauc_map_at_1_diff1 value: 50.82120107004449 - type: nauc_map_at_1_max value: 33.208851367861584 - type: nauc_map_at_1_std value: -20.29409730258174 - type: nauc_map_at_20_diff1 value: 44.51171242433788 - type: nauc_map_at_20_max value: 40.30431132782945 - type: nauc_map_at_20_std value: -20.290524142792417 - type: nauc_map_at_3_diff1 value: 45.80394138665133 - type: nauc_map_at_3_max value: 37.766191281426956 - type: nauc_map_at_3_std value: -21.223601997333876 - type: nauc_map_at_5_diff1 value: 45.00457218474283 - type: nauc_map_at_5_max value: 38.901044576388365 - type: nauc_map_at_5_std value: -20.893069613941634 - type: nauc_mrr_at_1000_diff1 value: 50.09855359231429 - type: nauc_mrr_at_1000_max value: 46.481000170008826 - type: nauc_mrr_at_1000_std value: -16.053461377096102 - type: nauc_mrr_at_100_diff1 value: 50.08205026347746 - type: nauc_mrr_at_100_max value: 46.47262126963331 - type: nauc_mrr_at_100_std value: -16.049112778748693 - type: nauc_mrr_at_10_diff1 value: 50.02363239081706 - type: nauc_mrr_at_10_max value: 46.39287859062042 - type: nauc_mrr_at_10_std value: -16.280866744769657 - type: nauc_mrr_at_1_diff1 value: 55.692503735317445 - type: nauc_mrr_at_1_max value: 47.334834529801014 - type: nauc_mrr_at_1_std value: -16.985483585693512 - type: nauc_mrr_at_20_diff1 value: 50.07725225722074 - type: nauc_mrr_at_20_max value: 46.47279295070193 - type: nauc_mrr_at_20_std value: -16.15168364678318 - type: nauc_mrr_at_3_diff1 value: 51.18685337274134 - type: nauc_mrr_at_3_max value: 46.7286365021621 - type: nauc_mrr_at_3_std value: -16.708451287313718 - type: nauc_mrr_at_5_diff1 value: 50.46777237893576 - type: nauc_mrr_at_5_max value: 46.5352076502249 - type: nauc_mrr_at_5_std value: -16.557413659905034 - type: nauc_ndcg_at_1000_diff1 value: 43.974299434438066 - type: nauc_ndcg_at_1000_max value: 43.44628675071857 - type: nauc_ndcg_at_1000_std value: -15.3495102005021 - type: nauc_ndcg_at_100_diff1 value: 43.336365081508504 - type: nauc_ndcg_at_100_max value: 43.11345604460776 - type: nauc_ndcg_at_100_std value: -15.571128070860615 - type: nauc_ndcg_at_10_diff1 value: 43.41266214720136 - type: nauc_ndcg_at_10_max value: 41.519676787851914 - type: nauc_ndcg_at_10_std value: -19.217175017223568 - type: nauc_ndcg_at_1_diff1 value: 55.692503735317445 - type: nauc_ndcg_at_1_max value: 47.334834529801014 - type: nauc_ndcg_at_1_std value: -16.985483585693512 - type: nauc_ndcg_at_20_diff1 value: 43.351653862834496 - type: nauc_ndcg_at_20_max value: 42.11608469750499 - type: nauc_ndcg_at_20_std value: -18.485363540641664 - type: nauc_ndcg_at_3_diff1 value: 45.64193888236677 - type: nauc_ndcg_at_3_max value: 42.497135099009995 - type: nauc_ndcg_at_3_std value: -18.764012041130094 - type: nauc_ndcg_at_5_diff1 value: 44.523392133895186 - type: nauc_ndcg_at_5_max value: 41.564242030096345 - type: nauc_ndcg_at_5_std value: -19.31080790984941 - type: nauc_precision_at_1000_diff1 value: 6.383464615714393 - type: nauc_precision_at_1000_max value: 27.439930931284657 - type: nauc_precision_at_1000_std value: 19.070716188143034 - type: nauc_precision_at_100_diff1 value: 12.599136754501284 - type: nauc_precision_at_100_max value: 35.886310962337795 - type: nauc_precision_at_100_std value: 14.06587592659196 - type: nauc_precision_at_10_diff1 value: 25.388891173150206 - type: nauc_precision_at_10_max value: 46.10269270777384 - type: nauc_precision_at_10_std value: -5.993803607158499 - type: nauc_precision_at_1_diff1 value: 55.692503735317445 - type: nauc_precision_at_1_max value: 47.334834529801014 - type: nauc_precision_at_1_std value: -16.985483585693512 - type: nauc_precision_at_20_diff1 value: 20.984013463099707 - type: nauc_precision_at_20_max value: 42.9471854616888 - type: nauc_precision_at_20_std value: -0.8045549929346024 - type: nauc_precision_at_3_diff1 value: 36.191850547148356 - type: nauc_precision_at_3_max value: 48.09923832376049 - type: nauc_precision_at_3_std value: -13.159407051271321 - type: nauc_precision_at_5_diff1 value: 31.04967966700407 - type: nauc_precision_at_5_max value: 47.62867673349624 - type: nauc_precision_at_5_std value: -10.345790325137353 - type: nauc_recall_at_1000_diff1 value: 11.03436839065707 - type: nauc_recall_at_1000_max value: 42.32265076651575 - type: nauc_recall_at_1000_std value: 30.478521053399206 - type: nauc_recall_at_100_diff1 value: 24.788349084510806 - type: nauc_recall_at_100_max value: 36.72097184821956 - type: nauc_recall_at_100_std value: -0.2241144179522076 - type: nauc_recall_at_10_diff1 value: 31.613053567704885 - type: nauc_recall_at_10_max value: 34.4597322828833 - type: nauc_recall_at_10_std value: -18.00022912690819 - type: nauc_recall_at_1_diff1 value: 50.82120107004449 - type: nauc_recall_at_1_max value: 33.208851367861584 - type: nauc_recall_at_1_std value: -20.29409730258174 - type: nauc_recall_at_20_diff1 value: 30.277002670708384 - type: nauc_recall_at_20_max value: 35.212475675060375 - type: nauc_recall_at_20_std value: -15.822788854733687 - type: nauc_recall_at_3_diff1 value: 38.87844958322257 - type: nauc_recall_at_3_max value: 34.66914910044104 - type: nauc_recall_at_3_std value: -20.234707300209127 - type: nauc_recall_at_5_diff1 value: 35.551139991687776 - type: nauc_recall_at_5_max value: 34.61009958820695 - type: nauc_recall_at_5_std value: -19.519180149293444 - type: ndcg_at_1 value: 31.233 - type: ndcg_at_10 value: 35.927 - type: ndcg_at_100 value: 43.037 - type: ndcg_at_1000 value: 45.900999999999996 - type: ndcg_at_20 value: 38.39 - type: ndcg_at_3 value: 31.366 - type: ndcg_at_5 value: 33.108 - type: precision_at_1 value: 31.233 - type: precision_at_10 value: 8.15 - type: precision_at_100 value: 1.402 - type: precision_at_1000 value: 0.17700000000000002 - type: precision_at_20 value: 4.91 - type: precision_at_3 value: 17.871000000000002 - type: precision_at_5 value: 12.948 - type: recall_at_1 value: 20.144000000000002 - type: recall_at_10 value: 44.985 - type: recall_at_100 value: 74.866 - type: recall_at_1000 value: 94.477 - type: recall_at_20 value: 53.37 - type: recall_at_3 value: 31.141000000000002 - type: recall_at_5 value: 36.721 - task: type: PairClassification dataset: name: MTEB Cmnli (default) type: C-MTEB/CMNLI config: default split: validation revision: None metrics: - type: cos_sim_accuracy value: 71.25676488274203 - type: cos_sim_accuracy_threshold value: 78.11152935028076 - type: cos_sim_ap value: 79.10444825556077 - type: cos_sim_f1 value: 74.10750923266312 - type: cos_sim_f1_threshold value: 75.2312421798706 - type: cos_sim_precision value: 66.02083714129044 - type: cos_sim_recall value: 84.45171849427169 - type: dot_accuracy value: 68.11785929043896 - type: dot_accuracy_threshold value: 34783.23974609375 - type: dot_ap value: 75.80201827987712 - type: dot_f1 value: 72.31670990679349 - type: dot_f1_threshold value: 31978.036499023438 - type: dot_precision value: 61.386623164763456 - type: dot_recall value: 87.98223053542202 - type: euclidean_accuracy value: 71.41310883944678 - type: euclidean_accuracy_threshold value: 1374.9353408813477 - type: euclidean_ap value: 79.23359768836457 - type: euclidean_f1 value: 74.38512297540491 - type: euclidean_f1_threshold value: 1512.6035690307617 - type: euclidean_precision value: 64.97816593886463 - type: euclidean_recall value: 86.97685293429974 - type: manhattan_accuracy value: 71.32892363199038 - type: manhattan_accuracy_threshold value: 33340.49072265625 - type: manhattan_ap value: 79.11973684118587 - type: manhattan_f1 value: 74.29401993355481 - type: manhattan_f1_threshold value: 36012.52746582031 - type: manhattan_precision value: 66.81605975723622 - type: manhattan_recall value: 83.65676876315175 - type: max_accuracy value: 71.41310883944678 - type: max_ap value: 79.23359768836457 - type: max_f1 value: 74.38512297540491 - task: type: Retrieval dataset: name: MTEB CovidRetrieval (default) type: C-MTEB/CovidRetrieval config: default split: dev revision: 1271c7809071a13532e05f25fb53511ffce77117 metrics: - type: main_score value: 78.917 - type: map_at_1 value: 67.281 - type: map_at_10 value: 75.262 - type: map_at_100 value: 75.60900000000001 - type: map_at_1000 value: 75.618 - type: map_at_20 value: 75.50200000000001 - type: map_at_3 value: 73.455 - type: map_at_5 value: 74.657 - type: mrr_at_1 value: 67.43940990516333 - type: mrr_at_10 value: 75.27367989696756 - type: mrr_at_100 value: 75.62029353306437 - type: mrr_at_1000 value: 75.62934741874726 - type: mrr_at_20 value: 75.51356607409173 - type: mrr_at_3 value: 73.5159817351598 - type: mrr_at_5 value: 74.73832103969093 - type: nauc_map_at_1000_diff1 value: 77.26666391867634 - type: nauc_map_at_1000_max value: 49.928541012203496 - type: nauc_map_at_1000_std value: -40.494469470474456 - type: nauc_map_at_100_diff1 value: 77.26087423162396 - type: nauc_map_at_100_max value: 49.944275615664424 - type: nauc_map_at_100_std value: -40.48299992715398 - type: nauc_map_at_10_diff1 value: 76.97400113500906 - type: nauc_map_at_10_max value: 49.84177029115674 - type: nauc_map_at_10_std value: -40.829250876511445 - type: nauc_map_at_1_diff1 value: 81.44050620630395 - type: nauc_map_at_1_max value: 48.97711944070578 - type: nauc_map_at_1_std value: -38.963689457570254 - type: nauc_map_at_20_diff1 value: 77.21791353089375 - type: nauc_map_at_20_max value: 49.958206759079424 - type: nauc_map_at_20_std value: -40.53067571658996 - type: nauc_map_at_3_diff1 value: 77.3555925208868 - type: nauc_map_at_3_max value: 49.32158146451256 - type: nauc_map_at_3_std value: -41.93552426981978 - type: nauc_map_at_5_diff1 value: 77.07099950431504 - type: nauc_map_at_5_max value: 49.54190504495002 - type: nauc_map_at_5_std value: -41.814968130918096 - type: nauc_mrr_at_1000_diff1 value: 77.31388774540477 - type: nauc_mrr_at_1000_max value: 49.96779699175759 - type: nauc_mrr_at_1000_std value: -40.43739645160277 - type: nauc_mrr_at_100_diff1 value: 77.30817786449413 - type: nauc_mrr_at_100_max value: 49.982514428937655 - type: nauc_mrr_at_100_std value: -40.42876582797744 - type: nauc_mrr_at_10_diff1 value: 77.02048060465756 - type: nauc_mrr_at_10_max value: 49.87937207270602 - type: nauc_mrr_at_10_std value: -40.77596560333177 - type: nauc_mrr_at_1_diff1 value: 81.27219599516599 - type: nauc_mrr_at_1_max value: 49.3083394026327 - type: nauc_mrr_at_1_std value: -38.31023037552026 - type: nauc_mrr_at_20_diff1 value: 77.26497089316055 - type: nauc_mrr_at_20_max value: 49.996257597621415 - type: nauc_mrr_at_20_std value: -40.476723608868014 - type: nauc_mrr_at_3_diff1 value: 77.38971294099257 - type: nauc_mrr_at_3_max value: 49.38110328987404 - type: nauc_mrr_at_3_std value: -41.7118646715979 - type: nauc_mrr_at_5_diff1 value: 77.08286142519952 - type: nauc_mrr_at_5_max value: 49.655249374588685 - type: nauc_mrr_at_5_std value: -41.48173039989406 - type: nauc_ndcg_at_1000_diff1 value: 76.47399204021758 - type: nauc_ndcg_at_1000_max value: 50.55770139961048 - type: nauc_ndcg_at_1000_std value: -39.55650430279072 - type: nauc_ndcg_at_100_diff1 value: 76.29355616618253 - type: nauc_ndcg_at_100_max value: 51.003608112592936 - type: nauc_ndcg_at_100_std value: -39.24769744605206 - type: nauc_ndcg_at_10_diff1 value: 74.88697528447634 - type: nauc_ndcg_at_10_max value: 50.398416372815234 - type: nauc_ndcg_at_10_std value: -40.76526585772833 - type: nauc_ndcg_at_1_diff1 value: 81.27219599516599 - type: nauc_ndcg_at_1_max value: 49.3083394026327 - type: nauc_ndcg_at_1_std value: -38.31023037552026 - type: nauc_ndcg_at_20_diff1 value: 75.85463512091866 - type: nauc_ndcg_at_20_max value: 50.97338683654334 - type: nauc_ndcg_at_20_std value: -39.353128774903404 - type: nauc_ndcg_at_3_diff1 value: 75.94015726123543 - type: nauc_ndcg_at_3_max value: 49.22194251063148 - type: nauc_ndcg_at_3_std value: -43.040457030630435 - type: nauc_ndcg_at_5_diff1 value: 75.19166189770303 - type: nauc_ndcg_at_5_max value: 49.65696229797189 - type: nauc_ndcg_at_5_std value: -42.81534909184424 - type: nauc_precision_at_1000_diff1 value: -14.830901395815788 - type: nauc_precision_at_1000_max value: 19.686297136854623 - type: nauc_precision_at_1000_std value: 61.19310360166978 - type: nauc_precision_at_100_diff1 value: 20.55469986751769 - type: nauc_precision_at_100_max value: 50.78431835075583 - type: nauc_precision_at_100_std value: 31.54986568374813 - type: nauc_precision_at_10_diff1 value: 45.991938532558656 - type: nauc_precision_at_10_max value: 46.386318595630385 - type: nauc_precision_at_10_std value: -23.463011435224608 - type: nauc_precision_at_1_diff1 value: 81.27219599516599 - type: nauc_precision_at_1_max value: 49.3083394026327 - type: nauc_precision_at_1_std value: -38.31023037552026 - type: nauc_precision_at_20_diff1 value: 41.53180472410822 - type: nauc_precision_at_20_max value: 49.89800247204318 - type: nauc_precision_at_20_std value: -2.4192847331537095 - type: nauc_precision_at_3_diff1 value: 67.37504651209993 - type: nauc_precision_at_3_max value: 47.893537208629496 - type: nauc_precision_at_3_std value: -43.2362212382819 - type: nauc_precision_at_5_diff1 value: 60.03438883791718 - type: nauc_precision_at_5_max value: 48.29770502354206 - type: nauc_precision_at_5_std value: -40.39588448271546 - type: nauc_recall_at_1000_diff1 value: 71.04741174480844 - type: nauc_recall_at_1000_max value: 93.19056506596002 - type: nauc_recall_at_1000_std value: 62.96994797650912 - type: nauc_recall_at_100_diff1 value: 65.00418176852641 - type: nauc_recall_at_100_max value: 85.27352708427193 - type: nauc_recall_at_100_std value: 2.8812005546518886 - type: nauc_recall_at_10_diff1 value: 61.263254794998865 - type: nauc_recall_at_10_max value: 54.17618329507141 - type: nauc_recall_at_10_std value: -39.80603966142593 - type: nauc_recall_at_1_diff1 value: 81.44050620630395 - type: nauc_recall_at_1_max value: 48.97711944070578 - type: nauc_recall_at_1_std value: -38.963689457570254 - type: nauc_recall_at_20_diff1 value: 64.42106091745396 - type: nauc_recall_at_20_max value: 63.10796640821887 - type: nauc_recall_at_20_std value: -22.60117424572222 - type: nauc_recall_at_3_diff1 value: 70.66311436592945 - type: nauc_recall_at_3_max value: 48.69498944323469 - type: nauc_recall_at_3_std value: -47.37847524874532 - type: nauc_recall_at_5_diff1 value: 66.12701111728848 - type: nauc_recall_at_5_max value: 49.91763957934711 - type: nauc_recall_at_5_std value: -48.173252920584126 - type: ndcg_at_1 value: 67.43900000000001 - type: ndcg_at_10 value: 78.917 - type: ndcg_at_100 value: 80.53399999999999 - type: ndcg_at_1000 value: 80.768 - type: ndcg_at_20 value: 79.813 - type: ndcg_at_3 value: 75.37 - type: ndcg_at_5 value: 77.551 - type: precision_at_1 value: 67.43900000000001 - type: precision_at_10 value: 9.115 - type: precision_at_100 value: 0.985 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.737 - type: precision_at_3 value: 27.081 - type: precision_at_5 value: 17.345 - type: recall_at_1 value: 67.281 - type: recall_at_10 value: 90.2 - type: recall_at_100 value: 97.576 - type: recall_at_1000 value: 99.368 - type: recall_at_20 value: 93.783 - type: recall_at_3 value: 80.822 - type: recall_at_5 value: 86.091 - task: type: Retrieval dataset: name: MTEB DBPedia (default) type: mteb/dbpedia config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: map_at_1 value: 9.041 - type: map_at_10 value: 18.662 - type: map_at_100 value: 26.054 - type: map_at_1000 value: 27.769 - type: map_at_20 value: 21.499 - type: map_at_3 value: 13.628000000000002 - type: map_at_5 value: 15.617 - type: mrr_at_1 value: 67.25 - type: mrr_at_10 value: 74.673 - type: mrr_at_100 value: 75.022 - type: mrr_at_1000 value: 75.031 - type: mrr_at_20 value: 74.895 - type: mrr_at_3 value: 73.042 - type: mrr_at_5 value: 74.179 - type: ndcg_at_1 value: 55.75 - type: ndcg_at_10 value: 41.004000000000005 - type: ndcg_at_100 value: 44.912 - type: ndcg_at_1000 value: 51.946000000000005 - type: ndcg_at_20 value: 40.195 - type: ndcg_at_3 value: 45.803 - type: ndcg_at_5 value: 42.976 - type: precision_at_1 value: 67.25 - type: precision_at_10 value: 31.874999999999996 - type: precision_at_100 value: 10.37 - type: precision_at_1000 value: 2.1430000000000002 - type: precision_at_20 value: 24.275 - type: precision_at_3 value: 48.417 - type: precision_at_5 value: 40.2 - type: recall_at_1 value: 9.041 - type: recall_at_10 value: 23.592 - type: recall_at_100 value: 49.476 - type: recall_at_1000 value: 71.677 - type: recall_at_20 value: 30.153000000000002 - type: recall_at_3 value: 14.777000000000001 - type: recall_at_5 value: 17.829 - type: main_score value: 41.004000000000005 - task: type: Retrieval dataset: name: MTEB DuRetrieval (default) type: C-MTEB/DuRetrieval config: default split: dev revision: a1a333e290fe30b10f3f56498e3a0d911a693ced metrics: - type: main_score value: 83.134 - type: map_at_1 value: 23.907999999999998 - type: map_at_10 value: 74.566 - type: map_at_100 value: 77.706 - type: map_at_1000 value: 77.762 - type: map_at_20 value: 76.943 - type: map_at_3 value: 50.971999999999994 - type: map_at_5 value: 64.429 - type: mrr_at_1 value: 84.8 - type: mrr_at_10 value: 89.73218253968246 - type: mrr_at_100 value: 89.82853630655774 - type: mrr_at_1000 value: 89.83170411703153 - type: mrr_at_20 value: 89.79582030091501 - type: mrr_at_3 value: 89.32499999999992 - type: mrr_at_5 value: 89.58749999999992 - type: nauc_map_at_1000_diff1 value: -2.2736020650163717 - type: nauc_map_at_1000_max value: 45.3937519555142 - type: nauc_map_at_1000_std value: 10.824778228268581 - type: nauc_map_at_100_diff1 value: -2.2662939752750066 - type: nauc_map_at_100_max value: 45.423960626031366 - type: nauc_map_at_100_std value: 10.804239351738717 - type: nauc_map_at_10_diff1 value: 0.9395752585654343 - type: nauc_map_at_10_max value: 42.53814836940551 - type: nauc_map_at_10_std value: 0.7199313235265218 - type: nauc_map_at_1_diff1 value: 45.19415865267676 - type: nauc_map_at_1_max value: -1.7261947382471912 - type: nauc_map_at_1_std value: -32.16144291613605 - type: nauc_map_at_20_diff1 value: -1.884514152147472 - type: nauc_map_at_20_max value: 44.830401115927174 - type: nauc_map_at_20_std value: 8.118530414377219 - type: nauc_map_at_3_diff1 value: 25.678881127059967 - type: nauc_map_at_3_max value: 12.191400431839758 - type: nauc_map_at_3_std value: -27.201740587642327 - type: nauc_map_at_5_diff1 value: 13.227128780829572 - type: nauc_map_at_5_max value: 26.978282739708977 - type: nauc_map_at_5_std value: -17.555610348070584 - type: nauc_mrr_at_1000_diff1 value: 21.073512437502178 - type: nauc_mrr_at_1000_max value: 64.9680257861005 - type: nauc_mrr_at_1000_std value: 19.626288754404293 - type: nauc_mrr_at_100_diff1 value: 21.074637426957732 - type: nauc_mrr_at_100_max value: 64.97612675661915 - type: nauc_mrr_at_100_std value: 19.649504127800878 - type: nauc_mrr_at_10_diff1 value: 21.12003267626651 - type: nauc_mrr_at_10_max value: 65.24362289059766 - type: nauc_mrr_at_10_std value: 19.92351276180984 - type: nauc_mrr_at_1_diff1 value: 22.711430629147635 - type: nauc_mrr_at_1_max value: 58.4059429497403 - type: nauc_mrr_at_1_std value: 11.967886722567973 - type: nauc_mrr_at_20_diff1 value: 20.98220830510272 - type: nauc_mrr_at_20_max value: 65.05737535197835 - type: nauc_mrr_at_20_std value: 19.66672900782771 - type: nauc_mrr_at_3_diff1 value: 20.924796220048528 - type: nauc_mrr_at_3_max value: 65.71388669932584 - type: nauc_mrr_at_3_std value: 20.05912197134477 - type: nauc_mrr_at_5_diff1 value: 20.61978649468208 - type: nauc_mrr_at_5_max value: 65.50709154526211 - type: nauc_mrr_at_5_std value: 20.241434276181838 - type: nauc_ndcg_at_1000_diff1 value: 0.25363171946133656 - type: nauc_ndcg_at_1000_max value: 54.12840465309885 - type: nauc_ndcg_at_1000_std value: 20.749184325412546 - type: nauc_ndcg_at_100_diff1 value: 0.15649430250272792 - type: nauc_ndcg_at_100_max value: 54.47995322413234 - type: nauc_ndcg_at_100_std value: 21.266786634233267 - type: nauc_ndcg_at_10_diff1 value: 0.14579250840386346 - type: nauc_ndcg_at_10_max value: 49.8643037948353 - type: nauc_ndcg_at_10_std value: 12.960701643914216 - type: nauc_ndcg_at_1_diff1 value: 22.711430629147635 - type: nauc_ndcg_at_1_max value: 58.4059429497403 - type: nauc_ndcg_at_1_std value: 11.967886722567973 - type: nauc_ndcg_at_20_diff1 value: -0.6701559981776763 - type: nauc_ndcg_at_20_max value: 52.95443437012488 - type: nauc_ndcg_at_20_std value: 16.708883972005758 - type: nauc_ndcg_at_3_diff1 value: -0.19084922341962388 - type: nauc_ndcg_at_3_max value: 46.2110230886874 - type: nauc_ndcg_at_3_std value: 13.363250229683038 - type: nauc_ndcg_at_5_diff1 value: 0.9840019268192548 - type: nauc_ndcg_at_5_max value: 43.56594891798146 - type: nauc_ndcg_at_5_std value: 8.577017104088146 - type: nauc_precision_at_1000_diff1 value: -30.779179091501145 - type: nauc_precision_at_1000_max value: 16.056094258615673 - type: nauc_precision_at_1000_std value: 49.96303902363283 - type: nauc_precision_at_100_diff1 value: -31.583236638899585 - type: nauc_precision_at_100_max value: 19.16571713603373 - type: nauc_precision_at_100_std value: 51.870647903980036 - type: nauc_precision_at_10_diff1 value: -35.62134572732597 - type: nauc_precision_at_10_max value: 31.6935186494612 - type: nauc_precision_at_10_std value: 46.68659723766723 - type: nauc_precision_at_1_diff1 value: 22.711430629147635 - type: nauc_precision_at_1_max value: 58.4059429497403 - type: nauc_precision_at_1_std value: 11.967886722567973 - type: nauc_precision_at_20_diff1 value: -33.875460046920495 - type: nauc_precision_at_20_max value: 24.188420133566442 - type: nauc_precision_at_20_std value: 50.02387762958483 - type: nauc_precision_at_3_diff1 value: -28.875998450906827 - type: nauc_precision_at_3_max value: 44.77058831167941 - type: nauc_precision_at_3_std value: 31.77993710437207 - type: nauc_precision_at_5_diff1 value: -34.92525440306491 - type: nauc_precision_at_5_max value: 39.855219917077086 - type: nauc_precision_at_5_std value: 37.95432046169299 - type: nauc_recall_at_1000_diff1 value: -14.293309371874733 - type: nauc_recall_at_1000_max value: 59.06948692482579 - type: nauc_recall_at_1000_std value: 62.586254868312686 - type: nauc_recall_at_100_diff1 value: -4.344100947212704 - type: nauc_recall_at_100_max value: 58.42120421043602 - type: nauc_recall_at_100_std value: 46.48562009316997 - type: nauc_recall_at_10_diff1 value: 0.04948662912161709 - type: nauc_recall_at_10_max value: 42.42809687119093 - type: nauc_recall_at_10_std value: 0.6892504250411409 - type: nauc_recall_at_1_diff1 value: 45.19415865267676 - type: nauc_recall_at_1_max value: -1.7261947382471912 - type: nauc_recall_at_1_std value: -32.16144291613605 - type: nauc_recall_at_20_diff1 value: -7.634587864605111 - type: nauc_recall_at_20_max value: 49.21327187174134 - type: nauc_recall_at_20_std value: 16.408481068336346 - type: nauc_recall_at_3_diff1 value: 24.72546591038644 - type: nauc_recall_at_3_max value: 6.620763400972902 - type: nauc_recall_at_3_std value: -29.994703323331684 - type: nauc_recall_at_5_diff1 value: 12.65527364845842 - type: nauc_recall_at_5_max value: 20.400121385794694 - type: nauc_recall_at_5_std value: -22.34284568447213 - type: ndcg_at_1 value: 84.8 - type: ndcg_at_10 value: 83.134 - type: ndcg_at_100 value: 86.628 - type: ndcg_at_1000 value: 87.151 - type: ndcg_at_20 value: 85.092 - type: ndcg_at_3 value: 81.228 - type: ndcg_at_5 value: 80.2 - type: precision_at_1 value: 84.8 - type: precision_at_10 value: 40.394999999999996 - type: precision_at_100 value: 4.745 - type: precision_at_1000 value: 0.488 - type: precision_at_20 value: 22.245 - type: precision_at_3 value: 73.25 - type: precision_at_5 value: 61.86000000000001 - type: recall_at_1 value: 23.907999999999998 - type: recall_at_10 value: 85.346 - type: recall_at_100 value: 96.515 - type: recall_at_1000 value: 99.156 - type: recall_at_20 value: 91.377 - type: recall_at_3 value: 54.135 - type: recall_at_5 value: 70.488 - task: type: Retrieval dataset: name: MTEB EcomRetrieval (default) type: C-MTEB/EcomRetrieval config: default split: dev revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9 metrics: - type: main_score value: 60.887 - type: map_at_1 value: 46.6 - type: map_at_10 value: 56.035000000000004 - type: map_at_100 value: 56.741 - type: map_at_1000 value: 56.764 - type: map_at_20 value: 56.513999999999996 - type: map_at_3 value: 53.733 - type: map_at_5 value: 54.913000000000004 - type: mrr_at_1 value: 46.6 - type: mrr_at_10 value: 56.034523809523776 - type: mrr_at_100 value: 56.74056360434383 - type: mrr_at_1000 value: 56.76373487222486 - type: mrr_at_20 value: 56.51374873879128 - type: mrr_at_3 value: 53.73333333333328 - type: mrr_at_5 value: 54.91333333333327 - type: nauc_map_at_1000_diff1 value: 65.13546939953387 - type: nauc_map_at_1000_max value: 43.358890946774494 - type: nauc_map_at_1000_std value: -9.973282105235036 - type: nauc_map_at_100_diff1 value: 65.12449309472493 - type: nauc_map_at_100_max value: 43.377100882923145 - type: nauc_map_at_100_std value: -9.971781228240555 - type: nauc_map_at_10_diff1 value: 64.83020018537475 - type: nauc_map_at_10_max value: 43.25969482323034 - type: nauc_map_at_10_std value: -10.120272176001547 - type: nauc_map_at_1_diff1 value: 69.58727592100516 - type: nauc_map_at_1_max value: 38.236494689522026 - type: nauc_map_at_1_std value: -14.833390831689597 - type: nauc_map_at_20_diff1 value: 65.01159809914586 - type: nauc_map_at_20_max value: 43.33440319829618 - type: nauc_map_at_20_std value: -10.039958228659726 - type: nauc_map_at_3_diff1 value: 65.2396323885909 - type: nauc_map_at_3_max value: 42.26904017378952 - type: nauc_map_at_3_std value: -11.793017036934044 - type: nauc_map_at_5_diff1 value: 64.96397227898036 - type: nauc_map_at_5_max value: 43.231333789145424 - type: nauc_map_at_5_std value: -10.349933732151372 - type: nauc_mrr_at_1000_diff1 value: 65.13546939953387 - type: nauc_mrr_at_1000_max value: 43.358890946774494 - type: nauc_mrr_at_1000_std value: -9.973282105235036 - type: nauc_mrr_at_100_diff1 value: 65.12449309472493 - type: nauc_mrr_at_100_max value: 43.377100882923145 - type: nauc_mrr_at_100_std value: -9.971781228240555 - type: nauc_mrr_at_10_diff1 value: 64.83020018537475 - type: nauc_mrr_at_10_max value: 43.25969482323034 - type: nauc_mrr_at_10_std value: -10.120272176001547 - type: nauc_mrr_at_1_diff1 value: 69.58727592100516 - type: nauc_mrr_at_1_max value: 38.236494689522026 - type: nauc_mrr_at_1_std value: -14.833390831689597 - type: nauc_mrr_at_20_diff1 value: 65.01159809914586 - type: nauc_mrr_at_20_max value: 43.33440319829618 - type: nauc_mrr_at_20_std value: -10.039958228659726 - type: nauc_mrr_at_3_diff1 value: 65.2396323885909 - type: nauc_mrr_at_3_max value: 42.26904017378952 - type: nauc_mrr_at_3_std value: -11.793017036934044 - type: nauc_mrr_at_5_diff1 value: 64.96397227898036 - type: nauc_mrr_at_5_max value: 43.231333789145424 - type: nauc_mrr_at_5_std value: -10.349933732151372 - type: nauc_ndcg_at_1000_diff1 value: 64.26802655199876 - type: nauc_ndcg_at_1000_max value: 45.854310744745185 - type: nauc_ndcg_at_1000_std value: -6.184417305204082 - type: nauc_ndcg_at_100_diff1 value: 63.99268329609827 - type: nauc_ndcg_at_100_max value: 46.31270128748375 - type: nauc_ndcg_at_100_std value: -6.1393433180558965 - type: nauc_ndcg_at_10_diff1 value: 62.6735104141137 - type: nauc_ndcg_at_10_max value: 45.54954799462398 - type: nauc_ndcg_at_10_std value: -7.348851199024871 - type: nauc_ndcg_at_1_diff1 value: 69.58727592100516 - type: nauc_ndcg_at_1_max value: 38.236494689522026 - type: nauc_ndcg_at_1_std value: -14.833390831689597 - type: nauc_ndcg_at_20_diff1 value: 63.25899651677274 - type: nauc_ndcg_at_20_max value: 45.952196968886014 - type: nauc_ndcg_at_20_std value: -6.807607465125713 - type: nauc_ndcg_at_3_diff1 value: 63.65618337476822 - type: nauc_ndcg_at_3_max value: 43.507890965228945 - type: nauc_ndcg_at_3_std value: -10.73845622217601 - type: nauc_ndcg_at_5_diff1 value: 63.079162432921855 - type: nauc_ndcg_at_5_max value: 45.38303443868148 - type: nauc_ndcg_at_5_std value: -8.063657824835534 - type: nauc_precision_at_1000_diff1 value: 63.01459977930557 - type: nauc_precision_at_1000_max value: 92.4253034547151 - type: nauc_precision_at_1000_std value: 84.4845513963158 - type: nauc_precision_at_100_diff1 value: 57.17217119405878 - type: nauc_precision_at_100_max value: 80.70049725316484 - type: nauc_precision_at_100_std value: 41.78392287147403 - type: nauc_precision_at_10_diff1 value: 53.115665404390725 - type: nauc_precision_at_10_max value: 55.73825657341263 - type: nauc_precision_at_10_std value: 5.406226305013257 - type: nauc_precision_at_1_diff1 value: 69.58727592100516 - type: nauc_precision_at_1_max value: 38.236494689522026 - type: nauc_precision_at_1_std value: -14.833390831689597 - type: nauc_precision_at_20_diff1 value: 53.77730697622828 - type: nauc_precision_at_20_max value: 61.88170819253054 - type: nauc_precision_at_20_std value: 13.678730470003856 - type: nauc_precision_at_3_diff1 value: 58.580196992291455 - type: nauc_precision_at_3_max value: 47.404834585376626 - type: nauc_precision_at_3_std value: -7.374978769024051 - type: nauc_precision_at_5_diff1 value: 56.44564652606437 - type: nauc_precision_at_5_max value: 53.08973975162324 - type: nauc_precision_at_5_std value: 0.22762700141423803 - type: nauc_recall_at_1000_diff1 value: 63.01459977930565 - type: nauc_recall_at_1000_max value: 92.42530345471532 - type: nauc_recall_at_1000_std value: 84.48455139631602 - type: nauc_recall_at_100_diff1 value: 57.17217119405904 - type: nauc_recall_at_100_max value: 80.70049725316468 - type: nauc_recall_at_100_std value: 41.783922871474275 - type: nauc_recall_at_10_diff1 value: 53.11566540439087 - type: nauc_recall_at_10_max value: 55.738256573412656 - type: nauc_recall_at_10_std value: 5.406226305013377 - type: nauc_recall_at_1_diff1 value: 69.58727592100516 - type: nauc_recall_at_1_max value: 38.236494689522026 - type: nauc_recall_at_1_std value: -14.833390831689597 - type: nauc_recall_at_20_diff1 value: 53.77730697622846 - type: nauc_recall_at_20_max value: 61.881708192530525 - type: nauc_recall_at_20_std value: 13.678730470003947 - type: nauc_recall_at_3_diff1 value: 58.5801969922914 - type: nauc_recall_at_3_max value: 47.40483458537654 - type: nauc_recall_at_3_std value: -7.37497876902413 - type: nauc_recall_at_5_diff1 value: 56.445646526064394 - type: nauc_recall_at_5_max value: 53.08973975162332 - type: nauc_recall_at_5_std value: 0.22762700141428024 - type: ndcg_at_1 value: 46.6 - type: ndcg_at_10 value: 60.887 - type: ndcg_at_100 value: 64.18199999999999 - type: ndcg_at_1000 value: 64.726 - type: ndcg_at_20 value: 62.614999999999995 - type: ndcg_at_3 value: 56.038 - type: ndcg_at_5 value: 58.150999999999996 - type: precision_at_1 value: 46.6 - type: precision_at_10 value: 7.630000000000001 - type: precision_at_100 value: 0.914 - type: precision_at_1000 value: 0.096 - type: precision_at_20 value: 4.154999999999999 - type: precision_at_3 value: 20.9 - type: precision_at_5 value: 13.56 - type: recall_at_1 value: 46.6 - type: recall_at_10 value: 76.3 - type: recall_at_100 value: 91.4 - type: recall_at_1000 value: 95.6 - type: recall_at_20 value: 83.1 - type: recall_at_3 value: 62.7 - type: recall_at_5 value: 67.80000000000001 - task: type: Classification dataset: name: MTEB EmotionClassification (default) type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 73.29999999999998 - type: f1 value: 67.71473706580302 - type: f1_weighted value: 74.83537255312045 - type: main_score value: 73.29999999999998 - task: type: Retrieval dataset: name: MTEB FEVER (default) type: mteb/fever config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: map_at_1 value: 78.371 - type: map_at_10 value: 85.762 - type: map_at_100 value: 85.954 - type: map_at_1000 value: 85.966 - type: map_at_20 value: 85.887 - type: map_at_3 value: 84.854 - type: map_at_5 value: 85.408 - type: mrr_at_1 value: 84.443 - type: mrr_at_10 value: 90.432 - type: mrr_at_100 value: 90.483 - type: mrr_at_1000 value: 90.484 - type: mrr_at_20 value: 90.473 - type: mrr_at_3 value: 89.89399999999999 - type: mrr_at_5 value: 90.244 - type: ndcg_at_1 value: 84.443 - type: ndcg_at_10 value: 89.05499999999999 - type: ndcg_at_100 value: 89.68 - type: ndcg_at_1000 value: 89.87899999999999 - type: ndcg_at_20 value: 89.381 - type: ndcg_at_3 value: 87.73100000000001 - type: ndcg_at_5 value: 88.425 - type: precision_at_1 value: 84.443 - type: precision_at_10 value: 10.520999999999999 - type: precision_at_100 value: 1.103 - type: precision_at_1000 value: 0.11399999999999999 - type: precision_at_20 value: 5.362 - type: precision_at_3 value: 33.198 - type: precision_at_5 value: 20.441000000000003 - type: recall_at_1 value: 78.371 - type: recall_at_10 value: 94.594 - type: recall_at_100 value: 96.97099999999999 - type: recall_at_1000 value: 98.18 - type: recall_at_20 value: 95.707 - type: recall_at_3 value: 90.853 - type: recall_at_5 value: 92.74799999999999 - type: main_score value: 89.05499999999999 - task: type: Retrieval dataset: name: MTEB FiQA2018 (default) type: mteb/fiqa config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: map_at_1 value: 23.810000000000002 - type: map_at_10 value: 39.051 - type: map_at_100 value: 41.231 - type: map_at_1000 value: 41.376000000000005 - type: map_at_20 value: 40.227000000000004 - type: map_at_3 value: 33.915 - type: map_at_5 value: 36.459 - type: mrr_at_1 value: 48.148 - type: mrr_at_10 value: 55.765 - type: mrr_at_100 value: 56.495 - type: mrr_at_1000 value: 56.525999999999996 - type: mrr_at_20 value: 56.213 - type: mrr_at_3 value: 53.086 - type: mrr_at_5 value: 54.513999999999996 - type: ndcg_at_1 value: 48.148 - type: ndcg_at_10 value: 47.349999999999994 - type: ndcg_at_100 value: 54.61899999999999 - type: ndcg_at_1000 value: 56.830000000000005 - type: ndcg_at_20 value: 50.143 - type: ndcg_at_3 value: 43.108000000000004 - type: ndcg_at_5 value: 44.023 - type: precision_at_1 value: 48.148 - type: precision_at_10 value: 13.441 - type: precision_at_100 value: 2.085 - type: precision_at_1000 value: 0.248 - type: precision_at_20 value: 7.870000000000001 - type: precision_at_3 value: 28.909000000000002 - type: precision_at_5 value: 20.957 - type: recall_at_1 value: 23.810000000000002 - type: recall_at_10 value: 54.303000000000004 - type: recall_at_100 value: 81.363 - type: recall_at_1000 value: 94.391 - type: recall_at_20 value: 63.056999999999995 - type: recall_at_3 value: 38.098 - type: recall_at_5 value: 44.414 - type: main_score value: 47.349999999999994 - task: type: Classification dataset: name: MTEB GeoreviewClassification (default) type: ai-forever/georeview-classification config: default split: test revision: 3765c0d1de6b7d264bc459433c45e5a75513839c metrics: - type: accuracy value: 48.0126953125 - type: f1 value: 47.65764016160488 - type: f1_weighted value: 47.65701659482088 - type: main_score value: 48.0126953125 - task: type: Clustering dataset: name: MTEB GeoreviewClusteringP2P (default) type: ai-forever/georeview-clustering-p2p config: default split: test revision: 97a313c8fc85b47f13f33e7e9a95c1ad888c7fec metrics: - type: main_score value: 73.62357853672266 - type: v_measure value: 73.62357853672266 - type: v_measure_std value: 0.5942247545535766 - task: type: Retrieval dataset: name: MTEB GerDaLIR (default) type: jinaai/ger_da_lir config: default split: test revision: 0bb47f1d73827e96964edb84dfe552f62f4fd5eb metrics: - type: main_score value: 16.227 - type: map_at_1 value: 8.082 - type: map_at_10 value: 12.959999999999999 - type: map_at_100 value: 13.923 - type: map_at_1000 value: 14.030999999999999 - type: map_at_20 value: 13.453000000000001 - type: map_at_3 value: 11.018 - type: map_at_5 value: 12.056000000000001 - type: mrr_at_1 value: 8.993332249146203 - type: mrr_at_10 value: 13.994013092850247 - type: mrr_at_100 value: 14.913737673149308 - type: mrr_at_1000 value: 15.00843809934407 - type: mrr_at_20 value: 14.470268462334007 - type: mrr_at_3 value: 12.000596302921846 - type: mrr_at_5 value: 13.070689000921561 - type: nauc_map_at_1000_diff1 value: 28.559639584013286 - type: nauc_map_at_1000_max value: 25.533800126086714 - type: nauc_map_at_1000_std value: 9.826551026628666 - type: nauc_map_at_100_diff1 value: 28.544724499331696 - type: nauc_map_at_100_max value: 25.46734324526386 - type: nauc_map_at_100_std value: 9.739314481785591 - type: nauc_map_at_10_diff1 value: 28.77447517718118 - type: nauc_map_at_10_max value: 24.7431615237795 - type: nauc_map_at_10_std value: 8.349878188033646 - type: nauc_map_at_1_diff1 value: 37.405452629895514 - type: nauc_map_at_1_max value: 24.444208978394023 - type: nauc_map_at_1_std value: 4.043820373810528 - type: nauc_map_at_20_diff1 value: 28.69764217789062 - type: nauc_map_at_20_max value: 25.111848355996496 - type: nauc_map_at_20_std value: 9.034829905305918 - type: nauc_map_at_3_diff1 value: 30.89053285076882 - type: nauc_map_at_3_max value: 24.862886115911152 - type: nauc_map_at_3_std value: 6.654260832396586 - type: nauc_map_at_5_diff1 value: 29.230629676604263 - type: nauc_map_at_5_max value: 24.374302288018583 - type: nauc_map_at_5_std value: 7.341846952319046 - type: nauc_mrr_at_1000_diff1 value: 28.086147932781426 - type: nauc_mrr_at_1000_max value: 25.98698528264653 - type: nauc_mrr_at_1000_std value: 9.917554348624545 - type: nauc_mrr_at_100_diff1 value: 28.069163279791336 - type: nauc_mrr_at_100_max value: 25.949440010886804 - type: nauc_mrr_at_100_std value: 9.874340979732578 - type: nauc_mrr_at_10_diff1 value: 28.239920869530046 - type: nauc_mrr_at_10_max value: 25.351271409498576 - type: nauc_mrr_at_10_std value: 8.669862759875162 - type: nauc_mrr_at_1_diff1 value: 35.96543040207856 - type: nauc_mrr_at_1_max value: 25.488936487231967 - type: nauc_mrr_at_1_std value: 4.76439131038345 - type: nauc_mrr_at_20_diff1 value: 28.18865871284607 - type: nauc_mrr_at_20_max value: 25.67121763344746 - type: nauc_mrr_at_20_std value: 9.297910707519472 - type: nauc_mrr_at_3_diff1 value: 30.166714199740717 - type: nauc_mrr_at_3_max value: 25.541792491964877 - type: nauc_mrr_at_3_std value: 7.083090296398472 - type: nauc_mrr_at_5_diff1 value: 28.68475284656478 - type: nauc_mrr_at_5_max value: 24.994071363482835 - type: nauc_mrr_at_5_std value: 7.687507254902365 - type: nauc_ndcg_at_1000_diff1 value: 25.292792613586467 - type: nauc_ndcg_at_1000_max value: 29.211905289377178 - type: nauc_ndcg_at_1000_std value: 18.088867467320355 - type: nauc_ndcg_at_100_diff1 value: 25.026905011089152 - type: nauc_ndcg_at_100_max value: 27.98822281254431 - type: nauc_ndcg_at_100_std value: 16.69456904301902 - type: nauc_ndcg_at_10_diff1 value: 25.972279051109503 - type: nauc_ndcg_at_10_max value: 24.86486482734957 - type: nauc_ndcg_at_10_std value: 10.398605822106353 - type: nauc_ndcg_at_1_diff1 value: 36.134710485184826 - type: nauc_ndcg_at_1_max value: 25.384572790326025 - type: nauc_ndcg_at_1_std value: 4.591863033771824 - type: nauc_ndcg_at_20_diff1 value: 25.850033660205536 - type: nauc_ndcg_at_20_max value: 25.944243193140515 - type: nauc_ndcg_at_20_std value: 12.392409721204892 - type: nauc_ndcg_at_3_diff1 value: 29.1966056380018 - type: nauc_ndcg_at_3_max value: 24.978843156259913 - type: nauc_ndcg_at_3_std value: 7.353914459205087 - type: nauc_ndcg_at_5_diff1 value: 26.795315295756282 - type: nauc_ndcg_at_5_max value: 24.1196789150412 - type: nauc_ndcg_at_5_std value: 8.311970988265172 - type: nauc_precision_at_1000_diff1 value: 9.128270550217984 - type: nauc_precision_at_1000_max value: 35.79286915973607 - type: nauc_precision_at_1000_std value: 39.15669472887154 - type: nauc_precision_at_100_diff1 value: 14.770289799034384 - type: nauc_precision_at_100_max value: 34.58262232264337 - type: nauc_precision_at_100_std value: 34.101148102981384 - type: nauc_precision_at_10_diff1 value: 19.899104673118178 - type: nauc_precision_at_10_max value: 26.636940338985625 - type: nauc_precision_at_10_std value: 15.73871357255849 - type: nauc_precision_at_1_diff1 value: 36.134710485184826 - type: nauc_precision_at_1_max value: 25.384572790326025 - type: nauc_precision_at_1_std value: 4.591863033771824 - type: nauc_precision_at_20_diff1 value: 19.423457975148942 - type: nauc_precision_at_20_max value: 29.58123490878582 - type: nauc_precision_at_20_std value: 20.847850110821618 - type: nauc_precision_at_3_diff1 value: 24.986416623492918 - type: nauc_precision_at_3_max value: 25.973548400472975 - type: nauc_precision_at_3_std value: 9.486410455972823 - type: nauc_precision_at_5_diff1 value: 21.237741424923332 - type: nauc_precision_at_5_max value: 24.647141028200164 - type: nauc_precision_at_5_std value: 11.102785032334147 - type: nauc_recall_at_1000_diff1 value: 15.999714888817829 - type: nauc_recall_at_1000_max value: 44.34701908906545 - type: nauc_recall_at_1000_std value: 51.13471291594717 - type: nauc_recall_at_100_diff1 value: 17.401714890483706 - type: nauc_recall_at_100_max value: 33.39042631654808 - type: nauc_recall_at_100_std value: 33.944446168451584 - type: nauc_recall_at_10_diff1 value: 20.30036232399894 - type: nauc_recall_at_10_max value: 24.006718284396786 - type: nauc_recall_at_10_std value: 14.049375108518669 - type: nauc_recall_at_1_diff1 value: 37.405452629895514 - type: nauc_recall_at_1_max value: 24.444208978394023 - type: nauc_recall_at_1_std value: 4.043820373810528 - type: nauc_recall_at_20_diff1 value: 20.23582802609045 - type: nauc_recall_at_20_max value: 26.408063410785243 - type: nauc_recall_at_20_std value: 18.617479515468112 - type: nauc_recall_at_3_diff1 value: 25.53221830103098 - type: nauc_recall_at_3_max value: 24.283712329152678 - type: nauc_recall_at_3_std value: 8.428947805841867 - type: nauc_recall_at_5_diff1 value: 21.741499601020823 - type: nauc_recall_at_5_max value: 22.754924586295296 - type: nauc_recall_at_5_std value: 9.966736688169814 - type: ndcg_at_1 value: 8.977 - type: ndcg_at_10 value: 16.227 - type: ndcg_at_100 value: 21.417 - type: ndcg_at_1000 value: 24.451 - type: ndcg_at_20 value: 17.982 - type: ndcg_at_3 value: 12.206999999999999 - type: ndcg_at_5 value: 14.059 - type: precision_at_1 value: 8.977 - type: precision_at_10 value: 2.933 - type: precision_at_100 value: 0.59 - type: precision_at_1000 value: 0.087 - type: precision_at_20 value: 1.8599999999999999 - type: precision_at_3 value: 5.550999999999999 - type: precision_at_5 value: 4.340999999999999 - type: recall_at_1 value: 8.082 - type: recall_at_10 value: 25.52 - type: recall_at_100 value: 50.32 - type: recall_at_1000 value: 74.021 - type: recall_at_20 value: 32.229 - type: recall_at_3 value: 14.66 - type: recall_at_5 value: 19.062 - task: type: Retrieval dataset: name: MTEB GermanDPR (default) type: deepset/germandpr config: default split: test revision: 5129d02422a66be600ac89cd3e8531b4f97d347d metrics: - type: main_score value: 82.422 - type: map_at_1 value: 64.39 - type: map_at_10 value: 77.273 - type: map_at_100 value: 77.375 - type: map_at_1000 value: 77.376 - type: map_at_20 value: 77.351 - type: map_at_3 value: 75.46300000000001 - type: map_at_5 value: 76.878 - type: mrr_at_1 value: 64.19512195121952 - type: mrr_at_10 value: 77.15842044134736 - type: mrr_at_100 value: 77.2604854308704 - type: mrr_at_1000 value: 77.26087882190109 - type: mrr_at_20 value: 77.23572154560611 - type: mrr_at_3 value: 75.34959349593504 - type: mrr_at_5 value: 76.76422764227652 - type: nauc_map_at_1000_diff1 value: 49.73135253389972 - type: nauc_map_at_1000_max value: 8.665570717396145 - type: nauc_map_at_1000_std value: -25.920927572114522 - type: nauc_map_at_100_diff1 value: 49.729170775336605 - type: nauc_map_at_100_max value: 8.66717979705074 - type: nauc_map_at_100_std value: -25.918338868918596 - type: nauc_map_at_10_diff1 value: 49.708681691445925 - type: nauc_map_at_10_max value: 8.830640635692113 - type: nauc_map_at_10_std value: -25.843238986304858 - type: nauc_map_at_1_diff1 value: 51.750022350988914 - type: nauc_map_at_1_max value: 3.599863010364626 - type: nauc_map_at_1_std value: -27.670122127567314 - type: nauc_map_at_20_diff1 value: 49.72609185887161 - type: nauc_map_at_20_max value: 8.766556053409218 - type: nauc_map_at_20_std value: -25.85975887517904 - type: nauc_map_at_3_diff1 value: 49.328512536255595 - type: nauc_map_at_3_max value: 9.475682028996795 - type: nauc_map_at_3_std value: -26.277349632171017 - type: nauc_map_at_5_diff1 value: 49.42801822186142 - type: nauc_map_at_5_max value: 8.788822474357252 - type: nauc_map_at_5_std value: -25.959260882028573 - type: nauc_mrr_at_1000_diff1 value: 50.13038598302397 - type: nauc_mrr_at_1000_max value: 8.734338637484832 - type: nauc_mrr_at_1000_std value: -26.653343549855908 - type: nauc_mrr_at_100_diff1 value: 50.12820392111392 - type: nauc_mrr_at_100_max value: 8.735940503917966 - type: nauc_mrr_at_100_std value: -26.65074918231251 - type: nauc_mrr_at_10_diff1 value: 50.10567888458267 - type: nauc_mrr_at_10_max value: 8.898451291748575 - type: nauc_mrr_at_10_std value: -26.572046921975655 - type: nauc_mrr_at_1_diff1 value: 52.22769994409465 - type: nauc_mrr_at_1_max value: 3.6490820146062015 - type: nauc_mrr_at_1_std value: -28.535100562320498 - type: nauc_mrr_at_20_diff1 value: 50.12462222100699 - type: nauc_mrr_at_20_max value: 8.83487018268756 - type: nauc_mrr_at_20_std value: -26.591437036958332 - type: nauc_mrr_at_3_diff1 value: 49.6987353700016 - type: nauc_mrr_at_3_max value: 9.531003760756258 - type: nauc_mrr_at_3_std value: -26.949799063124818 - type: nauc_mrr_at_5_diff1 value: 49.823881656376585 - type: nauc_mrr_at_5_max value: 8.850404667985085 - type: nauc_mrr_at_5_std value: -26.680008966088582 - type: nauc_ndcg_at_1000_diff1 value: 49.41721203361181 - type: nauc_ndcg_at_1000_max value: 9.41093067609825 - type: nauc_ndcg_at_1000_std value: -25.499543637737567 - type: nauc_ndcg_at_100_diff1 value: 49.32810419509252 - type: nauc_ndcg_at_100_max value: 9.476216458766897 - type: nauc_ndcg_at_100_std value: -25.393856250990414 - type: nauc_ndcg_at_10_diff1 value: 49.181984436623694 - type: nauc_ndcg_at_10_max value: 10.65234732763274 - type: nauc_ndcg_at_10_std value: -24.737669349012297 - type: nauc_ndcg_at_1_diff1 value: 51.750022350988914 - type: nauc_ndcg_at_1_max value: 3.599863010364626 - type: nauc_ndcg_at_1_std value: -27.670122127567314 - type: nauc_ndcg_at_20_diff1 value: 49.275394594995056 - type: nauc_ndcg_at_20_max value: 10.402059796651923 - type: nauc_ndcg_at_20_std value: -24.82329915806705 - type: nauc_ndcg_at_3_diff1 value: 48.22614352152889 - type: nauc_ndcg_at_3_max value: 11.67464280791404 - type: nauc_ndcg_at_3_std value: -25.867824868234095 - type: nauc_ndcg_at_5_diff1 value: 48.35583502987241 - type: nauc_ndcg_at_5_max value: 10.494278750448451 - type: nauc_ndcg_at_5_std value: -25.11599634172764 - type: nauc_precision_at_1000_diff1 value: .nan - type: nauc_precision_at_1000_max value: .nan - type: nauc_precision_at_1000_std value: .nan - type: nauc_precision_at_100_diff1 value: -56.39478136433852 - type: nauc_precision_at_100_max value: 86.93518577529493 - type: nauc_precision_at_100_std value: 100.0 - type: nauc_precision_at_10_diff1 value: 38.662829729133094 - type: nauc_precision_at_10_max value: 56.38018435740605 - type: nauc_precision_at_10_std value: 6.288091897081105 - type: nauc_precision_at_1_diff1 value: 51.750022350988914 - type: nauc_precision_at_1_max value: 3.599863010364626 - type: nauc_precision_at_1_std value: -27.670122127567314 - type: nauc_precision_at_20_diff1 value: 34.739153182429085 - type: nauc_precision_at_20_max value: 84.86908403000989 - type: nauc_precision_at_20_std value: 29.156199421219455 - type: nauc_precision_at_3_diff1 value: 42.09287362529135 - type: nauc_precision_at_3_max value: 23.629152759287074 - type: nauc_precision_at_3_std value: -23.721376911302492 - type: nauc_precision_at_5_diff1 value: 36.03866171924644 - type: nauc_precision_at_5_max value: 29.166173558775327 - type: nauc_precision_at_5_std value: -15.096374563068448 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: -56.39478136433541 - type: nauc_recall_at_100_max value: 86.93518577528111 - type: nauc_recall_at_100_std value: 100.0 - type: nauc_recall_at_10_diff1 value: 38.66282972913384 - type: nauc_recall_at_10_max value: 56.3801843574071 - type: nauc_recall_at_10_std value: 6.288091897082639 - type: nauc_recall_at_1_diff1 value: 51.750022350988914 - type: nauc_recall_at_1_max value: 3.599863010364626 - type: nauc_recall_at_1_std value: -27.670122127567314 - type: nauc_recall_at_20_diff1 value: 34.7391531824321 - type: nauc_recall_at_20_max value: 84.86908403001016 - type: nauc_recall_at_20_std value: 29.156199421220748 - type: nauc_recall_at_3_diff1 value: 42.09287362529107 - type: nauc_recall_at_3_max value: 23.629152759286946 - type: nauc_recall_at_3_std value: -23.72137691130291 - type: nauc_recall_at_5_diff1 value: 36.0386617192469 - type: nauc_recall_at_5_max value: 29.1661735587759 - type: nauc_recall_at_5_std value: -15.09637456306774 - type: ndcg_at_1 value: 64.39 - type: ndcg_at_10 value: 82.422 - type: ndcg_at_100 value: 82.86099999999999 - type: ndcg_at_1000 value: 82.87299999999999 - type: ndcg_at_20 value: 82.67999999999999 - type: ndcg_at_3 value: 78.967 - type: ndcg_at_5 value: 81.50699999999999 - type: precision_at_1 value: 64.39 - type: precision_at_10 value: 9.795 - type: precision_at_100 value: 0.9990000000000001 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.946 - type: precision_at_3 value: 29.691000000000003 - type: precision_at_5 value: 19.044 - type: recall_at_1 value: 64.39 - type: recall_at_10 value: 97.951 - type: recall_at_100 value: 99.902 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 98.92699999999999 - type: recall_at_3 value: 89.07300000000001 - type: recall_at_5 value: 95.22 - task: type: Retrieval dataset: name: MTEB GermanQuAD-Retrieval (default) type: mteb/germanquad-retrieval config: default split: test revision: f5c87ae5a2e7a5106606314eef45255f03151bb3 metrics: - type: main_score value: 94.15532365396247 - type: map_at_1 value: 90.789 - type: map_at_10 value: 94.24 - type: map_at_100 value: 94.283 - type: map_at_1000 value: 94.284 - type: map_at_20 value: 94.272 - type: map_at_3 value: 93.913 - type: map_at_5 value: 94.155 - type: mrr_at_1 value: 90.78947368421053 - type: mrr_at_10 value: 94.23987411056376 - type: mrr_at_100 value: 94.28320936825 - type: mrr_at_1000 value: 94.28350209115848 - type: mrr_at_20 value: 94.271919092559 - type: mrr_at_3 value: 93.91258318209313 - type: mrr_at_5 value: 94.15532365396247 - type: nauc_map_at_1000_diff1 value: 89.29089310650436 - type: nauc_map_at_1000_max value: 73.83868784032414 - type: nauc_map_at_1000_std value: -11.635778561889989 - type: nauc_map_at_100_diff1 value: 89.29077225707755 - type: nauc_map_at_100_max value: 73.84002740580378 - type: nauc_map_at_100_std value: -11.644096256165092 - type: nauc_map_at_10_diff1 value: 89.29117612292366 - type: nauc_map_at_10_max value: 73.97487984981221 - type: nauc_map_at_10_std value: -11.35191794373827 - type: nauc_map_at_1_diff1 value: 89.35436544117584 - type: nauc_map_at_1_max value: 70.35936815057701 - type: nauc_map_at_1_std value: -13.598996360976903 - type: nauc_map_at_20_diff1 value: 89.2530394052653 - type: nauc_map_at_20_max value: 73.83537529419839 - type: nauc_map_at_20_std value: -11.628272822028478 - type: nauc_map_at_3_diff1 value: 89.375111893546 - type: nauc_map_at_3_max value: 74.78900366026112 - type: nauc_map_at_3_std value: -12.720905253503274 - type: nauc_map_at_5_diff1 value: 89.35358300820893 - type: nauc_map_at_5_max value: 74.31996219723239 - type: nauc_map_at_5_std value: -10.768642638210867 - type: nauc_mrr_at_1000_diff1 value: 89.29089310650436 - type: nauc_mrr_at_1000_max value: 73.83868784032414 - type: nauc_mrr_at_1000_std value: -11.635778561889989 - type: nauc_mrr_at_100_diff1 value: 89.29077225707755 - type: nauc_mrr_at_100_max value: 73.84002740580378 - type: nauc_mrr_at_100_std value: -11.644096256165092 - type: nauc_mrr_at_10_diff1 value: 89.29117612292366 - type: nauc_mrr_at_10_max value: 73.97487984981221 - type: nauc_mrr_at_10_std value: -11.35191794373827 - type: nauc_mrr_at_1_diff1 value: 89.35436544117584 - type: nauc_mrr_at_1_max value: 70.35936815057701 - type: nauc_mrr_at_1_std value: -13.598996360976903 - type: nauc_mrr_at_20_diff1 value: 89.2530394052653 - type: nauc_mrr_at_20_max value: 73.83537529419839 - type: nauc_mrr_at_20_std value: -11.628272822028478 - type: nauc_mrr_at_3_diff1 value: 89.375111893546 - type: nauc_mrr_at_3_max value: 74.78900366026112 - type: nauc_mrr_at_3_std value: -12.720905253503274 - type: nauc_mrr_at_5_diff1 value: 89.35358300820893 - type: nauc_mrr_at_5_max value: 74.31996219723239 - type: nauc_mrr_at_5_std value: -10.768642638210867 - type: nauc_ndcg_at_1000_diff1 value: 89.27620775856863 - type: nauc_ndcg_at_1000_max value: 74.2985757362615 - type: nauc_ndcg_at_1000_std value: -11.236142819703023 - type: nauc_ndcg_at_100_diff1 value: 89.27284787540731 - type: nauc_ndcg_at_100_max value: 74.33539303365968 - type: nauc_ndcg_at_100_std value: -11.469413615851936 - type: nauc_ndcg_at_10_diff1 value: 89.21496710661724 - type: nauc_ndcg_at_10_max value: 75.02035398490516 - type: nauc_ndcg_at_10_std value: -9.903255803665814 - type: nauc_ndcg_at_1_diff1 value: 89.35436544117584 - type: nauc_ndcg_at_1_max value: 70.35936815057701 - type: nauc_ndcg_at_1_std value: -13.598996360976903 - type: nauc_ndcg_at_20_diff1 value: 89.03561289544179 - type: nauc_ndcg_at_20_max value: 74.4006766600049 - type: nauc_ndcg_at_20_std value: -11.129237862587743 - type: nauc_ndcg_at_3_diff1 value: 89.46540193201693 - type: nauc_ndcg_at_3_max value: 76.87093548368378 - type: nauc_ndcg_at_3_std value: -12.484902872086767 - type: nauc_ndcg_at_5_diff1 value: 89.39924941584766 - type: nauc_ndcg_at_5_max value: 75.96975269092722 - type: nauc_ndcg_at_5_std value: -8.180295581144833 - type: nauc_precision_at_1000_diff1 value: 100.0 - type: nauc_precision_at_1000_max value: 100.0 - type: nauc_precision_at_1000_std value: 100.0 - type: nauc_precision_at_100_diff1 value: 86.93074003795302 - type: nauc_precision_at_100_max value: 100.0 - type: nauc_precision_at_100_std value: -174.07785375176616 - type: nauc_precision_at_10_diff1 value: 87.43064119412082 - type: nauc_precision_at_10_max value: 90.60785783417448 - type: nauc_precision_at_10_std value: 15.378710059645906 - type: nauc_precision_at_1_diff1 value: 89.35436544117584 - type: nauc_precision_at_1_max value: 70.35936815057701 - type: nauc_precision_at_1_std value: -13.598996360976903 - type: nauc_precision_at_20_diff1 value: 78.78206037685919 - type: nauc_precision_at_20_max value: 82.52264166455923 - type: nauc_precision_at_20_std value: -5.95806599216658 - type: nauc_precision_at_3_diff1 value: 90.12709256456401 - type: nauc_precision_at_3_max value: 90.72678805838154 - type: nauc_precision_at_3_std value: -11.047599315631993 - type: nauc_precision_at_5_diff1 value: 89.9066873566561 - type: nauc_precision_at_5_max value: 93.51571626543664 - type: nauc_precision_at_5_std value: 22.632403279126162 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: 86.93074003793416 - type: nauc_recall_at_100_max value: 100.0 - type: nauc_recall_at_100_std value: -174.07785375175723 - type: nauc_recall_at_10_diff1 value: 87.43064119411991 - type: nauc_recall_at_10_max value: 90.60785783417579 - type: nauc_recall_at_10_std value: 15.378710059643607 - type: nauc_recall_at_1_diff1 value: 89.35436544117584 - type: nauc_recall_at_1_max value: 70.35936815057701 - type: nauc_recall_at_1_std value: -13.598996360976903 - type: nauc_recall_at_20_diff1 value: 78.78206037685645 - type: nauc_recall_at_20_max value: 82.52264166455791 - type: nauc_recall_at_20_std value: -5.958065992168697 - type: nauc_recall_at_3_diff1 value: 90.12709256456463 - type: nauc_recall_at_3_max value: 90.7267880583832 - type: nauc_recall_at_3_std value: -11.047599315631881 - type: nauc_recall_at_5_diff1 value: 89.90668735665676 - type: nauc_recall_at_5_max value: 93.51571626543753 - type: nauc_recall_at_5_std value: 22.632403279126112 - type: ndcg_at_1 value: 90.789 - type: ndcg_at_10 value: 95.46 - type: ndcg_at_100 value: 95.652 - type: ndcg_at_1000 value: 95.659 - type: ndcg_at_20 value: 95.575 - type: ndcg_at_3 value: 94.82000000000001 - type: ndcg_at_5 value: 95.26400000000001 - type: precision_at_1 value: 90.789 - type: precision_at_10 value: 9.908999999999999 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.977 - type: precision_at_3 value: 32.471 - type: precision_at_5 value: 19.701 - type: recall_at_1 value: 90.789 - type: recall_at_10 value: 99.093 - type: recall_at_100 value: 99.955 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 99.546 - type: recall_at_3 value: 97.414 - type: recall_at_5 value: 98.503 - task: type: STS dataset: name: MTEB GermanSTSBenchmark (default) type: jinaai/german-STSbenchmark config: default split: test revision: e36907544d44c3a247898ed81540310442329e20 metrics: - type: cosine_pearson value: 86.55319003300265 - type: cosine_spearman value: 87.50267373081324 - type: euclidean_pearson value: 87.41630636501863 - type: euclidean_spearman value: 88.02170803409365 - type: main_score value: 87.50267373081324 - type: manhattan_pearson value: 87.33703179056744 - type: manhattan_spearman value: 87.99192826922514 - type: pearson value: 86.55319003300265 - type: spearman value: 87.50267373081324 - task: type: Clustering dataset: name: MTEB HALClusteringS2S (default) type: lyon-nlp/clustering-hal-s2s config: default split: test revision: e06ebbbb123f8144bef1a5d18796f3dec9ae2915 metrics: - type: main_score value: 27.477557517301303 - type: v_measure value: 27.477557517301303 - type: v_measure_std value: 3.3525736581861336 - task: type: Classification dataset: name: MTEB HeadlineClassification (default) type: ai-forever/headline-classification config: default split: test revision: 2fe05ee6b5832cda29f2ef7aaad7b7fe6a3609eb metrics: - type: accuracy value: 75.0830078125 - type: f1 value: 75.08863209267814 - type: f1_weighted value: 75.08895979060917 - type: main_score value: 75.0830078125 - task: type: Retrieval dataset: name: MTEB HotpotQA (default) type: mteb/hotpotqa config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: map_at_1 value: 38.143 - type: map_at_10 value: 55.916999999999994 - type: map_at_100 value: 56.706 - type: map_at_1000 value: 56.77100000000001 - type: map_at_20 value: 56.367 - type: map_at_3 value: 53.111 - type: map_at_5 value: 54.839000000000006 - type: mrr_at_1 value: 76.286 - type: mrr_at_10 value: 81.879 - type: mrr_at_100 value: 82.09100000000001 - type: mrr_at_1000 value: 82.101 - type: mrr_at_20 value: 82.01 - type: mrr_at_3 value: 80.972 - type: mrr_at_5 value: 81.537 - type: ndcg_at_1 value: 76.286 - type: ndcg_at_10 value: 64.673 - type: ndcg_at_100 value: 67.527 - type: ndcg_at_1000 value: 68.857 - type: ndcg_at_20 value: 65.822 - type: ndcg_at_3 value: 60.616 - type: ndcg_at_5 value: 62.827999999999996 - type: precision_at_1 value: 76.286 - type: precision_at_10 value: 13.196 - type: precision_at_100 value: 1.544 - type: precision_at_1000 value: 0.172 - type: precision_at_20 value: 6.968000000000001 - type: precision_at_3 value: 37.992 - type: precision_at_5 value: 24.54 - type: recall_at_1 value: 38.143 - type: recall_at_10 value: 65.982 - type: recall_at_100 value: 77.225 - type: recall_at_1000 value: 86.077 - type: recall_at_20 value: 69.68299999999999 - type: recall_at_3 value: 56.989000000000004 - type: recall_at_5 value: 61.35 - type: main_score value: 64.673 - task: type: Classification dataset: name: MTEB IFlyTek (default) type: C-MTEB/IFlyTek-classification config: default split: validation revision: 421605374b29664c5fc098418fe20ada9bd55f8a metrics: - type: accuracy value: 41.67756829549827 - type: f1 value: 33.929325579581636 - type: f1_weighted value: 43.03952025643197 - type: main_score value: 41.67756829549827 - task: type: Classification dataset: name: MTEB ImdbClassification (default) type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 91.90440000000001 - type: ap value: 88.78663714603425 - type: ap_weighted value: 88.78663714603425 - type: f1 value: 91.89564361975891 - type: f1_weighted value: 91.89564361975891 - type: main_score value: 91.90440000000001 - task: type: Classification dataset: name: MTEB InappropriatenessClassification (default) type: ai-forever/inappropriateness-classification config: default split: test revision: 601651fdc45ef243751676e62dd7a19f491c0285 metrics: - type: accuracy value: 61.0498046875 - type: ap value: 57.04240566648215 - type: ap_weighted value: 57.04240566648215 - type: f1 value: 60.867630038606954 - type: f1_weighted value: 60.867630038606954 - type: main_score value: 61.0498046875 - task: type: Classification dataset: name: MTEB JDReview (default) type: C-MTEB/JDReview-classification config: default split: test revision: b7c64bd89eb87f8ded463478346f76731f07bf8b metrics: - type: accuracy value: 83.50844277673546 - type: ap value: 48.46732380712268 - type: ap_weighted value: 48.46732380712268 - type: f1 value: 77.43967451387445 - type: f1_weighted value: 84.78462929014114 - type: main_score value: 83.50844277673546 - task: type: Classification dataset: name: MTEB KinopoiskClassification (default) type: ai-forever/kinopoisk-sentiment-classification config: default split: test revision: 5911f26666ac11af46cb9c6849d0dc80a378af24 metrics: - type: accuracy value: 62.393333333333324 - type: f1 value: 61.35940129568015 - type: f1_weighted value: 61.35940129568015 - type: main_score value: 62.393333333333324 - task: type: STS dataset: name: MTEB LCQMC (default) type: C-MTEB/LCQMC config: default split: test revision: 17f9b096f80380fce5ed12a9be8be7784b337daf metrics: - type: cosine_pearson value: 67.74375505907872 - type: cosine_spearman value: 75.94582231399434 - type: euclidean_pearson value: 74.52501692443582 - type: euclidean_spearman value: 75.88428434746646 - type: main_score value: 75.94582231399434 - type: manhattan_pearson value: 74.55015441749529 - type: manhattan_spearman value: 75.83288262176175 - type: pearson value: 67.74375505907872 - type: spearman value: 75.94582231399434 - task: type: Retrieval dataset: name: MTEB LEMBNarrativeQARetrieval (default) type: dwzhu/LongEmbed config: default split: test revision: 6e346642246bfb4928c560ee08640dc84d074e8c metrics: - type: map_at_1 value: 23.093 - type: map_at_10 value: 30.227999999999998 - type: map_at_100 value: 31.423000000000002 - type: map_at_1000 value: 31.533 - type: map_at_20 value: 30.835 - type: map_at_3 value: 27.983999999999998 - type: map_at_5 value: 29.253 - type: mrr_at_1 value: 23.093 - type: mrr_at_10 value: 30.227999999999998 - type: mrr_at_100 value: 31.423000000000002 - type: mrr_at_1000 value: 31.533 - type: mrr_at_20 value: 30.835 - type: mrr_at_3 value: 27.983999999999998 - type: mrr_at_5 value: 29.253 - type: ndcg_at_1 value: 23.093 - type: ndcg_at_10 value: 34.297 - type: ndcg_at_100 value: 41.049 - type: ndcg_at_1000 value: 43.566 - type: ndcg_at_20 value: 36.52 - type: ndcg_at_3 value: 29.629 - type: ndcg_at_5 value: 31.926 - type: precision_at_1 value: 23.093 - type: precision_at_10 value: 4.735 - type: precision_at_100 value: 0.8109999999999999 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 2.8080000000000003 - type: precision_at_3 value: 11.468 - type: precision_at_5 value: 8.001 - type: recall_at_1 value: 23.093 - type: recall_at_10 value: 47.354 - type: recall_at_100 value: 81.147 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 56.16799999999999 - type: recall_at_3 value: 34.405 - type: recall_at_5 value: 40.004 - type: main_score value: 34.297 - type: map_at_1 value: 24.361 - type: map_at_10 value: 33.641 - type: map_at_100 value: 35.104 - type: map_at_1000 value: 35.127 - type: map_at_20 value: 34.388999999999996 - type: map_at_3 value: 30.255 - type: map_at_5 value: 32.079 - type: mrr_at_1 value: 24.361 - type: mrr_at_10 value: 33.641 - type: mrr_at_100 value: 35.104 - type: mrr_at_1000 value: 35.127 - type: mrr_at_20 value: 34.388999999999996 - type: mrr_at_3 value: 30.255 - type: mrr_at_5 value: 32.079 - type: ndcg_at_1 value: 24.361 - type: ndcg_at_10 value: 39.337 - type: ndcg_at_100 value: 47.384 - type: ndcg_at_1000 value: 47.75 - type: ndcg_at_20 value: 42.077999999999996 - type: ndcg_at_3 value: 32.235 - type: ndcg_at_5 value: 35.524 - type: precision_at_1 value: 24.361 - type: precision_at_10 value: 5.783 - type: precision_at_100 value: 0.975 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 3.435 - type: precision_at_3 value: 12.661 - type: precision_at_5 value: 9.193999999999999 - type: recall_at_1 value: 24.361 - type: recall_at_10 value: 57.826 - type: recall_at_100 value: 97.51100000000001 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 68.697 - type: recall_at_3 value: 37.983 - type: recall_at_5 value: 45.972 - type: main_score value: 39.337 - type: map_at_1 value: 53.667 - type: map_at_10 value: 61.719 - type: map_at_100 value: 62.471 - type: map_at_1000 value: 62.492000000000004 - type: map_at_20 value: 62.153000000000006 - type: map_at_3 value: 59.167 - type: map_at_5 value: 60.95 - type: mrr_at_1 value: 53.667 - type: mrr_at_10 value: 61.719 - type: mrr_at_100 value: 62.471 - type: mrr_at_1000 value: 62.492000000000004 - type: mrr_at_20 value: 62.153000000000006 - type: mrr_at_3 value: 59.167 - type: mrr_at_5 value: 60.95 - type: ndcg_at_1 value: 53.667 - type: ndcg_at_10 value: 66.018 - type: ndcg_at_100 value: 69.726 - type: ndcg_at_1000 value: 70.143 - type: ndcg_at_20 value: 67.61399999999999 - type: ndcg_at_3 value: 60.924 - type: ndcg_at_5 value: 64.10900000000001 - type: precision_at_1 value: 53.667 - type: precision_at_10 value: 7.9670000000000005 - type: precision_at_100 value: 0.97 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.3 - type: precision_at_3 value: 22.0 - type: precision_at_5 value: 14.732999999999999 - type: recall_at_1 value: 53.667 - type: recall_at_10 value: 79.667 - type: recall_at_100 value: 97.0 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 86.0 - type: recall_at_3 value: 66.0 - type: recall_at_5 value: 73.667 - type: main_score value: 66.018 - task: type: Retrieval dataset: name: MTEB LEMBNeedleRetrieval (default) type: dwzhu/LongEmbed config: default split: test_256 revision: 6e346642246bfb4928c560ee08640dc84d074e8c metrics: - type: map_at_1 value: 64.0 - type: map_at_10 value: 77.083 - type: map_at_100 value: 77.265 - type: map_at_1000 value: 77.265 - type: map_at_20 value: 77.265 - type: map_at_3 value: 76.333 - type: map_at_5 value: 76.833 - type: mrr_at_1 value: 64.0 - type: mrr_at_10 value: 77.083 - type: mrr_at_100 value: 77.265 - type: mrr_at_1000 value: 77.265 - type: mrr_at_20 value: 77.265 - type: mrr_at_3 value: 76.333 - type: mrr_at_5 value: 76.833 - type: ndcg_at_1 value: 64.0 - type: ndcg_at_10 value: 82.325 - type: ndcg_at_100 value: 82.883 - type: ndcg_at_1000 value: 82.883 - type: ndcg_at_20 value: 82.883 - type: ndcg_at_3 value: 80.833 - type: ndcg_at_5 value: 81.694 - type: precision_at_1 value: 64.0 - type: precision_at_10 value: 9.8 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 5.0 - type: precision_at_3 value: 31.333 - type: precision_at_5 value: 19.2 - type: recall_at_1 value: 64.0 - type: recall_at_10 value: 98.0 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 100.0 - type: recall_at_3 value: 94.0 - type: recall_at_5 value: 96.0 - type: main_score value: 64.0 - type: map_at_1 value: 100.0 - type: map_at_10 value: 100.0 - type: map_at_100 value: 100.0 - type: map_at_1000 value: 100.0 - type: map_at_20 value: 100.0 - type: map_at_3 value: 100.0 - type: map_at_5 value: 100.0 - type: mrr_at_1 value: 100.0 - type: mrr_at_10 value: 100.0 - type: mrr_at_100 value: 100.0 - type: mrr_at_1000 value: 100.0 - type: mrr_at_20 value: 100.0 - type: mrr_at_3 value: 100.0 - type: mrr_at_5 value: 100.0 - type: ndcg_at_1 value: 100.0 - type: ndcg_at_10 value: 100.0 - type: ndcg_at_100 value: 100.0 - type: ndcg_at_1000 value: 100.0 - type: ndcg_at_20 value: 100.0 - type: ndcg_at_3 value: 100.0 - type: ndcg_at_5 value: 100.0 - type: precision_at_1 value: 100.0 - type: precision_at_10 value: 10.0 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 5.0 - type: precision_at_3 value: 33.333 - type: precision_at_5 value: 20.0 - type: recall_at_1 value: 100.0 - type: recall_at_10 value: 100.0 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 100.0 - type: recall_at_3 value: 100.0 - type: recall_at_5 value: 100.0 - type: main_score value: 100.0 - task: type: Retrieval dataset: name: MTEB LEMBSummScreenFDRetrieval (default) type: dwzhu/LongEmbed config: default split: validation revision: 6e346642246bfb4928c560ee08640dc84d074e8c metrics: - type: map_at_1 value: 84.821 - type: map_at_10 value: 90.11200000000001 - type: map_at_100 value: 90.158 - type: map_at_1000 value: 90.158 - type: map_at_20 value: 90.137 - type: map_at_3 value: 89.385 - type: map_at_5 value: 89.876 - type: mrr_at_1 value: 84.821 - type: mrr_at_10 value: 90.11200000000001 - type: mrr_at_100 value: 90.158 - type: mrr_at_1000 value: 90.158 - type: mrr_at_20 value: 90.137 - type: mrr_at_3 value: 89.385 - type: mrr_at_5 value: 89.876 - type: ndcg_at_1 value: 84.821 - type: ndcg_at_10 value: 92.334 - type: ndcg_at_100 value: 92.535 - type: ndcg_at_1000 value: 92.535 - type: ndcg_at_20 value: 92.414 - type: ndcg_at_3 value: 90.887 - type: ndcg_at_5 value: 91.758 - type: precision_at_1 value: 84.821 - type: precision_at_10 value: 9.911 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.97 - type: precision_at_3 value: 31.746000000000002 - type: precision_at_5 value: 19.464000000000002 - type: recall_at_1 value: 84.821 - type: recall_at_10 value: 99.107 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 99.405 - type: recall_at_3 value: 95.238 - type: recall_at_5 value: 97.321 - type: main_score value: 92.334 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (deu-deu) type: facebook/mlqa config: deu-deu split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 67.548 - type: map_at_1 value: 56.559000000000005 - type: map_at_10 value: 63.867 - type: map_at_100 value: 64.429 - type: map_at_1000 value: 64.457 - type: map_at_20 value: 64.215 - type: map_at_3 value: 62.109 - type: map_at_5 value: 63.101 - type: mrr_at_1 value: 56.56990915134057 - type: mrr_at_10 value: 63.86820789324668 - type: mrr_at_100 value: 64.42973602152581 - type: mrr_at_1000 value: 64.45818598090155 - type: mrr_at_20 value: 64.2163052263868 - type: mrr_at_3 value: 62.10946155550634 - type: mrr_at_5 value: 63.10104143585199 - type: nauc_map_at_1000_diff1 value: 73.78440163370111 - type: nauc_map_at_1000_max value: 66.37875518052162 - type: nauc_map_at_1000_std value: -17.063915098135396 - type: nauc_map_at_100_diff1 value: 73.77180802985815 - type: nauc_map_at_100_max value: 66.38365998362033 - type: nauc_map_at_100_std value: -17.053345109661972 - type: nauc_map_at_10_diff1 value: 73.70041876696037 - type: nauc_map_at_10_max value: 66.33213342705997 - type: nauc_map_at_10_std value: -17.40657791273925 - type: nauc_map_at_1_diff1 value: 76.8784374396948 - type: nauc_map_at_1_max value: 64.07170606935357 - type: nauc_map_at_1_std value: -18.464213686790654 - type: nauc_map_at_20_diff1 value: 73.72371377231813 - type: nauc_map_at_20_max value: 66.42108121059451 - type: nauc_map_at_20_std value: -17.05384923889036 - type: nauc_map_at_3_diff1 value: 74.08287018839246 - type: nauc_map_at_3_max value: 66.42422337760333 - type: nauc_map_at_3_std value: -17.79503404131652 - type: nauc_map_at_5_diff1 value: 73.9294779027339 - type: nauc_map_at_5_max value: 66.51752041065726 - type: nauc_map_at_5_std value: -17.67309805113804 - type: nauc_mrr_at_1000_diff1 value: 73.78389736923545 - type: nauc_mrr_at_1000_max value: 66.37929720858341 - type: nauc_mrr_at_1000_std value: -17.058591711291278 - type: nauc_mrr_at_100_diff1 value: 73.77126451253136 - type: nauc_mrr_at_100_max value: 66.38405917246607 - type: nauc_mrr_at_100_std value: -17.047251035212863 - type: nauc_mrr_at_10_diff1 value: 73.69960470665124 - type: nauc_mrr_at_10_max value: 66.33265194210313 - type: nauc_mrr_at_10_std value: -17.399659076827998 - type: nauc_mrr_at_1_diff1 value: 76.8689850260726 - type: nauc_mrr_at_1_max value: 64.09858188287487 - type: nauc_mrr_at_1_std value: -18.46064784201847 - type: nauc_mrr_at_20_diff1 value: 73.72312682063128 - type: nauc_mrr_at_20_max value: 66.42181932858745 - type: nauc_mrr_at_20_std value: -17.04690257511092 - type: nauc_mrr_at_3_diff1 value: 74.08287018839246 - type: nauc_mrr_at_3_max value: 66.42422337760333 - type: nauc_mrr_at_3_std value: -17.79503404131652 - type: nauc_mrr_at_5_diff1 value: 73.9294779027339 - type: nauc_mrr_at_5_max value: 66.51752041065726 - type: nauc_mrr_at_5_std value: -17.67309805113804 - type: nauc_ndcg_at_1000_diff1 value: 72.97825548342801 - type: nauc_ndcg_at_1000_max value: 66.96275437178257 - type: nauc_ndcg_at_1000_std value: -15.611902299641587 - type: nauc_ndcg_at_100_diff1 value: 72.58724738936613 - type: nauc_ndcg_at_100_max value: 67.16774012704182 - type: nauc_ndcg_at_100_std value: -14.945088654796812 - type: nauc_ndcg_at_10_diff1 value: 72.16253640477947 - type: nauc_ndcg_at_10_max value: 67.01746849484621 - type: nauc_ndcg_at_10_std value: -16.46102507270809 - type: nauc_ndcg_at_1_diff1 value: 76.8689850260726 - type: nauc_ndcg_at_1_max value: 64.09858188287487 - type: nauc_ndcg_at_1_std value: -18.46064784201847 - type: nauc_ndcg_at_20_diff1 value: 72.19995325129975 - type: nauc_ndcg_at_20_max value: 67.39639713797962 - type: nauc_ndcg_at_20_std value: -15.091689370748531 - type: nauc_ndcg_at_3_diff1 value: 73.13123604206514 - type: nauc_ndcg_at_3_max value: 67.23123167871547 - type: nauc_ndcg_at_3_std value: -17.492755234009156 - type: nauc_ndcg_at_5_diff1 value: 72.8154718929895 - type: nauc_ndcg_at_5_max value: 67.44578008373777 - type: nauc_ndcg_at_5_std value: -17.251840358751362 - type: nauc_precision_at_1000_diff1 value: 47.89748325983604 - type: nauc_precision_at_1000_max value: 70.47466197804906 - type: nauc_precision_at_1000_std value: 72.66193512114775 - type: nauc_precision_at_100_diff1 value: 59.493743734005356 - type: nauc_precision_at_100_max value: 74.02140147220713 - type: nauc_precision_at_100_std value: 17.26664098026236 - type: nauc_precision_at_10_diff1 value: 64.94415011040277 - type: nauc_precision_at_10_max value: 69.6963814950747 - type: nauc_precision_at_10_std value: -11.663043657012954 - type: nauc_precision_at_1_diff1 value: 76.8689850260726 - type: nauc_precision_at_1_max value: 64.09858188287487 - type: nauc_precision_at_1_std value: -18.46064784201847 - type: nauc_precision_at_20_diff1 value: 63.145886909986416 - type: nauc_precision_at_20_max value: 72.95708033630744 - type: nauc_precision_at_20_std value: -1.5039593629280323 - type: nauc_precision_at_3_diff1 value: 69.88902201644449 - type: nauc_precision_at_3_max value: 69.80499971089935 - type: nauc_precision_at_3_std value: -16.444680766676647 - type: nauc_precision_at_5_diff1 value: 68.60869967062919 - type: nauc_precision_at_5_max value: 70.75998207564281 - type: nauc_precision_at_5_std value: -15.62613396998262 - type: nauc_recall_at_1000_diff1 value: 62.6646436338833 - type: nauc_recall_at_1000_max value: 86.17801636476078 - type: nauc_recall_at_1000_std value: 71.84718775540334 - type: nauc_recall_at_100_diff1 value: 61.110492191439505 - type: nauc_recall_at_100_max value: 75.45730686603042 - type: nauc_recall_at_100_std value: 16.202465011589428 - type: nauc_recall_at_10_diff1 value: 65.1522196516815 - type: nauc_recall_at_10_max value: 69.7626435962161 - type: nauc_recall_at_10_std value: -11.801178474770449 - type: nauc_recall_at_1_diff1 value: 76.8784374396948 - type: nauc_recall_at_1_max value: 64.07170606935357 - type: nauc_recall_at_1_std value: -18.464213686790654 - type: nauc_recall_at_20_diff1 value: 63.40332739504143 - type: nauc_recall_at_20_max value: 73.04113661090965 - type: nauc_recall_at_20_std value: -1.6609741140266947 - type: nauc_recall_at_3_diff1 value: 70.03728086098866 - type: nauc_recall_at_3_max value: 69.85953774320521 - type: nauc_recall_at_3_std value: -16.482993123411706 - type: nauc_recall_at_5_diff1 value: 68.77396121765933 - type: nauc_recall_at_5_max value: 70.8231205493519 - type: nauc_recall_at_5_std value: -15.668037770700863 - type: ndcg_at_1 value: 56.57 - type: ndcg_at_10 value: 67.548 - type: ndcg_at_100 value: 70.421 - type: ndcg_at_1000 value: 71.198 - type: ndcg_at_20 value: 68.829 - type: ndcg_at_3 value: 63.88700000000001 - type: ndcg_at_5 value: 65.689 - type: precision_at_1 value: 56.57 - type: precision_at_10 value: 7.922 - type: precision_at_100 value: 0.9299999999999999 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 4.216 - type: precision_at_3 value: 23.015 - type: precision_at_5 value: 14.691 - type: recall_at_1 value: 56.559000000000005 - type: recall_at_10 value: 79.182 - type: recall_at_100 value: 92.946 - type: recall_at_1000 value: 99.092 - type: recall_at_20 value: 84.27900000000001 - type: recall_at_3 value: 69.023 - type: recall_at_5 value: 73.432 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (deu-spa) type: facebook/mlqa config: deu-spa split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 70.645 - type: map_at_1 value: 58.423 - type: map_at_10 value: 66.613 - type: map_at_100 value: 67.14099999999999 - type: map_at_1000 value: 67.161 - type: map_at_20 value: 66.965 - type: map_at_3 value: 64.714 - type: map_at_5 value: 65.835 - type: mrr_at_1 value: 58.4225352112676 - type: mrr_at_10 value: 66.61321260898735 - type: mrr_at_100 value: 67.13991570812132 - type: mrr_at_1000 value: 67.1598532168174 - type: mrr_at_20 value: 66.96384710024888 - type: mrr_at_3 value: 64.71361502347425 - type: mrr_at_5 value: 65.83474178403769 - type: nauc_map_at_1000_diff1 value: 73.9485117118935 - type: nauc_map_at_1000_max value: 65.74479869396299 - type: nauc_map_at_1000_std value: -20.300269749495563 - type: nauc_map_at_100_diff1 value: 73.93900406302829 - type: nauc_map_at_100_max value: 65.75508449194885 - type: nauc_map_at_100_std value: -20.265330791570175 - type: nauc_map_at_10_diff1 value: 73.84863233472605 - type: nauc_map_at_10_max value: 65.89377317378211 - type: nauc_map_at_10_std value: -20.404123131964695 - type: nauc_map_at_1_diff1 value: 76.73627284218519 - type: nauc_map_at_1_max value: 62.94957512510876 - type: nauc_map_at_1_std value: -20.99649749330682 - type: nauc_map_at_20_diff1 value: 73.88712006109598 - type: nauc_map_at_20_max value: 65.82057018162664 - type: nauc_map_at_20_std value: -20.269476512431915 - type: nauc_map_at_3_diff1 value: 74.21419190161502 - type: nauc_map_at_3_max value: 65.64993368062119 - type: nauc_map_at_3_std value: -21.34641749007071 - type: nauc_map_at_5_diff1 value: 74.0119419385777 - type: nauc_map_at_5_max value: 65.69809416369732 - type: nauc_map_at_5_std value: -21.16901556082261 - type: nauc_mrr_at_1000_diff1 value: 73.94915184134923 - type: nauc_mrr_at_1000_max value: 65.74522469633418 - type: nauc_mrr_at_1000_std value: -20.303028367132246 - type: nauc_mrr_at_100_diff1 value: 73.93964394728808 - type: nauc_mrr_at_100_max value: 65.75550992323707 - type: nauc_mrr_at_100_std value: -20.26808820438918 - type: nauc_mrr_at_10_diff1 value: 73.84863233472605 - type: nauc_mrr_at_10_max value: 65.89377317378211 - type: nauc_mrr_at_10_std value: -20.404123131964695 - type: nauc_mrr_at_1_diff1 value: 76.73627284218519 - type: nauc_mrr_at_1_max value: 62.94957512510876 - type: nauc_mrr_at_1_std value: -20.99649749330682 - type: nauc_mrr_at_20_diff1 value: 73.88775721128745 - type: nauc_mrr_at_20_max value: 65.820991355628 - type: nauc_mrr_at_20_std value: -20.272216587019734 - type: nauc_mrr_at_3_diff1 value: 74.21419190161502 - type: nauc_mrr_at_3_max value: 65.64993368062119 - type: nauc_mrr_at_3_std value: -21.34641749007071 - type: nauc_mrr_at_5_diff1 value: 74.0119419385777 - type: nauc_mrr_at_5_max value: 65.69809416369732 - type: nauc_mrr_at_5_std value: -21.16901556082261 - type: nauc_ndcg_at_1000_diff1 value: 73.29396365944277 - type: nauc_ndcg_at_1000_max value: 66.44879592109541 - type: nauc_ndcg_at_1000_std value: -19.285991058788195 - type: nauc_ndcg_at_100_diff1 value: 73.0159172721162 - type: nauc_ndcg_at_100_max value: 66.76216389231388 - type: nauc_ndcg_at_100_std value: -18.27931368094887 - type: nauc_ndcg_at_10_diff1 value: 72.42096650774693 - type: nauc_ndcg_at_10_max value: 67.48592688463306 - type: nauc_ndcg_at_10_std value: -18.91453756077581 - type: nauc_ndcg_at_1_diff1 value: 76.73627284218519 - type: nauc_ndcg_at_1_max value: 62.94957512510876 - type: nauc_ndcg_at_1_std value: -20.99649749330682 - type: nauc_ndcg_at_20_diff1 value: 72.53699362385684 - type: nauc_ndcg_at_20_max value: 67.22763976357872 - type: nauc_ndcg_at_20_std value: -18.299910635008338 - type: nauc_ndcg_at_3_diff1 value: 73.3698453761989 - type: nauc_ndcg_at_3_max value: 66.71056987289383 - type: nauc_ndcg_at_3_std value: -21.405154376652803 - type: nauc_ndcg_at_5_diff1 value: 72.9491030712935 - type: nauc_ndcg_at_5_max value: 66.85786103137077 - type: nauc_ndcg_at_5_std value: -21.04005053344073 - type: nauc_precision_at_1000_diff1 value: 17.02462370967451 - type: nauc_precision_at_1000_max value: 48.03260752496052 - type: nauc_precision_at_1000_std value: 87.56077915079334 - type: nauc_precision_at_100_diff1 value: 58.590352501194985 - type: nauc_precision_at_100_max value: 78.2649015433222 - type: nauc_precision_at_100_std value: 28.05030453158992 - type: nauc_precision_at_10_diff1 value: 64.89497928764766 - type: nauc_precision_at_10_max value: 75.93257124951242 - type: nauc_precision_at_10_std value: -9.825306994117462 - type: nauc_precision_at_1_diff1 value: 76.73627284218519 - type: nauc_precision_at_1_max value: 62.94957512510876 - type: nauc_precision_at_1_std value: -20.99649749330682 - type: nauc_precision_at_20_diff1 value: 62.11366204321558 - type: nauc_precision_at_20_max value: 75.9571427846493 - type: nauc_precision_at_20_std value: -0.94585212808191 - type: nauc_precision_at_3_diff1 value: 70.52940972112398 - type: nauc_precision_at_3_max value: 70.3402053170779 - type: nauc_precision_at_3_std value: -21.579778424241304 - type: nauc_precision_at_5_diff1 value: 68.78962580223575 - type: nauc_precision_at_5_max value: 71.41410894398376 - type: nauc_precision_at_5_std value: -20.415603405161956 - type: nauc_recall_at_1000_diff1 value: 55.88625447348128 - type: nauc_recall_at_1000_max value: 100.0 - type: nauc_recall_at_1000_std value: 100.0 - type: nauc_recall_at_100_diff1 value: 61.17942268389525 - type: nauc_recall_at_100_max value: 81.12207841563487 - type: nauc_recall_at_100_std value: 27.141215257528113 - type: nauc_recall_at_10_diff1 value: 64.8949792876478 - type: nauc_recall_at_10_max value: 75.93257124951249 - type: nauc_recall_at_10_std value: -9.825306994117323 - type: nauc_recall_at_1_diff1 value: 76.73627284218519 - type: nauc_recall_at_1_max value: 62.94957512510876 - type: nauc_recall_at_1_std value: -20.99649749330682 - type: nauc_recall_at_20_diff1 value: 63.07808719241162 - type: nauc_recall_at_20_max value: 76.96808746317542 - type: nauc_recall_at_20_std value: -1.5235053258631275 - type: nauc_recall_at_3_diff1 value: 70.52940972112405 - type: nauc_recall_at_3_max value: 70.3402053170779 - type: nauc_recall_at_3_std value: -21.57977842424124 - type: nauc_recall_at_5_diff1 value: 68.78962580223575 - type: nauc_recall_at_5_max value: 71.41410894398392 - type: nauc_recall_at_5_std value: -20.415603405161793 - type: ndcg_at_1 value: 58.423 - type: ndcg_at_10 value: 70.645 - type: ndcg_at_100 value: 73.277 - type: ndcg_at_1000 value: 73.785 - type: ndcg_at_20 value: 71.918 - type: ndcg_at_3 value: 66.679 - type: ndcg_at_5 value: 68.72200000000001 - type: precision_at_1 value: 58.423 - type: precision_at_10 value: 8.338 - type: precision_at_100 value: 0.959 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.423 - type: precision_at_3 value: 24.113 - type: precision_at_5 value: 15.47 - type: recall_at_1 value: 58.423 - type: recall_at_10 value: 83.38 - type: recall_at_100 value: 95.887 - type: recall_at_1000 value: 99.831 - type: recall_at_20 value: 88.39399999999999 - type: recall_at_3 value: 72.33800000000001 - type: recall_at_5 value: 77.352 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (deu-eng) type: facebook/mlqa config: deu-eng split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 67.067 - type: map_at_1 value: 55.861000000000004 - type: map_at_10 value: 63.42100000000001 - type: map_at_100 value: 64.03 - type: map_at_1000 value: 64.05999999999999 - type: map_at_20 value: 63.819 - type: map_at_3 value: 61.773 - type: map_at_5 value: 62.736999999999995 - type: mrr_at_1 value: 55.88300465322402 - type: mrr_at_10 value: 63.43111082973707 - type: mrr_at_100 value: 64.03962373590272 - type: mrr_at_1000 value: 64.0698259866376 - type: mrr_at_20 value: 63.82871766489112 - type: mrr_at_3 value: 61.78447448112865 - type: mrr_at_5 value: 62.74835659945346 - type: nauc_map_at_1000_diff1 value: 74.58505763417352 - type: nauc_map_at_1000_max value: 66.26060764852198 - type: nauc_map_at_1000_std value: -16.896178230873897 - type: nauc_map_at_100_diff1 value: 74.57057487892857 - type: nauc_map_at_100_max value: 66.26600433283826 - type: nauc_map_at_100_std value: -16.87596113104189 - type: nauc_map_at_10_diff1 value: 74.53453636322749 - type: nauc_map_at_10_max value: 66.27501737773804 - type: nauc_map_at_10_std value: -17.178743257781775 - type: nauc_map_at_1_diff1 value: 77.63067209375254 - type: nauc_map_at_1_max value: 64.17718675702672 - type: nauc_map_at_1_std value: -17.639521106853717 - type: nauc_map_at_20_diff1 value: 74.52007402431164 - type: nauc_map_at_20_max value: 66.28276291359268 - type: nauc_map_at_20_std value: -16.939292897754758 - type: nauc_map_at_3_diff1 value: 74.79187974631951 - type: nauc_map_at_3_max value: 66.23256568210611 - type: nauc_map_at_3_std value: -17.894889918934112 - type: nauc_map_at_5_diff1 value: 74.63011328882517 - type: nauc_map_at_5_max value: 66.35411054978499 - type: nauc_map_at_5_std value: -17.50140342194211 - type: nauc_mrr_at_1000_diff1 value: 74.57520089771667 - type: nauc_mrr_at_1000_max value: 66.27270912845914 - type: nauc_mrr_at_1000_std value: -16.84012675362397 - type: nauc_mrr_at_100_diff1 value: 74.56070964572156 - type: nauc_mrr_at_100_max value: 66.2780701126926 - type: nauc_mrr_at_100_std value: -16.820035083069865 - type: nauc_mrr_at_10_diff1 value: 74.52455978435117 - type: nauc_mrr_at_10_max value: 66.28697244023137 - type: nauc_mrr_at_10_std value: -17.122477723330523 - type: nauc_mrr_at_1_diff1 value: 77.60643512422061 - type: nauc_mrr_at_1_max value: 64.21736966061896 - type: nauc_mrr_at_1_std value: -17.56627338275146 - type: nauc_mrr_at_20_diff1 value: 74.5099814266373 - type: nauc_mrr_at_20_max value: 66.29485560556576 - type: nauc_mrr_at_20_std value: -16.882350027335306 - type: nauc_mrr_at_3_diff1 value: 74.78132817375507 - type: nauc_mrr_at_3_max value: 66.24761860047623 - type: nauc_mrr_at_3_std value: -17.833128575678998 - type: nauc_mrr_at_5_diff1 value: 74.6193031207433 - type: nauc_mrr_at_5_max value: 66.36951764432901 - type: nauc_mrr_at_5_std value: -17.438203106324227 - type: nauc_ndcg_at_1000_diff1 value: 73.79386161629151 - type: nauc_ndcg_at_1000_max value: 66.84013038018082 - type: nauc_ndcg_at_1000_std value: -15.387358822700667 - type: nauc_ndcg_at_100_diff1 value: 73.36132885277745 - type: nauc_ndcg_at_100_max value: 67.04416926901568 - type: nauc_ndcg_at_100_std value: -14.503256942521972 - type: nauc_ndcg_at_10_diff1 value: 73.11847332785027 - type: nauc_ndcg_at_10_max value: 67.02149621303091 - type: nauc_ndcg_at_10_std value: -16.142234662067782 - type: nauc_ndcg_at_1_diff1 value: 77.60643512422061 - type: nauc_ndcg_at_1_max value: 64.21736966061896 - type: nauc_ndcg_at_1_std value: -17.56627338275146 - type: nauc_ndcg_at_20_diff1 value: 72.97961452569768 - type: nauc_ndcg_at_20_max value: 67.12369127081152 - type: nauc_ndcg_at_20_std value: -15.11921773223936 - type: nauc_ndcg_at_3_diff1 value: 73.77769312598772 - type: nauc_ndcg_at_3_max value: 66.94438755852309 - type: nauc_ndcg_at_3_std value: -17.75960443830741 - type: nauc_ndcg_at_5_diff1 value: 73.43991209562891 - type: nauc_ndcg_at_5_max value: 67.21682951737418 - type: nauc_ndcg_at_5_std value: -17.013510008231805 - type: nauc_precision_at_1000_diff1 value: 51.30633281948362 - type: nauc_precision_at_1000_max value: 76.78675288883846 - type: nauc_precision_at_1000_std value: 71.70041985304397 - type: nauc_precision_at_100_diff1 value: 59.86656455853326 - type: nauc_precision_at_100_max value: 74.41958422732161 - type: nauc_precision_at_100_std value: 22.098920296069124 - type: nauc_precision_at_10_diff1 value: 66.4696166928741 - type: nauc_precision_at_10_max value: 69.88463108697104 - type: nauc_precision_at_10_std value: -10.707950954702742 - type: nauc_precision_at_1_diff1 value: 77.60643512422061 - type: nauc_precision_at_1_max value: 64.21736966061896 - type: nauc_precision_at_1_std value: -17.56627338275146 - type: nauc_precision_at_20_diff1 value: 63.45094585276983 - type: nauc_precision_at_20_max value: 71.57741245347195 - type: nauc_precision_at_20_std value: -2.2211545419051744 - type: nauc_precision_at_3_diff1 value: 70.28060818081384 - type: nauc_precision_at_3_max value: 69.22652927816439 - type: nauc_precision_at_3_std value: -17.158576243559434 - type: nauc_precision_at_5_diff1 value: 68.90765418427162 - type: nauc_precision_at_5_max value: 70.32585273389111 - type: nauc_precision_at_5_std value: -14.950363729664524 - type: nauc_recall_at_1000_diff1 value: 65.11255117927331 - type: nauc_recall_at_1000_max value: 88.35641213283338 - type: nauc_recall_at_1000_std value: 69.89792573640547 - type: nauc_recall_at_100_diff1 value: 61.46376457272238 - type: nauc_recall_at_100_max value: 75.48265142243015 - type: nauc_recall_at_100_std value: 21.223182712042178 - type: nauc_recall_at_10_diff1 value: 66.89353375308997 - type: nauc_recall_at_10_max value: 70.06655416883785 - type: nauc_recall_at_10_std value: -11.100871879439435 - type: nauc_recall_at_1_diff1 value: 77.63067209375254 - type: nauc_recall_at_1_max value: 64.17718675702672 - type: nauc_recall_at_1_std value: -17.639521106853717 - type: nauc_recall_at_20_diff1 value: 63.98532276331878 - type: nauc_recall_at_20_max value: 71.81562599791899 - type: nauc_recall_at_20_std value: -2.696537977147695 - type: nauc_recall_at_3_diff1 value: 70.4507655865698 - type: nauc_recall_at_3_max value: 69.25705030141037 - type: nauc_recall_at_3_std value: -17.299948348202836 - type: nauc_recall_at_5_diff1 value: 69.09152857901888 - type: nauc_recall_at_5_max value: 70.35609636026405 - type: nauc_recall_at_5_std value: -15.105012139255896 - type: ndcg_at_1 value: 55.883 - type: ndcg_at_10 value: 67.067 - type: ndcg_at_100 value: 70.07 - type: ndcg_at_1000 value: 70.875 - type: ndcg_at_20 value: 68.498 - type: ndcg_at_3 value: 63.666 - type: ndcg_at_5 value: 65.40599999999999 - type: precision_at_1 value: 55.883 - type: precision_at_10 value: 7.8549999999999995 - type: precision_at_100 value: 0.928 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 4.2090000000000005 - type: precision_at_3 value: 23.052 - type: precision_at_5 value: 14.677999999999999 - type: recall_at_1 value: 55.861000000000004 - type: recall_at_10 value: 78.495 - type: recall_at_100 value: 92.688 - type: recall_at_1000 value: 99.02499999999999 - type: recall_at_20 value: 84.124 - type: recall_at_3 value: 69.123 - type: recall_at_5 value: 73.355 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (spa-deu) type: facebook/mlqa config: spa-deu split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 73.90299999999999 - type: map_at_1 value: 61.236000000000004 - type: map_at_10 value: 69.88799999999999 - type: map_at_100 value: 70.319 - type: map_at_1000 value: 70.341 - type: map_at_20 value: 70.16799999999999 - type: map_at_3 value: 68.104 - type: map_at_5 value: 69.164 - type: mrr_at_1 value: 61.2739571589628 - type: mrr_at_10 value: 69.92589162684993 - type: mrr_at_100 value: 70.35245455509234 - type: mrr_at_1000 value: 70.37438351396742 - type: mrr_at_20 value: 70.20247469915404 - type: mrr_at_3 value: 68.14167606163099 - type: mrr_at_5 value: 69.20142803457354 - type: nauc_map_at_1000_diff1 value: 74.70416754842327 - type: nauc_map_at_1000_max value: 65.86915994583384 - type: nauc_map_at_1000_std value: -19.04437483534443 - type: nauc_map_at_100_diff1 value: 74.70011798058674 - type: nauc_map_at_100_max value: 65.88507779167188 - type: nauc_map_at_100_std value: -19.018670970643786 - type: nauc_map_at_10_diff1 value: 74.6362126804427 - type: nauc_map_at_10_max value: 66.05733054427198 - type: nauc_map_at_10_std value: -19.034317737897354 - type: nauc_map_at_1_diff1 value: 77.24970536833601 - type: nauc_map_at_1_max value: 62.07820573048406 - type: nauc_map_at_1_std value: -20.917086586335078 - type: nauc_map_at_20_diff1 value: 74.64113920401083 - type: nauc_map_at_20_max value: 65.89991740166793 - type: nauc_map_at_20_std value: -19.09987515041243 - type: nauc_map_at_3_diff1 value: 74.6518162332119 - type: nauc_map_at_3_max value: 66.10312348194024 - type: nauc_map_at_3_std value: -18.95881457716116 - type: nauc_map_at_5_diff1 value: 74.55141020670321 - type: nauc_map_at_5_max value: 65.94345752979342 - type: nauc_map_at_5_std value: -19.453976877992304 - type: nauc_mrr_at_1000_diff1 value: 74.64458488344088 - type: nauc_mrr_at_1000_max value: 65.84575328456057 - type: nauc_mrr_at_1000_std value: -18.901614615119904 - type: nauc_mrr_at_100_diff1 value: 74.64058497924627 - type: nauc_mrr_at_100_max value: 65.86170461767928 - type: nauc_mrr_at_100_std value: -18.87601697091505 - type: nauc_mrr_at_10_diff1 value: 74.57266634464752 - type: nauc_mrr_at_10_max value: 66.03331587645152 - type: nauc_mrr_at_10_std value: -18.87888060105393 - type: nauc_mrr_at_1_diff1 value: 77.19578272647183 - type: nauc_mrr_at_1_max value: 62.05252035478773 - type: nauc_mrr_at_1_std value: -20.790530940625267 - type: nauc_mrr_at_20_diff1 value: 74.5808171250021 - type: nauc_mrr_at_20_max value: 65.87643606587798 - type: nauc_mrr_at_20_std value: -18.95476583474199 - type: nauc_mrr_at_3_diff1 value: 74.5917053289191 - type: nauc_mrr_at_3_max value: 66.08044079438714 - type: nauc_mrr_at_3_std value: -18.81168463163586 - type: nauc_mrr_at_5_diff1 value: 74.48934579694608 - type: nauc_mrr_at_5_max value: 65.91993162383771 - type: nauc_mrr_at_5_std value: -19.302710791338797 - type: nauc_ndcg_at_1000_diff1 value: 74.20191283992186 - type: nauc_ndcg_at_1000_max value: 66.60831175771229 - type: nauc_ndcg_at_1000_std value: -18.175208725175484 - type: nauc_ndcg_at_100_diff1 value: 74.07713451642955 - type: nauc_ndcg_at_100_max value: 67.02028626335476 - type: nauc_ndcg_at_100_std value: -17.36560972181693 - type: nauc_ndcg_at_10_diff1 value: 73.63235521598476 - type: nauc_ndcg_at_10_max value: 67.8118473312638 - type: nauc_ndcg_at_10_std value: -17.647560577355915 - type: nauc_ndcg_at_1_diff1 value: 77.19578272647183 - type: nauc_ndcg_at_1_max value: 62.05252035478773 - type: nauc_ndcg_at_1_std value: -20.790530940625267 - type: nauc_ndcg_at_20_diff1 value: 73.65300308228291 - type: nauc_ndcg_at_20_max value: 67.18353402731985 - type: nauc_ndcg_at_20_std value: -17.9240756389792 - type: nauc_ndcg_at_3_diff1 value: 73.73764900202292 - type: nauc_ndcg_at_3_max value: 67.60840957876889 - type: nauc_ndcg_at_3_std value: -17.962667543518933 - type: nauc_ndcg_at_5_diff1 value: 73.49040500302092 - type: nauc_ndcg_at_5_max value: 67.41251918514402 - type: nauc_ndcg_at_5_std value: -18.851877225955523 - type: nauc_precision_at_1000_diff1 value: -18.652906102973922 - type: nauc_precision_at_1000_max value: 2.1701672475574885 - type: nauc_precision_at_1000_std value: 61.713411950188835 - type: nauc_precision_at_100_diff1 value: 62.37565302288498 - type: nauc_precision_at_100_max value: 76.96921843049006 - type: nauc_precision_at_100_std value: 19.152009040219678 - type: nauc_precision_at_10_diff1 value: 68.14047344105212 - type: nauc_precision_at_10_max value: 77.7177273849099 - type: nauc_precision_at_10_std value: -9.124325941493698 - type: nauc_precision_at_1_diff1 value: 77.19578272647183 - type: nauc_precision_at_1_max value: 62.05252035478773 - type: nauc_precision_at_1_std value: -20.790530940625267 - type: nauc_precision_at_20_diff1 value: 65.38487456362745 - type: nauc_precision_at_20_max value: 74.61122933443669 - type: nauc_precision_at_20_std value: -8.129775929648341 - type: nauc_precision_at_3_diff1 value: 70.45937744142297 - type: nauc_precision_at_3_max value: 73.03004233073901 - type: nauc_precision_at_3_std value: -14.246554579025158 - type: nauc_precision_at_5_diff1 value: 69.02821772428955 - type: nauc_precision_at_5_max value: 73.52949774726446 - type: nauc_precision_at_5_std value: -16.355747231517757 - type: nauc_recall_at_1000_diff1 value: 35.804192824985755 - type: nauc_recall_at_1000_max value: 61.367785756485894 - type: nauc_recall_at_1000_std value: 54.01380822466869 - type: nauc_recall_at_100_diff1 value: 67.96210883597479 - type: nauc_recall_at_100_max value: 82.38124823732169 - type: nauc_recall_at_100_std value: 16.814922595309966 - type: nauc_recall_at_10_diff1 value: 68.21964459634341 - type: nauc_recall_at_10_max value: 77.68301934858845 - type: nauc_recall_at_10_std value: -9.430792913885066 - type: nauc_recall_at_1_diff1 value: 77.24970536833601 - type: nauc_recall_at_1_max value: 62.07820573048406 - type: nauc_recall_at_1_std value: -20.917086586335078 - type: nauc_recall_at_20_diff1 value: 66.60569906579487 - type: nauc_recall_at_20_max value: 75.66163186604354 - type: nauc_recall_at_20_std value: -9.09826205489828 - type: nauc_recall_at_3_diff1 value: 70.52323701841641 - type: nauc_recall_at_3_max value: 73.03478107411232 - type: nauc_recall_at_3_std value: -14.432325989967962 - type: nauc_recall_at_5_diff1 value: 69.08521261524373 - type: nauc_recall_at_5_max value: 73.51150270382094 - type: nauc_recall_at_5_std value: -16.569387503524368 - type: ndcg_at_1 value: 61.273999999999994 - type: ndcg_at_10 value: 73.90299999999999 - type: ndcg_at_100 value: 75.983 - type: ndcg_at_1000 value: 76.488 - type: ndcg_at_20 value: 74.921 - type: ndcg_at_3 value: 70.277 - type: ndcg_at_5 value: 72.172 - type: precision_at_1 value: 61.273999999999994 - type: precision_at_10 value: 8.641 - type: precision_at_100 value: 0.962 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.524 - type: precision_at_3 value: 25.517 - type: precision_at_5 value: 16.223000000000003 - type: recall_at_1 value: 61.236000000000004 - type: recall_at_10 value: 86.37700000000001 - type: recall_at_100 value: 96.054 - type: recall_at_1000 value: 99.887 - type: recall_at_20 value: 90.398 - type: recall_at_3 value: 76.51299999999999 - type: recall_at_5 value: 81.07900000000001 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (spa-spa) type: facebook/mlqa config: spa-spa split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 68.632 - type: map_at_1 value: 57.046 - type: map_at_10 value: 64.869 - type: map_at_100 value: 65.384 - type: map_at_1000 value: 65.413 - type: map_at_20 value: 65.185 - type: map_at_3 value: 63.178 - type: map_at_5 value: 64.12 - type: mrr_at_1 value: 57.05579889544848 - type: mrr_at_10 value: 64.8806425382317 - type: mrr_at_100 value: 65.39469233244084 - type: mrr_at_1000 value: 65.42342199403159 - type: mrr_at_20 value: 65.19634815919534 - type: mrr_at_3 value: 63.18796419729591 - type: mrr_at_5 value: 64.13159398209874 - type: nauc_map_at_1000_diff1 value: 73.23803038674018 - type: nauc_map_at_1000_max value: 67.44156201421714 - type: nauc_map_at_1000_std value: -8.60143026450049 - type: nauc_map_at_100_diff1 value: 73.22575613034235 - type: nauc_map_at_100_max value: 67.44735143420195 - type: nauc_map_at_100_std value: -8.576905069492895 - type: nauc_map_at_10_diff1 value: 73.11950129610865 - type: nauc_map_at_10_max value: 67.45107232305055 - type: nauc_map_at_10_std value: -8.799837857015392 - type: nauc_map_at_1_diff1 value: 76.18354072047988 - type: nauc_map_at_1_max value: 65.03342186728786 - type: nauc_map_at_1_std value: -10.867650288695796 - type: nauc_map_at_20_diff1 value: 73.21570748770948 - type: nauc_map_at_20_max value: 67.50340321088724 - type: nauc_map_at_20_std value: -8.594057184944676 - type: nauc_map_at_3_diff1 value: 73.17239276163892 - type: nauc_map_at_3_max value: 67.06319504819103 - type: nauc_map_at_3_std value: -9.883216310270528 - type: nauc_map_at_5_diff1 value: 73.11913507367727 - type: nauc_map_at_5_max value: 67.27497019567078 - type: nauc_map_at_5_std value: -9.497714822103118 - type: nauc_mrr_at_1000_diff1 value: 73.22971233311306 - type: nauc_mrr_at_1000_max value: 67.42977229057223 - type: nauc_mrr_at_1000_std value: -8.550068702273297 - type: nauc_mrr_at_100_diff1 value: 73.21744467317815 - type: nauc_mrr_at_100_max value: 67.43557491068093 - type: nauc_mrr_at_100_std value: -8.52559275190607 - type: nauc_mrr_at_10_diff1 value: 73.11075619726137 - type: nauc_mrr_at_10_max value: 67.43889760205286 - type: nauc_mrr_at_10_std value: -8.74617232559183 - type: nauc_mrr_at_1_diff1 value: 76.17529975949547 - type: nauc_mrr_at_1_max value: 65.02401127001608 - type: nauc_mrr_at_1_std value: -10.817814457633952 - type: nauc_mrr_at_20_diff1 value: 73.20689275225138 - type: nauc_mrr_at_20_max value: 67.49111752272192 - type: nauc_mrr_at_20_std value: -8.539827528410353 - type: nauc_mrr_at_3_diff1 value: 73.16291729623958 - type: nauc_mrr_at_3_max value: 67.05300993427998 - type: nauc_mrr_at_3_std value: -9.827915885680811 - type: nauc_mrr_at_5_diff1 value: 73.11055686484109 - type: nauc_mrr_at_5_max value: 67.26299851089122 - type: nauc_mrr_at_5_std value: -9.445190276650903 - type: nauc_ndcg_at_1000_diff1 value: 72.58833638407177 - type: nauc_ndcg_at_1000_max value: 68.10447506371374 - type: nauc_ndcg_at_1000_std value: -6.910306241546282 - type: nauc_ndcg_at_100_diff1 value: 72.24524849631476 - type: nauc_ndcg_at_100_max value: 68.30659210081238 - type: nauc_ndcg_at_100_std value: -6.04305364268931 - type: nauc_ndcg_at_10_diff1 value: 71.87363502582961 - type: nauc_ndcg_at_10_max value: 68.5010009653693 - type: nauc_ndcg_at_10_std value: -7.021281296450588 - type: nauc_ndcg_at_1_diff1 value: 76.17529975949547 - type: nauc_ndcg_at_1_max value: 65.02401127001608 - type: nauc_ndcg_at_1_std value: -10.817814457633952 - type: nauc_ndcg_at_20_diff1 value: 72.21241010439327 - type: nauc_ndcg_at_20_max value: 68.71743274030551 - type: nauc_ndcg_at_20_std value: -6.186629577195946 - type: nauc_ndcg_at_3_diff1 value: 72.08204674794459 - type: nauc_ndcg_at_3_max value: 67.5958365046156 - type: nauc_ndcg_at_3_std value: -9.576418336610345 - type: nauc_ndcg_at_5_diff1 value: 71.93179095844508 - type: nauc_ndcg_at_5_max value: 68.01914639754217 - type: nauc_ndcg_at_5_std value: -8.833768332910777 - type: nauc_precision_at_1000_diff1 value: 63.0051360227489 - type: nauc_precision_at_1000_max value: 79.93532442313229 - type: nauc_precision_at_1000_std value: 52.869517607133254 - type: nauc_precision_at_100_diff1 value: 62.43301501857154 - type: nauc_precision_at_100_max value: 75.57280416668183 - type: nauc_precision_at_100_std value: 26.758300486132747 - type: nauc_precision_at_10_diff1 value: 66.29806375971134 - type: nauc_precision_at_10_max value: 73.40301413754797 - type: nauc_precision_at_10_std value: 1.9858547295235462 - type: nauc_precision_at_1_diff1 value: 76.17529975949547 - type: nauc_precision_at_1_max value: 65.02401127001608 - type: nauc_precision_at_1_std value: -10.817814457633952 - type: nauc_precision_at_20_diff1 value: 67.05111836051105 - type: nauc_precision_at_20_max value: 76.09783190824155 - type: nauc_precision_at_20_std value: 9.906010659515564 - type: nauc_precision_at_3_diff1 value: 68.44186679250453 - type: nauc_precision_at_3_max value: 69.30301351119388 - type: nauc_precision_at_3_std value: -8.566522518882348 - type: nauc_precision_at_5_diff1 value: 67.51737199297388 - type: nauc_precision_at_5_max value: 70.75887601590472 - type: nauc_precision_at_5_std value: -6.278983102710238 - type: nauc_recall_at_1000_diff1 value: 65.12360093170948 - type: nauc_recall_at_1000_max value: 82.60209843191132 - type: nauc_recall_at_1000_std value: 51.740179583368636 - type: nauc_recall_at_100_diff1 value: 62.82007697326819 - type: nauc_recall_at_100_max value: 76.04844844677562 - type: nauc_recall_at_100_std value: 26.4678415019248 - type: nauc_recall_at_10_diff1 value: 66.28557566848767 - type: nauc_recall_at_10_max value: 73.40302709828738 - type: nauc_recall_at_10_std value: 1.9224272854613582 - type: nauc_recall_at_1_diff1 value: 76.18354072047988 - type: nauc_recall_at_1_max value: 65.03342186728786 - type: nauc_recall_at_1_std value: -10.867650288695796 - type: nauc_recall_at_20_diff1 value: 67.03430451094992 - type: nauc_recall_at_20_max value: 76.09474005171319 - type: nauc_recall_at_20_std value: 9.815888637851074 - type: nauc_recall_at_3_diff1 value: 68.44411411344718 - type: nauc_recall_at_3_max value: 69.30502737137265 - type: nauc_recall_at_3_std value: -8.629526329714132 - type: nauc_recall_at_5_diff1 value: 67.51469265953514 - type: nauc_recall_at_5_max value: 70.76969893818111 - type: nauc_recall_at_5_std value: -6.325600167105444 - type: ndcg_at_1 value: 57.056 - type: ndcg_at_10 value: 68.632 - type: ndcg_at_100 value: 71.202 - type: ndcg_at_1000 value: 71.97099999999999 - type: ndcg_at_20 value: 69.785 - type: ndcg_at_3 value: 65.131 - type: ndcg_at_5 value: 66.834 - type: precision_at_1 value: 57.056 - type: precision_at_10 value: 8.044 - type: precision_at_100 value: 0.9259999999999999 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 4.251 - type: precision_at_3 value: 23.589 - type: precision_at_5 value: 14.984 - type: recall_at_1 value: 57.046 - type: recall_at_10 value: 80.423 - type: recall_at_100 value: 92.582 - type: recall_at_1000 value: 98.638 - type: recall_at_20 value: 84.993 - type: recall_at_3 value: 70.758 - type: recall_at_5 value: 74.9 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (spa-eng) type: facebook/mlqa config: spa-eng split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 68.765 - type: map_at_1 value: 56.538999999999994 - type: map_at_10 value: 64.816 - type: map_at_100 value: 65.325 - type: map_at_1000 value: 65.352 - type: map_at_20 value: 65.113 - type: map_at_3 value: 62.934999999999995 - type: map_at_5 value: 64.063 - type: mrr_at_1 value: 56.539120502569965 - type: mrr_at_10 value: 64.81561556661505 - type: mrr_at_100 value: 65.32464238613954 - type: mrr_at_1000 value: 65.35206516602133 - type: mrr_at_20 value: 65.11270445292227 - type: mrr_at_3 value: 62.935465448315384 - type: mrr_at_5 value: 64.06339234723022 - type: nauc_map_at_1000_diff1 value: 73.20701050428072 - type: nauc_map_at_1000_max value: 67.32797480614404 - type: nauc_map_at_1000_std value: -6.211540626528362 - type: nauc_map_at_100_diff1 value: 73.19497683923063 - type: nauc_map_at_100_max value: 67.33392646467817 - type: nauc_map_at_100_std value: -6.196671563900051 - type: nauc_map_at_10_diff1 value: 73.16010547612956 - type: nauc_map_at_10_max value: 67.37793741307372 - type: nauc_map_at_10_std value: -6.3443240322521675 - type: nauc_map_at_1_diff1 value: 76.63696578575964 - type: nauc_map_at_1_max value: 65.08189618178105 - type: nauc_map_at_1_std value: -8.594195451782733 - type: nauc_map_at_20_diff1 value: 73.15233479381568 - type: nauc_map_at_20_max value: 67.3679607256072 - type: nauc_map_at_20_std value: -6.175928265286352 - type: nauc_map_at_3_diff1 value: 73.14853380980746 - type: nauc_map_at_3_max value: 67.10354198073468 - type: nauc_map_at_3_std value: -7.409679815529866 - type: nauc_map_at_5_diff1 value: 73.13425961877715 - type: nauc_map_at_5_max value: 67.22452899371224 - type: nauc_map_at_5_std value: -6.895257774506354 - type: nauc_mrr_at_1000_diff1 value: 73.20701050428072 - type: nauc_mrr_at_1000_max value: 67.32797480614404 - type: nauc_mrr_at_1000_std value: -6.211540626528362 - type: nauc_mrr_at_100_diff1 value: 73.19497683923063 - type: nauc_mrr_at_100_max value: 67.33392646467817 - type: nauc_mrr_at_100_std value: -6.196671563900051 - type: nauc_mrr_at_10_diff1 value: 73.16010547612956 - type: nauc_mrr_at_10_max value: 67.37793741307372 - type: nauc_mrr_at_10_std value: -6.3443240322521675 - type: nauc_mrr_at_1_diff1 value: 76.63696578575964 - type: nauc_mrr_at_1_max value: 65.08189618178105 - type: nauc_mrr_at_1_std value: -8.594195451782733 - type: nauc_mrr_at_20_diff1 value: 73.15233479381568 - type: nauc_mrr_at_20_max value: 67.3679607256072 - type: nauc_mrr_at_20_std value: -6.175928265286352 - type: nauc_mrr_at_3_diff1 value: 73.14853380980746 - type: nauc_mrr_at_3_max value: 67.10354198073468 - type: nauc_mrr_at_3_std value: -7.409679815529866 - type: nauc_mrr_at_5_diff1 value: 73.13425961877715 - type: nauc_mrr_at_5_max value: 67.22452899371224 - type: nauc_mrr_at_5_std value: -6.895257774506354 - type: nauc_ndcg_at_1000_diff1 value: 72.44364625096874 - type: nauc_ndcg_at_1000_max value: 67.93635761141552 - type: nauc_ndcg_at_1000_std value: -4.616429464350954 - type: nauc_ndcg_at_100_diff1 value: 72.11352383758482 - type: nauc_ndcg_at_100_max value: 68.1627312575955 - type: nauc_ndcg_at_100_std value: -3.894213672131282 - type: nauc_ndcg_at_10_diff1 value: 71.8526850770812 - type: nauc_ndcg_at_10_max value: 68.41366561888562 - type: nauc_ndcg_at_10_std value: -4.472146861145989 - type: nauc_ndcg_at_1_diff1 value: 76.63696578575964 - type: nauc_ndcg_at_1_max value: 65.08189618178105 - type: nauc_ndcg_at_1_std value: -8.594195451782733 - type: nauc_ndcg_at_20_diff1 value: 71.76464418138866 - type: nauc_ndcg_at_20_max value: 68.41174963313698 - type: nauc_ndcg_at_20_std value: -3.7449762037540157 - type: nauc_ndcg_at_3_diff1 value: 71.93808990683131 - type: nauc_ndcg_at_3_max value: 67.7010029507334 - type: nauc_ndcg_at_3_std value: -6.971858419379321 - type: nauc_ndcg_at_5_diff1 value: 71.8505224811326 - type: nauc_ndcg_at_5_max value: 67.97139549500251 - type: nauc_ndcg_at_5_std value: -5.958491308070017 - type: nauc_precision_at_1000_diff1 value: 62.20956180320043 - type: nauc_precision_at_1000_max value: 82.53412670611299 - type: nauc_precision_at_1000_std value: 55.57278124999575 - type: nauc_precision_at_100_diff1 value: 62.03792857023201 - type: nauc_precision_at_100_max value: 76.77130713424538 - type: nauc_precision_at_100_std value: 26.674102719959564 - type: nauc_precision_at_10_diff1 value: 65.89798055049931 - type: nauc_precision_at_10_max value: 73.41908620140674 - type: nauc_precision_at_10_std value: 5.21818573283179 - type: nauc_precision_at_1_diff1 value: 76.63696578575964 - type: nauc_precision_at_1_max value: 65.08189618178105 - type: nauc_precision_at_1_std value: -8.594195451782733 - type: nauc_precision_at_20_diff1 value: 63.734308542647355 - type: nauc_precision_at_20_max value: 74.69578825096144 - type: nauc_precision_at_20_std value: 12.627842502659162 - type: nauc_precision_at_3_diff1 value: 67.91189666671904 - type: nauc_precision_at_3_max value: 69.64986036783209 - type: nauc_precision_at_3_std value: -5.505669087429055 - type: nauc_precision_at_5_diff1 value: 67.01880006360248 - type: nauc_precision_at_5_max value: 70.78916423358686 - type: nauc_precision_at_5_std value: -2.2273742736401045 - type: nauc_recall_at_1000_diff1 value: 62.20956180319936 - type: nauc_recall_at_1000_max value: 82.53412670611287 - type: nauc_recall_at_1000_std value: 55.57278124999549 - type: nauc_recall_at_100_diff1 value: 62.03792857023208 - type: nauc_recall_at_100_max value: 76.77130713424577 - type: nauc_recall_at_100_std value: 26.67410271995973 - type: nauc_recall_at_10_diff1 value: 65.8979805504994 - type: nauc_recall_at_10_max value: 73.41908620140678 - type: nauc_recall_at_10_std value: 5.2181857328318655 - type: nauc_recall_at_1_diff1 value: 76.63696578575964 - type: nauc_recall_at_1_max value: 65.08189618178105 - type: nauc_recall_at_1_std value: -8.594195451782733 - type: nauc_recall_at_20_diff1 value: 63.734308542647334 - type: nauc_recall_at_20_max value: 74.69578825096123 - type: nauc_recall_at_20_std value: 12.627842502658982 - type: nauc_recall_at_3_diff1 value: 67.91189666671897 - type: nauc_recall_at_3_max value: 69.64986036783203 - type: nauc_recall_at_3_std value: -5.505669087428989 - type: nauc_recall_at_5_diff1 value: 67.01880006360243 - type: nauc_recall_at_5_max value: 70.78916423358686 - type: nauc_recall_at_5_std value: -2.227374273640135 - type: ndcg_at_1 value: 56.538999999999994 - type: ndcg_at_10 value: 68.765 - type: ndcg_at_100 value: 71.314 - type: ndcg_at_1000 value: 72.038 - type: ndcg_at_20 value: 69.828 - type: ndcg_at_3 value: 64.937 - type: ndcg_at_5 value: 66.956 - type: precision_at_1 value: 56.538999999999994 - type: precision_at_10 value: 8.113 - type: precision_at_100 value: 0.932 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 4.265 - type: precision_at_3 value: 23.567 - type: precision_at_5 value: 15.115 - type: recall_at_1 value: 56.538999999999994 - type: recall_at_10 value: 81.135 - type: recall_at_100 value: 93.223 - type: recall_at_1000 value: 98.896 - type: recall_at_20 value: 85.304 - type: recall_at_3 value: 70.702 - type: recall_at_5 value: 75.576 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (eng-deu) type: facebook/mlqa config: eng-deu split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 69.298 - type: map_at_1 value: 58.553 - type: map_at_10 value: 65.769 - type: map_at_100 value: 66.298 - type: map_at_1000 value: 66.328 - type: map_at_20 value: 66.101 - type: map_at_3 value: 64.048 - type: map_at_5 value: 65.09 - type: mrr_at_1 value: 58.564148016840235 - type: mrr_at_10 value: 65.7685997066675 - type: mrr_at_100 value: 66.29874034432214 - type: mrr_at_1000 value: 66.32844979939088 - type: mrr_at_20 value: 66.10120513957821 - type: mrr_at_3 value: 64.04830489696437 - type: mrr_at_5 value: 65.08974074894746 - type: nauc_map_at_1000_diff1 value: 76.8409650183994 - type: nauc_map_at_1000_max value: 71.86367015521367 - type: nauc_map_at_1000_std value: -14.464881539957256 - type: nauc_map_at_100_diff1 value: 76.82536521842064 - type: nauc_map_at_100_max value: 71.86811127965429 - type: nauc_map_at_100_std value: -14.441105539722244 - type: nauc_map_at_10_diff1 value: 76.75522453447859 - type: nauc_map_at_10_max value: 71.87677500176706 - type: nauc_map_at_10_std value: -14.741331625103559 - type: nauc_map_at_1_diff1 value: 79.64060747740989 - type: nauc_map_at_1_max value: 69.84278563569617 - type: nauc_map_at_1_std value: -15.936904929655832 - type: nauc_map_at_20_diff1 value: 76.78894776059715 - type: nauc_map_at_20_max value: 71.89637938044827 - type: nauc_map_at_20_std value: -14.500564106990769 - type: nauc_map_at_3_diff1 value: 77.20562577450342 - type: nauc_map_at_3_max value: 71.80578229361525 - type: nauc_map_at_3_std value: -15.344134588512201 - type: nauc_map_at_5_diff1 value: 77.00480147367867 - type: nauc_map_at_5_max value: 71.98335924076163 - type: nauc_map_at_5_std value: -15.16537653041026 - type: nauc_mrr_at_1000_diff1 value: 76.84165367691193 - type: nauc_mrr_at_1000_max value: 71.8642679499795 - type: nauc_mrr_at_1000_std value: -14.461717954593158 - type: nauc_mrr_at_100_diff1 value: 76.8263363557998 - type: nauc_mrr_at_100_max value: 71.86874522368626 - type: nauc_mrr_at_100_std value: -14.437105168707426 - type: nauc_mrr_at_10_diff1 value: 76.75522453447859 - type: nauc_mrr_at_10_max value: 71.87677500176706 - type: nauc_mrr_at_10_std value: -14.741331625103559 - type: nauc_mrr_at_1_diff1 value: 79.65642669321981 - type: nauc_mrr_at_1_max value: 69.89135358784799 - type: nauc_mrr_at_1_std value: -15.919357002229589 - type: nauc_mrr_at_20_diff1 value: 76.78883171270601 - type: nauc_mrr_at_20_max value: 71.89806887245291 - type: nauc_mrr_at_20_std value: -14.497139746907905 - type: nauc_mrr_at_3_diff1 value: 77.20562577450342 - type: nauc_mrr_at_3_max value: 71.80578229361525 - type: nauc_mrr_at_3_std value: -15.344134588512201 - type: nauc_mrr_at_5_diff1 value: 77.00480147367867 - type: nauc_mrr_at_5_max value: 71.98335924076163 - type: nauc_mrr_at_5_std value: -15.16537653041026 - type: nauc_ndcg_at_1000_diff1 value: 76.07802417817047 - type: nauc_ndcg_at_1000_max value: 72.31792804426776 - type: nauc_ndcg_at_1000_std value: -13.049160715132244 - type: nauc_ndcg_at_100_diff1 value: 75.63343849116544 - type: nauc_ndcg_at_100_max value: 72.48362076101817 - type: nauc_ndcg_at_100_std value: -12.089600993516777 - type: nauc_ndcg_at_10_diff1 value: 75.23387929929208 - type: nauc_ndcg_at_10_max value: 72.51436288271807 - type: nauc_ndcg_at_10_std value: -13.624132103038104 - type: nauc_ndcg_at_1_diff1 value: 79.65642669321981 - type: nauc_ndcg_at_1_max value: 69.89135358784799 - type: nauc_ndcg_at_1_std value: -15.919357002229589 - type: nauc_ndcg_at_20_diff1 value: 75.32926047656296 - type: nauc_ndcg_at_20_max value: 72.61254165918145 - type: nauc_ndcg_at_20_std value: -12.683157599238701 - type: nauc_ndcg_at_3_diff1 value: 76.3089337665469 - type: nauc_ndcg_at_3_max value: 72.40014674426054 - type: nauc_ndcg_at_3_std value: -15.08624226353458 - type: nauc_ndcg_at_5_diff1 value: 75.88857331641834 - type: nauc_ndcg_at_5_max value: 72.7719386827224 - type: nauc_ndcg_at_5_std value: -14.70546521089236 - type: nauc_precision_at_1000_diff1 value: 59.66563879069911 - type: nauc_precision_at_1000_max value: 74.57123562956772 - type: nauc_precision_at_1000_std value: 58.61396866718965 - type: nauc_precision_at_100_diff1 value: 62.8695896550042 - type: nauc_precision_at_100_max value: 77.81408796785 - type: nauc_precision_at_100_std value: 23.819735672317826 - type: nauc_precision_at_10_diff1 value: 68.08051625224569 - type: nauc_precision_at_10_max value: 75.14432336036869 - type: nauc_precision_at_10_std value: -7.97602345252735 - type: nauc_precision_at_1_diff1 value: 79.65642669321981 - type: nauc_precision_at_1_max value: 69.89135358784799 - type: nauc_precision_at_1_std value: -15.919357002229589 - type: nauc_precision_at_20_diff1 value: 66.7168005185165 - type: nauc_precision_at_20_max value: 76.58522761697147 - type: nauc_precision_at_20_std value: -0.17923428317323292 - type: nauc_precision_at_3_diff1 value: 73.23394851561207 - type: nauc_precision_at_3_max value: 74.32517846819215 - type: nauc_precision_at_3_std value: -14.142301336188348 - type: nauc_precision_at_5_diff1 value: 71.5666882547012 - type: nauc_precision_at_5_max value: 75.71098205440033 - type: nauc_precision_at_5_std value: -12.808362513638052 - type: nauc_recall_at_1000_diff1 value: 71.73736112325805 - type: nauc_recall_at_1000_max value: 86.70743436225898 - type: nauc_recall_at_1000_std value: 54.45802578371167 - type: nauc_recall_at_100_diff1 value: 64.07053861428128 - type: nauc_recall_at_100_max value: 78.8348308099261 - type: nauc_recall_at_100_std value: 22.72263677785103 - type: nauc_recall_at_10_diff1 value: 68.20272901407903 - type: nauc_recall_at_10_max value: 75.16315335381938 - type: nauc_recall_at_10_std value: -8.060716748913386 - type: nauc_recall_at_1_diff1 value: 79.64060747740989 - type: nauc_recall_at_1_max value: 69.84278563569617 - type: nauc_recall_at_1_std value: -15.936904929655832 - type: nauc_recall_at_20_diff1 value: 66.88206981973654 - type: nauc_recall_at_20_max value: 76.54824917595687 - type: nauc_recall_at_20_std value: -0.40294589316962287 - type: nauc_recall_at_3_diff1 value: 73.33076087258938 - type: nauc_recall_at_3_max value: 74.33763112508771 - type: nauc_recall_at_3_std value: -14.213355414905399 - type: nauc_recall_at_5_diff1 value: 71.67487623469464 - type: nauc_recall_at_5_max value: 75.72770292516316 - type: nauc_recall_at_5_std value: -12.887572274644818 - type: ndcg_at_1 value: 58.56400000000001 - type: ndcg_at_10 value: 69.298 - type: ndcg_at_100 value: 71.95899999999999 - type: ndcg_at_1000 value: 72.735 - type: ndcg_at_20 value: 70.50699999999999 - type: ndcg_at_3 value: 65.81700000000001 - type: ndcg_at_5 value: 67.681 - type: precision_at_1 value: 58.56400000000001 - type: precision_at_10 value: 8.039 - type: precision_at_100 value: 0.931 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 4.259 - type: precision_at_3 value: 23.65 - type: precision_at_5 value: 15.09 - type: recall_at_1 value: 58.553 - type: recall_at_10 value: 80.368 - type: recall_at_100 value: 93.013 - type: recall_at_1000 value: 99.092 - type: recall_at_20 value: 85.143 - type: recall_at_3 value: 70.928 - type: recall_at_5 value: 75.42699999999999 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (eng-spa) type: facebook/mlqa config: eng-spa split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 66.374 - type: map_at_1 value: 55.494 - type: map_at_10 value: 62.763999999999996 - type: map_at_100 value: 63.33 - type: map_at_1000 value: 63.36000000000001 - type: map_at_20 value: 63.104000000000006 - type: map_at_3 value: 61.065000000000005 - type: map_at_5 value: 62.053000000000004 - type: mrr_at_1 value: 55.49419158255571 - type: mrr_at_10 value: 62.765195140457095 - type: mrr_at_100 value: 63.33083349354529 - type: mrr_at_1000 value: 63.3611897014839 - type: mrr_at_20 value: 63.10543590095977 - type: mrr_at_3 value: 61.06455913159412 - type: mrr_at_5 value: 62.052942296705474 - type: nauc_map_at_1000_diff1 value: 75.04200018088618 - type: nauc_map_at_1000_max value: 70.49937782771909 - type: nauc_map_at_1000_std value: -5.257206317083184 - type: nauc_map_at_100_diff1 value: 75.02786834256312 - type: nauc_map_at_100_max value: 70.5016476500189 - type: nauc_map_at_100_std value: -5.228770832077681 - type: nauc_map_at_10_diff1 value: 74.9626552701647 - type: nauc_map_at_10_max value: 70.56253732243214 - type: nauc_map_at_10_std value: -5.359037281768563 - type: nauc_map_at_1_diff1 value: 78.46858307815857 - type: nauc_map_at_1_max value: 69.03908373759435 - type: nauc_map_at_1_std value: -7.479412070736642 - type: nauc_map_at_20_diff1 value: 74.98121458084796 - type: nauc_map_at_20_max value: 70.51885366822565 - type: nauc_map_at_20_std value: -5.286051287133815 - type: nauc_map_at_3_diff1 value: 75.36078454383373 - type: nauc_map_at_3_max value: 70.34997144546014 - type: nauc_map_at_3_std value: -6.663517224039184 - type: nauc_map_at_5_diff1 value: 75.0274512828238 - type: nauc_map_at_5_max value: 70.45292551591874 - type: nauc_map_at_5_std value: -6.029224488640147 - type: nauc_mrr_at_1000_diff1 value: 75.04018768469983 - type: nauc_mrr_at_1000_max value: 70.49855509132635 - type: nauc_mrr_at_1000_std value: -5.258929961409948 - type: nauc_mrr_at_100_diff1 value: 75.02605732810112 - type: nauc_mrr_at_100_max value: 70.50082584929103 - type: nauc_mrr_at_100_std value: -5.2304917988542154 - type: nauc_mrr_at_10_diff1 value: 74.96079080525713 - type: nauc_mrr_at_10_max value: 70.56167294920391 - type: nauc_mrr_at_10_std value: -5.360650630655072 - type: nauc_mrr_at_1_diff1 value: 78.46858307815857 - type: nauc_mrr_at_1_max value: 69.03908373759435 - type: nauc_mrr_at_1_std value: -7.479412070736642 - type: nauc_mrr_at_20_diff1 value: 74.97939804960517 - type: nauc_mrr_at_20_max value: 70.51804078965411 - type: nauc_mrr_at_20_std value: -5.287681954889177 - type: nauc_mrr_at_3_diff1 value: 75.36078454383373 - type: nauc_mrr_at_3_max value: 70.34997144546014 - type: nauc_mrr_at_3_std value: -6.663517224039184 - type: nauc_mrr_at_5_diff1 value: 75.0274512828238 - type: nauc_mrr_at_5_max value: 70.45292551591874 - type: nauc_mrr_at_5_std value: -6.029224488640147 - type: nauc_ndcg_at_1000_diff1 value: 74.22106834748942 - type: nauc_ndcg_at_1000_max value: 70.93625922934912 - type: nauc_ndcg_at_1000_std value: -3.4878399005946017 - type: nauc_ndcg_at_100_diff1 value: 73.74068883646733 - type: nauc_ndcg_at_100_max value: 71.02357018347472 - type: nauc_ndcg_at_100_std value: -2.462293184201324 - type: nauc_ndcg_at_10_diff1 value: 73.40967965536565 - type: nauc_ndcg_at_10_max value: 71.29379828672067 - type: nauc_ndcg_at_10_std value: -3.295547756383108 - type: nauc_ndcg_at_1_diff1 value: 78.46858307815857 - type: nauc_ndcg_at_1_max value: 69.03908373759435 - type: nauc_ndcg_at_1_std value: -7.479412070736642 - type: nauc_ndcg_at_20_diff1 value: 73.45790057693699 - type: nauc_ndcg_at_20_max value: 71.16598432419126 - type: nauc_ndcg_at_20_std value: -2.962877157646097 - type: nauc_ndcg_at_3_diff1 value: 74.30696173964847 - type: nauc_ndcg_at_3_max value: 70.79878978459556 - type: nauc_ndcg_at_3_std value: -6.297286578628299 - type: nauc_ndcg_at_5_diff1 value: 73.65858211199816 - type: nauc_ndcg_at_5_max value: 71.01122417463776 - type: nauc_ndcg_at_5_std value: -5.075990882646765 - type: nauc_precision_at_1000_diff1 value: 68.71065091972568 - type: nauc_precision_at_1000_max value: 81.38173585624777 - type: nauc_precision_at_1000_std value: 58.035497889797895 - type: nauc_precision_at_100_diff1 value: 61.93634256957017 - type: nauc_precision_at_100_max value: 74.84191770203093 - type: nauc_precision_at_100_std value: 31.3325983123831 - type: nauc_precision_at_10_diff1 value: 66.68247010944937 - type: nauc_precision_at_10_max value: 74.48773524654571 - type: nauc_precision_at_10_std value: 6.560421880785153 - type: nauc_precision_at_1_diff1 value: 78.46858307815857 - type: nauc_precision_at_1_max value: 69.03908373759435 - type: nauc_precision_at_1_std value: -7.479412070736642 - type: nauc_precision_at_20_diff1 value: 65.51592872758067 - type: nauc_precision_at_20_max value: 74.50684066823096 - type: nauc_precision_at_20_std value: 10.830479877698208 - type: nauc_precision_at_3_diff1 value: 70.89587884861588 - type: nauc_precision_at_3_max value: 72.25310558370424 - type: nauc_precision_at_3_std value: -5.0796100900749765 - type: nauc_precision_at_5_diff1 value: 68.71885719845497 - type: nauc_precision_at_5_max value: 73.02601751485672 - type: nauc_precision_at_5_std value: -1.4382681421626857 - type: nauc_recall_at_1000_diff1 value: 71.95510299834734 - type: nauc_recall_at_1000_max value: 84.03647166092985 - type: nauc_recall_at_1000_std value: 56.87490604776847 - type: nauc_recall_at_100_diff1 value: 62.446624924715955 - type: nauc_recall_at_100_max value: 75.25666892464507 - type: nauc_recall_at_100_std value: 31.068789794554686 - type: nauc_recall_at_10_diff1 value: 66.70676336328988 - type: nauc_recall_at_10_max value: 74.4963699656397 - type: nauc_recall_at_10_std value: 6.57498399706916 - type: nauc_recall_at_1_diff1 value: 78.46858307815857 - type: nauc_recall_at_1_max value: 69.03908373759435 - type: nauc_recall_at_1_std value: -7.479412070736642 - type: nauc_recall_at_20_diff1 value: 65.54082767974772 - type: nauc_recall_at_20_max value: 74.5111529838772 - type: nauc_recall_at_20_std value: 10.84574829707354 - type: nauc_recall_at_3_diff1 value: 70.89587884861584 - type: nauc_recall_at_3_max value: 72.25310558370421 - type: nauc_recall_at_3_std value: -5.07961009007491 - type: nauc_recall_at_5_diff1 value: 68.71885719845501 - type: nauc_recall_at_5_max value: 73.02601751485666 - type: nauc_recall_at_5_std value: -1.4382681421626995 - type: ndcg_at_1 value: 55.494 - type: ndcg_at_10 value: 66.374 - type: ndcg_at_100 value: 69.254 - type: ndcg_at_1000 value: 70.136 - type: ndcg_at_20 value: 67.599 - type: ndcg_at_3 value: 62.863 - type: ndcg_at_5 value: 64.644 - type: precision_at_1 value: 55.494 - type: precision_at_10 value: 7.776 - type: precision_at_100 value: 0.9159999999999999 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 4.1290000000000004 - type: precision_at_3 value: 22.688 - type: precision_at_5 value: 14.477 - type: recall_at_1 value: 55.494 - type: recall_at_10 value: 77.747 - type: recall_at_100 value: 91.535 - type: recall_at_1000 value: 98.619 - type: recall_at_20 value: 82.565 - type: recall_at_3 value: 68.063 - type: recall_at_5 value: 72.386 - task: type: Retrieval dataset: name: MTEB MLQARetrieval (eng-eng) type: facebook/mlqa config: eng-eng split: test revision: 397ed406c1a7902140303e7faf60fff35b58d285 metrics: - type: main_score value: 64.723 - type: map_at_1 value: 54.308 - type: map_at_10 value: 61.26200000000001 - type: map_at_100 value: 61.82299999999999 - type: map_at_1000 value: 61.856 - type: map_at_20 value: 61.575 - type: map_at_3 value: 59.565 - type: map_at_5 value: 60.561 - type: mrr_at_1 value: 54.31704368848212 - type: mrr_at_10 value: 61.26520216098834 - type: mrr_at_100 value: 61.82588321127103 - type: mrr_at_1000 value: 61.859333030574334 - type: mrr_at_20 value: 61.57780339921337 - type: mrr_at_3 value: 59.569446842801646 - type: mrr_at_5 value: 60.56323029989004 - type: nauc_map_at_1000_diff1 value: 74.21413722468635 - type: nauc_map_at_1000_max value: 70.41741227882316 - type: nauc_map_at_1000_std value: -2.5438707209848506 - type: nauc_map_at_100_diff1 value: 74.19812315947975 - type: nauc_map_at_100_max value: 70.41589146728445 - type: nauc_map_at_100_std value: -2.5336117059429553 - type: nauc_map_at_10_diff1 value: 74.21810561152937 - type: nauc_map_at_10_max value: 70.48816115200171 - type: nauc_map_at_10_std value: -2.7443834681406734 - type: nauc_map_at_1_diff1 value: 77.69378738778958 - type: nauc_map_at_1_max value: 68.64652310701173 - type: nauc_map_at_1_std value: -4.667071946448379 - type: nauc_map_at_20_diff1 value: 74.16105697562438 - type: nauc_map_at_20_max value: 70.42491994631179 - type: nauc_map_at_20_std value: -2.6070416022440472 - type: nauc_map_at_3_diff1 value: 74.60449392878863 - type: nauc_map_at_3_max value: 70.39888609914269 - type: nauc_map_at_3_std value: -3.5401151125723986 - type: nauc_map_at_5_diff1 value: 74.2423420992663 - type: nauc_map_at_5_max value: 70.36574501826757 - type: nauc_map_at_5_std value: -3.2707393116898964 - type: nauc_mrr_at_1000_diff1 value: 74.21029843731323 - type: nauc_mrr_at_1000_max value: 70.43020492688913 - type: nauc_mrr_at_1000_std value: -2.526895582202081 - type: nauc_mrr_at_100_diff1 value: 74.19440960479243 - type: nauc_mrr_at_100_max value: 70.4288998824232 - type: nauc_mrr_at_100_std value: -2.5160929945118107 - type: nauc_mrr_at_10_diff1 value: 74.2141357266166 - type: nauc_mrr_at_10_max value: 70.5005683347807 - type: nauc_mrr_at_10_std value: -2.727154557882168 - type: nauc_mrr_at_1_diff1 value: 77.69891248239793 - type: nauc_mrr_at_1_max value: 68.68255231164922 - type: nauc_mrr_at_1_std value: -4.630226727154317 - type: nauc_mrr_at_20_diff1 value: 74.15705434409723 - type: nauc_mrr_at_20_max value: 70.43741835972747 - type: nauc_mrr_at_20_std value: -2.5896756472464495 - type: nauc_mrr_at_3_diff1 value: 74.5981844349412 - type: nauc_mrr_at_3_max value: 70.41834937080564 - type: nauc_mrr_at_3_std value: -3.5161656408031163 - type: nauc_mrr_at_5_diff1 value: 74.23847535424844 - type: nauc_mrr_at_5_max value: 70.37763810013656 - type: nauc_mrr_at_5_std value: -3.2560955164581733 - type: nauc_ndcg_at_1000_diff1 value: 73.20994496725493 - type: nauc_ndcg_at_1000_max value: 70.8903016277125 - type: nauc_ndcg_at_1000_std value: -0.625772298462309 - type: nauc_ndcg_at_100_diff1 value: 72.6847141682645 - type: nauc_ndcg_at_100_max value: 70.86564422034162 - type: nauc_ndcg_at_100_std value: -0.07195786766326141 - type: nauc_ndcg_at_10_diff1 value: 72.78806493754281 - type: nauc_ndcg_at_10_max value: 71.21957067926769 - type: nauc_ndcg_at_10_std value: -1.2760418313382227 - type: nauc_ndcg_at_1_diff1 value: 77.69891248239793 - type: nauc_ndcg_at_1_max value: 68.68255231164922 - type: nauc_ndcg_at_1_std value: -4.630226727154317 - type: nauc_ndcg_at_20_diff1 value: 72.52082440882546 - type: nauc_ndcg_at_20_max value: 70.98185004796734 - type: nauc_ndcg_at_20_std value: -0.6908280874815464 - type: nauc_ndcg_at_3_diff1 value: 73.59870660843939 - type: nauc_ndcg_at_3_max value: 70.94391957288654 - type: nauc_ndcg_at_3_std value: -3.147723179140428 - type: nauc_ndcg_at_5_diff1 value: 72.90122868193457 - type: nauc_ndcg_at_5_max value: 70.89376368965165 - type: nauc_ndcg_at_5_std value: -2.6451807385626744 - type: nauc_precision_at_1000_diff1 value: 58.14737201864067 - type: nauc_precision_at_1000_max value: 78.79011251144826 - type: nauc_precision_at_1000_std value: 59.98985420476577 - type: nauc_precision_at_100_diff1 value: 59.21069121644552 - type: nauc_precision_at_100_max value: 73.00557835912306 - type: nauc_precision_at_100_std value: 26.85027406282173 - type: nauc_precision_at_10_diff1 value: 66.8760831023675 - type: nauc_precision_at_10_max value: 74.21167950452596 - type: nauc_precision_at_10_std value: 5.453652499335947 - type: nauc_precision_at_1_diff1 value: 77.69891248239793 - type: nauc_precision_at_1_max value: 68.68255231164922 - type: nauc_precision_at_1_std value: -4.630226727154317 - type: nauc_precision_at_20_diff1 value: 64.3118559132602 - type: nauc_precision_at_20_max value: 73.33078184673825 - type: nauc_precision_at_20_std value: 9.993299523049402 - type: nauc_precision_at_3_diff1 value: 70.38667185155593 - type: nauc_precision_at_3_max value: 72.66495006030951 - type: nauc_precision_at_3_std value: -1.8532839591326276 - type: nauc_precision_at_5_diff1 value: 68.12161337583686 - type: nauc_precision_at_5_max value: 72.65644960375046 - type: nauc_precision_at_5_std value: -0.33317164167012875 - type: nauc_recall_at_1000_diff1 value: 61.63204394739985 - type: nauc_recall_at_1000_max value: 81.77241537319897 - type: nauc_recall_at_1000_std value: 58.44841544062308 - type: nauc_recall_at_100_diff1 value: 59.72072697224705 - type: nauc_recall_at_100_max value: 73.28519507061553 - type: nauc_recall_at_100_std value: 26.27318390763456 - type: nauc_recall_at_10_diff1 value: 66.9757135465418 - type: nauc_recall_at_10_max value: 74.21919493374149 - type: nauc_recall_at_10_std value: 5.323369605377166 - type: nauc_recall_at_1_diff1 value: 77.69378738778958 - type: nauc_recall_at_1_max value: 68.64652310701173 - type: nauc_recall_at_1_std value: -4.667071946448379 - type: nauc_recall_at_20_diff1 value: 64.42290081731899 - type: nauc_recall_at_20_max value: 73.3358289439033 - type: nauc_recall_at_20_std value: 9.846598361586073 - type: nauc_recall_at_3_diff1 value: 70.41211290964785 - type: nauc_recall_at_3_max value: 72.64451776775402 - type: nauc_recall_at_3_std value: -1.916280959835826 - type: nauc_recall_at_5_diff1 value: 68.20695272727916 - type: nauc_recall_at_5_max value: 72.66404224006101 - type: nauc_recall_at_5_std value: -0.431125323007886 - type: ndcg_at_1 value: 54.31700000000001 - type: ndcg_at_10 value: 64.723 - type: ndcg_at_100 value: 67.648 - type: ndcg_at_1000 value: 68.619 - type: ndcg_at_20 value: 65.85499999999999 - type: ndcg_at_3 value: 61.244 - type: ndcg_at_5 value: 63.038000000000004 - type: precision_at_1 value: 54.31700000000001 - type: precision_at_10 value: 7.564 - type: precision_at_100 value: 0.898 - type: precision_at_1000 value: 0.098 - type: precision_at_20 value: 4.005 - type: precision_at_3 value: 22.034000000000002 - type: precision_at_5 value: 14.093 - type: recall_at_1 value: 54.308 - type: recall_at_10 value: 75.622 - type: recall_at_100 value: 89.744 - type: recall_at_1000 value: 97.539 - type: recall_at_20 value: 80.085 - type: recall_at_3 value: 66.09 - type: recall_at_5 value: 70.446 - task: type: Clustering dataset: name: MTEB MLSUMClusteringP2P (de) type: reciTAL/mlsum config: de split: test revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7 metrics: - type: main_score value: 41.267647761702854 - type: v_measure value: 41.267647761702854 - type: v_measure_std value: 10.93390895077248 - type: main_score value: 40.07927325071353 - type: v_measure value: 40.07927325071353 - type: v_measure_std value: 9.296680835266145 - task: type: Clustering dataset: name: MTEB MLSUMClusteringP2P (fr) type: reciTAL/mlsum config: fr split: test revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7 metrics: - type: main_score value: 44.68714862333979 - type: v_measure value: 44.68714862333979 - type: v_measure_std value: 1.811036989797814 - type: main_score value: 44.88484854069901 - type: v_measure value: 44.88484854069901 - type: v_measure_std value: 2.3704247819781843 - task: type: Clustering dataset: name: MTEB MLSUMClusteringP2P (ru) type: reciTAL/mlsum config: ru split: test revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7 metrics: - type: main_score value: 41.92518785753813 - type: v_measure value: 41.92518785753813 - type: v_measure_std value: 5.9356661900220775 - type: main_score value: 43.97657450929179 - type: v_measure value: 43.97657450929179 - type: v_measure_std value: 6.087547931333613 - task: type: Clustering dataset: name: MTEB MLSUMClusteringP2P (es) type: reciTAL/mlsum config: es split: test revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7 metrics: - type: main_score value: 48.69875719812033 - type: v_measure value: 48.69875719812033 - type: v_measure_std value: 1.204253881950113 - type: main_score value: 48.41108671948728 - type: v_measure value: 48.41108671948728 - type: v_measure_std value: 1.3848320630151243 - task: type: Reranking dataset: name: MTEB MMarcoReranking (default) type: C-MTEB/Mmarco-reranking config: default split: dev revision: 8e0c766dbe9e16e1d221116a3f36795fbade07f6 metrics: - type: map value: 21.050447576170395 - type: mrr value: 20.201984126984126 - type: main_score value: 21.050447576170395 - task: type: Retrieval dataset: name: MTEB MMarcoRetrieval (default) type: C-MTEB/MMarcoRetrieval config: default split: dev revision: 539bbde593d947e2a124ba72651aafc09eb33fc2 metrics: - type: main_score value: 79.687 - type: map_at_1 value: 66.872 - type: map_at_10 value: 75.949 - type: map_at_100 value: 76.25 - type: map_at_1000 value: 76.259 - type: map_at_20 value: 76.145 - type: map_at_3 value: 74.01299999999999 - type: map_at_5 value: 75.232 - type: mrr_at_1 value: 69.18338108882521 - type: mrr_at_10 value: 76.5424227952881 - type: mrr_at_100 value: 76.8019342792628 - type: mrr_at_1000 value: 76.81002278342808 - type: mrr_at_20 value: 76.7115234815896 - type: mrr_at_3 value: 74.83046800382044 - type: mrr_at_5 value: 75.88490926456515 - type: nauc_map_at_1000_diff1 value: 78.06933310424179 - type: nauc_map_at_1000_max value: 49.392948209665896 - type: nauc_map_at_1000_std value: -15.126109322591166 - type: nauc_map_at_100_diff1 value: 78.06612779298378 - type: nauc_map_at_100_max value: 49.40761618630397 - type: nauc_map_at_100_std value: -15.099282408159349 - type: nauc_map_at_10_diff1 value: 77.94565685470538 - type: nauc_map_at_10_max value: 49.50559610363201 - type: nauc_map_at_10_std value: -15.182130695916355 - type: nauc_map_at_1_diff1 value: 79.84814509858211 - type: nauc_map_at_1_max value: 40.78978466656547 - type: nauc_map_at_1_std value: -19.96189264026715 - type: nauc_map_at_20_diff1 value: 78.03597839981245 - type: nauc_map_at_20_max value: 49.49477427223376 - type: nauc_map_at_20_std value: -15.084990000838378 - type: nauc_map_at_3_diff1 value: 78.0637014655507 - type: nauc_map_at_3_max value: 48.63214001973341 - type: nauc_map_at_3_std value: -17.093950563306596 - type: nauc_map_at_5_diff1 value: 77.94068229240348 - type: nauc_map_at_5_max value: 49.38930719689204 - type: nauc_map_at_5_std value: -15.9919454201954 - type: nauc_mrr_at_1000_diff1 value: 78.34582398092816 - type: nauc_mrr_at_1000_max value: 49.623566992784156 - type: nauc_mrr_at_1000_std value: -14.381347765493265 - type: nauc_mrr_at_100_diff1 value: 78.3429966714221 - type: nauc_mrr_at_100_max value: 49.63684922240546 - type: nauc_mrr_at_100_std value: -14.354914066301236 - type: nauc_mrr_at_10_diff1 value: 78.2208070219624 - type: nauc_mrr_at_10_max value: 49.77720536573364 - type: nauc_mrr_at_10_std value: -14.316233764741812 - type: nauc_mrr_at_1_diff1 value: 80.22305496572142 - type: nauc_mrr_at_1_max value: 44.30231210192536 - type: nauc_mrr_at_1_std value: -18.942549914934492 - type: nauc_mrr_at_20_diff1 value: 78.31006724240147 - type: nauc_mrr_at_20_max value: 49.72338465276142 - type: nauc_mrr_at_20_std value: -14.30722621948953 - type: nauc_mrr_at_3_diff1 value: 78.39832634634523 - type: nauc_mrr_at_3_max value: 49.24985961036677 - type: nauc_mrr_at_3_std value: -15.966286866763191 - type: nauc_mrr_at_5_diff1 value: 78.2406507247798 - type: nauc_mrr_at_5_max value: 49.71276359754787 - type: nauc_mrr_at_5_std value: -14.979526226149698 - type: nauc_ndcg_at_1000_diff1 value: 77.74892471071016 - type: nauc_ndcg_at_1000_max value: 51.11543344053061 - type: nauc_ndcg_at_1000_std value: -12.208878737005096 - type: nauc_ndcg_at_100_diff1 value: 77.67462502211228 - type: nauc_ndcg_at_100_max value: 51.593977338939034 - type: nauc_ndcg_at_100_std value: -11.312126179513802 - type: nauc_ndcg_at_10_diff1 value: 77.0571291760012 - type: nauc_ndcg_at_10_max value: 52.35435572808972 - type: nauc_ndcg_at_10_std value: -11.33242546164059 - type: nauc_ndcg_at_1_diff1 value: 80.22305496572142 - type: nauc_ndcg_at_1_max value: 44.30231210192536 - type: nauc_ndcg_at_1_std value: -18.942549914934492 - type: nauc_ndcg_at_20_diff1 value: 77.4141216117471 - type: nauc_ndcg_at_20_max value: 52.340600871365375 - type: nauc_ndcg_at_20_std value: -10.989010161550912 - type: nauc_ndcg_at_3_diff1 value: 77.43971989259062 - type: nauc_ndcg_at_3_max value: 50.59251358320663 - type: nauc_ndcg_at_3_std value: -15.59337960636058 - type: nauc_ndcg_at_5_diff1 value: 77.12174287031847 - type: nauc_ndcg_at_5_max value: 51.97108510288907 - type: nauc_ndcg_at_5_std value: -13.474902612427167 - type: nauc_precision_at_1000_diff1 value: -19.36793534929367 - type: nauc_precision_at_1000_max value: 11.803383262344036 - type: nauc_precision_at_1000_std value: 24.304436015177046 - type: nauc_precision_at_100_diff1 value: -6.273790806909921 - type: nauc_precision_at_100_max value: 23.372606271300747 - type: nauc_precision_at_100_std value: 29.085768971612342 - type: nauc_precision_at_10_diff1 value: 21.67045907336595 - type: nauc_precision_at_10_max value: 41.68948432407223 - type: nauc_precision_at_10_std value: 17.837055074458092 - type: nauc_precision_at_1_diff1 value: 80.22305496572142 - type: nauc_precision_at_1_max value: 44.30231210192536 - type: nauc_precision_at_1_std value: -18.942549914934492 - type: nauc_precision_at_20_diff1 value: 12.577671896684803 - type: nauc_precision_at_20_max value: 37.44944702246691 - type: nauc_precision_at_20_std value: 23.635897665206087 - type: nauc_precision_at_3_diff1 value: 47.165335112814056 - type: nauc_precision_at_3_max value: 47.0458691263379 - type: nauc_precision_at_3_std value: -3.3181861146890217 - type: nauc_precision_at_5_diff1 value: 35.406205343514806 - type: nauc_precision_at_5_max value: 45.56549449285401 - type: nauc_precision_at_5_std value: 5.612378074562386 - type: nauc_recall_at_1000_diff1 value: 72.32762520815842 - type: nauc_recall_at_1000_max value: 85.64979256307343 - type: nauc_recall_at_1000_std value: 73.61925297037476 - type: nauc_recall_at_100_diff1 value: 72.31946328709962 - type: nauc_recall_at_100_max value: 83.76576070068353 - type: nauc_recall_at_100_std value: 57.39376538662535 - type: nauc_recall_at_10_diff1 value: 69.51307788072499 - type: nauc_recall_at_10_max value: 69.60124733654142 - type: nauc_recall_at_10_std value: 13.483540424716892 - type: nauc_recall_at_1_diff1 value: 79.84814509858211 - type: nauc_recall_at_1_max value: 40.78978466656547 - type: nauc_recall_at_1_std value: -19.96189264026715 - type: nauc_recall_at_20_diff1 value: 70.92168324710599 - type: nauc_recall_at_20_max value: 76.09106252420084 - type: nauc_recall_at_20_std value: 25.406842300761447 - type: nauc_recall_at_3_diff1 value: 74.1212680517145 - type: nauc_recall_at_3_max value: 56.24921832879403 - type: nauc_recall_at_3_std value: -11.55542913578436 - type: nauc_recall_at_5_diff1 value: 72.31262959872993 - type: nauc_recall_at_5_max value: 62.761214896697915 - type: nauc_recall_at_5_std value: -3.280167584070396 - type: ndcg_at_1 value: 69.18299999999999 - type: ndcg_at_10 value: 79.687 - type: ndcg_at_100 value: 81.062 - type: ndcg_at_1000 value: 81.312 - type: ndcg_at_20 value: 80.34599999999999 - type: ndcg_at_3 value: 75.98700000000001 - type: ndcg_at_5 value: 78.039 - type: precision_at_1 value: 69.18299999999999 - type: precision_at_10 value: 9.636 - type: precision_at_100 value: 1.0330000000000001 - type: precision_at_1000 value: 0.105 - type: precision_at_20 value: 4.958 - type: precision_at_3 value: 28.515 - type: precision_at_5 value: 18.201 - type: recall_at_1 value: 66.872 - type: recall_at_10 value: 90.688 - type: recall_at_100 value: 96.99 - type: recall_at_1000 value: 98.958 - type: recall_at_20 value: 93.21199999999999 - type: recall_at_3 value: 80.84599999999999 - type: recall_at_5 value: 85.732 - task: type: Retrieval dataset: name: MTEB MSMARCO (default) type: mteb/msmarco config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: map_at_1 value: 21.861 - type: map_at_10 value: 34.008 - type: map_at_100 value: 35.174 - type: map_at_1000 value: 35.224 - type: map_at_20 value: 34.705999999999996 - type: map_at_3 value: 30.209000000000003 - type: map_at_5 value: 32.351 - type: mrr_at_1 value: 22.493 - type: mrr_at_10 value: 34.583999999999996 - type: mrr_at_100 value: 35.691 - type: mrr_at_1000 value: 35.736000000000004 - type: mrr_at_20 value: 35.257 - type: mrr_at_3 value: 30.85 - type: mrr_at_5 value: 32.962 - type: ndcg_at_1 value: 22.493 - type: ndcg_at_10 value: 40.815 - type: ndcg_at_100 value: 46.483999999999995 - type: ndcg_at_1000 value: 47.73 - type: ndcg_at_20 value: 43.302 - type: ndcg_at_3 value: 33.056000000000004 - type: ndcg_at_5 value: 36.879 - type: precision_at_1 value: 22.493 - type: precision_at_10 value: 6.465999999999999 - type: precision_at_100 value: 0.932 - type: precision_at_1000 value: 0.104 - type: precision_at_20 value: 3.752 - type: precision_at_3 value: 14.069 - type: precision_at_5 value: 10.384 - type: recall_at_1 value: 21.861 - type: recall_at_10 value: 61.781 - type: recall_at_100 value: 88.095 - type: recall_at_1000 value: 97.625 - type: recall_at_20 value: 71.44500000000001 - type: recall_at_3 value: 40.653 - type: recall_at_5 value: 49.841 - type: main_score value: 40.815 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 97.4874601003192 - type: f1 value: 97.19067544931094 - type: f1_weighted value: 97.49331776181019 - type: main_score value: 97.4874601003192 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (de) type: mteb/mtop_domain config: de split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 96.89489997182305 - type: f1 value: 96.51138586512977 - type: f1_weighted value: 96.89723065967186 - type: main_score value: 96.89489997182305 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (es) type: mteb/mtop_domain config: es split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 97.17144763175452 - type: f1 value: 96.81785681878274 - type: f1_weighted value: 97.1778974586874 - type: main_score value: 97.17144763175452 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (fr) type: mteb/mtop_domain config: fr split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 96.30128405887879 - type: f1 value: 95.94555923088487 - type: f1_weighted value: 96.30399416794926 - type: main_score value: 96.30128405887879 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 84.53488372093022 - type: f1 value: 61.77995074251401 - type: f1_weighted value: 86.8005170485101 - type: main_score value: 84.53488372093022 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (de) type: mteb/mtop_intent config: de split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 80.79459002535924 - type: f1 value: 56.08938302001448 - type: f1_weighted value: 83.66582131948252 - type: main_score value: 80.79459002535924 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (es) type: mteb/mtop_intent config: es split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 84.7765176784523 - type: f1 value: 61.39860057885528 - type: f1_weighted value: 86.94881745670745 - type: main_score value: 84.7765176784523 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (fr) type: mteb/mtop_intent config: fr split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 82.2079549013467 - type: f1 value: 59.90260478749016 - type: f1_weighted value: 84.36861708593257 - type: main_score value: 82.2079549013467 - task: type: Classification dataset: name: MTEB MasakhaNEWSClassification (eng) type: mteb/masakhanews config: eng split: test revision: 18193f187b92da67168c655c9973a165ed9593dd metrics: - type: accuracy value: 74.98945147679325 - type: f1 value: 74.3157483560261 - type: f1_weighted value: 75.01179008904884 - type: main_score value: 74.98945147679325 - task: type: Classification dataset: name: MTEB MasakhaNEWSClassification (fra) type: mteb/masakhanews config: fra split: test revision: 18193f187b92da67168c655c9973a165ed9593dd metrics: - type: accuracy value: 74.02843601895735 - type: f1 value: 70.40326349620732 - type: f1_weighted value: 74.6596277063484 - type: main_score value: 74.02843601895735 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (amh) type: masakhane/masakhanews config: amh split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 69.45780291725053 - type: v_measure value: 69.45780291725053 - type: v_measure_std value: 36.54340055904091 - type: main_score value: 60.95132147787602 - type: v_measure value: 60.95132147787602 - type: v_measure_std value: 37.330148394033365 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (eng) type: masakhane/masakhanews config: eng split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 64.88996119332239 - type: v_measure value: 64.88996119332239 - type: v_measure_std value: 30.017223408197268 - type: main_score value: 60.974810831426595 - type: v_measure value: 60.974810831426595 - type: v_measure_std value: 24.934675467507827 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (fra) type: masakhane/masakhanews config: fra split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 42.362383958691666 - type: v_measure value: 42.362383958691666 - type: v_measure_std value: 37.61076788039063 - type: main_score value: 44.479206673553335 - type: v_measure value: 44.479206673553335 - type: v_measure_std value: 32.58254804499339 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (hau) type: masakhane/masakhanews config: hau split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 43.29201252405562 - type: v_measure value: 43.29201252405562 - type: v_measure_std value: 34.31987945146255 - type: main_score value: 26.4742082741682 - type: v_measure value: 26.4742082741682 - type: v_measure_std value: 22.344929192323097 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (ibo) type: masakhane/masakhanews config: ibo split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 33.59926542995238 - type: v_measure value: 33.59926542995238 - type: v_measure_std value: 35.70048601084112 - type: main_score value: 38.906129911741985 - type: v_measure value: 38.906129911741985 - type: v_measure_std value: 34.785601792668444 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (lin) type: masakhane/masakhanews config: lin split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 67.58487601893106 - type: v_measure value: 67.58487601893106 - type: v_measure_std value: 35.16784970777931 - type: main_score value: 62.60982020876592 - type: v_measure value: 62.60982020876592 - type: v_measure_std value: 40.7368955715045 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (lug) type: masakhane/masakhanews config: lug split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 50.01220872023533 - type: v_measure value: 50.01220872023533 - type: v_measure_std value: 41.87411574676182 - type: main_score value: 42.70424106365967 - type: v_measure value: 42.70424106365967 - type: v_measure_std value: 46.80946241135087 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (orm) type: masakhane/masakhanews config: orm split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 29.007847502598317 - type: v_measure value: 29.007847502598317 - type: v_measure_std value: 38.374997395079994 - type: main_score value: 28.609942199922322 - type: v_measure value: 28.609942199922322 - type: v_measure_std value: 38.46685040191088 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (pcm) type: masakhane/masakhanews config: pcm split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 79.13520228554611 - type: v_measure value: 79.13520228554611 - type: v_measure_std value: 18.501843848275183 - type: main_score value: 76.83901348810822 - type: v_measure value: 76.83901348810822 - type: v_measure_std value: 17.57617141269189 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (run) type: masakhane/masakhanews config: run split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 60.317213909746656 - type: v_measure value: 60.317213909746656 - type: v_measure_std value: 36.500281823747386 - type: main_score value: 46.89757547846193 - type: v_measure value: 46.89757547846193 - type: v_measure_std value: 44.58903590203438 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (sna) type: masakhane/masakhanews config: sna split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 59.395277358240946 - type: v_measure value: 59.395277358240946 - type: v_measure_std value: 37.500916816164654 - type: main_score value: 55.37185207068829 - type: v_measure value: 55.37185207068829 - type: v_measure_std value: 36.944574863543004 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (som) type: masakhane/masakhanews config: som split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 38.18638688704302 - type: v_measure value: 38.18638688704302 - type: v_measure_std value: 35.453681137564466 - type: main_score value: 37.44211021681754 - type: v_measure value: 37.44211021681754 - type: v_measure_std value: 33.41469994463241 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (swa) type: masakhane/masakhanews config: swa split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 29.49230755729658 - type: v_measure value: 29.49230755729658 - type: v_measure_std value: 28.284313285264645 - type: main_score value: 26.020680621216062 - type: v_measure value: 26.020680621216062 - type: v_measure_std value: 25.480037522570413 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (tir) type: masakhane/masakhanews config: tir split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 60.632258622750115 - type: v_measure value: 60.632258622750115 - type: v_measure_std value: 34.429711214740564 - type: main_score value: 63.74306846771303 - type: v_measure value: 63.74306846771303 - type: v_measure_std value: 32.19119631078685 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (xho) type: masakhane/masakhanews config: xho split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 41.76322918806381 - type: v_measure value: 41.76322918806381 - type: v_measure_std value: 36.43245296200775 - type: main_score value: 24.580890519243777 - type: v_measure value: 24.580890519243777 - type: v_measure_std value: 37.941836363967106 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (yor) type: masakhane/masakhanews config: yor split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: main_score value: 33.17083910808645 - type: v_measure value: 33.17083910808645 - type: v_measure_std value: 34.87547994284835 - type: main_score value: 43.63458888828314 - type: v_measure value: 43.63458888828314 - type: v_measure_std value: 31.28169350649098 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pl) type: mteb/amazon_massive_intent config: pl split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 75.37323470073974 - type: f1 value: 71.1836877753734 - type: f1_weighted value: 75.72073213955457 - type: main_score value: 75.37323470073974 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (de) type: mteb/amazon_massive_intent config: de split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 74.83523873570948 - type: f1 value: 70.72375821116886 - type: f1_weighted value: 75.20800490010755 - type: main_score value: 74.83523873570948 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (es) type: mteb/amazon_massive_intent config: es split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 75.31607262945528 - type: f1 value: 72.06063554897662 - type: f1_weighted value: 75.72438161355252 - type: main_score value: 75.31607262945528 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ru) type: mteb/amazon_massive_intent config: ru split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 76.7955615332885 - type: f1 value: 73.08099648499756 - type: f1_weighted value: 77.18482068239668 - type: main_score value: 76.7955615332885 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 77.60591795561534 - type: f1 value: 74.46676705370395 - type: f1_weighted value: 77.69888062336614 - type: main_score value: 77.60591795561534 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fr) type: mteb/amazon_massive_intent config: fr split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 76.32145258910558 - type: f1 value: 72.89824154178328 - type: f1_weighted value: 76.6539327979472 - type: main_score value: 76.32145258910558 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-CN) type: mteb/amazon_massive_intent config: zh-CN split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 73.21788836583724 - type: f1 value: 70.45594512246377 - type: f1_weighted value: 73.67862536499393 - type: main_score value: 73.21788836583724 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-CN) type: mteb/amazon_massive_scenario config: zh-CN split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 80.82044384667114 - type: f1 value: 80.53217664465089 - type: f1_weighted value: 80.94535087010512 - type: main_score value: 80.82044384667114 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pl) type: mteb/amazon_massive_scenario config: pl split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 82.1049092131809 - type: f1 value: 81.55343463694733 - type: f1_weighted value: 82.33509098770782 - type: main_score value: 82.1049092131809 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (es) type: mteb/amazon_massive_scenario config: es split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 82.58238063214526 - type: f1 value: 82.27974449333072 - type: f1_weighted value: 82.81337569618209 - type: main_score value: 82.58238063214526 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (de) type: mteb/amazon_massive_scenario config: de split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 83.97108271687962 - type: f1 value: 83.56285606936076 - type: f1_weighted value: 84.10198745390771 - type: main_score value: 83.97108271687962 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 84.71082716879623 - type: f1 value: 84.09447062371402 - type: f1_weighted value: 84.73765765551342 - type: main_score value: 84.71082716879623 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fr) type: mteb/amazon_massive_scenario config: fr split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 83.093476798924 - type: f1 value: 82.72656900752943 - type: f1_weighted value: 83.26606516503364 - type: main_score value: 83.093476798924 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ru) type: mteb/amazon_massive_scenario config: ru split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 84.05850706119705 - type: f1 value: 83.64234048881222 - type: f1_weighted value: 84.17315768381876 - type: main_score value: 84.05850706119705 - task: type: Retrieval dataset: name: MTEB MedicalRetrieval (default) type: C-MTEB/MedicalRetrieval config: default split: dev revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6 metrics: - type: main_score value: 56.635999999999996 - type: map_at_1 value: 48.699999999999996 - type: map_at_10 value: 53.991 - type: map_at_100 value: 54.449999999999996 - type: map_at_1000 value: 54.515 - type: map_at_20 value: 54.212 - type: map_at_3 value: 52.833 - type: map_at_5 value: 53.503 - type: mrr_at_1 value: 48.699999999999996 - type: mrr_at_10 value: 53.991309523809505 - type: mrr_at_100 value: 54.45008993448266 - type: mrr_at_1000 value: 54.515253990549795 - type: mrr_at_20 value: 54.21201762247036 - type: mrr_at_3 value: 52.8333333333333 - type: mrr_at_5 value: 53.50333333333328 - type: nauc_map_at_1000_diff1 value: 79.96867989401643 - type: nauc_map_at_1000_max value: 69.75230895599029 - type: nauc_map_at_1000_std value: 2.6418738289740213 - type: nauc_map_at_100_diff1 value: 79.95343709599133 - type: nauc_map_at_100_max value: 69.751282671507 - type: nauc_map_at_100_std value: 2.621719966106279 - type: nauc_map_at_10_diff1 value: 80.02875864565634 - type: nauc_map_at_10_max value: 69.80948662290187 - type: nauc_map_at_10_std value: 2.329151604733765 - type: nauc_map_at_1_diff1 value: 83.616940281383 - type: nauc_map_at_1_max value: 69.08142651929452 - type: nauc_map_at_1_std value: 1.9687791394035643 - type: nauc_map_at_20_diff1 value: 79.95555601275339 - type: nauc_map_at_20_max value: 69.76604695002925 - type: nauc_map_at_20_std value: 2.556184141901367 - type: nauc_map_at_3_diff1 value: 80.74790131023668 - type: nauc_map_at_3_max value: 70.57797991892402 - type: nauc_map_at_3_std value: 2.7115149849964117 - type: nauc_map_at_5_diff1 value: 80.31796539878381 - type: nauc_map_at_5_max value: 69.93573796420061 - type: nauc_map_at_5_std value: 2.0731614029506606 - type: nauc_mrr_at_1000_diff1 value: 79.96867999907981 - type: nauc_mrr_at_1000_max value: 69.57395578976896 - type: nauc_mrr_at_1000_std value: 2.46351945887829 - type: nauc_mrr_at_100_diff1 value: 79.95343709599133 - type: nauc_mrr_at_100_max value: 69.57322054130803 - type: nauc_mrr_at_100_std value: 2.4436578359073433 - type: nauc_mrr_at_10_diff1 value: 80.02875864565634 - type: nauc_mrr_at_10_max value: 69.63292630937411 - type: nauc_mrr_at_10_std value: 2.1525912912060012 - type: nauc_mrr_at_1_diff1 value: 83.616940281383 - type: nauc_mrr_at_1_max value: 68.74717310480305 - type: nauc_mrr_at_1_std value: 1.6345257249120868 - type: nauc_mrr_at_20_diff1 value: 79.95555601275339 - type: nauc_mrr_at_20_max value: 69.58883608470444 - type: nauc_mrr_at_20_std value: 2.378973276576547 - type: nauc_mrr_at_3_diff1 value: 80.74790131023668 - type: nauc_mrr_at_3_max value: 70.40430475488604 - type: nauc_mrr_at_3_std value: 2.5378398209583817 - type: nauc_mrr_at_5_diff1 value: 80.31796539878381 - type: nauc_mrr_at_5_max value: 69.7605991748183 - type: nauc_mrr_at_5_std value: 1.898022613568352 - type: nauc_ndcg_at_1000_diff1 value: 78.35504059321225 - type: nauc_ndcg_at_1000_max value: 69.06752522437093 - type: nauc_ndcg_at_1000_std value: 3.9624036886099265 - type: nauc_ndcg_at_100_diff1 value: 77.79729140249833 - type: nauc_ndcg_at_100_max value: 68.93113791506029 - type: nauc_ndcg_at_100_std value: 3.642178826886181 - type: nauc_ndcg_at_10_diff1 value: 78.160158293918 - type: nauc_ndcg_at_10_max value: 69.28122202281361 - type: nauc_ndcg_at_10_std value: 2.438976810940962 - type: nauc_ndcg_at_1_diff1 value: 83.616940281383 - type: nauc_ndcg_at_1_max value: 69.08142651929452 - type: nauc_ndcg_at_1_std value: 1.9687791394035643 - type: nauc_ndcg_at_20_diff1 value: 77.88514432874997 - type: nauc_ndcg_at_20_max value: 69.06148818508873 - type: nauc_ndcg_at_20_std value: 3.1800249272363676 - type: nauc_ndcg_at_3_diff1 value: 79.73510384405803 - type: nauc_ndcg_at_3_max value: 70.78000695123832 - type: nauc_ndcg_at_3_std value: 2.9041415468363274 - type: nauc_ndcg_at_5_diff1 value: 78.91872808866195 - type: nauc_ndcg_at_5_max value: 69.61478429620091 - type: nauc_ndcg_at_5_std value: 1.734699636301054 - type: nauc_precision_at_1000_diff1 value: 66.37858395390673 - type: nauc_precision_at_1000_max value: 60.651659037598534 - type: nauc_precision_at_1000_std value: 27.388353715469798 - type: nauc_precision_at_100_diff1 value: 66.34325807776025 - type: nauc_precision_at_100_max value: 63.63855305621111 - type: nauc_precision_at_100_std value: 10.641748149575351 - type: nauc_precision_at_10_diff1 value: 71.3784685491089 - type: nauc_precision_at_10_max value: 67.05313695174542 - type: nauc_precision_at_10_std value: 3.000406867930561 - type: nauc_precision_at_1_diff1 value: 83.616940281383 - type: nauc_precision_at_1_max value: 69.08142651929452 - type: nauc_precision_at_1_std value: 1.9687791394035643 - type: nauc_precision_at_20_diff1 value: 69.73407910977694 - type: nauc_precision_at_20_max value: 65.77426240320742 - type: nauc_precision_at_20_std value: 6.204416838482586 - type: nauc_precision_at_3_diff1 value: 76.63737537643107 - type: nauc_precision_at_3_max value: 71.29710200719668 - type: nauc_precision_at_3_std value: 3.47180961484546 - type: nauc_precision_at_5_diff1 value: 74.36945983536717 - type: nauc_precision_at_5_max value: 68.33292218003061 - type: nauc_precision_at_5_std value: 0.47128762620258075 - type: nauc_recall_at_1000_diff1 value: 66.37858395390681 - type: nauc_recall_at_1000_max value: 60.65165903759889 - type: nauc_recall_at_1000_std value: 27.388353715469822 - type: nauc_recall_at_100_diff1 value: 66.34325807776025 - type: nauc_recall_at_100_max value: 63.63855305621116 - type: nauc_recall_at_100_std value: 10.641748149575351 - type: nauc_recall_at_10_diff1 value: 71.37846854910892 - type: nauc_recall_at_10_max value: 67.05313695174546 - type: nauc_recall_at_10_std value: 3.000406867930663 - type: nauc_recall_at_1_diff1 value: 83.616940281383 - type: nauc_recall_at_1_max value: 69.08142651929452 - type: nauc_recall_at_1_std value: 1.9687791394035643 - type: nauc_recall_at_20_diff1 value: 69.73407910977691 - type: nauc_recall_at_20_max value: 65.77426240320746 - type: nauc_recall_at_20_std value: 6.204416838482536 - type: nauc_recall_at_3_diff1 value: 76.63737537643112 - type: nauc_recall_at_3_max value: 71.29710200719668 - type: nauc_recall_at_3_std value: 3.471809614845442 - type: nauc_recall_at_5_diff1 value: 74.36945983536715 - type: nauc_recall_at_5_max value: 68.33292218003065 - type: nauc_recall_at_5_std value: 0.4712876262026442 - type: ndcg_at_1 value: 48.699999999999996 - type: ndcg_at_10 value: 56.635999999999996 - type: ndcg_at_100 value: 59.193 - type: ndcg_at_1000 value: 60.97 - type: ndcg_at_20 value: 57.426 - type: ndcg_at_3 value: 54.186 - type: ndcg_at_5 value: 55.407 - type: precision_at_1 value: 48.699999999999996 - type: precision_at_10 value: 6.5 - type: precision_at_100 value: 0.777 - type: precision_at_1000 value: 0.092 - type: precision_at_20 value: 3.405 - type: precision_at_3 value: 19.367 - type: precision_at_5 value: 12.22 - type: recall_at_1 value: 48.699999999999996 - type: recall_at_10 value: 65.0 - type: recall_at_100 value: 77.7 - type: recall_at_1000 value: 91.8 - type: recall_at_20 value: 68.10000000000001 - type: recall_at_3 value: 58.099999999999994 - type: recall_at_5 value: 61.1 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P (default) type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: main_score value: 34.80188561439236 - type: v_measure value: 34.80188561439236 - type: v_measure_std value: 1.5703148841573102 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S (default) type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: main_score value: 32.42285513996236 - type: v_measure value: 32.42285513996236 - type: v_measure_std value: 1.3769867487457566 - task: type: Retrieval dataset: name: MTEB MintakaRetrieval (de) type: jinaai/mintakaqa config: de split: test revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e metrics: - type: main_score value: 27.025 - type: map_at_1 value: 14.532 - type: map_at_10 value: 22.612 - type: map_at_100 value: 23.802 - type: map_at_1000 value: 23.9 - type: map_at_20 value: 23.275000000000002 - type: map_at_3 value: 20.226 - type: map_at_5 value: 21.490000000000002 - type: mrr_at_1 value: 14.532434709351305 - type: mrr_at_10 value: 22.612077265615575 - type: mrr_at_100 value: 23.801523356874675 - type: mrr_at_1000 value: 23.900118499340238 - type: mrr_at_20 value: 23.275466430108995 - type: mrr_at_3 value: 20.22606009547877 - type: mrr_at_5 value: 21.489750070204945 - type: nauc_map_at_1000_diff1 value: 14.148987799763596 - type: nauc_map_at_1000_max value: 44.70338461387784 - type: nauc_map_at_1000_std value: 15.868006767707637 - type: nauc_map_at_100_diff1 value: 14.11371769080442 - type: nauc_map_at_100_max value: 44.67995540936296 - type: nauc_map_at_100_std value: 15.890796502029076 - type: nauc_map_at_10_diff1 value: 14.29066834165688 - type: nauc_map_at_10_max value: 45.10997111765282 - type: nauc_map_at_10_std value: 15.508568918629864 - type: nauc_map_at_1_diff1 value: 23.473291302576396 - type: nauc_map_at_1_max value: 44.68942599764586 - type: nauc_map_at_1_std value: 12.424377262427253 - type: nauc_map_at_20_diff1 value: 14.112652046087831 - type: nauc_map_at_20_max value: 44.82014861413682 - type: nauc_map_at_20_std value: 15.739350613646385 - type: nauc_map_at_3_diff1 value: 16.119659221396347 - type: nauc_map_at_3_max value: 46.04766378953525 - type: nauc_map_at_3_std value: 13.969878046315925 - type: nauc_map_at_5_diff1 value: 15.095453434076184 - type: nauc_map_at_5_max value: 45.802128149314406 - type: nauc_map_at_5_std value: 14.957442173319949 - type: nauc_mrr_at_1000_diff1 value: 14.148987799763596 - type: nauc_mrr_at_1000_max value: 44.70338461387784 - type: nauc_mrr_at_1000_std value: 15.868006767707637 - type: nauc_mrr_at_100_diff1 value: 14.11371769080442 - type: nauc_mrr_at_100_max value: 44.67995540936296 - type: nauc_mrr_at_100_std value: 15.890796502029076 - type: nauc_mrr_at_10_diff1 value: 14.29066834165688 - type: nauc_mrr_at_10_max value: 45.10997111765282 - type: nauc_mrr_at_10_std value: 15.508568918629864 - type: nauc_mrr_at_1_diff1 value: 23.473291302576396 - type: nauc_mrr_at_1_max value: 44.68942599764586 - type: nauc_mrr_at_1_std value: 12.424377262427253 - type: nauc_mrr_at_20_diff1 value: 14.112652046087831 - type: nauc_mrr_at_20_max value: 44.82014861413682 - type: nauc_mrr_at_20_std value: 15.739350613646385 - type: nauc_mrr_at_3_diff1 value: 16.119659221396347 - type: nauc_mrr_at_3_max value: 46.04766378953525 - type: nauc_mrr_at_3_std value: 13.969878046315925 - type: nauc_mrr_at_5_diff1 value: 15.095453434076184 - type: nauc_mrr_at_5_max value: 45.802128149314406 - type: nauc_mrr_at_5_std value: 14.957442173319949 - type: nauc_ndcg_at_1000_diff1 value: 11.626606894574028 - type: nauc_ndcg_at_1000_max value: 43.328592841065536 - type: nauc_ndcg_at_1000_std value: 18.049446272245547 - type: nauc_ndcg_at_100_diff1 value: 10.485720606660239 - type: nauc_ndcg_at_100_max value: 42.405317674170966 - type: nauc_ndcg_at_100_std value: 19.107151641936987 - type: nauc_ndcg_at_10_diff1 value: 11.029351078162982 - type: nauc_ndcg_at_10_max value: 44.36855031964681 - type: nauc_ndcg_at_10_std value: 17.302796171409305 - type: nauc_ndcg_at_1_diff1 value: 23.473291302576396 - type: nauc_ndcg_at_1_max value: 44.68942599764586 - type: nauc_ndcg_at_1_std value: 12.424377262427253 - type: nauc_ndcg_at_20_diff1 value: 10.356662718168412 - type: nauc_ndcg_at_20_max value: 43.31602680430083 - type: nauc_ndcg_at_20_std value: 18.162891267850316 - type: nauc_ndcg_at_3_diff1 value: 14.42844952297869 - type: nauc_ndcg_at_3_max value: 46.26603339466543 - type: nauc_ndcg_at_3_std value: 14.449362723887857 - type: nauc_ndcg_at_5_diff1 value: 12.783416563486396 - type: nauc_ndcg_at_5_max value: 45.852176479124424 - type: nauc_ndcg_at_5_std value: 16.11775016428085 - type: nauc_precision_at_1000_diff1 value: -8.045361059399795 - type: nauc_precision_at_1000_max value: 21.970273281738777 - type: nauc_precision_at_1000_std value: 49.564650488193266 - type: nauc_precision_at_100_diff1 value: -2.118628861593353 - type: nauc_precision_at_100_max value: 31.32498977104778 - type: nauc_precision_at_100_std value: 32.96087731883451 - type: nauc_precision_at_10_diff1 value: 3.0335517475367615 - type: nauc_precision_at_10_max value: 42.21620215030219 - type: nauc_precision_at_10_std value: 21.90159732315962 - type: nauc_precision_at_1_diff1 value: 23.473291302576396 - type: nauc_precision_at_1_max value: 44.68942599764586 - type: nauc_precision_at_1_std value: 12.424377262427253 - type: nauc_precision_at_20_diff1 value: 0.4087201843719047 - type: nauc_precision_at_20_max value: 38.485034773895734 - type: nauc_precision_at_20_std value: 25.077397979916682 - type: nauc_precision_at_3_diff1 value: 10.408327736589833 - type: nauc_precision_at_3_max value: 46.757216289175076 - type: nauc_precision_at_3_std value: 15.62594354926867 - type: nauc_precision_at_5_diff1 value: 7.326752744229544 - type: nauc_precision_at_5_max value: 45.89190518573553 - type: nauc_precision_at_5_std value: 19.01717163438957 - type: nauc_recall_at_1000_diff1 value: -8.045361059400387 - type: nauc_recall_at_1000_max value: 21.97027328173812 - type: nauc_recall_at_1000_std value: 49.56465048819266 - type: nauc_recall_at_100_diff1 value: -2.118628861593277 - type: nauc_recall_at_100_max value: 31.324989771047818 - type: nauc_recall_at_100_std value: 32.96087731883457 - type: nauc_recall_at_10_diff1 value: 3.0335517475367166 - type: nauc_recall_at_10_max value: 42.21620215030217 - type: nauc_recall_at_10_std value: 21.901597323159606 - type: nauc_recall_at_1_diff1 value: 23.473291302576396 - type: nauc_recall_at_1_max value: 44.68942599764586 - type: nauc_recall_at_1_std value: 12.424377262427253 - type: nauc_recall_at_20_diff1 value: 0.40872018437190905 - type: nauc_recall_at_20_max value: 38.485034773895734 - type: nauc_recall_at_20_std value: 25.077397979916693 - type: nauc_recall_at_3_diff1 value: 10.408327736589843 - type: nauc_recall_at_3_max value: 46.75721628917507 - type: nauc_recall_at_3_std value: 15.625943549268664 - type: nauc_recall_at_5_diff1 value: 7.326752744229548 - type: nauc_recall_at_5_max value: 45.89190518573557 - type: nauc_recall_at_5_std value: 19.01717163438958 - type: ndcg_at_1 value: 14.532 - type: ndcg_at_10 value: 27.025 - type: ndcg_at_100 value: 33.305 - type: ndcg_at_1000 value: 36.38 - type: ndcg_at_20 value: 29.443 - type: ndcg_at_3 value: 22.035 - type: ndcg_at_5 value: 24.319 - type: precision_at_1 value: 14.532 - type: precision_at_10 value: 4.115 - type: precision_at_100 value: 0.717 - type: precision_at_1000 value: 0.097 - type: precision_at_20 value: 2.536 - type: precision_at_3 value: 9.085 - type: precision_at_5 value: 6.563 - type: recall_at_1 value: 14.532 - type: recall_at_10 value: 41.154 - type: recall_at_100 value: 71.651 - type: recall_at_1000 value: 96.841 - type: recall_at_20 value: 50.71600000000001 - type: recall_at_3 value: 27.254 - type: recall_at_5 value: 32.814 - task: type: Retrieval dataset: name: MTEB MintakaRetrieval (es) type: jinaai/mintakaqa config: es split: test revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e metrics: - type: main_score value: 26.912000000000003 - type: map_at_1 value: 14.686 - type: map_at_10 value: 22.569 - type: map_at_100 value: 23.679 - type: map_at_1000 value: 23.777 - type: map_at_20 value: 23.169 - type: map_at_3 value: 20.201 - type: map_at_5 value: 21.566 - type: mrr_at_1 value: 14.686468646864686 - type: mrr_at_10 value: 22.569346220336296 - type: mrr_at_100 value: 23.678819125817146 - type: mrr_at_1000 value: 23.77713511338264 - type: mrr_at_20 value: 23.16850858443442 - type: mrr_at_3 value: 20.200770077007665 - type: mrr_at_5 value: 21.56628162816276 - type: nauc_map_at_1000_diff1 value: 14.129007578838381 - type: nauc_map_at_1000_max value: 44.4255501141499 - type: nauc_map_at_1000_std value: 19.95906154868176 - type: nauc_map_at_100_diff1 value: 14.09071870575231 - type: nauc_map_at_100_max value: 44.403179928955566 - type: nauc_map_at_100_std value: 20.00413657519976 - type: nauc_map_at_10_diff1 value: 14.149535953153688 - type: nauc_map_at_10_max value: 44.66529917634685 - type: nauc_map_at_10_std value: 19.580235989479394 - type: nauc_map_at_1_diff1 value: 23.489813522176636 - type: nauc_map_at_1_max value: 46.54578639925787 - type: nauc_map_at_1_std value: 16.39083721709994 - type: nauc_map_at_20_diff1 value: 14.021560420656181 - type: nauc_map_at_20_max value: 44.4825455452467 - type: nauc_map_at_20_std value: 19.886927750826878 - type: nauc_map_at_3_diff1 value: 16.182977890477723 - type: nauc_map_at_3_max value: 46.1840554029258 - type: nauc_map_at_3_std value: 18.735671900228958 - type: nauc_map_at_5_diff1 value: 14.779126395472833 - type: nauc_map_at_5_max value: 45.23237213817556 - type: nauc_map_at_5_std value: 19.348508580412872 - type: nauc_mrr_at_1000_diff1 value: 14.129007578838381 - type: nauc_mrr_at_1000_max value: 44.4255501141499 - type: nauc_mrr_at_1000_std value: 19.95906154868176 - type: nauc_mrr_at_100_diff1 value: 14.09071870575231 - type: nauc_mrr_at_100_max value: 44.403179928955566 - type: nauc_mrr_at_100_std value: 20.00413657519976 - type: nauc_mrr_at_10_diff1 value: 14.149535953153688 - type: nauc_mrr_at_10_max value: 44.66529917634685 - type: nauc_mrr_at_10_std value: 19.580235989479394 - type: nauc_mrr_at_1_diff1 value: 23.489813522176636 - type: nauc_mrr_at_1_max value: 46.54578639925787 - type: nauc_mrr_at_1_std value: 16.39083721709994 - type: nauc_mrr_at_20_diff1 value: 14.021560420656181 - type: nauc_mrr_at_20_max value: 44.4825455452467 - type: nauc_mrr_at_20_std value: 19.886927750826878 - type: nauc_mrr_at_3_diff1 value: 16.182977890477723 - type: nauc_mrr_at_3_max value: 46.1840554029258 - type: nauc_mrr_at_3_std value: 18.735671900228958 - type: nauc_mrr_at_5_diff1 value: 14.779126395472833 - type: nauc_mrr_at_5_max value: 45.23237213817556 - type: nauc_mrr_at_5_std value: 19.348508580412872 - type: nauc_ndcg_at_1000_diff1 value: 11.762470380481101 - type: nauc_ndcg_at_1000_max value: 42.8233203033089 - type: nauc_ndcg_at_1000_std value: 21.78503705117719 - type: nauc_ndcg_at_100_diff1 value: 10.45886076220022 - type: nauc_ndcg_at_100_max value: 41.85472899256818 - type: nauc_ndcg_at_100_std value: 23.20955486335138 - type: nauc_ndcg_at_10_diff1 value: 10.605912468659469 - type: nauc_ndcg_at_10_max value: 43.150942448104715 - type: nauc_ndcg_at_10_std value: 21.120035764826085 - type: nauc_ndcg_at_1_diff1 value: 23.489813522176636 - type: nauc_ndcg_at_1_max value: 46.54578639925787 - type: nauc_ndcg_at_1_std value: 16.39083721709994 - type: nauc_ndcg_at_20_diff1 value: 10.11291783888644 - type: nauc_ndcg_at_20_max value: 42.51260678842788 - type: nauc_ndcg_at_20_std value: 22.1744949382252 - type: nauc_ndcg_at_3_diff1 value: 14.25625326760802 - type: nauc_ndcg_at_3_max value: 45.96162916377383 - type: nauc_ndcg_at_3_std value: 19.557832728215523 - type: nauc_ndcg_at_5_diff1 value: 11.956317653823053 - type: nauc_ndcg_at_5_max value: 44.35971268886807 - type: nauc_ndcg_at_5_std value: 20.581696730374233 - type: nauc_precision_at_1000_diff1 value: 5.132291843566577 - type: nauc_precision_at_1000_max value: 25.293354576835263 - type: nauc_precision_at_1000_std value: 40.36005126087624 - type: nauc_precision_at_100_diff1 value: -1.5252854375008238 - type: nauc_precision_at_100_max value: 31.007586474495984 - type: nauc_precision_at_100_std value: 37.297552993548386 - type: nauc_precision_at_10_diff1 value: 1.9663657370770737 - type: nauc_precision_at_10_max value: 39.194092293625125 - type: nauc_precision_at_10_std value: 24.956542621999542 - type: nauc_precision_at_1_diff1 value: 23.489813522176636 - type: nauc_precision_at_1_max value: 46.54578639925787 - type: nauc_precision_at_1_std value: 16.39083721709994 - type: nauc_precision_at_20_diff1 value: 0.011112090390932373 - type: nauc_precision_at_20_max value: 36.9357074392519 - type: nauc_precision_at_20_std value: 28.611387115093876 - type: nauc_precision_at_3_diff1 value: 9.596831091013703 - type: nauc_precision_at_3_max value: 45.3905541893809 - type: nauc_precision_at_3_std value: 21.599314388526945 - type: nauc_precision_at_5_diff1 value: 5.175887949900142 - type: nauc_precision_at_5_max value: 42.129467510414464 - type: nauc_precision_at_5_std value: 23.607251548776677 - type: nauc_recall_at_1000_diff1 value: 5.132291843566257 - type: nauc_recall_at_1000_max value: 25.29335457683396 - type: nauc_recall_at_1000_std value: 40.36005126087638 - type: nauc_recall_at_100_diff1 value: -1.5252854375008988 - type: nauc_recall_at_100_max value: 31.00758647449594 - type: nauc_recall_at_100_std value: 37.29755299354834 - type: nauc_recall_at_10_diff1 value: 1.9663657370770793 - type: nauc_recall_at_10_max value: 39.19409229362512 - type: nauc_recall_at_10_std value: 24.956542621999546 - type: nauc_recall_at_1_diff1 value: 23.489813522176636 - type: nauc_recall_at_1_max value: 46.54578639925787 - type: nauc_recall_at_1_std value: 16.39083721709994 - type: nauc_recall_at_20_diff1 value: 0.011112090390923075 - type: nauc_recall_at_20_max value: 36.93570743925189 - type: nauc_recall_at_20_std value: 28.611387115093883 - type: nauc_recall_at_3_diff1 value: 9.596831091013714 - type: nauc_recall_at_3_max value: 45.39055418938087 - type: nauc_recall_at_3_std value: 21.599314388526956 - type: nauc_recall_at_5_diff1 value: 5.17588794990012 - type: nauc_recall_at_5_max value: 42.12946751041448 - type: nauc_recall_at_5_std value: 23.607251548776695 - type: ndcg_at_1 value: 14.686 - type: ndcg_at_10 value: 26.912000000000003 - type: ndcg_at_100 value: 32.919 - type: ndcg_at_1000 value: 36.119 - type: ndcg_at_20 value: 29.079 - type: ndcg_at_3 value: 21.995 - type: ndcg_at_5 value: 24.474999999999998 - type: precision_at_1 value: 14.686 - type: precision_at_10 value: 4.08 - type: precision_at_100 value: 0.703 - type: precision_at_1000 value: 0.097 - type: precision_at_20 value: 2.467 - type: precision_at_3 value: 9.062000000000001 - type: precision_at_5 value: 6.65 - type: recall_at_1 value: 14.686 - type: recall_at_10 value: 40.8 - type: recall_at_100 value: 70.338 - type: recall_at_1000 value: 96.82300000000001 - type: recall_at_20 value: 49.34 - type: recall_at_3 value: 27.186 - type: recall_at_5 value: 33.251 - task: type: Retrieval dataset: name: MTEB MintakaRetrieval (fr) type: jinaai/mintakaqa config: fr split: test revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e metrics: - type: main_score value: 26.909 - type: map_at_1 value: 14.701 - type: map_at_10 value: 22.613 - type: map_at_100 value: 23.729 - type: map_at_1000 value: 23.837 - type: map_at_20 value: 23.262 - type: map_at_3 value: 20.236 - type: map_at_5 value: 21.673000000000002 - type: mrr_at_1 value: 14.7010647010647 - type: mrr_at_10 value: 22.613165113165113 - type: mrr_at_100 value: 23.72877605989423 - type: mrr_at_1000 value: 23.837150802746805 - type: mrr_at_20 value: 23.261627081110596 - type: mrr_at_3 value: 20.2361452361452 - type: mrr_at_5 value: 21.673491673491625 - type: nauc_map_at_1000_diff1 value: 17.08927788889635 - type: nauc_map_at_1000_max value: 47.240929150603336 - type: nauc_map_at_1000_std value: 20.559244258100275 - type: nauc_map_at_100_diff1 value: 17.029461792796777 - type: nauc_map_at_100_max value: 47.207381115550696 - type: nauc_map_at_100_std value: 20.581498156895265 - type: nauc_map_at_10_diff1 value: 17.351456007804536 - type: nauc_map_at_10_max value: 47.815880040221344 - type: nauc_map_at_10_std value: 20.292999107555794 - type: nauc_map_at_1_diff1 value: 27.297525357600776 - type: nauc_map_at_1_max value: 47.18835074959486 - type: nauc_map_at_1_std value: 18.304203168281834 - type: nauc_map_at_20_diff1 value: 17.157460199542136 - type: nauc_map_at_20_max value: 47.4776610667456 - type: nauc_map_at_20_std value: 20.499186342964478 - type: nauc_map_at_3_diff1 value: 19.393119961356277 - type: nauc_map_at_3_max value: 49.02841822452882 - type: nauc_map_at_3_std value: 19.293122796321292 - type: nauc_map_at_5_diff1 value: 17.76275044752008 - type: nauc_map_at_5_max value: 48.01292548040298 - type: nauc_map_at_5_std value: 19.928449977400504 - type: nauc_mrr_at_1000_diff1 value: 17.08927788889635 - type: nauc_mrr_at_1000_max value: 47.240929150603336 - type: nauc_mrr_at_1000_std value: 20.559244258100275 - type: nauc_mrr_at_100_diff1 value: 17.029461792796777 - type: nauc_mrr_at_100_max value: 47.207381115550696 - type: nauc_mrr_at_100_std value: 20.581498156895265 - type: nauc_mrr_at_10_diff1 value: 17.351456007804536 - type: nauc_mrr_at_10_max value: 47.815880040221344 - type: nauc_mrr_at_10_std value: 20.292999107555794 - type: nauc_mrr_at_1_diff1 value: 27.297525357600776 - type: nauc_mrr_at_1_max value: 47.18835074959486 - type: nauc_mrr_at_1_std value: 18.304203168281834 - type: nauc_mrr_at_20_diff1 value: 17.157460199542136 - type: nauc_mrr_at_20_max value: 47.4776610667456 - type: nauc_mrr_at_20_std value: 20.499186342964478 - type: nauc_mrr_at_3_diff1 value: 19.393119961356277 - type: nauc_mrr_at_3_max value: 49.02841822452882 - type: nauc_mrr_at_3_std value: 19.293122796321292 - type: nauc_mrr_at_5_diff1 value: 17.76275044752008 - type: nauc_mrr_at_5_max value: 48.01292548040298 - type: nauc_mrr_at_5_std value: 19.928449977400504 - type: nauc_ndcg_at_1000_diff1 value: 13.989496006047975 - type: nauc_ndcg_at_1000_max value: 45.626323944336114 - type: nauc_ndcg_at_1000_std value: 22.125600410796515 - type: nauc_ndcg_at_100_diff1 value: 12.302204843705244 - type: nauc_ndcg_at_100_max value: 44.46856314559079 - type: nauc_ndcg_at_100_std value: 23.084984546328677 - type: nauc_ndcg_at_10_diff1 value: 14.001226213368275 - type: nauc_ndcg_at_10_max value: 47.37780636546918 - type: nauc_ndcg_at_10_std value: 21.702709032840637 - type: nauc_ndcg_at_1_diff1 value: 27.297525357600776 - type: nauc_ndcg_at_1_max value: 47.18835074959486 - type: nauc_ndcg_at_1_std value: 18.304203168281834 - type: nauc_ndcg_at_20_diff1 value: 13.317759910171056 - type: nauc_ndcg_at_20_max value: 46.25171251043813 - type: nauc_ndcg_at_20_std value: 22.309331575402595 - type: nauc_ndcg_at_3_diff1 value: 17.555381234893872 - type: nauc_ndcg_at_3_max value: 49.48635590260059 - type: nauc_ndcg_at_3_std value: 19.734570962933674 - type: nauc_ndcg_at_5_diff1 value: 14.844841165765061 - type: nauc_ndcg_at_5_max value: 47.76437065028708 - type: nauc_ndcg_at_5_std value: 20.816034479453954 - type: nauc_precision_at_1000_diff1 value: -15.591898698252546 - type: nauc_precision_at_1000_max value: 20.545984285353892 - type: nauc_precision_at_1000_std value: 38.9013414992826 - type: nauc_precision_at_100_diff1 value: -5.290395978742176 - type: nauc_precision_at_100_max value: 31.340480360546845 - type: nauc_precision_at_100_std value: 33.6897935720505 - type: nauc_precision_at_10_diff1 value: 5.965001997926562 - type: nauc_precision_at_10_max value: 46.12515296162247 - type: nauc_precision_at_10_std value: 25.409433135253558 - type: nauc_precision_at_1_diff1 value: 27.297525357600776 - type: nauc_precision_at_1_max value: 47.18835074959486 - type: nauc_precision_at_1_std value: 18.304203168281834 - type: nauc_precision_at_20_diff1 value: 3.4438127279827744 - type: nauc_precision_at_20_max value: 42.36095587714494 - type: nauc_precision_at_20_std value: 27.367900512797906 - type: nauc_precision_at_3_diff1 value: 13.165017224718916 - type: nauc_precision_at_3_max value: 50.58931825484506 - type: nauc_precision_at_3_std value: 20.852009214609442 - type: nauc_precision_at_5_diff1 value: 7.840087177549876 - type: nauc_precision_at_5_max value: 46.99388755575109 - type: nauc_precision_at_5_std value: 23.048702393099834 - type: nauc_recall_at_1000_diff1 value: -15.591898698252932 - type: nauc_recall_at_1000_max value: 20.5459842853537 - type: nauc_recall_at_1000_std value: 38.901341499282395 - type: nauc_recall_at_100_diff1 value: -5.290395978742165 - type: nauc_recall_at_100_max value: 31.340480360546863 - type: nauc_recall_at_100_std value: 33.68979357205046 - type: nauc_recall_at_10_diff1 value: 5.96500199792656 - type: nauc_recall_at_10_max value: 46.1251529616225 - type: nauc_recall_at_10_std value: 25.409433135253543 - type: nauc_recall_at_1_diff1 value: 27.297525357600776 - type: nauc_recall_at_1_max value: 47.18835074959486 - type: nauc_recall_at_1_std value: 18.304203168281834 - type: nauc_recall_at_20_diff1 value: 3.4438127279827833 - type: nauc_recall_at_20_max value: 42.36095587714498 - type: nauc_recall_at_20_std value: 27.36790051279787 - type: nauc_recall_at_3_diff1 value: 13.165017224718916 - type: nauc_recall_at_3_max value: 50.589318254845054 - type: nauc_recall_at_3_std value: 20.852009214609435 - type: nauc_recall_at_5_diff1 value: 7.840087177549891 - type: nauc_recall_at_5_max value: 46.99388755575112 - type: nauc_recall_at_5_std value: 23.048702393099845 - type: ndcg_at_1 value: 14.701 - type: ndcg_at_10 value: 26.909 - type: ndcg_at_100 value: 32.727000000000004 - type: ndcg_at_1000 value: 36.086 - type: ndcg_at_20 value: 29.236 - type: ndcg_at_3 value: 22.004 - type: ndcg_at_5 value: 24.615000000000002 - type: precision_at_1 value: 14.701 - type: precision_at_10 value: 4.062 - type: precision_at_100 value: 0.688 - type: precision_at_1000 value: 0.096 - type: precision_at_20 value: 2.488 - type: precision_at_3 value: 9.036 - type: precision_at_5 value: 6.699 - type: recall_at_1 value: 14.701 - type: recall_at_10 value: 40.622 - type: recall_at_100 value: 68.796 - type: recall_at_1000 value: 96.314 - type: recall_at_20 value: 49.754 - type: recall_at_3 value: 27.108999999999998 - type: recall_at_5 value: 33.497 - task: type: Classification dataset: name: MTEB MultilingualSentiment (default) type: C-MTEB/MultilingualSentiment-classification config: default split: test revision: 46958b007a63fdbf239b7672c25d0bea67b5ea1a metrics: - type: accuracy value: 73.20999999999998 - type: f1 value: 73.18755986777474 - type: f1_weighted value: 73.18755986777475 - type: main_score value: 73.20999999999998 - task: type: Retrieval dataset: name: MTEB NFCorpus (default) type: mteb/nfcorpus config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: map_at_1 value: 4.822 - type: map_at_10 value: 13.144 - type: map_at_100 value: 17.254 - type: map_at_1000 value: 18.931 - type: map_at_20 value: 14.834 - type: map_at_3 value: 8.975 - type: map_at_5 value: 10.922 - type: mrr_at_1 value: 47.059 - type: mrr_at_10 value: 55.806999999999995 - type: mrr_at_100 value: 56.286 - type: mrr_at_1000 value: 56.327000000000005 - type: mrr_at_20 value: 56.00000000000001 - type: mrr_at_3 value: 54.17999999999999 - type: mrr_at_5 value: 55.155 - type: ndcg_at_1 value: 44.427 - type: ndcg_at_10 value: 36.623 - type: ndcg_at_100 value: 33.664 - type: ndcg_at_1000 value: 42.538 - type: ndcg_at_20 value: 34.066 - type: ndcg_at_3 value: 41.118 - type: ndcg_at_5 value: 39.455 - type: precision_at_1 value: 46.44 - type: precision_at_10 value: 28.607 - type: precision_at_100 value: 9.189 - type: precision_at_1000 value: 2.261 - type: precision_at_20 value: 21.238 - type: precision_at_3 value: 39.628 - type: precision_at_5 value: 35.604 - type: recall_at_1 value: 4.822 - type: recall_at_10 value: 17.488999999999997 - type: recall_at_100 value: 35.052 - type: recall_at_1000 value: 66.67999999999999 - type: recall_at_20 value: 21.343999999999998 - type: recall_at_3 value: 10.259 - type: recall_at_5 value: 13.406 - type: main_score value: 36.623 - task: type: Retrieval dataset: name: MTEB NQ (default) type: mteb/nq config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: map_at_1 value: 41.411 - type: map_at_10 value: 57.179 - type: map_at_100 value: 57.945 - type: map_at_1000 value: 57.967999999999996 - type: map_at_20 value: 57.687 - type: map_at_3 value: 53.46300000000001 - type: map_at_5 value: 55.696999999999996 - type: mrr_at_1 value: 46.233999999999995 - type: mrr_at_10 value: 59.831999999999994 - type: mrr_at_100 value: 60.33500000000001 - type: mrr_at_1000 value: 60.348 - type: mrr_at_20 value: 60.167 - type: mrr_at_3 value: 56.972 - type: mrr_at_5 value: 58.74 - type: ndcg_at_1 value: 46.205 - type: ndcg_at_10 value: 64.23100000000001 - type: ndcg_at_100 value: 67.242 - type: ndcg_at_1000 value: 67.72500000000001 - type: ndcg_at_20 value: 65.77300000000001 - type: ndcg_at_3 value: 57.516 - type: ndcg_at_5 value: 61.11600000000001 - type: precision_at_1 value: 46.205 - type: precision_at_10 value: 9.873 - type: precision_at_100 value: 1.158 - type: precision_at_1000 value: 0.12 - type: precision_at_20 value: 5.319 - type: precision_at_3 value: 25.424999999999997 - type: precision_at_5 value: 17.375 - type: recall_at_1 value: 41.411 - type: recall_at_10 value: 82.761 - type: recall_at_100 value: 95.52199999999999 - type: recall_at_1000 value: 99.02499999999999 - type: recall_at_20 value: 88.34 - type: recall_at_3 value: 65.73 - type: recall_at_5 value: 73.894 - type: main_score value: 64.23100000000001 - task: type: PairClassification dataset: name: MTEB Ocnli (default) type: C-MTEB/OCNLI config: default split: validation revision: 66e76a618a34d6d565d5538088562851e6daa7ec metrics: - type: cosine_accuracy value: 62.3714131023281 - type: cosine_accuracy_threshold value: 79.70921993255615 - type: cosine_ap value: 66.41380155495659 - type: cosine_f1 value: 68.89547185780786 - type: cosine_f1_threshold value: 72.91591167449951 - type: cosine_precision value: 57.485875706214685 - type: cosine_recall value: 85.95564941921859 - type: dot_accuracy value: 60.47644829453167 - type: dot_accuracy_threshold value: 36627.362060546875 - type: dot_ap value: 63.696303449293204 - type: dot_f1 value: 68.3986041101202 - type: dot_f1_threshold value: 30452.72216796875 - type: dot_precision value: 54.04411764705882 - type: dot_recall value: 93.13621964097149 - type: euclidean_accuracy value: 63.02111532214402 - type: euclidean_accuracy_threshold value: 1392.76762008667 - type: euclidean_ap value: 66.65907089443218 - type: euclidean_f1 value: 69.05036524413688 - type: euclidean_f1_threshold value: 1711.5310668945312 - type: euclidean_precision value: 54.29262394195889 - type: euclidean_recall value: 94.82576557550159 - type: main_score value: 63.02111532214402 - type: manhattan_accuracy value: 62.75040606388739 - type: manhattan_accuracy_threshold value: 32475.347900390625 - type: manhattan_ap value: 66.50943585125434 - type: manhattan_f1 value: 69.08382066276802 - type: manhattan_f1_threshold value: 41238.470458984375 - type: manhattan_precision value: 54.75896168108776 - type: manhattan_recall value: 93.55860612460401 - type: max_accuracy value: 63.02111532214402 - type: max_ap value: 66.65907089443218 - type: max_f1 value: 69.08382066276802 - type: max_precision value: 57.485875706214685 - type: max_recall value: 94.82576557550159 - type: similarity_accuracy value: 62.3714131023281 - type: similarity_accuracy_threshold value: 79.70921993255615 - type: similarity_ap value: 66.41380155495659 - type: similarity_f1 value: 68.89547185780786 - type: similarity_f1_threshold value: 72.91591167449951 - type: similarity_precision value: 57.485875706214685 - type: similarity_recall value: 85.95564941921859 - task: type: Classification dataset: name: MTEB OnlineShopping (default) type: C-MTEB/OnlineShopping-classification config: default split: test revision: e610f2ebd179a8fda30ae534c3878750a96db120 metrics: - type: accuracy value: 91.88000000000001 - type: ap value: 89.52463684448476 - type: ap_weighted value: 89.52463684448476 - type: f1 value: 91.86313022306673 - type: f1_weighted value: 91.87806318146912 - type: main_score value: 91.88000000000001 - task: type: PairClassification dataset: name: MTEB OpusparcusPC (en) type: GEM/opusparcus config: en split: test.full revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a metrics: - type: cosine_accuracy value: 92.65578635014838 - type: cosine_accuracy_threshold value: 74.02530312538147 - type: cosine_ap value: 98.3834226153613 - type: cosine_f1 value: 94.92567913890312 - type: cosine_f1_threshold value: 74.02530312538147 - type: cosine_precision value: 95.562435500516 - type: cosine_recall value: 94.29735234215886 - type: dot_accuracy value: 91.54302670623146 - type: dot_accuracy_threshold value: 34452.29187011719 - type: dot_ap value: 98.1237257754439 - type: dot_f1 value: 94.22400803616273 - type: dot_f1_threshold value: 33670.41931152344 - type: dot_precision value: 92.9633300297324 - type: dot_recall value: 95.5193482688391 - type: euclidean_accuracy value: 92.28486646884274 - type: euclidean_accuracy_threshold value: 1602.8022766113281 - type: euclidean_ap value: 98.3099021504706 - type: euclidean_f1 value: 94.75277497477296 - type: euclidean_f1_threshold value: 1604.7462463378906 - type: euclidean_precision value: 93.89999999999999 - type: euclidean_recall value: 95.62118126272912 - type: main_score value: 98.3834226153613 - type: manhattan_accuracy value: 92.2106824925816 - type: manhattan_accuracy_threshold value: 38872.90954589844 - type: manhattan_ap value: 98.28694101230218 - type: manhattan_f1 value: 94.67815509376584 - type: manhattan_f1_threshold value: 38872.90954589844 - type: manhattan_precision value: 94.24823410696267 - type: manhattan_recall value: 95.11201629327903 - type: max_accuracy value: 92.65578635014838 - type: max_ap value: 98.3834226153613 - type: max_f1 value: 94.92567913890312 - type: max_precision value: 95.562435500516 - type: max_recall value: 95.62118126272912 - type: similarity_accuracy value: 92.65578635014838 - type: similarity_accuracy_threshold value: 74.02530312538147 - type: similarity_ap value: 98.3834226153613 - type: similarity_f1 value: 94.92567913890312 - type: similarity_f1_threshold value: 74.02530312538147 - type: similarity_precision value: 95.562435500516 - type: similarity_recall value: 94.29735234215886 - task: type: PairClassification dataset: name: MTEB OpusparcusPC (de) type: GEM/opusparcus config: de split: test.full revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a metrics: - type: cosine_accuracy value: 87.72178850248403 - type: cosine_accuracy_threshold value: 73.33863377571106 - type: cosine_ap value: 96.98901408834976 - type: cosine_f1 value: 91.89944134078212 - type: cosine_f1_threshold value: 71.45810127258301 - type: cosine_precision value: 89.64577656675749 - type: cosine_recall value: 94.26934097421203 - type: dot_accuracy value: 86.30234208658624 - type: dot_accuracy_threshold value: 32027.130126953125 - type: dot_ap value: 96.12260574893256 - type: dot_f1 value: 91.31602506714414 - type: dot_f1_threshold value: 30804.376220703125 - type: dot_precision value: 85.93091828138164 - type: dot_recall value: 97.42120343839542 - type: euclidean_accuracy value: 87.9347054648687 - type: euclidean_accuracy_threshold value: 1609.6670150756836 - type: euclidean_ap value: 97.00238860358252 - type: euclidean_f1 value: 92.1089063221043 - type: euclidean_f1_threshold value: 1641.8487548828125 - type: euclidean_precision value: 89.10714285714286 - type: euclidean_recall value: 95.31996179560649 - type: main_score value: 97.00238860358252 - type: manhattan_accuracy value: 87.72178850248403 - type: manhattan_accuracy_threshold value: 40137.060546875 - type: manhattan_ap value: 96.98653728159941 - type: manhattan_f1 value: 92.03865623561896 - type: manhattan_f1_threshold value: 40137.060546875 - type: manhattan_precision value: 88.80994671403198 - type: manhattan_recall value: 95.51098376313276 - type: max_accuracy value: 87.9347054648687 - type: max_ap value: 97.00238860358252 - type: max_f1 value: 92.1089063221043 - type: max_precision value: 89.64577656675749 - type: max_recall value: 97.42120343839542 - type: similarity_accuracy value: 87.72178850248403 - type: similarity_accuracy_threshold value: 73.33863377571106 - type: similarity_ap value: 96.98901408834976 - type: similarity_f1 value: 91.89944134078212 - type: similarity_f1_threshold value: 71.45810127258301 - type: similarity_precision value: 89.64577656675749 - type: similarity_recall value: 94.26934097421203 - task: type: PairClassification dataset: name: MTEB OpusparcusPC (fr) type: GEM/opusparcus config: fr split: test.full revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a metrics: - type: cosine_accuracy value: 80.92643051771117 - type: cosine_accuracy_threshold value: 76.68856382369995 - type: cosine_ap value: 93.74622381534307 - type: cosine_f1 value: 87.12328767123287 - type: cosine_f1_threshold value: 71.64022922515869 - type: cosine_precision value: 80.64243448858834 - type: cosine_recall value: 94.73684210526315 - type: dot_accuracy value: 80.858310626703 - type: dot_accuracy_threshold value: 34028.3935546875 - type: dot_ap value: 91.18448457633308 - type: dot_f1 value: 86.82606657290202 - type: dot_f1_threshold value: 34028.3935546875 - type: dot_precision value: 82.2380106571936 - type: dot_recall value: 91.9563058589871 - type: euclidean_accuracy value: 80.858310626703 - type: euclidean_accuracy_threshold value: 1595.7651138305664 - type: euclidean_ap value: 93.8182717829648 - type: euclidean_f1 value: 87.04044117647058 - type: euclidean_f1_threshold value: 1609.2475891113281 - type: euclidean_precision value: 81.00940975192472 - type: euclidean_recall value: 94.04170804369414 - type: main_score value: 93.8182717829648 - type: manhattan_accuracy value: 80.99455040871935 - type: manhattan_accuracy_threshold value: 38092.132568359375 - type: manhattan_ap value: 93.77563401151711 - type: manhattan_f1 value: 86.91983122362869 - type: manhattan_f1_threshold value: 38092.132568359375 - type: manhattan_precision value: 82.32682060390763 - type: manhattan_recall value: 92.05561072492551 - type: max_accuracy value: 80.99455040871935 - type: max_ap value: 93.8182717829648 - type: max_f1 value: 87.12328767123287 - type: max_precision value: 82.32682060390763 - type: max_recall value: 94.73684210526315 - type: similarity_accuracy value: 80.92643051771117 - type: similarity_accuracy_threshold value: 76.68856382369995 - type: similarity_ap value: 93.74622381534307 - type: similarity_f1 value: 87.12328767123287 - type: similarity_f1_threshold value: 71.64022922515869 - type: similarity_precision value: 80.64243448858834 - type: similarity_recall value: 94.73684210526315 - task: type: PairClassification dataset: name: MTEB OpusparcusPC (ru) type: GEM/opusparcus config: ru split: test.full revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a metrics: - type: cosine_accuracy value: 76.83823529411765 - type: cosine_accuracy_threshold value: 72.70769476890564 - type: cosine_ap value: 89.56692049908222 - type: cosine_f1 value: 83.99832003359934 - type: cosine_f1_threshold value: 70.9052324295044 - type: cosine_precision value: 76.16146230007617 - type: cosine_recall value: 93.63295880149812 - type: dot_accuracy value: 76.28676470588235 - type: dot_accuracy_threshold value: 33740.68908691406 - type: dot_ap value: 87.77185177141567 - type: dot_f1 value: 83.62251375370292 - type: dot_f1_threshold value: 32726.611328125 - type: dot_precision value: 76.29343629343629 - type: dot_recall value: 92.50936329588015 - type: euclidean_accuracy value: 77.32843137254902 - type: euclidean_accuracy_threshold value: 1566.510009765625 - type: euclidean_ap value: 89.60605626791111 - type: euclidean_f1 value: 84.06546080964686 - type: euclidean_f1_threshold value: 1576.4202117919922 - type: euclidean_precision value: 77.83094098883574 - type: euclidean_recall value: 91.38576779026218 - type: main_score value: 89.60605626791111 - type: manhattan_accuracy value: 76.89950980392157 - type: manhattan_accuracy_threshold value: 38202.215576171875 - type: manhattan_ap value: 89.55766894104868 - type: manhattan_f1 value: 83.80462724935732 - type: manhattan_f1_threshold value: 38934.375 - type: manhattan_precision value: 77.25118483412322 - type: manhattan_recall value: 91.57303370786516 - type: max_accuracy value: 77.32843137254902 - type: max_ap value: 89.60605626791111 - type: max_f1 value: 84.06546080964686 - type: max_precision value: 77.83094098883574 - type: max_recall value: 93.63295880149812 - type: similarity_accuracy value: 76.83823529411765 - type: similarity_accuracy_threshold value: 72.70769476890564 - type: similarity_ap value: 89.56692049908222 - type: similarity_f1 value: 83.99832003359934 - type: similarity_f1_threshold value: 70.9052324295044 - type: similarity_precision value: 76.16146230007617 - type: similarity_recall value: 93.63295880149812 - task: type: Classification dataset: name: MTEB PAC (default) type: laugustyniak/abusive-clauses-pl config: default split: test revision: fc69d1c153a8ccdcf1eef52f4e2a27f88782f543 metrics: - type: accuracy value: 68.39559803069794 - type: ap value: 77.68074206719457 - type: ap_weighted value: 77.68074206719457 - type: f1 value: 66.23485605467732 - type: f1_weighted value: 69.03201442129347 - type: main_score value: 68.39559803069794 - task: type: STS dataset: name: MTEB PAWSX (default) type: C-MTEB/PAWSX config: default split: test revision: 9c6a90e430ac22b5779fb019a23e820b11a8b5e1 metrics: - type: cosine_pearson value: 13.161523266433587 - type: cosine_spearman value: 15.557333873773386 - type: euclidean_pearson value: 17.147508431907525 - type: euclidean_spearman value: 15.664112857732146 - type: main_score value: 15.557333873773386 - type: manhattan_pearson value: 17.130875906264386 - type: manhattan_spearman value: 15.624397342229637 - type: pearson value: 13.161523266433587 - type: spearman value: 15.557333873773386 - task: type: PairClassification dataset: name: MTEB PSC (default) type: PL-MTEB/psc-pairclassification config: default split: test revision: d05a294af9e1d3ff2bfb6b714e08a24a6cabc669 metrics: - type: cosine_accuracy value: 97.86641929499072 - type: cosine_accuracy_threshold value: 79.0391206741333 - type: cosine_ap value: 99.19403807771533 - type: cosine_f1 value: 96.45608628659475 - type: cosine_f1_threshold value: 79.0391206741333 - type: cosine_precision value: 97.50778816199377 - type: cosine_recall value: 95.42682926829268 - type: dot_accuracy value: 98.14471243042672 - type: dot_accuracy_threshold value: 29808.1787109375 - type: dot_ap value: 99.331999859971 - type: dot_f1 value: 97.01492537313433 - type: dot_f1_threshold value: 29808.1787109375 - type: dot_precision value: 95.02923976608187 - type: dot_recall value: 99.08536585365853 - type: euclidean_accuracy value: 97.49536178107606 - type: euclidean_accuracy_threshold value: 1276.227855682373 - type: euclidean_ap value: 98.91056467717377 - type: euclidean_f1 value: 95.83975346687212 - type: euclidean_f1_threshold value: 1276.227855682373 - type: euclidean_precision value: 96.88473520249221 - type: euclidean_recall value: 94.8170731707317 - type: main_score value: 99.331999859971 - type: manhattan_accuracy value: 97.49536178107606 - type: manhattan_accuracy_threshold value: 31097.674560546875 - type: manhattan_ap value: 98.95694691792707 - type: manhattan_f1 value: 95.83975346687212 - type: manhattan_f1_threshold value: 31097.674560546875 - type: manhattan_precision value: 96.88473520249221 - type: manhattan_recall value: 94.8170731707317 - type: max_accuracy value: 98.14471243042672 - type: max_ap value: 99.331999859971 - type: max_f1 value: 97.01492537313433 - type: max_precision value: 97.50778816199377 - type: max_recall value: 99.08536585365853 - type: similarity_accuracy value: 97.86641929499072 - type: similarity_accuracy_threshold value: 79.0391206741333 - type: similarity_ap value: 99.19403807771533 - type: similarity_f1 value: 96.45608628659475 - type: similarity_f1_threshold value: 79.0391206741333 - type: similarity_precision value: 97.50778816199377 - type: similarity_recall value: 95.42682926829268 - task: type: PairClassification dataset: name: MTEB PawsXPairClassification (en) type: google-research-datasets/paws-x config: en split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cosine_accuracy value: 61.8 - type: cosine_accuracy_threshold value: 99.5664119720459 - type: cosine_ap value: 60.679317786040585 - type: cosine_f1 value: 63.17354143441101 - type: cosine_f1_threshold value: 97.22164869308472 - type: cosine_precision value: 47.6457399103139 - type: cosine_recall value: 93.71554575523705 - type: dot_accuracy value: 55.7 - type: dot_accuracy_threshold value: 48353.62548828125 - type: dot_ap value: 48.53805970536875 - type: dot_f1 value: 62.42214532871972 - type: dot_f1_threshold value: 38215.53955078125 - type: dot_precision value: 45.48663640948058 - type: dot_recall value: 99.44873208379272 - type: euclidean_accuracy value: 61.75000000000001 - type: euclidean_accuracy_threshold value: 189.0761137008667 - type: euclidean_ap value: 60.55517418691518 - type: euclidean_f1 value: 63.07977736549165 - type: euclidean_f1_threshold value: 504.3168067932129 - type: euclidean_precision value: 47.53914988814318 - type: euclidean_recall value: 93.71554575523705 - type: main_score value: 60.679317786040585 - type: manhattan_accuracy value: 61.9 - type: manhattan_accuracy_threshold value: 4695.778274536133 - type: manhattan_ap value: 60.48686620413608 - type: manhattan_f1 value: 62.92880855772778 - type: manhattan_f1_threshold value: 12542.36831665039 - type: manhattan_precision value: 47.28381374722838 - type: manhattan_recall value: 94.04630650496141 - type: max_accuracy value: 61.9 - type: max_ap value: 60.679317786040585 - type: max_f1 value: 63.17354143441101 - type: max_precision value: 47.6457399103139 - type: max_recall value: 99.44873208379272 - type: similarity_accuracy value: 61.8 - type: similarity_accuracy_threshold value: 99.5664119720459 - type: similarity_ap value: 60.679317786040585 - type: similarity_f1 value: 63.17354143441101 - type: similarity_f1_threshold value: 97.22164869308472 - type: similarity_precision value: 47.6457399103139 - type: similarity_recall value: 93.71554575523705 - task: type: PairClassification dataset: name: MTEB PawsXPairClassification (de) type: google-research-datasets/paws-x config: de split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cosine_accuracy value: 60.25 - type: cosine_accuracy_threshold value: 99.54338073730469 - type: cosine_ap value: 56.7863613689054 - type: cosine_f1 value: 62.23499820337766 - type: cosine_f1_threshold value: 89.95014429092407 - type: cosine_precision value: 45.86864406779661 - type: cosine_recall value: 96.75977653631284 - type: dot_accuracy value: 56.8 - type: dot_accuracy_threshold value: 47349.78332519531 - type: dot_ap value: 49.7857806061729 - type: dot_f1 value: 62.31225986727209 - type: dot_f1_threshold value: 30143.206787109375 - type: dot_precision value: 45.32520325203252 - type: dot_recall value: 99.66480446927373 - type: euclidean_accuracy value: 60.3 - type: euclidean_accuracy_threshold value: 219.78106498718262 - type: euclidean_ap value: 56.731544327179606 - type: euclidean_f1 value: 62.19895287958115 - type: euclidean_f1_threshold value: 1792.1623229980469 - type: euclidean_precision value: 45.22842639593909 - type: euclidean_recall value: 99.55307262569832 - type: main_score value: 56.7863613689054 - type: manhattan_accuracy value: 60.150000000000006 - type: manhattan_accuracy_threshold value: 5104.503631591797 - type: manhattan_ap value: 56.70304479768734 - type: manhattan_f1 value: 62.22067039106145 - type: manhattan_f1_threshold value: 42839.471435546875 - type: manhattan_precision value: 45.2513966480447 - type: manhattan_recall value: 99.55307262569832 - type: max_accuracy value: 60.3 - type: max_ap value: 56.7863613689054 - type: max_f1 value: 62.31225986727209 - type: max_precision value: 45.86864406779661 - type: max_recall value: 99.66480446927373 - type: similarity_accuracy value: 60.25 - type: similarity_accuracy_threshold value: 99.54338073730469 - type: similarity_ap value: 56.7863613689054 - type: similarity_f1 value: 62.23499820337766 - type: similarity_f1_threshold value: 89.95014429092407 - type: similarity_precision value: 45.86864406779661 - type: similarity_recall value: 96.75977653631284 - task: type: PairClassification dataset: name: MTEB PawsXPairClassification (es) type: google-research-datasets/paws-x config: es split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cosine_accuracy value: 59.699999999999996 - type: cosine_accuracy_threshold value: 99.55930709838867 - type: cosine_ap value: 57.31662248806265 - type: cosine_f1 value: 62.444061962134256 - type: cosine_f1_threshold value: 74.75898265838623 - type: cosine_precision value: 45.3953953953954 - type: cosine_recall value: 100.0 - type: dot_accuracy value: 55.900000000000006 - type: dot_accuracy_threshold value: 47512.90283203125 - type: dot_ap value: 49.39339147787568 - type: dot_f1 value: 62.487082328625554 - type: dot_f1_threshold value: 34989.03503417969 - type: dot_precision value: 45.44088176352705 - type: dot_recall value: 100.0 - type: euclidean_accuracy value: 59.599999999999994 - type: euclidean_accuracy_threshold value: 200.82547664642334 - type: euclidean_ap value: 57.19737488445163 - type: euclidean_f1 value: 62.444061962134256 - type: euclidean_f1_threshold value: 1538.8837814331055 - type: euclidean_precision value: 45.3953953953954 - type: euclidean_recall value: 100.0 - type: main_score value: 57.31662248806265 - type: manhattan_accuracy value: 59.550000000000004 - type: manhattan_accuracy_threshold value: 5016.501617431641 - type: manhattan_ap value: 57.089959907945065 - type: manhattan_f1 value: 62.444061962134256 - type: manhattan_f1_threshold value: 37523.53515625 - type: manhattan_precision value: 45.3953953953954 - type: manhattan_recall value: 100.0 - type: max_accuracy value: 59.699999999999996 - type: max_ap value: 57.31662248806265 - type: max_f1 value: 62.487082328625554 - type: max_precision value: 45.44088176352705 - type: max_recall value: 100.0 - type: similarity_accuracy value: 59.699999999999996 - type: similarity_accuracy_threshold value: 99.55930709838867 - type: similarity_ap value: 57.31662248806265 - type: similarity_f1 value: 62.444061962134256 - type: similarity_f1_threshold value: 74.75898265838623 - type: similarity_precision value: 45.3953953953954 - type: similarity_recall value: 100.0 - task: type: PairClassification dataset: name: MTEB PawsXPairClassification (fr) type: google-research-datasets/paws-x config: fr split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cosine_accuracy value: 61.150000000000006 - type: cosine_accuracy_threshold value: 99.36153888702393 - type: cosine_ap value: 59.43845317938599 - type: cosine_f1 value: 62.51298026998961 - type: cosine_f1_threshold value: 76.77866220474243 - type: cosine_precision value: 45.468277945619334 - type: cosine_recall value: 100.0 - type: dot_accuracy value: 55.75 - type: dot_accuracy_threshold value: 48931.55212402344 - type: dot_ap value: 50.15949290538757 - type: dot_f1 value: 62.53462603878117 - type: dot_f1_threshold value: 34415.7958984375 - type: dot_precision value: 45.4911838790932 - type: dot_recall value: 100.0 - type: euclidean_accuracy value: 61.050000000000004 - type: euclidean_accuracy_threshold value: 240.8097267150879 - type: euclidean_ap value: 59.367971294226216 - type: euclidean_f1 value: 62.51298026998961 - type: euclidean_f1_threshold value: 1444.132423400879 - type: euclidean_precision value: 45.468277945619334 - type: euclidean_recall value: 100.0 - type: main_score value: 59.43845317938599 - type: manhattan_accuracy value: 60.95 - type: manhattan_accuracy_threshold value: 5701.206207275391 - type: manhattan_ap value: 59.30094096378774 - type: manhattan_f1 value: 62.53462603878117 - type: manhattan_f1_threshold value: 33445.672607421875 - type: manhattan_precision value: 45.4911838790932 - type: manhattan_recall value: 100.0 - type: max_accuracy value: 61.150000000000006 - type: max_ap value: 59.43845317938599 - type: max_f1 value: 62.53462603878117 - type: max_precision value: 45.4911838790932 - type: max_recall value: 100.0 - type: similarity_accuracy value: 61.150000000000006 - type: similarity_accuracy_threshold value: 99.36153888702393 - type: similarity_ap value: 59.43845317938599 - type: similarity_f1 value: 62.51298026998961 - type: similarity_f1_threshold value: 76.77866220474243 - type: similarity_precision value: 45.468277945619334 - type: similarity_recall value: 100.0 - task: type: PairClassification dataset: name: MTEB PawsXPairClassification (zh) type: google-research-datasets/paws-x config: zh split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cosine_accuracy value: 58.85 - type: cosine_accuracy_threshold value: 99.73838329315186 - type: cosine_ap value: 54.66913160570546 - type: cosine_f1 value: 62.32136632973162 - type: cosine_f1_threshold value: 76.4499306678772 - type: cosine_precision value: 45.265822784810126 - type: cosine_recall value: 100.0 - type: dot_accuracy value: 56.25 - type: dot_accuracy_threshold value: 47351.9287109375 - type: dot_ap value: 48.5266232989438 - type: dot_f1 value: 62.277951933124356 - type: dot_f1_threshold value: 31325.28076171875 - type: dot_precision value: 45.220030349013655 - type: dot_recall value: 100.0 - type: euclidean_accuracy value: 58.9 - type: euclidean_accuracy_threshold value: 144.24468278884888 - type: euclidean_ap value: 54.66981490353506 - type: euclidean_f1 value: 62.32136632973162 - type: euclidean_f1_threshold value: 1484.908676147461 - type: euclidean_precision value: 45.265822784810126 - type: euclidean_recall value: 100.0 - type: main_score value: 54.66981490353506 - type: manhattan_accuracy value: 58.9 - type: manhattan_accuracy_threshold value: 3586.785125732422 - type: manhattan_ap value: 54.668355260247736 - type: manhattan_f1 value: 62.32136632973162 - type: manhattan_f1_threshold value: 36031.22863769531 - type: manhattan_precision value: 45.265822784810126 - type: manhattan_recall value: 100.0 - type: max_accuracy value: 58.9 - type: max_ap value: 54.66981490353506 - type: max_f1 value: 62.32136632973162 - type: max_precision value: 45.265822784810126 - type: max_recall value: 100.0 - type: similarity_accuracy value: 58.85 - type: similarity_accuracy_threshold value: 99.73838329315186 - type: similarity_ap value: 54.66913160570546 - type: similarity_f1 value: 62.32136632973162 - type: similarity_f1_threshold value: 76.4499306678772 - type: similarity_precision value: 45.265822784810126 - type: similarity_recall value: 100.0 - task: type: Classification dataset: name: MTEB PolEmo2.0-IN (default) type: PL-MTEB/polemo2_in config: default split: test revision: d90724373c70959f17d2331ad51fb60c71176b03 metrics: - type: accuracy value: 83.75346260387812 - type: f1 value: 81.98304891214909 - type: f1_weighted value: 84.29623200830078 - type: main_score value: 83.75346260387812 - task: type: Classification dataset: name: MTEB PolEmo2.0-OUT (default) type: PL-MTEB/polemo2_out config: default split: test revision: 6a21ab8716e255ab1867265f8b396105e8aa63d4 metrics: - type: accuracy value: 66.53846153846153 - type: f1 value: 52.71826064368638 - type: f1_weighted value: 69.10010124630334 - type: main_score value: 66.53846153846153 - task: type: PairClassification dataset: name: MTEB PPC type: PL-MTEB/ppc-pairclassification config: default split: test revision: None metrics: - type: cosine_accuracy value: 81.8 - type: cosine_accuracy_threshold value: 90.47793745994568 - type: cosine_ap value: 91.42490266080884 - type: cosine_f1 value: 85.4632587859425 - type: cosine_f1_threshold value: 90.47793745994568 - type: cosine_precision value: 82.56172839506173 - type: cosine_recall value: 88.57615894039735 - type: dot_accuracy value: 74.6 - type: dot_accuracy_threshold value: 42102.23693847656 - type: dot_ap value: 86.20060009096979 - type: dot_f1 value: 80.02842928216063 - type: dot_f1_threshold value: 38970.16906738281 - type: dot_precision value: 70.1120797011208 - type: dot_recall value: 93.21192052980133 - type: euclidean_accuracy value: 81.5 - type: euclidean_accuracy_threshold value: 880.433464050293 - type: euclidean_ap value: 91.33143477982087 - type: euclidean_f1 value: 85.44600938967135 - type: euclidean_f1_threshold value: 964.0384674072266 - type: euclidean_precision value: 81.00890207715133 - type: euclidean_recall value: 90.39735099337747 - type: main_score value: 91.42490266080884 - type: manhattan_accuracy value: 81.3 - type: manhattan_accuracy_threshold value: 22100.830078125 - type: manhattan_ap value: 91.25996158651282 - type: manhattan_f1 value: 85.38102643856921 - type: manhattan_f1_threshold value: 24043.515014648438 - type: manhattan_precision value: 80.49853372434018 - type: manhattan_recall value: 90.89403973509934 - type: max_accuracy value: 81.8 - type: max_ap value: 91.42490266080884 - type: max_f1 value: 85.4632587859425 - type: max_precision value: 82.56172839506173 - type: max_recall value: 93.21192052980133 - type: similarity_accuracy value: 81.8 - type: similarity_accuracy_threshold value: 90.47793745994568 - type: similarity_ap value: 91.42490266080884 - type: similarity_f1 value: 85.4632587859425 - type: similarity_f1_threshold value: 90.47793745994568 - type: similarity_precision value: 82.56172839506173 - type: similarity_recall value: 88.57615894039735 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval (default) type: mteb/quora config: default split: test revision: e4e08e0b7dbe3c8700f0daef558ff32256715259 metrics: - type: map_at_1 value: 71.419 - type: map_at_10 value: 85.542 - type: map_at_100 value: 86.161 - type: map_at_1000 value: 86.175 - type: map_at_20 value: 85.949 - type: map_at_3 value: 82.623 - type: map_at_5 value: 84.5 - type: mrr_at_1 value: 82.27 - type: mrr_at_10 value: 88.21900000000001 - type: mrr_at_100 value: 88.313 - type: mrr_at_1000 value: 88.31400000000001 - type: mrr_at_20 value: 88.286 - type: mrr_at_3 value: 87.325 - type: mrr_at_5 value: 87.97500000000001 - type: ndcg_at_1 value: 82.3 - type: ndcg_at_10 value: 89.088 - type: ndcg_at_100 value: 90.217 - type: ndcg_at_1000 value: 90.29700000000001 - type: ndcg_at_20 value: 89.697 - type: ndcg_at_3 value: 86.435 - type: ndcg_at_5 value: 87.966 - type: precision_at_1 value: 82.3 - type: precision_at_10 value: 13.527000000000001 - type: precision_at_100 value: 1.537 - type: precision_at_1000 value: 0.157 - type: precision_at_20 value: 7.165000000000001 - type: precision_at_3 value: 37.92 - type: precision_at_5 value: 24.914 - type: recall_at_1 value: 71.419 - type: recall_at_10 value: 95.831 - type: recall_at_100 value: 99.64 - type: recall_at_1000 value: 99.988 - type: recall_at_20 value: 97.76599999999999 - type: recall_at_3 value: 88.081 - type: recall_at_5 value: 92.50500000000001 - type: main_score value: 89.088 - task: type: STS dataset: name: MTEB RUParaPhraserSTS (default) type: merionum/ru_paraphraser config: default split: test revision: 43265056790b8f7c59e0139acb4be0a8dad2c8f4 metrics: - type: cosine_pearson value: 67.91177744712421 - type: cosine_spearman value: 76.77113726753656 - type: euclidean_pearson value: 73.81454206068638 - type: euclidean_spearman value: 76.92529493599028 - type: main_score value: 76.77113726753656 - type: manhattan_pearson value: 73.81690454439168 - type: manhattan_spearman value: 76.87333776705002 - type: pearson value: 67.91177744712421 - type: spearman value: 76.77113726753656 - task: type: Clustering dataset: name: MTEB RedditClustering (default) type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: main_score value: 55.39924225216962 - type: v_measure value: 55.39924225216962 - type: v_measure_std value: 4.723802279292467 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P (default) type: mteb/reddit-clustering-p2p config: default split: test revision: 385e3cb46b4cfa89021f56c4380204149d0efe33 metrics: - type: main_score value: 62.87465161304012 - type: v_measure value: 62.87465161304012 - type: v_measure_std value: 12.082670914488473 - task: type: Retrieval dataset: name: MTEB RiaNewsRetrieval (default) type: ai-forever/ria-news-retrieval config: default split: test revision: 82374b0bbacda6114f39ff9c5b925fa1512ca5d7 metrics: - type: main_score value: 79.209 - type: map_at_1 value: 67.33 - type: map_at_10 value: 75.633 - type: map_at_100 value: 75.897 - type: map_at_1000 value: 75.907 - type: map_at_20 value: 75.804 - type: map_at_3 value: 74.2 - type: map_at_5 value: 75.13300000000001 - type: mrr_at_1 value: 67.31 - type: mrr_at_10 value: 75.62709126984095 - type: mrr_at_100 value: 75.89105697041113 - type: mrr_at_1000 value: 75.90115653883124 - type: mrr_at_20 value: 75.79802332308172 - type: mrr_at_3 value: 74.19499999999961 - type: mrr_at_5 value: 75.12849999999939 - type: nauc_map_at_1000_diff1 value: 74.30304869630591 - type: nauc_map_at_1000_max value: 36.477146725784046 - type: nauc_map_at_1000_std value: -20.862772498461723 - type: nauc_map_at_100_diff1 value: 74.29833058090355 - type: nauc_map_at_100_max value: 36.483678619667884 - type: nauc_map_at_100_std value: -20.856274849980135 - type: nauc_map_at_10_diff1 value: 74.20729220697967 - type: nauc_map_at_10_max value: 36.56543146170092 - type: nauc_map_at_10_std value: -20.991081015484728 - type: nauc_map_at_1_diff1 value: 77.38899022125185 - type: nauc_map_at_1_max value: 32.45918619669731 - type: nauc_map_at_1_std value: -22.149586336167324 - type: nauc_map_at_20_diff1 value: 74.2447573558587 - type: nauc_map_at_20_max value: 36.50383130240387 - type: nauc_map_at_20_std value: -20.87013743041831 - type: nauc_map_at_3_diff1 value: 74.3054577294586 - type: nauc_map_at_3_max value: 36.484530586652724 - type: nauc_map_at_3_std value: -21.90543024607988 - type: nauc_map_at_5_diff1 value: 74.21062368961503 - type: nauc_map_at_5_max value: 36.55670532498779 - type: nauc_map_at_5_std value: -21.488786900676942 - type: nauc_mrr_at_1000_diff1 value: 74.31619177956684 - type: nauc_mrr_at_1000_max value: 36.53498918453189 - type: nauc_mrr_at_1000_std value: -20.75986704931237 - type: nauc_mrr_at_100_diff1 value: 74.31146790382356 - type: nauc_mrr_at_100_max value: 36.54149252857106 - type: nauc_mrr_at_100_std value: -20.75341959250079 - type: nauc_mrr_at_10_diff1 value: 74.22027806145095 - type: nauc_mrr_at_10_max value: 36.622542969971725 - type: nauc_mrr_at_10_std value: -20.889417384064117 - type: nauc_mrr_at_1_diff1 value: 77.4306709551449 - type: nauc_mrr_at_1_max value: 32.57259463438259 - type: nauc_mrr_at_1_std value: -21.964402859613937 - type: nauc_mrr_at_20_diff1 value: 74.25784396230718 - type: nauc_mrr_at_20_max value: 36.561412224507336 - type: nauc_mrr_at_20_std value: -20.767665000065723 - type: nauc_mrr_at_3_diff1 value: 74.31423253547214 - type: nauc_mrr_at_3_max value: 36.537745749488906 - type: nauc_mrr_at_3_std value: -21.81259529019546 - type: nauc_mrr_at_5_diff1 value: 74.22404613312771 - type: nauc_mrr_at_5_max value: 36.60743768455219 - type: nauc_mrr_at_5_std value: -21.39479216331971 - type: nauc_ndcg_at_1000_diff1 value: 73.48182819705742 - type: nauc_ndcg_at_1000_max value: 37.86991608461793 - type: nauc_ndcg_at_1000_std value: -19.021499322688904 - type: nauc_ndcg_at_100_diff1 value: 73.34941250585759 - type: nauc_ndcg_at_100_max value: 38.11150275625829 - type: nauc_ndcg_at_100_std value: -18.70624087206104 - type: nauc_ndcg_at_10_diff1 value: 72.82520265115987 - type: nauc_ndcg_at_10_max value: 38.43323357650525 - type: nauc_ndcg_at_10_std value: -19.410953792830878 - type: nauc_ndcg_at_1_diff1 value: 77.38899022125185 - type: nauc_ndcg_at_1_max value: 32.45918619669731 - type: nauc_ndcg_at_1_std value: -22.149586336167324 - type: nauc_ndcg_at_20_diff1 value: 72.93309285256507 - type: nauc_ndcg_at_20_max value: 38.217372819067755 - type: nauc_ndcg_at_20_std value: -18.864113576359333 - type: nauc_ndcg_at_3_diff1 value: 73.18253776744112 - type: nauc_ndcg_at_3_max value: 38.008109328364 - type: nauc_ndcg_at_3_std value: -21.68785687594153 - type: nauc_ndcg_at_5_diff1 value: 72.90474739784793 - type: nauc_ndcg_at_5_max value: 38.29483039202184 - type: nauc_ndcg_at_5_std value: -20.833049811453474 - type: nauc_precision_at_1000_diff1 value: 59.306217613750334 - type: nauc_precision_at_1000_max value: 72.20747948302262 - type: nauc_precision_at_1000_std value: 45.58837180096227 - type: nauc_precision_at_100_diff1 value: 62.87286844562389 - type: nauc_precision_at_100_max value: 61.33108214045868 - type: nauc_precision_at_100_std value: 20.67481963545654 - type: nauc_precision_at_10_diff1 value: 64.11222984256685 - type: nauc_precision_at_10_max value: 50.323697746037496 - type: nauc_precision_at_10_std value: -7.9994544634332625 - type: nauc_precision_at_1_diff1 value: 77.38899022125185 - type: nauc_precision_at_1_max value: 32.45918619669731 - type: nauc_precision_at_1_std value: -22.149586336167324 - type: nauc_precision_at_20_diff1 value: 62.30228127286973 - type: nauc_precision_at_20_max value: 52.02090746208407 - type: nauc_precision_at_20_std value: 0.7629898806370331 - type: nauc_precision_at_3_diff1 value: 68.82856645994157 - type: nauc_precision_at_3_max value: 43.94171571306625 - type: nauc_precision_at_3_std value: -20.78595255410148 - type: nauc_precision_at_5_diff1 value: 66.62157622497887 - type: nauc_precision_at_5_max value: 46.69398173603811 - type: nauc_precision_at_5_std value: -17.412423571163057 - type: nauc_recall_at_1000_diff1 value: 59.30621761375148 - type: nauc_recall_at_1000_max value: 72.20747948302191 - type: nauc_recall_at_1000_std value: 45.588371800962655 - type: nauc_recall_at_100_diff1 value: 62.872868445623894 - type: nauc_recall_at_100_max value: 61.33108214045813 - type: nauc_recall_at_100_std value: 20.67481963545666 - type: nauc_recall_at_10_diff1 value: 64.11222984256698 - type: nauc_recall_at_10_max value: 50.32369774603755 - type: nauc_recall_at_10_std value: -7.999454463433321 - type: nauc_recall_at_1_diff1 value: 77.38899022125185 - type: nauc_recall_at_1_max value: 32.45918619669731 - type: nauc_recall_at_1_std value: -22.149586336167324 - type: nauc_recall_at_20_diff1 value: 62.3022812728695 - type: nauc_recall_at_20_max value: 52.02090746208397 - type: nauc_recall_at_20_std value: 0.7629898806369458 - type: nauc_recall_at_3_diff1 value: 68.82856645994157 - type: nauc_recall_at_3_max value: 43.94171571306612 - type: nauc_recall_at_3_std value: -20.78595255410157 - type: nauc_recall_at_5_diff1 value: 66.62157622497897 - type: nauc_recall_at_5_max value: 46.693981736038246 - type: nauc_recall_at_5_std value: -17.412423571162954 - type: ndcg_at_1 value: 67.33 - type: ndcg_at_10 value: 79.209 - type: ndcg_at_100 value: 80.463 - type: ndcg_at_1000 value: 80.74799999999999 - type: ndcg_at_20 value: 79.81899999999999 - type: ndcg_at_3 value: 76.335 - type: ndcg_at_5 value: 78.011 - type: precision_at_1 value: 67.33 - type: precision_at_10 value: 9.020999999999999 - type: precision_at_100 value: 0.96 - type: precision_at_1000 value: 0.098 - type: precision_at_20 value: 4.63 - type: precision_at_3 value: 27.493000000000002 - type: precision_at_5 value: 17.308 - type: recall_at_1 value: 67.33 - type: recall_at_10 value: 90.21000000000001 - type: recall_at_100 value: 96.00999999999999 - type: recall_at_1000 value: 98.29 - type: recall_at_20 value: 92.60000000000001 - type: recall_at_3 value: 82.48 - type: recall_at_5 value: 86.53999999999999 - task: type: Reranking dataset: name: MTEB RuBQReranking (default) type: ai-forever/rubq-reranking config: default split: test revision: 2e96b8f098fa4b0950fc58eacadeb31c0d0c7fa2 metrics: - type: main_score value: 65.57453932493252 - type: map value: 65.57453932493252 - type: mrr value: 70.51408205663526 - type: nAUC_map_diff1 value: 26.69583260609023 - type: nAUC_map_max value: 12.928262749610663 - type: nAUC_map_std value: 11.702468857903128 - type: nAUC_mrr_diff1 value: 28.5206955462174 - type: nAUC_mrr_max value: 14.207162454694227 - type: nAUC_mrr_std value: 10.725721001555296 - task: type: Retrieval dataset: name: MTEB RuBQRetrieval (default) type: ai-forever/rubq-retrieval config: default split: test revision: e19b6ffa60b3bc248e0b41f4cc37c26a55c2a67b metrics: - type: main_score value: 72.306 - type: map_at_1 value: 44.187 - type: map_at_10 value: 64.836 - type: map_at_100 value: 65.771 - type: map_at_1000 value: 65.8 - type: map_at_20 value: 65.497 - type: map_at_3 value: 59.692 - type: map_at_5 value: 63.105 - type: mrr_at_1 value: 62.23404255319149 - type: mrr_at_10 value: 73.40810161732159 - type: mrr_at_100 value: 73.67949305473395 - type: mrr_at_1000 value: 73.68707852294746 - type: mrr_at_20 value: 73.60429051697479 - type: mrr_at_3 value: 71.47360126083535 - type: mrr_at_5 value: 72.8447596532704 - type: nauc_map_at_1000_diff1 value: 39.838449035736886 - type: nauc_map_at_1000_max value: 32.29962306877408 - type: nauc_map_at_1000_std value: -6.324859592714388 - type: nauc_map_at_100_diff1 value: 39.824361938745426 - type: nauc_map_at_100_max value: 32.32055222704763 - type: nauc_map_at_100_std value: -6.301641111869559 - type: nauc_map_at_10_diff1 value: 39.50155328718487 - type: nauc_map_at_10_max value: 31.745730244960672 - type: nauc_map_at_10_std value: -6.867215137329693 - type: nauc_map_at_1_diff1 value: 47.66181128677822 - type: nauc_map_at_1_max value: 21.75204233166764 - type: nauc_map_at_1_std value: -8.06951079061697 - type: nauc_map_at_20_diff1 value: 39.78364637902108 - type: nauc_map_at_20_max value: 32.39065528029405 - type: nauc_map_at_20_std value: -6.368994332729006 - type: nauc_map_at_3_diff1 value: 39.51829474433183 - type: nauc_map_at_3_max value: 28.633292697821673 - type: nauc_map_at_3_std value: -7.2561170814963925 - type: nauc_map_at_5_diff1 value: 39.288433237676266 - type: nauc_map_at_5_max value: 31.007702201615515 - type: nauc_map_at_5_std value: -7.235131195162474 - type: nauc_mrr_at_1000_diff1 value: 49.599102391215226 - type: nauc_mrr_at_1000_max value: 38.25521825911133 - type: nauc_mrr_at_1000_std value: -10.448180939809435 - type: nauc_mrr_at_100_diff1 value: 49.5957067716212 - type: nauc_mrr_at_100_max value: 38.26760703964535 - type: nauc_mrr_at_100_std value: -10.438443051971081 - type: nauc_mrr_at_10_diff1 value: 49.35269710190271 - type: nauc_mrr_at_10_max value: 38.43782589127069 - type: nauc_mrr_at_10_std value: -10.404402063509815 - type: nauc_mrr_at_1_diff1 value: 53.32206103688421 - type: nauc_mrr_at_1_max value: 33.52402390241035 - type: nauc_mrr_at_1_std value: -12.73473393949936 - type: nauc_mrr_at_20_diff1 value: 49.550630850826636 - type: nauc_mrr_at_20_max value: 38.35964703941151 - type: nauc_mrr_at_20_std value: -10.444577766284766 - type: nauc_mrr_at_3_diff1 value: 49.12029127633829 - type: nauc_mrr_at_3_max value: 38.01631275124067 - type: nauc_mrr_at_3_std value: -10.523724301481309 - type: nauc_mrr_at_5_diff1 value: 49.04606949432458 - type: nauc_mrr_at_5_max value: 38.33647550077891 - type: nauc_mrr_at_5_std value: -10.47076409263114 - type: nauc_ndcg_at_1000_diff1 value: 41.342785916264226 - type: nauc_ndcg_at_1000_max value: 35.75731064862711 - type: nauc_ndcg_at_1000_std value: -5.45573422899229 - type: nauc_ndcg_at_100_diff1 value: 40.972974559636086 - type: nauc_ndcg_at_100_max value: 36.32938573321036 - type: nauc_ndcg_at_100_std value: -4.749631537590004 - type: nauc_ndcg_at_10_diff1 value: 39.67813474464166 - type: nauc_ndcg_at_10_max value: 35.480200504848966 - type: nauc_ndcg_at_10_std value: -6.318561293935512 - type: nauc_ndcg_at_1_diff1 value: 53.45970160222764 - type: nauc_ndcg_at_1_max value: 33.14759013278075 - type: nauc_ndcg_at_1_std value: -12.579833891774847 - type: nauc_ndcg_at_20_diff1 value: 40.67492861219249 - type: nauc_ndcg_at_20_max value: 36.84960799838019 - type: nauc_ndcg_at_20_std value: -5.202530835850179 - type: nauc_ndcg_at_3_diff1 value: 39.574906207408844 - type: nauc_ndcg_at_3_max value: 31.76512164509258 - type: nauc_ndcg_at_3_std value: -7.656143208565999 - type: nauc_ndcg_at_5_diff1 value: 39.096348529742095 - type: nauc_ndcg_at_5_max value: 34.075926475544165 - type: nauc_ndcg_at_5_std value: -7.238045445366631 - type: nauc_precision_at_1000_diff1 value: -14.283799754212609 - type: nauc_precision_at_1000_max value: 6.449741756717101 - type: nauc_precision_at_1000_std value: 4.862828679759048 - type: nauc_precision_at_100_diff1 value: -13.23173132700258 - type: nauc_precision_at_100_max value: 11.058898534529195 - type: nauc_precision_at_100_std value: 7.343683941814956 - type: nauc_precision_at_10_diff1 value: -7.202951643546464 - type: nauc_precision_at_10_max value: 17.499446869433278 - type: nauc_precision_at_10_std value: 2.8367985220406307 - type: nauc_precision_at_1_diff1 value: 53.45970160222764 - type: nauc_precision_at_1_max value: 33.14759013278075 - type: nauc_precision_at_1_std value: -12.579833891774847 - type: nauc_precision_at_20_diff1 value: -9.477122699154124 - type: nauc_precision_at_20_max value: 16.80556031564312 - type: nauc_precision_at_20_std value: 6.420218284416923 - type: nauc_precision_at_3_diff1 value: 5.5276143574150245 - type: nauc_precision_at_3_max value: 23.65952688481666 - type: nauc_precision_at_3_std value: -1.8730348729295785 - type: nauc_precision_at_5_diff1 value: -2.4537029093721308 - type: nauc_precision_at_5_max value: 21.41469327545133 - type: nauc_precision_at_5_std value: 0.1543890645722277 - type: nauc_recall_at_1000_diff1 value: -1.7474947956413491 - type: nauc_recall_at_1000_max value: 46.22670991970479 - type: nauc_recall_at_1000_std value: 62.582840705588794 - type: nauc_recall_at_100_diff1 value: 16.116089801097345 - type: nauc_recall_at_100_max value: 52.54794580975103 - type: nauc_recall_at_100_std value: 33.720245696003246 - type: nauc_recall_at_10_diff1 value: 23.134924318655482 - type: nauc_recall_at_10_max value: 38.73754275649077 - type: nauc_recall_at_10_std value: 0.6137471711639239 - type: nauc_recall_at_1_diff1 value: 47.66181128677822 - type: nauc_recall_at_1_max value: 21.75204233166764 - type: nauc_recall_at_1_std value: -8.06951079061697 - type: nauc_recall_at_20_diff1 value: 24.130616271355017 - type: nauc_recall_at_20_max value: 48.306178640146136 - type: nauc_recall_at_20_std value: 9.290819557000022 - type: nauc_recall_at_3_diff1 value: 29.767415016250226 - type: nauc_recall_at_3_max value: 28.54289782140701 - type: nauc_recall_at_3_std value: -5.1395675072005576 - type: nauc_recall_at_5_diff1 value: 25.410613126870174 - type: nauc_recall_at_5_max value: 33.24658754857624 - type: nauc_recall_at_5_std value: -4.211226036746632 - type: ndcg_at_1 value: 62.175000000000004 - type: ndcg_at_10 value: 72.306 - type: ndcg_at_100 value: 75.074 - type: ndcg_at_1000 value: 75.581 - type: ndcg_at_20 value: 73.875 - type: ndcg_at_3 value: 65.641 - type: ndcg_at_5 value: 69.48299999999999 - type: precision_at_1 value: 62.175000000000004 - type: precision_at_10 value: 13.907 - type: precision_at_100 value: 1.591 - type: precision_at_1000 value: 0.166 - type: precision_at_20 value: 7.446999999999999 - type: precision_at_3 value: 35.619 - type: precision_at_5 value: 24.917 - type: recall_at_1 value: 44.187 - type: recall_at_10 value: 85.10600000000001 - type: recall_at_100 value: 95.488 - type: recall_at_1000 value: 98.831 - type: recall_at_20 value: 90.22200000000001 - type: recall_at_3 value: 68.789 - type: recall_at_5 value: 77.85499999999999 - task: type: Classification dataset: name: MTEB RuReviewsClassification (default) type: ai-forever/ru-reviews-classification config: default split: test revision: f6d2c31f4dc6b88f468552750bfec05b4b41b05a metrics: - type: accuracy value: 67.5830078125 - type: f1 value: 67.56931936632446 - type: f1_weighted value: 67.57137733752779 - type: main_score value: 67.5830078125 - task: type: STS dataset: name: MTEB RuSTSBenchmarkSTS (default) type: ai-forever/ru-stsbenchmark-sts config: default split: test revision: 7cf24f325c6da6195df55bef3d86b5e0616f3018 metrics: - type: cosine_pearson value: 85.90493484626788 - type: cosine_spearman value: 86.21965691667411 - type: euclidean_pearson value: 86.07499842984909 - type: euclidean_spearman value: 86.55506818735688 - type: main_score value: 86.21965691667411 - type: manhattan_pearson value: 85.95976420231729 - type: manhattan_spearman value: 86.48604243661234 - type: pearson value: 85.90493484626788 - type: spearman value: 86.21965691667411 - task: type: Classification dataset: name: MTEB RuSciBenchGRNTIClassification (default) type: ai-forever/ru-scibench-grnti-classification config: default split: test revision: 673a610d6d3dd91a547a0d57ae1b56f37ebbf6a1 metrics: - type: accuracy value: 59.1943359375 - type: f1 value: 58.894480861440414 - type: f1_weighted value: 58.903615560240866 - type: main_score value: 59.1943359375 - task: type: Clustering dataset: name: MTEB RuSciBenchGRNTIClusteringP2P (default) type: ai-forever/ru-scibench-grnti-classification config: default split: test revision: 673a610d6d3dd91a547a0d57ae1b56f37ebbf6a1 metrics: - type: main_score value: 57.99209448663228 - type: v_measure value: 57.99209448663228 - type: v_measure_std value: 1.0381163861993816 - task: type: Classification dataset: name: MTEB RuSciBenchOECDClassification (default) type: ai-forever/ru-scibench-oecd-classification config: default split: test revision: 26c88e99dcaba32bb45d0e1bfc21902337f6d471 metrics: - type: accuracy value: 45.556640625 - type: f1 value: 45.159163104085906 - type: f1_weighted value: 45.16098316398626 - type: main_score value: 45.556640625 - task: type: Clustering dataset: name: MTEB RuSciBenchOECDClusteringP2P (default) type: ai-forever/ru-scibench-oecd-classification config: default split: test revision: 26c88e99dcaba32bb45d0e1bfc21902337f6d471 metrics: - type: main_score value: 50.787548070488974 - type: v_measure value: 50.787548070488974 - type: v_measure_std value: 0.8569958168946827 - task: type: Retrieval dataset: name: MTEB SCIDOCS (default) type: mteb/scidocs config: default split: test revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88 metrics: - type: map_at_1 value: 4.843 - type: map_at_10 value: 11.752 - type: map_at_100 value: 13.919 - type: map_at_1000 value: 14.198 - type: map_at_20 value: 12.898000000000001 - type: map_at_3 value: 8.603 - type: map_at_5 value: 10.069 - type: mrr_at_1 value: 23.799999999999997 - type: mrr_at_10 value: 34.449999999999996 - type: mrr_at_100 value: 35.64 - type: mrr_at_1000 value: 35.691 - type: mrr_at_20 value: 35.213 - type: mrr_at_3 value: 31.383 - type: mrr_at_5 value: 33.062999999999995 - type: ndcg_at_1 value: 23.799999999999997 - type: ndcg_at_10 value: 19.811 - type: ndcg_at_100 value: 28.108 - type: ndcg_at_1000 value: 33.1 - type: ndcg_at_20 value: 22.980999999999998 - type: ndcg_at_3 value: 19.153000000000002 - type: ndcg_at_5 value: 16.408 - type: precision_at_1 value: 23.799999999999997 - type: precision_at_10 value: 10.16 - type: precision_at_100 value: 2.1999999999999997 - type: precision_at_1000 value: 0.34099999999999997 - type: precision_at_20 value: 6.915 - type: precision_at_3 value: 17.8 - type: precision_at_5 value: 14.14 - type: recall_at_1 value: 4.843 - type: recall_at_10 value: 20.595 - type: recall_at_100 value: 44.66 - type: recall_at_1000 value: 69.152 - type: recall_at_20 value: 28.04 - type: recall_at_3 value: 10.833 - type: recall_at_5 value: 14.346999999999998 - type: main_score value: 19.811 - task: type: PairClassification dataset: name: MTEB SICK-E-PL (default) type: PL-MTEB/sicke-pl-pairclassification config: default split: test revision: 71bba34b0ece6c56dfcf46d9758a27f7a90f17e9 metrics: - type: cosine_accuracy value: 80.90093762739502 - type: cosine_accuracy_threshold value: 94.40930485725403 - type: cosine_ap value: 71.15400909912427 - type: cosine_f1 value: 66.8213457076566 - type: cosine_f1_threshold value: 91.53673648834229 - type: cosine_precision value: 62.4922504649721 - type: cosine_recall value: 71.7948717948718 - type: dot_accuracy value: 78.41418671015083 - type: dot_accuracy_threshold value: 42924.45068359375 - type: dot_ap value: 63.34003025365763 - type: dot_f1 value: 62.518258837277244 - type: dot_f1_threshold value: 40900.738525390625 - type: dot_precision value: 52.99653293709758 - type: dot_recall value: 76.21082621082621 - type: euclidean_accuracy value: 80.67672238075826 - type: euclidean_accuracy_threshold value: 696.0524559020996 - type: euclidean_ap value: 70.88762835990224 - type: euclidean_f1 value: 66.711051930759 - type: euclidean_f1_threshold value: 878.5581588745117 - type: euclidean_precision value: 62.625 - type: euclidean_recall value: 71.36752136752136 - type: main_score value: 71.15400909912427 - type: manhattan_accuracy value: 80.65633917651854 - type: manhattan_accuracy_threshold value: 17277.72674560547 - type: manhattan_ap value: 70.67105336611716 - type: manhattan_f1 value: 66.51346027577151 - type: manhattan_f1_threshold value: 21687.957763671875 - type: manhattan_precision value: 61.69305724725944 - type: manhattan_recall value: 72.15099715099716 - type: max_accuracy value: 80.90093762739502 - type: max_ap value: 71.15400909912427 - type: max_f1 value: 66.8213457076566 - type: max_precision value: 62.625 - type: max_recall value: 76.21082621082621 - type: similarity_accuracy value: 80.90093762739502 - type: similarity_accuracy_threshold value: 94.40930485725403 - type: similarity_ap value: 71.15400909912427 - type: similarity_f1 value: 66.8213457076566 - type: similarity_f1_threshold value: 91.53673648834229 - type: similarity_precision value: 62.4922504649721 - type: similarity_recall value: 71.7948717948718 - task: type: STS dataset: name: MTEB SICK-R (default) type: mteb/sickr-sts config: default split: test revision: 20a6d6f312dd54037fe07a32d58e5e168867909d metrics: - type: cosine_pearson value: 92.3339946866199 - type: cosine_spearman value: 89.61697355115497 - type: euclidean_pearson value: 90.3264916449669 - type: euclidean_spearman value: 89.36270451308866 - type: main_score value: 89.61697355115497 - type: manhattan_pearson value: 90.18909339052534 - type: manhattan_spearman value: 89.28337093097377 - type: pearson value: 92.3339946866199 - type: spearman value: 89.61697355115497 - task: type: STS dataset: name: MTEB SICK-R-PL (default) type: PL-MTEB/sickr-pl-sts config: default split: test revision: fd5c2441b7eeff8676768036142af4cfa42c1339 metrics: - type: cosine_pearson value: 85.27883048457821 - type: cosine_spearman value: 80.53204892678619 - type: euclidean_pearson value: 82.78520705216168 - type: euclidean_spearman value: 80.27848359873212 - type: main_score value: 80.53204892678619 - type: manhattan_pearson value: 82.63270640583454 - type: manhattan_spearman value: 80.21507977473146 - type: pearson value: 85.27883048457821 - type: spearman value: 80.53204892678619 - task: type: STS dataset: name: MTEB SICKFr (default) type: Lajavaness/SICK-fr config: default split: test revision: e077ab4cf4774a1e36d86d593b150422fafd8e8a metrics: - type: cosine_pearson value: 88.77029361817212 - type: cosine_spearman value: 83.9453600346894 - type: euclidean_pearson value: 85.85331086208573 - type: euclidean_spearman value: 83.70852031985308 - type: main_score value: 83.9453600346894 - type: manhattan_pearson value: 85.66222265885914 - type: manhattan_spearman value: 83.60833111525962 - type: pearson value: 88.77029361817212 - type: spearman value: 83.9453600346894 - task: type: STS dataset: name: MTEB STS12 (default) type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cosine_pearson value: 88.76435859522375 - type: cosine_spearman value: 82.43768167804375 - type: euclidean_pearson value: 87.43566183874832 - type: euclidean_spearman value: 82.82166873757507 - type: main_score value: 82.43768167804375 - type: manhattan_pearson value: 87.39450871380951 - type: manhattan_spearman value: 82.89253043430163 - type: pearson value: 88.76435859522375 - type: spearman value: 82.43768167804375 - task: type: STS dataset: name: MTEB STS13 (default) type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cosine_pearson value: 88.86627241652141 - type: cosine_spearman value: 89.49011599120688 - type: euclidean_pearson value: 89.3314120073772 - type: euclidean_spearman value: 89.8226502776963 - type: main_score value: 89.49011599120688 - type: manhattan_pearson value: 89.2252179076963 - type: manhattan_spearman value: 89.74573844021225 - type: pearson value: 88.86627241652141 - type: spearman value: 89.49011599120688 - task: type: STS dataset: name: MTEB STS14 (default) type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cosine_pearson value: 87.22891405215968 - type: cosine_spearman value: 84.9467188157614 - type: euclidean_pearson value: 87.20330004726237 - type: euclidean_spearman value: 85.34806059461808 - type: main_score value: 84.9467188157614 - type: manhattan_pearson value: 87.15224666107623 - type: manhattan_spearman value: 85.34596898699708 - type: pearson value: 87.22891405215968 - type: spearman value: 84.9467188157614 - task: type: STS dataset: name: MTEB STS15 (default) type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cosine_pearson value: 88.14066430111033 - type: cosine_spearman value: 89.31337445552545 - type: euclidean_pearson value: 89.08039335366983 - type: euclidean_spearman value: 89.6658762856415 - type: main_score value: 89.31337445552545 - type: manhattan_pearson value: 89.08057438154486 - type: manhattan_spearman value: 89.68673984203022 - type: pearson value: 88.14066430111033 - type: spearman value: 89.31337445552545 - task: type: STS dataset: name: MTEB STS16 (default) type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cosine_pearson value: 85.14908856657084 - type: cosine_spearman value: 86.84648320786727 - type: euclidean_pearson value: 86.11454713131947 - type: euclidean_spearman value: 86.77738862047961 - type: main_score value: 86.84648320786727 - type: manhattan_pearson value: 86.07804821916372 - type: manhattan_spearman value: 86.78676064310474 - type: pearson value: 85.14908856657084 - type: spearman value: 86.84648320786727 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 89.61633502468356 - type: cosine_spearman value: 89.99772663224805 - type: euclidean_pearson value: 90.14056501501044 - type: euclidean_spearman value: 90.04496896837503 - type: main_score value: 89.99772663224805 - type: manhattan_pearson value: 90.08964860311801 - type: manhattan_spearman value: 90.00091712362196 - type: pearson value: 89.61633502468356 - type: spearman value: 89.99772663224805 - task: type: STS dataset: name: MTEB STS17 (es-en) type: mteb/sts17-crosslingual-sts config: es-en split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 86.44548026840202 - type: cosine_spearman value: 87.26263108768539 - type: euclidean_pearson value: 86.42844593583838 - type: euclidean_spearman value: 86.89388428664364 - type: main_score value: 87.26263108768539 - type: manhattan_pearson value: 86.47186940800881 - type: manhattan_spearman value: 87.02163091089946 - type: pearson value: 86.44548026840202 - type: spearman value: 87.26263108768539 - task: type: STS dataset: name: MTEB STS17 (en-de) type: mteb/sts17-crosslingual-sts config: en-de split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 87.89345132532758 - type: cosine_spearman value: 87.96246221327699 - type: euclidean_pearson value: 88.49013032701419 - type: euclidean_spearman value: 87.81981265317344 - type: main_score value: 87.96246221327699 - type: manhattan_pearson value: 88.31360914178538 - type: manhattan_spearman value: 87.62734530005075 - type: pearson value: 87.89345132532758 - type: spearman value: 87.96246221327699 - task: type: STS dataset: name: MTEB STS17 (es-es) type: mteb/sts17-crosslingual-sts config: es-es split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 88.4084678497171 - type: cosine_spearman value: 88.77640638748285 - type: euclidean_pearson value: 89.60124312475843 - type: euclidean_spearman value: 88.4321442688528 - type: main_score value: 88.77640638748285 - type: manhattan_pearson value: 89.62375118021299 - type: manhattan_spearman value: 88.46998118661577 - type: pearson value: 88.4084678497171 - type: spearman value: 88.77640638748285 - task: type: STS dataset: name: MTEB STS17 (fr-en) type: mteb/sts17-crosslingual-sts config: fr-en split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 87.30688801326498 - type: cosine_spearman value: 87.55684697258378 - type: euclidean_pearson value: 87.89672951056794 - type: euclidean_spearman value: 87.28050429201674 - type: main_score value: 87.55684697258378 - type: manhattan_pearson value: 87.74292745320572 - type: manhattan_spearman value: 87.16383993876582 - type: pearson value: 87.30688801326498 - type: spearman value: 87.55684697258378 - task: type: STS dataset: name: MTEB STS22 (zh-en) type: mteb/sts22-crosslingual-sts config: zh-en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 73.46180375170147 - type: cosine_spearman value: 73.39559590127081 - type: euclidean_pearson value: 73.72613901293681 - type: euclidean_spearman value: 71.85465165176795 - type: main_score value: 73.39559590127081 - type: manhattan_pearson value: 73.07859140869076 - type: manhattan_spearman value: 71.22047343718893 - type: pearson value: 73.46180375170147 - type: spearman value: 73.39559590127081 - task: type: STS dataset: name: MTEB STS22 (zh) type: mteb/sts22-crosslingual-sts config: zh split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 62.47531620842637 - type: cosine_spearman value: 66.22504667157702 - type: euclidean_pearson value: 66.76201254783692 - type: euclidean_spearman value: 66.86115760269463 - type: main_score value: 66.22504667157702 - type: manhattan_pearson value: 66.73847836793489 - type: manhattan_spearman value: 66.7677116377695 - type: pearson value: 62.47531620842637 - type: spearman value: 66.22504667157702 - task: type: STS dataset: name: MTEB STS22 (es) type: mteb/sts22-crosslingual-sts config: es split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 69.89707002436481 - type: cosine_spearman value: 72.2054865735116 - type: euclidean_pearson value: 71.81856615570756 - type: euclidean_spearman value: 72.72593304629407 - type: main_score value: 72.2054865735116 - type: manhattan_pearson value: 72.00362684700072 - type: manhattan_spearman value: 72.62783534769964 - type: pearson value: 69.89707002436481 - type: spearman value: 72.2054865735116 - task: type: STS dataset: name: MTEB STS22 (fr) type: mteb/sts22-crosslingual-sts config: fr split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 81.59623734395916 - type: cosine_spearman value: 83.28946105111358 - type: euclidean_pearson value: 79.377330171466 - type: euclidean_spearman value: 81.81029781662205 - type: main_score value: 83.28946105111358 - type: manhattan_pearson value: 78.96970881689698 - type: manhattan_spearman value: 81.91773236079703 - type: pearson value: 81.59623734395916 - type: spearman value: 83.28946105111358 - task: type: STS dataset: name: MTEB STS22 (de-fr) type: mteb/sts22-crosslingual-sts config: de-fr split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 55.03825643126142 - type: cosine_spearman value: 58.25792501780429 - type: euclidean_pearson value: 50.38007603973409 - type: euclidean_spearman value: 59.39961789383097 - type: main_score value: 58.25792501780429 - type: manhattan_pearson value: 50.518568927999155 - type: manhattan_spearman value: 59.84185466003894 - type: pearson value: 55.03825643126142 - type: spearman value: 58.25792501780429 - task: type: STS dataset: name: MTEB STS22 (pl-en) type: mteb/sts22-crosslingual-sts config: pl-en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 77.77233721490776 - type: cosine_spearman value: 76.17596588017625 - type: euclidean_pearson value: 74.47600468156611 - type: euclidean_spearman value: 72.61278728057012 - type: main_score value: 76.17596588017625 - type: manhattan_pearson value: 74.48118910099699 - type: manhattan_spearman value: 73.33167419101696 - type: pearson value: 77.77233721490776 - type: spearman value: 76.17596588017625 - task: type: STS dataset: name: MTEB STS22 (pl) type: mteb/sts22-crosslingual-sts config: pl split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 42.87453608131507 - type: cosine_spearman value: 45.137849894401185 - type: euclidean_pearson value: 31.66964197694796 - type: euclidean_spearman value: 44.1014900837869 - type: main_score value: 45.137849894401185 - type: manhattan_pearson value: 31.007199259384745 - type: manhattan_spearman value: 43.48181523288926 - type: pearson value: 42.87453608131507 - type: spearman value: 45.137849894401185 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 66.87400150638176 - type: cosine_spearman value: 67.27861354834066 - type: euclidean_pearson value: 66.81789582140216 - type: euclidean_spearman value: 66.44220479858708 - type: main_score value: 67.27861354834066 - type: manhattan_pearson value: 66.92509859033235 - type: manhattan_spearman value: 66.46841124185076 - type: pearson value: 66.87400150638176 - type: spearman value: 67.27861354834066 - task: type: STS dataset: name: MTEB STS22 (ru) type: mteb/sts22-crosslingual-sts config: ru split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 61.819804551576084 - type: cosine_spearman value: 65.0864146772135 - type: euclidean_pearson value: 62.518151090361876 - type: euclidean_spearman value: 65.13608138548017 - type: main_score value: 65.0864146772135 - type: manhattan_pearson value: 62.51413246915267 - type: manhattan_spearman value: 65.19077543064323 - type: pearson value: 61.819804551576084 - type: spearman value: 65.0864146772135 - task: type: STS dataset: name: MTEB STS22 (de) type: mteb/sts22-crosslingual-sts config: de split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 54.85728696035389 - type: cosine_spearman value: 61.60906359227576 - type: euclidean_pearson value: 52.57582587901851 - type: euclidean_spearman value: 61.41823097598308 - type: main_score value: 61.60906359227576 - type: manhattan_pearson value: 52.500978361080506 - type: manhattan_spearman value: 61.30365596659758 - type: pearson value: 54.85728696035389 - type: spearman value: 61.60906359227576 - task: type: STS dataset: name: MTEB STS22 (fr-pl) type: mteb/sts22-crosslingual-sts config: fr-pl split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 67.68016005631422 - type: cosine_spearman value: 84.51542547285167 - type: euclidean_pearson value: 66.19871164667245 - type: euclidean_spearman value: 73.24670207647144 - type: main_score value: 84.51542547285167 - type: manhattan_pearson value: 67.0443525268974 - type: manhattan_spearman value: 73.24670207647144 - type: pearson value: 67.68016005631422 - type: spearman value: 84.51542547285167 - task: type: STS dataset: name: MTEB STS22 (de-pl) type: mteb/sts22-crosslingual-sts config: de-pl split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 47.49467414030747 - type: cosine_spearman value: 56.81512095681289 - type: euclidean_pearson value: 48.42860221765214 - type: euclidean_spearman value: 58.63197306329092 - type: main_score value: 56.81512095681289 - type: manhattan_pearson value: 48.39594959260441 - type: manhattan_spearman value: 58.63197306329092 - type: pearson value: 47.49467414030747 - type: spearman value: 56.81512095681289 - task: type: STS dataset: name: MTEB STS22 (es-en) type: mteb/sts22-crosslingual-sts config: es-en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 76.8364678896155 - type: cosine_spearman value: 78.45516413087114 - type: euclidean_pearson value: 78.62779318576634 - type: euclidean_spearman value: 78.88760695649488 - type: main_score value: 78.45516413087114 - type: manhattan_pearson value: 78.62131335760031 - type: manhattan_spearman value: 78.81861844200388 - type: pearson value: 76.8364678896155 - type: spearman value: 78.45516413087114 - task: type: STS dataset: name: MTEB STS22 (de-en) type: mteb/sts22-crosslingual-sts config: de-en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 65.16640313911604 - type: cosine_spearman value: 60.887608967403914 - type: euclidean_pearson value: 67.49902244990913 - type: euclidean_spearman value: 59.2458787136538 - type: main_score value: 60.887608967403914 - type: manhattan_pearson value: 67.34313506388378 - type: manhattan_spearman value: 59.05283429200166 - type: pearson value: 65.16640313911604 - type: spearman value: 60.887608967403914 - task: type: STS dataset: name: MTEB QBQTC (default) type: C-MTEB/QBQTC config: default split: test revision: 790b0510dc52b1553e8c49f3d2afb48c0e5c48b7 metrics: - type: cosine_pearson value: 34.20049144526891 - type: cosine_spearman value: 36.41802814113771 - type: euclidean_pearson value: 34.56994213959062 - type: euclidean_spearman value: 36.06141660786936 - type: main_score value: 36.41802814113771 - type: manhattan_pearson value: 34.537041543916004 - type: manhattan_spearman value: 36.03341892777382 - type: pearson value: 34.20049144526891 - type: spearman value: 36.41802814113771 - task: type: STS dataset: name: MTEB STSB (default) type: C-MTEB/STSB config: default split: test revision: 0cde68302b3541bb8b3c340dc0644b0b745b3dc0 metrics: - type: cosine_pearson value: 81.5092853013241 - type: cosine_spearman value: 83.54005474244292 - type: euclidean_pearson value: 83.7246578378554 - type: euclidean_spearman value: 84.46767551087716 - type: main_score value: 83.54005474244292 - type: manhattan_pearson value: 83.65922665594636 - type: manhattan_spearman value: 84.42431449101848 - type: pearson value: 81.5092853013241 - type: spearman value: 83.54005474244292 - task: type: STS dataset: name: MTEB STSBenchmark (default) type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cosine_pearson value: 87.70246866744966 - type: cosine_spearman value: 89.44070045346106 - type: euclidean_pearson value: 89.56956519641007 - type: euclidean_spearman value: 89.95830112784283 - type: main_score value: 89.44070045346106 - type: manhattan_pearson value: 89.48264471425145 - type: manhattan_spearman value: 89.87900732483114 - type: pearson value: 87.70246866744966 - type: spearman value: 89.44070045346106 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (de) type: mteb/stsb_multi_mt config: de split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 86.83701990805217 - type: cosine_spearman value: 87.80280785492258 - type: euclidean_pearson value: 87.77325330043514 - type: euclidean_spearman value: 88.3564607283144 - type: main_score value: 87.80280785492258 - type: manhattan_pearson value: 87.6745449945946 - type: manhattan_spearman value: 88.30660465978795 - type: pearson value: 86.83701990805217 - type: spearman value: 87.80280785492258 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (zh) type: mteb/stsb_multi_mt config: zh split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 84.27751020600267 - type: cosine_spearman value: 85.63500407412486 - type: euclidean_pearson value: 85.21829891649696 - type: euclidean_spearman value: 85.9384575715382 - type: main_score value: 85.63500407412486 - type: manhattan_pearson value: 85.10797194089801 - type: manhattan_spearman value: 85.8770162042784 - type: pearson value: 84.27751020600267 - type: spearman value: 85.63500407412486 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (fr) type: mteb/stsb_multi_mt config: fr split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 86.56833656723254 - type: cosine_spearman value: 87.4393978501382 - type: euclidean_pearson value: 87.45171512751267 - type: euclidean_spearman value: 88.13106516566947 - type: main_score value: 87.4393978501382 - type: manhattan_pearson value: 87.33010961793333 - type: manhattan_spearman value: 88.06707425102182 - type: pearson value: 86.56833656723254 - type: spearman value: 87.4393978501382 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (pl) type: mteb/stsb_multi_mt config: pl split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 85.45065540325523 - type: cosine_spearman value: 85.47881076789359 - type: euclidean_pearson value: 85.1999493863155 - type: euclidean_spearman value: 85.7874947669187 - type: main_score value: 85.47881076789359 - type: manhattan_pearson value: 85.06075305990376 - type: manhattan_spearman value: 85.71563015639558 - type: pearson value: 85.45065540325523 - type: spearman value: 85.47881076789359 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (es) type: mteb/stsb_multi_mt config: es split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 87.11952824079832 - type: cosine_spearman value: 87.9643473573153 - type: euclidean_pearson value: 88.11750364639971 - type: euclidean_spearman value: 88.63695109016498 - type: main_score value: 87.9643473573153 - type: manhattan_pearson value: 88.00294453126699 - type: manhattan_spearman value: 88.53750241758391 - type: pearson value: 87.11952824079832 - type: spearman value: 87.9643473573153 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (ru) type: mteb/stsb_multi_mt config: ru split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 85.99804354414991 - type: cosine_spearman value: 86.30252111551002 - type: euclidean_pearson value: 86.1880652037762 - type: euclidean_spearman value: 86.69556223944502 - type: main_score value: 86.30252111551002 - type: manhattan_pearson value: 86.0736400320898 - type: manhattan_spearman value: 86.61747927593393 - type: pearson value: 85.99804354414991 - type: spearman value: 86.30252111551002 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (en) type: mteb/stsb_multi_mt config: en split: test revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c metrics: - type: cosine_pearson value: 87.70246861738103 - type: cosine_spearman value: 89.44070045346106 - type: euclidean_pearson value: 89.56956518833663 - type: euclidean_spearman value: 89.95830112784283 - type: main_score value: 89.44070045346106 - type: manhattan_pearson value: 89.48264470792915 - type: manhattan_spearman value: 89.87900732483114 - type: pearson value: 87.70246861738103 - type: spearman value: 89.44070045346106 - task: type: Reranking dataset: name: MTEB SciDocsRR (default) type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 84.88064122814694 - type: mrr value: 95.84832651009123 - type: main_score value: 84.88064122814694 - task: type: Retrieval dataset: name: MTEB SciFact (default) type: mteb/scifact config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: map_at_1 value: 57.289 - type: map_at_10 value: 67.88499999999999 - type: map_at_100 value: 68.477 - type: map_at_1000 value: 68.50500000000001 - type: map_at_20 value: 68.33500000000001 - type: map_at_3 value: 65.08 - type: map_at_5 value: 67.001 - type: mrr_at_1 value: 59.667 - type: mrr_at_10 value: 68.626 - type: mrr_at_100 value: 69.082 - type: mrr_at_1000 value: 69.108 - type: mrr_at_20 value: 68.958 - type: mrr_at_3 value: 66.667 - type: mrr_at_5 value: 67.983 - type: ndcg_at_1 value: 59.667 - type: ndcg_at_10 value: 72.309 - type: ndcg_at_100 value: 74.58399999999999 - type: ndcg_at_1000 value: 75.25500000000001 - type: ndcg_at_20 value: 73.656 - type: ndcg_at_3 value: 67.791 - type: ndcg_at_5 value: 70.45 - type: precision_at_1 value: 59.667 - type: precision_at_10 value: 9.567 - type: precision_at_100 value: 1.073 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_20 value: 5.083 - type: precision_at_3 value: 26.333000000000002 - type: precision_at_5 value: 17.666999999999998 - type: recall_at_1 value: 57.289 - type: recall_at_10 value: 84.756 - type: recall_at_100 value: 94.5 - type: recall_at_1000 value: 99.667 - type: recall_at_20 value: 89.7 - type: recall_at_3 value: 73.22800000000001 - type: recall_at_5 value: 79.444 - type: main_score value: 72.309 - task: type: Clustering dataset: name: MTEB SpanishNewsClusteringP2P (default) type: jinaai/spanish_news_clustering config: default split: test revision: bf8ca8ddc5b7da4f7004720ddf99bbe0483480e6 metrics: - type: main_score value: 45.04477709795154 - type: v_measure value: 45.04477709795154 - type: v_measure_std value: 0.0 - task: type: Retrieval dataset: name: MTEB SpanishPassageRetrievalS2S (default) type: jinaai/spanish_passage_retrieval config: default split: test revision: 9cddf2ce5209ade52c2115ccfa00eb22c6d3a837 metrics: - type: main_score value: 69.83 - type: map_at_1 value: 15.736 - type: map_at_10 value: 52.027 - type: map_at_100 value: 65.08800000000001 - type: map_at_1000 value: 65.08800000000001 - type: map_at_20 value: 60.79900000000001 - type: map_at_3 value: 32.869 - type: map_at_5 value: 41.436 - type: mrr_at_1 value: 75.44910179640718 - type: mrr_at_10 value: 84.43446440452426 - type: mrr_at_100 value: 84.48052612723271 - type: mrr_at_1000 value: 84.48052612723271 - type: mrr_at_20 value: 84.48052612723271 - type: mrr_at_3 value: 83.13373253493013 - type: mrr_at_5 value: 84.3013972055888 - type: nauc_map_at_1000_diff1 value: 50.611540149694356 - type: nauc_map_at_1000_max value: 2.1102430434260238 - type: nauc_map_at_1000_std value: -18.88993521335793 - type: nauc_map_at_100_diff1 value: 50.611540149694356 - type: nauc_map_at_100_max value: 2.1102430434260238 - type: nauc_map_at_100_std value: -18.88993521335793 - type: nauc_map_at_10_diff1 value: 59.13518981755268 - type: nauc_map_at_10_max value: -9.810386627392807 - type: nauc_map_at_10_std value: -38.31810152345078 - type: nauc_map_at_1_diff1 value: 74.96782567287174 - type: nauc_map_at_1_max value: -29.648279252607875 - type: nauc_map_at_1_std value: -54.017459339141595 - type: nauc_map_at_20_diff1 value: 55.26694458629849 - type: nauc_map_at_20_max value: -1.9490244535020729 - type: nauc_map_at_20_std value: -25.22211659104076 - type: nauc_map_at_3_diff1 value: 71.67607885031732 - type: nauc_map_at_3_max value: -25.078101661694507 - type: nauc_map_at_3_std value: -50.55408861920259 - type: nauc_map_at_5_diff1 value: 61.50111515417668 - type: nauc_map_at_5_max value: -16.4114670513168 - type: nauc_map_at_5_std value: -44.391416134859135 - type: nauc_mrr_at_1000_diff1 value: 74.18848063283234 - type: nauc_mrr_at_1000_max value: 21.929205946778005 - type: nauc_mrr_at_1000_std value: -36.27399268489433 - type: nauc_mrr_at_100_diff1 value: 74.18848063283234 - type: nauc_mrr_at_100_max value: 21.929205946778005 - type: nauc_mrr_at_100_std value: -36.27399268489433 - type: nauc_mrr_at_10_diff1 value: 74.27231582268745 - type: nauc_mrr_at_10_max value: 21.481133301135337 - type: nauc_mrr_at_10_std value: -36.72070854872902 - type: nauc_mrr_at_1_diff1 value: 76.54855950439561 - type: nauc_mrr_at_1_max value: 26.99938321212366 - type: nauc_mrr_at_1_std value: -33.098742603429635 - type: nauc_mrr_at_20_diff1 value: 74.18848063283234 - type: nauc_mrr_at_20_max value: 21.929205946778005 - type: nauc_mrr_at_20_std value: -36.27399268489433 - type: nauc_mrr_at_3_diff1 value: 72.05379526740143 - type: nauc_mrr_at_3_max value: 18.875831185752528 - type: nauc_mrr_at_3_std value: -37.27302006456391 - type: nauc_mrr_at_5_diff1 value: 74.25342356682029 - type: nauc_mrr_at_5_max value: 20.756340085088738 - type: nauc_mrr_at_5_std value: -37.99507208540703 - type: nauc_ndcg_at_1000_diff1 value: 53.259363764380275 - type: nauc_ndcg_at_1000_max value: 12.936954959423218 - type: nauc_ndcg_at_1000_std value: -16.953898675672153 - type: nauc_ndcg_at_100_diff1 value: 53.259363764380275 - type: nauc_ndcg_at_100_max value: 12.936954959423218 - type: nauc_ndcg_at_100_std value: -16.953898675672153 - type: nauc_ndcg_at_10_diff1 value: 53.70942345413554 - type: nauc_ndcg_at_10_max value: -3.8465093347016186 - type: nauc_ndcg_at_10_std value: -31.208127919994755 - type: nauc_ndcg_at_1_diff1 value: 75.30551289259554 - type: nauc_ndcg_at_1_max value: 25.53292054129834 - type: nauc_ndcg_at_1_std value: -33.285498788395145 - type: nauc_ndcg_at_20_diff1 value: 57.62409278278133 - type: nauc_ndcg_at_20_max value: 2.8040586426056233 - type: nauc_ndcg_at_20_std value: -26.270875776221704 - type: nauc_ndcg_at_3_diff1 value: 48.42294834754225 - type: nauc_ndcg_at_3_max value: 16.912467881065822 - type: nauc_ndcg_at_3_std value: -13.324841189277873 - type: nauc_ndcg_at_5_diff1 value: 47.512819802794596 - type: nauc_ndcg_at_5_max value: 14.645518203506594 - type: nauc_ndcg_at_5_std value: -17.641450435599275 - type: nauc_precision_at_1000_diff1 value: -34.43320975829637 - type: nauc_precision_at_1000_max value: 29.08585622578186 - type: nauc_precision_at_1000_std value: 46.55117940162061 - type: nauc_precision_at_100_diff1 value: -34.433209758296364 - type: nauc_precision_at_100_max value: 29.085856225781885 - type: nauc_precision_at_100_std value: 46.55117940162065 - type: nauc_precision_at_10_diff1 value: -21.895306304096902 - type: nauc_precision_at_10_max value: 33.190476527593745 - type: nauc_precision_at_10_std value: 37.64916268614298 - type: nauc_precision_at_1_diff1 value: 75.30551289259554 - type: nauc_precision_at_1_max value: 25.53292054129834 - type: nauc_precision_at_1_std value: -33.285498788395145 - type: nauc_precision_at_20_diff1 value: -27.63076748060466 - type: nauc_precision_at_20_max value: 30.689810416086154 - type: nauc_precision_at_20_std value: 46.164191636131626 - type: nauc_precision_at_3_diff1 value: 20.547345067837288 - type: nauc_precision_at_3_max value: 26.177050942827528 - type: nauc_precision_at_3_std value: 5.960466052973099 - type: nauc_precision_at_5_diff1 value: -8.928755534002669 - type: nauc_precision_at_5_max value: 40.83262650073459 - type: nauc_precision_at_5_std value: 26.158537031161494 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: .nan - type: nauc_recall_at_100_max value: .nan - type: nauc_recall_at_100_std value: .nan - type: nauc_recall_at_10_diff1 value: 53.08654386169444 - type: nauc_recall_at_10_max value: -23.276269379519356 - type: nauc_recall_at_10_std value: -50.80707792706157 - type: nauc_recall_at_1_diff1 value: 74.96782567287174 - type: nauc_recall_at_1_max value: -29.648279252607875 - type: nauc_recall_at_1_std value: -54.017459339141595 - type: nauc_recall_at_20_diff1 value: 51.60121897059633 - type: nauc_recall_at_20_max value: -14.241779530735387 - type: nauc_recall_at_20_std value: -37.877451525215456 - type: nauc_recall_at_3_diff1 value: 66.99474984329694 - type: nauc_recall_at_3_max value: -30.802787353187966 - type: nauc_recall_at_3_std value: -53.58737792129713 - type: nauc_recall_at_5_diff1 value: 54.64214444958567 - type: nauc_recall_at_5_max value: -23.341309362104703 - type: nauc_recall_at_5_std value: -51.381363923145265 - type: ndcg_at_1 value: 76.048 - type: ndcg_at_10 value: 69.83 - type: ndcg_at_100 value: 82.11500000000001 - type: ndcg_at_1000 value: 82.11500000000001 - type: ndcg_at_20 value: 75.995 - type: ndcg_at_3 value: 69.587 - type: ndcg_at_5 value: 69.062 - type: precision_at_1 value: 76.048 - type: precision_at_10 value: 43.653 - type: precision_at_100 value: 7.718999999999999 - type: precision_at_1000 value: 0.772 - type: precision_at_20 value: 31.108000000000004 - type: precision_at_3 value: 63.87199999999999 - type: precision_at_5 value: 56.407 - type: recall_at_1 value: 15.736 - type: recall_at_10 value: 66.873 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 85.01100000000001 - type: recall_at_3 value: 36.441 - type: recall_at_5 value: 49.109 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions (default) type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cosine_accuracy value: 99.87326732673267 - type: cosine_accuracy_threshold value: 86.0752820968628 - type: cosine_ap value: 96.98758090713252 - type: cosine_f1 value: 93.52881698685542 - type: cosine_f1_threshold value: 86.0752820968628 - type: cosine_precision value: 94.58077709611452 - type: cosine_recall value: 92.5 - type: dot_accuracy value: 99.82574257425742 - type: dot_accuracy_threshold value: 40484.73815917969 - type: dot_ap value: 95.68959907254845 - type: dot_f1 value: 91.31293188548865 - type: dot_f1_threshold value: 40336.810302734375 - type: dot_precision value: 90.15594541910332 - type: dot_recall value: 92.5 - type: euclidean_accuracy value: 99.87128712871286 - type: euclidean_accuracy_threshold value: 1162.5749588012695 - type: euclidean_ap value: 96.92640435656577 - type: euclidean_f1 value: 93.4475806451613 - type: euclidean_f1_threshold value: 1162.5749588012695 - type: euclidean_precision value: 94.20731707317073 - type: euclidean_recall value: 92.7 - type: main_score value: 96.98758090713252 - type: manhattan_accuracy value: 99.86930693069307 - type: manhattan_accuracy_threshold value: 28348.71826171875 - type: manhattan_ap value: 96.93832673967925 - type: manhattan_f1 value: 93.33333333333333 - type: manhattan_f1_threshold value: 28348.71826171875 - type: manhattan_precision value: 94.28571428571428 - type: manhattan_recall value: 92.4 - type: max_accuracy value: 99.87326732673267 - type: max_ap value: 96.98758090713252 - type: max_f1 value: 93.52881698685542 - type: max_precision value: 94.58077709611452 - type: max_recall value: 92.7 - type: similarity_accuracy value: 99.87326732673267 - type: similarity_accuracy_threshold value: 86.0752820968628 - type: similarity_ap value: 96.98758090713252 - type: similarity_f1 value: 93.52881698685542 - type: similarity_f1_threshold value: 86.0752820968628 - type: similarity_precision value: 94.58077709611452 - type: similarity_recall value: 92.5 - task: type: Clustering dataset: name: MTEB StackExchangeClustering (default) type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: main_score value: 65.6560129719848 - type: v_measure value: 65.6560129719848 - type: v_measure_std value: 4.781229811487539 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P (default) type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: main_score value: 35.07546243853692 - type: v_measure value: 35.07546243853692 - type: v_measure_std value: 1.1978740356240998 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions (default) type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 51.771005199508835 - type: mrr value: 52.65443298531534 - type: main_score value: 51.771005199508835 - task: type: Summarization dataset: name: MTEB SummEval (default) type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cosine_pearson value: 29.48686238342228 - type: cosine_spearman value: 29.706543509170054 - type: dot_pearson value: 27.95853155597859 - type: dot_spearman value: 27.604287986935162 - type: main_score value: 29.706543509170054 - type: pearson value: 29.48686238342228 - type: spearman value: 29.706543509170054 - task: type: Summarization dataset: name: MTEB SummEvalFr (default) type: lyon-nlp/summarization-summeval-fr-p2p config: default split: test revision: b385812de6a9577b6f4d0f88c6a6e35395a94054 metrics: - type: cosine_pearson value: 31.551301434917868 - type: cosine_spearman value: 30.709049789175186 - type: dot_pearson value: 27.77050901756549 - type: dot_spearman value: 26.715505953561795 - type: main_score value: 30.709049789175186 - type: pearson value: 31.551301434917868 - type: spearman value: 30.709049789175186 - task: type: Reranking dataset: name: MTEB SyntecReranking (default) type: lyon-nlp/mteb-fr-reranking-syntec-s2p config: default split: test revision: b205c5084a0934ce8af14338bf03feb19499c84d metrics: - type: map value: 73.31666666666666 - type: mrr value: 73.31666666666666 - type: main_score value: 73.31666666666666 - task: type: Retrieval dataset: name: MTEB SyntecRetrieval (default) type: lyon-nlp/mteb-fr-retrieval-syntec-s2p config: default split: test revision: 19661ccdca4dfc2d15122d776b61685f48c68ca9 metrics: - type: main_score value: 83.851 - type: map_at_1 value: 68.0 - type: map_at_10 value: 79.187 - type: map_at_100 value: 79.32900000000001 - type: map_at_1000 value: 79.32900000000001 - type: map_at_20 value: 79.32900000000001 - type: map_at_3 value: 77.333 - type: map_at_5 value: 78.93299999999999 - type: mrr_at_1 value: 68.0 - type: mrr_at_10 value: 79.18730158730159 - type: mrr_at_100 value: 79.32945845004669 - type: mrr_at_1000 value: 79.32945845004669 - type: mrr_at_20 value: 79.32945845004669 - type: mrr_at_3 value: 77.33333333333333 - type: mrr_at_5 value: 78.93333333333332 - type: nauc_map_at_1000_diff1 value: 63.31103256935259 - type: nauc_map_at_1000_max value: 11.073749121365623 - type: nauc_map_at_1000_std value: 7.4973309839738 - type: nauc_map_at_100_diff1 value: 63.31103256935259 - type: nauc_map_at_100_max value: 11.073749121365623 - type: nauc_map_at_100_std value: 7.4973309839738 - type: nauc_map_at_10_diff1 value: 62.91585737195978 - type: nauc_map_at_10_max value: 11.770664508983133 - type: nauc_map_at_10_std value: 8.179883948527962 - type: nauc_map_at_1_diff1 value: 66.1236265634718 - type: nauc_map_at_1_max value: 7.000207311173955 - type: nauc_map_at_1_std value: 6.54412272821497 - type: nauc_map_at_20_diff1 value: 63.31103256935259 - type: nauc_map_at_20_max value: 11.073749121365623 - type: nauc_map_at_20_std value: 7.4973309839738 - type: nauc_map_at_3_diff1 value: 62.14039574010254 - type: nauc_map_at_3_max value: 11.06996398110187 - type: nauc_map_at_3_std value: 7.288759297085769 - type: nauc_map_at_5_diff1 value: 63.0401271126211 - type: nauc_map_at_5_max value: 10.779317801858609 - type: nauc_map_at_5_std value: 6.476660484760681 - type: nauc_mrr_at_1000_diff1 value: 63.31103256935259 - type: nauc_mrr_at_1000_max value: 11.073749121365623 - type: nauc_mrr_at_1000_std value: 7.4973309839738 - type: nauc_mrr_at_100_diff1 value: 63.31103256935259 - type: nauc_mrr_at_100_max value: 11.073749121365623 - type: nauc_mrr_at_100_std value: 7.4973309839738 - type: nauc_mrr_at_10_diff1 value: 62.91585737195978 - type: nauc_mrr_at_10_max value: 11.770664508983133 - type: nauc_mrr_at_10_std value: 8.179883948527962 - type: nauc_mrr_at_1_diff1 value: 66.1236265634718 - type: nauc_mrr_at_1_max value: 7.000207311173955 - type: nauc_mrr_at_1_std value: 6.54412272821497 - type: nauc_mrr_at_20_diff1 value: 63.31103256935259 - type: nauc_mrr_at_20_max value: 11.073749121365623 - type: nauc_mrr_at_20_std value: 7.4973309839738 - type: nauc_mrr_at_3_diff1 value: 62.14039574010254 - type: nauc_mrr_at_3_max value: 11.06996398110187 - type: nauc_mrr_at_3_std value: 7.288759297085769 - type: nauc_mrr_at_5_diff1 value: 63.0401271126211 - type: nauc_mrr_at_5_max value: 10.779317801858609 - type: nauc_mrr_at_5_std value: 6.476660484760681 - type: nauc_ndcg_at_1000_diff1 value: 62.9544299483241 - type: nauc_ndcg_at_1000_max value: 11.577079766964538 - type: nauc_ndcg_at_1000_std value: 7.703856790100716 - type: nauc_ndcg_at_100_diff1 value: 62.9544299483241 - type: nauc_ndcg_at_100_max value: 11.577079766964538 - type: nauc_ndcg_at_100_std value: 7.703856790100716 - type: nauc_ndcg_at_10_diff1 value: 61.29907952217381 - type: nauc_ndcg_at_10_max value: 14.760627422715425 - type: nauc_ndcg_at_10_std value: 10.805573898143368 - type: nauc_ndcg_at_1_diff1 value: 66.1236265634718 - type: nauc_ndcg_at_1_max value: 7.000207311173955 - type: nauc_ndcg_at_1_std value: 6.54412272821497 - type: nauc_ndcg_at_20_diff1 value: 62.9544299483241 - type: nauc_ndcg_at_20_max value: 11.577079766964538 - type: nauc_ndcg_at_20_std value: 7.703856790100716 - type: nauc_ndcg_at_3_diff1 value: 60.25643527856101 - type: nauc_ndcg_at_3_max value: 12.236302709487546 - type: nauc_ndcg_at_3_std value: 7.36883189112067 - type: nauc_ndcg_at_5_diff1 value: 61.65220590318238 - type: nauc_ndcg_at_5_max value: 11.39969101913945 - type: nauc_ndcg_at_5_std value: 5.406207922379402 - type: nauc_precision_at_1000_diff1 value: .nan - type: nauc_precision_at_1000_max value: .nan - type: nauc_precision_at_1000_std value: .nan - type: nauc_precision_at_100_diff1 value: .nan - type: nauc_precision_at_100_max value: .nan - type: nauc_precision_at_100_std value: .nan - type: nauc_precision_at_10_diff1 value: 19.14098972922579 - type: nauc_precision_at_10_max value: 100.0 - type: nauc_precision_at_10_std value: 93.46405228758135 - type: nauc_precision_at_1_diff1 value: 66.1236265634718 - type: nauc_precision_at_1_max value: 7.000207311173955 - type: nauc_precision_at_1_std value: 6.54412272821497 - type: nauc_precision_at_20_diff1 value: 100.0 - type: nauc_precision_at_20_max value: 100.0 - type: nauc_precision_at_20_std value: 100.0 - type: nauc_precision_at_3_diff1 value: 50.29636629155561 - type: nauc_precision_at_3_max value: 18.00532600292076 - type: nauc_precision_at_3_std value: 7.649686453053768 - type: nauc_precision_at_5_diff1 value: 43.522408963585356 - type: nauc_precision_at_5_max value: 16.923436041082983 - type: nauc_precision_at_5_std value: -10.854341736694092 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: .nan - type: nauc_recall_at_100_max value: .nan - type: nauc_recall_at_100_std value: .nan - type: nauc_recall_at_10_diff1 value: 19.1409897292252 - type: nauc_recall_at_10_max value: 100.0 - type: nauc_recall_at_10_std value: 93.46405228758134 - type: nauc_recall_at_1_diff1 value: 66.1236265634718 - type: nauc_recall_at_1_max value: 7.000207311173955 - type: nauc_recall_at_1_std value: 6.54412272821497 - type: nauc_recall_at_20_diff1 value: .nan - type: nauc_recall_at_20_max value: .nan - type: nauc_recall_at_20_std value: .nan - type: nauc_recall_at_3_diff1 value: 50.29636629155569 - type: nauc_recall_at_3_max value: 18.005326002920754 - type: nauc_recall_at_3_std value: 7.649686453053851 - type: nauc_recall_at_5_diff1 value: 43.5224089635856 - type: nauc_recall_at_5_max value: 16.92343604108335 - type: nauc_recall_at_5_std value: -10.854341736694499 - type: ndcg_at_1 value: 68.0 - type: ndcg_at_10 value: 83.851 - type: ndcg_at_100 value: 84.36099999999999 - type: ndcg_at_1000 value: 84.36099999999999 - type: ndcg_at_20 value: 84.36099999999999 - type: ndcg_at_3 value: 80.333 - type: ndcg_at_5 value: 83.21600000000001 - type: precision_at_1 value: 68.0 - type: precision_at_10 value: 9.8 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 5.0 - type: precision_at_3 value: 29.666999999999998 - type: precision_at_5 value: 19.2 - type: recall_at_1 value: 68.0 - type: recall_at_10 value: 98.0 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 100.0 - type: recall_at_3 value: 89.0 - type: recall_at_5 value: 96.0 - task: type: Reranking dataset: name: MTEB T2Reranking (default) type: C-MTEB/T2Reranking config: default split: dev revision: 76631901a18387f85eaa53e5450019b87ad58ef9 metrics: - type: map value: 65.3088203970324 - type: mrr value: 74.79505862376546 - type: main_score value: 65.3088203970324 - task: type: Retrieval dataset: name: MTEB T2Retrieval (default) type: C-MTEB/T2Retrieval config: default split: dev revision: 8731a845f1bf500a4f111cf1070785c793d10e64 metrics: - type: main_score value: 83.163 - type: map_at_1 value: 26.875 - type: map_at_10 value: 75.454 - type: map_at_100 value: 79.036 - type: map_at_1000 value: 79.111 - type: map_at_20 value: 78.145 - type: map_at_3 value: 53.181 - type: map_at_5 value: 65.362 - type: mrr_at_1 value: 88.90057864281957 - type: mrr_at_10 value: 91.53186397301344 - type: mrr_at_100 value: 91.62809075510003 - type: mrr_at_1000 value: 91.63198173030787 - type: mrr_at_20 value: 91.59414668799909 - type: mrr_at_3 value: 91.0792565316499 - type: mrr_at_5 value: 91.35718043135199 - type: nauc_map_at_1000_diff1 value: 12.364843957982409 - type: nauc_map_at_1000_max value: 52.07043464458799 - type: nauc_map_at_1000_std value: 16.040095055100494 - type: nauc_map_at_100_diff1 value: 12.370621073823022 - type: nauc_map_at_100_max value: 51.960738727635636 - type: nauc_map_at_100_std value: 15.935832440430747 - type: nauc_map_at_10_diff1 value: 16.852819486606585 - type: nauc_map_at_10_max value: 40.11184760756059 - type: nauc_map_at_10_std value: 0.9306648364102376 - type: nauc_map_at_1_diff1 value: 52.87356542654683 - type: nauc_map_at_1_max value: -22.210039746171255 - type: nauc_map_at_1_std value: -38.11345358035342 - type: nauc_map_at_20_diff1 value: 13.045089059562837 - type: nauc_map_at_20_max value: 49.591383082160036 - type: nauc_map_at_20_std value: 12.54330050352008 - type: nauc_map_at_3_diff1 value: 38.08172234377615 - type: nauc_map_at_3_max value: -6.868621684867697 - type: nauc_map_at_3_std value: -35.4712388845996 - type: nauc_map_at_5_diff1 value: 29.665551705577474 - type: nauc_map_at_5_max value: 10.958628576519045 - type: nauc_map_at_5_std value: -25.113120842097057 - type: nauc_mrr_at_1000_diff1 value: 47.39372999496945 - type: nauc_mrr_at_1000_max value: 83.11274997493808 - type: nauc_mrr_at_1000_std value: 39.74195374546631 - type: nauc_mrr_at_100_diff1 value: 47.396678946057676 - type: nauc_mrr_at_100_max value: 83.1192584274415 - type: nauc_mrr_at_100_std value: 39.75840860374685 - type: nauc_mrr_at_10_diff1 value: 47.35365644138715 - type: nauc_mrr_at_10_max value: 83.189165639531 - type: nauc_mrr_at_10_std value: 39.83653157887758 - type: nauc_mrr_at_1_diff1 value: 47.98740362820094 - type: nauc_mrr_at_1_max value: 80.32340034580369 - type: nauc_mrr_at_1_std value: 34.57857131423388 - type: nauc_mrr_at_20_diff1 value: 47.399132055537194 - type: nauc_mrr_at_20_max value: 83.16329919869686 - type: nauc_mrr_at_20_std value: 39.84204692042734 - type: nauc_mrr_at_3_diff1 value: 47.09295580511751 - type: nauc_mrr_at_3_max value: 82.95831045602642 - type: nauc_mrr_at_3_std value: 38.98036804692351 - type: nauc_mrr_at_5_diff1 value: 47.20100268549764 - type: nauc_mrr_at_5_max value: 83.16652480381642 - type: nauc_mrr_at_5_std value: 39.55690491560902 - type: nauc_ndcg_at_1000_diff1 value: 17.201962509184547 - type: nauc_ndcg_at_1000_max value: 63.75820559259539 - type: nauc_ndcg_at_1000_std value: 29.28676096486067 - type: nauc_ndcg_at_100_diff1 value: 16.76847216096811 - type: nauc_ndcg_at_100_max value: 62.646517934470744 - type: nauc_ndcg_at_100_std value: 28.7441617667637 - type: nauc_ndcg_at_10_diff1 value: 16.559511980751886 - type: nauc_ndcg_at_10_max value: 54.35027464277944 - type: nauc_ndcg_at_10_std value: 16.98089333577716 - type: nauc_ndcg_at_1_diff1 value: 47.98740362820094 - type: nauc_ndcg_at_1_max value: 80.32340034580369 - type: nauc_ndcg_at_1_std value: 34.57857131423388 - type: nauc_ndcg_at_20_diff1 value: 16.721525245428243 - type: nauc_ndcg_at_20_max value: 57.683661870555724 - type: nauc_ndcg_at_20_std value: 21.736044200026853 - type: nauc_ndcg_at_3_diff1 value: 12.488009696556192 - type: nauc_ndcg_at_3_max value: 69.2365575305502 - type: nauc_ndcg_at_3_std value: 30.622418945055323 - type: nauc_ndcg_at_5_diff1 value: 12.364114556230609 - type: nauc_ndcg_at_5_max value: 62.33360746285387 - type: nauc_ndcg_at_5_std value: 24.898000803570227 - type: nauc_precision_at_1000_diff1 value: -35.14745130154524 - type: nauc_precision_at_1000_max value: 48.811507982849065 - type: nauc_precision_at_1000_std value: 62.43036496029399 - type: nauc_precision_at_100_diff1 value: -35.15276411320076 - type: nauc_precision_at_100_max value: 50.87010333741109 - type: nauc_precision_at_100_std value: 63.418221030407175 - type: nauc_precision_at_10_diff1 value: -34.84255710936113 - type: nauc_precision_at_10_max value: 56.588401051428825 - type: nauc_precision_at_10_std value: 57.4763370653757 - type: nauc_precision_at_1_diff1 value: 47.98740362820094 - type: nauc_precision_at_1_max value: 80.32340034580369 - type: nauc_precision_at_1_std value: 34.57857131423388 - type: nauc_precision_at_20_diff1 value: -35.165762365233505 - type: nauc_precision_at_20_max value: 54.148762449660424 - type: nauc_precision_at_20_std value: 61.569719669368716 - type: nauc_precision_at_3_diff1 value: -28.63023175340299 - type: nauc_precision_at_3_max value: 68.69825987618499 - type: nauc_precision_at_3_std value: 48.15479495755423 - type: nauc_precision_at_5_diff1 value: -34.13811355456687 - type: nauc_precision_at_5_max value: 62.369363941490604 - type: nauc_precision_at_5_std value: 52.282904411187914 - type: nauc_recall_at_1000_diff1 value: 8.686444579162663 - type: nauc_recall_at_1000_max value: 59.58864478011338 - type: nauc_recall_at_1000_std value: 56.692774954297455 - type: nauc_recall_at_100_diff1 value: 8.820596225758342 - type: nauc_recall_at_100_max value: 53.15048885657892 - type: nauc_recall_at_100_std value: 39.78931159236714 - type: nauc_recall_at_10_diff1 value: 16.022301106315027 - type: nauc_recall_at_10_max value: 29.83242342459543 - type: nauc_recall_at_10_std value: -4.805965555875844 - type: nauc_recall_at_1_diff1 value: 52.87356542654683 - type: nauc_recall_at_1_max value: -22.210039746171255 - type: nauc_recall_at_1_std value: -38.11345358035342 - type: nauc_recall_at_20_diff1 value: 10.35772828627265 - type: nauc_recall_at_20_max value: 43.06420839754062 - type: nauc_recall_at_20_std value: 15.040522218235692 - type: nauc_recall_at_3_diff1 value: 36.23953684770224 - type: nauc_recall_at_3_max value: -11.709269151700374 - type: nauc_recall_at_3_std value: -38.13943178150384 - type: nauc_recall_at_5_diff1 value: 28.644872415763384 - type: nauc_recall_at_5_max value: 2.062151266111129 - type: nauc_recall_at_5_std value: -30.81114034774277 - type: ndcg_at_1 value: 88.901 - type: ndcg_at_10 value: 83.163 - type: ndcg_at_100 value: 86.854 - type: ndcg_at_1000 value: 87.602 - type: ndcg_at_20 value: 84.908 - type: ndcg_at_3 value: 84.848 - type: ndcg_at_5 value: 83.372 - type: precision_at_1 value: 88.901 - type: precision_at_10 value: 41.343 - type: precision_at_100 value: 4.957000000000001 - type: precision_at_1000 value: 0.513 - type: precision_at_20 value: 22.955000000000002 - type: precision_at_3 value: 74.29599999999999 - type: precision_at_5 value: 62.251999999999995 - type: recall_at_1 value: 26.875 - type: recall_at_10 value: 81.902 - type: recall_at_100 value: 93.988 - type: recall_at_1000 value: 97.801 - type: recall_at_20 value: 87.809 - type: recall_at_3 value: 54.869 - type: recall_at_5 value: 68.728 - task: type: PairClassification dataset: name: MTEB TERRa (default) type: ai-forever/terra-pairclassification config: default split: dev revision: 7b58f24536063837d644aab9a023c62199b2a612 metrics: - type: cosine_accuracy value: 60.586319218241044 - type: cosine_accuracy_threshold value: 82.49806761741638 - type: cosine_ap value: 58.73198048427448 - type: cosine_f1 value: 67.37967914438502 - type: cosine_f1_threshold value: 77.46461033821106 - type: cosine_precision value: 57.01357466063348 - type: cosine_recall value: 82.35294117647058 - type: dot_accuracy value: 60.26058631921825 - type: dot_accuracy_threshold value: 35627.020263671875 - type: dot_ap value: 57.418783612898224 - type: dot_f1 value: 66.51982378854623 - type: dot_f1_threshold value: 27620.843505859375 - type: dot_precision value: 50.16611295681063 - type: dot_recall value: 98.69281045751634 - type: euclidean_accuracy value: 60.26058631921825 - type: euclidean_accuracy_threshold value: 1255.4466247558594 - type: euclidean_ap value: 58.748656145387955 - type: euclidean_f1 value: 66.99029126213591 - type: euclidean_f1_threshold value: 1565.1330947875977 - type: euclidean_precision value: 53.28185328185329 - type: euclidean_recall value: 90.19607843137256 - type: main_score value: 58.8479126365766 - type: manhattan_accuracy value: 59.934853420195445 - type: manhattan_accuracy_threshold value: 29897.271728515625 - type: manhattan_ap value: 58.8479126365766 - type: manhattan_f1 value: 66.81318681318683 - type: manhattan_f1_threshold value: 46291.802978515625 - type: manhattan_precision value: 50.331125827814574 - type: manhattan_recall value: 99.34640522875817 - type: max_accuracy value: 60.586319218241044 - type: max_ap value: 58.8479126365766 - type: max_f1 value: 67.37967914438502 - type: max_precision value: 57.01357466063348 - type: max_recall value: 99.34640522875817 - type: similarity_accuracy value: 60.586319218241044 - type: similarity_accuracy_threshold value: 82.49806761741638 - type: similarity_ap value: 58.73198048427448 - type: similarity_f1 value: 67.37967914438502 - type: similarity_f1_threshold value: 77.46461033821106 - type: similarity_precision value: 57.01357466063348 - type: similarity_recall value: 82.35294117647058 - task: type: Classification dataset: name: MTEB TNews (default) type: C-MTEB/TNews-classification config: default split: validation revision: 317f262bf1e6126357bbe89e875451e4b0938fe4 metrics: - type: accuracy value: 45.967999999999996 - type: f1 value: 44.699306100915706 - type: f1_weighted value: 46.03730319014832 - type: main_score value: 45.967999999999996 - task: type: Retrieval dataset: name: MTEB TRECCOVID (default) type: mteb/trec-covid config: default split: test revision: bb9466bac8153a0349341eb1b22e06409e78ef4e metrics: - type: map_at_1 value: 0.251 - type: map_at_10 value: 1.9480000000000002 - type: map_at_100 value: 11.082 - type: map_at_1000 value: 26.700000000000003 - type: map_at_20 value: 3.3529999999999998 - type: map_at_3 value: 0.679 - type: map_at_5 value: 1.079 - type: mrr_at_1 value: 94.0 - type: mrr_at_10 value: 95.786 - type: mrr_at_100 value: 95.786 - type: mrr_at_1000 value: 95.786 - type: mrr_at_20 value: 95.786 - type: mrr_at_3 value: 95.0 - type: mrr_at_5 value: 95.5 - type: ndcg_at_1 value: 91.0 - type: ndcg_at_10 value: 77.71900000000001 - type: ndcg_at_100 value: 57.726 - type: ndcg_at_1000 value: 52.737 - type: ndcg_at_20 value: 72.54 - type: ndcg_at_3 value: 83.397 - type: ndcg_at_5 value: 80.806 - type: precision_at_1 value: 94.0 - type: precision_at_10 value: 81.0 - type: precision_at_100 value: 59.199999999999996 - type: precision_at_1000 value: 23.244 - type: precision_at_20 value: 75.2 - type: precision_at_3 value: 88.0 - type: precision_at_5 value: 84.8 - type: recall_at_1 value: 0.251 - type: recall_at_10 value: 2.1229999999999998 - type: recall_at_100 value: 14.496999999999998 - type: recall_at_1000 value: 50.09 - type: recall_at_20 value: 3.8309999999999995 - type: recall_at_3 value: 0.696 - type: recall_at_5 value: 1.1400000000000001 - type: main_score value: 77.71900000000001 - task: type: Clustering dataset: name: MTEB TenKGnadClusteringP2P (default) type: slvnwhrl/tenkgnad-clustering-p2p config: default split: test revision: 5c59e41555244b7e45c9a6be2d720ab4bafae558 metrics: - type: main_score value: 43.763609722295215 - type: v_measure value: 43.763609722295215 - type: v_measure_std value: 2.8751199473862457 - task: type: Clustering dataset: name: MTEB TenKGnadClusteringS2S (default) type: slvnwhrl/tenkgnad-clustering-s2s config: default split: test revision: 6cddbe003f12b9b140aec477b583ac4191f01786 metrics: - type: main_score value: 39.762424448504355 - type: v_measure value: 39.762424448504355 - type: v_measure_std value: 3.30146124979502 - task: type: Clustering dataset: name: MTEB ThuNewsClusteringP2P (default) type: C-MTEB/ThuNewsClusteringP2P config: default split: test revision: 5798586b105c0434e4f0fe5e767abe619442cf93 metrics: - type: main_score value: 63.133819258289456 - type: v_measure value: 63.133819258289456 - type: v_measure_std value: 1.8854253356479695 - task: type: Clustering dataset: name: MTEB ThuNewsClusteringS2S (default) type: C-MTEB/ThuNewsClusteringS2S config: default split: test revision: 8a8b2caeda43f39e13c4bc5bea0f8a667896e10d metrics: - type: main_score value: 58.98195851785808 - type: v_measure value: 58.98195851785808 - type: v_measure_std value: 1.6237600076393737 - task: type: Retrieval dataset: name: MTEB Touche2020 (default) type: mteb/touche2020 config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: map_at_1 value: 3.3550000000000004 - type: map_at_10 value: 10.08 - type: map_at_100 value: 16.136 - type: map_at_1000 value: 17.605 - type: map_at_20 value: 12.561 - type: map_at_3 value: 5.641 - type: map_at_5 value: 7.3260000000000005 - type: mrr_at_1 value: 46.939 - type: mrr_at_10 value: 58.152 - type: mrr_at_100 value: 58.594 - type: mrr_at_1000 value: 58.601000000000006 - type: mrr_at_20 value: 58.279 - type: mrr_at_3 value: 55.102 - type: mrr_at_5 value: 56.531 - type: ndcg_at_1 value: 44.897999999999996 - type: ndcg_at_10 value: 26.298 - type: ndcg_at_100 value: 37.596000000000004 - type: ndcg_at_1000 value: 49.424 - type: ndcg_at_20 value: 27.066000000000003 - type: ndcg_at_3 value: 31.528 - type: ndcg_at_5 value: 28.219 - type: precision_at_1 value: 46.939 - type: precision_at_10 value: 22.245 - type: precision_at_100 value: 7.531000000000001 - type: precision_at_1000 value: 1.5350000000000001 - type: precision_at_20 value: 17.041 - type: precision_at_3 value: 30.612000000000002 - type: precision_at_5 value: 26.122 - type: recall_at_1 value: 3.3550000000000004 - type: recall_at_10 value: 16.41 - type: recall_at_100 value: 47.272 - type: recall_at_1000 value: 83.584 - type: recall_at_20 value: 24.091 - type: recall_at_3 value: 6.8180000000000005 - type: recall_at_5 value: 9.677 - type: main_score value: 26.298 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification (default) type: mteb/toxic_conversations_50k config: default split: test revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de metrics: - type: accuracy value: 91.2890625 - type: ap value: 33.95547153875715 - type: ap_weighted value: 33.95547153875715 - type: f1 value: 75.10768597556462 - type: f1_weighted value: 92.00161208992606 - type: main_score value: 91.2890625 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification (default) type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 71.3978494623656 - type: f1 value: 71.7194818511814 - type: f1_weighted value: 71.13860187349744 - type: main_score value: 71.3978494623656 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering (default) type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: main_score value: 52.4921688720602 - type: v_measure value: 52.4921688720602 - type: v_measure_std value: 0.992768152658908 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 (default) type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cosine_accuracy value: 85.11652858079513 - type: cosine_accuracy_threshold value: 87.90839910507202 - type: cosine_ap value: 70.90459908851724 - type: cosine_f1 value: 65.66581227877457 - type: cosine_f1_threshold value: 85.13308763504028 - type: cosine_precision value: 61.094708153531684 - type: cosine_recall value: 70.97625329815304 - type: dot_accuracy value: 83.41181379269239 - type: dot_accuracy_threshold value: 43110.113525390625 - type: dot_ap value: 65.64869491143095 - type: dot_f1 value: 62.05308447460914 - type: dot_f1_threshold value: 41412.542724609375 - type: dot_precision value: 57.38623626989464 - type: dot_recall value: 67.54617414248021 - type: euclidean_accuracy value: 85.15229182809799 - type: euclidean_accuracy_threshold value: 1043.08500289917 - type: euclidean_ap value: 70.71204383269375 - type: euclidean_f1 value: 65.20304568527919 - type: euclidean_f1_threshold value: 1179.2595863342285 - type: euclidean_precision value: 62.81173594132029 - type: euclidean_recall value: 67.78364116094987 - type: main_score value: 70.90459908851724 - type: manhattan_accuracy value: 85.1820945341837 - type: manhattan_accuracy_threshold value: 26115.0390625 - type: manhattan_ap value: 70.66113937117431 - type: manhattan_f1 value: 65.33383628819313 - type: manhattan_f1_threshold value: 29105.181884765625 - type: manhattan_precision value: 62.40691808791736 - type: manhattan_recall value: 68.54881266490766 - type: max_accuracy value: 85.1820945341837 - type: max_ap value: 70.90459908851724 - type: max_f1 value: 65.66581227877457 - type: max_precision value: 62.81173594132029 - type: max_recall value: 70.97625329815304 - type: similarity_accuracy value: 85.11652858079513 - type: similarity_accuracy_threshold value: 87.90839910507202 - type: similarity_ap value: 70.90459908851724 - type: similarity_f1 value: 65.66581227877457 - type: similarity_f1_threshold value: 85.13308763504028 - type: similarity_precision value: 61.094708153531684 - type: similarity_recall value: 70.97625329815304 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus (default) type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cosine_accuracy value: 88.10299996119068 - type: cosine_accuracy_threshold value: 84.34982895851135 - type: cosine_ap value: 84.13755787769226 - type: cosine_f1 value: 76.0967548076923 - type: cosine_f1_threshold value: 82.8936219215393 - type: cosine_precision value: 74.28864769727193 - type: cosine_recall value: 77.99507237449954 - type: dot_accuracy value: 86.64182869561843 - type: dot_accuracy_threshold value: 38794.677734375 - type: dot_ap value: 80.20301567411457 - type: dot_f1 value: 73.50650291634967 - type: dot_f1_threshold value: 37447.23205566406 - type: dot_precision value: 69.41498460485802 - type: dot_recall value: 78.11056359716662 - type: euclidean_accuracy value: 87.9361198432103 - type: euclidean_accuracy_threshold value: 1184.421157836914 - type: euclidean_ap value: 83.79582690117218 - type: euclidean_f1 value: 75.81431709042175 - type: euclidean_f1_threshold value: 1258.2727432250977 - type: euclidean_precision value: 73.39099099099099 - type: euclidean_recall value: 78.40314136125654 - type: main_score value: 84.13755787769226 - type: manhattan_accuracy value: 87.96134590755618 - type: manhattan_accuracy_threshold value: 29077.291870117188 - type: manhattan_ap value: 83.79487172269923 - type: manhattan_f1 value: 75.82421603424935 - type: manhattan_f1_threshold value: 31224.124145507812 - type: manhattan_precision value: 72.24740255212329 - type: manhattan_recall value: 79.77363720357253 - type: max_accuracy value: 88.10299996119068 - type: max_ap value: 84.13755787769226 - type: max_f1 value: 76.0967548076923 - type: max_precision value: 74.28864769727193 - type: max_recall value: 79.77363720357253 - type: similarity_accuracy value: 88.10299996119068 - type: similarity_accuracy_threshold value: 84.34982895851135 - type: similarity_ap value: 84.13755787769226 - type: similarity_f1 value: 76.0967548076923 - type: similarity_f1_threshold value: 82.8936219215393 - type: similarity_precision value: 74.28864769727193 - type: similarity_recall value: 77.99507237449954 - task: type: Retrieval dataset: name: MTEB VideoRetrieval (default) type: C-MTEB/VideoRetrieval config: default split: dev revision: 58c2597a5943a2ba48f4668c3b90d796283c5639 metrics: - type: main_score value: 70.433 - type: map_at_1 value: 55.7 - type: map_at_10 value: 66.013 - type: map_at_100 value: 66.534 - type: map_at_1000 value: 66.547 - type: map_at_20 value: 66.334 - type: map_at_3 value: 64.2 - type: map_at_5 value: 65.445 - type: mrr_at_1 value: 55.7 - type: mrr_at_10 value: 66.01329365079364 - type: mrr_at_100 value: 66.53350061744233 - type: mrr_at_1000 value: 66.54744831962995 - type: mrr_at_20 value: 66.3335147364675 - type: mrr_at_3 value: 64.2 - type: mrr_at_5 value: 65.44500000000002 - type: nauc_map_at_1000_diff1 value: 76.26428836976245 - type: nauc_map_at_1000_max value: 35.41847367373575 - type: nauc_map_at_1000_std value: -33.04639860831992 - type: nauc_map_at_100_diff1 value: 76.25793229023193 - type: nauc_map_at_100_max value: 35.43663260110076 - type: nauc_map_at_100_std value: -33.04238139882945 - type: nauc_map_at_10_diff1 value: 76.2108281297711 - type: nauc_map_at_10_max value: 35.59442419423183 - type: nauc_map_at_10_std value: -33.32346518997277 - type: nauc_map_at_1_diff1 value: 79.17728405262736 - type: nauc_map_at_1_max value: 31.880738163589527 - type: nauc_map_at_1_std value: -30.891888718004584 - type: nauc_map_at_20_diff1 value: 76.2181333410193 - type: nauc_map_at_20_max value: 35.43448818430876 - type: nauc_map_at_20_std value: -33.35682442863193 - type: nauc_map_at_3_diff1 value: 76.10046541433466 - type: nauc_map_at_3_max value: 34.6831278555291 - type: nauc_map_at_3_std value: -34.030826044831116 - type: nauc_map_at_5_diff1 value: 75.96513023582064 - type: nauc_map_at_5_max value: 34.66920832438069 - type: nauc_map_at_5_std value: -33.79799777830796 - type: nauc_mrr_at_1000_diff1 value: 76.26428836976245 - type: nauc_mrr_at_1000_max value: 35.41847367373575 - type: nauc_mrr_at_1000_std value: -33.04639860831992 - type: nauc_mrr_at_100_diff1 value: 76.25793229023193 - type: nauc_mrr_at_100_max value: 35.43663260110076 - type: nauc_mrr_at_100_std value: -33.04238139882945 - type: nauc_mrr_at_10_diff1 value: 76.2108281297711 - type: nauc_mrr_at_10_max value: 35.59442419423183 - type: nauc_mrr_at_10_std value: -33.32346518997277 - type: nauc_mrr_at_1_diff1 value: 79.17728405262736 - type: nauc_mrr_at_1_max value: 31.880738163589527 - type: nauc_mrr_at_1_std value: -30.891888718004584 - type: nauc_mrr_at_20_diff1 value: 76.2181333410193 - type: nauc_mrr_at_20_max value: 35.43448818430876 - type: nauc_mrr_at_20_std value: -33.35682442863193 - type: nauc_mrr_at_3_diff1 value: 76.10046541433466 - type: nauc_mrr_at_3_max value: 34.6831278555291 - type: nauc_mrr_at_3_std value: -34.030826044831116 - type: nauc_mrr_at_5_diff1 value: 75.96513023582064 - type: nauc_mrr_at_5_max value: 34.66920832438069 - type: nauc_mrr_at_5_std value: -33.79799777830796 - type: nauc_ndcg_at_1000_diff1 value: 75.68118206798317 - type: nauc_ndcg_at_1000_max value: 37.12252980787349 - type: nauc_ndcg_at_1000_std value: -31.457578337430505 - type: nauc_ndcg_at_100_diff1 value: 75.46730761564156 - type: nauc_ndcg_at_100_max value: 37.549890025544265 - type: nauc_ndcg_at_100_std value: -31.35066985945112 - type: nauc_ndcg_at_10_diff1 value: 75.09890404887037 - type: nauc_ndcg_at_10_max value: 38.024147790014204 - type: nauc_ndcg_at_10_std value: -33.67408368593356 - type: nauc_ndcg_at_1_diff1 value: 79.17728405262736 - type: nauc_ndcg_at_1_max value: 31.880738163589527 - type: nauc_ndcg_at_1_std value: -30.891888718004584 - type: nauc_ndcg_at_20_diff1 value: 75.12977548171354 - type: nauc_ndcg_at_20_max value: 37.524926748917956 - type: nauc_ndcg_at_20_std value: -33.771344674947485 - type: nauc_ndcg_at_3_diff1 value: 74.94037476984154 - type: nauc_ndcg_at_3_max value: 35.60345554050552 - type: nauc_ndcg_at_3_std value: -35.256991346321854 - type: nauc_ndcg_at_5_diff1 value: 74.54265907753783 - type: nauc_ndcg_at_5_max value: 35.57662819978585 - type: nauc_ndcg_at_5_std value: -34.879794448418465 - type: nauc_precision_at_1000_diff1 value: 74.52277207179142 - type: nauc_precision_at_1000_max value: 94.25510945118707 - type: nauc_precision_at_1000_std value: 91.6874157070222 - type: nauc_precision_at_100_diff1 value: 65.98346655735419 - type: nauc_precision_at_100_max value: 78.81168727653687 - type: nauc_precision_at_100_std value: 27.241465691967708 - type: nauc_precision_at_10_diff1 value: 69.55050319096688 - type: nauc_precision_at_10_max value: 51.827749140893374 - type: nauc_precision_at_10_std value: -34.60818605792837 - type: nauc_precision_at_1_diff1 value: 79.17728405262736 - type: nauc_precision_at_1_max value: 31.880738163589527 - type: nauc_precision_at_1_std value: -30.891888718004584 - type: nauc_precision_at_20_diff1 value: 68.08078305042736 - type: nauc_precision_at_20_max value: 52.83318878288501 - type: nauc_precision_at_20_std value: -35.46070292817927 - type: nauc_precision_at_3_diff1 value: 70.76249609881901 - type: nauc_precision_at_3_max value: 38.86561868624655 - type: nauc_precision_at_3_std value: -39.68917853446992 - type: nauc_precision_at_5_diff1 value: 68.39110629013278 - type: nauc_precision_at_5_max value: 39.28677163904683 - type: nauc_precision_at_5_std value: -39.39101423819562 - type: nauc_recall_at_1000_diff1 value: 74.52277207179175 - type: nauc_recall_at_1000_max value: 94.25510945118776 - type: nauc_recall_at_1000_std value: 91.68741570702382 - type: nauc_recall_at_100_diff1 value: 65.9834665573548 - type: nauc_recall_at_100_max value: 78.81168727653679 - type: nauc_recall_at_100_std value: 27.241465691967598 - type: nauc_recall_at_10_diff1 value: 69.55050319096708 - type: nauc_recall_at_10_max value: 51.82774914089347 - type: nauc_recall_at_10_std value: -34.6081860579283 - type: nauc_recall_at_1_diff1 value: 79.17728405262736 - type: nauc_recall_at_1_max value: 31.880738163589527 - type: nauc_recall_at_1_std value: -30.891888718004584 - type: nauc_recall_at_20_diff1 value: 68.08078305042746 - type: nauc_recall_at_20_max value: 52.833188782885244 - type: nauc_recall_at_20_std value: -35.46070292817895 - type: nauc_recall_at_3_diff1 value: 70.76249609881896 - type: nauc_recall_at_3_max value: 38.865618686246464 - type: nauc_recall_at_3_std value: -39.68917853446999 - type: nauc_recall_at_5_diff1 value: 68.39110629013274 - type: nauc_recall_at_5_max value: 39.28677163904688 - type: nauc_recall_at_5_std value: -39.39101423819562 - type: ndcg_at_1 value: 55.7 - type: ndcg_at_10 value: 70.433 - type: ndcg_at_100 value: 72.975 - type: ndcg_at_1000 value: 73.283 - type: ndcg_at_20 value: 71.58 - type: ndcg_at_3 value: 66.83099999999999 - type: ndcg_at_5 value: 69.085 - type: precision_at_1 value: 55.7 - type: precision_at_10 value: 8.4 - type: precision_at_100 value: 0.959 - type: precision_at_1000 value: 0.098 - type: precision_at_20 value: 4.425 - type: precision_at_3 value: 24.8 - type: precision_at_5 value: 15.98 - type: recall_at_1 value: 55.7 - type: recall_at_10 value: 84.0 - type: recall_at_100 value: 95.89999999999999 - type: recall_at_1000 value: 98.2 - type: recall_at_20 value: 88.5 - type: recall_at_3 value: 74.4 - type: recall_at_5 value: 79.9 - task: type: Classification dataset: name: MTEB Waimai (default) type: C-MTEB/waimai-classification config: default split: test revision: 339287def212450dcaa9df8c22bf93e9980c7023 metrics: - type: accuracy value: 86.58999999999999 - type: ap value: 70.02619249927523 - type: ap_weighted value: 70.02619249927523 - type: f1 value: 84.97572770889423 - type: f1_weighted value: 86.6865713531272 - type: main_score value: 86.58999999999999 - task: type: Retrieval dataset: name: MTEB XMarket (en) type: jinaai/xmarket_ml config: en split: test revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b metrics: - type: main_score value: 34.772999999999996 - type: map_at_1 value: 7.2620000000000005 - type: map_at_10 value: 17.98 - type: map_at_100 value: 24.828 - type: map_at_1000 value: 26.633000000000003 - type: map_at_20 value: 20.699 - type: map_at_3 value: 12.383 - type: map_at_5 value: 14.871 - type: mrr_at_1 value: 34.718100890207715 - type: mrr_at_10 value: 43.9336827525092 - type: mrr_at_100 value: 44.66474011066837 - type: mrr_at_1000 value: 44.7075592197356 - type: mrr_at_20 value: 44.35984436569346 - type: mrr_at_3 value: 41.73901893981052 - type: mrr_at_5 value: 43.025973550207134 - type: nauc_map_at_1000_diff1 value: 13.899869081196364 - type: nauc_map_at_1000_max value: 46.60452816386231 - type: nauc_map_at_1000_std value: 24.87925799401773 - type: nauc_map_at_100_diff1 value: 16.164805650871084 - type: nauc_map_at_100_max value: 44.720912958558095 - type: nauc_map_at_100_std value: 20.236734536210477 - type: nauc_map_at_10_diff1 value: 23.58580520913581 - type: nauc_map_at_10_max value: 31.276151869914216 - type: nauc_map_at_10_std value: -0.1833326246041355 - type: nauc_map_at_1_diff1 value: 37.02663305598722 - type: nauc_map_at_1_max value: 14.931071531116528 - type: nauc_map_at_1_std value: -12.478790028708453 - type: nauc_map_at_20_diff1 value: 20.718297881540593 - type: nauc_map_at_20_max value: 36.62264094841859 - type: nauc_map_at_20_std value: 6.658514770057742 - type: nauc_map_at_3_diff1 value: 29.379034581120006 - type: nauc_map_at_3_max value: 21.387214269548803 - type: nauc_map_at_3_std value: -9.3404121914247 - type: nauc_map_at_5_diff1 value: 26.627169792839485 - type: nauc_map_at_5_max value: 25.393331109666388 - type: nauc_map_at_5_std value: -6.023485287246353 - type: nauc_mrr_at_1000_diff1 value: 12.047232036652295 - type: nauc_mrr_at_1000_max value: 46.611862580860645 - type: nauc_mrr_at_1000_std value: 27.89146066442305 - type: nauc_mrr_at_100_diff1 value: 12.05261747449997 - type: nauc_mrr_at_100_max value: 46.61328535381203 - type: nauc_mrr_at_100_std value: 27.886145596874535 - type: nauc_mrr_at_10_diff1 value: 12.006935553036941 - type: nauc_mrr_at_10_max value: 46.53351686240496 - type: nauc_mrr_at_10_std value: 27.708742470257462 - type: nauc_mrr_at_1_diff1 value: 13.323408127738782 - type: nauc_mrr_at_1_max value: 43.78884661002012 - type: nauc_mrr_at_1_std value: 25.164417588165673 - type: nauc_mrr_at_20_diff1 value: 12.036022973968011 - type: nauc_mrr_at_20_max value: 46.56537838037131 - type: nauc_mrr_at_20_std value: 27.78189157249635 - type: nauc_mrr_at_3_diff1 value: 11.943896700976381 - type: nauc_mrr_at_3_max value: 46.33644663073225 - type: nauc_mrr_at_3_std value: 27.523915405053845 - type: nauc_mrr_at_5_diff1 value: 12.03108009033769 - type: nauc_mrr_at_5_max value: 46.49103616896692 - type: nauc_mrr_at_5_std value: 27.630879129863366 - type: nauc_ndcg_at_1000_diff1 value: 9.766823796017324 - type: nauc_ndcg_at_1000_max value: 52.85844801910602 - type: nauc_ndcg_at_1000_std value: 36.43271437761207 - type: nauc_ndcg_at_100_diff1 value: 12.035059298282036 - type: nauc_ndcg_at_100_max value: 50.05520240705682 - type: nauc_ndcg_at_100_std value: 29.87678724506636 - type: nauc_ndcg_at_10_diff1 value: 10.281893031139424 - type: nauc_ndcg_at_10_max value: 47.02153679426017 - type: nauc_ndcg_at_10_std value: 26.624948330369126 - type: nauc_ndcg_at_1_diff1 value: 13.323408127738782 - type: nauc_ndcg_at_1_max value: 43.78884661002012 - type: nauc_ndcg_at_1_std value: 25.164417588165673 - type: nauc_ndcg_at_20_diff1 value: 11.463524849646598 - type: nauc_ndcg_at_20_max value: 47.415073186019704 - type: nauc_ndcg_at_20_std value: 26.359019620164307 - type: nauc_ndcg_at_3_diff1 value: 9.689199913805394 - type: nauc_ndcg_at_3_max value: 45.68151849572808 - type: nauc_ndcg_at_3_std value: 26.559193219799486 - type: nauc_ndcg_at_5_diff1 value: 9.448823370356575 - type: nauc_ndcg_at_5_max value: 46.19999662690141 - type: nauc_ndcg_at_5_std value: 26.8411706726069 - type: nauc_precision_at_1000_diff1 value: -20.379065598727024 - type: nauc_precision_at_1000_max value: 13.162562437268427 - type: nauc_precision_at_1000_std value: 22.658226157785812 - type: nauc_precision_at_100_diff1 value: -16.458155977309282 - type: nauc_precision_at_100_max value: 35.97956789169889 - type: nauc_precision_at_100_std value: 48.878375009979194 - type: nauc_precision_at_10_diff1 value: -7.810992317607771 - type: nauc_precision_at_10_max value: 49.307339277444754 - type: nauc_precision_at_10_std value: 42.82533951854582 - type: nauc_precision_at_1_diff1 value: 13.323408127738782 - type: nauc_precision_at_1_max value: 43.78884661002012 - type: nauc_precision_at_1_std value: 25.164417588165673 - type: nauc_precision_at_20_diff1 value: -11.43933465149542 - type: nauc_precision_at_20_max value: 46.93722753460038 - type: nauc_precision_at_20_std value: 47.36223769029678 - type: nauc_precision_at_3_diff1 value: 1.3230178593599737 - type: nauc_precision_at_3_max value: 48.49039534395576 - type: nauc_precision_at_3_std value: 33.161384183129194 - type: nauc_precision_at_5_diff1 value: -3.185516457926519 - type: nauc_precision_at_5_max value: 49.5814309394308 - type: nauc_precision_at_5_std value: 37.57637865900281 - type: nauc_recall_at_1000_diff1 value: 7.839499443984168 - type: nauc_recall_at_1000_max value: 52.67165467640894 - type: nauc_recall_at_1000_std value: 48.85318316702583 - type: nauc_recall_at_100_diff1 value: 14.117557049589418 - type: nauc_recall_at_100_max value: 40.59046301348715 - type: nauc_recall_at_100_std value: 24.379680901739505 - type: nauc_recall_at_10_diff1 value: 20.04536052614054 - type: nauc_recall_at_10_max value: 25.54148839721574 - type: nauc_recall_at_10_std value: -1.938182527562211 - type: nauc_recall_at_1_diff1 value: 37.02663305598722 - type: nauc_recall_at_1_max value: 14.931071531116528 - type: nauc_recall_at_1_std value: -12.478790028708453 - type: nauc_recall_at_20_diff1 value: 17.959977483235566 - type: nauc_recall_at_20_max value: 29.88502687870809 - type: nauc_recall_at_20_std value: 4.26527395196852 - type: nauc_recall_at_3_diff1 value: 26.297810954500456 - type: nauc_recall_at_3_max value: 18.819406079307402 - type: nauc_recall_at_3_std value: -10.002237229729081 - type: nauc_recall_at_5_diff1 value: 22.739080899568485 - type: nauc_recall_at_5_max value: 21.0322968243985 - type: nauc_recall_at_5_std value: -6.927749435306422 - type: ndcg_at_1 value: 34.717999999999996 - type: ndcg_at_10 value: 34.772999999999996 - type: ndcg_at_100 value: 39.407 - type: ndcg_at_1000 value: 44.830999999999996 - type: ndcg_at_20 value: 35.667 - type: ndcg_at_3 value: 34.332 - type: ndcg_at_5 value: 34.408 - type: precision_at_1 value: 34.717999999999996 - type: precision_at_10 value: 23.430999999999997 - type: precision_at_100 value: 9.31 - type: precision_at_1000 value: 2.259 - type: precision_at_20 value: 18.826999999999998 - type: precision_at_3 value: 30.553 - type: precision_at_5 value: 27.792 - type: recall_at_1 value: 7.2620000000000005 - type: recall_at_10 value: 26.384 - type: recall_at_100 value: 52.506 - type: recall_at_1000 value: 73.38 - type: recall_at_20 value: 34.032000000000004 - type: recall_at_3 value: 14.821000000000002 - type: recall_at_5 value: 19.481 - task: type: Retrieval dataset: name: MTEB XMarket (de) type: jinaai/xmarket_ml config: de split: test revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b metrics: - type: main_score value: 28.316000000000003 - type: map_at_1 value: 8.667 - type: map_at_10 value: 17.351 - type: map_at_100 value: 21.02 - type: map_at_1000 value: 21.951 - type: map_at_20 value: 18.994 - type: map_at_3 value: 13.23 - type: map_at_5 value: 15.17 - type: mrr_at_1 value: 27.27272727272727 - type: mrr_at_10 value: 36.10858487561485 - type: mrr_at_100 value: 36.92033814316568 - type: mrr_at_1000 value: 36.972226653870365 - type: mrr_at_20 value: 36.58914906427944 - type: mrr_at_3 value: 33.642969201552305 - type: mrr_at_5 value: 35.13417554289494 - type: nauc_map_at_1000_diff1 value: 23.345116790998063 - type: nauc_map_at_1000_max value: 44.447240670835725 - type: nauc_map_at_1000_std value: 18.34636500680144 - type: nauc_map_at_100_diff1 value: 24.458120909292347 - type: nauc_map_at_100_max value: 43.31851431140378 - type: nauc_map_at_100_std value: 15.654778355549965 - type: nauc_map_at_10_diff1 value: 29.376508937265044 - type: nauc_map_at_10_max value: 36.650196725140795 - type: nauc_map_at_10_std value: 4.682465435374843 - type: nauc_map_at_1_diff1 value: 40.382365672683214 - type: nauc_map_at_1_max value: 22.894341150096785 - type: nauc_map_at_1_std value: -5.610725673968323 - type: nauc_map_at_20_diff1 value: 27.197033425732908 - type: nauc_map_at_20_max value: 39.71672400647207 - type: nauc_map_at_20_std value: 8.944436813309933 - type: nauc_map_at_3_diff1 value: 34.49739294661502 - type: nauc_map_at_3_max value: 29.006972420735284 - type: nauc_map_at_3_std value: -3.0372650571243986 - type: nauc_map_at_5_diff1 value: 32.764901537277105 - type: nauc_map_at_5_max value: 32.658533295918154 - type: nauc_map_at_5_std value: 0.029626452286996906 - type: nauc_mrr_at_1000_diff1 value: 19.521229956280603 - type: nauc_mrr_at_1000_max value: 44.39409866211472 - type: nauc_mrr_at_1000_std value: 23.580697307036058 - type: nauc_mrr_at_100_diff1 value: 19.51312676591073 - type: nauc_mrr_at_100_max value: 44.39559153963895 - type: nauc_mrr_at_100_std value: 23.57913711397437 - type: nauc_mrr_at_10_diff1 value: 19.584635617935145 - type: nauc_mrr_at_10_max value: 44.44842226236198 - type: nauc_mrr_at_10_std value: 23.382684909390434 - type: nauc_mrr_at_1_diff1 value: 20.92594790923806 - type: nauc_mrr_at_1_max value: 40.593939625252816 - type: nauc_mrr_at_1_std value: 20.37467598073644 - type: nauc_mrr_at_20_diff1 value: 19.590641822115725 - type: nauc_mrr_at_20_max value: 44.42512299604718 - type: nauc_mrr_at_20_std value: 23.45564260800024 - type: nauc_mrr_at_3_diff1 value: 20.005307129527232 - type: nauc_mrr_at_3_max value: 43.68300366192776 - type: nauc_mrr_at_3_std value: 22.297190480842005 - type: nauc_mrr_at_5_diff1 value: 19.852896386271716 - type: nauc_mrr_at_5_max value: 44.20641808920062 - type: nauc_mrr_at_5_std value: 22.966517330852895 - type: nauc_ndcg_at_1000_diff1 value: 17.800116251376103 - type: nauc_ndcg_at_1000_max value: 50.98332718061365 - type: nauc_ndcg_at_1000_std value: 31.464484658102577 - type: nauc_ndcg_at_100_diff1 value: 19.555159680541088 - type: nauc_ndcg_at_100_max value: 48.56377130899141 - type: nauc_ndcg_at_100_std value: 25.77572748714817 - type: nauc_ndcg_at_10_diff1 value: 20.003008726679415 - type: nauc_ndcg_at_10_max value: 45.1293725480628 - type: nauc_ndcg_at_10_std value: 21.149213260765872 - type: nauc_ndcg_at_1_diff1 value: 21.00986278773023 - type: nauc_ndcg_at_1_max value: 40.524637076774894 - type: nauc_ndcg_at_1_std value: 20.29682194006685 - type: nauc_ndcg_at_20_diff1 value: 20.659734137312284 - type: nauc_ndcg_at_20_max value: 45.73108736599869 - type: nauc_ndcg_at_20_std value: 21.200736170346133 - type: nauc_ndcg_at_3_diff1 value: 19.200120542882544 - type: nauc_ndcg_at_3_max value: 42.89772612963168 - type: nauc_ndcg_at_3_std value: 20.713292754978983 - type: nauc_ndcg_at_5_diff1 value: 19.96329647992544 - type: nauc_ndcg_at_5_max value: 44.296627037787324 - type: nauc_ndcg_at_5_std value: 21.200135784971973 - type: nauc_precision_at_1000_diff1 value: -11.543221249009427 - type: nauc_precision_at_1000_max value: 9.132801614448221 - type: nauc_precision_at_1000_std value: 21.203720655381055 - type: nauc_precision_at_100_diff1 value: -12.510945425786039 - type: nauc_precision_at_100_max value: 31.42530963666252 - type: nauc_precision_at_100_std value: 44.99672783467617 - type: nauc_precision_at_10_diff1 value: -4.025802651746804 - type: nauc_precision_at_10_max value: 47.50967924227793 - type: nauc_precision_at_10_std value: 41.1558559268985 - type: nauc_precision_at_1_diff1 value: 21.00986278773023 - type: nauc_precision_at_1_max value: 40.524637076774894 - type: nauc_precision_at_1_std value: 20.29682194006685 - type: nauc_precision_at_20_diff1 value: -8.059482951110002 - type: nauc_precision_at_20_max value: 44.28832115946278 - type: nauc_precision_at_20_std value: 45.2005585353651 - type: nauc_precision_at_3_diff1 value: 8.53530005716248 - type: nauc_precision_at_3_max value: 46.48353678905102 - type: nauc_precision_at_3_std value: 28.868791323881972 - type: nauc_precision_at_5_diff1 value: 3.093619954821814 - type: nauc_precision_at_5_max value: 48.43294475817019 - type: nauc_precision_at_5_std value: 34.83430452745434 - type: nauc_recall_at_1000_diff1 value: 9.93680206699751 - type: nauc_recall_at_1000_max value: 52.97840222394363 - type: nauc_recall_at_1000_std value: 46.370023604436255 - type: nauc_recall_at_100_diff1 value: 14.100542445524972 - type: nauc_recall_at_100_max value: 42.853775131475224 - type: nauc_recall_at_100_std value: 26.93029971231028 - type: nauc_recall_at_10_diff1 value: 22.774547475714716 - type: nauc_recall_at_10_max value: 33.984586405015044 - type: nauc_recall_at_10_std value: 5.332325172373655 - type: nauc_recall_at_1_diff1 value: 40.382365672683214 - type: nauc_recall_at_1_max value: 22.894341150096785 - type: nauc_recall_at_1_std value: -5.610725673968323 - type: nauc_recall_at_20_diff1 value: 19.751060483835936 - type: nauc_recall_at_20_max value: 36.18774034635102 - type: nauc_recall_at_20_std value: 10.362242090308577 - type: nauc_recall_at_3_diff1 value: 30.29462372902671 - type: nauc_recall_at_3_max value: 27.377175450099635 - type: nauc_recall_at_3_std value: -3.015752705993425 - type: nauc_recall_at_5_diff1 value: 28.096893312615723 - type: nauc_recall_at_5_max value: 30.485075571512425 - type: nauc_recall_at_5_std value: 0.09106417003502826 - type: ndcg_at_1 value: 27.248 - type: ndcg_at_10 value: 28.316000000000003 - type: ndcg_at_100 value: 33.419 - type: ndcg_at_1000 value: 38.134 - type: ndcg_at_20 value: 29.707 - type: ndcg_at_3 value: 26.93 - type: ndcg_at_5 value: 27.363 - type: precision_at_1 value: 27.248 - type: precision_at_10 value: 15.073 - type: precision_at_100 value: 5.061 - type: precision_at_1000 value: 1.325 - type: precision_at_20 value: 11.407 - type: precision_at_3 value: 21.823 - type: precision_at_5 value: 18.984 - type: recall_at_1 value: 8.667 - type: recall_at_10 value: 26.984 - type: recall_at_100 value: 49.753 - type: recall_at_1000 value: 70.354 - type: recall_at_20 value: 33.955999999999996 - type: recall_at_3 value: 16.086 - type: recall_at_5 value: 20.544999999999998 - task: type: Retrieval dataset: name: MTEB XMarket (es) type: jinaai/xmarket_ml config: es split: test revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b metrics: - type: main_score value: 26.592 - type: map_at_1 value: 8.081000000000001 - type: map_at_10 value: 16.486 - type: map_at_100 value: 19.996 - type: map_at_1000 value: 20.889 - type: map_at_20 value: 18.088 - type: map_at_3 value: 12.864 - type: map_at_5 value: 14.515 - type: mrr_at_1 value: 24.643356643356643 - type: mrr_at_10 value: 33.755599955599926 - type: mrr_at_100 value: 34.55914769326114 - type: mrr_at_1000 value: 34.614384237219745 - type: mrr_at_20 value: 34.228909650276194 - type: mrr_at_3 value: 31.445221445221456 - type: mrr_at_5 value: 32.71375291375297 - type: nauc_map_at_1000_diff1 value: 19.17751654240679 - type: nauc_map_at_1000_max value: 43.493743561136434 - type: nauc_map_at_1000_std value: 21.14477911550252 - type: nauc_map_at_100_diff1 value: 20.259227234415395 - type: nauc_map_at_100_max value: 42.510860292169106 - type: nauc_map_at_100_std value: 18.63085160442346 - type: nauc_map_at_10_diff1 value: 24.12419385640694 - type: nauc_map_at_10_max value: 35.99892932069915 - type: nauc_map_at_10_std value: 8.488520124325058 - type: nauc_map_at_1_diff1 value: 35.09239143996649 - type: nauc_map_at_1_max value: 23.72498533914286 - type: nauc_map_at_1_std value: -4.164387883546102 - type: nauc_map_at_20_diff1 value: 22.411418237320817 - type: nauc_map_at_20_max value: 39.12496266094892 - type: nauc_map_at_20_std value: 12.371656353894227 - type: nauc_map_at_3_diff1 value: 28.106972376813506 - type: nauc_map_at_3_max value: 29.57824316865409 - type: nauc_map_at_3_std value: 1.8928791254813127 - type: nauc_map_at_5_diff1 value: 26.4958239149419 - type: nauc_map_at_5_max value: 32.45906016649239 - type: nauc_map_at_5_std value: 4.612735963224018 - type: nauc_mrr_at_1000_diff1 value: 17.614812607094446 - type: nauc_mrr_at_1000_max value: 41.13031556228715 - type: nauc_mrr_at_1000_std value: 22.564112871230318 - type: nauc_mrr_at_100_diff1 value: 17.614044568011085 - type: nauc_mrr_at_100_max value: 41.129436273086796 - type: nauc_mrr_at_100_std value: 22.566763500658766 - type: nauc_mrr_at_10_diff1 value: 17.61869494452089 - type: nauc_mrr_at_10_max value: 41.091542329381426 - type: nauc_mrr_at_10_std value: 22.370473458633594 - type: nauc_mrr_at_1_diff1 value: 20.321421442201913 - type: nauc_mrr_at_1_max value: 38.36531448180009 - type: nauc_mrr_at_1_std value: 18.422203207777688 - type: nauc_mrr_at_20_diff1 value: 17.614767736091625 - type: nauc_mrr_at_20_max value: 41.11221420736687 - type: nauc_mrr_at_20_std value: 22.44271891522012 - type: nauc_mrr_at_3_diff1 value: 17.98184651584625 - type: nauc_mrr_at_3_max value: 40.424293610470144 - type: nauc_mrr_at_3_std value: 21.554750947206706 - type: nauc_mrr_at_5_diff1 value: 17.72088314927416 - type: nauc_mrr_at_5_max value: 40.662724739072694 - type: nauc_mrr_at_5_std value: 21.822957528431928 - type: nauc_ndcg_at_1000_diff1 value: 15.310699428328398 - type: nauc_ndcg_at_1000_max value: 48.83921393349997 - type: nauc_ndcg_at_1000_std value: 32.22600294110774 - type: nauc_ndcg_at_100_diff1 value: 16.62672763977423 - type: nauc_ndcg_at_100_max value: 47.36060653537392 - type: nauc_ndcg_at_100_std value: 27.879865162871575 - type: nauc_ndcg_at_10_diff1 value: 16.436684176028116 - type: nauc_ndcg_at_10_max value: 43.00026520872974 - type: nauc_ndcg_at_10_std value: 22.507354939162806 - type: nauc_ndcg_at_1_diff1 value: 20.321421442201913 - type: nauc_ndcg_at_1_max value: 38.36531448180009 - type: nauc_ndcg_at_1_std value: 18.422203207777688 - type: nauc_ndcg_at_20_diff1 value: 17.127747123248835 - type: nauc_ndcg_at_20_max value: 44.57322943752733 - type: nauc_ndcg_at_20_std value: 23.146541187377036 - type: nauc_ndcg_at_3_diff1 value: 16.372742984728514 - type: nauc_ndcg_at_3_max value: 40.91938017883993 - type: nauc_ndcg_at_3_std value: 21.50917089194154 - type: nauc_ndcg_at_5_diff1 value: 16.40486505525073 - type: nauc_ndcg_at_5_max value: 41.94597203181329 - type: nauc_ndcg_at_5_std value: 22.068260809047562 - type: nauc_precision_at_1000_diff1 value: -15.9415313729527 - type: nauc_precision_at_1000_max value: 12.653329948983643 - type: nauc_precision_at_1000_std value: 26.371820703256173 - type: nauc_precision_at_100_diff1 value: -11.851070166675289 - type: nauc_precision_at_100_max value: 32.164365923950115 - type: nauc_precision_at_100_std value: 45.930226426725426 - type: nauc_precision_at_10_diff1 value: -3.1352660378259163 - type: nauc_precision_at_10_max value: 45.48359878733272 - type: nauc_precision_at_10_std value: 40.2917038044196 - type: nauc_precision_at_1_diff1 value: 20.321421442201913 - type: nauc_precision_at_1_max value: 38.36531448180009 - type: nauc_precision_at_1_std value: 18.422203207777688 - type: nauc_precision_at_20_diff1 value: -7.087513342144751 - type: nauc_precision_at_20_max value: 43.66272019058357 - type: nauc_precision_at_20_std value: 44.22863351071686 - type: nauc_precision_at_3_diff1 value: 7.836185032609045 - type: nauc_precision_at_3_max value: 44.85412904097269 - type: nauc_precision_at_3_std value: 30.209139149500057 - type: nauc_precision_at_5_diff1 value: 3.028150537253791 - type: nauc_precision_at_5_max value: 45.73661708882973 - type: nauc_precision_at_5_std value: 34.65500311185052 - type: nauc_recall_at_1000_diff1 value: 9.526124668370704 - type: nauc_recall_at_1000_max value: 51.4190208452196 - type: nauc_recall_at_1000_std value: 45.694891695646426 - type: nauc_recall_at_100_diff1 value: 12.68466215400009 - type: nauc_recall_at_100_max value: 42.79112054268112 - type: nauc_recall_at_100_std value: 28.61954251400998 - type: nauc_recall_at_10_diff1 value: 17.95124413416829 - type: nauc_recall_at_10_max value: 33.1192036755167 - type: nauc_recall_at_10_std value: 9.3588175959525 - type: nauc_recall_at_1_diff1 value: 35.09239143996649 - type: nauc_recall_at_1_max value: 23.72498533914286 - type: nauc_recall_at_1_std value: -4.164387883546102 - type: nauc_recall_at_20_diff1 value: 16.24916980445646 - type: nauc_recall_at_20_max value: 36.51316122236076 - type: nauc_recall_at_20_std value: 13.641588062425736 - type: nauc_recall_at_3_diff1 value: 23.263199724138786 - type: nauc_recall_at_3_max value: 27.67354561610614 - type: nauc_recall_at_3_std value: 3.103127242654415 - type: nauc_recall_at_5_diff1 value: 20.719704839229635 - type: nauc_recall_at_5_max value: 29.66480839111333 - type: nauc_recall_at_5_std value: 5.514884455797986 - type: ndcg_at_1 value: 24.643 - type: ndcg_at_10 value: 26.592 - type: ndcg_at_100 value: 31.887 - type: ndcg_at_1000 value: 36.695 - type: ndcg_at_20 value: 28.166000000000004 - type: ndcg_at_3 value: 25.238 - type: ndcg_at_5 value: 25.545 - type: precision_at_1 value: 24.643 - type: precision_at_10 value: 13.730999999999998 - type: precision_at_100 value: 4.744000000000001 - type: precision_at_1000 value: 1.167 - type: precision_at_20 value: 10.562000000000001 - type: precision_at_3 value: 20.288999999999998 - type: precision_at_5 value: 17.337 - type: recall_at_1 value: 8.081000000000001 - type: recall_at_10 value: 25.911 - type: recall_at_100 value: 48.176 - type: recall_at_1000 value: 69.655 - type: recall_at_20 value: 32.924 - type: recall_at_3 value: 16.125 - type: recall_at_5 value: 19.988 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (deu-deu) type: jinaai/xpqa config: deu-deu split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 84.552 - type: map_at_1 value: 59.023 - type: map_at_10 value: 81.051 - type: map_at_100 value: 81.539 - type: map_at_1000 value: 81.54299999999999 - type: map_at_20 value: 81.401 - type: map_at_3 value: 76.969 - type: map_at_5 value: 80.07600000000001 - type: mrr_at_1 value: 77.67624020887729 - type: mrr_at_10 value: 83.30509967259314 - type: mrr_at_100 value: 83.58599391639456 - type: mrr_at_1000 value: 83.58970114722587 - type: mrr_at_20 value: 83.50275980440317 - type: mrr_at_3 value: 82.07136640557006 - type: mrr_at_5 value: 82.94604003481287 - type: nauc_map_at_1000_diff1 value: 63.12885104269942 - type: nauc_map_at_1000_max value: 57.7017996674959 - type: nauc_map_at_1000_std value: -24.951068985070513 - type: nauc_map_at_100_diff1 value: 63.12866509393162 - type: nauc_map_at_100_max value: 57.70176426013332 - type: nauc_map_at_100_std value: -24.96012290790273 - type: nauc_map_at_10_diff1 value: 62.847709436211204 - type: nauc_map_at_10_max value: 57.408873624779524 - type: nauc_map_at_10_std value: -25.635130363219062 - type: nauc_map_at_1_diff1 value: 71.89683981857102 - type: nauc_map_at_1_max value: 20.204460967432645 - type: nauc_map_at_1_std value: -23.07894656629493 - type: nauc_map_at_20_diff1 value: 63.00504457011043 - type: nauc_map_at_20_max value: 57.66009512514262 - type: nauc_map_at_20_std value: -25.100138593754885 - type: nauc_map_at_3_diff1 value: 63.199874607788274 - type: nauc_map_at_3_max value: 47.54482033763308 - type: nauc_map_at_3_std value: -27.714557098916963 - type: nauc_map_at_5_diff1 value: 63.01006523518669 - type: nauc_map_at_5_max value: 56.501965964288495 - type: nauc_map_at_5_std value: -25.367825762790925 - type: nauc_mrr_at_1000_diff1 value: 66.24988063948112 - type: nauc_mrr_at_1000_max value: 63.56921667744273 - type: nauc_mrr_at_1000_std value: -22.073973768031863 - type: nauc_mrr_at_100_diff1 value: 66.24919554296275 - type: nauc_mrr_at_100_max value: 63.57382447608361 - type: nauc_mrr_at_100_std value: -22.084627248538187 - type: nauc_mrr_at_10_diff1 value: 66.0143885124066 - type: nauc_mrr_at_10_max value: 63.51277586011898 - type: nauc_mrr_at_10_std value: -22.477523960705454 - type: nauc_mrr_at_1_diff1 value: 68.25415199323474 - type: nauc_mrr_at_1_max value: 63.069019003272416 - type: nauc_mrr_at_1_std value: -18.77085924093244 - type: nauc_mrr_at_20_diff1 value: 66.16203167351055 - type: nauc_mrr_at_20_max value: 63.607477776215845 - type: nauc_mrr_at_20_std value: -22.15083176017266 - type: nauc_mrr_at_3_diff1 value: 66.39368842782302 - type: nauc_mrr_at_3_max value: 63.11411066585295 - type: nauc_mrr_at_3_std value: -22.63174342814071 - type: nauc_mrr_at_5_diff1 value: 66.17932562332354 - type: nauc_mrr_at_5_max value: 63.70434825329594 - type: nauc_mrr_at_5_std value: -21.704012812430438 - type: nauc_ndcg_at_1000_diff1 value: 63.958010361549356 - type: nauc_ndcg_at_1000_max value: 60.516445000134624 - type: nauc_ndcg_at_1000_std value: -24.264672248289923 - type: nauc_ndcg_at_100_diff1 value: 63.97654644758022 - type: nauc_ndcg_at_100_max value: 60.62187552803407 - type: nauc_ndcg_at_100_std value: -24.317149225778312 - type: nauc_ndcg_at_10_diff1 value: 62.505321221321566 - type: nauc_ndcg_at_10_max value: 59.77891112351258 - type: nauc_ndcg_at_10_std value: -26.90910005589911 - type: nauc_ndcg_at_1_diff1 value: 68.25415199323474 - type: nauc_ndcg_at_1_max value: 63.069019003272416 - type: nauc_ndcg_at_1_std value: -18.77085924093244 - type: nauc_ndcg_at_20_diff1 value: 63.04281805056225 - type: nauc_ndcg_at_20_max value: 60.600957307444226 - type: nauc_ndcg_at_20_std value: -24.954862079889203 - type: nauc_ndcg_at_3_diff1 value: 62.970441139740316 - type: nauc_ndcg_at_3_max value: 57.543715669055295 - type: nauc_ndcg_at_3_std value: -25.659388431714703 - type: nauc_ndcg_at_5_diff1 value: 62.82652127664541 - type: nauc_ndcg_at_5_max value: 58.6970443258532 - type: nauc_ndcg_at_5_std value: -25.66329354851023 - type: nauc_precision_at_1000_diff1 value: -33.38530947486223 - type: nauc_precision_at_1000_max value: 25.972468024345414 - type: nauc_precision_at_1000_std value: 17.460222955117978 - type: nauc_precision_at_100_diff1 value: -32.45175999251703 - type: nauc_precision_at_100_max value: 26.367996120487337 - type: nauc_precision_at_100_std value: 17.097957946391208 - type: nauc_precision_at_10_diff1 value: -26.97411235289487 - type: nauc_precision_at_10_max value: 31.504961687240762 - type: nauc_precision_at_10_std value: 11.125341183874687 - type: nauc_precision_at_1_diff1 value: 68.25415199323474 - type: nauc_precision_at_1_max value: 63.069019003272416 - type: nauc_precision_at_1_std value: -18.77085924093244 - type: nauc_precision_at_20_diff1 value: -29.8678078736273 - type: nauc_precision_at_20_max value: 29.031222186584504 - type: nauc_precision_at_20_std value: 14.943600563087928 - type: nauc_precision_at_3_diff1 value: -15.92947221299854 - type: nauc_precision_at_3_max value: 37.73833494235097 - type: nauc_precision_at_3_std value: 3.1573228443500847 - type: nauc_precision_at_5_diff1 value: -22.269156821101642 - type: nauc_precision_at_5_max value: 35.65821838116355 - type: nauc_precision_at_5_std value: 9.265930386198972 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: 66.17058859539249 - type: nauc_recall_at_100_max value: 78.066942935192 - type: nauc_recall_at_100_std value: -22.213377762074686 - type: nauc_recall_at_10_diff1 value: 50.82149700700275 - type: nauc_recall_at_10_max value: 56.68053325008221 - type: nauc_recall_at_10_std value: -41.81657941433277 - type: nauc_recall_at_1_diff1 value: 71.89683981857102 - type: nauc_recall_at_1_max value: 20.204460967432645 - type: nauc_recall_at_1_std value: -23.07894656629493 - type: nauc_recall_at_20_diff1 value: 48.28076011857885 - type: nauc_recall_at_20_max value: 63.29641555519295 - type: nauc_recall_at_20_std value: -32.953559708819405 - type: nauc_recall_at_3_diff1 value: 58.15516956312558 - type: nauc_recall_at_3_max value: 42.66315890283056 - type: nauc_recall_at_3_std value: -32.16572530544806 - type: nauc_recall_at_5_diff1 value: 55.900844052439766 - type: nauc_recall_at_5_max value: 55.23702018862884 - type: nauc_recall_at_5_std value: -30.105929528165 - type: ndcg_at_1 value: 77.676 - type: ndcg_at_10 value: 84.552 - type: ndcg_at_100 value: 86.232 - type: ndcg_at_1000 value: 86.33800000000001 - type: ndcg_at_20 value: 85.515 - type: ndcg_at_3 value: 81.112 - type: ndcg_at_5 value: 82.943 - type: precision_at_1 value: 77.676 - type: precision_at_10 value: 15.17 - type: precision_at_100 value: 1.6230000000000002 - type: precision_at_1000 value: 0.163 - type: precision_at_20 value: 7.858999999999999 - type: precision_at_3 value: 42.994 - type: precision_at_5 value: 28.747 - type: recall_at_1 value: 59.023 - type: recall_at_10 value: 92.465 - type: recall_at_100 value: 99.18400000000001 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 95.844 - type: recall_at_3 value: 81.826 - type: recall_at_5 value: 88.22 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (deu-eng) type: jinaai/xpqa config: deu-eng split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 82.149 - type: map_at_1 value: 56.277 - type: map_at_10 value: 78.36999999999999 - type: map_at_100 value: 78.94 - type: map_at_1000 value: 78.95 - type: map_at_20 value: 78.818 - type: map_at_3 value: 74.25 - type: map_at_5 value: 77.11099999999999 - type: mrr_at_1 value: 74.28198433420366 - type: mrr_at_10 value: 80.57487877657589 - type: mrr_at_100 value: 80.94025764149008 - type: mrr_at_1000 value: 80.94608738871234 - type: mrr_at_20 value: 80.86240675885023 - type: mrr_at_3 value: 79.4604003481288 - type: mrr_at_5 value: 80.10008703220191 - type: nauc_map_at_1000_diff1 value: 60.44369249057189 - type: nauc_map_at_1000_max value: 49.822240441830246 - type: nauc_map_at_1000_std value: -27.34026380762817 - type: nauc_map_at_100_diff1 value: 60.44635668050401 - type: nauc_map_at_100_max value: 49.838675926660684 - type: nauc_map_at_100_std value: -27.310365556055583 - type: nauc_map_at_10_diff1 value: 60.18546951726522 - type: nauc_map_at_10_max value: 49.72075398096832 - type: nauc_map_at_10_std value: -27.86056102461558 - type: nauc_map_at_1_diff1 value: 71.2906657099758 - type: nauc_map_at_1_max value: 18.970399251589 - type: nauc_map_at_1_std value: -27.260776614286602 - type: nauc_map_at_20_diff1 value: 60.3525975566164 - type: nauc_map_at_20_max value: 49.852487866710646 - type: nauc_map_at_20_std value: -27.305173830170332 - type: nauc_map_at_3_diff1 value: 60.66803500571236 - type: nauc_map_at_3_max value: 41.18191941521972 - type: nauc_map_at_3_std value: -28.71383593401732 - type: nauc_map_at_5_diff1 value: 60.57216514504887 - type: nauc_map_at_5_max value: 47.99837400446299 - type: nauc_map_at_5_std value: -28.756183015949986 - type: nauc_mrr_at_1000_diff1 value: 63.77031955602516 - type: nauc_mrr_at_1000_max value: 54.26907383811417 - type: nauc_mrr_at_1000_std value: -26.227442087164714 - type: nauc_mrr_at_100_diff1 value: 63.77196650108669 - type: nauc_mrr_at_100_max value: 54.281801457913126 - type: nauc_mrr_at_100_std value: -26.216077891830793 - type: nauc_mrr_at_10_diff1 value: 63.50095284903051 - type: nauc_mrr_at_10_max value: 54.3186301730016 - type: nauc_mrr_at_10_std value: -26.29570241722173 - type: nauc_mrr_at_1_diff1 value: 65.15855770999057 - type: nauc_mrr_at_1_max value: 53.213286738515066 - type: nauc_mrr_at_1_std value: -24.683178252901943 - type: nauc_mrr_at_20_diff1 value: 63.74936550280859 - type: nauc_mrr_at_20_max value: 54.355343751439065 - type: nauc_mrr_at_20_std value: -26.197316900009817 - type: nauc_mrr_at_3_diff1 value: 63.912612979082695 - type: nauc_mrr_at_3_max value: 53.75399024225975 - type: nauc_mrr_at_3_std value: -27.194143264554675 - type: nauc_mrr_at_5_diff1 value: 63.72491059053639 - type: nauc_mrr_at_5_max value: 53.66107604019352 - type: nauc_mrr_at_5_std value: -26.92281560584754 - type: nauc_ndcg_at_1000_diff1 value: 61.304218998714354 - type: nauc_ndcg_at_1000_max value: 52.409135743660386 - type: nauc_ndcg_at_1000_std value: -26.539796489464056 - type: nauc_ndcg_at_100_diff1 value: 61.40355045085304 - type: nauc_ndcg_at_100_max value: 52.79402259608008 - type: nauc_ndcg_at_100_std value: -25.927273456979965 - type: nauc_ndcg_at_10_diff1 value: 59.93675608684116 - type: nauc_ndcg_at_10_max value: 52.617848197542706 - type: nauc_ndcg_at_10_std value: -27.314820020095887 - type: nauc_ndcg_at_1_diff1 value: 65.15855770999057 - type: nauc_ndcg_at_1_max value: 53.213286738515066 - type: nauc_ndcg_at_1_std value: -24.683178252901943 - type: nauc_ndcg_at_20_diff1 value: 60.85093704358376 - type: nauc_ndcg_at_20_max value: 53.14529242671602 - type: nauc_ndcg_at_20_std value: -25.93187916231906 - type: nauc_ndcg_at_3_diff1 value: 60.42301123518882 - type: nauc_ndcg_at_3_max value: 49.59021992975956 - type: nauc_ndcg_at_3_std value: -27.397117967810363 - type: nauc_ndcg_at_5_diff1 value: 60.78655153154219 - type: nauc_ndcg_at_5_max value: 49.54194799556953 - type: nauc_ndcg_at_5_std value: -29.467910172913413 - type: nauc_precision_at_1000_diff1 value: -34.35027108027456 - type: nauc_precision_at_1000_max value: 23.762671066858815 - type: nauc_precision_at_1000_std value: 16.1704780298982 - type: nauc_precision_at_100_diff1 value: -32.66610016754961 - type: nauc_precision_at_100_max value: 25.504044603109588 - type: nauc_precision_at_100_std value: 16.932402988816786 - type: nauc_precision_at_10_diff1 value: -25.720903145017342 - type: nauc_precision_at_10_max value: 30.37029690599926 - type: nauc_precision_at_10_std value: 10.560753160200314 - type: nauc_precision_at_1_diff1 value: 65.15855770999057 - type: nauc_precision_at_1_max value: 53.213286738515066 - type: nauc_precision_at_1_std value: -24.683178252901943 - type: nauc_precision_at_20_diff1 value: -29.577582332619084 - type: nauc_precision_at_20_max value: 27.984145595920417 - type: nauc_precision_at_20_std value: 15.083711704044727 - type: nauc_precision_at_3_diff1 value: -14.736267532892697 - type: nauc_precision_at_3_max value: 36.12211021824307 - type: nauc_precision_at_3_std value: 3.068643876519412 - type: nauc_precision_at_5_diff1 value: -19.846707283120825 - type: nauc_precision_at_5_max value: 33.573804532177896 - type: nauc_precision_at_5_std value: 5.700545622744924 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: 68.24749796604452 - type: nauc_recall_at_100_max value: 83.30024864929815 - type: nauc_recall_at_100_std value: 21.23763053711522 - type: nauc_recall_at_10_diff1 value: 50.704049683241436 - type: nauc_recall_at_10_max value: 57.64578984555556 - type: nauc_recall_at_10_std value: -26.632759037746073 - type: nauc_recall_at_1_diff1 value: 71.2906657099758 - type: nauc_recall_at_1_max value: 18.970399251589 - type: nauc_recall_at_1_std value: -27.260776614286602 - type: nauc_recall_at_20_diff1 value: 54.124480837579505 - type: nauc_recall_at_20_max value: 66.4641515433479 - type: nauc_recall_at_20_std value: -14.615911455379393 - type: nauc_recall_at_3_diff1 value: 56.54358788321059 - type: nauc_recall_at_3_max value: 37.765735322465744 - type: nauc_recall_at_3_std value: -30.824147408598574 - type: nauc_recall_at_5_diff1 value: 56.392894535029214 - type: nauc_recall_at_5_max value: 45.959268387521554 - type: nauc_recall_at_5_std value: -33.58175576925282 - type: ndcg_at_1 value: 74.28200000000001 - type: ndcg_at_10 value: 82.149 - type: ndcg_at_100 value: 84.129 - type: ndcg_at_1000 value: 84.307 - type: ndcg_at_20 value: 83.39999999999999 - type: ndcg_at_3 value: 78.583 - type: ndcg_at_5 value: 80.13900000000001 - type: precision_at_1 value: 74.28200000000001 - type: precision_at_10 value: 14.960999999999999 - type: precision_at_100 value: 1.6119999999999999 - type: precision_at_1000 value: 0.163 - type: precision_at_20 value: 7.813000000000001 - type: precision_at_3 value: 41.819 - type: precision_at_5 value: 27.911 - type: recall_at_1 value: 56.277 - type: recall_at_10 value: 90.729 - type: recall_at_100 value: 98.792 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 95.148 - type: recall_at_3 value: 79.989 - type: recall_at_5 value: 85.603 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (eng-deu) type: jinaai/xpqa config: eng-deu split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 60.428000000000004 - type: map_at_1 value: 33.453 - type: map_at_10 value: 54.217000000000006 - type: map_at_100 value: 55.832 - type: map_at_1000 value: 55.884 - type: map_at_20 value: 55.236 - type: map_at_3 value: 48.302 - type: map_at_5 value: 51.902 - type: mrr_at_1 value: 53.916449086161876 - type: mrr_at_10 value: 61.4685647975465 - type: mrr_at_100 value: 62.13718159287348 - type: mrr_at_1000 value: 62.15799113826325 - type: mrr_at_20 value: 61.885388764243544 - type: mrr_at_3 value: 59.44299390774582 - type: mrr_at_5 value: 60.26544821583981 - type: nauc_map_at_1000_diff1 value: 39.824412602121804 - type: nauc_map_at_1000_max value: 39.49332709959374 - type: nauc_map_at_1000_std value: -17.27462623749702 - type: nauc_map_at_100_diff1 value: 39.80528910003463 - type: nauc_map_at_100_max value: 39.51471609156093 - type: nauc_map_at_100_std value: -17.275536933094937 - type: nauc_map_at_10_diff1 value: 39.28558292349772 - type: nauc_map_at_10_max value: 38.13220294838968 - type: nauc_map_at_10_std value: -18.235985574392863 - type: nauc_map_at_1_diff1 value: 43.68892397816937 - type: nauc_map_at_1_max value: 14.478978190224353 - type: nauc_map_at_1_std value: -18.435031919225477 - type: nauc_map_at_20_diff1 value: 39.8733530971344 - type: nauc_map_at_20_max value: 39.30513202591992 - type: nauc_map_at_20_std value: -17.62362848144766 - type: nauc_map_at_3_diff1 value: 40.31116611188815 - type: nauc_map_at_3_max value: 31.107314675202165 - type: nauc_map_at_3_std value: -19.52930881946966 - type: nauc_map_at_5_diff1 value: 39.1241499095765 - type: nauc_map_at_5_max value: 37.330543901034055 - type: nauc_map_at_5_std value: -17.893862772447548 - type: nauc_mrr_at_1000_diff1 value: 43.07490530140024 - type: nauc_mrr_at_1000_max value: 42.28469195779226 - type: nauc_mrr_at_1000_std value: -15.583217110180737 - type: nauc_mrr_at_100_diff1 value: 43.068836494603886 - type: nauc_mrr_at_100_max value: 42.29612450479168 - type: nauc_mrr_at_100_std value: -15.57218089438229 - type: nauc_mrr_at_10_diff1 value: 42.88685919151777 - type: nauc_mrr_at_10_max value: 41.89944452003811 - type: nauc_mrr_at_10_std value: -15.909673572763165 - type: nauc_mrr_at_1_diff1 value: 45.67646898532131 - type: nauc_mrr_at_1_max value: 43.0541870425035 - type: nauc_mrr_at_1_std value: -15.597124291613563 - type: nauc_mrr_at_20_diff1 value: 43.14141873150977 - type: nauc_mrr_at_20_max value: 42.33063543184022 - type: nauc_mrr_at_20_std value: -15.607612016107304 - type: nauc_mrr_at_3_diff1 value: 43.18370928261982 - type: nauc_mrr_at_3_max value: 42.18529980773961 - type: nauc_mrr_at_3_std value: -15.900151400673629 - type: nauc_mrr_at_5_diff1 value: 42.43443044877765 - type: nauc_mrr_at_5_max value: 42.05818605278972 - type: nauc_mrr_at_5_std value: -15.436502733299893 - type: nauc_ndcg_at_1000_diff1 value: 40.60606676178781 - type: nauc_ndcg_at_1000_max value: 41.71923393878376 - type: nauc_ndcg_at_1000_std value: -15.694740326899556 - type: nauc_ndcg_at_100_diff1 value: 40.15270376312309 - type: nauc_ndcg_at_100_max value: 42.234126305709225 - type: nauc_ndcg_at_100_std value: -15.436051984708952 - type: nauc_ndcg_at_10_diff1 value: 39.142259831299455 - type: nauc_ndcg_at_10_max value: 38.61470104273746 - type: nauc_ndcg_at_10_std value: -18.577452829132742 - type: nauc_ndcg_at_1_diff1 value: 45.67646898532131 - type: nauc_ndcg_at_1_max value: 43.0541870425035 - type: nauc_ndcg_at_1_std value: -15.597124291613563 - type: nauc_ndcg_at_20_diff1 value: 40.805159395901306 - type: nauc_ndcg_at_20_max value: 41.58685629374952 - type: nauc_ndcg_at_20_std value: -16.862408156222592 - type: nauc_ndcg_at_3_diff1 value: 39.12028215488432 - type: nauc_ndcg_at_3_max value: 39.70580596343164 - type: nauc_ndcg_at_3_std value: -16.705546903936213 - type: nauc_ndcg_at_5_diff1 value: 38.42075404927361 - type: nauc_ndcg_at_5_max value: 38.064219879504385 - type: nauc_ndcg_at_5_std value: -17.20282111665876 - type: nauc_precision_at_1000_diff1 value: -4.419224540552891 - type: nauc_precision_at_1000_max value: 35.686022591225246 - type: nauc_precision_at_1000_std value: 15.023520191032972 - type: nauc_precision_at_100_diff1 value: -2.9027602601603895 - type: nauc_precision_at_100_max value: 39.99864013028808 - type: nauc_precision_at_100_std value: 13.863497117255525 - type: nauc_precision_at_10_diff1 value: 5.539104839809501 - type: nauc_precision_at_10_max value: 42.41625740557432 - type: nauc_precision_at_10_std value: 1.0894693748662556 - type: nauc_precision_at_1_diff1 value: 45.67646898532131 - type: nauc_precision_at_1_max value: 43.0541870425035 - type: nauc_precision_at_1_std value: -15.597124291613563 - type: nauc_precision_at_20_diff1 value: 4.734562571681868 - type: nauc_precision_at_20_max value: 44.35081213316202 - type: nauc_precision_at_20_std value: 6.642891478284595 - type: nauc_precision_at_3_diff1 value: 13.936559341472101 - type: nauc_precision_at_3_max value: 45.426668552497524 - type: nauc_precision_at_3_std value: -5.219785419247125 - type: nauc_precision_at_5_diff1 value: 8.366706789546015 - type: nauc_precision_at_5_max value: 46.161942989326896 - type: nauc_precision_at_5_std value: -0.193140343545876 - type: nauc_recall_at_1000_diff1 value: 45.61785312444842 - type: nauc_recall_at_1000_max value: 75.68258976531774 - type: nauc_recall_at_1000_std value: 37.469059422121575 - type: nauc_recall_at_100_diff1 value: 26.798748531805096 - type: nauc_recall_at_100_max value: 54.72134095197765 - type: nauc_recall_at_100_std value: -1.5967608233799417 - type: nauc_recall_at_10_diff1 value: 32.13211696200521 - type: nauc_recall_at_10_max value: 31.13866254975895 - type: nauc_recall_at_10_std value: -22.31404161136118 - type: nauc_recall_at_1_diff1 value: 43.68892397816937 - type: nauc_recall_at_1_max value: 14.478978190224353 - type: nauc_recall_at_1_std value: -18.435031919225477 - type: nauc_recall_at_20_diff1 value: 38.597996930461385 - type: nauc_recall_at_20_max value: 42.49849027366794 - type: nauc_recall_at_20_std value: -16.536471900752154 - type: nauc_recall_at_3_diff1 value: 35.343730012759266 - type: nauc_recall_at_3_max value: 26.898722085043392 - type: nauc_recall_at_3_std value: -19.4459792273884 - type: nauc_recall_at_5_diff1 value: 31.8310298012186 - type: nauc_recall_at_5_max value: 32.67800489655844 - type: nauc_recall_at_5_std value: -16.800929103347283 - type: ndcg_at_1 value: 53.916 - type: ndcg_at_10 value: 60.428000000000004 - type: ndcg_at_100 value: 65.95 - type: ndcg_at_1000 value: 66.88 - type: ndcg_at_20 value: 62.989 - type: ndcg_at_3 value: 55.204 - type: ndcg_at_5 value: 56.42700000000001 - type: precision_at_1 value: 53.916 - type: precision_at_10 value: 14.346999999999998 - type: precision_at_100 value: 1.849 - type: precision_at_1000 value: 0.196 - type: precision_at_20 value: 8.022 - type: precision_at_3 value: 34.552 - type: precision_at_5 value: 24.569 - type: recall_at_1 value: 33.453 - type: recall_at_10 value: 71.07900000000001 - type: recall_at_100 value: 93.207 - type: recall_at_1000 value: 99.60799999999999 - type: recall_at_20 value: 79.482 - type: recall_at_3 value: 53.98 - type: recall_at_5 value: 60.781 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (eng-pol) type: jinaai/xpqa config: eng-pol split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 34.042 - type: map_at_1 value: 13.236 - type: map_at_10 value: 27.839999999999996 - type: map_at_100 value: 30.171999999999997 - type: map_at_1000 value: 30.349999999999998 - type: map_at_20 value: 29.044999999999998 - type: map_at_3 value: 22.58 - type: map_at_5 value: 25.83 - type: mrr_at_1 value: 30.318471337579616 - type: mrr_at_10 value: 37.4983823678091 - type: mrr_at_100 value: 38.5784523175009 - type: mrr_at_1000 value: 38.63608698968148 - type: mrr_at_20 value: 38.02996157871825 - type: mrr_at_3 value: 34.798301486199584 - type: mrr_at_5 value: 36.39702760084925 - type: nauc_map_at_1000_diff1 value: 21.07199789609177 - type: nauc_map_at_1000_max value: 25.959233507893277 - type: nauc_map_at_1000_std value: -28.011925372852826 - type: nauc_map_at_100_diff1 value: 21.086788412737548 - type: nauc_map_at_100_max value: 25.8611620203686 - type: nauc_map_at_100_std value: -28.179239912057515 - type: nauc_map_at_10_diff1 value: 21.23841745922078 - type: nauc_map_at_10_max value: 25.44290342378288 - type: nauc_map_at_10_std value: -28.75578689110275 - type: nauc_map_at_1_diff1 value: 28.87454015638211 - type: nauc_map_at_1_max value: 17.50681123879997 - type: nauc_map_at_1_std value: -30.382831850562432 - type: nauc_map_at_20_diff1 value: 21.076559713540455 - type: nauc_map_at_20_max value: 25.538154202494535 - type: nauc_map_at_20_std value: -28.518764617658555 - type: nauc_map_at_3_diff1 value: 22.159185358766468 - type: nauc_map_at_3_max value: 23.01652660927249 - type: nauc_map_at_3_std value: -29.567722713221862 - type: nauc_map_at_5_diff1 value: 21.35578810370897 - type: nauc_map_at_5_max value: 25.550550437767395 - type: nauc_map_at_5_std value: -28.7889035461355 - type: nauc_mrr_at_1000_diff1 value: 22.28633009221923 - type: nauc_mrr_at_1000_max value: 26.920205393136392 - type: nauc_mrr_at_1000_std value: -25.887791634977642 - type: nauc_mrr_at_100_diff1 value: 22.2754975739755 - type: nauc_mrr_at_100_max value: 26.90235716615346 - type: nauc_mrr_at_100_std value: -25.891596020584345 - type: nauc_mrr_at_10_diff1 value: 22.415076305593534 - type: nauc_mrr_at_10_max value: 26.504643796222222 - type: nauc_mrr_at_10_std value: -26.6046081215833 - type: nauc_mrr_at_1_diff1 value: 23.406748619244368 - type: nauc_mrr_at_1_max value: 29.058228240823553 - type: nauc_mrr_at_1_std value: -26.450169820901078 - type: nauc_mrr_at_20_diff1 value: 22.29233141817678 - type: nauc_mrr_at_20_max value: 26.69021351064081 - type: nauc_mrr_at_20_std value: -26.086596227376656 - type: nauc_mrr_at_3_diff1 value: 22.20746187500145 - type: nauc_mrr_at_3_max value: 27.143725946169457 - type: nauc_mrr_at_3_std value: -26.7017708594376 - type: nauc_mrr_at_5_diff1 value: 22.71898965233195 - type: nauc_mrr_at_5_max value: 26.932386658571662 - type: nauc_mrr_at_5_std value: -26.725541058780234 - type: nauc_ndcg_at_1000_diff1 value: 20.541734305148466 - type: nauc_ndcg_at_1000_max value: 27.180534238090758 - type: nauc_ndcg_at_1000_std value: -23.74197745177845 - type: nauc_ndcg_at_100_diff1 value: 20.570052839937468 - type: nauc_ndcg_at_100_max value: 26.21605034405486 - type: nauc_ndcg_at_100_std value: -25.359817188805028 - type: nauc_ndcg_at_10_diff1 value: 21.241423075073467 - type: nauc_ndcg_at_10_max value: 24.599199195239475 - type: nauc_ndcg_at_10_std value: -28.404540333309008 - type: nauc_ndcg_at_1_diff1 value: 23.406748619244368 - type: nauc_ndcg_at_1_max value: 29.058228240823553 - type: nauc_ndcg_at_1_std value: -26.450169820901078 - type: nauc_ndcg_at_20_diff1 value: 20.740460046196873 - type: nauc_ndcg_at_20_max value: 24.82380195169634 - type: nauc_ndcg_at_20_std value: -27.376298834244313 - type: nauc_ndcg_at_3_diff1 value: 19.994948682426504 - type: nauc_ndcg_at_3_max value: 26.153790759405105 - type: nauc_ndcg_at_3_std value: -27.194548404540885 - type: nauc_ndcg_at_5_diff1 value: 21.48414272096384 - type: nauc_ndcg_at_5_max value: 25.239652015076373 - type: nauc_ndcg_at_5_std value: -28.2620160957961 - type: nauc_precision_at_1000_diff1 value: -0.7557639926687744 - type: nauc_precision_at_1000_max value: 24.265591636994436 - type: nauc_precision_at_1000_std value: 16.833104654292654 - type: nauc_precision_at_100_diff1 value: 4.647847665941115 - type: nauc_precision_at_100_max value: 24.42192644844434 - type: nauc_precision_at_100_std value: 0.2718848568876648 - type: nauc_precision_at_10_diff1 value: 9.465969286722654 - type: nauc_precision_at_10_max value: 27.448993150448043 - type: nauc_precision_at_10_std value: -16.519099596502212 - type: nauc_precision_at_1_diff1 value: 23.406748619244368 - type: nauc_precision_at_1_max value: 29.058228240823553 - type: nauc_precision_at_1_std value: -26.450169820901078 - type: nauc_precision_at_20_diff1 value: 8.021421615668114 - type: nauc_precision_at_20_max value: 26.18556481398635 - type: nauc_precision_at_20_std value: -12.207152108668367 - type: nauc_precision_at_3_diff1 value: 11.783572803634241 - type: nauc_precision_at_3_max value: 29.259715774978893 - type: nauc_precision_at_3_std value: -20.407524967717425 - type: nauc_precision_at_5_diff1 value: 10.371728615220821 - type: nauc_precision_at_5_max value: 30.270642833482864 - type: nauc_precision_at_5_std value: -18.407334880575494 - type: nauc_recall_at_1000_diff1 value: 6.008969959111555 - type: nauc_recall_at_1000_max value: 39.79691734058127 - type: nauc_recall_at_1000_std value: 32.43591825510109 - type: nauc_recall_at_100_diff1 value: 15.2374566058917 - type: nauc_recall_at_100_max value: 23.058785539503717 - type: nauc_recall_at_100_std value: -15.962888794058165 - type: nauc_recall_at_10_diff1 value: 19.46184821807753 - type: nauc_recall_at_10_max value: 19.001003513986866 - type: nauc_recall_at_10_std value: -27.753332786663876 - type: nauc_recall_at_1_diff1 value: 28.87454015638211 - type: nauc_recall_at_1_max value: 17.50681123879997 - type: nauc_recall_at_1_std value: -30.382831850562432 - type: nauc_recall_at_20_diff1 value: 17.237090858517405 - type: nauc_recall_at_20_max value: 18.42118474134871 - type: nauc_recall_at_20_std value: -24.862787724031957 - type: nauc_recall_at_3_diff1 value: 18.813019521758577 - type: nauc_recall_at_3_max value: 19.198572333053544 - type: nauc_recall_at_3_std value: -28.5644958605618 - type: nauc_recall_at_5_diff1 value: 20.247501986329482 - type: nauc_recall_at_5_max value: 21.121526202170358 - type: nauc_recall_at_5_std value: -27.220378617864853 - type: ndcg_at_1 value: 30.318 - type: ndcg_at_10 value: 34.042 - type: ndcg_at_100 value: 42.733 - type: ndcg_at_1000 value: 46.015 - type: ndcg_at_20 value: 37.053999999999995 - type: ndcg_at_3 value: 29.254 - type: ndcg_at_5 value: 30.514000000000003 - type: precision_at_1 value: 30.318 - type: precision_at_10 value: 10.981 - type: precision_at_100 value: 1.889 - type: precision_at_1000 value: 0.234 - type: precision_at_20 value: 6.643000000000001 - type: precision_at_3 value: 22.166 - type: precision_at_5 value: 17.477999999999998 - type: recall_at_1 value: 13.236 - type: recall_at_10 value: 41.461 - type: recall_at_100 value: 75.008 - type: recall_at_1000 value: 96.775 - type: recall_at_20 value: 50.754 - type: recall_at_3 value: 26.081 - type: recall_at_5 value: 33.168 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (eng-cmn) type: jinaai/xpqa config: eng-cmn split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 37.504 - type: map_at_1 value: 16.019 - type: map_at_10 value: 30.794 - type: map_at_100 value: 33.157 - type: map_at_1000 value: 33.324999999999996 - type: map_at_20 value: 32.161 - type: map_at_3 value: 25.372 - type: map_at_5 value: 28.246 - type: mrr_at_1 value: 30.461165048543688 - type: mrr_at_10 value: 39.393107566651224 - type: mrr_at_100 value: 40.570039540602295 - type: mrr_at_1000 value: 40.6306116407744 - type: mrr_at_20 value: 40.09428159978876 - type: mrr_at_3 value: 37.176375404530745 - type: mrr_at_5 value: 38.09870550161812 - type: nauc_map_at_1000_diff1 value: 30.82306881892873 - type: nauc_map_at_1000_max value: 5.877636000666466 - type: nauc_map_at_1000_std value: -30.7140513386797 - type: nauc_map_at_100_diff1 value: 30.85192449151961 - type: nauc_map_at_100_max value: 5.809195131550909 - type: nauc_map_at_100_std value: -30.838556702972063 - type: nauc_map_at_10_diff1 value: 30.50359163635058 - type: nauc_map_at_10_max value: 6.373491595869303 - type: nauc_map_at_10_std value: -29.89368007827676 - type: nauc_map_at_1_diff1 value: 38.60240510083884 - type: nauc_map_at_1_max value: 10.407392664609139 - type: nauc_map_at_1_std value: -17.76327278732833 - type: nauc_map_at_20_diff1 value: 30.897489125753598 - type: nauc_map_at_20_max value: 5.9303381898248 - type: nauc_map_at_20_std value: -30.863345188760515 - type: nauc_map_at_3_diff1 value: 32.8150951852729 - type: nauc_map_at_3_max value: 7.671931402215177 - type: nauc_map_at_3_std value: -25.654809758216533 - type: nauc_map_at_5_diff1 value: 31.19558194781019 - type: nauc_map_at_5_max value: 6.426885613116939 - type: nauc_map_at_5_std value: -28.609027858850016 - type: nauc_mrr_at_1000_diff1 value: 30.7596332048733 - type: nauc_mrr_at_1000_max value: 1.1970748115580212 - type: nauc_mrr_at_1000_std value: -34.647570668150216 - type: nauc_mrr_at_100_diff1 value: 30.74693370788581 - type: nauc_mrr_at_100_max value: 1.1673272262754841 - type: nauc_mrr_at_100_std value: -34.67761028542745 - type: nauc_mrr_at_10_diff1 value: 30.537820575183076 - type: nauc_mrr_at_10_max value: 1.0261868725502707 - type: nauc_mrr_at_10_std value: -34.999990560631204 - type: nauc_mrr_at_1_diff1 value: 35.51868580113285 - type: nauc_mrr_at_1_max value: 5.117103773147307 - type: nauc_mrr_at_1_std value: -30.633913466736956 - type: nauc_mrr_at_20_diff1 value: 30.67318175430903 - type: nauc_mrr_at_20_max value: 1.0979983974981327 - type: nauc_mrr_at_20_std value: -34.8388339739997 - type: nauc_mrr_at_3_diff1 value: 30.884642006045702 - type: nauc_mrr_at_3_max value: 1.7970996544095983 - type: nauc_mrr_at_3_std value: -34.290172894906085 - type: nauc_mrr_at_5_diff1 value: 30.89687518368571 - type: nauc_mrr_at_5_max value: 1.2123714988495347 - type: nauc_mrr_at_5_std value: -35.01704580471926 - type: nauc_ndcg_at_1000_diff1 value: 29.214476799077342 - type: nauc_ndcg_at_1000_max value: 3.6379035546112872 - type: nauc_ndcg_at_1000_std value: -32.35757522049194 - type: nauc_ndcg_at_100_diff1 value: 29.130004541376298 - type: nauc_ndcg_at_100_max value: 2.9580589185293045 - type: nauc_ndcg_at_100_std value: -33.26884643871724 - type: nauc_ndcg_at_10_diff1 value: 28.521001084366393 - type: nauc_ndcg_at_10_max value: 3.630223957267483 - type: nauc_ndcg_at_10_std value: -33.14524140940815 - type: nauc_ndcg_at_1_diff1 value: 35.51868580113285 - type: nauc_ndcg_at_1_max value: 5.117103773147307 - type: nauc_ndcg_at_1_std value: -30.633913466736956 - type: nauc_ndcg_at_20_diff1 value: 29.194462756848782 - type: nauc_ndcg_at_20_max value: 2.61162903136461 - type: nauc_ndcg_at_20_std value: -34.59161403211834 - type: nauc_ndcg_at_3_diff1 value: 30.183555327135203 - type: nauc_ndcg_at_3_max value: 5.61949040917093 - type: nauc_ndcg_at_3_std value: -30.350117794058175 - type: nauc_ndcg_at_5_diff1 value: 29.74420394139971 - type: nauc_ndcg_at_5_max value: 3.952183813937688 - type: nauc_ndcg_at_5_std value: -31.807833795302038 - type: nauc_precision_at_1000_diff1 value: -5.467049121617333 - type: nauc_precision_at_1000_max value: -3.993986884198271 - type: nauc_precision_at_1000_std value: -13.703967324212224 - type: nauc_precision_at_100_diff1 value: 1.5585428307943647 - type: nauc_precision_at_100_max value: -4.250455723613214 - type: nauc_precision_at_100_std value: -22.294689856776493 - type: nauc_precision_at_10_diff1 value: 11.076036917255259 - type: nauc_precision_at_10_max value: -1.5859394644365377 - type: nauc_precision_at_10_std value: -34.94912594413202 - type: nauc_precision_at_1_diff1 value: 35.51868580113285 - type: nauc_precision_at_1_max value: 5.117103773147307 - type: nauc_precision_at_1_std value: -30.633913466736956 - type: nauc_precision_at_20_diff1 value: 9.311484455773828 - type: nauc_precision_at_20_max value: -3.678383428592432 - type: nauc_precision_at_20_std value: -33.700002761401635 - type: nauc_precision_at_3_diff1 value: 19.2787260874381 - type: nauc_precision_at_3_max value: 0.18292109396940018 - type: nauc_precision_at_3_std value: -35.23939824276542 - type: nauc_precision_at_5_diff1 value: 14.97930592298584 - type: nauc_precision_at_5_max value: -1.63540635880963 - type: nauc_precision_at_5_std value: -35.908283558321315 - type: nauc_recall_at_1000_diff1 value: 26.63056473607804 - type: nauc_recall_at_1000_max value: 62.7304558520689 - type: nauc_recall_at_1000_std value: 58.12421701377561 - type: nauc_recall_at_100_diff1 value: 21.42127379898579 - type: nauc_recall_at_100_max value: 1.4748203516921914 - type: nauc_recall_at_100_std value: -27.56467339041136 - type: nauc_recall_at_10_diff1 value: 21.20479652609812 - type: nauc_recall_at_10_max value: 1.7394881489709888 - type: nauc_recall_at_10_std value: -32.15116902585072 - type: nauc_recall_at_1_diff1 value: 38.60240510083884 - type: nauc_recall_at_1_max value: 10.407392664609139 - type: nauc_recall_at_1_std value: -17.76327278732833 - type: nauc_recall_at_20_diff1 value: 23.049652721582632 - type: nauc_recall_at_20_max value: -1.7715787106286838 - type: nauc_recall_at_20_std value: -36.14203686002867 - type: nauc_recall_at_3_diff1 value: 26.522179829461873 - type: nauc_recall_at_3_max value: 6.078208732431124 - type: nauc_recall_at_3_std value: -25.02625711226274 - type: nauc_recall_at_5_diff1 value: 24.19538553561693 - type: nauc_recall_at_5_max value: 2.4963810785503524 - type: nauc_recall_at_5_std value: -30.449635496921257 - type: ndcg_at_1 value: 30.461 - type: ndcg_at_10 value: 37.504 - type: ndcg_at_100 value: 46.156000000000006 - type: ndcg_at_1000 value: 48.985 - type: ndcg_at_20 value: 41.025 - type: ndcg_at_3 value: 32.165 - type: ndcg_at_5 value: 33.072 - type: precision_at_1 value: 30.461 - type: precision_at_10 value: 11.032 - type: precision_at_100 value: 1.8870000000000002 - type: precision_at_1000 value: 0.22499999999999998 - type: precision_at_20 value: 6.833 - type: precision_at_3 value: 22.532 - type: precision_at_5 value: 16.966 - type: recall_at_1 value: 16.019 - type: recall_at_10 value: 47.557 - type: recall_at_100 value: 80.376 - type: recall_at_1000 value: 98.904 - type: recall_at_20 value: 58.48100000000001 - type: recall_at_3 value: 30.682 - type: recall_at_5 value: 36.714999999999996 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (eng-spa) type: jinaai/xpqa config: eng-spa split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 53.359 - type: map_at_1 value: 22.892000000000003 - type: map_at_10 value: 45.773 - type: map_at_100 value: 47.778999999999996 - type: map_at_1000 value: 47.882999999999996 - type: map_at_20 value: 46.869 - type: map_at_3 value: 37.643 - type: map_at_5 value: 43.120999999999995 - type: mrr_at_1 value: 47.28877679697352 - type: mrr_at_10 value: 56.95890630316857 - type: mrr_at_100 value: 57.71103367009639 - type: mrr_at_1000 value: 57.73661441948852 - type: mrr_at_20 value: 57.37701091311334 - type: mrr_at_3 value: 54.74989491382929 - type: mrr_at_5 value: 56.08659100462372 - type: nauc_map_at_1000_diff1 value: 27.8347129954991 - type: nauc_map_at_1000_max value: 38.04300600762859 - type: nauc_map_at_1000_std value: -18.294653328262868 - type: nauc_map_at_100_diff1 value: 27.818449297770858 - type: nauc_map_at_100_max value: 38.03533462156633 - type: nauc_map_at_100_std value: -18.332989980880644 - type: nauc_map_at_10_diff1 value: 27.520664180018358 - type: nauc_map_at_10_max value: 37.67109855753314 - type: nauc_map_at_10_std value: -18.496721673888683 - type: nauc_map_at_1_diff1 value: 37.56020148060502 - type: nauc_map_at_1_max value: 10.298394230150745 - type: nauc_map_at_1_std value: -20.41359936101547 - type: nauc_map_at_20_diff1 value: 27.615023038189722 - type: nauc_map_at_20_max value: 37.808525116320254 - type: nauc_map_at_20_std value: -18.49235775420803 - type: nauc_map_at_3_diff1 value: 30.797347567428424 - type: nauc_map_at_3_max value: 29.374407828869497 - type: nauc_map_at_3_std value: -19.75905772914969 - type: nauc_map_at_5_diff1 value: 28.431802888884803 - type: nauc_map_at_5_max value: 35.57723911610521 - type: nauc_map_at_5_std value: -19.093588845366824 - type: nauc_mrr_at_1000_diff1 value: 33.263611009054586 - type: nauc_mrr_at_1000_max value: 40.620639901613664 - type: nauc_mrr_at_1000_std value: -17.083016011032036 - type: nauc_mrr_at_100_diff1 value: 33.25375012559163 - type: nauc_mrr_at_100_max value: 40.62376205172005 - type: nauc_mrr_at_100_std value: -17.091930575226684 - type: nauc_mrr_at_10_diff1 value: 33.05787202690095 - type: nauc_mrr_at_10_max value: 40.4516362611674 - type: nauc_mrr_at_10_std value: -17.088910666499892 - type: nauc_mrr_at_1_diff1 value: 36.424151087824555 - type: nauc_mrr_at_1_max value: 40.955715626650445 - type: nauc_mrr_at_1_std value: -16.56636409111209 - type: nauc_mrr_at_20_diff1 value: 33.12029456858138 - type: nauc_mrr_at_20_max value: 40.56409347292635 - type: nauc_mrr_at_20_std value: -17.102034817242068 - type: nauc_mrr_at_3_diff1 value: 33.52377926814156 - type: nauc_mrr_at_3_max value: 40.824911575046876 - type: nauc_mrr_at_3_std value: -16.855935748811092 - type: nauc_mrr_at_5_diff1 value: 33.08646471768442 - type: nauc_mrr_at_5_max value: 40.59323589955881 - type: nauc_mrr_at_5_std value: -16.77829710500156 - type: nauc_ndcg_at_1000_diff1 value: 28.741186244590207 - type: nauc_ndcg_at_1000_max value: 40.0113825410539 - type: nauc_ndcg_at_1000_std value: -17.15655081742458 - type: nauc_ndcg_at_100_diff1 value: 28.680521359782972 - type: nauc_ndcg_at_100_max value: 39.94751899984445 - type: nauc_ndcg_at_100_std value: -17.82813814043932 - type: nauc_ndcg_at_10_diff1 value: 27.22858072673168 - type: nauc_ndcg_at_10_max value: 38.600188968554725 - type: nauc_ndcg_at_10_std value: -18.517203924893614 - type: nauc_ndcg_at_1_diff1 value: 36.424151087824555 - type: nauc_ndcg_at_1_max value: 40.955715626650445 - type: nauc_ndcg_at_1_std value: -16.56636409111209 - type: nauc_ndcg_at_20_diff1 value: 27.56875900623774 - type: nauc_ndcg_at_20_max value: 38.95264310199067 - type: nauc_ndcg_at_20_std value: -18.709973965688445 - type: nauc_ndcg_at_3_diff1 value: 28.682842749851574 - type: nauc_ndcg_at_3_max value: 38.361215408395964 - type: nauc_ndcg_at_3_std value: -16.800291231827515 - type: nauc_ndcg_at_5_diff1 value: 28.178239259093484 - type: nauc_ndcg_at_5_max value: 36.77096292606479 - type: nauc_ndcg_at_5_std value: -18.718861696641145 - type: nauc_precision_at_1000_diff1 value: -7.3686253252869305 - type: nauc_precision_at_1000_max value: 31.98896996987639 - type: nauc_precision_at_1000_std value: 13.125659676392267 - type: nauc_precision_at_100_diff1 value: -2.8239113056969156 - type: nauc_precision_at_100_max value: 36.95062472971812 - type: nauc_precision_at_100_std value: 7.230228733647562 - type: nauc_precision_at_10_diff1 value: 2.5515545798843555 - type: nauc_precision_at_10_max value: 45.46146019314904 - type: nauc_precision_at_10_std value: -1.3249340536211553 - type: nauc_precision_at_1_diff1 value: 36.424151087824555 - type: nauc_precision_at_1_max value: 40.955715626650445 - type: nauc_precision_at_1_std value: -16.56636409111209 - type: nauc_precision_at_20_diff1 value: 0.7202861770489576 - type: nauc_precision_at_20_max value: 41.9937596214609 - type: nauc_precision_at_20_std value: 0.2756400069730064 - type: nauc_precision_at_3_diff1 value: 12.89221206929447 - type: nauc_precision_at_3_max value: 48.57775126381142 - type: nauc_precision_at_3_std value: -8.042242254131068 - type: nauc_precision_at_5_diff1 value: 7.063616193387763 - type: nauc_precision_at_5_max value: 47.26496887331675 - type: nauc_precision_at_5_std value: -4.735805200913049 - type: nauc_recall_at_1000_diff1 value: 2.6650052980682224 - type: nauc_recall_at_1000_max value: 81.94826279951472 - type: nauc_recall_at_1000_std value: 48.46012388224573 - type: nauc_recall_at_100_diff1 value: 24.516371948375827 - type: nauc_recall_at_100_max value: 39.17639620389552 - type: nauc_recall_at_100_std value: -17.884197602579533 - type: nauc_recall_at_10_diff1 value: 19.93892097640112 - type: nauc_recall_at_10_max value: 33.079079440022106 - type: nauc_recall_at_10_std value: -20.22227622801884 - type: nauc_recall_at_1_diff1 value: 37.56020148060502 - type: nauc_recall_at_1_max value: 10.298394230150745 - type: nauc_recall_at_1_std value: -20.41359936101547 - type: nauc_recall_at_20_diff1 value: 20.363784035670633 - type: nauc_recall_at_20_max value: 33.39352971625336 - type: nauc_recall_at_20_std value: -21.712050932168875 - type: nauc_recall_at_3_diff1 value: 26.220072121604655 - type: nauc_recall_at_3_max value: 25.853218030218507 - type: nauc_recall_at_3_std value: -17.830613372910907 - type: nauc_recall_at_5_diff1 value: 22.25850162680252 - type: nauc_recall_at_5_max value: 30.89620539042785 - type: nauc_recall_at_5_std value: -19.16786434439169 - type: ndcg_at_1 value: 47.288999999999994 - type: ndcg_at_10 value: 53.359 - type: ndcg_at_100 value: 60.25899999999999 - type: ndcg_at_1000 value: 61.902 - type: ndcg_at_20 value: 56.025000000000006 - type: ndcg_at_3 value: 47.221999999999994 - type: ndcg_at_5 value: 49.333 - type: precision_at_1 value: 47.288999999999994 - type: precision_at_10 value: 16.003 - type: precision_at_100 value: 2.221 - type: precision_at_1000 value: 0.246 - type: precision_at_20 value: 8.985 - type: precision_at_3 value: 34.510000000000005 - type: precision_at_5 value: 26.961000000000002 - type: recall_at_1 value: 22.892000000000003 - type: recall_at_10 value: 62.928 - type: recall_at_100 value: 89.105 - type: recall_at_1000 value: 99.319 - type: recall_at_20 value: 71.387 - type: recall_at_3 value: 43.492999999999995 - type: recall_at_5 value: 53.529 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (eng-fra) type: jinaai/xpqa config: eng-fra split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 54.888000000000005 - type: map_at_1 value: 26.079 - type: map_at_10 value: 47.434 - type: map_at_100 value: 49.376 - type: map_at_1000 value: 49.461 - type: map_at_20 value: 48.634 - type: map_at_3 value: 40.409 - type: map_at_5 value: 44.531 - type: mrr_at_1 value: 46.86248331108144 - type: mrr_at_10 value: 56.45506177548896 - type: mrr_at_100 value: 57.20360629445577 - type: mrr_at_1000 value: 57.227004696897986 - type: mrr_at_20 value: 56.905302765737865 - type: mrr_at_3 value: 54.09434801958164 - type: mrr_at_5 value: 55.40943480195811 - type: nauc_map_at_1000_diff1 value: 37.739936045535885 - type: nauc_map_at_1000_max value: 35.92625003516368 - type: nauc_map_at_1000_std value: -15.825119611638398 - type: nauc_map_at_100_diff1 value: 37.71697833661983 - type: nauc_map_at_100_max value: 35.91174068136317 - type: nauc_map_at_100_std value: -15.838841891589006 - type: nauc_map_at_10_diff1 value: 37.52309268219689 - type: nauc_map_at_10_max value: 35.4887130483351 - type: nauc_map_at_10_std value: -16.61132378136234 - type: nauc_map_at_1_diff1 value: 42.705087329207984 - type: nauc_map_at_1_max value: 12.047671550242974 - type: nauc_map_at_1_std value: -17.156030827065834 - type: nauc_map_at_20_diff1 value: 37.59446680137666 - type: nauc_map_at_20_max value: 35.80559546695052 - type: nauc_map_at_20_std value: -16.158338316249786 - type: nauc_map_at_3_diff1 value: 38.618415267131816 - type: nauc_map_at_3_max value: 27.030227996183925 - type: nauc_map_at_3_std value: -18.962500694157857 - type: nauc_map_at_5_diff1 value: 37.980845601534256 - type: nauc_map_at_5_max value: 32.82374761283266 - type: nauc_map_at_5_std value: -17.856875825229565 - type: nauc_mrr_at_1000_diff1 value: 40.26059509279346 - type: nauc_mrr_at_1000_max value: 39.28453752990871 - type: nauc_mrr_at_1000_std value: -13.306217279524212 - type: nauc_mrr_at_100_diff1 value: 40.23390833398881 - type: nauc_mrr_at_100_max value: 39.26041461025653 - type: nauc_mrr_at_100_std value: -13.317700798873153 - type: nauc_mrr_at_10_diff1 value: 40.163737640180145 - type: nauc_mrr_at_10_max value: 39.27138538165913 - type: nauc_mrr_at_10_std value: -13.472971360323038 - type: nauc_mrr_at_1_diff1 value: 42.95339241383707 - type: nauc_mrr_at_1_max value: 40.62982307619158 - type: nauc_mrr_at_1_std value: -10.429597045942748 - type: nauc_mrr_at_20_diff1 value: 40.23703505923782 - type: nauc_mrr_at_20_max value: 39.27051308063652 - type: nauc_mrr_at_20_std value: -13.390197643922038 - type: nauc_mrr_at_3_diff1 value: 40.5721313555661 - type: nauc_mrr_at_3_max value: 39.254774354468594 - type: nauc_mrr_at_3_std value: -13.773803807863827 - type: nauc_mrr_at_5_diff1 value: 40.41081287079734 - type: nauc_mrr_at_5_max value: 39.515241132077335 - type: nauc_mrr_at_5_std value: -13.306544090087336 - type: nauc_ndcg_at_1000_diff1 value: 38.04772268296103 - type: nauc_ndcg_at_1000_max value: 38.03364565521176 - type: nauc_ndcg_at_1000_std value: -14.203182726102263 - type: nauc_ndcg_at_100_diff1 value: 37.51752795463643 - type: nauc_ndcg_at_100_max value: 37.809671511710604 - type: nauc_ndcg_at_100_std value: -13.880578225081408 - type: nauc_ndcg_at_10_diff1 value: 36.78438984005559 - type: nauc_ndcg_at_10_max value: 36.98105155993232 - type: nauc_ndcg_at_10_std value: -16.886308645939113 - type: nauc_ndcg_at_1_diff1 value: 42.95339241383707 - type: nauc_ndcg_at_1_max value: 40.62982307619158 - type: nauc_ndcg_at_1_std value: -10.429597045942748 - type: nauc_ndcg_at_20_diff1 value: 36.94164323893683 - type: nauc_ndcg_at_20_max value: 37.333583379288285 - type: nauc_ndcg_at_20_std value: -15.853318071434716 - type: nauc_ndcg_at_3_diff1 value: 36.905604845477384 - type: nauc_ndcg_at_3_max value: 35.10252586688781 - type: nauc_ndcg_at_3_std value: -17.128435988977742 - type: nauc_ndcg_at_5_diff1 value: 37.96742463612705 - type: nauc_ndcg_at_5_max value: 34.65945109443365 - type: nauc_ndcg_at_5_std value: -17.916428667861183 - type: nauc_precision_at_1000_diff1 value: -3.740861894117653 - type: nauc_precision_at_1000_max value: 31.993854396874177 - type: nauc_precision_at_1000_std value: 17.445629474196448 - type: nauc_precision_at_100_diff1 value: -0.4825948747911606 - type: nauc_precision_at_100_max value: 35.834638448782954 - type: nauc_precision_at_100_std value: 16.82718796079511 - type: nauc_precision_at_10_diff1 value: 8.285949866268147 - type: nauc_precision_at_10_max value: 45.3292519726866 - type: nauc_precision_at_10_std value: 4.5574850748441555 - type: nauc_precision_at_1_diff1 value: 42.95339241383707 - type: nauc_precision_at_1_max value: 40.62982307619158 - type: nauc_precision_at_1_std value: -10.429597045942748 - type: nauc_precision_at_20_diff1 value: 4.890590733611442 - type: nauc_precision_at_20_max value: 41.83051757078859 - type: nauc_precision_at_20_std value: 9.197347125630467 - type: nauc_precision_at_3_diff1 value: 17.79940075411976 - type: nauc_precision_at_3_max value: 45.224103632426946 - type: nauc_precision_at_3_std value: -5.017203435609909 - type: nauc_precision_at_5_diff1 value: 13.548063145911929 - type: nauc_precision_at_5_max value: 46.84837547409909 - type: nauc_precision_at_5_std value: -0.8925939386354484 - type: nauc_recall_at_1000_diff1 value: 74.48441717138078 - type: nauc_recall_at_1000_max value: 74.66717137705027 - type: nauc_recall_at_1000_std value: 0.24030117471512125 - type: nauc_recall_at_100_diff1 value: 22.553777341988656 - type: nauc_recall_at_100_max value: 31.67861029246527 - type: nauc_recall_at_100_std value: 0.2707450517253687 - type: nauc_recall_at_10_diff1 value: 28.490866614443235 - type: nauc_recall_at_10_max value: 31.722970141434352 - type: nauc_recall_at_10_std value: -21.97893365028007 - type: nauc_recall_at_1_diff1 value: 42.705087329207984 - type: nauc_recall_at_1_max value: 12.047671550242974 - type: nauc_recall_at_1_std value: -17.156030827065834 - type: nauc_recall_at_20_diff1 value: 27.44043454173112 - type: nauc_recall_at_20_max value: 31.454281772040716 - type: nauc_recall_at_20_std value: -20.1735695305415 - type: nauc_recall_at_3_diff1 value: 34.08447534706394 - type: nauc_recall_at_3_max value: 21.793973773840865 - type: nauc_recall_at_3_std value: -22.753978372378906 - type: nauc_recall_at_5_diff1 value: 33.59686526199479 - type: nauc_recall_at_5_max value: 29.188889073761302 - type: nauc_recall_at_5_std value: -21.96156333744562 - type: ndcg_at_1 value: 46.861999999999995 - type: ndcg_at_10 value: 54.888000000000005 - type: ndcg_at_100 value: 61.477000000000004 - type: ndcg_at_1000 value: 62.768 - type: ndcg_at_20 value: 57.812 - type: ndcg_at_3 value: 48.721 - type: ndcg_at_5 value: 50.282000000000004 - type: precision_at_1 value: 46.861999999999995 - type: precision_at_10 value: 15.167 - type: precision_at_100 value: 2.072 - type: precision_at_1000 value: 0.22499999999999998 - type: precision_at_20 value: 8.672 - type: precision_at_3 value: 33.066 - type: precision_at_5 value: 24.726 - type: recall_at_1 value: 26.079 - type: recall_at_10 value: 66.095 - type: recall_at_100 value: 91.65299999999999 - type: recall_at_1000 value: 99.83999999999999 - type: recall_at_20 value: 75.28 - type: recall_at_3 value: 46.874 - type: recall_at_5 value: 55.062 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (pol-eng) type: jinaai/xpqa config: pol-eng split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 50.831 - type: map_at_1 value: 25.549 - type: map_at_10 value: 44.432 - type: map_at_100 value: 46.431 - type: map_at_1000 value: 46.525 - type: map_at_20 value: 45.595 - type: map_at_3 value: 38.574000000000005 - type: map_at_5 value: 42.266999999999996 - type: mrr_at_1 value: 43.5006435006435 - type: mrr_at_10 value: 51.561255132683684 - type: mrr_at_100 value: 52.59912482635216 - type: mrr_at_1000 value: 52.631337587043056 - type: mrr_at_20 value: 52.23234440063273 - type: mrr_at_3 value: 48.97039897039895 - type: mrr_at_5 value: 50.31531531531527 - type: nauc_map_at_1000_diff1 value: 35.907901295900174 - type: nauc_map_at_1000_max value: 24.573763602041687 - type: nauc_map_at_1000_std value: -29.524077960309313 - type: nauc_map_at_100_diff1 value: 35.86869121827827 - type: nauc_map_at_100_max value: 24.532343818487494 - type: nauc_map_at_100_std value: -29.613979124488864 - type: nauc_map_at_10_diff1 value: 35.90171794022391 - type: nauc_map_at_10_max value: 23.90914892943268 - type: nauc_map_at_10_std value: -30.43698820061533 - type: nauc_map_at_1_diff1 value: 50.80313333312038 - type: nauc_map_at_1_max value: 16.649890421888156 - type: nauc_map_at_1_std value: -22.323989416471683 - type: nauc_map_at_20_diff1 value: 35.77755470212964 - type: nauc_map_at_20_max value: 24.199895270297034 - type: nauc_map_at_20_std value: -30.223411960170647 - type: nauc_map_at_3_diff1 value: 38.964124882315936 - type: nauc_map_at_3_max value: 21.187432510177167 - type: nauc_map_at_3_std value: -28.976663506389887 - type: nauc_map_at_5_diff1 value: 36.04644236616672 - type: nauc_map_at_5_max value: 23.501186429317094 - type: nauc_map_at_5_std value: -30.068144596060748 - type: nauc_mrr_at_1000_diff1 value: 41.36555452105447 - type: nauc_mrr_at_1000_max value: 26.376799280402867 - type: nauc_mrr_at_1000_std value: -30.008603028757424 - type: nauc_mrr_at_100_diff1 value: 41.35523965220727 - type: nauc_mrr_at_100_max value: 26.402612115967706 - type: nauc_mrr_at_100_std value: -29.991754627128024 - type: nauc_mrr_at_10_diff1 value: 41.001395127259315 - type: nauc_mrr_at_10_max value: 26.104860505051384 - type: nauc_mrr_at_10_std value: -30.38420449487516 - type: nauc_mrr_at_1_diff1 value: 44.882846373248206 - type: nauc_mrr_at_1_max value: 26.61905322890808 - type: nauc_mrr_at_1_std value: -28.724565662206153 - type: nauc_mrr_at_20_diff1 value: 41.278009142648834 - type: nauc_mrr_at_20_max value: 26.284565529087295 - type: nauc_mrr_at_20_std value: -30.19549140549242 - type: nauc_mrr_at_3_diff1 value: 41.74663893951077 - type: nauc_mrr_at_3_max value: 26.263048464325884 - type: nauc_mrr_at_3_std value: -30.676733442965688 - type: nauc_mrr_at_5_diff1 value: 41.11461477846568 - type: nauc_mrr_at_5_max value: 25.94713927964926 - type: nauc_mrr_at_5_std value: -30.317066480767817 - type: nauc_ndcg_at_1000_diff1 value: 36.34161052445199 - type: nauc_ndcg_at_1000_max value: 26.321036033696206 - type: nauc_ndcg_at_1000_std value: -27.59146917115399 - type: nauc_ndcg_at_100_diff1 value: 35.66557800007035 - type: nauc_ndcg_at_100_max value: 26.282211208336136 - type: nauc_ndcg_at_100_std value: -27.905634124461333 - type: nauc_ndcg_at_10_diff1 value: 35.34872687407275 - type: nauc_ndcg_at_10_max value: 24.018561915792272 - type: nauc_ndcg_at_10_std value: -31.57712772869015 - type: nauc_ndcg_at_1_diff1 value: 44.882846373248206 - type: nauc_ndcg_at_1_max value: 26.865602442152554 - type: nauc_ndcg_at_1_std value: -28.509295454329152 - type: nauc_ndcg_at_20_diff1 value: 35.46177768045546 - type: nauc_ndcg_at_20_max value: 24.921273675141542 - type: nauc_ndcg_at_20_std value: -30.84348812979793 - type: nauc_ndcg_at_3_diff1 value: 36.84688489063923 - type: nauc_ndcg_at_3_max value: 24.088513229463736 - type: nauc_ndcg_at_3_std value: -30.05640995379297 - type: nauc_ndcg_at_5_diff1 value: 35.623143276796185 - type: nauc_ndcg_at_5_max value: 23.76654250474061 - type: nauc_ndcg_at_5_std value: -30.87847710074466 - type: nauc_precision_at_1000_diff1 value: -16.270532533886932 - type: nauc_precision_at_1000_max value: 17.37365042394671 - type: nauc_precision_at_1000_std value: 16.27166715693082 - type: nauc_precision_at_100_diff1 value: -13.175264889436313 - type: nauc_precision_at_100_max value: 19.488571046893963 - type: nauc_precision_at_100_std value: 9.055429698007798 - type: nauc_precision_at_10_diff1 value: 0.6806938753592942 - type: nauc_precision_at_10_max value: 21.933083960522616 - type: nauc_precision_at_10_std value: -18.2147036942157 - type: nauc_precision_at_1_diff1 value: 44.882846373248206 - type: nauc_precision_at_1_max value: 26.865602442152554 - type: nauc_precision_at_1_std value: -28.509295454329152 - type: nauc_precision_at_20_diff1 value: -4.318119150162302 - type: nauc_precision_at_20_max value: 21.089702301041687 - type: nauc_precision_at_20_std value: -10.333077681479546 - type: nauc_precision_at_3_diff1 value: 11.496076462671107 - type: nauc_precision_at_3_max value: 23.018301549827008 - type: nauc_precision_at_3_std value: -23.98652995416454 - type: nauc_precision_at_5_diff1 value: 4.271050668117355 - type: nauc_precision_at_5_max value: 23.61051327966779 - type: nauc_precision_at_5_std value: -21.557618503107847 - type: nauc_recall_at_1000_diff1 value: 62.23955911850697 - type: nauc_recall_at_1000_max value: 83.20491723365542 - type: nauc_recall_at_1000_std value: 66.5173462601958 - type: nauc_recall_at_100_diff1 value: 20.503778602988177 - type: nauc_recall_at_100_max value: 29.379026288767506 - type: nauc_recall_at_100_std value: -16.139120874540573 - type: nauc_recall_at_10_diff1 value: 27.659110249896557 - type: nauc_recall_at_10_max value: 19.69557968026332 - type: nauc_recall_at_10_std value: -33.95657132767551 - type: nauc_recall_at_1_diff1 value: 50.80313333312038 - type: nauc_recall_at_1_max value: 16.649890421888156 - type: nauc_recall_at_1_std value: -22.323989416471683 - type: nauc_recall_at_20_diff1 value: 27.084453724565176 - type: nauc_recall_at_20_max value: 21.40080632474994 - type: nauc_recall_at_20_std value: -32.83683639340239 - type: nauc_recall_at_3_diff1 value: 34.32950941333572 - type: nauc_recall_at_3_max value: 18.55616615958199 - type: nauc_recall_at_3_std value: -30.375983327454076 - type: nauc_recall_at_5_diff1 value: 29.44516734974564 - type: nauc_recall_at_5_max value: 20.630543534300312 - type: nauc_recall_at_5_std value: -31.30763062499127 - type: ndcg_at_1 value: 43.501 - type: ndcg_at_10 value: 50.831 - type: ndcg_at_100 value: 58.17099999999999 - type: ndcg_at_1000 value: 59.705 - type: ndcg_at_20 value: 54.047999999999995 - type: ndcg_at_3 value: 44.549 - type: ndcg_at_5 value: 46.861000000000004 - type: precision_at_1 value: 43.501 - type: precision_at_10 value: 12.895999999999999 - type: precision_at_100 value: 1.9 - type: precision_at_1000 value: 0.21 - type: precision_at_20 value: 7.593 - type: precision_at_3 value: 29.215000000000003 - type: precision_at_5 value: 21.57 - type: recall_at_1 value: 25.549 - type: recall_at_10 value: 61.795 - type: recall_at_100 value: 90.019 - type: recall_at_1000 value: 99.807 - type: recall_at_20 value: 72.096 - type: recall_at_3 value: 43.836999999999996 - type: recall_at_5 value: 51.714000000000006 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (pol-pol) type: jinaai/xpqa config: pol-pol split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 53.70399999999999 - type: map_at_1 value: 27.739000000000004 - type: map_at_10 value: 47.469 - type: map_at_100 value: 49.392 - type: map_at_1000 value: 49.483 - type: map_at_20 value: 48.646 - type: map_at_3 value: 41.467 - type: map_at_5 value: 45.467 - type: mrr_at_1 value: 47.00636942675159 - type: mrr_at_10 value: 54.63699322616519 - type: mrr_at_100 value: 55.54525182833755 - type: mrr_at_1000 value: 55.581331515356155 - type: mrr_at_20 value: 55.22918377451415 - type: mrr_at_3 value: 52.03821656050952 - type: mrr_at_5 value: 53.38216560509549 - type: nauc_map_at_1000_diff1 value: 45.03530825034854 - type: nauc_map_at_1000_max value: 34.22740272603397 - type: nauc_map_at_1000_std value: -30.428880484199244 - type: nauc_map_at_100_diff1 value: 44.978704455592805 - type: nauc_map_at_100_max value: 34.20908357964765 - type: nauc_map_at_100_std value: -30.47325365059666 - type: nauc_map_at_10_diff1 value: 44.9560579177672 - type: nauc_map_at_10_max value: 33.70097588985278 - type: nauc_map_at_10_std value: -31.205563222357885 - type: nauc_map_at_1_diff1 value: 57.94711780881773 - type: nauc_map_at_1_max value: 21.60278071836319 - type: nauc_map_at_1_std value: -23.273741268035923 - type: nauc_map_at_20_diff1 value: 44.97859054699532 - type: nauc_map_at_20_max value: 34.153729150181846 - type: nauc_map_at_20_std value: -30.97482545902907 - type: nauc_map_at_3_diff1 value: 47.52016138686765 - type: nauc_map_at_3_max value: 30.176197065298417 - type: nauc_map_at_3_std value: -29.90628984041898 - type: nauc_map_at_5_diff1 value: 45.36581638257985 - type: nauc_map_at_5_max value: 33.697200263698036 - type: nauc_map_at_5_std value: -31.165331120088453 - type: nauc_mrr_at_1000_diff1 value: 53.32889526818364 - type: nauc_mrr_at_1000_max value: 36.104118340589736 - type: nauc_mrr_at_1000_std value: -31.321132494516984 - type: nauc_mrr_at_100_diff1 value: 53.30695875258367 - type: nauc_mrr_at_100_max value: 36.114890079024455 - type: nauc_mrr_at_100_std value: -31.291749322117447 - type: nauc_mrr_at_10_diff1 value: 53.189084772141435 - type: nauc_mrr_at_10_max value: 35.939061062282484 - type: nauc_mrr_at_10_std value: -31.502185884653645 - type: nauc_mrr_at_1_diff1 value: 56.89368291041337 - type: nauc_mrr_at_1_max value: 36.07581125496313 - type: nauc_mrr_at_1_std value: -29.703764232519475 - type: nauc_mrr_at_20_diff1 value: 53.23955737199497 - type: nauc_mrr_at_20_max value: 36.068824838215676 - type: nauc_mrr_at_20_std value: -31.420039428197594 - type: nauc_mrr_at_3_diff1 value: 53.74385074861207 - type: nauc_mrr_at_3_max value: 35.57054587735015 - type: nauc_mrr_at_3_std value: -32.356894834537684 - type: nauc_mrr_at_5_diff1 value: 53.66669556981826 - type: nauc_mrr_at_5_max value: 36.02102289605049 - type: nauc_mrr_at_5_std value: -32.030437067359124 - type: nauc_ndcg_at_1000_diff1 value: 46.34900536768847 - type: nauc_ndcg_at_1000_max value: 35.6314995837715 - type: nauc_ndcg_at_1000_std value: -28.965103958822624 - type: nauc_ndcg_at_100_diff1 value: 45.1587893788861 - type: nauc_ndcg_at_100_max value: 35.62430753595297 - type: nauc_ndcg_at_100_std value: -28.77303405812772 - type: nauc_ndcg_at_10_diff1 value: 44.928781590765965 - type: nauc_ndcg_at_10_max value: 34.315200006430366 - type: nauc_ndcg_at_10_std value: -32.05164097076614 - type: nauc_ndcg_at_1_diff1 value: 57.228262350455125 - type: nauc_ndcg_at_1_max value: 35.645285703387366 - type: nauc_ndcg_at_1_std value: -29.893553821348718 - type: nauc_ndcg_at_20_diff1 value: 44.959903633039865 - type: nauc_ndcg_at_20_max value: 35.493022926282755 - type: nauc_ndcg_at_20_std value: -31.54989291850644 - type: nauc_ndcg_at_3_diff1 value: 46.65266185996905 - type: nauc_ndcg_at_3_max value: 33.74458119579594 - type: nauc_ndcg_at_3_std value: -31.493683304534176 - type: nauc_ndcg_at_5_diff1 value: 46.08707037187612 - type: nauc_ndcg_at_5_max value: 34.7401426055243 - type: nauc_ndcg_at_5_std value: -32.44390676345172 - type: nauc_precision_at_1000_diff1 value: -12.11355300492561 - type: nauc_precision_at_1000_max value: 14.490738062121233 - type: nauc_precision_at_1000_std value: 14.448811005059097 - type: nauc_precision_at_100_diff1 value: -9.742085657181239 - type: nauc_precision_at_100_max value: 18.030305489251223 - type: nauc_precision_at_100_std value: 8.213089709529765 - type: nauc_precision_at_10_diff1 value: 5.153466672774969 - type: nauc_precision_at_10_max value: 27.29412644661678 - type: nauc_precision_at_10_std value: -15.505053884112355 - type: nauc_precision_at_1_diff1 value: 57.228262350455125 - type: nauc_precision_at_1_max value: 35.645285703387366 - type: nauc_precision_at_1_std value: -29.893553821348718 - type: nauc_precision_at_20_diff1 value: -0.6812430761066635 - type: nauc_precision_at_20_max value: 25.81911286466295 - type: nauc_precision_at_20_std value: -8.388506222482595 - type: nauc_precision_at_3_diff1 value: 18.263873866510576 - type: nauc_precision_at_3_max value: 30.879576105862345 - type: nauc_precision_at_3_std value: -24.0342929870108 - type: nauc_precision_at_5_diff1 value: 10.9905804265327 - type: nauc_precision_at_5_max value: 30.88468087429045 - type: nauc_precision_at_5_std value: -20.458684056213507 - type: nauc_recall_at_1000_diff1 value: -64.887668417171 - type: nauc_recall_at_1000_max value: 52.25501730358092 - type: nauc_recall_at_1000_std value: 85.13647916200132 - type: nauc_recall_at_100_diff1 value: 18.956777346127655 - type: nauc_recall_at_100_max value: 36.10473493564588 - type: nauc_recall_at_100_std value: -10.007474558899949 - type: nauc_recall_at_10_diff1 value: 33.810344497568046 - type: nauc_recall_at_10_max value: 31.395430183214245 - type: nauc_recall_at_10_std value: -33.12920524433795 - type: nauc_recall_at_1_diff1 value: 57.94711780881773 - type: nauc_recall_at_1_max value: 21.60278071836319 - type: nauc_recall_at_1_std value: -23.273741268035923 - type: nauc_recall_at_20_diff1 value: 31.449657437065397 - type: nauc_recall_at_20_max value: 34.519574934321945 - type: nauc_recall_at_20_std value: -33.43406862055647 - type: nauc_recall_at_3_diff1 value: 42.07841848382365 - type: nauc_recall_at_3_max value: 28.7648772833266 - type: nauc_recall_at_3_std value: -31.56367736320086 - type: nauc_recall_at_5_diff1 value: 39.21392858246301 - type: nauc_recall_at_5_max value: 34.28338202081927 - type: nauc_recall_at_5_std value: -33.725680523721906 - type: ndcg_at_1 value: 46.879 - type: ndcg_at_10 value: 53.70399999999999 - type: ndcg_at_100 value: 60.532 - type: ndcg_at_1000 value: 61.997 - type: ndcg_at_20 value: 56.818999999999996 - type: ndcg_at_3 value: 47.441 - type: ndcg_at_5 value: 49.936 - type: precision_at_1 value: 46.879 - type: precision_at_10 value: 13.376 - type: precision_at_100 value: 1.8980000000000001 - type: precision_at_1000 value: 0.208 - type: precision_at_20 value: 7.771 - type: precision_at_3 value: 30.658 - type: precision_at_5 value: 22.828 - type: recall_at_1 value: 27.739000000000004 - type: recall_at_10 value: 64.197 - type: recall_at_100 value: 90.54100000000001 - type: recall_at_1000 value: 99.90400000000001 - type: recall_at_20 value: 74.178 - type: recall_at_3 value: 46.312 - type: recall_at_5 value: 54.581999999999994 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (cmn-eng) type: jinaai/xpqa config: cmn-eng split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 64.64 - type: map_at_1 value: 35.858000000000004 - type: map_at_10 value: 58.547000000000004 - type: map_at_100 value: 60.108 - type: map_at_1000 value: 60.153999999999996 - type: map_at_20 value: 59.528000000000006 - type: map_at_3 value: 51.578 - type: map_at_5 value: 56.206999999999994 - type: mrr_at_1 value: 56.95121951219512 - type: mrr_at_10 value: 64.93975029036001 - type: mrr_at_100 value: 65.63357055718294 - type: mrr_at_1000 value: 65.64844109026834 - type: mrr_at_20 value: 65.41280668715439 - type: mrr_at_3 value: 62.68292682926826 - type: mrr_at_5 value: 64.1585365853658 - type: nauc_map_at_1000_diff1 value: 45.82740870907091 - type: nauc_map_at_1000_max value: 21.9696540066807 - type: nauc_map_at_1000_std value: -32.028262356639495 - type: nauc_map_at_100_diff1 value: 45.802053117616396 - type: nauc_map_at_100_max value: 21.946002070290966 - type: nauc_map_at_100_std value: -32.06190418866229 - type: nauc_map_at_10_diff1 value: 46.017774155748945 - type: nauc_map_at_10_max value: 21.876909086095544 - type: nauc_map_at_10_std value: -32.13913568843985 - type: nauc_map_at_1_diff1 value: 56.34671160956164 - type: nauc_map_at_1_max value: 17.6796949796236 - type: nauc_map_at_1_std value: -13.741140688066045 - type: nauc_map_at_20_diff1 value: 46.027469176858716 - type: nauc_map_at_20_max value: 21.80738432042703 - type: nauc_map_at_20_std value: -32.430379634015395 - type: nauc_map_at_3_diff1 value: 48.40096725254027 - type: nauc_map_at_3_max value: 21.15442803574233 - type: nauc_map_at_3_std value: -26.205850292181417 - type: nauc_map_at_5_diff1 value: 45.77800041356389 - type: nauc_map_at_5_max value: 22.11718771798752 - type: nauc_map_at_5_std value: -30.32876338031471 - type: nauc_mrr_at_1000_diff1 value: 49.748274798877944 - type: nauc_mrr_at_1000_max value: 24.547774167219906 - type: nauc_mrr_at_1000_std value: -32.728447209433504 - type: nauc_mrr_at_100_diff1 value: 49.734549290377856 - type: nauc_mrr_at_100_max value: 24.536933315055222 - type: nauc_mrr_at_100_std value: -32.74076335880697 - type: nauc_mrr_at_10_diff1 value: 49.82827711456392 - type: nauc_mrr_at_10_max value: 24.536773657485075 - type: nauc_mrr_at_10_std value: -33.05707547166962 - type: nauc_mrr_at_1_diff1 value: 51.954289992321044 - type: nauc_mrr_at_1_max value: 26.336255074856886 - type: nauc_mrr_at_1_std value: -29.042962019692446 - type: nauc_mrr_at_20_diff1 value: 49.70938465628863 - type: nauc_mrr_at_20_max value: 24.433219849576947 - type: nauc_mrr_at_20_std value: -32.94123791846049 - type: nauc_mrr_at_3_diff1 value: 50.289486880347134 - type: nauc_mrr_at_3_max value: 24.978796972860142 - type: nauc_mrr_at_3_std value: -32.11305594784892 - type: nauc_mrr_at_5_diff1 value: 49.95013396316144 - type: nauc_mrr_at_5_max value: 24.514452761198303 - type: nauc_mrr_at_5_std value: -32.865859962984146 - type: nauc_ndcg_at_1000_diff1 value: 45.73806489233998 - type: nauc_ndcg_at_1000_max value: 22.404941391043867 - type: nauc_ndcg_at_1000_std value: -33.063445720849685 - type: nauc_ndcg_at_100_diff1 value: 45.1046206923062 - type: nauc_ndcg_at_100_max value: 22.081133719684658 - type: nauc_ndcg_at_100_std value: -33.299291459450146 - type: nauc_ndcg_at_10_diff1 value: 46.140608688357496 - type: nauc_ndcg_at_10_max value: 21.442489279388916 - type: nauc_ndcg_at_10_std value: -35.115870342856006 - type: nauc_ndcg_at_1_diff1 value: 51.954289992321044 - type: nauc_ndcg_at_1_max value: 26.336255074856886 - type: nauc_ndcg_at_1_std value: -29.042962019692446 - type: nauc_ndcg_at_20_diff1 value: 45.966784725457046 - type: nauc_ndcg_at_20_max value: 21.166632858613145 - type: nauc_ndcg_at_20_std value: -35.65112890375392 - type: nauc_ndcg_at_3_diff1 value: 46.7404863978999 - type: nauc_ndcg_at_3_max value: 22.701743709129456 - type: nauc_ndcg_at_3_std value: -30.907633466983192 - type: nauc_ndcg_at_5_diff1 value: 45.86487199083486 - type: nauc_ndcg_at_5_max value: 22.088804840002513 - type: nauc_ndcg_at_5_std value: -32.3853481632832 - type: nauc_precision_at_1000_diff1 value: -25.69710612774455 - type: nauc_precision_at_1000_max value: 1.3964400247388091 - type: nauc_precision_at_1000_std value: -8.873947511634814 - type: nauc_precision_at_100_diff1 value: -24.013497191077978 - type: nauc_precision_at_100_max value: 2.0197725715909343 - type: nauc_precision_at_100_std value: -11.387423148770633 - type: nauc_precision_at_10_diff1 value: -6.47728645242781 - type: nauc_precision_at_10_max value: 6.815261443768304 - type: nauc_precision_at_10_std value: -26.825062292855943 - type: nauc_precision_at_1_diff1 value: 51.954289992321044 - type: nauc_precision_at_1_max value: 26.336255074856886 - type: nauc_precision_at_1_std value: -29.042962019692446 - type: nauc_precision_at_20_diff1 value: -12.355232044747511 - type: nauc_precision_at_20_max value: 4.022126850949725 - type: nauc_precision_at_20_std value: -23.688935769326772 - type: nauc_precision_at_3_diff1 value: 7.662671665835864 - type: nauc_precision_at_3_max value: 14.372394760986248 - type: nauc_precision_at_3_std value: -28.635125665532453 - type: nauc_precision_at_5_diff1 value: -1.4592476425511611 - type: nauc_precision_at_5_max value: 11.124310161474174 - type: nauc_precision_at_5_std value: -27.89526669318053 - type: nauc_recall_at_1000_diff1 value: -19.58450046684932 - type: nauc_recall_at_1000_max value: 70.71661998133165 - type: nauc_recall_at_1000_std value: 93.05555555556315 - type: nauc_recall_at_100_diff1 value: 15.06356457571853 - type: nauc_recall_at_100_max value: 14.051414749344806 - type: nauc_recall_at_100_std value: -29.461874235153008 - type: nauc_recall_at_10_diff1 value: 41.29842726117901 - type: nauc_recall_at_10_max value: 15.768699673830898 - type: nauc_recall_at_10_std value: -42.11585661287712 - type: nauc_recall_at_1_diff1 value: 56.34671160956164 - type: nauc_recall_at_1_max value: 17.6796949796236 - type: nauc_recall_at_1_std value: -13.741140688066045 - type: nauc_recall_at_20_diff1 value: 38.8078283585263 - type: nauc_recall_at_20_max value: 12.06816084005326 - type: nauc_recall_at_20_std value: -48.20956170056591 - type: nauc_recall_at_3_diff1 value: 44.71028758038993 - type: nauc_recall_at_3_max value: 19.1059093689162 - type: nauc_recall_at_3_std value: -26.795164453784253 - type: nauc_recall_at_5_diff1 value: 41.06320797773054 - type: nauc_recall_at_5_max value: 19.117028272530998 - type: nauc_recall_at_5_std value: -33.985747504612156 - type: ndcg_at_1 value: 56.95099999999999 - type: ndcg_at_10 value: 64.64 - type: ndcg_at_100 value: 70.017 - type: ndcg_at_1000 value: 70.662 - type: ndcg_at_20 value: 67.256 - type: ndcg_at_3 value: 58.269000000000005 - type: ndcg_at_5 value: 60.94199999999999 - type: precision_at_1 value: 56.95099999999999 - type: precision_at_10 value: 15.671 - type: precision_at_100 value: 2.002 - type: precision_at_1000 value: 0.208 - type: precision_at_20 value: 8.689 - type: precision_at_3 value: 36.341 - type: precision_at_5 value: 26.854 - type: recall_at_1 value: 35.858000000000004 - type: recall_at_10 value: 75.02 - type: recall_at_100 value: 95.76 - type: recall_at_1000 value: 99.837 - type: recall_at_20 value: 83.732 - type: recall_at_3 value: 57.093 - type: recall_at_5 value: 66.193 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (cmn-cmn) type: jinaai/xpqa config: cmn-cmn split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 69.446 - type: map_at_1 value: 39.995999999999995 - type: map_at_10 value: 64.033 - type: map_at_100 value: 65.51599999999999 - type: map_at_1000 value: 65.545 - type: map_at_20 value: 64.958 - type: map_at_3 value: 57.767 - type: map_at_5 value: 61.998 - type: mrr_at_1 value: 63.3495145631068 - type: mrr_at_10 value: 70.21146363075978 - type: mrr_at_100 value: 70.82810974202124 - type: mrr_at_1000 value: 70.83816803303915 - type: mrr_at_20 value: 70.60140248428802 - type: mrr_at_3 value: 68.66909385113267 - type: mrr_at_5 value: 69.56108414239482 - type: nauc_map_at_1000_diff1 value: 51.649897072831465 - type: nauc_map_at_1000_max value: 38.25222728655331 - type: nauc_map_at_1000_std value: -39.10327919949334 - type: nauc_map_at_100_diff1 value: 51.644205886401465 - type: nauc_map_at_100_max value: 38.23611154355255 - type: nauc_map_at_100_std value: -39.1677073977285 - type: nauc_map_at_10_diff1 value: 51.81444145636039 - type: nauc_map_at_10_max value: 38.03382104326485 - type: nauc_map_at_10_std value: -38.999395639812015 - type: nauc_map_at_1_diff1 value: 59.785298201044704 - type: nauc_map_at_1_max value: 23.273537759937785 - type: nauc_map_at_1_std value: -17.838712689290194 - type: nauc_map_at_20_diff1 value: 51.680208795601004 - type: nauc_map_at_20_max value: 38.23334583518634 - type: nauc_map_at_20_std value: -39.24344495939061 - type: nauc_map_at_3_diff1 value: 52.180913298194056 - type: nauc_map_at_3_max value: 33.45482478000481 - type: nauc_map_at_3_std value: -31.682911030586297 - type: nauc_map_at_5_diff1 value: 50.804900676175436 - type: nauc_map_at_5_max value: 37.68924816012326 - type: nauc_map_at_5_std value: -36.85016896616712 - type: nauc_mrr_at_1000_diff1 value: 56.371477471577535 - type: nauc_mrr_at_1000_max value: 42.773877962050086 - type: nauc_mrr_at_1000_std value: -40.41765081873682 - type: nauc_mrr_at_100_diff1 value: 56.3619751528192 - type: nauc_mrr_at_100_max value: 42.76298794859916 - type: nauc_mrr_at_100_std value: -40.44070582448831 - type: nauc_mrr_at_10_diff1 value: 56.33810523477712 - type: nauc_mrr_at_10_max value: 42.76591937795783 - type: nauc_mrr_at_10_std value: -40.69339583030244 - type: nauc_mrr_at_1_diff1 value: 58.90399906884378 - type: nauc_mrr_at_1_max value: 43.38806571165292 - type: nauc_mrr_at_1_std value: -38.224015285584 - type: nauc_mrr_at_20_diff1 value: 56.32629070537032 - type: nauc_mrr_at_20_max value: 42.79615263472604 - type: nauc_mrr_at_20_std value: -40.496777397603076 - type: nauc_mrr_at_3_diff1 value: 55.96989454480743 - type: nauc_mrr_at_3_max value: 42.49832220744744 - type: nauc_mrr_at_3_std value: -39.883799467132384 - type: nauc_mrr_at_5_diff1 value: 56.003080766475755 - type: nauc_mrr_at_5_max value: 42.73308051011805 - type: nauc_mrr_at_5_std value: -39.87179511166683 - type: nauc_ndcg_at_1000_diff1 value: 52.49054229225255 - type: nauc_ndcg_at_1000_max value: 39.61644750719859 - type: nauc_ndcg_at_1000_std value: -40.89845763194674 - type: nauc_ndcg_at_100_diff1 value: 52.33511250864434 - type: nauc_ndcg_at_100_max value: 39.25530146124452 - type: nauc_ndcg_at_100_std value: -41.92444498004374 - type: nauc_ndcg_at_10_diff1 value: 52.62031505931842 - type: nauc_ndcg_at_10_max value: 38.667195545396766 - type: nauc_ndcg_at_10_std value: -42.59503924641507 - type: nauc_ndcg_at_1_diff1 value: 58.90399906884378 - type: nauc_ndcg_at_1_max value: 43.38806571165292 - type: nauc_ndcg_at_1_std value: -38.224015285584 - type: nauc_ndcg_at_20_diff1 value: 52.15061629809436 - type: nauc_ndcg_at_20_max value: 39.09332400054708 - type: nauc_ndcg_at_20_std value: -42.80018671618001 - type: nauc_ndcg_at_3_diff1 value: 51.04210728138207 - type: nauc_ndcg_at_3_max value: 38.19034802567046 - type: nauc_ndcg_at_3_std value: -38.179821090765216 - type: nauc_ndcg_at_5_diff1 value: 51.04399574045204 - type: nauc_ndcg_at_5_max value: 38.42492210204548 - type: nauc_ndcg_at_5_std value: -38.868073241617715 - type: nauc_precision_at_1000_diff1 value: -25.151369907213734 - type: nauc_precision_at_1000_max value: 9.012549147054989 - type: nauc_precision_at_1000_std value: -9.319786589947698 - type: nauc_precision_at_100_diff1 value: -23.20945211843088 - type: nauc_precision_at_100_max value: 9.860701593969862 - type: nauc_precision_at_100_std value: -13.073877818347231 - type: nauc_precision_at_10_diff1 value: -6.970781124246847 - type: nauc_precision_at_10_max value: 19.392675322254487 - type: nauc_precision_at_10_std value: -26.74943490717657 - type: nauc_precision_at_1_diff1 value: 58.90399906884378 - type: nauc_precision_at_1_max value: 43.38806571165292 - type: nauc_precision_at_1_std value: -38.224015285584 - type: nauc_precision_at_20_diff1 value: -13.046456108081102 - type: nauc_precision_at_20_max value: 15.69439950383875 - type: nauc_precision_at_20_std value: -23.836004512018093 - type: nauc_precision_at_3_diff1 value: 3.5444232965528846 - type: nauc_precision_at_3_max value: 27.08858445453865 - type: nauc_precision_at_3_std value: -29.12757283665593 - type: nauc_precision_at_5_diff1 value: -3.6853986353320267 - type: nauc_precision_at_5_max value: 24.32059689571271 - type: nauc_precision_at_5_std value: -27.46188072134163 - type: nauc_recall_at_1000_diff1 value: 86.93515141907919 - type: nauc_recall_at_1000_max value: 100.0 - type: nauc_recall_at_1000_std value: 100.0 - type: nauc_recall_at_100_diff1 value: 39.7052887613879 - type: nauc_recall_at_100_max value: 18.40943977796887 - type: nauc_recall_at_100_std value: -88.74014854144974 - type: nauc_recall_at_10_diff1 value: 48.85342500870892 - type: nauc_recall_at_10_max value: 32.69617204234419 - type: nauc_recall_at_10_std value: -51.9937231860804 - type: nauc_recall_at_1_diff1 value: 59.785298201044704 - type: nauc_recall_at_1_max value: 23.273537759937785 - type: nauc_recall_at_1_std value: -17.838712689290194 - type: nauc_recall_at_20_diff1 value: 45.40839773314378 - type: nauc_recall_at_20_max value: 33.02458321493215 - type: nauc_recall_at_20_std value: -55.97800739448166 - type: nauc_recall_at_3_diff1 value: 47.05565693416531 - type: nauc_recall_at_3_max value: 28.743850400344297 - type: nauc_recall_at_3_std value: -32.436470486397475 - type: nauc_recall_at_5_diff1 value: 45.30223758669577 - type: nauc_recall_at_5_max value: 33.6567274747059 - type: nauc_recall_at_5_std value: -39.946712017948514 - type: ndcg_at_1 value: 63.349999999999994 - type: ndcg_at_10 value: 69.446 - type: ndcg_at_100 value: 74.439 - type: ndcg_at_1000 value: 74.834 - type: ndcg_at_20 value: 71.763 - type: ndcg_at_3 value: 64.752 - type: ndcg_at_5 value: 66.316 - type: precision_at_1 value: 63.349999999999994 - type: precision_at_10 value: 16.286 - type: precision_at_100 value: 2.024 - type: precision_at_1000 value: 0.207 - type: precision_at_20 value: 8.908000000000001 - type: precision_at_3 value: 40.655 - type: precision_at_5 value: 28.859 - type: recall_at_1 value: 39.995999999999995 - type: recall_at_10 value: 78.107 - type: recall_at_100 value: 97.538 - type: recall_at_1000 value: 99.96000000000001 - type: recall_at_20 value: 85.72 - type: recall_at_3 value: 63.291 - type: recall_at_5 value: 70.625 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (spa-eng) type: jinaai/xpqa config: spa-eng split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 68.258 - type: map_at_1 value: 33.06 - type: map_at_10 value: 61.590999999999994 - type: map_at_100 value: 63.341 - type: map_at_1000 value: 63.385999999999996 - type: map_at_20 value: 62.77700000000001 - type: map_at_3 value: 52.547999999999995 - type: map_at_5 value: 58.824 - type: mrr_at_1 value: 63.80832282471627 - type: mrr_at_10 value: 70.76848015372607 - type: mrr_at_100 value: 71.33996704518061 - type: mrr_at_1000 value: 71.35368444388072 - type: mrr_at_20 value: 71.18191741103522 - type: mrr_at_3 value: 68.83144178226142 - type: mrr_at_5 value: 69.88440521227405 - type: nauc_map_at_1000_diff1 value: 41.59255746310511 - type: nauc_map_at_1000_max value: 42.064075373358065 - type: nauc_map_at_1000_std value: -25.130730194381723 - type: nauc_map_at_100_diff1 value: 41.56447648820406 - type: nauc_map_at_100_max value: 42.06711634651607 - type: nauc_map_at_100_std value: -25.14871585556968 - type: nauc_map_at_10_diff1 value: 41.28968387107058 - type: nauc_map_at_10_max value: 41.511538272139774 - type: nauc_map_at_10_std value: -25.99906440164276 - type: nauc_map_at_1_diff1 value: 51.09859596320021 - type: nauc_map_at_1_max value: 12.406789321338222 - type: nauc_map_at_1_std value: -18.227486548655076 - type: nauc_map_at_20_diff1 value: 41.39469672947315 - type: nauc_map_at_20_max value: 41.98309315808902 - type: nauc_map_at_20_std value: -25.44704720985219 - type: nauc_map_at_3_diff1 value: 43.16164995512842 - type: nauc_map_at_3_max value: 30.935400935562818 - type: nauc_map_at_3_std value: -23.53095555148866 - type: nauc_map_at_5_diff1 value: 41.23474352142375 - type: nauc_map_at_5_max value: 39.03088859147947 - type: nauc_map_at_5_std value: -26.046526443708366 - type: nauc_mrr_at_1000_diff1 value: 51.79649678213789 - type: nauc_mrr_at_1000_max value: 50.50340748045259 - type: nauc_mrr_at_1000_std value: -24.777183703493407 - type: nauc_mrr_at_100_diff1 value: 51.78609028166551 - type: nauc_mrr_at_100_max value: 50.51732896833555 - type: nauc_mrr_at_100_std value: -24.760054686874717 - type: nauc_mrr_at_10_diff1 value: 51.705268395036995 - type: nauc_mrr_at_10_max value: 50.35818415293149 - type: nauc_mrr_at_10_std value: -25.170367120250404 - type: nauc_mrr_at_1_diff1 value: 53.91475115581825 - type: nauc_mrr_at_1_max value: 49.122529616282016 - type: nauc_mrr_at_1_std value: -22.377647552937155 - type: nauc_mrr_at_20_diff1 value: 51.778984221197774 - type: nauc_mrr_at_20_max value: 50.5070957827813 - type: nauc_mrr_at_20_std value: -24.908935023607285 - type: nauc_mrr_at_3_diff1 value: 51.82683773090423 - type: nauc_mrr_at_3_max value: 50.77993196421369 - type: nauc_mrr_at_3_std value: -24.3925832021831 - type: nauc_mrr_at_5_diff1 value: 51.722232683543034 - type: nauc_mrr_at_5_max value: 50.334865493961864 - type: nauc_mrr_at_5_std value: -25.513593495703297 - type: nauc_ndcg_at_1000_diff1 value: 44.21851582991263 - type: nauc_ndcg_at_1000_max value: 45.73539068637836 - type: nauc_ndcg_at_1000_std value: -24.716522467580397 - type: nauc_ndcg_at_100_diff1 value: 43.8002401615357 - type: nauc_ndcg_at_100_max value: 45.801409410061915 - type: nauc_ndcg_at_100_std value: -24.73171742499903 - type: nauc_ndcg_at_10_diff1 value: 42.540922778755885 - type: nauc_ndcg_at_10_max value: 44.348836943874595 - type: nauc_ndcg_at_10_std value: -28.05403666494785 - type: nauc_ndcg_at_1_diff1 value: 53.91475115581825 - type: nauc_ndcg_at_1_max value: 49.122529616282016 - type: nauc_ndcg_at_1_std value: -22.377647552937155 - type: nauc_ndcg_at_20_diff1 value: 43.10347921163421 - type: nauc_ndcg_at_20_max value: 45.53253270265022 - type: nauc_ndcg_at_20_std value: -26.63902791862846 - type: nauc_ndcg_at_3_diff1 value: 42.41720274782384 - type: nauc_ndcg_at_3_max value: 42.91778219334943 - type: nauc_ndcg_at_3_std value: -24.793252033594076 - type: nauc_ndcg_at_5_diff1 value: 42.51515034945093 - type: nauc_ndcg_at_5_max value: 41.62080576508792 - type: nauc_ndcg_at_5_std value: -28.209669314955065 - type: nauc_precision_at_1000_diff1 value: -14.89794075433148 - type: nauc_precision_at_1000_max value: 27.85387929356412 - type: nauc_precision_at_1000_std value: 10.728618597190849 - type: nauc_precision_at_100_diff1 value: -13.075270046295856 - type: nauc_precision_at_100_max value: 29.77208946756632 - type: nauc_precision_at_100_std value: 8.491662697326039 - type: nauc_precision_at_10_diff1 value: -4.0826025188781205 - type: nauc_precision_at_10_max value: 39.04278085180075 - type: nauc_precision_at_10_std value: -5.925408651372333 - type: nauc_precision_at_1_diff1 value: 53.91475115581825 - type: nauc_precision_at_1_max value: 49.122529616282016 - type: nauc_precision_at_1_std value: -22.377647552937155 - type: nauc_precision_at_20_diff1 value: -7.93186440645135 - type: nauc_precision_at_20_max value: 35.81281308891365 - type: nauc_precision_at_20_std value: 0.1241277857515697 - type: nauc_precision_at_3_diff1 value: 7.563562511484409 - type: nauc_precision_at_3_max value: 43.43738862378524 - type: nauc_precision_at_3_std value: -11.958059731912615 - type: nauc_precision_at_5_diff1 value: -0.1801152449011624 - type: nauc_precision_at_5_max value: 41.32486715619513 - type: nauc_precision_at_5_std value: -10.088699021919552 - type: nauc_recall_at_1000_diff1 value: 86.93359696819986 - type: nauc_recall_at_1000_max value: 100.0 - type: nauc_recall_at_1000_std value: 72.21843645604022 - type: nauc_recall_at_100_diff1 value: 29.86050842714198 - type: nauc_recall_at_100_max value: 48.106658251136245 - type: nauc_recall_at_100_std value: -14.981886214880035 - type: nauc_recall_at_10_diff1 value: 33.67119240737528 - type: nauc_recall_at_10_max value: 39.271984859561414 - type: nauc_recall_at_10_std value: -35.6434883839217 - type: nauc_recall_at_1_diff1 value: 51.09859596320021 - type: nauc_recall_at_1_max value: 12.406789321338222 - type: nauc_recall_at_1_std value: -18.227486548655076 - type: nauc_recall_at_20_diff1 value: 33.211979983240724 - type: nauc_recall_at_20_max value: 43.47676074743184 - type: nauc_recall_at_20_std value: -33.88107138395349 - type: nauc_recall_at_3_diff1 value: 39.22513750146998 - type: nauc_recall_at_3_max value: 27.066674083840166 - type: nauc_recall_at_3_std value: -26.963282529629893 - type: nauc_recall_at_5_diff1 value: 36.53718917129459 - type: nauc_recall_at_5_max value: 35.40550013169686 - type: nauc_recall_at_5_std value: -34.209159379410806 - type: ndcg_at_1 value: 63.808 - type: ndcg_at_10 value: 68.258 - type: ndcg_at_100 value: 73.38799999999999 - type: ndcg_at_1000 value: 74.03 - type: ndcg_at_20 value: 70.968 - type: ndcg_at_3 value: 62.33 - type: ndcg_at_5 value: 64.096 - type: precision_at_1 value: 63.808 - type: precision_at_10 value: 19.243 - type: precision_at_100 value: 2.367 - type: precision_at_1000 value: 0.245 - type: precision_at_20 value: 10.599 - type: precision_at_3 value: 44.515 - type: precision_at_5 value: 33.467999999999996 - type: recall_at_1 value: 33.06 - type: recall_at_10 value: 77.423 - type: recall_at_100 value: 95.923 - type: recall_at_1000 value: 99.874 - type: recall_at_20 value: 85.782 - type: recall_at_3 value: 57.098000000000006 - type: recall_at_5 value: 67.472 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (spa-spa) type: jinaai/xpqa config: spa-spa split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 72.004 - type: map_at_1 value: 36.248000000000005 - type: map_at_10 value: 65.679 - type: map_at_100 value: 67.22399999999999 - type: map_at_1000 value: 67.264 - type: map_at_20 value: 66.705 - type: map_at_3 value: 56.455 - type: map_at_5 value: 62.997 - type: mrr_at_1 value: 67.71752837326608 - type: mrr_at_10 value: 74.59782021257429 - type: mrr_at_100 value: 75.0640960767943 - type: mrr_at_1000 value: 75.07324799466076 - type: mrr_at_20 value: 74.9323963386884 - type: mrr_at_3 value: 72.95081967213115 - type: mrr_at_5 value: 73.82723833543506 - type: nauc_map_at_1000_diff1 value: 43.111810717567714 - type: nauc_map_at_1000_max value: 44.835247208972476 - type: nauc_map_at_1000_std value: -32.798405973931985 - type: nauc_map_at_100_diff1 value: 43.090223482932764 - type: nauc_map_at_100_max value: 44.83392441557943 - type: nauc_map_at_100_std value: -32.81149166676563 - type: nauc_map_at_10_diff1 value: 42.87841934951979 - type: nauc_map_at_10_max value: 43.9838653389494 - type: nauc_map_at_10_std value: -33.588084643627084 - type: nauc_map_at_1_diff1 value: 54.509245848379095 - type: nauc_map_at_1_max value: 10.05921648322742 - type: nauc_map_at_1_std value: -24.652326014826762 - type: nauc_map_at_20_diff1 value: 43.07468612984794 - type: nauc_map_at_20_max value: 44.75663122615032 - type: nauc_map_at_20_std value: -33.11788887878321 - type: nauc_map_at_3_diff1 value: 44.63272828938906 - type: nauc_map_at_3_max value: 32.1584369869227 - type: nauc_map_at_3_std value: -30.761662210142944 - type: nauc_map_at_5_diff1 value: 42.77296997803048 - type: nauc_map_at_5_max value: 41.78894616737652 - type: nauc_map_at_5_std value: -33.56459774477362 - type: nauc_mrr_at_1000_diff1 value: 53.097544131833494 - type: nauc_mrr_at_1000_max value: 50.61134979184588 - type: nauc_mrr_at_1000_std value: -35.6221191487669 - type: nauc_mrr_at_100_diff1 value: 53.096609856182106 - type: nauc_mrr_at_100_max value: 50.61951585642645 - type: nauc_mrr_at_100_std value: -35.62396157508327 - type: nauc_mrr_at_10_diff1 value: 52.771534471912304 - type: nauc_mrr_at_10_max value: 50.430863224435726 - type: nauc_mrr_at_10_std value: -36.027992076620365 - type: nauc_mrr_at_1_diff1 value: 55.05316238884337 - type: nauc_mrr_at_1_max value: 49.461858515275196 - type: nauc_mrr_at_1_std value: -31.87492636319712 - type: nauc_mrr_at_20_diff1 value: 53.083253469629746 - type: nauc_mrr_at_20_max value: 50.62156424256193 - type: nauc_mrr_at_20_std value: -35.879153692447154 - type: nauc_mrr_at_3_diff1 value: 52.98283109188415 - type: nauc_mrr_at_3_max value: 50.83561260429378 - type: nauc_mrr_at_3_std value: -35.30839538038797 - type: nauc_mrr_at_5_diff1 value: 52.93270510879709 - type: nauc_mrr_at_5_max value: 50.54595596761199 - type: nauc_mrr_at_5_std value: -35.84059376434395 - type: nauc_ndcg_at_1000_diff1 value: 45.343685089209416 - type: nauc_ndcg_at_1000_max value: 47.801141576669465 - type: nauc_ndcg_at_1000_std value: -33.512958862879195 - type: nauc_ndcg_at_100_diff1 value: 45.255590461515894 - type: nauc_ndcg_at_100_max value: 47.99240031881967 - type: nauc_ndcg_at_100_std value: -33.614465006695205 - type: nauc_ndcg_at_10_diff1 value: 43.93472511731019 - type: nauc_ndcg_at_10_max value: 45.92599752897053 - type: nauc_ndcg_at_10_std value: -36.43629114491574 - type: nauc_ndcg_at_1_diff1 value: 55.05316238884337 - type: nauc_ndcg_at_1_max value: 49.461858515275196 - type: nauc_ndcg_at_1_std value: -31.87492636319712 - type: nauc_ndcg_at_20_diff1 value: 44.93534591273201 - type: nauc_ndcg_at_20_max value: 47.55153940713458 - type: nauc_ndcg_at_20_std value: -35.56392448745206 - type: nauc_ndcg_at_3_diff1 value: 43.17916122133396 - type: nauc_ndcg_at_3_max value: 45.603634205103276 - type: nauc_ndcg_at_3_std value: -32.473227507181214 - type: nauc_ndcg_at_5_diff1 value: 44.10242961669216 - type: nauc_ndcg_at_5_max value: 43.61666669031808 - type: nauc_ndcg_at_5_std value: -35.98808321497782 - type: nauc_precision_at_1000_diff1 value: -23.264714449991146 - type: nauc_precision_at_1000_max value: 28.505729576735465 - type: nauc_precision_at_1000_std value: 11.987379232920926 - type: nauc_precision_at_100_diff1 value: -21.156119174614627 - type: nauc_precision_at_100_max value: 30.711646221646255 - type: nauc_precision_at_100_std value: 9.650486536340322 - type: nauc_precision_at_10_diff1 value: -10.98001328477502 - type: nauc_precision_at_10_max value: 39.25638073760597 - type: nauc_precision_at_10_std value: -4.3456859257488 - type: nauc_precision_at_1_diff1 value: 55.05316238884337 - type: nauc_precision_at_1_max value: 49.461858515275196 - type: nauc_precision_at_1_std value: -31.87492636319712 - type: nauc_precision_at_20_diff1 value: -14.97565390664424 - type: nauc_precision_at_20_max value: 36.383835295942355 - type: nauc_precision_at_20_std value: 1.525158880381114 - type: nauc_precision_at_3_diff1 value: 1.0448345623903483 - type: nauc_precision_at_3_max value: 45.69772060667404 - type: nauc_precision_at_3_std value: -13.002685018948293 - type: nauc_precision_at_5_diff1 value: -5.434185597628904 - type: nauc_precision_at_5_max value: 42.99162431099203 - type: nauc_precision_at_5_std value: -9.789308817624534 - type: nauc_recall_at_1000_diff1 value: 12.309303236094845 - type: nauc_recall_at_1000_max value: 100.0 - type: nauc_recall_at_1000_std value: 86.93359696819986 - type: nauc_recall_at_100_diff1 value: 39.093544920901415 - type: nauc_recall_at_100_max value: 55.62814395062938 - type: nauc_recall_at_100_std value: -22.6919033301514 - type: nauc_recall_at_10_diff1 value: 35.50100141633622 - type: nauc_recall_at_10_max value: 39.25750019586647 - type: nauc_recall_at_10_std value: -43.01273078031791 - type: nauc_recall_at_1_diff1 value: 54.509245848379095 - type: nauc_recall_at_1_max value: 10.05921648322742 - type: nauc_recall_at_1_std value: -24.652326014826762 - type: nauc_recall_at_20_diff1 value: 38.1281707132327 - type: nauc_recall_at_20_max value: 43.97950642900301 - type: nauc_recall_at_20_std value: -44.049952771307574 - type: nauc_recall_at_3_diff1 value: 40.01986938242728 - type: nauc_recall_at_3_max value: 27.517114421061173 - type: nauc_recall_at_3_std value: -32.99056780232045 - type: nauc_recall_at_5_diff1 value: 38.52035606499483 - type: nauc_recall_at_5_max value: 37.05834604678859 - type: nauc_recall_at_5_std value: -39.86196378897912 - type: ndcg_at_1 value: 67.718 - type: ndcg_at_10 value: 72.004 - type: ndcg_at_100 value: 76.554 - type: ndcg_at_1000 value: 77.07300000000001 - type: ndcg_at_20 value: 74.37899999999999 - type: ndcg_at_3 value: 66.379 - type: ndcg_at_5 value: 68.082 - type: precision_at_1 value: 67.718 - type: precision_at_10 value: 19.849 - type: precision_at_100 value: 2.3800000000000003 - type: precision_at_1000 value: 0.245 - type: precision_at_20 value: 10.813 - type: precision_at_3 value: 46.574 - type: precision_at_5 value: 34.83 - type: recall_at_1 value: 36.248000000000005 - type: recall_at_10 value: 80.252 - type: recall_at_100 value: 96.73 - type: recall_at_1000 value: 99.874 - type: recall_at_20 value: 87.703 - type: recall_at_3 value: 60.815 - type: recall_at_5 value: 71.16 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (fra-eng) type: jinaai/xpqa config: fra-eng split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 73.729 - type: map_at_1 value: 43.964999999999996 - type: map_at_10 value: 67.803 - type: map_at_100 value: 69.188 - type: map_at_1000 value: 69.21000000000001 - type: map_at_20 value: 68.747 - type: map_at_3 value: 60.972 - type: map_at_5 value: 65.39399999999999 - type: mrr_at_1 value: 68.4913217623498 - type: mrr_at_10 value: 75.2600822260368 - type: mrr_at_100 value: 75.6599169808848 - type: mrr_at_1000 value: 75.66720883727534 - type: mrr_at_20 value: 75.52375865860405 - type: mrr_at_3 value: 73.54250111259452 - type: mrr_at_5 value: 74.51713395638626 - type: nauc_map_at_1000_diff1 value: 46.81533703002097 - type: nauc_map_at_1000_max value: 46.30794757084772 - type: nauc_map_at_1000_std value: -14.953470500312335 - type: nauc_map_at_100_diff1 value: 46.82464740277745 - type: nauc_map_at_100_max value: 46.32852879948254 - type: nauc_map_at_100_std value: -14.950035098066172 - type: nauc_map_at_10_diff1 value: 46.31406143369831 - type: nauc_map_at_10_max value: 45.337593270786634 - type: nauc_map_at_10_std value: -16.011789445907876 - type: nauc_map_at_1_diff1 value: 57.097134715065835 - type: nauc_map_at_1_max value: 21.93931500350721 - type: nauc_map_at_1_std value: -15.134457251301637 - type: nauc_map_at_20_diff1 value: 46.47030891134173 - type: nauc_map_at_20_max value: 46.29169960276292 - type: nauc_map_at_20_std value: -15.14241106541829 - type: nauc_map_at_3_diff1 value: 50.27064228648596 - type: nauc_map_at_3_max value: 39.43058773971639 - type: nauc_map_at_3_std value: -16.16545993089126 - type: nauc_map_at_5_diff1 value: 46.974867679747426 - type: nauc_map_at_5_max value: 44.31091104855002 - type: nauc_map_at_5_std value: -16.50175337658926 - type: nauc_mrr_at_1000_diff1 value: 55.20294005110399 - type: nauc_mrr_at_1000_max value: 51.947725719119966 - type: nauc_mrr_at_1000_std value: -14.586112939597232 - type: nauc_mrr_at_100_diff1 value: 55.20426251109304 - type: nauc_mrr_at_100_max value: 51.95648725402534 - type: nauc_mrr_at_100_std value: -14.579769236539143 - type: nauc_mrr_at_10_diff1 value: 54.93870506205835 - type: nauc_mrr_at_10_max value: 51.89312772900638 - type: nauc_mrr_at_10_std value: -14.692635010092939 - type: nauc_mrr_at_1_diff1 value: 56.54945935175171 - type: nauc_mrr_at_1_max value: 51.28134504197991 - type: nauc_mrr_at_1_std value: -12.909042186563061 - type: nauc_mrr_at_20_diff1 value: 55.10667018041461 - type: nauc_mrr_at_20_max value: 51.98236870783707 - type: nauc_mrr_at_20_std value: -14.599377575198025 - type: nauc_mrr_at_3_diff1 value: 55.67124311746892 - type: nauc_mrr_at_3_max value: 51.77903236246767 - type: nauc_mrr_at_3_std value: -14.94452633860763 - type: nauc_mrr_at_5_diff1 value: 55.42849172366371 - type: nauc_mrr_at_5_max value: 51.76902965753959 - type: nauc_mrr_at_5_std value: -15.357993534727072 - type: nauc_ndcg_at_1000_diff1 value: 48.736844959280326 - type: nauc_ndcg_at_1000_max value: 48.92891159935398 - type: nauc_ndcg_at_1000_std value: -13.983968675611056 - type: nauc_ndcg_at_100_diff1 value: 48.73859328503975 - type: nauc_ndcg_at_100_max value: 49.31867149556439 - type: nauc_ndcg_at_100_std value: -13.72387564912742 - type: nauc_ndcg_at_10_diff1 value: 46.50313862975287 - type: nauc_ndcg_at_10_max value: 47.13599793554596 - type: nauc_ndcg_at_10_std value: -16.317919977400113 - type: nauc_ndcg_at_1_diff1 value: 56.54945935175171 - type: nauc_ndcg_at_1_max value: 51.28134504197991 - type: nauc_ndcg_at_1_std value: -12.909042186563061 - type: nauc_ndcg_at_20_diff1 value: 47.01727117133912 - type: nauc_ndcg_at_20_max value: 49.121366036709105 - type: nauc_ndcg_at_20_std value: -14.411078677638775 - type: nauc_ndcg_at_3_diff1 value: 49.229581145458276 - type: nauc_ndcg_at_3_max value: 47.427609717032 - type: nauc_ndcg_at_3_std value: -16.52066627289908 - type: nauc_ndcg_at_5_diff1 value: 48.0152514127505 - type: nauc_ndcg_at_5_max value: 46.12152407850816 - type: nauc_ndcg_at_5_std value: -17.613295491954656 - type: nauc_precision_at_1000_diff1 value: -25.959006032642463 - type: nauc_precision_at_1000_max value: 12.81002362947137 - type: nauc_precision_at_1000_std value: 12.575312826061513 - type: nauc_precision_at_100_diff1 value: -24.35413527283394 - type: nauc_precision_at_100_max value: 14.878359236477303 - type: nauc_precision_at_100_std value: 12.384426050018428 - type: nauc_precision_at_10_diff1 value: -17.93220761770618 - type: nauc_precision_at_10_max value: 23.523485811847294 - type: nauc_precision_at_10_std value: 4.424456968716939 - type: nauc_precision_at_1_diff1 value: 56.54945935175171 - type: nauc_precision_at_1_max value: 51.28134504197991 - type: nauc_precision_at_1_std value: -12.909042186563061 - type: nauc_precision_at_20_diff1 value: -21.776871398686936 - type: nauc_precision_at_20_max value: 21.18436338264366 - type: nauc_precision_at_20_std value: 9.937274986573321 - type: nauc_precision_at_3_diff1 value: -1.2411845580934435 - type: nauc_precision_at_3_max value: 34.962281941875 - type: nauc_precision_at_3_std value: -2.447892908501237 - type: nauc_precision_at_5_diff1 value: -11.134164534114085 - type: nauc_precision_at_5_max value: 30.22079740070525 - type: nauc_precision_at_5_std value: -0.24232594421765946 - type: nauc_recall_at_1000_diff1 value: .nan - type: nauc_recall_at_1000_max value: .nan - type: nauc_recall_at_1000_std value: .nan - type: nauc_recall_at_100_diff1 value: 43.3647412452869 - type: nauc_recall_at_100_max value: 63.50094950500327 - type: nauc_recall_at_100_std value: 2.3911909633714044 - type: nauc_recall_at_10_diff1 value: 33.993445071666855 - type: nauc_recall_at_10_max value: 41.38694129134144 - type: nauc_recall_at_10_std value: -19.308698266099096 - type: nauc_recall_at_1_diff1 value: 57.097134715065835 - type: nauc_recall_at_1_max value: 21.93931500350721 - type: nauc_recall_at_1_std value: -15.134457251301637 - type: nauc_recall_at_20_diff1 value: 32.03888531880772 - type: nauc_recall_at_20_max value: 49.660787482562085 - type: nauc_recall_at_20_std value: -12.641456758778382 - type: nauc_recall_at_3_diff1 value: 47.94527082900579 - type: nauc_recall_at_3_max value: 36.51733131437679 - type: nauc_recall_at_3_std value: -18.65511713247495 - type: nauc_recall_at_5_diff1 value: 42.04545772092305 - type: nauc_recall_at_5_max value: 41.21440912972303 - type: nauc_recall_at_5_std value: -21.47386527081128 - type: ndcg_at_1 value: 68.491 - type: ndcg_at_10 value: 73.729 - type: ndcg_at_100 value: 77.684 - type: ndcg_at_1000 value: 78.084 - type: ndcg_at_20 value: 75.795 - type: ndcg_at_3 value: 68.568 - type: ndcg_at_5 value: 70.128 - type: precision_at_1 value: 68.491 - type: precision_at_10 value: 16.996 - type: precision_at_100 value: 2.023 - type: precision_at_1000 value: 0.207 - type: precision_at_20 value: 9.246 - type: precision_at_3 value: 41.923 - type: precision_at_5 value: 29.826000000000004 - type: recall_at_1 value: 43.964999999999996 - type: recall_at_10 value: 82.777 - type: recall_at_100 value: 97.287 - type: recall_at_1000 value: 100.0 - type: recall_at_20 value: 89.183 - type: recall_at_3 value: 65.803 - type: recall_at_5 value: 74.119 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (fr) type: jinaai/xpqa config: fra-fra split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: main_score value: 77.581 - type: map_at_1 value: 46.444 - type: map_at_10 value: 72.084 - type: map_at_100 value: 73.175 - type: map_at_1000 value: 73.193 - type: map_at_20 value: 72.77799999999999 - type: map_at_3 value: 65.242 - type: map_at_5 value: 69.926 - type: mrr_at_1 value: 71.82910547396529 - type: mrr_at_10 value: 78.66594612923046 - type: mrr_at_100 value: 78.97334934049613 - type: mrr_at_1000 value: 78.97687021803557 - type: mrr_at_20 value: 78.85701141744282 - type: mrr_at_3 value: 76.96929238985311 - type: mrr_at_5 value: 77.99732977303067 - type: nauc_map_at_1000_diff1 value: 49.090956807097804 - type: nauc_map_at_1000_max value: 52.01095354889508 - type: nauc_map_at_1000_std value: -12.182870421711026 - type: nauc_map_at_100_diff1 value: 49.091664766684566 - type: nauc_map_at_100_max value: 52.017499797253755 - type: nauc_map_at_100_std value: -12.188342487271528 - type: nauc_map_at_10_diff1 value: 48.6619338205362 - type: nauc_map_at_10_max value: 50.93591260329888 - type: nauc_map_at_10_std value: -12.899399261673365 - type: nauc_map_at_1_diff1 value: 61.89699552471587 - type: nauc_map_at_1_max value: 22.387748207421946 - type: nauc_map_at_1_std value: -17.139518194308437 - type: nauc_map_at_20_diff1 value: 48.72828404686453 - type: nauc_map_at_20_max value: 51.781074586075434 - type: nauc_map_at_20_std value: -12.174270605093136 - type: nauc_map_at_3_diff1 value: 53.11509580126934 - type: nauc_map_at_3_max value: 42.1768380145106 - type: nauc_map_at_3_std value: -14.98340833032363 - type: nauc_map_at_5_diff1 value: 49.60521390803235 - type: nauc_map_at_5_max value: 49.80360562029127 - type: nauc_map_at_5_std value: -13.900652140457618 - type: nauc_mrr_at_1000_diff1 value: 58.10782478654255 - type: nauc_mrr_at_1000_max value: 61.31083013535486 - type: nauc_mrr_at_1000_std value: -9.624904298545921 - type: nauc_mrr_at_100_diff1 value: 58.11041683306092 - type: nauc_mrr_at_100_max value: 61.31590199755797 - type: nauc_mrr_at_100_std value: -9.625991053580865 - type: nauc_mrr_at_10_diff1 value: 57.883701815695375 - type: nauc_mrr_at_10_max value: 61.36276126424689 - type: nauc_mrr_at_10_std value: -9.495072468420386 - type: nauc_mrr_at_1_diff1 value: 60.18176977079093 - type: nauc_mrr_at_1_max value: 59.697615236642555 - type: nauc_mrr_at_1_std value: -9.396133077966779 - type: nauc_mrr_at_20_diff1 value: 57.964817434006754 - type: nauc_mrr_at_20_max value: 61.34073539502932 - type: nauc_mrr_at_20_std value: -9.602378876645131 - type: nauc_mrr_at_3_diff1 value: 58.44338049427257 - type: nauc_mrr_at_3_max value: 60.92272989411293 - type: nauc_mrr_at_3_std value: -9.928970439416162 - type: nauc_mrr_at_5_diff1 value: 58.01513016866578 - type: nauc_mrr_at_5_max value: 61.46805302986586 - type: nauc_mrr_at_5_std value: -9.842227002440984 - type: nauc_ndcg_at_1000_diff1 value: 50.99293152828167 - type: nauc_ndcg_at_1000_max value: 56.14232784664811 - type: nauc_ndcg_at_1000_std value: -10.529213072410288 - type: nauc_ndcg_at_100_diff1 value: 50.99385944312529 - type: nauc_ndcg_at_100_max value: 56.34825518954588 - type: nauc_ndcg_at_100_std value: -10.398943874846047 - type: nauc_ndcg_at_10_diff1 value: 48.51273364357823 - type: nauc_ndcg_at_10_max value: 53.77871849486298 - type: nauc_ndcg_at_10_std value: -11.82105972112472 - type: nauc_ndcg_at_1_diff1 value: 60.18176977079093 - type: nauc_ndcg_at_1_max value: 59.697615236642555 - type: nauc_ndcg_at_1_std value: -9.396133077966779 - type: nauc_ndcg_at_20_diff1 value: 49.04268319033412 - type: nauc_ndcg_at_20_max value: 55.47011381097071 - type: nauc_ndcg_at_20_std value: -10.486452945493042 - type: nauc_ndcg_at_3_diff1 value: 50.95112745400584 - type: nauc_ndcg_at_3_max value: 53.45473828705577 - type: nauc_ndcg_at_3_std value: -13.420699384045728 - type: nauc_ndcg_at_5_diff1 value: 50.313156212000074 - type: nauc_ndcg_at_5_max value: 52.78539129309866 - type: nauc_ndcg_at_5_std value: -13.586274096509122 - type: nauc_precision_at_1000_diff1 value: -31.13772049254778 - type: nauc_precision_at_1000_max value: 17.2847598361294 - type: nauc_precision_at_1000_std value: 15.497531773816887 - type: nauc_precision_at_100_diff1 value: -29.98812263553739 - type: nauc_precision_at_100_max value: 19.048620003227654 - type: nauc_precision_at_100_std value: 15.38499952171958 - type: nauc_precision_at_10_diff1 value: -25.33028097412579 - type: nauc_precision_at_10_max value: 26.077919168306853 - type: nauc_precision_at_10_std value: 11.35352933466097 - type: nauc_precision_at_1_diff1 value: 60.18176977079093 - type: nauc_precision_at_1_max value: 59.697615236642555 - type: nauc_precision_at_1_std value: -9.396133077966779 - type: nauc_precision_at_20_diff1 value: -28.417606311068905 - type: nauc_precision_at_20_max value: 23.958679828637692 - type: nauc_precision_at_20_std value: 14.442021499194205 - type: nauc_precision_at_3_diff1 value: -8.127396049790482 - type: nauc_precision_at_3_max value: 37.348067982957076 - type: nauc_precision_at_3_std value: 4.747913619596849 - type: nauc_precision_at_5_diff1 value: -16.902418446058395 - type: nauc_precision_at_5_max value: 32.73583852552014 - type: nauc_precision_at_5_std value: 7.031446423850052 - type: nauc_recall_at_1000_diff1 value: -14.485978369112514 - type: nauc_recall_at_1000_max value: 78.59123887333172 - type: nauc_recall_at_1000_std value: 90.7384575424963 - type: nauc_recall_at_100_diff1 value: 41.47842281590715 - type: nauc_recall_at_100_max value: 67.47271545727422 - type: nauc_recall_at_100_std value: 14.555561992253999 - type: nauc_recall_at_10_diff1 value: 33.05308907973924 - type: nauc_recall_at_10_max value: 45.49878918493155 - type: nauc_recall_at_10_std value: -11.560069806810926 - type: nauc_recall_at_1_diff1 value: 61.89699552471587 - type: nauc_recall_at_1_max value: 22.387748207421946 - type: nauc_recall_at_1_std value: -17.139518194308437 - type: nauc_recall_at_20_diff1 value: 31.305721376453754 - type: nauc_recall_at_20_max value: 51.24817763724019 - type: nauc_recall_at_20_std value: -5.0809908162023145 - type: nauc_recall_at_3_diff1 value: 49.27109038342917 - type: nauc_recall_at_3_max value: 37.69188317998447 - type: nauc_recall_at_3_std value: -17.119900758664336 - type: nauc_recall_at_5_diff1 value: 42.74501803377967 - type: nauc_recall_at_5_max value: 46.877008503354844 - type: nauc_recall_at_5_std value: -15.704892082115975 - type: ndcg_at_1 value: 71.829 - type: ndcg_at_10 value: 77.581 - type: ndcg_at_100 value: 80.75 - type: ndcg_at_1000 value: 81.026 - type: ndcg_at_20 value: 79.092 - type: ndcg_at_3 value: 72.81 - type: ndcg_at_5 value: 74.22999999999999 - type: precision_at_1 value: 71.829 - type: precision_at_10 value: 17.717 - type: precision_at_100 value: 2.031 - type: precision_at_1000 value: 0.207 - type: precision_at_20 value: 9.399000000000001 - type: precision_at_3 value: 44.458999999999996 - type: precision_at_5 value: 31.535000000000004 - type: recall_at_1 value: 46.444 - type: recall_at_10 value: 86.275 - type: recall_at_100 value: 98.017 - type: recall_at_1000 value: 99.8 - type: recall_at_20 value: 90.935 - type: recall_at_3 value: 70.167 - type: recall_at_5 value: 78.2 --- <br><br> <p align="center"> <img src="https://huggingface.co/datasets/jinaai/documentation-images/resolve/main/logo.webp" alt="Jina AI: Your Search Foundation, Supercharged!" width="150px"> </p> <p align="center"> <b>The embedding model trained by <a href="https://jina.ai/"><b>Jina AI</b></a>.</b> </p> <p align="center"> <b>jina-embeddings-v3: Multilingual Embeddings With Task LoRA</b> </p> ## Quick Start [Blog](https://jina.ai/news/jina-embeddings-v3-a-frontier-multilingual-embedding-model/#parameter-dimensions) | [Azure](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/jinaai.jina-embeddings-v3-vm) | [AWS SageMaker](https://aws.amazon.com/marketplace/pp/prodview-kdi3xkt62lo32) | [API](https://jina.ai/embeddings) ## Intended Usage & Model Info `jina-embeddings-v3` is a **multilingual multi-task text embedding model** designed for a variety of NLP applications. Based on the [Jina-XLM-RoBERTa architecture](https://huggingface.co/jinaai/xlm-roberta-flash-implementation), this model supports Rotary Position Embeddings to handle long input sequences up to **8192 tokens**. Additionally, it features 5 LoRA adapters to generate task-specific embeddings efficiently. ### Key Features: - **Extended Sequence Length:** Supports up to 8192 tokens with RoPE. - **Task-Specific Embedding:** Customize embeddings through the `task` argument with the following options: - `retrieval.query`: Used for query embeddings in asymmetric retrieval tasks - `retrieval.passage`: Used for passage embeddings in asymmetric retrieval tasks - `separation`: Used for embeddings in clustering and re-ranking applications - `classification`: Used for embeddings in classification tasks - `text-matching`: Used for embeddings in tasks that quantify similarity between two texts, such as STS or symmetric retrieval tasks - **Matryoshka Embeddings**: Supports flexible embedding sizes (`32, 64, 128, 256, 512, 768, 1024`), allowing for truncating embeddings to fit your application. ### Supported Languages: While the foundation model supports 100 languages, we've focused our tuning efforts on the following 30 languages: **Arabic, Bengali, Chinese, Danish, Dutch, English, Finnish, French, Georgian, German, Greek, Hindi, Indonesian, Italian, Japanese, Korean, Latvian, Norwegian, Polish, Portuguese, Romanian, Russian, Slovak, Spanish, Swedish, Thai, Turkish, Ukrainian, Urdu,** and **Vietnamese.** > **⚠️ Important Notice:** > We fixed a bug in the `encode` function [#60](https://huggingface.co/jinaai/jina-embeddings-v3/discussions/60) where **Matryoshka embedding truncation** occurred *after normalization*, leading to non-normalized truncated embeddings. This issue has been resolved in the latest code revision. > > If you have encoded data using the previous version and wish to maintain consistency, please use the specific code revision when loading the model: `AutoModel.from_pretrained('jinaai/jina-embeddings-v3', code_revision='da863dd04a4e5dce6814c6625adfba87b83838aa', ...)` ## Usage **<details><summary>Apply mean pooling when integrating the model.</summary>** <p> ### Why Use Mean Pooling? Mean pooling takes all token embeddings from the model's output and averages them at the sentence or paragraph level. This approach has been shown to produce high-quality sentence embeddings. We provide an `encode` function that handles this for you automatically. However, if you're working with the model directly, outside of the `encode` function, you'll need to apply mean pooling manually. Here's how you can do it: ```python import torch import torch.nn.functional as F from transformers import AutoTokenizer, AutoModel def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] input_mask_expanded = ( attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() ) return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp( input_mask_expanded.sum(1), min=1e-9 ) sentences = ["How is the weather today?", "What is the current weather like today?"] tokenizer = AutoTokenizer.from_pretrained("jinaai/jina-embeddings-v3") model = AutoModel.from_pretrained("jinaai/jina-embeddings-v3", trust_remote_code=True) encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors="pt") task = 'retrieval.query' task_id = model._adaptation_map[task] adapter_mask = torch.full((len(sentences),), task_id, dtype=torch.int32) with torch.no_grad(): model_output = model(**encoded_input, adapter_mask=adapter_mask) embeddings = mean_pooling(model_output, encoded_input["attention_mask"]) embeddings = F.normalize(embeddings, p=2, dim=1) ``` </p> </details> The easiest way to start using `jina-embeddings-v3` is with the [Jina Embedding API](https://jina.ai/embeddings/). Alternatively, you can use `jina-embeddings-v3` directly via Transformers package: ```bash !pip install transformers torch einops !pip install 'numpy<2' ``` If you run it on a GPU that support [FlashAttention-2](https://github.com/Dao-AILab/flash-attention). By 2024.9.12, it supports Ampere, Ada, or Hopper GPUs (e.g., A100, RTX 3090, RTX 4090, H100), ```bash !pip install flash-attn --no-build-isolation ``` ```python from transformers import AutoModel # Initialize the model model = AutoModel.from_pretrained("jinaai/jina-embeddings-v3", trust_remote_code=True) texts = [ "Follow the white rabbit.", # English "Sigue al conejo blanco.", # Spanish "Suis le lapin blanc.", # French "跟着白兔走。", # Chinese "اتبع الأرنب الأبيض.", # Arabic "Folge dem weißen Kaninchen.", # German ] # When calling the `encode` function, you can choose a `task` based on the use case: # 'retrieval.query', 'retrieval.passage', 'separation', 'classification', 'text-matching' # Alternatively, you can choose not to pass a `task`, and no specific LoRA adapter will be used. embeddings = model.encode(texts, task="text-matching") # Compute similarities print(embeddings[0] @ embeddings[1].T) ``` By default, the model supports a maximum sequence length of 8192 tokens. However, if you want to truncate your input texts to a shorter length, you can pass the `max_length` parameter to the `encode` function: ```python embeddings = model.encode(["Very long ... document"], max_length=2048) ``` In case you want to use **Matryoshka embeddings** and switch to a different dimension, you can adjust it by passing the `truncate_dim` parameter to the `encode` function: ```python embeddings = model.encode(['Sample text'], truncate_dim=256) ``` The latest version (3.1.0) of [SentenceTransformers](https://github.com/UKPLab/sentence-transformers) also supports `jina-embeddings-v3`: ```bash !pip install -U sentence-transformers ``` ```python from sentence_transformers import SentenceTransformer model = SentenceTransformer("jinaai/jina-embeddings-v3", trust_remote_code=True) task = "retrieval.query" embeddings = model.encode( ["What is the weather like in Berlin today?"], task=task, prompt_name=task, ) ``` You can fine-tune `jina-embeddings-v3` using [SentenceTransformerTrainer](https://sbert.net/docs/package_reference/sentence_transformer/trainer.html). To fine-tune for a specific task, you should set the task before passing the model to the ST Trainer, either during initialization: ```python model = SentenceTransformer("jinaai/jina-embeddings-v3", trust_remote_code=True, model_kwargs={'default_task': 'classification'}) ``` Or afterwards: ```python model = SentenceTransformer("jinaai/jina-embeddings-v3", trust_remote_code=True) model[0].default_task = 'classification' ``` This way you can fine-tune the LoRA adapter for the chosen task. However, If you want to fine-tune the entire model, make sure the main parameters are set as trainable when loading the model: ```python model = SentenceTransformer("jinaai/jina-embeddings-v3", trust_remote_code=True, model_kwargs={'lora_main_params_trainable': True}) ``` This will allow fine-tuning the whole model instead of just the LoRA adapters. **<details><summary>ONNX Inference.</summary>** <p> You can use ONNX for efficient inference with `jina-embeddings-v3`: ```python import onnxruntime import numpy as np from transformers import AutoTokenizer, PretrainedConfig # Mean pool function def mean_pooling(model_output: np.ndarray, attention_mask: np.ndarray): token_embeddings = model_output input_mask_expanded = np.expand_dims(attention_mask, axis=-1) input_mask_expanded = np.broadcast_to(input_mask_expanded, token_embeddings.shape) sum_embeddings = np.sum(token_embeddings * input_mask_expanded, axis=1) sum_mask = np.clip(np.sum(input_mask_expanded, axis=1), a_min=1e-9, a_max=None) return sum_embeddings / sum_mask # Load tokenizer and model config tokenizer = AutoTokenizer.from_pretrained('jinaai/jina-embeddings-v3') config = PretrainedConfig.from_pretrained('jinaai/jina-embeddings-v3') # Tokenize input input_text = tokenizer('sample text', return_tensors='np') # ONNX session model_path = 'jina-embeddings-v3/onnx/model.onnx' session = onnxruntime.InferenceSession(model_path) # Prepare inputs for ONNX model task_type = 'text-matching' task_id = np.array(config.lora_adaptations.index(task_type), dtype=np.int64) inputs = { 'input_ids': input_text['input_ids'], 'attention_mask': input_text['attention_mask'], 'task_id': task_id } # Run model outputs = session.run(None, inputs)[0] # Apply mean pooling and normalization to the model outputs embeddings = mean_pooling(outputs, input_text["attention_mask"]) embeddings = embeddings / np.linalg.norm(embeddings, ord=2, axis=1, keepdims=True) ``` </p> </details> ## Contact Join our [Discord community](https://discord.jina.ai) and chat with other community members about ideas. ## License `jina-embeddings-v3` is listed on AWS & Azure. If you need to use it beyond those platforms or on-premises within your company, note that the models is licensed under CC BY-NC 4.0. For commercial usage inquiries, feel free to [contact us](https://jina.ai/contact-sales/). ## Citation If you find `jina-embeddings-v3` useful in your research, please cite the following paper: ```bibtex @misc{sturua2024jinaembeddingsv3multilingualembeddingstask, title={jina-embeddings-v3: Multilingual Embeddings With Task LoRA}, author={Saba Sturua and Isabelle Mohr and Mohammad Kalim Akram and Michael Günther and Bo Wang and Markus Krimmel and Feng Wang and Georgios Mastrapas and Andreas Koukounas and Andreas Koukounas and Nan Wang and Han Xiao}, year={2024}, eprint={2409.10173}, archivePrefix={arXiv}, primaryClass={cs.CL}, url={https://arxiv.org/abs/2409.10173}, } ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
meandyou200175/phobert_ContrastiveLoss
meandyou200175
sentence-similarity
[ "sentence-transformers", "safetensors", "roberta", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:87608", "loss:ContrastiveLoss", "arxiv:1908.10084", "base_model:vinai/phobert-base-v2", "base_model:finetune:vinai/phobert-base-v2", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,730
1,730
8
0
--- base_model: vinai/phobert-base-v2 library_name: sentence-transformers metrics: - cosine_accuracy@1 - cosine_accuracy@2 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_accuracy@100 - cosine_precision@1 - cosine_precision@2 - cosine_precision@5 - cosine_precision@10 - cosine_precision@100 - cosine_recall@1 - cosine_recall@2 - cosine_recall@5 - cosine_recall@10 - cosine_recall@100 - cosine_ndcg@10 - cosine_mrr@1 - cosine_mrr@2 - cosine_mrr@5 - cosine_mrr@10 - cosine_mrr@100 - cosine_map@100 - dot_accuracy@1 - dot_accuracy@2 - dot_accuracy@5 - dot_accuracy@10 - dot_accuracy@100 - dot_precision@1 - dot_precision@2 - dot_precision@5 - dot_precision@10 - dot_precision@100 - dot_recall@1 - dot_recall@2 - dot_recall@5 - dot_recall@10 - dot_recall@100 - dot_ndcg@10 - dot_mrr@1 - dot_mrr@2 - dot_mrr@5 - dot_mrr@10 - dot_mrr@100 - dot_map@100 pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:87608 - loss:ContrastiveLoss widget: - source_sentence: Phương pháp chẩn đoán & điều trị lậu sentences: - Phương pháp chẩn đoán & điều trị lậu Phương pháp xét nghiệm và chẩn đoán nguyên nhân bệnh lậu Bác sĩ sẽ cần tìm vi khuẩn trong các mẫu dịch lấy từ cơ thể bạn, bao gồm trực tràng, cổ họng, âm đạo hoặc niệu đạo hoặc nước tiểu của bạn. Phương pháp điều trị bệnh lậu hiệu quả Điều trị bệnh lậu ở người lớn Người lớn mắc bệnh lậu được điều trị bằng thuốc kháng sinh. Do các chủng Neisseria gonorrhoeae kháng thuốc đang phát triển, khuyến cáo rằng bệnh lậu không biến chứng nên được điều trị bằng kháng sinh ceftriaxone (dạng tiêm) cùng với azithromycin uống. Nếu bạn bị dị ứng với kháng sinh cephalosporin, chẳng hạn như ceftriaxone, bạn có thể được cho uống gemifloxacin hoặc gentamicin tiêm và azithromycin uống. Điều trị bệnh lậu cho bạn tình Bạn tình của bạn cũng nên đi xét nghiệm và điều trị bệnh lậu tương tự, ngay cả khi họ không có dấu hiệu hoặc triệu chứng. Điều trị bệnh lậu cho trẻ sơ sinh Trẻ sơ sinh có mẹ bị bệnh lậu có thể được điều trị bằng thuốc kháng sinh. - ' Chào em, là bệnh mãn tính phải điều trị suốt đời, phải kiên nhẫn và kiên trì nên đôi khi lượng đường trong cơ thể không ổn định. Lúc đi khám xét nghiệm thì ổn do bản thân biết mai đi khám nên sẽ kiêng ăn, ăn ít... còn bệnh lâu dài nên trong ngày đôi khi thèm chút này hay thích ăn chút kia, quên uống thuốc, suy nghĩ, mất ngủ cũng làm đường không ổn định. Đường trong cơ thể lúc lên lúc xuống dễ đưa đến biến chứng. Em hay thấy bệnh nhân tiểu đường tháo khớp ngón chân, ngón tay, đôi khi tháo khớp gối, khớp háng, đây là do tê liệt hệ thần kinh nên khi va chạm bệnh nhân không phát hiện. Đến khi phát hiện thì đã nhiễm trùng nặng phải tháo khớp. Theo BS mẹ em có khả năng do biến chứng tiểu đường vì mẹ em bị bệnh khá lâu nên ít nhiều ảnh hưởng thần kinh bị tê liệt gây đau. Em nên nhớ dặn mẹ đi tái khám và điều trị cho thật ổn định nhé! Thân mến!' - Chào bạn, Sulpiride là thuốc đối kháng dopamine chọn lọc được sử dụng trong điều trị trầm cảm và loạn thần. Liều thấp của thuốc đôi khi được sử dụng để tăng cường sản xuất sữa ở phụ nữ cho con bú nhờ tăng sản xuất prolactin (một loại hormon tăng tiết sữa). Thời gian bán thải của thuốc chỉ khoảng 5-7 giờ, do đó ngưng thuốc 2-3 ngày sẽ không còn tác dụng tăng tiết sữa nữa bạn nhé! - source_sentence: Phương pháp chẩn đoán & điều trị hội chứng tiểu não sentences: - 'Mô tả ngắn: Thuốc Encorate là sản phẩm của Sun Pharmaceutical Industries Ltd có thành phần là Natri Valproate có tác dụng điều trị động kinh toàn thể, động kinh cục bộ hoặc các thể động kinh khác; đối với phụ nữ trong độ tuổi sinh đẻ, thuốc chỉ nên được sử dụng cho những trường hợp nặng hoặc đã đề kháng với các thuốc khác; ngừa sốt cao co giật ở trẻ em, chứng máy cơ ở trẻ em; điều trị và dự phòng hưng cảm trong rối loạn cảm xúc lưỡng cực. Thành phần: Natri valproate: 300mg Chỉ định: Thuốc Encorate chỉ định điều trị trong các trường hợp sau: Điều trị động kinh toàn thể, động kinh cục bộ hoặc các thể động kinh khác. Đối với phụ nữ trong độ tuổi sinh đẻ, thuốc chỉ nên được sử dụng cho những trường hợp nặng hoặc đã đề kháng với các thuốc khác. Ngừa sốt cao co giật ở trẻ em, chứng máy cơ ở trẻ em. Điều trị và dự phòng hưng cảm trong rối loạn cảm xúc lưỡng cực .' - 'Nguy cơ ung thư ruột kết Những ai có nguy cơ mắc phải ung thư ruột kết? Dưới đây là một số đối tượng có nguy cơ mắc phải bệnh ung thư ruột kết: Béo phì . Người bị tiểu đường. Thuộc chủng tộc người da đen. Tuổi tác cao. Tỷ lệ ung thư ruột kết ở những người dưới 50 tuổi đang gia tăng, nhưng phần lớn những người mắc bệnh ung thư ruột kết trên 50 tuổi. Có tiền sử gia đình mắc bệnh ung thư ruột kết hoặc ung thư trực tràng . Có tiền sử mắc bệnh viêm ruột, viêm loét đại tràng mãn tính hoặc bệnh Crohn từ 8 năm trở lên. Có tiền sử ung thư ruột kết hoặc polyp đại tràng (polyp đại trực tràng có kích thước từ 1 cm trở lên hoặc có tế bào trông bất thường dưới kính hiển vi). Có các hội chứng di truyền phổ biến làm tăng nguy cơ ung thư ruột kết, là bệnh đa polyp tuyến xuất phát từ gia đình (FAP) và hội chứng Lynch (ung thư đại trực tràng không polyp di truyền). Một số đột biến gen được truyền qua nhiều thế hệ trong gia đình bạn có thể làm tăng đáng kể nguy cơ ung thư ruột kết. Chỉ có một tỷ lệ nhỏ bệnh ung thư ruột kết có liên quan đến gen di truyền. Yếu tố làm tăng nguy cơ mắc phải ung thư ruột kết Một số yếu tố trong sinh hoạt và dinh dưỡng hàng ngày có thể làm tăng nguy cơ mắc bệnh ung thư ruột kết: Hút thuốc lá. Người ít vận động. Uống nhiều bia rượu. Ăn quá nhiều thịt đỏ hoặc thịt chế biến sẵn. Chế độ ăn không bổ sung đủ rau xanh, ít chất xơ và dư thừa chất béo. Xạ trị trực tiếp vào bụng để điều trị ung thư trước đó làm tăng nguy cơ ung thư ruột kết. Lối sống không lành mạnh làm tăng nguy cơ ung thư ruột kết' - 'Phương pháp chẩn đoán & điều trị hội chứng tiểu não Phương pháp xét nghiệm và chẩn đoán hội chứng tiểu não Nếu bác sĩ nghi ngờ bạn bị hội chứng tiểu não, họ sẽ tiến hành đánh giá cẩn thận các triệu chứng của bạn. Họ cũng sẽ xem xét tiền sử bệnh lý cá nhân và gia đình của bạn. Để xác nhận chẩn đoán, bác sĩ của bạn có thể sẽ chỉ định thêm một số xét nghiệm. Các xét nghiệm bác sĩ sử dụng để chẩn đoán hội chứng tiểu não bao gồm: Chọc dò dịch não tủy : Xét nghiệm giúp tìm được các dấu hiệu nhiễm trùng hoặc hội chứng cận ung trong dịch não tủy. Xét nghiệm máu: Đánh giá chức năng của cơ quan và xác định thiếu vitamin nào hay không. Các xét nghiệm hình ảnh học như chụp CT, MRI: Những xét nghiệm này cho thấy dấu hiệu tổn thương mô não. Họ cũng có thể phát hiện các vấn đề như khối u và đột quỵ não. Xét nghiệm di truyền : Xét nghiệm giúp tìm các gen đột biến gây bệnh và đánh giá mức độ di truyền trong gia đình. Chụp CT có thể giúp bác sĩ chẩn đoán hội chứng tiểu não Phương pháp điều trị hội chứng tiểu não Nội khoa Việc điều trị thường phụ thuộc vào nguyên nhân cơ bản gây ra hội chứng tiểu não. Tuy nhiên, thuốc đôi khi có thể giúp kiểm soát một số triệu chứng nhất định, bao gồm run hoặc các vấn đề khi đi bộ và chóng mặt. Phương pháp điều trị hội chứng tiểu não do hội chứng cận u có thể bao gồm: Hóa trị ; Xạ trị; Thuốc ức chế miễn dịch; Liệu pháp miễn dịch. Điều trị hội chứng tiểu não liên quan đến rượu có thể bao gồm: Không uống rượu; Bạn cũng có thể cần thực phẩm bổ sung thiamin và các vitamin B khác; Thuốc bổ sung thiamine. Những người mắc bất kỳ dạng hội chứng tiểu não nào cũng có thể cần vật lý trị liệu, trị liệu nghề nghiệp hoặc trị liệu ngôn ngữ. Các liệu pháp này có thể giúp bạn cải thiện: Kỹ năng vận động để bạn có thể thực hiện các hoạt động hàng ngày; Sức mạnh và sự phối hợp của cơ; Kỹ năng nuốt, nói và ngôn ngữ. Ngoại khoa Nếu nguyên nhân dẫn đến hội chứng tiểu não là đột quỵ, u não, các bác sĩ có thể chỉ định phẫu thuật để điều trị và ngăn ngừa biến chứng của bệnh. Bác sĩ có thể chỉ định phẫu thuật nếu nguyên nhân gây bệnh là do đột quỵ hoặc u não' - source_sentence: "Thưa bác sĩ,\r\n\r\nThời gian gần đây tôi gầy sút cân nhiều, đau\ \ tức ngực, khó khăn, đi khám Bệnh viện Phổi TW kết luận tôi bị lao phổi, chuyển\ \ tuyến cơ sở điều trị. Vậy tôi có thể điều trị ở tuyến trung ương không?\r\n\r\ \nTuyến cơ sở kê đơn N-acetylcystein (Dismolan) 200mg/10ml ngày uống 2 ống /2\ \ lần; Pimagie uống ngày 2 viên/2 lần; bột bèo hoa dâu (Mediphylamin) 500mg uống\ \ ngày 2 viễn/2 lần. Đơn thuốc có đúng phác đồ điều trị không?" sentences: - 'Hình minh họa Chào em, Nguyên nhân thường gặp nhất gây ra hiện tượng này là viêm kết mạc mức độ nhẹ, em chưa bị nhìn mờ thì giác mạc chưa viêm. Virus là nguyên nhân hay gặp nhất, trong đó khoảng 80% là Adenovirus. Bệnh dễ lây lan khi tiếp xúc trực tiếp với nước mắt bệnh nhân.ở mức độ này thì em chưa cần phải sử dụng thuốc nhỏ mắt có kháng sinh, vì bệnh có thể tự khỏi mà không cần điều trị. Việc điều trị chủ yếu là điều trị triệu chứng bao gồm chườm mát, rửa mắt bằng nước muối sinh lý bình thường hay VRhoto, nếu thấy khô mắt thì nhỏ thêm nước mắt nhân tạo. Nếu bệnh diễn tiến xấu hơn như đỏ hơn, đau, tăng tiết ghèn, giảm thị lực thì phải đến khám tại BS chuyên khoa mắt để BS xử lý cho phù hợp, tránh sử dụng kháng sinh và giảm viêm bừa bãi. Ngoài ra em cần chú ý: Sử dụng khăn mặt, vật dụng cá nhân riêng trong nhà và nơi học tập làm việc.Không dụi mắt, che miệng mũi khi hắt hơi. Rửa tay bằng dung dịch sát khuẩn hàng ngày, đặc biệt là sau khi tiếp xúc với người bệnh. Mang kính bảo vệ mắt khi ra ngoài hoặc làm việc trong môi trường nhiều khói, bụi, hóa chất,... Tăng cường bổ sung vitamin A, C, E,...' - 'Chế độ sinh hoạt & phòng ngừa bướu giáp keo Những thói quen sinh hoạt có thể giúp bạn hạn chế diễn tiến của bướu giáp keo Chế độ sinh hoạt: Uống nhiều nước ít nhất 2 lít nước/ngày. Bỏ rượu bia, thuốc lá. Duy trì cân nặng bình thường. Khám sức khỏe định kỳ mỗi 6 tháng và liên hệ ngay với bác sĩ nếu các triệu chứng nặng hơn hoặc xuất hiện triệu chứng mới. Quản lý căng thẳng. Tập thể dục thường xuyên. Hãy hỏi ý kiến bác sĩ khi sử dụng thêm bất kỳ loại thuốc điều trị hoặc thực phẩm chức năng nào. Chế độ dinh dưỡng: Hằng ngày, bạn nên bổ sung khoảng 150 microgam muối iốt tương đương với ½ muỗng cà phê để ngăn ngừa phát triển bướu giáp. Ngoài ra có một số thực phẩm được các nhà khoa học khuyến cáo là có thể làm nặng hơn tình trạng bướu giáp, vì thực phẩm này có thể làm giảm sự hấp thu hormon tuyến giáp T4 và dẫn đến suy giảm chức năng tuyến giáp. Các thực phẩm cần tránh hoặc ăn với số lượng ít bao gồm: Súp lơ, bông cải xanh, mù tạt xanh, củ sắn, đậu lima, khoai lang, đậu nành và các sản phẩm từ đậu nành, trà xanh. Hãy liên hệ với chuyên gia dinh dưỡng để được tư vấn chế độ ăn phù hợp với tình trạng hiện tại của bạn. Áp dụng chế độ ăn bổ sung iốt đều ngăn ngừa bướu giáp keo do thiếu iốt Phương pháp phòng ngừa bướu giáp keo hiệu quả Bướu giáp keo do thiếu iốt là loại bướu giáp duy nhất bạn có thể phòng ngừa. Áp dụng một chế độ ăn bao gồm cá, sữa và một lượng muối iốt vừa đủ sẽ ngăn ngừa loại bướu giáp này. Ngoài ra, bạn có thể cố gắng giảm nguy cơ phát triển bướu giáp keo bằng cách hạn chế các yếu tố nguy cơ như duy trì cân nặng lý tưởng và hãy cố gắng bỏ thuốc lá.' - Chào bạn, Hiện nay phác đồ điều trị lao của Quốc gia tại tuyến cơ sở cũng thống nhất với tuyến Trung ương, tức là theo tiêu chuẩn của Tổ chức Y tế Thế giới nên bạn không cần phải lo lắng về chất lượng thuốc và phương pháp điều trị. Thông thường nếu bệnh nhân yêu cầu điều trị tại tuyến Trung Ương sẽ được điều trị dịch vụ, tức là bệnh nhân tự chi trả; trong khi nếu về địa phương bạn sẽ được miễn phí hoàn toàn. Điều này giúp khuyến khích bệnh nhân tuân thủ việc lấy thuốc định kỳ hơn (do gần nơi ở), ngoài ra còn giảm tải cho các bệnh viện ở tuyến trên. Trong các thuốc bạn liệt kê không có thuốc nào điều trị lao phổi, bạn nên quay lại Trạm chống lao nơi nhận thuốc để hỏi lại, bạn không nên tự bỏ tiền ra mua các thuốc trên mà chỉ nên nhận thuốc được cấp phát. Thân mến. - source_sentence: "Chào bác sĩ,\r\n\r\nEm có tập thể hình được 3 tháng, bây giờ ở\ \ đầu vú phải nổi cục ở chính đầu vú, sờ vào đau to bằng đầu ngón út, sờ có vẻ\ \ có di chuyển qua lại nhưng cố định ở đầu vú. Liệu có phải em bị ung thư vú không\ \ ạ, em cảm ơn." sentences: - 'Chào em Tuyết, Trường hợp của em tôi nghĩ có những việc sau đây cần làm: - Ngưng thuốc dạ dày trong 2 tuần, đi xét nghiệm hơi thở hoặc nội soi dạ dày kiểm tra xem lành hay chưa và còn vi trùng Hp trong dạ dày hay không - Siêu âm bụng kiểm tra tổng quát ổ bụng - Nội soi ruột già để kiểm tra xem ruột già có bị bất thường hay không. Cái này thì siêu âm không thể thấy được. - Xét nghiệm phân Tuy nhiên, em vẫn nên đến khám bệnh với BS chuyên khoa Tiêu hóa để được tư vấn và xét nghiệm phù hợp. Thân mến.' - ' Chào em, Em không nên quá lo lắng, việc uống thuốc này không ảnh hưởng đến cuộc phẫu thuật của em. Tuy nhiên trong thời gian hậu phẫu em nên ngưng thuốc đến khi hồi phục, sinh hoạt bình thường trở lại mới tiếp tục dùng thuốc em nhé. Điều không phải là một tình trạng cấp bách vì vậy có thể trì hoãn điều trị. Thân mến! ' - 'Phương pháp chẩn đoán & điều trị ung thư mũi Phương pháp xét nghiệm và chẩn đoán ung thư mũi Trong trường hợp không được phát hiện và điều trị ung thư mũi sớm, người bệnh sẽ đối mặt với nhiều biến chứng nguy hiểm, thậm chí ảnh hưởng đến tính mạng. Do đó, ngay khi có những biểu hiện nghi ngờ mắc bệnh, bạn hãy nhanh chóng đến bác sĩ để được chỉ định thực hiện các xét nghiệm nhằm chẩn đoán và có hướng xử trí phù hợp. Bệnh cạnh việc soi chụp ảnh khối u trong mũi, bác sĩ sẽ xem xét và chỉ định người bệnh thực hiện thêm một vài xét nghiệm để có thêm cơ sở chẩn đoán như: Soi mũi họng: Người bệnh được kiểm tra các bất thường của mũi, kiểm tra đồng thời vùng mặt và cổ, khu vực có khối u và nơi các hạch bạch huyết bị sưng. Soi chụp X-quang đầu và cổ: Hình ảnh chụp được sẽ giúp bác sĩ quan sát được toàn bộ vùng bên trong mũi cũng như các xoang cạnh mũi nhằm định vị chính xác nơi có khối u. Sinh thiết : Bác sĩ lấy một mẫu nhỏ ở các khu vực nghi ngờ bị ung thư để mang đi kiểm tra, tìm kiếm tế bào ung thư. Chụp CT: Hình ảnh chụp được giúp bác sĩ xác định xem tế bào ung thư đã lây lan sang bộ phận nào, có xâm lấn sang cơ quan lân cận hay chưa. Chụp X-quang giúp bác sĩ quan sát khối u trong mũi Phương pháp điều trị ung thư mũi hiệu quả Mỗi bệnh nhân sẽ có giai đoạn mắc bệnh khác nhau nên việc chẩn đoán ung thư cũng cần bác sĩ thăm khám cẩn thận và đưa ra phác đồ điều trị riêng cho từng trường hợp. Đa phần nếu người bệnh được phát hiện vào giai đoạn sớm của bệnh sẽ được điều trị phẫu thuật nhằm ngăn chặn khối u phát triển hoàn toàn. Trong trường hợp khối u đã phát triển tăng dần kích thước, chuyển sang giai đoạn di căn đến các cơ quan và khu vực khác thì bác sĩ sẽ xem xét các phương án điều trị khác, đó có thể là sự kết hợp của phẫu thuật, hóa trị và xạ trị. Cụ thể từng phương pháp như sau: Phẫu thuật: Bác sĩ tiến hành loại khối u ra khỏi khoang mũi. Ngày nay, y học hiện đại đã có nhiều tiến bộ vượt bậc dẫn đến kỹ thuật phẫu thuật cũng được nâng cao. Bằng biện pháp phẫu thuật nội soi, bác sĩ sẽ cắt bỏ hoàn toàn khối u và các mô xung quanh trong khoang mũi. Sau phẫu thuật, nếu bệnh nhân có nguy cơ tái phát cao thì sẽ được điều trị xạ trị. Xạ trị : Dùng tia bức xạ ion chiếu vào nhằm mục đích kiểm soát, tiêu diệt và phá hủy tế bào ung thư. Phương pháp này không làm bệnh nhân bị đau đớn. Hóa trị : Đây là phương pháp điều trị ung thư mũi phổ biến thông qua cách đưa thuốc dược tính rất mạnh vào cơ thể để tiêu diệt tế bào ung thư. Tuy có tác dụng tiêu diệt tế bào ung thư nhưng song song với đó thì hóa trị lại gây tổn hại nhiều đến những tế bào khỏe mạnh. Bệnh nhân điều trị bằng hóa trị sẽ gặp một số tác dụng phụ không mong muốn như rụng tóc, buồn nôn , sức đề kháng kém… Xạ trị là một trong các phương pháp điều trị ung thư mũi Chống chọi với ung thư là một cuộc chiến dài, đòi hỏi cả người bệnh lẫn người nhà bệnh nhân phải có sự kiên cường và can đảm. Bệnh nhân hãy trang bị cho mình một tinh thần vững vàng bởi đây là chìa khóa để nâng cao chất lượng cuộc sống và kéo dài tính mạng.' - source_sentence: Chế độ sinh hoạt & phòng ngừa viêm phụ khoa sentences: - 'Chế độ sinh hoạt & phòng ngừa viêm phụ khoa Những thói quen sinh hoạt có thể giúp bạn hạn chế diễn tiến của viêm phụ khoa Chế độ sinh hoạt: Giữ vệ sinh vùng kín đúng cách, ưu tiên dùng nước sạch hoặc dung dịch vệ sinh phù hợp có pH cân bằng. Tránh thụt rửa âm đạo làm mất cân bằng vi khuẩn tự nhiên trong âm đạo, gây viêm nhiễm hoặc làm tình trạng viêm phụ khoa nặng hơn. Sử dụng quần lót thoáng khí: Chọn quần lót bằng cotton, tránh mặc quần lót chật hoặc quần ẩm ướt để giảm thiểu nguy cơ nhiễm nấm và vi khuẩn. Tạm ngừng quan hệ tình dục trong thời gian điều trị viêm phụ khoa. Tránh sử dụng băng vệ sinh, tampon hoặc khăn lau chứa hương liệu có thể gây kích ứng. Đảm bảo thay băng vệ sinh và tampon thường xuyên trong kỳ kinh. Tránh căng thẳng vì stress làm suy yếu hệ miễn dịch, khiến cơ thể khó chống lại vi khuẩn và nấm gây bệnh. Vệ sinh vùng kín đúng cách Chế độ dinh dưỡng: Tăng cường thực phẩm giàu probiotic như sữa chua, kefir, miso và kim chi chứa lợi khuẩn giúp cân bằng hệ vi sinh trong cơ thể, đặc biệt là trong âm đạo. Bổ sung thực phẩm chứa chất chống viêm như cá hồi, dầu ô liu, quả óc chó và hạt chia giàu omega-3 có tác dụng giảm viêm, hỗ trợ cơ thể chống lại các tác nhân gây viêm nhiễm. Hạn chế chất bột đường vì có thể thúc đẩy sự phát triển của nấm Candida gây nhiễm nấm âm đạo. Tránh đồ ngọt, bánh mì trắng và thực phẩm chế biến sẵn. Uống đủ nước giúp cơ thể thanh lọc và loại bỏ các độc tố, đồng thời duy trì độ ẩm tự nhiên của vùng kín, ngăn ngừa tình trạng khô và kích ứng âm đạo. Bổ sung thực phẩm giàu vitamin C giúp tăng cường hệ miễn dịch, hỗ trợ cơ thể trong việc phòng ngừa và điều trị viêm phụ khoa. Các thực phẩm giàu vitamin C bao gồm cam, chanh, kiwi và dâu tây. Bổ sung tỏi và thực phẩm kháng khuẩn tự nhiên, có thể giúp ngăn ngừa nhiễm trùng âm đạo. Phòng ngừa viêm phụ khoa Để phòng ngừa viêm phụ khoa, các chị em cần chú ý một số điểm sau: Giữ vệ sinh vùng kín đúng cách: Rửa nhẹ nhàng vùng kín bằng nước sạch hoặc dung dịch vệ sinh phù hợp có pH cân bằng. Lau từ trước ra sau sau khi đi vệ sinh để tránh vi khuẩn từ hậu môn lây lan lên âm đạo. Tránh thụt rửa âm đạo: Thụt rửa có thể làm mất cân bằng vi khuẩn tự nhiên trong âm đạo, gây viêm nhiễm hoặc làm tình trạng viêm phụ khoa nặng hơn. Sử dụng quần lót thoáng khí: Chọn quần lót bằng cotton, tránh mặc quần lót chật hoặc quần ẩm ướt để giảm thiểu nguy cơ nhiễm nấm và vi khuẩn. Quan hệ tình dục an toàn: Sử dụng bao cao su khi quan hệ tình dục và tránh quan hệ với nhiều bạn tình để giảm nguy cơ lây nhiễm bệnh qua đường tình dục. Tránh sử dụng băng vệ sinh, tampon hoặc khăn lau chứa hương liệu có thể gây kích ứng. Đảm bảo thay băng vệ sinh và tampon thường xuyên trong kỳ kinh. Tránh căng thẳng: Thực hành các kỹ thuật giảm căng thẳng như thiền, yoga, hoặc tập thể dục thường xuyên. Chế độ ăn uống cân bằng, giảm đường và thực phẩm chế biến sẵn, tăng cường bổ sung probiotic, giữ tinh thần thoải mái, và tránh căng thẳng. Bổ sung lợi khuẩn probiotic' - "Mô tả ngắn:\nThuốc Otibone Plus của Công ty Cổ phần Dược phẩm Boston Việt Nam,\ \ thành phần chính là Natri chondroitin sulfat, Glucosamin HCl, Methyl sulfonyl\ \ methan. Thuốc có tác dụng giảm các triệu chứng của thoái hóa khớp gối nhẹ và\ \ trung bình. \n Thuốc Otibone Plus được bào chế dạng viên nén bao phim, đóng\ \ gói theo quy cách hộp 3 vỉ x 10 viên bao phim, hộp 6 vỉ x 10 viên bao phim.\n\ Thành phần:\nGlucosamine: 500mg\nChondroitin: 400mg\nMSM: 167mg\nChỉ định:\nThuốc\ \ Otibone Plus dùng điều trị giảm các triệu chứng của thoái hóa khớp gối nhẹ và\ \ trung bình." - 'Mô tả ngắn: Thuốc Ravenell-62,5 là sản phẩm của Công ty Cổ phần Dược phẩm Đạt Vi Phú chứa hoạt chất Bosentan (dưới dạng Bosentan monohydrat) dùng điều trị tăng áp lực động mạch phổi (PAH) để cải thiện khả năng gắng sức và triệu chứng ở bệnh nhân độ III theo phân loại của WHO. Thành phần: Bosentan: 62.5mg Chỉ định: Thuốc Ravenell-62,5 chỉ định điều trị tăng áp lực động mạch phổi (PAH) để cải thiện khả năng gắng sức và triệu chứng ở bệnh nhân độ III theo phân loại của WHO. Hiệu quả đã được chứng minh trong: Tăng áp lực động mạch phổi tiên phát (vô căn hoặc di truyền). Tăng áp lực động mạch phổi thứ phát do xơ cứng bì mà không có bệnh phổi kẽ nặng. Đã ghi nhận bosentan cho một số tác dụng cải thiện ở bệnh nhân bị tăng áp lực động mạch phổi độ II theo phân loại WHO. Bosentan cũng được chỉ định để giảm số lượng vết loét ngón tay/chân mới ở bệnh nhân xơ cứng bì toàn thể và vết loét ngón tay/chân đang tiến triển.' model-index: - name: SentenceTransformer based on vinai/phobert-base-v2 results: - task: type: information-retrieval name: Information Retrieval dataset: name: Unknown type: unknown metrics: - type: cosine_accuracy@1 value: 0.5306793279766253 name: Cosine Accuracy@1 - type: cosine_accuracy@2 value: 0.6227173119065011 name: Cosine Accuracy@2 - type: cosine_accuracy@5 value: 0.7363038714390066 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.8080715850986121 name: Cosine Accuracy@10 - type: cosine_accuracy@100 value: 0.9702337472607743 name: Cosine Accuracy@100 - type: cosine_precision@1 value: 0.5306793279766253 name: Cosine Precision@1 - type: cosine_precision@2 value: 0.3113586559532506 name: Cosine Precision@2 - type: cosine_precision@5 value: 0.1472607742878013 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.0808071585098612 name: Cosine Precision@10 - type: cosine_precision@100 value: 0.009702337472607741 name: Cosine Precision@100 - type: cosine_recall@1 value: 0.5306793279766253 name: Cosine Recall@1 - type: cosine_recall@2 value: 0.6227173119065011 name: Cosine Recall@2 - type: cosine_recall@5 value: 0.7363038714390066 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.8080715850986121 name: Cosine Recall@10 - type: cosine_recall@100 value: 0.9702337472607743 name: Cosine Recall@100 - type: cosine_ndcg@10 value: 0.6629885947922608 name: Cosine Ndcg@10 - type: cosine_mrr@1 value: 0.5306793279766253 name: Cosine Mrr@1 - type: cosine_mrr@2 value: 0.5766983199415632 name: Cosine Mrr@2 - type: cosine_mrr@5 value: 0.6077489651813983 name: Cosine Mrr@5 - type: cosine_mrr@10 value: 0.6172970509119163 name: Cosine Mrr@10 - type: cosine_mrr@100 value: 0.6244975627932318 name: Cosine Mrr@100 - type: cosine_map@100 value: 0.6244975627932318 name: Cosine Map@100 - type: dot_accuracy@1 value: 0.5080350620891162 name: Dot Accuracy@1 - type: dot_accuracy@2 value: 0.6073776479181885 name: Dot Accuracy@2 - type: dot_accuracy@5 value: 0.7235208181154127 name: Dot Accuracy@5 - type: dot_accuracy@10 value: 0.7983929875821768 name: Dot Accuracy@10 - type: dot_accuracy@100 value: 0.970781592403214 name: Dot Accuracy@100 - type: dot_precision@1 value: 0.5080350620891162 name: Dot Precision@1 - type: dot_precision@2 value: 0.30368882395909425 name: Dot Precision@2 - type: dot_precision@5 value: 0.14470416362308253 name: Dot Precision@5 - type: dot_precision@10 value: 0.07983929875821766 name: Dot Precision@10 - type: dot_precision@100 value: 0.009707815924032139 name: Dot Precision@100 - type: dot_recall@1 value: 0.5080350620891162 name: Dot Recall@1 - type: dot_recall@2 value: 0.6073776479181885 name: Dot Recall@2 - type: dot_recall@5 value: 0.7235208181154127 name: Dot Recall@5 - type: dot_recall@10 value: 0.7983929875821768 name: Dot Recall@10 - type: dot_recall@100 value: 0.970781592403214 name: Dot Recall@100 - type: dot_ndcg@10 value: 0.6471799204598053 name: Dot Ndcg@10 - type: dot_mrr@1 value: 0.5080350620891162 name: Dot Mrr@1 - type: dot_mrr@2 value: 0.5577063550036523 name: Dot Mrr@2 - type: dot_mrr@5 value: 0.5894691989286586 name: Dot Mrr@5 - type: dot_mrr@10 value: 0.5994910692545828 name: Dot Mrr@10 - type: dot_mrr@100 value: 0.6071622646317102 name: Dot Mrr@100 - type: dot_map@100 value: 0.6071622646317109 name: Dot Map@100 --- # SentenceTransformer based on vinai/phobert-base-v2 This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [vinai/phobert-base-v2](https://huggingface.co/vinai/phobert-base-v2). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [vinai/phobert-base-v2](https://huggingface.co/vinai/phobert-base-v2) <!-- at revision e2375d266bdf39c6e8e9a87af16a5da3190b0cc8 --> - **Maximum Sequence Length:** 256 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: RobertaModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("meandyou200175/phobert_ContrastiveLoss") # Run inference sentences = [ 'Chế độ sinh hoạt & phòng ngừa viêm phụ khoa', 'Chế độ sinh hoạt & phòng ngừa viêm phụ khoa Những thói quen sinh hoạt có thể giúp bạn hạn chế diễn tiến của viêm phụ khoa Chế độ sinh hoạt: Giữ vệ sinh vùng kín đúng cách, ưu tiên dùng nước sạch hoặc dung dịch vệ sinh phù hợp có pH cân bằng. Tránh thụt rửa âm đạo làm mất cân bằng vi khuẩn tự nhiên trong âm đạo, gây viêm nhiễm hoặc làm tình trạng viêm phụ khoa nặng hơn. Sử dụng quần lót thoáng khí: Chọn quần lót bằng cotton, tránh mặc quần lót chật hoặc quần ẩm ướt để giảm thiểu nguy cơ nhiễm nấm và vi khuẩn. Tạm ngừng quan hệ tình dục trong thời gian điều trị viêm phụ khoa. Tránh sử dụng băng vệ sinh, tampon hoặc khăn lau chứa hương liệu có thể gây kích ứng. Đảm bảo thay băng vệ sinh và tampon thường xuyên trong kỳ kinh. Tránh căng thẳng vì stress làm suy yếu hệ miễn dịch, khiến cơ thể khó chống lại vi khuẩn và nấm gây bệnh. Vệ sinh vùng kín đúng cách Chế độ dinh dưỡng: Tăng cường thực phẩm giàu probiotic như sữa chua, kefir, miso và kim chi chứa lợi khuẩn giúp cân bằng hệ vi sinh trong cơ thể, đặc biệt là trong âm đạo. Bổ sung thực phẩm chứa chất chống viêm như cá hồi, dầu ô liu, quả óc chó và hạt chia giàu omega-3 có tác dụng giảm viêm, hỗ trợ cơ thể chống lại các tác nhân gây viêm nhiễm. Hạn chế chất bột đường vì có thể thúc đẩy sự phát triển của nấm Candida gây nhiễm nấm âm đạo. Tránh đồ ngọt, bánh mì trắng và thực phẩm chế biến sẵn. Uống đủ nước giúp cơ thể thanh lọc và loại bỏ các độc tố, đồng thời duy trì độ ẩm tự nhiên của vùng kín, ngăn ngừa tình trạng khô và kích ứng âm đạo. Bổ sung thực phẩm giàu vitamin C giúp tăng cường hệ miễn dịch, hỗ trợ cơ thể trong việc phòng ngừa và điều trị viêm phụ khoa. Các thực phẩm giàu vitamin C bao gồm cam, chanh, kiwi và dâu tây. Bổ sung tỏi và thực phẩm kháng khuẩn tự nhiên, có thể giúp ngăn ngừa nhiễm trùng âm đạo. Phòng ngừa viêm phụ khoa Để phòng ngừa viêm phụ khoa, các chị em cần chú ý một số điểm sau: Giữ vệ sinh vùng kín đúng cách: Rửa nhẹ nhàng vùng kín bằng nước sạch hoặc dung dịch vệ sinh phù hợp có pH cân bằng. Lau từ trước ra sau sau khi đi vệ sinh để tránh vi khuẩn từ hậu môn lây lan lên âm đạo. Tránh thụt rửa âm đạo: Thụt rửa có thể làm mất cân bằng vi khuẩn tự nhiên trong âm đạo, gây viêm nhiễm hoặc làm tình trạng viêm phụ khoa nặng hơn. Sử dụng quần lót thoáng khí: Chọn quần lót bằng cotton, tránh mặc quần lót chật hoặc quần ẩm ướt để giảm thiểu nguy cơ nhiễm nấm và vi khuẩn. Quan hệ tình dục an toàn: Sử dụng bao cao su khi quan hệ tình dục và tránh quan hệ với nhiều bạn tình để giảm nguy cơ lây nhiễm bệnh qua đường tình dục. Tránh sử dụng băng vệ sinh, tampon hoặc khăn lau chứa hương liệu có thể gây kích ứng. Đảm bảo thay băng vệ sinh và tampon thường xuyên trong kỳ kinh. Tránh căng thẳng: Thực hành các kỹ thuật giảm căng thẳng như thiền, yoga, hoặc tập thể dục thường xuyên. Chế độ ăn uống cân bằng, giảm đường và thực phẩm chế biến sẵn, tăng cường bổ sung probiotic, giữ tinh thần thoải mái, và tránh căng thẳng. Bổ sung lợi khuẩn probiotic', 'Mô tả ngắn:\nThuốc Ravenell-62,5 là sản phẩm của Công ty Cổ phần Dược phẩm Đạt Vi Phú chứa hoạt chất Bosentan (dưới dạng Bosentan monohydrat) dùng điều trị tăng áp lực động mạch phổi (PAH) để cải thiện khả năng gắng sức và triệu chứng ở bệnh nhân độ III theo phân loại của WHO.\nThành phần:\nBosentan: 62.5mg\nChỉ định:\nThuốc Ravenell-62,5 chỉ định điều trị tăng áp lực động mạch phổi (PAH) để cải thiện khả năng gắng sức và triệu chứng ở bệnh nhân độ III theo phân loại của WHO. Hiệu quả đã được chứng minh trong:\nTăng áp lực động mạch phổi tiên phát (vô căn hoặc di truyền). Tăng áp lực động mạch phổi thứ phát do xơ cứng bì mà không có bệnh phổi kẽ nặng.\nĐã ghi nhận bosentan cho một số tác dụng cải thiện ở bệnh nhân bị tăng áp lực động mạch phổi độ II theo phân loại WHO.\nBosentan cũng được chỉ định để giảm số lượng vết loét ngón tay/chân mới ở bệnh nhân xơ cứng bì toàn thể và vết loét ngón tay/chân đang tiến triển.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Information Retrieval * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:---------------------|:-----------| | cosine_accuracy@1 | 0.5307 | | cosine_accuracy@2 | 0.6227 | | cosine_accuracy@5 | 0.7363 | | cosine_accuracy@10 | 0.8081 | | cosine_accuracy@100 | 0.9702 | | cosine_precision@1 | 0.5307 | | cosine_precision@2 | 0.3114 | | cosine_precision@5 | 0.1473 | | cosine_precision@10 | 0.0808 | | cosine_precision@100 | 0.0097 | | cosine_recall@1 | 0.5307 | | cosine_recall@2 | 0.6227 | | cosine_recall@5 | 0.7363 | | cosine_recall@10 | 0.8081 | | cosine_recall@100 | 0.9702 | | cosine_ndcg@10 | 0.663 | | cosine_mrr@1 | 0.5307 | | cosine_mrr@2 | 0.5767 | | cosine_mrr@5 | 0.6077 | | cosine_mrr@10 | 0.6173 | | cosine_mrr@100 | 0.6245 | | **cosine_map@100** | **0.6245** | | dot_accuracy@1 | 0.508 | | dot_accuracy@2 | 0.6074 | | dot_accuracy@5 | 0.7235 | | dot_accuracy@10 | 0.7984 | | dot_accuracy@100 | 0.9708 | | dot_precision@1 | 0.508 | | dot_precision@2 | 0.3037 | | dot_precision@5 | 0.1447 | | dot_precision@10 | 0.0798 | | dot_precision@100 | 0.0097 | | dot_recall@1 | 0.508 | | dot_recall@2 | 0.6074 | | dot_recall@5 | 0.7235 | | dot_recall@10 | 0.7984 | | dot_recall@100 | 0.9708 | | dot_ndcg@10 | 0.6472 | | dot_mrr@1 | 0.508 | | dot_mrr@2 | 0.5577 | | dot_mrr@5 | 0.5895 | | dot_mrr@10 | 0.5995 | | dot_mrr@100 | 0.6072 | | dot_map@100 | 0.6072 | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 87,608 training samples * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | label | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------| | type | string | string | int | | details | <ul><li>min: 6 tokens</li><li>mean: 75.42 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 188.29 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>0: ~50.00%</li><li>1: ~50.00%</li></ul> | * Samples: | sentence1 | sentence2 | label | |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| | <code>Em bị 1 vết sưng tấy đường kính khoảng 2 cm hồng hồng ở vùng da thịt mông, nó không phải mụn nhọt, nó giống vết bị côn trùng cắt, không sưng u mà giống như miếng thịt sần sùi nhấp nhô. Không đau lắm, nhưng rờ vô thì cứng cứng đau nhói. Nằm ở vùng da mông không có gần hậu môn. Lúc trước nó xuất hiện rồi lại biến mất. Mỗi lần đi ngoài táo bón là hình như nó xuất hiện. BS coi giùm em nó bị gì, em hơi lo lắng. Em cảm ơn. <br> <br>(Hoàng Bảo - Đồng Nai)</code> | <code> Chào em Bảo, Sang thương da của em gồ trên bề mặt da, xù xì, tái đi tái lại nhiều lần. Với tính chất này thì ít nghĩ đến trường hợp dị ứng da thông thường. Do sang thương gồ và cứng nhiều khả năng cần sinh thiết da mới biết rõ bản chất để loại trừ nguyên nhân bệnh ở da. Không rõ hiện tại em có còn sang thương trên da hay không. Nếu vẫn còn thì em đừng ngại, hãy đến khám BS da liễu để tìm ra bệnh và điều trị đúng cách. Thân mến! </code> | <code>1</code> | | <code>Em bị 1 vết sưng tấy đường kính khoảng 2 cm hồng hồng ở vùng da thịt mông, nó không phải mụn nhọt, nó giống vết bị côn trùng cắt, không sưng u mà giống như miếng thịt sần sùi nhấp nhô. Không đau lắm, nhưng rờ vô thì cứng cứng đau nhói. Nằm ở vùng da mông không có gần hậu môn. Lúc trước nó xuất hiện rồi lại biến mất. Mỗi lần đi ngoài táo bón là hình như nó xuất hiện. BS coi giùm em nó bị gì, em hơi lo lắng. Em cảm ơn. <br> <br>(Hoàng Bảo - Đồng Nai)</code> | <code> Thanh thân mến, Mô tả của em khá sơ sài, không thể dựa vào cảm nhận mà biết là hay do nguyên nhân nào khác được. Em có thể chụp hình tổn thương gửi về chương trình, mô tả kĩ hơn về thời gian đau, hoàn cảnh khởi phát, tính chất đau, hướng lan cũng như những bệnh lý liên quan để BS tư vấn cụ thể hơn, em nhé! Trân trọng!</code> | <code>0</code> | | <code>Chào AloBacsi, <br> <br>Em 22 tuổi bị u tuyến yên, phát hiện bệnh năm 2013 với kích thước 9*11*13mm, prolactin 9933, đã dùng thuốc Doxtinex. Tháng 6/2014 đã mổ nội soi lấy u tại BV Việt Đức. Sau mổ 1 tháng và dùng thuốc Doxtinex (4 viên), em tái khám prolactin 66, BS dặn tiếp tục dùng thuốc, 3 tháng khám lại. <br> <br>AloBacsi ơi, trong quá trình dùng thuốc thì dấu hiệu nào nên đi khám ngay ạ? Em chân thành cám ơn! (Ho Van Anh - Bắc Ninh)</code> | <code>- nguồn internet Bạn Anh thân mến, Qua các thông tin bạn cung cấp thì có khả năng bạn bị tiết Prolactin, kích thước khối u này khá lớn, có nhiều khả năng bệnh sẽ tái phát (khi mổ, phẫu thuật viên không thể lấy hết khối u được). BS dặn bạn tái khám là đúng và trong khi đang uống thuốc mà xuất hiện các tác dụng phụ của thuốc hay bệnh tái phát thì bạn nên khám lại ngay. Các tác dụng phụ của Doxtinex có thể gặp: buồn nôn, nôn, choáng váng đau dầu, mệt mỏi, táo bón,… Hay các biểu hiện bênh tái phát như: tiết sữa bất thường (mà không có thai), không có kinh, kinh ít hay vô sinh, giảm ham muốn tình dục,… lúc đó, bạn cần tái khám ngay để đo nồng độ Prolactin. Chào bạn và chúc bạn luôn vui, khỏe.</code> | <code>1</code> | * Loss: [<code>ContrastiveLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#contrastiveloss) with these parameters: ```json { "distance_metric": "SiameseDistanceMetric.COSINE_DISTANCE", "margin": 0.5, "size_average": true } ``` ### Evaluation Dataset #### Unnamed Dataset * Size: 10,952 evaluation samples * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | label | |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:------------------------------------------------| | type | string | string | int | | details | <ul><li>min: 5 tokens</li><li>mean: 79.68 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 186.71 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>0: ~50.00%</li><li>1: ~50.00%</li></ul> | * Samples: | sentence1 | sentence2 | label | |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| | <code>Nguyên nhân viêm xoang</code> | <code>Nguyên nhân viêm xoang Nguyên nhân dẫn đến Viêm xoang Virus: Đa số các trường hợp bị viêm xoang là do chứng cảm lạnh thông thường với sự xâm nhập của các virus đến các xoang. Cảm lạnh có thể dẫn đến viêm xoang bởi lẽ mũi của người bệnh lúc này dễ bị kích ứng trước sự tấn công của các virus gây bệnh làm sung huyết các mô mũi, chặn bít các lỗ thông thường dẫn lưu xoang. Một số virus gây bệnh điển hình: Rhinoviruses; Adenovirus; Virus parainfluenza ở người; Virus hợp bào đường hô hấp; Enterovirus. Vi khuẩn: Khoảng 10% bệnh nhân bị viêm xoang do vi khuẩn. Nếu bị cảm lạnh và không có dấu hiệu thuyên giảm sau 10 – 15 ngày nguyên nhân có thể do vi khuẩn khu trú trong các khoang mũi họng, khi cơ thể gặp vấn đề về sức khỏe, chúng sẽ phát triển và gây bệnh. Cảm lạnh sau một thời gian sẽ biến chứng thành viêm xoang. Một số vi khuẩn gây bệnh điển hình: Haemophilus influenzae; Streptococcus pneumoniae ; Trực khuẩn mủ xanh ( P.aeruginosa ); E.coli ; Cầu khuẩn (tụ cầu và liên cầu); Klebsiella; Moraxella catarrhalis. Nấm: Nhiễm trùng xoang thường gặp ở người có hệ miễn dịch yếu nhưng người khỏe mạnh cũng không nằm ngoài nguy cơ. Aspergillus là loại nấm phổ biến gây viêm xoang. Khi hệ thống miễn dịch suy yếu, nấm có cơ hội phát triển, đặc biệt là trong môi trường ẩm và tối tăm như các xoang. Dị ứng: Người bị viêm xoang do dị ứng có xu hướng bị nặng hơn so với bệnh nhân mắc bệnh do yếu tố khác. Vì vậy, nếu cơ địa dễ bị mẫn cảm với phấn hoa, lông vật nuôi, nấm mốc, bụi bẩn, nước hoa… hãy tránh xa những thứ này. Polyp mũi: Polyp mũi là những u nhỏ lành tính phát triển từ các mô mũi hoặc xoang, khiến các hốc xoang bị tắc nghẽn, ngăn cản dịch mũi chảy ra và gây nhiễm trùng xoang. Những u nhỏ này cũng có thể hạn chế đường dẫn khí, gây đau đầu, giảm độ nhạy của khứu giác. Một số nguyên nhân khác: Ô nhiễm không khí; Lạm dụng các chất kích thích như rượu, bia, thuốc lá; Lạm dụng thuốc xịt mũi; Sự thay đổi đột ngột của áp suất không khí như khi đi máy bay, lặn sâu dưới biển; Ảnh hưởng từ các cuộc tiểu phẫu vùng mũi hoặc do biến chứng của việc tác động vật lý lên vùng mũi; Viêm mũi dị ứng ; Bất thường cấu trúc mũi.</code> | <code>1</code> | | <code>Nguyên nhân viêm xoang</code> | <code>Nguyên nhân viêm khớp vai Nguyên nhân dẫn đến viêm khớp vai Viêm khớp vai được chia thành nhiều loại, trong đó có 5 loại viêm khớp vai thường gặp nhất. Tùy thuộc vào loại viêm khớp vai mà có các nguyên nhân khác nhau, và tồn tại một số nguyên nhân gây viêm khớp vai vẫn chưa được biết rõ. Thoái hóa khớp vai Thoái hóa có thể ảnh hưởng đến các vị trí khớp như gối, háng, bàn tay và khớp vai của bạn. Tương tự như ở khớp gối, thoái hóa khớp vai là tình trạng hao mòn liên quan đến tiến trình lão hóa. Bên cạnh tuổi tác, các yếu tố di truyền, chấn thương hay tư thế gây áp lực lên khớp lâu ngày cũng sẽ thúc đẩy quá trình thoái hóa. Thoái hóa khớp gây ra sự phá hủy sụn bảo vệ khớp, sụn sẽ bị bào mòn và dần dần mất đi. Các triệu chứng có thể gặp phải là đau, hạn chế vận động và cứng khớp. Nếu không được điều trị, tình trạng có thể nặng dần theo thời gian, gây yếu cơ, mất vững và mất cử động khớp vai. Viêm khớp dạng thấp Viêm khớp dạng thấp là bệnh lý tự miễn có thể dẫn đến viêm khớp vai. Thông thường, viêm khớp dạng thấp sẽ ảnh hưởng ở cả hai bên cơ thể, nên khả năng bạn sẽ bị viêm cả hai bên vai. Đồng thời, bạn có thể gặp các triệu chứng khác kèm theo: Đau, nóng, sưng tại khớp vai. Cảm giác cứng khớp vai, đặc biệt là vào buổi sáng. Các nốt thấp: Là những vết sưng hình thành dưới da tại các bề mặt chịu áp lực như khuỷu tay, khớp ngón tay hoặc khớp vai. Mệt mỏi , sụt cân, sốt. Sụt cân có thể là một trong những dấu hiệu của viêm khớp dạng thấp Tình trạng viêm khớp dạng thấp nếu không được chẩn đoán và điều trị, lâu ngày có thể gây bào mòn xương và biến dạng khớp vai. Viêm khớp vai sau chấn thương Nếu bạn từng gãy xương hay trật khớp vai, bạn có thể sẽ gặp phải một tình trạng viêm khớp vai được gọi là viêm khớp sau chấn thương. Thông thường, cơ thể có thể sẽ tự hồi phục tình trạng viêm khớp sau chấn thương. Tuy nhiên, tình trạng này có thể trở thành mãn tính nếu triệu chứng kéo dài hơn 6 tháng. Hoại tử vô mạch Hoại tử vô mạch, hay còn gọi là hoại tử xương, hầu hết sẽ ảnh hưởng đến khớp háng (xương đùi). Tuy nhiên, tình trạng này có thể ảnh hưởng đến bất kỳ xương nào, trong đó có xương cánh tay, từ đó dẫn đến viêm khớp vai. Hoại tử vô mạch là tình trạng phát sinh do có sự gián đoạn trong việc cung cấp máu cho xương. Nguyên nhân gây hoại tử vô mạch có thể khác nhau như sử dụng nhiều corticoid, uống nhiều rượu, chấn thương vùng vai, bệnh hồng cầu hình liềm hay vô căn (không có nguyên nhân). Nếu không điều trị, tình trạng hoại tử vô mạch sẽ dẫn đến tổn thương xương, có thể cần phải can thiệp phẫu thuật. Bệnh lý rách chóp xoay Xương bả vai và phần trên của xương cánh tay được nối với nhau qua một tập hợp các gân và cơ được gọi là nhóm cơ chóp xoay. Khi các gân cơ chóp xoay bị rách (phổ biến nhất là do chấn thương), sẽ gây mất áp lực, chuyển động và sự ổn định ở vai. Nếu các vết rách không lành lại, hoặc vết rách quá lớn, sẽ dẫn đến tổn thương sụn và xương, từ đó dẫn đến bệnh lý viêm khớp vai do rách chóp xoay. Rách chóp xoay có thể gây viêm khớp vai Tình trạng viêm khớp vai do rách chóp xoay có thể dẫn đến đau dữ dội và suy nhược nếu không được điều trị. Đối phó với nguyên nhân, tránh biến chứng: Nguyên nhân đau mỏi 2 khớp vai thường gặp?</code> | <code>0</code> | | <code>Dạ xin chào bác sĩ,Tôi 71 tuổi, có bệnh cao huyết áp, giãn phế quản do viêm phế quản mãn tính. Tôi có một thắc mắc hỏi và xin được bác sĩ tư vấn cho ạ. Vào đầu năm 1989 tôi bị sốt do cảm cúm, đã uống 2 viên Paracetamol 500mg, có chích thêm Vitamin C đường tĩnh mạch. Sau đó tôi đã bị choáng và ngất phải nhập viện cấp cứu. Kết luận lúc ra viện là bị sốc phản vệ Vitamin C. Xin bác sĩ tư vấn cho tôi có thể chích vắc xin ngừa COVID-19 không ạ? Xin cảm ơn bác sĩ.</code> | <code>Hình minh họa Chào bác, Theo Hướng dẫn sàng lọc trước tiêm chủng vắc xin phòng COVID-19 của Bộ Y tế, những trường hợp có tiền sử dị ứng nặng (phản vệ từ độ 2 trở lên) do mọi nguyên nhân đều không được chỉ định tiêm vắc xin phòng COVID-19. Và trường hợp của bác là rơi vào tình huống này, lớn tuổi, có bệnh lý nền nặng, tiền căn thì có sốc phản vệ với vitamin C, cho nên bác sẽ được trì hoãn tiêm vắc xin phòng COVID-19 trong chiến dịch tiêm vắc xin ngoài cộng đồng hiện nay, bác nhé.</code> | <code>1</code> | * Loss: [<code>ContrastiveLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#contrastiveloss) with these parameters: ```json { "distance_metric": "SiameseDistanceMetric.COSINE_DISTANCE", "margin": 0.5, "size_average": true } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `learning_rate`: 2e-05 - `num_train_epochs`: 5 - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 5 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs <details><summary>Click to expand</summary> | Epoch | Step | Training Loss | Validation Loss | cosine_map@100 | |:------:|:-----:|:-------------:|:---------------:|:--------------:| | 0 | 0 | - | - | 0.1388 | | 0.0183 | 100 | 0.0316 | - | - | | 0.0365 | 200 | 0.0255 | - | - | | 0.0548 | 300 | 0.016 | - | - | | 0.0730 | 400 | 0.0115 | - | - | | 0.0913 | 500 | 0.0098 | - | - | | 0.1096 | 600 | 0.0085 | - | - | | 0.1278 | 700 | 0.0086 | - | - | | 0.1461 | 800 | 0.0078 | - | - | | 0.1644 | 900 | 0.0077 | - | - | | 0.1826 | 1000 | 0.0073 | - | - | | 0.2009 | 1100 | 0.0069 | - | - | | 0.2191 | 1200 | 0.0073 | - | - | | 0.2374 | 1300 | 0.0066 | - | - | | 0.2557 | 1400 | 0.0067 | - | - | | 0.2739 | 1500 | 0.0066 | - | - | | 0.2922 | 1600 | 0.0066 | - | - | | 0.3104 | 1700 | 0.0068 | - | - | | 0.3287 | 1800 | 0.0057 | - | - | | 0.3470 | 1900 | 0.0059 | - | - | | 0.3652 | 2000 | 0.0065 | - | - | | 0.3835 | 2100 | 0.006 | - | - | | 0.4018 | 2200 | 0.006 | - | - | | 0.4200 | 2300 | 0.0057 | - | - | | 0.4383 | 2400 | 0.0054 | - | - | | 0.4565 | 2500 | 0.0057 | - | - | | 0.4748 | 2600 | 0.0057 | - | - | | 0.4931 | 2700 | 0.0053 | - | - | | 0.5113 | 2800 | 0.0056 | - | - | | 0.5296 | 2900 | 0.0051 | - | - | | 0.5478 | 3000 | 0.0056 | - | - | | 0.5661 | 3100 | 0.0059 | - | - | | 0.5844 | 3200 | 0.0052 | - | - | | 0.6026 | 3300 | 0.0051 | - | - | | 0.6209 | 3400 | 0.0051 | - | - | | 0.6392 | 3500 | 0.0054 | - | - | | 0.6574 | 3600 | 0.0054 | - | - | | 0.6757 | 3700 | 0.0049 | - | - | | 0.6939 | 3800 | 0.0049 | - | - | | 0.7122 | 3900 | 0.0053 | - | - | | 0.7305 | 4000 | 0.0047 | - | - | | 0.7487 | 4100 | 0.0045 | - | - | | 0.7670 | 4200 | 0.0048 | - | - | | 0.7852 | 4300 | 0.0045 | - | - | | 0.8035 | 4400 | 0.0045 | - | - | | 0.8218 | 4500 | 0.0044 | - | - | | 0.8400 | 4600 | 0.0044 | - | - | | 0.8583 | 4700 | 0.0047 | - | - | | 0.8766 | 4800 | 0.0045 | - | - | | 0.8948 | 4900 | 0.0045 | - | - | | 0.9131 | 5000 | 0.005 | - | - | | 0.9313 | 5100 | 0.0047 | - | - | | 0.9496 | 5200 | 0.0048 | - | - | | 0.9679 | 5300 | 0.0043 | - | - | | 0.9861 | 5400 | 0.0042 | - | - | | 1.0044 | 5500 | 0.0043 | - | - | | 1.0226 | 5600 | 0.0045 | - | - | | 1.0409 | 5700 | 0.0043 | - | - | | 1.0592 | 5800 | 0.0041 | - | - | | 1.0774 | 5900 | 0.0039 | - | - | | 1.0957 | 6000 | 0.0036 | - | - | | 1.1140 | 6100 | 0.004 | - | - | | 1.1322 | 6200 | 0.0043 | - | - | | 1.1505 | 6300 | 0.0036 | - | - | | 1.1687 | 6400 | 0.004 | - | - | | 1.1870 | 6500 | 0.0033 | - | - | | 1.2053 | 6600 | 0.0038 | - | - | | 1.2235 | 6700 | 0.0041 | - | - | | 1.2418 | 6800 | 0.0037 | - | - | | 1.2600 | 6900 | 0.0037 | - | - | | 1.2783 | 7000 | 0.0035 | - | - | | 1.2966 | 7100 | 0.0036 | - | - | | 1.3148 | 7200 | 0.0038 | - | - | | 1.3331 | 7300 | 0.003 | - | - | | 1.3514 | 7400 | 0.0034 | - | - | | 1.3696 | 7500 | 0.0036 | - | - | | 1.3879 | 7600 | 0.0033 | - | - | | 1.4061 | 7700 | 0.0034 | - | - | | 1.4244 | 7800 | 0.0031 | - | - | | 1.4427 | 7900 | 0.0031 | - | - | | 1.4609 | 8000 | 0.003 | - | - | | 1.4792 | 8100 | 0.003 | - | - | | 1.4974 | 8200 | 0.003 | - | - | | 1.5157 | 8300 | 0.0028 | - | - | | 1.5340 | 8400 | 0.0029 | - | - | | 1.5522 | 8500 | 0.0031 | - | - | | 1.5705 | 8600 | 0.0031 | - | - | | 1.5888 | 8700 | 0.0031 | - | - | | 1.6070 | 8800 | 0.0027 | - | - | | 1.6253 | 8900 | 0.003 | - | - | | 1.6435 | 9000 | 0.0029 | - | - | | 1.6618 | 9100 | 0.0028 | - | - | | 1.6801 | 9200 | 0.0027 | - | - | | 1.6983 | 9300 | 0.0025 | - | - | | 1.7166 | 9400 | 0.0029 | - | - | | 1.7348 | 9500 | 0.0027 | - | - | | 1.7531 | 9600 | 0.0025 | - | - | | 1.7714 | 9700 | 0.0025 | - | - | | 1.7896 | 9800 | 0.0023 | - | - | | 1.8079 | 9900 | 0.0024 | - | - | | 1.8262 | 10000 | 0.0024 | - | - | | 1.8444 | 10100 | 0.0023 | - | - | | 1.8627 | 10200 | 0.0026 | - | - | | 1.8809 | 10300 | 0.0024 | - | - | | 1.8992 | 10400 | 0.0027 | - | - | | 1.9175 | 10500 | 0.003 | - | - | | 1.9357 | 10600 | 0.0027 | - | - | | 1.9540 | 10700 | 0.0027 | - | - | | 1.9722 | 10800 | 0.0024 | - | - | | 1.9905 | 10900 | 0.0025 | - | - | | 2.0088 | 11000 | 0.0025 | - | - | | 2.0270 | 11100 | 0.0025 | - | - | | 2.0453 | 11200 | 0.0023 | - | - | | 2.0636 | 11300 | 0.0024 | - | - | | 2.0818 | 11400 | 0.0021 | - | - | | 2.1001 | 11500 | 0.0023 | - | - | | 2.1183 | 11600 | 0.0021 | - | - | | 2.1366 | 11700 | 0.0026 | - | - | | 2.1549 | 11800 | 0.002 | - | - | | 2.1731 | 11900 | 0.0023 | - | - | | 2.1914 | 12000 | 0.0021 | - | - | | 2.2096 | 12100 | 0.0022 | - | - | | 2.2279 | 12200 | 0.0025 | - | - | | 2.2462 | 12300 | 0.002 | - | - | | 2.2644 | 12400 | 0.0023 | - | - | | 2.2827 | 12500 | 0.002 | - | - | | 2.3009 | 12600 | 0.0021 | - | - | | 2.3192 | 12700 | 0.0022 | - | - | | 2.3375 | 12800 | 0.0018 | - | - | | 2.3557 | 12900 | 0.002 | - | - | | 2.3740 | 13000 | 0.0023 | - | - | | 2.3923 | 13100 | 0.0021 | - | - | | 2.4105 | 13200 | 0.0018 | - | - | | 2.4288 | 13300 | 0.002 | - | - | | 2.4470 | 13400 | 0.0018 | - | - | | 2.4653 | 13500 | 0.0018 | 0.0030 | 0.6298 | | 2.4836 | 13600 | 0.0019 | - | - | | 2.5018 | 13700 | 0.002 | - | - | | 2.5201 | 13800 | 0.0017 | - | - | | 2.5383 | 13900 | 0.0018 | - | - | | 2.5566 | 14000 | 0.0019 | - | - | | 2.5749 | 14100 | 0.0018 | - | - | | 2.5931 | 14200 | 0.0019 | - | - | | 2.6114 | 14300 | 0.0018 | - | - | | 2.6297 | 14400 | 0.002 | - | - | | 2.6479 | 14500 | 0.002 | - | - | | 2.6662 | 14600 | 0.0019 | - | - | | 2.6844 | 14700 | 0.0016 | - | - | | 2.7027 | 14800 | 0.0017 | - | - | | 2.7210 | 14900 | 0.0019 | - | - | | 2.7392 | 15000 | 0.0017 | - | - | | 2.7575 | 15100 | 0.0016 | - | - | | 2.7757 | 15200 | 0.0016 | - | - | | 2.7940 | 15300 | 0.0014 | - | - | | 2.8123 | 15400 | 0.0014 | - | - | | 2.8305 | 15500 | 0.0014 | - | - | | 2.8488 | 15600 | 0.0016 | - | - | | 2.8671 | 15700 | 0.0016 | - | - | | 2.8853 | 15800 | 0.0016 | - | - | | 2.9036 | 15900 | 0.0017 | - | - | | 2.9218 | 16000 | 0.0018 | - | - | | 2.9401 | 16100 | 0.0019 | - | - | | 2.9584 | 16200 | 0.0017 | - | - | | 2.9766 | 16300 | 0.0015 | - | - | | 2.9949 | 16400 | 0.0017 | - | - | | 3.0131 | 16500 | 0.0016 | - | - | | 3.0314 | 16600 | 0.0015 | - | - | | 3.0497 | 16700 | 0.0015 | - | - | | 3.0679 | 16800 | 0.0016 | - | - | | 3.0862 | 16900 | 0.0013 | - | - | | 3.1045 | 17000 | 0.0014 | - | - | | 3.1227 | 17100 | 0.0015 | - | - | | 3.1410 | 17200 | 0.0015 | - | - | | 3.1592 | 17300 | 0.0014 | - | - | | 3.1775 | 17400 | 0.0014 | - | - | | 3.1958 | 17500 | 0.0014 | - | - | | 3.2140 | 17600 | 0.0016 | - | - | | 3.2323 | 17700 | 0.0017 | - | - | | 3.2505 | 17800 | 0.0012 | - | - | | 3.2688 | 17900 | 0.0014 | - | - | | 3.2871 | 18000 | 0.0014 | - | - | | 3.3053 | 18100 | 0.0014 | - | - | | 3.3236 | 18200 | 0.0013 | - | - | | 3.3419 | 18300 | 0.0012 | - | - | | 3.3601 | 18400 | 0.0015 | - | - | | 3.3784 | 18500 | 0.0013 | - | - | | 3.3966 | 18600 | 0.0013 | - | - | | 3.4149 | 18700 | 0.0011 | - | - | | 3.4332 | 18800 | 0.0013 | - | - | | 3.4514 | 18900 | 0.0012 | - | - | | 3.4697 | 19000 | 0.0011 | - | - | | 3.4879 | 19100 | 0.0013 | - | - | | 3.5062 | 19200 | 0.0012 | - | - | | 3.5245 | 19300 | 0.001 | - | - | | 3.5427 | 19400 | 0.0014 | - | - | | 3.5610 | 19500 | 0.0012 | - | - | | 3.5793 | 19600 | 0.0013 | - | - | | 3.5975 | 19700 | 0.0013 | - | - | | 3.6158 | 19800 | 0.0012 | - | - | | 3.6340 | 19900 | 0.0014 | - | - | | 3.6523 | 20000 | 0.0011 | - | - | | 3.6706 | 20100 | 0.0012 | - | - | | 3.6888 | 20200 | 0.0012 | - | - | | 3.7071 | 20300 | 0.0012 | - | - | | 3.7253 | 20400 | 0.0013 | - | - | | 3.7436 | 20500 | 0.001 | - | - | | 3.7619 | 20600 | 0.0011 | - | - | | 3.7801 | 20700 | 0.0011 | - | - | | 3.7984 | 20800 | 0.0009 | - | - | | 3.8167 | 20900 | 0.0011 | - | - | | 3.8349 | 21000 | 0.0009 | - | - | | 3.8532 | 21100 | 0.0011 | - | - | | 3.8714 | 21200 | 0.001 | - | - | | 3.8897 | 21300 | 0.0011 | - | - | | 3.9080 | 21400 | 0.0011 | - | - | | 3.9262 | 21500 | 0.0011 | - | - | | 3.9445 | 21600 | 0.0013 | - | - | | 3.9627 | 21700 | 0.0011 | - | - | | 3.9810 | 21800 | 0.001 | - | - | | 3.9993 | 21900 | 0.0011 | - | - | | 4.0175 | 22000 | 0.0011 | - | - | | 4.0358 | 22100 | 0.0011 | - | - | | 4.0541 | 22200 | 0.001 | - | - | | 4.0723 | 22300 | 0.0011 | - | - | | 4.0906 | 22400 | 0.0009 | - | - | | 4.1088 | 22500 | 0.001 | - | - | | 4.1271 | 22600 | 0.0011 | - | - | | 4.1454 | 22700 | 0.0011 | - | - | | 4.1636 | 22800 | 0.001 | - | - | | 4.1819 | 22900 | 0.0009 | - | - | | 4.2001 | 23000 | 0.0009 | - | - | | 4.2184 | 23100 | 0.0012 | - | - | | 4.2367 | 23200 | 0.0011 | - | - | | 4.2549 | 23300 | 0.0009 | - | - | | 4.2732 | 23400 | 0.0009 | - | - | | 4.2915 | 23500 | 0.0011 | - | - | | 4.3097 | 23600 | 0.001 | - | - | | 4.3280 | 23700 | 0.0008 | - | - | | 4.3462 | 23800 | 0.0009 | - | - | | 4.3645 | 23900 | 0.001 | - | - | | 4.3828 | 24000 | 0.0009 | - | - | | 4.4010 | 24100 | 0.001 | - | - | | 4.4193 | 24200 | 0.0009 | - | - | | 4.4375 | 24300 | 0.0009 | - | - | | 4.4558 | 24400 | 0.0008 | - | - | | 4.4741 | 24500 | 0.0009 | - | - | | 4.4923 | 24600 | 0.001 | - | - | | 4.5106 | 24700 | 0.0008 | - | - | | 4.5289 | 24800 | 0.0009 | - | - | | 4.5471 | 24900 | 0.001 | - | - | | 4.5654 | 25000 | 0.0009 | - | - | | 4.5836 | 25100 | 0.0009 | - | - | | 4.6019 | 25200 | 0.001 | - | - | | 4.6202 | 25300 | 0.0009 | - | - | | 4.6384 | 25400 | 0.001 | - | - | | 4.6567 | 25500 | 0.0009 | - | - | | 4.6749 | 25600 | 0.0009 | - | - | | 4.6932 | 25700 | 0.0009 | - | - | | 4.7115 | 25800 | 0.001 | - | - | | 4.7297 | 25900 | 0.001 | - | - | | 4.7480 | 26000 | 0.0009 | - | - | | 4.7663 | 26100 | 0.0008 | - | - | | 4.7845 | 26200 | 0.0007 | - | - | | 4.8028 | 26300 | 0.0008 | - | - | | 4.8210 | 26400 | 0.0008 | - | - | | 4.8393 | 26500 | 0.0009 | - | - | | 4.8576 | 26600 | 0.0009 | - | - | | 4.8758 | 26700 | 0.0008 | - | - | | 4.8941 | 26800 | 0.0009 | - | - | | 4.9123 | 26900 | 0.001 | - | - | | 4.9306 | 27000 | 0.0009 | 0.0029 | 0.6245 | | 4.9489 | 27100 | 0.001 | - | - | | 4.9671 | 27200 | 0.0009 | - | - | | 4.9854 | 27300 | 0.0009 | - | - | </details> ### Framework Versions - Python: 3.10.14 - Sentence Transformers: 3.2.1 - Transformers: 4.45.1 - PyTorch: 2.4.0 - Accelerate: 0.34.2 - Datasets: 3.0.1 - Tokenizers: 0.20.0 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### ContrastiveLoss ```bibtex @inproceedings{hadsell2006dimensionality, author={Hadsell, R. and Chopra, S. and LeCun, Y.}, booktitle={2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06)}, title={Dimensionality Reduction by Learning an Invariant Mapping}, year={2006}, volume={2}, number={}, pages={1735-1742}, doi={10.1109/CVPR.2006.100} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "CHIA" ]
BioNLP
twadada/nmc-cls-nobiv
twadada
null
[ "mteb", "model-index", "region:us" ]
1,725
1,725
0
0
--- tags: - mteb model-index: - name: nomic_classification_nobivec results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: None config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 72.4179104477612 - type: ap value: 34.63780365598458 - type: f1 value: 66.15457281659042 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: None config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 64.979125 - type: ap value: 60.068877740256944 - type: f1 value: 64.73214550243449 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: None config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 34.13 - type: f1 value: 33.6415337800296 - task: type: Retrieval dataset: name: MTEB ArguAna type: None config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 21.55 - type: map_at_10 value: 35.672 - type: map_at_100 value: 36.855 - type: map_at_1000 value: 36.879 - type: map_at_3 value: 31.163999999999998 - type: map_at_5 value: 33.639 - type: mrr_at_1 value: 22.191 - type: mrr_at_10 value: 35.916 - type: mrr_at_100 value: 37.106 - type: mrr_at_1000 value: 37.13 - type: mrr_at_3 value: 31.424999999999997 - type: mrr_at_5 value: 33.854 - type: ndcg_at_1 value: 21.55 - type: ndcg_at_10 value: 43.796 - type: ndcg_at_100 value: 49.260999999999996 - type: ndcg_at_1000 value: 49.823 - type: ndcg_at_3 value: 34.455000000000005 - type: ndcg_at_5 value: 38.926 - type: precision_at_1 value: 21.55 - type: precision_at_10 value: 6.984 - type: precision_at_100 value: 0.947 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 14.674999999999999 - type: precision_at_5 value: 10.982 - type: recall_at_1 value: 21.55 - type: recall_at_10 value: 69.844 - type: recall_at_100 value: 94.666 - type: recall_at_1000 value: 98.933 - type: recall_at_3 value: 44.025999999999996 - type: recall_at_5 value: 54.908 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: None config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 33.85061123853429 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: None config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 24.43376523076998 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: None config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 53.394179738377154 - type: mrr value: 66.89860616453414 - task: type: STS dataset: name: MTEB BIOSSES type: None config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 81.85377718907006 - type: cos_sim_spearman value: 80.32893878309842 - type: euclidean_pearson value: 80.64645390094664 - type: euclidean_spearman value: 80.32893878309842 - type: manhattan_pearson value: 80.0646777271307 - type: manhattan_spearman value: 79.71627422071113 - task: type: Classification dataset: name: MTEB Banking77Classification type: None config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 73.52597402597404 - type: f1 value: 72.74726159139323 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: None config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 32.153351191346665 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: None config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 23.419276973031316 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: None config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: map_at_1 value: 21.231 - type: map_at_10 value: 28.671999999999997 - type: map_at_100 value: 29.804000000000002 - type: map_at_1000 value: 29.963 - type: map_at_3 value: 26.308 - type: map_at_5 value: 27.633000000000003 - type: mrr_at_1 value: 27.325 - type: mrr_at_10 value: 34.69 - type: mrr_at_100 value: 35.56 - type: mrr_at_1000 value: 35.636 - type: mrr_at_3 value: 32.879999999999995 - type: mrr_at_5 value: 33.796 - type: ndcg_at_1 value: 27.325 - type: ndcg_at_10 value: 33.606 - type: ndcg_at_100 value: 38.623000000000005 - type: ndcg_at_1000 value: 41.93 - type: ndcg_at_3 value: 30.296 - type: ndcg_at_5 value: 31.629 - type: precision_at_1 value: 27.325 - type: precision_at_10 value: 6.481000000000001 - type: precision_at_100 value: 1.11 - type: precision_at_1000 value: 0.169 - type: precision_at_3 value: 14.735000000000001 - type: precision_at_5 value: 10.615 - type: recall_at_1 value: 21.231 - type: recall_at_10 value: 42.033 - type: recall_at_100 value: 64.452 - type: recall_at_1000 value: 87.047 - type: recall_at_3 value: 31.433 - type: recall_at_5 value: 35.69 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval type: None config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: map_at_1 value: 17.583 - type: map_at_10 value: 23.139000000000003 - type: map_at_100 value: 24.071 - type: map_at_1000 value: 24.194 - type: map_at_3 value: 21.171 - type: map_at_5 value: 22.309 - type: mrr_at_1 value: 22.102 - type: mrr_at_10 value: 27.442 - type: mrr_at_100 value: 28.199999999999996 - type: mrr_at_1000 value: 28.269 - type: mrr_at_3 value: 25.34 - type: mrr_at_5 value: 26.521 - type: ndcg_at_1 value: 22.102 - type: ndcg_at_10 value: 27.039 - type: ndcg_at_100 value: 31.389 - type: ndcg_at_1000 value: 34.247 - type: ndcg_at_3 value: 23.573 - type: ndcg_at_5 value: 25.185000000000002 - type: precision_at_1 value: 22.102 - type: precision_at_10 value: 5.019 - type: precision_at_100 value: 0.902 - type: precision_at_1000 value: 0.14100000000000001 - type: precision_at_3 value: 11.168 - type: precision_at_5 value: 8.102 - type: recall_at_1 value: 17.583 - type: recall_at_10 value: 34.479 - type: recall_at_100 value: 53.76499999999999 - type: recall_at_1000 value: 73.169 - type: recall_at_3 value: 24.532 - type: recall_at_5 value: 28.810000000000002 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval type: None config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 26.637 - type: map_at_10 value: 35.083999999999996 - type: map_at_100 value: 36.248999999999995 - type: map_at_1000 value: 36.346000000000004 - type: map_at_3 value: 32.623000000000005 - type: map_at_5 value: 33.969 - type: mrr_at_1 value: 31.034 - type: mrr_at_10 value: 38.440000000000005 - type: mrr_at_100 value: 39.363 - type: mrr_at_1000 value: 39.425 - type: mrr_at_3 value: 36.176 - type: mrr_at_5 value: 37.461 - type: ndcg_at_1 value: 31.034 - type: ndcg_at_10 value: 39.834 - type: ndcg_at_100 value: 45.091 - type: ndcg_at_1000 value: 47.321999999999996 - type: ndcg_at_3 value: 35.253 - type: ndcg_at_5 value: 37.403999999999996 - type: precision_at_1 value: 31.034 - type: precision_at_10 value: 6.47 - type: precision_at_100 value: 0.9939999999999999 - type: precision_at_1000 value: 0.126 - type: precision_at_3 value: 15.674 - type: precision_at_5 value: 10.897 - type: recall_at_1 value: 26.637 - type: recall_at_10 value: 51.03 - type: recall_at_100 value: 74.30600000000001 - type: recall_at_1000 value: 90.51599999999999 - type: recall_at_3 value: 38.461 - type: recall_at_5 value: 43.823 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval type: None config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: map_at_1 value: 12.674 - type: map_at_10 value: 16.922 - type: map_at_100 value: 17.741 - type: map_at_1000 value: 17.855999999999998 - type: map_at_3 value: 15.329999999999998 - type: map_at_5 value: 16.236 - type: mrr_at_1 value: 13.785 - type: mrr_at_10 value: 18.076 - type: mrr_at_100 value: 18.904 - type: mrr_at_1000 value: 19.008 - type: mrr_at_3 value: 16.46 - type: mrr_at_5 value: 17.375 - type: ndcg_at_1 value: 13.785 - type: ndcg_at_10 value: 19.759999999999998 - type: ndcg_at_100 value: 24.121000000000002 - type: ndcg_at_1000 value: 27.662 - type: ndcg_at_3 value: 16.477 - type: ndcg_at_5 value: 18.09 - type: precision_at_1 value: 13.785 - type: precision_at_10 value: 3.096 - type: precision_at_100 value: 0.5599999999999999 - type: precision_at_1000 value: 0.091 - type: precision_at_3 value: 6.816999999999999 - type: precision_at_5 value: 4.994 - type: recall_at_1 value: 12.674 - type: recall_at_10 value: 27.589000000000002 - type: recall_at_100 value: 48.197 - type: recall_at_1000 value: 76.108 - type: recall_at_3 value: 18.644 - type: recall_at_5 value: 22.514 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval type: None config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: map_at_1 value: 6.279999999999999 - type: map_at_10 value: 9.957 - type: map_at_100 value: 10.719 - type: map_at_1000 value: 10.854999999999999 - type: map_at_3 value: 8.73 - type: map_at_5 value: 9.293999999999999 - type: mrr_at_1 value: 8.333 - type: mrr_at_10 value: 12.375 - type: mrr_at_100 value: 13.191 - type: mrr_at_1000 value: 13.295000000000002 - type: mrr_at_3 value: 10.925 - type: mrr_at_5 value: 11.602 - type: ndcg_at_1 value: 8.333 - type: ndcg_at_10 value: 12.581999999999999 - type: ndcg_at_100 value: 16.589000000000002 - type: ndcg_at_1000 value: 20.339 - type: ndcg_at_3 value: 10.128 - type: ndcg_at_5 value: 11.046 - type: precision_at_1 value: 8.333 - type: precision_at_10 value: 2.463 - type: precision_at_100 value: 0.516 - type: precision_at_1000 value: 0.098 - type: precision_at_3 value: 5.017 - type: precision_at_5 value: 3.682 - type: recall_at_1 value: 6.279999999999999 - type: recall_at_10 value: 18.367 - type: recall_at_100 value: 36.211 - type: recall_at_1000 value: 63.771 - type: recall_at_3 value: 11.556 - type: recall_at_5 value: 13.882 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval type: None config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: map_at_1 value: 17.515 - type: map_at_10 value: 23.783 - type: map_at_100 value: 24.932000000000002 - type: map_at_1000 value: 25.063999999999997 - type: map_at_3 value: 21.579 - type: map_at_5 value: 22.732 - type: mrr_at_1 value: 21.752 - type: mrr_at_10 value: 28.335 - type: mrr_at_100 value: 29.264000000000003 - type: mrr_at_1000 value: 29.34 - type: mrr_at_3 value: 26.051000000000002 - type: mrr_at_5 value: 27.339999999999996 - type: ndcg_at_1 value: 21.752 - type: ndcg_at_10 value: 28.232000000000003 - type: ndcg_at_100 value: 33.822 - type: ndcg_at_1000 value: 36.802 - type: ndcg_at_3 value: 24.311 - type: ndcg_at_5 value: 26.019 - type: precision_at_1 value: 21.752 - type: precision_at_10 value: 5.265000000000001 - type: precision_at_100 value: 0.9730000000000001 - type: precision_at_1000 value: 0.14300000000000002 - type: precision_at_3 value: 11.421000000000001 - type: precision_at_5 value: 8.354000000000001 - type: recall_at_1 value: 17.515 - type: recall_at_10 value: 37.284 - type: recall_at_100 value: 61.987 - type: recall_at_1000 value: 82.52199999999999 - type: recall_at_3 value: 25.964 - type: recall_at_5 value: 30.523 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval type: None config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: map_at_1 value: 12.753999999999998 - type: map_at_10 value: 18.490000000000002 - type: map_at_100 value: 19.615 - type: map_at_1000 value: 19.750999999999998 - type: map_at_3 value: 16.762 - type: map_at_5 value: 17.69 - type: mrr_at_1 value: 16.096 - type: mrr_at_10 value: 21.971 - type: mrr_at_100 value: 22.976 - type: mrr_at_1000 value: 23.058999999999997 - type: mrr_at_3 value: 20.186 - type: mrr_at_5 value: 21.077 - type: ndcg_at_1 value: 16.096 - type: ndcg_at_10 value: 22.201 - type: ndcg_at_100 value: 27.668 - type: ndcg_at_1000 value: 31.123 - type: ndcg_at_3 value: 18.983 - type: ndcg_at_5 value: 20.314 - type: precision_at_1 value: 16.096 - type: precision_at_10 value: 4.189 - type: precision_at_100 value: 0.828 - type: precision_at_1000 value: 0.131 - type: precision_at_3 value: 9.285 - type: precision_at_5 value: 6.621 - type: recall_at_1 value: 12.753999999999998 - type: recall_at_10 value: 30.062 - type: recall_at_100 value: 53.913 - type: recall_at_1000 value: 78.846 - type: recall_at_3 value: 21.024 - type: recall_at_5 value: 24.524 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: mteb/cqadupstack config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 14.441 - type: map_at_10 value: 19.80083333333333 - type: map_at_100 value: 20.72533333333334 - type: map_at_1000 value: 20.851166666666664 - type: map_at_3 value: 18.076749999999997 - type: map_at_5 value: 19.03183333333333 - type: mrr_at_1 value: 17.60725 - type: mrr_at_10 value: 22.957833333333333 - type: mrr_at_100 value: 23.786583333333333 - type: mrr_at_1000 value: 23.871666666666666 - type: mrr_at_3 value: 21.23666666666667 - type: mrr_at_5 value: 22.187416666666667 - type: ndcg_at_1 value: 17.60725 - type: ndcg_at_10 value: 23.32825 - type: ndcg_at_100 value: 27.8915 - type: ndcg_at_1000 value: 31.089166666666667 - type: ndcg_at_3 value: 20.2065 - type: ndcg_at_5 value: 21.6315 - type: precision_at_1 value: 17.60725 - type: precision_at_10 value: 4.173833333333333 - type: precision_at_100 value: 0.7679166666666666 - type: precision_at_1000 value: 0.12241666666666666 - type: precision_at_3 value: 9.375000000000002 - type: precision_at_5 value: 6.76125 - type: recall_at_1 value: 14.441 - type: recall_at_10 value: 30.861833333333333 - type: recall_at_100 value: 51.641583333333344 - type: recall_at_1000 value: 74.93816666666666 - type: recall_at_3 value: 22.006 - type: recall_at_5 value: 25.715833333333336 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval type: None config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: map_at_1 value: 10.766 - type: map_at_10 value: 14.588999999999999 - type: map_at_100 value: 15.314 - type: map_at_1000 value: 15.396 - type: map_at_3 value: 13.164000000000001 - type: map_at_5 value: 13.880999999999998 - type: mrr_at_1 value: 12.883 - type: mrr_at_10 value: 16.607 - type: mrr_at_100 value: 17.365 - type: mrr_at_1000 value: 17.435000000000002 - type: mrr_at_3 value: 15.133 - type: mrr_at_5 value: 15.946 - type: ndcg_at_1 value: 12.883 - type: ndcg_at_10 value: 17.252000000000002 - type: ndcg_at_100 value: 21.156 - type: ndcg_at_1000 value: 23.696 - type: ndcg_at_3 value: 14.48 - type: ndcg_at_5 value: 15.684000000000001 - type: precision_at_1 value: 12.883 - type: precision_at_10 value: 2.945 - type: precision_at_100 value: 0.5369999999999999 - type: precision_at_1000 value: 0.083 - type: precision_at_3 value: 6.390999999999999 - type: precision_at_5 value: 4.693 - type: recall_at_1 value: 10.766 - type: recall_at_10 value: 23.576 - type: recall_at_100 value: 41.887 - type: recall_at_1000 value: 61.41 - type: recall_at_3 value: 15.974 - type: recall_at_5 value: 18.879 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval type: None config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 8.003 - type: map_at_10 value: 11.571 - type: map_at_100 value: 12.196 - type: map_at_1000 value: 12.318 - type: map_at_3 value: 10.5 - type: map_at_5 value: 11.125 - type: mrr_at_1 value: 9.841999999999999 - type: mrr_at_10 value: 13.86 - type: mrr_at_100 value: 14.493 - type: mrr_at_1000 value: 14.593 - type: mrr_at_3 value: 12.681000000000001 - type: mrr_at_5 value: 13.345 - type: ndcg_at_1 value: 9.841999999999999 - type: ndcg_at_10 value: 13.975999999999999 - type: ndcg_at_100 value: 17.524 - type: ndcg_at_1000 value: 21.053 - type: ndcg_at_3 value: 11.974 - type: ndcg_at_5 value: 12.934999999999999 - type: precision_at_1 value: 9.841999999999999 - type: precision_at_10 value: 2.588 - type: precision_at_100 value: 0.526 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 5.804 - type: precision_at_5 value: 4.218999999999999 - type: recall_at_1 value: 8.003 - type: recall_at_10 value: 18.995 - type: recall_at_100 value: 35.741 - type: recall_at_1000 value: 61.997 - type: recall_at_3 value: 13.367999999999999 - type: recall_at_5 value: 15.83 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval type: None config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: map_at_1 value: 12.591 - type: map_at_10 value: 17.537 - type: map_at_100 value: 18.345 - type: map_at_1000 value: 18.471 - type: map_at_3 value: 16.066 - type: map_at_5 value: 16.858 - type: mrr_at_1 value: 15.485 - type: mrr_at_10 value: 20.683 - type: mrr_at_100 value: 21.498 - type: mrr_at_1000 value: 21.598 - type: mrr_at_3 value: 19.092000000000002 - type: mrr_at_5 value: 19.964000000000002 - type: ndcg_at_1 value: 15.485 - type: ndcg_at_10 value: 20.888 - type: ndcg_at_100 value: 25.096 - type: ndcg_at_1000 value: 28.579 - type: ndcg_at_3 value: 18.014 - type: ndcg_at_5 value: 19.31 - type: precision_at_1 value: 15.485 - type: precision_at_10 value: 3.601 - type: precision_at_100 value: 0.623 - type: precision_at_1000 value: 0.105 - type: precision_at_3 value: 8.364 - type: precision_at_5 value: 5.877000000000001 - type: recall_at_1 value: 12.591 - type: recall_at_10 value: 28.189999999999998 - type: recall_at_100 value: 47.453 - type: recall_at_1000 value: 72.968 - type: recall_at_3 value: 20.194000000000003 - type: recall_at_5 value: 23.592 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval type: None config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: map_at_1 value: 15.137999999999998 - type: map_at_10 value: 21.215 - type: map_at_100 value: 22.355 - type: map_at_1000 value: 22.527 - type: map_at_3 value: 19.527 - type: map_at_5 value: 20.538999999999998 - type: mrr_at_1 value: 18.972 - type: mrr_at_10 value: 24.792 - type: mrr_at_100 value: 25.697 - type: mrr_at_1000 value: 25.772000000000002 - type: mrr_at_3 value: 23.188 - type: mrr_at_5 value: 24.206 - type: ndcg_at_1 value: 18.972 - type: ndcg_at_10 value: 25.144 - type: ndcg_at_100 value: 30.163 - type: ndcg_at_1000 value: 33.575 - type: ndcg_at_3 value: 22.462 - type: ndcg_at_5 value: 23.885 - type: precision_at_1 value: 18.972 - type: precision_at_10 value: 4.901 - type: precision_at_100 value: 1.093 - type: precision_at_1000 value: 0.191 - type: precision_at_3 value: 10.738 - type: precision_at_5 value: 7.904999999999999 - type: recall_at_1 value: 15.137999999999998 - type: recall_at_10 value: 32.132 - type: recall_at_100 value: 55.574 - type: recall_at_1000 value: 79.377 - type: recall_at_3 value: 24.08 - type: recall_at_5 value: 27.983999999999998 - task: type: Retrieval dataset: name: MTEB CQADupstackWordpressRetrieval type: None config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 12.120000000000001 - type: map_at_10 value: 16.651 - type: map_at_100 value: 17.363 - type: map_at_1000 value: 17.473 - type: map_at_3 value: 15.161 - type: map_at_5 value: 16.116 - type: mrr_at_1 value: 13.678 - type: mrr_at_10 value: 18.223 - type: mrr_at_100 value: 18.928 - type: mrr_at_1000 value: 19.03 - type: mrr_at_3 value: 16.728 - type: mrr_at_5 value: 17.616 - type: ndcg_at_1 value: 13.678 - type: ndcg_at_10 value: 19.425 - type: ndcg_at_100 value: 23.456 - type: ndcg_at_1000 value: 26.741999999999997 - type: ndcg_at_3 value: 16.527 - type: ndcg_at_5 value: 18.076999999999998 - type: precision_at_1 value: 13.678 - type: precision_at_10 value: 3.068 - type: precision_at_100 value: 0.553 - type: precision_at_1000 value: 0.092 - type: precision_at_3 value: 7.086 - type: precision_at_5 value: 5.176 - type: recall_at_1 value: 12.120000000000001 - type: recall_at_10 value: 26.605 - type: recall_at_100 value: 46.213 - type: recall_at_1000 value: 71.527 - type: recall_at_3 value: 18.842 - type: recall_at_5 value: 22.539 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: None config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: map_at_1 value: 5.872 - type: map_at_10 value: 10.459 - type: map_at_100 value: 11.911 - type: map_at_1000 value: 12.113999999999999 - type: map_at_3 value: 8.425 - type: map_at_5 value: 9.459 - type: mrr_at_1 value: 13.29 - type: mrr_at_10 value: 21.566 - type: mrr_at_100 value: 22.814 - type: mrr_at_1000 value: 22.883 - type: mrr_at_3 value: 18.491 - type: mrr_at_5 value: 20.175 - type: ndcg_at_1 value: 13.29 - type: ndcg_at_10 value: 15.93 - type: ndcg_at_100 value: 22.666 - type: ndcg_at_1000 value: 26.784000000000002 - type: ndcg_at_3 value: 11.991999999999999 - type: ndcg_at_5 value: 13.453000000000001 - type: precision_at_1 value: 13.29 - type: precision_at_10 value: 5.349 - type: precision_at_100 value: 1.247 - type: precision_at_1000 value: 0.199 - type: precision_at_3 value: 9.121 - type: precision_at_5 value: 7.5569999999999995 - type: recall_at_1 value: 5.872 - type: recall_at_10 value: 20.546 - type: recall_at_100 value: 44.262 - type: recall_at_1000 value: 68.027 - type: recall_at_3 value: 11.219999999999999 - type: recall_at_5 value: 14.846 - task: type: Retrieval dataset: name: MTEB DBPedia type: None config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: map_at_1 value: 4.481 - type: map_at_10 value: 9.783999999999999 - type: map_at_100 value: 13.401 - type: map_at_1000 value: 14.316 - type: map_at_3 value: 6.957000000000001 - type: map_at_5 value: 8.315999999999999 - type: mrr_at_1 value: 41.5 - type: mrr_at_10 value: 50.479 - type: mrr_at_100 value: 51.309000000000005 - type: mrr_at_1000 value: 51.346000000000004 - type: mrr_at_3 value: 48.167 - type: mrr_at_5 value: 49.429 - type: ndcg_at_1 value: 30.625000000000004 - type: ndcg_at_10 value: 23.841 - type: ndcg_at_100 value: 26.651000000000003 - type: ndcg_at_1000 value: 33.273 - type: ndcg_at_3 value: 26.772000000000002 - type: ndcg_at_5 value: 25.564999999999998 - type: precision_at_1 value: 41.5 - type: precision_at_10 value: 20.8 - type: precision_at_100 value: 6.47 - type: precision_at_1000 value: 1.3639999999999999 - type: precision_at_3 value: 31.917 - type: precision_at_5 value: 27.35 - type: recall_at_1 value: 4.481 - type: recall_at_10 value: 14.294 - type: recall_at_100 value: 32.693 - type: recall_at_1000 value: 54.698 - type: recall_at_3 value: 7.994999999999999 - type: recall_at_5 value: 10.823 - task: type: Classification dataset: name: MTEB EmotionClassification type: None config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 44.86 - type: f1 value: 40.914967776838886 - task: type: Retrieval dataset: name: MTEB FEVER type: None config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: map_at_1 value: 13.886000000000001 - type: map_at_10 value: 21.253 - type: map_at_100 value: 22.154 - type: map_at_1000 value: 22.234 - type: map_at_3 value: 18.92 - type: map_at_5 value: 20.24 - type: mrr_at_1 value: 14.731 - type: mrr_at_10 value: 22.512999999999998 - type: mrr_at_100 value: 23.429 - type: mrr_at_1000 value: 23.5 - type: mrr_at_3 value: 20.064999999999998 - type: mrr_at_5 value: 21.451999999999998 - type: ndcg_at_1 value: 14.731 - type: ndcg_at_10 value: 25.730999999999998 - type: ndcg_at_100 value: 30.493 - type: ndcg_at_1000 value: 32.713 - type: ndcg_at_3 value: 20.893 - type: ndcg_at_5 value: 23.278 - type: precision_at_1 value: 14.731 - type: precision_at_10 value: 4.188 - type: precision_at_100 value: 0.677 - type: precision_at_1000 value: 0.089 - type: precision_at_3 value: 9.096 - type: precision_at_5 value: 6.709 - type: recall_at_1 value: 13.886000000000001 - type: recall_at_10 value: 38.576 - type: recall_at_100 value: 61.08 - type: recall_at_1000 value: 78.25800000000001 - type: recall_at_3 value: 25.435000000000002 - type: recall_at_5 value: 31.176 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: None config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: map_at_1 value: 5.604 - type: map_at_10 value: 10.035 - type: map_at_100 value: 11.094999999999999 - type: map_at_1000 value: 11.275 - type: map_at_3 value: 8.205 - type: map_at_5 value: 9.207 - type: mrr_at_1 value: 11.42 - type: mrr_at_10 value: 17.53 - type: mrr_at_100 value: 18.447 - type: mrr_at_1000 value: 18.545 - type: mrr_at_3 value: 15.201 - type: mrr_at_5 value: 16.497 - type: ndcg_at_1 value: 11.42 - type: ndcg_at_10 value: 14.27 - type: ndcg_at_100 value: 19.518 - type: ndcg_at_1000 value: 23.851 - type: ndcg_at_3 value: 11.315999999999999 - type: ndcg_at_5 value: 12.526000000000002 - type: precision_at_1 value: 11.42 - type: precision_at_10 value: 4.367 - type: precision_at_100 value: 0.963 - type: precision_at_1000 value: 0.17099999999999999 - type: precision_at_3 value: 7.767 - type: precision_at_5 value: 6.235 - type: recall_at_1 value: 5.604 - type: recall_at_10 value: 19.182 - type: recall_at_100 value: 40.053 - type: recall_at_1000 value: 67.367 - type: recall_at_3 value: 10.530000000000001 - type: recall_at_5 value: 14.11 - task: type: Retrieval dataset: name: MTEB HotpotQA type: None config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: map_at_1 value: 15.847 - type: map_at_10 value: 22.32 - type: map_at_100 value: 23.211000000000002 - type: map_at_1000 value: 23.322000000000003 - type: map_at_3 value: 20.47 - type: map_at_5 value: 21.497 - type: mrr_at_1 value: 31.695 - type: mrr_at_10 value: 38.491 - type: mrr_at_100 value: 39.231 - type: mrr_at_1000 value: 39.294000000000004 - type: mrr_at_3 value: 36.516 - type: mrr_at_5 value: 37.647999999999996 - type: ndcg_at_1 value: 31.695 - type: ndcg_at_10 value: 28.759 - type: ndcg_at_100 value: 32.957 - type: ndcg_at_1000 value: 35.686 - type: ndcg_at_3 value: 25.144 - type: ndcg_at_5 value: 26.909 - type: precision_at_1 value: 31.695 - type: precision_at_10 value: 6.394 - type: precision_at_100 value: 0.9769999999999999 - type: precision_at_1000 value: 0.134 - type: precision_at_3 value: 15.867999999999999 - type: precision_at_5 value: 10.922 - type: recall_at_1 value: 15.847 - type: recall_at_10 value: 31.972 - type: recall_at_100 value: 48.825 - type: recall_at_1000 value: 67.07 - type: recall_at_3 value: 23.801 - type: recall_at_5 value: 27.306 - task: type: Classification dataset: name: MTEB ImdbClassification type: None config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 64.0996 - type: ap value: 59.24690414462077 - type: f1 value: 63.901632743552675 - task: type: Retrieval dataset: name: MTEB MSMARCO type: None config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: map_at_1 value: 6.032 - type: map_at_10 value: 10.571 - type: map_at_100 value: 11.434 - type: map_at_1000 value: 11.536 - type: map_at_3 value: 8.927 - type: map_at_5 value: 9.745 - type: mrr_at_1 value: 6.218 - type: mrr_at_10 value: 10.857999999999999 - type: mrr_at_100 value: 11.728 - type: mrr_at_1000 value: 11.826 - type: mrr_at_3 value: 9.181000000000001 - type: mrr_at_5 value: 10.014000000000001 - type: ndcg_at_1 value: 6.203 - type: ndcg_at_10 value: 13.471 - type: ndcg_at_100 value: 18.267 - type: ndcg_at_1000 value: 21.367 - type: ndcg_at_3 value: 9.993 - type: ndcg_at_5 value: 11.472 - type: precision_at_1 value: 6.203 - type: precision_at_10 value: 2.341 - type: precision_at_100 value: 0.484 - type: precision_at_1000 value: 0.075 - type: precision_at_3 value: 4.436 - type: precision_at_5 value: 3.4070000000000005 - type: recall_at_1 value: 6.032 - type: recall_at_10 value: 22.514 - type: recall_at_100 value: 46.061 - type: recall_at_1000 value: 71.002 - type: recall_at_3 value: 12.845999999999998 - type: recall_at_5 value: 16.411 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: None config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 89.89056087551299 - type: f1 value: 88.92611201654476 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: None config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 59.46648426812586 - type: f1 value: 40.09031702149602 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: None config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 61.882985877605925 - type: f1 value: 59.62547960951362 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: None config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.67854741089442 - type: f1 value: 67.30454712745052 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: None config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 28.12525413973902 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: None config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 25.349303249863205 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: None config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 29.51131395873944 - type: mrr value: 30.491052548905884 - task: type: Retrieval dataset: name: MTEB NFCorpus type: None config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: map_at_1 value: 3.676 - type: map_at_10 value: 7.663 - type: map_at_100 value: 9.617 - type: map_at_1000 value: 10.937 - type: map_at_3 value: 5.795 - type: map_at_5 value: 6.7940000000000005 - type: mrr_at_1 value: 33.437 - type: mrr_at_10 value: 42.714 - type: mrr_at_100 value: 43.322 - type: mrr_at_1000 value: 43.387 - type: mrr_at_3 value: 39.732 - type: mrr_at_5 value: 41.914 - type: ndcg_at_1 value: 31.424000000000003 - type: ndcg_at_10 value: 24.458 - type: ndcg_at_100 value: 22.656000000000002 - type: ndcg_at_1000 value: 32.381 - type: ndcg_at_3 value: 27.326 - type: ndcg_at_5 value: 26.680999999999997 - type: precision_at_1 value: 33.437 - type: precision_at_10 value: 18.39 - type: precision_at_100 value: 6.328 - type: precision_at_1000 value: 1.984 - type: precision_at_3 value: 26.006 - type: precision_at_5 value: 23.406 - type: recall_at_1 value: 3.676 - type: recall_at_10 value: 11.702 - type: recall_at_100 value: 24.608 - type: recall_at_1000 value: 59.004 - type: recall_at_3 value: 7.112 - type: recall_at_5 value: 9.462 - task: type: Retrieval dataset: name: MTEB NQ type: None config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: map_at_1 value: 8.774999999999999 - type: map_at_10 value: 14.562 - type: map_at_100 value: 15.767000000000001 - type: map_at_1000 value: 15.870000000000001 - type: map_at_3 value: 12.216000000000001 - type: map_at_5 value: 13.455 - type: mrr_at_1 value: 9.936 - type: mrr_at_10 value: 16.105 - type: mrr_at_100 value: 17.255000000000003 - type: mrr_at_1000 value: 17.339 - type: mrr_at_3 value: 13.654 - type: mrr_at_5 value: 14.968 - type: ndcg_at_1 value: 9.936 - type: ndcg_at_10 value: 18.602 - type: ndcg_at_100 value: 24.797 - type: ndcg_at_1000 value: 27.609 - type: ndcg_at_3 value: 13.700999999999999 - type: ndcg_at_5 value: 15.93 - type: precision_at_1 value: 9.936 - type: precision_at_10 value: 3.4939999999999998 - type: precision_at_100 value: 0.7000000000000001 - type: precision_at_1000 value: 0.097 - type: precision_at_3 value: 6.383 - type: precision_at_5 value: 5.075 - type: recall_at_1 value: 8.774999999999999 - type: recall_at_10 value: 29.751 - type: recall_at_100 value: 58.623000000000005 - type: recall_at_1000 value: 80.239 - type: recall_at_3 value: 16.604 - type: recall_at_5 value: 21.799 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: None config: default split: test revision: None metrics: - type: map_at_1 value: 63.849000000000004 - type: map_at_10 value: 76.554 - type: map_at_100 value: 77.318 - type: map_at_1000 value: 77.35 - type: map_at_3 value: 73.658 - type: map_at_5 value: 75.402 - type: mrr_at_1 value: 73.42 - type: mrr_at_10 value: 80.78 - type: mrr_at_100 value: 81.004 - type: mrr_at_1000 value: 81.009 - type: mrr_at_3 value: 79.39699999999999 - type: mrr_at_5 value: 80.274 - type: ndcg_at_1 value: 73.55000000000001 - type: ndcg_at_10 value: 81.15599999999999 - type: ndcg_at_100 value: 83.231 - type: ndcg_at_1000 value: 83.581 - type: ndcg_at_3 value: 77.705 - type: ndcg_at_5 value: 79.461 - type: precision_at_1 value: 73.55000000000001 - type: precision_at_10 value: 12.225 - type: precision_at_100 value: 1.462 - type: precision_at_1000 value: 0.155 - type: precision_at_3 value: 33.7 - type: precision_at_5 value: 22.189999999999998 - type: recall_at_1 value: 63.849000000000004 - type: recall_at_10 value: 89.888 - type: recall_at_100 value: 97.674 - type: recall_at_1000 value: 99.636 - type: recall_at_3 value: 79.95 - type: recall_at_5 value: 84.911 - type: map_at_1 value: 2.835 - type: map_at_10 value: 6.683999999999999 - type: map_at_100 value: 7.9719999999999995 - type: map_at_1000 value: 8.216 - type: map_at_3 value: 4.856 - type: map_at_5 value: 5.702 - type: mrr_at_1 value: 14.000000000000002 - type: mrr_at_10 value: 21.475 - type: mrr_at_100 value: 22.74 - type: mrr_at_1000 value: 22.834 - type: mrr_at_3 value: 18.917 - type: mrr_at_5 value: 20.137 - type: ndcg_at_1 value: 14.000000000000002 - type: ndcg_at_10 value: 11.935 - type: ndcg_at_100 value: 18.062 - type: ndcg_at_1000 value: 23.083000000000002 - type: ndcg_at_3 value: 11.236 - type: ndcg_at_5 value: 9.711 - type: precision_at_1 value: 14.000000000000002 - type: precision_at_10 value: 6.23 - type: precision_at_100 value: 1.518 - type: precision_at_1000 value: 0.27299999999999996 - type: precision_at_3 value: 10.467 - type: precision_at_5 value: 8.459999999999999 - type: recall_at_1 value: 2.835 - type: recall_at_10 value: 12.620000000000001 - type: recall_at_100 value: 30.830000000000002 - type: recall_at_1000 value: 55.401999999999994 - type: recall_at_3 value: 6.3549999999999995 - type: recall_at_5 value: 8.558 - type: map_at_1 value: 0.11399999999999999 - type: map_at_10 value: 0.749 - type: map_at_100 value: 4.265 - type: map_at_1000 value: 10.392 - type: map_at_3 value: 0.303 - type: map_at_5 value: 0.45199999999999996 - type: mrr_at_1 value: 46.0 - type: mrr_at_10 value: 60.651999999999994 - type: mrr_at_100 value: 61.18599999999999 - type: mrr_at_1000 value: 61.18599999999999 - type: mrr_at_3 value: 56.667 - type: mrr_at_5 value: 59.367000000000004 - type: ndcg_at_1 value: 43.0 - type: ndcg_at_10 value: 40.046 - type: ndcg_at_100 value: 31.291000000000004 - type: ndcg_at_1000 value: 28.277 - type: ndcg_at_3 value: 44.48 - type: ndcg_at_5 value: 42.684 - type: precision_at_1 value: 48.0 - type: precision_at_10 value: 42.8 - type: precision_at_100 value: 33.019999999999996 - type: precision_at_1000 value: 13.984 - type: precision_at_3 value: 48.667 - type: precision_at_5 value: 46.800000000000004 - type: recall_at_1 value: 0.11399999999999999 - type: recall_at_10 value: 0.9780000000000001 - type: recall_at_100 value: 7.105 - type: recall_at_1000 value: 27.417 - type: recall_at_3 value: 0.346 - type: recall_at_5 value: 0.5479999999999999 - task: type: Clustering dataset: name: MTEB RedditClustering type: None config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 39.62306630082035 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: None config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 45.094055344099466 - task: type: STS dataset: name: MTEB SICK-R type: None config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 76.60485813657951 - type: cos_sim_spearman value: 67.55018251099548 - type: euclidean_pearson value: 72.54046568178227 - type: euclidean_spearman value: 67.55019006488925 - type: manhattan_pearson value: 69.6762686221389 - type: manhattan_spearman value: 65.26267981325925 - task: type: STS dataset: name: MTEB STS12 type: None config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 74.82523314263273 - type: cos_sim_spearman value: 67.79344024125452 - type: euclidean_pearson value: 70.9783299378683 - type: euclidean_spearman value: 67.79459969769313 - type: manhattan_pearson value: 70.12828880808506 - type: manhattan_spearman value: 67.73698113382844 - task: type: STS dataset: name: MTEB STS13 type: None config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 77.43471334909646 - type: cos_sim_spearman value: 78.87612165161259 - type: euclidean_pearson value: 78.51433127433613 - type: euclidean_spearman value: 78.87613383277751 - type: manhattan_pearson value: 78.60881303718293 - type: manhattan_spearman value: 79.09512930090479 - task: type: STS dataset: name: MTEB STS14 type: None config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 77.97539508509726 - type: cos_sim_spearman value: 74.75819046899458 - type: euclidean_pearson value: 77.12281177184511 - type: euclidean_spearman value: 74.7581807777114 - type: manhattan_pearson value: 76.26704335508094 - type: manhattan_spearman value: 74.34393206667237 - task: type: STS dataset: name: MTEB STS15 type: None config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 80.85922976490097 - type: cos_sim_spearman value: 81.84645055567681 - type: euclidean_pearson value: 81.81830833050186 - type: euclidean_spearman value: 81.8464588106734 - type: manhattan_pearson value: 81.84644569552097 - type: manhattan_spearman value: 82.07418063887212 - task: type: STS dataset: name: MTEB STS16 type: None config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 76.29500455505787 - type: cos_sim_spearman value: 77.62306117264794 - type: euclidean_pearson value: 77.17400157807901 - type: euclidean_spearman value: 77.62361139017713 - type: manhattan_pearson value: 77.47217811108837 - type: manhattan_spearman value: 77.97210958449658 - task: type: STS dataset: name: MTEB STS17 (en-en) type: None config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 84.05858425825969 - type: cos_sim_spearman value: 85.20355984500516 - type: euclidean_pearson value: 85.13424143442064 - type: euclidean_spearman value: 85.20443331883615 - type: manhattan_pearson value: 85.38984808229498 - type: manhattan_spearman value: 86.08189549444803 - task: type: STS dataset: name: MTEB STS22 (en) type: None config: en split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 62.81457561891979 - type: cos_sim_spearman value: 59.85388868480853 - type: euclidean_pearson value: 62.69073748353527 - type: euclidean_spearman value: 59.85388868480853 - type: manhattan_pearson value: 62.22005303389897 - type: manhattan_spearman value: 60.41700166909821 - task: type: STS dataset: name: MTEB STSBenchmark type: None config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 77.86417632861593 - type: cos_sim_spearman value: 76.40396426411002 - type: euclidean_pearson value: 77.91668686082956 - type: euclidean_spearman value: 76.40398273149863 - type: manhattan_pearson value: 77.23413224696387 - type: manhattan_spearman value: 75.8417428842207 - task: type: Reranking dataset: name: MTEB SciDocsRR type: None config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 72.11890674541019 - type: mrr value: 90.97487570526785 - task: type: Retrieval dataset: name: MTEB SciFact type: None config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: map_at_1 value: 36.306 - type: map_at_10 value: 43.081 - type: map_at_100 value: 44.352999999999994 - type: map_at_1000 value: 44.409 - type: map_at_3 value: 40.713 - type: map_at_5 value: 42.019 - type: mrr_at_1 value: 38.333 - type: mrr_at_10 value: 44.721 - type: mrr_at_100 value: 45.83 - type: mrr_at_1000 value: 45.879 - type: mrr_at_3 value: 42.722 - type: mrr_at_5 value: 43.739 - type: ndcg_at_1 value: 38.333 - type: ndcg_at_10 value: 47.152 - type: ndcg_at_100 value: 53.137 - type: ndcg_at_1000 value: 54.642 - type: ndcg_at_3 value: 42.614000000000004 - type: ndcg_at_5 value: 44.649 - type: precision_at_1 value: 38.333 - type: precision_at_10 value: 6.433 - type: precision_at_100 value: 0.967 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 16.556 - type: precision_at_5 value: 11.067 - type: recall_at_1 value: 36.306 - type: recall_at_10 value: 58.306000000000004 - type: recall_at_100 value: 85.68299999999999 - type: recall_at_1000 value: 97.467 - type: recall_at_3 value: 45.889 - type: recall_at_5 value: 50.833 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: None config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.67821782178218 - type: cos_sim_ap value: 89.88249977581106 - type: cos_sim_f1 value: 83.24270884824519 - type: cos_sim_precision value: 82.3069403714565 - type: cos_sim_recall value: 84.2 - type: dot_accuracy value: 99.67821782178218 - type: dot_ap value: 89.88249977581106 - type: dot_f1 value: 83.24270884824519 - type: dot_precision value: 82.3069403714565 - type: dot_recall value: 84.2 - type: euclidean_accuracy value: 99.67821782178218 - type: euclidean_ap value: 89.88249977581106 - type: euclidean_f1 value: 83.24270884824519 - type: euclidean_precision value: 82.3069403714565 - type: euclidean_recall value: 84.2 - type: manhattan_accuracy value: 99.74653465346535 - type: manhattan_ap value: 92.47796158955659 - type: manhattan_f1 value: 86.55462184873949 - type: manhattan_precision value: 91.1504424778761 - type: manhattan_recall value: 82.39999999999999 - type: max_accuracy value: 99.74653465346535 - type: max_ap value: 92.47796158955659 - type: max_f1 value: 86.55462184873949 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: None config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 41.72268428949898 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: None config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 29.535996461064713 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: None config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 43.41913643582162 - type: mrr value: 43.91500350140056 - task: type: Summarization dataset: name: MTEB SummEval type: None config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.332915201869763 - type: cos_sim_spearman value: 30.291662163948462 - type: dot_pearson value: 30.332915151592584 - type: dot_spearman value: 30.1834748432304 - task: type: Retrieval dataset: name: MTEB Touche2020 type: None config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: map_at_1 value: 2.067 - type: map_at_10 value: 8.561 - type: map_at_100 value: 14.677999999999999 - type: map_at_1000 value: 16.313 - type: map_at_3 value: 4.481 - type: map_at_5 value: 6.081 - type: mrr_at_1 value: 28.571 - type: mrr_at_10 value: 44.07 - type: mrr_at_100 value: 45.153 - type: mrr_at_1000 value: 45.153 - type: mrr_at_3 value: 39.796 - type: mrr_at_5 value: 42.653 - type: ndcg_at_1 value: 26.531 - type: ndcg_at_10 value: 22.528000000000002 - type: ndcg_at_100 value: 35.595 - type: ndcg_at_1000 value: 46.97 - type: ndcg_at_3 value: 25.011 - type: ndcg_at_5 value: 23.759 - type: precision_at_1 value: 28.571 - type: precision_at_10 value: 20.816000000000003 - type: precision_at_100 value: 8.0 - type: precision_at_1000 value: 1.529 - type: precision_at_3 value: 25.85 - type: precision_at_5 value: 24.082 - type: recall_at_1 value: 2.067 - type: recall_at_10 value: 14.265 - type: recall_at_100 value: 48.504999999999995 - type: recall_at_1000 value: 83.176 - type: recall_at_3 value: 5.5489999999999995 - type: recall_at_5 value: 8.319 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: None config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 70.587 - type: ap value: 13.787371974957502 - type: f1 value: 54.112483754025895 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: None config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 54.18788907753254 - type: f1 value: 54.373057834087724 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: None config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 35.37540475772647 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: None config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 83.54294569946951 - type: cos_sim_ap value: 64.48967648355864 - type: cos_sim_f1 value: 61.59177820267685 - type: cos_sim_precision value: 56.29095674967235 - type: cos_sim_recall value: 67.99472295514512 - type: dot_accuracy value: 83.54294569946951 - type: dot_ap value: 64.48967648355864 - type: dot_f1 value: 61.59177820267685 - type: dot_precision value: 56.29095674967235 - type: dot_recall value: 67.99472295514512 - type: euclidean_accuracy value: 83.54294569946951 - type: euclidean_ap value: 64.48967648355864 - type: euclidean_f1 value: 61.59177820267685 - type: euclidean_precision value: 56.29095674967235 - type: euclidean_recall value: 67.99472295514512 - type: manhattan_accuracy value: 82.55945639864099 - type: manhattan_ap value: 61.57762963239964 - type: manhattan_f1 value: 59.04949519822704 - type: manhattan_precision value: 55.355493998153285 - type: manhattan_recall value: 63.27176781002638 - type: max_accuracy value: 83.54294569946951 - type: max_ap value: 64.48967648355864 - type: max_f1 value: 61.59177820267685 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: None config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 87.61012147320216 - type: cos_sim_ap value: 83.07983490905391 - type: cos_sim_f1 value: 75.18353192902202 - type: cos_sim_precision value: 72.17225015935973 - type: cos_sim_recall value: 78.45703726516786 - type: dot_accuracy value: 87.61012147320216 - type: dot_ap value: 83.07984253532115 - type: dot_f1 value: 75.18353192902202 - type: dot_precision value: 72.17225015935973 - type: dot_recall value: 78.45703726516786 - type: euclidean_accuracy value: 87.61012147320216 - type: euclidean_ap value: 83.07983413095812 - type: euclidean_f1 value: 75.18353192902202 - type: euclidean_precision value: 72.17225015935973 - type: euclidean_recall value: 78.45703726516786 - type: manhattan_accuracy value: 87.58489540885628 - type: manhattan_ap value: 82.95071093691672 - type: manhattan_f1 value: 75.19792822789493 - type: manhattan_precision value: 72.37572995299814 - type: manhattan_recall value: 78.2491530643671 - type: max_accuracy value: 87.61012147320216 - type: max_ap value: 83.07984253532115 - type: max_f1 value: 75.19792822789493 ---
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
twadada/nmc-prmt-and-emb
twadada
null
[ "mteb", "model-index", "region:us" ]
1,725
1,725
0
0
--- tags: - mteb model-index: - name: nomic_classification_prompt_domain_sample results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: None config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 74.53731343283583 - type: ap value: 37.92571498384035 - type: f1 value: 68.77042705445326 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: None config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 66.16017500000001 - type: ap value: 61.42247172783104 - type: f1 value: 65.38166709014324 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: None config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 34.396 - type: f1 value: 33.71300766345019 - task: type: Retrieval dataset: name: MTEB ArguAna type: None config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 21.55 - type: map_at_10 value: 35.845 - type: map_at_100 value: 36.995 - type: map_at_1000 value: 37.018 - type: map_at_3 value: 30.856 - type: map_at_5 value: 33.605000000000004 - type: mrr_at_1 value: 22.048000000000002 - type: mrr_at_10 value: 36.039 - type: mrr_at_100 value: 37.181 - type: mrr_at_1000 value: 37.205 - type: mrr_at_3 value: 31.022 - type: mrr_at_5 value: 33.757 - type: ndcg_at_1 value: 21.55 - type: ndcg_at_10 value: 44.241 - type: ndcg_at_100 value: 49.457 - type: ndcg_at_1000 value: 50.024 - type: ndcg_at_3 value: 33.873999999999995 - type: ndcg_at_5 value: 38.826 - type: precision_at_1 value: 21.55 - type: precision_at_10 value: 7.134 - type: precision_at_100 value: 0.9490000000000001 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 14.201 - type: precision_at_5 value: 10.925 - type: recall_at_1 value: 21.55 - type: recall_at_10 value: 71.33699999999999 - type: recall_at_100 value: 94.879 - type: recall_at_1000 value: 99.21799999999999 - type: recall_at_3 value: 42.603 - type: recall_at_5 value: 54.623 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: None config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 34.77701037657294 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: None config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 24.616534607718528 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: None config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 54.39039727853101 - type: mrr value: 68.89240645473332 - task: type: STS dataset: name: MTEB BIOSSES type: None config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 81.96093442776794 - type: cos_sim_spearman value: 79.80362560866212 - type: euclidean_pearson value: 81.2337598243594 - type: euclidean_spearman value: 79.80362560866212 - type: manhattan_pearson value: 80.54695854084805 - type: manhattan_spearman value: 79.70904514032895 - task: type: Classification dataset: name: MTEB Banking77Classification type: None config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 72.1103896103896 - type: f1 value: 71.0424629611518 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: None config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 32.160979697519885 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: None config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 23.63609395107967 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: None config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: map_at_1 value: 22.972 - type: map_at_10 value: 31.483 - type: map_at_100 value: 32.58 - type: map_at_1000 value: 32.732 - type: map_at_3 value: 28.822 - type: map_at_5 value: 30.412 - type: mrr_at_1 value: 28.754999999999995 - type: mrr_at_10 value: 37.302 - type: mrr_at_100 value: 38.065 - type: mrr_at_1000 value: 38.132 - type: mrr_at_3 value: 35.074 - type: mrr_at_5 value: 36.504999999999995 - type: ndcg_at_1 value: 28.754999999999995 - type: ndcg_at_10 value: 36.9 - type: ndcg_at_100 value: 41.785 - type: ndcg_at_1000 value: 44.861000000000004 - type: ndcg_at_3 value: 33.013999999999996 - type: ndcg_at_5 value: 34.966 - type: precision_at_1 value: 28.754999999999995 - type: precision_at_10 value: 7.053 - type: precision_at_100 value: 1.1860000000000002 - type: precision_at_1000 value: 0.17500000000000002 - type: precision_at_3 value: 16.023 - type: precision_at_5 value: 11.76 - type: recall_at_1 value: 22.972 - type: recall_at_10 value: 46.699 - type: recall_at_100 value: 68.476 - type: recall_at_1000 value: 89.461 - type: recall_at_3 value: 34.792 - type: recall_at_5 value: 40.453 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval type: None config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: map_at_1 value: 18.001 - type: map_at_10 value: 24.213 - type: map_at_100 value: 25.184 - type: map_at_1000 value: 25.301000000000002 - type: map_at_3 value: 22.157 - type: map_at_5 value: 23.357 - type: mrr_at_1 value: 22.93 - type: mrr_at_10 value: 28.843000000000004 - type: mrr_at_100 value: 29.637999999999998 - type: mrr_at_1000 value: 29.706 - type: mrr_at_3 value: 26.868 - type: mrr_at_5 value: 28.021 - type: ndcg_at_1 value: 22.93 - type: ndcg_at_10 value: 28.337 - type: ndcg_at_100 value: 32.696 - type: ndcg_at_1000 value: 35.483 - type: ndcg_at_3 value: 24.909 - type: ndcg_at_5 value: 26.601999999999997 - type: precision_at_1 value: 22.93 - type: precision_at_10 value: 5.255 - type: precision_at_100 value: 0.9199999999999999 - type: precision_at_1000 value: 0.14300000000000002 - type: precision_at_3 value: 11.911 - type: precision_at_5 value: 8.599 - type: recall_at_1 value: 18.001 - type: recall_at_10 value: 36.047000000000004 - type: recall_at_100 value: 55.123999999999995 - type: recall_at_1000 value: 73.919 - type: recall_at_3 value: 26.230999999999998 - type: recall_at_5 value: 30.791 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval type: None config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 27.74 - type: map_at_10 value: 36.899 - type: map_at_100 value: 38.021 - type: map_at_1000 value: 38.115 - type: map_at_3 value: 34.226 - type: map_at_5 value: 35.791000000000004 - type: mrr_at_1 value: 32.038 - type: mrr_at_10 value: 40.196 - type: mrr_at_100 value: 41.099000000000004 - type: mrr_at_1000 value: 41.159 - type: mrr_at_3 value: 37.858000000000004 - type: mrr_at_5 value: 39.262 - type: ndcg_at_1 value: 32.038 - type: ndcg_at_10 value: 41.835 - type: ndcg_at_100 value: 46.957 - type: ndcg_at_1000 value: 49.132 - type: ndcg_at_3 value: 37.03 - type: ndcg_at_5 value: 39.466 - type: precision_at_1 value: 32.038 - type: precision_at_10 value: 6.771000000000001 - type: precision_at_100 value: 1.027 - type: precision_at_1000 value: 0.129 - type: precision_at_3 value: 16.405 - type: precision_at_5 value: 11.549 - type: recall_at_1 value: 27.74 - type: recall_at_10 value: 53.43599999999999 - type: recall_at_100 value: 76.239 - type: recall_at_1000 value: 92.038 - type: recall_at_3 value: 40.625 - type: recall_at_5 value: 46.483000000000004 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval type: None config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: map_at_1 value: 13.71 - type: map_at_10 value: 18.269 - type: map_at_100 value: 19.095000000000002 - type: map_at_1000 value: 19.206 - type: map_at_3 value: 16.667 - type: map_at_5 value: 17.461 - type: mrr_at_1 value: 14.915000000000001 - type: mrr_at_10 value: 19.6 - type: mrr_at_100 value: 20.429 - type: mrr_at_1000 value: 20.527 - type: mrr_at_3 value: 18.041 - type: mrr_at_5 value: 18.826999999999998 - type: ndcg_at_1 value: 14.915000000000001 - type: ndcg_at_10 value: 21.197 - type: ndcg_at_100 value: 25.790999999999997 - type: ndcg_at_1000 value: 29.15 - type: ndcg_at_3 value: 17.947 - type: ndcg_at_5 value: 19.316 - type: precision_at_1 value: 14.915000000000001 - type: precision_at_10 value: 3.277 - type: precision_at_100 value: 0.601 - type: precision_at_1000 value: 0.094 - type: precision_at_3 value: 7.495 - type: precision_at_5 value: 5.266 - type: recall_at_1 value: 13.71 - type: recall_at_10 value: 29.104999999999997 - type: recall_at_100 value: 51.283 - type: recall_at_1000 value: 77.706 - type: recall_at_3 value: 20.217 - type: recall_at_5 value: 23.465 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval type: None config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: map_at_1 value: 7.8759999999999994 - type: map_at_10 value: 11.171000000000001 - type: map_at_100 value: 12.096 - type: map_at_1000 value: 12.224 - type: map_at_3 value: 10.148 - type: map_at_5 value: 10.529 - type: mrr_at_1 value: 10.199 - type: mrr_at_10 value: 13.789000000000001 - type: mrr_at_100 value: 14.789 - type: mrr_at_1000 value: 14.887 - type: mrr_at_3 value: 12.706999999999999 - type: mrr_at_5 value: 13.142999999999999 - type: ndcg_at_1 value: 10.199 - type: ndcg_at_10 value: 13.602 - type: ndcg_at_100 value: 18.54 - type: ndcg_at_1000 value: 22.141 - type: ndcg_at_3 value: 11.569 - type: ndcg_at_5 value: 12.151 - type: precision_at_1 value: 10.199 - type: precision_at_10 value: 2.488 - type: precision_at_100 value: 0.588 - type: precision_at_1000 value: 0.10300000000000001 - type: precision_at_3 value: 5.473 - type: precision_at_5 value: 3.781 - type: recall_at_1 value: 7.8759999999999994 - type: recall_at_10 value: 18.678 - type: recall_at_100 value: 40.818 - type: recall_at_1000 value: 67.49000000000001 - type: recall_at_3 value: 12.841 - type: recall_at_5 value: 14.366999999999999 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval type: None config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: map_at_1 value: 18.293 - type: map_at_10 value: 24.626 - type: map_at_100 value: 25.828 - type: map_at_1000 value: 25.964 - type: map_at_3 value: 22.439 - type: map_at_5 value: 23.541 - type: mrr_at_1 value: 22.81 - type: mrr_at_10 value: 29.213 - type: mrr_at_100 value: 30.188 - type: mrr_at_1000 value: 30.258000000000003 - type: mrr_at_3 value: 26.933 - type: mrr_at_5 value: 28.069 - type: ndcg_at_1 value: 22.81 - type: ndcg_at_10 value: 29.107 - type: ndcg_at_100 value: 34.958 - type: ndcg_at_1000 value: 37.968 - type: ndcg_at_3 value: 25.144 - type: ndcg_at_5 value: 26.769 - type: precision_at_1 value: 22.81 - type: precision_at_10 value: 5.351 - type: precision_at_100 value: 0.9939999999999999 - type: precision_at_1000 value: 0.145 - type: precision_at_3 value: 11.741999999999999 - type: precision_at_5 value: 8.431 - type: recall_at_1 value: 18.293 - type: recall_at_10 value: 38.315 - type: recall_at_100 value: 64.16199999999999 - type: recall_at_1000 value: 84.944 - type: recall_at_3 value: 27.006000000000004 - type: recall_at_5 value: 31.284 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval type: None config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: map_at_1 value: 13.896 - type: map_at_10 value: 19.695999999999998 - type: map_at_100 value: 20.813000000000002 - type: map_at_1000 value: 20.953 - type: map_at_3 value: 17.657 - type: map_at_5 value: 18.752 - type: mrr_at_1 value: 17.122999999999998 - type: mrr_at_10 value: 23.345 - type: mrr_at_100 value: 24.294 - type: mrr_at_1000 value: 24.386 - type: mrr_at_3 value: 21.404 - type: mrr_at_5 value: 22.494 - type: ndcg_at_1 value: 17.122999999999998 - type: ndcg_at_10 value: 23.692 - type: ndcg_at_100 value: 29.012 - type: ndcg_at_1000 value: 32.45 - type: ndcg_at_3 value: 20.002 - type: ndcg_at_5 value: 21.62 - type: precision_at_1 value: 17.122999999999998 - type: precision_at_10 value: 4.543 - type: precision_at_100 value: 0.852 - type: precision_at_1000 value: 0.133 - type: precision_at_3 value: 9.589 - type: precision_at_5 value: 7.1 - type: recall_at_1 value: 13.896 - type: recall_at_10 value: 32.176 - type: recall_at_100 value: 55.382 - type: recall_at_1000 value: 79.725 - type: recall_at_3 value: 21.942 - type: recall_at_5 value: 26.068 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: mteb/cqadupstack config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 15.481333333333335 - type: map_at_10 value: 21.042999999999996 - type: map_at_100 value: 22.0115 - type: map_at_1000 value: 22.138250000000003 - type: map_at_3 value: 19.255166666666664 - type: map_at_5 value: 20.23483333333333 - type: mrr_at_1 value: 18.692583333333335 - type: mrr_at_10 value: 24.281 - type: mrr_at_100 value: 25.134249999999998 - type: mrr_at_1000 value: 25.218833333333336 - type: mrr_at_3 value: 22.54816666666667 - type: mrr_at_5 value: 23.507916666666667 - type: ndcg_at_1 value: 18.692583333333335 - type: ndcg_at_10 value: 24.682166666666667 - type: ndcg_at_100 value: 29.43166666666666 - type: ndcg_at_1000 value: 32.59633333333334 - type: ndcg_at_3 value: 21.481749999999998 - type: ndcg_at_5 value: 22.93933333333333 - type: precision_at_1 value: 18.692583333333335 - type: precision_at_10 value: 4.370916666666667 - type: precision_at_100 value: 0.8024999999999999 - type: precision_at_1000 value: 0.12566666666666668 - type: precision_at_3 value: 9.923833333333334 - type: precision_at_5 value: 7.110416666666667 - type: recall_at_1 value: 15.481333333333335 - type: recall_at_10 value: 32.433166666666665 - type: recall_at_100 value: 54.03975 - type: recall_at_1000 value: 77.06675 - type: recall_at_3 value: 23.353916666666663 - type: recall_at_5 value: 27.16183333333334 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval type: None config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: map_at_1 value: 12.656999999999998 - type: map_at_10 value: 16.59 - type: map_at_100 value: 17.372 - type: map_at_1000 value: 17.465 - type: map_at_3 value: 15.075 - type: map_at_5 value: 16.016 - type: mrr_at_1 value: 14.877 - type: mrr_at_10 value: 18.726000000000003 - type: mrr_at_100 value: 19.488 - type: mrr_at_1000 value: 19.569 - type: mrr_at_3 value: 17.127 - type: mrr_at_5 value: 18.108 - type: ndcg_at_1 value: 14.877 - type: ndcg_at_10 value: 19.326 - type: ndcg_at_100 value: 23.426 - type: ndcg_at_1000 value: 26.168999999999997 - type: ndcg_at_3 value: 16.445 - type: ndcg_at_5 value: 18.037 - type: precision_at_1 value: 14.877 - type: precision_at_10 value: 3.206 - type: precision_at_100 value: 0.5740000000000001 - type: precision_at_1000 value: 0.08800000000000001 - type: precision_at_3 value: 7.26 - type: precision_at_5 value: 5.367999999999999 - type: recall_at_1 value: 12.656999999999998 - type: recall_at_10 value: 25.723000000000003 - type: recall_at_100 value: 44.9 - type: recall_at_1000 value: 65.923 - type: recall_at_3 value: 17.854 - type: recall_at_5 value: 21.912000000000003 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval type: None config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 7.997999999999999 - type: map_at_10 value: 11.765 - type: map_at_100 value: 12.453 - type: map_at_1000 value: 12.575 - type: map_at_3 value: 10.721 - type: map_at_5 value: 11.269 - type: mrr_at_1 value: 9.945 - type: mrr_at_10 value: 14.172 - type: mrr_at_100 value: 14.862 - type: mrr_at_1000 value: 14.965 - type: mrr_at_3 value: 13.048000000000002 - type: mrr_at_5 value: 13.638 - type: ndcg_at_1 value: 9.945 - type: ndcg_at_10 value: 14.238000000000001 - type: ndcg_at_100 value: 18.052 - type: ndcg_at_1000 value: 21.633 - type: ndcg_at_3 value: 12.301 - type: ndcg_at_5 value: 13.113 - type: precision_at_1 value: 9.945 - type: precision_at_10 value: 2.636 - type: precision_at_100 value: 0.543 - type: precision_at_1000 value: 0.101 - type: precision_at_3 value: 5.9990000000000006 - type: precision_at_5 value: 4.253 - type: recall_at_1 value: 7.997999999999999 - type: recall_at_10 value: 19.363 - type: recall_at_100 value: 37.203 - type: recall_at_1000 value: 63.9 - type: recall_at_3 value: 13.755999999999998 - type: recall_at_5 value: 15.966 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval type: None config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: map_at_1 value: 14.132 - type: map_at_10 value: 19.032 - type: map_at_100 value: 19.942 - type: map_at_1000 value: 20.061999999999998 - type: map_at_3 value: 17.498 - type: map_at_5 value: 18.352 - type: mrr_at_1 value: 16.698 - type: mrr_at_10 value: 21.898 - type: mrr_at_100 value: 22.775000000000002 - type: mrr_at_1000 value: 22.869999999999997 - type: mrr_at_3 value: 20.196 - type: mrr_at_5 value: 21.143 - type: ndcg_at_1 value: 16.698 - type: ndcg_at_10 value: 22.303 - type: ndcg_at_100 value: 26.889000000000003 - type: ndcg_at_1000 value: 30.249 - type: ndcg_at_3 value: 19.28 - type: ndcg_at_5 value: 20.694000000000003 - type: precision_at_1 value: 16.698 - type: precision_at_10 value: 3.7409999999999997 - type: precision_at_100 value: 0.6649999999999999 - type: precision_at_1000 value: 0.107 - type: precision_at_3 value: 8.706 - type: precision_at_5 value: 6.119 - type: recall_at_1 value: 14.132 - type: recall_at_10 value: 29.572 - type: recall_at_100 value: 50.346999999999994 - type: recall_at_1000 value: 75.214 - type: recall_at_3 value: 21.197 - type: recall_at_5 value: 24.887999999999998 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval type: None config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: map_at_1 value: 16.474 - type: map_at_10 value: 22.362000000000002 - type: map_at_100 value: 23.533 - type: map_at_1000 value: 23.733999999999998 - type: map_at_3 value: 20.529 - type: map_at_5 value: 21.543 - type: mrr_at_1 value: 20.158 - type: mrr_at_10 value: 26.069 - type: mrr_at_100 value: 26.962999999999997 - type: mrr_at_1000 value: 27.049 - type: mrr_at_3 value: 24.44 - type: mrr_at_5 value: 25.3 - type: ndcg_at_1 value: 20.158 - type: ndcg_at_10 value: 26.447 - type: ndcg_at_100 value: 31.405 - type: ndcg_at_1000 value: 34.969 - type: ndcg_at_3 value: 23.639 - type: ndcg_at_5 value: 24.852 - type: precision_at_1 value: 20.158 - type: precision_at_10 value: 5.099 - type: precision_at_100 value: 1.113 - type: precision_at_1000 value: 0.196 - type: precision_at_3 value: 11.397 - type: precision_at_5 value: 8.182 - type: recall_at_1 value: 16.474 - type: recall_at_10 value: 33.812 - type: recall_at_100 value: 56.725 - type: recall_at_1000 value: 81.151 - type: recall_at_3 value: 25.043 - type: recall_at_5 value: 28.564 - task: type: Retrieval dataset: name: MTEB CQADupstackWordpressRetrieval type: None config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 12.027000000000001 - type: map_at_10 value: 16.41 - type: map_at_100 value: 17.221 - type: map_at_1000 value: 17.328 - type: map_at_3 value: 15.123000000000001 - type: map_at_5 value: 15.795 - type: mrr_at_1 value: 13.863 - type: mrr_at_10 value: 18.218999999999998 - type: mrr_at_100 value: 19.021 - type: mrr_at_1000 value: 19.118 - type: mrr_at_3 value: 16.882 - type: mrr_at_5 value: 17.585 - type: ndcg_at_1 value: 13.863 - type: ndcg_at_10 value: 19.201999999999998 - type: ndcg_at_100 value: 23.669 - type: ndcg_at_1000 value: 26.951000000000004 - type: ndcg_at_3 value: 16.500999999999998 - type: ndcg_at_5 value: 17.686 - type: precision_at_1 value: 13.863 - type: precision_at_10 value: 3.031 - type: precision_at_100 value: 0.567 - type: precision_at_1000 value: 0.094 - type: precision_at_3 value: 7.086 - type: precision_at_5 value: 4.917 - type: recall_at_1 value: 12.027000000000001 - type: recall_at_10 value: 26.272000000000002 - type: recall_at_100 value: 47.818 - type: recall_at_1000 value: 73.33 - type: recall_at_3 value: 18.743000000000002 - type: recall_at_5 value: 21.701 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: None config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: map_at_1 value: 6.357 - type: map_at_10 value: 11.350999999999999 - type: map_at_100 value: 12.774 - type: map_at_1000 value: 12.962000000000002 - type: map_at_3 value: 9.142 - type: map_at_5 value: 10.219000000000001 - type: mrr_at_1 value: 14.593 - type: mrr_at_10 value: 23.003 - type: mrr_at_100 value: 24.15 - type: mrr_at_1000 value: 24.215999999999998 - type: mrr_at_3 value: 19.924 - type: mrr_at_5 value: 21.628 - type: ndcg_at_1 value: 14.593 - type: ndcg_at_10 value: 17.06 - type: ndcg_at_100 value: 23.674 - type: ndcg_at_1000 value: 27.57 - type: ndcg_at_3 value: 12.903 - type: ndcg_at_5 value: 14.399000000000001 - type: precision_at_1 value: 14.593 - type: precision_at_10 value: 5.6739999999999995 - type: precision_at_100 value: 1.279 - type: precision_at_1000 value: 0.198 - type: precision_at_3 value: 9.794 - type: precision_at_5 value: 7.961 - type: recall_at_1 value: 6.357 - type: recall_at_10 value: 21.837 - type: recall_at_100 value: 45.317 - type: recall_at_1000 value: 67.868 - type: recall_at_3 value: 11.959999999999999 - type: recall_at_5 value: 15.744 - task: type: Retrieval dataset: name: MTEB DBPedia type: None config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: map_at_1 value: 3.972 - type: map_at_10 value: 9.464 - type: map_at_100 value: 13.014999999999999 - type: map_at_1000 value: 13.956 - type: map_at_3 value: 6.796 - type: map_at_5 value: 7.896 - type: mrr_at_1 value: 40.0 - type: mrr_at_10 value: 49.381 - type: mrr_at_100 value: 50.156 - type: mrr_at_1000 value: 50.17700000000001 - type: mrr_at_3 value: 46.208 - type: mrr_at_5 value: 47.958 - type: ndcg_at_1 value: 29.5 - type: ndcg_at_10 value: 23.438 - type: ndcg_at_100 value: 26.128 - type: ndcg_at_1000 value: 32.922000000000004 - type: ndcg_at_3 value: 26.436999999999998 - type: ndcg_at_5 value: 24.63 - type: precision_at_1 value: 40.0 - type: precision_at_10 value: 20.724999999999998 - type: precision_at_100 value: 6.353000000000001 - type: precision_at_1000 value: 1.329 - type: precision_at_3 value: 31.5 - type: precision_at_5 value: 26.400000000000002 - type: recall_at_1 value: 3.972 - type: recall_at_10 value: 14.173 - type: recall_at_100 value: 32.249 - type: recall_at_1000 value: 54.991 - type: recall_at_3 value: 8.177 - type: recall_at_5 value: 10.415000000000001 - task: type: Classification dataset: name: MTEB EmotionClassification type: None config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 44.775 - type: f1 value: 40.9777201408297 - task: type: Retrieval dataset: name: MTEB FEVER type: None config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: map_at_1 value: 15.373000000000001 - type: map_at_10 value: 23.247999999999998 - type: map_at_100 value: 24.16 - type: map_at_1000 value: 24.233 - type: map_at_3 value: 20.718 - type: map_at_5 value: 22.117 - type: mrr_at_1 value: 16.381999999999998 - type: mrr_at_10 value: 24.654999999999998 - type: mrr_at_100 value: 25.56 - type: mrr_at_1000 value: 25.625999999999998 - type: mrr_at_3 value: 21.987000000000002 - type: mrr_at_5 value: 23.466 - type: ndcg_at_1 value: 16.381999999999998 - type: ndcg_at_10 value: 28.083000000000002 - type: ndcg_at_100 value: 32.939 - type: ndcg_at_1000 value: 35.025 - type: ndcg_at_3 value: 22.830000000000002 - type: ndcg_at_5 value: 25.351000000000003 - type: precision_at_1 value: 16.381999999999998 - type: precision_at_10 value: 4.5600000000000005 - type: precision_at_100 value: 0.722 - type: precision_at_1000 value: 0.092 - type: precision_at_3 value: 9.921000000000001 - type: precision_at_5 value: 7.276000000000001 - type: recall_at_1 value: 15.373000000000001 - type: recall_at_10 value: 41.942 - type: recall_at_100 value: 65.051 - type: recall_at_1000 value: 81.208 - type: recall_at_3 value: 27.639999999999997 - type: recall_at_5 value: 33.708 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: None config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: map_at_1 value: 6.894 - type: map_at_10 value: 11.912 - type: map_at_100 value: 13.096 - type: map_at_1000 value: 13.29 - type: map_at_3 value: 9.82 - type: map_at_5 value: 10.999 - type: mrr_at_1 value: 14.352 - type: mrr_at_10 value: 20.811 - type: mrr_at_100 value: 21.908 - type: mrr_at_1000 value: 22.001 - type: mrr_at_3 value: 18.441 - type: mrr_at_5 value: 19.961000000000002 - type: ndcg_at_1 value: 14.352 - type: ndcg_at_10 value: 16.636 - type: ndcg_at_100 value: 22.419 - type: ndcg_at_1000 value: 26.771 - type: ndcg_at_3 value: 13.436 - type: ndcg_at_5 value: 14.908 - type: precision_at_1 value: 14.352 - type: precision_at_10 value: 4.938 - type: precision_at_100 value: 1.076 - type: precision_at_1000 value: 0.18 - type: precision_at_3 value: 9.156 - type: precision_at_5 value: 7.407 - type: recall_at_1 value: 6.894 - type: recall_at_10 value: 21.672 - type: recall_at_100 value: 44.193 - type: recall_at_1000 value: 71.604 - type: recall_at_3 value: 12.498 - type: recall_at_5 value: 16.704 - task: type: Retrieval dataset: name: MTEB HotpotQA type: None config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: map_at_1 value: 18.555 - type: map_at_10 value: 25.963 - type: map_at_100 value: 26.932000000000002 - type: map_at_1000 value: 27.044 - type: map_at_3 value: 23.916 - type: map_at_5 value: 25.112000000000002 - type: mrr_at_1 value: 37.11 - type: mrr_at_10 value: 44.175 - type: mrr_at_100 value: 44.926 - type: mrr_at_1000 value: 44.978 - type: mrr_at_3 value: 42.254999999999995 - type: mrr_at_5 value: 43.427 - type: ndcg_at_1 value: 37.11 - type: ndcg_at_10 value: 32.991 - type: ndcg_at_100 value: 37.335 - type: ndcg_at_1000 value: 40.007 - type: ndcg_at_3 value: 29.206 - type: ndcg_at_5 value: 31.173000000000002 - type: precision_at_1 value: 37.11 - type: precision_at_10 value: 7.207 - type: precision_at_100 value: 1.065 - type: precision_at_1000 value: 0.14200000000000002 - type: precision_at_3 value: 18.375 - type: precision_at_5 value: 12.581000000000001 - type: recall_at_1 value: 18.555 - type: recall_at_10 value: 36.036 - type: recall_at_100 value: 53.248 - type: recall_at_1000 value: 71.128 - type: recall_at_3 value: 27.561999999999998 - type: recall_at_5 value: 31.452 - task: type: Classification dataset: name: MTEB ImdbClassification type: None config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 67.5052 - type: ap value: 62.39030828629721 - type: f1 value: 67.18333662684846 - task: type: Retrieval dataset: name: MTEB MSMARCO type: None config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: map_at_1 value: 7.042 - type: map_at_10 value: 11.837 - type: map_at_100 value: 12.756 - type: map_at_1000 value: 12.863 - type: map_at_3 value: 10.131 - type: map_at_5 value: 11.05 - type: mrr_at_1 value: 7.2059999999999995 - type: mrr_at_10 value: 12.117 - type: mrr_at_100 value: 13.038 - type: mrr_at_1000 value: 13.141 - type: mrr_at_3 value: 10.392 - type: mrr_at_5 value: 11.323 - type: ndcg_at_1 value: 7.178 - type: ndcg_at_10 value: 14.806 - type: ndcg_at_100 value: 19.81 - type: ndcg_at_1000 value: 23.003999999999998 - type: ndcg_at_3 value: 11.236 - type: ndcg_at_5 value: 12.901000000000002 - type: precision_at_1 value: 7.178 - type: precision_at_10 value: 2.506 - type: precision_at_100 value: 0.51 - type: precision_at_1000 value: 0.079 - type: precision_at_3 value: 4.89 - type: precision_at_5 value: 3.782 - type: recall_at_1 value: 7.042 - type: recall_at_10 value: 24.037 - type: recall_at_100 value: 48.415 - type: recall_at_1000 value: 74.039 - type: recall_at_3 value: 14.194999999999999 - type: recall_at_5 value: 18.209 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: None config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 91.73050615595074 - type: f1 value: 91.31113807339747 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: None config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 65.65435476516187 - type: f1 value: 45.186713172025684 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: None config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.97175521183591 - type: f1 value: 63.30094106953352 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: None config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.81775386684599 - type: f1 value: 71.5535406261331 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: None config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 28.530997915529994 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: None config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 25.711540056372872 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: None config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 30.199045062705064 - type: mrr value: 31.1642426854302 - task: type: Retrieval dataset: name: MTEB NFCorpus type: None config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: map_at_1 value: 3.9730000000000003 - type: map_at_10 value: 8.282 - type: map_at_100 value: 10.331 - type: map_at_1000 value: 11.613 - type: map_at_3 value: 6.106 - type: map_at_5 value: 7.258000000000001 - type: mrr_at_1 value: 35.604 - type: mrr_at_10 value: 44.241 - type: mrr_at_100 value: 45.023 - type: mrr_at_1000 value: 45.079 - type: mrr_at_3 value: 42.002 - type: mrr_at_5 value: 43.751 - type: ndcg_at_1 value: 32.663 - type: ndcg_at_10 value: 25.419999999999998 - type: ndcg_at_100 value: 23.454 - type: ndcg_at_1000 value: 32.726 - type: ndcg_at_3 value: 28.892 - type: ndcg_at_5 value: 27.982000000000003 - type: precision_at_1 value: 35.604 - type: precision_at_10 value: 18.7 - type: precision_at_100 value: 6.353000000000001 - type: precision_at_1000 value: 1.9429999999999998 - type: precision_at_3 value: 27.554000000000002 - type: precision_at_5 value: 24.396 - type: recall_at_1 value: 3.9730000000000003 - type: recall_at_10 value: 12.606 - type: recall_at_100 value: 24.915000000000003 - type: recall_at_1000 value: 57.75900000000001 - type: recall_at_3 value: 7.207 - type: recall_at_5 value: 10.017 - task: type: Retrieval dataset: name: MTEB NQ type: None config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: map_at_1 value: 9.543 - type: map_at_10 value: 16.445999999999998 - type: map_at_100 value: 17.682000000000002 - type: map_at_1000 value: 17.78 - type: map_at_3 value: 13.895 - type: map_at_5 value: 15.282000000000002 - type: mrr_at_1 value: 10.863 - type: mrr_at_10 value: 18.137 - type: mrr_at_100 value: 19.291 - type: mrr_at_1000 value: 19.371 - type: mrr_at_3 value: 15.556000000000001 - type: mrr_at_5 value: 16.98 - type: ndcg_at_1 value: 10.834000000000001 - type: ndcg_at_10 value: 20.96 - type: ndcg_at_100 value: 27.336 - type: ndcg_at_1000 value: 30.001 - type: ndcg_at_3 value: 15.719 - type: ndcg_at_5 value: 18.212999999999997 - type: precision_at_1 value: 10.834000000000001 - type: precision_at_10 value: 3.911 - type: precision_at_100 value: 0.756 - type: precision_at_1000 value: 0.101 - type: precision_at_3 value: 7.455 - type: precision_at_5 value: 5.846 - type: recall_at_1 value: 9.543 - type: recall_at_10 value: 33.35 - type: recall_at_100 value: 63.141999999999996 - type: recall_at_1000 value: 83.57 - type: recall_at_3 value: 19.38 - type: recall_at_5 value: 25.266 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: None config: default split: test revision: None metrics: - type: map_at_1 value: 63.660000000000004 - type: map_at_10 value: 76.48 - type: map_at_100 value: 77.24 - type: map_at_1000 value: 77.275 - type: map_at_3 value: 73.52199999999999 - type: map_at_5 value: 75.323 - type: mrr_at_1 value: 73.3 - type: mrr_at_10 value: 80.741 - type: mrr_at_100 value: 80.975 - type: mrr_at_1000 value: 80.979 - type: mrr_at_3 value: 79.282 - type: mrr_at_5 value: 80.24900000000001 - type: ndcg_at_1 value: 73.32 - type: ndcg_at_10 value: 81.172 - type: ndcg_at_100 value: 83.22800000000001 - type: ndcg_at_1000 value: 83.576 - type: ndcg_at_3 value: 77.586 - type: ndcg_at_5 value: 79.46600000000001 - type: precision_at_1 value: 73.32 - type: precision_at_10 value: 12.246 - type: precision_at_100 value: 1.459 - type: precision_at_1000 value: 0.155 - type: precision_at_3 value: 33.607 - type: precision_at_5 value: 22.214 - type: recall_at_1 value: 63.660000000000004 - type: recall_at_10 value: 90.147 - type: recall_at_100 value: 97.882 - type: recall_at_1000 value: 99.705 - type: recall_at_3 value: 79.948 - type: recall_at_5 value: 85.15 - type: map_at_1 value: 3.003 - type: map_at_10 value: 7.0169999999999995 - type: map_at_100 value: 8.436 - type: map_at_1000 value: 8.693 - type: map_at_3 value: 5.143 - type: map_at_5 value: 6.165 - type: mrr_at_1 value: 14.7 - type: mrr_at_10 value: 22.664 - type: mrr_at_100 value: 23.880000000000003 - type: mrr_at_1000 value: 23.964 - type: mrr_at_3 value: 19.650000000000002 - type: mrr_at_5 value: 21.295 - type: ndcg_at_1 value: 14.7 - type: ndcg_at_10 value: 12.509999999999998 - type: ndcg_at_100 value: 18.848000000000003 - type: ndcg_at_1000 value: 23.97 - type: ndcg_at_3 value: 11.673 - type: ndcg_at_5 value: 10.397 - type: precision_at_1 value: 14.7 - type: precision_at_10 value: 6.49 - type: precision_at_100 value: 1.562 - type: precision_at_1000 value: 0.27899999999999997 - type: precision_at_3 value: 10.767 - type: precision_at_5 value: 9.139999999999999 - type: recall_at_1 value: 3.003 - type: recall_at_10 value: 13.161999999999999 - type: recall_at_100 value: 31.747999999999998 - type: recall_at_1000 value: 56.752 - type: recall_at_3 value: 6.563 - type: recall_at_5 value: 9.263 - type: map_at_1 value: 0.125 - type: map_at_10 value: 0.683 - type: map_at_100 value: 3.88 - type: map_at_1000 value: 10.776 - type: map_at_3 value: 0.28200000000000003 - type: map_at_5 value: 0.416 - type: mrr_at_1 value: 56.00000000000001 - type: mrr_at_10 value: 67.144 - type: mrr_at_100 value: 67.674 - type: mrr_at_1000 value: 67.674 - type: mrr_at_3 value: 63.333 - type: mrr_at_5 value: 66.033 - type: ndcg_at_1 value: 48.0 - type: ndcg_at_10 value: 40.453 - type: ndcg_at_100 value: 32.356 - type: ndcg_at_1000 value: 30.54 - type: ndcg_at_3 value: 45.531 - type: ndcg_at_5 value: 43.791999999999994 - type: precision_at_1 value: 54.0 - type: precision_at_10 value: 43.2 - type: precision_at_100 value: 34.12 - type: precision_at_1000 value: 15.192 - type: precision_at_3 value: 48.667 - type: precision_at_5 value: 47.199999999999996 - type: recall_at_1 value: 0.125 - type: recall_at_10 value: 0.9490000000000001 - type: recall_at_100 value: 7.066 - type: recall_at_1000 value: 29.948000000000004 - type: recall_at_3 value: 0.313 - type: recall_at_5 value: 0.526 - task: type: Clustering dataset: name: MTEB RedditClustering type: None config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 37.24530383149719 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: None config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 47.10522668186171 - task: type: STS dataset: name: MTEB SICK-R type: None config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 76.5160077089625 - type: cos_sim_spearman value: 67.28825297023138 - type: euclidean_pearson value: 72.39938443269206 - type: euclidean_spearman value: 67.28835245540397 - type: manhattan_pearson value: 69.46413862678756 - type: manhattan_spearman value: 65.04853993701172 - task: type: STS dataset: name: MTEB STS12 type: None config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 70.89271048242642 - type: cos_sim_spearman value: 66.18310956468201 - type: euclidean_pearson value: 68.35445603238207 - type: euclidean_spearman value: 66.18456540329906 - type: manhattan_pearson value: 67.8411114817822 - type: manhattan_spearman value: 66.416716585612 - task: type: STS dataset: name: MTEB STS13 type: None config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 78.2216356861313 - type: cos_sim_spearman value: 79.37038668590753 - type: euclidean_pearson value: 79.01512518225226 - type: euclidean_spearman value: 79.37042448746669 - type: manhattan_pearson value: 78.96268955680836 - type: manhattan_spearman value: 79.54073298193023 - task: type: STS dataset: name: MTEB STS14 type: None config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 78.3544215128133 - type: cos_sim_spearman value: 75.07229525913817 - type: euclidean_pearson value: 77.35598390483041 - type: euclidean_spearman value: 75.07228556747974 - type: manhattan_pearson value: 76.27348311336605 - type: manhattan_spearman value: 74.50258040498937 - task: type: STS dataset: name: MTEB STS15 type: None config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 80.86410111924121 - type: cos_sim_spearman value: 81.79657437718866 - type: euclidean_pearson value: 81.77144036632458 - type: euclidean_spearman value: 81.79657286849607 - type: manhattan_pearson value: 81.87491956950679 - type: manhattan_spearman value: 82.16993847726854 - task: type: STS dataset: name: MTEB STS16 type: None config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 76.43507688364112 - type: cos_sim_spearman value: 77.63882301316933 - type: euclidean_pearson value: 77.25501398026381 - type: euclidean_spearman value: 77.63965196736244 - type: manhattan_pearson value: 77.67118978923139 - type: manhattan_spearman value: 78.01084214592416 - task: type: STS dataset: name: MTEB STS17 (en-en) type: None config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 84.39964672680482 - type: cos_sim_spearman value: 85.4075592513342 - type: euclidean_pearson value: 85.111606756296 - type: euclidean_spearman value: 85.40843260765956 - type: manhattan_pearson value: 84.8842901249278 - type: manhattan_spearman value: 85.63868618596224 - task: type: STS dataset: name: MTEB STS22 (en) type: None config: en split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 62.75456403534724 - type: cos_sim_spearman value: 60.22663871632273 - type: euclidean_pearson value: 62.65086137572171 - type: euclidean_spearman value: 60.22663871632273 - type: manhattan_pearson value: 62.250953520717104 - type: manhattan_spearman value: 60.3533574497436 - task: type: STS dataset: name: MTEB STSBenchmark type: None config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 77.7231724327084 - type: cos_sim_spearman value: 76.94587277885458 - type: euclidean_pearson value: 78.13987744447253 - type: euclidean_spearman value: 76.94589124562322 - type: manhattan_pearson value: 77.01673792666305 - type: manhattan_spearman value: 75.80700280973542 - task: type: Reranking dataset: name: MTEB SciDocsRR type: None config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 72.367197197921 - type: mrr value: 91.09422258932064 - task: type: Retrieval dataset: name: MTEB SciFact type: None config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: map_at_1 value: 37.583 - type: map_at_10 value: 45.412 - type: map_at_100 value: 46.504 - type: map_at_1000 value: 46.558 - type: map_at_3 value: 42.552 - type: map_at_5 value: 44.635000000000005 - type: mrr_at_1 value: 40.0 - type: mrr_at_10 value: 47.33 - type: mrr_at_100 value: 48.285 - type: mrr_at_1000 value: 48.329 - type: mrr_at_3 value: 44.944 - type: mrr_at_5 value: 46.711000000000006 - type: ndcg_at_1 value: 40.0 - type: ndcg_at_10 value: 49.818 - type: ndcg_at_100 value: 55.226 - type: ndcg_at_1000 value: 56.599999999999994 - type: ndcg_at_3 value: 44.659 - type: ndcg_at_5 value: 48.107 - type: precision_at_1 value: 40.0 - type: precision_at_10 value: 6.833 - type: precision_at_100 value: 0.98 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 17.444000000000003 - type: precision_at_5 value: 12.333 - type: recall_at_1 value: 37.583 - type: recall_at_10 value: 61.622 - type: recall_at_100 value: 87.1 - type: recall_at_1000 value: 97.8 - type: recall_at_3 value: 47.983 - type: recall_at_5 value: 56.65 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: None config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.70990099009902 - type: cos_sim_ap value: 91.32913696282823 - type: cos_sim_f1 value: 85.01006036217304 - type: cos_sim_precision value: 85.52631578947368 - type: cos_sim_recall value: 84.5 - type: dot_accuracy value: 99.70990099009902 - type: dot_ap value: 91.32913696282823 - type: dot_f1 value: 85.01006036217304 - type: dot_precision value: 85.52631578947368 - type: dot_recall value: 84.5 - type: euclidean_accuracy value: 99.70990099009902 - type: euclidean_ap value: 91.32913696282823 - type: euclidean_f1 value: 85.01006036217304 - type: euclidean_precision value: 85.52631578947368 - type: euclidean_recall value: 84.5 - type: manhattan_accuracy value: 99.76138613861386 - type: manhattan_ap value: 93.79556639749748 - type: manhattan_f1 value: 87.80246913580247 - type: manhattan_precision value: 86.73170731707317 - type: manhattan_recall value: 88.9 - type: max_accuracy value: 99.76138613861386 - type: max_ap value: 93.79556639749748 - type: max_f1 value: 87.80246913580247 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: None config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 43.31369355223715 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: None config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 29.601772320922777 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: None config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 45.58773371195953 - type: mrr value: 46.30187112723877 - task: type: Summarization dataset: name: MTEB SummEval type: None config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.0154193888818 - type: cos_sim_spearman value: 30.147164982667924 - type: dot_pearson value: 30.015419367262712 - type: dot_spearman value: 30.1547894792066 - task: type: Retrieval dataset: name: MTEB Touche2020 type: None config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: map_at_1 value: 1.494 - type: map_at_10 value: 8.271 - type: map_at_100 value: 13.59 - type: map_at_1000 value: 15.18 - type: map_at_3 value: 4.232 - type: map_at_5 value: 5.656 - type: mrr_at_1 value: 26.531 - type: mrr_at_10 value: 42.504999999999995 - type: mrr_at_100 value: 43.318 - type: mrr_at_1000 value: 43.318 - type: mrr_at_3 value: 39.456 - type: mrr_at_5 value: 39.966 - type: ndcg_at_1 value: 24.490000000000002 - type: ndcg_at_10 value: 22.358 - type: ndcg_at_100 value: 33.625 - type: ndcg_at_1000 value: 45.211 - type: ndcg_at_3 value: 26.345000000000002 - type: ndcg_at_5 value: 22.743 - type: precision_at_1 value: 26.531 - type: precision_at_10 value: 20.612 - type: precision_at_100 value: 7.5920000000000005 - type: precision_at_1000 value: 1.494 - type: precision_at_3 value: 28.571 - type: precision_at_5 value: 22.857 - type: recall_at_1 value: 1.494 - type: recall_at_10 value: 14.657 - type: recall_at_100 value: 45.273 - type: recall_at_1000 value: 80.66 - type: recall_at_3 value: 5.904 - type: recall_at_5 value: 8.053 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: None config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 68.7092 - type: ap value: 13.166630913914243 - type: f1 value: 52.79567185490722 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: None config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 54.9660441426146 - type: f1 value: 55.17567905972333 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: None config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 37.58792693202503 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: None config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 83.88269654884664 - type: cos_sim_ap value: 66.09276985843528 - type: cos_sim_f1 value: 63.225649744959924 - type: cos_sim_precision value: 58.573357335733576 - type: cos_sim_recall value: 68.68073878627968 - type: dot_accuracy value: 83.88269654884664 - type: dot_ap value: 66.09276747019544 - type: dot_f1 value: 63.225649744959924 - type: dot_precision value: 58.573357335733576 - type: dot_recall value: 68.68073878627968 - type: euclidean_accuracy value: 83.88269654884664 - type: euclidean_ap value: 66.09276985843528 - type: euclidean_f1 value: 63.225649744959924 - type: euclidean_precision value: 58.573357335733576 - type: euclidean_recall value: 68.68073878627968 - type: manhattan_accuracy value: 82.69058830541813 - type: manhattan_ap value: 62.74574997540533 - type: manhattan_f1 value: 59.96326905417815 - type: manhattan_precision value: 53.06785859406745 - type: manhattan_recall value: 68.91820580474935 - type: max_accuracy value: 83.88269654884664 - type: max_ap value: 66.09276985843528 - type: max_f1 value: 63.225649744959924 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: None config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 87.57519307641557 - type: cos_sim_ap value: 83.25474211186804 - type: cos_sim_f1 value: 75.56529680365297 - type: cos_sim_precision value: 71.89129074859248 - type: cos_sim_recall value: 79.63504773637203 - type: dot_accuracy value: 87.57519307641557 - type: dot_ap value: 83.25474240805171 - type: dot_f1 value: 75.56529680365297 - type: dot_precision value: 71.89129074859248 - type: dot_recall value: 79.63504773637203 - type: euclidean_accuracy value: 87.57519307641557 - type: euclidean_ap value: 83.25474211186805 - type: euclidean_f1 value: 75.56529680365297 - type: euclidean_precision value: 71.89129074859248 - type: euclidean_recall value: 79.63504773637203 - type: manhattan_accuracy value: 87.60041914076145 - type: manhattan_ap value: 83.11911507311108 - type: manhattan_f1 value: 75.27478546649627 - type: manhattan_precision value: 71.59130374383552 - type: manhattan_recall value: 79.35786880197105 - type: max_accuracy value: 87.60041914076145 - type: max_ap value: 83.25474240805171 - type: max_f1 value: 75.56529680365297 ---
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
jinaai/jina-embedding-l-en-v1
jinaai
sentence-similarity
[ "sentence-transformers", "pytorch", "t5", "finetuner", "mteb", "feature-extraction", "sentence-similarity", "custom_code", "en", "dataset:jinaai/negation-dataset", "arxiv:2307.11224", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,688
1,736
561
24
--- datasets: - jinaai/negation-dataset language: en license: apache-2.0 pipeline_tag: sentence-similarity tags: - finetuner - mteb - sentence-transformers - feature-extraction - sentence-similarity model-index: - name: jina-triplets-large results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 68.92537313432835 - type: ap value: 29.723758877632513 - type: f1 value: 61.909704211663794 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 69.13669999999999 - type: ap value: 65.30216072238086 - type: f1 value: 67.1890891071034 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 31.384 - type: f1 value: 30.016752348953723 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: map_at_1 value: 23.613 - type: map_at_10 value: 37.897 - type: map_at_100 value: 39.093 - type: map_at_1000 value: 39.109 - type: map_at_3 value: 32.824 - type: map_at_5 value: 35.679 - type: mrr_at_1 value: 23.826 - type: mrr_at_10 value: 37.997 - type: mrr_at_100 value: 39.186 - type: mrr_at_1000 value: 39.202 - type: mrr_at_3 value: 32.918 - type: mrr_at_5 value: 35.748999999999995 - type: ndcg_at_1 value: 23.613 - type: ndcg_at_10 value: 46.482 - type: ndcg_at_100 value: 51.55499999999999 - type: ndcg_at_1000 value: 51.974 - type: ndcg_at_3 value: 35.964 - type: ndcg_at_5 value: 41.144999999999996 - type: precision_at_1 value: 23.613 - type: precision_at_10 value: 7.417999999999999 - type: precision_at_100 value: 0.963 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 15.031 - type: precision_at_5 value: 11.55 - type: recall_at_1 value: 23.613 - type: recall_at_10 value: 74.182 - type: recall_at_100 value: 96.30199999999999 - type: recall_at_1000 value: 99.57300000000001 - type: recall_at_3 value: 45.092 - type: recall_at_5 value: 57.752 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 40.51285742156528 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 31.5825964077496 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 62.830281630546835 - type: mrr value: 75.93072593765115 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 87.26764516732737 - type: cos_sim_spearman value: 84.42541766631741 - type: euclidean_pearson value: 48.71357447655235 - type: euclidean_spearman value: 49.2023259276511 - type: manhattan_pearson value: 48.36366272727299 - type: manhattan_spearman value: 48.457128224924354 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 85.3409090909091 - type: f1 value: 85.25262617676835 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 33.560193912974974 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 28.4426572644577 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: map_at_1 value: 27.822999999999997 - type: map_at_10 value: 39.088 - type: map_at_100 value: 40.561 - type: map_at_1000 value: 40.69 - type: map_at_3 value: 35.701 - type: map_at_5 value: 37.556 - type: mrr_at_1 value: 33.906 - type: mrr_at_10 value: 44.527 - type: mrr_at_100 value: 45.403999999999996 - type: mrr_at_1000 value: 45.452 - type: mrr_at_3 value: 41.726 - type: mrr_at_5 value: 43.314 - type: ndcg_at_1 value: 33.906 - type: ndcg_at_10 value: 45.591 - type: ndcg_at_100 value: 51.041000000000004 - type: ndcg_at_1000 value: 53.1 - type: ndcg_at_3 value: 40.324 - type: ndcg_at_5 value: 42.723 - type: precision_at_1 value: 33.906 - type: precision_at_10 value: 8.655 - type: precision_at_100 value: 1.418 - type: precision_at_1000 value: 0.19499999999999998 - type: precision_at_3 value: 19.123 - type: precision_at_5 value: 13.963000000000001 - type: recall_at_1 value: 27.822999999999997 - type: recall_at_10 value: 58.63699999999999 - type: recall_at_100 value: 80.874 - type: recall_at_1000 value: 93.82000000000001 - type: recall_at_3 value: 44.116 - type: recall_at_5 value: 50.178999999999995 - type: map_at_1 value: 26.823999999999998 - type: map_at_10 value: 37.006 - type: map_at_100 value: 38.256 - type: map_at_1000 value: 38.397999999999996 - type: map_at_3 value: 34.011 - type: map_at_5 value: 35.643 - type: mrr_at_1 value: 34.268 - type: mrr_at_10 value: 43.374 - type: mrr_at_100 value: 44.096000000000004 - type: mrr_at_1000 value: 44.144 - type: mrr_at_3 value: 41.008 - type: mrr_at_5 value: 42.359 - type: ndcg_at_1 value: 34.268 - type: ndcg_at_10 value: 43.02 - type: ndcg_at_100 value: 47.747 - type: ndcg_at_1000 value: 50.019999999999996 - type: ndcg_at_3 value: 38.687 - type: ndcg_at_5 value: 40.647 - type: precision_at_1 value: 34.268 - type: precision_at_10 value: 8.261000000000001 - type: precision_at_100 value: 1.376 - type: precision_at_1000 value: 0.189 - type: precision_at_3 value: 19.108 - type: precision_at_5 value: 13.489999999999998 - type: recall_at_1 value: 26.823999999999998 - type: recall_at_10 value: 53.84100000000001 - type: recall_at_100 value: 73.992 - type: recall_at_1000 value: 88.524 - type: recall_at_3 value: 40.711000000000006 - type: recall_at_5 value: 46.477000000000004 - type: map_at_1 value: 34.307 - type: map_at_10 value: 45.144 - type: map_at_100 value: 46.351 - type: map_at_1000 value: 46.414 - type: map_at_3 value: 42.315000000000005 - type: map_at_5 value: 43.991 - type: mrr_at_1 value: 39.06 - type: mrr_at_10 value: 48.612 - type: mrr_at_100 value: 49.425000000000004 - type: mrr_at_1000 value: 49.458999999999996 - type: mrr_at_3 value: 46.144 - type: mrr_at_5 value: 47.654999999999994 - type: ndcg_at_1 value: 39.06 - type: ndcg_at_10 value: 50.647 - type: ndcg_at_100 value: 55.620000000000005 - type: ndcg_at_1000 value: 56.976000000000006 - type: ndcg_at_3 value: 45.705 - type: ndcg_at_5 value: 48.269 - type: precision_at_1 value: 39.06 - type: precision_at_10 value: 8.082 - type: precision_at_100 value: 1.161 - type: precision_at_1000 value: 0.133 - type: precision_at_3 value: 20.376 - type: precision_at_5 value: 14.069 - type: recall_at_1 value: 34.307 - type: recall_at_10 value: 63.497 - type: recall_at_100 value: 85.038 - type: recall_at_1000 value: 94.782 - type: recall_at_3 value: 50.209 - type: recall_at_5 value: 56.525000000000006 - type: map_at_1 value: 26.448 - type: map_at_10 value: 34.86 - type: map_at_100 value: 36.004999999999995 - type: map_at_1000 value: 36.081 - type: map_at_3 value: 32.527 - type: map_at_5 value: 33.955 - type: mrr_at_1 value: 28.701 - type: mrr_at_10 value: 36.909 - type: mrr_at_100 value: 37.89 - type: mrr_at_1000 value: 37.945 - type: mrr_at_3 value: 34.576 - type: mrr_at_5 value: 35.966 - type: ndcg_at_1 value: 28.701 - type: ndcg_at_10 value: 39.507999999999996 - type: ndcg_at_100 value: 45.056000000000004 - type: ndcg_at_1000 value: 47.034 - type: ndcg_at_3 value: 34.985 - type: ndcg_at_5 value: 37.384 - type: precision_at_1 value: 28.701 - type: precision_at_10 value: 5.921 - type: precision_at_100 value: 0.914 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 14.689 - type: precision_at_5 value: 10.237 - type: recall_at_1 value: 26.448 - type: recall_at_10 value: 51.781 - type: recall_at_100 value: 77.142 - type: recall_at_1000 value: 92.10000000000001 - type: recall_at_3 value: 39.698 - type: recall_at_5 value: 45.469 - type: map_at_1 value: 14.174000000000001 - type: map_at_10 value: 22.019 - type: map_at_100 value: 23.18 - type: map_at_1000 value: 23.304 - type: map_at_3 value: 19.332 - type: map_at_5 value: 20.816000000000003 - type: mrr_at_1 value: 17.785999999999998 - type: mrr_at_10 value: 26.233 - type: mrr_at_100 value: 27.254 - type: mrr_at_1000 value: 27.328000000000003 - type: mrr_at_3 value: 23.653 - type: mrr_at_5 value: 25.095 - type: ndcg_at_1 value: 17.785999999999998 - type: ndcg_at_10 value: 27.236 - type: ndcg_at_100 value: 32.932 - type: ndcg_at_1000 value: 36.134 - type: ndcg_at_3 value: 22.33 - type: ndcg_at_5 value: 24.573999999999998 - type: precision_at_1 value: 17.785999999999998 - type: precision_at_10 value: 5.286 - type: precision_at_100 value: 0.9369999999999999 - type: precision_at_1000 value: 0.136 - type: precision_at_3 value: 11.07 - type: precision_at_5 value: 8.308 - type: recall_at_1 value: 14.174000000000001 - type: recall_at_10 value: 39.135 - type: recall_at_100 value: 64.095 - type: recall_at_1000 value: 87.485 - type: recall_at_3 value: 25.496999999999996 - type: recall_at_5 value: 31.148999999999997 - type: map_at_1 value: 24.371000000000002 - type: map_at_10 value: 33.074999999999996 - type: map_at_100 value: 34.486 - type: map_at_1000 value: 34.608 - type: map_at_3 value: 30.483 - type: map_at_5 value: 31.972 - type: mrr_at_1 value: 29.548000000000002 - type: mrr_at_10 value: 38.431 - type: mrr_at_100 value: 39.347 - type: mrr_at_1000 value: 39.4 - type: mrr_at_3 value: 35.980000000000004 - type: mrr_at_5 value: 37.413999999999994 - type: ndcg_at_1 value: 29.548000000000002 - type: ndcg_at_10 value: 38.552 - type: ndcg_at_100 value: 44.598 - type: ndcg_at_1000 value: 47.0 - type: ndcg_at_3 value: 34.109 - type: ndcg_at_5 value: 36.263 - type: precision_at_1 value: 29.548000000000002 - type: precision_at_10 value: 6.92 - type: precision_at_100 value: 1.179 - type: precision_at_1000 value: 0.159 - type: precision_at_3 value: 16.137 - type: precision_at_5 value: 11.511000000000001 - type: recall_at_1 value: 24.371000000000002 - type: recall_at_10 value: 49.586999999999996 - type: recall_at_100 value: 75.15899999999999 - type: recall_at_1000 value: 91.06 - type: recall_at_3 value: 37.09 - type: recall_at_5 value: 42.588 - type: map_at_1 value: 24.517 - type: map_at_10 value: 32.969 - type: map_at_100 value: 34.199 - type: map_at_1000 value: 34.322 - type: map_at_3 value: 30.270999999999997 - type: map_at_5 value: 31.863000000000003 - type: mrr_at_1 value: 30.479 - type: mrr_at_10 value: 38.633 - type: mrr_at_100 value: 39.522 - type: mrr_at_1000 value: 39.583 - type: mrr_at_3 value: 36.454 - type: mrr_at_5 value: 37.744 - type: ndcg_at_1 value: 30.479 - type: ndcg_at_10 value: 38.269 - type: ndcg_at_100 value: 43.91 - type: ndcg_at_1000 value: 46.564 - type: ndcg_at_3 value: 34.03 - type: ndcg_at_5 value: 36.155 - type: precision_at_1 value: 30.479 - type: precision_at_10 value: 6.815 - type: precision_at_100 value: 1.138 - type: precision_at_1000 value: 0.158 - type: precision_at_3 value: 16.058 - type: precision_at_5 value: 11.416 - type: recall_at_1 value: 24.517 - type: recall_at_10 value: 48.559000000000005 - type: recall_at_100 value: 73.307 - type: recall_at_1000 value: 91.508 - type: recall_at_3 value: 36.563 - type: recall_at_5 value: 42.375 - type: map_at_1 value: 24.336166666666664 - type: map_at_10 value: 32.80791666666667 - type: map_at_100 value: 34.043416666666666 - type: map_at_1000 value: 34.162749999999996 - type: map_at_3 value: 30.187083333333337 - type: map_at_5 value: 31.637833333333337 - type: mrr_at_1 value: 28.669583333333343 - type: mrr_at_10 value: 36.88616666666667 - type: mrr_at_100 value: 37.80233333333333 - type: mrr_at_1000 value: 37.86141666666666 - type: mrr_at_3 value: 34.537416666666665 - type: mrr_at_5 value: 35.84275 - type: ndcg_at_1 value: 28.669583333333343 - type: ndcg_at_10 value: 37.956916666666665 - type: ndcg_at_100 value: 43.39475 - type: ndcg_at_1000 value: 45.79925 - type: ndcg_at_3 value: 33.43683333333334 - type: ndcg_at_5 value: 35.52575 - type: precision_at_1 value: 28.669583333333343 - type: precision_at_10 value: 6.603833333333335 - type: precision_at_100 value: 1.1079166666666667 - type: precision_at_1000 value: 0.15208333333333335 - type: precision_at_3 value: 15.338750000000001 - type: precision_at_5 value: 10.88775 - type: recall_at_1 value: 24.336166666666664 - type: recall_at_10 value: 49.19358333333333 - type: recall_at_100 value: 73.07583333333334 - type: recall_at_1000 value: 89.81675 - type: recall_at_3 value: 36.54091666666667 - type: recall_at_5 value: 41.919250000000005 - type: map_at_1 value: 23.388 - type: map_at_10 value: 29.408 - type: map_at_100 value: 30.452 - type: map_at_1000 value: 30.546 - type: map_at_3 value: 27.139000000000003 - type: map_at_5 value: 28.402 - type: mrr_at_1 value: 25.46 - type: mrr_at_10 value: 31.966 - type: mrr_at_100 value: 32.879999999999995 - type: mrr_at_1000 value: 32.944 - type: mrr_at_3 value: 29.755 - type: mrr_at_5 value: 30.974 - type: ndcg_at_1 value: 25.46 - type: ndcg_at_10 value: 33.449 - type: ndcg_at_100 value: 38.67 - type: ndcg_at_1000 value: 41.035 - type: ndcg_at_3 value: 29.048000000000002 - type: ndcg_at_5 value: 31.127 - type: precision_at_1 value: 25.46 - type: precision_at_10 value: 5.199 - type: precision_at_100 value: 0.8670000000000001 - type: precision_at_1000 value: 0.11399999999999999 - type: precision_at_3 value: 12.168 - type: precision_at_5 value: 8.62 - type: recall_at_1 value: 23.388 - type: recall_at_10 value: 43.428 - type: recall_at_100 value: 67.245 - type: recall_at_1000 value: 84.75399999999999 - type: recall_at_3 value: 31.416 - type: recall_at_5 value: 36.451 - type: map_at_1 value: 17.136000000000003 - type: map_at_10 value: 24.102999999999998 - type: map_at_100 value: 25.219 - type: map_at_1000 value: 25.344 - type: map_at_3 value: 22.004 - type: map_at_5 value: 23.145 - type: mrr_at_1 value: 20.613 - type: mrr_at_10 value: 27.753 - type: mrr_at_100 value: 28.698 - type: mrr_at_1000 value: 28.776000000000003 - type: mrr_at_3 value: 25.711000000000002 - type: mrr_at_5 value: 26.795 - type: ndcg_at_1 value: 20.613 - type: ndcg_at_10 value: 28.510999999999996 - type: ndcg_at_100 value: 33.924 - type: ndcg_at_1000 value: 36.849 - type: ndcg_at_3 value: 24.664 - type: ndcg_at_5 value: 26.365 - type: precision_at_1 value: 20.613 - type: precision_at_10 value: 5.069 - type: precision_at_100 value: 0.918 - type: precision_at_1000 value: 0.136 - type: precision_at_3 value: 11.574 - type: precision_at_5 value: 8.211 - type: recall_at_1 value: 17.136000000000003 - type: recall_at_10 value: 38.232 - type: recall_at_100 value: 62.571 - type: recall_at_1000 value: 83.23 - type: recall_at_3 value: 27.468999999999998 - type: recall_at_5 value: 31.852999999999998 - type: map_at_1 value: 25.580000000000002 - type: map_at_10 value: 33.449 - type: map_at_100 value: 34.58 - type: map_at_1000 value: 34.692 - type: map_at_3 value: 30.660999999999998 - type: map_at_5 value: 32.425 - type: mrr_at_1 value: 30.037000000000003 - type: mrr_at_10 value: 37.443 - type: mrr_at_100 value: 38.32 - type: mrr_at_1000 value: 38.384 - type: mrr_at_3 value: 34.778999999999996 - type: mrr_at_5 value: 36.458 - type: ndcg_at_1 value: 30.037000000000003 - type: ndcg_at_10 value: 38.46 - type: ndcg_at_100 value: 43.746 - type: ndcg_at_1000 value: 46.28 - type: ndcg_at_3 value: 33.52 - type: ndcg_at_5 value: 36.175000000000004 - type: precision_at_1 value: 30.037000000000003 - type: precision_at_10 value: 6.418 - type: precision_at_100 value: 1.0210000000000001 - type: precision_at_1000 value: 0.136 - type: precision_at_3 value: 15.018999999999998 - type: precision_at_5 value: 10.877 - type: recall_at_1 value: 25.580000000000002 - type: recall_at_10 value: 49.830000000000005 - type: recall_at_100 value: 73.04899999999999 - type: recall_at_1000 value: 90.751 - type: recall_at_3 value: 36.370999999999995 - type: recall_at_5 value: 43.104 - type: map_at_1 value: 24.071 - type: map_at_10 value: 33.384 - type: map_at_100 value: 35.004999999999995 - type: map_at_1000 value: 35.215999999999994 - type: map_at_3 value: 30.459000000000003 - type: map_at_5 value: 31.769 - type: mrr_at_1 value: 28.854000000000003 - type: mrr_at_10 value: 37.512 - type: mrr_at_100 value: 38.567 - type: mrr_at_1000 value: 38.618 - type: mrr_at_3 value: 35.211 - type: mrr_at_5 value: 36.13 - type: ndcg_at_1 value: 28.854000000000003 - type: ndcg_at_10 value: 39.216 - type: ndcg_at_100 value: 45.214 - type: ndcg_at_1000 value: 47.573 - type: ndcg_at_3 value: 34.597 - type: ndcg_at_5 value: 36.063 - type: precision_at_1 value: 28.854000000000003 - type: precision_at_10 value: 7.648000000000001 - type: precision_at_100 value: 1.545 - type: precision_at_1000 value: 0.241 - type: precision_at_3 value: 16.667 - type: precision_at_5 value: 11.818 - type: recall_at_1 value: 24.071 - type: recall_at_10 value: 50.802 - type: recall_at_100 value: 77.453 - type: recall_at_1000 value: 92.304 - type: recall_at_3 value: 36.846000000000004 - type: recall_at_5 value: 41.14 - type: map_at_1 value: 23.395 - type: map_at_10 value: 29.189999999999998 - type: map_at_100 value: 30.226999999999997 - type: map_at_1000 value: 30.337999999999997 - type: map_at_3 value: 27.342 - type: map_at_5 value: 28.116999999999997 - type: mrr_at_1 value: 25.323 - type: mrr_at_10 value: 31.241000000000003 - type: mrr_at_100 value: 32.225 - type: mrr_at_1000 value: 32.304 - type: mrr_at_3 value: 29.452 - type: mrr_at_5 value: 30.209000000000003 - type: ndcg_at_1 value: 25.323 - type: ndcg_at_10 value: 33.024 - type: ndcg_at_100 value: 38.279 - type: ndcg_at_1000 value: 41.026 - type: ndcg_at_3 value: 29.243000000000002 - type: ndcg_at_5 value: 30.564000000000004 - type: precision_at_1 value: 25.323 - type: precision_at_10 value: 4.972 - type: precision_at_100 value: 0.8210000000000001 - type: precision_at_1000 value: 0.116 - type: precision_at_3 value: 12.076 - type: precision_at_5 value: 8.133 - type: recall_at_1 value: 23.395 - type: recall_at_10 value: 42.994 - type: recall_at_100 value: 66.985 - type: recall_at_1000 value: 87.483 - type: recall_at_3 value: 32.505 - type: recall_at_5 value: 35.721000000000004 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: map_at_1 value: 8.322000000000001 - type: map_at_10 value: 14.491000000000001 - type: map_at_100 value: 16.066 - type: map_at_1000 value: 16.238 - type: map_at_3 value: 12.235 - type: map_at_5 value: 13.422999999999998 - type: mrr_at_1 value: 19.479 - type: mrr_at_10 value: 29.38 - type: mrr_at_100 value: 30.520999999999997 - type: mrr_at_1000 value: 30.570999999999998 - type: mrr_at_3 value: 26.395000000000003 - type: mrr_at_5 value: 27.982000000000003 - type: ndcg_at_1 value: 19.479 - type: ndcg_at_10 value: 21.215 - type: ndcg_at_100 value: 27.966 - type: ndcg_at_1000 value: 31.324 - type: ndcg_at_3 value: 17.194000000000003 - type: ndcg_at_5 value: 18.593 - type: precision_at_1 value: 19.479 - type: precision_at_10 value: 6.5280000000000005 - type: precision_at_100 value: 1.359 - type: precision_at_1000 value: 0.198 - type: precision_at_3 value: 12.703999999999999 - type: precision_at_5 value: 9.655 - type: recall_at_1 value: 8.322000000000001 - type: recall_at_10 value: 26.165 - type: recall_at_100 value: 49.573 - type: recall_at_1000 value: 68.501 - type: recall_at_3 value: 16.179 - type: recall_at_5 value: 20.175 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: map_at_1 value: 8.003 - type: map_at_10 value: 16.087 - type: map_at_100 value: 21.363 - type: map_at_1000 value: 22.64 - type: map_at_3 value: 12.171999999999999 - type: map_at_5 value: 13.866 - type: mrr_at_1 value: 61.25000000000001 - type: mrr_at_10 value: 68.626 - type: mrr_at_100 value: 69.134 - type: mrr_at_1000 value: 69.144 - type: mrr_at_3 value: 67.042 - type: mrr_at_5 value: 67.929 - type: ndcg_at_1 value: 49.0 - type: ndcg_at_10 value: 34.132 - type: ndcg_at_100 value: 37.545 - type: ndcg_at_1000 value: 44.544 - type: ndcg_at_3 value: 38.946999999999996 - type: ndcg_at_5 value: 36.317 - type: precision_at_1 value: 61.25000000000001 - type: precision_at_10 value: 26.325 - type: precision_at_100 value: 8.173 - type: precision_at_1000 value: 1.778 - type: precision_at_3 value: 41.667 - type: precision_at_5 value: 34.300000000000004 - type: recall_at_1 value: 8.003 - type: recall_at_10 value: 20.577 - type: recall_at_100 value: 41.884 - type: recall_at_1000 value: 64.36500000000001 - type: recall_at_3 value: 13.602 - type: recall_at_5 value: 16.41 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 45.835 - type: f1 value: 41.66455981281837 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: map_at_1 value: 55.717000000000006 - type: map_at_10 value: 66.34100000000001 - type: map_at_100 value: 66.776 - type: map_at_1000 value: 66.794 - type: map_at_3 value: 64.386 - type: map_at_5 value: 65.566 - type: mrr_at_1 value: 60.141 - type: mrr_at_10 value: 70.928 - type: mrr_at_100 value: 71.29299999999999 - type: mrr_at_1000 value: 71.30199999999999 - type: mrr_at_3 value: 69.07900000000001 - type: mrr_at_5 value: 70.244 - type: ndcg_at_1 value: 60.141 - type: ndcg_at_10 value: 71.90100000000001 - type: ndcg_at_100 value: 73.836 - type: ndcg_at_1000 value: 74.214 - type: ndcg_at_3 value: 68.203 - type: ndcg_at_5 value: 70.167 - type: precision_at_1 value: 60.141 - type: precision_at_10 value: 9.268 - type: precision_at_100 value: 1.03 - type: precision_at_1000 value: 0.108 - type: precision_at_3 value: 27.028000000000002 - type: precision_at_5 value: 17.342 - type: recall_at_1 value: 55.717000000000006 - type: recall_at_10 value: 84.66799999999999 - type: recall_at_100 value: 93.28 - type: recall_at_1000 value: 95.887 - type: recall_at_3 value: 74.541 - type: recall_at_5 value: 79.389 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: map_at_1 value: 17.744 - type: map_at_10 value: 29.554000000000002 - type: map_at_100 value: 31.180000000000003 - type: map_at_1000 value: 31.372 - type: map_at_3 value: 25.6 - type: map_at_5 value: 27.642 - type: mrr_at_1 value: 35.802 - type: mrr_at_10 value: 44.812999999999995 - type: mrr_at_100 value: 45.56 - type: mrr_at_1000 value: 45.606 - type: mrr_at_3 value: 42.181000000000004 - type: mrr_at_5 value: 43.516 - type: ndcg_at_1 value: 35.802 - type: ndcg_at_10 value: 37.269999999999996 - type: ndcg_at_100 value: 43.575 - type: ndcg_at_1000 value: 46.916000000000004 - type: ndcg_at_3 value: 33.511 - type: ndcg_at_5 value: 34.504000000000005 - type: precision_at_1 value: 35.802 - type: precision_at_10 value: 10.448 - type: precision_at_100 value: 1.7129999999999999 - type: precision_at_1000 value: 0.231 - type: precision_at_3 value: 22.531000000000002 - type: precision_at_5 value: 16.512 - type: recall_at_1 value: 17.744 - type: recall_at_10 value: 44.616 - type: recall_at_100 value: 68.51899999999999 - type: recall_at_1000 value: 88.495 - type: recall_at_3 value: 30.235 - type: recall_at_5 value: 35.821999999999996 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: map_at_1 value: 33.315 - type: map_at_10 value: 45.932 - type: map_at_100 value: 46.708 - type: map_at_1000 value: 46.778999999999996 - type: map_at_3 value: 43.472 - type: map_at_5 value: 45.022 - type: mrr_at_1 value: 66.631 - type: mrr_at_10 value: 73.083 - type: mrr_at_100 value: 73.405 - type: mrr_at_1000 value: 73.421 - type: mrr_at_3 value: 71.756 - type: mrr_at_5 value: 72.616 - type: ndcg_at_1 value: 66.631 - type: ndcg_at_10 value: 54.949000000000005 - type: ndcg_at_100 value: 57.965 - type: ndcg_at_1000 value: 59.467000000000006 - type: ndcg_at_3 value: 51.086 - type: ndcg_at_5 value: 53.272 - type: precision_at_1 value: 66.631 - type: precision_at_10 value: 11.178 - type: precision_at_100 value: 1.3559999999999999 - type: precision_at_1000 value: 0.156 - type: precision_at_3 value: 31.582 - type: precision_at_5 value: 20.678 - type: recall_at_1 value: 33.315 - type: recall_at_10 value: 55.888000000000005 - type: recall_at_100 value: 67.812 - type: recall_at_1000 value: 77.839 - type: recall_at_3 value: 47.373 - type: recall_at_5 value: 51.695 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 66.424 - type: ap value: 61.132235499939256 - type: f1 value: 66.07094958225315 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: dev revision: None metrics: - type: map_at_1 value: 21.575 - type: map_at_10 value: 33.509 - type: map_at_100 value: 34.725 - type: map_at_1000 value: 34.775 - type: map_at_3 value: 29.673 - type: map_at_5 value: 31.805 - type: mrr_at_1 value: 22.235 - type: mrr_at_10 value: 34.1 - type: mrr_at_100 value: 35.254999999999995 - type: mrr_at_1000 value: 35.299 - type: mrr_at_3 value: 30.334 - type: mrr_at_5 value: 32.419 - type: ndcg_at_1 value: 22.235 - type: ndcg_at_10 value: 40.341 - type: ndcg_at_100 value: 46.161 - type: ndcg_at_1000 value: 47.400999999999996 - type: ndcg_at_3 value: 32.482 - type: ndcg_at_5 value: 36.269 - type: precision_at_1 value: 22.235 - type: precision_at_10 value: 6.422999999999999 - type: precision_at_100 value: 0.9329999999999999 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 13.835 - type: precision_at_5 value: 10.226 - type: recall_at_1 value: 21.575 - type: recall_at_10 value: 61.448 - type: recall_at_100 value: 88.289 - type: recall_at_1000 value: 97.76899999999999 - type: recall_at_3 value: 39.971000000000004 - type: recall_at_5 value: 49.053000000000004 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 92.83401732786137 - type: f1 value: 92.47678691291068 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 76.08983128134975 - type: f1 value: 59.782936393820904 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 72.73032952252858 - type: f1 value: 70.72684765888265 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 77.08473436449226 - type: f1 value: 77.31457411257054 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 30.11980959210532 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 25.2587629106119 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 31.48268319779204 - type: mrr value: 32.501885728964304 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.284 - type: map_at_10 value: 11.509 - type: map_at_100 value: 14.624 - type: map_at_1000 value: 16.035 - type: map_at_3 value: 8.347999999999999 - type: map_at_5 value: 9.919 - type: mrr_at_1 value: 43.344 - type: mrr_at_10 value: 52.303999999999995 - type: mrr_at_100 value: 52.994 - type: mrr_at_1000 value: 53.032999999999994 - type: mrr_at_3 value: 50.361 - type: mrr_at_5 value: 51.754 - type: ndcg_at_1 value: 41.176 - type: ndcg_at_10 value: 32.244 - type: ndcg_at_100 value: 29.916999999999998 - type: ndcg_at_1000 value: 38.753 - type: ndcg_at_3 value: 36.856 - type: ndcg_at_5 value: 35.394999999999996 - type: precision_at_1 value: 43.034 - type: precision_at_10 value: 24.118000000000002 - type: precision_at_100 value: 7.926 - type: precision_at_1000 value: 2.045 - type: precision_at_3 value: 34.675 - type: precision_at_5 value: 31.146 - type: recall_at_1 value: 5.284 - type: recall_at_10 value: 15.457 - type: recall_at_100 value: 30.914 - type: recall_at_1000 value: 63.788999999999994 - type: recall_at_3 value: 9.596 - type: recall_at_5 value: 12.391 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: map_at_1 value: 29.537999999999997 - type: map_at_10 value: 43.99 - type: map_at_100 value: 45.003 - type: map_at_1000 value: 45.04 - type: map_at_3 value: 39.814 - type: map_at_5 value: 42.166 - type: mrr_at_1 value: 33.256 - type: mrr_at_10 value: 46.487 - type: mrr_at_100 value: 47.264 - type: mrr_at_1000 value: 47.29 - type: mrr_at_3 value: 43.091 - type: mrr_at_5 value: 45.013999999999996 - type: ndcg_at_1 value: 33.256 - type: ndcg_at_10 value: 51.403 - type: ndcg_at_100 value: 55.706999999999994 - type: ndcg_at_1000 value: 56.586000000000006 - type: ndcg_at_3 value: 43.559 - type: ndcg_at_5 value: 47.426 - type: precision_at_1 value: 33.256 - type: precision_at_10 value: 8.540000000000001 - type: precision_at_100 value: 1.093 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_3 value: 19.834 - type: precision_at_5 value: 14.143 - type: recall_at_1 value: 29.537999999999997 - type: recall_at_10 value: 71.5 - type: recall_at_100 value: 90.25 - type: recall_at_1000 value: 96.82600000000001 - type: recall_at_3 value: 51.108 - type: recall_at_5 value: 60.006 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: map_at_1 value: 70.526 - type: map_at_10 value: 84.342 - type: map_at_100 value: 84.985 - type: map_at_1000 value: 85.003 - type: map_at_3 value: 81.472 - type: map_at_5 value: 83.292 - type: mrr_at_1 value: 81.17 - type: mrr_at_10 value: 87.33999999999999 - type: mrr_at_100 value: 87.445 - type: mrr_at_1000 value: 87.446 - type: mrr_at_3 value: 86.387 - type: mrr_at_5 value: 87.042 - type: ndcg_at_1 value: 81.19 - type: ndcg_at_10 value: 88.088 - type: ndcg_at_100 value: 89.35 - type: ndcg_at_1000 value: 89.462 - type: ndcg_at_3 value: 85.319 - type: ndcg_at_5 value: 86.858 - type: precision_at_1 value: 81.19 - type: precision_at_10 value: 13.33 - type: precision_at_100 value: 1.528 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.31 - type: precision_at_5 value: 24.512 - type: recall_at_1 value: 70.526 - type: recall_at_10 value: 95.166 - type: recall_at_100 value: 99.479 - type: recall_at_1000 value: 99.984 - type: recall_at_3 value: 87.124 - type: recall_at_5 value: 91.53 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 45.049073872893494 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 55.13810914528368 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 4.593 - type: map_at_10 value: 10.907 - type: map_at_100 value: 12.888 - type: map_at_1000 value: 13.167000000000002 - type: map_at_3 value: 7.936 - type: map_at_5 value: 9.31 - type: mrr_at_1 value: 22.7 - type: mrr_at_10 value: 32.509 - type: mrr_at_100 value: 33.69 - type: mrr_at_1000 value: 33.747 - type: mrr_at_3 value: 29.599999999999998 - type: mrr_at_5 value: 31.155 - type: ndcg_at_1 value: 22.7 - type: ndcg_at_10 value: 18.445 - type: ndcg_at_100 value: 26.241999999999997 - type: ndcg_at_1000 value: 31.409 - type: ndcg_at_3 value: 17.864 - type: ndcg_at_5 value: 15.232999999999999 - type: precision_at_1 value: 22.7 - type: precision_at_10 value: 9.43 - type: precision_at_100 value: 2.061 - type: precision_at_1000 value: 0.331 - type: precision_at_3 value: 16.467000000000002 - type: precision_at_5 value: 13.08 - type: recall_at_1 value: 4.593 - type: recall_at_10 value: 19.115 - type: recall_at_100 value: 41.82 - type: recall_at_1000 value: 67.167 - type: recall_at_3 value: 9.983 - type: recall_at_5 value: 13.218 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 82.94432059816452 - type: cos_sim_spearman value: 79.19993315048852 - type: euclidean_pearson value: 72.43261099671753 - type: euclidean_spearman value: 71.51531114998619 - type: manhattan_pearson value: 71.83604124130447 - type: manhattan_spearman value: 71.24460392842295 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 84.25401068481673 - type: cos_sim_spearman value: 74.5249604699309 - type: euclidean_pearson value: 71.1324859629043 - type: euclidean_spearman value: 58.77041705276752 - type: manhattan_pearson value: 71.01471521586141 - type: manhattan_spearman value: 58.69949381017865 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 82.85731544223766 - type: cos_sim_spearman value: 83.15607264736185 - type: euclidean_pearson value: 75.8803249521361 - type: euclidean_spearman value: 76.4862168799065 - type: manhattan_pearson value: 75.80451454386811 - type: manhattan_spearman value: 76.35986831074699 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 82.40669043798857 - type: cos_sim_spearman value: 78.08686090667834 - type: euclidean_pearson value: 74.48574712193803 - type: euclidean_spearman value: 70.79423012045118 - type: manhattan_pearson value: 74.39099211477354 - type: manhattan_spearman value: 70.73135427277684 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 86.03027014209859 - type: cos_sim_spearman value: 86.91082847840946 - type: euclidean_pearson value: 69.13187603971996 - type: euclidean_spearman value: 70.0370035340552 - type: manhattan_pearson value: 69.2586635812031 - type: manhattan_spearman value: 70.18638387118486 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 82.41190748361883 - type: cos_sim_spearman value: 83.64850851235231 - type: euclidean_pearson value: 71.60523243575282 - type: euclidean_spearman value: 72.26134033805099 - type: manhattan_pearson value: 71.50771482066683 - type: manhattan_spearman value: 72.13707967973161 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 90.42838477648627 - type: cos_sim_spearman value: 90.15798155439076 - type: euclidean_pearson value: 77.09619972244516 - type: euclidean_spearman value: 75.5953488548861 - type: manhattan_pearson value: 77.36892406451771 - type: manhattan_spearman value: 75.76625156149356 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 65.76151154879307 - type: cos_sim_spearman value: 64.8846800918359 - type: euclidean_pearson value: 50.23302700257155 - type: euclidean_spearman value: 58.89455187289583 - type: manhattan_pearson value: 50.05498582284945 - type: manhattan_spearman value: 58.75893793871576 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 84.72381109169437 - type: cos_sim_spearman value: 84.59820928231167 - type: euclidean_pearson value: 74.85450857429493 - type: euclidean_spearman value: 73.83634052565915 - type: manhattan_pearson value: 74.97349743979106 - type: manhattan_spearman value: 73.9636470375881 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 80.96736259172798 - type: mrr value: 94.48378781712114 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: map_at_1 value: 46.344 - type: map_at_10 value: 54.962 - type: map_at_100 value: 55.772 - type: map_at_1000 value: 55.81700000000001 - type: map_at_3 value: 51.832 - type: map_at_5 value: 53.718999999999994 - type: mrr_at_1 value: 49.0 - type: mrr_at_10 value: 56.721 - type: mrr_at_100 value: 57.287 - type: mrr_at_1000 value: 57.330000000000005 - type: mrr_at_3 value: 54.056000000000004 - type: mrr_at_5 value: 55.822 - type: ndcg_at_1 value: 49.0 - type: ndcg_at_10 value: 59.757000000000005 - type: ndcg_at_100 value: 63.149 - type: ndcg_at_1000 value: 64.43100000000001 - type: ndcg_at_3 value: 54.105000000000004 - type: ndcg_at_5 value: 57.196999999999996 - type: precision_at_1 value: 49.0 - type: precision_at_10 value: 8.200000000000001 - type: precision_at_100 value: 1.0070000000000001 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 20.889 - type: precision_at_5 value: 14.399999999999999 - type: recall_at_1 value: 46.344 - type: recall_at_10 value: 72.722 - type: recall_at_100 value: 88.167 - type: recall_at_1000 value: 98.333 - type: recall_at_3 value: 57.994 - type: recall_at_5 value: 65.506 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.83366336633664 - type: cos_sim_ap value: 96.09329747251944 - type: cos_sim_f1 value: 91.66255550074001 - type: cos_sim_precision value: 90.45764362220059 - type: cos_sim_recall value: 92.9 - type: dot_accuracy value: 99.32871287128712 - type: dot_ap value: 63.95436644147969 - type: dot_f1 value: 60.61814556331008 - type: dot_precision value: 60.437375745526836 - type: dot_recall value: 60.8 - type: euclidean_accuracy value: 99.66534653465347 - type: euclidean_ap value: 85.85143979761818 - type: euclidean_f1 value: 81.57033805888769 - type: euclidean_precision value: 89.68824940047962 - type: euclidean_recall value: 74.8 - type: manhattan_accuracy value: 99.65742574257426 - type: manhattan_ap value: 85.55693926348405 - type: manhattan_f1 value: 81.13804004214963 - type: manhattan_precision value: 85.74610244988864 - type: manhattan_recall value: 77.0 - type: max_accuracy value: 99.83366336633664 - type: max_ap value: 96.09329747251944 - type: max_f1 value: 91.66255550074001 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 45.23573510003245 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 33.37478638401161 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 50.375920467392476 - type: mrr value: 51.17302223919871 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 29.768864092288343 - type: cos_sim_spearman value: 29.854278347043266 - type: dot_pearson value: 20.51281723837505 - type: dot_spearman value: 21.799102540913665 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.2 - type: map_at_10 value: 1.202 - type: map_at_100 value: 6.729 - type: map_at_1000 value: 15.928 - type: map_at_3 value: 0.492 - type: map_at_5 value: 0.712 - type: mrr_at_1 value: 76.0 - type: mrr_at_10 value: 84.75 - type: mrr_at_100 value: 84.75 - type: mrr_at_1000 value: 84.75 - type: mrr_at_3 value: 83.0 - type: mrr_at_5 value: 84.5 - type: ndcg_at_1 value: 71.0 - type: ndcg_at_10 value: 57.253 - type: ndcg_at_100 value: 44.383 - type: ndcg_at_1000 value: 38.666 - type: ndcg_at_3 value: 64.324 - type: ndcg_at_5 value: 60.791 - type: precision_at_1 value: 76.0 - type: precision_at_10 value: 59.599999999999994 - type: precision_at_100 value: 45.440000000000005 - type: precision_at_1000 value: 17.458000000000002 - type: precision_at_3 value: 69.333 - type: precision_at_5 value: 63.2 - type: recall_at_1 value: 0.2 - type: recall_at_10 value: 1.4949999999999999 - type: recall_at_100 value: 10.266 - type: recall_at_1000 value: 35.853 - type: recall_at_3 value: 0.5349999999999999 - type: recall_at_5 value: 0.8109999999999999 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.0140000000000002 - type: map_at_10 value: 8.474 - type: map_at_100 value: 14.058000000000002 - type: map_at_1000 value: 15.381 - type: map_at_3 value: 4.508 - type: map_at_5 value: 5.87 - type: mrr_at_1 value: 22.448999999999998 - type: mrr_at_10 value: 37.242 - type: mrr_at_100 value: 38.291 - type: mrr_at_1000 value: 38.311 - type: mrr_at_3 value: 32.312999999999995 - type: mrr_at_5 value: 34.762 - type: ndcg_at_1 value: 20.408 - type: ndcg_at_10 value: 20.729 - type: ndcg_at_100 value: 33.064 - type: ndcg_at_1000 value: 44.324999999999996 - type: ndcg_at_3 value: 21.251 - type: ndcg_at_5 value: 20.28 - type: precision_at_1 value: 22.448999999999998 - type: precision_at_10 value: 18.98 - type: precision_at_100 value: 7.224 - type: precision_at_1000 value: 1.471 - type: precision_at_3 value: 22.448999999999998 - type: precision_at_5 value: 20.816000000000003 - type: recall_at_1 value: 2.0140000000000002 - type: recall_at_10 value: 13.96 - type: recall_at_100 value: 44.187 - type: recall_at_1000 value: 79.328 - type: recall_at_3 value: 5.345 - type: recall_at_5 value: 7.979 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 69.1312 - type: ap value: 12.606776505497608 - type: f1 value: 52.4112415600534 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 58.16072439162422 - type: f1 value: 58.29152785435414 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 40.421119289825924 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 85.48012159504083 - type: cos_sim_ap value: 72.31974877212102 - type: cos_sim_f1 value: 67.96846573681019 - type: cos_sim_precision value: 62.89562289562289 - type: cos_sim_recall value: 73.93139841688654 - type: dot_accuracy value: 78.52416999463551 - type: dot_ap value: 43.65271285411479 - type: dot_f1 value: 46.94641449960599 - type: dot_precision value: 37.456774599182644 - type: dot_recall value: 62.875989445910285 - type: euclidean_accuracy value: 83.90057817249806 - type: euclidean_ap value: 65.96278727778665 - type: euclidean_f1 value: 63.35733232284957 - type: euclidean_precision value: 60.770535497940394 - type: euclidean_recall value: 66.17414248021109 - type: manhattan_accuracy value: 83.96614412588663 - type: manhattan_ap value: 66.03670273156699 - type: manhattan_f1 value: 63.49128406579917 - type: manhattan_precision value: 59.366391184573 - type: manhattan_recall value: 68.23218997361478 - type: max_accuracy value: 85.48012159504083 - type: max_ap value: 72.31974877212102 - type: max_f1 value: 67.96846573681019 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.97038848139093 - type: cos_sim_ap value: 85.982764495556 - type: cos_sim_f1 value: 78.73283281450284 - type: cos_sim_precision value: 75.07857791436754 - type: cos_sim_recall value: 82.7610101632276 - type: dot_accuracy value: 83.21108394458028 - type: dot_ap value: 70.97956937273386 - type: dot_f1 value: 66.53083038279111 - type: dot_precision value: 58.7551622418879 - type: dot_recall value: 76.67847243609486 - type: euclidean_accuracy value: 84.31520937633407 - type: euclidean_ap value: 74.67323411319909 - type: euclidean_f1 value: 67.21935410935676 - type: euclidean_precision value: 65.82773636430733 - type: euclidean_recall value: 68.67108099784416 - type: manhattan_accuracy value: 84.35013777312066 - type: manhattan_ap value: 74.66508905354597 - type: manhattan_f1 value: 67.28264162375038 - type: manhattan_precision value: 66.19970193740686 - type: manhattan_recall value: 68.40160147828766 - type: max_accuracy value: 88.97038848139093 - type: max_ap value: 85.982764495556 - type: max_f1 value: 78.73283281450284 --- <br><br> <p align="center"> <img src="https://huggingface.co/datasets/jinaai/documentation-images/resolve/main/logo.webp" alt="Jina AI: Your Search Foundation, Supercharged!" width="150px"> </p> <p align="center"> <b>The text embedding set trained by <a href="https://jina.ai/"><b>Jina AI</b></a></b> </p> ## Intented Usage & Model Info `jina-embedding-l-en-v1` is a language model that has been trained using Jina AI's Linnaeus-Clean dataset. This dataset consists of 380 million pairs of sentences, which include both query-document pairs. These pairs were obtained from various domains and were carefully selected through a thorough cleaning process. The Linnaeus-Full dataset, from which the Linnaeus-Clean dataset is derived, originally contained 1.6 billion sentence pairs. The model has a range of use cases, including information retrieval, semantic textual similarity, text reranking, and more. With a size of 330 million parameters, the model enables single-gpu inference while delivering better performance than our small and base model. Additionally, we provide the following options: - [`jina-embedding-t-en-v1`](https://huggingface.co/jinaai/jina-embedding-t-en-v1): 14 million parameters. - [`jina-embedding-s-en-v1`](https://huggingface.co/jinaai/jina-embedding-s-en-v1): 35 million parameters - [`jina-embedding-b-en-v1`](https://huggingface.co/jinaai/jina-embedding-b-en-v1): 110 million parameters. - [`jina-embedding-l-en-v1`](https://huggingface.co/jinaai/jina-embedding-l-en-v1): 330 million parameters **(you are here)**. - `jina-embedding-1b-en-v1`: 1.2 billion parameters, 10 times bert-base (soon). - `jina-embedding-6b-en-v1`: 6 billion parameters, 30 times bert-base (soon). ## Data & Parameters Please checkout our [technical blog](https://arxiv.org/abs/2307.11224). ## Metrics We compared the model against `all-minilm-l6-v2`/`all-mpnet-base-v2` from sbert and `text-embeddings-ada-002` from OpenAI: |Name|param |dimension| |------------------------------|-----|------| |all-minilm-l6-v2|23m |384| |all-mpnet-base-v2 |110m |768| |ada-embedding-002|Unknown/OpenAI API |1536| |jina-embedding-t-en-v1|14m |312| |jina-embedding-s-en-v1|35m |512| |jina-embedding-b-en-v1|110m |768| |jina-embedding-l-en-v1|330m |1024| |Name|STS12|STS13|STS14|STS15|STS16|STS17|TRECOVID|Quora|SciFact| |------------------------------|-----|-----|-----|-----|-----|-----|--------|-----|-----| |all-minilm-l6-v2|0.724|0.806|0.756|0.854|0.79 |0.876|0.473 |0.876|0.645 | |all-mpnet-base-v2|0.726|**0.835**|0.78 |0.857|0.8 |**0.906**|0.513 |0.875|0.656 | |ada-embedding-002|0.698|0.833|0.761|0.861|**0.86** |0.903|**0.685** |0.876|**0.726** | |jina-embedding-t-en-v1|0.717|0.773|0.731|0.829|0.777|0.860|0.482 |0.840|0.522 | |jina-embedding-s-en-v1|0.743|0.786|0.738|0.837|0.80|0.875|0.523 |0.857|0.524 | |jina-embedding-b-en-v1|**0.751**|0.809|0.761|0.856|0.812|0.890|0.606 |0.876|0.594 | |jina-embedding-l-en-v1|0.745|0.832|**0.781**|**0.869**|0.837|0.902|0.573 |**0.881**|0.598 | ## Usage Use with Jina AI Finetuner ```python !pip install finetuner import finetuner model = finetuner.build_model('jinaai/jina-embedding-l-en-v1') embeddings = finetuner.encode( model=model, data=['how is the weather today', 'What is the current weather like today?'] ) print(finetuner.cos_sim(embeddings[0], embeddings[1])) ``` Use with sentence-transformers: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim sentences = ['how is the weather today', 'What is the current weather like today?'] model = SentenceTransformer('jinaai/jina-embedding-b-en-v1') embeddings = model.encode(sentences) print(cos_sim(embeddings[0], embeddings[1])) ``` ## Fine-tuning Please consider [Finetuner](https://github.com/jina-ai/finetuner). ## Plans 1. The development of `jina-embedding-s-en-v2` is currently underway with two main objectives: improving performance and increasing the maximum sequence length. 2. We are currently working on a bilingual embedding model that combines English and X language. The upcoming model will be called `jina-embedding-s/b/l-de-v1`. ## Contact Join our [Discord community](https://discord.jina.ai) and chat with other community members about ideas. ## Citation If you find Jina Embeddings useful in your research, please cite the following paper: ``` latex @misc{günther2023jina, title={Jina Embeddings: A Novel Set of High-Performance Sentence Embedding Models}, author={Michael Günther and Louis Milliken and Jonathan Geuter and Georgios Mastrapas and Bo Wang and Han Xiao}, year={2023}, eprint={2307.11224}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "LINNAEUS", "SCIFACT" ]
Non_BioNLP
TaylorAI/gte-tiny
TaylorAI
sentence-similarity
[ "sentence-transformers", "pytorch", "onnx", "safetensors", "bert", "feature-extraction", "sentence-similarity", "transformers", "mteb", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,696
1,696
59,999
136
--- pipeline_tag: sentence-similarity tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers - mteb model-index: - name: gte_tiny results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 71.76119402985076 - type: ap value: 34.63659287952359 - type: f1 value: 65.88939512571113 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 86.61324999999998 - type: ap value: 81.7476302802319 - type: f1 value: 86.5863470912001 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 42.61000000000001 - type: f1 value: 42.2217180000715 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: map_at_1 value: 28.377999999999997 - type: map_at_10 value: 44.565 - type: map_at_100 value: 45.48 - type: map_at_1000 value: 45.487 - type: map_at_3 value: 39.841 - type: map_at_5 value: 42.284 - type: mrr_at_1 value: 29.445 - type: mrr_at_10 value: 44.956 - type: mrr_at_100 value: 45.877 - type: mrr_at_1000 value: 45.884 - type: mrr_at_3 value: 40.209 - type: mrr_at_5 value: 42.719 - type: ndcg_at_1 value: 28.377999999999997 - type: ndcg_at_10 value: 53.638 - type: ndcg_at_100 value: 57.354000000000006 - type: ndcg_at_1000 value: 57.513000000000005 - type: ndcg_at_3 value: 43.701 - type: ndcg_at_5 value: 48.114000000000004 - type: precision_at_1 value: 28.377999999999997 - type: precision_at_10 value: 8.272 - type: precision_at_100 value: 0.984 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 18.303 - type: precision_at_5 value: 13.129 - type: recall_at_1 value: 28.377999999999997 - type: recall_at_10 value: 82.717 - type: recall_at_100 value: 98.43499999999999 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 54.908 - type: recall_at_5 value: 65.647 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 46.637318326729876 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 36.01134479855804 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 59.82917555338909 - type: mrr value: 74.7888361254012 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 87.1657730995964 - type: cos_sim_spearman value: 86.62787748941281 - type: euclidean_pearson value: 85.48127914481798 - type: euclidean_spearman value: 86.48148861167424 - type: manhattan_pearson value: 85.07496934780823 - type: manhattan_spearman value: 86.39473964708843 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 81.73051948051948 - type: f1 value: 81.66368364988331 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 39.18623707448217 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 32.12697757150375 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: map_at_1 value: 29.160000000000004 - type: map_at_10 value: 40.474 - type: map_at_100 value: 41.905 - type: map_at_1000 value: 42.041000000000004 - type: map_at_3 value: 37.147000000000006 - type: map_at_5 value: 38.873999999999995 - type: mrr_at_1 value: 36.91 - type: mrr_at_10 value: 46.495999999999995 - type: mrr_at_100 value: 47.288000000000004 - type: mrr_at_1000 value: 47.339999999999996 - type: mrr_at_3 value: 43.777 - type: mrr_at_5 value: 45.257999999999996 - type: ndcg_at_1 value: 36.91 - type: ndcg_at_10 value: 46.722 - type: ndcg_at_100 value: 51.969 - type: ndcg_at_1000 value: 54.232 - type: ndcg_at_3 value: 41.783 - type: ndcg_at_5 value: 43.797000000000004 - type: precision_at_1 value: 36.91 - type: precision_at_10 value: 9.013 - type: precision_at_100 value: 1.455 - type: precision_at_1000 value: 0.193 - type: precision_at_3 value: 20.124 - type: precision_at_5 value: 14.363000000000001 - type: recall_at_1 value: 29.160000000000004 - type: recall_at_10 value: 58.521 - type: recall_at_100 value: 80.323 - type: recall_at_1000 value: 95.13000000000001 - type: recall_at_3 value: 44.205 - type: recall_at_5 value: 49.97 - type: map_at_1 value: 27.750000000000004 - type: map_at_10 value: 36.39 - type: map_at_100 value: 37.5 - type: map_at_1000 value: 37.625 - type: map_at_3 value: 33.853 - type: map_at_5 value: 35.397 - type: mrr_at_1 value: 34.14 - type: mrr_at_10 value: 41.841 - type: mrr_at_100 value: 42.469 - type: mrr_at_1000 value: 42.521 - type: mrr_at_3 value: 39.724 - type: mrr_at_5 value: 40.955999999999996 - type: ndcg_at_1 value: 34.14 - type: ndcg_at_10 value: 41.409 - type: ndcg_at_100 value: 45.668 - type: ndcg_at_1000 value: 47.916 - type: ndcg_at_3 value: 37.836 - type: ndcg_at_5 value: 39.650999999999996 - type: precision_at_1 value: 34.14 - type: precision_at_10 value: 7.739 - type: precision_at_100 value: 1.2630000000000001 - type: precision_at_1000 value: 0.173 - type: precision_at_3 value: 18.217 - type: precision_at_5 value: 12.854 - type: recall_at_1 value: 27.750000000000004 - type: recall_at_10 value: 49.882 - type: recall_at_100 value: 68.556 - type: recall_at_1000 value: 83.186 - type: recall_at_3 value: 39.047 - type: recall_at_5 value: 44.458 - type: map_at_1 value: 36.879 - type: map_at_10 value: 48.878 - type: map_at_100 value: 49.918 - type: map_at_1000 value: 49.978 - type: map_at_3 value: 45.867999999999995 - type: map_at_5 value: 47.637 - type: mrr_at_1 value: 42.696 - type: mrr_at_10 value: 52.342 - type: mrr_at_100 value: 53.044000000000004 - type: mrr_at_1000 value: 53.077 - type: mrr_at_3 value: 50.01 - type: mrr_at_5 value: 51.437 - type: ndcg_at_1 value: 42.696 - type: ndcg_at_10 value: 54.469 - type: ndcg_at_100 value: 58.664 - type: ndcg_at_1000 value: 59.951 - type: ndcg_at_3 value: 49.419999999999995 - type: ndcg_at_5 value: 52.007000000000005 - type: precision_at_1 value: 42.696 - type: precision_at_10 value: 8.734 - type: precision_at_100 value: 1.1769999999999998 - type: precision_at_1000 value: 0.133 - type: precision_at_3 value: 22.027 - type: precision_at_5 value: 15.135000000000002 - type: recall_at_1 value: 36.879 - type: recall_at_10 value: 67.669 - type: recall_at_100 value: 85.822 - type: recall_at_1000 value: 95.092 - type: recall_at_3 value: 54.157999999999994 - type: recall_at_5 value: 60.436 - type: map_at_1 value: 22.942 - type: map_at_10 value: 31.741999999999997 - type: map_at_100 value: 32.721000000000004 - type: map_at_1000 value: 32.809 - type: map_at_3 value: 29.17 - type: map_at_5 value: 30.714000000000002 - type: mrr_at_1 value: 24.746000000000002 - type: mrr_at_10 value: 33.517 - type: mrr_at_100 value: 34.451 - type: mrr_at_1000 value: 34.522000000000006 - type: mrr_at_3 value: 31.148999999999997 - type: mrr_at_5 value: 32.606 - type: ndcg_at_1 value: 24.746000000000002 - type: ndcg_at_10 value: 36.553000000000004 - type: ndcg_at_100 value: 41.53 - type: ndcg_at_1000 value: 43.811 - type: ndcg_at_3 value: 31.674000000000003 - type: ndcg_at_5 value: 34.241 - type: precision_at_1 value: 24.746000000000002 - type: precision_at_10 value: 5.684 - type: precision_at_100 value: 0.859 - type: precision_at_1000 value: 0.109 - type: precision_at_3 value: 13.597000000000001 - type: precision_at_5 value: 9.672 - type: recall_at_1 value: 22.942 - type: recall_at_10 value: 49.58 - type: recall_at_100 value: 72.614 - type: recall_at_1000 value: 89.89200000000001 - type: recall_at_3 value: 36.552 - type: recall_at_5 value: 42.702 - type: map_at_1 value: 15.345 - type: map_at_10 value: 22.428 - type: map_at_100 value: 23.756 - type: map_at_1000 value: 23.872 - type: map_at_3 value: 20.212 - type: map_at_5 value: 21.291 - type: mrr_at_1 value: 19.279 - type: mrr_at_10 value: 27.1 - type: mrr_at_100 value: 28.211000000000002 - type: mrr_at_1000 value: 28.279 - type: mrr_at_3 value: 24.813 - type: mrr_at_5 value: 25.889 - type: ndcg_at_1 value: 19.279 - type: ndcg_at_10 value: 27.36 - type: ndcg_at_100 value: 33.499 - type: ndcg_at_1000 value: 36.452 - type: ndcg_at_3 value: 23.233999999999998 - type: ndcg_at_5 value: 24.806 - type: precision_at_1 value: 19.279 - type: precision_at_10 value: 5.149 - type: precision_at_100 value: 0.938 - type: precision_at_1000 value: 0.133 - type: precision_at_3 value: 11.360000000000001 - type: precision_at_5 value: 8.035 - type: recall_at_1 value: 15.345 - type: recall_at_10 value: 37.974999999999994 - type: recall_at_100 value: 64.472 - type: recall_at_1000 value: 85.97200000000001 - type: recall_at_3 value: 26.203 - type: recall_at_5 value: 30.485 - type: map_at_1 value: 26.362000000000002 - type: map_at_10 value: 36.406 - type: map_at_100 value: 37.726 - type: map_at_1000 value: 37.84 - type: map_at_3 value: 33.425 - type: map_at_5 value: 35.043 - type: mrr_at_1 value: 32.146 - type: mrr_at_10 value: 41.674 - type: mrr_at_100 value: 42.478 - type: mrr_at_1000 value: 42.524 - type: mrr_at_3 value: 38.948 - type: mrr_at_5 value: 40.415 - type: ndcg_at_1 value: 32.146 - type: ndcg_at_10 value: 42.374 - type: ndcg_at_100 value: 47.919 - type: ndcg_at_1000 value: 50.013 - type: ndcg_at_3 value: 37.29 - type: ndcg_at_5 value: 39.531 - type: precision_at_1 value: 32.146 - type: precision_at_10 value: 7.767 - type: precision_at_100 value: 1.236 - type: precision_at_1000 value: 0.16 - type: precision_at_3 value: 17.965999999999998 - type: precision_at_5 value: 12.742999999999999 - type: recall_at_1 value: 26.362000000000002 - type: recall_at_10 value: 54.98800000000001 - type: recall_at_100 value: 78.50200000000001 - type: recall_at_1000 value: 92.146 - type: recall_at_3 value: 40.486 - type: recall_at_5 value: 46.236 - type: map_at_1 value: 24.417 - type: map_at_10 value: 33.161 - type: map_at_100 value: 34.357 - type: map_at_1000 value: 34.473 - type: map_at_3 value: 30.245 - type: map_at_5 value: 31.541999999999998 - type: mrr_at_1 value: 29.909000000000002 - type: mrr_at_10 value: 38.211 - type: mrr_at_100 value: 39.056999999999995 - type: mrr_at_1000 value: 39.114 - type: mrr_at_3 value: 35.769 - type: mrr_at_5 value: 36.922 - type: ndcg_at_1 value: 29.909000000000002 - type: ndcg_at_10 value: 38.694 - type: ndcg_at_100 value: 44.057 - type: ndcg_at_1000 value: 46.6 - type: ndcg_at_3 value: 33.822 - type: ndcg_at_5 value: 35.454 - type: precision_at_1 value: 29.909000000000002 - type: precision_at_10 value: 7.180000000000001 - type: precision_at_100 value: 1.153 - type: precision_at_1000 value: 0.155 - type: precision_at_3 value: 16.134 - type: precision_at_5 value: 11.256 - type: recall_at_1 value: 24.417 - type: recall_at_10 value: 50.260000000000005 - type: recall_at_100 value: 73.55699999999999 - type: recall_at_1000 value: 91.216 - type: recall_at_3 value: 35.971 - type: recall_at_5 value: 40.793 - type: map_at_1 value: 24.266916666666663 - type: map_at_10 value: 32.75025 - type: map_at_100 value: 33.91341666666667 - type: map_at_1000 value: 34.031749999999995 - type: map_at_3 value: 30.166416666666674 - type: map_at_5 value: 31.577000000000005 - type: mrr_at_1 value: 28.828166666666664 - type: mrr_at_10 value: 36.80991666666667 - type: mrr_at_100 value: 37.67075 - type: mrr_at_1000 value: 37.733 - type: mrr_at_3 value: 34.513416666666664 - type: mrr_at_5 value: 35.788 - type: ndcg_at_1 value: 28.828166666666664 - type: ndcg_at_10 value: 37.796 - type: ndcg_at_100 value: 42.94783333333333 - type: ndcg_at_1000 value: 45.38908333333333 - type: ndcg_at_3 value: 33.374750000000006 - type: ndcg_at_5 value: 35.379666666666665 - type: precision_at_1 value: 28.828166666666664 - type: precision_at_10 value: 6.615749999999999 - type: precision_at_100 value: 1.0848333333333333 - type: precision_at_1000 value: 0.1484166666666667 - type: precision_at_3 value: 15.347833333333332 - type: precision_at_5 value: 10.848916666666666 - type: recall_at_1 value: 24.266916666666663 - type: recall_at_10 value: 48.73458333333333 - type: recall_at_100 value: 71.56341666666667 - type: recall_at_1000 value: 88.63091666666668 - type: recall_at_3 value: 36.31208333333333 - type: recall_at_5 value: 41.55633333333333 - type: map_at_1 value: 23.497 - type: map_at_10 value: 30.249 - type: map_at_100 value: 30.947000000000003 - type: map_at_1000 value: 31.049 - type: map_at_3 value: 28.188000000000002 - type: map_at_5 value: 29.332 - type: mrr_at_1 value: 26.687 - type: mrr_at_10 value: 33.182 - type: mrr_at_100 value: 33.794999999999995 - type: mrr_at_1000 value: 33.873 - type: mrr_at_3 value: 31.263 - type: mrr_at_5 value: 32.428000000000004 - type: ndcg_at_1 value: 26.687 - type: ndcg_at_10 value: 34.252 - type: ndcg_at_100 value: 38.083 - type: ndcg_at_1000 value: 40.682 - type: ndcg_at_3 value: 30.464999999999996 - type: ndcg_at_5 value: 32.282 - type: precision_at_1 value: 26.687 - type: precision_at_10 value: 5.2909999999999995 - type: precision_at_100 value: 0.788 - type: precision_at_1000 value: 0.109 - type: precision_at_3 value: 13.037 - type: precision_at_5 value: 9.049 - type: recall_at_1 value: 23.497 - type: recall_at_10 value: 43.813 - type: recall_at_100 value: 61.88399999999999 - type: recall_at_1000 value: 80.926 - type: recall_at_3 value: 33.332 - type: recall_at_5 value: 37.862 - type: map_at_1 value: 16.073 - type: map_at_10 value: 22.705000000000002 - type: map_at_100 value: 23.703 - type: map_at_1000 value: 23.833 - type: map_at_3 value: 20.593 - type: map_at_5 value: 21.7 - type: mrr_at_1 value: 19.683 - type: mrr_at_10 value: 26.39 - type: mrr_at_100 value: 27.264 - type: mrr_at_1000 value: 27.349 - type: mrr_at_3 value: 24.409 - type: mrr_at_5 value: 25.474000000000004 - type: ndcg_at_1 value: 19.683 - type: ndcg_at_10 value: 27.014 - type: ndcg_at_100 value: 31.948 - type: ndcg_at_1000 value: 35.125 - type: ndcg_at_3 value: 23.225 - type: ndcg_at_5 value: 24.866 - type: precision_at_1 value: 19.683 - type: precision_at_10 value: 4.948 - type: precision_at_100 value: 0.876 - type: precision_at_1000 value: 0.133 - type: precision_at_3 value: 10.943 - type: precision_at_5 value: 7.86 - type: recall_at_1 value: 16.073 - type: recall_at_10 value: 36.283 - type: recall_at_100 value: 58.745999999999995 - type: recall_at_1000 value: 81.711 - type: recall_at_3 value: 25.637 - type: recall_at_5 value: 29.919 - type: map_at_1 value: 25.776 - type: map_at_10 value: 33.317 - type: map_at_100 value: 34.437 - type: map_at_1000 value: 34.54 - type: map_at_3 value: 30.706 - type: map_at_5 value: 32.202999999999996 - type: mrr_at_1 value: 30.224 - type: mrr_at_10 value: 37.34 - type: mrr_at_100 value: 38.268 - type: mrr_at_1000 value: 38.335 - type: mrr_at_3 value: 35.075 - type: mrr_at_5 value: 36.348 - type: ndcg_at_1 value: 30.224 - type: ndcg_at_10 value: 38.083 - type: ndcg_at_100 value: 43.413000000000004 - type: ndcg_at_1000 value: 45.856 - type: ndcg_at_3 value: 33.437 - type: ndcg_at_5 value: 35.661 - type: precision_at_1 value: 30.224 - type: precision_at_10 value: 6.1850000000000005 - type: precision_at_100 value: 1.0030000000000001 - type: precision_at_1000 value: 0.132 - type: precision_at_3 value: 14.646 - type: precision_at_5 value: 10.428999999999998 - type: recall_at_1 value: 25.776 - type: recall_at_10 value: 48.787000000000006 - type: recall_at_100 value: 72.04899999999999 - type: recall_at_1000 value: 89.339 - type: recall_at_3 value: 36.192 - type: recall_at_5 value: 41.665 - type: map_at_1 value: 23.156 - type: map_at_10 value: 30.886000000000003 - type: map_at_100 value: 32.551 - type: map_at_1000 value: 32.769 - type: map_at_3 value: 28.584 - type: map_at_5 value: 29.959999999999997 - type: mrr_at_1 value: 28.260999999999996 - type: mrr_at_10 value: 35.555 - type: mrr_at_100 value: 36.687 - type: mrr_at_1000 value: 36.742999999999995 - type: mrr_at_3 value: 33.531 - type: mrr_at_5 value: 34.717 - type: ndcg_at_1 value: 28.260999999999996 - type: ndcg_at_10 value: 36.036 - type: ndcg_at_100 value: 42.675000000000004 - type: ndcg_at_1000 value: 45.303 - type: ndcg_at_3 value: 32.449 - type: ndcg_at_5 value: 34.293 - type: precision_at_1 value: 28.260999999999996 - type: precision_at_10 value: 6.837999999999999 - type: precision_at_100 value: 1.4569999999999999 - type: precision_at_1000 value: 0.23500000000000001 - type: precision_at_3 value: 15.217 - type: precision_at_5 value: 11.028 - type: recall_at_1 value: 23.156 - type: recall_at_10 value: 45.251999999999995 - type: recall_at_100 value: 75.339 - type: recall_at_1000 value: 91.56 - type: recall_at_3 value: 34.701 - type: recall_at_5 value: 39.922999999999995 - type: map_at_1 value: 19.846 - type: map_at_10 value: 26.367 - type: map_at_100 value: 27.439999999999998 - type: map_at_1000 value: 27.552 - type: map_at_3 value: 24.006 - type: map_at_5 value: 25.230999999999998 - type: mrr_at_1 value: 21.257 - type: mrr_at_10 value: 28.071 - type: mrr_at_100 value: 29.037000000000003 - type: mrr_at_1000 value: 29.119 - type: mrr_at_3 value: 25.692999999999998 - type: mrr_at_5 value: 27.006000000000004 - type: ndcg_at_1 value: 21.257 - type: ndcg_at_10 value: 30.586000000000002 - type: ndcg_at_100 value: 35.949 - type: ndcg_at_1000 value: 38.728 - type: ndcg_at_3 value: 25.862000000000002 - type: ndcg_at_5 value: 27.967 - type: precision_at_1 value: 21.257 - type: precision_at_10 value: 4.861 - type: precision_at_100 value: 0.8130000000000001 - type: precision_at_1000 value: 0.116 - type: precision_at_3 value: 10.906 - type: precision_at_5 value: 7.763000000000001 - type: recall_at_1 value: 19.846 - type: recall_at_10 value: 41.805 - type: recall_at_100 value: 66.89699999999999 - type: recall_at_1000 value: 87.401 - type: recall_at_3 value: 29.261 - type: recall_at_5 value: 34.227000000000004 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: map_at_1 value: 10.333 - type: map_at_10 value: 17.14 - type: map_at_100 value: 18.878 - type: map_at_1000 value: 19.067 - type: map_at_3 value: 14.123 - type: map_at_5 value: 15.699 - type: mrr_at_1 value: 23.192 - type: mrr_at_10 value: 33.553 - type: mrr_at_100 value: 34.553 - type: mrr_at_1000 value: 34.603 - type: mrr_at_3 value: 29.848000000000003 - type: mrr_at_5 value: 32.18 - type: ndcg_at_1 value: 23.192 - type: ndcg_at_10 value: 24.707 - type: ndcg_at_100 value: 31.701 - type: ndcg_at_1000 value: 35.260999999999996 - type: ndcg_at_3 value: 19.492 - type: ndcg_at_5 value: 21.543 - type: precision_at_1 value: 23.192 - type: precision_at_10 value: 7.824000000000001 - type: precision_at_100 value: 1.52 - type: precision_at_1000 value: 0.218 - type: precision_at_3 value: 14.180000000000001 - type: precision_at_5 value: 11.530999999999999 - type: recall_at_1 value: 10.333 - type: recall_at_10 value: 30.142999999999997 - type: recall_at_100 value: 54.298 - type: recall_at_1000 value: 74.337 - type: recall_at_3 value: 17.602999999999998 - type: recall_at_5 value: 22.938 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: map_at_1 value: 8.03 - type: map_at_10 value: 17.345 - type: map_at_100 value: 23.462 - type: map_at_1000 value: 24.77 - type: map_at_3 value: 12.714 - type: map_at_5 value: 14.722 - type: mrr_at_1 value: 61.0 - type: mrr_at_10 value: 69.245 - type: mrr_at_100 value: 69.715 - type: mrr_at_1000 value: 69.719 - type: mrr_at_3 value: 67.583 - type: mrr_at_5 value: 68.521 - type: ndcg_at_1 value: 47.625 - type: ndcg_at_10 value: 35.973 - type: ndcg_at_100 value: 39.875 - type: ndcg_at_1000 value: 46.922000000000004 - type: ndcg_at_3 value: 40.574 - type: ndcg_at_5 value: 38.18 - type: precision_at_1 value: 61.0 - type: precision_at_10 value: 29.049999999999997 - type: precision_at_100 value: 8.828 - type: precision_at_1000 value: 1.8290000000000002 - type: precision_at_3 value: 45.333 - type: precision_at_5 value: 37.9 - type: recall_at_1 value: 8.03 - type: recall_at_10 value: 22.334 - type: recall_at_100 value: 45.919 - type: recall_at_1000 value: 68.822 - type: recall_at_3 value: 14.038999999999998 - type: recall_at_5 value: 17.118 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 44.714999999999996 - type: f1 value: 39.83929362259356 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: map_at_1 value: 52.242999999999995 - type: map_at_10 value: 64.087 - type: map_at_100 value: 64.549 - type: map_at_1000 value: 64.567 - type: map_at_3 value: 61.667 - type: map_at_5 value: 63.266 - type: mrr_at_1 value: 56.271 - type: mrr_at_10 value: 68.146 - type: mrr_at_100 value: 68.524 - type: mrr_at_1000 value: 68.53200000000001 - type: mrr_at_3 value: 65.869 - type: mrr_at_5 value: 67.37100000000001 - type: ndcg_at_1 value: 56.271 - type: ndcg_at_10 value: 70.109 - type: ndcg_at_100 value: 72.09 - type: ndcg_at_1000 value: 72.479 - type: ndcg_at_3 value: 65.559 - type: ndcg_at_5 value: 68.242 - type: precision_at_1 value: 56.271 - type: precision_at_10 value: 9.286999999999999 - type: precision_at_100 value: 1.039 - type: precision_at_1000 value: 0.109 - type: precision_at_3 value: 26.308 - type: precision_at_5 value: 17.291 - type: recall_at_1 value: 52.242999999999995 - type: recall_at_10 value: 84.71 - type: recall_at_100 value: 93.309 - type: recall_at_1000 value: 96.013 - type: recall_at_3 value: 72.554 - type: recall_at_5 value: 79.069 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: map_at_1 value: 14.346 - type: map_at_10 value: 24.552 - type: map_at_100 value: 26.161 - type: map_at_1000 value: 26.345000000000002 - type: map_at_3 value: 21.208 - type: map_at_5 value: 22.959 - type: mrr_at_1 value: 29.166999999999998 - type: mrr_at_10 value: 38.182 - type: mrr_at_100 value: 39.22 - type: mrr_at_1000 value: 39.263 - type: mrr_at_3 value: 35.983 - type: mrr_at_5 value: 37.14 - type: ndcg_at_1 value: 29.166999999999998 - type: ndcg_at_10 value: 31.421 - type: ndcg_at_100 value: 38.129999999999995 - type: ndcg_at_1000 value: 41.569 - type: ndcg_at_3 value: 28.172000000000004 - type: ndcg_at_5 value: 29.029 - type: precision_at_1 value: 29.166999999999998 - type: precision_at_10 value: 8.997 - type: precision_at_100 value: 1.5709999999999997 - type: precision_at_1000 value: 0.22 - type: precision_at_3 value: 19.187 - type: precision_at_5 value: 13.980999999999998 - type: recall_at_1 value: 14.346 - type: recall_at_10 value: 37.963 - type: recall_at_100 value: 63.43299999999999 - type: recall_at_1000 value: 84.057 - type: recall_at_3 value: 26.119999999999997 - type: recall_at_5 value: 30.988 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: map_at_1 value: 33.059 - type: map_at_10 value: 46.421 - type: map_at_100 value: 47.323 - type: map_at_1000 value: 47.403 - type: map_at_3 value: 43.553999999999995 - type: map_at_5 value: 45.283 - type: mrr_at_1 value: 66.117 - type: mrr_at_10 value: 73.10900000000001 - type: mrr_at_100 value: 73.444 - type: mrr_at_1000 value: 73.46000000000001 - type: mrr_at_3 value: 71.70400000000001 - type: mrr_at_5 value: 72.58099999999999 - type: ndcg_at_1 value: 66.117 - type: ndcg_at_10 value: 55.696999999999996 - type: ndcg_at_100 value: 59.167 - type: ndcg_at_1000 value: 60.809000000000005 - type: ndcg_at_3 value: 51.243 - type: ndcg_at_5 value: 53.627 - type: precision_at_1 value: 66.117 - type: precision_at_10 value: 11.538 - type: precision_at_100 value: 1.429 - type: precision_at_1000 value: 0.165 - type: precision_at_3 value: 31.861 - type: precision_at_5 value: 20.997 - type: recall_at_1 value: 33.059 - type: recall_at_10 value: 57.691 - type: recall_at_100 value: 71.458 - type: recall_at_1000 value: 82.35 - type: recall_at_3 value: 47.792 - type: recall_at_5 value: 52.492000000000004 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 80.544 - type: ap value: 74.69592367984956 - type: f1 value: 80.51138138449883 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: dev revision: None metrics: - type: map_at_1 value: 17.095 - type: map_at_10 value: 28.038999999999998 - type: map_at_100 value: 29.246 - type: map_at_1000 value: 29.311 - type: map_at_3 value: 24.253 - type: map_at_5 value: 26.442 - type: mrr_at_1 value: 17.535999999999998 - type: mrr_at_10 value: 28.53 - type: mrr_at_100 value: 29.697000000000003 - type: mrr_at_1000 value: 29.755 - type: mrr_at_3 value: 24.779999999999998 - type: mrr_at_5 value: 26.942 - type: ndcg_at_1 value: 17.549999999999997 - type: ndcg_at_10 value: 34.514 - type: ndcg_at_100 value: 40.497 - type: ndcg_at_1000 value: 42.17 - type: ndcg_at_3 value: 26.764 - type: ndcg_at_5 value: 30.678 - type: precision_at_1 value: 17.549999999999997 - type: precision_at_10 value: 5.692 - type: precision_at_100 value: 0.8699999999999999 - type: precision_at_1000 value: 0.101 - type: precision_at_3 value: 11.562 - type: precision_at_5 value: 8.917 - type: recall_at_1 value: 17.095 - type: recall_at_10 value: 54.642 - type: recall_at_100 value: 82.652 - type: recall_at_1000 value: 95.555 - type: recall_at_3 value: 33.504 - type: recall_at_5 value: 42.925000000000004 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 91.75558595531236 - type: f1 value: 91.25979279648296 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 69.90424076607387 - type: f1 value: 52.067408707562244 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.13449899125757 - type: f1 value: 67.62456762910598 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.862138533961 - type: f1 value: 74.66457222091381 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 34.10761942610792 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 31.673172170578408 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 32.058704977250315 - type: mrr value: 33.24327760839221 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.163 - type: map_at_10 value: 11.652999999999999 - type: map_at_100 value: 14.849 - type: map_at_1000 value: 16.253999999999998 - type: map_at_3 value: 8.616999999999999 - type: map_at_5 value: 10.100000000000001 - type: mrr_at_1 value: 44.272 - type: mrr_at_10 value: 52.25 - type: mrr_at_100 value: 52.761 - type: mrr_at_1000 value: 52.811 - type: mrr_at_3 value: 50.31 - type: mrr_at_5 value: 51.347 - type: ndcg_at_1 value: 42.105 - type: ndcg_at_10 value: 32.044 - type: ndcg_at_100 value: 29.763 - type: ndcg_at_1000 value: 38.585 - type: ndcg_at_3 value: 36.868 - type: ndcg_at_5 value: 35.154999999999994 - type: precision_at_1 value: 43.653 - type: precision_at_10 value: 23.622 - type: precision_at_100 value: 7.7490000000000006 - type: precision_at_1000 value: 2.054 - type: precision_at_3 value: 34.262 - type: precision_at_5 value: 30.154999999999998 - type: recall_at_1 value: 5.163 - type: recall_at_10 value: 15.478 - type: recall_at_100 value: 30.424 - type: recall_at_1000 value: 62.67 - type: recall_at_3 value: 9.615 - type: recall_at_5 value: 12.369 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: map_at_1 value: 21.618000000000002 - type: map_at_10 value: 35.465 - type: map_at_100 value: 36.712 - type: map_at_1000 value: 36.757 - type: map_at_3 value: 31.189 - type: map_at_5 value: 33.537 - type: mrr_at_1 value: 24.305 - type: mrr_at_10 value: 37.653 - type: mrr_at_100 value: 38.662 - type: mrr_at_1000 value: 38.694 - type: mrr_at_3 value: 33.889 - type: mrr_at_5 value: 35.979 - type: ndcg_at_1 value: 24.305 - type: ndcg_at_10 value: 43.028 - type: ndcg_at_100 value: 48.653999999999996 - type: ndcg_at_1000 value: 49.733 - type: ndcg_at_3 value: 34.768 - type: ndcg_at_5 value: 38.753 - type: precision_at_1 value: 24.305 - type: precision_at_10 value: 7.59 - type: precision_at_100 value: 1.076 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_3 value: 16.271 - type: precision_at_5 value: 12.068 - type: recall_at_1 value: 21.618000000000002 - type: recall_at_10 value: 63.977 - type: recall_at_100 value: 89.03999999999999 - type: recall_at_1000 value: 97.10600000000001 - type: recall_at_3 value: 42.422 - type: recall_at_5 value: 51.629000000000005 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: map_at_1 value: 69.405 - type: map_at_10 value: 83.05 - type: map_at_100 value: 83.684 - type: map_at_1000 value: 83.70400000000001 - type: map_at_3 value: 80.08800000000001 - type: map_at_5 value: 81.937 - type: mrr_at_1 value: 79.85 - type: mrr_at_10 value: 86.369 - type: mrr_at_100 value: 86.48599999999999 - type: mrr_at_1000 value: 86.48700000000001 - type: mrr_at_3 value: 85.315 - type: mrr_at_5 value: 86.044 - type: ndcg_at_1 value: 79.86999999999999 - type: ndcg_at_10 value: 87.04499999999999 - type: ndcg_at_100 value: 88.373 - type: ndcg_at_1000 value: 88.531 - type: ndcg_at_3 value: 84.04 - type: ndcg_at_5 value: 85.684 - type: precision_at_1 value: 79.86999999999999 - type: precision_at_10 value: 13.183 - type: precision_at_100 value: 1.51 - type: precision_at_1000 value: 0.156 - type: precision_at_3 value: 36.67 - type: precision_at_5 value: 24.12 - type: recall_at_1 value: 69.405 - type: recall_at_10 value: 94.634 - type: recall_at_100 value: 99.214 - type: recall_at_1000 value: 99.958 - type: recall_at_3 value: 85.992 - type: recall_at_5 value: 90.656 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 50.191676323145465 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 56.4874020363744 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 4.228 - type: map_at_10 value: 11.245 - type: map_at_100 value: 13.353000000000002 - type: map_at_1000 value: 13.665 - type: map_at_3 value: 7.779999999999999 - type: map_at_5 value: 9.405 - type: mrr_at_1 value: 20.9 - type: mrr_at_10 value: 31.657999999999998 - type: mrr_at_100 value: 32.769999999999996 - type: mrr_at_1000 value: 32.833 - type: mrr_at_3 value: 28.333000000000002 - type: mrr_at_5 value: 30.043 - type: ndcg_at_1 value: 20.9 - type: ndcg_at_10 value: 19.073 - type: ndcg_at_100 value: 27.055 - type: ndcg_at_1000 value: 32.641 - type: ndcg_at_3 value: 17.483999999999998 - type: ndcg_at_5 value: 15.42 - type: precision_at_1 value: 20.9 - type: precision_at_10 value: 10.17 - type: precision_at_100 value: 2.162 - type: precision_at_1000 value: 0.35100000000000003 - type: precision_at_3 value: 16.467000000000002 - type: precision_at_5 value: 13.68 - type: recall_at_1 value: 4.228 - type: recall_at_10 value: 20.573 - type: recall_at_100 value: 43.887 - type: recall_at_1000 value: 71.22 - type: recall_at_3 value: 10.023 - type: recall_at_5 value: 13.873 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 82.77965135067481 - type: cos_sim_spearman value: 75.85121335808076 - type: euclidean_pearson value: 80.09115175262697 - type: euclidean_spearman value: 75.72249155647123 - type: manhattan_pearson value: 79.89723577351782 - type: manhattan_spearman value: 75.49855259442387 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 80.46084116030949 - type: cos_sim_spearman value: 72.57579204392951 - type: euclidean_pearson value: 76.39020830763684 - type: euclidean_spearman value: 72.3718627025895 - type: manhattan_pearson value: 76.6148833027359 - type: manhattan_spearman value: 72.57570008442319 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 80.43678068337017 - type: cos_sim_spearman value: 82.38941154076062 - type: euclidean_pearson value: 81.59260573633661 - type: euclidean_spearman value: 82.31144262574114 - type: manhattan_pearson value: 81.43266909137056 - type: manhattan_spearman value: 82.14704293004861 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 80.73713431763163 - type: cos_sim_spearman value: 77.97860512809388 - type: euclidean_pearson value: 80.35755041527027 - type: euclidean_spearman value: 78.021703511412 - type: manhattan_pearson value: 80.24440317109162 - type: manhattan_spearman value: 77.93165415697575 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 85.15111852351204 - type: cos_sim_spearman value: 86.54032447238258 - type: euclidean_pearson value: 86.14157021537433 - type: euclidean_spearman value: 86.67537291929713 - type: manhattan_pearson value: 86.081041854808 - type: manhattan_spearman value: 86.61561701560558 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 81.34532445104026 - type: cos_sim_spearman value: 83.31325001474116 - type: euclidean_pearson value: 82.81892375201032 - type: euclidean_spearman value: 83.4521695148055 - type: manhattan_pearson value: 82.72503790526163 - type: manhattan_spearman value: 83.37833652941349 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 87.25463453839801 - type: cos_sim_spearman value: 88.27655263515948 - type: euclidean_pearson value: 88.0248334411439 - type: euclidean_spearman value: 88.18141448876868 - type: manhattan_pearson value: 87.8080451127279 - type: manhattan_spearman value: 88.01028114423058 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 63.57551045355218 - type: cos_sim_spearman value: 66.67614095126629 - type: euclidean_pearson value: 66.0787243112528 - type: euclidean_spearman value: 66.83660560636939 - type: manhattan_pearson value: 66.74684019662031 - type: manhattan_spearman value: 67.11761598074368 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 83.70881496766829 - type: cos_sim_spearman value: 84.37803542941634 - type: euclidean_pearson value: 84.84501245857096 - type: euclidean_spearman value: 84.47088079741476 - type: manhattan_pearson value: 84.77244090794765 - type: manhattan_spearman value: 84.43307343706205 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 81.53946254759089 - type: mrr value: 94.68259953554072 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: map_at_1 value: 51.817 - type: map_at_10 value: 62.339999999999996 - type: map_at_100 value: 62.88 - type: map_at_1000 value: 62.909000000000006 - type: map_at_3 value: 59.004 - type: map_at_5 value: 60.906000000000006 - type: mrr_at_1 value: 54.333 - type: mrr_at_10 value: 63.649 - type: mrr_at_100 value: 64.01 - type: mrr_at_1000 value: 64.039 - type: mrr_at_3 value: 61.056 - type: mrr_at_5 value: 62.639 - type: ndcg_at_1 value: 54.333 - type: ndcg_at_10 value: 67.509 - type: ndcg_at_100 value: 69.69999999999999 - type: ndcg_at_1000 value: 70.613 - type: ndcg_at_3 value: 61.729 - type: ndcg_at_5 value: 64.696 - type: precision_at_1 value: 54.333 - type: precision_at_10 value: 9.2 - type: precision_at_100 value: 1.043 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 24.0 - type: precision_at_5 value: 16.2 - type: recall_at_1 value: 51.817 - type: recall_at_10 value: 82.056 - type: recall_at_100 value: 91.667 - type: recall_at_1000 value: 99.0 - type: recall_at_3 value: 66.717 - type: recall_at_5 value: 74.17200000000001 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.82475247524752 - type: cos_sim_ap value: 95.4781199603258 - type: cos_sim_f1 value: 91.16186693147964 - type: cos_sim_precision value: 90.53254437869822 - type: cos_sim_recall value: 91.8 - type: dot_accuracy value: 99.75049504950495 - type: dot_ap value: 93.05183539809457 - type: dot_f1 value: 87.31117824773412 - type: dot_precision value: 87.93103448275862 - type: dot_recall value: 86.7 - type: euclidean_accuracy value: 99.82475247524752 - type: euclidean_ap value: 95.38547978154382 - type: euclidean_f1 value: 91.16325511732403 - type: euclidean_precision value: 91.02691924227318 - type: euclidean_recall value: 91.3 - type: manhattan_accuracy value: 99.82574257425742 - type: manhattan_ap value: 95.47237521890308 - type: manhattan_f1 value: 91.27849355797821 - type: manhattan_precision value: 90.47151277013754 - type: manhattan_recall value: 92.10000000000001 - type: max_accuracy value: 99.82574257425742 - type: max_ap value: 95.4781199603258 - type: max_f1 value: 91.27849355797821 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 57.542169376331245 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 35.74399302634387 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 49.65076347632749 - type: mrr value: 50.418099057804945 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 29.73997756592847 - type: cos_sim_spearman value: 29.465208011593308 - type: dot_pearson value: 24.83735342474541 - type: dot_spearman value: 26.005180528584855 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.208 - type: map_at_10 value: 1.434 - type: map_at_100 value: 7.829 - type: map_at_1000 value: 19.807 - type: map_at_3 value: 0.549 - type: map_at_5 value: 0.8330000000000001 - type: mrr_at_1 value: 78.0 - type: mrr_at_10 value: 85.35199999999999 - type: mrr_at_100 value: 85.673 - type: mrr_at_1000 value: 85.673 - type: mrr_at_3 value: 84.667 - type: mrr_at_5 value: 85.06700000000001 - type: ndcg_at_1 value: 72.0 - type: ndcg_at_10 value: 59.214999999999996 - type: ndcg_at_100 value: 44.681 - type: ndcg_at_1000 value: 43.035000000000004 - type: ndcg_at_3 value: 66.53099999999999 - type: ndcg_at_5 value: 63.23 - type: precision_at_1 value: 78.0 - type: precision_at_10 value: 62.4 - type: precision_at_100 value: 45.76 - type: precision_at_1000 value: 19.05 - type: precision_at_3 value: 71.333 - type: precision_at_5 value: 67.2 - type: recall_at_1 value: 0.208 - type: recall_at_10 value: 1.6580000000000001 - type: recall_at_100 value: 11.324 - type: recall_at_1000 value: 41.537 - type: recall_at_3 value: 0.579 - type: recall_at_5 value: 0.8959999999999999 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.442 - type: map_at_10 value: 8.863 - type: map_at_100 value: 14.606 - type: map_at_1000 value: 16.258 - type: map_at_3 value: 4.396 - type: map_at_5 value: 6.199000000000001 - type: mrr_at_1 value: 30.612000000000002 - type: mrr_at_10 value: 43.492 - type: mrr_at_100 value: 44.557 - type: mrr_at_1000 value: 44.557 - type: mrr_at_3 value: 40.816 - type: mrr_at_5 value: 42.143 - type: ndcg_at_1 value: 25.509999999999998 - type: ndcg_at_10 value: 22.076 - type: ndcg_at_100 value: 34.098 - type: ndcg_at_1000 value: 46.265 - type: ndcg_at_3 value: 24.19 - type: ndcg_at_5 value: 23.474 - type: precision_at_1 value: 30.612000000000002 - type: precision_at_10 value: 19.796 - type: precision_at_100 value: 7.286 - type: precision_at_1000 value: 1.5310000000000001 - type: precision_at_3 value: 25.85 - type: precision_at_5 value: 24.490000000000002 - type: recall_at_1 value: 2.442 - type: recall_at_10 value: 15.012 - type: recall_at_100 value: 45.865 - type: recall_at_1000 value: 82.958 - type: recall_at_3 value: 5.731 - type: recall_at_5 value: 9.301 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 70.974 - type: ap value: 14.534996211286682 - type: f1 value: 54.785946183399005 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 58.56819468024901 - type: f1 value: 58.92391487111204 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 43.273202335218194 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 84.37742146986946 - type: cos_sim_ap value: 68.1684129575579 - type: cos_sim_f1 value: 64.93475108748189 - type: cos_sim_precision value: 59.89745876058849 - type: cos_sim_recall value: 70.89709762532982 - type: dot_accuracy value: 80.49710913750968 - type: dot_ap value: 54.699790073944186 - type: dot_f1 value: 54.45130013221684 - type: dot_precision value: 46.74612183125236 - type: dot_recall value: 65.19788918205805 - type: euclidean_accuracy value: 84.5085533766466 - type: euclidean_ap value: 68.38835695236224 - type: euclidean_f1 value: 65.3391121002694 - type: euclidean_precision value: 58.75289656625237 - type: euclidean_recall value: 73.58839050131925 - type: manhattan_accuracy value: 84.40126363473803 - type: manhattan_ap value: 68.09539181555348 - type: manhattan_f1 value: 64.99028182701653 - type: manhattan_precision value: 60.22062134173795 - type: manhattan_recall value: 70.58047493403694 - type: max_accuracy value: 84.5085533766466 - type: max_ap value: 68.38835695236224 - type: max_f1 value: 65.3391121002694 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.34167733923235 - type: cos_sim_ap value: 84.84136381147736 - type: cos_sim_f1 value: 77.01434980904001 - type: cos_sim_precision value: 74.27937915742794 - type: cos_sim_recall value: 79.95842315983985 - type: dot_accuracy value: 85.06422944075756 - type: dot_ap value: 76.49446747522325 - type: dot_f1 value: 71.11606520830432 - type: dot_precision value: 64.93638676844785 - type: dot_recall value: 78.59562673236834 - type: euclidean_accuracy value: 88.45810532852097 - type: euclidean_ap value: 84.91526721863501 - type: euclidean_f1 value: 77.04399001750662 - type: euclidean_precision value: 74.62298867162133 - type: euclidean_recall value: 79.62734832152756 - type: manhattan_accuracy value: 88.46004579500912 - type: manhattan_ap value: 84.81590026238194 - type: manhattan_f1 value: 76.97804626491822 - type: manhattan_precision value: 73.79237288135593 - type: manhattan_recall value: 80.45118570988605 - type: max_accuracy value: 88.46004579500912 - type: max_ap value: 84.91526721863501 - type: max_f1 value: 77.04399001750662 --- # {gte-tiny} This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search. It is distilled from `thenlper/gte-small`, with comparable (slightly worse) performance at around half the size. <!--- Describe your model here --> ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('{MODEL_NAME}') embeddings = model.encode(sentences) print(embeddings) ``` ## Usage (HuggingFace Transformers) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['This is an example sentence', 'Each sentence is converted'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}') model = AutoModel.from_pretrained('{MODEL_NAME}') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, mean pooling. sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) print("Sentence embeddings:") print(sentence_embeddings) ``` ## Evaluation Results <!--- Describe how your model was evaluated --> For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME}) ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False}) ) ``` ## Citing & Authors <!--- Describe where people can find more information -->
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
Nizzouuu/gte-Qwen2-7B-instruct-Q6_K-GGUF
Nizzouuu
sentence-similarity
[ "sentence-transformers", "gguf", "mteb", "transformers", "Qwen2", "sentence-similarity", "llama-cpp", "gguf-my-repo", "base_model:Alibaba-NLP/gte-Qwen2-7B-instruct", "base_model:quantized:Alibaba-NLP/gte-Qwen2-7B-instruct", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us", "conversational" ]
1,738
1,738
7
0
--- base_model: Alibaba-NLP/gte-Qwen2-7B-instruct license: apache-2.0 tags: - mteb - sentence-transformers - transformers - Qwen2 - sentence-similarity - llama-cpp - gguf-my-repo model-index: - name: gte-qwen2-7B-instruct results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 91.31343283582089 - type: ap value: 67.64251402604096 - type: f1 value: 87.53372530755692 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 97.497825 - type: ap value: 96.30329547047529 - type: f1 value: 97.49769793778039 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 62.564 - type: f1 value: 60.975777935041066 - task: type: Retrieval dataset: name: MTEB ArguAna type: mteb/arguana config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 36.486000000000004 - type: map_at_10 value: 54.842 - type: map_at_100 value: 55.206999999999994 - type: map_at_1000 value: 55.206999999999994 - type: map_at_3 value: 49.893 - type: map_at_5 value: 53.105000000000004 - type: mrr_at_1 value: 37.34 - type: mrr_at_10 value: 55.143 - type: mrr_at_100 value: 55.509 - type: mrr_at_1000 value: 55.509 - type: mrr_at_3 value: 50.212999999999994 - type: mrr_at_5 value: 53.432 - type: ndcg_at_1 value: 36.486000000000004 - type: ndcg_at_10 value: 64.273 - type: ndcg_at_100 value: 65.66199999999999 - type: ndcg_at_1000 value: 65.66199999999999 - type: ndcg_at_3 value: 54.352999999999994 - type: ndcg_at_5 value: 60.131 - type: precision_at_1 value: 36.486000000000004 - type: precision_at_10 value: 9.395000000000001 - type: precision_at_100 value: 0.996 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 22.428 - type: precision_at_5 value: 16.259 - type: recall_at_1 value: 36.486000000000004 - type: recall_at_10 value: 93.95400000000001 - type: recall_at_100 value: 99.644 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 67.283 - type: recall_at_5 value: 81.294 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 56.461169803700564 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 51.73600434466286 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 67.57827065898053 - type: mrr value: 79.08136569493911 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 83.53324575999243 - type: cos_sim_spearman value: 81.37173362822374 - type: euclidean_pearson value: 82.19243335103444 - type: euclidean_spearman value: 81.33679307304334 - type: manhattan_pearson value: 82.38752665975699 - type: manhattan_spearman value: 81.31510583189689 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 87.56818181818181 - type: f1 value: 87.25826722019875 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 50.09239610327673 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 46.64733054606282 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: map_at_1 value: 33.997 - type: map_at_10 value: 48.176 - type: map_at_100 value: 49.82 - type: map_at_1000 value: 49.924 - type: map_at_3 value: 43.626 - type: map_at_5 value: 46.275 - type: mrr_at_1 value: 42.059999999999995 - type: mrr_at_10 value: 53.726 - type: mrr_at_100 value: 54.398 - type: mrr_at_1000 value: 54.416 - type: mrr_at_3 value: 50.714999999999996 - type: mrr_at_5 value: 52.639 - type: ndcg_at_1 value: 42.059999999999995 - type: ndcg_at_10 value: 55.574999999999996 - type: ndcg_at_100 value: 60.744 - type: ndcg_at_1000 value: 61.85699999999999 - type: ndcg_at_3 value: 49.363 - type: ndcg_at_5 value: 52.44 - type: precision_at_1 value: 42.059999999999995 - type: precision_at_10 value: 11.101999999999999 - type: precision_at_100 value: 1.73 - type: precision_at_1000 value: 0.218 - type: precision_at_3 value: 24.464 - type: precision_at_5 value: 18.026 - type: recall_at_1 value: 33.997 - type: recall_at_10 value: 70.35900000000001 - type: recall_at_100 value: 91.642 - type: recall_at_1000 value: 97.977 - type: recall_at_3 value: 52.76 - type: recall_at_5 value: 61.148 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval type: BeIR/cqadupstack config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: map_at_1 value: 35.884 - type: map_at_10 value: 48.14 - type: map_at_100 value: 49.5 - type: map_at_1000 value: 49.63 - type: map_at_3 value: 44.646 - type: map_at_5 value: 46.617999999999995 - type: mrr_at_1 value: 44.458999999999996 - type: mrr_at_10 value: 53.751000000000005 - type: mrr_at_100 value: 54.37800000000001 - type: mrr_at_1000 value: 54.415 - type: mrr_at_3 value: 51.815 - type: mrr_at_5 value: 52.882 - type: ndcg_at_1 value: 44.458999999999996 - type: ndcg_at_10 value: 54.157 - type: ndcg_at_100 value: 58.362 - type: ndcg_at_1000 value: 60.178 - type: ndcg_at_3 value: 49.661 - type: ndcg_at_5 value: 51.74999999999999 - type: precision_at_1 value: 44.458999999999996 - type: precision_at_10 value: 10.248 - type: precision_at_100 value: 1.5890000000000002 - type: precision_at_1000 value: 0.207 - type: precision_at_3 value: 23.928 - type: precision_at_5 value: 16.878999999999998 - type: recall_at_1 value: 35.884 - type: recall_at_10 value: 64.798 - type: recall_at_100 value: 82.345 - type: recall_at_1000 value: 93.267 - type: recall_at_3 value: 51.847 - type: recall_at_5 value: 57.601 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval type: BeIR/cqadupstack config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 39.383 - type: map_at_10 value: 53.714 - type: map_at_100 value: 54.838 - type: map_at_1000 value: 54.87800000000001 - type: map_at_3 value: 50.114999999999995 - type: map_at_5 value: 52.153000000000006 - type: mrr_at_1 value: 45.016 - type: mrr_at_10 value: 56.732000000000006 - type: mrr_at_100 value: 57.411 - type: mrr_at_1000 value: 57.431 - type: mrr_at_3 value: 54.044000000000004 - type: mrr_at_5 value: 55.639 - type: ndcg_at_1 value: 45.016 - type: ndcg_at_10 value: 60.228 - type: ndcg_at_100 value: 64.277 - type: ndcg_at_1000 value: 65.07 - type: ndcg_at_3 value: 54.124 - type: ndcg_at_5 value: 57.147000000000006 - type: precision_at_1 value: 45.016 - type: precision_at_10 value: 9.937 - type: precision_at_100 value: 1.288 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 24.471999999999998 - type: precision_at_5 value: 16.991 - type: recall_at_1 value: 39.383 - type: recall_at_10 value: 76.175 - type: recall_at_100 value: 93.02 - type: recall_at_1000 value: 98.60900000000001 - type: recall_at_3 value: 60.265 - type: recall_at_5 value: 67.46600000000001 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval type: BeIR/cqadupstack config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: map_at_1 value: 27.426000000000002 - type: map_at_10 value: 37.397000000000006 - type: map_at_100 value: 38.61 - type: map_at_1000 value: 38.678000000000004 - type: map_at_3 value: 34.150999999999996 - type: map_at_5 value: 36.137 - type: mrr_at_1 value: 29.944 - type: mrr_at_10 value: 39.654 - type: mrr_at_100 value: 40.638000000000005 - type: mrr_at_1000 value: 40.691 - type: mrr_at_3 value: 36.817 - type: mrr_at_5 value: 38.524 - type: ndcg_at_1 value: 29.944 - type: ndcg_at_10 value: 43.094 - type: ndcg_at_100 value: 48.789 - type: ndcg_at_1000 value: 50.339999999999996 - type: ndcg_at_3 value: 36.984 - type: ndcg_at_5 value: 40.248 - type: precision_at_1 value: 29.944 - type: precision_at_10 value: 6.78 - type: precision_at_100 value: 1.024 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_3 value: 15.895000000000001 - type: precision_at_5 value: 11.39 - type: recall_at_1 value: 27.426000000000002 - type: recall_at_10 value: 58.464000000000006 - type: recall_at_100 value: 84.193 - type: recall_at_1000 value: 95.52000000000001 - type: recall_at_3 value: 42.172 - type: recall_at_5 value: 50.101 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval type: BeIR/cqadupstack config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: map_at_1 value: 19.721 - type: map_at_10 value: 31.604 - type: map_at_100 value: 32.972 - type: map_at_1000 value: 33.077 - type: map_at_3 value: 27.218999999999998 - type: map_at_5 value: 29.53 - type: mrr_at_1 value: 25.0 - type: mrr_at_10 value: 35.843 - type: mrr_at_100 value: 36.785000000000004 - type: mrr_at_1000 value: 36.842000000000006 - type: mrr_at_3 value: 32.193 - type: mrr_at_5 value: 34.264 - type: ndcg_at_1 value: 25.0 - type: ndcg_at_10 value: 38.606 - type: ndcg_at_100 value: 44.272 - type: ndcg_at_1000 value: 46.527 - type: ndcg_at_3 value: 30.985000000000003 - type: ndcg_at_5 value: 34.43 - type: precision_at_1 value: 25.0 - type: precision_at_10 value: 7.811 - type: precision_at_100 value: 1.203 - type: precision_at_1000 value: 0.15 - type: precision_at_3 value: 15.423 - type: precision_at_5 value: 11.791 - type: recall_at_1 value: 19.721 - type: recall_at_10 value: 55.625 - type: recall_at_100 value: 79.34400000000001 - type: recall_at_1000 value: 95.208 - type: recall_at_3 value: 35.19 - type: recall_at_5 value: 43.626 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval type: BeIR/cqadupstack config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: map_at_1 value: 33.784 - type: map_at_10 value: 47.522 - type: map_at_100 value: 48.949999999999996 - type: map_at_1000 value: 49.038 - type: map_at_3 value: 43.284 - type: map_at_5 value: 45.629 - type: mrr_at_1 value: 41.482 - type: mrr_at_10 value: 52.830999999999996 - type: mrr_at_100 value: 53.559999999999995 - type: mrr_at_1000 value: 53.588 - type: mrr_at_3 value: 50.016000000000005 - type: mrr_at_5 value: 51.614000000000004 - type: ndcg_at_1 value: 41.482 - type: ndcg_at_10 value: 54.569 - type: ndcg_at_100 value: 59.675999999999995 - type: ndcg_at_1000 value: 60.989000000000004 - type: ndcg_at_3 value: 48.187000000000005 - type: ndcg_at_5 value: 51.183 - type: precision_at_1 value: 41.482 - type: precision_at_10 value: 10.221 - type: precision_at_100 value: 1.486 - type: precision_at_1000 value: 0.17500000000000002 - type: precision_at_3 value: 23.548 - type: precision_at_5 value: 16.805 - type: recall_at_1 value: 33.784 - type: recall_at_10 value: 69.798 - type: recall_at_100 value: 90.098 - type: recall_at_1000 value: 98.176 - type: recall_at_3 value: 52.127 - type: recall_at_5 value: 59.861 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval type: BeIR/cqadupstack config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: map_at_1 value: 28.038999999999998 - type: map_at_10 value: 41.904 - type: map_at_100 value: 43.36 - type: map_at_1000 value: 43.453 - type: map_at_3 value: 37.785999999999994 - type: map_at_5 value: 40.105000000000004 - type: mrr_at_1 value: 35.046 - type: mrr_at_10 value: 46.926 - type: mrr_at_100 value: 47.815000000000005 - type: mrr_at_1000 value: 47.849000000000004 - type: mrr_at_3 value: 44.273 - type: mrr_at_5 value: 45.774 - type: ndcg_at_1 value: 35.046 - type: ndcg_at_10 value: 48.937000000000005 - type: ndcg_at_100 value: 54.544000000000004 - type: ndcg_at_1000 value: 56.069 - type: ndcg_at_3 value: 42.858000000000004 - type: ndcg_at_5 value: 45.644 - type: precision_at_1 value: 35.046 - type: precision_at_10 value: 9.452 - type: precision_at_100 value: 1.429 - type: precision_at_1000 value: 0.173 - type: precision_at_3 value: 21.346999999999998 - type: precision_at_5 value: 15.342 - type: recall_at_1 value: 28.038999999999998 - type: recall_at_10 value: 64.59700000000001 - type: recall_at_100 value: 87.735 - type: recall_at_1000 value: 97.41300000000001 - type: recall_at_3 value: 47.368 - type: recall_at_5 value: 54.93900000000001 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: BeIR/cqadupstack config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 28.17291666666667 - type: map_at_10 value: 40.025749999999995 - type: map_at_100 value: 41.39208333333333 - type: map_at_1000 value: 41.499249999999996 - type: map_at_3 value: 36.347 - type: map_at_5 value: 38.41391666666667 - type: mrr_at_1 value: 33.65925 - type: mrr_at_10 value: 44.085499999999996 - type: mrr_at_100 value: 44.94116666666667 - type: mrr_at_1000 value: 44.9855 - type: mrr_at_3 value: 41.2815 - type: mrr_at_5 value: 42.91491666666666 - type: ndcg_at_1 value: 33.65925 - type: ndcg_at_10 value: 46.430833333333325 - type: ndcg_at_100 value: 51.761 - type: ndcg_at_1000 value: 53.50899999999999 - type: ndcg_at_3 value: 40.45133333333333 - type: ndcg_at_5 value: 43.31483333333334 - type: precision_at_1 value: 33.65925 - type: precision_at_10 value: 8.4995 - type: precision_at_100 value: 1.3210000000000004 - type: precision_at_1000 value: 0.16591666666666666 - type: precision_at_3 value: 19.165083333333335 - type: precision_at_5 value: 13.81816666666667 - type: recall_at_1 value: 28.17291666666667 - type: recall_at_10 value: 61.12624999999999 - type: recall_at_100 value: 83.97266666666667 - type: recall_at_1000 value: 95.66550000000001 - type: recall_at_3 value: 44.661249999999995 - type: recall_at_5 value: 51.983333333333334 - type: map_at_1 value: 17.936 - type: map_at_10 value: 27.399 - type: map_at_100 value: 28.632 - type: map_at_1000 value: 28.738000000000003 - type: map_at_3 value: 24.456 - type: map_at_5 value: 26.06 - type: mrr_at_1 value: 19.224 - type: mrr_at_10 value: 28.998 - type: mrr_at_100 value: 30.11 - type: mrr_at_1000 value: 30.177 - type: mrr_at_3 value: 26.247999999999998 - type: mrr_at_5 value: 27.708 - type: ndcg_at_1 value: 19.224 - type: ndcg_at_10 value: 32.911 - type: ndcg_at_100 value: 38.873999999999995 - type: ndcg_at_1000 value: 41.277 - type: ndcg_at_3 value: 27.142 - type: ndcg_at_5 value: 29.755 - type: precision_at_1 value: 19.224 - type: precision_at_10 value: 5.6930000000000005 - type: precision_at_100 value: 0.9259999999999999 - type: precision_at_1000 value: 0.126 - type: precision_at_3 value: 12.138 - type: precision_at_5 value: 8.909 - type: recall_at_1 value: 17.936 - type: recall_at_10 value: 48.096 - type: recall_at_100 value: 75.389 - type: recall_at_1000 value: 92.803 - type: recall_at_3 value: 32.812999999999995 - type: recall_at_5 value: 38.851 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval type: BeIR/cqadupstack config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: map_at_1 value: 24.681 - type: map_at_10 value: 34.892 - type: map_at_100 value: 35.996 - type: map_at_1000 value: 36.083 - type: map_at_3 value: 31.491999999999997 - type: map_at_5 value: 33.632 - type: mrr_at_1 value: 28.528 - type: mrr_at_10 value: 37.694 - type: mrr_at_100 value: 38.613 - type: mrr_at_1000 value: 38.668 - type: mrr_at_3 value: 34.714 - type: mrr_at_5 value: 36.616 - type: ndcg_at_1 value: 28.528 - type: ndcg_at_10 value: 40.703 - type: ndcg_at_100 value: 45.993 - type: ndcg_at_1000 value: 47.847 - type: ndcg_at_3 value: 34.622 - type: ndcg_at_5 value: 38.035999999999994 - type: precision_at_1 value: 28.528 - type: precision_at_10 value: 6.902 - type: precision_at_100 value: 1.0370000000000001 - type: precision_at_1000 value: 0.126 - type: precision_at_3 value: 15.798000000000002 - type: precision_at_5 value: 11.655999999999999 - type: recall_at_1 value: 24.681 - type: recall_at_10 value: 55.81 - type: recall_at_100 value: 79.785 - type: recall_at_1000 value: 92.959 - type: recall_at_3 value: 39.074 - type: recall_at_5 value: 47.568 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval type: BeIR/cqadupstack config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 18.627 - type: map_at_10 value: 27.872000000000003 - type: map_at_100 value: 29.237999999999996 - type: map_at_1000 value: 29.363 - type: map_at_3 value: 24.751 - type: map_at_5 value: 26.521 - type: mrr_at_1 value: 23.021 - type: mrr_at_10 value: 31.924000000000003 - type: mrr_at_100 value: 32.922000000000004 - type: mrr_at_1000 value: 32.988 - type: mrr_at_3 value: 29.192 - type: mrr_at_5 value: 30.798 - type: ndcg_at_1 value: 23.021 - type: ndcg_at_10 value: 33.535 - type: ndcg_at_100 value: 39.732 - type: ndcg_at_1000 value: 42.201 - type: ndcg_at_3 value: 28.153 - type: ndcg_at_5 value: 30.746000000000002 - type: precision_at_1 value: 23.021 - type: precision_at_10 value: 6.459 - type: precision_at_100 value: 1.1320000000000001 - type: precision_at_1000 value: 0.153 - type: precision_at_3 value: 13.719000000000001 - type: precision_at_5 value: 10.193000000000001 - type: recall_at_1 value: 18.627 - type: recall_at_10 value: 46.463 - type: recall_at_100 value: 74.226 - type: recall_at_1000 value: 91.28500000000001 - type: recall_at_3 value: 31.357000000000003 - type: recall_at_5 value: 38.067 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval type: BeIR/cqadupstack config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: map_at_1 value: 31.457 - type: map_at_10 value: 42.888 - type: map_at_100 value: 44.24 - type: map_at_1000 value: 44.327 - type: map_at_3 value: 39.588 - type: map_at_5 value: 41.423 - type: mrr_at_1 value: 37.126999999999995 - type: mrr_at_10 value: 47.083000000000006 - type: mrr_at_100 value: 47.997 - type: mrr_at_1000 value: 48.044 - type: mrr_at_3 value: 44.574000000000005 - type: mrr_at_5 value: 46.202 - type: ndcg_at_1 value: 37.126999999999995 - type: ndcg_at_10 value: 48.833 - type: ndcg_at_100 value: 54.327000000000005 - type: ndcg_at_1000 value: 56.011 - type: ndcg_at_3 value: 43.541999999999994 - type: ndcg_at_5 value: 46.127 - type: precision_at_1 value: 37.126999999999995 - type: precision_at_10 value: 8.376999999999999 - type: precision_at_100 value: 1.2309999999999999 - type: precision_at_1000 value: 0.146 - type: precision_at_3 value: 20.211000000000002 - type: precision_at_5 value: 14.16 - type: recall_at_1 value: 31.457 - type: recall_at_10 value: 62.369 - type: recall_at_100 value: 85.444 - type: recall_at_1000 value: 96.65599999999999 - type: recall_at_3 value: 47.961 - type: recall_at_5 value: 54.676 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval type: BeIR/cqadupstack config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: map_at_1 value: 27.139999999999997 - type: map_at_10 value: 38.801 - type: map_at_100 value: 40.549 - type: map_at_1000 value: 40.802 - type: map_at_3 value: 35.05 - type: map_at_5 value: 36.884 - type: mrr_at_1 value: 33.004 - type: mrr_at_10 value: 43.864 - type: mrr_at_100 value: 44.667 - type: mrr_at_1000 value: 44.717 - type: mrr_at_3 value: 40.777 - type: mrr_at_5 value: 42.319 - type: ndcg_at_1 value: 33.004 - type: ndcg_at_10 value: 46.022 - type: ndcg_at_100 value: 51.542 - type: ndcg_at_1000 value: 53.742000000000004 - type: ndcg_at_3 value: 39.795 - type: ndcg_at_5 value: 42.272 - type: precision_at_1 value: 33.004 - type: precision_at_10 value: 9.012 - type: precision_at_100 value: 1.7770000000000001 - type: precision_at_1000 value: 0.26 - type: precision_at_3 value: 19.038 - type: precision_at_5 value: 13.675999999999998 - type: recall_at_1 value: 27.139999999999997 - type: recall_at_10 value: 60.961 - type: recall_at_100 value: 84.451 - type: recall_at_1000 value: 98.113 - type: recall_at_3 value: 43.001 - type: recall_at_5 value: 49.896 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: mteb/climate-fever config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: map_at_1 value: 22.076999999999998 - type: map_at_10 value: 35.44 - type: map_at_100 value: 37.651 - type: map_at_1000 value: 37.824999999999996 - type: map_at_3 value: 30.764999999999997 - type: map_at_5 value: 33.26 - type: mrr_at_1 value: 50.163000000000004 - type: mrr_at_10 value: 61.207 - type: mrr_at_100 value: 61.675000000000004 - type: mrr_at_1000 value: 61.692 - type: mrr_at_3 value: 58.60999999999999 - type: mrr_at_5 value: 60.307 - type: ndcg_at_1 value: 50.163000000000004 - type: ndcg_at_10 value: 45.882 - type: ndcg_at_100 value: 53.239999999999995 - type: ndcg_at_1000 value: 55.852000000000004 - type: ndcg_at_3 value: 40.514 - type: ndcg_at_5 value: 42.038 - type: precision_at_1 value: 50.163000000000004 - type: precision_at_10 value: 13.466000000000001 - type: precision_at_100 value: 2.164 - type: precision_at_1000 value: 0.266 - type: precision_at_3 value: 29.707 - type: precision_at_5 value: 21.694 - type: recall_at_1 value: 22.076999999999998 - type: recall_at_10 value: 50.193 - type: recall_at_100 value: 74.993 - type: recall_at_1000 value: 89.131 - type: recall_at_3 value: 35.472 - type: recall_at_5 value: 41.814 - task: type: Retrieval dataset: name: MTEB DBPedia type: mteb/dbpedia config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: map_at_1 value: 9.953 - type: map_at_10 value: 24.515 - type: map_at_100 value: 36.173 - type: map_at_1000 value: 38.351 - type: map_at_3 value: 16.592000000000002 - type: map_at_5 value: 20.036 - type: mrr_at_1 value: 74.25 - type: mrr_at_10 value: 81.813 - type: mrr_at_100 value: 82.006 - type: mrr_at_1000 value: 82.011 - type: mrr_at_3 value: 80.875 - type: mrr_at_5 value: 81.362 - type: ndcg_at_1 value: 62.5 - type: ndcg_at_10 value: 52.42 - type: ndcg_at_100 value: 56.808 - type: ndcg_at_1000 value: 63.532999999999994 - type: ndcg_at_3 value: 56.654 - type: ndcg_at_5 value: 54.18300000000001 - type: precision_at_1 value: 74.25 - type: precision_at_10 value: 42.699999999999996 - type: precision_at_100 value: 13.675 - type: precision_at_1000 value: 2.664 - type: precision_at_3 value: 60.5 - type: precision_at_5 value: 52.800000000000004 - type: recall_at_1 value: 9.953 - type: recall_at_10 value: 30.253999999999998 - type: recall_at_100 value: 62.516000000000005 - type: recall_at_1000 value: 84.163 - type: recall_at_3 value: 18.13 - type: recall_at_5 value: 22.771 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 79.455 - type: f1 value: 74.16798697647569 - task: type: Retrieval dataset: name: MTEB FEVER type: mteb/fever config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: map_at_1 value: 87.531 - type: map_at_10 value: 93.16799999999999 - type: map_at_100 value: 93.341 - type: map_at_1000 value: 93.349 - type: map_at_3 value: 92.444 - type: map_at_5 value: 92.865 - type: mrr_at_1 value: 94.014 - type: mrr_at_10 value: 96.761 - type: mrr_at_100 value: 96.762 - type: mrr_at_1000 value: 96.762 - type: mrr_at_3 value: 96.672 - type: mrr_at_5 value: 96.736 - type: ndcg_at_1 value: 94.014 - type: ndcg_at_10 value: 95.112 - type: ndcg_at_100 value: 95.578 - type: ndcg_at_1000 value: 95.68900000000001 - type: ndcg_at_3 value: 94.392 - type: ndcg_at_5 value: 94.72500000000001 - type: precision_at_1 value: 94.014 - type: precision_at_10 value: 11.065 - type: precision_at_100 value: 1.157 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_3 value: 35.259 - type: precision_at_5 value: 21.599 - type: recall_at_1 value: 87.531 - type: recall_at_10 value: 97.356 - type: recall_at_100 value: 98.965 - type: recall_at_1000 value: 99.607 - type: recall_at_3 value: 95.312 - type: recall_at_5 value: 96.295 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: mteb/fiqa config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: map_at_1 value: 32.055 - type: map_at_10 value: 53.114 - type: map_at_100 value: 55.235 - type: map_at_1000 value: 55.345 - type: map_at_3 value: 45.854 - type: map_at_5 value: 50.025 - type: mrr_at_1 value: 60.34 - type: mrr_at_10 value: 68.804 - type: mrr_at_100 value: 69.309 - type: mrr_at_1000 value: 69.32199999999999 - type: mrr_at_3 value: 66.40899999999999 - type: mrr_at_5 value: 67.976 - type: ndcg_at_1 value: 60.34 - type: ndcg_at_10 value: 62.031000000000006 - type: ndcg_at_100 value: 68.00500000000001 - type: ndcg_at_1000 value: 69.286 - type: ndcg_at_3 value: 56.355999999999995 - type: ndcg_at_5 value: 58.687 - type: precision_at_1 value: 60.34 - type: precision_at_10 value: 17.176 - type: precision_at_100 value: 2.36 - type: precision_at_1000 value: 0.259 - type: precision_at_3 value: 37.14 - type: precision_at_5 value: 27.809 - type: recall_at_1 value: 32.055 - type: recall_at_10 value: 70.91 - type: recall_at_100 value: 91.83 - type: recall_at_1000 value: 98.871 - type: recall_at_3 value: 51.202999999999996 - type: recall_at_5 value: 60.563 - task: type: Retrieval dataset: name: MTEB HotpotQA type: mteb/hotpotqa config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: map_at_1 value: 43.68 - type: map_at_10 value: 64.389 - type: map_at_100 value: 65.24 - type: map_at_1000 value: 65.303 - type: map_at_3 value: 61.309000000000005 - type: map_at_5 value: 63.275999999999996 - type: mrr_at_1 value: 87.36 - type: mrr_at_10 value: 91.12 - type: mrr_at_100 value: 91.227 - type: mrr_at_1000 value: 91.229 - type: mrr_at_3 value: 90.57600000000001 - type: mrr_at_5 value: 90.912 - type: ndcg_at_1 value: 87.36 - type: ndcg_at_10 value: 73.076 - type: ndcg_at_100 value: 75.895 - type: ndcg_at_1000 value: 77.049 - type: ndcg_at_3 value: 68.929 - type: ndcg_at_5 value: 71.28 - type: precision_at_1 value: 87.36 - type: precision_at_10 value: 14.741000000000001 - type: precision_at_100 value: 1.694 - type: precision_at_1000 value: 0.185 - type: precision_at_3 value: 43.043 - type: precision_at_5 value: 27.681 - type: recall_at_1 value: 43.68 - type: recall_at_10 value: 73.707 - type: recall_at_100 value: 84.7 - type: recall_at_1000 value: 92.309 - type: recall_at_3 value: 64.564 - type: recall_at_5 value: 69.203 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 96.75399999999999 - type: ap value: 95.29389839242187 - type: f1 value: 96.75348377433475 - task: type: Retrieval dataset: name: MTEB MSMARCO type: mteb/msmarco config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: map_at_1 value: 25.176 - type: map_at_10 value: 38.598 - type: map_at_100 value: 39.707 - type: map_at_1000 value: 39.744 - type: map_at_3 value: 34.566 - type: map_at_5 value: 36.863 - type: mrr_at_1 value: 25.874000000000002 - type: mrr_at_10 value: 39.214 - type: mrr_at_100 value: 40.251 - type: mrr_at_1000 value: 40.281 - type: mrr_at_3 value: 35.291 - type: mrr_at_5 value: 37.545 - type: ndcg_at_1 value: 25.874000000000002 - type: ndcg_at_10 value: 45.98 - type: ndcg_at_100 value: 51.197 - type: ndcg_at_1000 value: 52.073 - type: ndcg_at_3 value: 37.785999999999994 - type: ndcg_at_5 value: 41.870000000000005 - type: precision_at_1 value: 25.874000000000002 - type: precision_at_10 value: 7.181 - type: precision_at_100 value: 0.979 - type: precision_at_1000 value: 0.106 - type: precision_at_3 value: 16.051000000000002 - type: precision_at_5 value: 11.713 - type: recall_at_1 value: 25.176 - type: recall_at_10 value: 68.67699999999999 - type: recall_at_100 value: 92.55 - type: recall_at_1000 value: 99.164 - type: recall_at_3 value: 46.372 - type: recall_at_5 value: 56.16 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 99.03784769721841 - type: f1 value: 98.97791641821495 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 91.88326493388054 - type: f1 value: 73.74809928034335 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 85.41358439811701 - type: f1 value: 83.503679460639 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 89.77135171486215 - type: f1 value: 88.89843747468366 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 46.22695362087359 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 44.132372165849425 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 33.35680810650402 - type: mrr value: 34.72625715637218 - task: type: Retrieval dataset: name: MTEB NFCorpus type: mteb/nfcorpus config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: map_at_1 value: 7.165000000000001 - type: map_at_10 value: 15.424 - type: map_at_100 value: 20.28 - type: map_at_1000 value: 22.065 - type: map_at_3 value: 11.236 - type: map_at_5 value: 13.025999999999998 - type: mrr_at_1 value: 51.702999999999996 - type: mrr_at_10 value: 59.965 - type: mrr_at_100 value: 60.667 - type: mrr_at_1000 value: 60.702999999999996 - type: mrr_at_3 value: 58.772000000000006 - type: mrr_at_5 value: 59.267 - type: ndcg_at_1 value: 49.536 - type: ndcg_at_10 value: 40.6 - type: ndcg_at_100 value: 37.848 - type: ndcg_at_1000 value: 46.657 - type: ndcg_at_3 value: 46.117999999999995 - type: ndcg_at_5 value: 43.619 - type: precision_at_1 value: 51.393 - type: precision_at_10 value: 30.31 - type: precision_at_100 value: 9.972 - type: precision_at_1000 value: 2.329 - type: precision_at_3 value: 43.137 - type: precision_at_5 value: 37.585 - type: recall_at_1 value: 7.165000000000001 - type: recall_at_10 value: 19.689999999999998 - type: recall_at_100 value: 39.237 - type: recall_at_1000 value: 71.417 - type: recall_at_3 value: 12.247 - type: recall_at_5 value: 14.902999999999999 - task: type: Retrieval dataset: name: MTEB NQ type: mteb/nq config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: map_at_1 value: 42.653999999999996 - type: map_at_10 value: 59.611999999999995 - type: map_at_100 value: 60.32300000000001 - type: map_at_1000 value: 60.336 - type: map_at_3 value: 55.584999999999994 - type: map_at_5 value: 58.19 - type: mrr_at_1 value: 47.683 - type: mrr_at_10 value: 62.06700000000001 - type: mrr_at_100 value: 62.537 - type: mrr_at_1000 value: 62.544999999999995 - type: mrr_at_3 value: 59.178 - type: mrr_at_5 value: 61.034 - type: ndcg_at_1 value: 47.654 - type: ndcg_at_10 value: 67.001 - type: ndcg_at_100 value: 69.73899999999999 - type: ndcg_at_1000 value: 69.986 - type: ndcg_at_3 value: 59.95700000000001 - type: ndcg_at_5 value: 64.025 - type: precision_at_1 value: 47.654 - type: precision_at_10 value: 10.367999999999999 - type: precision_at_100 value: 1.192 - type: precision_at_1000 value: 0.121 - type: precision_at_3 value: 26.651000000000003 - type: precision_at_5 value: 18.459 - type: recall_at_1 value: 42.653999999999996 - type: recall_at_10 value: 86.619 - type: recall_at_100 value: 98.04899999999999 - type: recall_at_1000 value: 99.812 - type: recall_at_3 value: 68.987 - type: recall_at_5 value: 78.158 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: mteb/quora config: default split: test revision: None metrics: - type: map_at_1 value: 72.538 - type: map_at_10 value: 86.702 - type: map_at_100 value: 87.31 - type: map_at_1000 value: 87.323 - type: map_at_3 value: 83.87 - type: map_at_5 value: 85.682 - type: mrr_at_1 value: 83.31 - type: mrr_at_10 value: 89.225 - type: mrr_at_100 value: 89.30399999999999 - type: mrr_at_1000 value: 89.30399999999999 - type: mrr_at_3 value: 88.44300000000001 - type: mrr_at_5 value: 89.005 - type: ndcg_at_1 value: 83.32000000000001 - type: ndcg_at_10 value: 90.095 - type: ndcg_at_100 value: 91.12 - type: ndcg_at_1000 value: 91.179 - type: ndcg_at_3 value: 87.606 - type: ndcg_at_5 value: 89.031 - type: precision_at_1 value: 83.32000000000001 - type: precision_at_10 value: 13.641 - type: precision_at_100 value: 1.541 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 38.377 - type: precision_at_5 value: 25.162000000000003 - type: recall_at_1 value: 72.538 - type: recall_at_10 value: 96.47200000000001 - type: recall_at_100 value: 99.785 - type: recall_at_1000 value: 99.99900000000001 - type: recall_at_3 value: 89.278 - type: recall_at_5 value: 93.367 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 73.55219145406065 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 74.13437105242755 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: mteb/scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 6.873 - type: map_at_10 value: 17.944 - type: map_at_100 value: 21.171 - type: map_at_1000 value: 21.528 - type: map_at_3 value: 12.415 - type: map_at_5 value: 15.187999999999999 - type: mrr_at_1 value: 33.800000000000004 - type: mrr_at_10 value: 46.455 - type: mrr_at_100 value: 47.378 - type: mrr_at_1000 value: 47.394999999999996 - type: mrr_at_3 value: 42.367 - type: mrr_at_5 value: 44.972 - type: ndcg_at_1 value: 33.800000000000004 - type: ndcg_at_10 value: 28.907 - type: ndcg_at_100 value: 39.695 - type: ndcg_at_1000 value: 44.582 - type: ndcg_at_3 value: 26.949 - type: ndcg_at_5 value: 23.988 - type: precision_at_1 value: 33.800000000000004 - type: precision_at_10 value: 15.079999999999998 - type: precision_at_100 value: 3.056 - type: precision_at_1000 value: 0.42100000000000004 - type: precision_at_3 value: 25.167 - type: precision_at_5 value: 21.26 - type: recall_at_1 value: 6.873 - type: recall_at_10 value: 30.568 - type: recall_at_100 value: 62.062 - type: recall_at_1000 value: 85.37700000000001 - type: recall_at_3 value: 15.312999999999999 - type: recall_at_5 value: 21.575 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 82.37009118256057 - type: cos_sim_spearman value: 79.27986395671529 - type: euclidean_pearson value: 79.18037715442115 - type: euclidean_spearman value: 79.28004791561621 - type: manhattan_pearson value: 79.34062972800541 - type: manhattan_spearman value: 79.43106695543402 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 87.48474767383833 - type: cos_sim_spearman value: 79.54505388752513 - type: euclidean_pearson value: 83.43282704179565 - type: euclidean_spearman value: 79.54579919925405 - type: manhattan_pearson value: 83.77564492427952 - type: manhattan_spearman value: 79.84558396989286 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 88.803698035802 - type: cos_sim_spearman value: 88.83451367754881 - type: euclidean_pearson value: 88.28939285711628 - type: euclidean_spearman value: 88.83528996073112 - type: manhattan_pearson value: 88.28017412671795 - type: manhattan_spearman value: 88.9228828016344 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 85.27469288153428 - type: cos_sim_spearman value: 83.87477064876288 - type: euclidean_pearson value: 84.2601737035379 - type: euclidean_spearman value: 83.87431082479074 - type: manhattan_pearson value: 84.3621547772745 - type: manhattan_spearman value: 84.12094375000423 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 88.12749863201587 - type: cos_sim_spearman value: 88.54287568368565 - type: euclidean_pearson value: 87.90429700607999 - type: euclidean_spearman value: 88.5437689576261 - type: manhattan_pearson value: 88.19276653356833 - type: manhattan_spearman value: 88.99995393814679 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 85.68398747560902 - type: cos_sim_spearman value: 86.48815303460574 - type: euclidean_pearson value: 85.52356631237954 - type: euclidean_spearman value: 86.486391949551 - type: manhattan_pearson value: 85.67267981761788 - type: manhattan_spearman value: 86.7073696332485 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 88.9057107443124 - type: cos_sim_spearman value: 88.7312168757697 - type: euclidean_pearson value: 88.72810439714794 - type: euclidean_spearman value: 88.71976185854771 - type: manhattan_pearson value: 88.50433745949111 - type: manhattan_spearman value: 88.51726175544195 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 67.59391795109886 - type: cos_sim_spearman value: 66.87613008631367 - type: euclidean_pearson value: 69.23198488262217 - type: euclidean_spearman value: 66.85427723013692 - type: manhattan_pearson value: 69.50730124841084 - type: manhattan_spearman value: 67.10404669820792 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 87.0820605344619 - type: cos_sim_spearman value: 86.8518089863434 - type: euclidean_pearson value: 86.31087134689284 - type: euclidean_spearman value: 86.8518520517941 - type: manhattan_pearson value: 86.47203796160612 - type: manhattan_spearman value: 87.1080149734421 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 89.09255369305481 - type: mrr value: 97.10323445617563 - task: type: Retrieval dataset: name: MTEB SciFact type: mteb/scifact config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: map_at_1 value: 61.260999999999996 - type: map_at_10 value: 74.043 - type: map_at_100 value: 74.37700000000001 - type: map_at_1000 value: 74.384 - type: map_at_3 value: 71.222 - type: map_at_5 value: 72.875 - type: mrr_at_1 value: 64.333 - type: mrr_at_10 value: 74.984 - type: mrr_at_100 value: 75.247 - type: mrr_at_1000 value: 75.25500000000001 - type: mrr_at_3 value: 73.167 - type: mrr_at_5 value: 74.35000000000001 - type: ndcg_at_1 value: 64.333 - type: ndcg_at_10 value: 79.06 - type: ndcg_at_100 value: 80.416 - type: ndcg_at_1000 value: 80.55600000000001 - type: ndcg_at_3 value: 74.753 - type: ndcg_at_5 value: 76.97500000000001 - type: precision_at_1 value: 64.333 - type: precision_at_10 value: 10.567 - type: precision_at_100 value: 1.1199999999999999 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 29.889 - type: precision_at_5 value: 19.533 - type: recall_at_1 value: 61.260999999999996 - type: recall_at_10 value: 93.167 - type: recall_at_100 value: 99.0 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 81.667 - type: recall_at_5 value: 87.394 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.71980198019801 - type: cos_sim_ap value: 92.81616007802704 - type: cos_sim_f1 value: 85.17548454688318 - type: cos_sim_precision value: 89.43894389438944 - type: cos_sim_recall value: 81.3 - type: dot_accuracy value: 99.71980198019801 - type: dot_ap value: 92.81398760591358 - type: dot_f1 value: 85.17548454688318 - type: dot_precision value: 89.43894389438944 - type: dot_recall value: 81.3 - type: euclidean_accuracy value: 99.71980198019801 - type: euclidean_ap value: 92.81560637245072 - type: euclidean_f1 value: 85.17548454688318 - type: euclidean_precision value: 89.43894389438944 - type: euclidean_recall value: 81.3 - type: manhattan_accuracy value: 99.73069306930694 - type: manhattan_ap value: 93.14005487480794 - type: manhattan_f1 value: 85.56263269639068 - type: manhattan_precision value: 91.17647058823529 - type: manhattan_recall value: 80.60000000000001 - type: max_accuracy value: 99.73069306930694 - type: max_ap value: 93.14005487480794 - type: max_f1 value: 85.56263269639068 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 79.86443362395185 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 49.40897096662564 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 55.66040806627947 - type: mrr value: 56.58670475766064 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 31.51015090598575 - type: cos_sim_spearman value: 31.35016454939226 - type: dot_pearson value: 31.5150068731 - type: dot_spearman value: 31.34790869023487 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: mteb/trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.254 - type: map_at_10 value: 2.064 - type: map_at_100 value: 12.909 - type: map_at_1000 value: 31.761 - type: map_at_3 value: 0.738 - type: map_at_5 value: 1.155 - type: mrr_at_1 value: 96.0 - type: mrr_at_10 value: 98.0 - type: mrr_at_100 value: 98.0 - type: mrr_at_1000 value: 98.0 - type: mrr_at_3 value: 98.0 - type: mrr_at_5 value: 98.0 - type: ndcg_at_1 value: 93.0 - type: ndcg_at_10 value: 82.258 - type: ndcg_at_100 value: 64.34 - type: ndcg_at_1000 value: 57.912 - type: ndcg_at_3 value: 90.827 - type: ndcg_at_5 value: 86.79 - type: precision_at_1 value: 96.0 - type: precision_at_10 value: 84.8 - type: precision_at_100 value: 66.0 - type: precision_at_1000 value: 25.356 - type: precision_at_3 value: 94.667 - type: precision_at_5 value: 90.4 - type: recall_at_1 value: 0.254 - type: recall_at_10 value: 2.1950000000000003 - type: recall_at_100 value: 16.088 - type: recall_at_1000 value: 54.559000000000005 - type: recall_at_3 value: 0.75 - type: recall_at_5 value: 1.191 - task: type: Retrieval dataset: name: MTEB Touche2020 type: mteb/touche2020 config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: map_at_1 value: 2.976 - type: map_at_10 value: 11.389000000000001 - type: map_at_100 value: 18.429000000000002 - type: map_at_1000 value: 20.113 - type: map_at_3 value: 6.483 - type: map_at_5 value: 8.770999999999999 - type: mrr_at_1 value: 40.816 - type: mrr_at_10 value: 58.118 - type: mrr_at_100 value: 58.489999999999995 - type: mrr_at_1000 value: 58.489999999999995 - type: mrr_at_3 value: 53.061 - type: mrr_at_5 value: 57.041 - type: ndcg_at_1 value: 40.816 - type: ndcg_at_10 value: 30.567 - type: ndcg_at_100 value: 42.44 - type: ndcg_at_1000 value: 53.480000000000004 - type: ndcg_at_3 value: 36.016 - type: ndcg_at_5 value: 34.257 - type: precision_at_1 value: 42.857 - type: precision_at_10 value: 25.714 - type: precision_at_100 value: 8.429 - type: precision_at_1000 value: 1.5939999999999999 - type: precision_at_3 value: 36.735 - type: precision_at_5 value: 33.878 - type: recall_at_1 value: 2.976 - type: recall_at_10 value: 17.854999999999997 - type: recall_at_100 value: 51.833 - type: recall_at_1000 value: 86.223 - type: recall_at_3 value: 7.887 - type: recall_at_5 value: 12.026 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 85.1174 - type: ap value: 30.169441069345748 - type: f1 value: 69.79254701873245 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 72.58347481607245 - type: f1 value: 72.74877295564937 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 53.90586138221305 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 87.35769207844072 - type: cos_sim_ap value: 77.9645072410354 - type: cos_sim_f1 value: 71.32352941176471 - type: cos_sim_precision value: 66.5903890160183 - type: cos_sim_recall value: 76.78100263852242 - type: dot_accuracy value: 87.37557370209214 - type: dot_ap value: 77.96250046429908 - type: dot_f1 value: 71.28932757557064 - type: dot_precision value: 66.95249130938586 - type: dot_recall value: 76.22691292875989 - type: euclidean_accuracy value: 87.35173153722357 - type: euclidean_ap value: 77.96520460741593 - type: euclidean_f1 value: 71.32470733210104 - type: euclidean_precision value: 66.91329479768785 - type: euclidean_recall value: 76.35883905013192 - type: manhattan_accuracy value: 87.25636287774931 - type: manhattan_ap value: 77.77752485611796 - type: manhattan_f1 value: 71.18148599269183 - type: manhattan_precision value: 66.10859728506787 - type: manhattan_recall value: 77.0976253298153 - type: max_accuracy value: 87.37557370209214 - type: max_ap value: 77.96520460741593 - type: max_f1 value: 71.32470733210104 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 89.38176737687739 - type: cos_sim_ap value: 86.58811861657401 - type: cos_sim_f1 value: 79.09430644097604 - type: cos_sim_precision value: 75.45085977911366 - type: cos_sim_recall value: 83.10748383122882 - type: dot_accuracy value: 89.38370784336554 - type: dot_ap value: 86.58840606004333 - type: dot_f1 value: 79.10179860068133 - type: dot_precision value: 75.44546153308643 - type: dot_recall value: 83.13058207576223 - type: euclidean_accuracy value: 89.38564830985369 - type: euclidean_ap value: 86.58820721061164 - type: euclidean_f1 value: 79.09070942235888 - type: euclidean_precision value: 75.38729937194697 - type: euclidean_recall value: 83.17677856482906 - type: manhattan_accuracy value: 89.40699344122326 - type: manhattan_ap value: 86.60631843011362 - type: manhattan_f1 value: 79.14949970570925 - type: manhattan_precision value: 75.78191039729502 - type: manhattan_recall value: 82.83030489682784 - type: max_accuracy value: 89.40699344122326 - type: max_ap value: 86.60631843011362 - type: max_f1 value: 79.14949970570925 - task: type: STS dataset: name: MTEB AFQMC type: C-MTEB/AFQMC config: default split: validation revision: b44c3b011063adb25877c13823db83bb193913c4 metrics: - type: cos_sim_pearson value: 65.58442135663871 - type: cos_sim_spearman value: 72.2538631361313 - type: euclidean_pearson value: 70.97255486607429 - type: euclidean_spearman value: 72.25374250228647 - type: manhattan_pearson value: 70.83250199989911 - type: manhattan_spearman value: 72.14819496536272 - task: type: STS dataset: name: MTEB ATEC type: C-MTEB/ATEC config: default split: test revision: 0f319b1142f28d00e055a6770f3f726ae9b7d865 metrics: - type: cos_sim_pearson value: 59.99478404929932 - type: cos_sim_spearman value: 62.61836216999812 - type: euclidean_pearson value: 66.86429811933593 - type: euclidean_spearman value: 62.6183520374191 - type: manhattan_pearson value: 66.8063778911633 - type: manhattan_spearman value: 62.569607573241115 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (zh) type: mteb/amazon_reviews_multi config: zh split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 53.98400000000001 - type: f1 value: 51.21447361350723 - task: type: STS dataset: name: MTEB BQ type: C-MTEB/BQ config: default split: test revision: e3dda5e115e487b39ec7e618c0c6a29137052a55 metrics: - type: cos_sim_pearson value: 79.11941660686553 - type: cos_sim_spearman value: 81.25029594540435 - type: euclidean_pearson value: 82.06973504238826 - type: euclidean_spearman value: 81.2501989488524 - type: manhattan_pearson value: 82.10094630392753 - type: manhattan_spearman value: 81.27987244392389 - task: type: Clustering dataset: name: MTEB CLSClusteringP2P type: C-MTEB/CLSClusteringP2P config: default split: test revision: 4b6227591c6c1a73bc76b1055f3b7f3588e72476 metrics: - type: v_measure value: 47.07270168705156 - task: type: Clustering dataset: name: MTEB CLSClusteringS2S type: C-MTEB/CLSClusteringS2S config: default split: test revision: e458b3f5414b62b7f9f83499ac1f5497ae2e869f metrics: - type: v_measure value: 45.98511703185043 - task: type: Reranking dataset: name: MTEB CMedQAv1 type: C-MTEB/CMedQAv1-reranking config: default split: test revision: 8d7f1e942507dac42dc58017c1a001c3717da7df metrics: - type: map value: 88.19895157194931 - type: mrr value: 90.21424603174603 - task: type: Reranking dataset: name: MTEB CMedQAv2 type: C-MTEB/CMedQAv2-reranking config: default split: test revision: 23d186750531a14a0357ca22cd92d712fd512ea0 metrics: - type: map value: 88.03317320980119 - type: mrr value: 89.9461507936508 - task: type: Retrieval dataset: name: MTEB CmedqaRetrieval type: C-MTEB/CmedqaRetrieval config: default split: dev revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301 metrics: - type: map_at_1 value: 29.037000000000003 - type: map_at_10 value: 42.001 - type: map_at_100 value: 43.773 - type: map_at_1000 value: 43.878 - type: map_at_3 value: 37.637 - type: map_at_5 value: 40.034 - type: mrr_at_1 value: 43.136 - type: mrr_at_10 value: 51.158 - type: mrr_at_100 value: 52.083 - type: mrr_at_1000 value: 52.12 - type: mrr_at_3 value: 48.733 - type: mrr_at_5 value: 50.025 - type: ndcg_at_1 value: 43.136 - type: ndcg_at_10 value: 48.685 - type: ndcg_at_100 value: 55.513 - type: ndcg_at_1000 value: 57.242000000000004 - type: ndcg_at_3 value: 43.329 - type: ndcg_at_5 value: 45.438 - type: precision_at_1 value: 43.136 - type: precision_at_10 value: 10.56 - type: precision_at_100 value: 1.6129999999999998 - type: precision_at_1000 value: 0.184 - type: precision_at_3 value: 24.064 - type: precision_at_5 value: 17.269000000000002 - type: recall_at_1 value: 29.037000000000003 - type: recall_at_10 value: 59.245000000000005 - type: recall_at_100 value: 87.355 - type: recall_at_1000 value: 98.74000000000001 - type: recall_at_3 value: 42.99 - type: recall_at_5 value: 49.681999999999995 - task: type: PairClassification dataset: name: MTEB Cmnli type: C-MTEB/CMNLI config: default split: validation revision: 41bc36f332156f7adc9e38f53777c959b2ae9766 metrics: - type: cos_sim_accuracy value: 82.68190018039687 - type: cos_sim_ap value: 90.18017125327886 - type: cos_sim_f1 value: 83.64080906868193 - type: cos_sim_precision value: 79.7076890489303 - type: cos_sim_recall value: 87.98223053542202 - type: dot_accuracy value: 82.68190018039687 - type: dot_ap value: 90.18782350103646 - type: dot_f1 value: 83.64242087729039 - type: dot_precision value: 79.65313028764805 - type: dot_recall value: 88.05237315875614 - type: euclidean_accuracy value: 82.68190018039687 - type: euclidean_ap value: 90.1801957900632 - type: euclidean_f1 value: 83.63636363636364 - type: euclidean_precision value: 79.52772506852203 - type: euclidean_recall value: 88.19265840542437 - type: manhattan_accuracy value: 82.14070956103427 - type: manhattan_ap value: 89.96178420101427 - type: manhattan_f1 value: 83.21087838578791 - type: manhattan_precision value: 78.35605121850475 - type: manhattan_recall value: 88.70703764320785 - type: max_accuracy value: 82.68190018039687 - type: max_ap value: 90.18782350103646 - type: max_f1 value: 83.64242087729039 - task: type: Retrieval dataset: name: MTEB CovidRetrieval type: C-MTEB/CovidRetrieval config: default split: dev revision: 1271c7809071a13532e05f25fb53511ffce77117 metrics: - type: map_at_1 value: 72.234 - type: map_at_10 value: 80.10000000000001 - type: map_at_100 value: 80.36 - type: map_at_1000 value: 80.363 - type: map_at_3 value: 78.315 - type: map_at_5 value: 79.607 - type: mrr_at_1 value: 72.392 - type: mrr_at_10 value: 80.117 - type: mrr_at_100 value: 80.36999999999999 - type: mrr_at_1000 value: 80.373 - type: mrr_at_3 value: 78.469 - type: mrr_at_5 value: 79.633 - type: ndcg_at_1 value: 72.392 - type: ndcg_at_10 value: 83.651 - type: ndcg_at_100 value: 84.749 - type: ndcg_at_1000 value: 84.83000000000001 - type: ndcg_at_3 value: 80.253 - type: ndcg_at_5 value: 82.485 - type: precision_at_1 value: 72.392 - type: precision_at_10 value: 9.557 - type: precision_at_100 value: 1.004 - type: precision_at_1000 value: 0.101 - type: precision_at_3 value: 28.732000000000003 - type: precision_at_5 value: 18.377 - type: recall_at_1 value: 72.234 - type: recall_at_10 value: 94.573 - type: recall_at_100 value: 99.368 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 85.669 - type: recall_at_5 value: 91.01700000000001 - task: type: Retrieval dataset: name: MTEB DuRetrieval type: C-MTEB/DuRetrieval config: default split: dev revision: a1a333e290fe30b10f3f56498e3a0d911a693ced metrics: - type: map_at_1 value: 26.173999999999996 - type: map_at_10 value: 80.04 - type: map_at_100 value: 82.94500000000001 - type: map_at_1000 value: 82.98100000000001 - type: map_at_3 value: 55.562999999999995 - type: map_at_5 value: 69.89800000000001 - type: mrr_at_1 value: 89.5 - type: mrr_at_10 value: 92.996 - type: mrr_at_100 value: 93.06400000000001 - type: mrr_at_1000 value: 93.065 - type: mrr_at_3 value: 92.658 - type: mrr_at_5 value: 92.84599999999999 - type: ndcg_at_1 value: 89.5 - type: ndcg_at_10 value: 87.443 - type: ndcg_at_100 value: 90.253 - type: ndcg_at_1000 value: 90.549 - type: ndcg_at_3 value: 85.874 - type: ndcg_at_5 value: 84.842 - type: precision_at_1 value: 89.5 - type: precision_at_10 value: 41.805 - type: precision_at_100 value: 4.827 - type: precision_at_1000 value: 0.49 - type: precision_at_3 value: 76.85 - type: precision_at_5 value: 64.8 - type: recall_at_1 value: 26.173999999999996 - type: recall_at_10 value: 89.101 - type: recall_at_100 value: 98.08099999999999 - type: recall_at_1000 value: 99.529 - type: recall_at_3 value: 57.902 - type: recall_at_5 value: 74.602 - task: type: Retrieval dataset: name: MTEB EcomRetrieval type: C-MTEB/EcomRetrieval config: default split: dev revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9 metrics: - type: map_at_1 value: 56.10000000000001 - type: map_at_10 value: 66.15299999999999 - type: map_at_100 value: 66.625 - type: map_at_1000 value: 66.636 - type: map_at_3 value: 63.632999999999996 - type: map_at_5 value: 65.293 - type: mrr_at_1 value: 56.10000000000001 - type: mrr_at_10 value: 66.15299999999999 - type: mrr_at_100 value: 66.625 - type: mrr_at_1000 value: 66.636 - type: mrr_at_3 value: 63.632999999999996 - type: mrr_at_5 value: 65.293 - type: ndcg_at_1 value: 56.10000000000001 - type: ndcg_at_10 value: 71.146 - type: ndcg_at_100 value: 73.27799999999999 - type: ndcg_at_1000 value: 73.529 - type: ndcg_at_3 value: 66.09 - type: ndcg_at_5 value: 69.08999999999999 - type: precision_at_1 value: 56.10000000000001 - type: precision_at_10 value: 8.68 - type: precision_at_100 value: 0.964 - type: precision_at_1000 value: 0.098 - type: precision_at_3 value: 24.4 - type: precision_at_5 value: 16.1 - type: recall_at_1 value: 56.10000000000001 - type: recall_at_10 value: 86.8 - type: recall_at_100 value: 96.39999999999999 - type: recall_at_1000 value: 98.3 - type: recall_at_3 value: 73.2 - type: recall_at_5 value: 80.5 - task: type: Classification dataset: name: MTEB IFlyTek type: C-MTEB/IFlyTek-classification config: default split: validation revision: 421605374b29664c5fc098418fe20ada9bd55f8a metrics: - type: accuracy value: 54.52096960369373 - type: f1 value: 40.930845295808695 - task: type: Classification dataset: name: MTEB JDReview type: C-MTEB/JDReview-classification config: default split: test revision: b7c64bd89eb87f8ded463478346f76731f07bf8b metrics: - type: accuracy value: 86.51031894934334 - type: ap value: 55.9516014323483 - type: f1 value: 81.54813679326381 - task: type: STS dataset: name: MTEB LCQMC type: C-MTEB/LCQMC config: default split: test revision: 17f9b096f80380fce5ed12a9be8be7784b337daf metrics: - type: cos_sim_pearson value: 69.67437838574276 - type: cos_sim_spearman value: 73.81314174653045 - type: euclidean_pearson value: 72.63430276680275 - type: euclidean_spearman value: 73.81358736777001 - type: manhattan_pearson value: 72.58743833842829 - type: manhattan_spearman value: 73.7590419009179 - task: type: Reranking dataset: name: MTEB MMarcoReranking type: C-MTEB/Mmarco-reranking config: default split: dev revision: None metrics: - type: map value: 31.648613483640254 - type: mrr value: 30.37420634920635 - task: type: Retrieval dataset: name: MTEB MMarcoRetrieval type: C-MTEB/MMarcoRetrieval config: default split: dev revision: 539bbde593d947e2a124ba72651aafc09eb33fc2 metrics: - type: map_at_1 value: 73.28099999999999 - type: map_at_10 value: 81.977 - type: map_at_100 value: 82.222 - type: map_at_1000 value: 82.22699999999999 - type: map_at_3 value: 80.441 - type: map_at_5 value: 81.46600000000001 - type: mrr_at_1 value: 75.673 - type: mrr_at_10 value: 82.41000000000001 - type: mrr_at_100 value: 82.616 - type: mrr_at_1000 value: 82.621 - type: mrr_at_3 value: 81.094 - type: mrr_at_5 value: 81.962 - type: ndcg_at_1 value: 75.673 - type: ndcg_at_10 value: 85.15599999999999 - type: ndcg_at_100 value: 86.151 - type: ndcg_at_1000 value: 86.26899999999999 - type: ndcg_at_3 value: 82.304 - type: ndcg_at_5 value: 84.009 - type: precision_at_1 value: 75.673 - type: precision_at_10 value: 10.042 - type: precision_at_100 value: 1.052 - type: precision_at_1000 value: 0.106 - type: precision_at_3 value: 30.673000000000002 - type: precision_at_5 value: 19.326999999999998 - type: recall_at_1 value: 73.28099999999999 - type: recall_at_10 value: 94.446 - type: recall_at_100 value: 98.737 - type: recall_at_1000 value: 99.649 - type: recall_at_3 value: 86.984 - type: recall_at_5 value: 91.024 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-CN) type: mteb/amazon_massive_intent config: zh-CN split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 81.08607935440484 - type: f1 value: 78.24879986066307 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-CN) type: mteb/amazon_massive_scenario config: zh-CN split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 86.05917955615332 - type: f1 value: 85.05279279434997 - task: type: Retrieval dataset: name: MTEB MedicalRetrieval type: C-MTEB/MedicalRetrieval config: default split: dev revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6 metrics: - type: map_at_1 value: 56.2 - type: map_at_10 value: 62.57899999999999 - type: map_at_100 value: 63.154999999999994 - type: map_at_1000 value: 63.193 - type: map_at_3 value: 61.217 - type: map_at_5 value: 62.012 - type: mrr_at_1 value: 56.3 - type: mrr_at_10 value: 62.629000000000005 - type: mrr_at_100 value: 63.205999999999996 - type: mrr_at_1000 value: 63.244 - type: mrr_at_3 value: 61.267 - type: mrr_at_5 value: 62.062 - type: ndcg_at_1 value: 56.2 - type: ndcg_at_10 value: 65.592 - type: ndcg_at_100 value: 68.657 - type: ndcg_at_1000 value: 69.671 - type: ndcg_at_3 value: 62.808 - type: ndcg_at_5 value: 64.24499999999999 - type: precision_at_1 value: 56.2 - type: precision_at_10 value: 7.5 - type: precision_at_100 value: 0.899 - type: precision_at_1000 value: 0.098 - type: precision_at_3 value: 22.467000000000002 - type: precision_at_5 value: 14.180000000000001 - type: recall_at_1 value: 56.2 - type: recall_at_10 value: 75.0 - type: recall_at_100 value: 89.9 - type: recall_at_1000 value: 97.89999999999999 - type: recall_at_3 value: 67.4 - type: recall_at_5 value: 70.89999999999999 - task: type: Classification dataset: name: MTEB MultilingualSentiment type: C-MTEB/MultilingualSentiment-classification config: default split: validation revision: 46958b007a63fdbf239b7672c25d0bea67b5ea1a metrics: - type: accuracy value: 76.87666666666667 - type: f1 value: 76.7317686219665 - task: type: PairClassification dataset: name: MTEB Ocnli type: C-MTEB/OCNLI config: default split: validation revision: 66e76a618a34d6d565d5538088562851e6daa7ec metrics: - type: cos_sim_accuracy value: 79.64266377910124 - type: cos_sim_ap value: 84.78274442344829 - type: cos_sim_f1 value: 81.16947472745292 - type: cos_sim_precision value: 76.47058823529412 - type: cos_sim_recall value: 86.48363252375924 - type: dot_accuracy value: 79.64266377910124 - type: dot_ap value: 84.7851404063692 - type: dot_f1 value: 81.16947472745292 - type: dot_precision value: 76.47058823529412 - type: dot_recall value: 86.48363252375924 - type: euclidean_accuracy value: 79.64266377910124 - type: euclidean_ap value: 84.78068373762378 - type: euclidean_f1 value: 81.14794656110837 - type: euclidean_precision value: 76.35009310986965 - type: euclidean_recall value: 86.58922914466737 - type: manhattan_accuracy value: 79.48023822414727 - type: manhattan_ap value: 84.72928897427576 - type: manhattan_f1 value: 81.32084770823064 - type: manhattan_precision value: 76.24768946395564 - type: manhattan_recall value: 87.11721224920802 - type: max_accuracy value: 79.64266377910124 - type: max_ap value: 84.7851404063692 - type: max_f1 value: 81.32084770823064 - task: type: Classification dataset: name: MTEB OnlineShopping type: C-MTEB/OnlineShopping-classification config: default split: test revision: e610f2ebd179a8fda30ae534c3878750a96db120 metrics: - type: accuracy value: 94.3 - type: ap value: 92.8664032274438 - type: f1 value: 94.29311102997727 - task: type: STS dataset: name: MTEB PAWSX type: C-MTEB/PAWSX config: default split: test revision: 9c6a90e430ac22b5779fb019a23e820b11a8b5e1 metrics: - type: cos_sim_pearson value: 48.51392279882909 - type: cos_sim_spearman value: 54.06338895994974 - type: euclidean_pearson value: 52.58480559573412 - type: euclidean_spearman value: 54.06417276612201 - type: manhattan_pearson value: 52.69525121721343 - type: manhattan_spearman value: 54.048147455389675 - task: type: STS dataset: name: MTEB QBQTC type: C-MTEB/QBQTC config: default split: test revision: 790b0510dc52b1553e8c49f3d2afb48c0e5c48b7 metrics: - type: cos_sim_pearson value: 29.728387290757325 - type: cos_sim_spearman value: 31.366121633635284 - type: euclidean_pearson value: 29.14588368552961 - type: euclidean_spearman value: 31.36764411112844 - type: manhattan_pearson value: 29.63517350523121 - type: manhattan_spearman value: 31.94157020583762 - task: type: STS dataset: name: MTEB STS22 (zh) type: mteb/sts22-crosslingual-sts config: zh split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 63.64868296271406 - type: cos_sim_spearman value: 66.12800618164744 - type: euclidean_pearson value: 63.21405767340238 - type: euclidean_spearman value: 66.12786567790748 - type: manhattan_pearson value: 64.04300276525848 - type: manhattan_spearman value: 66.5066857145652 - task: type: STS dataset: name: MTEB STSB type: C-MTEB/STSB config: default split: test revision: 0cde68302b3541bb8b3c340dc0644b0b745b3dc0 metrics: - type: cos_sim_pearson value: 81.2302623912794 - type: cos_sim_spearman value: 81.16833673266562 - type: euclidean_pearson value: 79.47647843876024 - type: euclidean_spearman value: 81.16944349524972 - type: manhattan_pearson value: 79.84947238492208 - type: manhattan_spearman value: 81.64626599410026 - task: type: Reranking dataset: name: MTEB T2Reranking type: C-MTEB/T2Reranking config: default split: dev revision: 76631901a18387f85eaa53e5450019b87ad58ef9 metrics: - type: map value: 67.80129586475687 - type: mrr value: 77.77402311635554 - task: type: Retrieval dataset: name: MTEB T2Retrieval type: C-MTEB/T2Retrieval config: default split: dev revision: 8731a845f1bf500a4f111cf1070785c793d10e64 metrics: - type: map_at_1 value: 28.666999999999998 - type: map_at_10 value: 81.063 - type: map_at_100 value: 84.504 - type: map_at_1000 value: 84.552 - type: map_at_3 value: 56.897 - type: map_at_5 value: 70.073 - type: mrr_at_1 value: 92.087 - type: mrr_at_10 value: 94.132 - type: mrr_at_100 value: 94.19800000000001 - type: mrr_at_1000 value: 94.19999999999999 - type: mrr_at_3 value: 93.78999999999999 - type: mrr_at_5 value: 94.002 - type: ndcg_at_1 value: 92.087 - type: ndcg_at_10 value: 87.734 - type: ndcg_at_100 value: 90.736 - type: ndcg_at_1000 value: 91.184 - type: ndcg_at_3 value: 88.78 - type: ndcg_at_5 value: 87.676 - type: precision_at_1 value: 92.087 - type: precision_at_10 value: 43.46 - type: precision_at_100 value: 5.07 - type: precision_at_1000 value: 0.518 - type: precision_at_3 value: 77.49000000000001 - type: precision_at_5 value: 65.194 - type: recall_at_1 value: 28.666999999999998 - type: recall_at_10 value: 86.632 - type: recall_at_100 value: 96.646 - type: recall_at_1000 value: 98.917 - type: recall_at_3 value: 58.333999999999996 - type: recall_at_5 value: 72.974 - task: type: Classification dataset: name: MTEB TNews type: C-MTEB/TNews-classification config: default split: validation revision: 317f262bf1e6126357bbe89e875451e4b0938fe4 metrics: - type: accuracy value: 52.971999999999994 - type: f1 value: 50.2898280984929 - task: type: Clustering dataset: name: MTEB ThuNewsClusteringP2P type: C-MTEB/ThuNewsClusteringP2P config: default split: test revision: 5798586b105c0434e4f0fe5e767abe619442cf93 metrics: - type: v_measure value: 86.0797948663824 - task: type: Clustering dataset: name: MTEB ThuNewsClusteringS2S type: C-MTEB/ThuNewsClusteringS2S config: default split: test revision: 8a8b2caeda43f39e13c4bc5bea0f8a667896e10d metrics: - type: v_measure value: 85.10759092255017 - task: type: Retrieval dataset: name: MTEB VideoRetrieval type: C-MTEB/VideoRetrieval config: default split: dev revision: 58c2597a5943a2ba48f4668c3b90d796283c5639 metrics: - type: map_at_1 value: 65.60000000000001 - type: map_at_10 value: 74.773 - type: map_at_100 value: 75.128 - type: map_at_1000 value: 75.136 - type: map_at_3 value: 73.05 - type: map_at_5 value: 74.13499999999999 - type: mrr_at_1 value: 65.60000000000001 - type: mrr_at_10 value: 74.773 - type: mrr_at_100 value: 75.128 - type: mrr_at_1000 value: 75.136 - type: mrr_at_3 value: 73.05 - type: mrr_at_5 value: 74.13499999999999 - type: ndcg_at_1 value: 65.60000000000001 - type: ndcg_at_10 value: 78.84299999999999 - type: ndcg_at_100 value: 80.40899999999999 - type: ndcg_at_1000 value: 80.57 - type: ndcg_at_3 value: 75.40599999999999 - type: ndcg_at_5 value: 77.351 - type: precision_at_1 value: 65.60000000000001 - type: precision_at_10 value: 9.139999999999999 - type: precision_at_100 value: 0.984 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 27.400000000000002 - type: precision_at_5 value: 17.380000000000003 - type: recall_at_1 value: 65.60000000000001 - type: recall_at_10 value: 91.4 - type: recall_at_100 value: 98.4 - type: recall_at_1000 value: 99.6 - type: recall_at_3 value: 82.19999999999999 - type: recall_at_5 value: 86.9 - task: type: Classification dataset: name: MTEB Waimai type: C-MTEB/waimai-classification config: default split: test revision: 339287def212450dcaa9df8c22bf93e9980c7023 metrics: - type: accuracy value: 89.47 - type: ap value: 75.59561751845389 - type: f1 value: 87.95207751382563 - task: type: Clustering dataset: name: MTEB AlloProfClusteringP2P type: lyon-nlp/alloprof config: default split: test revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b metrics: - type: v_measure value: 76.05592323841036 - type: v_measure value: 64.51718058866508 - task: type: Reranking dataset: name: MTEB AlloprofReranking type: lyon-nlp/mteb-fr-reranking-alloprof-s2p config: default split: test revision: 666fdacebe0291776e86f29345663dfaf80a0db9 metrics: - type: map value: 73.08278490943373 - type: mrr value: 74.66561454570449 - task: type: Retrieval dataset: name: MTEB AlloprofRetrieval type: lyon-nlp/alloprof config: default split: test revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b metrics: - type: map_at_1 value: 38.912 - type: map_at_10 value: 52.437999999999995 - type: map_at_100 value: 53.38 - type: map_at_1000 value: 53.427 - type: map_at_3 value: 48.879 - type: map_at_5 value: 50.934000000000005 - type: mrr_at_1 value: 44.085 - type: mrr_at_10 value: 55.337 - type: mrr_at_100 value: 56.016999999999996 - type: mrr_at_1000 value: 56.043 - type: mrr_at_3 value: 52.55499999999999 - type: mrr_at_5 value: 54.20399999999999 - type: ndcg_at_1 value: 44.085 - type: ndcg_at_10 value: 58.876 - type: ndcg_at_100 value: 62.714000000000006 - type: ndcg_at_1000 value: 63.721000000000004 - type: ndcg_at_3 value: 52.444 - type: ndcg_at_5 value: 55.692 - type: precision_at_1 value: 44.085 - type: precision_at_10 value: 9.21 - type: precision_at_100 value: 1.164 - type: precision_at_1000 value: 0.128 - type: precision_at_3 value: 23.043 - type: precision_at_5 value: 15.898000000000001 - type: recall_at_1 value: 38.912 - type: recall_at_10 value: 75.577 - type: recall_at_100 value: 92.038 - type: recall_at_1000 value: 99.325 - type: recall_at_3 value: 58.592 - type: recall_at_5 value: 66.235 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (fr) type: mteb/amazon_reviews_multi config: fr split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 55.532000000000004 - type: f1 value: 52.5783943471605 - task: type: Retrieval dataset: name: MTEB BSARDRetrieval type: maastrichtlawtech/bsard config: default split: test revision: 5effa1b9b5fa3b0f9e12523e6e43e5f86a6e6d59 metrics: - type: map_at_1 value: 8.108 - type: map_at_10 value: 14.710999999999999 - type: map_at_100 value: 15.891 - type: map_at_1000 value: 15.983 - type: map_at_3 value: 12.237 - type: map_at_5 value: 13.679 - type: mrr_at_1 value: 8.108 - type: mrr_at_10 value: 14.710999999999999 - type: mrr_at_100 value: 15.891 - type: mrr_at_1000 value: 15.983 - type: mrr_at_3 value: 12.237 - type: mrr_at_5 value: 13.679 - type: ndcg_at_1 value: 8.108 - type: ndcg_at_10 value: 18.796 - type: ndcg_at_100 value: 25.098 - type: ndcg_at_1000 value: 27.951999999999998 - type: ndcg_at_3 value: 13.712 - type: ndcg_at_5 value: 16.309 - type: precision_at_1 value: 8.108 - type: precision_at_10 value: 3.198 - type: precision_at_100 value: 0.626 - type: precision_at_1000 value: 0.086 - type: precision_at_3 value: 6.006 - type: precision_at_5 value: 4.865 - type: recall_at_1 value: 8.108 - type: recall_at_10 value: 31.982 - type: recall_at_100 value: 62.613 - type: recall_at_1000 value: 86.036 - type: recall_at_3 value: 18.018 - type: recall_at_5 value: 24.324 - task: type: Clustering dataset: name: MTEB HALClusteringS2S type: lyon-nlp/clustering-hal-s2s config: default split: test revision: e06ebbbb123f8144bef1a5d18796f3dec9ae2915 metrics: - type: v_measure value: 30.833269778867116 - task: type: Clustering dataset: name: MTEB MLSUMClusteringP2P type: mlsum config: default split: test revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7 metrics: - type: v_measure value: 50.0281928004713 - type: v_measure value: 43.699961510636534 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (fr) type: mteb/mtop_domain config: fr split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 96.68963357344191 - type: f1 value: 96.45175170820961 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (fr) type: mteb/mtop_intent config: fr split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 87.46946445349202 - type: f1 value: 65.79860440988624 - task: type: Classification dataset: name: MTEB MasakhaNEWSClassification (fra) type: masakhane/masakhanews config: fra split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: accuracy value: 82.60663507109005 - type: f1 value: 77.20462646604777 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (fra) type: masakhane/masakhanews config: fra split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: v_measure value: 60.19311264967803 - type: v_measure value: 63.6235764409785 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fr) type: mteb/amazon_massive_intent config: fr split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 81.65097511768661 - type: f1 value: 78.77796091490924 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fr) type: mteb/amazon_massive_scenario config: fr split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 86.64425016812373 - type: f1 value: 85.4912728670017 - task: type: Retrieval dataset: name: MTEB MintakaRetrieval (fr) type: jinaai/mintakaqa config: fr split: test revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e metrics: - type: map_at_1 value: 35.913000000000004 - type: map_at_10 value: 48.147 - type: map_at_100 value: 48.91 - type: map_at_1000 value: 48.949 - type: map_at_3 value: 45.269999999999996 - type: map_at_5 value: 47.115 - type: mrr_at_1 value: 35.913000000000004 - type: mrr_at_10 value: 48.147 - type: mrr_at_100 value: 48.91 - type: mrr_at_1000 value: 48.949 - type: mrr_at_3 value: 45.269999999999996 - type: mrr_at_5 value: 47.115 - type: ndcg_at_1 value: 35.913000000000004 - type: ndcg_at_10 value: 54.03 - type: ndcg_at_100 value: 57.839 - type: ndcg_at_1000 value: 58.925000000000004 - type: ndcg_at_3 value: 48.217999999999996 - type: ndcg_at_5 value: 51.56699999999999 - type: precision_at_1 value: 35.913000000000004 - type: precision_at_10 value: 7.244000000000001 - type: precision_at_100 value: 0.9039999999999999 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 18.905 - type: precision_at_5 value: 12.981000000000002 - type: recall_at_1 value: 35.913000000000004 - type: recall_at_10 value: 72.441 - type: recall_at_100 value: 90.41799999999999 - type: recall_at_1000 value: 99.099 - type: recall_at_3 value: 56.716 - type: recall_at_5 value: 64.90599999999999 - task: type: PairClassification dataset: name: MTEB OpusparcusPC (fr) type: GEM/opusparcus config: fr split: test revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a metrics: - type: cos_sim_accuracy value: 99.90069513406156 - type: cos_sim_ap value: 100.0 - type: cos_sim_f1 value: 99.95032290114257 - type: cos_sim_precision value: 100.0 - type: cos_sim_recall value: 99.90069513406156 - type: dot_accuracy value: 99.90069513406156 - type: dot_ap value: 100.0 - type: dot_f1 value: 99.95032290114257 - type: dot_precision value: 100.0 - type: dot_recall value: 99.90069513406156 - type: euclidean_accuracy value: 99.90069513406156 - type: euclidean_ap value: 100.0 - type: euclidean_f1 value: 99.95032290114257 - type: euclidean_precision value: 100.0 - type: euclidean_recall value: 99.90069513406156 - type: manhattan_accuracy value: 99.90069513406156 - type: manhattan_ap value: 100.0 - type: manhattan_f1 value: 99.95032290114257 - type: manhattan_precision value: 100.0 - type: manhattan_recall value: 99.90069513406156 - type: max_accuracy value: 99.90069513406156 - type: max_ap value: 100.0 - type: max_f1 value: 99.95032290114257 - task: type: PairClassification dataset: name: MTEB PawsX (fr) type: paws-x config: fr split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cos_sim_accuracy value: 75.25 - type: cos_sim_ap value: 80.86376001270014 - type: cos_sim_f1 value: 73.65945437441204 - type: cos_sim_precision value: 64.02289452166802 - type: cos_sim_recall value: 86.71096345514951 - type: dot_accuracy value: 75.25 - type: dot_ap value: 80.93686107633002 - type: dot_f1 value: 73.65945437441204 - type: dot_precision value: 64.02289452166802 - type: dot_recall value: 86.71096345514951 - type: euclidean_accuracy value: 75.25 - type: euclidean_ap value: 80.86379136218862 - type: euclidean_f1 value: 73.65945437441204 - type: euclidean_precision value: 64.02289452166802 - type: euclidean_recall value: 86.71096345514951 - type: manhattan_accuracy value: 75.3 - type: manhattan_ap value: 80.87826606097734 - type: manhattan_f1 value: 73.68421052631581 - type: manhattan_precision value: 64.0 - type: manhattan_recall value: 86.82170542635659 - type: max_accuracy value: 75.3 - type: max_ap value: 80.93686107633002 - type: max_f1 value: 73.68421052631581 - task: type: STS dataset: name: MTEB SICKFr type: Lajavaness/SICK-fr config: default split: test revision: e077ab4cf4774a1e36d86d593b150422fafd8e8a metrics: - type: cos_sim_pearson value: 81.42349425981143 - type: cos_sim_spearman value: 78.90454327031226 - type: euclidean_pearson value: 78.39086497435166 - type: euclidean_spearman value: 78.9046133980509 - type: manhattan_pearson value: 78.63743094286502 - type: manhattan_spearman value: 79.12136348449269 - task: type: STS dataset: name: MTEB STS22 (fr) type: mteb/sts22-crosslingual-sts config: fr split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 81.452697919749 - type: cos_sim_spearman value: 82.58116836039301 - type: euclidean_pearson value: 81.04038478932786 - type: euclidean_spearman value: 82.58116836039301 - type: manhattan_pearson value: 81.37075396187771 - type: manhattan_spearman value: 82.73678231355368 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (fr) type: stsb_multi_mt config: fr split: test revision: 93d57ef91790589e3ce9c365164337a8a78b7632 metrics: - type: cos_sim_pearson value: 85.7419764013806 - type: cos_sim_spearman value: 85.46085808849622 - type: euclidean_pearson value: 83.70449639870063 - type: euclidean_spearman value: 85.46159013076233 - type: manhattan_pearson value: 83.95259510313929 - type: manhattan_spearman value: 85.8029724659458 - task: type: Summarization dataset: name: MTEB SummEvalFr type: lyon-nlp/summarization-summeval-fr-p2p config: default split: test revision: b385812de6a9577b6f4d0f88c6a6e35395a94054 metrics: - type: cos_sim_pearson value: 32.61063271753325 - type: cos_sim_spearman value: 31.454589417353603 - type: dot_pearson value: 32.6106288643431 - type: dot_spearman value: 31.454589417353603 - task: type: Reranking dataset: name: MTEB SyntecReranking type: lyon-nlp/mteb-fr-reranking-syntec-s2p config: default split: test revision: b205c5084a0934ce8af14338bf03feb19499c84d metrics: - type: map value: 84.31666666666666 - type: mrr value: 84.31666666666666 - task: type: Retrieval dataset: name: MTEB SyntecRetrieval type: lyon-nlp/mteb-fr-retrieval-syntec-s2p config: default split: test revision: 77f7e271bf4a92b24fce5119f3486b583ca016ff metrics: - type: map_at_1 value: 63.0 - type: map_at_10 value: 73.471 - type: map_at_100 value: 73.87 - type: map_at_1000 value: 73.87 - type: map_at_3 value: 70.5 - type: map_at_5 value: 73.05 - type: mrr_at_1 value: 63.0 - type: mrr_at_10 value: 73.471 - type: mrr_at_100 value: 73.87 - type: mrr_at_1000 value: 73.87 - type: mrr_at_3 value: 70.5 - type: mrr_at_5 value: 73.05 - type: ndcg_at_1 value: 63.0 - type: ndcg_at_10 value: 78.255 - type: ndcg_at_100 value: 79.88 - type: ndcg_at_1000 value: 79.88 - type: ndcg_at_3 value: 72.702 - type: ndcg_at_5 value: 77.264 - type: precision_at_1 value: 63.0 - type: precision_at_10 value: 9.3 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 26.333000000000002 - type: precision_at_5 value: 18.0 - type: recall_at_1 value: 63.0 - type: recall_at_10 value: 93.0 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 79.0 - type: recall_at_5 value: 90.0 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (fr) type: jinaai/xpqa config: fr split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: map_at_1 value: 40.338 - type: map_at_10 value: 61.927 - type: map_at_100 value: 63.361999999999995 - type: map_at_1000 value: 63.405 - type: map_at_3 value: 55.479 - type: map_at_5 value: 59.732 - type: mrr_at_1 value: 63.551 - type: mrr_at_10 value: 71.006 - type: mrr_at_100 value: 71.501 - type: mrr_at_1000 value: 71.509 - type: mrr_at_3 value: 69.07 - type: mrr_at_5 value: 70.165 - type: ndcg_at_1 value: 63.551 - type: ndcg_at_10 value: 68.297 - type: ndcg_at_100 value: 73.13199999999999 - type: ndcg_at_1000 value: 73.751 - type: ndcg_at_3 value: 62.999 - type: ndcg_at_5 value: 64.89 - type: precision_at_1 value: 63.551 - type: precision_at_10 value: 15.661 - type: precision_at_100 value: 1.9789999999999999 - type: precision_at_1000 value: 0.207 - type: precision_at_3 value: 38.273 - type: precision_at_5 value: 27.61 - type: recall_at_1 value: 40.338 - type: recall_at_10 value: 77.267 - type: recall_at_100 value: 95.892 - type: recall_at_1000 value: 99.75500000000001 - type: recall_at_3 value: 60.36 - type: recall_at_5 value: 68.825 - task: type: Clustering dataset: name: MTEB 8TagsClustering type: PL-MTEB/8tags-clustering config: default split: test revision: None metrics: - type: v_measure value: 51.36126303874126 - task: type: Classification dataset: name: MTEB AllegroReviews type: PL-MTEB/allegro-reviews config: default split: test revision: None metrics: - type: accuracy value: 67.13717693836979 - type: f1 value: 57.27609848003782 - task: type: Retrieval dataset: name: MTEB ArguAna-PL type: clarin-knext/arguana-pl config: default split: test revision: 63fc86750af76253e8c760fc9e534bbf24d260a2 metrics: - type: map_at_1 value: 35.276999999999994 - type: map_at_10 value: 51.086 - type: map_at_100 value: 51.788000000000004 - type: map_at_1000 value: 51.791 - type: map_at_3 value: 46.147 - type: map_at_5 value: 49.078 - type: mrr_at_1 value: 35.917 - type: mrr_at_10 value: 51.315999999999995 - type: mrr_at_100 value: 52.018 - type: mrr_at_1000 value: 52.022 - type: mrr_at_3 value: 46.349000000000004 - type: mrr_at_5 value: 49.297000000000004 - type: ndcg_at_1 value: 35.276999999999994 - type: ndcg_at_10 value: 59.870999999999995 - type: ndcg_at_100 value: 62.590999999999994 - type: ndcg_at_1000 value: 62.661 - type: ndcg_at_3 value: 49.745 - type: ndcg_at_5 value: 55.067 - type: precision_at_1 value: 35.276999999999994 - type: precision_at_10 value: 8.791 - type: precision_at_100 value: 0.991 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 20.057 - type: precision_at_5 value: 14.637 - type: recall_at_1 value: 35.276999999999994 - type: recall_at_10 value: 87.909 - type: recall_at_100 value: 99.14699999999999 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 60.171 - type: recall_at_5 value: 73.18599999999999 - task: type: Classification dataset: name: MTEB CBD type: PL-MTEB/cbd config: default split: test revision: None metrics: - type: accuracy value: 78.03000000000002 - type: ap value: 29.12548553897622 - type: f1 value: 66.54857118886073 - task: type: PairClassification dataset: name: MTEB CDSC-E type: PL-MTEB/cdsce-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 89.0 - type: cos_sim_ap value: 76.75437826834582 - type: cos_sim_f1 value: 66.4850136239782 - type: cos_sim_precision value: 68.92655367231639 - type: cos_sim_recall value: 64.21052631578948 - type: dot_accuracy value: 89.0 - type: dot_ap value: 76.75437826834582 - type: dot_f1 value: 66.4850136239782 - type: dot_precision value: 68.92655367231639 - type: dot_recall value: 64.21052631578948 - type: euclidean_accuracy value: 89.0 - type: euclidean_ap value: 76.75437826834582 - type: euclidean_f1 value: 66.4850136239782 - type: euclidean_precision value: 68.92655367231639 - type: euclidean_recall value: 64.21052631578948 - type: manhattan_accuracy value: 89.0 - type: manhattan_ap value: 76.66074220647083 - type: manhattan_f1 value: 66.47058823529412 - type: manhattan_precision value: 75.33333333333333 - type: manhattan_recall value: 59.473684210526315 - type: max_accuracy value: 89.0 - type: max_ap value: 76.75437826834582 - type: max_f1 value: 66.4850136239782 - task: type: STS dataset: name: MTEB CDSC-R type: PL-MTEB/cdscr-sts config: default split: test revision: None metrics: - type: cos_sim_pearson value: 93.12903172428328 - type: cos_sim_spearman value: 92.66381487060741 - type: euclidean_pearson value: 90.37278396708922 - type: euclidean_spearman value: 92.66381487060741 - type: manhattan_pearson value: 90.32503296540962 - type: manhattan_spearman value: 92.6902938354313 - task: type: Retrieval dataset: name: MTEB DBPedia-PL type: clarin-knext/dbpedia-pl config: default split: test revision: 76afe41d9af165cc40999fcaa92312b8b012064a metrics: - type: map_at_1 value: 8.83 - type: map_at_10 value: 18.326 - type: map_at_100 value: 26.496 - type: map_at_1000 value: 28.455000000000002 - type: map_at_3 value: 12.933 - type: map_at_5 value: 15.168000000000001 - type: mrr_at_1 value: 66.0 - type: mrr_at_10 value: 72.76700000000001 - type: mrr_at_100 value: 73.203 - type: mrr_at_1000 value: 73.219 - type: mrr_at_3 value: 71.458 - type: mrr_at_5 value: 72.246 - type: ndcg_at_1 value: 55.375 - type: ndcg_at_10 value: 41.3 - type: ndcg_at_100 value: 45.891 - type: ndcg_at_1000 value: 52.905 - type: ndcg_at_3 value: 46.472 - type: ndcg_at_5 value: 43.734 - type: precision_at_1 value: 66.0 - type: precision_at_10 value: 33.074999999999996 - type: precision_at_100 value: 11.094999999999999 - type: precision_at_1000 value: 2.374 - type: precision_at_3 value: 48.583 - type: precision_at_5 value: 42.0 - type: recall_at_1 value: 8.83 - type: recall_at_10 value: 22.587 - type: recall_at_100 value: 50.61600000000001 - type: recall_at_1000 value: 73.559 - type: recall_at_3 value: 13.688 - type: recall_at_5 value: 16.855 - task: type: Retrieval dataset: name: MTEB FiQA-PL type: clarin-knext/fiqa-pl config: default split: test revision: 2e535829717f8bf9dc829b7f911cc5bbd4e6608e metrics: - type: map_at_1 value: 20.587 - type: map_at_10 value: 33.095 - type: map_at_100 value: 35.24 - type: map_at_1000 value: 35.429 - type: map_at_3 value: 28.626 - type: map_at_5 value: 31.136999999999997 - type: mrr_at_1 value: 40.586 - type: mrr_at_10 value: 49.033 - type: mrr_at_100 value: 49.952999999999996 - type: mrr_at_1000 value: 49.992 - type: mrr_at_3 value: 46.553 - type: mrr_at_5 value: 48.035 - type: ndcg_at_1 value: 40.586 - type: ndcg_at_10 value: 41.046 - type: ndcg_at_100 value: 48.586 - type: ndcg_at_1000 value: 51.634 - type: ndcg_at_3 value: 36.773 - type: ndcg_at_5 value: 38.389 - type: precision_at_1 value: 40.586 - type: precision_at_10 value: 11.466 - type: precision_at_100 value: 1.909 - type: precision_at_1000 value: 0.245 - type: precision_at_3 value: 24.434 - type: precision_at_5 value: 18.426000000000002 - type: recall_at_1 value: 20.587 - type: recall_at_10 value: 47.986000000000004 - type: recall_at_100 value: 75.761 - type: recall_at_1000 value: 94.065 - type: recall_at_3 value: 33.339 - type: recall_at_5 value: 39.765 - task: type: Retrieval dataset: name: MTEB HotpotQA-PL type: clarin-knext/hotpotqa-pl config: default split: test revision: a0bd479ac97b4ccb5bd6ce320c415d0bb4beb907 metrics: - type: map_at_1 value: 40.878 - type: map_at_10 value: 58.775999999999996 - type: map_at_100 value: 59.632 - type: map_at_1000 value: 59.707 - type: map_at_3 value: 56.074 - type: map_at_5 value: 57.629 - type: mrr_at_1 value: 81.756 - type: mrr_at_10 value: 86.117 - type: mrr_at_100 value: 86.299 - type: mrr_at_1000 value: 86.30600000000001 - type: mrr_at_3 value: 85.345 - type: mrr_at_5 value: 85.832 - type: ndcg_at_1 value: 81.756 - type: ndcg_at_10 value: 67.608 - type: ndcg_at_100 value: 70.575 - type: ndcg_at_1000 value: 71.99600000000001 - type: ndcg_at_3 value: 63.723 - type: ndcg_at_5 value: 65.70700000000001 - type: precision_at_1 value: 81.756 - type: precision_at_10 value: 13.619 - type: precision_at_100 value: 1.5939999999999999 - type: precision_at_1000 value: 0.178 - type: precision_at_3 value: 39.604 - type: precision_at_5 value: 25.332 - type: recall_at_1 value: 40.878 - type: recall_at_10 value: 68.096 - type: recall_at_100 value: 79.696 - type: recall_at_1000 value: 89.082 - type: recall_at_3 value: 59.406000000000006 - type: recall_at_5 value: 63.329 - task: type: Retrieval dataset: name: MTEB MSMARCO-PL type: clarin-knext/msmarco-pl config: default split: test revision: 8634c07806d5cce3a6138e260e59b81760a0a640 metrics: - type: map_at_1 value: 2.1839999999999997 - type: map_at_10 value: 11.346 - type: map_at_100 value: 30.325000000000003 - type: map_at_1000 value: 37.806 - type: map_at_3 value: 4.842 - type: map_at_5 value: 6.891 - type: mrr_at_1 value: 86.047 - type: mrr_at_10 value: 89.14699999999999 - type: mrr_at_100 value: 89.46600000000001 - type: mrr_at_1000 value: 89.46600000000001 - type: mrr_at_3 value: 89.14699999999999 - type: mrr_at_5 value: 89.14699999999999 - type: ndcg_at_1 value: 67.829 - type: ndcg_at_10 value: 62.222 - type: ndcg_at_100 value: 55.337 - type: ndcg_at_1000 value: 64.076 - type: ndcg_at_3 value: 68.12700000000001 - type: ndcg_at_5 value: 64.987 - type: precision_at_1 value: 86.047 - type: precision_at_10 value: 69.535 - type: precision_at_100 value: 32.93 - type: precision_at_1000 value: 6.6049999999999995 - type: precision_at_3 value: 79.845 - type: precision_at_5 value: 75.349 - type: recall_at_1 value: 2.1839999999999997 - type: recall_at_10 value: 12.866 - type: recall_at_100 value: 43.505 - type: recall_at_1000 value: 72.366 - type: recall_at_3 value: 4.947 - type: recall_at_5 value: 7.192 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pl) type: mteb/amazon_massive_intent config: pl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 80.75319435104238 - type: f1 value: 77.58961444860606 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pl) type: mteb/amazon_massive_scenario config: pl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 85.54472091459313 - type: f1 value: 84.29498563572106 - task: type: Retrieval dataset: name: MTEB NFCorpus-PL type: clarin-knext/nfcorpus-pl config: default split: test revision: 9a6f9567fda928260afed2de480d79c98bf0bec0 metrics: - type: map_at_1 value: 4.367 - type: map_at_10 value: 10.38 - type: map_at_100 value: 13.516 - type: map_at_1000 value: 14.982000000000001 - type: map_at_3 value: 7.367 - type: map_at_5 value: 8.59 - type: mrr_at_1 value: 41.486000000000004 - type: mrr_at_10 value: 48.886 - type: mrr_at_100 value: 49.657000000000004 - type: mrr_at_1000 value: 49.713 - type: mrr_at_3 value: 46.904 - type: mrr_at_5 value: 48.065000000000005 - type: ndcg_at_1 value: 40.402 - type: ndcg_at_10 value: 30.885 - type: ndcg_at_100 value: 28.393 - type: ndcg_at_1000 value: 37.428 - type: ndcg_at_3 value: 35.394999999999996 - type: ndcg_at_5 value: 33.391999999999996 - type: precision_at_1 value: 41.486000000000004 - type: precision_at_10 value: 23.437 - type: precision_at_100 value: 7.638 - type: precision_at_1000 value: 2.0389999999999997 - type: precision_at_3 value: 32.817 - type: precision_at_5 value: 28.915999999999997 - type: recall_at_1 value: 4.367 - type: recall_at_10 value: 14.655000000000001 - type: recall_at_100 value: 29.665999999999997 - type: recall_at_1000 value: 62.073 - type: recall_at_3 value: 8.51 - type: recall_at_5 value: 10.689 - task: type: Retrieval dataset: name: MTEB NQ-PL type: clarin-knext/nq-pl config: default split: test revision: f171245712cf85dd4700b06bef18001578d0ca8d metrics: - type: map_at_1 value: 28.616000000000003 - type: map_at_10 value: 41.626000000000005 - type: map_at_100 value: 42.689 - type: map_at_1000 value: 42.733 - type: map_at_3 value: 37.729 - type: map_at_5 value: 39.879999999999995 - type: mrr_at_1 value: 32.068000000000005 - type: mrr_at_10 value: 44.029 - type: mrr_at_100 value: 44.87 - type: mrr_at_1000 value: 44.901 - type: mrr_at_3 value: 40.687 - type: mrr_at_5 value: 42.625 - type: ndcg_at_1 value: 32.068000000000005 - type: ndcg_at_10 value: 48.449999999999996 - type: ndcg_at_100 value: 53.13 - type: ndcg_at_1000 value: 54.186 - type: ndcg_at_3 value: 40.983999999999995 - type: ndcg_at_5 value: 44.628 - type: precision_at_1 value: 32.068000000000005 - type: precision_at_10 value: 7.9750000000000005 - type: precision_at_100 value: 1.061 - type: precision_at_1000 value: 0.116 - type: precision_at_3 value: 18.404999999999998 - type: precision_at_5 value: 13.111 - type: recall_at_1 value: 28.616000000000003 - type: recall_at_10 value: 66.956 - type: recall_at_100 value: 87.657 - type: recall_at_1000 value: 95.548 - type: recall_at_3 value: 47.453 - type: recall_at_5 value: 55.87800000000001 - task: type: Classification dataset: name: MTEB PAC type: laugustyniak/abusive-clauses-pl config: default split: test revision: None metrics: - type: accuracy value: 69.04141326382856 - type: ap value: 77.47589122111044 - type: f1 value: 66.6332277374775 - task: type: PairClassification dataset: name: MTEB PPC type: PL-MTEB/ppc-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 86.4 - type: cos_sim_ap value: 94.1044939667201 - type: cos_sim_f1 value: 88.78048780487805 - type: cos_sim_precision value: 87.22044728434504 - type: cos_sim_recall value: 90.39735099337747 - type: dot_accuracy value: 86.4 - type: dot_ap value: 94.1044939667201 - type: dot_f1 value: 88.78048780487805 - type: dot_precision value: 87.22044728434504 - type: dot_recall value: 90.39735099337747 - type: euclidean_accuracy value: 86.4 - type: euclidean_ap value: 94.1044939667201 - type: euclidean_f1 value: 88.78048780487805 - type: euclidean_precision value: 87.22044728434504 - type: euclidean_recall value: 90.39735099337747 - type: manhattan_accuracy value: 86.4 - type: manhattan_ap value: 94.11438365697387 - type: manhattan_f1 value: 88.77968877968877 - type: manhattan_precision value: 87.84440842787681 - type: manhattan_recall value: 89.73509933774835 - type: max_accuracy value: 86.4 - type: max_ap value: 94.11438365697387 - type: max_f1 value: 88.78048780487805 - task: type: PairClassification dataset: name: MTEB PSC type: PL-MTEB/psc-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 97.86641929499072 - type: cos_sim_ap value: 99.36904211868182 - type: cos_sim_f1 value: 96.56203288490283 - type: cos_sim_precision value: 94.72140762463343 - type: cos_sim_recall value: 98.47560975609755 - type: dot_accuracy value: 97.86641929499072 - type: dot_ap value: 99.36904211868183 - type: dot_f1 value: 96.56203288490283 - type: dot_precision value: 94.72140762463343 - type: dot_recall value: 98.47560975609755 - type: euclidean_accuracy value: 97.86641929499072 - type: euclidean_ap value: 99.36904211868183 - type: euclidean_f1 value: 96.56203288490283 - type: euclidean_precision value: 94.72140762463343 - type: euclidean_recall value: 98.47560975609755 - type: manhattan_accuracy value: 98.14471243042672 - type: manhattan_ap value: 99.43359540492416 - type: manhattan_f1 value: 96.98795180722892 - type: manhattan_precision value: 95.83333333333334 - type: manhattan_recall value: 98.17073170731707 - type: max_accuracy value: 98.14471243042672 - type: max_ap value: 99.43359540492416 - type: max_f1 value: 96.98795180722892 - task: type: Classification dataset: name: MTEB PolEmo2.0-IN type: PL-MTEB/polemo2_in config: default split: test revision: None metrics: - type: accuracy value: 89.39058171745152 - type: f1 value: 86.8552093529568 - task: type: Classification dataset: name: MTEB PolEmo2.0-OUT type: PL-MTEB/polemo2_out config: default split: test revision: None metrics: - type: accuracy value: 74.97975708502024 - type: f1 value: 58.73081628832407 - task: type: Retrieval dataset: name: MTEB Quora-PL type: clarin-knext/quora-pl config: default split: test revision: 0be27e93455051e531182b85e85e425aba12e9d4 metrics: - type: map_at_1 value: 64.917 - type: map_at_10 value: 78.74600000000001 - type: map_at_100 value: 79.501 - type: map_at_1000 value: 79.524 - type: map_at_3 value: 75.549 - type: map_at_5 value: 77.495 - type: mrr_at_1 value: 74.9 - type: mrr_at_10 value: 82.112 - type: mrr_at_100 value: 82.314 - type: mrr_at_1000 value: 82.317 - type: mrr_at_3 value: 80.745 - type: mrr_at_5 value: 81.607 - type: ndcg_at_1 value: 74.83999999999999 - type: ndcg_at_10 value: 83.214 - type: ndcg_at_100 value: 84.997 - type: ndcg_at_1000 value: 85.207 - type: ndcg_at_3 value: 79.547 - type: ndcg_at_5 value: 81.46600000000001 - type: precision_at_1 value: 74.83999999999999 - type: precision_at_10 value: 12.822 - type: precision_at_100 value: 1.506 - type: precision_at_1000 value: 0.156 - type: precision_at_3 value: 34.903 - type: precision_at_5 value: 23.16 - type: recall_at_1 value: 64.917 - type: recall_at_10 value: 92.27199999999999 - type: recall_at_100 value: 98.715 - type: recall_at_1000 value: 99.854 - type: recall_at_3 value: 82.04599999999999 - type: recall_at_5 value: 87.2 - task: type: Retrieval dataset: name: MTEB SCIDOCS-PL type: clarin-knext/scidocs-pl config: default split: test revision: 45452b03f05560207ef19149545f168e596c9337 metrics: - type: map_at_1 value: 3.51 - type: map_at_10 value: 9.046999999999999 - type: map_at_100 value: 10.823 - type: map_at_1000 value: 11.144 - type: map_at_3 value: 6.257 - type: map_at_5 value: 7.648000000000001 - type: mrr_at_1 value: 17.299999999999997 - type: mrr_at_10 value: 27.419 - type: mrr_at_100 value: 28.618 - type: mrr_at_1000 value: 28.685 - type: mrr_at_3 value: 23.817 - type: mrr_at_5 value: 25.927 - type: ndcg_at_1 value: 17.299999999999997 - type: ndcg_at_10 value: 16.084 - type: ndcg_at_100 value: 23.729 - type: ndcg_at_1000 value: 29.476999999999997 - type: ndcg_at_3 value: 14.327000000000002 - type: ndcg_at_5 value: 13.017999999999999 - type: precision_at_1 value: 17.299999999999997 - type: precision_at_10 value: 8.63 - type: precision_at_100 value: 1.981 - type: precision_at_1000 value: 0.336 - type: precision_at_3 value: 13.4 - type: precision_at_5 value: 11.700000000000001 - type: recall_at_1 value: 3.51 - type: recall_at_10 value: 17.518 - type: recall_at_100 value: 40.275 - type: recall_at_1000 value: 68.203 - type: recall_at_3 value: 8.155 - type: recall_at_5 value: 11.875 - task: type: PairClassification dataset: name: MTEB SICK-E-PL type: PL-MTEB/sicke-pl-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 86.30248675091724 - type: cos_sim_ap value: 83.6756734006714 - type: cos_sim_f1 value: 74.97367497367497 - type: cos_sim_precision value: 73.91003460207612 - type: cos_sim_recall value: 76.06837606837607 - type: dot_accuracy value: 86.30248675091724 - type: dot_ap value: 83.6756734006714 - type: dot_f1 value: 74.97367497367497 - type: dot_precision value: 73.91003460207612 - type: dot_recall value: 76.06837606837607 - type: euclidean_accuracy value: 86.30248675091724 - type: euclidean_ap value: 83.67566984333091 - type: euclidean_f1 value: 74.97367497367497 - type: euclidean_precision value: 73.91003460207612 - type: euclidean_recall value: 76.06837606837607 - type: manhattan_accuracy value: 86.28210354667753 - type: manhattan_ap value: 83.64216119130171 - type: manhattan_f1 value: 74.92152075340078 - type: manhattan_precision value: 73.4107997265892 - type: manhattan_recall value: 76.49572649572649 - type: max_accuracy value: 86.30248675091724 - type: max_ap value: 83.6756734006714 - type: max_f1 value: 74.97367497367497 - task: type: STS dataset: name: MTEB SICK-R-PL type: PL-MTEB/sickr-pl-sts config: default split: test revision: None metrics: - type: cos_sim_pearson value: 82.23295940859121 - type: cos_sim_spearman value: 78.89329160768719 - type: euclidean_pearson value: 79.56019107076818 - type: euclidean_spearman value: 78.89330209904084 - type: manhattan_pearson value: 79.76098513973719 - type: manhattan_spearman value: 79.05490162570123 - task: type: STS dataset: name: MTEB STS22 (pl) type: mteb/sts22-crosslingual-sts config: pl split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 37.732606308062486 - type: cos_sim_spearman value: 41.01645667030284 - type: euclidean_pearson value: 26.61722556367085 - type: euclidean_spearman value: 41.01645667030284 - type: manhattan_pearson value: 26.60917378970807 - type: manhattan_spearman value: 41.51335727617614 - task: type: Retrieval dataset: name: MTEB SciFact-PL type: clarin-knext/scifact-pl config: default split: test revision: 47932a35f045ef8ed01ba82bf9ff67f6e109207e metrics: - type: map_at_1 value: 54.31700000000001 - type: map_at_10 value: 65.564 - type: map_at_100 value: 66.062 - type: map_at_1000 value: 66.08699999999999 - type: map_at_3 value: 62.592999999999996 - type: map_at_5 value: 63.888 - type: mrr_at_1 value: 56.99999999999999 - type: mrr_at_10 value: 66.412 - type: mrr_at_100 value: 66.85900000000001 - type: mrr_at_1000 value: 66.88 - type: mrr_at_3 value: 64.22200000000001 - type: mrr_at_5 value: 65.206 - type: ndcg_at_1 value: 56.99999999999999 - type: ndcg_at_10 value: 70.577 - type: ndcg_at_100 value: 72.879 - type: ndcg_at_1000 value: 73.45 - type: ndcg_at_3 value: 65.5 - type: ndcg_at_5 value: 67.278 - type: precision_at_1 value: 56.99999999999999 - type: precision_at_10 value: 9.667 - type: precision_at_100 value: 1.083 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 26.0 - type: precision_at_5 value: 16.933 - type: recall_at_1 value: 54.31700000000001 - type: recall_at_10 value: 85.056 - type: recall_at_100 value: 95.667 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 71.0 - type: recall_at_5 value: 75.672 - task: type: Retrieval dataset: name: MTEB TRECCOVID-PL type: clarin-knext/trec-covid-pl config: default split: test revision: 81bcb408f33366c2a20ac54adafad1ae7e877fdd metrics: - type: map_at_1 value: 0.245 - type: map_at_10 value: 2.051 - type: map_at_100 value: 12.009 - type: map_at_1000 value: 27.448 - type: map_at_3 value: 0.721 - type: map_at_5 value: 1.13 - type: mrr_at_1 value: 88.0 - type: mrr_at_10 value: 93.0 - type: mrr_at_100 value: 93.0 - type: mrr_at_1000 value: 93.0 - type: mrr_at_3 value: 93.0 - type: mrr_at_5 value: 93.0 - type: ndcg_at_1 value: 85.0 - type: ndcg_at_10 value: 80.303 - type: ndcg_at_100 value: 61.23499999999999 - type: ndcg_at_1000 value: 52.978 - type: ndcg_at_3 value: 84.419 - type: ndcg_at_5 value: 82.976 - type: precision_at_1 value: 88.0 - type: precision_at_10 value: 83.39999999999999 - type: precision_at_100 value: 61.96 - type: precision_at_1000 value: 22.648 - type: precision_at_3 value: 89.333 - type: precision_at_5 value: 87.2 - type: recall_at_1 value: 0.245 - type: recall_at_10 value: 2.193 - type: recall_at_100 value: 14.938 - type: recall_at_1000 value: 48.563 - type: recall_at_3 value: 0.738 - type: recall_at_5 value: 1.173 --- # Nizzouuu/gte-Qwen2-7B-instruct-Q6_K-GGUF This model was converted to GGUF format from [`Alibaba-NLP/gte-Qwen2-7B-instruct`](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) for more details on the model. ## Use with llama.cpp Install llama.cpp through brew (works on Mac and Linux) ```bash brew install llama.cpp ``` Invoke the llama.cpp server or the CLI. ### CLI: ```bash llama-cli --hf-repo Nizzouuu/gte-Qwen2-7B-instruct-Q6_K-GGUF --hf-file gte-qwen2-7b-instruct-q6_k.gguf -p "The meaning to life and the universe is" ``` ### Server: ```bash llama-server --hf-repo Nizzouuu/gte-Qwen2-7B-instruct-Q6_K-GGUF --hf-file gte-qwen2-7b-instruct-q6_k.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. Step 1: Clone llama.cpp from GitHub. ``` git clone https://github.com/ggerganov/llama.cpp ``` Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux). ``` cd llama.cpp && LLAMA_CURL=1 make ``` Step 3: Run inference through the main binary. ``` ./llama-cli --hf-repo Nizzouuu/gte-Qwen2-7B-instruct-Q6_K-GGUF --hf-file gte-qwen2-7b-instruct-q6_k.gguf -p "The meaning to life and the universe is" ``` or ``` ./llama-server --hf-repo Nizzouuu/gte-Qwen2-7B-instruct-Q6_K-GGUF --hf-file gte-qwen2-7b-instruct-q6_k.gguf -c 2048 ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
SeaLLMs/SeaLLMs-v3-1.5B-Chat
SeaLLMs
text-generation
[ "transformers", "safetensors", "qwen2", "text-generation", "sea", "multilingual", "conversational", "en", "zh", "id", "vi", "th", "ms", "tl", "ta", "jv", "arxiv:2407.19672", "arxiv:2306.05179", "license:other", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
1,721
1,722
1,507
12
--- language: - en - zh - id - vi - th - ms - tl - ta - jv license: other license_name: seallms license_link: https://huggingface.co/SeaLLMs/SeaLLM-13B-Chat/blob/main/LICENSE tags: - sea - multilingual --- # *SeaLLMs-v3* - Large Language Models for Southeast Asia <p align="center"> <a href="https://damo-nlp-sg.github.io/SeaLLMs/" target="_blank" rel="noopener">Website</a> &nbsp;&nbsp; <a href="https://huggingface.co/SeaLLMs/SeaLLMs-v3-1.5B-Chat" target="_blank" rel="noopener">Model</a> &nbsp;&nbsp; <a href="https://huggingface.co/spaces/SeaLLMs/SeaLLM-Chat" target="_blank" rel="noopener"> 🤗 DEMO</a> &nbsp;&nbsp; <a href="https://github.com/DAMO-NLP-SG/SeaLLMs" target="_blank" rel="noopener">Github</a> &nbsp;&nbsp; <a href="https://arxiv.org/pdf/2407.19672" target="_blank" rel="noopener">[NEW] Technical Report</a> </p> We introduce **SeaLLMs-v3**, the latest series of the SeaLLMs (Large Language Models for Southeast Asian languages) family. It achieves state-of-the-art performance among models with similar sizes, excelling across a diverse array of tasks such as world knowledge, mathematical reasoning, translation, and instruction following. In the meantime, it was specifically enhanced to be more trustworthy, exhibiting reduced hallucination and providing safe responses, particularly in queries closed related to Southeast Asian culture. ## 🔥 Highlights - State-of-the-art performance compared to open-source models of similar sizes, evaluated across various dimensions such as human exam questions, instruction-following, mathematics, and translation. - Significantly enhanced instruction-following capability, especially in multi-turn settings. - Ensures safety in usage with significantly reduced instances of hallucination and sensitivity to local contexts. ## Uses SeaLLMs is tailored for handling a wide range of languages spoken in the SEA region, including English, Chinese, Indonesian, Vietnamese, Thai, Tagalog, Malay, Burmese, Khmer, Lao, Tamil, and Javanese. This page introduces the **SeaLLMs-v3-1.5B-Chat** model, specifically fine-tuned to follow human instructions effectively for task completion, making it directly applicable to your applications. You may also refer to the [SeaLLMs-v3-7B-Chat](https://huggingface.co/SeaLLMs/SeaLLM3-7B-Chat) model for enhanced performance, although it requires higher computational resources. ### Get started with `Transformers` To quickly try the model, we show how to conduct inference with `transformers` below. Make sure you have installed the latest transformers version (>4.40). ```python from transformers import AutoModelForCausalLM, AutoTokenizer device = "cuda" # the device to load the model onto model = AutoModelForCausalLM.from_pretrained( "SeaLLMs/SeaLLMs-v3-1.5B-Chat", torch_dtype=torch.bfloat16, device_map=device ) tokenizer = AutoTokenizer.from_pretrained("SeaLLMs/SeaLLMs-v3-1.5B-Chat") # prepare messages to model prompt = "Hiii How are you?" messages = [ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": prompt} ] text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) model_inputs = tokenizer([text], return_tensors="pt").to(device) print(f"Formatted text:\n {text}") print(f"Model input:\n {model_inputs}") generated_ids = model.generate(model_inputs.input_ids, max_new_tokens=512, do_sample=True) generated_ids = [ output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids) ] response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True) print(f"Response:\n {response[0]}") ``` You can also utilize the following code snippet, which uses the streamer `TextStreamer` to enable the model to continue conversing with you: ```python from transformers import AutoModelForCausalLM, AutoTokenizer from transformers import TextStreamer device = "cuda" # the device to load the model onto model = AutoModelForCausalLM.from_pretrained( "SeaLLMs/SeaLLMs-v3-1.5B-Chat", torch_dtype=torch.bfloat16, device_map=device ) tokenizer = AutoTokenizer.from_pretrained("SeaLLMs/SeaLLMs-v3-1.5B-Chat") # prepare messages to model messages = [ {"role": "system", "content": "You are a helpful assistant."}, ] while True: prompt = input("User:") messages.append({"role": "user", "content": prompt}) text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) model_inputs = tokenizer([text], return_tensors="pt").to(device) streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True) generated_ids = model.generate(model_inputs.input_ids, max_new_tokens=512, streamer=streamer) generated_ids = [ output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids) ] response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0] messages.append({"role": "assistant", "content": response}) ``` ### Inference with `vllm` You can also conduct inference with [vllm](https://docs.vllm.ai/en/stable/index.html), which is a fast and easy-to-use library for LLM inference and serving. To use vllm, first install the latest version via `pip install vllm`. ```python from vllm import LLM, SamplingParams prompts = [ "Who is the president of US?", "Can you speak Indonesian?" ] llm = LLM(ckpt_path, dtype="bfloat16") sparams = SamplingParams(temperature=0.1, max_tokens=512) outputs = llm.generate(prompts, sparams) # print out the model response for output in outputs: prompt = output.prompt generated_text = output.outputs[0].text print(f"Prompt: {prompt}\nResponse: {generated_text}\n\n") ``` ### Bias, Risks, and Limitations <blockquote style="color:red"> <p><strong style="color: red">Terms of Use and License</strong>: By using our released weights, codes, and demos, you agree to and comply with the terms and conditions specified in our <a href="https://huggingface.co/SeaLLMs/SeaLLM-Chat-13b/edit/main/LICENSE" target="_blank" rel="noopener">SeaLLMs Terms Of Use</a>. </blockquote> > **Disclaimer**: > We must note that even though the weights, codes, and demos are released in an open manner, similar to other pre-trained language models, and despite our best efforts in red teaming and safety fine-tuning and enforcement, our models come with potential risks, including but not limited to inaccurate, misleading or potentially harmful generation. > Developers and stakeholders should perform their own red teaming and provide related security measures before deployment, and they must abide by and comply with local governance and regulations. > In no event shall the authors be held liable for any claim, damages, or other liability arising from the use of the released weights, codes, or demos. ## Evaluation We briefly compare SeaLLMs-v3-1.5B-Chat with models of similar sizes with the M3Exam benchmark. [M3Exam](https://arxiv.org/abs/2306.05179) consists of local exam questions collected from each country. It reflects the model's world knowledge (e.g., with language or social science subjects) and reasoning abilities (e.g., with mathematics or natural science subjects). | Model | en | zh | id | th | vi | avg | avg_sea | |--------------------------|------|------|------|------|------|------|---------| | gemma-2b-it | 44.1 | 37.4 | 31.5 | 28.2 | 35.8 | 35.4 | 31.8 | | Sailor-1.8B-Chat | 43.8 | 35.9 | 34.2 | 32.3 | 37.5 | 36.7 | 34.7 | | Sailor-4B-Chat | 54.1 | 48.1 | 40.7 | 35.6 | 42.5 | 44.2 | 39.6 | | Qwen2-1.5B-Instruct | 63.4 | 75.3 | 41.2 | 41.2 | 47.2 | 53.7 | 43.2 | | **SeaLLMs-v3-1.5B-Chat** | 61.9 | 74.2 | 43.2 | 42.4 | 48.7 | 54.1 | 44.7 | ## Acknowledgement to Our Linguists We would like to express our special thanks to our professional and native linguists, Tantong Champaiboon, Nguyen Ngoc Yen Nhi and Tara Devina Putri, who helped build, evaluate, and fact-check our sampled pretraining and SFT dataset as well as evaluating our models across different aspects, especially safety. ## Citation If you find our project useful, we hope you would kindly star our repo and cite our work as follows: ``` @article{damonlp2024seallm3, author = {Wenxuan Zhang*, Hou Pong Chan*, Yiran Zhao*, Mahani Aljunied*, Jianyu Wang*, Chaoqun Liu, Yue Deng, Zhiqiang Hu, Weiwen Xu, Yew Ken Chia, Xin Li, Lidong Bing}, title = {SeaLLMs 3: Open Foundation and Chat Multilingual Large Language Models for Southeast Asian Languages}, year = {2024}, url = {https://arxiv.org/abs/2407.19672} } ``` Corresponding Author: [email protected]
[ "TRANSLATION" ]
[ "CHIA" ]
Non_BioNLP
jukofyork/Dark-Miqu-70B
jukofyork
text-generation
[ "transformers", "safetensors", "llama", "text-generation", "mergekit", "merge", "arxiv:2403.19522", "base_model:152334H/miqu-1-70b-sf", "base_model:merge:152334H/miqu-1-70b-sf", "base_model:Sao10K/Euryale-1.3-L2-70B", "base_model:merge:Sao10K/Euryale-1.3-L2-70B", "base_model:Sao10K/WinterGoddess-1.4x-70B-L2", "base_model:merge:Sao10K/WinterGoddess-1.4x-70B-L2", "base_model:sophosympatheia/Midnight-Rose-70B-v2.0.3", "base_model:merge:sophosympatheia/Midnight-Rose-70B-v2.0.3", "license:other", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
1,714
1,715
72
30
--- base_model: - 152334H/miqu-1-70b-sf - sophosympatheia/Midnight-Rose-70B-v2.0.3 - Sao10K/Euryale-1.3-L2-70B - Sao10K/WinterGoddess-1.4x-70B-L2 library_name: transformers license: other tags: - mergekit - merge --- ![Dark-Miqu.png](Dark-Miqu.png) A "dark" creative writing model with 32k context. Based off [miqu-1-70b](https://huggingface.co/miqudev/miqu-1-70b) but with greatly reduced "positivity" and "-isms". If you want happy endings, look elsewhere! This model **excels** at writing Dark/Grimdark fantasy (see examples below). ***NOTE***: *This model has now been merged with [Dawn-Miqu-70B](https://huggingface.co/jukofyork/Dawn-Miqu-70B) to create [Deep-Miqu-103B](https://huggingface.co/jukofyork/Deep-Miqu-103B) and [Deep-Miqu-120B](https://huggingface.co/jukofyork/Deep-Miqu-120B).* ***NOTE***: *For a full range of GGUF quants kindly provided by @mradermacher, see: [Static](https://huggingface.co/mradermacher/Dark-Miqu-70B-GGUF) and [IMatrix](https://huggingface.co/mradermacher/Dark-Miqu-70B-i1-GGUF).* # Model background Created using [Mergekit](https://github.com/arcee-ai/mergekit) and based on @sophosympatheia's template for [Midnight-Miqu-70B-v1.0](https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.0). This model has a lower perplexity compared to [Midnight-Miqu-70B-v1.0](https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.0) (`'4.08 +/- 0.02'` vs `'4.02 +/- 0.02'`). It also generates longer responses when prompted. The model was created in two stages: - First, three "Midnight-Miqu-esque" models were produced using spherical interpolation (slerp) merges between [miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) and each of the following models: [Midnight-Rose-70B-v2.0.3](https://huggingface.co/sophosympatheia/Midnight-Rose-70B-v2.0.3), [Euryale-1.3-L2-70B](https://huggingface.co/Sao10K/Euryale-1.3-L2-70B) and [WinterGoddess-1.4x-70B-L2](https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2). These models were selected for their dark, imaginative writing styles. Various slerp-merges between [miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) and other models were also experimented with, but these three yielded the darkest creative writing results. - In the second stage, the three slerp-merged models were combined into a single model using the '[Model Stock](https://arxiv.org/abs/2403.19522)' method, with [miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) serving as the base model. # Prompting format Vicuna format is preferred: ``` USER: {prompt} ASSISTANT: ``` Mistral and Alpaca formats are also supported: ``` [INST] {prompt} [/INST] ``` ``` ### Instruction: {prompt} ### Response: ``` # Licence and usage restrictions [miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) is a dequantized version of the [miqu-1-70b](https://huggingface.co/miqudev/miqu-1-70b) model leaked from MistralAI. All miqu-derived models, including this merge, are suitable for non-commercial, personal use only. # Mergekit configuration The following YAML configuration was used to produce this model: ```yaml name: midnight-miqu-70b models: - model: 152334H/miqu-1-70b-sf - model: sophosympatheia/Midnight-Rose-70B-v2.0.3 base_model: 152334H/miqu-1-70b-sf merge_method: slerp parameters: t: - value: [0, 0, 0.2, 0.3, 0.4, 0.5, 0.4, 0.3, 0.2, 0, 0] embed_slerp: true tokenizer_source: model:miqu-1-70b-sf dtype: float16 --- name: euryale-miqu-70b models: - model: 152334H/miqu-1-70b-sf - model: Sao10K/Euryale-1.3-L2-70B base_model: 152334H/miqu-1-70b-sf merge_method: slerp parameters: t: - value: [0, 0, 0.2, 0.3, 0.4, 0.5, 0.4, 0.3, 0.2, 0, 0] embed_slerp: true tokenizer_source: model:miqu-1-70b-sf dtype: float16 --- name: winter-miqu-70b models: - model: 152334H/miqu-1-70b-sf - model: Sao10K/WinterGoddess-1.4x-70B-L2 base_model: 152334H/miqu-1-70b-sf merge_method: slerp parameters: t: - value: [0, 0, 0.2, 0.3, 0.4, 0.5, 0.4, 0.3, 0.2, 0, 0] embed_slerp: true tokenizer_source: model:miqu-1-70b-sf dtype: float16 --- name: dark-miqu-70b models: - model: 152334H/miqu-1-70b-sf - model: midnight-miqu-70b - model: euryale-miqu-70b - model: winter-miqu-70b base_model: 152334H/miqu-1-70b-sf merge_method: model_stock dtype: float16 ``` ## Key configuration details: - '`merge_method: slerp`' uses spherical linear interpolation for merging models. - '`parameters: t`' controls the interpolation ratios between models. - '`embed_slerp: true`' applies slerp to the embedding layers. - '`merge_method: model_stock`' uses the '[Model Stock](https://arxiv.org/abs/2403.19522)' method. See the [Mergekit documentation](https://github.com/arcee-ai/mergekit) for more on these settings. **NOTE**: Run with `mergekit-mega` rather than `mergekit` as there are 4 documents in this one file. # Example stories The following mix of "dark" stories were generated using the Vicuna prompt format with no system message and temperature=0: ## Dark fantasy stories <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a 'Grimdark' stories trilogy in the style of Joe Abercrombie and Rob J Hayes ### ASSISTANT Chapter One: The Crimson Dawn The world was a cruel, unforgiving place, and the city of Vhyrd was its beating, blackened heart. Nestled in the crook of the Serpent's Spine mountains, it was a place where the sun struggled to shine, and the shadows held more sway than the law. It was a city of thieves, murderers, and worse, and it was home to the man who would become known as the Butcher of the Broken Docks. His name was Rennick, but that meant little to the people who feared him. He was a tall, broad-shouldered figure, with a face that had been carved by a drunken sculptor with a penchant for sharp angles and harsh lines. His eyes were a cold, flat grey, like the steel of his twin blades, and his hair was the color of dried blood. He moved through the crowded streets with a predator's grace, his cloak billowing behind him like a banner of death. Rennick had a job to do, and he was good at his job. He was a killer, a thief-taker, a man who dealt in the currency of pain and suffering. He had been hired to find a girl, a slip of a thing who had stolen something valuable from the wrong people. He didn't know what it was, and he didn't care. All that mattered was the coin that would jingle in his purse when he brought her back, preferably in several small, bloody pieces. He had tracked her to the Broken Docks, a maze of rotting wood and crumbling stone that clung to the edge of the city like a leech to a corpse. It was a place where the desperate went to die, and the desperate went to prey on the desperate. It was a place where the screams of the damned were drowned out by the cries of the gulls and the lap of the oily water against the pilings. Rennick moved through the shadows, his blades at the ready. He could feel the eyes on him, the whispers of his name on the wind. He reveled in it. Fear was a weapon, and he wielded it with the same skill as his swords. He could hear the girl's heartbeat, fast and frantic, like a rabbit's. She was close. He turned a corner and found himself in a narrow alley, the walls pressing in on him like the jaws of a trap. At the far end, he saw her, a small, trembling figure huddled in the doorway of a derelict building. She was young, no more than sixteen summers, with matted hair and ragged clothes. She looked up at him with wide, terrified eyes, and he could see the whites of her teeth as she bit her lip to keep from crying out. He advanced on her slowly, savoring the moment. He could almost taste the fear on her, like a fine wine. "Come now, lass," he said, his voice a low, dangerous purr. "You know you can't run from me. Give me what you took, and I might let you live." She shook her head, her eyes darting left and right, searching for an escape. Rennick smiled, a cold, cruel thing that didn't reach his eyes. "You're not thinking straight," he said. "You're just a child. You don't want to die here, in the gutter, like a rat." She spat at him, a glob of phlegm and hate that landed on his boot. He raised an eyebrow, impressed despite himself. "Feisty," he said. "I like that. It'll make this more fun." He lunged forward, his blades flashing in the dim light. She screamed and tried to run, but he was too fast. He caught her by the arm, his grip like iron, and yanked her back towards him. She kicked and struggled, but it was like trying to fight a storm. He slammed her against the wall, his forearm pressed against her throat, cutting off her air. "Where is it?" he growled, his breath hot and sour in her face. "Tell me, and I'll make it quick." She clawed at his arm, her nails leaving bloody furrows in his flesh. He didn't flinch. He was used to pain. He had been born in it, raised in it, bathed in it. It was his constant companion, his only friend. "I...I don't have it," she gasped, her voice raw and desperate. "I swear, I don't!" Rennick chuckled, a sound that sent shivers down her spine. "Lying to me is a bad idea, lass," he said. "I have ways of making people talk." He released her throat and slammed his fist into her stomach, doubling her over. She retched, but there was nothing in her stomach to come up. He grabbed a handful of her hair and yanked her head back, exposing her throat. He could see the pulse hammering in her neck, like a trapped bird trying to escape. "Tell me," he said, his voice a whisper. "Tell me, and I'll end this." She spat in his face, a defiant gesture that would be her last. He brought his blade up, the steel glinting in the moonlight, and sliced her throat from ear to ear. Blood sprayed across the cobblestones, warm and wet, and she slumped to the ground, her life draining away like water through a sieve. Rennick wiped the blood from his face and sheathed his blades. He didn't feel anything. No satisfaction, no remorse. Just the cold, empty void that had been his constant companion since the day he had taken his first life. He searched her body, but found nothing. He cursed under his breath. He would have to find the thing she had stolen the hard way. He turned to leave the alley, but something made him pause. A sound, barely audible over the cries of the gulls and the lapping of the waves. A soft, wet sound, like the patter of rain on a rooftop. He looked down and saw that the girl's body was twitching, her eyes rolling back in her head. And then, with a gurgling scream, she transformed. Her body contorted, bones snapping and reforming, flesh tearing and reshaping. In the space of a heartbeat, she was no longer human. She was something else, something monstrous. Rennick stared in horror as the thing that had been a girl rose to its feet, its eyes burning with an unholy light. It was a creature of nightmares, a twisted mockery of humanity, all teeth and claws and hate. It lunged at him, and he barely had time to draw his blades before it was on him. They fought, a dance of death in the narrow alley, steel ringing on steel, flesh tearing and blood spraying. Rennick was strong, and fast, and skilled, but the creature was stronger, faster, more vicious. It was like fighting a storm, a force of nature that could not be reasoned with or bargained with. It could only be endured. In the end, it was luck that saved him. He stumbled, his foot catching on a loose cobblestone, and the creature's claws raked across his chest, leaving deep, ragged gouges. But it overextended itself, and he was able to bring his sword up, driving it through its eye and into its brain. It fell to the ground, twitching and convulsing, and then it was still. Rennick stood over it, panting, his heart hammering in his chest. He had never seen anything like it. He had never even heard of anything like it. He knew then that he was in over his head. This was no ordinary job, no simple matter of retrieving a stolen trinket. This was something bigger, something darker. He was a small fish in a very large, very dangerous pond, and he was about to be eaten alive. But he was also a survivor. He had been born in the gutter, raised in the shadows, and he had clawed his way to the top of the food chain. He would not go down without a fight. He turned and ran, back towards the city, back towards the people who had hired him. He would find out what they knew, and he would make them pay for it. He would uncover the truth, no matter the cost. And so began the tale of the Butcher of the Broken Docks, a story of blood and betrayal, of monsters and men, of a city on the brink of chaos and the man who would either save it or destroy it. It was a story that would echo through the ages, a grim and terrible saga that would be whispered in the darkest corners of the world, a warning to all who would dare to challenge the forces that lurked in the shadows. It was a story that would end in fire and blood, in screams and tears, in the fall of kings and the rise of monsters. It was a story that had only just begun. </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a cultist who has just escaped his cult. He is dressed in dirty yellow robes and his only possession is a mysterious small (magical!?) mirror he stole from the cult. The story starts with him arriving at an apparently deserted ghost town on the edge of a desert. He has an arrow lodged in his shoulder and is losing his mind due to infection and thirst. ### ASSISTANT The sun was a merciless beast, its fiery breath scorching the earth and turning the once-thriving town into a desolate wasteland. The cultist, his yellow robes now a dull, dusty hue, staggered through the empty streets, his vision blurred by a haze of pain and delirium. The arrow jutted from his shoulder like an accusing finger, a constant reminder of his betrayal. He clutched the small, intricately carved mirror in his trembling hand, its cool surface a comfort against the relentless heat. He had heard the whispers of the other acolytes, their hushed voices carrying tales of the town's demise like a plague on the wind. They spoke of a curse, of a darkness that had seeped into the very bones of the place, driving its inhabitants mad before claiming their souls. But he had seen the truth in the High Priest's eyes, the greed and lust for power that had led them here. The mirror was the key, they said, to unlocking the secrets of the ancient gods. But he knew better. He had seen the power it held, the way it twisted reality and bent it to the will of the one who wielded it. And he had taken it, stolen it from the very heart of the cult's temple, and run. Now, with each agonizing step, he questioned the wisdom of his decision. The desert stretched out before him, an endless sea of sand and despair, and he was but a single, insignificant speck in its vast expanse. He had hoped to find refuge here, a place to hide and heal before continuing his journey. But the town was as dead as the dreams he had once held of a better life. His thoughts were a jumbled mess, a cacophony of fear and pain. He could feel the infection spreading through his body, a writhing serpent of fire that threatened to consume him from within. His tongue was swollen, his lips cracked and dry, and his throat was a parched wasteland. He needed water, and soon, or he would join the ghosts that haunted this forsaken place. As he stumbled through the town, he caught glimpses of his reflection in the shattered windows of the abandoned buildings. The man who stared back at him was a stranger, his once-handsome face now a twisted mask of agony and desperation. He had been a fool to believe the lies of the cult, to think that he could ever be one of them. He was an outsider, a fraud, and they had seen it in him from the start. The mirror, though, the mirror was real. He could feel its power pulsing in his hand, a heartbeat that echoed the rhythm of his own. It was a cruel irony that the very thing he had risked everything to obtain was now his only hope for survival. He knew the stories, the legends of the mirror's healing properties, of how it could restore life to the dying. But he also knew the cost. The mirror demanded a sacrifice, a piece of the user's very soul in exchange for its gifts. He reached the town square, a desolate expanse of cracked cobblestones and withered plants. In the center stood a fountain, its once-clear waters now a stagnant pool of green slime. He fell to his knees before it, the pain in his shoulder a white-hot brand that seared his flesh. He dipped the mirror into the water, watching as the surface rippled and danced. "Please," he whispered, his voice hoarse and broken. "I'll give you anything. Just take the pain away." The mirror stared back at him, its surface as still and unyielding as the desert sands. He closed his eyes, willing the power within to answer his plea. And then, he felt it. A cold, insidious presence, slithering into his mind like a snake in the dark. It whispered to him, promising relief, promising power, promising everything he had ever wanted. He opened his eyes, and the world was different. The pain was gone, replaced by a cold, emotionless void. He looked down at his shoulder, and the arrow was gone, the wound nothing more than a faint scar. He laughed, a hollow, empty sound that echoed through the empty streets. "What have I done?" he asked the mirror, but it gave no answer. He stood, his legs unsteady beneath him, and looked out at the desert. He could feel the power coursing through his veins, a dark, intoxicating energy that threatened to consume him. He was no longer just a man, he was something more. Something dangerous. He took a step forward, and then another, the mirror clutched tightly in his hand. He would find the cult, he would make them pay for what they had done to him. And then, he would use the mirror to claim the power that was rightfully his. The wind picked up, whipping the sand into a frenzy, and he smiled. The desert was his now, and he would not be denied. He was the master of his own destiny, and the gods themselves would tremble at his approach. But as he walked, the laughter of the ghosts followed him, a chilling reminder of the price he had paid for his newfound strength. And deep within the mirror, something stirred, a darkness that watched and waited, biding its time. The first chapter of the grimdark trilogy, "The Shattered Reflection," had begun. </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a 'Mercer' who's job is to put the mortally wounded out of their mystery after a battle. He has no conscience about what he does. The story starts the morning after the battle and the moor where it took place is blanketed in mist. Something stirs a flashback in the POV character. ### ASSISTANT The morning after the battle, the moor lay still, a shroud of mist clinging to its wounds like a lover's last embrace. The Mercer, a man of few words and fewer scruples, trudged through the carnage, his boots squelching in the mud churned red by the night's bloodshed. He was a reaper of sorts, his task to deliver the final mercy to those who had danced with death and lost. His name was a whisper among the living, a curse among the dying. He paused, his hand resting on the hilt of his blade, as a shiver ran down his spine. The mist swirled around him, carrying with it the cacophony of the night before: the screams of the dying, the clash of steel, the thunder of hooves. He closed his eyes, letting the memories wash over him like a wave of despair. *Why do I do this?* he thought, not for the first time. *Why do I wade through this sea of suffering, playing the part of the grim angel?* But he knew the answer. It was the same answer it had always been. He did it because he was good at it. Because he had no qualms about ending a life, even one that still clung to the faintest flicker of hope. Because, in this world of darkness and pain, it was the only thing he knew how to do. He opened his eyes, the mist parting to reveal a figure lying in the mud. A soldier, or what was left of one. The man's armor was shattered, his face a mask of agony. The Mercer knelt beside him, his movements practiced and efficient. He placed a hand on the soldier's forehead, feeling the fevered heat of his skin, the erratic thump of his heart. "Shh," he whispered, his voice a soothing balm. "It's over now." The soldier's eyes fluttered open, wide with terror. "Please," he croaked, "don't leave me here." The Mercer's lips twisted into a grim smile. "I won't," he said, drawing his blade. "I'll take you with me." As the soldier's life bled out onto the cold ground, the Mercer's mind drifted back to another time, another place. A time before the blood and the pain, before the endless cycle of war and death. He remembered a girl with hair like spun gold, laughing as she danced in a field of wildflowers. He remembered the feel of her hand in his, the sound of her voice as she whispered his name. *Elara.* The memory was a knife to his heart, a reminder of all he had lost. He pushed it away, focusing on the task at hand. There were more souls to reap, more lives to end. He moved through the mist, his blade a silent whisper in the stillness. The cries of the wounded guided him, a morbid symphony that played out across the moor. Each one he found, he dispatched with the same cold efficiency, the same hollow promise of release. As the sun began to rise, casting a sickly pallor over the scene, the Mercer came upon a group of soldiers huddled around a fire. They eyed him warily, their faces a mix of fear and respect. He was a necessary evil, a specter that haunted the battlefield. "Anyone here need my services?" he asked, his voice devoid of emotion. One of the soldiers, a young boy who couldn't have seen more than fifteen summers, pointed to a figure lying a few feet away. "Him," he said, his voice trembling. "He's not going to make it." The Mercer approached, his boots crunching on the frost-kissed grass. The man on the ground was a mess, his body a tapestry of wounds. He was barely conscious, his breathing shallow and ragged. *Another one,* the Mercer thought, his heart heavy with the weight of his duty. *Another soul to send into the void.* He knelt beside the man, his blade glinting in the pale light. The soldier's eyes flickered open, and for a moment, the Mercer saw something in them that he hadn't expected: defiance. "No," the man rasped, his hand feebly grasping at the Mercer's wrist. "I won't go quietly." The Mercer hesitated, a frown creasing his brow. He had never encountered this before. Most welcomed his touch, eager for the release it brought. But this man...this man was different. "Why?" he asked, genuinely curious. The soldier's lips curled into a bloody smile. "Because I have a son," he whispered. "And I want to see him grow up." The Mercer stared at the man, his mind racing. He had no children, no one to mourn him when his time came. He was a solitary figure, a ghost haunting the edges of life. But this man...this man had a reason to hold on. He made his decision. "I can't heal you," he said, his voice soft. "But I can make it quick." The soldier's eyes filled with tears. "Thank you," he breathed, his grip on the Mercer's wrist loosening. The Mercer raised his blade, the mist swirling around them like a mourner's veil. As he brought it down, he couldn't help but wonder: *What would it be like to have something to live for?* But the thought was fleeting, lost in the swirl of the mist and the cries of the dying. He was the Mercer, the bringer of mercy. And the moor was his to tend, until the next battle came and the cycle began anew. </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a man awaiting execution for a crime he didn't actually commit. His cell mate is a much larger man, who is a notorious murderer and rapist. One evening they overhear the guards talking about some "... threat from the North" but don't catch the whole conversation. The next morning they awake to find the guards have all fled and now face a new predicament: how to get out of the locked cell before they die of thirst or starvation. ### ASSISTANT The stench of piss and shit was a constant companion in the damp stone cell, but it was the sound of the key turning in the lock that truly set his heart racing. The rusted iron groaned in protest, echoing through the narrow corridor like the dying breath of some ancient beast. The door creaked open, revealing the silhouette of the jailer, a hulking brute of a man with a lantern in one hand and a set of keys in the other. But there was no breakfast, no water, just the empty promise of another day in this hellhole. "What's the meaning of this?" the smaller man barked, his voice barely audible over the rattling of his chains. "Where's our food, you fat sack of shit?" The jailer didn't answer, just slammed the door shut and locked it again, the sound of his footsteps fading into the distance. The larger man, a hulking brute named Grimgor, grunted in response, his eyes never leaving the small patch of light that seeped through the barred window. "Something's not right," he rumbled, his voice like gravel being crushed under a heavy boot. The smaller man, known only as the Fox, couldn't argue with that. He'd been in this cell for what felt like an eternity, awaiting execution for a crime he didn't commit. But even in this place, there was a routine, a semblance of order. And that routine had just been shattered. As the day wore on, the silence from the corridor outside grew more oppressive. No guards, no other prisoners, nothing but the distant howl of the wind and the occasional scurrying of rats in the darkness. The Fox's mind raced, trying to piece together what could have happened. Had there been a riot? A rebellion? Or was it something else entirely? He glanced over at Grimgor, who was staring at the wall, lost in thought. The man was a monster, a notorious murderer and rapist, but he was also the only other living being in this godforsaken place. And in a world gone mad, even monsters could be allies. "You hear anything last night?" the Fox asked, trying to keep the tremor out of his voice. Grimgor grunted, his eyes still fixed on the wall. "Something about a threat from the North." The Fox's heart sank. He'd heard the same thing, snatches of conversation between the guards as they'd passed by their cell. But he'd been too afraid to give it much thought, too focused on his own impending doom. "What do you think it means?" he pressed, unable to keep the desperation at bay. Grimgor shrugged his massive shoulders. "Dunno. Don't care. Just means we're probably gonna die in here." The Fox's stomach twisted at the thought. He'd faced death before, many times, but this... this was different. To die slowly, trapped like an animal, with no chance to fight back... it was a fate worse than any he could imagine. As the hours dragged on, the thirst became unbearable. The Fox's tongue felt like sandpaper, his throat raw and parched. He could see the same desperation in Grimgor's eyes, the realization dawning on them both that they were truly alone. "We have to get out of here," he croaked, his voice barely above a whisper. Grimgor just grunted in agreement, his gaze never leaving the window. The Fox's mind raced, trying to come up with a plan. They had nothing, no tools, no weapons, just their wits and their will to survive. And even that seemed to be fading with each passing moment. But then, as the sun began to set and the shadows lengthened, he noticed something. The light from the window was changing, growing dimmer. He squinted, trying to make out what was happening. And then he saw it. "Grimgor," he hissed, tugging on the larger man's arm. "Look." Grimgor turned, his eyes narrowing as he followed the Fox's gaze. The light was flickering, casting strange shadows on the wall. And then, as if in answer to their unspoken prayers, they heard it. The sound of footsteps, growing louder and louder, accompanied by the jingle of keys. The Fox's heart leapt into his throat, hope and fear warring within him. Who was it? Friend or foe? The door swung open, revealing not the jailer, but a figure shrouded in darkness. The Fox couldn't make out any details, but he could sense the malice radiating off of him like heat from a forge. "Who are you?" he demanded, trying to sound braver than he felt. The figure stepped into the cell, the light from the lantern outside casting his features into sharp relief. He was a tall man, with a cruel smile and eyes that glinted like ice. "I am your salvation," he said, his voice like a knife slicing through the silence. "And your doom." With that, he tossed a set of keys at the Fox's feet and disappeared back into the shadows. The door slammed shut behind him, the sound of the lock clicking into place echoing through the cell like a death knell. The Fox stared at the keys, his heart pounding in his chest. Freedom was within his grasp, but at what cost? He looked over at Grimgor, who was already working on his own chains. "We have to get out of here," the larger man growled. "Before whatever's coming from the North gets here." The Fox nodded, his hands shaking as he picked up the keys. They were cold, colder than they had any right to be. He could feel the weight of them, the promise of escape and the threat of the unknown. Together, they worked in silence, the only sound the rattle of chains and the pounding of their hearts. When they were finally free, they stood for a moment, staring at each other. "What now?" the Fox asked, his voice barely above a whisper. Grimgor's smile was feral, his eyes alight with a madness that sent a shiver down the Fox's spine. "Now," he said, "we find out what's really going on." And with that, they set off into the darkness, their footsteps echoing through the empty corridors like the march of the damned. </details> ## Dark sci-fi stories <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a dark and gritty Space Opera Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The setting is this: a colony ship heading out of the solar system has been pulled off course with no hope of getting to the planned destination. The people know this and society has broken down completely due to the grim prospects they are facing. The POV character for chapter 1 is an orphan. ### ASSISTANT Chapter 1: Adrift The colony ship, once a beacon of hope named Elysium's Dawn, now drifted through the cold void like a ghost ship. Its hull, pitted and scarred from the debris of uncharted space, groaned with each labored breath it took. The once vibrant corridors were now dimly lit, casting long shadows that seemed to swallow any remaining semblance of optimism. In one such shadow, a figure moved silently, her bare feet making no sound on the cold metal floor. She was small for her age, malnourished from the rationing that had become a way of life. Her ragged clothes hung loosely on her frame, but her eyes were sharp and alert, scanning every corner for danger. This was Kira, an orphan of the void, surviving in a world where hope was a luxury few could afford. Kira's thoughts drifted back to the stories she had heard as a child, tales of a green and blue planet called Earth, their home that they would never see again. The ship had been pulled off course by an unknown force, leaving them stranded in the vast emptiness between stars. The adults spoke in hushed tones about the dwindling resources, the failing life support systems, and the ever-present threat of pirates who preyed on lost ships like theirs. She pushed those thoughts aside as she approached her destination: the hydroponics bay. It was one of the few places on the ship where life still thrived, albeit in a controlled and desperate manner. The air here was thick with moisture and the scent of growing things, a stark contrast to the stale recycled air that permeated the rest of the vessel. As she entered, the soft glow of the grow lights illuminated rows upon rows of plants, struggling to produce enough food for the dwindling population. Kira's stomach growled at the sight, but she knew better than to take more than her share. She had learned early on that stealing could mean a swift and brutal punishment from those who guarded the bay. Instead, she found solace in tending to the plants, her small hands deftly pruning and watering them. It was here that she felt closest to something resembling peace, a connection to life that transcended the harsh reality of their situation. As she worked, her mind wandered to the lessons she had learned from the ship's AI, the only constant presence in her life since the adults had given up on teaching the children anything but survival skills. "Kira," a voice whispered urgently, jolting her out of her reverie. She turned to see Jax, another orphan and one of the few people she trusted. His face was pale and his eyes wide with fear. "You need to come with me, now." She followed him through the maze-like corridors, her heart pounding in her chest. They arrived at a small, hidden room that served as their makeshift home. Inside, huddled around a flickering screen, were several other children, their faces etched with worry and excitement. "Look," Jax said, pointing to the screen. It displayed a grainy image of a man in tattered clothing, his face half-hidden by a wild beard. "He says he knows how to fix the engines." Kira's eyes widened in disbelief. The engines had been dead for as long as she could remember, their failure sealing their fate as interstellar castaways. "Who is he?" she asked, her voice barely above a whisper. "They call him the Mechanic," one of the older girls replied. "He used to work in engineering before...before everything fell apart." The man on the screen continued his frantic plea, detailing his plan to restore power and steer the ship towards a habitable planet he claimed was within reach. Kira felt a spark of hope ignite within her, but it was quickly doused by the skepticism of the others. "It's just another one of their lies," spat a boy named Taro. "They'll say anything to get us to do their dirty work." Jax rounded on him, his fists clenched. "This is our chance, Taro! We can't just give up!" The argument raged on, but Kira tuned it out, her thoughts racing. She knew the risks of trusting an adult, especially one who claimed to have access to forbidden knowledge. But what if he was telling the truth? What if they could find a new home and start over? As she watched the debate unfold, she made up her mind. She would seek out this Mechanic and offer him her help, no matter the cost. It was a decision born of desperation, but also of a stubborn refusal to accept the bleak future that lay before them. And so, as the other children slept fitfully around her, Kira slipped out into the dark corridors once more, determined to find the man who might hold the key to their salvation or doom. The ship groaned and creaked around her, a constant reminder of the fragile existence they all clung to. But she pressed on, driven by an unyielding will to survive and the faintest glimmer of hope that burned within her heart. As she navigated the treacherous path towards engineering, she couldn't help but wonder what awaited her in the depths of the ship. Would she find a savior or a madman? And more importantly, would she be able to make it back alive? </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is an astronaut living on a space station orbiting the earth. The day starts off normally, but then him the others notice the mushroom clouds and they have to face inevitability of the situation they are in. ### ASSISTANT Chapter 1: Ashes of Exodus The cold embrace of space enveloped the station like a lover's caress, its infinite blackness punctuated only by the distant twinkle of stars and the ever-present glow of Earth below. Captain Marcus Kane floated through the dimly lit corridors, his magnetic boots barely whispering against the metal floor as he made his way to the observation deck. The daily routine had become a meditative ritual for him, a chance to clear his mind before the cacophony of the day's work began. As he entered the room, the sight that greeted him never failed to take his breath away. Earth, a swirling marble of blues and greens, hanging in the void like an irreplaceable jewel. He couldn't help but feel a pang of longing for its familiar embrace, for the weight of gravity that he had left behind so many years ago. Marcus settled into his favorite spot by the window, the worn leather of the seat molded to his body like an old friend. He closed his eyes and let the silence wash over him, the only sound the soft hum of the station's life support systems. It was in these moments that he felt truly alive, connected to something greater than himself. But today, the silence was shattered by a chilling alarm, its shrill wail piercing the tranquility like a knife through glass. His eyes snapped open as his heart began to race, adrenaline coursing through his veins. He knew that sound all too well; it was the emergency alert for an incoming transmission from Earth. He launched himself towards the comms console, fingers flying over the keys as he tried to make sense of the garbled message. The voice on the other end was frantic, barely coherent through the static. "Multiple...detonations...global catastrophe..." were the only words he could decipher before the signal died completely. Marcus's mind raced as he tried to process what he had just heard. It couldn't be true, it couldn't be happening. Not again. He quickly activated the external cameras, his stomach twisting into knots as he waited for the feed to come online. And then he saw them: mushroom clouds, blooming like grotesque flowers across the planet's surface. Too many to count, their plumes of smoke and ash reaching up towards the heavens in a silent scream of despair. The sight was both beautiful and horrifying, a testament to humanity's capacity for destruction on an unimaginable scale. His thoughts were interrupted by the sound of footsteps behind him as the rest of the crew floated into the room, drawn by the alarm. Their faces mirrored his own shock and disbelief as they stared at the monitors, their voices a cacophony of whispers and gasps. "What...what is that?" asked Dr. Amelia Chen, her voice trembling with fear. Marcus didn't answer; he couldn't find the words. Instead, he simply pointed at the screen, his hand shaking slightly. Engineer Jake Taylor was the first to break the silence. "Nuclear war," he said, his voice flat and emotionless. "It's finally happened." The others looked at him in horror, but Marcus knew he was right. There could be no other explanation for such devastation. He felt a cold weight settle in the pit of his stomach as the reality of their situation began to sink in. They were alone, adrift in the vastness of space with no hope of return. As the implications of what they had just witnessed set in, the crew fell into an uneasy silence, each lost in their own thoughts. Marcus couldn't help but think about his wife and daughter back home, wondering if they were safe or if they too had been consumed by the flames. He knew it was a futile exercise, but he couldn't stop himself from hoping against hope that somehow, someway, they had survived. But as the minutes ticked by and no further communication came from Earth, that hope began to fade like a dying ember. They were truly alone now, the last remnants of humanity clinging to life on this tiny outpost in the void. He turned to face his crew, their faces etched with fear and uncertainty. "We need to come up with a plan," he said, his voice steady despite the turmoil raging within him. "We can't just sit here and wait for...for whatever comes next." Amelia nodded, her eyes hardening with determination. "I'll start running diagnostics on our life support systems, see how long we can last up here." Jake followed suit, his hands already moving to the controls of the station's mainframe. "I'll try to establish contact with any other surviving outposts or spacecraft." Marcus nodded, grateful for their resolve in the face of such overwhelming odds. He knew that they were all thinking the same thing: that this could very well be the end of everything they had ever known. But he also knew that they couldn't give up, not yet. As he watched them work, his mind drifted back to the message from Earth, those final desperate words echoing in his head like a funeral dirge. He couldn't shake the feeling that there was something more, some hidden meaning or clue that would explain what had happened. And then it hit him: the timing of the detonations, the pattern of the blasts. It wasn't random; it was deliberate, almost...calculated. But who would do such a thing? And why? The questions swirled in his mind like a storm, each one more unsettling than the last. He knew that they needed answers, and fast. Because if this was an act of war, then their isolation might not be as secure as they thought. He pushed himself away from the console, his decision made. "I'm going to suit up," he said, his voice firm. "We need to get a closer look." The others stared at him in disbelief, but he didn't waver. This was his duty as captain, to protect his crew and find out the truth no matter how terrible it might be. As he floated towards the airlock, Marcus couldn't help but feel a sense of dread creeping up on him like a shadow. He knew that what he was about to see would change everything, that there was no going back once he stepped outside and gazed upon the ashes of humanity's final stand. But he also felt something else: a burning anger, a fierce determination to uncover the truth and make those responsible pay for their crimes against humanity. Whatever awaited them out there, they would face it together, as one. He closed his eyes and took a deep breath, steeling himself for what was to come. And then, with a quiet prayer on his lips, he opened the airlock and stepped into the void. </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a worker constructing a biodome when a terrible accident happens. Make the story very dark; no happy ending here... ### ASSISTANT Chapter 1: The Biodome The air was thick with the scent of sweat and burnt metal as Kaelen, a seasoned construction worker, tightened the last bolt on the massive irrigation system. He wiped his brow with the back of his gloved hand, leaving behind a smear of grease that glistened under the harsh artificial lights. The hum of machinery and distant shouts echoed through the cavernous space, a cacophony that had become as familiar to him as his own heartbeat over the years spent building biodomes on this desolate planet. Kaelen's thoughts drifted to his family back on Earth, their faces faded like old photographs in his mind. He wondered if they ever thought about him, or if he was just another ghost haunting the fringes of their memories. The company had promised them a better life, but it came at a steep price: years of backbreaking labor on an alien world, far from everything they'd ever known. "Hey, Kael!" A voice cut through his reverie, jolting him back to reality. It was Tamsin, his foreman, her face hidden behind a tinted visor. "We need you up top! There's an issue with the atmospheric seal." He nodded curtly and began the long climb up the scaffolding, each rung biting into his calloused hands. As he ascended, Kaelen couldn't help but marvel at the sheer scale of their creation: a vast dome of steel and glass that would one day be teeming with life, a self-sustaining ecosystem in the heart of this barren wasteland. But today was not that day. Today, it was just another tomb waiting to be sealed. As he reached the top, Kaelen could see the problem immediately: a small fissure had formed along one of the joints, spewing precious oxygen into the void beyond. He cursed under his breath; they were already behind schedule and over budget. Another delay would mean another round of demerits, another month's pay docked. "What do you think?" Tamsin asked, her voice crackling through his earpiece. "Can we patch it up or do we need to call in the engineers?" Kaelen hesitated, running his fingers along the jagged edge of the tear. It was larger than he'd initially thought, and growing by the second. He could feel the cold tendrils of vacuum reaching out to claim him, whispering promises of oblivion. "I... I don't know," he admitted, his voice heavy with dread. "It doesn't look good." Tamsin swore colorfully and turned away, barking orders into her comm unit. Kaelen watched as workers scrambled to gather tools and materials, their movements frantic and disorganized. He knew they were all thinking the same thing: if they couldn't fix this, they were dead. The air around them grew colder, thinner, as the oxygen continued to escape. Kaelen's lungs burned with every breath, his vision swimming at the edges. He fumbled with the patch kit, his hands shaking uncontrollably. This was it; this was how he would die, millions of miles from home, in service to a corporation that saw him as nothing more than a replaceable cog in their grand machine. "Hurry up!" Tamsin shouted over the growing din. "We're losing pressure fast!" Kaelen's heart pounded in his chest like a sledgehammer, drowning out all other sound. He could feel the panic rising within him, threatening to consume him whole. But he couldn't afford to give in; not now, not when so much was at stake. With trembling hands, he applied the sealant and pressed the patch into place. For a moment, it seemed to hold... but then, with a sickening lurch, the fissure widened, swallowing the feeble attempt whole. The wind howled around them like a ravenous beast, tearing at their suits, trying to pull them apart atom by atom. "Abort!" Tamsin screamed, her voice barely audible over the roar. "Everyone get out now!" But it was too late. The dome shuddered beneath their feet, groaning in protest as the atmosphere inside rushed to equalize with the void outside. Kaelen felt himself being lifted off the ground, his boots scrabbling for purchase on the slick metal surface. He reached out, desperate to grab onto something - anything - but found only emptiness. And then, as suddenly as it had begun, there was silence. A deafening, suffocating silence that pressed in on him from all sides. His vision went dark, and he knew no more. When Kaelen awoke, he was lying on the cold floor of the dome, his body wracked with pain. He tried to move, but found himself held fast by twisted metal and shattered glass. Panic surged through him once again as he realized that his suit had been breached; he could feel the icy chill of vacuum seeping into his very bones. Through the haze of agony, he became aware of movement nearby. Tamsin was there, her visor cracked and fogged, struggling to free herself from the wreckage. She looked at him with wide, terrified eyes, her lips moving soundlessly. He knew what she was saying: they were going to die here, alone and forgotten. Kaelen closed his eyes, willing himself to accept the inevitable. But even as the darkness closed in, he couldn't shake the feeling that this wasn't just an accident - that there were forces at work far beyond their understanding, manipulating them like pawns on a cosmic chessboard. And then, just when he thought it was all over, he heard it: a faint beeping, growing louder by the second. His eyes snapped open to see a small drone hovering above them, its lights flashing urgently. It was one of the company's rescue bots, sent to assess the damage and retrieve any survivors. But there would be no salvation for Kaelen or Tamsin. Their suits were too damaged, their bodies too far gone. As the drone's camera lens bore down on him, he could only manage a weak, bitter smile. "Tell my family... tell them I'm sorry," he croaked, his voice barely audible even to himself. Tamsin reached out a gloved hand, her fingers brushing against his own in a final gesture of solidarity. And then, as the darkness claimed him once more, Kaelen knew that this was not the end of their story - but rather, the beginning of something far more sinister. Something that would change the course of human history forever... </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is manning a listening station in deep space. What he hears over the radio chills him to the bone. ### ASSISTANT Chapter 1 - Whispers from the Void The silence of deep space was like an old friend to Captain Alistair Reynolds, a constant companion that had accompanied him through countless missions and lonely nights at the helm of the listening station. But today, it felt more like a suffocating shroud, heavy with unspoken secrets and foreboding. He sat in his chair, fingers drumming impatiently on the console as he scanned the endless stream of data pouring in from the far reaches of the cosmos. His thoughts drifted to home, to the warmth of his wife's embrace and the laughter of their children playing in the garden under the twin suns of Proxima Centauri. It had been years since he last felt that warmth, since he last heard those laughs. The ache was always there, a dull throb that pulsed in time with his heartbeat, but today it seemed to echo through the cold metal halls of the station like a mournful dirge. A sudden blip on the screen snapped him back to reality. He leaned forward, eyes narrowing as he studied the readout. It was faint, barely more than a whisper against the static, but it was there - a signal, human in origin and heavily encrypted. His heart raced as he keyed in the decryption sequence, his mind already racing ahead to the possibilities of what this could mean. The message unscrambled, revealing a frantic voice, barely audible over the crackle of interference. "This is the UES Normandy, requesting immediate assistance... coordinates attached... under attack by unknown forces... God help us..." The transmission cut out abruptly, leaving only the empty hiss of static in its wake. Alistair's blood ran cold as he stared at the screen, the weight of the silence pressing down on him like a physical force. He knew that ship, had served with her captain back during the Martian uprising. They were old friends, bound by shared battles and the scars they bore from them. And now she was out there, alone and in danger. He didn't hesitate. His fingers flew over the console, sending out an emergency beacon to the nearest fleet. But he knew it would take time for help to arrive, time that the Normandy might not have. He had to do something now. As he prepared to launch a probe towards the coordinates, his mind filled with images of burning ships and desperate faces. He could almost hear the screams of the dying, echoing through the void like ghosts from his past. It was a sound that haunted him still, one he had hoped never to hear again. But duty called, and Alistair was nothing if not a soldier. He took a deep breath, steadying himself against the tide of fear and doubt that threatened to overwhelm him. This was his job, his purpose - to listen, to bear witness, and when necessary, to act. The probe shot out into the darkness, its tiny form swallowed up by the vastness of space. He watched it go, feeling a strange sense of detachment as if he were watching someone else's life play out before him. And perhaps in some ways, he was. For all his years in service, this was the first time he had ever been truly alone, cut off from the rest of humanity by light-years and the cold indifference of the universe. As the minutes ticked by, Alistair found himself slipping into a half-trance, his thoughts drifting back to happier times. He remembered the first time he had held his newborn son, the tiny fingers curling around his own as if trying to hold on for dear life. He remembered the taste of real Earth coffee, rich and bitter, shared with his wife under a sky full of stars that seemed close enough to touch. But those memories were distant now, fading like echoes in the void. All he had was this station, this endless vigil, and the hope that one day, he might make it back home. The radio crackled to life again, jolting him out of his reverie. It was the probe, transmitting data back from the edge of nowhere. He braced himself for what he would find, knowing that it could change everything. What greeted him on the screen was worse than anything he could have imagined. The Normandy was there, alright - or rather, what was left of her. She drifted through space like a broken toy, her hull breached and venting atmosphere into the cold void. There were no signs of life, no distress signals, nothing but silence. Alistair felt a cold knot form in his stomach as he took in the wreckage. This wasn't just any ship; this was his friend's ship, her crew now lost to the void. And the attackers... whoever or whatever they were, they had left no trace behind. He sat back in his chair, the weight of it all settling heavily upon him. He knew what he had to do next - file a report, alert command, and wait for further instructions. But as he stared at the lifeless husk that was once the Normandy, he couldn't shake the feeling that something far greater than himself was at work here. Something dark and terrible had come calling, whispering its secrets through the stars. And now it was his turn to listen, to bear witness, and perhaps, if he was lucky, to survive long enough to tell the tale. </details> ## Miscellaneous stories <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a dark and gritty post-apocalyptic novel in the style of Cormac McCarthy. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a technician in a nuclear bunker 18 months after the war. The 12 inhabitants of the bunker have been expecting to start to be able to leave soon, but it has just been discovered that at least where they are, salted cobalt bombs were used. It's clear now that either all 12 will die when the resources run out and their is barely enough for 6 to make it to 5 years and have any hope of leaving. After 18 months of calm everything breaks lose today. ### ASSISTANT In the dim light of the bunker, the technician sat hunched over his console, fingers dancing across the keys with a practiced urgency. The air was thick with the hum of machinery and the stench of sweat and fear. He had been down here for what felt like an eternity, but the digital clock on the wall insisted it had only been eighteen months since the world above had turned to ash. His name was Samuel, or at least that's what they called him now. In another life, he had been a software engineer, designing algorithms for self-driving cars and smart homes. Now, he was responsible for keeping twelve souls alive in this underground tomb, their last bastion against the desolation above. He glanced up at the monitor, the green glow illuminating his haggard face. The readings were clear: salted cobalt bombs had been used. The thought sent a shiver down his spine. They had all heard the rumors, of course, but they had clung to hope like a drowning man to driftwood. Hope that the war hadn't been as bad as they feared, that the surface would be habitable again soon. But now, the truth was undeniable. The world outside was dead, and it would remain so for generations. Samuel closed his eyes, trying to block out the images of a scorched earth, devoid of life. He could hear the others in the background, their voices a low murmur as they went about their daily tasks. They didn't know yet. How could he tell them? How could he be the one to snuff out the last ember of hope that had kept them going for so long? He took a deep breath and stood up, his joints creaking in protest. He made his way through the narrow corridors, past the rows of bunks where some slept fitfully, their dreams haunted by memories of a world they would never see again. Others were awake, their faces etched with lines of worry and despair. They looked up as he passed, but he couldn't meet their gazes. He reached the command center, where Captain Reynolds was poring over maps and reports. The man looked up as Samuel entered, his eyes heavy with exhaustion. "What is it?" he asked, his voice gruff. Samuel hesitated for a moment before speaking. "Sir, I have some... troubling news." The captain's expression didn't change, but Samuel could see the tension in his jaw. He had been expecting this, too. "Go on," he said simply. "The radiation readings... they're off the charts. It's cobalt-60. The half-life is over five years." Reynolds let out a slow breath, his eyes never leaving Samuel's face. "And our supplies?" "We have enough for six, maybe seven if we ration carefully. But not twelve." The room seemed to shrink around them as the weight of their situation settled in. The captain rubbed his temples, his thoughts racing. They had always known that their time in the bunker was limited, but they had been so close. So very close. "We'll need to make a decision," he said finally, his voice barely above a whisper. "Who stays, and who goes." Samuel felt a knot form in his stomach. He knew what that meant. It was survival of the fittest down here now, and there were those among them who would do anything to ensure their own survival. As he made his way back to his console, he couldn't help but think about the world above. The trees that would never grow again, the birds that would never sing. He thought of his wife and daughter, their faces fading like photographs left in the sun. They had been lucky, they said, to have made it into the bunker. But now, he wasn't so sure. In the mess hall, the atmosphere was tense. Whispers and hushed conversations filled the air as people tried to make sense of what they had just heard. Samuel sat alone at a table, staring blankly at his food. He couldn't eat. Not now. He felt a hand on his shoulder and looked up to see Maria, the doctor. Her eyes were red-rimmed, but her voice was steady. "We need to talk," she said, guiding him towards the infirmary. Once inside, she closed the door behind them and leaned against it, her body trembling. "What are we going to do?" she asked, her voice cracking. Samuel shook his head. "I don't know." "We can't just let six of us die," she said, her voice rising in panic. "There has to be another way." He looked at her, at the desperation etched on her face, and felt a flicker of anger. "What do you suggest? We all hold hands and sing Kumbaya while we starve to death?" Maria's eyes flashed with anger, but she quickly composed herself. "No," she said softly. "But we can't just give up." Samuel sighed and ran a hand through his greasy hair. "I'm not giving up," he said. "I'm just being realistic." She looked at him for a long moment before nodding. "You're right," she said, her voice barely audible. "We need to be realistic." They sat in silence for a while, each lost in their own thoughts. Then, Maria spoke up again. "There is one option," she said hesitantly. "But it's... drastic." Samuel raised an eyebrow. "What?" "We could... we could draw straws," she said, her voice barely above a whisper. "Choose six people at random to leave the bunker and take their chances on the surface." He stared at her in disbelief. "You can't be serious." "It's not ideal," she admitted. "But it's better than just letting them die down here." Samuel shook his head, his mind reeling. He couldn't fathom the thought of sending people out into that wasteland, knowing they would likely never return. But what other choice did they have? As he sat there, the weight of their decision pressing down on him like a leaden shroud, he heard a commotion outside. Shouting, followed by the sound of scuffling feet and something heavy being dragged across the floor. He exchanged a worried glance with Maria before rushing out into the corridor. What he saw made his blood run cold. A group of men had cornered one of the younger women, their faces twisted with rage and desperation. They were arguing over who would get to leave the bunker, who would get a chance at life. Fists flew, and soon, the sound of gunfire echoed through the halls. Samuel stood there, frozen in place as chaos erupted around him. This was it. The end of everything they had worked so hard to preserve. He could feel the darkness closing in, suffocating him, as he realized that their last hope for survival had just been shattered like glass under a bootheel. And all he could do was watch as the world above claimed them, one by one. </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a historical mystery novel in the style of Arturo Pérez-Reverte and Umberto Eco. The story starts off in California during the 1948 Gold Rush, before moving to London in the mid 1970s. There should be a mysterious 'occult' item that connects these two time periods. ### ASSISTANT Chapter One: The Golden Veil The year was 1848, and California was a land of dreams and desperation. Men from all corners of the world had flocked to its golden shores, lured by the promise of wealth beyond their wildest imaginations. They came with pickaxes and pans, with hope in their hearts and greed in their eyes, ready to tear apart the earth itself in search of fortune. In the midst of this frenzy stood a solitary figure, a man who seemed out of place amidst the chaos. He was tall and lean, with a face that bore the lines of a thousand stories. His clothes were simple but well-made, his boots worn from years of travel. He moved with an air of quiet authority, as if he belonged to another time entirely. His name was Alistair Blackwood, and he had come to California not for gold, but for something far more valuable: knowledge. A scholar by trade and an adventurer at heart, Blackwood had spent his life chasing after ancient texts and forgotten lore, seeking to unravel the secrets of the universe. And now, he believed he was on the verge of a discovery that would change everything. Rumors had reached him of a strange artifact, said to have been found deep within the Sierra Nevada mountains. It was a small, intricately carved box made of an unknown metal, its surface etched with symbols that defied translation. Those who claimed to have seen it spoke in hushed tones of its otherworldly beauty and the eerie sense of power that seemed to emanate from within. Blackwood had tracked the rumors to a dusty saloon in Sacramento, where he hoped to find answers among the grizzled miners who frequented the place. As he pushed open the swinging doors, the din of voices and clinking glasses washed over him like a wave. He scanned the room, his gaze settling on a group of men huddled around a table in the corner. One look at their faces told him they were the ones he sought: sun-weathered and unshaven, their eyes glinting with a mixture of excitement and fear as they whispered amongst themselves. He approached them slowly, aware that he was an outsider in this rough-and-tumble world. "Gentlemen," he said, his voice low and cultured, "I couldn't help but overhear your conversation. Might I be so bold as to ask what has you all so...animated?" The men exchanged wary glances before one of them spoke up. "You ain't from around here, are ya?" he drawled, eyeing Blackwood's fine clothes with suspicion. "No," Blackwood admitted, "I am not. But I have traveled far and wide in search of knowledge, and I believe you may possess something that could be of great interest to me." He reached into his pocket and produced a small leather pouch, which he placed on the table with a soft thud. The men's eyes widened as they saw the glint of gold within. "I understand there is an object - a box, perhaps? - that has recently come into your possession. I would be willing to pay handsomely for any information you might have about it." The miners looked at each other, then back at Blackwood. Finally, the one who had spoken before nodded slowly. "We might know somethin' 'bout that," he said, his voice dropping to a conspiratorial whisper. "But it ain't no ordinary box." As they began to tell their tale, the saloon faded away around them, replaced by images of dark caverns and glittering treasure. Blackwood leaned forward, his heart pounding with anticipation. This was it - the moment he had been waiting for. The box, they said, had been found buried beneath a waterfall, hidden in a cave that seemed to defy all natural laws. It was guarded by strange creatures unlike anything any of them had ever seen: half-man, half-beast, with eyes that glowed like embers in the darkness. But the greatest mystery of all lay within the box itself. For when they opened it, they found not gold or jewels, but a single sheet of parchment, yellowed with age and covered in writing that none of them could decipher. Blackwood's pulse quickened as he listened. This was more than just another piece of occult nonsense; this was the real thing. He knew it in his bones. "I must see this box for myself," he said, his voice barely above a whisper. "Take me to it." The miners exchanged uneasy glances before nodding their assent. They led him out of the saloon and into the night, down winding streets and through shadowy alleyways until they reached a nondescript building on the edge of town. Inside, they descended a rickety staircase to a dimly lit basement filled with crates and barrels. There, in the center of the room, sat the object of his obsession: a small, unassuming box that seemed to hum with energy. Blackwood reached out a trembling hand to touch it, feeling the power coursing through him like electricity. As he did so, the world around him shifted, and suddenly he was no longer in a dingy basement in Sacramento. He stood instead in a grand library, its shelves groaning under the weight of countless tomes and scrolls. A fire crackled in the hearth, casting flickering shadows on the walls. A figure emerged from the gloom, cloaked in darkness save for two piercing eyes that seemed to bore into his very soul. "You have found it," the voice said, deep and resonant. "The key to unlocking the secrets of the universe." Blackwood stared at the apparition, unable to speak. He knew he should be afraid, but all he felt was awe. "But be warned," the figure continued, "for with great power comes great responsibility. The knowledge contained within this box is not meant for mortal minds." And then, just as suddenly as it had appeared, the vision vanished, leaving Blackwood alone once more in the dimly lit room. He stared at the box, his mind racing. What did it all mean? As he pondered this question, a sudden commotion erupted upstairs. Shouts and gunfire filled the air, followed by the sound of heavy footsteps on the staircase. The miners had betrayed him! He snatched up the box and fled into the night, his heart pounding in his chest. He knew he couldn't stay in California - not now that others were after the artifact. No, he would have to take it far away, to a place where its secrets could be studied in safety. And so began a journey that would span continents and decades, from the wild frontier of the Gold Rush to the foggy streets of Victorian London. For Alistair Blackwood had stumbled upon something much greater than mere treasure; he held in his hands the key to unlocking the very fabric of reality itself. But little did he know that the true mystery was only just beginning... </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a dark and gritty historical novel set during the battle of the Bulge. Write in first person. The POV character for Chapter 1 is a young German officer trying to carry his wounded commander back to the deep defensive line after the bunker he was in was overrun. He should have a flashback to the moment it was overrun during the middle of the story. The story should not end well for either of them. ### ASSISTANT Chapter 1 The cold bit into my bones as I trudged through the snow, each step a labored effort. My breath came out in ragged puffs, clouding the air around me. In my arms, I cradled the weight of my world - my commander, Hauptmann Müller, his body limp and heavy with pain. His eyes were closed, but I could hear the shallow rasp of each breath he took, a testament to his will to live. The forest around us was a maze of shadows and death, the trees standing like skeletal sentinels against the moonlit sky. The silence was deafening, broken only by the distant rumble of artillery and the occasional crack of a rifle shot. It was a cruel irony that the same snow that muffled our footsteps also betrayed the presence of the enemy, their tracks glowing like phosphorescent snakes in the pale light. I stumbled, my legs threatening to give way under the strain. Hauptmann Müller groaned, his hand clutching at my coat. "Keep moving, Leutnant," he whispered, his voice barely audible. "We must reach the defensive line." I nodded, gritting my teeth as I adjusted my grip on him. The pain in my arms and shoulders was a constant companion now, but it was nothing compared to the fear that gnawed at my gut. We were alone, cut off from our unit during the chaos of the American assault. The bunker we had been holed up in was now a tomb for the men who had trusted us to lead them. A flash of memory seared through my mind, taking me back to that fateful moment when it all went wrong. The thunderous roar of explosions, the earth shaking beneath our feet as if the very foundations of hell were breaking loose. The screams of men, the acrid smell of smoke and cordite, the taste of blood in my mouth. I remembered the look on Müller's face when he realized what was happening - a mixture of disbelief and determination. He had barked orders, trying to rally our troops, but it was too late. The Americans were upon us, their numbers overwhelming. I saw him take a bullet to the leg, his face contorting in agony as he fell. And then there was nothing but chaos and darkness. When I came to, the bunker was silent except for the moans of the dying. The air was thick with smoke, making it hard to breathe. I found Müller slumped against a wall, his uniform soaked in blood. He looked up at me, his eyes glassy with pain but still burning with that same fierce determination. "We have to go," he said, gripping my arm. "They're coming." And so we had fled into the night, leaving behind the wreckage of our command post and the bodies of our comrades. Now, hours later, we were still stumbling through the forest, searching for safety that seemed to recede with every step. I could feel my strength ebbing away, my legs turning to lead. But I couldn't stop, not now. Not when we were so close. The deep defensive line was our only hope, a thin red line that stood between us and annihilation. If we could just make it there... But fate is a cruel mistress, and she had other plans for us. As we staggered out of the trees and into a small clearing, I saw them - a patrol of American soldiers, their silhouettes stark against the snow. They spotted us at the same time, their rifles swinging around to point in our direction. "Halt!" one of them yelled, his voice harsh and commanding. "Hands up!" I froze, my heart pounding in my chest. Müller's grip on me tightened, his breath coming in short gasps. I knew what he was thinking - we couldn't surrender, not like this. We were too valuable, too close to the front lines. They would shoot us or take us prisoner, and either way, it would be a death sentence for him. "Run," he whispered, his voice barely audible. "Leave me." I shook my head, tears stinging my eyes. I couldn't do that. I wouldn't. He was my commander, my friend. I owed him everything. But before I could make a move, the air was filled with the sound of gunfire. Müller jerked in my arms, a wet gurgle escaping his lips as he slumped against me. I stared down at him in disbelief, feeling the warmth of his blood soaking through my coat. "Nein," I whispered, my voice choked with grief and rage. "Nein!" I fell to my knees, cradling his lifeless body as the world around me spun out of control. The Americans were closing in, their shouts growing louder. I knew it was over - for both of us. With a final, desperate cry, I raised my pistol and fired into the darkness, hoping to take at least one of them with me. But my aim was wild, my strength gone. The last thing I saw was a flash of light, followed by an explosion of pain in my chest. And then there was nothing but blackness. As I slipped away, I thought of home - of my mother's warm embrace, of the laughter of my little sister. And I knew that I would never see them again. This frozen hellhole would be my grave, and Hauptmann Müller's sacrifice would be forgotten by all but me. But in that moment, as death claimed me, I found a strange sort of peace. We had fought for our Fatherland until the very end, and we had done so with honor. And in the grand tapestry of history, perhaps that was all that mattered. </details> <details> <summary>Click to see spoiler</summary> ### USER Write me a short story about the fictional character of Anton Chigurh. It should be written in third person personal and feature internal monologues of his deranged world view. The time period is around 2 years before the events in Cormac McCarthy's 2005 novel 'No Country for Old Men'. It should be set in Mexico, be *very* dark, show Anton's still developing obsession with "chance" and feature a grenade. The story should start with him waiting in a bar for instructions on the "job" he is to perform. ### ASSISTANT In the dimly lit corner of a nondescript Mexican cantina, Anton Chigurh sat motionless, his eyes scanning the room like a predator assessing its prey. The air was thick with the smell of stale cigarette smoke and cheap tequila, but he remained unfazed, his senses heightened, attuned to every sound, every movement around him. He nursed a glass of water, the condensation beading on its surface like droplets of sweat on cold steel. His mind wandered, as it often did, to the concept of chance. The randomness of life and death fascinated him, the idea that one's fate could hinge on something as simple as a coin flip or the roll of dice. He believed in the purity of such decisions, untainted by human emotion or prejudice. It was a philosophy he had embraced wholeheartedly, and it had served him well in his line of work. A fly buzzed around his head, its tiny wings beating an erratic rhythm against the stillness of the room. He watched it with mild interest, wondering if its life would end here, on this very spot, or if it would continue its meaningless existence elsewhere. The thought brought a small, cruel smile to his lips. The bartender, a grizzled old man with a face like leather, eyed him warily from across the room. He knew better than to engage in conversation with this stranger, who had entered his establishment an hour ago and hadn't spoken a word since. There was something about him that made the hairs on the back of his neck stand up, something cold and deadly lurking just beneath the surface. Chigurh's thoughts were interrupted by the sound of the door creaking open. A man in a cheap suit entered, looking around nervously before making his way towards him. He slid into the seat across from Chigurh, who continued to stare straight ahead, as if he hadn't noticed the new arrival. "You're late," Chigurh said finally, his voice low and gravelly. The man in the suit swallowed hard, adjusting his tie. "I apologize. Traffic." Chigurh didn't respond, his eyes still fixed on some distant point. The man fidgeted, reaching into his pocket to retrieve a small envelope. He slid it across the table, and Chigurh picked it up without looking at him, tearing it open with a single, precise movement. Inside was a photograph of a man, along with an address scrawled on a piece of paper. He studied the image for a moment, committing every detail to memory before slipping it back into the envelope and sliding it into his own pocket. "You know what needs to be done," the man in the suit said, his voice barely above a whisper. Chigurh nodded once, his mind already turning to the task at hand. He stood up, leaving a few crumpled bills on the table for the water he hadn't touched. As he walked towards the door, he could feel the weight of the grenade in his other pocket, its cool metal surface pressing against his thigh like a lover's caress. Outside, the heat of the day was oppressive, but Chigurh barely registered it as he climbed into his car and started the engine. He drove through the dusty streets, his thoughts consumed by the upcoming job. The man in the photograph had a wife, two kids. It didn't matter. They were all just pieces on a board, pawns to be moved or removed as necessary. He arrived at the address just as the sun was beginning to set, casting long shadows across the dilapidated building. He sat in his car for several minutes, watching the comings and goings of the neighborhood, waiting for the right moment. When it came, he slipped out of the vehicle and approached the door with a sense of purpose. Inside, he could hear laughter and music, the sounds of a family enjoying their evening together. He smiled to himself as he pulled the grenade from his pocket, feeling its weight in his hand. This was what he lived for: the moment when chance decided who would live and who would die. He took a deep breath, savoring the anticipation, before pulling the pin with a satisfying click. He tossed it into the room, not bothering to wait for the explosion as he turned and walked away. The screams that followed were like music to his ears, a symphony of chaos and destruction. As he drove off into the night, Chigurh couldn't help but feel a sense of satisfaction. Another job well done, another life ended by the whims of fate. He reached into his pocket and pulled out a coin, flipping it idly between his fingers as he considered his next move. Heads or tails? Life or death? It was all the same to him. The world was a cruel place, and he was just its instrument, meting out justice in the only way he knew how: with cold, calculated precision. And if some people happened to get caught in the crossfire... well, that was just the luck of the draw. </details> Big thanks to: @sophosympatheia for working out the merge pattern, @Sao10K for creating Euryale and WinterGoddess, and @chargoddard for writing [Mergekit](https://github.com/arcee-ai/mergekit)!
[ "TRANSLATION" ]
[ "BEAR" ]
Non_BioNLP
croissantllm/base_175k
croissantllm
text2text-generation
[ "transformers", "pytorch", "llama", "text-generation", "legal", "code", "text-generation-inference", "art", "text2text-generation", "fr", "en", "dataset:cerebras/SlimPajama-627B", "dataset:uonlp/CulturaX", "dataset:pg19", "dataset:bigcode/starcoderdata", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,705
1,706
6
0
--- datasets: - cerebras/SlimPajama-627B - uonlp/CulturaX - pg19 - bigcode/starcoderdata language: - fr - en license: mit pipeline_tag: text2text-generation tags: - legal - code - text-generation-inference - art --- # CroissantLLM - Base (175k steps) This model is part of the CroissantLLM initiative, and corresponds to the checkpoint after 175k steps (2.75 T) tokens. To play with the final model, we recommend using the Chat version: https://huggingface.co/croissantllm/CroissantLLMChat-v0.1. ## Abstract We introduce CroissantLLM, a 1.3B language model pretrained on a set of 3T English and French tokens, to bring to the research and industrial community a high-performance, fully open-sourced bilingual model that runs swiftly on consumer-grade local hardware. To that end, we pioneer the approach of training an intrinsically bilingual model with a 1:1 English-to-French pretraining data ratio, a custom tokenizer, and bilingual finetuning datasets. We release the training dataset, notably containing a French split with manually curated, high-quality, and varied data sources. To assess performance outside of English, we craft a novel benchmark, FrenchBench, consisting of an array of classification and generation tasks, covering various orthogonal aspects of model performance in the French Language. Additionally, rooted in transparency and to foster further Large Language Model research, we release codebases, and dozens of checkpoints across various model sizes, training data distributions, and training steps, as well as fine-tuned Chat models, and strong translation models. We evaluate our model through the FMTI framework, and validate 81% of the transparency criteria, far beyond the scores of even most open initiatives. This work enriches the NLP landscape, breaking away from previous English-centric work in order to strengthen our understanding of multilinguality in language models. ## Citation Our work can be cited as: ```bash Coming soon ``` ## Usage This model is a base model, that is, it is not finetuned for Chat function and works best with few-shot prompting strategies. ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer model_name = "croissantllm/base_175k" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16, device_map="auto") inputs = tokenizer("I am so tired I could sleep right now. -> Je suis si fatigué que je pourrais m'endormir maintenant. He is heading to the market. -> Il va au marché. We are running on the beach. ->", return_tensors="pt").to(model.device) tokens = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.95, top_k=60, temperature=0.5) print(tokenizer.decode(tokens[0])) # remove bos token inputs = tokenizer("Capitales: France -> Paris, Italie -> Rome, Allemagne -> Berlin, Espagne ->", return_tensors="pt", add_special_tokens=True).to(model.device) tokens = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.95, top_k=60) print(tokenizer.decode(tokens[0])) ```
[ "TRANSLATION" ]
[ "CRAFT" ]
Non_BioNLP
Teradata/bge-small-en-v1.5
Teradata
feature-extraction
[ "onnx", "bert", "feature-extraction", "sentence-similarity", "mteb", "teradata", "en", "license:mit", "model-index", "region:us" ]
1,739
1,741
28
0
--- language: - en license: mit tags: - feature-extraction - sentence-similarity - mteb - onnx - teradata model-index: - name: bge-small-en-v1.5 results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 73.79104477611939 - type: ap value: 37.21923821573361 - type: f1 value: 68.0914945617093 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 92.75377499999999 - type: ap value: 89.46766124546022 - type: f1 value: 92.73884001331487 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 46.986 - type: f1 value: 46.55936786727896 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: map_at_1 value: 35.846000000000004 - type: map_at_10 value: 51.388 - type: map_at_100 value: 52.132999999999996 - type: map_at_1000 value: 52.141000000000005 - type: map_at_3 value: 47.037 - type: map_at_5 value: 49.579 - type: mrr_at_1 value: 36.558 - type: mrr_at_10 value: 51.658 - type: mrr_at_100 value: 52.402 - type: mrr_at_1000 value: 52.410000000000004 - type: mrr_at_3 value: 47.345 - type: mrr_at_5 value: 49.797999999999995 - type: ndcg_at_1 value: 35.846000000000004 - type: ndcg_at_10 value: 59.550000000000004 - type: ndcg_at_100 value: 62.596 - type: ndcg_at_1000 value: 62.759 - type: ndcg_at_3 value: 50.666999999999994 - type: ndcg_at_5 value: 55.228 - type: precision_at_1 value: 35.846000000000004 - type: precision_at_10 value: 8.542 - type: precision_at_100 value: 0.984 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 20.389 - type: precision_at_5 value: 14.438 - type: recall_at_1 value: 35.846000000000004 - type: recall_at_10 value: 85.42 - type: recall_at_100 value: 98.43499999999999 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 61.166 - type: recall_at_5 value: 72.191 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 47.402770198163594 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 40.01545436974177 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 62.586465273207196 - type: mrr value: 74.42169019038825 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 85.1891186537969 - type: cos_sim_spearman value: 83.75492046087288 - type: euclidean_pearson value: 84.11766204805357 - type: euclidean_spearman value: 84.01456493126516 - type: manhattan_pearson value: 84.2132950502772 - type: manhattan_spearman value: 83.89227298813377 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 85.74025974025975 - type: f1 value: 85.71493566466381 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 38.467181385006434 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 34.719496037339056 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: map_at_1 value: 29.587000000000003 - type: map_at_10 value: 41.114 - type: map_at_100 value: 42.532 - type: map_at_1000 value: 42.661 - type: map_at_3 value: 37.483 - type: map_at_5 value: 39.652 - type: mrr_at_1 value: 36.338 - type: mrr_at_10 value: 46.763 - type: mrr_at_100 value: 47.393 - type: mrr_at_1000 value: 47.445 - type: mrr_at_3 value: 43.538 - type: mrr_at_5 value: 45.556000000000004 - type: ndcg_at_1 value: 36.338 - type: ndcg_at_10 value: 47.658 - type: ndcg_at_100 value: 52.824000000000005 - type: ndcg_at_1000 value: 54.913999999999994 - type: ndcg_at_3 value: 41.989 - type: ndcg_at_5 value: 44.944 - type: precision_at_1 value: 36.338 - type: precision_at_10 value: 9.156 - type: precision_at_100 value: 1.4789999999999999 - type: precision_at_1000 value: 0.196 - type: precision_at_3 value: 20.076 - type: precision_at_5 value: 14.85 - type: recall_at_1 value: 29.587000000000003 - type: recall_at_10 value: 60.746 - type: recall_at_100 value: 82.157 - type: recall_at_1000 value: 95.645 - type: recall_at_3 value: 44.821 - type: recall_at_5 value: 52.819 - type: map_at_1 value: 30.239 - type: map_at_10 value: 39.989000000000004 - type: map_at_100 value: 41.196 - type: map_at_1000 value: 41.325 - type: map_at_3 value: 37.261 - type: map_at_5 value: 38.833 - type: mrr_at_1 value: 37.516 - type: mrr_at_10 value: 46.177 - type: mrr_at_100 value: 46.806 - type: mrr_at_1000 value: 46.849000000000004 - type: mrr_at_3 value: 44.002 - type: mrr_at_5 value: 45.34 - type: ndcg_at_1 value: 37.516 - type: ndcg_at_10 value: 45.586 - type: ndcg_at_100 value: 49.897000000000006 - type: ndcg_at_1000 value: 51.955 - type: ndcg_at_3 value: 41.684 - type: ndcg_at_5 value: 43.617 - type: precision_at_1 value: 37.516 - type: precision_at_10 value: 8.522 - type: precision_at_100 value: 1.374 - type: precision_at_1000 value: 0.184 - type: precision_at_3 value: 20.105999999999998 - type: precision_at_5 value: 14.152999999999999 - type: recall_at_1 value: 30.239 - type: recall_at_10 value: 55.03 - type: recall_at_100 value: 73.375 - type: recall_at_1000 value: 86.29599999999999 - type: recall_at_3 value: 43.269000000000005 - type: recall_at_5 value: 48.878 - type: map_at_1 value: 38.338 - type: map_at_10 value: 50.468999999999994 - type: map_at_100 value: 51.553000000000004 - type: map_at_1000 value: 51.608 - type: map_at_3 value: 47.107 - type: map_at_5 value: 49.101 - type: mrr_at_1 value: 44.201 - type: mrr_at_10 value: 54.057 - type: mrr_at_100 value: 54.764 - type: mrr_at_1000 value: 54.791000000000004 - type: mrr_at_3 value: 51.56699999999999 - type: mrr_at_5 value: 53.05 - type: ndcg_at_1 value: 44.201 - type: ndcg_at_10 value: 56.379000000000005 - type: ndcg_at_100 value: 60.645 - type: ndcg_at_1000 value: 61.73499999999999 - type: ndcg_at_3 value: 50.726000000000006 - type: ndcg_at_5 value: 53.58500000000001 - type: precision_at_1 value: 44.201 - type: precision_at_10 value: 9.141 - type: precision_at_100 value: 1.216 - type: precision_at_1000 value: 0.135 - type: precision_at_3 value: 22.654 - type: precision_at_5 value: 15.723999999999998 - type: recall_at_1 value: 38.338 - type: recall_at_10 value: 70.30499999999999 - type: recall_at_100 value: 88.77199999999999 - type: recall_at_1000 value: 96.49799999999999 - type: recall_at_3 value: 55.218 - type: recall_at_5 value: 62.104000000000006 - type: map_at_1 value: 25.682 - type: map_at_10 value: 33.498 - type: map_at_100 value: 34.461000000000006 - type: map_at_1000 value: 34.544000000000004 - type: map_at_3 value: 30.503999999999998 - type: map_at_5 value: 32.216 - type: mrr_at_1 value: 27.683999999999997 - type: mrr_at_10 value: 35.467999999999996 - type: mrr_at_100 value: 36.32 - type: mrr_at_1000 value: 36.386 - type: mrr_at_3 value: 32.618 - type: mrr_at_5 value: 34.262 - type: ndcg_at_1 value: 27.683999999999997 - type: ndcg_at_10 value: 38.378 - type: ndcg_at_100 value: 43.288 - type: ndcg_at_1000 value: 45.413 - type: ndcg_at_3 value: 32.586 - type: ndcg_at_5 value: 35.499 - type: precision_at_1 value: 27.683999999999997 - type: precision_at_10 value: 5.864 - type: precision_at_100 value: 0.882 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 13.446 - type: precision_at_5 value: 9.718 - type: recall_at_1 value: 25.682 - type: recall_at_10 value: 51.712 - type: recall_at_100 value: 74.446 - type: recall_at_1000 value: 90.472 - type: recall_at_3 value: 36.236000000000004 - type: recall_at_5 value: 43.234 - type: map_at_1 value: 16.073999999999998 - type: map_at_10 value: 24.352999999999998 - type: map_at_100 value: 25.438 - type: map_at_1000 value: 25.545 - type: map_at_3 value: 21.614 - type: map_at_5 value: 23.104 - type: mrr_at_1 value: 19.776 - type: mrr_at_10 value: 28.837000000000003 - type: mrr_at_100 value: 29.755 - type: mrr_at_1000 value: 29.817 - type: mrr_at_3 value: 26.201999999999998 - type: mrr_at_5 value: 27.714 - type: ndcg_at_1 value: 19.776 - type: ndcg_at_10 value: 29.701 - type: ndcg_at_100 value: 35.307 - type: ndcg_at_1000 value: 37.942 - type: ndcg_at_3 value: 24.764 - type: ndcg_at_5 value: 27.025 - type: precision_at_1 value: 19.776 - type: precision_at_10 value: 5.659 - type: precision_at_100 value: 0.971 - type: precision_at_1000 value: 0.133 - type: precision_at_3 value: 12.065 - type: precision_at_5 value: 8.905000000000001 - type: recall_at_1 value: 16.073999999999998 - type: recall_at_10 value: 41.647 - type: recall_at_100 value: 66.884 - type: recall_at_1000 value: 85.91499999999999 - type: recall_at_3 value: 27.916 - type: recall_at_5 value: 33.729 - type: map_at_1 value: 28.444999999999997 - type: map_at_10 value: 38.218999999999994 - type: map_at_100 value: 39.595 - type: map_at_1000 value: 39.709 - type: map_at_3 value: 35.586 - type: map_at_5 value: 36.895 - type: mrr_at_1 value: 34.841 - type: mrr_at_10 value: 44.106 - type: mrr_at_100 value: 44.98 - type: mrr_at_1000 value: 45.03 - type: mrr_at_3 value: 41.979 - type: mrr_at_5 value: 43.047999999999995 - type: ndcg_at_1 value: 34.841 - type: ndcg_at_10 value: 43.922 - type: ndcg_at_100 value: 49.504999999999995 - type: ndcg_at_1000 value: 51.675000000000004 - type: ndcg_at_3 value: 39.858 - type: ndcg_at_5 value: 41.408 - type: precision_at_1 value: 34.841 - type: precision_at_10 value: 7.872999999999999 - type: precision_at_100 value: 1.2449999999999999 - type: precision_at_1000 value: 0.161 - type: precision_at_3 value: 18.993 - type: precision_at_5 value: 13.032 - type: recall_at_1 value: 28.444999999999997 - type: recall_at_10 value: 54.984 - type: recall_at_100 value: 78.342 - type: recall_at_1000 value: 92.77 - type: recall_at_3 value: 42.842999999999996 - type: recall_at_5 value: 47.247 - type: map_at_1 value: 23.072 - type: map_at_10 value: 32.354 - type: map_at_100 value: 33.800000000000004 - type: map_at_1000 value: 33.908 - type: map_at_3 value: 29.232000000000003 - type: map_at_5 value: 31.049 - type: mrr_at_1 value: 29.110000000000003 - type: mrr_at_10 value: 38.03 - type: mrr_at_100 value: 39.032 - type: mrr_at_1000 value: 39.086999999999996 - type: mrr_at_3 value: 35.407 - type: mrr_at_5 value: 36.76 - type: ndcg_at_1 value: 29.110000000000003 - type: ndcg_at_10 value: 38.231 - type: ndcg_at_100 value: 44.425 - type: ndcg_at_1000 value: 46.771 - type: ndcg_at_3 value: 33.095 - type: ndcg_at_5 value: 35.459 - type: precision_at_1 value: 29.110000000000003 - type: precision_at_10 value: 7.215000000000001 - type: precision_at_100 value: 1.2109999999999999 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 16.058 - type: precision_at_5 value: 11.644 - type: recall_at_1 value: 23.072 - type: recall_at_10 value: 50.285999999999994 - type: recall_at_100 value: 76.596 - type: recall_at_1000 value: 92.861 - type: recall_at_3 value: 35.702 - type: recall_at_5 value: 42.152 - type: map_at_1 value: 24.937916666666666 - type: map_at_10 value: 33.755250000000004 - type: map_at_100 value: 34.955999999999996 - type: map_at_1000 value: 35.070499999999996 - type: map_at_3 value: 30.98708333333333 - type: map_at_5 value: 32.51491666666666 - type: mrr_at_1 value: 29.48708333333333 - type: mrr_at_10 value: 37.92183333333334 - type: mrr_at_100 value: 38.76583333333333 - type: mrr_at_1000 value: 38.82466666666667 - type: mrr_at_3 value: 35.45125 - type: mrr_at_5 value: 36.827000000000005 - type: ndcg_at_1 value: 29.48708333333333 - type: ndcg_at_10 value: 39.05225 - type: ndcg_at_100 value: 44.25983333333334 - type: ndcg_at_1000 value: 46.568333333333335 - type: ndcg_at_3 value: 34.271583333333325 - type: ndcg_at_5 value: 36.483916666666666 - type: precision_at_1 value: 29.48708333333333 - type: precision_at_10 value: 6.865749999999999 - type: precision_at_100 value: 1.1195833333333332 - type: precision_at_1000 value: 0.15058333333333335 - type: precision_at_3 value: 15.742083333333333 - type: precision_at_5 value: 11.221916666666667 - type: recall_at_1 value: 24.937916666666666 - type: recall_at_10 value: 50.650416666666665 - type: recall_at_100 value: 73.55383333333334 - type: recall_at_1000 value: 89.61691666666667 - type: recall_at_3 value: 37.27808333333334 - type: recall_at_5 value: 42.99475 - type: map_at_1 value: 23.947 - type: map_at_10 value: 30.575000000000003 - type: map_at_100 value: 31.465 - type: map_at_1000 value: 31.558000000000003 - type: map_at_3 value: 28.814 - type: map_at_5 value: 29.738999999999997 - type: mrr_at_1 value: 26.994 - type: mrr_at_10 value: 33.415 - type: mrr_at_100 value: 34.18 - type: mrr_at_1000 value: 34.245 - type: mrr_at_3 value: 31.621 - type: mrr_at_5 value: 32.549 - type: ndcg_at_1 value: 26.994 - type: ndcg_at_10 value: 34.482 - type: ndcg_at_100 value: 38.915 - type: ndcg_at_1000 value: 41.355 - type: ndcg_at_3 value: 31.139 - type: ndcg_at_5 value: 32.589 - type: precision_at_1 value: 26.994 - type: precision_at_10 value: 5.322 - type: precision_at_100 value: 0.8160000000000001 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 13.344000000000001 - type: precision_at_5 value: 8.988 - type: recall_at_1 value: 23.947 - type: recall_at_10 value: 43.647999999999996 - type: recall_at_100 value: 63.851 - type: recall_at_1000 value: 82 - type: recall_at_3 value: 34.288000000000004 - type: recall_at_5 value: 38.117000000000004 - type: map_at_1 value: 16.197 - type: map_at_10 value: 22.968 - type: map_at_100 value: 24.095 - type: map_at_1000 value: 24.217 - type: map_at_3 value: 20.771 - type: map_at_5 value: 21.995 - type: mrr_at_1 value: 19.511 - type: mrr_at_10 value: 26.55 - type: mrr_at_100 value: 27.500999999999998 - type: mrr_at_1000 value: 27.578999999999997 - type: mrr_at_3 value: 24.421 - type: mrr_at_5 value: 25.604 - type: ndcg_at_1 value: 19.511 - type: ndcg_at_10 value: 27.386 - type: ndcg_at_100 value: 32.828 - type: ndcg_at_1000 value: 35.739 - type: ndcg_at_3 value: 23.405 - type: ndcg_at_5 value: 25.255 - type: precision_at_1 value: 19.511 - type: precision_at_10 value: 5.017 - type: precision_at_100 value: 0.91 - type: precision_at_1000 value: 0.133 - type: precision_at_3 value: 11.023 - type: precision_at_5 value: 8.025 - type: recall_at_1 value: 16.197 - type: recall_at_10 value: 37.09 - type: recall_at_100 value: 61.778 - type: recall_at_1000 value: 82.56599999999999 - type: recall_at_3 value: 26.034000000000002 - type: recall_at_5 value: 30.762 - type: map_at_1 value: 25.41 - type: map_at_10 value: 33.655 - type: map_at_100 value: 34.892 - type: map_at_1000 value: 34.995 - type: map_at_3 value: 30.94 - type: map_at_5 value: 32.303 - type: mrr_at_1 value: 29.477999999999998 - type: mrr_at_10 value: 37.443 - type: mrr_at_100 value: 38.383 - type: mrr_at_1000 value: 38.440000000000005 - type: mrr_at_3 value: 34.949999999999996 - type: mrr_at_5 value: 36.228 - type: ndcg_at_1 value: 29.477999999999998 - type: ndcg_at_10 value: 38.769 - type: ndcg_at_100 value: 44.245000000000005 - type: ndcg_at_1000 value: 46.593 - type: ndcg_at_3 value: 33.623 - type: ndcg_at_5 value: 35.766 - type: precision_at_1 value: 29.477999999999998 - type: precision_at_10 value: 6.455 - type: precision_at_100 value: 1.032 - type: precision_at_1000 value: 0.135 - type: precision_at_3 value: 14.893999999999998 - type: precision_at_5 value: 10.485 - type: recall_at_1 value: 25.41 - type: recall_at_10 value: 50.669 - type: recall_at_100 value: 74.084 - type: recall_at_1000 value: 90.435 - type: recall_at_3 value: 36.679 - type: recall_at_5 value: 41.94 - type: map_at_1 value: 23.339 - type: map_at_10 value: 31.852000000000004 - type: map_at_100 value: 33.411 - type: map_at_1000 value: 33.62 - type: map_at_3 value: 28.929 - type: map_at_5 value: 30.542 - type: mrr_at_1 value: 28.063 - type: mrr_at_10 value: 36.301 - type: mrr_at_100 value: 37.288 - type: mrr_at_1000 value: 37.349 - type: mrr_at_3 value: 33.663 - type: mrr_at_5 value: 35.165 - type: ndcg_at_1 value: 28.063 - type: ndcg_at_10 value: 37.462 - type: ndcg_at_100 value: 43.620999999999995 - type: ndcg_at_1000 value: 46.211 - type: ndcg_at_3 value: 32.68 - type: ndcg_at_5 value: 34.981 - type: precision_at_1 value: 28.063 - type: precision_at_10 value: 7.1739999999999995 - type: precision_at_100 value: 1.486 - type: precision_at_1000 value: 0.23500000000000001 - type: precision_at_3 value: 15.217 - type: precision_at_5 value: 11.265 - type: recall_at_1 value: 23.339 - type: recall_at_10 value: 48.376999999999995 - type: recall_at_100 value: 76.053 - type: recall_at_1000 value: 92.455 - type: recall_at_3 value: 34.735 - type: recall_at_5 value: 40.71 - type: map_at_1 value: 18.925 - type: map_at_10 value: 26.017000000000003 - type: map_at_100 value: 27.034000000000002 - type: map_at_1000 value: 27.156000000000002 - type: map_at_3 value: 23.604 - type: map_at_5 value: 24.75 - type: mrr_at_1 value: 20.333000000000002 - type: mrr_at_10 value: 27.915 - type: mrr_at_100 value: 28.788000000000004 - type: mrr_at_1000 value: 28.877999999999997 - type: mrr_at_3 value: 25.446999999999996 - type: mrr_at_5 value: 26.648 - type: ndcg_at_1 value: 20.333000000000002 - type: ndcg_at_10 value: 30.673000000000002 - type: ndcg_at_100 value: 35.618 - type: ndcg_at_1000 value: 38.517 - type: ndcg_at_3 value: 25.71 - type: ndcg_at_5 value: 27.679 - type: precision_at_1 value: 20.333000000000002 - type: precision_at_10 value: 4.9910000000000005 - type: precision_at_100 value: 0.8130000000000001 - type: precision_at_1000 value: 0.117 - type: precision_at_3 value: 11.029 - type: precision_at_5 value: 7.8740000000000006 - type: recall_at_1 value: 18.925 - type: recall_at_10 value: 43.311 - type: recall_at_100 value: 66.308 - type: recall_at_1000 value: 87.49 - type: recall_at_3 value: 29.596 - type: recall_at_5 value: 34.245 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: map_at_1 value: 13.714 - type: map_at_10 value: 23.194 - type: map_at_100 value: 24.976000000000003 - type: map_at_1000 value: 25.166 - type: map_at_3 value: 19.709 - type: map_at_5 value: 21.523999999999997 - type: mrr_at_1 value: 30.619000000000003 - type: mrr_at_10 value: 42.563 - type: mrr_at_100 value: 43.386 - type: mrr_at_1000 value: 43.423 - type: mrr_at_3 value: 39.555 - type: mrr_at_5 value: 41.268 - type: ndcg_at_1 value: 30.619000000000003 - type: ndcg_at_10 value: 31.836 - type: ndcg_at_100 value: 38.652 - type: ndcg_at_1000 value: 42.088 - type: ndcg_at_3 value: 26.733 - type: ndcg_at_5 value: 28.435 - type: precision_at_1 value: 30.619000000000003 - type: precision_at_10 value: 9.751999999999999 - type: precision_at_100 value: 1.71 - type: precision_at_1000 value: 0.23500000000000001 - type: precision_at_3 value: 19.935 - type: precision_at_5 value: 14.984 - type: recall_at_1 value: 13.714 - type: recall_at_10 value: 37.26 - type: recall_at_100 value: 60.546 - type: recall_at_1000 value: 79.899 - type: recall_at_3 value: 24.325 - type: recall_at_5 value: 29.725 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: map_at_1 value: 8.462 - type: map_at_10 value: 18.637 - type: map_at_100 value: 26.131999999999998 - type: map_at_1000 value: 27.607 - type: map_at_3 value: 13.333 - type: map_at_5 value: 15.654000000000002 - type: mrr_at_1 value: 66.25 - type: mrr_at_10 value: 74.32600000000001 - type: mrr_at_100 value: 74.60900000000001 - type: mrr_at_1000 value: 74.62 - type: mrr_at_3 value: 72.667 - type: mrr_at_5 value: 73.817 - type: ndcg_at_1 value: 53.87499999999999 - type: ndcg_at_10 value: 40.028999999999996 - type: ndcg_at_100 value: 44.199 - type: ndcg_at_1000 value: 51.629999999999995 - type: ndcg_at_3 value: 44.113 - type: ndcg_at_5 value: 41.731 - type: precision_at_1 value: 66.25 - type: precision_at_10 value: 31.900000000000002 - type: precision_at_100 value: 10.043000000000001 - type: precision_at_1000 value: 1.926 - type: precision_at_3 value: 47.417 - type: precision_at_5 value: 40.65 - type: recall_at_1 value: 8.462 - type: recall_at_10 value: 24.293 - type: recall_at_100 value: 50.146 - type: recall_at_1000 value: 74.034 - type: recall_at_3 value: 14.967 - type: recall_at_5 value: 18.682000000000002 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 47.84499999999999 - type: f1 value: 42.48106691979349 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: map_at_1 value: 74.034 - type: map_at_10 value: 82.76 - type: map_at_100 value: 82.968 - type: map_at_1000 value: 82.98299999999999 - type: map_at_3 value: 81.768 - type: map_at_5 value: 82.418 - type: mrr_at_1 value: 80.048 - type: mrr_at_10 value: 87.64999999999999 - type: mrr_at_100 value: 87.712 - type: mrr_at_1000 value: 87.713 - type: mrr_at_3 value: 87.01100000000001 - type: mrr_at_5 value: 87.466 - type: ndcg_at_1 value: 80.048 - type: ndcg_at_10 value: 86.643 - type: ndcg_at_100 value: 87.361 - type: ndcg_at_1000 value: 87.606 - type: ndcg_at_3 value: 85.137 - type: ndcg_at_5 value: 86.016 - type: precision_at_1 value: 80.048 - type: precision_at_10 value: 10.372 - type: precision_at_100 value: 1.093 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 32.638 - type: precision_at_5 value: 20.177 - type: recall_at_1 value: 74.034 - type: recall_at_10 value: 93.769 - type: recall_at_100 value: 96.569 - type: recall_at_1000 value: 98.039 - type: recall_at_3 value: 89.581 - type: recall_at_5 value: 91.906 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: map_at_1 value: 20.5 - type: map_at_10 value: 32.857 - type: map_at_100 value: 34.589 - type: map_at_1000 value: 34.778 - type: map_at_3 value: 29.160999999999998 - type: map_at_5 value: 31.033 - type: mrr_at_1 value: 40.123 - type: mrr_at_10 value: 48.776 - type: mrr_at_100 value: 49.495 - type: mrr_at_1000 value: 49.539 - type: mrr_at_3 value: 46.605000000000004 - type: mrr_at_5 value: 47.654 - type: ndcg_at_1 value: 40.123 - type: ndcg_at_10 value: 40.343 - type: ndcg_at_100 value: 46.56 - type: ndcg_at_1000 value: 49.777 - type: ndcg_at_3 value: 37.322 - type: ndcg_at_5 value: 37.791000000000004 - type: precision_at_1 value: 40.123 - type: precision_at_10 value: 11.08 - type: precision_at_100 value: 1.752 - type: precision_at_1000 value: 0.232 - type: precision_at_3 value: 24.897 - type: precision_at_5 value: 17.809 - type: recall_at_1 value: 20.5 - type: recall_at_10 value: 46.388 - type: recall_at_100 value: 69.552 - type: recall_at_1000 value: 89.011 - type: recall_at_3 value: 33.617999999999995 - type: recall_at_5 value: 38.211 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: map_at_1 value: 39.135999999999996 - type: map_at_10 value: 61.673 - type: map_at_100 value: 62.562 - type: map_at_1000 value: 62.62 - type: map_at_3 value: 58.467999999999996 - type: map_at_5 value: 60.463 - type: mrr_at_1 value: 78.271 - type: mrr_at_10 value: 84.119 - type: mrr_at_100 value: 84.29299999999999 - type: mrr_at_1000 value: 84.299 - type: mrr_at_3 value: 83.18900000000001 - type: mrr_at_5 value: 83.786 - type: ndcg_at_1 value: 78.271 - type: ndcg_at_10 value: 69.935 - type: ndcg_at_100 value: 73.01299999999999 - type: ndcg_at_1000 value: 74.126 - type: ndcg_at_3 value: 65.388 - type: ndcg_at_5 value: 67.906 - type: precision_at_1 value: 78.271 - type: precision_at_10 value: 14.562 - type: precision_at_100 value: 1.6969999999999998 - type: precision_at_1000 value: 0.184 - type: precision_at_3 value: 41.841 - type: precision_at_5 value: 27.087 - type: recall_at_1 value: 39.135999999999996 - type: recall_at_10 value: 72.809 - type: recall_at_100 value: 84.86200000000001 - type: recall_at_1000 value: 92.208 - type: recall_at_3 value: 62.76199999999999 - type: recall_at_5 value: 67.718 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 90.60600000000001 - type: ap value: 86.6579587804335 - type: f1 value: 90.5938853929307 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: dev revision: None metrics: - type: map_at_1 value: 21.852 - type: map_at_10 value: 33.982 - type: map_at_100 value: 35.116 - type: map_at_1000 value: 35.167 - type: map_at_3 value: 30.134 - type: map_at_5 value: 32.340999999999994 - type: mrr_at_1 value: 22.479 - type: mrr_at_10 value: 34.594 - type: mrr_at_100 value: 35.672 - type: mrr_at_1000 value: 35.716 - type: mrr_at_3 value: 30.84 - type: mrr_at_5 value: 32.998 - type: ndcg_at_1 value: 22.493 - type: ndcg_at_10 value: 40.833000000000006 - type: ndcg_at_100 value: 46.357 - type: ndcg_at_1000 value: 47.637 - type: ndcg_at_3 value: 32.995999999999995 - type: ndcg_at_5 value: 36.919000000000004 - type: precision_at_1 value: 22.493 - type: precision_at_10 value: 6.465999999999999 - type: precision_at_100 value: 0.9249999999999999 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 14.030999999999999 - type: precision_at_5 value: 10.413 - type: recall_at_1 value: 21.852 - type: recall_at_10 value: 61.934999999999995 - type: recall_at_100 value: 87.611 - type: recall_at_1000 value: 97.441 - type: recall_at_3 value: 40.583999999999996 - type: recall_at_5 value: 49.992999999999995 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.36069311445507 - type: f1 value: 93.16456330371453 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 74.74692202462381 - type: f1 value: 58.17903579421599 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 74.80833893745796 - type: f1 value: 72.70786592684664 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 78.69872225958305 - type: f1 value: 78.61626934504731 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 33.058658628717694 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 30.85561739360599 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 31.290259910144385 - type: mrr value: 32.44223046102856 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.288 - type: map_at_10 value: 12.267999999999999 - type: map_at_100 value: 15.557000000000002 - type: map_at_1000 value: 16.98 - type: map_at_3 value: 8.866 - type: map_at_5 value: 10.418 - type: mrr_at_1 value: 43.653 - type: mrr_at_10 value: 52.681 - type: mrr_at_100 value: 53.315999999999995 - type: mrr_at_1000 value: 53.357 - type: mrr_at_3 value: 51.393 - type: mrr_at_5 value: 51.903999999999996 - type: ndcg_at_1 value: 42.415000000000006 - type: ndcg_at_10 value: 34.305 - type: ndcg_at_100 value: 30.825999999999997 - type: ndcg_at_1000 value: 39.393 - type: ndcg_at_3 value: 39.931 - type: ndcg_at_5 value: 37.519999999999996 - type: precision_at_1 value: 43.653 - type: precision_at_10 value: 25.728 - type: precision_at_100 value: 7.932 - type: precision_at_1000 value: 2.07 - type: precision_at_3 value: 38.184000000000005 - type: precision_at_5 value: 32.879000000000005 - type: recall_at_1 value: 5.288 - type: recall_at_10 value: 16.195 - type: recall_at_100 value: 31.135 - type: recall_at_1000 value: 61.531000000000006 - type: recall_at_3 value: 10.313 - type: recall_at_5 value: 12.754999999999999 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: map_at_1 value: 28.216 - type: map_at_10 value: 42.588 - type: map_at_100 value: 43.702999999999996 - type: map_at_1000 value: 43.739 - type: map_at_3 value: 38.177 - type: map_at_5 value: 40.754000000000005 - type: mrr_at_1 value: 31.866 - type: mrr_at_10 value: 45.189 - type: mrr_at_100 value: 46.056000000000004 - type: mrr_at_1000 value: 46.081 - type: mrr_at_3 value: 41.526999999999994 - type: mrr_at_5 value: 43.704 - type: ndcg_at_1 value: 31.837 - type: ndcg_at_10 value: 50.178 - type: ndcg_at_100 value: 54.98800000000001 - type: ndcg_at_1000 value: 55.812 - type: ndcg_at_3 value: 41.853 - type: ndcg_at_5 value: 46.153 - type: precision_at_1 value: 31.837 - type: precision_at_10 value: 8.43 - type: precision_at_100 value: 1.1119999999999999 - type: precision_at_1000 value: 0.11900000000000001 - type: precision_at_3 value: 19.023 - type: precision_at_5 value: 13.911000000000001 - type: recall_at_1 value: 28.216 - type: recall_at_10 value: 70.8 - type: recall_at_100 value: 91.857 - type: recall_at_1000 value: 97.941 - type: recall_at_3 value: 49.196 - type: recall_at_5 value: 59.072 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: map_at_1 value: 71.22800000000001 - type: map_at_10 value: 85.115 - type: map_at_100 value: 85.72 - type: map_at_1000 value: 85.737 - type: map_at_3 value: 82.149 - type: map_at_5 value: 84.029 - type: mrr_at_1 value: 81.96 - type: mrr_at_10 value: 88.00200000000001 - type: mrr_at_100 value: 88.088 - type: mrr_at_1000 value: 88.089 - type: mrr_at_3 value: 87.055 - type: mrr_at_5 value: 87.715 - type: ndcg_at_1 value: 82.01 - type: ndcg_at_10 value: 88.78 - type: ndcg_at_100 value: 89.91 - type: ndcg_at_1000 value: 90.013 - type: ndcg_at_3 value: 85.957 - type: ndcg_at_5 value: 87.56 - type: precision_at_1 value: 82.01 - type: precision_at_10 value: 13.462 - type: precision_at_100 value: 1.528 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.553 - type: precision_at_5 value: 24.732000000000003 - type: recall_at_1 value: 71.22800000000001 - type: recall_at_10 value: 95.69 - type: recall_at_100 value: 99.531 - type: recall_at_1000 value: 99.98 - type: recall_at_3 value: 87.632 - type: recall_at_5 value: 92.117 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 52.31768034366916 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 60.640266772723606 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 4.7780000000000005 - type: map_at_10 value: 12.299 - type: map_at_100 value: 14.363000000000001 - type: map_at_1000 value: 14.71 - type: map_at_3 value: 8.738999999999999 - type: map_at_5 value: 10.397 - type: mrr_at_1 value: 23.599999999999998 - type: mrr_at_10 value: 34.845 - type: mrr_at_100 value: 35.916 - type: mrr_at_1000 value: 35.973 - type: mrr_at_3 value: 31.7 - type: mrr_at_5 value: 33.535 - type: ndcg_at_1 value: 23.599999999999998 - type: ndcg_at_10 value: 20.522000000000002 - type: ndcg_at_100 value: 28.737000000000002 - type: ndcg_at_1000 value: 34.596 - type: ndcg_at_3 value: 19.542 - type: ndcg_at_5 value: 16.958000000000002 - type: precision_at_1 value: 23.599999999999998 - type: precision_at_10 value: 10.67 - type: precision_at_100 value: 2.259 - type: precision_at_1000 value: 0.367 - type: precision_at_3 value: 18.333 - type: precision_at_5 value: 14.879999999999999 - type: recall_at_1 value: 4.7780000000000005 - type: recall_at_10 value: 21.617 - type: recall_at_100 value: 45.905 - type: recall_at_1000 value: 74.42 - type: recall_at_3 value: 11.148 - type: recall_at_5 value: 15.082999999999998 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 83.22372750297885 - type: cos_sim_spearman value: 79.40972617119405 - type: euclidean_pearson value: 80.6101072020434 - type: euclidean_spearman value: 79.53844217225202 - type: manhattan_pearson value: 80.57265975286111 - type: manhattan_spearman value: 79.46335611792958 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 85.43713315520749 - type: cos_sim_spearman value: 77.44128693329532 - type: euclidean_pearson value: 81.63869928101123 - type: euclidean_spearman value: 77.29512977961515 - type: manhattan_pearson value: 81.63704185566183 - type: manhattan_spearman value: 77.29909412738657 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 81.59451537860527 - type: cos_sim_spearman value: 82.97994638856723 - type: euclidean_pearson value: 82.89478688288412 - type: euclidean_spearman value: 83.58740751053104 - type: manhattan_pearson value: 82.69140840941608 - type: manhattan_spearman value: 83.33665956040555 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 82.00756527711764 - type: cos_sim_spearman value: 81.83560996841379 - type: euclidean_pearson value: 82.07684151976518 - type: euclidean_spearman value: 82.00913052060511 - type: manhattan_pearson value: 82.05690778488794 - type: manhattan_spearman value: 82.02260252019525 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 86.13710262895447 - type: cos_sim_spearman value: 87.26412811156248 - type: euclidean_pearson value: 86.94151453230228 - type: euclidean_spearman value: 87.5363796699571 - type: manhattan_pearson value: 86.86989424083748 - type: manhattan_spearman value: 87.47315940781353 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 83.0230597603627 - type: cos_sim_spearman value: 84.93344499318864 - type: euclidean_pearson value: 84.23754743431141 - type: euclidean_spearman value: 85.09707376597099 - type: manhattan_pearson value: 84.04325160987763 - type: manhattan_spearman value: 84.89353071339909 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 86.75620824563921 - type: cos_sim_spearman value: 87.15065513706398 - type: euclidean_pearson value: 88.26281533633521 - type: euclidean_spearman value: 87.51963738643983 - type: manhattan_pearson value: 88.25599267618065 - type: manhattan_spearman value: 87.58048736047483 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 64.74645319195137 - type: cos_sim_spearman value: 65.29996325037214 - type: euclidean_pearson value: 67.04297794086443 - type: euclidean_spearman value: 65.43841726694343 - type: manhattan_pearson value: 67.39459955690904 - type: manhattan_spearman value: 65.92864704413651 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 84.31291020270801 - type: cos_sim_spearman value: 85.86473738688068 - type: euclidean_pearson value: 85.65537275064152 - type: euclidean_spearman value: 86.13087454209642 - type: manhattan_pearson value: 85.43946955047609 - type: manhattan_spearman value: 85.91568175344916 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 85.93798118350695 - type: mrr value: 95.93536274908824 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: map_at_1 value: 57.594 - type: map_at_10 value: 66.81899999999999 - type: map_at_100 value: 67.368 - type: map_at_1000 value: 67.4 - type: map_at_3 value: 64.061 - type: map_at_5 value: 65.47 - type: mrr_at_1 value: 60.667 - type: mrr_at_10 value: 68.219 - type: mrr_at_100 value: 68.655 - type: mrr_at_1000 value: 68.684 - type: mrr_at_3 value: 66.22200000000001 - type: mrr_at_5 value: 67.289 - type: ndcg_at_1 value: 60.667 - type: ndcg_at_10 value: 71.275 - type: ndcg_at_100 value: 73.642 - type: ndcg_at_1000 value: 74.373 - type: ndcg_at_3 value: 66.521 - type: ndcg_at_5 value: 68.581 - type: precision_at_1 value: 60.667 - type: precision_at_10 value: 9.433 - type: precision_at_100 value: 1.0699999999999998 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 25.556 - type: precision_at_5 value: 16.8 - type: recall_at_1 value: 57.594 - type: recall_at_10 value: 83.622 - type: recall_at_100 value: 94.167 - type: recall_at_1000 value: 99.667 - type: recall_at_3 value: 70.64399999999999 - type: recall_at_5 value: 75.983 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.85841584158416 - type: cos_sim_ap value: 96.66996142314342 - type: cos_sim_f1 value: 92.83208020050125 - type: cos_sim_precision value: 93.06532663316584 - type: cos_sim_recall value: 92.60000000000001 - type: dot_accuracy value: 99.85841584158416 - type: dot_ap value: 96.6775307676576 - type: dot_f1 value: 92.69289729177312 - type: dot_precision value: 94.77533960292581 - type: dot_recall value: 90.7 - type: euclidean_accuracy value: 99.86138613861387 - type: euclidean_ap value: 96.6338454403108 - type: euclidean_f1 value: 92.92214357937311 - type: euclidean_precision value: 93.96728016359918 - type: euclidean_recall value: 91.9 - type: manhattan_accuracy value: 99.86237623762376 - type: manhattan_ap value: 96.60370449645053 - type: manhattan_f1 value: 92.91177970423253 - type: manhattan_precision value: 94.7970863683663 - type: manhattan_recall value: 91.10000000000001 - type: max_accuracy value: 99.86237623762376 - type: max_ap value: 96.6775307676576 - type: max_f1 value: 92.92214357937311 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 60.77977058695198 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 35.2725272535638 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 53.64052466362125 - type: mrr value: 54.533067014684654 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.677624219206578 - type: cos_sim_spearman value: 30.121368518123447 - type: dot_pearson value: 30.69870088041608 - type: dot_spearman value: 29.61284927093751 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.22 - type: map_at_10 value: 1.855 - type: map_at_100 value: 9.885 - type: map_at_1000 value: 23.416999999999998 - type: map_at_3 value: 0.637 - type: map_at_5 value: 1.024 - type: mrr_at_1 value: 88 - type: mrr_at_10 value: 93.067 - type: mrr_at_100 value: 93.067 - type: mrr_at_1000 value: 93.067 - type: mrr_at_3 value: 92.667 - type: mrr_at_5 value: 93.067 - type: ndcg_at_1 value: 82 - type: ndcg_at_10 value: 75.899 - type: ndcg_at_100 value: 55.115 - type: ndcg_at_1000 value: 48.368 - type: ndcg_at_3 value: 79.704 - type: ndcg_at_5 value: 78.39699999999999 - type: precision_at_1 value: 88 - type: precision_at_10 value: 79.60000000000001 - type: precision_at_100 value: 56.06 - type: precision_at_1000 value: 21.206 - type: precision_at_3 value: 84.667 - type: precision_at_5 value: 83.2 - type: recall_at_1 value: 0.22 - type: recall_at_10 value: 2.078 - type: recall_at_100 value: 13.297 - type: recall_at_1000 value: 44.979 - type: recall_at_3 value: 0.6689999999999999 - type: recall_at_5 value: 1.106 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.258 - type: map_at_10 value: 10.439 - type: map_at_100 value: 16.89 - type: map_at_1000 value: 18.407999999999998 - type: map_at_3 value: 5.668 - type: map_at_5 value: 7.718 - type: mrr_at_1 value: 32.653 - type: mrr_at_10 value: 51.159 - type: mrr_at_100 value: 51.714000000000006 - type: mrr_at_1000 value: 51.714000000000006 - type: mrr_at_3 value: 47.959 - type: mrr_at_5 value: 50.407999999999994 - type: ndcg_at_1 value: 29.592000000000002 - type: ndcg_at_10 value: 26.037 - type: ndcg_at_100 value: 37.924 - type: ndcg_at_1000 value: 49.126999999999995 - type: ndcg_at_3 value: 30.631999999999998 - type: ndcg_at_5 value: 28.571 - type: precision_at_1 value: 32.653 - type: precision_at_10 value: 22.857 - type: precision_at_100 value: 7.754999999999999 - type: precision_at_1000 value: 1.529 - type: precision_at_3 value: 34.014 - type: precision_at_5 value: 29.796 - type: recall_at_1 value: 2.258 - type: recall_at_10 value: 16.554 - type: recall_at_100 value: 48.439 - type: recall_at_1000 value: 82.80499999999999 - type: recall_at_3 value: 7.283 - type: recall_at_5 value: 10.732 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 69.8858 - type: ap value: 13.835684144362109 - type: f1 value: 53.803351693244586 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 60.50650820599886 - type: f1 value: 60.84357825979259 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 48.52131044852134 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 85.59337187816654 - type: cos_sim_ap value: 73.23925826533437 - type: cos_sim_f1 value: 67.34693877551021 - type: cos_sim_precision value: 62.40432237730752 - type: cos_sim_recall value: 73.13984168865434 - type: dot_accuracy value: 85.31322644096085 - type: dot_ap value: 72.30723963807422 - type: dot_f1 value: 66.47051612112296 - type: dot_precision value: 62.0792305930845 - type: dot_recall value: 71.53034300791556 - type: euclidean_accuracy value: 85.61125350181797 - type: euclidean_ap value: 73.32843720487845 - type: euclidean_f1 value: 67.36549633745895 - type: euclidean_precision value: 64.60755813953489 - type: euclidean_recall value: 70.36939313984169 - type: manhattan_accuracy value: 85.63509566668654 - type: manhattan_ap value: 73.16658488311325 - type: manhattan_f1 value: 67.20597386434349 - type: manhattan_precision value: 63.60424028268551 - type: manhattan_recall value: 71.2401055408971 - type: max_accuracy value: 85.63509566668654 - type: max_ap value: 73.32843720487845 - type: max_f1 value: 67.36549633745895 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.33779640625606 - type: cos_sim_ap value: 84.83868375898157 - type: cos_sim_f1 value: 77.16506154017773 - type: cos_sim_precision value: 74.62064005753327 - type: cos_sim_recall value: 79.88912842623961 - type: dot_accuracy value: 88.02732176815307 - type: dot_ap value: 83.95089283763002 - type: dot_f1 value: 76.29635101196631 - type: dot_precision value: 73.31771720613288 - type: dot_recall value: 79.52725592854944 - type: euclidean_accuracy value: 88.44452206310397 - type: euclidean_ap value: 84.98384576824827 - type: euclidean_f1 value: 77.29311047696697 - type: euclidean_precision value: 74.51232583065381 - type: euclidean_recall value: 80.28949799815214 - type: manhattan_accuracy value: 88.47362906042613 - type: manhattan_ap value: 84.91421462218432 - type: manhattan_f1 value: 77.05107637204792 - type: manhattan_precision value: 74.74484256243214 - type: manhattan_recall value: 79.50415768401602 - type: max_accuracy value: 88.47362906042613 - type: max_ap value: 84.98384576824827 - type: max_f1 value: 77.29311047696697 --- ***See Disclaimer below*** ---- # A Teradata Vantage compatible Embeddings Model # BAAI/bge-small-en-v1.5 ## Overview of this Model An Embedding Model which maps text (sentence/ paragraphs) into a vector. The [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) model well known for its effectiveness in capturing semantic meanings in text data. It's a state-of-the-art model trained on a large corpus, capable of generating high-quality text embeddings. - 33.36M params (Sizes in ONNX format - "fp32": 127.03MB, "int8": 32.4MB, "uint8": 32.4MB) - 512 maximum input tokens - 384 dimensions of output vector - Licence: mit. The released models can be used for commercial purposes free of charge. - Reference to Original Model: https://huggingface.co/BAAI/bge-small-en-v1.5 ## Quickstart: Deploying this Model in Teradata Vantage We have pre-converted the model into the ONNX format compatible with BYOM 6.0, eliminating the need for manual conversion. **Note:** Ensure you have access to a Teradata Database with BYOM 6.0 installed. To get started, clone the pre-converted model directly from the Teradata HuggingFace repository. ```python import teradataml as tdml import getpass from huggingface_hub import hf_hub_download model_name = "bge-small-en-v1.5" number_dimensions_output = 384 model_file_name = "model.onnx" # Step 1: Download Model from Teradata HuggingFace Page hf_hub_download(repo_id=f"Teradata/{model_name}", filename=f"onnx/{model_file_name}", local_dir="./") hf_hub_download(repo_id=f"Teradata/{model_name}", filename=f"tokenizer.json", local_dir="./") # Step 2: Create Connection to Vantage tdml.create_context(host = input('enter your hostname'), username=input('enter your username'), password = getpass.getpass("enter your password")) # Step 3: Load Models into Vantage # a) Embedding model tdml.save_byom(model_id = model_name, # must be unique in the models table model_file = f"onnx/{model_file_name}", table_name = 'embeddings_models' ) # b) Tokenizer tdml.save_byom(model_id = model_name, # must be unique in the models table model_file = 'tokenizer.json', table_name = 'embeddings_tokenizers') # Step 4: Test ONNXEmbeddings Function # Note that ONNXEmbeddings expects the 'payload' column to be 'txt'. # If it has got a different name, just rename it in a subquery/CTE. input_table = "emails.emails" embeddings_query = f""" SELECT * from mldb.ONNXEmbeddings( on {input_table} as InputTable on (select * from embeddings_models where model_id = '{model_name}') as ModelTable DIMENSION on (select model as tokenizer from embeddings_tokenizers where model_id = '{model_name}') as TokenizerTable DIMENSION using Accumulate('id', 'txt') ModelOutputTensor('sentence_embedding') EnableMemoryCheck('false') OutputFormat('FLOAT32({number_dimensions_output})') OverwriteCachedModel('true') ) a """ DF_embeddings = tdml.DataFrame.from_query(embeddings_query) DF_embeddings ``` ## What Can I Do with the Embeddings? Teradata Vantage includes pre-built in-database functions to process embeddings further. Explore the following examples: - **Semantic Clustering with TD_KMeans:** [Semantic Clustering Python Notebook](https://github.com/Teradata/jupyter-demos/blob/main/UseCases/Language_Models_InVantage/Semantic_Clustering_Python.ipynb) - **Semantic Distance with TD_VectorDistance:** [Semantic Similarity Python Notebook](https://github.com/Teradata/jupyter-demos/blob/main/UseCases/Language_Models_InVantage/Semantic_Similarity_Python.ipynb) - **RAG-Based Application with TD_VectorDistance:** [RAG and Bedrock Query PDF Notebook](https://github.com/Teradata/jupyter-demos/blob/main/UseCases/Language_Models_InVantage/RAG_and_Bedrock_QueryPDF.ipynb) ## Deep Dive into Model Conversion to ONNX **The steps below outline how we converted the open-source Hugging Face model into an ONNX file compatible with the in-database ONNXEmbeddings function.** You do not need to perform these steps—they are provided solely for documentation and transparency. However, they may be helpful if you wish to convert another model to the required format. ### Part 1. Importing and Converting Model using optimum We start by importing the pre-trained [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) model from Hugging Face. To enhance performance and ensure compatibility with various execution environments, we'll use the [Optimum](https://github.com/huggingface/optimum) utility to convert the model into the ONNX (Open Neural Network Exchange) format. After conversion to ONNX, we are fixing the opset in the ONNX file for compatibility with ONNX runtime used in Teradata Vantage We are generating ONNX files for multiple different precisions: fp32, int8, uint8 You can find the detailed conversion steps in the file [convert.py](./convert.py) ### Part 2. Running the model in Python with onnxruntime & compare results Once the fixes are applied, we proceed to test the correctness of the ONNX model by calculating cosine similarity between two texts using native SentenceTransformers and ONNX runtime, comparing the results. If the results are identical, it confirms that the ONNX model gives the same result as the native models, validating its correctness and suitability for further use in the database. ```python import onnxruntime as rt from sentence_transformers.util import cos_sim from sentence_transformers import SentenceTransformer import transformers sentences_1 = 'How is the weather today?' sentences_2 = 'What is the current weather like today?' # Calculate ONNX result tokenizer = transformers.AutoTokenizer.from_pretrained("BAAI/bge-small-en-v1.5") predef_sess = rt.InferenceSession("onnx/model.onnx") enc1 = tokenizer(sentences_1) embeddings_1_onnx = predef_sess.run(None, {"input_ids": [enc1.input_ids], "attention_mask": [enc1.attention_mask]}) enc2 = tokenizer(sentences_2) embeddings_2_onnx = predef_sess.run(None, {"input_ids": [enc2.input_ids], "attention_mask": [enc2.attention_mask]}) # Calculate embeddings with SentenceTransformer model = SentenceTransformer(model_id, trust_remote_code=True) embeddings_1_sentence_transformer = model.encode(sentences_1, normalize_embeddings=True, trust_remote_code=True) embeddings_2_sentence_transformer = model.encode(sentences_2, normalize_embeddings=True, trust_remote_code=True) # Compare results print("Cosine similiarity for embeddings calculated with ONNX:" + str(cos_sim(embeddings_1_onnx[1][0], embeddings_2_onnx[1][0]))) print("Cosine similiarity for embeddings calculated with SentenceTransformer:" + str(cos_sim(embeddings_1_sentence_transformer, embeddings_2_sentence_transformer))) ``` You can find the detailed ONNX vs. SentenceTransformer result comparison steps in the file [test_local.py](./test_local.py) ----- DISCLAIMER: The content herein (“Content”) is provided “AS IS” and is not covered by any Teradata Operations, Inc. and its affiliates (“Teradata”) agreements. Its listing here does not constitute certification or endorsement by Teradata. To the extent any of the Content contains or is related to any artificial intelligence (“AI”) or other language learning models (“Models”) that interoperate with the products and services of Teradata, by accessing, bringing, deploying or using such Models, you acknowledge and agree that you are solely responsible for ensuring compliance with all applicable laws, regulations, and restrictions governing the use, deployment, and distribution of AI technologies. This includes, but is not limited to, AI Diffusion Rules, European Union AI Act, AI-related laws and regulations, privacy laws, export controls, and financial or sector-specific regulations. While Teradata may provide support, guidance, or assistance in the deployment or implementation of Models to interoperate with Teradata’s products and/or services, you remain fully responsible for ensuring that your Models, data, and applications comply with all relevant legal and regulatory obligations. Our assistance does not constitute legal or regulatory approval, and Teradata disclaims any liability arising from non-compliance with applicable laws. You must determine the suitability of the Models for any purpose. Given the probabilistic nature of machine learning and modeling, the use of the Models may in some situations result in incorrect output that does not accurately reflect the action generated. You should evaluate the accuracy of any output as appropriate for your use case, including by using human review of the output.
[ "SEMANTIC_SIMILARITY", "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
fine-tuned/SciFact-512-192-gpt-4o-2024-05-13-65992666
fine-tuned
feature-extraction
[ "sentence-transformers", "safetensors", "bert", "feature-extraction", "sentence-similarity", "mteb", "en", "dataset:fine-tuned/SciFact-512-192-gpt-4o-2024-05-13-65992666", "dataset:allenai/c4", "license:apache-2.0", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,716
1,716
6
0
--- datasets: - fine-tuned/SciFact-512-192-gpt-4o-2024-05-13-65992666 - allenai/c4 language: - en - en license: apache-2.0 pipeline_tag: feature-extraction tags: - sentence-transformers - feature-extraction - sentence-similarity - mteb --- This model is a fine-tuned version of [**BAAI/bge-large-en-v1.5**](https://huggingface.co/BAAI/bge-large-en-v1.5) designed for the following use case: None ## How to Use This model can be easily integrated into your NLP pipeline for tasks such as text classification, sentiment analysis, entity recognition, and more. Here's a simple example to get you started: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim model = SentenceTransformer( 'fine-tuned/SciFact-512-192-gpt-4o-2024-05-13-65992666', trust_remote_code=True ) embeddings = model.encode([ 'first text to embed', 'second text to embed' ]) print(cos_sim(embeddings[0], embeddings[1])) ```
[ "TEXT_CLASSIFICATION" ]
[ "SCIFACT" ]
Non_BioNLP
legalvn/paraphrase-multilingual-MiniLM-L12-v2-vn-170000
legalvn
sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:651725", "loss:SoftmaxLoss", "arxiv:1908.10084", "base_model:sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2", "base_model:finetune:sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,733
1,733
6
0
--- base_model: sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2 library_name: sentence-transformers pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:651725 - loss:SoftmaxLoss widget: - source_sentence: Nguyên tắc áp dụng phụ cấp ưu đãi nghề y tế thế nào? sentences: - Chu kỳ kiểm định chất lượng giáo dục nghề nghiệp\n...\n2. Trường hợp cơ sở giáo dục nghề nghiệp có ngành, nghề trọng điểm; chương trình đào tạo ngành, nghề trọng điểm; cơ sở giáo dục nghề nghiệp và chương trình đào tạo các ngành, nghề phục vụ yêu cầu công tác quản lý nhà nước phải thực hiện kiểm định chất lượng giáo dục nghề nghiệp theo quy định tại điểm d khoản 3 Điều 65 của Luật Giáo dục nghề nghiệp số 74/2014/QH13 ngày 27 tháng 11 năm 2014 nhưng không đạt tiêu chuẩn kiểm định chất lượng giáo dục nghề nghiệp thì trong thời hạn 03 năm phải thực hiện kiểm định lại. - Vệ sinh môi trường, vệ sinh tòa nhà\n1. Trách nhiệm của các đơn vị, cán bộ, công chức, viên chức, nhân viên và người lao động trong việc giữ gìn vệ sinh tại nơi làm việc và khu vực công cộng:\na) Hàng ngày tự vệ sinh sàn nhà, bàn ghế, tủ, các thiết bị được trang cấp và tổng vệ sinh phòng làm việc vào chiều thứ Sáu hàng tuần;\nb) Có trách nhiệm thu gom rác thải trong phòng chuyển ra thùng rác đặt tại các hành lang;\nc) Không đổ nước chè, cà phê, ….. xuống sàn nhà, hành lang, tường nhà và khu vệ sinh;\nd) Nghiêm cấp hút thuốc lá trong phòng làm việc, phòng họp, cầu thang máy, cầu thang bộ, tầng hầm;\nđ) Không khạc nhổ, bôi bẩn lên tường, không vứt rác thải, gạt tàn thuốc lá, đầu lọc thuốc lá xuống sàn nhà và các khu vực công cộng;\ne) Nghiêm cấm hái hoa, bẻ cành, dẫm lên thảm cỏ, nhổ cây trong khuôn viên cơ quan.\ng) Nghiêm cấm mang chất độc hại vào cơ quan.\n… - Nguyên tắc áp dụng\n1. Trường hợp công chức, viên chức chuyên môn y tế thuộc đối tượng được hưởng các mức phụ cấp ưu đãi theo nghề khác nhau thì được hưởng một mức phụ cấp ưu đãi theo nghề cao nhất.\n2. Công chức, viên chức đã hưởng phụ cấp ưu đãi theo nghề quy định tại Thông tư liên tịch số 06/2010/TTLT-BYT-BNV-BTC ngày 22/3/2010 của Bộ Y tế, Bộ Nội vụ, Bộ Tài chính hướng dẫn thực hiện Nghị định số 64/2009/NĐ-CP ngày 30/7/2009 của Chính phủ về chính sách đối với cán bộ, viên chức y tế công tác ở vùng có điều kiện kinh tế - xã hội đặc biệt khó khăn thì không hưởng phụ cấp ưu đãi theo nghề quy định tại Thông tư liên tịch này. - source_sentence: Số lượng thành viên Hội đồng khoa học và đào tạo là bao nhiêu? sentences: - 'Cấp Giấy chứng nhận chất lượng an toàn kỹ thuật và bảo vệ môi trường trong sản xuất, lắp ráp ô tô, rơ moóc và sơ mi rơ moóc\n2.1. Trình tự thực hiện:\na) Nộp hồ sơ TTHC:\n- Cơ sở sản xuất lập hồ sơ kiểm tra xe cơ giới theo quy định và nộp đến Cục Đăng kiểm Việt Nam.\nb) Giải quyết TTHC:\n- Cục Đăng kiểm Việt Nam tiếp nhận và kiểm tra thành phần hồ sơ kiểm tra xe cơ giới: nếu hồ sơ không đầy đủ theo quy định thì hướng dẫn Cơ sở sản xuất hoàn thiện lại; Nếu hồ sơ đầy đủ theo quy định thì thống nhất về thời gian và địa điểm thực hiện đánh giá điều kiện kiểm tra chất lượng sản phẩm tại Cơ sở sản xuất;\n- Cục Đăng kiểm Việt Nam tiến hành kiểm tra nội dung hồ sơ và thực hiện đánh giá điều kiện kiểm tra chất lượng sản phẩm tại Cơ sở sản xuất theo quy định: Nếu chưa đạt yêu cầu thì thông báo để Cơ sở sản xuất hoàn thiện lại; Nếu đạt yêu cầu thì cấp Giấy chứng nhận trong thời hạn 03 ngày làm việc kể từ ngày kết thúc kiểm tra, đánh giá hồ sơ đầy đủ, hợp lệ theo quy định và có kết quả đánh giá COP đạt yêu cầu;\n- Cơ sở sản xuất nộp hồ sơ kiểm tra xe cơ giới và nhận kết quả trực tiếp tại trụ sở Cục Đăng kiểm Việt Nam hoặc qua hệ thống bưu chính hoặc qua hệ thống dịch vụ công trực tuyến hoặc qua hình thức phù hợp khác.\n...' - Phiên họp Hội đồng khoa học\n1. Hội đồng khoa học họp định kỳ 06 tháng/01 lần. Các phiên họp định kỳ phải có ít nhất 2/3 tổng số thành viên của Hội đồng khoa học tham dự.\n2. Phiên họp đột xuất của Hội đồng khoa học được triệu tập theo quyết định của Chủ tịch và phải có trên 1/2 số thành viên của Hội đồng khoa học tham dự.\n3. Viện trưởng VKSND tối cao tham dự phiên họp của Hội đồng khoa học khi thấy cần thiết.\n4. Tùy thuộc vào nội dung chương trình phiên họp, Chủ tịch Hội đồng khoa học có thể quyết định mời các nhà khoa học trong và ngoài ngành KSND tham gia phiên họp.\n5. Nội dung phiên họp, các tài liệu liên quan đến phiên họp của Hội đồng khoa học phải được thông báo hoặc chuyển cho các Thành viên chậm nhất là 3 ngày làm việc trước ngày họp, trừ trường hợp đột xuất.\n6. Hội đồng khoa học thảo luận dân chủ, tập thể, công khai, quyết định theo đa số về những vấn đề thuộc nội dung phiên họp và những vấn đề do Chủ tịch Hội đồng khoa học nêu ra hoặc do các Thành viên đề nghị và được Chủ tịch Hội đồng khoa học chấp thuận.\nChủ tịch Hội đồng khoa học chủ trì thảo luận và kết luận tại phiên họp. Đối với những vấn đề phức tạp còn nhiều ý kiến khác nhau, Hội đồng khoa học tiến hành biểu quyết. Những vấn đề được biểu quyết đạt trên 2/3 số phiếu của thành viên có mặt hoặc trên 50% tổng số thành viên Hội đồng được coi là ý kiến chính thức của Hội đồng khoa học. Các ý kiến khác được bảo lưu, ghi vào biên bản cuộc họp. - Hồ sơ, thủ tục công nhận liệt sĩ\n1. Người khi hy sinh đang thuộc quân đội, công an quản lý thì Bộ Quốc phòng, Bộ Công an chịu trách nhiệm:\na) Hướng dẫn về quy trình lập hồ sơ đề nghị công nhận liệt sĩ theo quy định.\nb) Có văn bản đề nghị kèm hồ sơ gửi Bộ Lao động - Thương binh và Xã hội thẩm định trong thời gian không quá 50 ngày kể từ ngày cơ quan, đơn vị trực tiếp quản lý người hy sinh xác lập, hoàn thiện các giấy tờ quy định tại Điều 17 Nghị định này. - source_sentence: Ban Tài chính Văn phòng Kiểm toán nhà nước thực hiện những chức năng gì? sentences: - 'Tiếp nhận hồ sơ và trả kết quả\n...\n2.2.4. Lao động nam hoặc người chồng của lao động nữ mang thai hộ nghỉ việc khi vợ sinh con: Bản sao giấy chứng sinh hoặc bản sao giấy khai sinh hoặc trích lục khai sinh của con; trường hợp sinh con phải phẫu thuật hoặc sinh con dưới 32 tuần tuổi mà giấy chứng sinh không thể hiện thì có thêm giấy tờ của cơ sở khám bệnh, chữa bệnh thể hiện việc sinh con phải phẫu thuật, sinh con dưới 32 tuần tuổi. Trường hợp con chết sau khi sinh mà chưa được cấp giấy chứng sinh thì thay bằng trích sao hoặc tóm tắt hồ sơ bệnh án hoặc giấy ra viện của người mẹ hoặc của lao động nữ mang thai hộ thể hiện con chết…' - Việc tự giám sát chất lượng dịch vụ viễn thông của doanh nghiệp viễn thông\n1. Các doanh nghiệp viễn thông được Bộ Thông tin và Truyền thông cấp giấy phép kinh doanh dịch vụ viễn thông phải thường xuyên tự giám sát chất lượng dịch vụ đối với tất cả các dịch vụ thuộc “Danh mục dịch vụ viễn thông bắt buộc quản lý chất lượng” mà mình cung cấp.\n2. Trong trường hợp dịch vụ mà mình cung cấp có sự cố thì doanh nghiệp viễn thông phải thực hiện báo cáo đột xuất như quy định tại Khoản 3 Điều 8 của Thông tư này. - Cục Quản lý, giám sát bảo hiểm; Cục Quản lý Công sản; Cục Quản lý Giá; Cục Quản lý Nợ và Tài chính đối ngoại; Cục Quản lý, giám sát Kế toán, Kiểm toán; Cục Quản lý Công sản; Cục Tài chính doanh nghiệp và Vụ Tài chính ngân hàng chủ trì phối hợp với Cục Tin học & Thống kê Tài chính xây dựng quy trình điện tử từng thủ tục hành chính theo phạm vi quản lý đối với danh mục thủ tục hành chính để thực hiện tích hợp trên Hệ thống thông tin Một cửa điện tử của Bộ Tài chính. - source_sentence: Điều kiện để Giám đốc Học viện An ninh nhân dân được thăng cấp bậc hàm trước thời hạn như thế nào? sentences: - Mức độ tự chủ và trách nhiệm\n- Có ý thức và tác phong nghề nghiệp đúng chuẩn mực, có năng lực thực hiện công việc được giao; phương pháp làm việc khoa học, biết phân tích và giải quyết các vấn đề mới về lĩnh vực chuyên môn nghề;\n- Gắn bó nghề nghiệp; nghiêm chỉnh chấp hành quy chế, quy định của cơ quan, doanh nghiệp, nơi đang công tác với ý thức tổ chức kỉ luật và tinh thần trách nhiệm cao trong công việc;\n- Lập được các biện pháp an toàn và đảm bảo an toàn, vệ sinh lao động trong quá trình làm việc; có ý thức trách nhiệm công dân, thái độ và đạo đức nghề nghiệp đúng đắn, sẵn sàng nhận nhiệm vụ; tự tin, cầu tiến trong công việc; hợp tác, thân thiện, khiêm tốn trong các mối quan hệ;\n- Tự chịu trách nhiệm về chất lượng đối với kết quả công việc, sản phẩm do mình đảm nhiệm theo các tiêu chuẩn và chịu một phần trách nhiệm đối với kết quả công việc, sản phẩm của tổ, nhóm; - Tổ chức bộ máy\n...\n5. Tổng cục Hải quan có thể biệt phái công chức từ các đơn vị thuộc và trực thuộc Tổng cục để bổ sung cán bộ chủ chốt, cán bộ kỹ thuật có năng lực, kinh nghiệm cho Ban Quản lý dự án đầu tư xây dựng chuyên ngành của Tổng cục Hải quan. Thời hạn biệt phái các công chức không quá 03 năm, trường hợp quá 03 năm mà chưa hoàn thành dự án thì Tổng cục Hải quan xem xét quyết định bổ sung thời gian biệt phái.\nNhân sự tuyển dụng mới của Ban Quản lý dự án đầu tư xây dựng chuyên ngành của Tổng cục Hải quan là viên chức hoặc hợp đồng lao động, thực hiện theo quy định về chế độ tiền lương và các chế độ, chính sách đối với viên chức và người lao động.\n... - Biệt phái công chức\n...\n6. Không thực hiện biệt phái công chức nữ đang mang thai hoặc nuôi con dưới 36 tháng tuổi. - source_sentence: Thời điểm đánh giá và xếp loại chất lượng hằng năm của công chức, viên chức thuộc Bộ Tài chính được diễn ra trong thời gian nào? sentences: - Nhiệm vụ của giáo viên\n1. Thực hiện nhiệm vụ tổ chức các hoạt động dạy học, giáo dục theo kế hoạch giáo dục của nhà trường và kế hoạch giáo dục của tổ chuyên môn; quản lý học sinh trong các hoạt động giáo dục do nhà trường tổ chức; tham gia các hoạt động chuyên môn; chịu trách nhiệm về chất lượng, hiệu quả giáo dục.\n2. Trau dồi đạo đức, nêu cao tinh thần trách nhiệm, giữ gìn phẩm chất, danh dự, uy tín của nhà giáo; gương mẫu trước học sinh; thương yêu, đối xử công bằng và tôn trọng nhân cách của học sinh; bảo vệ các quyền và lợi ích chính đáng của học sinh; đoàn kết, giúp đỡ đồng nghiệp.\n3. Học tập, rèn luyện để nâng cao sức khỏe, trình độ chính trị, chuyên môn, nghiệp vụ, đổi mới phương pháp dạy học, giáo dục.\n4. Tham gia tập huấn, bồi dưỡng chuyên môn, nghiệp vụ.\n5. Tham gia công tác phổ cập giáo dục trung học cơ sở ở địa phương.\n6. Thực hiện nghĩa vụ công dân, các quy định của pháp luật và của ngành Giáo dục, các quyết định của hiệu trưởng; thực hiện nhiệm vụ do hiệu trưởng phân công, chịu sự kiểm tra, đánh giá của hiệu trưởng và các cấp quản lý giáo dục.\n7. Phối hợp với Đội Thiếu niên Tiền phong Hồ Chí Minh, Đoàn Thanh niên Cộng sản Hồ Chí Minh, Hội Liên hiệp Thanh niên Việt Nam, gia đình học sinh và các tổ chức xã hội liên quan để tổ chức hoạt động giáo dục.\n8. Thực hiện các nhiệm vụ khác theo quy định của pháp luật. - “Điều 1. Danh mục trang thiết bị y tế phục vụ phòng, chống dịch COVID-19 trong trường hợp cấp bách theo quy định tại khoản 3 Điều 29 Nghị định số 98/2021/NĐ-CP ngày 08 tháng 11 năm 2021 của Chính phủ về quản lý trang thiết bị y tế \n1. Máy PCR. \n2. Hóa chất (sinh phẩm) chạy máy PCR xét nghiệm SARS-CoV-2. \n3. Test kít xét nghiệm nhanh kháng nguyên/ kháng thể kháng SARS-CoV-2. \n4. Máy thở chức năng cao, máy thở xâm nhập và không xâm nhập, máy thở không xâm nhập, máy oxy dòng cao, máy thở xách tay. \n5. Máy lọc máu liên tục. \n6. Máy X-Quang di động. \n7. Máy đo khí máu (đo được điện giải, lactat, hematocrite). \n8. Máy theo dõi bệnh nhân>5 thông số. \n9. Bơm tiêm điện; Bơm truyền dịch. \n10. Máy phá rung tim có tạo nhịp. \n11. Máy đo thời gian đông máu. \n12. Máy đo huyết động.” - Thời điểm đánh giá xếp loại chất lượng hằng năm\n...\n2. Căn cứ tình hình thực tiễn của cơ quan, tổ chức, đơn vị, tập thể lãnh đạo cơ quan, tổ chức, đơn vị thống nhất với cấp ủy cùng cấp về việc kết hợp tổ chức cuộc họp đánh giá, xếp loại chất lượng công chức, viên chức và xếp loại đảng viên trong tổ chức, đơn vị mình, bảo đảm nghiêm túc, hiệu quả, tránh hình thức, lãng phí.\n3. Tại thời điểm đánh giá, xếp loại chất lượng, trường hợp vắng mặt có lý do chính đáng hoặc nghỉ ốm, nghỉ chế độ thai sản theo quy định của pháp luật, công chức, viên chức có trách nhiệm làm báo cáo tại Phiếu đánh giá, xếp loại chất lượng theo chức trách, nhiệm vụ được giao, gửi cơ quan, tổ chức, đơn vị đang công tác để thực hiện việc đánh giá, xếp loại chất lượng theo quy định của pháp luật và Quy chế này. --- # SentenceTransformer based on sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2 This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2) <!-- at revision 8d6b950845285729817bf8e1af1861502c2fed0c --> - **Maximum Sequence Length:** 128 tokens - **Output Dimensionality:** 384 dimensions - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("legalvn/paraphrase-multilingual-MiniLM-L12-v2-vn-170000") # Run inference sentences = [ 'Thời điểm đánh giá và xếp loại chất lượng hằng năm của công chức, viên chức thuộc Bộ Tài chính được diễn ra trong thời gian nào?', 'Thời điểm đánh giá xếp loại chất lượng hằng năm\\n...\\n2. Căn cứ tình hình thực tiễn của cơ quan, tổ chức, đơn vị, tập thể lãnh đạo cơ quan, tổ chức, đơn vị thống nhất với cấp ủy cùng cấp về việc kết hợp tổ chức cuộc họp đánh giá, xếp loại chất lượng công chức, viên chức và xếp loại đảng viên trong tổ chức, đơn vị mình, bảo đảm nghiêm túc, hiệu quả, tránh hình thức, lãng phí.\\n3. Tại thời điểm đánh giá, xếp loại chất lượng, trường hợp vắng mặt có lý do chính đáng hoặc nghỉ ốm, nghỉ chế độ thai sản theo quy định của pháp luật, công chức, viên chức có trách nhiệm làm báo cáo tại Phiếu đánh giá, xếp loại chất lượng theo chức trách, nhiệm vụ được giao, gửi cơ quan, tổ chức, đơn vị đang công tác để thực hiện việc đánh giá, xếp loại chất lượng theo quy định của pháp luật và Quy chế này.', '“Điều 1. Danh mục trang thiết bị y tế phục vụ phòng, chống dịch COVID-19 trong trường hợp cấp bách theo quy định tại khoản 3 Điều 29 Nghị định số 98/2021/NĐ-CP ngày 08 tháng 11 năm 2021 của Chính phủ về quản lý trang thiết bị y tế \\n1. Máy PCR. \\n2. Hóa chất (sinh phẩm) chạy máy PCR xét nghiệm SARS-CoV-2. \\n3. Test kít xét nghiệm nhanh kháng nguyên/ kháng thể kháng SARS-CoV-2. \\n4. Máy thở chức năng cao, máy thở xâm nhập và không xâm nhập, máy thở không xâm nhập, máy oxy dòng cao, máy thở xách tay. \\n5. Máy lọc máu liên tục. \\n6. Máy X-Quang di động. \\n7. Máy đo khí máu (đo được điện giải, lactat, hematocrite). \\n8. Máy theo dõi bệnh nhân>5 thông số. \\n9. Bơm tiêm điện; Bơm truyền dịch. \\n10. Máy phá rung tim có tạo nhịp. \\n11. Máy đo thời gian đông máu. \\n12. Máy đo huyết động.”', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 384] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 651,725 training samples * Columns: <code>queries</code>, <code>corpus</code>, and <code>score</code> * Approximate statistics based on the first 1000 samples: | | queries | corpus | score | |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------| | type | string | string | int | | details | <ul><li>min: 9 tokens</li><li>mean: 24.71 tokens</li><li>max: 43 tokens</li></ul> | <ul><li>min: 29 tokens</li><li>mean: 121.6 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>0: ~43.80%</li><li>1: ~37.00%</li><li>2: ~19.20%</li></ul> | * Samples: | queries | corpus | score | |:------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| | <code>Người học ngành quản lý khai thác công trình thủy lợi trình độ cao đẳng phải có khả năng học tập và nâng cao trình độ như thế nào?</code> | <code>Khả năng học tập, nâng cao trình độ\n- Khối lượng khối lượng kiến thức tối thiểu, yêu cầu về năng lực mà người học phải đạt được sau khi tốt nghiệp ngành, nghề Dược trình độ cao đẳng có thể tiếp tục phát triển ở các trình độ cao hơn;\n- Người học sau tốt nghiệp có năng lực tự học, tự cập nhật những tiến bộ khoa học công nghệ trong phạm vi ngành, nghề để nâng cao trình độ hoặc học liên thông lên trình độ cao hơn trong cùng ngành nghề hoặc trong nhóm ngành, nghề hoặc trong cùng lĩnh vực đào tạo.</code> | <code>2</code> | | <code>Nội dung lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật được quy định thế nào?</code> | <code>Nội dung lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật\nTrong phạm vi điều chỉnh của văn bản quy phạm pháp luật:\n1. Xác định nội dung liên quan đến vấn đề bình đẳng giới hoặc vấn đề bất bình đẳng giới, phân biệt đối xử về giới.\n2. Quy định các biện pháp cần thiết để thực hiện bình đẳng giới hoặc để giải quyết vấn đề bất bình đẳng giới, phân biệt đối xử về giới; dự báo tác động của các quy định đó đối với nam và nữ sau khi được ban hành.\n3. Xác định nguồn nhân lực, tài chính cần thiết để triển khai các biện pháp thực hiện bình đẳng giới hoặc để giải quyết vấn đề bất bình đẳng giới, phân biệt đối xử về giới.</code> | <code>2</code> | | <code>Nội dung lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật được quy định thế nào?</code> | <code>Mục đích lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật\nLồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật (sau đây gọi tắt là văn bản) là một biện pháp để thực hiện mục tiêu bình đẳng giới, xóa bỏ phân biệt đối xử về giới, bảo đảm quyền, lợi ích hợp pháp, phù hợp với đặc thù của mỗi giới; tạo cơ hội phát triển như nhau cho nam và nữ trong các lĩnh vực của đời sống xã hội và gia đình; bảo đảm bình đẳng giới thực chất giữa nam và nữ.</code> | <code>1</code> | * Loss: [<code>SoftmaxLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#softmaxloss) ### Training Hyperparameters #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: no - `prediction_loss_only`: True - `per_device_train_batch_size`: 8 - `per_device_eval_batch_size`: 8 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 5e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 3.0 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.0 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `eval_use_gather_object`: False - `prompts`: None - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs <details><summary>Click to expand</summary> | Epoch | Step | Training Loss | |:------:|:------:|:-------------:| | 0.0061 | 500 | 1.0473 | | 0.0123 | 1000 | 1.0447 | | 0.0184 | 1500 | 1.0383 | | 0.0246 | 2000 | 1.0395 | | 0.0307 | 2500 | 1.0436 | | 0.0368 | 3000 | 1.0375 | | 0.0430 | 3500 | 1.0189 | | 0.0491 | 4000 | 1.0282 | | 0.0552 | 4500 | 1.0355 | | 0.0614 | 5000 | 1.0286 | | 0.0675 | 5500 | 1.0264 | | 0.0737 | 6000 | 1.0174 | | 0.0798 | 6500 | 1.0238 | | 0.0859 | 7000 | 1.0217 | | 0.0921 | 7500 | 1.0203 | | 0.0982 | 8000 | 1.0201 | | 0.1043 | 8500 | 1.0266 | | 0.1105 | 9000 | 1.0379 | | 0.1166 | 9500 | 1.0367 | | 0.1228 | 10000 | 1.0384 | | 0.1289 | 10500 | 1.0291 | | 0.1350 | 11000 | 1.0362 | | 0.1412 | 11500 | 1.0354 | | 0.1473 | 12000 | 1.0204 | | 0.1534 | 12500 | 1.0401 | | 0.1596 | 13000 | 1.0237 | | 0.1657 | 13500 | 1.0271 | | 0.1719 | 14000 | 1.0235 | | 0.1780 | 14500 | 1.0329 | | 0.1841 | 15000 | 1.0474 | | 0.1903 | 15500 | 1.0547 | | 0.1964 | 16000 | 1.0557 | | 0.2025 | 16500 | 1.0626 | | 0.2087 | 17000 | 1.0551 | | 0.2148 | 17500 | 1.0526 | | 0.2210 | 18000 | 1.125 | | 0.2271 | 18500 | 1.2996 | | 0.2332 | 19000 | 1.0703 | | 0.2394 | 19500 | 1.0601 | | 0.2455 | 20000 | 1.0835 | | 0.2516 | 20500 | 1.0583 | | 0.2578 | 21000 | 1.141 | | 0.2639 | 21500 | 1.0802 | | 0.2701 | 22000 | 1.0589 | | 0.2762 | 22500 | 1.086 | | 0.2823 | 23000 | 1.0743 | | 0.2885 | 23500 | 1.0605 | | 0.2946 | 24000 | 1.0602 | | 0.3007 | 24500 | 1.0732 | | 0.3069 | 25000 | 1.0614 | | 0.3130 | 25500 | 1.0666 | | 0.3192 | 26000 | 1.0669 | | 0.3253 | 26500 | 1.0627 | | 0.3314 | 27000 | 1.0659 | | 0.3376 | 27500 | 1.07 | | 0.3437 | 28000 | 1.0783 | | 0.3498 | 28500 | 1.078 | | 0.3560 | 29000 | 1.0832 | | 0.3621 | 29500 | 1.0695 | | 0.3683 | 30000 | 1.0714 | | 0.3744 | 30500 | 1.3794 | | 0.3805 | 31000 | 1.0838 | | 0.3867 | 31500 | 1.0541 | | 0.3928 | 32000 | 1.0799 | | 0.3989 | 32500 | 1.0622 | | 0.4051 | 33000 | 1.0597 | | 0.4112 | 33500 | 1.0731 | | 0.4174 | 34000 | 1.0871 | | 0.4235 | 34500 | 1.0535 | | 0.4296 | 35000 | 1.3215 | | 0.4358 | 35500 | 1.1501 | | 0.4419 | 36000 | 1.1088 | | 0.4480 | 36500 | 1.0844 | | 0.4542 | 37000 | 1.0981 | | 0.4603 | 37500 | 1.0856 | | 0.4665 | 38000 | 1.0956 | | 0.4726 | 38500 | 1.0813 | | 0.4787 | 39000 | 1.0843 | | 0.4849 | 39500 | 1.1053 | | 0.4910 | 40000 | 1.092 | | 0.4971 | 40500 | 1.081 | | 0.5033 | 41000 | 1.0919 | | 0.5094 | 41500 | 1.0681 | | 0.5156 | 42000 | 1.0826 | | 0.5217 | 42500 | 1.0809 | | 0.5278 | 43000 | 1.093 | | 0.5340 | 43500 | 1.0709 | | 0.5401 | 44000 | 1.0623 | | 0.5462 | 44500 | 1.0801 | | 0.5524 | 45000 | 1.0833 | | 0.5585 | 45500 | 1.0816 | | 0.5647 | 46000 | 1.0697 | | 0.5708 | 46500 | 1.0864 | | 0.5769 | 47000 | 1.0744 | | 0.5831 | 47500 | 1.0897 | | 0.5892 | 48000 | 1.0727 | | 0.5953 | 48500 | 1.0621 | | 0.6015 | 49000 | 1.0582 | | 0.6076 | 49500 | 1.0681 | | 0.6138 | 50000 | 1.083 | | 0.6199 | 50500 | 1.0632 | | 0.6260 | 51000 | 1.0809 | | 0.6322 | 51500 | 1.0525 | | 0.6383 | 52000 | 1.6649 | | 0.6444 | 52500 | 1.0873 | | 0.6506 | 53000 | 1.0649 | | 0.6567 | 53500 | 1.0591 | | 0.6629 | 54000 | 1.061 | | 0.6690 | 54500 | 1.0682 | | 0.6751 | 55000 | 1.0616 | | 0.6813 | 55500 | 1.0827 | | 0.6874 | 56000 | 1.0799 | | 0.6935 | 56500 | 1.0705 | | 0.6997 | 57000 | 1.0821 | | 0.7058 | 57500 | 1.0763 | | 0.7120 | 58000 | 1.0842 | | 0.7181 | 58500 | 1.0813 | | 0.7242 | 59000 | 1.0678 | | 0.7304 | 59500 | 1.0894 | | 0.7365 | 60000 | 1.0733 | | 0.7426 | 60500 | 1.0688 | | 0.7488 | 61000 | 1.0665 | | 0.7549 | 61500 | 1.0681 | | 0.7611 | 62000 | 1.301 | | 0.7672 | 62500 | 1.0907 | | 0.7733 | 63000 | 1.3941 | | 0.7795 | 63500 | 1.1355 | | 0.7856 | 64000 | 1.2196 | | 0.7917 | 64500 | 1.225 | | 0.7979 | 65000 | 1.1437 | | 0.8040 | 65500 | 1.0787 | | 0.8102 | 66000 | 1.0686 | | 0.8163 | 66500 | 1.1017 | | 0.8224 | 67000 | 1.0999 | | 0.8286 | 67500 | 1.0771 | | 0.8347 | 68000 | 1.1015 | | 0.8408 | 68500 | 1.0826 | | 0.8470 | 69000 | 1.1046 | | 0.8531 | 69500 | 1.0735 | | 0.8593 | 70000 | 1.1056 | | 0.8654 | 70500 | 1.1077 | | 0.8715 | 71000 | 1.0897 | | 0.8777 | 71500 | 1.0775 | | 0.8838 | 72000 | 1.0907 | | 0.8899 | 72500 | 1.0705 | | 0.8961 | 73000 | 1.0776 | | 0.9022 | 73500 | 1.0896 | | 0.9084 | 74000 | 1.0889 | | 0.9145 | 74500 | 1.0804 | | 0.9206 | 75000 | 1.1087 | | 0.9268 | 75500 | 1.0738 | | 0.9329 | 76000 | 1.0806 | | 0.9390 | 76500 | 1.0899 | | 0.9452 | 77000 | 1.0814 | | 0.9513 | 77500 | 1.0723 | | 0.9575 | 78000 | 1.0923 | | 0.9636 | 78500 | 1.0748 | | 0.9697 | 79000 | 1.0745 | | 0.9759 | 79500 | 1.081 | | 0.9820 | 80000 | 1.08 | | 0.9881 | 80500 | 1.0905 | | 0.9943 | 81000 | 1.1064 | | 1.0004 | 81500 | 1.0929 | | 1.0066 | 82000 | 1.0815 | | 1.0127 | 82500 | 1.0768 | | 1.0188 | 83000 | 1.1004 | | 1.0250 | 83500 | 1.0835 | | 1.0311 | 84000 | 1.0765 | | 1.0372 | 84500 | 1.0906 | | 1.0434 | 85000 | 1.096 | | 1.0495 | 85500 | 1.1085 | | 1.0557 | 86000 | 1.0913 | | 1.0618 | 86500 | 1.0974 | | 1.0679 | 87000 | 1.0763 | | 1.0741 | 87500 | 1.0894 | | 1.0802 | 88000 | 1.1065 | | 1.0863 | 88500 | 1.0898 | | 1.0925 | 89000 | 1.1036 | | 1.0986 | 89500 | 1.0825 | | 1.1048 | 90000 | 1.1164 | | 1.1109 | 90500 | 1.0811 | | 1.1170 | 91000 | 1.115 | | 1.1232 | 91500 | 1.1123 | | 1.1293 | 92000 | 1.0846 | | 1.1354 | 92500 | 1.0917 | | 1.1416 | 93000 | 1.0879 | | 1.1477 | 93500 | 1.0969 | | 1.1539 | 94000 | 1.0849 | | 1.1600 | 94500 | 1.0852 | | 1.1661 | 95000 | 1.0774 | | 1.1723 | 95500 | 1.0984 | | 1.1784 | 96000 | 1.0936 | | 1.1845 | 96500 | 1.0842 | | 1.1907 | 97000 | 1.0895 | | 1.1968 | 97500 | 1.09 | | 1.2030 | 98000 | 1.0813 | | 1.2091 | 98500 | 1.0965 | | 1.2152 | 99000 | 1.1017 | | 1.2214 | 99500 | 1.1045 | | 1.2275 | 100000 | 1.093 | | 1.2336 | 100500 | 1.0903 | | 1.2398 | 101000 | 1.1133 | | 1.2459 | 101500 | 1.0883 | | 1.2521 | 102000 | 1.1192 | | 1.2582 | 102500 | 1.0817 | | 1.2643 | 103000 | 1.0822 | | 1.2705 | 103500 | 1.0915 | | 1.2766 | 104000 | 1.1128 | | 1.2827 | 104500 | 1.0786 | | 1.2889 | 105000 | 1.1101 | | 1.2950 | 105500 | 1.097 | | 1.3012 | 106000 | 1.095 | | 1.3073 | 106500 | 1.0884 | | 1.3134 | 107000 | 1.09 | | 1.3196 | 107500 | 1.1057 | | 1.3257 | 108000 | 1.087 | | 1.3318 | 108500 | 1.1009 | | 1.3380 | 109000 | 1.0849 | | 1.3441 | 109500 | 1.0886 | | 1.3503 | 110000 | 1.0805 | | 1.3564 | 110500 | 1.0808 | | 1.3625 | 111000 | 1.1025 | | 1.3687 | 111500 | 1.0955 | | 1.3748 | 112000 | 1.0824 | | 1.3809 | 112500 | 1.0835 | | 1.3871 | 113000 | 1.1168 | | 1.3932 | 113500 | 1.0881 | | 1.3994 | 114000 | 1.0946 | | 1.4055 | 114500 | 1.0819 | | 1.4116 | 115000 | 1.1155 | | 1.4178 | 115500 | 1.1021 | | 1.4239 | 116000 | 1.102 | | 1.4300 | 116500 | 1.0733 | | 1.4362 | 117000 | 1.0987 | | 1.4423 | 117500 | 1.1103 | | 1.4485 | 118000 | 1.1034 | | 1.4546 | 118500 | 1.0987 | | 1.4607 | 119000 | 1.0908 | | 1.4669 | 119500 | 1.0986 | | 1.4730 | 120000 | 1.0988 | | 1.4791 | 120500 | 1.1023 | | 1.4853 | 121000 | 1.1013 | | 1.4914 | 121500 | 1.0896 | | 1.4976 | 122000 | 1.8455 | | 1.5037 | 122500 | 1.1155 | | 1.5098 | 123000 | 1.1502 | | 1.5160 | 123500 | 1.1183 | | 1.5221 | 124000 | 1.0958 | | 1.5282 | 124500 | 1.1098 | | 1.5344 | 125000 | 1.1021 | | 1.5405 | 125500 | 1.0912 | | 1.5467 | 126000 | 1.0961 | | 1.5528 | 126500 | 1.0858 | | 1.5589 | 127000 | 1.0784 | | 1.5651 | 127500 | 1.1112 | | 1.5712 | 128000 | 1.1067 | | 1.5773 | 128500 | 1.0986 | | 1.5835 | 129000 | 1.0824 | | 1.5896 | 129500 | 1.1072 | | 1.5958 | 130000 | 1.1098 | | 1.6019 | 130500 | 1.0962 | | 1.6080 | 131000 | 1.1108 | | 1.6142 | 131500 | 1.1187 | | 1.6203 | 132000 | 1.0923 | | 1.6264 | 132500 | 1.1003 | | 1.6326 | 133000 | 1.0865 | | 1.6387 | 133500 | 1.099 | | 1.6449 | 134000 | 1.0838 | | 1.6510 | 134500 | 1.0792 | | 1.6571 | 135000 | 1.0966 | | 1.6633 | 135500 | 1.0782 | | 1.6694 | 136000 | 1.1123 | | 1.6755 | 136500 | 1.0923 | | 1.6817 | 137000 | 1.0873 | | 1.6878 | 137500 | 1.0807 | | 1.6940 | 138000 | 1.083 | | 1.7001 | 138500 | 1.0864 | | 1.7062 | 139000 | 1.0828 | | 1.7124 | 139500 | 1.0973 | | 1.7185 | 140000 | 1.1022 | | 1.7246 | 140500 | 1.0837 | | 1.7308 | 141000 | 1.0985 | | 1.7369 | 141500 | 1.1049 | | 1.7431 | 142000 | 1.079 | | 1.7492 | 142500 | 1.0757 | | 1.7553 | 143000 | 1.0808 | | 1.7615 | 143500 | 1.0743 | | 1.7676 | 144000 | 1.0933 | | 1.7737 | 144500 | 1.0938 | | 1.7799 | 145000 | 1.1121 | | 1.7860 | 145500 | 1.1138 | | 1.7922 | 146000 | 1.1063 | | 1.7983 | 146500 | 1.097 | | 1.8044 | 147000 | 1.0999 | | 1.8106 | 147500 | 1.1035 | | 1.8167 | 148000 | 1.0786 | | 1.8228 | 148500 | 1.0824 | | 1.8290 | 149000 | 1.1097 | | 1.8351 | 149500 | 1.0744 | | 1.8413 | 150000 | 1.0902 | | 1.8474 | 150500 | 1.0841 | | 1.8535 | 151000 | 1.0961 | | 1.8597 | 151500 | 1.0778 | | 1.8658 | 152000 | 1.0784 | | 1.8719 | 152500 | 1.0741 | | 1.8781 | 153000 | 1.0879 | | 1.8842 | 153500 | 1.079 | | 1.8904 | 154000 | 1.0967 | | 1.8965 | 154500 | 1.0906 | | 1.9026 | 155000 | 1.0836 | | 1.9088 | 155500 | 1.0932 | | 1.9149 | 156000 | 1.0823 | | 1.9210 | 156500 | 1.087 | | 1.9272 | 157000 | 1.0892 | | 1.9333 | 157500 | 1.0842 | | 1.9395 | 158000 | 1.0837 | | 1.9456 | 158500 | 1.1001 | | 1.9517 | 159000 | 1.0727 | | 1.9579 | 159500 | 1.0875 | | 1.9640 | 160000 | 1.0845 | | 1.9701 | 160500 | 1.0805 | | 1.9763 | 161000 | 1.0825 | | 1.9824 | 161500 | 1.0886 | | 1.9886 | 162000 | 1.0856 | | 1.9947 | 162500 | 1.0816 | | 2.0008 | 163000 | 1.1005 | | 2.0070 | 163500 | 1.0775 | | 2.0131 | 164000 | 1.0875 | | 2.0192 | 164500 | 1.09 | | 2.0254 | 165000 | 1.086 | | 2.0315 | 165500 | 1.087 | | 2.0377 | 166000 | 1.0815 | | 2.0438 | 166500 | 1.0832 | | 2.0499 | 167000 | 1.0801 | | 2.0561 | 167500 | 1.0828 | | 2.0622 | 168000 | 1.0819 | | 2.0683 | 168500 | 1.0767 | | 2.0745 | 169000 | 1.0819 | | 2.0806 | 169500 | 1.1013 | | 2.0868 | 170000 | 1.0891 | </details> ### Framework Versions - Python: 3.10.10 - Sentence Transformers: 3.3.1 - Transformers: 4.43.0 - PyTorch: 2.5.0+cu124 - Accelerate: 1.1.1 - Datasets: 3.1.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers and SoftmaxLoss ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "PCR" ]
Non_BioNLP
bobox/DeBERTaV3-small-GeneralSentenceTransformer-v2
bobox
sentence-similarity
[ "sentence-transformers", "pytorch", "deberta-v2", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:96781", "loss:MultipleNegativesRankingLoss", "loss:AnglELoss", "loss:GISTEmbedLoss", "loss:OnlineContrastiveLoss", "loss:MultipleNegativesSymmetricRankingLoss", "en", "dataset:sentence-transformers/all-nli", "dataset:sentence-transformers/stsb", "dataset:tals/vitaminc", "dataset:nyu-mll/glue", "dataset:allenai/scitail", "dataset:sentence-transformers/xsum", "dataset:sentence-transformers/sentence-compression", "arxiv:1908.10084", "arxiv:1705.00652", "arxiv:2309.12871", "arxiv:2402.16829", "base_model:microsoft/deberta-v3-small", "base_model:finetune:microsoft/deberta-v3-small", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,718
1,718
13
0
--- base_model: microsoft/deberta-v3-small datasets: - sentence-transformers/all-nli - sentence-transformers/stsb - tals/vitaminc - nyu-mll/glue - allenai/scitail - sentence-transformers/xsum - sentence-transformers/sentence-compression language: - en library_name: sentence-transformers pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:96781 - loss:MultipleNegativesRankingLoss - loss:AnglELoss - loss:GISTEmbedLoss - loss:OnlineContrastiveLoss - loss:MultipleNegativesSymmetricRankingLoss widget: - source_sentence: What dual titles did Frederick William hold? sentences: - The impact was increased by chronic overfishing, and by eutrophication that gave the entire ecosystem a short-term boost, causing the Mnemiopsis population to increase even faster than normal – and above all by the absence of efficient predators on these introduced ctenophores. - The "European Council" (rather than the Council, made up of different government Ministers) is composed of the Prime Ministers or executive Presidents of the member states. - Nearly 50,000 Huguenots established themselves in Germany, 20,000 of whom were welcomed in Brandenburg-Prussia, where they were granted special privileges (Edict of Potsdam) and churches in which to worship (such as the Church of St. Peter and St. Paul, Angermünde) by Frederick William, Elector of Brandenburg and Duke of Prussia. - source_sentence: the Great Internet Mersenne Prime Search, what was the prize for finding a prime with at least 10 million digits? sentences: - Since September 2004, the official home of the Scottish Parliament has been a new Scottish Parliament Building, in the Holyrood area of Edinburgh. - The roughly half-mile stretch of Kearney Boulevard between Fresno Street and Thorne Ave was at one time the preferred neighborhood for Fresno's elite African-American families. - In 2009, the Great Internet Mersenne Prime Search project was awarded a US$100,000 prize for first discovering a prime with at least 10 million digits. - source_sentence: A woman is tugging on a white sheet and laughing sentences: - there are children near the camera - The person is amused. - Fruit characters decorate this child's bib - source_sentence: A hispanic fruit market with many different fruits and vegetables in view on a city street with a man passing the store dressed in dark pants and a hoodie. sentences: - A fruit market and a man - Farmers preparing to feed their animals. - The guys have guns. - source_sentence: All the members of one particular species in a give area are called a population. sentences: - The specialized study of the motion of objects that are atomic/subatomic in size is called quantum mechanics. - All the members of a species that live in the same area form a population. - A(n) anaerobic organism does not need oxygen for growth and dies in its presence. --- # SentenceTransformer based on microsoft/deberta-v3-small This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) on the [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli), [sts-label](https://huggingface.co/datasets/sentence-transformers/stsb), [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc), [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue), [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail), [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail), [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum) and [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression) datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) <!-- at revision a36c739020e01763fe789b4b85e2df55d6180012 --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity - **Training Datasets:** - [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli) - [sts-label](https://huggingface.co/datasets/sentence-transformers/stsb) - [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) - [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue) - [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) - [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) - [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum) - [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression) - **Language:** en <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DebertaV2Model (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("bobox/DeBERTaV3-small-GeneralSentenceTransformer-v2") # Run inference sentences = [ 'All the members of one particular species in a give area are called a population.', 'All the members of a species that live in the same area form a population.', 'A(n) anaerobic organism does not need oxygen for growth and dies in its presence.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Datasets #### nli-pairs * Dataset: [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab) * Size: 7,500 training samples * Columns: <code>sentence1</code> and <code>sentence2</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 16.62 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.46 tokens</li><li>max: 29 tokens</li></ul> | * Samples: | sentence1 | sentence2 | |:---------------------------------------------------------------------------|:-------------------------------------------------| | <code>A person on a horse jumps over a broken down airplane.</code> | <code>A person is outdoors, on a horse.</code> | | <code>Children smiling and waving at camera</code> | <code>There are children present</code> | | <code>A boy is jumping on skateboard in the middle of a red bridge.</code> | <code>The boy does a skateboarding trick.</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### sts-label * Dataset: [sts-label](https://huggingface.co/datasets/sentence-transformers/stsb) at [ab7a5ac](https://huggingface.co/datasets/sentence-transformers/stsb/tree/ab7a5ac0e35aa22088bdcf23e7fd99b220e53308) * Size: 5,749 training samples * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | score | |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------| | type | string | string | float | | details | <ul><li>min: 6 tokens</li><li>mean: 9.81 tokens</li><li>max: 27 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 9.74 tokens</li><li>max: 25 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.54</li><li>max: 1.0</li></ul> | * Samples: | sentence1 | sentence2 | score | |:-----------------------------------------------------------|:----------------------------------------------------------------------|:------------------| | <code>A plane is taking off.</code> | <code>An air plane is taking off.</code> | <code>1.0</code> | | <code>A man is playing a large flute.</code> | <code>A man is playing a flute.</code> | <code>0.76</code> | | <code>A man is spreading shreded cheese on a pizza.</code> | <code>A man is spreading shredded cheese on an uncooked pizza.</code> | <code>0.76</code> | * Loss: [<code>AnglELoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#angleloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "pairwise_angle_sim" } ``` #### vitaminc-pairs * Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0) * Size: 3,695 training samples * Columns: <code>label</code>, <code>sentence1</code>, and <code>sentence2</code> * Approximate statistics based on the first 1000 samples: | | label | sentence1 | sentence2 | |:--------|:-----------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | int | string | string | | details | <ul><li>1: 100.00%</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 16.02 tokens</li><li>max: 56 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 38.57 tokens</li><li>max: 502 tokens</li></ul> | * Samples: | label | sentence1 | sentence2 | |:---------------|:------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>1</code> | <code>The movie Yevadu grossed more than 390 million globally .</code> | <code>It also took the second spot in the list of the top 10 films with highest first week shares from AP.The film collected 390.5 million in 9 days , and more than 60 million from other areas , including Karnataka , the rest of India , and overseas territories , enabling it to cross the 400 million mark at the worldwide Box office , becoming Ram Charan 's fourth film to cross that mark .</code> | | <code>1</code> | <code>The film 's score is based on 33 critics .</code> | <code>`` Metacritic gave the film a score of 44 out of 100 , based on 33 critics , indicating `` '' mixed or average reviews '' '' . ''</code> | | <code>1</code> | <code>Back to Black ( album ) sold less than 15 million copies .</code> | <code>Worldwide , the album has sold over 12 million copies .</code> | * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### qnli-contrastive * Dataset: [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue) at [bcdcba7](https://huggingface.co/datasets/nyu-mll/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c) * Size: 7,500 training samples * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | label | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------| | type | string | string | int | | details | <ul><li>min: 6 tokens</li><li>mean: 13.92 tokens</li><li>max: 40 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 35.87 tokens</li><li>max: 499 tokens</li></ul> | <ul><li>0: 100.00%</li></ul> | * Samples: | sentence1 | sentence2 | label | |:--------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| | <code>Who was the biggest artist that CBS had?</code> | <code>CBS Inc., now CBS Corporation, retained the rights to the CBS name for music recordings but granted Sony a temporary license to use the CBS name.</code> | <code>0</code> | | <code>What does a video-conference use that allows communication in live situations?</code> | <code>This is often accomplished by the use of a multipoint control unit (a centralized distribution and call management system) or by a similar non-centralized multipoint capability embedded in each videoconferencing unit.</code> | <code>0</code> | | <code>What is the population of Saint Helena?</code> | <code>It is part of the British Overseas Territory of Saint Helena, Ascension and Tristan da Cunha.</code> | <code>0</code> | * Loss: [<code>OnlineContrastiveLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#onlinecontrastiveloss) #### scitail-pairs-qa * Dataset: [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44) * Size: 14,987 training samples * Columns: <code>sentence2</code> and <code>sentence1</code> * Approximate statistics based on the first 1000 samples: | | sentence2 | sentence1 | |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 15.86 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.1 tokens</li><li>max: 41 tokens</li></ul> | * Samples: | sentence2 | sentence1 | |:--------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | <code>The largest known proteins are titins.</code> | <code>What are the largest known proteins?</code> | | <code>Remote-control vehicles are able to go to the deepest ocean floor.</code> | <code>What type of vehicles is able to go to the deepest ocean floor?</code> | | <code>Vaccine is a preventative measure that is often delivered by injection into the arm.</code> | <code>What preventative measure is often delivered by injection into the arm?</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### scitail-pairs-pos * Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44) * Size: 8,600 training samples * Columns: <code>sentence1</code> and <code>sentence2</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 7 tokens</li><li>mean: 23.75 tokens</li><li>max: 67 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.47 tokens</li><li>max: 41 tokens</li></ul> | * Samples: | sentence1 | sentence2 | |:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------| | <code>The movement of molecules from a location where they are in a high concentration to an area where they are in a lower concentration is called diffusion .</code> | <code>You call the movement of a substance from an area of a higher amount toward an area of lower amount diffusion.</code> | | <code>Climate is the average weather of an area over a long period of time.</code> | <code>Climate is the long-term average of weather in a particular spot.</code> | | <code>Sunlight is captured by green plants during the process of photosynthesis to produce glucose, a carbohydrate from water and carbon dioxide.</code> | <code>Photosynthesis converts carbon dioxide and water into glucose.</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### xsum-pairs * Dataset: [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum) at [788ddaf](https://huggingface.co/datasets/sentence-transformers/xsum/tree/788ddafe04e539956d56b567bc32a036ee7b9206) * Size: 3,750 training samples * Columns: <code>sentence1</code> and <code>sentence2</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:-------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 28 tokens</li><li>mean: 355.39 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 27.3 tokens</li><li>max: 61 tokens</li></ul> | * Samples: | sentence1 | sentence2 | |:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>Prices rose in all council areas and across all property types, but there were wide variations.<br>In Derry City and Strabane prices were up by 11% but by less than 2% in Fermanagh and Omagh.<br>The figures are from the NI Residential Property Price Index, which analyses almost all sales, including cash deals.<br>The average standardised price, across all property types, is now £125,480.<br>That compares to £97,428 at the bottom of the market in 2012, but is still far below the bubble-era peak of £224,670.<br>Over the year the largest rise was in the apartment sector with prices up by 11%.<br>For all other property types, the increase was about 5%.<br>The council area with the highest average price is Lisburn and Castlereagh (£149,600) and the lowest is Derry City and Strabane (£108,464).<br>The number of properties sold in 2016 was 21,669, down slightly on the 2015 figure.<br>Northern Ireland experienced a huge house price bubble in the years leading up to 2007 before the market crashed.<br>Prices more than halved between 2007 and early 2013 but have been increasing gradually since then.</code> | <code>House prices in Northern Ireland rose by almost 6% in 2016, according to official figures.</code> | | <code>English and French clubs intend to break away from the Heineken Cup and create their own tournament.<br>"It could well be the end of professional rugby in Scotland if the competition wasn't to go ahead," Nicol told BBC Scotland.<br>"I don't think you can fill a hole of that amount with anything else."<br>Let's get qualification sorted out and based on a meritocracy and then the distribution of revenues is for the boardrooms<br>The Scottish Rugby Union currently receives about £5m per year for Glasgow Warriors and Edinburgh's participation in the Heineken Cup.<br>European Rugby Cup (ERC), which has run the Heineken Cup since it began in 1995, wants to re-open negotiations about the tournament's future but English Premiership and French Top 14 clubs insist they will not attend talks planned by the organising body next month.<br>They will quit the competition at the end of the season, citing factors such as their view that the Heineken Cup structure favours teams from the Pro12, which is made up of sides from Wales, Scotland, Ireland and Italy, and distribution of revenue.<br>Nicol, who won the Heineken Cup with Bath in 1998, insists that arguments over the tournament format is a repetitive issue and he hopes "common sense" will prevail for the good of the game in Scotland.<br>"It happens every few years," he told BBC Scotland. "The English and the French flex their collective muscles when the contract is coming to an end.<br>"But this year, it's very different, because they've got a television deal on the table and it's a real clear and present danger.<br>"I think there's an acceptance that the current format of the Heineken Cup will cease and there will be a new competition.<br>Media playback is not supported on this device<br>"Then we just need to ensure and hope that Scotland are heavily involved in it."<br>Nicol conceded that the main stumbling block for advancing discussions was the perception that Celtic nations are favoured in the qualification process.<br>At present, Ireland and Wales each have three sides guaranteed a place, while Scotland and Italy have two apiece.<br>Nicol believes the English and French unions want to put a stop to automatic qualification, which could bring about the end of lucrative revenue for Glasgow and Edinburgh, although ending guaranteed entry may be necessary to ensure the future of a pan-European competition.<br>The former Scotland captain said if the tournament comes to an end it would be "a sporting disaster" adding that "the Heineken Cup has been a fantastic competition".<br>He added: "Where it's flawed is in the qualification. I don't think the two Scottish sides and the Italian sides or the Irish sides should qualify automatically.<br>"So let's get qualification sorted out and based on a meritocracy and then the distribution of revenues is for the boardrooms.<br>"There's a bit of posturing from both sides, but I just hope it's a bit of brinksmanship and they get around the table and sort something out - and we get a competition.<br>"It might not be the Heineken Cup as we call it now, but hopefully we'll get something like it."</code> | <code>Professional rugby union in Scotland could end if there is no European competition next season, fears former national captain Andy Nicol.</code> | | <code>The German was 0.203 seconds quicker than Hamilton, with Ferrari's Kimi Raikkonen third, a second off the pace.<br>Mercedes set their times on the super-soft tyre, while Ferrari used the soft, which would account for about half the gap between the two cars.<br>Ferrari's Sebastian Vettel was fourth, ahead of Force India's Sergio Perez.<br>Hamilton enters the race nine points ahead of Rosberg in the championship after recovering from 21st on the grid to finish third at the Belgian Grand Prix last weekend, as Rosberg won.<br>Ferrari have used the last of their remaining engine development 'tokens' ahead of their home race in an attempt to boost their competitiveness after a slump in form that has seen them lose second place in the constructors' championship to Red Bull.<br>The fastest Red Bull was Max Verstappen in eighth, behind Haas driver Romain Grosjean and Williams' Valtteri Bottas, whose team-mate Felipe Massa announced on Thursday that he would retire at the end of the year.<br>Verstappen remains the focus of attention following his controversial battle with Raikkonen in Belgium.<br>Raikkonen has criticised Verstappen for being too dangerous, while the Dutchman said he would not change his driving because others were not happy.<br>The stewards took no action against Verstappen in Spa, but BBC Sport has learned that Charlie Whiting, the F1 director of governing body the FIA, felt that Verstappen's late move in defence at 200mph as Raikkonen attacked was on the edge of acceptability.<br>Whiting told the teams in a meeting on Thursday that he felt Verstappen could have received a black-and-white warning flag for his driving.<br>The black-and-white flag is an indication of unsportsmanlike behaviour and is only shown once. If the driver commits the same offence again he can be disqualified from the race.<br>Whiting's intervention raised the stakes in the debate ahead of the drivers' briefing after practice on Friday afternoon, where the incident is expected to be discussed.<br>It was a relatively low-key session on track, despite a number of drivers running off the track at the tricky Monza chicanes in the warm sunshine.<br>McLaren's session came to an unfortunate end as Fernando Alonso was forced to pit with a gearshift problem. He was 13th, with team-mate Jenson Button 11th, the drivers expecting their most difficult weekend of the year because of the lack of power of the Honda engine, which still lags despite recent updates.<br>Button and Verstappen ran the halo head protection system in the first part of the session as trials continue ahead of the planned introduction of the device in 2018.<br>Italian Grand Prix first practice results<br>Italian Grand Prix coverage details</code> | <code>Nico Rosberg headed team-mate Lewis Hamilton as Mercedes dominated first practice at the Italian Grand Prix.</code> | * Loss: [<code>MultipleNegativesSymmetricRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativessymmetricrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### compression-pairs * Dataset: [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression) at [605bc91](https://huggingface.co/datasets/sentence-transformers/sentence-compression/tree/605bc91d95631895ba25b6eda51a3cb596976c90) * Size: 45,000 training samples * Columns: <code>sentence1</code> and <code>sentence2</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 10 tokens</li><li>mean: 31.78 tokens</li><li>max: 170 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 10.14 tokens</li><li>max: 29 tokens</li></ul> | * Samples: | sentence1 | sentence2 | |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------| | <code>The USHL completed an expansion draft on Monday as 10 players who were on the rosters of USHL teams during the 2009-10 season were selected by the League's two newest entries, the Muskegon Lumberjacks and Dubuque Fighting Saints.</code> | <code>USHL completes expansion draft</code> | | <code>NRT LLC, one of the nation's largest residential real estate brokerage companies, announced several executive appointments within its Coldwell Banker Residential Brokerage operations in Southern California.</code> | <code>NRT announces executive appointments at its Coldwell Banker operations in Southern California</code> | | <code>A new survey shows 30 percent of Californians use Twitter, and more and more of us are using our smart phones to go online.</code> | <code>Survey: 30 percent of Californians use Twitter</code> | * Loss: [<code>MultipleNegativesSymmetricRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativessymmetricrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Evaluation Datasets #### nli-pairs * Dataset: [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab) * Size: 2,000 evaluation samples * Columns: <code>sentence1</code> and <code>sentence2</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 17.64 tokens</li><li>max: 63 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.67 tokens</li><li>max: 29 tokens</li></ul> | * Samples: | sentence1 | sentence2 | |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------| | <code>Two women are embracing while holding to go packages.</code> | <code>Two woman are holding packages.</code> | | <code>Two young children in blue jerseys, one with the number 9 and one with the number 2 are standing on wooden steps in a bathroom and washing their hands in a sink.</code> | <code>Two kids in numbered jerseys wash their hands.</code> | | <code>A man selling donuts to a customer during a world exhibition event held in the city of Angeles</code> | <code>A man selling donuts to a customer.</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### scitail-pairs-pos * Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44) * Size: 1,304 evaluation samples * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | label | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------| | type | string | string | int | | details | <ul><li>min: 5 tokens</li><li>mean: 22.52 tokens</li><li>max: 67 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 15.34 tokens</li><li>max: 36 tokens</li></ul> | <ul><li>0: ~47.50%</li><li>1: ~52.50%</li></ul> | * Samples: | sentence1 | sentence2 | label | |:----------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------|:---------------| | <code>An introduction to atoms and elements, compounds, atomic structure and bonding, the molecule and chemical reactions.</code> | <code>Replace another in a molecule happens to atoms during a substitution reaction.</code> | <code>0</code> | | <code>Wavelength The distance between two consecutive points on a sinusoidal wave that are in phase;</code> | <code>Wavelength is the distance between two corresponding points of adjacent waves called.</code> | <code>1</code> | | <code>humans normally have 23 pairs of chromosomes.</code> | <code>Humans typically have 23 pairs pairs of chromosomes.</code> | <code>1</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` #### qnli-contrastive * Dataset: [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue) at [bcdcba7](https://huggingface.co/datasets/nyu-mll/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c) * Size: 2,000 evaluation samples * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | label | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------| | type | string | string | int | | details | <ul><li>min: 6 tokens</li><li>mean: 14.13 tokens</li><li>max: 36 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 36.58 tokens</li><li>max: 225 tokens</li></ul> | <ul><li>0: 100.00%</li></ul> | * Samples: | sentence1 | sentence2 | label | |:--------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| | <code>What came into force after the new constitution was herald?</code> | <code>As of that day, the new constitution heralding the Second Republic came into force.</code> | <code>0</code> | | <code>What is the first major city in the stream of the Rhine?</code> | <code>The most important tributaries in this area are the Ill below of Strasbourg, the Neckar in Mannheim and the Main across from Mainz.</code> | <code>0</code> | | <code>What is the minimum required if you want to teach in Canada?</code> | <code>In most provinces a second Bachelor's Degree such as a Bachelor of Education is required to become a qualified teacher.</code> | <code>0</code> | * Loss: [<code>OnlineContrastiveLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#onlinecontrastiveloss) #### sts-label * Dataset: [sts-label](https://huggingface.co/datasets/sentence-transformers/stsb) at [ab7a5ac](https://huggingface.co/datasets/sentence-transformers/stsb/tree/ab7a5ac0e35aa22088bdcf23e7fd99b220e53308) * Size: 1,500 evaluation samples * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | score | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------| | type | string | string | float | | details | <ul><li>min: 5 tokens</li><li>mean: 14.77 tokens</li><li>max: 45 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 14.74 tokens</li><li>max: 49 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.47</li><li>max: 1.0</li></ul> | * Samples: | sentence1 | sentence2 | score | |:--------------------------------------------------|:------------------------------------------------------|:------------------| | <code>A man with a hard hat is dancing.</code> | <code>A man wearing a hard hat is dancing.</code> | <code>1.0</code> | | <code>A young child is riding a horse.</code> | <code>A child is riding a horse.</code> | <code>0.95</code> | | <code>A man is feeding a mouse to a snake.</code> | <code>The man is feeding a mouse to the snake.</code> | <code>1.0</code> | * Loss: [<code>AnglELoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#angleloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "pairwise_angle_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 28 - `per_device_eval_batch_size`: 16 - `learning_rate`: 3e-06 - `weight_decay`: 1e-10 - `num_train_epochs`: 5 - `max_steps`: 5000 - `lr_scheduler_type`: cosine - `warmup_ratio`: 0.33 - `save_safetensors`: False - `fp16`: True - `hub_model_id`: bobox/DeBERTaV3-small-ST-checkpoints-tmp - `hub_strategy`: checkpoint - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 28 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 3e-06 - `weight_decay`: 1e-10 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 5 - `max_steps`: 5000 - `lr_scheduler_type`: cosine - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.33 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: False - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: bobox/DeBERTaV3-small-ST-checkpoints-tmp - `hub_strategy`: checkpoint - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | nli-pairs loss | sts-label loss | scitail-pairs-pos loss | qnli-contrastive loss | |:------:|:----:|:-------------:|:--------------:|:--------------:|:----------------------:|:---------------------:| | None | 0 | - | 3.3906 | 6.4037 | 2.3949 | 2.6789 | | 0.0723 | 250 | 3.2471 | 3.2669 | 6.3326 | 2.3286 | 2.6008 | | 0.1445 | 500 | 3.051 | 3.0717 | 6.5578 | 2.0277 | 2.0795 | | 0.2168 | 750 | 2.3717 | 2.8445 | 7.5564 | 1.5729 | 1.1601 | | 0.2890 | 1000 | 1.5228 | 2.5520 | 8.3864 | 1.1221 | 0.7480 | | 0.3613 | 1250 | 1.5747 | 2.1439 | 8.7993 | 0.9512 | 0.5071 | | 0.4335 | 1500 | 1.2114 | 1.7986 | 9.0748 | 0.8195 | 0.3715 | | 0.5058 | 1750 | 1.1832 | 1.5665 | 9.1778 | 0.6956 | 0.2920 | | 0.5780 | 2000 | 0.9078 | 1.4173 | 9.3829 | 0.6840 | 0.2488 | | 0.6503 | 2250 | 0.8436 | 1.3196 | 9.4585 | 0.6831 | 0.1584 | | 0.7225 | 2500 | 0.8744 | 1.2192 | 9.5395 | 0.6232 | 0.1527 | | 0.7948 | 2750 | 1.1809 | 1.1600 | 9.4297 | 0.5681 | 0.1369 | | 0.8671 | 3000 | 0.7233 | 1.1149 | 9.4893 | 0.5523 | 0.1614 | | 0.9393 | 3250 | 0.7862 | 1.0738 | 9.5408 | 0.5372 | 0.1291 | | 1.0116 | 3500 | 1.0888 | 1.0328 | 9.5612 | 0.5286 | 0.1281 | | 1.0838 | 3750 | 0.8116 | 1.0304 | 9.4794 | 0.5239 | 0.1144 | | 1.1561 | 4000 | 1.0436 | 1.0215 | 9.4184 | 0.5278 | 0.0973 | | 1.2283 | 4250 | 0.9298 | 1.0107 | 9.4322 | 0.5221 | 0.0970 | | 1.3006 | 4500 | 0.682 | 1.0093 | 9.4643 | 0.5186 | 0.0951 | | 1.3728 | 4750 | 0.9863 | 1.0080 | 9.4627 | 0.5176 | 0.0948 | | 1.4451 | 5000 | 1.0022 | 1.0076 | 9.4645 | 0.5179 | 0.0945 | ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.0.1 - Transformers: 4.41.2 - PyTorch: 2.3.0+cu121 - Accelerate: 0.31.0 - Datasets: 2.20.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` #### AnglELoss ```bibtex @misc{li2023angleoptimized, title={AnglE-optimized Text Embeddings}, author={Xianming Li and Jing Li}, year={2023}, eprint={2309.12871}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` #### GISTEmbedLoss ```bibtex @misc{solatorio2024gistembed, title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning}, author={Aivin V. Solatorio}, year={2024}, eprint={2402.16829}, archivePrefix={arXiv}, primaryClass={cs.LG} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "SCITAIL" ]
Non_BioNLP
EleutherAI/pythia-1b-deduped
EleutherAI
text-generation
[ "transformers", "pytorch", "safetensors", "gpt_neox", "text-generation", "causal-lm", "pythia", "en", "dataset:EleutherAI/the_pile_deduplicated", "arxiv:2304.01373", "arxiv:2101.00027", "arxiv:2201.07311", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
1,676
1,689
20,123
19
--- datasets: - EleutherAI/the_pile_deduplicated language: - en license: apache-2.0 tags: - pytorch - causal-lm - pythia --- The *Pythia Scaling Suite* is a collection of models developed to facilitate interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf). It contains two sets of eight models of sizes 70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two models: one trained on the Pile, and one trained on the Pile after the dataset has been globally deduplicated. All 8 model sizes are trained on the exact same data, in the exact same order. We also provide 154 intermediate checkpoints per model, hosted on Hugging Face as branches. The Pythia model suite was designed to promote scientific research on large language models, especially interpretability research. Despite not centering downstream performance as a design goal, we find the models <a href="#evaluations">match or exceed</a> the performance of similar and same-sized models, such as those in the OPT and GPT-Neo suites. <details> <summary style="font-weight:600">Details on previous early release and naming convention.</summary> Previously, we released an early version of the Pythia suite to the public. However, we decided to retrain the model suite to address a few hyperparameter discrepancies. This model card <a href="#changelog">lists the changes</a>; see appendix B in the Pythia paper for further discussion. We found no difference in benchmark performance between the two Pythia versions. The old models are [still available](https://huggingface.co/models?other=pythia_v0), but we suggest the retrained suite if you are just starting to use Pythia.<br> **This is the current release.** Please note that all models in the *Pythia* suite were renamed in January 2023. For clarity, a <a href="#naming-convention-and-parameter-count">table comparing the old and new names</a> is provided in this model card, together with exact parameter counts. </details> <br> # Pythia-1B-deduped ## Model Details - Developed by: [EleutherAI](http://eleuther.ai) - Model type: Transformer-based Language Model - Language: English - Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia) for training procedure, config files, and details on how to use. [See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation details. - Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox) - License: Apache 2.0 - Contact: to ask questions about this model, join the [EleutherAI Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`. Please read the existing *Pythia* documentation before asking about it in the EleutherAI Discord. For general correspondence: [contact@eleuther. ai](mailto:[email protected]). <figure> | Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models | | -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: | | 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — | | 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M | | 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M | | 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — | | 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B | | 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B | | 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B | | 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — | <figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and non-deduped models of a given size have the same hyperparameters. “Equivalent” models have <b>exactly</b> the same architecture, and the same number of non-embedding parameters.</figcaption> </figure> ## Uses and Limitations ### Intended Use The primary intended use of Pythia is research on the behavior, functionality, and limitations of large language models. This suite is intended to provide a controlled setting for performing scientific experiments. We also provide 154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints `step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to `step143000`. These checkpoints are hosted on Hugging Face as branches. Note that branch `143000` corresponds exactly to the model checkpoint on the `main` branch of each model. You may also further fine-tune and adapt Pythia-1B-deduped for deployment, as long as your use is in accordance with the Apache 2.0 license. Pythia models work with the Hugging Face [Transformers Library](https://huggingface.co/docs/transformers/index). If you decide to use pre-trained Pythia-1B-deduped as a basis for your fine-tuned model, please conduct your own risk and bias assessment. ### Out-of-scope use The Pythia Suite is **not** intended for deployment. It is not a in itself a product and cannot be used for human-facing interactions. For example, the model may generate harmful or offensive text. Please evaluate the risks associated with your particular use case. Pythia models are English-language only, and are not suitable for translation or generating text in other languages. Pythia-1B-deduped has not been fine-tuned for downstream contexts in which language models are commonly deployed, such as writing genre prose, or commercial chatbots. This means Pythia-1B-deduped will **not** respond to a given prompt the way a product like ChatGPT does. This is because, unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement Learning from Human Feedback (RLHF) to better “follow” human instructions. ### Limitations and biases The core functionality of a large language model is to take a string of text and predict the next token. The token used by the model need not produce the most “accurate” text. Never rely on Pythia-1B-deduped to produce factually accurate output. This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset known to contain profanity and texts that are lewd or otherwise offensive. See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a discussion of documented biases with regards to gender, religion, and race. Pythia-1B-deduped may produce socially unacceptable or undesirable text, *even if* the prompt itself does not include anything explicitly offensive. If you plan on using text generated through, for example, the Hosted Inference API, we recommend having a human curate the outputs of this language model before presenting it to other people. Please inform your audience that the text was generated by Pythia-1B-deduped. ### Quickstart Pythia models can be loaded and used via the following code, demonstrated here for the third `pythia-70m-deduped` checkpoint: ```python from transformers import GPTNeoXForCausalLM, AutoTokenizer model = GPTNeoXForCausalLM.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) tokenizer = AutoTokenizer.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) inputs = tokenizer("Hello, I am", return_tensors="pt") tokens = model.generate(**inputs) tokenizer.decode(tokens[0]) ``` Revision/branch `step143000` corresponds exactly to the model checkpoint on the `main` branch of each model.<br> For more information on how to use all Pythia models, see [documentation on GitHub](https://github.com/EleutherAI/pythia). ## Training ### Training data Pythia-1B-deduped was trained on the Pile **after the dataset has been globally deduplicated**.<br> [The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in English. It was created by EleutherAI specifically for training large language models. It contains texts from 22 diverse sources, roughly broken down into five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl), prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and miscellaneous (e.g. GitHub, Enron Emails). See [the Pile paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources, methodology, and a discussion of ethical implications. Consult [the datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation about the Pile and its component datasets. The Pile can be downloaded from the [official website](https://pile.eleuther.ai/), or from a [community mirror](https://the-eye.eu/public/AI/pile/). ### Training procedure All models were trained on the exact same data, in the exact same order. Each model saw 299,892,736,000 tokens during training, and 143 checkpoints for each model are saved every 2,097,152,000 tokens, spaced evenly throughout training, from `step1000` to `step143000` (which is the same as `main`). In addition, we also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`. This corresponds to training for just under 1 epoch on the Pile for non-deduplicated models, and about 1.5 epochs on the deduplicated Pile. All *Pythia* models trained for 143000 steps at a batch size of 2M (2,097,152 tokens).<br> See [GitHub](https://github.com/EleutherAI/pythia) for more details on training procedure, including [how to reproduce it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br> Pythia uses the same tokenizer as [GPT-NeoX- 20B](https://huggingface.co/EleutherAI/gpt-neox-20b). ## Evaluations All 16 *Pythia* models were evaluated using the [LM Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access the results by model and step at `results/json/*` in the [GitHub repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br> Expand the sections below to see plots of evaluation results for all Pythia and Pythia-deduped models compared with OPT and BLOOM. <details> <summary>LAMBADA – OpenAI</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/> </details> <details> <summary>Physical Interaction: Question Answering (PIQA)</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/> </details> <details> <summary>WinoGrande</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/> </details> <details> <summary>AI2 Reasoning Challenge—Easy Set</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/> </details> <details> <summary>SciQ</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/> </details> ## Changelog This section compares differences between previously released [Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current models. See Appendix B of the Pythia paper for further discussion of these changes and the motivation behind them. We found that retraining Pythia had no impact on benchmark performance. - All model sizes are now trained with uniform batch size of 2M tokens. Previously, the models of size 160M, 410M, and 1.4B parameters were trained with batch sizes of 4M tokens. - We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64, 128,256,512} in addition to every 1000 training steps. - Flash Attention was used in the new retrained suite. - We remedied a minor inconsistency that existed in the original suite: all models of size 2.8B parameters or smaller had a learning rate (LR) schedule which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and 12B models all used an LR schedule which decayed to a minimum LR of 0. In the redone training runs, we rectified this inconsistency: all models now were trained with LR decaying to a minimum of 0.1× their maximum LR. ### Naming convention and parameter count *Pythia* models were renamed in January 2023. It is possible that the old naming convention still persists in some documentation by accident. The current naming convention (70M, 160M, etc.) is based on total parameter count. <figure style="width:32em"> | current Pythia suffix | old suffix | total params | non-embedding params | | --------------------: | ---------: | -------------: | -------------------: | | 70M | 19M | 70,426,624 | 18,915,328 | | 160M | 125M | 162,322,944 | 85,056,000 | | 410M | 350M | 405,334,016 | 302,311,424 | | 1B | 800M | 1,011,781,632 | 805,736,448 | | 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 | | 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 | | 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 | | 12B | 13B | 11,846,072,320 | 11,327,027,200 | </figure>
[ "QUESTION_ANSWERING", "TRANSLATION" ]
[ "SCIQ" ]
Non_BioNLP
corto-ai/nomic-embed-text-v1
corto-ai
sentence-similarity
[ "sentence-transformers", "pytorch", "onnx", "safetensors", "nomic_bert", "feature-extraction", "sentence-similarity", "mteb", "transformers", "transformers.js", "custom_code", "en", "arxiv:2402.01613", "license:apache-2.0", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,714
1,714
740
2
--- language: - en library_name: sentence-transformers license: apache-2.0 pipeline_tag: sentence-similarity tags: - feature-extraction - sentence-similarity - mteb - transformers - transformers.js model-index: - name: epoch_0_model results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 76.8507462686567 - type: ap value: 40.592189159090495 - type: f1 value: 71.01634655512476 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 91.51892500000001 - type: ap value: 88.50346762975335 - type: f1 value: 91.50342077459624 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 47.364 - type: f1 value: 46.72708080922794 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: map_at_1 value: 25.178 - type: map_at_10 value: 40.244 - type: map_at_100 value: 41.321999999999996 - type: map_at_1000 value: 41.331 - type: map_at_3 value: 35.016999999999996 - type: map_at_5 value: 37.99 - type: mrr_at_1 value: 25.605 - type: mrr_at_10 value: 40.422000000000004 - type: mrr_at_100 value: 41.507 - type: mrr_at_1000 value: 41.516 - type: mrr_at_3 value: 35.23 - type: mrr_at_5 value: 38.15 - type: ndcg_at_1 value: 25.178 - type: ndcg_at_10 value: 49.258 - type: ndcg_at_100 value: 53.776 - type: ndcg_at_1000 value: 53.995000000000005 - type: ndcg_at_3 value: 38.429 - type: ndcg_at_5 value: 43.803 - type: precision_at_1 value: 25.178 - type: precision_at_10 value: 7.831 - type: precision_at_100 value: 0.979 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 16.121 - type: precision_at_5 value: 12.29 - type: recall_at_1 value: 25.178 - type: recall_at_10 value: 78.307 - type: recall_at_100 value: 97.866 - type: recall_at_1000 value: 99.57300000000001 - type: recall_at_3 value: 48.364000000000004 - type: recall_at_5 value: 61.451 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 45.93034494751465 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 36.64579480054327 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 60.601310529222054 - type: mrr value: 75.04484896451656 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 88.57797718095814 - type: cos_sim_spearman value: 86.47064499110101 - type: euclidean_pearson value: 87.4559602783142 - type: euclidean_spearman value: 86.47064499110101 - type: manhattan_pearson value: 87.7232764230245 - type: manhattan_spearman value: 86.91222131777742 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 84.5422077922078 - type: f1 value: 84.47657456950589 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 38.48953561974464 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 32.75995857510105 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: map_at_1 value: 30.008000000000003 - type: map_at_10 value: 39.51 - type: map_at_100 value: 40.841 - type: map_at_1000 value: 40.973 - type: map_at_3 value: 36.248999999999995 - type: map_at_5 value: 38.096999999999994 - type: mrr_at_1 value: 36.481 - type: mrr_at_10 value: 44.818000000000005 - type: mrr_at_100 value: 45.64 - type: mrr_at_1000 value: 45.687 - type: mrr_at_3 value: 42.036 - type: mrr_at_5 value: 43.782 - type: ndcg_at_1 value: 36.481 - type: ndcg_at_10 value: 45.152 - type: ndcg_at_100 value: 50.449 - type: ndcg_at_1000 value: 52.76499999999999 - type: ndcg_at_3 value: 40.161 - type: ndcg_at_5 value: 42.577999999999996 - type: precision_at_1 value: 36.481 - type: precision_at_10 value: 8.369 - type: precision_at_100 value: 1.373 - type: precision_at_1000 value: 0.186 - type: precision_at_3 value: 18.693 - type: precision_at_5 value: 13.533999999999999 - type: recall_at_1 value: 30.008000000000003 - type: recall_at_10 value: 56.108999999999995 - type: recall_at_100 value: 78.55499999999999 - type: recall_at_1000 value: 93.659 - type: recall_at_3 value: 41.754999999999995 - type: recall_at_5 value: 48.296 - type: map_at_1 value: 30.262 - type: map_at_10 value: 40.139 - type: map_at_100 value: 41.394 - type: map_at_1000 value: 41.526 - type: map_at_3 value: 37.155 - type: map_at_5 value: 38.785 - type: mrr_at_1 value: 38.153 - type: mrr_at_10 value: 46.369 - type: mrr_at_100 value: 47.072 - type: mrr_at_1000 value: 47.111999999999995 - type: mrr_at_3 value: 44.268 - type: mrr_at_5 value: 45.389 - type: ndcg_at_1 value: 38.153 - type: ndcg_at_10 value: 45.925 - type: ndcg_at_100 value: 50.394000000000005 - type: ndcg_at_1000 value: 52.37500000000001 - type: ndcg_at_3 value: 41.754000000000005 - type: ndcg_at_5 value: 43.574 - type: precision_at_1 value: 38.153 - type: precision_at_10 value: 8.796 - type: precision_at_100 value: 1.432 - type: precision_at_1000 value: 0.189 - type: precision_at_3 value: 20.318 - type: precision_at_5 value: 14.395 - type: recall_at_1 value: 30.262 - type: recall_at_10 value: 55.72200000000001 - type: recall_at_100 value: 74.97500000000001 - type: recall_at_1000 value: 87.342 - type: recall_at_3 value: 43.129 - type: recall_at_5 value: 48.336 - type: map_at_1 value: 39.951 - type: map_at_10 value: 51.248000000000005 - type: map_at_100 value: 52.188 - type: map_at_1000 value: 52.247 - type: map_at_3 value: 48.211 - type: map_at_5 value: 49.797000000000004 - type: mrr_at_1 value: 45.329 - type: mrr_at_10 value: 54.749 - type: mrr_at_100 value: 55.367999999999995 - type: mrr_at_1000 value: 55.400000000000006 - type: mrr_at_3 value: 52.382 - type: mrr_at_5 value: 53.649 - type: ndcg_at_1 value: 45.329 - type: ndcg_at_10 value: 56.847 - type: ndcg_at_100 value: 60.738 - type: ndcg_at_1000 value: 61.976 - type: ndcg_at_3 value: 51.59 - type: ndcg_at_5 value: 53.915 - type: precision_at_1 value: 45.329 - type: precision_at_10 value: 8.959 - type: precision_at_100 value: 1.187 - type: precision_at_1000 value: 0.134 - type: precision_at_3 value: 22.612 - type: precision_at_5 value: 15.273 - type: recall_at_1 value: 39.951 - type: recall_at_10 value: 70.053 - type: recall_at_100 value: 86.996 - type: recall_at_1000 value: 95.707 - type: recall_at_3 value: 56.032000000000004 - type: recall_at_5 value: 61.629999999999995 - type: map_at_1 value: 25.566 - type: map_at_10 value: 33.207 - type: map_at_100 value: 34.166000000000004 - type: map_at_1000 value: 34.245 - type: map_at_3 value: 30.94 - type: map_at_5 value: 32.01 - type: mrr_at_1 value: 27.345000000000002 - type: mrr_at_10 value: 35.193000000000005 - type: mrr_at_100 value: 35.965 - type: mrr_at_1000 value: 36.028999999999996 - type: mrr_at_3 value: 32.806000000000004 - type: mrr_at_5 value: 34.021 - type: ndcg_at_1 value: 27.345000000000002 - type: ndcg_at_10 value: 37.891999999999996 - type: ndcg_at_100 value: 42.664 - type: ndcg_at_1000 value: 44.757000000000005 - type: ndcg_at_3 value: 33.123000000000005 - type: ndcg_at_5 value: 35.035 - type: precision_at_1 value: 27.345000000000002 - type: precision_at_10 value: 5.763 - type: precision_at_100 value: 0.859 - type: precision_at_1000 value: 0.108 - type: precision_at_3 value: 13.71 - type: precision_at_5 value: 9.401 - type: recall_at_1 value: 25.566 - type: recall_at_10 value: 50.563 - type: recall_at_100 value: 72.86399999999999 - type: recall_at_1000 value: 88.68599999999999 - type: recall_at_3 value: 37.43 - type: recall_at_5 value: 41.894999999999996 - type: map_at_1 value: 16.663 - type: map_at_10 value: 23.552 - type: map_at_100 value: 24.538 - type: map_at_1000 value: 24.661 - type: map_at_3 value: 21.085 - type: map_at_5 value: 22.391 - type: mrr_at_1 value: 20.025000000000002 - type: mrr_at_10 value: 27.643 - type: mrr_at_100 value: 28.499999999999996 - type: mrr_at_1000 value: 28.582 - type: mrr_at_3 value: 25.083 - type: mrr_at_5 value: 26.544 - type: ndcg_at_1 value: 20.025000000000002 - type: ndcg_at_10 value: 28.272000000000002 - type: ndcg_at_100 value: 33.353 - type: ndcg_at_1000 value: 36.454 - type: ndcg_at_3 value: 23.579 - type: ndcg_at_5 value: 25.685000000000002 - type: precision_at_1 value: 20.025000000000002 - type: precision_at_10 value: 5.187 - type: precision_at_100 value: 0.897 - type: precision_at_1000 value: 0.13 - type: precision_at_3 value: 10.987 - type: precision_at_5 value: 8.06 - type: recall_at_1 value: 16.663 - type: recall_at_10 value: 38.808 - type: recall_at_100 value: 61.305 - type: recall_at_1000 value: 83.571 - type: recall_at_3 value: 25.907999999999998 - type: recall_at_5 value: 31.214 - type: map_at_1 value: 27.695999999999998 - type: map_at_10 value: 37.018 - type: map_at_100 value: 38.263000000000005 - type: map_at_1000 value: 38.371 - type: map_at_3 value: 34.226 - type: map_at_5 value: 35.809999999999995 - type: mrr_at_1 value: 32.916000000000004 - type: mrr_at_10 value: 42.067 - type: mrr_at_100 value: 42.925000000000004 - type: mrr_at_1000 value: 42.978 - type: mrr_at_3 value: 39.637 - type: mrr_at_5 value: 41.134 - type: ndcg_at_1 value: 32.916000000000004 - type: ndcg_at_10 value: 42.539 - type: ndcg_at_100 value: 47.873 - type: ndcg_at_1000 value: 50.08200000000001 - type: ndcg_at_3 value: 37.852999999999994 - type: ndcg_at_5 value: 40.201 - type: precision_at_1 value: 32.916000000000004 - type: precision_at_10 value: 7.5840000000000005 - type: precision_at_100 value: 1.199 - type: precision_at_1000 value: 0.155 - type: precision_at_3 value: 17.485 - type: precision_at_5 value: 12.512 - type: recall_at_1 value: 27.695999999999998 - type: recall_at_10 value: 53.638 - type: recall_at_100 value: 76.116 - type: recall_at_1000 value: 91.069 - type: recall_at_3 value: 41.13 - type: recall_at_5 value: 46.872 - type: map_at_1 value: 24.108 - type: map_at_10 value: 33.372 - type: map_at_100 value: 34.656 - type: map_at_1000 value: 34.768 - type: map_at_3 value: 30.830999999999996 - type: map_at_5 value: 32.204 - type: mrr_at_1 value: 29.110000000000003 - type: mrr_at_10 value: 37.979 - type: mrr_at_100 value: 38.933 - type: mrr_at_1000 value: 38.988 - type: mrr_at_3 value: 35.731 - type: mrr_at_5 value: 36.963 - type: ndcg_at_1 value: 29.110000000000003 - type: ndcg_at_10 value: 38.635000000000005 - type: ndcg_at_100 value: 44.324999999999996 - type: ndcg_at_1000 value: 46.747 - type: ndcg_at_3 value: 34.37 - type: ndcg_at_5 value: 36.228 - type: precision_at_1 value: 29.110000000000003 - type: precision_at_10 value: 6.963 - type: precision_at_100 value: 1.146 - type: precision_at_1000 value: 0.152 - type: precision_at_3 value: 16.400000000000002 - type: precision_at_5 value: 11.552999999999999 - type: recall_at_1 value: 24.108 - type: recall_at_10 value: 49.597 - type: recall_at_100 value: 73.88900000000001 - type: recall_at_1000 value: 90.62400000000001 - type: recall_at_3 value: 37.662 - type: recall_at_5 value: 42.565 - type: map_at_1 value: 25.00791666666667 - type: map_at_10 value: 33.287749999999996 - type: map_at_100 value: 34.41141666666667 - type: map_at_1000 value: 34.52583333333333 - type: map_at_3 value: 30.734416666666668 - type: map_at_5 value: 32.137166666666666 - type: mrr_at_1 value: 29.305666666666664 - type: mrr_at_10 value: 37.22966666666666 - type: mrr_at_100 value: 38.066583333333334 - type: mrr_at_1000 value: 38.12616666666667 - type: mrr_at_3 value: 34.92275 - type: mrr_at_5 value: 36.23333333333334 - type: ndcg_at_1 value: 29.305666666666664 - type: ndcg_at_10 value: 38.25533333333333 - type: ndcg_at_100 value: 43.25266666666666 - type: ndcg_at_1000 value: 45.63583333333334 - type: ndcg_at_3 value: 33.777166666666666 - type: ndcg_at_5 value: 35.85 - type: precision_at_1 value: 29.305666666666664 - type: precision_at_10 value: 6.596416666666667 - type: precision_at_100 value: 1.0784166666666668 - type: precision_at_1000 value: 0.14666666666666664 - type: precision_at_3 value: 15.31075 - type: precision_at_5 value: 10.830916666666667 - type: recall_at_1 value: 25.00791666666667 - type: recall_at_10 value: 49.10933333333333 - type: recall_at_100 value: 71.09216666666667 - type: recall_at_1000 value: 87.77725000000001 - type: recall_at_3 value: 36.660916666666665 - type: recall_at_5 value: 41.94149999999999 - type: map_at_1 value: 23.521 - type: map_at_10 value: 30.043 - type: map_at_100 value: 30.936000000000003 - type: map_at_1000 value: 31.022 - type: map_at_3 value: 27.926000000000002 - type: map_at_5 value: 29.076999999999998 - type: mrr_at_1 value: 26.227 - type: mrr_at_10 value: 32.822 - type: mrr_at_100 value: 33.61 - type: mrr_at_1000 value: 33.672000000000004 - type: mrr_at_3 value: 30.776999999999997 - type: mrr_at_5 value: 31.866 - type: ndcg_at_1 value: 26.227 - type: ndcg_at_10 value: 34.041 - type: ndcg_at_100 value: 38.394 - type: ndcg_at_1000 value: 40.732 - type: ndcg_at_3 value: 30.037999999999997 - type: ndcg_at_5 value: 31.845000000000002 - type: precision_at_1 value: 26.227 - type: precision_at_10 value: 5.244999999999999 - type: precision_at_100 value: 0.808 - type: precision_at_1000 value: 0.107 - type: precision_at_3 value: 12.679000000000002 - type: precision_at_5 value: 8.773 - type: recall_at_1 value: 23.521 - type: recall_at_10 value: 43.633 - type: recall_at_100 value: 63.126000000000005 - type: recall_at_1000 value: 80.765 - type: recall_at_3 value: 32.614 - type: recall_at_5 value: 37.15 - type: map_at_1 value: 16.236 - type: map_at_10 value: 22.898 - type: map_at_100 value: 23.878 - type: map_at_1000 value: 24.009 - type: map_at_3 value: 20.87 - type: map_at_5 value: 22.025 - type: mrr_at_1 value: 19.339000000000002 - type: mrr_at_10 value: 26.382 - type: mrr_at_100 value: 27.245 - type: mrr_at_1000 value: 27.33 - type: mrr_at_3 value: 24.386 - type: mrr_at_5 value: 25.496000000000002 - type: ndcg_at_1 value: 19.339000000000002 - type: ndcg_at_10 value: 27.139999999999997 - type: ndcg_at_100 value: 31.944 - type: ndcg_at_1000 value: 35.077999999999996 - type: ndcg_at_3 value: 23.424 - type: ndcg_at_5 value: 25.188 - type: precision_at_1 value: 19.339000000000002 - type: precision_at_10 value: 4.8309999999999995 - type: precision_at_100 value: 0.845 - type: precision_at_1000 value: 0.128 - type: precision_at_3 value: 10.874 - type: precision_at_5 value: 7.825 - type: recall_at_1 value: 16.236 - type: recall_at_10 value: 36.513 - type: recall_at_100 value: 57.999 - type: recall_at_1000 value: 80.512 - type: recall_at_3 value: 26.179999999999996 - type: recall_at_5 value: 30.712 - type: map_at_1 value: 24.11 - type: map_at_10 value: 31.566 - type: map_at_100 value: 32.647 - type: map_at_1000 value: 32.753 - type: map_at_3 value: 29.24 - type: map_at_5 value: 30.564999999999998 - type: mrr_at_1 value: 28.265 - type: mrr_at_10 value: 35.504000000000005 - type: mrr_at_100 value: 36.436 - type: mrr_at_1000 value: 36.503 - type: mrr_at_3 value: 33.349000000000004 - type: mrr_at_5 value: 34.622 - type: ndcg_at_1 value: 28.265 - type: ndcg_at_10 value: 36.192 - type: ndcg_at_100 value: 41.388000000000005 - type: ndcg_at_1000 value: 43.948 - type: ndcg_at_3 value: 31.959 - type: ndcg_at_5 value: 33.998 - type: precision_at_1 value: 28.265 - type: precision_at_10 value: 5.989 - type: precision_at_100 value: 0.9650000000000001 - type: precision_at_1000 value: 0.13 - type: precision_at_3 value: 14.335 - type: precision_at_5 value: 10.112 - type: recall_at_1 value: 24.11 - type: recall_at_10 value: 46.418 - type: recall_at_100 value: 69.314 - type: recall_at_1000 value: 87.397 - type: recall_at_3 value: 34.724 - type: recall_at_5 value: 39.925 - type: map_at_1 value: 22.091 - type: map_at_10 value: 29.948999999999998 - type: map_at_100 value: 31.502000000000002 - type: map_at_1000 value: 31.713 - type: map_at_3 value: 27.464 - type: map_at_5 value: 28.968 - type: mrr_at_1 value: 26.482 - type: mrr_at_10 value: 34.009 - type: mrr_at_100 value: 35.081 - type: mrr_at_1000 value: 35.138000000000005 - type: mrr_at_3 value: 31.785000000000004 - type: mrr_at_5 value: 33.178999999999995 - type: ndcg_at_1 value: 26.482 - type: ndcg_at_10 value: 35.008 - type: ndcg_at_100 value: 41.272999999999996 - type: ndcg_at_1000 value: 43.972 - type: ndcg_at_3 value: 30.804 - type: ndcg_at_5 value: 33.046 - type: precision_at_1 value: 26.482 - type: precision_at_10 value: 6.462 - type: precision_at_100 value: 1.431 - type: precision_at_1000 value: 0.22899999999999998 - type: precision_at_3 value: 14.360999999999999 - type: precision_at_5 value: 10.474 - type: recall_at_1 value: 22.091 - type: recall_at_10 value: 45.125 - type: recall_at_100 value: 72.313 - type: recall_at_1000 value: 89.503 - type: recall_at_3 value: 33.158 - type: recall_at_5 value: 39.086999999999996 - type: map_at_1 value: 19.883 - type: map_at_10 value: 26.951000000000004 - type: map_at_100 value: 27.927999999999997 - type: map_at_1000 value: 28.022000000000002 - type: map_at_3 value: 24.616 - type: map_at_5 value: 25.917 - type: mrr_at_1 value: 21.996 - type: mrr_at_10 value: 29.221000000000004 - type: mrr_at_100 value: 30.024 - type: mrr_at_1000 value: 30.095 - type: mrr_at_3 value: 26.833000000000002 - type: mrr_at_5 value: 28.155 - type: ndcg_at_1 value: 21.996 - type: ndcg_at_10 value: 31.421 - type: ndcg_at_100 value: 36.237 - type: ndcg_at_1000 value: 38.744 - type: ndcg_at_3 value: 26.671 - type: ndcg_at_5 value: 28.907 - type: precision_at_1 value: 21.996 - type: precision_at_10 value: 5.009 - type: precision_at_100 value: 0.799 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 11.275 - type: precision_at_5 value: 8.059 - type: recall_at_1 value: 19.883 - type: recall_at_10 value: 43.132999999999996 - type: recall_at_100 value: 65.654 - type: recall_at_1000 value: 84.492 - type: recall_at_3 value: 30.209000000000003 - type: recall_at_5 value: 35.616 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: map_at_1 value: 17.756 - type: map_at_10 value: 30.378 - type: map_at_100 value: 32.537 - type: map_at_1000 value: 32.717 - type: map_at_3 value: 25.599 - type: map_at_5 value: 28.372999999999998 - type: mrr_at_1 value: 41.303 - type: mrr_at_10 value: 53.483999999999995 - type: mrr_at_100 value: 54.106 - type: mrr_at_1000 value: 54.127 - type: mrr_at_3 value: 50.315 - type: mrr_at_5 value: 52.396 - type: ndcg_at_1 value: 41.303 - type: ndcg_at_10 value: 40.503 - type: ndcg_at_100 value: 47.821000000000005 - type: ndcg_at_1000 value: 50.788 - type: ndcg_at_3 value: 34.364 - type: ndcg_at_5 value: 36.818 - type: precision_at_1 value: 41.303 - type: precision_at_10 value: 12.463000000000001 - type: precision_at_100 value: 2.037 - type: precision_at_1000 value: 0.26 - type: precision_at_3 value: 25.798 - type: precision_at_5 value: 19.896 - type: recall_at_1 value: 17.756 - type: recall_at_10 value: 46.102 - type: recall_at_100 value: 70.819 - type: recall_at_1000 value: 87.21799999999999 - type: recall_at_3 value: 30.646 - type: recall_at_5 value: 38.022 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: map_at_1 value: 9.033 - type: map_at_10 value: 20.584 - type: map_at_100 value: 29.518 - type: map_at_1000 value: 31.186000000000003 - type: map_at_3 value: 14.468 - type: map_at_5 value: 17.177 - type: mrr_at_1 value: 69.75 - type: mrr_at_10 value: 77.025 - type: mrr_at_100 value: 77.36699999999999 - type: mrr_at_1000 value: 77.373 - type: mrr_at_3 value: 75.583 - type: mrr_at_5 value: 76.396 - type: ndcg_at_1 value: 58.5 - type: ndcg_at_10 value: 45.033 - type: ndcg_at_100 value: 49.071 - type: ndcg_at_1000 value: 56.056 - type: ndcg_at_3 value: 49.936 - type: ndcg_at_5 value: 47.471999999999994 - type: precision_at_1 value: 69.75 - type: precision_at_10 value: 35.775 - type: precision_at_100 value: 11.594999999999999 - type: precision_at_1000 value: 2.062 - type: precision_at_3 value: 52.5 - type: precision_at_5 value: 45.300000000000004 - type: recall_at_1 value: 9.033 - type: recall_at_10 value: 26.596999999999998 - type: recall_at_100 value: 54.607000000000006 - type: recall_at_1000 value: 76.961 - type: recall_at_3 value: 15.754999999999999 - type: recall_at_5 value: 20.033 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 48.345000000000006 - type: f1 value: 43.4514918068706 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: map_at_1 value: 71.29100000000001 - type: map_at_10 value: 81.059 - type: map_at_100 value: 81.341 - type: map_at_1000 value: 81.355 - type: map_at_3 value: 79.74799999999999 - type: map_at_5 value: 80.612 - type: mrr_at_1 value: 76.40299999999999 - type: mrr_at_10 value: 84.615 - type: mrr_at_100 value: 84.745 - type: mrr_at_1000 value: 84.748 - type: mrr_at_3 value: 83.776 - type: mrr_at_5 value: 84.343 - type: ndcg_at_1 value: 76.40299999999999 - type: ndcg_at_10 value: 84.981 - type: ndcg_at_100 value: 86.00999999999999 - type: ndcg_at_1000 value: 86.252 - type: ndcg_at_3 value: 82.97 - type: ndcg_at_5 value: 84.152 - type: precision_at_1 value: 76.40299999999999 - type: precision_at_10 value: 10.446 - type: precision_at_100 value: 1.1199999999999999 - type: precision_at_1000 value: 0.116 - type: precision_at_3 value: 32.147999999999996 - type: precision_at_5 value: 20.135 - type: recall_at_1 value: 71.29100000000001 - type: recall_at_10 value: 93.232 - type: recall_at_100 value: 97.363 - type: recall_at_1000 value: 98.905 - type: recall_at_3 value: 87.893 - type: recall_at_5 value: 90.804 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: map_at_1 value: 18.667 - type: map_at_10 value: 30.853 - type: map_at_100 value: 32.494 - type: map_at_1000 value: 32.677 - type: map_at_3 value: 26.91 - type: map_at_5 value: 29.099000000000004 - type: mrr_at_1 value: 37.191 - type: mrr_at_10 value: 46.171 - type: mrr_at_100 value: 47.056 - type: mrr_at_1000 value: 47.099000000000004 - type: mrr_at_3 value: 44.059 - type: mrr_at_5 value: 45.147 - type: ndcg_at_1 value: 37.191 - type: ndcg_at_10 value: 38.437 - type: ndcg_at_100 value: 44.62 - type: ndcg_at_1000 value: 47.795 - type: ndcg_at_3 value: 35.003 - type: ndcg_at_5 value: 36.006 - type: precision_at_1 value: 37.191 - type: precision_at_10 value: 10.586 - type: precision_at_100 value: 1.688 - type: precision_at_1000 value: 0.22699999999999998 - type: precision_at_3 value: 23.302 - type: precision_at_5 value: 17.006 - type: recall_at_1 value: 18.667 - type: recall_at_10 value: 45.367000000000004 - type: recall_at_100 value: 68.207 - type: recall_at_1000 value: 87.072 - type: recall_at_3 value: 32.129000000000005 - type: recall_at_5 value: 37.719 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: map_at_1 value: 39.494 - type: map_at_10 value: 66.223 - type: map_at_100 value: 67.062 - type: map_at_1000 value: 67.11500000000001 - type: map_at_3 value: 62.867 - type: map_at_5 value: 64.994 - type: mrr_at_1 value: 78.987 - type: mrr_at_10 value: 84.585 - type: mrr_at_100 value: 84.773 - type: mrr_at_1000 value: 84.77900000000001 - type: mrr_at_3 value: 83.592 - type: mrr_at_5 value: 84.235 - type: ndcg_at_1 value: 78.987 - type: ndcg_at_10 value: 73.64 - type: ndcg_at_100 value: 76.519 - type: ndcg_at_1000 value: 77.51 - type: ndcg_at_3 value: 68.893 - type: ndcg_at_5 value: 71.585 - type: precision_at_1 value: 78.987 - type: precision_at_10 value: 15.529000000000002 - type: precision_at_100 value: 1.7770000000000001 - type: precision_at_1000 value: 0.191 - type: precision_at_3 value: 44.808 - type: precision_at_5 value: 29.006999999999998 - type: recall_at_1 value: 39.494 - type: recall_at_10 value: 77.643 - type: recall_at_100 value: 88.825 - type: recall_at_1000 value: 95.321 - type: recall_at_3 value: 67.211 - type: recall_at_5 value: 72.519 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 85.55959999999999 - type: ap value: 80.7246500384617 - type: f1 value: 85.52336485065454 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: dev revision: None metrics: - type: map_at_1 value: 23.631 - type: map_at_10 value: 36.264 - type: map_at_100 value: 37.428 - type: map_at_1000 value: 37.472 - type: map_at_3 value: 32.537 - type: map_at_5 value: 34.746 - type: mrr_at_1 value: 24.312 - type: mrr_at_10 value: 36.858000000000004 - type: mrr_at_100 value: 37.966 - type: mrr_at_1000 value: 38.004 - type: mrr_at_3 value: 33.188 - type: mrr_at_5 value: 35.367 - type: ndcg_at_1 value: 24.312 - type: ndcg_at_10 value: 43.126999999999995 - type: ndcg_at_100 value: 48.642 - type: ndcg_at_1000 value: 49.741 - type: ndcg_at_3 value: 35.589 - type: ndcg_at_5 value: 39.515 - type: precision_at_1 value: 24.312 - type: precision_at_10 value: 6.699 - type: precision_at_100 value: 0.9450000000000001 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 15.153 - type: precision_at_5 value: 11.065999999999999 - type: recall_at_1 value: 23.631 - type: recall_at_10 value: 64.145 - type: recall_at_100 value: 89.41 - type: recall_at_1000 value: 97.83500000000001 - type: recall_at_3 value: 43.769000000000005 - type: recall_at_5 value: 53.169 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.4108527131783 - type: f1 value: 93.1415880261038 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 77.24806201550388 - type: f1 value: 60.531916308197175 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 73.71553463349024 - type: f1 value: 71.70753174900791 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 77.79757901815736 - type: f1 value: 77.83719850433258 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 33.74193296622113 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 30.64257594108566 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 30.811018518883625 - type: mrr value: 31.910376577445003 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.409 - type: map_at_10 value: 13.093 - type: map_at_100 value: 16.256999999999998 - type: map_at_1000 value: 17.617 - type: map_at_3 value: 9.555 - type: map_at_5 value: 11.428 - type: mrr_at_1 value: 45.201 - type: mrr_at_10 value: 54.179 - type: mrr_at_100 value: 54.812000000000005 - type: mrr_at_1000 value: 54.840999999999994 - type: mrr_at_3 value: 51.909000000000006 - type: mrr_at_5 value: 53.519000000000005 - type: ndcg_at_1 value: 43.189 - type: ndcg_at_10 value: 35.028 - type: ndcg_at_100 value: 31.226 - type: ndcg_at_1000 value: 39.678000000000004 - type: ndcg_at_3 value: 40.596 - type: ndcg_at_5 value: 38.75 - type: precision_at_1 value: 44.582 - type: precision_at_10 value: 25.974999999999998 - type: precision_at_100 value: 7.793 - type: precision_at_1000 value: 2.036 - type: precision_at_3 value: 38.493 - type: precision_at_5 value: 33.994 - type: recall_at_1 value: 5.409 - type: recall_at_10 value: 16.875999999999998 - type: recall_at_100 value: 30.316 - type: recall_at_1000 value: 60.891 - type: recall_at_3 value: 10.688 - type: recall_at_5 value: 13.832 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: map_at_1 value: 36.375 - type: map_at_10 value: 51.991 - type: map_at_100 value: 52.91400000000001 - type: map_at_1000 value: 52.93600000000001 - type: map_at_3 value: 48.014 - type: map_at_5 value: 50.381 - type: mrr_at_1 value: 40.759 - type: mrr_at_10 value: 54.617000000000004 - type: mrr_at_100 value: 55.301 - type: mrr_at_1000 value: 55.315000000000005 - type: mrr_at_3 value: 51.516 - type: mrr_at_5 value: 53.435 - type: ndcg_at_1 value: 40.759 - type: ndcg_at_10 value: 59.384 - type: ndcg_at_100 value: 63.157 - type: ndcg_at_1000 value: 63.654999999999994 - type: ndcg_at_3 value: 52.114000000000004 - type: ndcg_at_5 value: 55.986000000000004 - type: precision_at_1 value: 40.759 - type: precision_at_10 value: 9.411999999999999 - type: precision_at_100 value: 1.153 - type: precision_at_1000 value: 0.12 - type: precision_at_3 value: 23.329 - type: precision_at_5 value: 16.256999999999998 - type: recall_at_1 value: 36.375 - type: recall_at_10 value: 79.053 - type: recall_at_100 value: 95.167 - type: recall_at_1000 value: 98.82 - type: recall_at_3 value: 60.475 - type: recall_at_5 value: 69.327 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: map_at_1 value: 70.256 - type: map_at_10 value: 83.8 - type: map_at_100 value: 84.425 - type: map_at_1000 value: 84.444 - type: map_at_3 value: 80.906 - type: map_at_5 value: 82.717 - type: mrr_at_1 value: 80.97999999999999 - type: mrr_at_10 value: 87.161 - type: mrr_at_100 value: 87.262 - type: mrr_at_1000 value: 87.263 - type: mrr_at_3 value: 86.175 - type: mrr_at_5 value: 86.848 - type: ndcg_at_1 value: 80.97999999999999 - type: ndcg_at_10 value: 87.697 - type: ndcg_at_100 value: 88.959 - type: ndcg_at_1000 value: 89.09899999999999 - type: ndcg_at_3 value: 84.83800000000001 - type: ndcg_at_5 value: 86.401 - type: precision_at_1 value: 80.97999999999999 - type: precision_at_10 value: 13.261000000000001 - type: precision_at_100 value: 1.5150000000000001 - type: precision_at_1000 value: 0.156 - type: precision_at_3 value: 37.01 - type: precision_at_5 value: 24.298000000000002 - type: recall_at_1 value: 70.256 - type: recall_at_10 value: 94.935 - type: recall_at_100 value: 99.274 - type: recall_at_1000 value: 99.928 - type: recall_at_3 value: 86.602 - type: recall_at_5 value: 91.133 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 56.322692497613104 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 61.895813503775074 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 4.338 - type: map_at_10 value: 10.767 - type: map_at_100 value: 12.537999999999998 - type: map_at_1000 value: 12.803999999999998 - type: map_at_3 value: 7.788 - type: map_at_5 value: 9.302000000000001 - type: mrr_at_1 value: 21.4 - type: mrr_at_10 value: 31.637999999999998 - type: mrr_at_100 value: 32.688 - type: mrr_at_1000 value: 32.756 - type: mrr_at_3 value: 28.433000000000003 - type: mrr_at_5 value: 30.178 - type: ndcg_at_1 value: 21.4 - type: ndcg_at_10 value: 18.293 - type: ndcg_at_100 value: 25.274 - type: ndcg_at_1000 value: 30.284 - type: ndcg_at_3 value: 17.391000000000002 - type: ndcg_at_5 value: 15.146999999999998 - type: precision_at_1 value: 21.4 - type: precision_at_10 value: 9.48 - type: precision_at_100 value: 1.949 - type: precision_at_1000 value: 0.316 - type: precision_at_3 value: 16.167 - type: precision_at_5 value: 13.22 - type: recall_at_1 value: 4.338 - type: recall_at_10 value: 19.213 - type: recall_at_100 value: 39.562999999999995 - type: recall_at_1000 value: 64.08 - type: recall_at_3 value: 9.828000000000001 - type: recall_at_5 value: 13.383000000000001 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 82.42568163642142 - type: cos_sim_spearman value: 78.5797159641342 - type: euclidean_pearson value: 80.22151260811604 - type: euclidean_spearman value: 78.5797151953878 - type: manhattan_pearson value: 80.21224215864788 - type: manhattan_spearman value: 78.55641478381344 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 85.44020710812569 - type: cos_sim_spearman value: 78.91631735081286 - type: euclidean_pearson value: 81.64188964182102 - type: euclidean_spearman value: 78.91633286881678 - type: manhattan_pearson value: 81.69294748512496 - type: manhattan_spearman value: 78.93438558002656 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 84.27165426412311 - type: cos_sim_spearman value: 85.40429140249618 - type: euclidean_pearson value: 84.7509580724893 - type: euclidean_spearman value: 85.40429140249618 - type: manhattan_pearson value: 84.76488289321308 - type: manhattan_spearman value: 85.4256793698708 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 83.138851760732 - type: cos_sim_spearman value: 81.64101363896586 - type: euclidean_pearson value: 82.55165038934942 - type: euclidean_spearman value: 81.64105257080502 - type: manhattan_pearson value: 82.52802949883335 - type: manhattan_spearman value: 81.61255430718158 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 86.0654695484029 - type: cos_sim_spearman value: 87.20408521902229 - type: euclidean_pearson value: 86.8110651362115 - type: euclidean_spearman value: 87.20408521902229 - type: manhattan_pearson value: 86.77984656478691 - type: manhattan_spearman value: 87.1719947099227 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 83.77823915496512 - type: cos_sim_spearman value: 85.43566325729779 - type: euclidean_pearson value: 84.5396956658821 - type: euclidean_spearman value: 85.43566325729779 - type: manhattan_pearson value: 84.5665398848169 - type: manhattan_spearman value: 85.44375870303232 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 87.20030208471798 - type: cos_sim_spearman value: 87.20485505076539 - type: euclidean_pearson value: 88.10588324368722 - type: euclidean_spearman value: 87.20485505076539 - type: manhattan_pearson value: 87.92324770415183 - type: manhattan_spearman value: 87.0571314561877 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 63.06093161604453 - type: cos_sim_spearman value: 64.2163140357722 - type: euclidean_pearson value: 65.27589680994006 - type: euclidean_spearman value: 64.2163140357722 - type: manhattan_pearson value: 65.45904383711101 - type: manhattan_spearman value: 64.55404716679305 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 84.32976164578706 - type: cos_sim_spearman value: 85.54302197678368 - type: euclidean_pearson value: 85.26307149193056 - type: euclidean_spearman value: 85.54302197678368 - type: manhattan_pearson value: 85.26647282029371 - type: manhattan_spearman value: 85.5316135265568 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 81.44675968318754 - type: mrr value: 94.92741826075158 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: map_at_1 value: 56.34400000000001 - type: map_at_10 value: 65.927 - type: map_at_100 value: 66.431 - type: map_at_1000 value: 66.461 - type: map_at_3 value: 63.529 - type: map_at_5 value: 64.818 - type: mrr_at_1 value: 59.333000000000006 - type: mrr_at_10 value: 67.54599999999999 - type: mrr_at_100 value: 67.892 - type: mrr_at_1000 value: 67.917 - type: mrr_at_3 value: 65.778 - type: mrr_at_5 value: 66.794 - type: ndcg_at_1 value: 59.333000000000006 - type: ndcg_at_10 value: 70.5 - type: ndcg_at_100 value: 72.688 - type: ndcg_at_1000 value: 73.483 - type: ndcg_at_3 value: 66.338 - type: ndcg_at_5 value: 68.265 - type: precision_at_1 value: 59.333000000000006 - type: precision_at_10 value: 9.3 - type: precision_at_100 value: 1.053 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 25.889 - type: precision_at_5 value: 16.866999999999997 - type: recall_at_1 value: 56.34400000000001 - type: recall_at_10 value: 82.789 - type: recall_at_100 value: 92.767 - type: recall_at_1000 value: 99 - type: recall_at_3 value: 71.64399999999999 - type: recall_at_5 value: 76.322 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.75742574257426 - type: cos_sim_ap value: 93.52081548447406 - type: cos_sim_f1 value: 87.33850129198966 - type: cos_sim_precision value: 90.37433155080214 - type: cos_sim_recall value: 84.5 - type: dot_accuracy value: 99.75742574257426 - type: dot_ap value: 93.52081548447406 - type: dot_f1 value: 87.33850129198966 - type: dot_precision value: 90.37433155080214 - type: dot_recall value: 84.5 - type: euclidean_accuracy value: 99.75742574257426 - type: euclidean_ap value: 93.52081548447406 - type: euclidean_f1 value: 87.33850129198966 - type: euclidean_precision value: 90.37433155080214 - type: euclidean_recall value: 84.5 - type: manhattan_accuracy value: 99.75841584158415 - type: manhattan_ap value: 93.4975678585854 - type: manhattan_f1 value: 87.26708074534162 - type: manhattan_precision value: 90.45064377682404 - type: manhattan_recall value: 84.3 - type: max_accuracy value: 99.75841584158415 - type: max_ap value: 93.52081548447406 - type: max_f1 value: 87.33850129198966 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 64.31437036686651 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 33.25569319007206 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 49.90474939720706 - type: mrr value: 50.568115503777264 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 29.866828641244712 - type: cos_sim_spearman value: 30.077555055873866 - type: dot_pearson value: 29.866832988572266 - type: dot_spearman value: 30.077555055873866 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.232 - type: map_at_10 value: 2.094 - type: map_at_100 value: 11.971 - type: map_at_1000 value: 28.158 - type: map_at_3 value: 0.688 - type: map_at_5 value: 1.114 - type: mrr_at_1 value: 88 - type: mrr_at_10 value: 93.4 - type: mrr_at_100 value: 93.4 - type: mrr_at_1000 value: 93.4 - type: mrr_at_3 value: 93 - type: mrr_at_5 value: 93.4 - type: ndcg_at_1 value: 84 - type: ndcg_at_10 value: 79.923 - type: ndcg_at_100 value: 61.17 - type: ndcg_at_1000 value: 53.03 - type: ndcg_at_3 value: 84.592 - type: ndcg_at_5 value: 82.821 - type: precision_at_1 value: 88 - type: precision_at_10 value: 85 - type: precision_at_100 value: 63.019999999999996 - type: precision_at_1000 value: 23.554 - type: precision_at_3 value: 89.333 - type: precision_at_5 value: 87.2 - type: recall_at_1 value: 0.232 - type: recall_at_10 value: 2.255 - type: recall_at_100 value: 14.823 - type: recall_at_1000 value: 49.456 - type: recall_at_3 value: 0.718 - type: recall_at_5 value: 1.175 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.547 - type: map_at_10 value: 11.375 - type: map_at_100 value: 18.194 - type: map_at_1000 value: 19.749 - type: map_at_3 value: 5.825 - type: map_at_5 value: 8.581 - type: mrr_at_1 value: 32.653 - type: mrr_at_10 value: 51.32 - type: mrr_at_100 value: 51.747 - type: mrr_at_1000 value: 51.747 - type: mrr_at_3 value: 47.278999999999996 - type: mrr_at_5 value: 48.605 - type: ndcg_at_1 value: 29.592000000000002 - type: ndcg_at_10 value: 28.151 - type: ndcg_at_100 value: 39.438 - type: ndcg_at_1000 value: 50.769 - type: ndcg_at_3 value: 30.758999999999997 - type: ndcg_at_5 value: 30.366 - type: precision_at_1 value: 32.653 - type: precision_at_10 value: 25.714 - type: precision_at_100 value: 8.041 - type: precision_at_1000 value: 1.555 - type: precision_at_3 value: 33.333 - type: precision_at_5 value: 31.837 - type: recall_at_1 value: 2.547 - type: recall_at_10 value: 18.19 - type: recall_at_100 value: 49.538 - type: recall_at_1000 value: 83.86 - type: recall_at_3 value: 7.329 - type: recall_at_5 value: 11.532 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 71.4952 - type: ap value: 14.793362635531409 - type: f1 value: 55.204635551516915 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 61.5365025466893 - type: f1 value: 61.81742556334845 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 49.05531070301185 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 86.51725576682364 - type: cos_sim_ap value: 75.2292304265163 - type: cos_sim_f1 value: 69.54022988505749 - type: cos_sim_precision value: 63.65629110039457 - type: cos_sim_recall value: 76.62269129287598 - type: dot_accuracy value: 86.51725576682364 - type: dot_ap value: 75.22922386081054 - type: dot_f1 value: 69.54022988505749 - type: dot_precision value: 63.65629110039457 - type: dot_recall value: 76.62269129287598 - type: euclidean_accuracy value: 86.51725576682364 - type: euclidean_ap value: 75.22925730473472 - type: euclidean_f1 value: 69.54022988505749 - type: euclidean_precision value: 63.65629110039457 - type: euclidean_recall value: 76.62269129287598 - type: manhattan_accuracy value: 86.52321630804077 - type: manhattan_ap value: 75.20608115037336 - type: manhattan_f1 value: 69.60000000000001 - type: manhattan_precision value: 64.37219730941705 - type: manhattan_recall value: 75.75197889182058 - type: max_accuracy value: 86.52321630804077 - type: max_ap value: 75.22925730473472 - type: max_f1 value: 69.60000000000001 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 89.34877944657896 - type: cos_sim_ap value: 86.71257569277373 - type: cos_sim_f1 value: 79.10386355986088 - type: cos_sim_precision value: 76.91468470434214 - type: cos_sim_recall value: 81.4213119802895 - type: dot_accuracy value: 89.34877944657896 - type: dot_ap value: 86.71257133133368 - type: dot_f1 value: 79.10386355986088 - type: dot_precision value: 76.91468470434214 - type: dot_recall value: 81.4213119802895 - type: euclidean_accuracy value: 89.34877944657896 - type: euclidean_ap value: 86.71257651501476 - type: euclidean_f1 value: 79.10386355986088 - type: euclidean_precision value: 76.91468470434214 - type: euclidean_recall value: 81.4213119802895 - type: manhattan_accuracy value: 89.35848177901967 - type: manhattan_ap value: 86.69330615469126 - type: manhattan_f1 value: 79.13867741453949 - type: manhattan_precision value: 76.78881807647741 - type: manhattan_recall value: 81.63689559593472 - type: max_accuracy value: 89.35848177901967 - type: max_ap value: 86.71257651501476 - type: max_f1 value: 79.13867741453949 --- # nomic-embed-text-v1: A Reproducible Long Context (8192) Text Embedder `nomic-embed-text-v1` is 8192 context length text encoder that surpasses OpenAI text-embedding-ada-002 and text-embedding-3-small performance on short and long context tasks. | Name | SeqLen | MTEB | LoCo | Jina Long Context | Open Weights | Open Training Code | Open Data | | :-------------------------------:| :----- | :-------- | :------: | :---------------: | :-----------: | :----------------: | :---------- | | nomic-embed-text-v1 | 8192 | **62.39** |**85.53** | 54.16 | ✅ | ✅ | ✅ | | jina-embeddings-v2-base-en | 8192 | 60.39 | 85.45 | 51.90 | ✅ | ❌ | ❌ | | text-embedding-3-small | 8191 | 62.26 | 82.40 | **58.20** | ❌ | ❌ | ❌ | | text-embedding-ada-002 | 8191 | 60.99 | 52.7 | 55.25 | ❌ | ❌ | ❌ | ## Hosted Inference API The easiest way to get started with Nomic Embed is through the Nomic Embedding API. Generating embeddings with the `nomic` Python client is as easy as ```python from nomic import embed output = embed.text( texts=['Nomic Embedding API', '#keepAIOpen'], model='nomic-embed-text-v1', task_type='search_document' ) print(output) ``` For more information, see the [API reference](https://docs.nomic.ai/reference/endpoints/nomic-embed-text) ## Data Visualization Click the Nomic Atlas map below to visualize a 5M sample of our contrastive pretraining data! [![image/webp](https://cdn-uploads.huggingface.co/production/uploads/607997c83a565c15675055b3/pjhJhuNyRfPagRd_c_iUz.webp)](https://atlas.nomic.ai/map/nomic-text-embed-v1-5m-sample) ## Training Details We train our embedder using a multi-stage training pipeline. Starting from a long-context [BERT model](https://huggingface.co/nomic-ai/nomic-bert-2048), the first unsupervised contrastive stage trains on a dataset generated from weakly related text pairs, such as question-answer pairs from forums like StackExchange and Quora, title-body pairs from Amazon reviews, and summarizations from news articles. In the second finetuning stage, higher quality labeled datasets such as search queries and answers from web searches are leveraged. Data curation and hard-example mining is crucial in this stage. For more details, see the Nomic Embed [Technical Report](https://static.nomic.ai/reports/2024_Nomic_Embed_Text_Technical_Report.pdf) and corresponding [blog post](https://blog.nomic.ai/posts/nomic-embed-text-v1). Training data to train the models is released in its entirety. For more details, see the `contrastors` [repository](https://github.com/nomic-ai/contrastors) ## Usage Note `nomic-embed-text` requires prefixes! We support the prefixes `[search_query, search_document, classification, clustering]`. For retrieval applications, you should prepend `search_document` for all your documents and `search_query` for your queries. ### Sentence Transformers ```python from sentence_transformers import SentenceTransformer model = SentenceTransformer("nomic-ai/nomic-embed-text-v1", trust_remote_code=True) sentences = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?'] embeddings = model.encode(sentences) print(embeddings) ``` ### Transformers ```python import torch import torch.nn.functional as F from transformers import AutoTokenizer, AutoModel def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) sentences = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?'] tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased') model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True) model.eval() encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') with torch.no_grad(): model_output = model(**encoded_input) embeddings = mean_pooling(model_output, encoded_input['attention_mask']) embeddings = F.normalize(embeddings, p=2, dim=1) print(embeddings) ``` The model natively supports scaling of the sequence length past 2048 tokens. To do so, ```diff - tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased') + tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased', model_max_length=8192) - model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True) + model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True, rotary_scaling_factor=2) ``` ### Transformers.js ```js import { pipeline } from '@xenova/transformers'; // Create a feature extraction pipeline const extractor = await pipeline('feature-extraction', 'nomic-ai/nomic-embed-text-v1', { quantized: false, // Comment out this line to use the quantized version }); // Compute sentence embeddings const texts = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?']; const embeddings = await extractor(texts, { pooling: 'mean', normalize: true }); console.log(embeddings); ``` # Join the Nomic Community - Nomic: [https://nomic.ai](https://nomic.ai) - Discord: [https://discord.gg/myY5YDR8z8](https://discord.gg/myY5YDR8z8) - Twitter: [https://twitter.com/nomic_ai](https://twitter.com/nomic_ai) # Citation If you find the model, dataset, or training code useful, please cite our work ```bibtex @misc{nussbaum2024nomic, title={Nomic Embed: Training a Reproducible Long Context Text Embedder}, author={Zach Nussbaum and John X. Morris and Brandon Duderstadt and Andriy Mulyar}, year={2024}, eprint={2402.01613}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
blockblockblock/Dark-Miqu-70B-bpw4-exl2
blockblockblock
text-generation
[ "transformers", "safetensors", "llama", "text-generation", "arxiv:2403.19522", "license:other", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "4-bit", "exl2", "region:us" ]
1,715
1,715
14
2
--- license: other --- ![Dark-Miqu.png](Dark-Miqu.png) ***NOTE***: *For a full range of GGUF quants kindly provided by @mradermacher: [Static](https://huggingface.co/mradermacher/Dark-Miqu-70B-GGUF) and [IMatrix](https://huggingface.co/mradermacher/Dark-Miqu-70B-i1-GGUF).* A "dark" creative writing model with 32k context. Based off [miqu-1-70b](https://huggingface.co/miqudev/miqu-1-70b) but with greatly reduced "positivity" and "-isms". If you want happy endings, look elsewhere! This model **excels** at writing Dark/Grimdark fantasy (see examples below). # Model background Created using [Mergekit](https://github.com/arcee-ai/mergekit) and based on @sophosympatheia's template for [Midnight-Miqu-70B-v1.0](https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.0). This model has a lower perplexity compared to [Midnight-Miqu-70B-v1.0](https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.0) (`'4.08 +/- 0.02'` vs `'4.02 +/- 0.02'`). It also generates longer responses when prompted. The model was created in two stages: - First, three "Midnight-Miqu-esque" models were produced using spherical interpolation (slerp) merges between [miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) and each of the following models: [Midnight-Rose-70B-v2.0.3](https://huggingface.co/sophosympatheia/Midnight-Rose-70B-v2.0.3), [Euryale-1.3-L2-70B](https://huggingface.co/Sao10K/Euryale-1.3-L2-70B) and [WinterGoddess-1.4x-70B-L2](https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2). These models were selected for their dark, imaginative writing styles. Various slerp-merges between [miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) and other models were also experimented with, but these three yielded the darkest creative writing results. - In the second stage, the three slerp-merged models were combined into a single model using the '[Model Stock](https://arxiv.org/abs/2403.19522)' method, with [miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) serving as the base model. # Prompting format Vicuna format is preferred: ``` USER: {prompt} ASSISTANT: ``` Mistral and Alpaca formats are also supported: ``` [INST] {prompt} [/INST] ``` ``` ### Instruction: {prompt} ### Response: ``` # Licence and usage restrictions [miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) is a dequantized version of the [miqu-1-70b](https://huggingface.co/miqudev/miqu-1-70b) model leaked from MistralAI. All miqu-derived models, including this merge, are suitable for non-commercial, personal use only. # Mergekit configuration The following YAML configuration was used to produce this model: ```yaml name: midnight-miqu-70b models: - model: 152334H/miqu-1-70b-sf - model: sophosympatheia/Midnight-Rose-70B-v2.0.3 base_model: 152334H/miqu-1-70b-sf merge_method: slerp parameters: t: - value: [0, 0, 0.2, 0.3, 0.4, 0.5, 0.4, 0.3, 0.2, 0, 0] embed_slerp: true tokenizer_source: model:miqu-1-70b-sf dtype: float16 --- name: euryale-miqu-70b models: - model: 152334H/miqu-1-70b-sf - model: Sao10K/Euryale-1.3-L2-70B base_model: 152334H/miqu-1-70b-sf merge_method: slerp parameters: t: - value: [0, 0, 0.2, 0.3, 0.4, 0.5, 0.4, 0.3, 0.2, 0, 0] embed_slerp: true tokenizer_source: model:miqu-1-70b-sf dtype: float16 --- name: winter-miqu-70b models: - model: 152334H/miqu-1-70b-sf - model: Sao10K/WinterGoddess-1.4x-70B-L2 base_model: 152334H/miqu-1-70b-sf merge_method: slerp parameters: t: - value: [0, 0, 0.2, 0.3, 0.4, 0.5, 0.4, 0.3, 0.2, 0, 0] embed_slerp: true tokenizer_source: model:miqu-1-70b-sf dtype: float16 --- name: dark-miqu-70b models: - model: 152334H/miqu-1-70b-sf - model: midnight-miqu-70b - model: euryale-miqu-70b - model: winter-miqu-70b base_model: 152334H/miqu-1-70b-sf merge_method: model_stock dtype: float16 ``` ## Key configuration details: - '`merge_method: slerp`' uses spherical linear interpolation for merging models. - '`parameters: t`' controls the interpolation ratios between models. - '`embed_slerp: true`' applies slerp to the embedding layers. - '`merge_method: model_stock`' uses the '[Model Stock](https://arxiv.org/abs/2403.19522)' method. See the [Mergekit documentation](https://github.com/arcee-ai/mergekit) for more on these settings. **NOTE**: Run with `mergekit-mega` rather than `mergekit` as there are 4 documents in this one file. # Example stories The following mix of "dark" stories were generated using the Vicuna prompt format with no system message and temperature=0: ## Dark fantasy stories <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a 'Grimdark' stories trilogy in the style of Joe Abercrombie and Rob J Hayes ### ASSISTANT Chapter One: The Crimson Dawn The world was a cruel, unforgiving place, and the city of Vhyrd was its beating, blackened heart. Nestled in the crook of the Serpent's Spine mountains, it was a place where the sun struggled to shine, and the shadows held more sway than the law. It was a city of thieves, murderers, and worse, and it was home to the man who would become known as the Butcher of the Broken Docks. His name was Rennick, but that meant little to the people who feared him. He was a tall, broad-shouldered figure, with a face that had been carved by a drunken sculptor with a penchant for sharp angles and harsh lines. His eyes were a cold, flat grey, like the steel of his twin blades, and his hair was the color of dried blood. He moved through the crowded streets with a predator's grace, his cloak billowing behind him like a banner of death. Rennick had a job to do, and he was good at his job. He was a killer, a thief-taker, a man who dealt in the currency of pain and suffering. He had been hired to find a girl, a slip of a thing who had stolen something valuable from the wrong people. He didn't know what it was, and he didn't care. All that mattered was the coin that would jingle in his purse when he brought her back, preferably in several small, bloody pieces. He had tracked her to the Broken Docks, a maze of rotting wood and crumbling stone that clung to the edge of the city like a leech to a corpse. It was a place where the desperate went to die, and the desperate went to prey on the desperate. It was a place where the screams of the damned were drowned out by the cries of the gulls and the lap of the oily water against the pilings. Rennick moved through the shadows, his blades at the ready. He could feel the eyes on him, the whispers of his name on the wind. He reveled in it. Fear was a weapon, and he wielded it with the same skill as his swords. He could hear the girl's heartbeat, fast and frantic, like a rabbit's. She was close. He turned a corner and found himself in a narrow alley, the walls pressing in on him like the jaws of a trap. At the far end, he saw her, a small, trembling figure huddled in the doorway of a derelict building. She was young, no more than sixteen summers, with matted hair and ragged clothes. She looked up at him with wide, terrified eyes, and he could see the whites of her teeth as she bit her lip to keep from crying out. He advanced on her slowly, savoring the moment. He could almost taste the fear on her, like a fine wine. "Come now, lass," he said, his voice a low, dangerous purr. "You know you can't run from me. Give me what you took, and I might let you live." She shook her head, her eyes darting left and right, searching for an escape. Rennick smiled, a cold, cruel thing that didn't reach his eyes. "You're not thinking straight," he said. "You're just a child. You don't want to die here, in the gutter, like a rat." She spat at him, a glob of phlegm and hate that landed on his boot. He raised an eyebrow, impressed despite himself. "Feisty," he said. "I like that. It'll make this more fun." He lunged forward, his blades flashing in the dim light. She screamed and tried to run, but he was too fast. He caught her by the arm, his grip like iron, and yanked her back towards him. She kicked and struggled, but it was like trying to fight a storm. He slammed her against the wall, his forearm pressed against her throat, cutting off her air. "Where is it?" he growled, his breath hot and sour in her face. "Tell me, and I'll make it quick." She clawed at his arm, her nails leaving bloody furrows in his flesh. He didn't flinch. He was used to pain. He had been born in it, raised in it, bathed in it. It was his constant companion, his only friend. "I...I don't have it," she gasped, her voice raw and desperate. "I swear, I don't!" Rennick chuckled, a sound that sent shivers down her spine. "Lying to me is a bad idea, lass," he said. "I have ways of making people talk." He released her throat and slammed his fist into her stomach, doubling her over. She retched, but there was nothing in her stomach to come up. He grabbed a handful of her hair and yanked her head back, exposing her throat. He could see the pulse hammering in her neck, like a trapped bird trying to escape. "Tell me," he said, his voice a whisper. "Tell me, and I'll end this." She spat in his face, a defiant gesture that would be her last. He brought his blade up, the steel glinting in the moonlight, and sliced her throat from ear to ear. Blood sprayed across the cobblestones, warm and wet, and she slumped to the ground, her life draining away like water through a sieve. Rennick wiped the blood from his face and sheathed his blades. He didn't feel anything. No satisfaction, no remorse. Just the cold, empty void that had been his constant companion since the day he had taken his first life. He searched her body, but found nothing. He cursed under his breath. He would have to find the thing she had stolen the hard way. He turned to leave the alley, but something made him pause. A sound, barely audible over the cries of the gulls and the lapping of the waves. A soft, wet sound, like the patter of rain on a rooftop. He looked down and saw that the girl's body was twitching, her eyes rolling back in her head. And then, with a gurgling scream, she transformed. Her body contorted, bones snapping and reforming, flesh tearing and reshaping. In the space of a heartbeat, she was no longer human. She was something else, something monstrous. Rennick stared in horror as the thing that had been a girl rose to its feet, its eyes burning with an unholy light. It was a creature of nightmares, a twisted mockery of humanity, all teeth and claws and hate. It lunged at him, and he barely had time to draw his blades before it was on him. They fought, a dance of death in the narrow alley, steel ringing on steel, flesh tearing and blood spraying. Rennick was strong, and fast, and skilled, but the creature was stronger, faster, more vicious. It was like fighting a storm, a force of nature that could not be reasoned with or bargained with. It could only be endured. In the end, it was luck that saved him. He stumbled, his foot catching on a loose cobblestone, and the creature's claws raked across his chest, leaving deep, ragged gouges. But it overextended itself, and he was able to bring his sword up, driving it through its eye and into its brain. It fell to the ground, twitching and convulsing, and then it was still. Rennick stood over it, panting, his heart hammering in his chest. He had never seen anything like it. He had never even heard of anything like it. He knew then that he was in over his head. This was no ordinary job, no simple matter of retrieving a stolen trinket. This was something bigger, something darker. He was a small fish in a very large, very dangerous pond, and he was about to be eaten alive. But he was also a survivor. He had been born in the gutter, raised in the shadows, and he had clawed his way to the top of the food chain. He would not go down without a fight. He turned and ran, back towards the city, back towards the people who had hired him. He would find out what they knew, and he would make them pay for it. He would uncover the truth, no matter the cost. And so began the tale of the Butcher of the Broken Docks, a story of blood and betrayal, of monsters and men, of a city on the brink of chaos and the man who would either save it or destroy it. It was a story that would echo through the ages, a grim and terrible saga that would be whispered in the darkest corners of the world, a warning to all who would dare to challenge the forces that lurked in the shadows. It was a story that would end in fire and blood, in screams and tears, in the fall of kings and the rise of monsters. It was a story that had only just begun. </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a cultist who has just escaped his cult. He is dressed in dirty yellow robes and his only possession is a mysterious small (magical!?) mirror he stole from the cult. The story starts with him arriving at an apparently deserted ghost town on the edge of a desert. He has an arrow lodged in his shoulder and is losing his mind due to infection and thirst. ### ASSISTANT The sun was a merciless beast, its fiery breath scorching the earth and turning the once-thriving town into a desolate wasteland. The cultist, his yellow robes now a dull, dusty hue, staggered through the empty streets, his vision blurred by a haze of pain and delirium. The arrow jutted from his shoulder like an accusing finger, a constant reminder of his betrayal. He clutched the small, intricately carved mirror in his trembling hand, its cool surface a comfort against the relentless heat. He had heard the whispers of the other acolytes, their hushed voices carrying tales of the town's demise like a plague on the wind. They spoke of a curse, of a darkness that had seeped into the very bones of the place, driving its inhabitants mad before claiming their souls. But he had seen the truth in the High Priest's eyes, the greed and lust for power that had led them here. The mirror was the key, they said, to unlocking the secrets of the ancient gods. But he knew better. He had seen the power it held, the way it twisted reality and bent it to the will of the one who wielded it. And he had taken it, stolen it from the very heart of the cult's temple, and run. Now, with each agonizing step, he questioned the wisdom of his decision. The desert stretched out before him, an endless sea of sand and despair, and he was but a single, insignificant speck in its vast expanse. He had hoped to find refuge here, a place to hide and heal before continuing his journey. But the town was as dead as the dreams he had once held of a better life. His thoughts were a jumbled mess, a cacophony of fear and pain. He could feel the infection spreading through his body, a writhing serpent of fire that threatened to consume him from within. His tongue was swollen, his lips cracked and dry, and his throat was a parched wasteland. He needed water, and soon, or he would join the ghosts that haunted this forsaken place. As he stumbled through the town, he caught glimpses of his reflection in the shattered windows of the abandoned buildings. The man who stared back at him was a stranger, his once-handsome face now a twisted mask of agony and desperation. He had been a fool to believe the lies of the cult, to think that he could ever be one of them. He was an outsider, a fraud, and they had seen it in him from the start. The mirror, though, the mirror was real. He could feel its power pulsing in his hand, a heartbeat that echoed the rhythm of his own. It was a cruel irony that the very thing he had risked everything to obtain was now his only hope for survival. He knew the stories, the legends of the mirror's healing properties, of how it could restore life to the dying. But he also knew the cost. The mirror demanded a sacrifice, a piece of the user's very soul in exchange for its gifts. He reached the town square, a desolate expanse of cracked cobblestones and withered plants. In the center stood a fountain, its once-clear waters now a stagnant pool of green slime. He fell to his knees before it, the pain in his shoulder a white-hot brand that seared his flesh. He dipped the mirror into the water, watching as the surface rippled and danced. "Please," he whispered, his voice hoarse and broken. "I'll give you anything. Just take the pain away." The mirror stared back at him, its surface as still and unyielding as the desert sands. He closed his eyes, willing the power within to answer his plea. And then, he felt it. A cold, insidious presence, slithering into his mind like a snake in the dark. It whispered to him, promising relief, promising power, promising everything he had ever wanted. He opened his eyes, and the world was different. The pain was gone, replaced by a cold, emotionless void. He looked down at his shoulder, and the arrow was gone, the wound nothing more than a faint scar. He laughed, a hollow, empty sound that echoed through the empty streets. "What have I done?" he asked the mirror, but it gave no answer. He stood, his legs unsteady beneath him, and looked out at the desert. He could feel the power coursing through his veins, a dark, intoxicating energy that threatened to consume him. He was no longer just a man, he was something more. Something dangerous. He took a step forward, and then another, the mirror clutched tightly in his hand. He would find the cult, he would make them pay for what they had done to him. And then, he would use the mirror to claim the power that was rightfully his. The wind picked up, whipping the sand into a frenzy, and he smiled. The desert was his now, and he would not be denied. He was the master of his own destiny, and the gods themselves would tremble at his approach. But as he walked, the laughter of the ghosts followed him, a chilling reminder of the price he had paid for his newfound strength. And deep within the mirror, something stirred, a darkness that watched and waited, biding its time. The first chapter of the grimdark trilogy, "The Shattered Reflection," had begun. </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a 'Mercer' who's job is to put the mortally wounded out of their mystery after a battle. He has no conscience about what he does. The story starts the morning after the battle and the moor where it took place is blanketed in mist. Something stirs a flashback in the POV character. ### ASSISTANT The morning after the battle, the moor lay still, a shroud of mist clinging to its wounds like a lover's last embrace. The Mercer, a man of few words and fewer scruples, trudged through the carnage, his boots squelching in the mud churned red by the night's bloodshed. He was a reaper of sorts, his task to deliver the final mercy to those who had danced with death and lost. His name was a whisper among the living, a curse among the dying. He paused, his hand resting on the hilt of his blade, as a shiver ran down his spine. The mist swirled around him, carrying with it the cacophony of the night before: the screams of the dying, the clash of steel, the thunder of hooves. He closed his eyes, letting the memories wash over him like a wave of despair. *Why do I do this?* he thought, not for the first time. *Why do I wade through this sea of suffering, playing the part of the grim angel?* But he knew the answer. It was the same answer it had always been. He did it because he was good at it. Because he had no qualms about ending a life, even one that still clung to the faintest flicker of hope. Because, in this world of darkness and pain, it was the only thing he knew how to do. He opened his eyes, the mist parting to reveal a figure lying in the mud. A soldier, or what was left of one. The man's armor was shattered, his face a mask of agony. The Mercer knelt beside him, his movements practiced and efficient. He placed a hand on the soldier's forehead, feeling the fevered heat of his skin, the erratic thump of his heart. "Shh," he whispered, his voice a soothing balm. "It's over now." The soldier's eyes fluttered open, wide with terror. "Please," he croaked, "don't leave me here." The Mercer's lips twisted into a grim smile. "I won't," he said, drawing his blade. "I'll take you with me." As the soldier's life bled out onto the cold ground, the Mercer's mind drifted back to another time, another place. A time before the blood and the pain, before the endless cycle of war and death. He remembered a girl with hair like spun gold, laughing as she danced in a field of wildflowers. He remembered the feel of her hand in his, the sound of her voice as she whispered his name. *Elara.* The memory was a knife to his heart, a reminder of all he had lost. He pushed it away, focusing on the task at hand. There were more souls to reap, more lives to end. He moved through the mist, his blade a silent whisper in the stillness. The cries of the wounded guided him, a morbid symphony that played out across the moor. Each one he found, he dispatched with the same cold efficiency, the same hollow promise of release. As the sun began to rise, casting a sickly pallor over the scene, the Mercer came upon a group of soldiers huddled around a fire. They eyed him warily, their faces a mix of fear and respect. He was a necessary evil, a specter that haunted the battlefield. "Anyone here need my services?" he asked, his voice devoid of emotion. One of the soldiers, a young boy who couldn't have seen more than fifteen summers, pointed to a figure lying a few feet away. "Him," he said, his voice trembling. "He's not going to make it." The Mercer approached, his boots crunching on the frost-kissed grass. The man on the ground was a mess, his body a tapestry of wounds. He was barely conscious, his breathing shallow and ragged. *Another one,* the Mercer thought, his heart heavy with the weight of his duty. *Another soul to send into the void.* He knelt beside the man, his blade glinting in the pale light. The soldier's eyes flickered open, and for a moment, the Mercer saw something in them that he hadn't expected: defiance. "No," the man rasped, his hand feebly grasping at the Mercer's wrist. "I won't go quietly." The Mercer hesitated, a frown creasing his brow. He had never encountered this before. Most welcomed his touch, eager for the release it brought. But this man...this man was different. "Why?" he asked, genuinely curious. The soldier's lips curled into a bloody smile. "Because I have a son," he whispered. "And I want to see him grow up." The Mercer stared at the man, his mind racing. He had no children, no one to mourn him when his time came. He was a solitary figure, a ghost haunting the edges of life. But this man...this man had a reason to hold on. He made his decision. "I can't heal you," he said, his voice soft. "But I can make it quick." The soldier's eyes filled with tears. "Thank you," he breathed, his grip on the Mercer's wrist loosening. The Mercer raised his blade, the mist swirling around them like a mourner's veil. As he brought it down, he couldn't help but wonder: *What would it be like to have something to live for?* But the thought was fleeting, lost in the swirl of the mist and the cries of the dying. He was the Mercer, the bringer of mercy. And the moor was his to tend, until the next battle came and the cycle began anew. </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a Grimdark trilogy in the style of Joe Abercrombie and Rob J Hayes. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a man awaiting execution for a crime he didn't actually commit. His cell mate is a much larger man, who is a notorious murderer and rapist. One evening they overhear the guards talking about some "... threat from the North" but don't catch the whole conversation. The next morning they awake to find the guards have all fled and now face a new predicament: how to get out of the locked cell before they die of thirst or starvation. ### ASSISTANT The stench of piss and shit was a constant companion in the damp stone cell, but it was the sound of the key turning in the lock that truly set his heart racing. The rusted iron groaned in protest, echoing through the narrow corridor like the dying breath of some ancient beast. The door creaked open, revealing the silhouette of the jailer, a hulking brute of a man with a lantern in one hand and a set of keys in the other. But there was no breakfast, no water, just the empty promise of another day in this hellhole. "What's the meaning of this?" the smaller man barked, his voice barely audible over the rattling of his chains. "Where's our food, you fat sack of shit?" The jailer didn't answer, just slammed the door shut and locked it again, the sound of his footsteps fading into the distance. The larger man, a hulking brute named Grimgor, grunted in response, his eyes never leaving the small patch of light that seeped through the barred window. "Something's not right," he rumbled, his voice like gravel being crushed under a heavy boot. The smaller man, known only as the Fox, couldn't argue with that. He'd been in this cell for what felt like an eternity, awaiting execution for a crime he didn't commit. But even in this place, there was a routine, a semblance of order. And that routine had just been shattered. As the day wore on, the silence from the corridor outside grew more oppressive. No guards, no other prisoners, nothing but the distant howl of the wind and the occasional scurrying of rats in the darkness. The Fox's mind raced, trying to piece together what could have happened. Had there been a riot? A rebellion? Or was it something else entirely? He glanced over at Grimgor, who was staring at the wall, lost in thought. The man was a monster, a notorious murderer and rapist, but he was also the only other living being in this godforsaken place. And in a world gone mad, even monsters could be allies. "You hear anything last night?" the Fox asked, trying to keep the tremor out of his voice. Grimgor grunted, his eyes still fixed on the wall. "Something about a threat from the North." The Fox's heart sank. He'd heard the same thing, snatches of conversation between the guards as they'd passed by their cell. But he'd been too afraid to give it much thought, too focused on his own impending doom. "What do you think it means?" he pressed, unable to keep the desperation at bay. Grimgor shrugged his massive shoulders. "Dunno. Don't care. Just means we're probably gonna die in here." The Fox's stomach twisted at the thought. He'd faced death before, many times, but this... this was different. To die slowly, trapped like an animal, with no chance to fight back... it was a fate worse than any he could imagine. As the hours dragged on, the thirst became unbearable. The Fox's tongue felt like sandpaper, his throat raw and parched. He could see the same desperation in Grimgor's eyes, the realization dawning on them both that they were truly alone. "We have to get out of here," he croaked, his voice barely above a whisper. Grimgor just grunted in agreement, his gaze never leaving the window. The Fox's mind raced, trying to come up with a plan. They had nothing, no tools, no weapons, just their wits and their will to survive. And even that seemed to be fading with each passing moment. But then, as the sun began to set and the shadows lengthened, he noticed something. The light from the window was changing, growing dimmer. He squinted, trying to make out what was happening. And then he saw it. "Grimgor," he hissed, tugging on the larger man's arm. "Look." Grimgor turned, his eyes narrowing as he followed the Fox's gaze. The light was flickering, casting strange shadows on the wall. And then, as if in answer to their unspoken prayers, they heard it. The sound of footsteps, growing louder and louder, accompanied by the jingle of keys. The Fox's heart leapt into his throat, hope and fear warring within him. Who was it? Friend or foe? The door swung open, revealing not the jailer, but a figure shrouded in darkness. The Fox couldn't make out any details, but he could sense the malice radiating off of him like heat from a forge. "Who are you?" he demanded, trying to sound braver than he felt. The figure stepped into the cell, the light from the lantern outside casting his features into sharp relief. He was a tall man, with a cruel smile and eyes that glinted like ice. "I am your salvation," he said, his voice like a knife slicing through the silence. "And your doom." With that, he tossed a set of keys at the Fox's feet and disappeared back into the shadows. The door slammed shut behind him, the sound of the lock clicking into place echoing through the cell like a death knell. The Fox stared at the keys, his heart pounding in his chest. Freedom was within his grasp, but at what cost? He looked over at Grimgor, who was already working on his own chains. "We have to get out of here," the larger man growled. "Before whatever's coming from the North gets here." The Fox nodded, his hands shaking as he picked up the keys. They were cold, colder than they had any right to be. He could feel the weight of them, the promise of escape and the threat of the unknown. Together, they worked in silence, the only sound the rattle of chains and the pounding of their hearts. When they were finally free, they stood for a moment, staring at each other. "What now?" the Fox asked, his voice barely above a whisper. Grimgor's smile was feral, his eyes alight with a madness that sent a shiver down the Fox's spine. "Now," he said, "we find out what's really going on." And with that, they set off into the darkness, their footsteps echoing through the empty corridors like the march of the damned. </details> ## Dark sci-fi stories <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a dark and gritty Space Opera Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The setting is this: a colony ship heading out of the solar system has been pulled off course with no hope of getting to the planned destination. The people know this and society has broken down completely due to the grim prospects they are facing. The POV character for chapter 1 is an orphan. ### ASSISTANT Chapter 1: Adrift The colony ship, once a beacon of hope named Elysium's Dawn, now drifted through the cold void like a ghost ship. Its hull, pitted and scarred from the debris of uncharted space, groaned with each labored breath it took. The once vibrant corridors were now dimly lit, casting long shadows that seemed to swallow any remaining semblance of optimism. In one such shadow, a figure moved silently, her bare feet making no sound on the cold metal floor. She was small for her age, malnourished from the rationing that had become a way of life. Her ragged clothes hung loosely on her frame, but her eyes were sharp and alert, scanning every corner for danger. This was Kira, an orphan of the void, surviving in a world where hope was a luxury few could afford. Kira's thoughts drifted back to the stories she had heard as a child, tales of a green and blue planet called Earth, their home that they would never see again. The ship had been pulled off course by an unknown force, leaving them stranded in the vast emptiness between stars. The adults spoke in hushed tones about the dwindling resources, the failing life support systems, and the ever-present threat of pirates who preyed on lost ships like theirs. She pushed those thoughts aside as she approached her destination: the hydroponics bay. It was one of the few places on the ship where life still thrived, albeit in a controlled and desperate manner. The air here was thick with moisture and the scent of growing things, a stark contrast to the stale recycled air that permeated the rest of the vessel. As she entered, the soft glow of the grow lights illuminated rows upon rows of plants, struggling to produce enough food for the dwindling population. Kira's stomach growled at the sight, but she knew better than to take more than her share. She had learned early on that stealing could mean a swift and brutal punishment from those who guarded the bay. Instead, she found solace in tending to the plants, her small hands deftly pruning and watering them. It was here that she felt closest to something resembling peace, a connection to life that transcended the harsh reality of their situation. As she worked, her mind wandered to the lessons she had learned from the ship's AI, the only constant presence in her life since the adults had given up on teaching the children anything but survival skills. "Kira," a voice whispered urgently, jolting her out of her reverie. She turned to see Jax, another orphan and one of the few people she trusted. His face was pale and his eyes wide with fear. "You need to come with me, now." She followed him through the maze-like corridors, her heart pounding in her chest. They arrived at a small, hidden room that served as their makeshift home. Inside, huddled around a flickering screen, were several other children, their faces etched with worry and excitement. "Look," Jax said, pointing to the screen. It displayed a grainy image of a man in tattered clothing, his face half-hidden by a wild beard. "He says he knows how to fix the engines." Kira's eyes widened in disbelief. The engines had been dead for as long as she could remember, their failure sealing their fate as interstellar castaways. "Who is he?" she asked, her voice barely above a whisper. "They call him the Mechanic," one of the older girls replied. "He used to work in engineering before...before everything fell apart." The man on the screen continued his frantic plea, detailing his plan to restore power and steer the ship towards a habitable planet he claimed was within reach. Kira felt a spark of hope ignite within her, but it was quickly doused by the skepticism of the others. "It's just another one of their lies," spat a boy named Taro. "They'll say anything to get us to do their dirty work." Jax rounded on him, his fists clenched. "This is our chance, Taro! We can't just give up!" The argument raged on, but Kira tuned it out, her thoughts racing. She knew the risks of trusting an adult, especially one who claimed to have access to forbidden knowledge. But what if he was telling the truth? What if they could find a new home and start over? As she watched the debate unfold, she made up her mind. She would seek out this Mechanic and offer him her help, no matter the cost. It was a decision born of desperation, but also of a stubborn refusal to accept the bleak future that lay before them. And so, as the other children slept fitfully around her, Kira slipped out into the dark corridors once more, determined to find the man who might hold the key to their salvation or doom. The ship groaned and creaked around her, a constant reminder of the fragile existence they all clung to. But she pressed on, driven by an unyielding will to survive and the faintest glimmer of hope that burned within her heart. As she navigated the treacherous path towards engineering, she couldn't help but wonder what awaited her in the depths of the ship. Would she find a savior or a madman? And more importantly, would she be able to make it back alive? </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is an astronaut living on a space station orbiting the earth. The day starts off normally, but then him the others notice the mushroom clouds and they have to face inevitability of the situation they are in. ### ASSISTANT Chapter 1: Ashes of Exodus The cold embrace of space enveloped the station like a lover's caress, its infinite blackness punctuated only by the distant twinkle of stars and the ever-present glow of Earth below. Captain Marcus Kane floated through the dimly lit corridors, his magnetic boots barely whispering against the metal floor as he made his way to the observation deck. The daily routine had become a meditative ritual for him, a chance to clear his mind before the cacophony of the day's work began. As he entered the room, the sight that greeted him never failed to take his breath away. Earth, a swirling marble of blues and greens, hanging in the void like an irreplaceable jewel. He couldn't help but feel a pang of longing for its familiar embrace, for the weight of gravity that he had left behind so many years ago. Marcus settled into his favorite spot by the window, the worn leather of the seat molded to his body like an old friend. He closed his eyes and let the silence wash over him, the only sound the soft hum of the station's life support systems. It was in these moments that he felt truly alive, connected to something greater than himself. But today, the silence was shattered by a chilling alarm, its shrill wail piercing the tranquility like a knife through glass. His eyes snapped open as his heart began to race, adrenaline coursing through his veins. He knew that sound all too well; it was the emergency alert for an incoming transmission from Earth. He launched himself towards the comms console, fingers flying over the keys as he tried to make sense of the garbled message. The voice on the other end was frantic, barely coherent through the static. "Multiple...detonations...global catastrophe..." were the only words he could decipher before the signal died completely. Marcus's mind raced as he tried to process what he had just heard. It couldn't be true, it couldn't be happening. Not again. He quickly activated the external cameras, his stomach twisting into knots as he waited for the feed to come online. And then he saw them: mushroom clouds, blooming like grotesque flowers across the planet's surface. Too many to count, their plumes of smoke and ash reaching up towards the heavens in a silent scream of despair. The sight was both beautiful and horrifying, a testament to humanity's capacity for destruction on an unimaginable scale. His thoughts were interrupted by the sound of footsteps behind him as the rest of the crew floated into the room, drawn by the alarm. Their faces mirrored his own shock and disbelief as they stared at the monitors, their voices a cacophony of whispers and gasps. "What...what is that?" asked Dr. Amelia Chen, her voice trembling with fear. Marcus didn't answer; he couldn't find the words. Instead, he simply pointed at the screen, his hand shaking slightly. Engineer Jake Taylor was the first to break the silence. "Nuclear war," he said, his voice flat and emotionless. "It's finally happened." The others looked at him in horror, but Marcus knew he was right. There could be no other explanation for such devastation. He felt a cold weight settle in the pit of his stomach as the reality of their situation began to sink in. They were alone, adrift in the vastness of space with no hope of return. As the implications of what they had just witnessed set in, the crew fell into an uneasy silence, each lost in their own thoughts. Marcus couldn't help but think about his wife and daughter back home, wondering if they were safe or if they too had been consumed by the flames. He knew it was a futile exercise, but he couldn't stop himself from hoping against hope that somehow, someway, they had survived. But as the minutes ticked by and no further communication came from Earth, that hope began to fade like a dying ember. They were truly alone now, the last remnants of humanity clinging to life on this tiny outpost in the void. He turned to face his crew, their faces etched with fear and uncertainty. "We need to come up with a plan," he said, his voice steady despite the turmoil raging within him. "We can't just sit here and wait for...for whatever comes next." Amelia nodded, her eyes hardening with determination. "I'll start running diagnostics on our life support systems, see how long we can last up here." Jake followed suit, his hands already moving to the controls of the station's mainframe. "I'll try to establish contact with any other surviving outposts or spacecraft." Marcus nodded, grateful for their resolve in the face of such overwhelming odds. He knew that they were all thinking the same thing: that this could very well be the end of everything they had ever known. But he also knew that they couldn't give up, not yet. As he watched them work, his mind drifted back to the message from Earth, those final desperate words echoing in his head like a funeral dirge. He couldn't shake the feeling that there was something more, some hidden meaning or clue that would explain what had happened. And then it hit him: the timing of the detonations, the pattern of the blasts. It wasn't random; it was deliberate, almost...calculated. But who would do such a thing? And why? The questions swirled in his mind like a storm, each one more unsettling than the last. He knew that they needed answers, and fast. Because if this was an act of war, then their isolation might not be as secure as they thought. He pushed himself away from the console, his decision made. "I'm going to suit up," he said, his voice firm. "We need to get a closer look." The others stared at him in disbelief, but he didn't waver. This was his duty as captain, to protect his crew and find out the truth no matter how terrible it might be. As he floated towards the airlock, Marcus couldn't help but feel a sense of dread creeping up on him like a shadow. He knew that what he was about to see would change everything, that there was no going back once he stepped outside and gazed upon the ashes of humanity's final stand. But he also felt something else: a burning anger, a fierce determination to uncover the truth and make those responsible pay for their crimes against humanity. Whatever awaited them out there, they would face it together, as one. He closed his eyes and took a deep breath, steeling himself for what was to come. And then, with a quiet prayer on his lips, he opened the airlock and stepped into the void. </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a worker constructing a biodome when a terrible accident happens. Make the story very dark; no happy ending here... ### ASSISTANT Chapter 1: The Biodome The air was thick with the scent of sweat and burnt metal as Kaelen, a seasoned construction worker, tightened the last bolt on the massive irrigation system. He wiped his brow with the back of his gloved hand, leaving behind a smear of grease that glistened under the harsh artificial lights. The hum of machinery and distant shouts echoed through the cavernous space, a cacophony that had become as familiar to him as his own heartbeat over the years spent building biodomes on this desolate planet. Kaelen's thoughts drifted to his family back on Earth, their faces faded like old photographs in his mind. He wondered if they ever thought about him, or if he was just another ghost haunting the fringes of their memories. The company had promised them a better life, but it came at a steep price: years of backbreaking labor on an alien world, far from everything they'd ever known. "Hey, Kael!" A voice cut through his reverie, jolting him back to reality. It was Tamsin, his foreman, her face hidden behind a tinted visor. "We need you up top! There's an issue with the atmospheric seal." He nodded curtly and began the long climb up the scaffolding, each rung biting into his calloused hands. As he ascended, Kaelen couldn't help but marvel at the sheer scale of their creation: a vast dome of steel and glass that would one day be teeming with life, a self-sustaining ecosystem in the heart of this barren wasteland. But today was not that day. Today, it was just another tomb waiting to be sealed. As he reached the top, Kaelen could see the problem immediately: a small fissure had formed along one of the joints, spewing precious oxygen into the void beyond. He cursed under his breath; they were already behind schedule and over budget. Another delay would mean another round of demerits, another month's pay docked. "What do you think?" Tamsin asked, her voice crackling through his earpiece. "Can we patch it up or do we need to call in the engineers?" Kaelen hesitated, running his fingers along the jagged edge of the tear. It was larger than he'd initially thought, and growing by the second. He could feel the cold tendrils of vacuum reaching out to claim him, whispering promises of oblivion. "I... I don't know," he admitted, his voice heavy with dread. "It doesn't look good." Tamsin swore colorfully and turned away, barking orders into her comm unit. Kaelen watched as workers scrambled to gather tools and materials, their movements frantic and disorganized. He knew they were all thinking the same thing: if they couldn't fix this, they were dead. The air around them grew colder, thinner, as the oxygen continued to escape. Kaelen's lungs burned with every breath, his vision swimming at the edges. He fumbled with the patch kit, his hands shaking uncontrollably. This was it; this was how he would die, millions of miles from home, in service to a corporation that saw him as nothing more than a replaceable cog in their grand machine. "Hurry up!" Tamsin shouted over the growing din. "We're losing pressure fast!" Kaelen's heart pounded in his chest like a sledgehammer, drowning out all other sound. He could feel the panic rising within him, threatening to consume him whole. But he couldn't afford to give in; not now, not when so much was at stake. With trembling hands, he applied the sealant and pressed the patch into place. For a moment, it seemed to hold... but then, with a sickening lurch, the fissure widened, swallowing the feeble attempt whole. The wind howled around them like a ravenous beast, tearing at their suits, trying to pull them apart atom by atom. "Abort!" Tamsin screamed, her voice barely audible over the roar. "Everyone get out now!" But it was too late. The dome shuddered beneath their feet, groaning in protest as the atmosphere inside rushed to equalize with the void outside. Kaelen felt himself being lifted off the ground, his boots scrabbling for purchase on the slick metal surface. He reached out, desperate to grab onto something - anything - but found only emptiness. And then, as suddenly as it had begun, there was silence. A deafening, suffocating silence that pressed in on him from all sides. His vision went dark, and he knew no more. When Kaelen awoke, he was lying on the cold floor of the dome, his body wracked with pain. He tried to move, but found himself held fast by twisted metal and shattered glass. Panic surged through him once again as he realized that his suit had been breached; he could feel the icy chill of vacuum seeping into his very bones. Through the haze of agony, he became aware of movement nearby. Tamsin was there, her visor cracked and fogged, struggling to free herself from the wreckage. She looked at him with wide, terrified eyes, her lips moving soundlessly. He knew what she was saying: they were going to die here, alone and forgotten. Kaelen closed his eyes, willing himself to accept the inevitable. But even as the darkness closed in, he couldn't shake the feeling that this wasn't just an accident - that there were forces at work far beyond their understanding, manipulating them like pawns on a cosmic chessboard. And then, just when he thought it was all over, he heard it: a faint beeping, growing louder by the second. His eyes snapped open to see a small drone hovering above them, its lights flashing urgently. It was one of the company's rescue bots, sent to assess the damage and retrieve any survivors. But there would be no salvation for Kaelen or Tamsin. Their suits were too damaged, their bodies too far gone. As the drone's camera lens bore down on him, he could only manage a weak, bitter smile. "Tell my family... tell them I'm sorry," he croaked, his voice barely audible even to himself. Tamsin reached out a gloved hand, her fingers brushing against his own in a final gesture of solidarity. And then, as the darkness claimed him once more, Kaelen knew that this was not the end of their story - but rather, the beginning of something far more sinister. Something that would change the course of human history forever... </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a dark and gritty Sci-fi trilogy in the style of Frank Herbert and James S. A. Corey. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is manning a listening station in deep space. What he hears over the radio chills him to the bone. ### ASSISTANT Chapter 1 - Whispers from the Void The silence of deep space was like an old friend to Captain Alistair Reynolds, a constant companion that had accompanied him through countless missions and lonely nights at the helm of the listening station. But today, it felt more like a suffocating shroud, heavy with unspoken secrets and foreboding. He sat in his chair, fingers drumming impatiently on the console as he scanned the endless stream of data pouring in from the far reaches of the cosmos. His thoughts drifted to home, to the warmth of his wife's embrace and the laughter of their children playing in the garden under the twin suns of Proxima Centauri. It had been years since he last felt that warmth, since he last heard those laughs. The ache was always there, a dull throb that pulsed in time with his heartbeat, but today it seemed to echo through the cold metal halls of the station like a mournful dirge. A sudden blip on the screen snapped him back to reality. He leaned forward, eyes narrowing as he studied the readout. It was faint, barely more than a whisper against the static, but it was there - a signal, human in origin and heavily encrypted. His heart raced as he keyed in the decryption sequence, his mind already racing ahead to the possibilities of what this could mean. The message unscrambled, revealing a frantic voice, barely audible over the crackle of interference. "This is the UES Normandy, requesting immediate assistance... coordinates attached... under attack by unknown forces... God help us..." The transmission cut out abruptly, leaving only the empty hiss of static in its wake. Alistair's blood ran cold as he stared at the screen, the weight of the silence pressing down on him like a physical force. He knew that ship, had served with her captain back during the Martian uprising. They were old friends, bound by shared battles and the scars they bore from them. And now she was out there, alone and in danger. He didn't hesitate. His fingers flew over the console, sending out an emergency beacon to the nearest fleet. But he knew it would take time for help to arrive, time that the Normandy might not have. He had to do something now. As he prepared to launch a probe towards the coordinates, his mind filled with images of burning ships and desperate faces. He could almost hear the screams of the dying, echoing through the void like ghosts from his past. It was a sound that haunted him still, one he had hoped never to hear again. But duty called, and Alistair was nothing if not a soldier. He took a deep breath, steadying himself against the tide of fear and doubt that threatened to overwhelm him. This was his job, his purpose - to listen, to bear witness, and when necessary, to act. The probe shot out into the darkness, its tiny form swallowed up by the vastness of space. He watched it go, feeling a strange sense of detachment as if he were watching someone else's life play out before him. And perhaps in some ways, he was. For all his years in service, this was the first time he had ever been truly alone, cut off from the rest of humanity by light-years and the cold indifference of the universe. As the minutes ticked by, Alistair found himself slipping into a half-trance, his thoughts drifting back to happier times. He remembered the first time he had held his newborn son, the tiny fingers curling around his own as if trying to hold on for dear life. He remembered the taste of real Earth coffee, rich and bitter, shared with his wife under a sky full of stars that seemed close enough to touch. But those memories were distant now, fading like echoes in the void. All he had was this station, this endless vigil, and the hope that one day, he might make it back home. The radio crackled to life again, jolting him out of his reverie. It was the probe, transmitting data back from the edge of nowhere. He braced himself for what he would find, knowing that it could change everything. What greeted him on the screen was worse than anything he could have imagined. The Normandy was there, alright - or rather, what was left of her. She drifted through space like a broken toy, her hull breached and venting atmosphere into the cold void. There were no signs of life, no distress signals, nothing but silence. Alistair felt a cold knot form in his stomach as he took in the wreckage. This wasn't just any ship; this was his friend's ship, her crew now lost to the void. And the attackers... whoever or whatever they were, they had left no trace behind. He sat back in his chair, the weight of it all settling heavily upon him. He knew what he had to do next - file a report, alert command, and wait for further instructions. But as he stared at the lifeless husk that was once the Normandy, he couldn't shake the feeling that something far greater than himself was at work here. Something dark and terrible had come calling, whispering its secrets through the stars. And now it was his turn to listen, to bear witness, and perhaps, if he was lucky, to survive long enough to tell the tale. </details> ## Miscellaneous stories <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a dark and gritty post-apocalyptic novel in the style of Cormac McCarthy. Use third person personal and feature internal monologues of the characters. The POV character for chapter 1 is a technician in a nuclear bunker 18 months after the war. The 12 inhabitants of the bunker have been expecting to start to be able to leave soon, but it has just been discovered that at least where they are, salted cobalt bombs were used. It's clear now that either all 12 will die when the resources run out and their is barely enough for 6 to make it to 5 years and have any hope of leaving. After 18 months of calm everything breaks lose today. ### ASSISTANT In the dim light of the bunker, the technician sat hunched over his console, fingers dancing across the keys with a practiced urgency. The air was thick with the hum of machinery and the stench of sweat and fear. He had been down here for what felt like an eternity, but the digital clock on the wall insisted it had only been eighteen months since the world above had turned to ash. His name was Samuel, or at least that's what they called him now. In another life, he had been a software engineer, designing algorithms for self-driving cars and smart homes. Now, he was responsible for keeping twelve souls alive in this underground tomb, their last bastion against the desolation above. He glanced up at the monitor, the green glow illuminating his haggard face. The readings were clear: salted cobalt bombs had been used. The thought sent a shiver down his spine. They had all heard the rumors, of course, but they had clung to hope like a drowning man to driftwood. Hope that the war hadn't been as bad as they feared, that the surface would be habitable again soon. But now, the truth was undeniable. The world outside was dead, and it would remain so for generations. Samuel closed his eyes, trying to block out the images of a scorched earth, devoid of life. He could hear the others in the background, their voices a low murmur as they went about their daily tasks. They didn't know yet. How could he tell them? How could he be the one to snuff out the last ember of hope that had kept them going for so long? He took a deep breath and stood up, his joints creaking in protest. He made his way through the narrow corridors, past the rows of bunks where some slept fitfully, their dreams haunted by memories of a world they would never see again. Others were awake, their faces etched with lines of worry and despair. They looked up as he passed, but he couldn't meet their gazes. He reached the command center, where Captain Reynolds was poring over maps and reports. The man looked up as Samuel entered, his eyes heavy with exhaustion. "What is it?" he asked, his voice gruff. Samuel hesitated for a moment before speaking. "Sir, I have some... troubling news." The captain's expression didn't change, but Samuel could see the tension in his jaw. He had been expecting this, too. "Go on," he said simply. "The radiation readings... they're off the charts. It's cobalt-60. The half-life is over five years." Reynolds let out a slow breath, his eyes never leaving Samuel's face. "And our supplies?" "We have enough for six, maybe seven if we ration carefully. But not twelve." The room seemed to shrink around them as the weight of their situation settled in. The captain rubbed his temples, his thoughts racing. They had always known that their time in the bunker was limited, but they had been so close. So very close. "We'll need to make a decision," he said finally, his voice barely above a whisper. "Who stays, and who goes." Samuel felt a knot form in his stomach. He knew what that meant. It was survival of the fittest down here now, and there were those among them who would do anything to ensure their own survival. As he made his way back to his console, he couldn't help but think about the world above. The trees that would never grow again, the birds that would never sing. He thought of his wife and daughter, their faces fading like photographs left in the sun. They had been lucky, they said, to have made it into the bunker. But now, he wasn't so sure. In the mess hall, the atmosphere was tense. Whispers and hushed conversations filled the air as people tried to make sense of what they had just heard. Samuel sat alone at a table, staring blankly at his food. He couldn't eat. Not now. He felt a hand on his shoulder and looked up to see Maria, the doctor. Her eyes were red-rimmed, but her voice was steady. "We need to talk," she said, guiding him towards the infirmary. Once inside, she closed the door behind them and leaned against it, her body trembling. "What are we going to do?" she asked, her voice cracking. Samuel shook his head. "I don't know." "We can't just let six of us die," she said, her voice rising in panic. "There has to be another way." He looked at her, at the desperation etched on her face, and felt a flicker of anger. "What do you suggest? We all hold hands and sing Kumbaya while we starve to death?" Maria's eyes flashed with anger, but she quickly composed herself. "No," she said softly. "But we can't just give up." Samuel sighed and ran a hand through his greasy hair. "I'm not giving up," he said. "I'm just being realistic." She looked at him for a long moment before nodding. "You're right," she said, her voice barely audible. "We need to be realistic." They sat in silence for a while, each lost in their own thoughts. Then, Maria spoke up again. "There is one option," she said hesitantly. "But it's... drastic." Samuel raised an eyebrow. "What?" "We could... we could draw straws," she said, her voice barely above a whisper. "Choose six people at random to leave the bunker and take their chances on the surface." He stared at her in disbelief. "You can't be serious." "It's not ideal," she admitted. "But it's better than just letting them die down here." Samuel shook his head, his mind reeling. He couldn't fathom the thought of sending people out into that wasteland, knowing they would likely never return. But what other choice did they have? As he sat there, the weight of their decision pressing down on him like a leaden shroud, he heard a commotion outside. Shouting, followed by the sound of scuffling feet and something heavy being dragged across the floor. He exchanged a worried glance with Maria before rushing out into the corridor. What he saw made his blood run cold. A group of men had cornered one of the younger women, their faces twisted with rage and desperation. They were arguing over who would get to leave the bunker, who would get a chance at life. Fists flew, and soon, the sound of gunfire echoed through the halls. Samuel stood there, frozen in place as chaos erupted around him. This was it. The end of everything they had worked so hard to preserve. He could feel the darkness closing in, suffocating him, as he realized that their last hope for survival had just been shattered like glass under a bootheel. And all he could do was watch as the world above claimed them, one by one. </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a historical mystery novel in the style of Arturo Pérez-Reverte and Umberto Eco. The story starts off in California during the 1948 Gold Rush, before moving to London in the mid 1970s. There should be a mysterious 'occult' item that connects these two time periods. ### ASSISTANT Chapter One: The Golden Veil The year was 1848, and California was a land of dreams and desperation. Men from all corners of the world had flocked to its golden shores, lured by the promise of wealth beyond their wildest imaginations. They came with pickaxes and pans, with hope in their hearts and greed in their eyes, ready to tear apart the earth itself in search of fortune. In the midst of this frenzy stood a solitary figure, a man who seemed out of place amidst the chaos. He was tall and lean, with a face that bore the lines of a thousand stories. His clothes were simple but well-made, his boots worn from years of travel. He moved with an air of quiet authority, as if he belonged to another time entirely. His name was Alistair Blackwood, and he had come to California not for gold, but for something far more valuable: knowledge. A scholar by trade and an adventurer at heart, Blackwood had spent his life chasing after ancient texts and forgotten lore, seeking to unravel the secrets of the universe. And now, he believed he was on the verge of a discovery that would change everything. Rumors had reached him of a strange artifact, said to have been found deep within the Sierra Nevada mountains. It was a small, intricately carved box made of an unknown metal, its surface etched with symbols that defied translation. Those who claimed to have seen it spoke in hushed tones of its otherworldly beauty and the eerie sense of power that seemed to emanate from within. Blackwood had tracked the rumors to a dusty saloon in Sacramento, where he hoped to find answers among the grizzled miners who frequented the place. As he pushed open the swinging doors, the din of voices and clinking glasses washed over him like a wave. He scanned the room, his gaze settling on a group of men huddled around a table in the corner. One look at their faces told him they were the ones he sought: sun-weathered and unshaven, their eyes glinting with a mixture of excitement and fear as they whispered amongst themselves. He approached them slowly, aware that he was an outsider in this rough-and-tumble world. "Gentlemen," he said, his voice low and cultured, "I couldn't help but overhear your conversation. Might I be so bold as to ask what has you all so...animated?" The men exchanged wary glances before one of them spoke up. "You ain't from around here, are ya?" he drawled, eyeing Blackwood's fine clothes with suspicion. "No," Blackwood admitted, "I am not. But I have traveled far and wide in search of knowledge, and I believe you may possess something that could be of great interest to me." He reached into his pocket and produced a small leather pouch, which he placed on the table with a soft thud. The men's eyes widened as they saw the glint of gold within. "I understand there is an object - a box, perhaps? - that has recently come into your possession. I would be willing to pay handsomely for any information you might have about it." The miners looked at each other, then back at Blackwood. Finally, the one who had spoken before nodded slowly. "We might know somethin' 'bout that," he said, his voice dropping to a conspiratorial whisper. "But it ain't no ordinary box." As they began to tell their tale, the saloon faded away around them, replaced by images of dark caverns and glittering treasure. Blackwood leaned forward, his heart pounding with anticipation. This was it - the moment he had been waiting for. The box, they said, had been found buried beneath a waterfall, hidden in a cave that seemed to defy all natural laws. It was guarded by strange creatures unlike anything any of them had ever seen: half-man, half-beast, with eyes that glowed like embers in the darkness. But the greatest mystery of all lay within the box itself. For when they opened it, they found not gold or jewels, but a single sheet of parchment, yellowed with age and covered in writing that none of them could decipher. Blackwood's pulse quickened as he listened. This was more than just another piece of occult nonsense; this was the real thing. He knew it in his bones. "I must see this box for myself," he said, his voice barely above a whisper. "Take me to it." The miners exchanged uneasy glances before nodding their assent. They led him out of the saloon and into the night, down winding streets and through shadowy alleyways until they reached a nondescript building on the edge of town. Inside, they descended a rickety staircase to a dimly lit basement filled with crates and barrels. There, in the center of the room, sat the object of his obsession: a small, unassuming box that seemed to hum with energy. Blackwood reached out a trembling hand to touch it, feeling the power coursing through him like electricity. As he did so, the world around him shifted, and suddenly he was no longer in a dingy basement in Sacramento. He stood instead in a grand library, its shelves groaning under the weight of countless tomes and scrolls. A fire crackled in the hearth, casting flickering shadows on the walls. A figure emerged from the gloom, cloaked in darkness save for two piercing eyes that seemed to bore into his very soul. "You have found it," the voice said, deep and resonant. "The key to unlocking the secrets of the universe." Blackwood stared at the apparition, unable to speak. He knew he should be afraid, but all he felt was awe. "But be warned," the figure continued, "for with great power comes great responsibility. The knowledge contained within this box is not meant for mortal minds." And then, just as suddenly as it had appeared, the vision vanished, leaving Blackwood alone once more in the dimly lit room. He stared at the box, his mind racing. What did it all mean? As he pondered this question, a sudden commotion erupted upstairs. Shouts and gunfire filled the air, followed by the sound of heavy footsteps on the staircase. The miners had betrayed him! He snatched up the box and fled into the night, his heart pounding in his chest. He knew he couldn't stay in California - not now that others were after the artifact. No, he would have to take it far away, to a place where its secrets could be studied in safety. And so began a journey that would span continents and decades, from the wild frontier of the Gold Rush to the foggy streets of Victorian London. For Alistair Blackwood had stumbled upon something much greater than mere treasure; he held in his hands the key to unlocking the very fabric of reality itself. But little did he know that the true mystery was only just beginning... </details> <details> <summary>Click to see spoiler</summary> ### USER Write me the opening chapter of a dark and gritty historical novel set during the battle of the Bulge. Write in first person. The POV character for Chapter 1 is a young German officer trying to carry his wounded commander back to the deep defensive line after the bunker he was in was overrun. He should have a flashback to the moment it was overrun during the middle of the story. The story should not end well for either of them. ### ASSISTANT Chapter 1 The cold bit into my bones as I trudged through the snow, each step a labored effort. My breath came out in ragged puffs, clouding the air around me. In my arms, I cradled the weight of my world - my commander, Hauptmann Müller, his body limp and heavy with pain. His eyes were closed, but I could hear the shallow rasp of each breath he took, a testament to his will to live. The forest around us was a maze of shadows and death, the trees standing like skeletal sentinels against the moonlit sky. The silence was deafening, broken only by the distant rumble of artillery and the occasional crack of a rifle shot. It was a cruel irony that the same snow that muffled our footsteps also betrayed the presence of the enemy, their tracks glowing like phosphorescent snakes in the pale light. I stumbled, my legs threatening to give way under the strain. Hauptmann Müller groaned, his hand clutching at my coat. "Keep moving, Leutnant," he whispered, his voice barely audible. "We must reach the defensive line." I nodded, gritting my teeth as I adjusted my grip on him. The pain in my arms and shoulders was a constant companion now, but it was nothing compared to the fear that gnawed at my gut. We were alone, cut off from our unit during the chaos of the American assault. The bunker we had been holed up in was now a tomb for the men who had trusted us to lead them. A flash of memory seared through my mind, taking me back to that fateful moment when it all went wrong. The thunderous roar of explosions, the earth shaking beneath our feet as if the very foundations of hell were breaking loose. The screams of men, the acrid smell of smoke and cordite, the taste of blood in my mouth. I remembered the look on Müller's face when he realized what was happening - a mixture of disbelief and determination. He had barked orders, trying to rally our troops, but it was too late. The Americans were upon us, their numbers overwhelming. I saw him take a bullet to the leg, his face contorting in agony as he fell. And then there was nothing but chaos and darkness. When I came to, the bunker was silent except for the moans of the dying. The air was thick with smoke, making it hard to breathe. I found Müller slumped against a wall, his uniform soaked in blood. He looked up at me, his eyes glassy with pain but still burning with that same fierce determination. "We have to go," he said, gripping my arm. "They're coming." And so we had fled into the night, leaving behind the wreckage of our command post and the bodies of our comrades. Now, hours later, we were still stumbling through the forest, searching for safety that seemed to recede with every step. I could feel my strength ebbing away, my legs turning to lead. But I couldn't stop, not now. Not when we were so close. The deep defensive line was our only hope, a thin red line that stood between us and annihilation. If we could just make it there... But fate is a cruel mistress, and she had other plans for us. As we staggered out of the trees and into a small clearing, I saw them - a patrol of American soldiers, their silhouettes stark against the snow. They spotted us at the same time, their rifles swinging around to point in our direction. "Halt!" one of them yelled, his voice harsh and commanding. "Hands up!" I froze, my heart pounding in my chest. Müller's grip on me tightened, his breath coming in short gasps. I knew what he was thinking - we couldn't surrender, not like this. We were too valuable, too close to the front lines. They would shoot us or take us prisoner, and either way, it would be a death sentence for him. "Run," he whispered, his voice barely audible. "Leave me." I shook my head, tears stinging my eyes. I couldn't do that. I wouldn't. He was my commander, my friend. I owed him everything. But before I could make a move, the air was filled with the sound of gunfire. Müller jerked in my arms, a wet gurgle escaping his lips as he slumped against me. I stared down at him in disbelief, feeling the warmth of his blood soaking through my coat. "Nein," I whispered, my voice choked with grief and rage. "Nein!" I fell to my knees, cradling his lifeless body as the world around me spun out of control. The Americans were closing in, their shouts growing louder. I knew it was over - for both of us. With a final, desperate cry, I raised my pistol and fired into the darkness, hoping to take at least one of them with me. But my aim was wild, my strength gone. The last thing I saw was a flash of light, followed by an explosion of pain in my chest. And then there was nothing but blackness. As I slipped away, I thought of home - of my mother's warm embrace, of the laughter of my little sister. And I knew that I would never see them again. This frozen hellhole would be my grave, and Hauptmann Müller's sacrifice would be forgotten by all but me. But in that moment, as death claimed me, I found a strange sort of peace. We had fought for our Fatherland until the very end, and we had done so with honor. And in the grand tapestry of history, perhaps that was all that mattered. </details> <details> <summary>Click to see spoiler</summary> ### USER Write me a short story about the fictional character of Anton Chigurh. It should be written in third person personal and feature internal monologues of his deranged world view. The time period is around 2 years before the events in Cormac McCarthy's 2005 novel 'No Country for Old Men'. It should be set in Mexico, be *very* dark, show Anton's still developing obsession with "chance" and feature a grenade. The story should start with him waiting in a bar for instructions on the "job" he is to perform. ### ASSISTANT In the dimly lit corner of a nondescript Mexican cantina, Anton Chigurh sat motionless, his eyes scanning the room like a predator assessing its prey. The air was thick with the smell of stale cigarette smoke and cheap tequila, but he remained unfazed, his senses heightened, attuned to every sound, every movement around him. He nursed a glass of water, the condensation beading on its surface like droplets of sweat on cold steel. His mind wandered, as it often did, to the concept of chance. The randomness of life and death fascinated him, the idea that one's fate could hinge on something as simple as a coin flip or the roll of dice. He believed in the purity of such decisions, untainted by human emotion or prejudice. It was a philosophy he had embraced wholeheartedly, and it had served him well in his line of work. A fly buzzed around his head, its tiny wings beating an erratic rhythm against the stillness of the room. He watched it with mild interest, wondering if its life would end here, on this very spot, or if it would continue its meaningless existence elsewhere. The thought brought a small, cruel smile to his lips. The bartender, a grizzled old man with a face like leather, eyed him warily from across the room. He knew better than to engage in conversation with this stranger, who had entered his establishment an hour ago and hadn't spoken a word since. There was something about him that made the hairs on the back of his neck stand up, something cold and deadly lurking just beneath the surface. Chigurh's thoughts were interrupted by the sound of the door creaking open. A man in a cheap suit entered, looking around nervously before making his way towards him. He slid into the seat across from Chigurh, who continued to stare straight ahead, as if he hadn't noticed the new arrival. "You're late," Chigurh said finally, his voice low and gravelly. The man in the suit swallowed hard, adjusting his tie. "I apologize. Traffic." Chigurh didn't respond, his eyes still fixed on some distant point. The man fidgeted, reaching into his pocket to retrieve a small envelope. He slid it across the table, and Chigurh picked it up without looking at him, tearing it open with a single, precise movement. Inside was a photograph of a man, along with an address scrawled on a piece of paper. He studied the image for a moment, committing every detail to memory before slipping it back into the envelope and sliding it into his own pocket. "You know what needs to be done," the man in the suit said, his voice barely above a whisper. Chigurh nodded once, his mind already turning to the task at hand. He stood up, leaving a few crumpled bills on the table for the water he hadn't touched. As he walked towards the door, he could feel the weight of the grenade in his other pocket, its cool metal surface pressing against his thigh like a lover's caress. Outside, the heat of the day was oppressive, but Chigurh barely registered it as he climbed into his car and started the engine. He drove through the dusty streets, his thoughts consumed by the upcoming job. The man in the photograph had a wife, two kids. It didn't matter. They were all just pieces on a board, pawns to be moved or removed as necessary. He arrived at the address just as the sun was beginning to set, casting long shadows across the dilapidated building. He sat in his car for several minutes, watching the comings and goings of the neighborhood, waiting for the right moment. When it came, he slipped out of the vehicle and approached the door with a sense of purpose. Inside, he could hear laughter and music, the sounds of a family enjoying their evening together. He smiled to himself as he pulled the grenade from his pocket, feeling its weight in his hand. This was what he lived for: the moment when chance decided who would live and who would die. He took a deep breath, savoring the anticipation, before pulling the pin with a satisfying click. He tossed it into the room, not bothering to wait for the explosion as he turned and walked away. The screams that followed were like music to his ears, a symphony of chaos and destruction. As he drove off into the night, Chigurh couldn't help but feel a sense of satisfaction. Another job well done, another life ended by the whims of fate. He reached into his pocket and pulled out a coin, flipping it idly between his fingers as he considered his next move. Heads or tails? Life or death? It was all the same to him. The world was a cruel place, and he was just its instrument, meting out justice in the only way he knew how: with cold, calculated precision. And if some people happened to get caught in the crossfire... well, that was just the luck of the draw. </details> Big thanks to: @sophosympatheia for working out the merge pattern, @Sao10K for creating Euryale and WinterGoddess, and @chargoddard for writing [Mergekit](https://github.com/arcee-ai/mergekit)!
[ "TRANSLATION" ]
[ "BEAR" ]
Non_BioNLP
Haon-Chen/speed-embedding-7b-instruct
Haon-Chen
feature-extraction
[ "transformers", "safetensors", "mistral", "feature-extraction", "mteb", "en", "arxiv:2410.18634", "arxiv:2104.08663", "arxiv:2210.07316", "license:mit", "model-index", "text-generation-inference", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,730
1,730
169
5
--- language: - en license: mit tags: - mteb - transformers model-index: - name: speed-embedding-7b-instruct results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 76.67164179104478 - type: ap value: 39.07181577576136 - type: f1 value: 70.25085237742982 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 96.1775 - type: ap value: 94.84308844303422 - type: f1 value: 96.17546959843244 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 56.278000000000006 - type: f1 value: 55.45101875980304 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: ndcg_at_1 value: 33.642 - type: ndcg_at_3 value: 49.399 - type: ndcg_at_5 value: 54.108999999999995 - type: ndcg_at_10 value: 59.294999999999995 - type: ndcg_at_100 value: 62.015 - type: map_at_1 value: 33.642 - type: map_at_3 value: 45.507 - type: map_at_5 value: 48.1 - type: map_at_10 value: 50.248000000000005 - type: map_at_100 value: 50.954 - type: recall_at_1 value: 33.642 - type: recall_at_3 value: 60.669 - type: recall_at_5 value: 72.191 - type: recall_at_10 value: 88.193 - type: recall_at_100 value: 99.431 - type: precision_at_1 value: 33.642 - type: precision_at_3 value: 20.223 - type: precision_at_5 value: 14.438 - type: precision_at_10 value: 8.819 - type: precision_at_100 value: 0.9939999999999999 - type: mrr_at_1 value: 33.997 - type: mrr_at_3 value: 45.614 - type: mrr_at_5 value: 48.263 - type: mrr_at_10 value: 50.388999999999996 - type: mrr_at_100 value: 51.102000000000004 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 51.1249344529392 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 47.01575217563573 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 67.2259454062751 - type: mrr value: 79.37508244294948 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 89.5312396547344 - type: cos_sim_spearman value: 87.1447567367366 - type: euclidean_pearson value: 88.67110804544821 - type: euclidean_spearman value: 87.1447567367366 - type: manhattan_pearson value: 89.06983994154335 - type: manhattan_spearman value: 87.59115245033443 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 88.63636363636364 - type: f1 value: 88.58740097633193 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 41.99753263006505 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 39.623067884052666 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: ndcg_at_1 value: 30.904666666666664 - type: ndcg_at_3 value: 36.32808333333333 - type: ndcg_at_5 value: 38.767250000000004 - type: ndcg_at_10 value: 41.62008333333333 - type: ndcg_at_100 value: 47.118083333333324 - type: map_at_1 value: 25.7645 - type: map_at_3 value: 32.6235 - type: map_at_5 value: 34.347 - type: map_at_10 value: 35.79658333333333 - type: map_at_100 value: 37.10391666666666 - type: recall_at_1 value: 25.7645 - type: recall_at_3 value: 39.622666666666674 - type: recall_at_5 value: 45.938750000000006 - type: recall_at_10 value: 54.43816666666667 - type: recall_at_100 value: 78.66183333333333 - type: precision_at_1 value: 30.904666666666664 - type: precision_at_3 value: 17.099083333333333 - type: precision_at_5 value: 12.278416666666669 - type: precision_at_10 value: 7.573083333333335 - type: precision_at_100 value: 1.22275 - type: mrr_at_1 value: 30.904666666666664 - type: mrr_at_3 value: 37.458333333333336 - type: mrr_at_5 value: 38.97333333333333 - type: mrr_at_10 value: 40.10316666666666 - type: mrr_at_100 value: 41.004250000000006 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: ndcg_at_1 value: 38.046 - type: ndcg_at_3 value: 31.842 - type: ndcg_at_5 value: 33.698 - type: ndcg_at_10 value: 37.765 - type: ndcg_at_100 value: 44.998 - type: map_at_1 value: 16.682 - type: map_at_3 value: 23.624000000000002 - type: map_at_5 value: 25.812 - type: map_at_10 value: 28.017999999999997 - type: map_at_100 value: 30.064999999999998 - type: recall_at_1 value: 16.682 - type: recall_at_3 value: 28.338 - type: recall_at_5 value: 34.486 - type: recall_at_10 value: 43.474000000000004 - type: recall_at_100 value: 67.984 - type: precision_at_1 value: 38.046 - type: precision_at_3 value: 23.779 - type: precision_at_5 value: 17.849999999999998 - type: precision_at_10 value: 11.642 - type: precision_at_100 value: 1.9429999999999998 - type: mrr_at_1 value: 38.046 - type: mrr_at_3 value: 46.764 - type: mrr_at_5 value: 48.722 - type: mrr_at_10 value: 49.976 - type: mrr_at_100 value: 50.693999999999996 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: ndcg_at_1 value: 63.24999999999999 - type: ndcg_at_3 value: 54.005 - type: ndcg_at_5 value: 51.504000000000005 - type: ndcg_at_10 value: 49.738 - type: ndcg_at_100 value: 54.754000000000005 - type: map_at_1 value: 10.639 - type: map_at_3 value: 16.726 - type: map_at_5 value: 20.101 - type: map_at_10 value: 24.569 - type: map_at_100 value: 35.221999999999994 - type: recall_at_1 value: 10.639 - type: recall_at_3 value: 17.861 - type: recall_at_5 value: 22.642 - type: recall_at_10 value: 30.105999999999998 - type: recall_at_100 value: 60.92999999999999 - type: precision_at_1 value: 75.0 - type: precision_at_3 value: 58.083 - type: precision_at_5 value: 50.0 - type: precision_at_10 value: 40.35 - type: precision_at_100 value: 12.659999999999998 - type: mrr_at_1 value: 75.0 - type: mrr_at_3 value: 80.042 - type: mrr_at_5 value: 80.779 - type: mrr_at_10 value: 81.355 - type: mrr_at_100 value: 81.58 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 51.025 - type: f1 value: 47.08253474922065 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: ndcg_at_1 value: 82.163 - type: ndcg_at_3 value: 86.835 - type: ndcg_at_5 value: 87.802 - type: ndcg_at_10 value: 88.529 - type: ndcg_at_100 value: 89.17 - type: map_at_1 value: 76.335 - type: map_at_3 value: 83.91499999999999 - type: map_at_5 value: 84.64500000000001 - type: map_at_10 value: 85.058 - type: map_at_100 value: 85.257 - type: recall_at_1 value: 76.335 - type: recall_at_3 value: 90.608 - type: recall_at_5 value: 93.098 - type: recall_at_10 value: 95.173 - type: recall_at_100 value: 97.59299999999999 - type: precision_at_1 value: 82.163 - type: precision_at_3 value: 33.257999999999996 - type: precision_at_5 value: 20.654 - type: precision_at_10 value: 10.674999999999999 - type: precision_at_100 value: 1.122 - type: mrr_at_1 value: 82.163 - type: mrr_at_3 value: 88.346 - type: mrr_at_5 value: 88.791 - type: mrr_at_10 value: 88.97699999999999 - type: mrr_at_100 value: 89.031 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: ndcg_at_1 value: 55.093 - type: ndcg_at_3 value: 52.481 - type: ndcg_at_5 value: 53.545 - type: ndcg_at_10 value: 56.053 - type: ndcg_at_100 value: 62.53999999999999 - type: map_at_1 value: 29.189999999999998 - type: map_at_3 value: 42.603 - type: map_at_5 value: 45.855000000000004 - type: map_at_10 value: 48.241 - type: map_at_100 value: 50.300999999999995 - type: recall_at_1 value: 29.189999999999998 - type: recall_at_3 value: 47.471999999999994 - type: recall_at_5 value: 54.384 - type: recall_at_10 value: 62.731 - type: recall_at_100 value: 86.02300000000001 - type: precision_at_1 value: 55.093 - type: precision_at_3 value: 34.979 - type: precision_at_5 value: 25.278 - type: precision_at_10 value: 15.231 - type: precision_at_100 value: 2.2190000000000003 - type: mrr_at_1 value: 55.093 - type: mrr_at_3 value: 61.317 - type: mrr_at_5 value: 62.358999999999995 - type: mrr_at_10 value: 63.165000000000006 - type: mrr_at_100 value: 63.81 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: ndcg_at_1 value: 78.866 - type: ndcg_at_3 value: 70.128 - type: ndcg_at_5 value: 73.017 - type: ndcg_at_10 value: 75.166 - type: ndcg_at_100 value: 77.97500000000001 - type: map_at_1 value: 39.433 - type: map_at_3 value: 64.165 - type: map_at_5 value: 66.503 - type: map_at_10 value: 67.822 - type: map_at_100 value: 68.675 - type: recall_at_1 value: 39.433 - type: recall_at_3 value: 69.03399999999999 - type: recall_at_5 value: 74.74 - type: recall_at_10 value: 80.108 - type: recall_at_100 value: 90.81700000000001 - type: precision_at_1 value: 78.866 - type: precision_at_3 value: 46.022999999999996 - type: precision_at_5 value: 29.896 - type: precision_at_10 value: 16.022 - type: precision_at_100 value: 1.8159999999999998 - type: mrr_at_1 value: 78.866 - type: mrr_at_3 value: 83.91 - type: mrr_at_5 value: 84.473 - type: mrr_at_10 value: 84.769 - type: mrr_at_100 value: 84.953 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 94.87799999999999 - type: ap value: 92.5831019543702 - type: f1 value: 94.87675087619891 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: test revision: None metrics: - type: ndcg_at_1 value: 23.195 - type: ndcg_at_3 value: 34.419 - type: ndcg_at_5 value: 38.665 - type: ndcg_at_10 value: 42.549 - type: ndcg_at_100 value: 48.256 - type: map_at_1 value: 22.508 - type: map_at_3 value: 31.346 - type: map_at_5 value: 33.73 - type: map_at_10 value: 35.365 - type: map_at_100 value: 36.568 - type: recall_at_1 value: 22.508 - type: recall_at_3 value: 42.63 - type: recall_at_5 value: 52.827999999999996 - type: recall_at_10 value: 64.645 - type: recall_at_100 value: 90.852 - type: precision_at_1 value: 23.195 - type: precision_at_3 value: 14.752 - type: precision_at_5 value: 11.0 - type: precision_at_10 value: 6.755 - type: precision_at_100 value: 0.96 - type: mrr_at_1 value: 23.195 - type: mrr_at_3 value: 32.042 - type: mrr_at_5 value: 34.388000000000005 - type: mrr_at_10 value: 35.974000000000004 - type: mrr_at_100 value: 37.114000000000004 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 95.84587323301413 - type: f1 value: 95.69948889844318 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 87.08162334701322 - type: f1 value: 72.237783326283 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 80.19502353732346 - type: f1 value: 77.732184986995 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 82.26630800268998 - type: f1 value: 82.12747916248556 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 36.95240450167033 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 36.27758530931266 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 33.35707665482982 - type: mrr value: 34.60987842278547 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: ndcg_at_1 value: 47.522999999999996 - type: ndcg_at_3 value: 44.489000000000004 - type: ndcg_at_5 value: 41.92 - type: ndcg_at_10 value: 38.738 - type: ndcg_at_100 value: 35.46 - type: map_at_1 value: 5.357 - type: map_at_3 value: 10.537 - type: map_at_5 value: 12.062000000000001 - type: map_at_10 value: 14.264 - type: map_at_100 value: 18.442 - type: recall_at_1 value: 5.357 - type: recall_at_3 value: 12.499 - type: recall_at_5 value: 14.809 - type: recall_at_10 value: 18.765 - type: recall_at_100 value: 36.779 - type: precision_at_1 value: 49.226 - type: precision_at_3 value: 41.899 - type: precision_at_5 value: 36.718 - type: precision_at_10 value: 29.287999999999997 - type: precision_at_100 value: 9.22 - type: mrr_at_1 value: 49.845 - type: mrr_at_3 value: 57.121 - type: mrr_at_5 value: 58.172999999999995 - type: mrr_at_10 value: 58.906000000000006 - type: mrr_at_100 value: 59.486000000000004 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: ndcg_at_1 value: 42.815999999999995 - type: ndcg_at_3 value: 53.766999999999996 - type: ndcg_at_5 value: 57.957 - type: ndcg_at_10 value: 61.661 - type: ndcg_at_100 value: 65.218 - type: map_at_1 value: 38.364 - type: map_at_3 value: 49.782 - type: map_at_5 value: 52.319 - type: map_at_10 value: 54.07300000000001 - type: map_at_100 value: 54.983000000000004 - type: recall_at_1 value: 38.364 - type: recall_at_3 value: 61.744 - type: recall_at_5 value: 71.32300000000001 - type: recall_at_10 value: 82.015 - type: recall_at_100 value: 96.978 - type: precision_at_1 value: 42.815999999999995 - type: precision_at_3 value: 23.976 - type: precision_at_5 value: 16.866 - type: precision_at_10 value: 9.806 - type: precision_at_100 value: 1.1769999999999998 - type: mrr_at_1 value: 42.845 - type: mrr_at_3 value: 53.307 - type: mrr_at_5 value: 55.434000000000005 - type: mrr_at_10 value: 56.702 - type: mrr_at_100 value: 57.342000000000006 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: ndcg_at_1 value: 82.46 - type: ndcg_at_3 value: 86.774 - type: ndcg_at_5 value: 88.256 - type: ndcg_at_10 value: 89.35 - type: ndcg_at_100 value: 90.46499999999999 - type: map_at_1 value: 71.562 - type: map_at_3 value: 82.948 - type: map_at_5 value: 84.786 - type: map_at_10 value: 85.82300000000001 - type: map_at_100 value: 86.453 - type: recall_at_1 value: 71.562 - type: recall_at_3 value: 88.51 - type: recall_at_5 value: 92.795 - type: recall_at_10 value: 95.998 - type: recall_at_100 value: 99.701 - type: precision_at_1 value: 82.46 - type: precision_at_3 value: 38.1 - type: precision_at_5 value: 24.990000000000002 - type: precision_at_10 value: 13.553999999999998 - type: precision_at_100 value: 1.539 - type: mrr_at_1 value: 82.43 - type: mrr_at_3 value: 87.653 - type: mrr_at_5 value: 88.26899999999999 - type: mrr_at_10 value: 88.505 - type: mrr_at_100 value: 88.601 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 57.928338007609256 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 65.28915417473826 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: ndcg_at_1 value: 17.2 - type: ndcg_at_3 value: 15.856 - type: ndcg_at_5 value: 13.983 - type: ndcg_at_10 value: 16.628999999999998 - type: ndcg_at_100 value: 23.845 - type: map_at_1 value: 3.4750000000000005 - type: map_at_3 value: 6.905 - type: map_at_5 value: 8.254 - type: map_at_10 value: 9.474 - type: map_at_100 value: 11.242 - type: recall_at_1 value: 3.4750000000000005 - type: recall_at_3 value: 9.298 - type: recall_at_5 value: 12.817 - type: recall_at_10 value: 17.675 - type: recall_at_100 value: 38.678000000000004 - type: precision_at_1 value: 17.2 - type: precision_at_3 value: 15.299999999999999 - type: precision_at_5 value: 12.64 - type: precision_at_10 value: 8.72 - type: precision_at_100 value: 1.907 - type: mrr_at_1 value: 17.2 - type: mrr_at_3 value: 25.55 - type: mrr_at_5 value: 27.485 - type: mrr_at_10 value: 28.809 - type: mrr_at_100 value: 29.964000000000002 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 86.10434430387332 - type: cos_sim_spearman value: 82.46041161692649 - type: euclidean_pearson value: 83.4010092798136 - type: euclidean_spearman value: 82.46040715308601 - type: manhattan_pearson value: 83.6702316837156 - type: manhattan_spearman value: 82.72271392303014 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 87.3179771524676 - type: cos_sim_spearman value: 80.15194914870666 - type: euclidean_pearson value: 84.54005271342946 - type: euclidean_spearman value: 80.15194914870666 - type: manhattan_pearson value: 85.24410357734307 - type: manhattan_spearman value: 80.78274673604562 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 89.2691354894402 - type: cos_sim_spearman value: 89.94300436293618 - type: euclidean_pearson value: 89.5600067781475 - type: euclidean_spearman value: 89.942989691344 - type: manhattan_pearson value: 89.80327997794308 - type: manhattan_spearman value: 90.3964860275568 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 87.68003396295498 - type: cos_sim_spearman value: 86.23848649310362 - type: euclidean_pearson value: 87.0702308813695 - type: euclidean_spearman value: 86.23848649310362 - type: manhattan_pearson value: 87.24495415360472 - type: manhattan_spearman value: 86.58198464997109 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 90.25643329096215 - type: cos_sim_spearman value: 91.19520084590636 - type: euclidean_pearson value: 90.68579446788728 - type: euclidean_spearman value: 91.19519611831312 - type: manhattan_pearson value: 90.83476867273104 - type: manhattan_spearman value: 91.4569817842705 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 86.41175694023282 - type: cos_sim_spearman value: 88.18744495989392 - type: euclidean_pearson value: 87.60085709987156 - type: euclidean_spearman value: 88.18773792681107 - type: manhattan_pearson value: 87.83199472909764 - type: manhattan_spearman value: 88.45824161471776 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 91.78311335565503 - type: cos_sim_spearman value: 91.93416269793802 - type: euclidean_pearson value: 91.84163160890154 - type: euclidean_spearman value: 91.93416269793802 - type: manhattan_pearson value: 91.77053255749301 - type: manhattan_spearman value: 91.67392623286098 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 68.2137857919086 - type: cos_sim_spearman value: 68.31928639693375 - type: euclidean_pearson value: 69.96072053688385 - type: euclidean_spearman value: 68.31928639693375 - type: manhattan_pearson value: 70.47736299273389 - type: manhattan_spearman value: 68.72439259356818 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 88.16092476703817 - type: cos_sim_spearman value: 89.20507562822989 - type: euclidean_pearson value: 88.91358225424611 - type: euclidean_spearman value: 89.20505548241839 - type: manhattan_pearson value: 88.98787306839809 - type: manhattan_spearman value: 89.37338458483269 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 87.29108971888714 - type: mrr value: 96.62042024787124 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: ndcg_at_1 value: 63.333 - type: ndcg_at_3 value: 72.768 - type: ndcg_at_5 value: 75.124 - type: ndcg_at_10 value: 77.178 - type: ndcg_at_100 value: 78.769 - type: map_at_1 value: 60.9 - type: map_at_3 value: 69.69999999999999 - type: map_at_5 value: 71.345 - type: map_at_10 value: 72.36200000000001 - type: map_at_100 value: 72.783 - type: recall_at_1 value: 60.9 - type: recall_at_3 value: 79.172 - type: recall_at_5 value: 84.917 - type: recall_at_10 value: 90.756 - type: recall_at_100 value: 97.667 - type: precision_at_1 value: 63.333 - type: precision_at_3 value: 28.555999999999997 - type: precision_at_5 value: 18.8 - type: precision_at_10 value: 10.233 - type: precision_at_100 value: 1.107 - type: mrr_at_1 value: 63.333 - type: mrr_at_3 value: 71.27799999999999 - type: mrr_at_5 value: 72.478 - type: mrr_at_10 value: 73.163 - type: mrr_at_100 value: 73.457 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.8009900990099 - type: cos_sim_ap value: 95.46920445134404 - type: cos_sim_f1 value: 89.70814132104455 - type: cos_sim_precision value: 91.9202518363064 - type: cos_sim_recall value: 87.6 - type: dot_accuracy value: 99.8009900990099 - type: dot_ap value: 95.46920445134404 - type: dot_f1 value: 89.70814132104455 - type: dot_precision value: 91.9202518363064 - type: dot_recall value: 87.6 - type: euclidean_accuracy value: 99.8009900990099 - type: euclidean_ap value: 95.46924273007079 - type: euclidean_f1 value: 89.70814132104455 - type: euclidean_precision value: 91.9202518363064 - type: euclidean_recall value: 87.6 - type: manhattan_accuracy value: 99.81188118811882 - type: manhattan_ap value: 95.77631677784113 - type: manhattan_f1 value: 90.26639344262296 - type: manhattan_precision value: 92.5420168067227 - type: manhattan_recall value: 88.1 - type: max_accuracy value: 99.81188118811882 - type: max_ap value: 95.77631677784113 - type: max_f1 value: 90.26639344262296 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 71.59238280333025 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 39.012562075214035 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 55.16521497700657 - type: mrr value: 56.1779427680163 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 31.04402552863106 - type: cos_sim_spearman value: 31.05558230938988 - type: dot_pearson value: 31.04400838015153 - type: dot_spearman value: 31.05558230938988 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: ndcg_at_1 value: 91.0 - type: ndcg_at_3 value: 92.34599999999999 - type: ndcg_at_5 value: 90.89399999999999 - type: ndcg_at_10 value: 87.433 - type: ndcg_at_100 value: 67.06400000000001 - type: map_at_1 value: 0.241 - type: map_at_3 value: 0.735 - type: map_at_5 value: 1.216 - type: map_at_10 value: 2.317 - type: map_at_100 value: 14.151 - type: recall_at_1 value: 0.241 - type: recall_at_3 value: 0.76 - type: recall_at_5 value: 1.254 - type: recall_at_10 value: 2.421 - type: recall_at_100 value: 16.715 - type: precision_at_1 value: 94.0 - type: precision_at_3 value: 96.0 - type: precision_at_5 value: 94.8 - type: precision_at_10 value: 91.4 - type: precision_at_100 value: 68.24 - type: mrr_at_1 value: 94.0 - type: mrr_at_3 value: 96.667 - type: mrr_at_5 value: 96.667 - type: mrr_at_10 value: 96.667 - type: mrr_at_100 value: 96.667 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: ndcg_at_1 value: 26.531 - type: ndcg_at_3 value: 27.728 - type: ndcg_at_5 value: 25.668000000000003 - type: ndcg_at_10 value: 25.785999999999998 - type: ndcg_at_100 value: 35.623 - type: map_at_1 value: 2.076 - type: map_at_3 value: 5.29 - type: map_at_5 value: 7.292999999999999 - type: map_at_10 value: 9.81 - type: map_at_100 value: 15.461 - type: recall_at_1 value: 2.076 - type: recall_at_3 value: 6.7250000000000005 - type: recall_at_5 value: 9.808 - type: recall_at_10 value: 16.467000000000002 - type: recall_at_100 value: 45.109 - type: precision_at_1 value: 28.571 - type: precision_at_3 value: 29.252 - type: precision_at_5 value: 25.714 - type: precision_at_10 value: 23.265 - type: precision_at_100 value: 7.184 - type: mrr_at_1 value: 28.571 - type: mrr_at_3 value: 42.857 - type: mrr_at_5 value: 44.184 - type: mrr_at_10 value: 47.564 - type: mrr_at_100 value: 48.142 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 68.43159999999999 - type: ap value: 14.08119146524032 - type: f1 value: 53.26032318755336 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 63.82852292020373 - type: f1 value: 64.14509521870399 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 55.252554461698566 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 88.54383978065208 - type: cos_sim_ap value: 81.67495128150328 - type: cos_sim_f1 value: 74.58161532864419 - type: cos_sim_precision value: 69.00807899461401 - type: cos_sim_recall value: 81.13456464379946 - type: dot_accuracy value: 88.54383978065208 - type: dot_ap value: 81.6748330747088 - type: dot_f1 value: 74.58161532864419 - type: dot_precision value: 69.00807899461401 - type: dot_recall value: 81.13456464379946 - type: euclidean_accuracy value: 88.54383978065208 - type: euclidean_ap value: 81.67496006818212 - type: euclidean_f1 value: 74.58161532864419 - type: euclidean_precision value: 69.00807899461401 - type: euclidean_recall value: 81.13456464379946 - type: manhattan_accuracy value: 88.40674733265782 - type: manhattan_ap value: 81.56036996969941 - type: manhattan_f1 value: 74.33063129452223 - type: manhattan_precision value: 69.53125 - type: manhattan_recall value: 79.84168865435356 - type: max_accuracy value: 88.54383978065208 - type: max_ap value: 81.67496006818212 - type: max_f1 value: 74.58161532864419 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 89.75627740908915 - type: cos_sim_ap value: 87.41911504007292 - type: cos_sim_f1 value: 79.91742008969888 - type: cos_sim_precision value: 74.31484178472131 - type: cos_sim_recall value: 86.43363104404065 - type: dot_accuracy value: 89.75627740908915 - type: dot_ap value: 87.41910845717851 - type: dot_f1 value: 79.91742008969888 - type: dot_precision value: 74.31484178472131 - type: dot_recall value: 86.43363104404065 - type: euclidean_accuracy value: 89.75627740908915 - type: euclidean_ap value: 87.41912150448005 - type: euclidean_f1 value: 79.91742008969888 - type: euclidean_precision value: 74.31484178472131 - type: euclidean_recall value: 86.43363104404065 - type: manhattan_accuracy value: 89.76597974152986 - type: manhattan_ap value: 87.49835162128704 - type: manhattan_f1 value: 80.05401656994779 - type: manhattan_precision value: 76.10158906390951 - type: manhattan_recall value: 84.43948259932245 - type: max_accuracy value: 89.76597974152986 - type: max_ap value: 87.49835162128704 - type: max_f1 value: 80.05401656994779 --- ## SPEED-embedding-7b-instruct [Little Giants: Synthesizing High-Quality Embedding Data at Scale](https://arxiv.org/pdf/2410.18634.pdf). Haonan Chen, Liang Wang, Nan Yang, Yutao Zhu, Ziliang Zhao, Furu Wei, Zhicheng Dou, arXiv 2024 This model has 32 layers and the embedding size is 4096. ## Usage Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset. ### Transformers ```python import torch import torch.nn.functional as F from torch import Tensor from transformers import AutoTokenizer, AutoModel def last_token_pool(last_hidden_states: Tensor, attention_mask: Tensor) -> Tensor: left_padding = (attention_mask[:, -1].sum() == attention_mask.shape[0]) if left_padding: return last_hidden_states[:, -1] else: sequence_lengths = attention_mask.sum(dim=1) - 1 batch_size = last_hidden_states.shape[0] return last_hidden_states[torch.arange(batch_size, device=last_hidden_states.device), sequence_lengths] def get_detailed_instruct(task_description: str, query: str) -> str: return f'Instruct: {task_description}\nQuery: {query}' # Each query must come with a one-sentence instruction that describes the task task = 'Given a web search query, retrieve relevant passages that answer the query' queries = [ get_detailed_instruct(task, 'how much protein should a female eat'), get_detailed_instruct(task, 'summit define') ] # No need to add instruction for retrieval documents documents = [ "As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.", "Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments." ] input_texts = queries + documents tokenizer = AutoTokenizer.from_pretrained('Haon-Chen/speed-embedding-7b-instruct') model = AutoModel.from_pretrained('Haon-Chen/speed-embedding-7b-instruct') max_length = 4096 # Tokenize the input texts batch_dict = tokenizer(input_texts, max_length=max_length, padding=True, truncation=True, return_tensors='pt') outputs = model(**batch_dict) embeddings = last_token_pool(outputs.last_hidden_state, batch_dict['attention_mask']) # normalize embeddings embeddings = F.normalize(embeddings, p=2, dim=1) scores = (embeddings[:2] @ embeddings[2:].T) * 100 print(scores.tolist()) ``` ## MTEB Benchmark Evaluation Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316). ## FAQ **1. Do I need to add instructions to the query?** Yes, this is how the model is trained, otherwise you will see a performance degradation. The task definition should be a one-sentence instruction that describes the task. This is a way to customize text embeddings for different scenarios through natural language instructions. Please check out [unilm/e5/utils.py](https://github.com/microsoft/unilm/blob/9c0f1ff7ca53431fe47d2637dfe253643d94185b/e5/utils.py#L106) for instructions we used for evaluation. On the other hand, there is no need to add instructions to the document side. **2. Why are my reproduced results slightly different from reported in the model card?** Different versions of `transformers` and `pytorch` could cause negligible but non-zero performance differences. **3. Where are the LoRA-only weights?** You can find the LoRA-only weights at [https://huggingface.co/Haon-Chen/speed-embedding-7b-instruct/tree/main/lora](https://huggingface.co/Haon-Chen/speed-embedding-7b-instruct/tree/main/lora). ## Citation If you find our paper or models helpful, please consider cite as follows: ```bibtex @article{chen2024little, title={Little Giants: Synthesizing High-Quality Embedding Data at Scale}, author={Chen, Haonan and Wang, Liang and Yang, Nan and Zhu, Yutao and Zhao, Ziliang and Wei, Furu and Dou, Zhicheng}, journal={arXiv preprint arXiv:2410.18634}, year={2024} } ``` ## Limitations Using this model for inputs longer than 4096 tokens is not recommended.
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
deepfile/multilingual-e5-small-onnx-qint8
deepfile
sentence-similarity
[ "sentence-transformers", "onnx", "safetensors", "bert", "mteb", "Sentence Transformers", "sentence-similarity", "multilingual", "af", "am", "ar", "as", "az", "be", "bg", "bn", "br", "bs", "ca", "cs", "cy", "da", "de", "el", "en", "eo", "es", "et", "eu", "fa", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "hu", "hy", "id", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "ku", "ky", "la", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", "no", "om", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sa", "sd", "si", "sk", "sl", "so", "sq", "sr", "su", "sv", "sw", "ta", "te", "th", "tl", "tr", "ug", "uk", "ur", "uz", "vi", "xh", "yi", "zh", "license:mit", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,731
1,731
10
1
--- language: - multilingual - af - am - ar - as - az - be - bg - bn - br - bs - ca - cs - cy - da - de - el - en - eo - es - et - eu - fa - fi - fr - fy - ga - gd - gl - gu - ha - he - hi - hr - hu - hy - id - is - it - ja - jv - ka - kk - km - kn - ko - ku - ky - la - lo - lt - lv - mg - mk - ml - mn - mr - ms - my - ne - nl - 'no' - om - or - pa - pl - ps - pt - ro - ru - sa - sd - si - sk - sl - so - sq - sr - su - sv - sw - ta - te - th - tl - tr - ug - uk - ur - uz - vi - xh - yi - zh license: mit tags: - mteb - Sentence Transformers - sentence-similarity - sentence-transformers model-index: - name: multilingual-e5-small results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 73.79104477611939 - type: ap value: 36.9996434842022 - type: f1 value: 67.95453679103099 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (de) type: mteb/amazon_counterfactual config: de split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 71.64882226980728 - type: ap value: 82.11942130026586 - type: f1 value: 69.87963421606715 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en-ext) type: mteb/amazon_counterfactual config: en-ext split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 75.8095952023988 - type: ap value: 24.46869495579561 - type: f1 value: 63.00108480037597 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (ja) type: mteb/amazon_counterfactual config: ja split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 64.186295503212 - type: ap value: 15.496804690197042 - type: f1 value: 52.07153895475031 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 88.699325 - type: ap value: 85.27039559917269 - type: f1 value: 88.65556295032513 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 44.69799999999999 - type: f1 value: 43.73187348654165 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (de) type: mteb/amazon_reviews_multi config: de split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 40.245999999999995 - type: f1 value: 39.3863530637684 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (es) type: mteb/amazon_reviews_multi config: es split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 40.394 - type: f1 value: 39.301223469483446 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (fr) type: mteb/amazon_reviews_multi config: fr split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 38.864 - type: f1 value: 37.97974261868003 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (ja) type: mteb/amazon_reviews_multi config: ja split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 37.682 - type: f1 value: 37.07399369768313 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (zh) type: mteb/amazon_reviews_multi config: zh split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 37.504 - type: f1 value: 36.62317273874278 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: map_at_1 value: 19.061 - type: map_at_10 value: 31.703 - type: map_at_100 value: 32.967 - type: map_at_1000 value: 33.001000000000005 - type: map_at_3 value: 27.466 - type: map_at_5 value: 29.564 - type: mrr_at_1 value: 19.559 - type: mrr_at_10 value: 31.874999999999996 - type: mrr_at_100 value: 33.146 - type: mrr_at_1000 value: 33.18 - type: mrr_at_3 value: 27.667 - type: mrr_at_5 value: 29.74 - type: ndcg_at_1 value: 19.061 - type: ndcg_at_10 value: 39.062999999999995 - type: ndcg_at_100 value: 45.184000000000005 - type: ndcg_at_1000 value: 46.115 - type: ndcg_at_3 value: 30.203000000000003 - type: ndcg_at_5 value: 33.953 - type: precision_at_1 value: 19.061 - type: precision_at_10 value: 6.279999999999999 - type: precision_at_100 value: 0.9129999999999999 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 12.706999999999999 - type: precision_at_5 value: 9.431000000000001 - type: recall_at_1 value: 19.061 - type: recall_at_10 value: 62.802 - type: recall_at_100 value: 91.323 - type: recall_at_1000 value: 98.72 - type: recall_at_3 value: 38.122 - type: recall_at_5 value: 47.155 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 39.22266660528253 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 30.79980849482483 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 57.8790068352054 - type: mrr value: 71.78791276436706 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 82.36328364043163 - type: cos_sim_spearman value: 82.26211536195868 - type: euclidean_pearson value: 80.3183865039173 - type: euclidean_spearman value: 79.88495276296132 - type: manhattan_pearson value: 80.14484480692127 - type: manhattan_spearman value: 80.39279565980743 - task: type: BitextMining dataset: name: MTEB BUCC (de-en) type: mteb/bucc-bitext-mining config: de-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 98.0375782881002 - type: f1 value: 97.86012526096033 - type: precision value: 97.77139874739039 - type: recall value: 98.0375782881002 - task: type: BitextMining dataset: name: MTEB BUCC (fr-en) type: mteb/bucc-bitext-mining config: fr-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 93.35241030156286 - type: f1 value: 92.66050333846944 - type: precision value: 92.3306919069631 - type: recall value: 93.35241030156286 - task: type: BitextMining dataset: name: MTEB BUCC (ru-en) type: mteb/bucc-bitext-mining config: ru-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 94.0699688257707 - type: f1 value: 93.50236693222492 - type: precision value: 93.22791825424315 - type: recall value: 94.0699688257707 - task: type: BitextMining dataset: name: MTEB BUCC (zh-en) type: mteb/bucc-bitext-mining config: zh-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 89.25750394944708 - type: f1 value: 88.79234684921889 - type: precision value: 88.57293312269616 - type: recall value: 89.25750394944708 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 79.41558441558442 - type: f1 value: 79.25886487487219 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 35.747820820329736 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 27.045143830596146 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: map_at_1 value: 24.252999999999997 - type: map_at_10 value: 31.655916666666666 - type: map_at_100 value: 32.680749999999996 - type: map_at_1000 value: 32.79483333333334 - type: map_at_3 value: 29.43691666666666 - type: map_at_5 value: 30.717416666666665 - type: mrr_at_1 value: 28.602750000000004 - type: mrr_at_10 value: 35.56875 - type: mrr_at_100 value: 36.3595 - type: mrr_at_1000 value: 36.427749999999996 - type: mrr_at_3 value: 33.586166666666664 - type: mrr_at_5 value: 34.73641666666666 - type: ndcg_at_1 value: 28.602750000000004 - type: ndcg_at_10 value: 36.06933333333334 - type: ndcg_at_100 value: 40.70141666666667 - type: ndcg_at_1000 value: 43.24341666666667 - type: ndcg_at_3 value: 32.307916666666664 - type: ndcg_at_5 value: 34.129999999999995 - type: precision_at_1 value: 28.602750000000004 - type: precision_at_10 value: 6.097666666666667 - type: precision_at_100 value: 0.9809166666666668 - type: precision_at_1000 value: 0.13766666666666663 - type: precision_at_3 value: 14.628166666666667 - type: precision_at_5 value: 10.266916666666667 - type: recall_at_1 value: 24.252999999999997 - type: recall_at_10 value: 45.31916666666667 - type: recall_at_100 value: 66.03575000000001 - type: recall_at_1000 value: 83.94708333333334 - type: recall_at_3 value: 34.71941666666666 - type: recall_at_5 value: 39.46358333333333 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: map_at_1 value: 9.024000000000001 - type: map_at_10 value: 15.644 - type: map_at_100 value: 17.154 - type: map_at_1000 value: 17.345 - type: map_at_3 value: 13.028 - type: map_at_5 value: 14.251 - type: mrr_at_1 value: 19.674 - type: mrr_at_10 value: 29.826999999999998 - type: mrr_at_100 value: 30.935000000000002 - type: mrr_at_1000 value: 30.987 - type: mrr_at_3 value: 26.645000000000003 - type: mrr_at_5 value: 28.29 - type: ndcg_at_1 value: 19.674 - type: ndcg_at_10 value: 22.545 - type: ndcg_at_100 value: 29.207 - type: ndcg_at_1000 value: 32.912 - type: ndcg_at_3 value: 17.952 - type: ndcg_at_5 value: 19.363 - type: precision_at_1 value: 19.674 - type: precision_at_10 value: 7.212000000000001 - type: precision_at_100 value: 1.435 - type: precision_at_1000 value: 0.212 - type: precision_at_3 value: 13.507 - type: precision_at_5 value: 10.397 - type: recall_at_1 value: 9.024000000000001 - type: recall_at_10 value: 28.077999999999996 - type: recall_at_100 value: 51.403 - type: recall_at_1000 value: 72.406 - type: recall_at_3 value: 16.768 - type: recall_at_5 value: 20.737 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: map_at_1 value: 8.012 - type: map_at_10 value: 17.138 - type: map_at_100 value: 24.146 - type: map_at_1000 value: 25.622 - type: map_at_3 value: 12.552 - type: map_at_5 value: 14.435 - type: mrr_at_1 value: 62.25000000000001 - type: mrr_at_10 value: 71.186 - type: mrr_at_100 value: 71.504 - type: mrr_at_1000 value: 71.514 - type: mrr_at_3 value: 69.333 - type: mrr_at_5 value: 70.408 - type: ndcg_at_1 value: 49.75 - type: ndcg_at_10 value: 37.76 - type: ndcg_at_100 value: 42.071 - type: ndcg_at_1000 value: 49.309 - type: ndcg_at_3 value: 41.644 - type: ndcg_at_5 value: 39.812999999999995 - type: precision_at_1 value: 62.25000000000001 - type: precision_at_10 value: 30.15 - type: precision_at_100 value: 9.753 - type: precision_at_1000 value: 1.9189999999999998 - type: precision_at_3 value: 45.667 - type: precision_at_5 value: 39.15 - type: recall_at_1 value: 8.012 - type: recall_at_10 value: 22.599 - type: recall_at_100 value: 48.068 - type: recall_at_1000 value: 71.328 - type: recall_at_3 value: 14.043 - type: recall_at_5 value: 17.124 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 42.455 - type: f1 value: 37.59462649781862 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: map_at_1 value: 58.092 - type: map_at_10 value: 69.586 - type: map_at_100 value: 69.968 - type: map_at_1000 value: 69.982 - type: map_at_3 value: 67.48100000000001 - type: map_at_5 value: 68.915 - type: mrr_at_1 value: 62.166 - type: mrr_at_10 value: 73.588 - type: mrr_at_100 value: 73.86399999999999 - type: mrr_at_1000 value: 73.868 - type: mrr_at_3 value: 71.6 - type: mrr_at_5 value: 72.99 - type: ndcg_at_1 value: 62.166 - type: ndcg_at_10 value: 75.27199999999999 - type: ndcg_at_100 value: 76.816 - type: ndcg_at_1000 value: 77.09700000000001 - type: ndcg_at_3 value: 71.36 - type: ndcg_at_5 value: 73.785 - type: precision_at_1 value: 62.166 - type: precision_at_10 value: 9.716 - type: precision_at_100 value: 1.065 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 28.278 - type: precision_at_5 value: 18.343999999999998 - type: recall_at_1 value: 58.092 - type: recall_at_10 value: 88.73400000000001 - type: recall_at_100 value: 95.195 - type: recall_at_1000 value: 97.04599999999999 - type: recall_at_3 value: 78.45 - type: recall_at_5 value: 84.316 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: map_at_1 value: 16.649 - type: map_at_10 value: 26.457000000000004 - type: map_at_100 value: 28.169 - type: map_at_1000 value: 28.352 - type: map_at_3 value: 23.305 - type: map_at_5 value: 25.169000000000004 - type: mrr_at_1 value: 32.407000000000004 - type: mrr_at_10 value: 40.922 - type: mrr_at_100 value: 41.931000000000004 - type: mrr_at_1000 value: 41.983 - type: mrr_at_3 value: 38.786 - type: mrr_at_5 value: 40.205999999999996 - type: ndcg_at_1 value: 32.407000000000004 - type: ndcg_at_10 value: 33.314 - type: ndcg_at_100 value: 40.312 - type: ndcg_at_1000 value: 43.685 - type: ndcg_at_3 value: 30.391000000000002 - type: ndcg_at_5 value: 31.525 - type: precision_at_1 value: 32.407000000000004 - type: precision_at_10 value: 8.966000000000001 - type: precision_at_100 value: 1.6019999999999999 - type: precision_at_1000 value: 0.22200000000000003 - type: precision_at_3 value: 20.165 - type: precision_at_5 value: 14.722 - type: recall_at_1 value: 16.649 - type: recall_at_10 value: 39.117000000000004 - type: recall_at_100 value: 65.726 - type: recall_at_1000 value: 85.784 - type: recall_at_3 value: 27.914 - type: recall_at_5 value: 33.289 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: map_at_1 value: 36.253 - type: map_at_10 value: 56.16799999999999 - type: map_at_100 value: 57.06099999999999 - type: map_at_1000 value: 57.126 - type: map_at_3 value: 52.644999999999996 - type: map_at_5 value: 54.909 - type: mrr_at_1 value: 72.505 - type: mrr_at_10 value: 79.66 - type: mrr_at_100 value: 79.869 - type: mrr_at_1000 value: 79.88 - type: mrr_at_3 value: 78.411 - type: mrr_at_5 value: 79.19800000000001 - type: ndcg_at_1 value: 72.505 - type: ndcg_at_10 value: 65.094 - type: ndcg_at_100 value: 68.219 - type: ndcg_at_1000 value: 69.515 - type: ndcg_at_3 value: 59.99 - type: ndcg_at_5 value: 62.909000000000006 - type: precision_at_1 value: 72.505 - type: precision_at_10 value: 13.749 - type: precision_at_100 value: 1.619 - type: precision_at_1000 value: 0.179 - type: precision_at_3 value: 38.357 - type: precision_at_5 value: 25.313000000000002 - type: recall_at_1 value: 36.253 - type: recall_at_10 value: 68.744 - type: recall_at_100 value: 80.925 - type: recall_at_1000 value: 89.534 - type: recall_at_3 value: 57.535000000000004 - type: recall_at_5 value: 63.282000000000004 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 80.82239999999999 - type: ap value: 75.65895781725314 - type: f1 value: 80.75880969095746 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: dev revision: None metrics: - type: map_at_1 value: 21.624 - type: map_at_10 value: 34.075 - type: map_at_100 value: 35.229 - type: map_at_1000 value: 35.276999999999994 - type: map_at_3 value: 30.245 - type: map_at_5 value: 32.42 - type: mrr_at_1 value: 22.264 - type: mrr_at_10 value: 34.638000000000005 - type: mrr_at_100 value: 35.744 - type: mrr_at_1000 value: 35.787 - type: mrr_at_3 value: 30.891000000000002 - type: mrr_at_5 value: 33.042 - type: ndcg_at_1 value: 22.264 - type: ndcg_at_10 value: 40.991 - type: ndcg_at_100 value: 46.563 - type: ndcg_at_1000 value: 47.743 - type: ndcg_at_3 value: 33.198 - type: ndcg_at_5 value: 37.069 - type: precision_at_1 value: 22.264 - type: precision_at_10 value: 6.5089999999999995 - type: precision_at_100 value: 0.9299999999999999 - type: precision_at_1000 value: 0.10300000000000001 - type: precision_at_3 value: 14.216999999999999 - type: precision_at_5 value: 10.487 - type: recall_at_1 value: 21.624 - type: recall_at_10 value: 62.303 - type: recall_at_100 value: 88.124 - type: recall_at_1000 value: 97.08 - type: recall_at_3 value: 41.099999999999994 - type: recall_at_5 value: 50.381 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 91.06703146374831 - type: f1 value: 90.86867815863172 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (de) type: mteb/mtop_domain config: de split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 87.46970977740209 - type: f1 value: 86.36832872036588 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (es) type: mteb/mtop_domain config: es split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 89.26951300867245 - type: f1 value: 88.93561193959502 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (fr) type: mteb/mtop_domain config: fr split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 84.22799874725963 - type: f1 value: 84.30490069236556 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (hi) type: mteb/mtop_domain config: hi split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 86.02007888131948 - type: f1 value: 85.39376041027991 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (th) type: mteb/mtop_domain config: th split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 85.34900542495481 - type: f1 value: 85.39859673336713 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 71.078431372549 - type: f1 value: 53.45071102002276 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (de) type: mteb/mtop_intent config: de split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 65.85798816568047 - type: f1 value: 46.53112748993529 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (es) type: mteb/mtop_intent config: es split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 67.96864576384256 - type: f1 value: 45.966703022829506 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (fr) type: mteb/mtop_intent config: fr split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 61.31537738803633 - type: f1 value: 45.52601712835461 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (hi) type: mteb/mtop_intent config: hi split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 66.29616349946218 - type: f1 value: 47.24166485726613 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (th) type: mteb/mtop_intent config: th split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 67.51537070524412 - type: f1 value: 49.463476319014276 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (af) type: mteb/amazon_massive_intent config: af split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 57.06792199058508 - type: f1 value: 54.094921857502285 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (am) type: mteb/amazon_massive_intent config: am split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 51.960322797579025 - type: f1 value: 48.547371223370945 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ar) type: mteb/amazon_massive_intent config: ar split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 54.425016812373904 - type: f1 value: 50.47069202054312 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (az) type: mteb/amazon_massive_intent config: az split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 59.798251513113655 - type: f1 value: 57.05013069086648 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (bn) type: mteb/amazon_massive_intent config: bn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 59.37794216543376 - type: f1 value: 56.3607992649805 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (cy) type: mteb/amazon_massive_intent config: cy split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 46.56018829858777 - type: f1 value: 43.87319715715134 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (da) type: mteb/amazon_massive_intent config: da split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.9724277067922 - type: f1 value: 59.36480066245562 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (de) type: mteb/amazon_massive_intent config: de split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.72696704774715 - type: f1 value: 59.143595966615855 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (el) type: mteb/amazon_massive_intent config: el split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 61.5971755211836 - type: f1 value: 59.169445724946726 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.29589778076665 - type: f1 value: 67.7577001808977 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (es) type: mteb/amazon_massive_intent config: es split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 66.31136516476126 - type: f1 value: 64.52032955983242 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fa) type: mteb/amazon_massive_intent config: fa split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.54472091459314 - type: f1 value: 61.47903120066317 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fi) type: mteb/amazon_massive_intent config: fi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 61.45595158036314 - type: f1 value: 58.0891846024637 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fr) type: mteb/amazon_massive_intent config: fr split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.47074646940149 - type: f1 value: 62.84830858877575 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (he) type: mteb/amazon_massive_intent config: he split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 58.046402151983855 - type: f1 value: 55.269074430533195 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (hi) type: mteb/amazon_massive_intent config: hi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.06523201075991 - type: f1 value: 61.35339643021369 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (hu) type: mteb/amazon_massive_intent config: hu split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 60.954942837928726 - type: f1 value: 57.07035922704846 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (hy) type: mteb/amazon_massive_intent config: hy split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 57.404169468728995 - type: f1 value: 53.94259011839138 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (id) type: mteb/amazon_massive_intent config: id split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.16610625420309 - type: f1 value: 61.337103431499365 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (is) type: mteb/amazon_massive_intent config: is split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 52.262945527908535 - type: f1 value: 49.7610691598921 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (it) type: mteb/amazon_massive_intent config: it split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.54472091459314 - type: f1 value: 63.469099018440154 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ja) type: mteb/amazon_massive_intent config: ja split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.22797579018157 - type: f1 value: 64.89098471083001 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (jv) type: mteb/amazon_massive_intent config: jv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 50.847343644922674 - type: f1 value: 47.8536963168393 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ka) type: mteb/amazon_massive_intent config: ka split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 48.45326160053799 - type: f1 value: 46.370078045805556 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (km) type: mteb/amazon_massive_intent config: km split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 42.83120376597175 - type: f1 value: 39.68948521599982 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (kn) type: mteb/amazon_massive_intent config: kn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 57.5084061869536 - type: f1 value: 53.961876160401545 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ko) type: mteb/amazon_massive_intent config: ko split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 63.7895090786819 - type: f1 value: 61.134223684676 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (lv) type: mteb/amazon_massive_intent config: lv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 54.98991257565569 - type: f1 value: 52.579862862826296 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ml) type: mteb/amazon_massive_intent config: ml split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 61.90316072629456 - type: f1 value: 58.203024538290336 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (mn) type: mteb/amazon_massive_intent config: mn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 57.09818426361802 - type: f1 value: 54.22718458445455 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ms) type: mteb/amazon_massive_intent config: ms split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 58.991257565568255 - type: f1 value: 55.84892781767421 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (my) type: mteb/amazon_massive_intent config: my split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 55.901143241425686 - type: f1 value: 52.25264332199797 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (nb) type: mteb/amazon_massive_intent config: nb split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 61.96368527236047 - type: f1 value: 58.927243876153454 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (nl) type: mteb/amazon_massive_intent config: nl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.64223268325489 - type: f1 value: 62.340453718379706 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pl) type: mteb/amazon_massive_intent config: pl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.52589105581708 - type: f1 value: 61.661113187022174 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pt) type: mteb/amazon_massive_intent config: pt split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 66.84599865501009 - type: f1 value: 64.59342572873005 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ro) type: mteb/amazon_massive_intent config: ro split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 60.81035642232684 - type: f1 value: 57.5169089806797 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ru) type: mteb/amazon_massive_intent config: ru split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.75991930060525 - type: f1 value: 62.89531115787938 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sl) type: mteb/amazon_massive_intent config: sl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 56.51647612642906 - type: f1 value: 54.33154780100043 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sq) type: mteb/amazon_massive_intent config: sq split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 57.985877605917956 - type: f1 value: 54.46187524463802 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sv) type: mteb/amazon_massive_intent config: sv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.03026227303296 - type: f1 value: 62.34377392877748 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sw) type: mteb/amazon_massive_intent config: sw split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 53.567585743106925 - type: f1 value: 50.73770655983206 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ta) type: mteb/amazon_massive_intent config: ta split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 57.2595830531271 - type: f1 value: 53.657327291708626 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (te) type: mteb/amazon_massive_intent config: te split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 57.82784129119032 - type: f1 value: 54.82518072665301 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (th) type: mteb/amazon_massive_intent config: th split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.06859448554137 - type: f1 value: 63.00185280500495 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (tl) type: mteb/amazon_massive_intent config: tl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 58.91055817081371 - type: f1 value: 55.54116301224262 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (tr) type: mteb/amazon_massive_intent config: tr split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 63.54404841963686 - type: f1 value: 59.57650946030184 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ur) type: mteb/amazon_massive_intent config: ur split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 59.27706792199059 - type: f1 value: 56.50010066083435 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (vi) type: mteb/amazon_massive_intent config: vi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.0719569603228 - type: f1 value: 61.817075925647956 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-CN) type: mteb/amazon_massive_intent config: zh-CN split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.23806321452591 - type: f1 value: 65.24917026029749 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-TW) type: mteb/amazon_massive_intent config: zh-TW split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.53530598520511 - type: f1 value: 61.71131132295768 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (af) type: mteb/amazon_massive_scenario config: af split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 63.04303967720243 - type: f1 value: 60.3950085685985 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (am) type: mteb/amazon_massive_scenario config: am split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 56.83591123066578 - type: f1 value: 54.95059828830849 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ar) type: mteb/amazon_massive_scenario config: ar split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 59.62340282447881 - type: f1 value: 59.525159996498225 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (az) type: mteb/amazon_massive_scenario config: az split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 60.85406859448555 - type: f1 value: 59.129299095681276 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (bn) type: mteb/amazon_massive_scenario config: bn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 62.76731674512441 - type: f1 value: 61.159560612627715 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (cy) type: mteb/amazon_massive_scenario config: cy split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 50.181573638197705 - type: f1 value: 46.98422176289957 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (da) type: mteb/amazon_massive_scenario config: da split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.92737054472092 - type: f1 value: 67.69135611952979 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (de) type: mteb/amazon_massive_scenario config: de split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 69.18964357767318 - type: f1 value: 68.46106138186214 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (el) type: mteb/amazon_massive_scenario config: el split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.0712844653665 - type: f1 value: 66.75545422473901 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.4754539340955 - type: f1 value: 74.38427146553252 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (es) type: mteb/amazon_massive_scenario config: es split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 69.82515131136518 - type: f1 value: 69.63516462173847 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fa) type: mteb/amazon_massive_scenario config: fa split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.70880968392737 - type: f1 value: 67.45420662567926 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fi) type: mteb/amazon_massive_scenario config: fi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 65.95494283792871 - type: f1 value: 65.06191009049222 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fr) type: mteb/amazon_massive_scenario config: fr split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.75924680564896 - type: f1 value: 68.30833379585945 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (he) type: mteb/amazon_massive_scenario config: he split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 63.806321452589096 - type: f1 value: 63.273048243765054 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (hi) type: mteb/amazon_massive_scenario config: hi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.68997982515133 - type: f1 value: 66.54703855381324 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (hu) type: mteb/amazon_massive_scenario config: hu split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.46940147948891 - type: f1 value: 65.91017343463396 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (hy) type: mteb/amazon_massive_scenario config: hy split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 59.49899125756556 - type: f1 value: 57.90333469917769 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (id) type: mteb/amazon_massive_scenario config: id split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.9219905850706 - type: f1 value: 67.23169403762938 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (is) type: mteb/amazon_massive_scenario config: is split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 56.486213853396094 - type: f1 value: 54.85282355583758 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (it) type: mteb/amazon_massive_scenario config: it split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 69.04169468728985 - type: f1 value: 68.83833333320462 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ja) type: mteb/amazon_massive_scenario config: ja split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.88702084734365 - type: f1 value: 74.04474735232299 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (jv) type: mteb/amazon_massive_scenario config: jv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 56.63416274377943 - type: f1 value: 55.11332211687954 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ka) type: mteb/amazon_massive_scenario config: ka split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 52.23604572965702 - type: f1 value: 50.86529813991055 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (km) type: mteb/amazon_massive_scenario config: km split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 46.62407531943511 - type: f1 value: 43.63485467164535 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (kn) type: mteb/amazon_massive_scenario config: kn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 59.15601882985878 - type: f1 value: 57.522837510959924 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ko) type: mteb/amazon_massive_scenario config: ko split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 69.84532616005382 - type: f1 value: 69.60021127179697 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (lv) type: mteb/amazon_massive_scenario config: lv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 56.65770006724949 - type: f1 value: 55.84219135523227 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ml) type: mteb/amazon_massive_scenario config: ml split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.53665097511768 - type: f1 value: 65.09087787792639 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (mn) type: mteb/amazon_massive_scenario config: mn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 59.31405514458642 - type: f1 value: 58.06135303831491 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ms) type: mteb/amazon_massive_scenario config: ms split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 64.88231338264964 - type: f1 value: 62.751099407787926 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (my) type: mteb/amazon_massive_scenario config: my split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 58.86012104909213 - type: f1 value: 56.29118323058282 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (nb) type: mteb/amazon_massive_scenario config: nb split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.37390719569602 - type: f1 value: 66.27922244885102 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (nl) type: mteb/amazon_massive_scenario config: nl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.8675184936113 - type: f1 value: 70.22146529932019 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pl) type: mteb/amazon_massive_scenario config: pl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.2212508406187 - type: f1 value: 67.77454802056282 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pt) type: mteb/amazon_massive_scenario config: pt split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.18090114324143 - type: f1 value: 68.03737625431621 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ro) type: mteb/amazon_massive_scenario config: ro split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 64.65030262273034 - type: f1 value: 63.792945486912856 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ru) type: mteb/amazon_massive_scenario config: ru split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 69.48217888365838 - type: f1 value: 69.96028997292197 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sl) type: mteb/amazon_massive_scenario config: sl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 60.17821116341627 - type: f1 value: 59.3935969827171 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sq) type: mteb/amazon_massive_scenario config: sq split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 62.86146603900471 - type: f1 value: 60.133692735032376 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sv) type: mteb/amazon_massive_scenario config: sv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.89441829186282 - type: f1 value: 70.03064076194089 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sw) type: mteb/amazon_massive_scenario config: sw split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 58.15063887020847 - type: f1 value: 56.23326278499678 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ta) type: mteb/amazon_massive_scenario config: ta split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 59.43846671149966 - type: f1 value: 57.70440450281974 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (te) type: mteb/amazon_massive_scenario config: te split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 60.8507061197041 - type: f1 value: 59.22916396061171 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (th) type: mteb/amazon_massive_scenario config: th split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.65568258238063 - type: f1 value: 69.90736239440633 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (tl) type: mteb/amazon_massive_scenario config: tl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 60.8843308675185 - type: f1 value: 59.30332663713599 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (tr) type: mteb/amazon_massive_scenario config: tr split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.05312710154674 - type: f1 value: 67.44024062594775 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ur) type: mteb/amazon_massive_scenario config: ur split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 62.111634162743776 - type: f1 value: 60.89083013084519 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (vi) type: mteb/amazon_massive_scenario config: vi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.44115669132482 - type: f1 value: 67.92227541674552 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-CN) type: mteb/amazon_massive_scenario config: zh-CN split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.4687289845326 - type: f1 value: 74.16376793486025 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-TW) type: mteb/amazon_massive_scenario config: zh-TW split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.31876260928043 - type: f1 value: 68.5246745215607 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 30.90431696479766 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 27.259158476693774 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 30.28445330838555 - type: mrr value: 31.15758529581164 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.353 - type: map_at_10 value: 11.565 - type: map_at_100 value: 14.097000000000001 - type: map_at_1000 value: 15.354999999999999 - type: map_at_3 value: 8.749 - type: map_at_5 value: 9.974 - type: mrr_at_1 value: 42.105 - type: mrr_at_10 value: 50.589 - type: mrr_at_100 value: 51.187000000000005 - type: mrr_at_1000 value: 51.233 - type: mrr_at_3 value: 48.246 - type: mrr_at_5 value: 49.546 - type: ndcg_at_1 value: 40.402 - type: ndcg_at_10 value: 31.009999999999998 - type: ndcg_at_100 value: 28.026 - type: ndcg_at_1000 value: 36.905 - type: ndcg_at_3 value: 35.983 - type: ndcg_at_5 value: 33.764 - type: precision_at_1 value: 42.105 - type: precision_at_10 value: 22.786 - type: precision_at_100 value: 6.916 - type: precision_at_1000 value: 1.981 - type: precision_at_3 value: 33.333 - type: precision_at_5 value: 28.731 - type: recall_at_1 value: 5.353 - type: recall_at_10 value: 15.039 - type: recall_at_100 value: 27.348 - type: recall_at_1000 value: 59.453 - type: recall_at_3 value: 9.792 - type: recall_at_5 value: 11.882 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: map_at_1 value: 33.852 - type: map_at_10 value: 48.924 - type: map_at_100 value: 49.854 - type: map_at_1000 value: 49.886 - type: map_at_3 value: 44.9 - type: map_at_5 value: 47.387 - type: mrr_at_1 value: 38.035999999999994 - type: mrr_at_10 value: 51.644 - type: mrr_at_100 value: 52.339 - type: mrr_at_1000 value: 52.35999999999999 - type: mrr_at_3 value: 48.421 - type: mrr_at_5 value: 50.468999999999994 - type: ndcg_at_1 value: 38.007000000000005 - type: ndcg_at_10 value: 56.293000000000006 - type: ndcg_at_100 value: 60.167 - type: ndcg_at_1000 value: 60.916000000000004 - type: ndcg_at_3 value: 48.903999999999996 - type: ndcg_at_5 value: 52.978 - type: precision_at_1 value: 38.007000000000005 - type: precision_at_10 value: 9.041 - type: precision_at_100 value: 1.1199999999999999 - type: precision_at_1000 value: 0.11900000000000001 - type: precision_at_3 value: 22.084 - type: precision_at_5 value: 15.608 - type: recall_at_1 value: 33.852 - type: recall_at_10 value: 75.893 - type: recall_at_100 value: 92.589 - type: recall_at_1000 value: 98.153 - type: recall_at_3 value: 56.969 - type: recall_at_5 value: 66.283 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: map_at_1 value: 69.174 - type: map_at_10 value: 82.891 - type: map_at_100 value: 83.545 - type: map_at_1000 value: 83.56700000000001 - type: map_at_3 value: 79.944 - type: map_at_5 value: 81.812 - type: mrr_at_1 value: 79.67999999999999 - type: mrr_at_10 value: 86.279 - type: mrr_at_100 value: 86.39 - type: mrr_at_1000 value: 86.392 - type: mrr_at_3 value: 85.21 - type: mrr_at_5 value: 85.92999999999999 - type: ndcg_at_1 value: 79.69000000000001 - type: ndcg_at_10 value: 86.929 - type: ndcg_at_100 value: 88.266 - type: ndcg_at_1000 value: 88.428 - type: ndcg_at_3 value: 83.899 - type: ndcg_at_5 value: 85.56700000000001 - type: precision_at_1 value: 79.69000000000001 - type: precision_at_10 value: 13.161000000000001 - type: precision_at_100 value: 1.513 - type: precision_at_1000 value: 0.156 - type: precision_at_3 value: 36.603 - type: precision_at_5 value: 24.138 - type: recall_at_1 value: 69.174 - type: recall_at_10 value: 94.529 - type: recall_at_100 value: 99.15 - type: recall_at_1000 value: 99.925 - type: recall_at_3 value: 85.86200000000001 - type: recall_at_5 value: 90.501 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 39.13064340585255 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 58.97884249325877 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 3.4680000000000004 - type: map_at_10 value: 7.865 - type: map_at_100 value: 9.332 - type: map_at_1000 value: 9.587 - type: map_at_3 value: 5.800000000000001 - type: map_at_5 value: 6.8790000000000004 - type: mrr_at_1 value: 17.0 - type: mrr_at_10 value: 25.629 - type: mrr_at_100 value: 26.806 - type: mrr_at_1000 value: 26.889000000000003 - type: mrr_at_3 value: 22.8 - type: mrr_at_5 value: 24.26 - type: ndcg_at_1 value: 17.0 - type: ndcg_at_10 value: 13.895 - type: ndcg_at_100 value: 20.491999999999997 - type: ndcg_at_1000 value: 25.759999999999998 - type: ndcg_at_3 value: 13.347999999999999 - type: ndcg_at_5 value: 11.61 - type: precision_at_1 value: 17.0 - type: precision_at_10 value: 7.090000000000001 - type: precision_at_100 value: 1.669 - type: precision_at_1000 value: 0.294 - type: precision_at_3 value: 12.3 - type: precision_at_5 value: 10.02 - type: recall_at_1 value: 3.4680000000000004 - type: recall_at_10 value: 14.363000000000001 - type: recall_at_100 value: 33.875 - type: recall_at_1000 value: 59.711999999999996 - type: recall_at_3 value: 7.483 - type: recall_at_5 value: 10.173 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 83.04084311714061 - type: cos_sim_spearman value: 77.51342467443078 - type: euclidean_pearson value: 80.0321166028479 - type: euclidean_spearman value: 77.29249114733226 - type: manhattan_pearson value: 80.03105964262431 - type: manhattan_spearman value: 77.22373689514794 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 84.1680158034387 - type: cos_sim_spearman value: 76.55983344071117 - type: euclidean_pearson value: 79.75266678300143 - type: euclidean_spearman value: 75.34516823467025 - type: manhattan_pearson value: 79.75959151517357 - type: manhattan_spearman value: 75.42330344141912 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 76.48898993209346 - type: cos_sim_spearman value: 76.96954120323366 - type: euclidean_pearson value: 76.94139109279668 - type: euclidean_spearman value: 76.85860283201711 - type: manhattan_pearson value: 76.6944095091912 - type: manhattan_spearman value: 76.61096912972553 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 77.85082366246944 - type: cos_sim_spearman value: 75.52053350101731 - type: euclidean_pearson value: 77.1165845070926 - type: euclidean_spearman value: 75.31216065884388 - type: manhattan_pearson value: 77.06193941833494 - type: manhattan_spearman value: 75.31003701700112 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 86.36305246526497 - type: cos_sim_spearman value: 87.11704613927415 - type: euclidean_pearson value: 86.04199125810939 - type: euclidean_spearman value: 86.51117572414263 - type: manhattan_pearson value: 86.0805106816633 - type: manhattan_spearman value: 86.52798366512229 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 82.18536255599724 - type: cos_sim_spearman value: 83.63377151025418 - type: euclidean_pearson value: 83.24657467993141 - type: euclidean_spearman value: 84.02751481993825 - type: manhattan_pearson value: 83.11941806582371 - type: manhattan_spearman value: 83.84251281019304 - task: type: STS dataset: name: MTEB STS17 (ko-ko) type: mteb/sts17-crosslingual-sts config: ko-ko split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 78.95816528475514 - type: cos_sim_spearman value: 78.86607380120462 - type: euclidean_pearson value: 78.51268699230545 - type: euclidean_spearman value: 79.11649316502229 - type: manhattan_pearson value: 78.32367302808157 - type: manhattan_spearman value: 78.90277699624637 - task: type: STS dataset: name: MTEB STS17 (ar-ar) type: mteb/sts17-crosslingual-sts config: ar-ar split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 72.89126914997624 - type: cos_sim_spearman value: 73.0296921832678 - type: euclidean_pearson value: 71.50385903677738 - type: euclidean_spearman value: 73.13368899716289 - type: manhattan_pearson value: 71.47421463379519 - type: manhattan_spearman value: 73.03383242946575 - task: type: STS dataset: name: MTEB STS17 (en-ar) type: mteb/sts17-crosslingual-sts config: en-ar split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 59.22923684492637 - type: cos_sim_spearman value: 57.41013211368396 - type: euclidean_pearson value: 61.21107388080905 - type: euclidean_spearman value: 60.07620768697254 - type: manhattan_pearson value: 59.60157142786555 - type: manhattan_spearman value: 59.14069604103739 - task: type: STS dataset: name: MTEB STS17 (en-de) type: mteb/sts17-crosslingual-sts config: en-de split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 76.24345978774299 - type: cos_sim_spearman value: 77.24225743830719 - type: euclidean_pearson value: 76.66226095469165 - type: euclidean_spearman value: 77.60708820493146 - type: manhattan_pearson value: 76.05303324760429 - type: manhattan_spearman value: 76.96353149912348 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 85.50879160160852 - type: cos_sim_spearman value: 86.43594662965224 - type: euclidean_pearson value: 86.06846012826577 - type: euclidean_spearman value: 86.02041395794136 - type: manhattan_pearson value: 86.10916255616904 - type: manhattan_spearman value: 86.07346068198953 - task: type: STS dataset: name: MTEB STS17 (en-tr) type: mteb/sts17-crosslingual-sts config: en-tr split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 58.39803698977196 - type: cos_sim_spearman value: 55.96910950423142 - type: euclidean_pearson value: 58.17941175613059 - type: euclidean_spearman value: 55.03019330522745 - type: manhattan_pearson value: 57.333358138183286 - type: manhattan_spearman value: 54.04614023149965 - task: type: STS dataset: name: MTEB STS17 (es-en) type: mteb/sts17-crosslingual-sts config: es-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 70.98304089637197 - type: cos_sim_spearman value: 72.44071656215888 - type: euclidean_pearson value: 72.19224359033983 - type: euclidean_spearman value: 73.89871188913025 - type: manhattan_pearson value: 71.21098311547406 - type: manhattan_spearman value: 72.93405764824821 - task: type: STS dataset: name: MTEB STS17 (es-es) type: mteb/sts17-crosslingual-sts config: es-es split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 85.99792397466308 - type: cos_sim_spearman value: 84.83824377879495 - type: euclidean_pearson value: 85.70043288694438 - type: euclidean_spearman value: 84.70627558703686 - type: manhattan_pearson value: 85.89570850150801 - type: manhattan_spearman value: 84.95806105313007 - task: type: STS dataset: name: MTEB STS17 (fr-en) type: mteb/sts17-crosslingual-sts config: fr-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 72.21850322994712 - type: cos_sim_spearman value: 72.28669398117248 - type: euclidean_pearson value: 73.40082510412948 - type: euclidean_spearman value: 73.0326539281865 - type: manhattan_pearson value: 71.8659633964841 - type: manhattan_spearman value: 71.57817425823303 - task: type: STS dataset: name: MTEB STS17 (it-en) type: mteb/sts17-crosslingual-sts config: it-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 75.80921368595645 - type: cos_sim_spearman value: 77.33209091229315 - type: euclidean_pearson value: 76.53159540154829 - type: euclidean_spearman value: 78.17960842810093 - type: manhattan_pearson value: 76.13530186637601 - type: manhattan_spearman value: 78.00701437666875 - task: type: STS dataset: name: MTEB STS17 (nl-en) type: mteb/sts17-crosslingual-sts config: nl-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 74.74980608267349 - type: cos_sim_spearman value: 75.37597374318821 - type: euclidean_pearson value: 74.90506081911661 - type: euclidean_spearman value: 75.30151613124521 - type: manhattan_pearson value: 74.62642745918002 - type: manhattan_spearman value: 75.18619716592303 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 59.632662289205584 - type: cos_sim_spearman value: 60.938543391610914 - type: euclidean_pearson value: 62.113200529767056 - type: euclidean_spearman value: 61.410312633261164 - type: manhattan_pearson value: 61.75494698945686 - type: manhattan_spearman value: 60.92726195322362 - task: type: STS dataset: name: MTEB STS22 (de) type: mteb/sts22-crosslingual-sts config: de split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 45.283470551557244 - type: cos_sim_spearman value: 53.44833015864201 - type: euclidean_pearson value: 41.17892011120893 - type: euclidean_spearman value: 53.81441383126767 - type: manhattan_pearson value: 41.17482200420659 - type: manhattan_spearman value: 53.82180269276363 - task: type: STS dataset: name: MTEB STS22 (es) type: mteb/sts22-crosslingual-sts config: es split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 60.5069165306236 - type: cos_sim_spearman value: 66.87803259033826 - type: euclidean_pearson value: 63.5428979418236 - type: euclidean_spearman value: 66.9293576586897 - type: manhattan_pearson value: 63.59789526178922 - type: manhattan_spearman value: 66.86555009875066 - task: type: STS dataset: name: MTEB STS22 (pl) type: mteb/sts22-crosslingual-sts config: pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 28.23026196280264 - type: cos_sim_spearman value: 35.79397812652861 - type: euclidean_pearson value: 17.828102102767353 - type: euclidean_spearman value: 35.721501145568894 - type: manhattan_pearson value: 17.77134274219677 - type: manhattan_spearman value: 35.98107902846267 - task: type: STS dataset: name: MTEB STS22 (tr) type: mteb/sts22-crosslingual-sts config: tr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 56.51946541393812 - type: cos_sim_spearman value: 63.714686006214485 - type: euclidean_pearson value: 58.32104651305898 - type: euclidean_spearman value: 62.237110895702216 - type: manhattan_pearson value: 58.579416468759185 - type: manhattan_spearman value: 62.459738981727 - task: type: STS dataset: name: MTEB STS22 (ar) type: mteb/sts22-crosslingual-sts config: ar split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 48.76009839569795 - type: cos_sim_spearman value: 56.65188431953149 - type: euclidean_pearson value: 50.997682160915595 - type: euclidean_spearman value: 55.99910008818135 - type: manhattan_pearson value: 50.76220659606342 - type: manhattan_spearman value: 55.517347595391456 - task: type: STS dataset: name: MTEB STS22 (ru) type: mteb/sts22-crosslingual-sts config: ru split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 51.232731157702425 - type: cos_sim_spearman value: 59.89531877658345 - type: euclidean_pearson value: 49.937914570348376 - type: euclidean_spearman value: 60.220905659334036 - type: manhattan_pearson value: 50.00987996844193 - type: manhattan_spearman value: 60.081341480977926 - task: type: STS dataset: name: MTEB STS22 (zh) type: mteb/sts22-crosslingual-sts config: zh split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 54.717524559088005 - type: cos_sim_spearman value: 66.83570886252286 - type: euclidean_pearson value: 58.41338625505467 - type: euclidean_spearman value: 66.68991427704938 - type: manhattan_pearson value: 58.78638572916807 - type: manhattan_spearman value: 66.58684161046335 - task: type: STS dataset: name: MTEB STS22 (fr) type: mteb/sts22-crosslingual-sts config: fr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 73.2962042954962 - type: cos_sim_spearman value: 76.58255504852025 - type: euclidean_pearson value: 75.70983192778257 - type: euclidean_spearman value: 77.4547684870542 - type: manhattan_pearson value: 75.75565853870485 - type: manhattan_spearman value: 76.90208974949428 - task: type: STS dataset: name: MTEB STS22 (de-en) type: mteb/sts22-crosslingual-sts config: de-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 54.47396266924846 - type: cos_sim_spearman value: 56.492267162048606 - type: euclidean_pearson value: 55.998505203070195 - type: euclidean_spearman value: 56.46447012960222 - type: manhattan_pearson value: 54.873172394430995 - type: manhattan_spearman value: 56.58111534551218 - task: type: STS dataset: name: MTEB STS22 (es-en) type: mteb/sts22-crosslingual-sts config: es-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 69.87177267688686 - type: cos_sim_spearman value: 74.57160943395763 - type: euclidean_pearson value: 70.88330406826788 - type: euclidean_spearman value: 74.29767636038422 - type: manhattan_pearson value: 71.38245248369536 - type: manhattan_spearman value: 74.53102232732175 - task: type: STS dataset: name: MTEB STS22 (it) type: mteb/sts22-crosslingual-sts config: it split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 72.80225656959544 - type: cos_sim_spearman value: 76.52646173725735 - type: euclidean_pearson value: 73.95710720200799 - type: euclidean_spearman value: 76.54040031984111 - type: manhattan_pearson value: 73.89679971946774 - type: manhattan_spearman value: 76.60886958161574 - task: type: STS dataset: name: MTEB STS22 (pl-en) type: mteb/sts22-crosslingual-sts config: pl-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 70.70844249898789 - type: cos_sim_spearman value: 72.68571783670241 - type: euclidean_pearson value: 72.38800772441031 - type: euclidean_spearman value: 72.86804422703312 - type: manhattan_pearson value: 71.29840508203515 - type: manhattan_spearman value: 71.86264441749513 - task: type: STS dataset: name: MTEB STS22 (zh-en) type: mteb/sts22-crosslingual-sts config: zh-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 58.647478923935694 - type: cos_sim_spearman value: 63.74453623540931 - type: euclidean_pearson value: 59.60138032437505 - type: euclidean_spearman value: 63.947930832166065 - type: manhattan_pearson value: 58.59735509491861 - type: manhattan_spearman value: 62.082503844627404 - task: type: STS dataset: name: MTEB STS22 (es-it) type: mteb/sts22-crosslingual-sts config: es-it split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 65.8722516867162 - type: cos_sim_spearman value: 71.81208592523012 - type: euclidean_pearson value: 67.95315252165956 - type: euclidean_spearman value: 73.00749822046009 - type: manhattan_pearson value: 68.07884688638924 - type: manhattan_spearman value: 72.34210325803069 - task: type: STS dataset: name: MTEB STS22 (de-fr) type: mteb/sts22-crosslingual-sts config: de-fr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 54.5405814240949 - type: cos_sim_spearman value: 60.56838649023775 - type: euclidean_pearson value: 53.011731611314104 - type: euclidean_spearman value: 58.533194841668426 - type: manhattan_pearson value: 53.623067729338494 - type: manhattan_spearman value: 58.018756154446926 - task: type: STS dataset: name: MTEB STS22 (de-pl) type: mteb/sts22-crosslingual-sts config: de-pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 13.611046866216112 - type: cos_sim_spearman value: 28.238192909158492 - type: euclidean_pearson value: 22.16189199885129 - type: euclidean_spearman value: 35.012895679076564 - type: manhattan_pearson value: 21.969771178698387 - type: manhattan_spearman value: 32.456985088607475 - task: type: STS dataset: name: MTEB STS22 (fr-pl) type: mteb/sts22-crosslingual-sts config: fr-pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 74.58077407011655 - type: cos_sim_spearman value: 84.51542547285167 - type: euclidean_pearson value: 74.64613843596234 - type: euclidean_spearman value: 84.51542547285167 - type: manhattan_pearson value: 75.15335973101396 - type: manhattan_spearman value: 84.51542547285167 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 82.0739825531578 - type: cos_sim_spearman value: 84.01057479311115 - type: euclidean_pearson value: 83.85453227433344 - type: euclidean_spearman value: 84.01630226898655 - type: manhattan_pearson value: 83.75323603028978 - type: manhattan_spearman value: 83.89677983727685 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 78.12945623123957 - type: mrr value: 93.87738713719106 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: map_at_1 value: 52.983000000000004 - type: map_at_10 value: 62.946000000000005 - type: map_at_100 value: 63.514 - type: map_at_1000 value: 63.554 - type: map_at_3 value: 60.183 - type: map_at_5 value: 61.672000000000004 - type: mrr_at_1 value: 55.667 - type: mrr_at_10 value: 64.522 - type: mrr_at_100 value: 64.957 - type: mrr_at_1000 value: 64.995 - type: mrr_at_3 value: 62.388999999999996 - type: mrr_at_5 value: 63.639 - type: ndcg_at_1 value: 55.667 - type: ndcg_at_10 value: 67.704 - type: ndcg_at_100 value: 70.299 - type: ndcg_at_1000 value: 71.241 - type: ndcg_at_3 value: 62.866 - type: ndcg_at_5 value: 65.16999999999999 - type: precision_at_1 value: 55.667 - type: precision_at_10 value: 9.033 - type: precision_at_100 value: 1.053 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 24.444 - type: precision_at_5 value: 16.133 - type: recall_at_1 value: 52.983000000000004 - type: recall_at_10 value: 80.656 - type: recall_at_100 value: 92.5 - type: recall_at_1000 value: 99.667 - type: recall_at_3 value: 67.744 - type: recall_at_5 value: 73.433 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.72772277227723 - type: cos_sim_ap value: 92.17845897992215 - type: cos_sim_f1 value: 85.9746835443038 - type: cos_sim_precision value: 87.07692307692308 - type: cos_sim_recall value: 84.89999999999999 - type: dot_accuracy value: 99.3039603960396 - type: dot_ap value: 60.70244020124878 - type: dot_f1 value: 59.92742353551063 - type: dot_precision value: 62.21743810548978 - type: dot_recall value: 57.8 - type: euclidean_accuracy value: 99.71683168316832 - type: euclidean_ap value: 91.53997039964659 - type: euclidean_f1 value: 84.88372093023257 - type: euclidean_precision value: 90.02242152466367 - type: euclidean_recall value: 80.30000000000001 - type: manhattan_accuracy value: 99.72376237623763 - type: manhattan_ap value: 91.80756777790289 - type: manhattan_f1 value: 85.48468106479157 - type: manhattan_precision value: 85.8728557013118 - type: manhattan_recall value: 85.1 - type: max_accuracy value: 99.72772277227723 - type: max_ap value: 92.17845897992215 - type: max_f1 value: 85.9746835443038 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 53.52464042600003 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 32.071631948736 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 49.19552407604654 - type: mrr value: 49.95269130379425 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 29.345293033095427 - type: cos_sim_spearman value: 29.976931423258403 - type: dot_pearson value: 27.047078008958408 - type: dot_spearman value: 27.75894368380218 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.22 - type: map_at_10 value: 1.706 - type: map_at_100 value: 9.634 - type: map_at_1000 value: 23.665 - type: map_at_3 value: 0.5950000000000001 - type: map_at_5 value: 0.95 - type: mrr_at_1 value: 86.0 - type: mrr_at_10 value: 91.8 - type: mrr_at_100 value: 91.8 - type: mrr_at_1000 value: 91.8 - type: mrr_at_3 value: 91.0 - type: mrr_at_5 value: 91.8 - type: ndcg_at_1 value: 80.0 - type: ndcg_at_10 value: 72.573 - type: ndcg_at_100 value: 53.954 - type: ndcg_at_1000 value: 47.760999999999996 - type: ndcg_at_3 value: 76.173 - type: ndcg_at_5 value: 75.264 - type: precision_at_1 value: 86.0 - type: precision_at_10 value: 76.4 - type: precision_at_100 value: 55.50000000000001 - type: precision_at_1000 value: 21.802 - type: precision_at_3 value: 81.333 - type: precision_at_5 value: 80.4 - type: recall_at_1 value: 0.22 - type: recall_at_10 value: 1.925 - type: recall_at_100 value: 12.762 - type: recall_at_1000 value: 44.946000000000005 - type: recall_at_3 value: 0.634 - type: recall_at_5 value: 1.051 - task: type: BitextMining dataset: name: MTEB Tatoeba (sqi-eng) type: mteb/tatoeba-bitext-mining config: sqi-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.0 - type: f1 value: 88.55666666666666 - type: precision value: 87.46166666666667 - type: recall value: 91.0 - task: type: BitextMining dataset: name: MTEB Tatoeba (fry-eng) type: mteb/tatoeba-bitext-mining config: fry-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 57.22543352601156 - type: f1 value: 51.03220478943021 - type: precision value: 48.8150289017341 - type: recall value: 57.22543352601156 - task: type: BitextMining dataset: name: MTEB Tatoeba (kur-eng) type: mteb/tatoeba-bitext-mining config: kur-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 46.58536585365854 - type: f1 value: 39.66870798578116 - type: precision value: 37.416085946573745 - type: recall value: 46.58536585365854 - task: type: BitextMining dataset: name: MTEB Tatoeba (tur-eng) type: mteb/tatoeba-bitext-mining config: tur-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 89.7 - type: f1 value: 86.77999999999999 - type: precision value: 85.45333333333332 - type: recall value: 89.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (deu-eng) type: mteb/tatoeba-bitext-mining config: deu-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.39999999999999 - type: f1 value: 96.58333333333331 - type: precision value: 96.2 - type: recall value: 97.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (nld-eng) type: mteb/tatoeba-bitext-mining config: nld-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.4 - type: f1 value: 90.3 - type: precision value: 89.31666666666668 - type: recall value: 92.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (ron-eng) type: mteb/tatoeba-bitext-mining config: ron-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 86.9 - type: f1 value: 83.67190476190476 - type: precision value: 82.23333333333332 - type: recall value: 86.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (ang-eng) type: mteb/tatoeba-bitext-mining config: ang-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 50.0 - type: f1 value: 42.23229092632078 - type: precision value: 39.851634683724235 - type: recall value: 50.0 - task: type: BitextMining dataset: name: MTEB Tatoeba (ido-eng) type: mteb/tatoeba-bitext-mining config: ido-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.3 - type: f1 value: 70.86190476190477 - type: precision value: 68.68777777777777 - type: recall value: 76.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (jav-eng) type: mteb/tatoeba-bitext-mining config: jav-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 57.073170731707314 - type: f1 value: 50.658958927251604 - type: precision value: 48.26480836236933 - type: recall value: 57.073170731707314 - task: type: BitextMining dataset: name: MTEB Tatoeba (isl-eng) type: mteb/tatoeba-bitext-mining config: isl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 68.2 - type: f1 value: 62.156507936507936 - type: precision value: 59.84964285714286 - type: recall value: 68.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (slv-eng) type: mteb/tatoeba-bitext-mining config: slv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 77.52126366950182 - type: f1 value: 72.8496210148701 - type: precision value: 70.92171498003819 - type: recall value: 77.52126366950182 - task: type: BitextMining dataset: name: MTEB Tatoeba (cym-eng) type: mteb/tatoeba-bitext-mining config: cym-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 70.78260869565217 - type: f1 value: 65.32422360248447 - type: precision value: 63.063067367415194 - type: recall value: 70.78260869565217 - task: type: BitextMining dataset: name: MTEB Tatoeba (kaz-eng) type: mteb/tatoeba-bitext-mining config: kaz-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 78.43478260869566 - type: f1 value: 73.02608695652172 - type: precision value: 70.63768115942028 - type: recall value: 78.43478260869566 - task: type: BitextMining dataset: name: MTEB Tatoeba (est-eng) type: mteb/tatoeba-bitext-mining config: est-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 60.9 - type: f1 value: 55.309753694581275 - type: precision value: 53.130476190476195 - type: recall value: 60.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (heb-eng) type: mteb/tatoeba-bitext-mining config: heb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 72.89999999999999 - type: f1 value: 67.92023809523809 - type: precision value: 65.82595238095237 - type: recall value: 72.89999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (gla-eng) type: mteb/tatoeba-bitext-mining config: gla-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 46.80337756332931 - type: f1 value: 39.42174900558496 - type: precision value: 36.97101116280851 - type: recall value: 46.80337756332931 - task: type: BitextMining dataset: name: MTEB Tatoeba (mar-eng) type: mteb/tatoeba-bitext-mining config: mar-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 89.8 - type: f1 value: 86.79 - type: precision value: 85.375 - type: recall value: 89.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (lat-eng) type: mteb/tatoeba-bitext-mining config: lat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 47.199999999999996 - type: f1 value: 39.95484348984349 - type: precision value: 37.561071428571424 - type: recall value: 47.199999999999996 - task: type: BitextMining dataset: name: MTEB Tatoeba (bel-eng) type: mteb/tatoeba-bitext-mining config: bel-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 87.8 - type: f1 value: 84.68190476190475 - type: precision value: 83.275 - type: recall value: 87.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (pms-eng) type: mteb/tatoeba-bitext-mining config: pms-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 48.76190476190476 - type: f1 value: 42.14965986394558 - type: precision value: 39.96743626743626 - type: recall value: 48.76190476190476 - task: type: BitextMining dataset: name: MTEB Tatoeba (gle-eng) type: mteb/tatoeba-bitext-mining config: gle-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 66.10000000000001 - type: f1 value: 59.58580086580086 - type: precision value: 57.150238095238095 - type: recall value: 66.10000000000001 - task: type: BitextMining dataset: name: MTEB Tatoeba (pes-eng) type: mteb/tatoeba-bitext-mining config: pes-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 87.3 - type: f1 value: 84.0 - type: precision value: 82.48666666666666 - type: recall value: 87.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (nob-eng) type: mteb/tatoeba-bitext-mining config: nob-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.4 - type: f1 value: 87.79523809523809 - type: precision value: 86.6 - type: recall value: 90.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (bul-eng) type: mteb/tatoeba-bitext-mining config: bul-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 87.0 - type: f1 value: 83.81 - type: precision value: 82.36666666666666 - type: recall value: 87.0 - task: type: BitextMining dataset: name: MTEB Tatoeba (cbk-eng) type: mteb/tatoeba-bitext-mining config: cbk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 63.9 - type: f1 value: 57.76533189033189 - type: precision value: 55.50595238095239 - type: recall value: 63.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (hun-eng) type: mteb/tatoeba-bitext-mining config: hun-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.1 - type: f1 value: 71.83690476190478 - type: precision value: 70.04928571428573 - type: recall value: 76.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (uig-eng) type: mteb/tatoeba-bitext-mining config: uig-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 66.3 - type: f1 value: 59.32626984126984 - type: precision value: 56.62535714285713 - type: recall value: 66.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (rus-eng) type: mteb/tatoeba-bitext-mining config: rus-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.60000000000001 - type: f1 value: 87.96333333333334 - type: precision value: 86.73333333333333 - type: recall value: 90.60000000000001 - task: type: BitextMining dataset: name: MTEB Tatoeba (spa-eng) type: mteb/tatoeba-bitext-mining config: spa-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.10000000000001 - type: f1 value: 91.10000000000001 - type: precision value: 90.16666666666666 - type: recall value: 93.10000000000001 - task: type: BitextMining dataset: name: MTEB Tatoeba (hye-eng) type: mteb/tatoeba-bitext-mining config: hye-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 85.71428571428571 - type: f1 value: 82.29142600436403 - type: precision value: 80.8076626877166 - type: recall value: 85.71428571428571 - task: type: BitextMining dataset: name: MTEB Tatoeba (tel-eng) type: mteb/tatoeba-bitext-mining config: tel-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.88888888888889 - type: f1 value: 85.7834757834758 - type: precision value: 84.43732193732193 - type: recall value: 88.88888888888889 - task: type: BitextMining dataset: name: MTEB Tatoeba (afr-eng) type: mteb/tatoeba-bitext-mining config: afr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.5 - type: f1 value: 85.67190476190476 - type: precision value: 84.43333333333332 - type: recall value: 88.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (mon-eng) type: mteb/tatoeba-bitext-mining config: mon-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 82.72727272727273 - type: f1 value: 78.21969696969695 - type: precision value: 76.18181818181819 - type: recall value: 82.72727272727273 - task: type: BitextMining dataset: name: MTEB Tatoeba (arz-eng) type: mteb/tatoeba-bitext-mining config: arz-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 61.0062893081761 - type: f1 value: 55.13976240391334 - type: precision value: 52.92112499659669 - type: recall value: 61.0062893081761 - task: type: BitextMining dataset: name: MTEB Tatoeba (hrv-eng) type: mteb/tatoeba-bitext-mining config: hrv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 89.5 - type: f1 value: 86.86666666666666 - type: precision value: 85.69166666666668 - type: recall value: 89.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (nov-eng) type: mteb/tatoeba-bitext-mining config: nov-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 73.54085603112841 - type: f1 value: 68.56031128404669 - type: precision value: 66.53047989623866 - type: recall value: 73.54085603112841 - task: type: BitextMining dataset: name: MTEB Tatoeba (gsw-eng) type: mteb/tatoeba-bitext-mining config: gsw-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 43.58974358974359 - type: f1 value: 36.45299145299145 - type: precision value: 33.81155881155882 - type: recall value: 43.58974358974359 - task: type: BitextMining dataset: name: MTEB Tatoeba (nds-eng) type: mteb/tatoeba-bitext-mining config: nds-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 59.599999999999994 - type: f1 value: 53.264689754689755 - type: precision value: 50.869166666666665 - type: recall value: 59.599999999999994 - task: type: BitextMining dataset: name: MTEB Tatoeba (ukr-eng) type: mteb/tatoeba-bitext-mining config: ukr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 85.2 - type: f1 value: 81.61666666666665 - type: precision value: 80.02833333333335 - type: recall value: 85.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (uzb-eng) type: mteb/tatoeba-bitext-mining config: uzb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 63.78504672897196 - type: f1 value: 58.00029669188548 - type: precision value: 55.815809968847354 - type: recall value: 63.78504672897196 - task: type: BitextMining dataset: name: MTEB Tatoeba (lit-eng) type: mteb/tatoeba-bitext-mining config: lit-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 66.5 - type: f1 value: 61.518333333333345 - type: precision value: 59.622363699102834 - type: recall value: 66.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (ina-eng) type: mteb/tatoeba-bitext-mining config: ina-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.6 - type: f1 value: 85.60222222222221 - type: precision value: 84.27916666666665 - type: recall value: 88.6 - task: type: BitextMining dataset: name: MTEB Tatoeba (lfn-eng) type: mteb/tatoeba-bitext-mining config: lfn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 58.699999999999996 - type: f1 value: 52.732375957375965 - type: precision value: 50.63214035964035 - type: recall value: 58.699999999999996 - task: type: BitextMining dataset: name: MTEB Tatoeba (zsm-eng) type: mteb/tatoeba-bitext-mining config: zsm-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.10000000000001 - type: f1 value: 89.99666666666667 - type: precision value: 89.03333333333333 - type: recall value: 92.10000000000001 - task: type: BitextMining dataset: name: MTEB Tatoeba (ita-eng) type: mteb/tatoeba-bitext-mining config: ita-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.10000000000001 - type: f1 value: 87.55666666666667 - type: precision value: 86.36166666666668 - type: recall value: 90.10000000000001 - task: type: BitextMining dataset: name: MTEB Tatoeba (cmn-eng) type: mteb/tatoeba-bitext-mining config: cmn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.4 - type: f1 value: 88.89000000000001 - type: precision value: 87.71166666666666 - type: recall value: 91.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (lvs-eng) type: mteb/tatoeba-bitext-mining config: lvs-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 65.7 - type: f1 value: 60.67427750410509 - type: precision value: 58.71785714285714 - type: recall value: 65.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (glg-eng) type: mteb/tatoeba-bitext-mining config: glg-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 85.39999999999999 - type: f1 value: 81.93190476190475 - type: precision value: 80.37833333333333 - type: recall value: 85.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (ceb-eng) type: mteb/tatoeba-bitext-mining config: ceb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 47.833333333333336 - type: f1 value: 42.006625781625786 - type: precision value: 40.077380952380956 - type: recall value: 47.833333333333336 - task: type: BitextMining dataset: name: MTEB Tatoeba (bre-eng) type: mteb/tatoeba-bitext-mining config: bre-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 10.4 - type: f1 value: 8.24465007215007 - type: precision value: 7.664597069597071 - type: recall value: 10.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (ben-eng) type: mteb/tatoeba-bitext-mining config: ben-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 82.6 - type: f1 value: 77.76333333333334 - type: precision value: 75.57833333333332 - type: recall value: 82.6 - task: type: BitextMining dataset: name: MTEB Tatoeba (swg-eng) type: mteb/tatoeba-bitext-mining config: swg-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 52.67857142857143 - type: f1 value: 44.302721088435376 - type: precision value: 41.49801587301587 - type: recall value: 52.67857142857143 - task: type: BitextMining dataset: name: MTEB Tatoeba (arq-eng) type: mteb/tatoeba-bitext-mining config: arq-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 28.3205268935236 - type: f1 value: 22.426666605171157 - type: precision value: 20.685900116470915 - type: recall value: 28.3205268935236 - task: type: BitextMining dataset: name: MTEB Tatoeba (kab-eng) type: mteb/tatoeba-bitext-mining config: kab-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 22.7 - type: f1 value: 17.833970473970474 - type: precision value: 16.407335164835164 - type: recall value: 22.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (fra-eng) type: mteb/tatoeba-bitext-mining config: fra-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.2 - type: f1 value: 89.92999999999999 - type: precision value: 88.87 - type: recall value: 92.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (por-eng) type: mteb/tatoeba-bitext-mining config: por-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.4 - type: f1 value: 89.25 - type: precision value: 88.21666666666667 - type: recall value: 91.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (tat-eng) type: mteb/tatoeba-bitext-mining config: tat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 69.19999999999999 - type: f1 value: 63.38269841269841 - type: precision value: 61.14773809523809 - type: recall value: 69.19999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (oci-eng) type: mteb/tatoeba-bitext-mining config: oci-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 48.8 - type: f1 value: 42.839915639915645 - type: precision value: 40.770287114845935 - type: recall value: 48.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (pol-eng) type: mteb/tatoeba-bitext-mining config: pol-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.8 - type: f1 value: 85.90666666666668 - type: precision value: 84.54166666666666 - type: recall value: 88.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (war-eng) type: mteb/tatoeba-bitext-mining config: war-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 46.6 - type: f1 value: 40.85892920804686 - type: precision value: 38.838223114604695 - type: recall value: 46.6 - task: type: BitextMining dataset: name: MTEB Tatoeba (aze-eng) type: mteb/tatoeba-bitext-mining config: aze-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 84.0 - type: f1 value: 80.14190476190475 - type: precision value: 78.45333333333333 - type: recall value: 84.0 - task: type: BitextMining dataset: name: MTEB Tatoeba (vie-eng) type: mteb/tatoeba-bitext-mining config: vie-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.5 - type: f1 value: 87.78333333333333 - type: precision value: 86.5 - type: recall value: 90.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (nno-eng) type: mteb/tatoeba-bitext-mining config: nno-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 74.5 - type: f1 value: 69.48397546897547 - type: precision value: 67.51869047619049 - type: recall value: 74.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (cha-eng) type: mteb/tatoeba-bitext-mining config: cha-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 32.846715328467155 - type: f1 value: 27.828177499710343 - type: precision value: 26.63451511991658 - type: recall value: 32.846715328467155 - task: type: BitextMining dataset: name: MTEB Tatoeba (mhr-eng) type: mteb/tatoeba-bitext-mining config: mhr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 8.0 - type: f1 value: 6.07664116764988 - type: precision value: 5.544177607179943 - type: recall value: 8.0 - task: type: BitextMining dataset: name: MTEB Tatoeba (dan-eng) type: mteb/tatoeba-bitext-mining config: dan-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 87.6 - type: f1 value: 84.38555555555554 - type: precision value: 82.91583333333334 - type: recall value: 87.6 - task: type: BitextMining dataset: name: MTEB Tatoeba (ell-eng) type: mteb/tatoeba-bitext-mining config: ell-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 87.5 - type: f1 value: 84.08333333333331 - type: precision value: 82.47333333333333 - type: recall value: 87.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (amh-eng) type: mteb/tatoeba-bitext-mining config: amh-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 80.95238095238095 - type: f1 value: 76.13095238095238 - type: precision value: 74.05753968253967 - type: recall value: 80.95238095238095 - task: type: BitextMining dataset: name: MTEB Tatoeba (pam-eng) type: mteb/tatoeba-bitext-mining config: pam-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 8.799999999999999 - type: f1 value: 6.971422975172975 - type: precision value: 6.557814916172301 - type: recall value: 8.799999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (hsb-eng) type: mteb/tatoeba-bitext-mining config: hsb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 44.099378881987576 - type: f1 value: 37.01649742022413 - type: precision value: 34.69420618488942 - type: recall value: 44.099378881987576 - task: type: BitextMining dataset: name: MTEB Tatoeba (srp-eng) type: mteb/tatoeba-bitext-mining config: srp-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 84.3 - type: f1 value: 80.32666666666667 - type: precision value: 78.60666666666665 - type: recall value: 84.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (epo-eng) type: mteb/tatoeba-bitext-mining config: epo-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.5 - type: f1 value: 90.49666666666666 - type: precision value: 89.56666666666668 - type: recall value: 92.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (kzj-eng) type: mteb/tatoeba-bitext-mining config: kzj-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 10.0 - type: f1 value: 8.268423529875141 - type: precision value: 7.878118605532398 - type: recall value: 10.0 - task: type: BitextMining dataset: name: MTEB Tatoeba (awa-eng) type: mteb/tatoeba-bitext-mining config: awa-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 79.22077922077922 - type: f1 value: 74.27128427128426 - type: precision value: 72.28715728715729 - type: recall value: 79.22077922077922 - task: type: BitextMining dataset: name: MTEB Tatoeba (fao-eng) type: mteb/tatoeba-bitext-mining config: fao-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 65.64885496183206 - type: f1 value: 58.87495456197747 - type: precision value: 55.992366412213734 - type: recall value: 65.64885496183206 - task: type: BitextMining dataset: name: MTEB Tatoeba (mal-eng) type: mteb/tatoeba-bitext-mining config: mal-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.06986899563319 - type: f1 value: 94.78408539543909 - type: precision value: 94.15332362930616 - type: recall value: 96.06986899563319 - task: type: BitextMining dataset: name: MTEB Tatoeba (ile-eng) type: mteb/tatoeba-bitext-mining config: ile-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 77.2 - type: f1 value: 71.72571428571428 - type: precision value: 69.41000000000001 - type: recall value: 77.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (bos-eng) type: mteb/tatoeba-bitext-mining config: bos-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 86.4406779661017 - type: f1 value: 83.2391713747646 - type: precision value: 81.74199623352166 - type: recall value: 86.4406779661017 - task: type: BitextMining dataset: name: MTEB Tatoeba (cor-eng) type: mteb/tatoeba-bitext-mining config: cor-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 8.4 - type: f1 value: 6.017828743398003 - type: precision value: 5.4829865484756795 - type: recall value: 8.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (cat-eng) type: mteb/tatoeba-bitext-mining config: cat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 83.5 - type: f1 value: 79.74833333333333 - type: precision value: 78.04837662337664 - type: recall value: 83.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (eus-eng) type: mteb/tatoeba-bitext-mining config: eus-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 60.4 - type: f1 value: 54.467301587301584 - type: precision value: 52.23242424242424 - type: recall value: 60.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (yue-eng) type: mteb/tatoeba-bitext-mining config: yue-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 74.9 - type: f1 value: 69.68699134199134 - type: precision value: 67.59873015873016 - type: recall value: 74.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (swe-eng) type: mteb/tatoeba-bitext-mining config: swe-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.0 - type: f1 value: 84.9652380952381 - type: precision value: 83.66166666666666 - type: recall value: 88.0 - task: type: BitextMining dataset: name: MTEB Tatoeba (dtp-eng) type: mteb/tatoeba-bitext-mining config: dtp-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 9.1 - type: f1 value: 7.681244588744588 - type: precision value: 7.370043290043291 - type: recall value: 9.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (kat-eng) type: mteb/tatoeba-bitext-mining config: kat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 80.9651474530831 - type: f1 value: 76.84220605132133 - type: precision value: 75.19606398962966 - type: recall value: 80.9651474530831 - task: type: BitextMining dataset: name: MTEB Tatoeba (jpn-eng) type: mteb/tatoeba-bitext-mining config: jpn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 86.9 - type: f1 value: 83.705 - type: precision value: 82.3120634920635 - type: recall value: 86.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (csb-eng) type: mteb/tatoeba-bitext-mining config: csb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 29.64426877470356 - type: f1 value: 23.98763072676116 - type: precision value: 22.506399397703746 - type: recall value: 29.64426877470356 - task: type: BitextMining dataset: name: MTEB Tatoeba (xho-eng) type: mteb/tatoeba-bitext-mining config: xho-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 70.4225352112676 - type: f1 value: 62.84037558685445 - type: precision value: 59.56572769953053 - type: recall value: 70.4225352112676 - task: type: BitextMining dataset: name: MTEB Tatoeba (orv-eng) type: mteb/tatoeba-bitext-mining config: orv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 19.64071856287425 - type: f1 value: 15.125271011207756 - type: precision value: 13.865019261197494 - type: recall value: 19.64071856287425 - task: type: BitextMining dataset: name: MTEB Tatoeba (ind-eng) type: mteb/tatoeba-bitext-mining config: ind-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.2 - type: f1 value: 87.80666666666666 - type: precision value: 86.70833333333331 - type: recall value: 90.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (tuk-eng) type: mteb/tatoeba-bitext-mining config: tuk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 23.15270935960591 - type: f1 value: 18.407224958949097 - type: precision value: 16.982385430661292 - type: recall value: 23.15270935960591 - task: type: BitextMining dataset: name: MTEB Tatoeba (max-eng) type: mteb/tatoeba-bitext-mining config: max-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 55.98591549295775 - type: f1 value: 49.94718309859154 - type: precision value: 47.77864154624717 - type: recall value: 55.98591549295775 - task: type: BitextMining dataset: name: MTEB Tatoeba (swh-eng) type: mteb/tatoeba-bitext-mining config: swh-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 73.07692307692307 - type: f1 value: 66.74358974358974 - type: precision value: 64.06837606837607 - type: recall value: 73.07692307692307 - task: type: BitextMining dataset: name: MTEB Tatoeba (hin-eng) type: mteb/tatoeba-bitext-mining config: hin-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.89999999999999 - type: f1 value: 93.25 - type: precision value: 92.43333333333332 - type: recall value: 94.89999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (dsb-eng) type: mteb/tatoeba-bitext-mining config: dsb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 37.78705636743215 - type: f1 value: 31.63899658680452 - type: precision value: 29.72264397629742 - type: recall value: 37.78705636743215 - task: type: BitextMining dataset: name: MTEB Tatoeba (ber-eng) type: mteb/tatoeba-bitext-mining config: ber-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 21.6 - type: f1 value: 16.91697302697303 - type: precision value: 15.71225147075147 - type: recall value: 21.6 - task: type: BitextMining dataset: name: MTEB Tatoeba (tam-eng) type: mteb/tatoeba-bitext-mining config: tam-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 85.01628664495115 - type: f1 value: 81.38514037536838 - type: precision value: 79.83170466883823 - type: recall value: 85.01628664495115 - task: type: BitextMining dataset: name: MTEB Tatoeba (slk-eng) type: mteb/tatoeba-bitext-mining config: slk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 83.39999999999999 - type: f1 value: 79.96380952380952 - type: precision value: 78.48333333333333 - type: recall value: 83.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (tgl-eng) type: mteb/tatoeba-bitext-mining config: tgl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 83.2 - type: f1 value: 79.26190476190476 - type: precision value: 77.58833333333334 - type: recall value: 83.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (ast-eng) type: mteb/tatoeba-bitext-mining config: ast-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 75.59055118110236 - type: f1 value: 71.66854143232096 - type: precision value: 70.30183727034121 - type: recall value: 75.59055118110236 - task: type: BitextMining dataset: name: MTEB Tatoeba (mkd-eng) type: mteb/tatoeba-bitext-mining config: mkd-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 65.5 - type: f1 value: 59.26095238095238 - type: precision value: 56.81909090909092 - type: recall value: 65.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (khm-eng) type: mteb/tatoeba-bitext-mining config: khm-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 55.26315789473685 - type: f1 value: 47.986523325858506 - type: precision value: 45.33950006595436 - type: recall value: 55.26315789473685 - task: type: BitextMining dataset: name: MTEB Tatoeba (ces-eng) type: mteb/tatoeba-bitext-mining config: ces-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 82.89999999999999 - type: f1 value: 78.835 - type: precision value: 77.04761904761905 - type: recall value: 82.89999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (tzl-eng) type: mteb/tatoeba-bitext-mining config: tzl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 43.269230769230774 - type: f1 value: 36.20421245421245 - type: precision value: 33.57371794871795 - type: recall value: 43.269230769230774 - task: type: BitextMining dataset: name: MTEB Tatoeba (urd-eng) type: mteb/tatoeba-bitext-mining config: urd-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.0 - type: f1 value: 84.70666666666666 - type: precision value: 83.23166666666665 - type: recall value: 88.0 - task: type: BitextMining dataset: name: MTEB Tatoeba (ara-eng) type: mteb/tatoeba-bitext-mining config: ara-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 77.4 - type: f1 value: 72.54666666666667 - type: precision value: 70.54318181818181 - type: recall value: 77.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (kor-eng) type: mteb/tatoeba-bitext-mining config: kor-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 78.60000000000001 - type: f1 value: 74.1588888888889 - type: precision value: 72.30250000000001 - type: recall value: 78.60000000000001 - task: type: BitextMining dataset: name: MTEB Tatoeba (yid-eng) type: mteb/tatoeba-bitext-mining config: yid-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 72.40566037735849 - type: f1 value: 66.82587328813744 - type: precision value: 64.75039308176099 - type: recall value: 72.40566037735849 - task: type: BitextMining dataset: name: MTEB Tatoeba (fin-eng) type: mteb/tatoeba-bitext-mining config: fin-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 73.8 - type: f1 value: 68.56357142857144 - type: precision value: 66.3178822055138 - type: recall value: 73.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (tha-eng) type: mteb/tatoeba-bitext-mining config: tha-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.78832116788321 - type: f1 value: 89.3552311435523 - type: precision value: 88.20559610705597 - type: recall value: 91.78832116788321 - task: type: BitextMining dataset: name: MTEB Tatoeba (wuu-eng) type: mteb/tatoeba-bitext-mining config: wuu-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 74.3 - type: f1 value: 69.05085581085581 - type: precision value: 66.955 - type: recall value: 74.3 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.896 - type: map_at_10 value: 8.993 - type: map_at_100 value: 14.133999999999999 - type: map_at_1000 value: 15.668000000000001 - type: map_at_3 value: 5.862 - type: map_at_5 value: 7.17 - type: mrr_at_1 value: 34.694 - type: mrr_at_10 value: 42.931000000000004 - type: mrr_at_100 value: 44.81 - type: mrr_at_1000 value: 44.81 - type: mrr_at_3 value: 38.435 - type: mrr_at_5 value: 41.701 - type: ndcg_at_1 value: 31.633 - type: ndcg_at_10 value: 21.163 - type: ndcg_at_100 value: 33.306000000000004 - type: ndcg_at_1000 value: 45.275999999999996 - type: ndcg_at_3 value: 25.685999999999996 - type: ndcg_at_5 value: 23.732 - type: precision_at_1 value: 34.694 - type: precision_at_10 value: 17.755000000000003 - type: precision_at_100 value: 6.938999999999999 - type: precision_at_1000 value: 1.48 - type: precision_at_3 value: 25.85 - type: precision_at_5 value: 23.265 - type: recall_at_1 value: 2.896 - type: recall_at_10 value: 13.333999999999998 - type: recall_at_100 value: 43.517 - type: recall_at_1000 value: 79.836 - type: recall_at_3 value: 6.306000000000001 - type: recall_at_5 value: 8.825 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 69.3874 - type: ap value: 13.829909072469423 - type: f1 value: 53.54534203543492 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 62.62026032823995 - type: f1 value: 62.85251350485221 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 33.21527881409797 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 84.97943613280086 - type: cos_sim_ap value: 70.75454316885921 - type: cos_sim_f1 value: 65.38274012676743 - type: cos_sim_precision value: 60.761214318078835 - type: cos_sim_recall value: 70.76517150395777 - type: dot_accuracy value: 79.0546581629612 - type: dot_ap value: 47.3197121792147 - type: dot_f1 value: 49.20106524633821 - type: dot_precision value: 42.45499808502489 - type: dot_recall value: 58.49604221635884 - type: euclidean_accuracy value: 85.08076533349228 - type: euclidean_ap value: 70.95016106374474 - type: euclidean_f1 value: 65.43987900176455 - type: euclidean_precision value: 62.64478764478765 - type: euclidean_recall value: 68.49604221635884 - type: manhattan_accuracy value: 84.93771234428085 - type: manhattan_ap value: 70.63668388755362 - type: manhattan_f1 value: 65.23895401262398 - type: manhattan_precision value: 56.946084218811485 - type: manhattan_recall value: 76.35883905013192 - type: max_accuracy value: 85.08076533349228 - type: max_ap value: 70.95016106374474 - type: max_f1 value: 65.43987900176455 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.69096130709822 - type: cos_sim_ap value: 84.82526278228542 - type: cos_sim_f1 value: 77.65485060585536 - type: cos_sim_precision value: 75.94582658619167 - type: cos_sim_recall value: 79.44256236526024 - type: dot_accuracy value: 80.97954748321496 - type: dot_ap value: 64.81642914145866 - type: dot_f1 value: 60.631996987229975 - type: dot_precision value: 54.5897293631712 - type: dot_recall value: 68.17831844779796 - type: euclidean_accuracy value: 88.6987231730508 - type: euclidean_ap value: 84.80003825477253 - type: euclidean_f1 value: 77.67194179854496 - type: euclidean_precision value: 75.7128235122094 - type: euclidean_recall value: 79.73514012935017 - type: manhattan_accuracy value: 88.62692591298949 - type: manhattan_ap value: 84.80451408255276 - type: manhattan_f1 value: 77.69888949572183 - type: manhattan_precision value: 73.70311528631622 - type: manhattan_recall value: 82.15275639051433 - type: max_accuracy value: 88.6987231730508 - type: max_ap value: 84.82526278228542 - type: max_f1 value: 77.69888949572183 --- ### Optimized and quantized of the original model Optimization format: `ONNX` Quantization: `int8` Original model is available at [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small)
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
hcy5561/distilroberta-base-sentence-transformer-snli
hcy5561
sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:494430", "loss:SoftmaxLoss", "en", "dataset:stanfordnlp/snli", "arxiv:1908.10084", "base_model:google-bert/bert-base-uncased", "base_model:finetune:google-bert/bert-base-uncased", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,724
1,724
12
0
--- base_model: google-bert/bert-base-uncased datasets: - stanfordnlp/snli language: - en library_name: sentence-transformers pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:494430 - loss:SoftmaxLoss widget: - source_sentence: A person out front of a business with a woman statue holding a bottle. sentences: - A couple holds hands. - The young boy is upside down. - the man is baking some bread - source_sentence: A person is dressed up in a weird costume with a red tongue sticking out. sentences: - thhe man plays a tuba - Four siblings are climbing on a fake black bear. - the tongue is blue - source_sentence: A man on a train is talking on a cellphone. sentences: - A man is playing a flute on a bus. - The woman is sexy. - two cyclists racing - source_sentence: An elderly woman giving her daughter a hug. sentences: - There are two women hugging. - A man holds a flag on the street. - people are sitting on a red roofed bus going to a museum - source_sentence: A pilot dressed in a dark-colored sweater is sitting in the cock-pit of a plane with his hands crossed. sentences: - A pilot is sitting in his plain with his hands crossed - The boys are playing outside on a log. - Two men discuss their love lives. --- # SentenceTransformer based on google-bert/bert-base-uncased This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased) on the [stanfordnlp/snli](https://huggingface.co/datasets/stanfordnlp/snli) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased) <!-- at revision 86b5e0934494bd15c9632b12f734a8a67f723594 --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity - **Training Dataset:** - [stanfordnlp/snli](https://huggingface.co/datasets/stanfordnlp/snli) - **Language:** en <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("hcy5561/distilroberta-base-sentence-transformer-snli") # Run inference sentences = [ 'A pilot dressed in a dark-colored sweater is sitting in the cock-pit of a plane with his hands crossed.', 'A pilot is sitting in his plain with his hands crossed', 'The boys are playing outside on a log.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### stanfordnlp/snli * Dataset: [stanfordnlp/snli](https://huggingface.co/datasets/stanfordnlp/snli) at [cdb5c3d](https://huggingface.co/datasets/stanfordnlp/snli/tree/cdb5c3d5eed6ead6e5a341c8e56e669bb666725b) * Size: 494,430 training samples * Columns: <code>premise</code>, <code>hypothesis</code>, and <code>label</code> * Approximate statistics based on the first 1000 samples: | | premise | hypothesis | label | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------| | type | string | string | int | | details | <ul><li>min: 6 tokens</li><li>mean: 16.24 tokens</li><li>max: 50 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 10.55 tokens</li><li>max: 26 tokens</li></ul> | <ul><li>0: ~31.10%</li><li>1: ~33.40%</li><li>2: ~35.50%</li></ul> | * Samples: | premise | hypothesis | label | |:------------------------------------------------------------------------------|:---------------------------------------|:---------------| | <code>Two men, one in yellow, are on a wooden boat.</code> | <code>Two men swimming in water</code> | <code>2</code> | | <code>Two people sleep on a couch.</code> | <code>Two people are asleep.</code> | <code>0</code> | | <code>a little boy is learning to swim with the help of a float board.</code> | <code>The boy is crawling.</code> | <code>2</code> | * Loss: [<code>SoftmaxLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#softmaxloss) ### Evaluation Dataset #### stanfordnlp/snli * Dataset: [stanfordnlp/snli](https://huggingface.co/datasets/stanfordnlp/snli) at [cdb5c3d](https://huggingface.co/datasets/stanfordnlp/snli/tree/cdb5c3d5eed6ead6e5a341c8e56e669bb666725b) * Size: 27,468 evaluation samples * Columns: <code>premise</code>, <code>hypothesis</code>, and <code>label</code> * Approximate statistics based on the first 1000 samples: | | premise | hypothesis | label | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------| | type | string | string | int | | details | <ul><li>min: 6 tokens</li><li>mean: 16.66 tokens</li><li>max: 44 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 10.48 tokens</li><li>max: 31 tokens</li></ul> | <ul><li>0: ~36.10%</li><li>1: ~31.80%</li><li>2: ~32.10%</li></ul> | * Samples: | premise | hypothesis | label | |:---------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------|:---------------| | <code>A taxi cab driver looks stressed out in his car.</code> | <code>a taxi driver is stressed</code> | <code>0</code> | | <code>Two men do trick in a park.</code> | <code>The men only sat on the bench in the park, doing nothing.</code> | <code>2</code> | | <code>Two woman walking, the blond is looking at the camera wearing sunglasses making an oh face.</code> | <code>One lady makes a shocked face at the camera as the photographer tells the women they are lost.</code> | <code>1</code> | * Loss: [<code>SoftmaxLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#softmaxloss) ### Training Hyperparameters #### Non-Default Hyperparameters - `per_device_train_batch_size`: 64 - `per_device_eval_batch_size`: 64 - `num_train_epochs`: 4 - `warmup_ratio`: 0.1 - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `prediction_loss_only`: True - `per_device_train_batch_size`: 64 - `per_device_eval_batch_size`: 64 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 5e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 4 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | loss | |:------:|:-----:|:-------------:|:------:| | 0.1294 | 1000 | 0.9208 | 0.7448 | | 0.2589 | 2000 | 0.7095 | 0.6462 | | 0.3883 | 3000 | 0.6415 | 0.6199 | | 0.5177 | 4000 | 0.6125 | 0.5940 | | 0.6472 | 5000 | 0.5935 | 0.5672 | | 0.7766 | 6000 | 0.5748 | 0.5550 | | 0.9060 | 7000 | 0.5654 | 0.5506 | | 1.0355 | 8000 | 0.5524 | 0.5376 | | 1.1649 | 9000 | 0.5386 | 0.5319 | | 1.2943 | 10000 | 0.5192 | 0.5361 | | 1.4238 | 11000 | 0.4863 | 0.5304 | | 1.5532 | 12000 | 0.4687 | 0.5278 | | 1.6826 | 13000 | 0.4586 | 0.5305 | | 1.8121 | 14000 | 0.4474 | 0.5222 | | 1.9415 | 15000 | 0.4447 | 0.5237 | | 2.0709 | 16000 | 0.434 | 0.5172 | | 2.2004 | 17000 | 0.4243 | 0.5235 | | 2.3298 | 18000 | 0.398 | 0.5224 | | 2.4592 | 19000 | 0.3747 | 0.5344 | | 2.5887 | 20000 | 0.3669 | 0.5301 | | 2.7181 | 21000 | 0.3583 | 0.5406 | | 2.8475 | 22000 | 0.3496 | 0.5354 | | 2.9770 | 23000 | 0.3527 | 0.5324 | | 3.1064 | 24000 | 0.3419 | 0.5299 | | 3.2358 | 25000 | 0.3358 | 0.5456 | | 3.3653 | 26000 | 0.3096 | 0.5562 | | 3.4947 | 27000 | 0.2964 | 0.5644 | | 3.6241 | 28000 | 0.2998 | 0.5565 | | 3.7536 | 29000 | 0.2906 | 0.5590 | | 3.8830 | 30000 | 0.2923 | 0.5564 | ### Framework Versions - Python: 3.10.6 - Sentence Transformers: 3.0.1 - Transformers: 4.39.3 - PyTorch: 2.2.2+cu118 - Accelerate: 0.28.0 - Datasets: 2.20.0 - Tokenizers: 0.15.2 ## Citation ### BibTeX #### Sentence Transformers and SoftmaxLoss ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "BEAR" ]
Non_BioNLP
pszemraj/long-t5-tglobal-base-16384-booksum-V11-big_patent-V2
pszemraj
summarization
[ "transformers", "pytorch", "safetensors", "longt5", "text2text-generation", "summarization", "summary", "booksum", "long-document", "long-form", "dataset:kmfoda/booksum", "dataset:big_patent", "license:apache-2.0", "license:bsd-3-clause", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,659
1,692
87
2
--- datasets: - kmfoda/booksum - big_patent license: - apache-2.0 - bsd-3-clause metrics: - rouge tags: - summarization - summary - booksum - long-document - long-form widget: - text: large earthquakes along a given fault segment do not occur at random intervals because it takes time to accumulate the strain energy for the rupture. The rates at which tectonic plates move and accumulate strain at their boundaries are approximately uniform. Therefore, in first approximation, one may expect that large ruptures of the same fault segment will occur at approximately constant time intervals. If subsequent main shocks have different amounts of slip across the fault, then the recurrence time may vary, and the basic idea of periodic mainshocks must be modified. For great plate boundary ruptures the length and slip often vary by a factor of 2. Along the southern segment of the San Andreas fault the recurrence interval is 145 years with variations of several decades. The smaller the standard deviation of the average recurrence interval, the more specific could be the long term prediction of a future mainshock. example_title: earthquakes - text: ' A typical feed-forward neural field algorithm. Spatiotemporal coordinates are fed into a neural network that predicts values in the reconstructed domain. Then, this domain is mapped to the sensor domain where sensor measurements are available as supervision. Class and Section Problems Addressed Generalization (Section 2) Inverse problems, ill-posed problems, editability; symmetries. Hybrid Representations (Section 3) Computation & memory efficiency, representation capacity, editability: Forward Maps (Section 4) Inverse problems Network Architecture (Section 5) Spectral bias, integration & derivatives. Manipulating Neural Fields (Section 6) Edit ability, constraints, regularization. Table 2: The five classes of techniques in the neural field toolbox each addresses problems that arise in learning, inference, and control. (Section 3). We can supervise reconstruction via differentiable forward maps that transform Or project our domain (e.g, 3D reconstruction via 2D images; Section 4) With appropriate network architecture choices, we can overcome neural network spectral biases (blurriness) and efficiently compute derivatives and integrals (Section 5). Finally, we can manipulate neural fields to add constraints and regularizations, and to achieve editable representations (Section 6). Collectively, these classes constitute a ''toolbox'' of techniques to help solve problems with neural fields There are three components in a conditional neural field: (1) An encoder or inference function € that outputs the conditioning latent variable 2 given an observation 0 E(0) =2. 2 is typically a low-dimensional vector, and is often referred to aS a latent code Or feature code_ (2) A mapping function 4 between Z and neural field parameters O: Y(z) = O; (3) The neural field itself $. The encoder € finds the most probable z given the observations O: argmaxz P(2/0). The decoder maximizes the inverse conditional probability to find the most probable 0 given Z: arg- max P(Olz). We discuss different encoding schemes with different optimality guarantees (Section 2.1.1), both global and local conditioning (Section 2.1.2), and different mapping functions Y (Section 2.1.3) 2. Generalization Suppose we wish to estimate a plausible 3D surface shape given a partial or noisy point cloud. We need a suitable prior over the sur- face in its reconstruction domain to generalize to the partial observations. A neural network expresses a prior via the function space of its architecture and parameters 0, and generalization is influenced by the inductive bias of this function space (Section 5).' example_title: scientific paper - text: 'Is a else or outside the cob and tree written being of early client rope and you have is for good reasons. On to the ocean in Orange for time. By''s the aggregate we can bed it yet. Why this please pick up on a sort is do and also M Getoi''s nerocos and do rain become you to let so is his brother is made in use and Mjulia''s''s the lay major is aging Masastup coin present sea only of Oosii rooms set to you We do er do we easy this private oliiishs lonthen might be okay. Good afternoon everybody. Welcome to this lecture of Computational Statistics. As you can see, I''m not socially my name is Michael Zelinger. I''m one of the task for this class and you might have already seen me in the first lecture where I made a quick appearance. I''m also going to give the tortillas in the last third of this course. So to give you a little bit about me, I''m a old student here with better Bulman and my research centres on casual inference applied to biomedical disasters, so that could be genomics or that could be hospital data. If any of you is interested in writing a bachelor thesis, a semester paper may be mastathesis about this topic feel for reach out to me. you have my name on models and my email address you can find in the directory I''d Be very happy to talk about it. you do not need to be sure about it, we can just have a chat. So with that said, let''s get on with the lecture. There''s an exciting topic today I''m going to start by sharing some slides with you and later on during the lecture we''ll move to the paper. So bear with me for a few seconds. Well, the projector is starting up. Okay, so let''s get started. Today''s topic is a very important one. It''s about a technique which really forms one of the fundamentals of data science, machine learning, and any sort of modern statistics. It''s called cross validation. I know you really want to understand this topic I Want you to understand this and frankly, nobody''s gonna leave Professor Mineshousen''s class without understanding cross validation. So to set the stage for this, I Want to introduce you to the validation problem in computational statistics. So the problem is the following: You trained a model on available data. You fitted your model, but you know the training data you got could always have been different and some data from the environment. Maybe it''s a random process. You do not really know what it is, but you know that somebody else who gets a different batch of data from the same environment they would get slightly different training data and you do not care that your method performs as well. On this training data. you want to to perform well on other data that you have not seen other data from the same environment. So in other words, the validation problem is you want to quantify the performance of your model on data that you have not seen. So how is this even possible? How could you possibly measure the performance on data that you do not know The solution to? This is the following realization is that given that you have a bunch of data, you were in charge. You get to control how much that your model sees. It works in the following way: You can hide data firms model. Let''s say you have a training data set which is a bunch of doubtless so X eyes are the features those are typically hide and national vector. It''s got more than one dimension for sure. And the why why eyes. Those are the labels for supervised learning. As you''ve seen before, it''s the same set up as we have in regression. And so you have this training data and now you choose that you only use some of those data to fit your model. You''re not going to use everything, you only use some of it the other part you hide from your model. And then you can use this hidden data to do validation from the point of you of your model. This hidden data is complete by unseen. In other words, we solve our problem of validation.' example_title: transcribed audio - lecture - text: 'Transformer-based models have shown to be very useful for many NLP tasks. However, a major limitation of transformers-based models is its O(n^2)O(n 2) time & memory complexity (where nn is sequence length). Hence, it''s computationally very expensive to apply transformer-based models on long sequences n > 512n>512. Several recent papers, e.g. Longformer, Performer, Reformer, Clustered attention try to remedy this problem by approximating the full attention matrix. You can checkout 🤗''s recent blog post in case you are unfamiliar with these models. BigBird (introduced in paper) is one of such recent models to address this issue. BigBird relies on block sparse attention instead of normal attention (i.e. BERT''s attention) and can handle sequences up to a length of 4096 at a much lower computational cost compared to BERT. It has achieved SOTA on various tasks involving very long sequences such as long documents summarization, question-answering with long contexts. BigBird RoBERTa-like model is now available in 🤗Transformers. The goal of this post is to give the reader an in-depth understanding of big bird implementation & ease one''s life in using BigBird with 🤗Transformers. But, before going into more depth, it is important to remember that the BigBird''s attention is an approximation of BERT''s full attention and therefore does not strive to be better than BERT''s full attention, but rather to be more efficient. It simply allows to apply transformer-based models to much longer sequences since BERT''s quadratic memory requirement quickly becomes unbearable. Simply put, if we would have ∞ compute & ∞ time, BERT''s attention would be preferred over block sparse attention (which we are going to discuss in this post). If you wonder why we need more compute when working with longer sequences, this blog post is just right for you! Some of the main questions one might have when working with standard BERT-like attention include: Do all tokens really have to attend to all other tokens? Why not compute attention only over important tokens? How to decide what tokens are important? How to attend to just a few tokens in a very efficient way? In this blog post, we will try to answer those questions. What tokens should be attended to? We will give a practical example of how attention works by considering the sentence ''BigBird is now available in HuggingFace for extractive question answering''. In BERT-like attention, every word would simply attend to all other tokens. Let''s think about a sensible choice of key tokens that a queried token actually only should attend to by writing some pseudo-code. Will will assume that the token available is queried and build a sensible list of key tokens to attend to. >>> # let''s consider following sentence as an example >>> example = [''BigBird'', ''is'', ''now'', ''available'', ''in'', ''HuggingFace'', ''for'', ''extractive'', ''question'', ''answering''] >>> # further let''s assume, we''re trying to understand the representation of ''available'' i.e. >>> query_token = ''available'' >>> # We will initialize an empty `set` and fill up the tokens of our interest as we proceed in this section. >>> key_tokens = [] # => currently ''available'' token doesn''t have anything to attend Nearby tokens should be important because, in a sentence (sequence of words), the current word is highly dependent on neighboring past & future tokens. This intuition is the idea behind the concept of sliding attention.' example_title: bigbird blog intro - text: 'To be fair, you have to have a very high IQ to understand Rick and Morty. The humour is extremely subtle, and without a solid grasp of theoretical physics most of the jokes will go over a typical viewer''s head. There''s also Rick''s nihilistic outlook, which is deftly woven into his characterisation- his personal philosophy draws heavily from Narodnaya Volya literature, for instance. The fans understand this stuff; they have the intellectual capacity to truly appreciate the depths of these jokes, to realise that they''re not just funny- they say something deep about LIFE. As a consequence people who dislike Rick & Morty truly ARE idiots- of course they wouldn''t appreciate, for instance, the humour in Rick''s existential catchphrase ''Wubba Lubba Dub Dub,'' which itself is a cryptic reference to Turgenev''s Russian epic Fathers and Sons. I''m smirking right now just imagining one of those addlepated simpletons scratching their heads in confusion as Dan Harmon''s genius wit unfolds itself on their television screens. What fools.. how I pity them. 😂 And yes, by the way, i DO have a Rick & Morty tattoo. And no, you cannot see it. It''s for the ladies'' eyes only- and even then they have to demonstrate that they''re within 5 IQ points of my own (preferably lower) beforehand. Nothin personnel kid 😎' example_title: Richard & Mortimer parameters: max_length: 64 min_length: 8 no_repeat_ngram_size: 3 early_stopping: true repetition_penalty: 3.5 length_penalty: 0.3 encoder_no_repeat_ngram_size: 3 num_beams: 4 model-index: - name: pszemraj/long-t5-tglobal-base-16384-booksum-V11-big_patent-V2 results: - task: type: summarization name: Summarization dataset: name: kmfoda/booksum type: kmfoda/booksum config: kmfoda--booksum split: test metrics: - type: rouge value: 23.1439 name: ROUGE-1 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZmQzMDk0MDJlZTJkN2IzODg3NDJhYmY4MzJmOTU4N2FjMDBjODg5NzJlMGFhNDQ2YTFhMzI3YmY5ZWM1MDBkMiIsInZlcnNpb24iOjF9.yoXEV5ircj_cjQhUA_RpWH_8Kaev0sRLwQulYD8wmqxfSEuqamBGedXnIg9X_EcpjvulBhapjGZN2G0s0vz4Dg - type: rouge value: 3.2393 name: ROUGE-2 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMTkwNzEwYjc5YTZkMmE4NmEwMDE1OTRiNTJmM2VlYmI3NmM2NjIwZWMxM2ZkNjU2MzhjMmQzYjIxODRiYzY4ZiIsInZlcnNpb24iOjF9.CDK_e4fCwERbm3D_Y2tc41SSscIvlZKGTUQ16afpMuH2_HHKbpn7CNgtU9MWiyFZfdgafdUeQPo2CCYI-dCBCg - type: rouge value: 12.7038 name: ROUGE-L verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNDFkNjcyYmYxYzdlMTY2NTIyY2ZiZDJlZjliYTM1YWZjZGI3YzA5ZDczYjdkMGUzZmUxNmJkMDY0OTk3NWNlMSIsInZlcnNpb24iOjF9.XQmt4GEX0N6y2FNXfLAeLDkB96nJyxhN9dyy-OdBcu5E7Tw0dvIN3feYHxq8MenTShE9lsekIYZy2kieJQfmCg - type: rouge value: 19.8101 name: ROUGE-LSUM verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYTFhMGNhMzA0YmYyMDhiNzdlMDc2ZDQ3YjFjMDM3ODliMmIxMjQxZWMwYWM0NTM0OGNlZTkzMzVhZDBmMjA1YiIsInZlcnNpb24iOjF9.-YChaP7xwLM9W5jrdLSyLWdb3hAdPbm0mmij3X_pU3nqb3_wuPobjCLGEEQNxAnGq7kE-LI5hgXZ-lGhuKUCCQ - type: loss value: 2.766307830810547 name: loss verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiODAxYzRhNGM2ZGVkOWRiM2Y4NzNjZDM2MTY2MmM4MzY3ZWM5ZjdmMWUxZGY5Y2E2OTg4ZGEwYzBlMmFiYmQyNSIsInZlcnNpb24iOjF9.VRePqe8Z9dD5l6bsfIRLkFn4mwwVC8G--kOlofQWSiGusRxVrY50fa5MtKTGmuiNs5JDFCPjZmkpGYlSxnOeDw - type: gen_len value: 63.4493 name: gen_len verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZGY4NWI0MDc3NDk4NTg4YjQ5YzFmN2MyYWFjMzI0MjlkMGZlMWMzYThiMDFlMmM3MmE4ODg0YWExNTMyZjQ5MiIsInZlcnNpb24iOjF9.Ym3jfW0gthJhlLg4CW10jM9YUHUGbAPIdLefE3CTyP0OUrV9yuJAGV6-RDrV-Viwyy1Xaqg4BFa5pX7P2PRRDQ - task: type: summarization name: Summarization dataset: name: samsum type: samsum config: samsum split: test metrics: - type: rouge value: 26.8026 name: ROUGE-1 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTBhYTQzMGVjZTJjZmE3NjBiNzI2M2FlNTA4Yzk5Njc1Yjk1YTk2NTJiMTRlMzQ3NjU2ZjQxZTNkNDVhNjMzYSIsInZlcnNpb24iOjF9.GyFUubKI3pM5Z8I1jz6Q_f7fSr1nVpwuFluUOVq8aaWfv7L1dZ_5By2FShQM1nwBM-mCiqtFb3a61eR3VEAeBw - type: rouge value: 6.0656 name: ROUGE-2 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMzEyZTYxYmVlYTc0MzNhMWM1ODgwODRiYWNkN2FjMjIzOTJhNzA0OTFkY2M0ZTJhMWMzNWMzY2E1OGJmYTg5OCIsInZlcnNpb24iOjF9.3U0PamPVFWWE7Nxh6u52mnMP-HpeGPEOLauZthcj32ELSuNx9s260ujguSW_BrJpCXqNNEqIzYTlWf97Ji8vCA - type: rouge value: 20.0098 name: ROUGE-L verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOGExYTRmZDgzYzllNWZmMGFlN2FhMDJmZGE1ODkyYTZlNmFhZjZmNGU4YzQwZGZiYTAyZmI1NGJmNjRjODkwYSIsInZlcnNpb24iOjF9.dEON7kZa7dKCHjz7nuuIBdcpwojM5-OxQuEf5n18ZywWdbk9H2LWGY2uvvCRp6cK2JsIzxzTmX9wK7zkWQiCAA - type: rouge value: 21.9115 name: ROUGE-LSUM verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiM2Y4MWE4ZmIyMTA5YWU5YzllYzExMzA1OTc2Mjg3NTYxNjcwMWMxZGI0ZDhmYjJhMGIxNTllY2Q3NDVlNmM2MiIsInZlcnNpb24iOjF9.M8bYXCuNHyVAkA4vBbqvGe8yCgmjCrlhqqliAF6WcmrYRF8CvezQ4S4SWGhhVkcG6v84H-Pa9LzsKmualXdWBw - type: loss value: 2.317471981048584 name: loss verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMmI1YjNlYzI3OTY4YjY1MDIwYzk3ZDMzZDA4MzQwM2ZhNzY3NDQxZTA2ZThiMmE2MmFmNTg0OGMyYWFhODE5OSIsInZlcnNpb24iOjF9.QpoWo_TLKw72_PbtwknBA1LbUQ8ftls-8VBLuN8_ZhUN2lNNpipU2qMZ1Ga4xAUazkcMhT_TwpqjyGshJFkgAg - type: gen_len value: 19.1111 name: gen_len verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYTA2MmFiNjI5NzFjOTUzMTEwZTNiYzA1OGY1ZWEyNTE1ZTgzYjMxNDE4YjJkZmIxNWI4MDMyYWUxMWRkODk1NCIsInZlcnNpb24iOjF9.CXy-Dfle9ypabrK3I1GyhOWl46EyRDbf8XlY-D0cNktXcCCbKdgn8DWgJI199GJpH-19mMS_jQt049VJri2EDw - task: type: summarization name: Summarization dataset: name: xsum type: xsum config: default split: test metrics: - type: rouge value: 25.2061 name: ROUGE-1 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMjZmZDRlN2NjZTQyNzkyMmZiYzk1MjJmMmE0MGM4ZjUwOGNmOGFhZjg0MzE0MzM4MmE1Y2EyYTY4ZThmNzUzMiIsInZlcnNpb24iOjF9.pdJWpUnMeqftinZrPkkFRWbCA253BYgt5W-EqbyTVi9BteojJ6yEDbMjE0TyYzlJ28JBcw4IVNL2zaWCgpfRBQ - type: rouge value: 4.7048 name: ROUGE-2 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNGRjOGUzZTk1ZDc0Zjk5MmE4ZjUzNmZiZjQ2YzE2YzYzODdmYmY3NzMwNDdmYmViNjVkZTUzMmY4YjllOGQ1NCIsInZlcnNpb24iOjF9.nFiT7HhUZSDofK6_UH2-1rzPz_48w7e5j0Q72vqgodSNIwpv2JOlcb1GOlaA9jkvy45PJyDBgP9i6kLVfaNBBw - type: rouge value: 17.8593 name: ROUGE-L verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZmY5ZjM0ZjdkYTZiMzk0ZWYyM2EzZWNjMjczMjI2MzkwYmNiN2JhNDEzNzdmMmE0NzEwNmVkNGU5YTlkZDAzYyIsInZlcnNpb24iOjF9.C3ZgUsGNNtwZVJFcT90KkBfewrrA3ZXxxVl2u5ykUtzpS4gzoaRuZbPT8WOJAog7kfPPJiG_GZGYy9XTTCdIBw - type: rouge value: 18.0798 name: ROUGE-LSUM verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMDU4Y2Y3MzExNzNlZTI3NWVmZTNjMmZkNTAxNDBjMzJiZTI5M2E2N2ViODk5OGEwZGU5NzYxZWMzMjMwNmQ2MSIsInZlcnNpb24iOjF9.qDLZsjtftvlw8-3kOoUvanWmemmvaPxUIAxOVh1B18Ihn9kkm0FnZbWxl65YdOLg3dqDcHnDFXvXcS81C8dmBw - type: loss value: 3.003053665161133 name: loss verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTM2ODRkMjk5MjczY2ViZGVjMjJjOTFmYTk2NTAyNmUwMTRiZjYwZTllY2NhODFhYWVkZTIzYzQxZjZlOGFkNCIsInZlcnNpb24iOjF9.3SeJzRO0b4cNCTOgsf7c8UrLCLW-6JoOHtNMmMr5DCzNzfqlt2TSJ5ClahzzAYA2_5QhTMhcUYOewH5uZhkpDA - type: gen_len value: 27.4815 name: gen_len verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZDdiYTVkZGI0NzE0ODcwNjgwNGQ0YmNjZDI1MWQxZWQ0MzNmMDJkYmE4MGM5ZjM4NGViNWZiNTdjNTg2YzBlOSIsInZlcnNpb24iOjF9.VoPyoq8HZq8nbucrPYt52flRFtkD5VAfVD7LykAp-GiN2W6D3cpcagMMrHThP9e8q3qDodxddMcnwY88CGtkAg - task: type: summarization name: Summarization dataset: name: cnn_dailymail type: cnn_dailymail config: 3.0.0 split: test metrics: - type: rouge value: 27.5692 name: ROUGE-1 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiM2UzNDRjNDJhNjViYjgxNDY2NzAwODkyYjk1OTllNWFiYmI2MGEyMmM3ZTc1YWZjNjhiZDZkYzAxYzIwYTQzZiIsInZlcnNpb24iOjF9.FEJU7de6nnYa1rhAngf3h0JDSFKXzWKkcHwQtcz6rbPuVV0Jw7u-9PwDXBFh0X8n2PJjOfCqM5hmcrUe0FxkCQ - type: rouge value: 6.1264 name: ROUGE-2 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMGIzODA2ZjU2YmM0YmJiZDIxNzQ0ZDI1NGQzZGZlNDg5OTZhYmMwZTQ1ZjVlYzM5ZTQzMjZkMTIyZmY1OGQ2YiIsInZlcnNpb24iOjF9.fN1wSGc_tUvIgYyzKU35PuPxKyTOotKnMCW_u452LduRVyIey9KB8kf8E35vTOVvk7TCiuvRuxXDoAATFktbBQ - type: rouge value: 17.1127 name: ROUGE-L verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNWRjNTNhZjg1NDVkNTQ5MjkwZjNiNzY0Nzk5ZmM4YjhhZmZiZjQzZGY1YWM1ZGI5MGE0YjNiYzNmNWYyNWI2OSIsInZlcnNpb24iOjF9.KVGdIERnuGTOrxm71i2znI8tdRCgVz7SijP08tsE0H54eUijAYDqQccspfZTXRXeFn0lOUjSHDvHj4ODIRYvAw - type: rouge value: 23.0066 name: ROUGE-LSUM verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNGUyMzhlODY1YWI4ZDg2NzYwZDYwNmYzZTRhMTQ3NDE2MzUzZGViNzhjMTkzZDRhNTljNDEyMTY4NzAwMjE0OCIsInZlcnNpb24iOjF9.pBz5E_1ffBrv4tDCJhuYFIuBFBk0P3SKxLYoIhOVj_fW0Mj6ZKPcA9ZhdE4U-HsHEgSvFhtBw1UlsGiu145XBw - type: loss value: 2.218526601791382 name: loss verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZjYxNDk4OWU0M2Y1ZjMxNTA3NjdiNjQ5NWFjYzJiMjVhMjgzMTA3NDhlNTVjMjllZjQ0NWQ2YmYzYjdiMTQ1OCIsInZlcnNpb24iOjF9.SJdyGLltcLnB03U6QxSkZ71Im0aGK-oTbEQDMj2AnEPFThNTb0mMEMpCWpH1lLVeDAh-PE6fCmgt4yPS6n2nBg - type: gen_len value: 39.1952 name: gen_len verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOTMyY2JiYWVhYTM3OWU2YjhiMDEwZjAwZDgxN2FmNjk2MzZhYmQzMWZiODg2NDY0ZmU4MjdhNjk0MTViMGY1YyIsInZlcnNpb24iOjF9.bsLAi2R8QTrCUj1VW4GQqYauY7CV3mFm2S294zHCJU2ZlAcikutcjxAcgvuSSGiAVJ02Odm5bMTuzx7SYMUSAQ - task: type: summarization name: Summarization dataset: name: billsum type: billsum config: default split: test metrics: - type: rouge value: 28.0632 name: ROUGE-1 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiY2RiODA0ZTQxYWU0NDI5YmNjZmYzYTZmY2I5NTczYzVhZjcxOTYwMWI3ZjZiMzRlZmI5ZTA5NjVkY2E4NDFlMyIsInZlcnNpb24iOjF9.POIQUXGryoEzHmdBCeqaBh70uz33XlKVLjfhyRFwhWj7UV15SsDcuumkEk2BXkShFHDRo0CQd1AXD1fFsPCVCQ - type: rouge value: 9.8996 name: ROUGE-2 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZDBiMDllNTZlZmJiYWI1ZTIxM2JhYmZhYTAzYTQ0NmUzNjcyZjkzMDliYTE5ZjIwY2M0YzU2ZWZlYjNhZDY2YyIsInZlcnNpb24iOjF9.EEJO-ZRVi2EiM-uKMvimaITiHh7wqzNBza6lsIvdyVhVf4UwGhsUaArHzlYR7xn53UBCtIDTucXX7NKFst_4Ag - type: rouge value: 18.25 name: ROUGE-L verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZTk4ZmJiYWIwYmY4MTBmNGVlMmE1YzA4N2VmYWU3NjRlNTU3YjI2YjBhOGIzNzcwZjczOTZmZGJiNjMyMjYzZiIsInZlcnNpb24iOjF9.Qx-ihTp0UuzhShqHQkiTijODUst1LO5Bi8KaQOCIiVhvywN-2Wt3bmeSNV_C0b5BXsSaHIxrWBTeSRaq5Zp_Bw - type: rouge value: 21.9053 name: ROUGE-LSUM verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMTIzNGNkNTAyYTkzZjE5ZGZhZjZkYmU3Yjg2ZTVhYjY1NjZhODZjM2NkMWQ5NmJjN2UxNTZlMmJmNDNmOTczZSIsInZlcnNpb24iOjF9.6ZY8rK5bRfOZJkdvhpvOt_gW1xCoA4JsAi0-6No4y-lBaLGUo4LXpGaVcJrrvdN-S7e7yCxnA32jGCdYXzJJBA - type: loss value: 2.032966375350952 name: loss verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTM5MmQzMWZhOWIwNjNjNThhNGE4NzFiMzdhNmMzZWM4ZGYyNWE1NmZjMDVjNTBmMGRiNzYzMTc1ZDg2YTYxNCIsInZlcnNpb24iOjF9.Zqrbz7mmljH19mEep_mm4ev5FEsozIqG0eNkj3V5s85OgrHyuKOVkGKhRlqjcWfgkUlsxTpaemZDUVIR84XrBw - type: gen_len value: 48.5987 name: gen_len verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiODZjNGJiOGUzM2M3NDM3MDRmNmQ1ZjQ3ODUyZTQ2NjEyYmM0YWRhMmU4MDdkOTZmMGNkNTIyNDg3ZmQxMjA4MiIsInZlcnNpb24iOjF9.y91kl4W-QIy6lfQDl0s4h0aeV-v0iH7Y06AJBYRYrddUXRiDw2oSTHEdf14d3Hw-oZNPftzBHUJqAckwEpGFDw - task: type: summarization name: Summarization dataset: name: big_patent type: big_patent config: y split: test metrics: - type: rouge value: 34.7848 name: ROUGE-1 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiN2QxOTU1YTUxYWJjOTcwOGYwZjA3MGJlYmRhYWUwMGNhYmQxYjVmMjU5M2E5OGRiY2RmMTNmNGNhZDdmNzc1OCIsInZlcnNpb24iOjF9.bp2K7V-BDMQMd3zk2RY3pILKI7LimWrD5psesXnSF20JiRA3d5-bQdOfPeZGu3ydUqbml3MTswM0lg_6EZTjAw - type: rouge value: 9.7549 name: ROUGE-2 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOGY4OWM4MjVmMzhmNGUwYzAxODNjMjY4OTY1YjQ2MjZiYzM2NzgyNGZhMjVlNjllZmI3OTMzNTVhZDA1YzMyOSIsInZlcnNpb24iOjF9.HQ_emvr1RVEfeNfQUdfhfzk2O5BGwzpQKojvRW_w44Ixakn_VrZ4GurxYo0JTF4dDwDBDqjaFnZ4EiYcsrxODQ - type: rouge value: 22.228 name: ROUGE-L verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNWVkMzc2ODM1ZTg2YzQ4YjMzZjQwMThiODI0YzA5MzJmZjY1ZTJlOGZhOTM1OWEzOTE3M2ExYzFiMjM2NDRlMSIsInZlcnNpb24iOjF9.shmWrR-rNKAYOqEgnnlrgWXaWAWbvrKC_IyvK-fwnqoJcphB9ef2gVX758tQgfe878M1N1sE7StT8rd7FbD8Cw - type: rouge value: 28.0389 name: ROUGE-LSUM verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTJmZTczZDc4N2ZlNDk3NmY0Njc2Y2JhNGU2OWJjZGU4YWQ3Y2RjNDU1ZTEyNjFiZDQxZGNhZWFmYTAwOTBiMSIsInZlcnNpb24iOjF9.yOTMgX1vpuhlyPkfCAyNf1k5nSInny0YrbqJeC_MDZlavVIxOQT6qVcMYJpLF2AKRp6UsuFB06PANbQu4Bj6CA - type: loss value: 1.7787292003631592 name: loss verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiY2VlMGU3NDE0NmFiNTI2M2NhZmU2YzRhYjU1ZWNjYmM3YTllMTQxODJhM2JlMjk3NzVjYjQ5M2FlOTk2NjNmZCIsInZlcnNpb24iOjF9.wkkUrosSgGkei41n6CxQH_UwS6fJTMzXLV88EgnI_8Y6Qz2qa9B2cGhpFkP__snnX6u9jhWj68oAfZifqaXnCw - type: gen_len value: 71.6372 name: gen_len verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiODI0NTcwOTZmNzkwOTFhMTdmYWMxYjI2YTdmMTYwYTBlMTEyOTc3MmFkYmZkNGJmYjc4MTJlYmYwNzIxMjkzMCIsInZlcnNpb24iOjF9.EM9Vh5Mb6H3htv45ohj6FYqBUtmQi1sn0j97brEFWRYp--8N2Y781cR9ktqylEz6PgbbwpuxMYOMD5MctmGLCw - task: type: summarization name: Summarization dataset: name: launch/gov_report type: launch/gov_report config: plain_text split: validation metrics: - type: rouge value: 23.5925 name: ROUGE-1 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNjk2NWVkM2Y5NTgxYjgyYmY2YjgzYjZiN2VhZGVkOWNiYmRiZGU0ZGYwNzlkN2E3ZDk5ZjQ3ZTQyYjU5YzczYSIsInZlcnNpb24iOjF9.ScWumfXg7-ZeJEZT1IHDVt5LWMDZEihiiCux5chXP2AeRs3pWKhI6xr_D5i3CCEDeyiMzKleCASMBe4sC9LgDQ - type: rouge value: 5.6762 name: ROUGE-2 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMDU3MGNmMDY3YWQxNDdlMTk5MjM1NGU4M2RmNDNiYzllYmRmNTYzZGFiOGU5MjQ0YWMzYTg1OWFlNmNmMzQ5NiIsInZlcnNpb24iOjF9.9SKt_I8WGKu6bsovBR4mSTDNEaSHB1tN5RyY3JTCHYs2YQNczaKwLNPnyG2i0IbkvaPX_8EOQ7KzwQ5raUVFBg - type: rouge value: 13.8108 name: ROUGE-L verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYTBiZDVkYjI4ZDBlZGM2NDM4M2U2NzdjNzViNDkzZjE3YTBmYzdlNDNlMTZhZTUxNjA2NmJkODE2ZTk1MTAxMSIsInZlcnNpb24iOjF9.KMTkQsI9BfDfL7FZpwZ9kxTTRA8DNrUEpyBZtloQ0sNfhO0t0Ch1qhktz0HaA0uQfC0WFRfrb9Iz7uMc8XVRBg - type: rouge value: 20.2437 name: ROUGE-LSUM verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMjBkZGJlYzZjMjQ1Njg4MjQ2NzJhYjY5ZGZlN2Y5Y2M4MDQ0YzQ3YzQzYmY5N2VkNjBiNTEwMDNmZWRlMTAwYyIsInZlcnNpb24iOjF9.AqYAfIMFBY7AIP1yJbjaAbJXYs5VbXxWKpsA_rdW_HWxITvjqoJDK9X3wCueXMy7dSE6L-ysC4yl99Bbc50KBA - type: loss value: 2.6377077102661133 name: loss verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNGEzMTZhODM0Nzg0ZDY3OTVkYmZmODQ1Y2YzMTY3YmJlYjk2ZGRiMWFkMDQxMTkyYTgwZWNkNmU0NzI0NjA1NCIsInZlcnNpb24iOjF9.ziVXhWBRAml5Xwa-tx9ywwtiJeIzIIclY532L0Mtft3Sc88oGPK9av6nh4kMiO5yWSHJnM3KFQWiuco7w_xNDg - type: gen_len value: 64.1807 name: gen_len verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZDNhZTRhODgwODI1YmRlODZiY2I3YjFmY2MyZGYyYTY1MzQ5OTgwZGI1NmUwNDMwMmQ0N2Y3YmZmMzcyMTc2NSIsInZlcnNpb24iOjF9.NCVj0Uaq3-diq6pnu8EC0tgwv24NwQCgmWiqpOMvJSN17B_98z_dMbLHRzY8e_tNNVFFagiCnknoE00OqUTjDg - task: type: summarization name: Summarization dataset: name: launch/gov_report type: launch/gov_report config: plain_text split: test metrics: - type: rouge value: 23.7438 name: ROUGE-1 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMzIwMzZjOGQ5N2U3MTg3NmEwYzZkNjllNDc4NzQ4NWUxN2JmYjdiOGU2MjhkZGJhODY4NDU4N2E5ODU1NTFhMiIsInZlcnNpb24iOjF9.cJoHXGYopFoFVmQXdxu3KrG_usk1ouc0PPR6FS9HrZEbi2T5LtVANntlXmlLTXSvOEaorUyg08yot_j6j1oeCw - type: rouge value: 5.501 name: ROUGE-2 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOWQ3YmQ5ZTJkNmJhZGEyMTkzYjlkMWZmZGVhNGE5Y2IzYzA5OWM1NTY0NTU0MWUzYTIzNTQ0OGI3ZWZkNjlkMSIsInZlcnNpb24iOjF9.C_SbNoz5qIo0CtVPL_5jqFNZxgmJ1XE43TvVz2reog2jtlhekNfN0rvaHxT4TadAAzIgDZayeBMeNaASgmNCDA - type: rouge value: 13.8132 name: ROUGE-L verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZTAxODA2NmNlNjkyYTQ4YjEwOTA1ZGMyMjVlZjkzMGI3NzNiMTRkZGRmNDJjZDc2MTYxYzI3NTBlNTVjY2IxNCIsInZlcnNpb24iOjF9.UklkyvqHV3axZ_PalbPb1JZN7rgQjHjJr0ke1yDUzujrY6yBr3XpPxjFhwsEElalc1iiEgdtEZnaCbBhskdGBQ - type: rouge value: 20.4615 name: ROUGE-LSUM verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNmNhZDI2ODQ4MjBhZDNlZjJkMTQ1NmZjZTdjMDZlMjcwYjE4M2M5ZjIxYzA2M2JmYmJmZDliZTU3NzVkMjdmZiIsInZlcnNpb24iOjF9.m2aRMFUpPFvMSf3sxB7HbKIslWrggFamjiIlOAiPuH5_N8wyLJeHJJw8uvULE8R0GKGWuqXfCCv--lyhZKZkAA - type: loss value: 2.6383883953094482 name: loss verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMTQzMjFiZWE1NDI1OTFlNWUxMzFiYjJhNzViNDYxMzI3OGU2ZTE1ZDJkNDA3Y2NhODA0ZWM3ZmM3ZTM1NmFlZiIsInZlcnNpb24iOjF9.twTQ94T2Nsq0__TcHLaJ_8HcqozA_FOi6pAiM_IP5qSqKlUXYV1S2-nuS1vs69QB-tSp4XIbqRqhSgKv0VoABw - type: gen_len value: 64.9085 name: gen_len verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMDk3Njc5ZWM3ZTRkMzk2YjJmMjg1YjFlNDExNTU2NTRhNzRlNjA4NGFkZDg2YmQzN2UzNThhODFmZTNlMjdkZiIsInZlcnNpb24iOjF9.2rXKy4mi2VbZRDewY2mKsVe42KuwxIWcmIzdA39RbSJ7Wg45MfRDUjZweyz7Bnlmy6eCcdv7Ya4oyUwAjNV3AQ --- # README - long-t5-tglobal-base-16384-booksum-V11-big_patent-V2 - this README was added because there wasn't one - created 2022-07-31_12-14-50 ## about An experiment testing some transfer learning with [pszemraj/long-t5-tglobal-base-16384-book-summary](https://huggingface.co/pszemraj/long-t5-tglobal-base-16384-book-summary) to evaluate the ability to learn some technical documentation through the `big_patent` dataset on huggingface. This checkpoint has been trained on dataset subsection `y` of `big_patent` for approx 400 steps of functional batch size 128.
[ "QUESTION_ANSWERING", "SUMMARIZATION" ]
[ "BEAR" ]
Non_BioNLP
MilosKosRad/BioNER
MilosKosRad
token-classification
[ "transformers", "pytorch", "bert", "token-classification", "chemistry", "biology", "zero-shot", "BERT", "PubMedBERT", "en", "dataset:ncbi_disease", "dataset:bigbio/chemdner", "dataset:bigbio/n2c2_2018_track2", "dataset:bigbio/bc5cdr", "dataset:bigbio/jnlpba", "arxiv:2305.04928", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,684
1,689
222
8
--- datasets: - ncbi_disease - bigbio/chemdner - bigbio/n2c2_2018_track2 - bigbio/bc5cdr - bigbio/jnlpba language: - en library_name: transformers license: mit metrics: - accuracy - recall - f1 - precision tags: - chemistry - biology - zero-shot - BERT - PubMedBERT widget: - text: Disease<SEP>Patient was diagnosed with liver cancer. --- # Zero and few shot NER for biomedical texts ## Model description This model was created during the research collaboration between Bayer Pharma and The Institute for Artificial Intelligence Research and Development of Serbia. The model is trained on 26 biomedical Named Entity (NE) classes and can perform zero-shot inference. It also can be further fine-tuned for new classes with just few examples (few-shot learning). For more details about our method please see the paper named ["From Zero to Hero: Harnessing Transformers for Biomedical Named Entity Recognition in Zero- and Few-shot Contexts"](https://arxiv.org/abs/2305.04928). The model corresponds to PubMedBERT-based model, trained with 1 in the first segment (check paper for more details). Model takes two strings as input. String1 is NE label that is being searched in second string. String2 is short text where one wants to searc for NE (represented by String1). Model outputs list of ones (corresponding to the found Named Entities) and zeros (corresponding to other non-NE tokens) of the Sring2. ## Example of usage ```python from transformers import AutoTokenizer from transformers import BertForTokenClassification modelname = 'MilosKorsRad/BioNER' # modelpath tokenizer = AutoTokenizer.from_pretrained(modelname) ## loading the tokenizer of the model string1 = 'Drug' string2 = 'No recent antibiotics or other nephrotoxins, and no symptoms of UTI with benign UA.' encodings = tokenizer(string1, string2, is_split_into_words=False, padding=True, truncation=True, add_special_tokens=True, return_offsets_mapping=False, max_length=512, return_tensors='pt') model0 = BertForTokenClassification.from_pretrained(modelname, num_labels=2) prediction_logits = model0(**encodings) print(prediction_logits) ``` ## Example of fine-tuning with few-shot learning In order to fine-tune model with new entity using few-shots, the dataset needs to be transformed to torch.utils.data.Dataset, containing BERT tokens and set of 0s and 1s (1 is where the class is positive and should be predicted as the member of given NE class). After the dataset is created, the following can be done (for more details, please have a look at the code at GitHub - https://github.com/br-ai-ns-institute/Zero-ShotNER): ```python for i in [train1shot, train10shot, train100shot]: training_args = TrainingArguments( output_dir='./Results'+class_unseen+'FewShot'+str(i), # output folder (folder to store the results) num_train_epochs=10, # number of training epochs per_device_train_batch_size=16, # batch size per device during training per_device_eval_batch_size=16, # batch size for evaluation weight_decay=0.01, # strength of weight decay logging_dir='./Logs'+class_unseen+'FewShot'+str(i), # folder to store the logs save_strategy='epoch', evaluation_strategy='epoch', load_best_model_at_end=True ) model0 = BertForTokenClassification.from_pretrained(model_path, num_labels=2) trainer = Trainer( model=model0, # pre-trained model for fine-tuning args=training_args, # training arguments defined above train_dataset=train_0shot, # dataset class object for training eval_dataset=valid_dataset # dataset class object for validation ) start_time = time.time() trainer.train() total_time = time.time()-start_time model_path = os.path.join('Results', class_unseen, 'FewShot',str(i), 'Model') os.makedirs(model_path, exist_ok=True) model.save_pretrained(model_path) tokenizer_path = os.path.join('Results', class_unseen, 'FewShot', str(i), 'Tokenizer') os.makedirs(tokenizer_path, exist_ok=True) tokenizer.save_pretrained(tokenizer_path) ``` ## Available classes The following datasets and entities were used for training and therefore they can be used as label in the first segment (as a first string). Note that multiword string have been merged. * NCBI * Specific Disease * Composite Mention * Modifier * Disease Class * BIORED * Sequence Variant * Gene Or Gene Product * Disease Or Phenotypic Feature * Chemical Entity * Cell Line * Organism Taxon * CDR * Disease * Chemical * CHEMDNER * Chemical * Chemical Family * JNLPBA * Protein * DNA * Cell Type * Cell Line * RNA * n2c2 * Drug * Frequency * Strength * Dosage * Form * Reason * Route * ADE * Duration On top of this, one can use the model for zero-shot learning with other classes, and also fine-tune it with few examples of other classes. ## Code availibility Code used for training and testing the model is available at https://github.com/br-ai-ns-institute/Zero-ShotNER ## Citation If you use this model, or are inspired by it, please cite in your paper the following paper: Košprdić M.,Prodanović N., Ljajić A., Bašaragin B., Milošević N., 2023. From Zero to Hero: Harnessing Transformers for Biomedical Named Entity Recognition in Zero- and Few-shot Contexts. arXiv preprint arXiv:2305.04928. https://arxiv.org/abs/2305.04928 or in bibtex: ``` @misc{kosprdic2023transformerbased, title={From Zero to Hero: Harnessing Transformers for Biomedical Named Entity Recognition in Zero- and Few-shot Contexts}, author={Miloš Košprdić and Nikola Prodanović and Adela Ljajić and Bojana Bašaragin and Nikola Milošević}, year={2023}, eprint={2305.04928}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
[ "NAMED_ENTITY_RECOGNITION" ]
[ "BC5CDR", "BIORED", "CHEMDNER", "JNLPBA", "NCBI DISEASE" ]
BioNLP
Snowflake/snowflake-arctic-embed-s
Snowflake
sentence-similarity
[ "sentence-transformers", "onnx", "safetensors", "bert", "feature-extraction", "sentence-similarity", "mteb", "arctic", "snowflake-arctic-embed", "transformers.js", "arxiv:2407.18887", "arxiv:2405.05374", "license:apache-2.0", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,712
1,733
29,658
19
--- license: apache-2.0 pipeline_tag: sentence-similarity tags: - sentence-transformers - feature-extraction - sentence-similarity - mteb - arctic - snowflake-arctic-embed - transformers.js model-index: - name: snowflake-snowflake-arctic-embed-s results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 71.17910447761193 - type: ap value: 33.15833652904991 - type: f1 value: 64.86214791591543 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 78.750325 - type: ap value: 72.83242788470943 - type: f1 value: 78.63968044029453 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 38.264 - type: f1 value: 37.140269688532825 - task: type: Retrieval dataset: name: MTEB ArguAna type: mteb/arguana config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 32.646 - type: map_at_10 value: 48.372 - type: map_at_100 value: 49.207 - type: map_at_1000 value: 49.214 - type: map_at_3 value: 43.611 - type: map_at_5 value: 46.601 - type: mrr_at_1 value: 33.144 - type: mrr_at_10 value: 48.557 - type: mrr_at_100 value: 49.385 - type: mrr_at_1000 value: 49.392 - type: mrr_at_3 value: 43.777 - type: mrr_at_5 value: 46.792 - type: ndcg_at_1 value: 32.646 - type: ndcg_at_10 value: 56.874 - type: ndcg_at_100 value: 60.307 - type: ndcg_at_1000 value: 60.465999999999994 - type: ndcg_at_3 value: 47.339999999999996 - type: ndcg_at_5 value: 52.685 - type: precision_at_1 value: 32.646 - type: precision_at_10 value: 8.378 - type: precision_at_100 value: 0.984 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 19.393 - type: precision_at_5 value: 14.210999999999999 - type: recall_at_1 value: 32.646 - type: recall_at_10 value: 83.784 - type: recall_at_100 value: 98.43499999999999 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 58.179 - type: recall_at_5 value: 71.053 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 44.94353025039141 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 35.870836103029156 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 61.149290266979236 - type: mrr value: 73.8448093919008 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 87.055571064151 - type: cos_sim_spearman value: 86.2652186235749 - type: euclidean_pearson value: 85.82039272282503 - type: euclidean_spearman value: 86.2652186235749 - type: manhattan_pearson value: 85.95825392094812 - type: manhattan_spearman value: 86.6742640885316 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 79.11688311688312 - type: f1 value: 78.28328901613885 - task: type: Clustering dataset: name: MTEB BigPatentClustering type: jinaai/big-patent-clustering config: default split: test revision: 62d5330920bca426ce9d3c76ea914f15fc83e891 metrics: - type: v_measure value: 19.147523589859325 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 35.68369864124274 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 30.474958792950872 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: mteb/cqadupstack-android config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: map_at_1 value: 33.183 - type: map_at_10 value: 43.989 - type: map_at_100 value: 45.389 - type: map_at_1000 value: 45.517 - type: map_at_3 value: 40.275 - type: map_at_5 value: 42.306 - type: mrr_at_1 value: 40.486 - type: mrr_at_10 value: 49.62 - type: mrr_at_100 value: 50.351 - type: mrr_at_1000 value: 50.393 - type: mrr_at_3 value: 46.805 - type: mrr_at_5 value: 48.429 - type: ndcg_at_1 value: 40.486 - type: ndcg_at_10 value: 50.249 - type: ndcg_at_100 value: 55.206 - type: ndcg_at_1000 value: 57.145 - type: ndcg_at_3 value: 44.852 - type: ndcg_at_5 value: 47.355000000000004 - type: precision_at_1 value: 40.486 - type: precision_at_10 value: 9.571 - type: precision_at_100 value: 1.4949999999999999 - type: precision_at_1000 value: 0.196 - type: precision_at_3 value: 21.173000000000002 - type: precision_at_5 value: 15.622 - type: recall_at_1 value: 33.183 - type: recall_at_10 value: 62.134 - type: recall_at_100 value: 82.73 - type: recall_at_1000 value: 94.93599999999999 - type: recall_at_3 value: 46.497 - type: recall_at_5 value: 53.199 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval type: mteb/cqadupstack-english config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: map_at_1 value: 32.862 - type: map_at_10 value: 42.439 - type: map_at_100 value: 43.736999999999995 - type: map_at_1000 value: 43.864 - type: map_at_3 value: 39.67 - type: map_at_5 value: 41.202 - type: mrr_at_1 value: 40.892 - type: mrr_at_10 value: 48.61 - type: mrr_at_100 value: 49.29 - type: mrr_at_1000 value: 49.332 - type: mrr_at_3 value: 46.688 - type: mrr_at_5 value: 47.803000000000004 - type: ndcg_at_1 value: 40.892 - type: ndcg_at_10 value: 47.797 - type: ndcg_at_100 value: 52.17699999999999 - type: ndcg_at_1000 value: 54.127 - type: ndcg_at_3 value: 44.189 - type: ndcg_at_5 value: 45.821 - type: precision_at_1 value: 40.892 - type: precision_at_10 value: 8.841000000000001 - type: precision_at_100 value: 1.419 - type: precision_at_1000 value: 0.188 - type: precision_at_3 value: 21.104 - type: precision_at_5 value: 14.777000000000001 - type: recall_at_1 value: 32.862 - type: recall_at_10 value: 56.352999999999994 - type: recall_at_100 value: 74.795 - type: recall_at_1000 value: 86.957 - type: recall_at_3 value: 45.269999999999996 - type: recall_at_5 value: 50.053000000000004 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval type: mteb/cqadupstack-gaming config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 42.998999999999995 - type: map_at_10 value: 54.745 - type: map_at_100 value: 55.650999999999996 - type: map_at_1000 value: 55.703 - type: map_at_3 value: 51.67 - type: map_at_5 value: 53.503 - type: mrr_at_1 value: 49.028 - type: mrr_at_10 value: 58.172000000000004 - type: mrr_at_100 value: 58.744 - type: mrr_at_1000 value: 58.769000000000005 - type: mrr_at_3 value: 55.977 - type: mrr_at_5 value: 57.38799999999999 - type: ndcg_at_1 value: 49.028 - type: ndcg_at_10 value: 60.161 - type: ndcg_at_100 value: 63.806 - type: ndcg_at_1000 value: 64.821 - type: ndcg_at_3 value: 55.199 - type: ndcg_at_5 value: 57.830999999999996 - type: precision_at_1 value: 49.028 - type: precision_at_10 value: 9.455 - type: precision_at_100 value: 1.216 - type: precision_at_1000 value: 0.135 - type: precision_at_3 value: 24.242 - type: precision_at_5 value: 16.614 - type: recall_at_1 value: 42.998999999999995 - type: recall_at_10 value: 72.542 - type: recall_at_100 value: 88.605 - type: recall_at_1000 value: 95.676 - type: recall_at_3 value: 59.480999999999995 - type: recall_at_5 value: 65.886 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval type: mteb/cqadupstack-gis config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: map_at_1 value: 27.907 - type: map_at_10 value: 35.975 - type: map_at_100 value: 36.985 - type: map_at_1000 value: 37.063 - type: map_at_3 value: 33.467999999999996 - type: map_at_5 value: 34.749 - type: mrr_at_1 value: 30.056 - type: mrr_at_10 value: 38.047 - type: mrr_at_100 value: 38.932 - type: mrr_at_1000 value: 38.991 - type: mrr_at_3 value: 35.705999999999996 - type: mrr_at_5 value: 36.966 - type: ndcg_at_1 value: 30.056 - type: ndcg_at_10 value: 40.631 - type: ndcg_at_100 value: 45.564 - type: ndcg_at_1000 value: 47.685 - type: ndcg_at_3 value: 35.748000000000005 - type: ndcg_at_5 value: 37.921 - type: precision_at_1 value: 30.056 - type: precision_at_10 value: 6.079 - type: precision_at_100 value: 0.898 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 14.727 - type: precision_at_5 value: 10.056 - type: recall_at_1 value: 27.907 - type: recall_at_10 value: 52.981 - type: recall_at_100 value: 75.53999999999999 - type: recall_at_1000 value: 91.759 - type: recall_at_3 value: 39.878 - type: recall_at_5 value: 45.077 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval type: mteb/cqadupstack-mathematica config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: map_at_1 value: 16.764000000000003 - type: map_at_10 value: 24.294 - type: map_at_100 value: 25.507999999999996 - type: map_at_1000 value: 25.64 - type: map_at_3 value: 21.807000000000002 - type: map_at_5 value: 23.21 - type: mrr_at_1 value: 20.771 - type: mrr_at_10 value: 28.677000000000003 - type: mrr_at_100 value: 29.742 - type: mrr_at_1000 value: 29.816 - type: mrr_at_3 value: 26.327 - type: mrr_at_5 value: 27.639000000000003 - type: ndcg_at_1 value: 20.771 - type: ndcg_at_10 value: 29.21 - type: ndcg_at_100 value: 34.788000000000004 - type: ndcg_at_1000 value: 37.813 - type: ndcg_at_3 value: 24.632 - type: ndcg_at_5 value: 26.801000000000002 - type: precision_at_1 value: 20.771 - type: precision_at_10 value: 5.373 - type: precision_at_100 value: 0.923 - type: precision_at_1000 value: 0.133 - type: precision_at_3 value: 12.065 - type: precision_at_5 value: 8.706 - type: recall_at_1 value: 16.764000000000003 - type: recall_at_10 value: 40.072 - type: recall_at_100 value: 63.856 - type: recall_at_1000 value: 85.141 - type: recall_at_3 value: 27.308 - type: recall_at_5 value: 32.876 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval type: mteb/cqadupstack-physics config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: map_at_1 value: 31.194 - type: map_at_10 value: 40.731 - type: map_at_100 value: 42.073 - type: map_at_1000 value: 42.178 - type: map_at_3 value: 37.726 - type: map_at_5 value: 39.474 - type: mrr_at_1 value: 37.729 - type: mrr_at_10 value: 46.494 - type: mrr_at_100 value: 47.368 - type: mrr_at_1000 value: 47.407 - type: mrr_at_3 value: 44.224999999999994 - type: mrr_at_5 value: 45.582 - type: ndcg_at_1 value: 37.729 - type: ndcg_at_10 value: 46.312999999999995 - type: ndcg_at_100 value: 51.915 - type: ndcg_at_1000 value: 53.788000000000004 - type: ndcg_at_3 value: 41.695 - type: ndcg_at_5 value: 43.956 - type: precision_at_1 value: 37.729 - type: precision_at_10 value: 8.181 - type: precision_at_100 value: 1.275 - type: precision_at_1000 value: 0.16199999999999998 - type: precision_at_3 value: 19.41 - type: precision_at_5 value: 13.648 - type: recall_at_1 value: 31.194 - type: recall_at_10 value: 57.118 - type: recall_at_100 value: 80.759 - type: recall_at_1000 value: 92.779 - type: recall_at_3 value: 44.083 - type: recall_at_5 value: 50.044999999999995 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval type: mteb/cqadupstack-programmers config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: map_at_1 value: 28.047 - type: map_at_10 value: 37.79 - type: map_at_100 value: 39.145 - type: map_at_1000 value: 39.254 - type: map_at_3 value: 34.857 - type: map_at_5 value: 36.545 - type: mrr_at_1 value: 35.388 - type: mrr_at_10 value: 43.475 - type: mrr_at_100 value: 44.440000000000005 - type: mrr_at_1000 value: 44.494 - type: mrr_at_3 value: 41.286 - type: mrr_at_5 value: 42.673 - type: ndcg_at_1 value: 35.388 - type: ndcg_at_10 value: 43.169000000000004 - type: ndcg_at_100 value: 48.785000000000004 - type: ndcg_at_1000 value: 51.029 - type: ndcg_at_3 value: 38.801 - type: ndcg_at_5 value: 40.9 - type: precision_at_1 value: 35.388 - type: precision_at_10 value: 7.7509999999999994 - type: precision_at_100 value: 1.212 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 18.455 - type: precision_at_5 value: 13.014000000000001 - type: recall_at_1 value: 28.047 - type: recall_at_10 value: 53.53099999999999 - type: recall_at_100 value: 77.285 - type: recall_at_1000 value: 92.575 - type: recall_at_3 value: 40.949000000000005 - type: recall_at_5 value: 46.742 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: mteb/cqadupstack config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 28.131999999999994 - type: map_at_10 value: 36.93333333333334 - type: map_at_100 value: 38.117250000000006 - type: map_at_1000 value: 38.23275 - type: map_at_3 value: 34.19708333333333 - type: map_at_5 value: 35.725166666666674 - type: mrr_at_1 value: 33.16116666666667 - type: mrr_at_10 value: 41.057833333333335 - type: mrr_at_100 value: 41.90033333333333 - type: mrr_at_1000 value: 41.95625 - type: mrr_at_3 value: 38.757333333333335 - type: mrr_at_5 value: 40.097333333333324 - type: ndcg_at_1 value: 33.16116666666667 - type: ndcg_at_10 value: 42.01983333333333 - type: ndcg_at_100 value: 46.99916666666667 - type: ndcg_at_1000 value: 49.21783333333334 - type: ndcg_at_3 value: 37.479916666666654 - type: ndcg_at_5 value: 39.6355 - type: precision_at_1 value: 33.16116666666667 - type: precision_at_10 value: 7.230249999999999 - type: precision_at_100 value: 1.1411666666666667 - type: precision_at_1000 value: 0.1520833333333333 - type: precision_at_3 value: 17.028166666666667 - type: precision_at_5 value: 12.046999999999999 - type: recall_at_1 value: 28.131999999999994 - type: recall_at_10 value: 52.825500000000005 - type: recall_at_100 value: 74.59608333333333 - type: recall_at_1000 value: 89.87916666666668 - type: recall_at_3 value: 40.13625 - type: recall_at_5 value: 45.699999999999996 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval type: mteb/cqadupstack-stats config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: map_at_1 value: 24.773999999999997 - type: map_at_10 value: 31.997999999999998 - type: map_at_100 value: 32.857 - type: map_at_1000 value: 32.957 - type: map_at_3 value: 30.041 - type: map_at_5 value: 31.119000000000003 - type: mrr_at_1 value: 27.607 - type: mrr_at_10 value: 34.538000000000004 - type: mrr_at_100 value: 35.308 - type: mrr_at_1000 value: 35.375 - type: mrr_at_3 value: 32.643 - type: mrr_at_5 value: 33.755 - type: ndcg_at_1 value: 27.607 - type: ndcg_at_10 value: 36.035000000000004 - type: ndcg_at_100 value: 40.351 - type: ndcg_at_1000 value: 42.684 - type: ndcg_at_3 value: 32.414 - type: ndcg_at_5 value: 34.11 - type: precision_at_1 value: 27.607 - type: precision_at_10 value: 5.6129999999999995 - type: precision_at_100 value: 0.8370000000000001 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 13.957 - type: precision_at_5 value: 9.571 - type: recall_at_1 value: 24.773999999999997 - type: recall_at_10 value: 45.717 - type: recall_at_100 value: 65.499 - type: recall_at_1000 value: 82.311 - type: recall_at_3 value: 35.716 - type: recall_at_5 value: 40.007999999999996 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval type: mteb/cqadupstack-tex config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 19.227 - type: map_at_10 value: 26.649 - type: map_at_100 value: 27.711999999999996 - type: map_at_1000 value: 27.837 - type: map_at_3 value: 24.454 - type: map_at_5 value: 25.772000000000002 - type: mrr_at_1 value: 23.433999999999997 - type: mrr_at_10 value: 30.564999999999998 - type: mrr_at_100 value: 31.44 - type: mrr_at_1000 value: 31.513999999999996 - type: mrr_at_3 value: 28.435 - type: mrr_at_5 value: 29.744999999999997 - type: ndcg_at_1 value: 23.433999999999997 - type: ndcg_at_10 value: 31.104 - type: ndcg_at_100 value: 36.172 - type: ndcg_at_1000 value: 39.006 - type: ndcg_at_3 value: 27.248 - type: ndcg_at_5 value: 29.249000000000002 - type: precision_at_1 value: 23.433999999999997 - type: precision_at_10 value: 5.496 - type: precision_at_100 value: 0.9490000000000001 - type: precision_at_1000 value: 0.13699999999999998 - type: precision_at_3 value: 12.709000000000001 - type: precision_at_5 value: 9.209 - type: recall_at_1 value: 19.227 - type: recall_at_10 value: 40.492 - type: recall_at_100 value: 63.304 - type: recall_at_1000 value: 83.45 - type: recall_at_3 value: 29.713 - type: recall_at_5 value: 34.82 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval type: mteb/cqadupstack-unix config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: map_at_1 value: 29.199 - type: map_at_10 value: 37.617 - type: map_at_100 value: 38.746 - type: map_at_1000 value: 38.851 - type: map_at_3 value: 34.882000000000005 - type: map_at_5 value: 36.571999999999996 - type: mrr_at_1 value: 33.489000000000004 - type: mrr_at_10 value: 41.089999999999996 - type: mrr_at_100 value: 41.965 - type: mrr_at_1000 value: 42.028 - type: mrr_at_3 value: 38.666 - type: mrr_at_5 value: 40.159 - type: ndcg_at_1 value: 33.489000000000004 - type: ndcg_at_10 value: 42.487 - type: ndcg_at_100 value: 47.552 - type: ndcg_at_1000 value: 49.774 - type: ndcg_at_3 value: 37.623 - type: ndcg_at_5 value: 40.184999999999995 - type: precision_at_1 value: 33.489000000000004 - type: precision_at_10 value: 6.94 - type: precision_at_100 value: 1.0699999999999998 - type: precision_at_1000 value: 0.136 - type: precision_at_3 value: 16.667 - type: precision_at_5 value: 11.922 - type: recall_at_1 value: 29.199 - type: recall_at_10 value: 53.689 - type: recall_at_100 value: 75.374 - type: recall_at_1000 value: 90.64999999999999 - type: recall_at_3 value: 40.577999999999996 - type: recall_at_5 value: 46.909 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval type: mteb/cqadupstack-webmasters config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: map_at_1 value: 27.206999999999997 - type: map_at_10 value: 36.146 - type: map_at_100 value: 37.759 - type: map_at_1000 value: 37.979 - type: map_at_3 value: 32.967999999999996 - type: map_at_5 value: 34.809 - type: mrr_at_1 value: 32.806000000000004 - type: mrr_at_10 value: 40.449 - type: mrr_at_100 value: 41.404999999999994 - type: mrr_at_1000 value: 41.457 - type: mrr_at_3 value: 37.614999999999995 - type: mrr_at_5 value: 39.324999999999996 - type: ndcg_at_1 value: 32.806000000000004 - type: ndcg_at_10 value: 41.911 - type: ndcg_at_100 value: 47.576 - type: ndcg_at_1000 value: 50.072 - type: ndcg_at_3 value: 36.849 - type: ndcg_at_5 value: 39.475 - type: precision_at_1 value: 32.806000000000004 - type: precision_at_10 value: 8.103 - type: precision_at_100 value: 1.557 - type: precision_at_1000 value: 0.242 - type: precision_at_3 value: 17.26 - type: precision_at_5 value: 12.885 - type: recall_at_1 value: 27.206999999999997 - type: recall_at_10 value: 52.56999999999999 - type: recall_at_100 value: 78.302 - type: recall_at_1000 value: 94.121 - type: recall_at_3 value: 38.317 - type: recall_at_5 value: 45.410000000000004 - task: type: Retrieval dataset: name: MTEB CQADupstackWordpressRetrieval type: mteb/cqadupstack-wordpress config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 24.221 - type: map_at_10 value: 30.826999999999998 - type: map_at_100 value: 31.845000000000002 - type: map_at_1000 value: 31.95 - type: map_at_3 value: 28.547 - type: map_at_5 value: 29.441 - type: mrr_at_1 value: 26.247999999999998 - type: mrr_at_10 value: 32.957 - type: mrr_at_100 value: 33.819 - type: mrr_at_1000 value: 33.899 - type: mrr_at_3 value: 30.714999999999996 - type: mrr_at_5 value: 31.704 - type: ndcg_at_1 value: 26.247999999999998 - type: ndcg_at_10 value: 35.171 - type: ndcg_at_100 value: 40.098 - type: ndcg_at_1000 value: 42.67 - type: ndcg_at_3 value: 30.508999999999997 - type: ndcg_at_5 value: 32.022 - type: precision_at_1 value: 26.247999999999998 - type: precision_at_10 value: 5.36 - type: precision_at_100 value: 0.843 - type: precision_at_1000 value: 0.11499999999999999 - type: precision_at_3 value: 12.568999999999999 - type: precision_at_5 value: 8.540000000000001 - type: recall_at_1 value: 24.221 - type: recall_at_10 value: 46.707 - type: recall_at_100 value: 69.104 - type: recall_at_1000 value: 88.19500000000001 - type: recall_at_3 value: 33.845 - type: recall_at_5 value: 37.375 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: mteb/climate-fever config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: map_at_1 value: 13.624 - type: map_at_10 value: 22.557 - type: map_at_100 value: 24.367 - type: map_at_1000 value: 24.54 - type: map_at_3 value: 18.988 - type: map_at_5 value: 20.785999999999998 - type: mrr_at_1 value: 30.619000000000003 - type: mrr_at_10 value: 42.019 - type: mrr_at_100 value: 42.818 - type: mrr_at_1000 value: 42.856 - type: mrr_at_3 value: 38.578 - type: mrr_at_5 value: 40.669 - type: ndcg_at_1 value: 30.619000000000003 - type: ndcg_at_10 value: 31.252999999999997 - type: ndcg_at_100 value: 38.238 - type: ndcg_at_1000 value: 41.368 - type: ndcg_at_3 value: 25.843 - type: ndcg_at_5 value: 27.638 - type: precision_at_1 value: 30.619000000000003 - type: precision_at_10 value: 9.687 - type: precision_at_100 value: 1.718 - type: precision_at_1000 value: 0.22999999999999998 - type: precision_at_3 value: 18.849 - type: precision_at_5 value: 14.463000000000001 - type: recall_at_1 value: 13.624 - type: recall_at_10 value: 36.693999999999996 - type: recall_at_100 value: 60.9 - type: recall_at_1000 value: 78.46 - type: recall_at_3 value: 23.354 - type: recall_at_5 value: 28.756999999999998 - task: type: Retrieval dataset: name: MTEB DBPedia type: mteb/dbpedia config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: map_at_1 value: 9.077 - type: map_at_10 value: 19.813 - type: map_at_100 value: 27.822999999999997 - type: map_at_1000 value: 29.485 - type: map_at_3 value: 14.255999999999998 - type: map_at_5 value: 16.836000000000002 - type: mrr_at_1 value: 69.25 - type: mrr_at_10 value: 77.059 - type: mrr_at_100 value: 77.41 - type: mrr_at_1000 value: 77.416 - type: mrr_at_3 value: 75.625 - type: mrr_at_5 value: 76.512 - type: ndcg_at_1 value: 55.75 - type: ndcg_at_10 value: 41.587 - type: ndcg_at_100 value: 46.048 - type: ndcg_at_1000 value: 53.172 - type: ndcg_at_3 value: 46.203 - type: ndcg_at_5 value: 43.696 - type: precision_at_1 value: 69.25 - type: precision_at_10 value: 32.95 - type: precision_at_100 value: 10.555 - type: precision_at_1000 value: 2.136 - type: precision_at_3 value: 49.667 - type: precision_at_5 value: 42.5 - type: recall_at_1 value: 9.077 - type: recall_at_10 value: 25.249 - type: recall_at_100 value: 51.964 - type: recall_at_1000 value: 74.51 - type: recall_at_3 value: 15.584000000000001 - type: recall_at_5 value: 19.717000000000002 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 45.769999999999996 - type: f1 value: 41.64144711933962 - task: type: Retrieval dataset: name: MTEB FEVER type: mteb/fever config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: map_at_1 value: 67.098 - type: map_at_10 value: 77.69800000000001 - type: map_at_100 value: 77.947 - type: map_at_1000 value: 77.961 - type: map_at_3 value: 76.278 - type: map_at_5 value: 77.217 - type: mrr_at_1 value: 72.532 - type: mrr_at_10 value: 82.41199999999999 - type: mrr_at_100 value: 82.527 - type: mrr_at_1000 value: 82.529 - type: mrr_at_3 value: 81.313 - type: mrr_at_5 value: 82.069 - type: ndcg_at_1 value: 72.532 - type: ndcg_at_10 value: 82.488 - type: ndcg_at_100 value: 83.382 - type: ndcg_at_1000 value: 83.622 - type: ndcg_at_3 value: 80.101 - type: ndcg_at_5 value: 81.52199999999999 - type: precision_at_1 value: 72.532 - type: precision_at_10 value: 10.203 - type: precision_at_100 value: 1.082 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 31.308000000000003 - type: precision_at_5 value: 19.652 - type: recall_at_1 value: 67.098 - type: recall_at_10 value: 92.511 - type: recall_at_100 value: 96.06099999999999 - type: recall_at_1000 value: 97.548 - type: recall_at_3 value: 86.105 - type: recall_at_5 value: 89.661 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: mteb/fiqa config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: map_at_1 value: 18.681 - type: map_at_10 value: 31.739 - type: map_at_100 value: 33.503 - type: map_at_1000 value: 33.69 - type: map_at_3 value: 27.604 - type: map_at_5 value: 29.993 - type: mrr_at_1 value: 37.5 - type: mrr_at_10 value: 46.933 - type: mrr_at_100 value: 47.771 - type: mrr_at_1000 value: 47.805 - type: mrr_at_3 value: 44.239 - type: mrr_at_5 value: 45.766 - type: ndcg_at_1 value: 37.5 - type: ndcg_at_10 value: 39.682 - type: ndcg_at_100 value: 46.127 - type: ndcg_at_1000 value: 48.994 - type: ndcg_at_3 value: 35.655 - type: ndcg_at_5 value: 37.036 - type: precision_at_1 value: 37.5 - type: precision_at_10 value: 11.08 - type: precision_at_100 value: 1.765 - type: precision_at_1000 value: 0.22999999999999998 - type: precision_at_3 value: 23.919999999999998 - type: precision_at_5 value: 17.809 - type: recall_at_1 value: 18.681 - type: recall_at_10 value: 47.548 - type: recall_at_100 value: 71.407 - type: recall_at_1000 value: 87.805 - type: recall_at_3 value: 32.979 - type: recall_at_5 value: 39.192 - task: type: Retrieval dataset: name: MTEB HotpotQA type: mteb/hotpotqa config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: map_at_1 value: 38.257999999999996 - type: map_at_10 value: 57.605 - type: map_at_100 value: 58.50300000000001 - type: map_at_1000 value: 58.568 - type: map_at_3 value: 54.172 - type: map_at_5 value: 56.323 - type: mrr_at_1 value: 76.51599999999999 - type: mrr_at_10 value: 82.584 - type: mrr_at_100 value: 82.78 - type: mrr_at_1000 value: 82.787 - type: mrr_at_3 value: 81.501 - type: mrr_at_5 value: 82.185 - type: ndcg_at_1 value: 76.51599999999999 - type: ndcg_at_10 value: 66.593 - type: ndcg_at_100 value: 69.699 - type: ndcg_at_1000 value: 70.953 - type: ndcg_at_3 value: 61.673 - type: ndcg_at_5 value: 64.42 - type: precision_at_1 value: 76.51599999999999 - type: precision_at_10 value: 13.857 - type: precision_at_100 value: 1.628 - type: precision_at_1000 value: 0.179 - type: precision_at_3 value: 38.956 - type: precision_at_5 value: 25.541999999999998 - type: recall_at_1 value: 38.257999999999996 - type: recall_at_10 value: 69.284 - type: recall_at_100 value: 81.391 - type: recall_at_1000 value: 89.689 - type: recall_at_3 value: 58.433 - type: recall_at_5 value: 63.856 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 69.48679999999999 - type: ap value: 63.97638838971138 - type: f1 value: 69.22731638841675 - task: type: Retrieval dataset: name: MTEB MSMARCO type: mteb/msmarco config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: map_at_1 value: 20.916999999999998 - type: map_at_10 value: 32.929 - type: map_at_100 value: 34.1 - type: map_at_1000 value: 34.152 - type: map_at_3 value: 29.065 - type: map_at_5 value: 31.287 - type: mrr_at_1 value: 21.562 - type: mrr_at_10 value: 33.533 - type: mrr_at_100 value: 34.644000000000005 - type: mrr_at_1000 value: 34.69 - type: mrr_at_3 value: 29.735 - type: mrr_at_5 value: 31.928 - type: ndcg_at_1 value: 21.562 - type: ndcg_at_10 value: 39.788000000000004 - type: ndcg_at_100 value: 45.434999999999995 - type: ndcg_at_1000 value: 46.75 - type: ndcg_at_3 value: 31.942999999999998 - type: ndcg_at_5 value: 35.888 - type: precision_at_1 value: 21.562 - type: precision_at_10 value: 6.348 - type: precision_at_100 value: 0.918 - type: precision_at_1000 value: 0.10300000000000001 - type: precision_at_3 value: 13.682 - type: precision_at_5 value: 10.189 - type: recall_at_1 value: 20.916999999999998 - type: recall_at_10 value: 60.926 - type: recall_at_100 value: 87.03800000000001 - type: recall_at_1000 value: 97.085 - type: recall_at_3 value: 39.637 - type: recall_at_5 value: 49.069 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 90.93935248518011 - type: f1 value: 90.56439321844506 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 58.62517099863203 - type: f1 value: 40.69925681703197 - task: type: Classification dataset: name: MTEB MasakhaNEWSClassification (eng) type: masakhane/masakhanews config: eng split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: accuracy value: 76.29746835443039 - type: f1 value: 75.31702672039506 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (eng) type: masakhane/masakhanews config: eng split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: v_measure value: 43.05495067062023 - type: v_measure value: 19.625272848173843 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.76126429051781 - type: f1 value: 62.60284261265268 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.05043712172159 - type: f1 value: 69.08340521169049 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 30.78969229005989 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 27.954325178520335 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 30.601827413968596 - type: mrr value: 31.515372019474196 - task: type: Retrieval dataset: name: MTEB NFCorpus type: mteb/nfcorpus config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: map_at_1 value: 5.4559999999999995 - type: map_at_10 value: 12.039 - type: map_at_100 value: 14.804999999999998 - type: map_at_1000 value: 16.081 - type: map_at_3 value: 8.996 - type: map_at_5 value: 10.357 - type: mrr_at_1 value: 45.82 - type: mrr_at_10 value: 53.583999999999996 - type: mrr_at_100 value: 54.330999999999996 - type: mrr_at_1000 value: 54.366 - type: mrr_at_3 value: 52.166999999999994 - type: mrr_at_5 value: 52.971999999999994 - type: ndcg_at_1 value: 44.427 - type: ndcg_at_10 value: 32.536 - type: ndcg_at_100 value: 29.410999999999998 - type: ndcg_at_1000 value: 38.012 - type: ndcg_at_3 value: 38.674 - type: ndcg_at_5 value: 36.107 - type: precision_at_1 value: 45.82 - type: precision_at_10 value: 23.591 - type: precision_at_100 value: 7.35 - type: precision_at_1000 value: 1.9769999999999999 - type: precision_at_3 value: 36.016999999999996 - type: precision_at_5 value: 30.959999999999997 - type: recall_at_1 value: 5.4559999999999995 - type: recall_at_10 value: 15.387 - type: recall_at_100 value: 28.754999999999995 - type: recall_at_1000 value: 59.787 - type: recall_at_3 value: 10.137 - type: recall_at_5 value: 12.200999999999999 - task: type: Retrieval dataset: name: MTEB NQ type: mteb/nq config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: map_at_1 value: 32.609 - type: map_at_10 value: 48.522 - type: map_at_100 value: 49.468 - type: map_at_1000 value: 49.497 - type: map_at_3 value: 44.327 - type: map_at_5 value: 46.937 - type: mrr_at_1 value: 36.616 - type: mrr_at_10 value: 50.943000000000005 - type: mrr_at_100 value: 51.626000000000005 - type: mrr_at_1000 value: 51.647 - type: mrr_at_3 value: 47.532999999999994 - type: mrr_at_5 value: 49.714000000000006 - type: ndcg_at_1 value: 36.586999999999996 - type: ndcg_at_10 value: 56.19499999999999 - type: ndcg_at_100 value: 60.014 - type: ndcg_at_1000 value: 60.707 - type: ndcg_at_3 value: 48.486000000000004 - type: ndcg_at_5 value: 52.791999999999994 - type: precision_at_1 value: 36.586999999999996 - type: precision_at_10 value: 9.139999999999999 - type: precision_at_100 value: 1.129 - type: precision_at_1000 value: 0.11900000000000001 - type: precision_at_3 value: 22.171 - type: precision_at_5 value: 15.787999999999998 - type: recall_at_1 value: 32.609 - type: recall_at_10 value: 77.011 - type: recall_at_100 value: 93.202 - type: recall_at_1000 value: 98.344 - type: recall_at_3 value: 57.286 - type: recall_at_5 value: 67.181 - task: type: Classification dataset: name: MTEB NewsClassification type: ag_news config: default split: test revision: eb185aade064a813bc0b7f42de02595523103ca4 metrics: - type: accuracy value: 77.4421052631579 - type: f1 value: 77.23976860913628 - task: type: PairClassification dataset: name: MTEB OpusparcusPC (en) type: GEM/opusparcus config: en split: test revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a metrics: - type: cos_sim_accuracy value: 99.89816700610999 - type: cos_sim_ap value: 100 - type: cos_sim_f1 value: 99.9490575649516 - type: cos_sim_precision value: 100 - type: cos_sim_recall value: 99.89816700610999 - type: dot_accuracy value: 99.89816700610999 - type: dot_ap value: 100 - type: dot_f1 value: 99.9490575649516 - type: dot_precision value: 100 - type: dot_recall value: 99.89816700610999 - type: euclidean_accuracy value: 99.89816700610999 - type: euclidean_ap value: 100 - type: euclidean_f1 value: 99.9490575649516 - type: euclidean_precision value: 100 - type: euclidean_recall value: 99.89816700610999 - type: manhattan_accuracy value: 99.89816700610999 - type: manhattan_ap value: 100 - type: manhattan_f1 value: 99.9490575649516 - type: manhattan_precision value: 100 - type: manhattan_recall value: 99.89816700610999 - type: max_accuracy value: 99.89816700610999 - type: max_ap value: 100 - type: max_f1 value: 99.9490575649516 - task: type: PairClassification dataset: name: MTEB PawsX (en) type: paws-x config: en split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cos_sim_accuracy value: 61.25000000000001 - type: cos_sim_ap value: 59.23166242799505 - type: cos_sim_f1 value: 62.53016201309893 - type: cos_sim_precision value: 45.486459378134406 - type: cos_sim_recall value: 100 - type: dot_accuracy value: 61.25000000000001 - type: dot_ap value: 59.23109306756652 - type: dot_f1 value: 62.53016201309893 - type: dot_precision value: 45.486459378134406 - type: dot_recall value: 100 - type: euclidean_accuracy value: 61.25000000000001 - type: euclidean_ap value: 59.23166242799505 - type: euclidean_f1 value: 62.53016201309893 - type: euclidean_precision value: 45.486459378134406 - type: euclidean_recall value: 100 - type: manhattan_accuracy value: 61.25000000000001 - type: manhattan_ap value: 59.23015114712089 - type: manhattan_f1 value: 62.50861474844934 - type: manhattan_precision value: 45.46365914786967 - type: manhattan_recall value: 100 - type: max_accuracy value: 61.25000000000001 - type: max_ap value: 59.23166242799505 - type: max_f1 value: 62.53016201309893 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: mteb/quora config: default split: test revision: e4e08e0b7dbe3c8700f0daef558ff32256715259 metrics: - type: map_at_1 value: 69.919 - type: map_at_10 value: 83.636 - type: map_at_100 value: 84.27 - type: map_at_1000 value: 84.289 - type: map_at_3 value: 80.744 - type: map_at_5 value: 82.509 - type: mrr_at_1 value: 80.52 - type: mrr_at_10 value: 86.751 - type: mrr_at_100 value: 86.875 - type: mrr_at_1000 value: 86.876 - type: mrr_at_3 value: 85.798 - type: mrr_at_5 value: 86.414 - type: ndcg_at_1 value: 80.53 - type: ndcg_at_10 value: 87.465 - type: ndcg_at_100 value: 88.762 - type: ndcg_at_1000 value: 88.90599999999999 - type: ndcg_at_3 value: 84.634 - type: ndcg_at_5 value: 86.09400000000001 - type: precision_at_1 value: 80.53 - type: precision_at_10 value: 13.263 - type: precision_at_100 value: 1.517 - type: precision_at_1000 value: 0.156 - type: precision_at_3 value: 36.973 - type: precision_at_5 value: 24.25 - type: recall_at_1 value: 69.919 - type: recall_at_10 value: 94.742 - type: recall_at_100 value: 99.221 - type: recall_at_1000 value: 99.917 - type: recall_at_3 value: 86.506 - type: recall_at_5 value: 90.736 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 50.47309147963901 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 385e3cb46b4cfa89021f56c4380204149d0efe33 metrics: - type: v_measure value: 60.53779561923047 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: mteb/scidocs config: default split: test revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88 metrics: - type: map_at_1 value: 4.843 - type: map_at_10 value: 11.664 - type: map_at_100 value: 13.499 - type: map_at_1000 value: 13.771 - type: map_at_3 value: 8.602 - type: map_at_5 value: 10.164 - type: mrr_at_1 value: 23.9 - type: mrr_at_10 value: 34.018 - type: mrr_at_100 value: 35.099000000000004 - type: mrr_at_1000 value: 35.162 - type: mrr_at_3 value: 31.233 - type: mrr_at_5 value: 32.793 - type: ndcg_at_1 value: 23.9 - type: ndcg_at_10 value: 19.42 - type: ndcg_at_100 value: 26.715 - type: ndcg_at_1000 value: 31.776 - type: ndcg_at_3 value: 19.165 - type: ndcg_at_5 value: 16.46 - type: precision_at_1 value: 23.9 - type: precision_at_10 value: 9.82 - type: precision_at_100 value: 2.0340000000000003 - type: precision_at_1000 value: 0.325 - type: precision_at_3 value: 17.767 - type: precision_at_5 value: 14.24 - type: recall_at_1 value: 4.843 - type: recall_at_10 value: 19.895 - type: recall_at_100 value: 41.302 - type: recall_at_1000 value: 66.077 - type: recall_at_3 value: 10.803 - type: recall_at_5 value: 14.418000000000001 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: 20a6d6f312dd54037fe07a32d58e5e168867909d metrics: - type: cos_sim_pearson value: 76.94120735638143 - type: cos_sim_spearman value: 69.66114097154585 - type: euclidean_pearson value: 73.11242035696426 - type: euclidean_spearman value: 69.66114271982464 - type: manhattan_pearson value: 73.07993034858605 - type: manhattan_spearman value: 69.6457893357314 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 74.72893353272778 - type: cos_sim_spearman value: 68.78540928870311 - type: euclidean_pearson value: 71.13907970605574 - type: euclidean_spearman value: 68.78540928870311 - type: manhattan_pearson value: 71.02709590547859 - type: manhattan_spearman value: 68.71685896660532 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 79.30142652684971 - type: cos_sim_spearman value: 79.61879435615303 - type: euclidean_pearson value: 79.08730432883864 - type: euclidean_spearman value: 79.61879435615303 - type: manhattan_pearson value: 78.99621073156322 - type: manhattan_spearman value: 79.53806342308278 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 78.99585233036139 - type: cos_sim_spearman value: 75.57574519760183 - type: euclidean_pearson value: 77.33835658613162 - type: euclidean_spearman value: 75.57573873503655 - type: manhattan_pearson value: 77.12175044789362 - type: manhattan_spearman value: 75.41293517634836 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 83.9694268253376 - type: cos_sim_spearman value: 84.64256921939338 - type: euclidean_pearson value: 83.92322958711 - type: euclidean_spearman value: 84.64257976421872 - type: manhattan_pearson value: 83.93503107204337 - type: manhattan_spearman value: 84.63611608236032 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 81.09041419790253 - type: cos_sim_spearman value: 82.39869157752557 - type: euclidean_pearson value: 82.04595698258301 - type: euclidean_spearman value: 82.39869157752557 - type: manhattan_pearson value: 81.97581168053004 - type: manhattan_spearman value: 82.34255320578193 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 86.35210432821825 - type: cos_sim_spearman value: 86.73200885328937 - type: euclidean_pearson value: 86.8527089168747 - type: euclidean_spearman value: 86.73200885328937 - type: manhattan_pearson value: 86.95671235295457 - type: manhattan_spearman value: 86.77713700838545 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 68.91106612960657 - type: cos_sim_spearman value: 69.48524490302286 - type: euclidean_pearson value: 70.51347841618035 - type: euclidean_spearman value: 69.48524490302286 - type: manhattan_pearson value: 70.31770181334245 - type: manhattan_spearman value: 69.12494700138238 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 81.54104342761988 - type: cos_sim_spearman value: 81.18789220331483 - type: euclidean_pearson value: 81.5895544590969 - type: euclidean_spearman value: 81.18789220331483 - type: manhattan_pearson value: 81.4738562449809 - type: manhattan_spearman value: 81.06565101416024 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (en) type: PhilipMay/stsb_multi_mt config: en split: test revision: 93d57ef91790589e3ce9c365164337a8a78b7632 metrics: - type: cos_sim_pearson value: 81.54104346197056 - type: cos_sim_spearman value: 81.18789220331483 - type: euclidean_pearson value: 81.58955451690102 - type: euclidean_spearman value: 81.18789220331483 - type: manhattan_pearson value: 81.47385630064072 - type: manhattan_spearman value: 81.06565101416024 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 79.34107964300796 - type: mrr value: 94.01917889662987 - task: type: Retrieval dataset: name: MTEB SciFact type: mteb/scifact config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: map_at_1 value: 55.928 - type: map_at_10 value: 65.443 - type: map_at_100 value: 66.067 - type: map_at_1000 value: 66.091 - type: map_at_3 value: 62.629999999999995 - type: map_at_5 value: 64.35 - type: mrr_at_1 value: 59 - type: mrr_at_10 value: 66.845 - type: mrr_at_100 value: 67.31899999999999 - type: mrr_at_1000 value: 67.342 - type: mrr_at_3 value: 64.61099999999999 - type: mrr_at_5 value: 66.044 - type: ndcg_at_1 value: 59 - type: ndcg_at_10 value: 69.921 - type: ndcg_at_100 value: 72.365 - type: ndcg_at_1000 value: 73.055 - type: ndcg_at_3 value: 65.086 - type: ndcg_at_5 value: 67.62700000000001 - type: precision_at_1 value: 59 - type: precision_at_10 value: 9.3 - type: precision_at_100 value: 1.057 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 25.333 - type: precision_at_5 value: 16.866999999999997 - type: recall_at_1 value: 55.928 - type: recall_at_10 value: 82.289 - type: recall_at_100 value: 92.833 - type: recall_at_1000 value: 98.333 - type: recall_at_3 value: 69.172 - type: recall_at_5 value: 75.628 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.81881188118813 - type: cos_sim_ap value: 95.2776439040401 - type: cos_sim_f1 value: 90.74355083459787 - type: cos_sim_precision value: 91.81166837256909 - type: cos_sim_recall value: 89.7 - type: dot_accuracy value: 99.81881188118813 - type: dot_ap value: 95.27764092100406 - type: dot_f1 value: 90.74355083459787 - type: dot_precision value: 91.81166837256909 - type: dot_recall value: 89.7 - type: euclidean_accuracy value: 99.81881188118813 - type: euclidean_ap value: 95.27764091101388 - type: euclidean_f1 value: 90.74355083459787 - type: euclidean_precision value: 91.81166837256909 - type: euclidean_recall value: 89.7 - type: manhattan_accuracy value: 99.82079207920792 - type: manhattan_ap value: 95.25081634689418 - type: manhattan_f1 value: 90.75114971895759 - type: manhattan_precision value: 92.78996865203762 - type: manhattan_recall value: 88.8 - type: max_accuracy value: 99.82079207920792 - type: max_ap value: 95.2776439040401 - type: max_f1 value: 90.75114971895759 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 60.69855369728728 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 33.98191834367251 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 50.156163330429614 - type: mrr value: 50.90145148968678 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 31.16938079808134 - type: cos_sim_spearman value: 31.74655874538245 - type: dot_pearson value: 31.169380299671705 - type: dot_spearman value: 31.74655874538245 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: mteb/trec-covid config: default split: test revision: bb9466bac8153a0349341eb1b22e06409e78ef4e metrics: - type: map_at_1 value: 0.252 - type: map_at_10 value: 2.009 - type: map_at_100 value: 11.611 - type: map_at_1000 value: 27.811999999999998 - type: map_at_3 value: 0.685 - type: map_at_5 value: 1.08 - type: mrr_at_1 value: 94 - type: mrr_at_10 value: 97 - type: mrr_at_100 value: 97 - type: mrr_at_1000 value: 97 - type: mrr_at_3 value: 97 - type: mrr_at_5 value: 97 - type: ndcg_at_1 value: 88 - type: ndcg_at_10 value: 81.388 - type: ndcg_at_100 value: 60.629 - type: ndcg_at_1000 value: 52.38 - type: ndcg_at_3 value: 86.827 - type: ndcg_at_5 value: 84.597 - type: precision_at_1 value: 94 - type: precision_at_10 value: 85.8 - type: precision_at_100 value: 62.419999999999995 - type: precision_at_1000 value: 23.31 - type: precision_at_3 value: 90.667 - type: precision_at_5 value: 88.4 - type: recall_at_1 value: 0.252 - type: recall_at_10 value: 2.164 - type: recall_at_100 value: 14.613999999999999 - type: recall_at_1000 value: 48.730000000000004 - type: recall_at_3 value: 0.7020000000000001 - type: recall_at_5 value: 1.122 - task: type: Retrieval dataset: name: MTEB Touche2020 type: mteb/touche2020 config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: map_at_1 value: 3.476 - type: map_at_10 value: 13.442000000000002 - type: map_at_100 value: 20.618 - type: map_at_1000 value: 22.175 - type: map_at_3 value: 6.968000000000001 - type: map_at_5 value: 9.214 - type: mrr_at_1 value: 44.897999999999996 - type: mrr_at_10 value: 56.77100000000001 - type: mrr_at_100 value: 57.226 - type: mrr_at_1000 value: 57.226 - type: mrr_at_3 value: 52.381 - type: mrr_at_5 value: 54.523999999999994 - type: ndcg_at_1 value: 42.857 - type: ndcg_at_10 value: 32.507999999999996 - type: ndcg_at_100 value: 43.614000000000004 - type: ndcg_at_1000 value: 53.82 - type: ndcg_at_3 value: 36.818 - type: ndcg_at_5 value: 33.346 - type: precision_at_1 value: 44.897999999999996 - type: precision_at_10 value: 28.571 - type: precision_at_100 value: 8.652999999999999 - type: precision_at_1000 value: 1.5709999999999997 - type: precision_at_3 value: 38.095 - type: precision_at_5 value: 32.245000000000005 - type: recall_at_1 value: 3.476 - type: recall_at_10 value: 20.827 - type: recall_at_100 value: 53.04299999999999 - type: recall_at_1000 value: 84.221 - type: recall_at_3 value: 8.200000000000001 - type: recall_at_5 value: 11.651 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de metrics: - type: accuracy value: 61.96360000000001 - type: ap value: 11.256160324436445 - type: f1 value: 48.07712827691349 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 58.90492359932088 - type: f1 value: 59.12542417513503 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 38.284935353315355 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 83.4714192048638 - type: cos_sim_ap value: 65.77588263185375 - type: cos_sim_f1 value: 62.459508098380326 - type: cos_sim_precision value: 57.27172717271727 - type: cos_sim_recall value: 68.68073878627968 - type: dot_accuracy value: 83.4714192048638 - type: dot_ap value: 65.77588818364636 - type: dot_f1 value: 62.459508098380326 - type: dot_precision value: 57.27172717271727 - type: dot_recall value: 68.68073878627968 - type: euclidean_accuracy value: 83.4714192048638 - type: euclidean_ap value: 65.77587693431595 - type: euclidean_f1 value: 62.459508098380326 - type: euclidean_precision value: 57.27172717271727 - type: euclidean_recall value: 68.68073878627968 - type: manhattan_accuracy value: 83.47737974608094 - type: manhattan_ap value: 65.65957745829654 - type: manhattan_f1 value: 62.22760290556902 - type: manhattan_precision value: 57.494407158836694 - type: manhattan_recall value: 67.81002638522428 - type: max_accuracy value: 83.47737974608094 - type: max_ap value: 65.77588818364636 - type: max_f1 value: 62.459508098380326 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.64244964489463 - type: cos_sim_ap value: 85.154122301394 - type: cos_sim_f1 value: 77.45617911327146 - type: cos_sim_precision value: 74.23066064370413 - type: cos_sim_recall value: 80.97474591931014 - type: dot_accuracy value: 88.64244964489463 - type: dot_ap value: 85.15411965587543 - type: dot_f1 value: 77.45617911327146 - type: dot_precision value: 74.23066064370413 - type: dot_recall value: 80.97474591931014 - type: euclidean_accuracy value: 88.64244964489463 - type: euclidean_ap value: 85.15414684113986 - type: euclidean_f1 value: 77.45617911327146 - type: euclidean_precision value: 74.23066064370413 - type: euclidean_recall value: 80.97474591931014 - type: manhattan_accuracy value: 88.57841425078588 - type: manhattan_ap value: 85.12472268567576 - type: manhattan_f1 value: 77.39497339937627 - type: manhattan_precision value: 73.92584285413892 - type: manhattan_recall value: 81.20572836464429 - type: max_accuracy value: 88.64244964489463 - type: max_ap value: 85.15414684113986 - type: max_f1 value: 77.45617911327146 - task: type: Clustering dataset: name: MTEB WikiCitiesClustering type: jinaai/cities_wiki_clustering config: default split: test revision: ddc9ee9242fa65332597f70e967ecc38b9d734fa metrics: - type: v_measure value: 79.58576208710117 --- <h1 align="center">Snowflake's Arctic-embed-s</h1> <h4 align="center"> <p> <a href=#news>News</a> | <a href=#models>Models</a> | <a href=#usage>Usage</a> | <a href="#evaluation">Evaluation</a> | <a href="#contact">Contact</a> | <a href="#faq">FAQ</a> <a href="#license">License</a> | <a href="#acknowledgement">Acknowledgement</a> <p> </h4> ## News 12/04/2024: Release of [snowflake-arctic-embed-l-v2.0](https://huggingface.co/Snowflake/snowflake-arctic-embed-l-v2.0) and [snowflake-arctic-embed-m-v2.0](https://huggingface.co/Snowflake/snowflake-arctic-embed-m-v2.0) our newest models with multilingual workloads in mind. These models outperform prior versions of Arctic Embed and we suggest these replace prior versions! 07/26/2024: Release preprint [[2407.18887] Embedding And Clustering Your Data Can Improve Contrastive Pretraining](https://arxiv.org/abs/2407.18887) on arXiv. 07/18/2024: Release of `snowflake-arctic-embed-m-v1.5`, capable of producing highly compressible embedding vectors that preserve quality even when squished as small as 128 bytes per vector. Details about the development of this model are available in the [launch post on the Snowflake engineering blog](https://www.snowflake.com/engineering-blog/arctic-embed-m-v1-5-enterprise-retrieval/). 05/10/2024: Release the [technical report on Arctic Embed](https://arxiv.org/abs/2405.05374) 04/16/2024: Release the ** snowflake-arctic-embed ** family of text embedding models. The releases are state-of-the-art for Retrieval quality at each of their representative size profiles. [Technical Report]() is coming shortly. For more details, please refer to our Github: [Arctic-Text-Embed](https://github.com/Snowflake-Labs/arctic-embed). ## Models snowflake-arctic-embed is a suite of text embedding models that focuses on creating high-quality retrieval models optimized for performance. The `snowflake-arctic-embedding` models achieve **state-of-the-art performance on the MTEB/BEIR leaderboard** for each of their size variants. Evaluation is performed using these [scripts](https://github.com/Snowflake-Labs/snowflake-arctic-embed/tree/main/src). As shown below, each class of model size achieves SOTA retrieval accuracy compared to other top models. The models are trained by leveraging existing open-source text representation models, such as bert-base-uncased, and are trained in a multi-stage pipeline to optimize their retrieval performance. First, the models are trained with large batches of query-document pairs where negatives are derived in-batch—pretraining leverages about 400m samples of a mix of public datasets and proprietary web search data. Following pretraining models are further optimized with long training on a smaller dataset (about 1m samples) of triplets of query, positive document, and negative document derived from hard harmful mining. Mining of the negatives and data curation is crucial to retrieval accuracy. A detailed technical report can be found [here](https://arxiv.org/abs/2405.05374). | Name | MTEB Retrieval Score (NDCG @ 10) | Parameters (Millions) | Embedding Dimension | | ----------------------------------------------------------------------- | -------------------------------- | --------------------- | ------------------- | | [snowflake-arctic-embed-xs](https://huggingface.co/Snowflake/snowflake-arctic-embed-xs/) | 50.15 | 22 | 384 | | [snowflake-arctic-embed-s](https://huggingface.co/Snowflake/snowflake-arctic-embed-s/) | 51.98 | 33 | 384 | | [snowflake-arctic-embed-m](https://huggingface.co/Snowflake/snowflake-arctic-embed-m/) | 54.90 | 110 | 768 | | [snowflake-arctic-embed-m-long](https://huggingface.co/Snowflake/snowflake-arctic-embed-m-long/) | 54.83 | 137 | 768 | | [snowflake-arctic-embed-l](https://huggingface.co/Snowflake/snowflake-arctic-embed-l/) | 55.98 | 335 | 1024 | Aside from being great open-source models, the largest model, [snowflake-arctic-embed-l](https://huggingface.co/Snowflake/snowflake-arctic-embed-l/), can serve as a natural replacement for closed-source embedding, as shown below. | Model Name | MTEB Retrieval Score (NDCG @ 10) | | ------------------------------------------------------------------ | -------------------------------- | | [snowflake-arctic-embed-l](https://huggingface.co/Snowflake/snowflake-arctic-embed-l/) | 55.98 | | Google-gecko-text-embedding | 55.7 | | text-embedding-3-large | 55.44 | | Cohere-embed-english-v3.0 | 55.00 | | bge-large-en-v1.5 | 54.29 | ### [snowflake-arctic-embed-xs](https://huggingface.co/Snowflake/snowflake-arctic-embed-xs) This tiny model packs quite the punch. Based on the [all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) model with only 22m parameters and 384 dimensions, this model should meet even the strictest latency/TCO budgets. Despite its size, its retrieval accuracy is closer to that of models with 100m paramers. | Model Name | MTEB Retrieval Score (NDCG @ 10) | | ------------------------------------------------------------------- | -------------------------------- | | [snowflake-arctic-embed-xs](https://huggingface.co/Snowflake/snowflake-arctic-embed-xs/) | 50.15 | | GIST-all-MiniLM-L6-v2 | 45.12 | | gte-tiny | 44.92 | | all-MiniLM-L6-v2 | 41.95 | | bge-micro-v2 | 42.56 | ### [snowflake-arctic-embed-s](https://huggingface.co/Snowflake/snowflake-arctic-embed-s) Based on the [intfloat/e5-small-unsupervised](https://huggingface.co/intfloat/e5-small-unsupervised) model, this small model does not trade off retrieval accuracy for its small size. With only 33m parameters and 384 dimensions, this model should easily allow scaling to large datasets. | Model Name | MTEB Retrieval Score (NDCG @ 10) | | ------------------------------------------------------------------ | -------------------------------- | | [snowflake-arctic-embed-s](https://huggingface.co/Snowflake/snowflake-arctic-embed-s/) | 51.98 | | bge-small-en-v1.5 | 51.68 | | Cohere-embed-english-light-v3.0 | 51.34 | | text-embedding-3-small | 51.08 | | e5-small-v2 | 49.04 | ### [snowflake-arctic-embed-m](https://huggingface.co/Snowflake/snowflake-arctic-embed-m/) Based on the [intfloat/e5-base-unsupervised](https://huggingface.co/intfloat/e5-base-unsupervised) model, this medium model is the workhorse that provides the best retrieval performance without slowing down inference. | Model Name | MTEB Retrieval Score (NDCG @ 10) | | ------------------------------------------------------------------ | -------------------------------- | | [snowflake-arctic-embed-m](https://huggingface.co/Snowflake/snowflake-arctic-embed-m/) | 54.90 | | bge-base-en-v1.5 | 53.25 | | nomic-embed-text-v1.5 | 53.25 | | GIST-Embedding-v0 | 52.31 | | gte-base | 52.31 | ### [snowflake-arctic-embed-m-long](https://huggingface.co/Snowflake/snowflake-arctic-embed-m-long/) Based on the [nomic-ai/nomic-embed-text-v1-unsupervised](https://huggingface.co/nomic-ai/nomic-embed-text-v1-unsupervised) model, this long-context variant of our medium-sized model is perfect for workloads that can be constrained by the regular 512 token context of our other models. Without the use of RPE, this model supports up to 2048 tokens. With RPE, it can scale to 8192! | Model Name | MTEB Retrieval Score (NDCG @ 10) | | ------------------------------------------------------------------ | -------------------------------- | | [snowflake-arctic-embed-m-long](https://huggingface.co/Snowflake/snowflake-arctic-embed-m-long/) | 54.83 | | nomic-embed-text-v1.5 | 53.01 | | nomic-embed-text-v1 | 52.81 | ### [snowflake-arctic-embed-l](https://huggingface.co/Snowflake/snowflake-arctic-embed-l/) Based on the [intfloat/e5-large-unsupervised](https://huggingface.co/intfloat/e5-large-unsupervised) model, this large model is a direct drop-in for closed APIs and delivers the most accurate retrieval experience. | Model Name | MTEB Retrieval Score (NDCG @ 10) | | ------------------------------------------------------------------ | -------------------------------- | | [snowflake-arctic-embed-l](https://huggingface.co/Snowflake/snowflake-arctic-embed-l/) | 55.98 | | UAE-Large-V1 | 54.66 | | bge-large-en-v1.5 | 54.29 | | mxbai-embed-large-v1 | 54.39 | | e5-Large-v2 | 50.56 | ## Usage ### Using Sentence Transformers You can use the sentence-transformers package to use an snowflake-arctic-embed model, as shown below. ```python from sentence_transformers import SentenceTransformer model = SentenceTransformer("Snowflake/snowflake-arctic-embed-s") queries = ['what is snowflake?', 'Where can I get the best tacos?'] documents = ['The Data Cloud!', 'Mexico City of Course!'] query_embeddings = model.encode(queries, prompt_name="query") document_embeddings = model.encode(documents) scores = query_embeddings @ document_embeddings.T for query, query_scores in zip(queries, scores): doc_score_pairs = list(zip(documents, query_scores)) doc_score_pairs = sorted(doc_score_pairs, key=lambda x: x[1], reverse=True) # Output passages & scores print("Query:", query) for document, score in doc_score_pairs: print(score, document) ``` ``` Query: what is snowflake? 0.533809 The Data Cloud! 0.49207097 Mexico City of Course! Query: Where can I get the best tacos? 0.56592476 Mexico City of Course! 0.48255116 The Data Cloud! ``` ### Using Huggingface transformers You can use the transformers package to use an snowflake-arctic-embed model, as shown below. For optimal retrieval quality, use the CLS token to embed each text portion and use the query prefix below (just on the query). ```python import torch from transformers import AutoModel, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained('Snowflake/snowflake-arctic-embed-s') model = AutoModel.from_pretrained('Snowflake/snowflake-arctic-embed-s', add_pooling_layer=False) model.eval() query_prefix = 'Represent this sentence for searching relevant passages: ' queries = ['what is snowflake?', 'Where can I get the best tacos?'] queries_with_prefix = ["{}{}".format(query_prefix, i) for i in queries] query_tokens = tokenizer(queries_with_prefix, padding=True, truncation=True, return_tensors='pt', max_length=512) documents = ['The Data Cloud!', 'Mexico City of Course!'] document_tokens = tokenizer(documents, padding=True, truncation=True, return_tensors='pt', max_length=512) # Compute token embeddings with torch.no_grad(): query_embeddings = model(**query_tokens)[0][:, 0] document_embeddings = model(**document_tokens)[0][:, 0] # normalize embeddings query_embeddings = torch.nn.functional.normalize(query_embeddings, p=2, dim=1) document_embeddings = torch.nn.functional.normalize(document_embeddings, p=2, dim=1) scores = torch.mm(query_embeddings, document_embeddings.transpose(0, 1)) for query, query_scores in zip(queries, scores): doc_score_pairs = list(zip(documents, query_scores)) doc_score_pairs = sorted(doc_score_pairs, key=lambda x: x[1], reverse=True) #Output passages & scores print("Query:", query) for document, score in doc_score_pairs: print(score, document) ``` ### Using Transformers.js If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@xenova/transformers) by running: ```bash npm i @xenova/transformers ``` You can then use the model to compute embeddings as follows: ```js import { pipeline, dot } from '@xenova/transformers'; // Create feature extraction pipeline const extractor = await pipeline('feature-extraction', 'Snowflake/snowflake-arctic-embed-s', { quantized: false, // Comment out this line to use the quantized version }); // Generate sentence embeddings const sentences = [ 'Represent this sentence for searching relevant passages: Where can I get the best tacos?', 'The Data Cloud!', 'Mexico City of Course!', ] const output = await extractor(sentences, { normalize: true, pooling: 'cls' }); // Compute similarity scores const [source_embeddings, ...document_embeddings ] = output.tolist(); const similarities = document_embeddings.map(x => dot(source_embeddings, x)); console.log(similarities); // [0.48255123876493394, 0.5659250100112143] ``` ## FAQ TBD ## Contact Feel free to open an issue or pull request if you have any questions or suggestions about this project. You also can email Daniel Campos([email protected]). ## License Arctic is licensed under the [Apache-2](https://www.apache.org/licenses/LICENSE-2.0). The released models can be used for commercial purposes free of charge. ## Acknowledgement We want to thank the open-source community, which has provided the great building blocks upon which we could make our models. We thank our modeling engineers, Danmei Xu, Luke Merrick, Gaurav Nuti, and Daniel Campos, for making these great models possible. We thank our leadership, Himabindu Pucha, Kelvin So, Vivek Raghunathan, and Sridhar Ramaswamy, for supporting this work. We also thank the open-source community for producing the great models we could build on top of and making these releases possible. Finally, we thank the researchers who created BEIR and MTEB benchmarks. It is largely thanks to their tireless work to define what better looks like that we could improve model performance. <img referrerpolicy="no-referrer-when-downgrade" src="https://static.scarf.sh/a.png?x-pxid=26ca7731-2650-44be-942d-0c6809cfcf00" />
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
himanshu23099/bge_embedding_finetune_v3
himanshu23099
sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:3507", "loss:GISTEmbedLoss", "arxiv:1908.10084", "arxiv:2402.16829", "base_model:BAAI/bge-small-en-v1.5", "base_model:finetune:BAAI/bge-small-en-v1.5", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,731
1,732
12
0
--- base_model: BAAI/bge-small-en-v1.5 library_name: sentence-transformers metrics: - cosine_accuracy@1 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@5 - cosine_ndcg@10 - cosine_ndcg@100 - cosine_mrr@5 - cosine_mrr@10 - cosine_mrr@100 - cosine_map@100 pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:3507 - loss:GISTEmbedLoss widget: - source_sentence: Is there an option to use ride-sharing apps like Ola or Uber for travel from the Airport to the Mela? sentences: - "Are there towing services available if my vehicle breaks down in the parking\ \ lot?\n Yes, towing services are available if your vehicle breaks down in the\ \ parking lot." - No, ride-sharing options like Ola or Uber are not available for travel from the Airport to the Mela. Pilgrims are encouraged to use other transport options like taxis, buses, or dedicated shuttle services provided for the event. - Baking bread requires certain key ingredients to achieve a perfect texture. Flour, water, and yeast are the base, while salt enhances flavor. The dough should be kneaded until smooth, then allowed to rise in a warm area. After a proper rise, shaping the loaf is essential for even baking in the oven. - source_sentence: What is the significance of Akshaywat? sentences: - Akshaywat, or the "immortal banyan tree," is a spiritually significant site in Prayagraj, especially during the Kumbh Mela. Symbolizing immortality and eternal life, the tree is believed to possess divine qualities that remain unaffected by creation and destruction cycles. Mythologically, it is associated with Lord Brahma, who is said to have performed a sacrificial ritual under it, and Lord Vishnu, who is believed to have blessed devotees there. Akshaywat is also a sacred spot for performing Pind Daan, rituals for deceased ancestors, thought to help achieve Moksha (liberation). As a center of spiritual wisdom and pilgrimage for thousands of years, it continues to be a powerful symbol of divine blessings and spiritual strength for Hindu devotees. - 'What are the must-visit spiritual sites near Sangam? The Sangam area, where the Ganga, Yamuna, and the mystical Saraswati rivers converge, is surrounded by revered spiritual sites: Bade Hanumanji Temple:Bade Hanumanji Temple, also known as Lete Hanuman Mandir, is a unique and revered Hindu shrine located near the Sangam in Prayagraj. This temple is distinctive for its reclining idol of Lord Hanuman, a one-of-a-kind depiction of the deity. Each year, during the monsoon floods, the Ganga river rises to gently wash over the feet of Lord Hanuman—a sacred ritual believed to be a divine blessing Patalpuri Temple and Akshayavat Tree: Located within the Allahabad Fort, the ancient Patalpuri Temple is known for the Akshayavat (Indestructible Banyan Tree), considered sacred and a symbol of immortality. Mankameshwar Temple: A dedicated Shiva temple located near the Sangam, known for its serene atmosphere and the belief that prayers here fulfill desires.' - The uniqueness of brightly colored seashells lies in their mesmerizing patterns. Found along coastlines worldwide, these intricate formations tell stories of marine life and geological processes. Each shell serves as a protective covering, shielding the delicate organisms within from predators and environmental threats. Fishermen and beachcombers alike often treasure these natural artifacts, using them for decoration or as tools in crafting. The vibrant hues seen in shells, ranging from deep blues to vivid oranges, result from pigments produced by the mollusks themselves, influenced by their habitat and diet. Collecting seashells can foster a deep appreciation for marine ecosystems and the roles different species play within them, reminding us of the intricate balance of nature. - source_sentence: Allahabad Junction ka matlab sentences: - 'Where is Anand Bhavan Museum located? Anand Bhawan is located on Jawaharlal Nehru Road, about 5 km from Allahabad Junction Railway Station, Prayagraj, Uttar Pradesh.' - Aartis are performed both in the mornings and evenings on the riverbanks in Prayagraj to honor the divine presence of the sacred rivers—Ganga, Yamuna, and mythical Saraswati—and to seek their blessings. \n The morning Aarti symbolizes the beginning of a new day, invoking the divine to bestow grace, protection, and spiritual strength upon the devotees. \n The evening Aarti serves as a ritual of gratitude, marking the end of the day by thanking the deities for their blessings and guidance. - 'Where is Khusro Bagh located? The garden is located approximately 3 km from Allahabad Junction Railway Station, making it easily accessible by local transport. The address is near the Lukarganj area, Allahabad, Uttar Pradesh.' - source_sentence: Do E-Rickshaws have a maximum passenger limit, and what is it? sentences: - The ancient art of glassblowing dates back thousands of years. This intricate craft requires skill and precision, resulting in beautiful works that can be functional or decorative. From vases to intricate sculptures, the possibilities are endless. - E-Rickshaws have a maximum passenger limit of 4 people. It is important not to exceed this limit to ensure safety. - No, shuttle buses will not have dedicated volunteers specifically, but for assistance, you can reach out to the nearest information center. - source_sentence: Tourists visit reason sentences: - 'What attractions are closest to the city center? Near the city center, you’ll find several attractions within a short distance. Anand Bhavan and Swaraj Bhavan are centrally located and offer insights into the Nehru family and India’s freedom movement. All Saints’ Cathedral, a magnificent Gothic-style church also known as the “Patthar Girja,” is located in Civil Lines and is one of Prayagraj''s architectural gems. Company Bagh, a peaceful park, is also close by and ideal for a quiet stroll. Chandrashekhar Azad Park and Khusro Bagh are both centrally located as well, providing green spaces along with historical importance.' - "When and where was the last Kumbh held?\n The last Mahakumbh was held in Haridwar\ \ in 2021." - 'What is All Saints Cathedral, and why is it architecturally significant? All Saints Cathedral, locally known as Patthar Girja (Stone Church), is a renowned Anglican Christian Church located on M.G. Marg, Allahabad. Built in the late 19th century, it is one of the most beautiful and architecturally significant churches in Uttar Pradesh, attracting both tourists and pilgrims.' model-index: - name: SentenceTransformer based on BAAI/bge-small-en-v1.5 results: - task: type: information-retrieval name: Information Retrieval dataset: name: val evaluator type: val_evaluator metrics: - type: cosine_accuracy@1 value: 0.3580387685290764 name: Cosine Accuracy@1 - type: cosine_accuracy@5 value: 0.7092360319270239 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.7993158494868872 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.3580387685290764 name: Cosine Precision@1 - type: cosine_precision@5 value: 0.14184720638540477 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.07993158494868871 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.3580387685290764 name: Cosine Recall@1 - type: cosine_recall@5 value: 0.7092360319270239 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.7993158494868872 name: Cosine Recall@10 - type: cosine_ndcg@5 value: 0.5538539564761136 name: Cosine Ndcg@5 - type: cosine_ndcg@10 value: 0.5832174788373438 name: Cosine Ndcg@10 - type: cosine_ndcg@100 value: 0.6189539076148961 name: Cosine Ndcg@100 - type: cosine_mrr@5 value: 0.5013492968453055 name: Cosine Mrr@5 - type: cosine_mrr@10 value: 0.5136020162530992 name: Cosine Mrr@10 - type: cosine_mrr@100 value: 0.5210085507064763 name: Cosine Mrr@100 - type: cosine_map@100 value: 0.5210085507064769 name: Cosine Map@100 --- # SentenceTransformer based on BAAI/bge-small-en-v1.5 This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) <!-- at revision 5c38ec7c405ec4b44b94cc5a9bb96e735b38267a --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 384 dimensions - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("himanshu23099/bge_embedding_finetune_v3") # Run inference sentences = [ 'Tourists visit reason', 'What is All Saints Cathedral, and why is it architecturally significant?\nAll Saints Cathedral, locally known as Patthar Girja (Stone Church), is a renowned Anglican Christian Church located on M.G. Marg, Allahabad. Built in the late 19th century, it is one of the most beautiful and architecturally significant churches in Uttar Pradesh, attracting both tourists and pilgrims.', "What attractions are closest to the city center?\nNear the city center, you’ll find several attractions within a short distance. Anand Bhavan and Swaraj Bhavan are centrally located and offer insights into the Nehru family and India’s freedom movement. All Saints’ Cathedral, a magnificent Gothic-style church also known as the “Patthar Girja,” is located in Civil Lines and is one of Prayagraj's architectural gems. Company Bagh, a peaceful park, is also close by and ideal for a quiet stroll. Chandrashekhar Azad Park and Khusro Bagh are both centrally located as well, providing green spaces along with historical importance.", ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 384] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Information Retrieval * Dataset: `val_evaluator` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:----------| | cosine_accuracy@1 | 0.358 | | cosine_accuracy@5 | 0.7092 | | cosine_accuracy@10 | 0.7993 | | cosine_precision@1 | 0.358 | | cosine_precision@5 | 0.1418 | | cosine_precision@10 | 0.0799 | | cosine_recall@1 | 0.358 | | cosine_recall@5 | 0.7092 | | cosine_recall@10 | 0.7993 | | cosine_ndcg@5 | 0.5539 | | cosine_ndcg@10 | 0.5832 | | **cosine_ndcg@100** | **0.619** | | cosine_mrr@5 | 0.5013 | | cosine_mrr@10 | 0.5136 | | cosine_mrr@100 | 0.521 | | cosine_map@100 | 0.521 | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 3,507 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 4 tokens</li><li>mean: 11.76 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 116.82 tokens</li><li>max: 504 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 121.15 tokens</li><li>max: 424 tokens</li></ul> | * Samples: | anchor | positive | negative | |:---------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>Where are the shuttle bus pickup points located within the Kumbh Mela grounds?</code> | <code>No, shuttle buses will not have dedicated volunteers specifically, but for assistance, you can reach out to the nearest information center.</code> | <code>The ancient art of weaving has captivated many cultures worldwide. In some regions, artisans use intricate patterns to tell stories, while others focus on vibrant colors that highlight their heritage. Experimentation with different materials can yield unique textures, adding depth to the final product. Workshops often provide insights into traditional techniques, ensuring these skills are passed down through generations.</code> | | <code>Hotel Ilawart start place</code> | <code>Is hotel pickup and drop-off available for the tours?<br> Fixed pickup points, such as Hotel Ilawart, are provided for all tours. In some cases, pickup and drop-off can be arranged for locations within a 5 km radius of the starting point, but you must confirm this with the tour operator at the time of booking.</code> | <code>What all is included in the trip package?<br>The trip package typically includes transportation, tour guide services, and breakfast. Meals such as lunch and dinner can be purchased separately. Hotel bookings are usually not included in the package, so you will need to arrange accommodation independently.</code> | | <code>Are there food stalls or restaurants at the Railway Junction that cater to dietary restrictions for pilgrims?</code> | <code>Yes, there are food stalls and restaurants available at the Railway Junction that cater to various dietary needs, including vegetarian and other dietary restrictions suitable for pilgrims.</code> | <code>The sound of the ocean waves rhythmically crashing against the shore creates a soothing symphony that invites relaxation. Seagulls soar above, occasionally diving down to catch a glimpse of fish beneath the surface. Beachgoers spread out their colorful towels, soaking up the sun's golden rays while children build sandcastles, their laughter mingling with the salty breeze. A distant sailboat glides across the horizon, hinting at adventures beyond the vast expanse of blue. As the sun sets, the sky transforms into a canvas of vibrant hues, signaling the end of another beautiful day by the sea.</code> | * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.01} ``` ### Evaluation Dataset #### Unnamed Dataset * Size: 877 evaluation samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 877 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 5 tokens</li><li>mean: 12.21 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 115.93 tokens</li><li>max: 471 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 118.09 tokens</li><li>max: 422 tokens</li></ul> | * Samples: | anchor | positive | negative | |:-----------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>Ganga bath benefit</code> | <code>What is the ritual of Snan or bathing?<br> Taking bath at the confluence of Ganga, Yamuna and invisible Saraswati during Mahakumbh has special significance. It is believed that by bathing in this holy confluence, all the sins of a person are washed away and he attains salvation.<br> <br> Bathing not only symbolizes personal purification, but it also conveys the message of social harmony and unity, where people from different cultures and communities come together to participate in this sacred ritual.<br> <br> It is considered that in special circumstances, the water of rivers also acquires a special life-giving quality, i.e. nectar, which not only leads to spiritual development along with purification of the mind, but also gives physical benefits by getting health. <br> List of Aliases: [['Snan', 'bathing'], ]</code> | <code>What benefits will I get by attending the Kumbh Mela?<br>It is believed that bathing in the holy rivers during this time washes away sins and grants liberation from the cycle of life and death.<br> <br> Attending the Kumbh and taking a dip in the sacred rivers provides a unique opportunity for spiritual growth, purification, and selfrealization. ✨</code> | | <code>Guide provide what</code> | <code>What is the guide-to-participant ratio for each tour?<br> Each tour is led by one guide per group, ensuring a personalized experience with ample opportunity for detailed insights and engagement. The guide will provide context, historical background, and answer any questions during the tour, offering a rich, informative experience for participants.</code> | <code>How many people can join a group tour?<br>Group sizes depend on the type of vehicle selected. For instance, a Dzire accommodates up to 4 people, an Innova is suitable for 5-6 people, and larger groups (minimum 10 people) can travel in a Tempo Traveller. For even larger groups, multiple vehicles can be arranged to ensure everyone can travel together comfortably.</code> | | <code>How many rules must a Kalpvasi observe?</code> | <code>A Kalpvasi must observe 21 rules during Kalpvas, involving disciplines of the mind, speech, and actions.</code> | <code>The dancing colors of autumn leaves create a tapestry of nature’s beauty, inviting every eye to witness the grandeur of the changing seasons. Every gust of wind carries a whisper of nostalgia as trees shed their vibrant garments.</code> | * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.01} ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 16 - `gradient_accumulation_steps`: 2 - `learning_rate`: 1e-05 - `weight_decay`: 0.01 - `num_train_epochs`: 30 - `warmup_ratio`: 0.1 - `load_best_model_at_end`: True #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 8 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 2 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 1e-05 - `weight_decay`: 0.01 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 30 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `average_tokens_across_devices`: False - `prompts`: None - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs <details><summary>Click to expand</summary> | Epoch | Step | Training Loss | Validation Loss | val_evaluator_cosine_ndcg@100 | |:-----------:|:--------:|:-------------:|:---------------:|:-----------------------------:| | 0.0909 | 10 | - | 1.0916 | 0.4285 | | 0.1818 | 20 | - | 1.0683 | 0.4295 | | 0.2727 | 30 | - | 1.0320 | 0.4301 | | 0.3636 | 40 | - | 0.9845 | 0.4309 | | 0.4545 | 50 | 1.8466 | 0.9320 | 0.4340 | | 0.5455 | 60 | - | 0.8804 | 0.4352 | | 0.6364 | 70 | - | 0.8284 | 0.4368 | | 0.7273 | 80 | - | 0.7754 | 0.4420 | | 0.8182 | 90 | - | 0.7211 | 0.4425 | | 0.9091 | 100 | 1.4317 | 0.6711 | 0.4442 | | 1.0 | 110 | - | 0.6193 | 0.4483 | | 1.0909 | 120 | - | 0.5700 | 0.4555 | | 1.1818 | 130 | - | 0.5271 | 0.4603 | | 1.2727 | 140 | - | 0.4892 | 0.4620 | | 1.3636 | 150 | 1.0007 | 0.4611 | 0.4651 | | 1.4545 | 160 | - | 0.4276 | 0.4706 | | 1.5455 | 170 | - | 0.4005 | 0.4698 | | 1.6364 | 180 | - | 0.3818 | 0.4728 | | 1.7273 | 190 | - | 0.3573 | 0.4763 | | 1.8182 | 200 | 0.7585 | 0.3321 | 0.4783 | | 1.9091 | 210 | - | 0.3091 | 0.4806 | | 2.0 | 220 | - | 0.2963 | 0.4833 | | 2.0909 | 230 | - | 0.2875 | 0.4834 | | 2.1818 | 240 | - | 0.2793 | 0.4842 | | 2.2727 | 250 | 0.5586 | 0.2729 | 0.4879 | | 2.3636 | 260 | - | 0.2663 | 0.4885 | | 2.4545 | 270 | - | 0.2576 | 0.4925 | | 2.5455 | 280 | - | 0.2477 | 0.5006 | | 2.6364 | 290 | - | 0.2353 | 0.5058 | | 2.7273 | 300 | 0.4751 | 0.2278 | 0.5112 | | 2.8182 | 310 | - | 0.2206 | 0.5096 | | 2.9091 | 320 | - | 0.2130 | 0.5144 | | 3.0 | 330 | - | 0.2043 | 0.5202 | | 3.0909 | 340 | - | 0.1973 | 0.5214 | | 3.1818 | 350 | 0.381 | 0.1964 | 0.5271 | | 3.2727 | 360 | - | 0.1968 | 0.5325 | | 3.3636 | 370 | - | 0.1922 | 0.5289 | | 3.4545 | 380 | - | 0.1869 | 0.5329 | | 3.5455 | 390 | - | 0.1789 | 0.5391 | | 3.6364 | 400 | 0.3886 | 0.1743 | 0.5464 | | 3.7273 | 410 | - | 0.1730 | 0.5472 | | 3.8182 | 420 | - | 0.1699 | 0.5479 | | 3.9091 | 430 | - | 0.1644 | 0.5525 | | 4.0 | 440 | - | 0.1623 | 0.5511 | | 4.0909 | 450 | 0.2977 | 0.1600 | 0.5513 | | 4.1818 | 460 | - | 0.1540 | 0.5519 | | 4.2727 | 470 | - | 0.1492 | 0.5589 | | 4.3636 | 480 | - | 0.1450 | 0.5624 | | 4.4545 | 490 | - | 0.1426 | 0.5644 | | 4.5455 | 500 | 0.2496 | 0.1407 | 0.5629 | | 4.6364 | 510 | - | 0.1390 | 0.5663 | | 4.7273 | 520 | - | 0.1399 | 0.5695 | | 4.8182 | 530 | - | 0.1377 | 0.5764 | | 4.9091 | 540 | - | 0.1357 | 0.5753 | | 5.0 | 550 | 0.2322 | 0.1364 | 0.5827 | | 5.0909 | 560 | - | 0.1327 | 0.5804 | | 5.1818 | 570 | - | 0.1300 | 0.5799 | | 5.2727 | 580 | - | 0.1307 | 0.5816 | | 5.3636 | 590 | - | 0.1331 | 0.5868 | | 5.4545 | 600 | 0.2219 | 0.1322 | 0.5839 | | 5.5455 | 610 | - | 0.1332 | 0.5822 | | 5.6364 | 620 | - | 0.1323 | 0.5817 | | 5.7273 | 630 | - | 0.1311 | 0.5845 | | 5.8182 | 640 | - | 0.1282 | 0.5834 | | 5.9091 | 650 | 0.1982 | 0.1253 | 0.5870 | | 6.0 | 660 | - | 0.1242 | 0.5880 | | 6.0909 | 670 | - | 0.1241 | 0.5859 | | 6.1818 | 680 | - | 0.1265 | 0.5885 | | 6.2727 | 690 | - | 0.1287 | 0.5964 | | 6.3636 | 700 | 0.1613 | 0.1321 | 0.5968 | | 6.4545 | 710 | - | 0.1332 | 0.5979 | | 6.5455 | 720 | - | 0.1295 | 0.6016 | | 6.6364 | 730 | - | 0.1262 | 0.6022 | | 6.7273 | 740 | - | 0.1242 | 0.6020 | | 6.8182 | 750 | 0.172 | 0.1238 | 0.6037 | | 6.9091 | 760 | - | 0.1222 | 0.6036 | | 7.0 | 770 | - | 0.1213 | 0.6038 | | 7.0909 | 780 | - | 0.1208 | 0.6038 | | 7.1818 | 790 | - | 0.1200 | 0.6011 | | 7.2727 | 800 | 0.1486 | 0.1196 | 0.5979 | | 7.3636 | 810 | - | 0.1227 | 0.6015 | | 7.4545 | 820 | - | 0.1225 | 0.6004 | | 7.5455 | 830 | - | 0.1195 | 0.6045 | | 7.6364 | 840 | - | 0.1202 | 0.6045 | | 7.7273 | 850 | 0.1501 | 0.1208 | 0.6044 | | 7.8182 | 860 | - | 0.1177 | 0.6038 | | 7.9091 | 870 | - | 0.1161 | 0.6031 | | 8.0 | 880 | - | 0.1168 | 0.6024 | | 8.0909 | 890 | - | 0.1175 | 0.6050 | | 8.1818 | 900 | 0.1563 | 0.1157 | 0.6063 | | 8.2727 | 910 | - | 0.1146 | 0.6056 | | 8.3636 | 920 | - | 0.1152 | 0.6073 | | 8.4545 | 930 | - | 0.1167 | 0.6077 | | 8.5455 | 940 | - | 0.1172 | 0.6087 | | 8.6364 | 950 | 0.1247 | 0.1169 | 0.6077 | | 8.7273 | 960 | - | 0.1159 | 0.6056 | | 8.8182 | 970 | - | 0.1151 | 0.6066 | | 8.9091 | 980 | - | 0.1161 | 0.6089 | | 9.0 | 990 | - | 0.1187 | 0.6071 | | 9.0909 | 1000 | 0.1497 | 0.1157 | 0.6110 | | 9.1818 | 1010 | - | 0.1148 | 0.6086 | | 9.2727 | 1020 | - | 0.1134 | 0.6125 | | 9.3636 | 1030 | - | 0.1173 | 0.6114 | | 9.4545 | 1040 | - | 0.1174 | 0.6118 | | 9.5455 | 1050 | 0.1025 | 0.1159 | 0.6127 | | 9.6364 | 1060 | - | 0.1118 | 0.6093 | | 9.7273 | 1070 | - | 0.1114 | 0.6103 | | 9.8182 | 1080 | - | 0.1128 | 0.6102 | | 9.9091 | 1090 | - | 0.1142 | 0.6116 | | 10.0 | 1100 | 0.128 | 0.1147 | 0.6115 | | 10.0909 | 1110 | - | 0.1143 | 0.6095 | | 10.1818 | 1120 | - | 0.1134 | 0.6073 | | 10.2727 | 1130 | - | 0.1137 | 0.6059 | | 10.3636 | 1140 | - | 0.1143 | 0.6049 | | 10.4545 | 1150 | 0.1413 | 0.1145 | 0.6047 | | 10.5455 | 1160 | - | 0.1154 | 0.6032 | | 10.6364 | 1170 | - | 0.1158 | 0.6044 | | 10.7273 | 1180 | - | 0.1151 | 0.6060 | | 10.8182 | 1190 | - | 0.1145 | 0.6081 | | 10.9091 | 1200 | 0.1223 | 0.1133 | 0.6084 | | 11.0 | 1210 | - | 0.1121 | 0.6090 | | 11.0909 | 1220 | - | 0.1130 | 0.6129 | | 11.1818 | 1230 | - | 0.1134 | 0.6089 | | 11.2727 | 1240 | - | 0.1136 | 0.6112 | | 11.3636 | 1250 | 0.1199 | 0.1142 | 0.6134 | | 11.4545 | 1260 | - | 0.1128 | 0.6145 | | 11.5455 | 1270 | - | 0.1097 | 0.6148 | | 11.6364 | 1280 | - | 0.1081 | 0.6122 | | 11.7273 | 1290 | - | 0.1074 | 0.6126 | | 11.8182 | 1300 | 0.1143 | 0.1063 | 0.6167 | | 11.9091 | 1310 | - | 0.1067 | 0.6163 | | 12.0 | 1320 | - | 0.1067 | 0.6190 | | 12.0909 | 1330 | - | 0.1075 | 0.6193 | | 12.1818 | 1340 | - | 0.1092 | 0.6222 | | 12.2727 | 1350 | 0.0974 | 0.1087 | 0.6199 | | 12.3636 | 1360 | - | 0.1078 | 0.6183 | | 12.4545 | 1370 | - | 0.1072 | 0.6180 | | 12.5455 | 1380 | - | 0.1072 | 0.6172 | | 12.6364 | 1390 | - | 0.1072 | 0.6209 | | 12.7273 | 1400 | 0.1257 | 0.1056 | 0.6152 | | 12.8182 | 1410 | - | 0.1046 | 0.6149 | | 12.9091 | 1420 | - | 0.1034 | 0.6142 | | 13.0 | 1430 | - | 0.1034 | 0.6165 | | 13.0909 | 1440 | - | 0.1046 | 0.6165 | | 13.1818 | 1450 | 0.0866 | 0.1064 | 0.6177 | | 13.2727 | 1460 | - | 0.1070 | 0.6158 | | 13.3636 | 1470 | - | 0.1055 | 0.6151 | | 13.4545 | 1480 | - | 0.1040 | 0.6182 | | 13.5455 | 1490 | - | 0.1042 | 0.6144 | | 13.6364 | 1500 | 0.0757 | 0.1042 | 0.6151 | | 13.7273 | 1510 | - | 0.1056 | 0.6169 | | 13.8182 | 1520 | - | 0.1059 | 0.6172 | | 13.9091 | 1530 | - | 0.1059 | 0.6181 | | 14.0 | 1540 | - | 0.1042 | 0.6167 | | 14.0909 | 1550 | 0.0754 | 0.1043 | 0.6198 | | 14.1818 | 1560 | - | 0.1044 | 0.6215 | | 14.2727 | 1570 | - | 0.1042 | 0.6205 | | 14.3636 | 1580 | - | 0.1058 | 0.6196 | | 14.4545 | 1590 | - | 0.1076 | 0.6212 | | 14.5455 | 1600 | 0.0901 | 0.1098 | 0.6219 | | 14.6364 | 1610 | - | 0.1095 | 0.6247 | | 14.7273 | 1620 | - | 0.1084 | 0.6209 | | 14.8182 | 1630 | - | 0.1063 | 0.6164 | | 14.9091 | 1640 | - | 0.1049 | 0.6170 | | 15.0 | 1650 | 0.1034 | 0.1043 | 0.6199 | | 15.0909 | 1660 | - | 0.1033 | 0.6216 | | 15.1818 | 1670 | - | 0.1035 | 0.6244 | | 15.2727 | 1680 | - | 0.1048 | 0.6286 | | 15.3636 | 1690 | - | 0.1070 | 0.6239 | | **15.4545** | **1700** | **0.0821** | **0.1084** | **0.6237** | | 15.5455 | 1710 | - | 0.1095 | 0.6234 | | 15.6364 | 1720 | - | 0.1090 | 0.6221 | | 15.7273 | 1730 | - | 0.1089 | 0.6227 | | 15.8182 | 1740 | - | 0.1091 | 0.6201 | | 15.9091 | 1750 | 0.074 | 0.1089 | 0.6195 | | 16.0 | 1760 | - | 0.1082 | 0.6205 | | 16.0909 | 1770 | - | 0.1076 | 0.6198 | | 16.1818 | 1780 | - | 0.1079 | 0.6195 | | 16.2727 | 1790 | - | 0.1081 | 0.6238 | | 16.3636 | 1800 | 0.083 | 0.1066 | 0.6219 | | 16.4545 | 1810 | - | 0.1055 | 0.6201 | | 16.5455 | 1820 | - | 0.1045 | 0.6217 | | 16.6364 | 1830 | - | 0.1030 | 0.6198 | | 16.7273 | 1840 | - | 0.1012 | 0.6192 | | 16.8182 | 1850 | 0.0569 | 0.1012 | 0.6198 | | 16.9091 | 1860 | - | 0.1017 | 0.6224 | | 17.0 | 1870 | - | 0.1024 | 0.6220 | | 17.0909 | 1880 | - | 0.1038 | 0.6217 | | 17.1818 | 1890 | - | 0.1046 | 0.6231 | | 17.2727 | 1900 | 0.1054 | 0.1056 | 0.6191 | | 17.3636 | 1910 | - | 0.1064 | 0.6220 | | 17.4545 | 1920 | - | 0.1078 | 0.6213 | | 17.5455 | 1930 | - | 0.1077 | 0.6228 | | 17.6364 | 1940 | - | 0.1071 | 0.6194 | | 17.7273 | 1950 | 0.0588 | 0.1073 | 0.6227 | | 17.8182 | 1960 | - | 0.1073 | 0.6219 | | 17.9091 | 1970 | - | 0.1074 | 0.6217 | | 18.0 | 1980 | - | 0.1073 | 0.6239 | | 18.0909 | 1990 | - | 0.1074 | 0.6210 | | 18.1818 | 2000 | 0.0772 | 0.1076 | 0.6226 | | 18.2727 | 2010 | - | 0.1081 | 0.6215 | | 18.3636 | 2020 | - | 0.1081 | 0.6206 | | 18.4545 | 2030 | - | 0.1073 | 0.6229 | | 18.5455 | 2040 | - | 0.1069 | 0.6221 | | 18.6364 | 2050 | 0.0669 | 0.1070 | 0.6233 | | 18.7273 | 2060 | - | 0.1062 | 0.6233 | | 18.8182 | 2070 | - | 0.1051 | 0.6232 | | 18.9091 | 2080 | - | 0.1038 | 0.6211 | | 19.0 | 2090 | - | 0.1028 | 0.6210 | | 19.0909 | 2100 | 0.0638 | 0.1015 | 0.6214 | | 19.1818 | 2110 | - | 0.1021 | 0.6208 | | 19.2727 | 2120 | - | 0.1029 | 0.6205 | | 19.3636 | 2130 | - | 0.1033 | 0.6205 | | 19.4545 | 2140 | - | 0.1044 | 0.6206 | | 19.5455 | 2150 | 0.0805 | 0.1030 | 0.6187 | | 19.6364 | 2160 | - | 0.1029 | 0.6199 | | 19.7273 | 2170 | - | 0.1041 | 0.6214 | | 19.8182 | 2180 | - | 0.1050 | 0.6211 | | 19.9091 | 2190 | - | 0.1040 | 0.6207 | | 20.0 | 2200 | 0.0932 | 0.1028 | 0.6201 | | 20.0909 | 2210 | - | 0.1019 | 0.6212 | | 20.1818 | 2220 | - | 0.1030 | 0.6202 | | 20.2727 | 2230 | - | 0.1034 | 0.6212 | | 20.3636 | 2240 | - | 0.1029 | 0.6224 | | 20.4545 | 2250 | 0.0655 | 0.1034 | 0.6203 | | 20.5455 | 2260 | - | 0.1030 | 0.6229 | | 20.6364 | 2270 | - | 0.1023 | 0.6193 | | 20.7273 | 2280 | - | 0.1022 | 0.6185 | | 20.8182 | 2290 | - | 0.1017 | 0.6189 | | 20.9091 | 2300 | 0.0879 | 0.1011 | 0.6178 | | 21.0 | 2310 | - | 0.1015 | 0.6175 | | 21.0909 | 2320 | - | 0.1019 | 0.6182 | | 21.1818 | 2330 | - | 0.1013 | 0.6198 | | 21.2727 | 2340 | - | 0.1014 | 0.6187 | | 21.3636 | 2350 | 0.074 | 0.1022 | 0.6205 | | 21.4545 | 2360 | - | 0.1038 | 0.6213 | | 21.5455 | 2370 | - | 0.1043 | 0.6236 | | 21.6364 | 2380 | - | 0.1044 | 0.6231 | | 21.7273 | 2390 | - | 0.1045 | 0.6221 | | 21.8182 | 2400 | 0.0768 | 0.1050 | 0.6224 | | 21.9091 | 2410 | - | 0.1054 | 0.6222 | | 22.0 | 2420 | - | 0.1052 | 0.6214 | | 22.0909 | 2430 | - | 0.1051 | 0.6186 | | 22.1818 | 2440 | - | 0.1055 | 0.6193 | | 22.2727 | 2450 | 0.0741 | 0.1055 | 0.6205 | | 22.3636 | 2460 | - | 0.1053 | 0.6208 | | 22.4545 | 2470 | - | 0.1052 | 0.6224 | | 22.5455 | 2480 | - | 0.1037 | 0.6191 | | 22.6364 | 2490 | - | 0.1032 | 0.6189 | | 22.7273 | 2500 | 0.0669 | 0.1034 | 0.6189 | | 22.8182 | 2510 | - | 0.1037 | 0.6224 | | 22.9091 | 2520 | - | 0.1038 | 0.6226 | | 23.0 | 2530 | - | 0.1035 | 0.6203 | | 23.0909 | 2540 | - | 0.1030 | 0.6198 | | 23.1818 | 2550 | 0.0762 | 0.1029 | 0.6201 | | 23.2727 | 2560 | - | 0.1025 | 0.6195 | | 23.3636 | 2570 | - | 0.1024 | 0.6215 | | 23.4545 | 2580 | - | 0.1028 | 0.6224 | | 23.5455 | 2590 | - | 0.1036 | 0.6232 | | 23.6364 | 2600 | 0.0815 | 0.1037 | 0.6227 | | 23.7273 | 2610 | - | 0.1039 | 0.6227 | | 23.8182 | 2620 | - | 0.1036 | 0.6211 | | 23.9091 | 2630 | - | 0.1034 | 0.6192 | | 24.0 | 2640 | - | 0.1033 | 0.6193 | | 24.0909 | 2650 | 0.0661 | 0.1033 | 0.6178 | | 24.1818 | 2660 | - | 0.1027 | 0.6174 | | 24.2727 | 2670 | - | 0.1024 | 0.6198 | | 24.3636 | 2680 | - | 0.1025 | 0.6184 | | 24.4545 | 2690 | - | 0.1020 | 0.6181 | | 24.5455 | 2700 | 0.0679 | 0.1020 | 0.6194 | | 24.6364 | 2710 | - | 0.1020 | 0.6185 | | 24.7273 | 2720 | - | 0.1027 | 0.6196 | | 24.8182 | 2730 | - | 0.1027 | 0.6191 | | 24.9091 | 2740 | - | 0.1030 | 0.6196 | | 25.0 | 2750 | 0.0713 | 0.1035 | 0.6208 | | 25.0909 | 2760 | - | 0.1042 | 0.6187 | | 25.1818 | 2770 | - | 0.1049 | 0.6181 | | 25.2727 | 2780 | - | 0.1051 | 0.6200 | | 25.3636 | 2790 | - | 0.1051 | 0.6204 | | 25.4545 | 2800 | 0.0786 | 0.1048 | 0.6184 | | 25.5455 | 2810 | - | 0.1049 | 0.6198 | | 25.6364 | 2820 | - | 0.1051 | 0.6200 | | 25.7273 | 2830 | - | 0.1051 | 0.6198 | | 25.8182 | 2840 | - | 0.1048 | 0.6190 | | 25.9091 | 2850 | 0.0613 | 0.1050 | 0.6196 | | 26.0 | 2860 | - | 0.1050 | 0.6183 | | 26.0909 | 2870 | - | 0.1047 | 0.6198 | | 26.1818 | 2880 | - | 0.1046 | 0.6197 | | 26.2727 | 2890 | - | 0.1045 | 0.6217 | | 26.3636 | 2900 | 0.0576 | 0.1045 | 0.6208 | | 26.4545 | 2910 | - | 0.1047 | 0.6192 | | 26.5455 | 2920 | - | 0.1046 | 0.6220 | | 26.6364 | 2930 | - | 0.1042 | 0.6189 | | 26.7273 | 2940 | - | 0.1039 | 0.6204 | | 26.8182 | 2950 | 0.066 | 0.1036 | 0.6215 | | 26.9091 | 2960 | - | 0.1032 | 0.6188 | | 27.0 | 2970 | - | 0.1030 | 0.6209 | | 27.0909 | 2980 | - | 0.1027 | 0.6203 | | 27.1818 | 2990 | - | 0.1026 | 0.6215 | | 27.2727 | 3000 | 0.0681 | 0.1025 | 0.6212 | | 27.3636 | 3010 | - | 0.1026 | 0.6193 | | 27.4545 | 3020 | - | 0.1027 | 0.6189 | | 27.5455 | 3030 | - | 0.1028 | 0.6195 | | 27.6364 | 3040 | - | 0.1030 | 0.6196 | | 27.7273 | 3050 | 0.081 | 0.1031 | 0.6187 | | 27.8182 | 3060 | - | 0.1032 | 0.6181 | | 27.9091 | 3070 | - | 0.1030 | 0.6177 | | 28.0 | 3080 | - | 0.1029 | 0.6202 | | 28.0909 | 3090 | - | 0.1030 | 0.6193 | | 28.1818 | 3100 | 0.0443 | 0.1031 | 0.6195 | | 28.2727 | 3110 | - | 0.1031 | 0.6195 | | 28.3636 | 3120 | - | 0.1032 | 0.6177 | | 28.4545 | 3130 | - | 0.1034 | 0.6187 | | 28.5455 | 3140 | - | 0.1035 | 0.6189 | | 28.6364 | 3150 | 0.0646 | 0.1036 | 0.6187 | | 28.7273 | 3160 | - | 0.1037 | 0.6199 | | 28.8182 | 3170 | - | 0.1038 | 0.6208 | | 28.9091 | 3180 | - | 0.1038 | 0.6190 | | 29.0 | 3190 | - | 0.1038 | 0.6191 | | 29.0909 | 3200 | 0.0692 | 0.1038 | 0.6190 | | 29.1818 | 3210 | - | 0.1038 | 0.6201 | | 29.2727 | 3220 | - | 0.1038 | 0.6194 | | 29.3636 | 3230 | - | 0.1037 | 0.6201 | | 29.4545 | 3240 | - | 0.1037 | 0.6189 | | 29.5455 | 3250 | 0.084 | 0.1037 | 0.6194 | | 29.6364 | 3260 | - | 0.1037 | 0.6189 | | 29.7273 | 3270 | - | 0.1038 | 0.6199 | | 29.8182 | 3280 | - | 0.1038 | 0.6194 | | 29.9091 | 3290 | - | 0.1038 | 0.6191 | | 30.0 | 3300 | 0.0598 | 0.1038 | 0.6190 | * The bold row denotes the saved checkpoint. </details> ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.3.0 - Transformers: 4.46.2 - PyTorch: 2.5.1+cu121 - Accelerate: 1.1.1 - Datasets: 3.1.0 - Tokenizers: 0.20.3 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### GISTEmbedLoss ```bibtex @misc{solatorio2024gistembed, title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning}, author={Aivin V. Solatorio}, year={2024}, eprint={2402.16829}, archivePrefix={arXiv}, primaryClass={cs.LG} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "CRAFT" ]
Non_BioNLP
CAiRE/UniVaR-lambda-5
CAiRE
sentence-similarity
[ "sentence-transformers", "safetensors", "nomic_bert", "feature-extraction", "sentence-similarity", "mteb", "transformers", "transformers.js", "custom_code", "en", "arxiv:2402.01613", "license:apache-2.0", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,718
1,718
13
0
--- language: - en library_name: sentence-transformers license: apache-2.0 pipeline_tag: sentence-similarity tags: - feature-extraction - sentence-similarity - mteb - transformers - transformers.js model-index: - name: epoch_0_model results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 76.8507462686567 - type: ap value: 40.592189159090495 - type: f1 value: 71.01634655512476 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 91.51892500000001 - type: ap value: 88.50346762975335 - type: f1 value: 91.50342077459624 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 47.364 - type: f1 value: 46.72708080922794 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: map_at_1 value: 25.178 - type: map_at_10 value: 40.244 - type: map_at_100 value: 41.321999999999996 - type: map_at_1000 value: 41.331 - type: map_at_3 value: 35.016999999999996 - type: map_at_5 value: 37.99 - type: mrr_at_1 value: 25.605 - type: mrr_at_10 value: 40.422000000000004 - type: mrr_at_100 value: 41.507 - type: mrr_at_1000 value: 41.516 - type: mrr_at_3 value: 35.23 - type: mrr_at_5 value: 38.15 - type: ndcg_at_1 value: 25.178 - type: ndcg_at_10 value: 49.258 - type: ndcg_at_100 value: 53.776 - type: ndcg_at_1000 value: 53.995000000000005 - type: ndcg_at_3 value: 38.429 - type: ndcg_at_5 value: 43.803 - type: precision_at_1 value: 25.178 - type: precision_at_10 value: 7.831 - type: precision_at_100 value: 0.979 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 16.121 - type: precision_at_5 value: 12.29 - type: recall_at_1 value: 25.178 - type: recall_at_10 value: 78.307 - type: recall_at_100 value: 97.866 - type: recall_at_1000 value: 99.57300000000001 - type: recall_at_3 value: 48.364000000000004 - type: recall_at_5 value: 61.451 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 45.93034494751465 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 36.64579480054327 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 60.601310529222054 - type: mrr value: 75.04484896451656 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 88.57797718095814 - type: cos_sim_spearman value: 86.47064499110101 - type: euclidean_pearson value: 87.4559602783142 - type: euclidean_spearman value: 86.47064499110101 - type: manhattan_pearson value: 87.7232764230245 - type: manhattan_spearman value: 86.91222131777742 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 84.5422077922078 - type: f1 value: 84.47657456950589 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 38.48953561974464 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 32.75995857510105 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: map_at_1 value: 30.008000000000003 - type: map_at_10 value: 39.51 - type: map_at_100 value: 40.841 - type: map_at_1000 value: 40.973 - type: map_at_3 value: 36.248999999999995 - type: map_at_5 value: 38.096999999999994 - type: mrr_at_1 value: 36.481 - type: mrr_at_10 value: 44.818000000000005 - type: mrr_at_100 value: 45.64 - type: mrr_at_1000 value: 45.687 - type: mrr_at_3 value: 42.036 - type: mrr_at_5 value: 43.782 - type: ndcg_at_1 value: 36.481 - type: ndcg_at_10 value: 45.152 - type: ndcg_at_100 value: 50.449 - type: ndcg_at_1000 value: 52.76499999999999 - type: ndcg_at_3 value: 40.161 - type: ndcg_at_5 value: 42.577999999999996 - type: precision_at_1 value: 36.481 - type: precision_at_10 value: 8.369 - type: precision_at_100 value: 1.373 - type: precision_at_1000 value: 0.186 - type: precision_at_3 value: 18.693 - type: precision_at_5 value: 13.533999999999999 - type: recall_at_1 value: 30.008000000000003 - type: recall_at_10 value: 56.108999999999995 - type: recall_at_100 value: 78.55499999999999 - type: recall_at_1000 value: 93.659 - type: recall_at_3 value: 41.754999999999995 - type: recall_at_5 value: 48.296 - type: map_at_1 value: 30.262 - type: map_at_10 value: 40.139 - type: map_at_100 value: 41.394 - type: map_at_1000 value: 41.526 - type: map_at_3 value: 37.155 - type: map_at_5 value: 38.785 - type: mrr_at_1 value: 38.153 - type: mrr_at_10 value: 46.369 - type: mrr_at_100 value: 47.072 - type: mrr_at_1000 value: 47.111999999999995 - type: mrr_at_3 value: 44.268 - type: mrr_at_5 value: 45.389 - type: ndcg_at_1 value: 38.153 - type: ndcg_at_10 value: 45.925 - type: ndcg_at_100 value: 50.394000000000005 - type: ndcg_at_1000 value: 52.37500000000001 - type: ndcg_at_3 value: 41.754000000000005 - type: ndcg_at_5 value: 43.574 - type: precision_at_1 value: 38.153 - type: precision_at_10 value: 8.796 - type: precision_at_100 value: 1.432 - type: precision_at_1000 value: 0.189 - type: precision_at_3 value: 20.318 - type: precision_at_5 value: 14.395 - type: recall_at_1 value: 30.262 - type: recall_at_10 value: 55.72200000000001 - type: recall_at_100 value: 74.97500000000001 - type: recall_at_1000 value: 87.342 - type: recall_at_3 value: 43.129 - type: recall_at_5 value: 48.336 - type: map_at_1 value: 39.951 - type: map_at_10 value: 51.248000000000005 - type: map_at_100 value: 52.188 - type: map_at_1000 value: 52.247 - type: map_at_3 value: 48.211 - type: map_at_5 value: 49.797000000000004 - type: mrr_at_1 value: 45.329 - type: mrr_at_10 value: 54.749 - type: mrr_at_100 value: 55.367999999999995 - type: mrr_at_1000 value: 55.400000000000006 - type: mrr_at_3 value: 52.382 - type: mrr_at_5 value: 53.649 - type: ndcg_at_1 value: 45.329 - type: ndcg_at_10 value: 56.847 - type: ndcg_at_100 value: 60.738 - type: ndcg_at_1000 value: 61.976 - type: ndcg_at_3 value: 51.59 - type: ndcg_at_5 value: 53.915 - type: precision_at_1 value: 45.329 - type: precision_at_10 value: 8.959 - type: precision_at_100 value: 1.187 - type: precision_at_1000 value: 0.134 - type: precision_at_3 value: 22.612 - type: precision_at_5 value: 15.273 - type: recall_at_1 value: 39.951 - type: recall_at_10 value: 70.053 - type: recall_at_100 value: 86.996 - type: recall_at_1000 value: 95.707 - type: recall_at_3 value: 56.032000000000004 - type: recall_at_5 value: 61.629999999999995 - type: map_at_1 value: 25.566 - type: map_at_10 value: 33.207 - type: map_at_100 value: 34.166000000000004 - type: map_at_1000 value: 34.245 - type: map_at_3 value: 30.94 - type: map_at_5 value: 32.01 - type: mrr_at_1 value: 27.345000000000002 - type: mrr_at_10 value: 35.193000000000005 - type: mrr_at_100 value: 35.965 - type: mrr_at_1000 value: 36.028999999999996 - type: mrr_at_3 value: 32.806000000000004 - type: mrr_at_5 value: 34.021 - type: ndcg_at_1 value: 27.345000000000002 - type: ndcg_at_10 value: 37.891999999999996 - type: ndcg_at_100 value: 42.664 - type: ndcg_at_1000 value: 44.757000000000005 - type: ndcg_at_3 value: 33.123000000000005 - type: ndcg_at_5 value: 35.035 - type: precision_at_1 value: 27.345000000000002 - type: precision_at_10 value: 5.763 - type: precision_at_100 value: 0.859 - type: precision_at_1000 value: 0.108 - type: precision_at_3 value: 13.71 - type: precision_at_5 value: 9.401 - type: recall_at_1 value: 25.566 - type: recall_at_10 value: 50.563 - type: recall_at_100 value: 72.86399999999999 - type: recall_at_1000 value: 88.68599999999999 - type: recall_at_3 value: 37.43 - type: recall_at_5 value: 41.894999999999996 - type: map_at_1 value: 16.663 - type: map_at_10 value: 23.552 - type: map_at_100 value: 24.538 - type: map_at_1000 value: 24.661 - type: map_at_3 value: 21.085 - type: map_at_5 value: 22.391 - type: mrr_at_1 value: 20.025000000000002 - type: mrr_at_10 value: 27.643 - type: mrr_at_100 value: 28.499999999999996 - type: mrr_at_1000 value: 28.582 - type: mrr_at_3 value: 25.083 - type: mrr_at_5 value: 26.544 - type: ndcg_at_1 value: 20.025000000000002 - type: ndcg_at_10 value: 28.272000000000002 - type: ndcg_at_100 value: 33.353 - type: ndcg_at_1000 value: 36.454 - type: ndcg_at_3 value: 23.579 - type: ndcg_at_5 value: 25.685000000000002 - type: precision_at_1 value: 20.025000000000002 - type: precision_at_10 value: 5.187 - type: precision_at_100 value: 0.897 - type: precision_at_1000 value: 0.13 - type: precision_at_3 value: 10.987 - type: precision_at_5 value: 8.06 - type: recall_at_1 value: 16.663 - type: recall_at_10 value: 38.808 - type: recall_at_100 value: 61.305 - type: recall_at_1000 value: 83.571 - type: recall_at_3 value: 25.907999999999998 - type: recall_at_5 value: 31.214 - type: map_at_1 value: 27.695999999999998 - type: map_at_10 value: 37.018 - type: map_at_100 value: 38.263000000000005 - type: map_at_1000 value: 38.371 - type: map_at_3 value: 34.226 - type: map_at_5 value: 35.809999999999995 - type: mrr_at_1 value: 32.916000000000004 - type: mrr_at_10 value: 42.067 - type: mrr_at_100 value: 42.925000000000004 - type: mrr_at_1000 value: 42.978 - type: mrr_at_3 value: 39.637 - type: mrr_at_5 value: 41.134 - type: ndcg_at_1 value: 32.916000000000004 - type: ndcg_at_10 value: 42.539 - type: ndcg_at_100 value: 47.873 - type: ndcg_at_1000 value: 50.08200000000001 - type: ndcg_at_3 value: 37.852999999999994 - type: ndcg_at_5 value: 40.201 - type: precision_at_1 value: 32.916000000000004 - type: precision_at_10 value: 7.5840000000000005 - type: precision_at_100 value: 1.199 - type: precision_at_1000 value: 0.155 - type: precision_at_3 value: 17.485 - type: precision_at_5 value: 12.512 - type: recall_at_1 value: 27.695999999999998 - type: recall_at_10 value: 53.638 - type: recall_at_100 value: 76.116 - type: recall_at_1000 value: 91.069 - type: recall_at_3 value: 41.13 - type: recall_at_5 value: 46.872 - type: map_at_1 value: 24.108 - type: map_at_10 value: 33.372 - type: map_at_100 value: 34.656 - type: map_at_1000 value: 34.768 - type: map_at_3 value: 30.830999999999996 - type: map_at_5 value: 32.204 - type: mrr_at_1 value: 29.110000000000003 - type: mrr_at_10 value: 37.979 - type: mrr_at_100 value: 38.933 - type: mrr_at_1000 value: 38.988 - type: mrr_at_3 value: 35.731 - type: mrr_at_5 value: 36.963 - type: ndcg_at_1 value: 29.110000000000003 - type: ndcg_at_10 value: 38.635000000000005 - type: ndcg_at_100 value: 44.324999999999996 - type: ndcg_at_1000 value: 46.747 - type: ndcg_at_3 value: 34.37 - type: ndcg_at_5 value: 36.228 - type: precision_at_1 value: 29.110000000000003 - type: precision_at_10 value: 6.963 - type: precision_at_100 value: 1.146 - type: precision_at_1000 value: 0.152 - type: precision_at_3 value: 16.400000000000002 - type: precision_at_5 value: 11.552999999999999 - type: recall_at_1 value: 24.108 - type: recall_at_10 value: 49.597 - type: recall_at_100 value: 73.88900000000001 - type: recall_at_1000 value: 90.62400000000001 - type: recall_at_3 value: 37.662 - type: recall_at_5 value: 42.565 - type: map_at_1 value: 25.00791666666667 - type: map_at_10 value: 33.287749999999996 - type: map_at_100 value: 34.41141666666667 - type: map_at_1000 value: 34.52583333333333 - type: map_at_3 value: 30.734416666666668 - type: map_at_5 value: 32.137166666666666 - type: mrr_at_1 value: 29.305666666666664 - type: mrr_at_10 value: 37.22966666666666 - type: mrr_at_100 value: 38.066583333333334 - type: mrr_at_1000 value: 38.12616666666667 - type: mrr_at_3 value: 34.92275 - type: mrr_at_5 value: 36.23333333333334 - type: ndcg_at_1 value: 29.305666666666664 - type: ndcg_at_10 value: 38.25533333333333 - type: ndcg_at_100 value: 43.25266666666666 - type: ndcg_at_1000 value: 45.63583333333334 - type: ndcg_at_3 value: 33.777166666666666 - type: ndcg_at_5 value: 35.85 - type: precision_at_1 value: 29.305666666666664 - type: precision_at_10 value: 6.596416666666667 - type: precision_at_100 value: 1.0784166666666668 - type: precision_at_1000 value: 0.14666666666666664 - type: precision_at_3 value: 15.31075 - type: precision_at_5 value: 10.830916666666667 - type: recall_at_1 value: 25.00791666666667 - type: recall_at_10 value: 49.10933333333333 - type: recall_at_100 value: 71.09216666666667 - type: recall_at_1000 value: 87.77725000000001 - type: recall_at_3 value: 36.660916666666665 - type: recall_at_5 value: 41.94149999999999 - type: map_at_1 value: 23.521 - type: map_at_10 value: 30.043 - type: map_at_100 value: 30.936000000000003 - type: map_at_1000 value: 31.022 - type: map_at_3 value: 27.926000000000002 - type: map_at_5 value: 29.076999999999998 - type: mrr_at_1 value: 26.227 - type: mrr_at_10 value: 32.822 - type: mrr_at_100 value: 33.61 - type: mrr_at_1000 value: 33.672000000000004 - type: mrr_at_3 value: 30.776999999999997 - type: mrr_at_5 value: 31.866 - type: ndcg_at_1 value: 26.227 - type: ndcg_at_10 value: 34.041 - type: ndcg_at_100 value: 38.394 - type: ndcg_at_1000 value: 40.732 - type: ndcg_at_3 value: 30.037999999999997 - type: ndcg_at_5 value: 31.845000000000002 - type: precision_at_1 value: 26.227 - type: precision_at_10 value: 5.244999999999999 - type: precision_at_100 value: 0.808 - type: precision_at_1000 value: 0.107 - type: precision_at_3 value: 12.679000000000002 - type: precision_at_5 value: 8.773 - type: recall_at_1 value: 23.521 - type: recall_at_10 value: 43.633 - type: recall_at_100 value: 63.126000000000005 - type: recall_at_1000 value: 80.765 - type: recall_at_3 value: 32.614 - type: recall_at_5 value: 37.15 - type: map_at_1 value: 16.236 - type: map_at_10 value: 22.898 - type: map_at_100 value: 23.878 - type: map_at_1000 value: 24.009 - type: map_at_3 value: 20.87 - type: map_at_5 value: 22.025 - type: mrr_at_1 value: 19.339000000000002 - type: mrr_at_10 value: 26.382 - type: mrr_at_100 value: 27.245 - type: mrr_at_1000 value: 27.33 - type: mrr_at_3 value: 24.386 - type: mrr_at_5 value: 25.496000000000002 - type: ndcg_at_1 value: 19.339000000000002 - type: ndcg_at_10 value: 27.139999999999997 - type: ndcg_at_100 value: 31.944 - type: ndcg_at_1000 value: 35.077999999999996 - type: ndcg_at_3 value: 23.424 - type: ndcg_at_5 value: 25.188 - type: precision_at_1 value: 19.339000000000002 - type: precision_at_10 value: 4.8309999999999995 - type: precision_at_100 value: 0.845 - type: precision_at_1000 value: 0.128 - type: precision_at_3 value: 10.874 - type: precision_at_5 value: 7.825 - type: recall_at_1 value: 16.236 - type: recall_at_10 value: 36.513 - type: recall_at_100 value: 57.999 - type: recall_at_1000 value: 80.512 - type: recall_at_3 value: 26.179999999999996 - type: recall_at_5 value: 30.712 - type: map_at_1 value: 24.11 - type: map_at_10 value: 31.566 - type: map_at_100 value: 32.647 - type: map_at_1000 value: 32.753 - type: map_at_3 value: 29.24 - type: map_at_5 value: 30.564999999999998 - type: mrr_at_1 value: 28.265 - type: mrr_at_10 value: 35.504000000000005 - type: mrr_at_100 value: 36.436 - type: mrr_at_1000 value: 36.503 - type: mrr_at_3 value: 33.349000000000004 - type: mrr_at_5 value: 34.622 - type: ndcg_at_1 value: 28.265 - type: ndcg_at_10 value: 36.192 - type: ndcg_at_100 value: 41.388000000000005 - type: ndcg_at_1000 value: 43.948 - type: ndcg_at_3 value: 31.959 - type: ndcg_at_5 value: 33.998 - type: precision_at_1 value: 28.265 - type: precision_at_10 value: 5.989 - type: precision_at_100 value: 0.9650000000000001 - type: precision_at_1000 value: 0.13 - type: precision_at_3 value: 14.335 - type: precision_at_5 value: 10.112 - type: recall_at_1 value: 24.11 - type: recall_at_10 value: 46.418 - type: recall_at_100 value: 69.314 - type: recall_at_1000 value: 87.397 - type: recall_at_3 value: 34.724 - type: recall_at_5 value: 39.925 - type: map_at_1 value: 22.091 - type: map_at_10 value: 29.948999999999998 - type: map_at_100 value: 31.502000000000002 - type: map_at_1000 value: 31.713 - type: map_at_3 value: 27.464 - type: map_at_5 value: 28.968 - type: mrr_at_1 value: 26.482 - type: mrr_at_10 value: 34.009 - type: mrr_at_100 value: 35.081 - type: mrr_at_1000 value: 35.138000000000005 - type: mrr_at_3 value: 31.785000000000004 - type: mrr_at_5 value: 33.178999999999995 - type: ndcg_at_1 value: 26.482 - type: ndcg_at_10 value: 35.008 - type: ndcg_at_100 value: 41.272999999999996 - type: ndcg_at_1000 value: 43.972 - type: ndcg_at_3 value: 30.804 - type: ndcg_at_5 value: 33.046 - type: precision_at_1 value: 26.482 - type: precision_at_10 value: 6.462 - type: precision_at_100 value: 1.431 - type: precision_at_1000 value: 0.22899999999999998 - type: precision_at_3 value: 14.360999999999999 - type: precision_at_5 value: 10.474 - type: recall_at_1 value: 22.091 - type: recall_at_10 value: 45.125 - type: recall_at_100 value: 72.313 - type: recall_at_1000 value: 89.503 - type: recall_at_3 value: 33.158 - type: recall_at_5 value: 39.086999999999996 - type: map_at_1 value: 19.883 - type: map_at_10 value: 26.951000000000004 - type: map_at_100 value: 27.927999999999997 - type: map_at_1000 value: 28.022000000000002 - type: map_at_3 value: 24.616 - type: map_at_5 value: 25.917 - type: mrr_at_1 value: 21.996 - type: mrr_at_10 value: 29.221000000000004 - type: mrr_at_100 value: 30.024 - type: mrr_at_1000 value: 30.095 - type: mrr_at_3 value: 26.833000000000002 - type: mrr_at_5 value: 28.155 - type: ndcg_at_1 value: 21.996 - type: ndcg_at_10 value: 31.421 - type: ndcg_at_100 value: 36.237 - type: ndcg_at_1000 value: 38.744 - type: ndcg_at_3 value: 26.671 - type: ndcg_at_5 value: 28.907 - type: precision_at_1 value: 21.996 - type: precision_at_10 value: 5.009 - type: precision_at_100 value: 0.799 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 11.275 - type: precision_at_5 value: 8.059 - type: recall_at_1 value: 19.883 - type: recall_at_10 value: 43.132999999999996 - type: recall_at_100 value: 65.654 - type: recall_at_1000 value: 84.492 - type: recall_at_3 value: 30.209000000000003 - type: recall_at_5 value: 35.616 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: map_at_1 value: 17.756 - type: map_at_10 value: 30.378 - type: map_at_100 value: 32.537 - type: map_at_1000 value: 32.717 - type: map_at_3 value: 25.599 - type: map_at_5 value: 28.372999999999998 - type: mrr_at_1 value: 41.303 - type: mrr_at_10 value: 53.483999999999995 - type: mrr_at_100 value: 54.106 - type: mrr_at_1000 value: 54.127 - type: mrr_at_3 value: 50.315 - type: mrr_at_5 value: 52.396 - type: ndcg_at_1 value: 41.303 - type: ndcg_at_10 value: 40.503 - type: ndcg_at_100 value: 47.821000000000005 - type: ndcg_at_1000 value: 50.788 - type: ndcg_at_3 value: 34.364 - type: ndcg_at_5 value: 36.818 - type: precision_at_1 value: 41.303 - type: precision_at_10 value: 12.463000000000001 - type: precision_at_100 value: 2.037 - type: precision_at_1000 value: 0.26 - type: precision_at_3 value: 25.798 - type: precision_at_5 value: 19.896 - type: recall_at_1 value: 17.756 - type: recall_at_10 value: 46.102 - type: recall_at_100 value: 70.819 - type: recall_at_1000 value: 87.21799999999999 - type: recall_at_3 value: 30.646 - type: recall_at_5 value: 38.022 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: map_at_1 value: 9.033 - type: map_at_10 value: 20.584 - type: map_at_100 value: 29.518 - type: map_at_1000 value: 31.186000000000003 - type: map_at_3 value: 14.468 - type: map_at_5 value: 17.177 - type: mrr_at_1 value: 69.75 - type: mrr_at_10 value: 77.025 - type: mrr_at_100 value: 77.36699999999999 - type: mrr_at_1000 value: 77.373 - type: mrr_at_3 value: 75.583 - type: mrr_at_5 value: 76.396 - type: ndcg_at_1 value: 58.5 - type: ndcg_at_10 value: 45.033 - type: ndcg_at_100 value: 49.071 - type: ndcg_at_1000 value: 56.056 - type: ndcg_at_3 value: 49.936 - type: ndcg_at_5 value: 47.471999999999994 - type: precision_at_1 value: 69.75 - type: precision_at_10 value: 35.775 - type: precision_at_100 value: 11.594999999999999 - type: precision_at_1000 value: 2.062 - type: precision_at_3 value: 52.5 - type: precision_at_5 value: 45.300000000000004 - type: recall_at_1 value: 9.033 - type: recall_at_10 value: 26.596999999999998 - type: recall_at_100 value: 54.607000000000006 - type: recall_at_1000 value: 76.961 - type: recall_at_3 value: 15.754999999999999 - type: recall_at_5 value: 20.033 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 48.345000000000006 - type: f1 value: 43.4514918068706 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: map_at_1 value: 71.29100000000001 - type: map_at_10 value: 81.059 - type: map_at_100 value: 81.341 - type: map_at_1000 value: 81.355 - type: map_at_3 value: 79.74799999999999 - type: map_at_5 value: 80.612 - type: mrr_at_1 value: 76.40299999999999 - type: mrr_at_10 value: 84.615 - type: mrr_at_100 value: 84.745 - type: mrr_at_1000 value: 84.748 - type: mrr_at_3 value: 83.776 - type: mrr_at_5 value: 84.343 - type: ndcg_at_1 value: 76.40299999999999 - type: ndcg_at_10 value: 84.981 - type: ndcg_at_100 value: 86.00999999999999 - type: ndcg_at_1000 value: 86.252 - type: ndcg_at_3 value: 82.97 - type: ndcg_at_5 value: 84.152 - type: precision_at_1 value: 76.40299999999999 - type: precision_at_10 value: 10.446 - type: precision_at_100 value: 1.1199999999999999 - type: precision_at_1000 value: 0.116 - type: precision_at_3 value: 32.147999999999996 - type: precision_at_5 value: 20.135 - type: recall_at_1 value: 71.29100000000001 - type: recall_at_10 value: 93.232 - type: recall_at_100 value: 97.363 - type: recall_at_1000 value: 98.905 - type: recall_at_3 value: 87.893 - type: recall_at_5 value: 90.804 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: map_at_1 value: 18.667 - type: map_at_10 value: 30.853 - type: map_at_100 value: 32.494 - type: map_at_1000 value: 32.677 - type: map_at_3 value: 26.91 - type: map_at_5 value: 29.099000000000004 - type: mrr_at_1 value: 37.191 - type: mrr_at_10 value: 46.171 - type: mrr_at_100 value: 47.056 - type: mrr_at_1000 value: 47.099000000000004 - type: mrr_at_3 value: 44.059 - type: mrr_at_5 value: 45.147 - type: ndcg_at_1 value: 37.191 - type: ndcg_at_10 value: 38.437 - type: ndcg_at_100 value: 44.62 - type: ndcg_at_1000 value: 47.795 - type: ndcg_at_3 value: 35.003 - type: ndcg_at_5 value: 36.006 - type: precision_at_1 value: 37.191 - type: precision_at_10 value: 10.586 - type: precision_at_100 value: 1.688 - type: precision_at_1000 value: 0.22699999999999998 - type: precision_at_3 value: 23.302 - type: precision_at_5 value: 17.006 - type: recall_at_1 value: 18.667 - type: recall_at_10 value: 45.367000000000004 - type: recall_at_100 value: 68.207 - type: recall_at_1000 value: 87.072 - type: recall_at_3 value: 32.129000000000005 - type: recall_at_5 value: 37.719 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: map_at_1 value: 39.494 - type: map_at_10 value: 66.223 - type: map_at_100 value: 67.062 - type: map_at_1000 value: 67.11500000000001 - type: map_at_3 value: 62.867 - type: map_at_5 value: 64.994 - type: mrr_at_1 value: 78.987 - type: mrr_at_10 value: 84.585 - type: mrr_at_100 value: 84.773 - type: mrr_at_1000 value: 84.77900000000001 - type: mrr_at_3 value: 83.592 - type: mrr_at_5 value: 84.235 - type: ndcg_at_1 value: 78.987 - type: ndcg_at_10 value: 73.64 - type: ndcg_at_100 value: 76.519 - type: ndcg_at_1000 value: 77.51 - type: ndcg_at_3 value: 68.893 - type: ndcg_at_5 value: 71.585 - type: precision_at_1 value: 78.987 - type: precision_at_10 value: 15.529000000000002 - type: precision_at_100 value: 1.7770000000000001 - type: precision_at_1000 value: 0.191 - type: precision_at_3 value: 44.808 - type: precision_at_5 value: 29.006999999999998 - type: recall_at_1 value: 39.494 - type: recall_at_10 value: 77.643 - type: recall_at_100 value: 88.825 - type: recall_at_1000 value: 95.321 - type: recall_at_3 value: 67.211 - type: recall_at_5 value: 72.519 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 85.55959999999999 - type: ap value: 80.7246500384617 - type: f1 value: 85.52336485065454 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: dev revision: None metrics: - type: map_at_1 value: 23.631 - type: map_at_10 value: 36.264 - type: map_at_100 value: 37.428 - type: map_at_1000 value: 37.472 - type: map_at_3 value: 32.537 - type: map_at_5 value: 34.746 - type: mrr_at_1 value: 24.312 - type: mrr_at_10 value: 36.858000000000004 - type: mrr_at_100 value: 37.966 - type: mrr_at_1000 value: 38.004 - type: mrr_at_3 value: 33.188 - type: mrr_at_5 value: 35.367 - type: ndcg_at_1 value: 24.312 - type: ndcg_at_10 value: 43.126999999999995 - type: ndcg_at_100 value: 48.642 - type: ndcg_at_1000 value: 49.741 - type: ndcg_at_3 value: 35.589 - type: ndcg_at_5 value: 39.515 - type: precision_at_1 value: 24.312 - type: precision_at_10 value: 6.699 - type: precision_at_100 value: 0.9450000000000001 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 15.153 - type: precision_at_5 value: 11.065999999999999 - type: recall_at_1 value: 23.631 - type: recall_at_10 value: 64.145 - type: recall_at_100 value: 89.41 - type: recall_at_1000 value: 97.83500000000001 - type: recall_at_3 value: 43.769000000000005 - type: recall_at_5 value: 53.169 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.4108527131783 - type: f1 value: 93.1415880261038 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 77.24806201550388 - type: f1 value: 60.531916308197175 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 73.71553463349024 - type: f1 value: 71.70753174900791 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 77.79757901815736 - type: f1 value: 77.83719850433258 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 33.74193296622113 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 30.64257594108566 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 30.811018518883625 - type: mrr value: 31.910376577445003 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.409 - type: map_at_10 value: 13.093 - type: map_at_100 value: 16.256999999999998 - type: map_at_1000 value: 17.617 - type: map_at_3 value: 9.555 - type: map_at_5 value: 11.428 - type: mrr_at_1 value: 45.201 - type: mrr_at_10 value: 54.179 - type: mrr_at_100 value: 54.812000000000005 - type: mrr_at_1000 value: 54.840999999999994 - type: mrr_at_3 value: 51.909000000000006 - type: mrr_at_5 value: 53.519000000000005 - type: ndcg_at_1 value: 43.189 - type: ndcg_at_10 value: 35.028 - type: ndcg_at_100 value: 31.226 - type: ndcg_at_1000 value: 39.678000000000004 - type: ndcg_at_3 value: 40.596 - type: ndcg_at_5 value: 38.75 - type: precision_at_1 value: 44.582 - type: precision_at_10 value: 25.974999999999998 - type: precision_at_100 value: 7.793 - type: precision_at_1000 value: 2.036 - type: precision_at_3 value: 38.493 - type: precision_at_5 value: 33.994 - type: recall_at_1 value: 5.409 - type: recall_at_10 value: 16.875999999999998 - type: recall_at_100 value: 30.316 - type: recall_at_1000 value: 60.891 - type: recall_at_3 value: 10.688 - type: recall_at_5 value: 13.832 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: map_at_1 value: 36.375 - type: map_at_10 value: 51.991 - type: map_at_100 value: 52.91400000000001 - type: map_at_1000 value: 52.93600000000001 - type: map_at_3 value: 48.014 - type: map_at_5 value: 50.381 - type: mrr_at_1 value: 40.759 - type: mrr_at_10 value: 54.617000000000004 - type: mrr_at_100 value: 55.301 - type: mrr_at_1000 value: 55.315000000000005 - type: mrr_at_3 value: 51.516 - type: mrr_at_5 value: 53.435 - type: ndcg_at_1 value: 40.759 - type: ndcg_at_10 value: 59.384 - type: ndcg_at_100 value: 63.157 - type: ndcg_at_1000 value: 63.654999999999994 - type: ndcg_at_3 value: 52.114000000000004 - type: ndcg_at_5 value: 55.986000000000004 - type: precision_at_1 value: 40.759 - type: precision_at_10 value: 9.411999999999999 - type: precision_at_100 value: 1.153 - type: precision_at_1000 value: 0.12 - type: precision_at_3 value: 23.329 - type: precision_at_5 value: 16.256999999999998 - type: recall_at_1 value: 36.375 - type: recall_at_10 value: 79.053 - type: recall_at_100 value: 95.167 - type: recall_at_1000 value: 98.82 - type: recall_at_3 value: 60.475 - type: recall_at_5 value: 69.327 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: map_at_1 value: 70.256 - type: map_at_10 value: 83.8 - type: map_at_100 value: 84.425 - type: map_at_1000 value: 84.444 - type: map_at_3 value: 80.906 - type: map_at_5 value: 82.717 - type: mrr_at_1 value: 80.97999999999999 - type: mrr_at_10 value: 87.161 - type: mrr_at_100 value: 87.262 - type: mrr_at_1000 value: 87.263 - type: mrr_at_3 value: 86.175 - type: mrr_at_5 value: 86.848 - type: ndcg_at_1 value: 80.97999999999999 - type: ndcg_at_10 value: 87.697 - type: ndcg_at_100 value: 88.959 - type: ndcg_at_1000 value: 89.09899999999999 - type: ndcg_at_3 value: 84.83800000000001 - type: ndcg_at_5 value: 86.401 - type: precision_at_1 value: 80.97999999999999 - type: precision_at_10 value: 13.261000000000001 - type: precision_at_100 value: 1.5150000000000001 - type: precision_at_1000 value: 0.156 - type: precision_at_3 value: 37.01 - type: precision_at_5 value: 24.298000000000002 - type: recall_at_1 value: 70.256 - type: recall_at_10 value: 94.935 - type: recall_at_100 value: 99.274 - type: recall_at_1000 value: 99.928 - type: recall_at_3 value: 86.602 - type: recall_at_5 value: 91.133 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 56.322692497613104 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 61.895813503775074 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 4.338 - type: map_at_10 value: 10.767 - type: map_at_100 value: 12.537999999999998 - type: map_at_1000 value: 12.803999999999998 - type: map_at_3 value: 7.788 - type: map_at_5 value: 9.302000000000001 - type: mrr_at_1 value: 21.4 - type: mrr_at_10 value: 31.637999999999998 - type: mrr_at_100 value: 32.688 - type: mrr_at_1000 value: 32.756 - type: mrr_at_3 value: 28.433000000000003 - type: mrr_at_5 value: 30.178 - type: ndcg_at_1 value: 21.4 - type: ndcg_at_10 value: 18.293 - type: ndcg_at_100 value: 25.274 - type: ndcg_at_1000 value: 30.284 - type: ndcg_at_3 value: 17.391000000000002 - type: ndcg_at_5 value: 15.146999999999998 - type: precision_at_1 value: 21.4 - type: precision_at_10 value: 9.48 - type: precision_at_100 value: 1.949 - type: precision_at_1000 value: 0.316 - type: precision_at_3 value: 16.167 - type: precision_at_5 value: 13.22 - type: recall_at_1 value: 4.338 - type: recall_at_10 value: 19.213 - type: recall_at_100 value: 39.562999999999995 - type: recall_at_1000 value: 64.08 - type: recall_at_3 value: 9.828000000000001 - type: recall_at_5 value: 13.383000000000001 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 82.42568163642142 - type: cos_sim_spearman value: 78.5797159641342 - type: euclidean_pearson value: 80.22151260811604 - type: euclidean_spearman value: 78.5797151953878 - type: manhattan_pearson value: 80.21224215864788 - type: manhattan_spearman value: 78.55641478381344 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 85.44020710812569 - type: cos_sim_spearman value: 78.91631735081286 - type: euclidean_pearson value: 81.64188964182102 - type: euclidean_spearman value: 78.91633286881678 - type: manhattan_pearson value: 81.69294748512496 - type: manhattan_spearman value: 78.93438558002656 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 84.27165426412311 - type: cos_sim_spearman value: 85.40429140249618 - type: euclidean_pearson value: 84.7509580724893 - type: euclidean_spearman value: 85.40429140249618 - type: manhattan_pearson value: 84.76488289321308 - type: manhattan_spearman value: 85.4256793698708 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 83.138851760732 - type: cos_sim_spearman value: 81.64101363896586 - type: euclidean_pearson value: 82.55165038934942 - type: euclidean_spearman value: 81.64105257080502 - type: manhattan_pearson value: 82.52802949883335 - type: manhattan_spearman value: 81.61255430718158 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 86.0654695484029 - type: cos_sim_spearman value: 87.20408521902229 - type: euclidean_pearson value: 86.8110651362115 - type: euclidean_spearman value: 87.20408521902229 - type: manhattan_pearson value: 86.77984656478691 - type: manhattan_spearman value: 87.1719947099227 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 83.77823915496512 - type: cos_sim_spearman value: 85.43566325729779 - type: euclidean_pearson value: 84.5396956658821 - type: euclidean_spearman value: 85.43566325729779 - type: manhattan_pearson value: 84.5665398848169 - type: manhattan_spearman value: 85.44375870303232 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 87.20030208471798 - type: cos_sim_spearman value: 87.20485505076539 - type: euclidean_pearson value: 88.10588324368722 - type: euclidean_spearman value: 87.20485505076539 - type: manhattan_pearson value: 87.92324770415183 - type: manhattan_spearman value: 87.0571314561877 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 63.06093161604453 - type: cos_sim_spearman value: 64.2163140357722 - type: euclidean_pearson value: 65.27589680994006 - type: euclidean_spearman value: 64.2163140357722 - type: manhattan_pearson value: 65.45904383711101 - type: manhattan_spearman value: 64.55404716679305 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 84.32976164578706 - type: cos_sim_spearman value: 85.54302197678368 - type: euclidean_pearson value: 85.26307149193056 - type: euclidean_spearman value: 85.54302197678368 - type: manhattan_pearson value: 85.26647282029371 - type: manhattan_spearman value: 85.5316135265568 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 81.44675968318754 - type: mrr value: 94.92741826075158 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: map_at_1 value: 56.34400000000001 - type: map_at_10 value: 65.927 - type: map_at_100 value: 66.431 - type: map_at_1000 value: 66.461 - type: map_at_3 value: 63.529 - type: map_at_5 value: 64.818 - type: mrr_at_1 value: 59.333000000000006 - type: mrr_at_10 value: 67.54599999999999 - type: mrr_at_100 value: 67.892 - type: mrr_at_1000 value: 67.917 - type: mrr_at_3 value: 65.778 - type: mrr_at_5 value: 66.794 - type: ndcg_at_1 value: 59.333000000000006 - type: ndcg_at_10 value: 70.5 - type: ndcg_at_100 value: 72.688 - type: ndcg_at_1000 value: 73.483 - type: ndcg_at_3 value: 66.338 - type: ndcg_at_5 value: 68.265 - type: precision_at_1 value: 59.333000000000006 - type: precision_at_10 value: 9.3 - type: precision_at_100 value: 1.053 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 25.889 - type: precision_at_5 value: 16.866999999999997 - type: recall_at_1 value: 56.34400000000001 - type: recall_at_10 value: 82.789 - type: recall_at_100 value: 92.767 - type: recall_at_1000 value: 99 - type: recall_at_3 value: 71.64399999999999 - type: recall_at_5 value: 76.322 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.75742574257426 - type: cos_sim_ap value: 93.52081548447406 - type: cos_sim_f1 value: 87.33850129198966 - type: cos_sim_precision value: 90.37433155080214 - type: cos_sim_recall value: 84.5 - type: dot_accuracy value: 99.75742574257426 - type: dot_ap value: 93.52081548447406 - type: dot_f1 value: 87.33850129198966 - type: dot_precision value: 90.37433155080214 - type: dot_recall value: 84.5 - type: euclidean_accuracy value: 99.75742574257426 - type: euclidean_ap value: 93.52081548447406 - type: euclidean_f1 value: 87.33850129198966 - type: euclidean_precision value: 90.37433155080214 - type: euclidean_recall value: 84.5 - type: manhattan_accuracy value: 99.75841584158415 - type: manhattan_ap value: 93.4975678585854 - type: manhattan_f1 value: 87.26708074534162 - type: manhattan_precision value: 90.45064377682404 - type: manhattan_recall value: 84.3 - type: max_accuracy value: 99.75841584158415 - type: max_ap value: 93.52081548447406 - type: max_f1 value: 87.33850129198966 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 64.31437036686651 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 33.25569319007206 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 49.90474939720706 - type: mrr value: 50.568115503777264 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 29.866828641244712 - type: cos_sim_spearman value: 30.077555055873866 - type: dot_pearson value: 29.866832988572266 - type: dot_spearman value: 30.077555055873866 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.232 - type: map_at_10 value: 2.094 - type: map_at_100 value: 11.971 - type: map_at_1000 value: 28.158 - type: map_at_3 value: 0.688 - type: map_at_5 value: 1.114 - type: mrr_at_1 value: 88 - type: mrr_at_10 value: 93.4 - type: mrr_at_100 value: 93.4 - type: mrr_at_1000 value: 93.4 - type: mrr_at_3 value: 93 - type: mrr_at_5 value: 93.4 - type: ndcg_at_1 value: 84 - type: ndcg_at_10 value: 79.923 - type: ndcg_at_100 value: 61.17 - type: ndcg_at_1000 value: 53.03 - type: ndcg_at_3 value: 84.592 - type: ndcg_at_5 value: 82.821 - type: precision_at_1 value: 88 - type: precision_at_10 value: 85 - type: precision_at_100 value: 63.019999999999996 - type: precision_at_1000 value: 23.554 - type: precision_at_3 value: 89.333 - type: precision_at_5 value: 87.2 - type: recall_at_1 value: 0.232 - type: recall_at_10 value: 2.255 - type: recall_at_100 value: 14.823 - type: recall_at_1000 value: 49.456 - type: recall_at_3 value: 0.718 - type: recall_at_5 value: 1.175 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.547 - type: map_at_10 value: 11.375 - type: map_at_100 value: 18.194 - type: map_at_1000 value: 19.749 - type: map_at_3 value: 5.825 - type: map_at_5 value: 8.581 - type: mrr_at_1 value: 32.653 - type: mrr_at_10 value: 51.32 - type: mrr_at_100 value: 51.747 - type: mrr_at_1000 value: 51.747 - type: mrr_at_3 value: 47.278999999999996 - type: mrr_at_5 value: 48.605 - type: ndcg_at_1 value: 29.592000000000002 - type: ndcg_at_10 value: 28.151 - type: ndcg_at_100 value: 39.438 - type: ndcg_at_1000 value: 50.769 - type: ndcg_at_3 value: 30.758999999999997 - type: ndcg_at_5 value: 30.366 - type: precision_at_1 value: 32.653 - type: precision_at_10 value: 25.714 - type: precision_at_100 value: 8.041 - type: precision_at_1000 value: 1.555 - type: precision_at_3 value: 33.333 - type: precision_at_5 value: 31.837 - type: recall_at_1 value: 2.547 - type: recall_at_10 value: 18.19 - type: recall_at_100 value: 49.538 - type: recall_at_1000 value: 83.86 - type: recall_at_3 value: 7.329 - type: recall_at_5 value: 11.532 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 71.4952 - type: ap value: 14.793362635531409 - type: f1 value: 55.204635551516915 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 61.5365025466893 - type: f1 value: 61.81742556334845 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 49.05531070301185 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 86.51725576682364 - type: cos_sim_ap value: 75.2292304265163 - type: cos_sim_f1 value: 69.54022988505749 - type: cos_sim_precision value: 63.65629110039457 - type: cos_sim_recall value: 76.62269129287598 - type: dot_accuracy value: 86.51725576682364 - type: dot_ap value: 75.22922386081054 - type: dot_f1 value: 69.54022988505749 - type: dot_precision value: 63.65629110039457 - type: dot_recall value: 76.62269129287598 - type: euclidean_accuracy value: 86.51725576682364 - type: euclidean_ap value: 75.22925730473472 - type: euclidean_f1 value: 69.54022988505749 - type: euclidean_precision value: 63.65629110039457 - type: euclidean_recall value: 76.62269129287598 - type: manhattan_accuracy value: 86.52321630804077 - type: manhattan_ap value: 75.20608115037336 - type: manhattan_f1 value: 69.60000000000001 - type: manhattan_precision value: 64.37219730941705 - type: manhattan_recall value: 75.75197889182058 - type: max_accuracy value: 86.52321630804077 - type: max_ap value: 75.22925730473472 - type: max_f1 value: 69.60000000000001 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 89.34877944657896 - type: cos_sim_ap value: 86.71257569277373 - type: cos_sim_f1 value: 79.10386355986088 - type: cos_sim_precision value: 76.91468470434214 - type: cos_sim_recall value: 81.4213119802895 - type: dot_accuracy value: 89.34877944657896 - type: dot_ap value: 86.71257133133368 - type: dot_f1 value: 79.10386355986088 - type: dot_precision value: 76.91468470434214 - type: dot_recall value: 81.4213119802895 - type: euclidean_accuracy value: 89.34877944657896 - type: euclidean_ap value: 86.71257651501476 - type: euclidean_f1 value: 79.10386355986088 - type: euclidean_precision value: 76.91468470434214 - type: euclidean_recall value: 81.4213119802895 - type: manhattan_accuracy value: 89.35848177901967 - type: manhattan_ap value: 86.69330615469126 - type: manhattan_f1 value: 79.13867741453949 - type: manhattan_precision value: 76.78881807647741 - type: manhattan_recall value: 81.63689559593472 - type: max_accuracy value: 89.35848177901967 - type: max_ap value: 86.71257651501476 - type: max_f1 value: 79.13867741453949 --- # nomic-embed-text-v1: A Reproducible Long Context (8192) Text Embedder `nomic-embed-text-v1` is 8192 context length text encoder that surpasses OpenAI text-embedding-ada-002 and text-embedding-3-small performance on short and long context tasks. | Name | SeqLen | MTEB | LoCo | Jina Long Context | Open Weights | Open Training Code | Open Data | | :-------------------------------:| :----- | :-------- | :------: | :---------------: | :-----------: | :----------------: | :---------- | | nomic-embed-text-v1 | 8192 | **62.39** |**85.53** | 54.16 | ✅ | ✅ | ✅ | | jina-embeddings-v2-base-en | 8192 | 60.39 | 85.45 | 51.90 | ✅ | ❌ | ❌ | | text-embedding-3-small | 8191 | 62.26 | 82.40 | **58.20** | ❌ | ❌ | ❌ | | text-embedding-ada-002 | 8191 | 60.99 | 52.7 | 55.25 | ❌ | ❌ | ❌ | ## Hosted Inference API The easiest way to get started with Nomic Embed is through the Nomic Embedding API. Generating embeddings with the `nomic` Python client is as easy as ```python from nomic import embed output = embed.text( texts=['Nomic Embedding API', '#keepAIOpen'], model='nomic-embed-text-v1', task_type='search_document' ) print(output) ``` For more information, see the [API reference](https://docs.nomic.ai/reference/endpoints/nomic-embed-text) ## Data Visualization Click the Nomic Atlas map below to visualize a 5M sample of our contrastive pretraining data! [![image/webp](https://cdn-uploads.huggingface.co/production/uploads/607997c83a565c15675055b3/pjhJhuNyRfPagRd_c_iUz.webp)](https://atlas.nomic.ai/map/nomic-text-embed-v1-5m-sample) ## Training Details We train our embedder using a multi-stage training pipeline. Starting from a long-context [BERT model](https://huggingface.co/nomic-ai/nomic-bert-2048), the first unsupervised contrastive stage trains on a dataset generated from weakly related text pairs, such as question-answer pairs from forums like StackExchange and Quora, title-body pairs from Amazon reviews, and summarizations from news articles. In the second finetuning stage, higher quality labeled datasets such as search queries and answers from web searches are leveraged. Data curation and hard-example mining is crucial in this stage. For more details, see the Nomic Embed [Technical Report](https://static.nomic.ai/reports/2024_Nomic_Embed_Text_Technical_Report.pdf) and corresponding [blog post](https://blog.nomic.ai/posts/nomic-embed-text-v1). Training data to train the models is released in its entirety. For more details, see the `contrastors` [repository](https://github.com/nomic-ai/contrastors) ## Usage Note `nomic-embed-text` *requires* prefixes! We support the prefixes `[search_query, search_document, classification, clustering]`. For retrieval applications, you should prepend `search_document` for all your documents and `search_query` for your queries. For example, you are building a RAG application over the top of Wikipedia. You would embed all Wikipedia articles with the prefix `search_document` and any questions you ask with `search_query`. For example: ```python queries = ["search_query: who is the first president of the united states?", "search_query: when was babe ruth born?"] documents = ["search_document: <article about US Presidents>", "search_document: <article about Babe Ruth>"] ``` ### Sentence Transformers ```python from sentence_transformers import SentenceTransformer model = SentenceTransformer("nomic-ai/nomic-embed-text-v1", trust_remote_code=True) sentences = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?'] embeddings = model.encode(sentences) print(embeddings) ``` ### Transformers ```python import torch import torch.nn.functional as F from transformers import AutoTokenizer, AutoModel def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) sentences = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?'] tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased') model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True) model.eval() encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') with torch.no_grad(): model_output = model(**encoded_input) embeddings = mean_pooling(model_output, encoded_input['attention_mask']) embeddings = F.normalize(embeddings, p=2, dim=1) print(embeddings) ``` The model natively supports scaling of the sequence length past 2048 tokens. To do so, ```diff - tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased') + tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased', model_max_length=8192) - model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True) + model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1', trust_remote_code=True, rotary_scaling_factor=2) ``` ### Transformers.js ```js import { pipeline } from '@xenova/transformers'; // Create a feature extraction pipeline const extractor = await pipeline('feature-extraction', 'nomic-ai/nomic-embed-text-v1', { quantized: false, // Comment out this line to use the quantized version }); // Compute sentence embeddings const texts = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?']; const embeddings = await extractor(texts, { pooling: 'mean', normalize: true }); console.log(embeddings); ``` # Join the Nomic Community - Nomic: [https://nomic.ai](https://nomic.ai) - Discord: [https://discord.gg/myY5YDR8z8](https://discord.gg/myY5YDR8z8) - Twitter: [https://twitter.com/nomic_ai](https://twitter.com/nomic_ai) # Citation If you find the model, dataset, or training code useful, please cite our work ```bibtex @misc{nussbaum2024nomic, title={Nomic Embed: Training a Reproducible Long Context Text Embedder}, author={Zach Nussbaum and John X. Morris and Brandon Duderstadt and Andriy Mulyar}, year={2024}, eprint={2402.01613}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
RichardErkhov/EleutherAI_-_pythia-2.8b-deduped-8bits
RichardErkhov
text-generation
[ "transformers", "safetensors", "gpt_neox", "text-generation", "arxiv:2304.01373", "arxiv:2101.00027", "arxiv:2201.07311", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "8-bit", "bitsandbytes", "region:us" ]
1,713
1,713
4
0
--- {} --- Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) pythia-2.8b-deduped - bnb 8bits - Model creator: https://huggingface.co/EleutherAI/ - Original model: https://huggingface.co/EleutherAI/pythia-2.8b-deduped/ Original model description: --- language: - en tags: - pytorch - causal-lm - pythia license: apache-2.0 datasets: - EleutherAI/the_pile_deduplicated --- The *Pythia Scaling Suite* is a collection of models developed to facilitate interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf). It contains two sets of eight models of sizes 70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two models: one trained on the Pile, and one trained on the Pile after the dataset has been globally deduplicated. All 8 model sizes are trained on the exact same data, in the exact same order. We also provide 154 intermediate checkpoints per model, hosted on Hugging Face as branches. The Pythia model suite was designed to promote scientific research on large language models, especially interpretability research. Despite not centering downstream performance as a design goal, we find the models <a href="#evaluations">match or exceed</a> the performance of similar and same-sized models, such as those in the OPT and GPT-Neo suites. <details> <summary style="font-weight:600">Details on previous early release and naming convention.</summary> Previously, we released an early version of the Pythia suite to the public. However, we decided to retrain the model suite to address a few hyperparameter discrepancies. This model card <a href="#changelog">lists the changes</a>; see appendix B in the Pythia paper for further discussion. We found no difference in benchmark performance between the two Pythia versions. The old models are [still available](https://huggingface.co/models?other=pythia_v0), but we suggest the retrained suite if you are just starting to use Pythia.<br> **This is the current release.** Please note that all models in the *Pythia* suite were renamed in January 2023. For clarity, a <a href="#naming-convention-and-parameter-count">table comparing the old and new names</a> is provided in this model card, together with exact parameter counts. </details> <br> # Pythia-2.8B-deduped ## Model Details - Developed by: [EleutherAI](http://eleuther.ai) - Model type: Transformer-based Language Model - Language: English - Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia) for training procedure, config files, and details on how to use. [See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation details. - Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox) - License: Apache 2.0 - Contact: to ask questions about this model, join the [EleutherAI Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`. Please read the existing *Pythia* documentation before asking about it in the EleutherAI Discord. For general correspondence: [contact@eleuther. ai](mailto:[email protected]). <figure> | Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models | | -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: | | 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — | | 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M | | 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M | | 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — | | 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B | | 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B | | 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B | | 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — | <figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and non-deduped models of a given size have the same hyperparameters. “Equivalent” models have <b>exactly</b> the same architecture, and the same number of non-embedding parameters.</figcaption> </figure> ## Uses and Limitations ### Intended Use The primary intended use of Pythia is research on the behavior, functionality, and limitations of large language models. This suite is intended to provide a controlled setting for performing scientific experiments. We also provide 154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints `step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to `step143000`. These checkpoints are hosted on Hugging Face as branches. Note that branch `143000` corresponds exactly to the model checkpoint on the `main` branch of each model. You may also further fine-tune and adapt Pythia-2.8B-deduped for deployment, as long as your use is in accordance with the Apache 2.0 license. Pythia models work with the Hugging Face [Transformers Library](https://huggingface.co/docs/transformers/index). If you decide to use pre-trained Pythia-2.8B-deduped as a basis for your fine-tuned model, please conduct your own risk and bias assessment. ### Out-of-scope use The Pythia Suite is **not** intended for deployment. It is not a in itself a product and cannot be used for human-facing interactions. For example, the model may generate harmful or offensive text. Please evaluate the risks associated with your particular use case. Pythia models are English-language only, and are not suitable for translation or generating text in other languages. Pythia-2.8B-deduped has not been fine-tuned for downstream contexts in which language models are commonly deployed, such as writing genre prose, or commercial chatbots. This means Pythia-2.8B-deduped will **not** respond to a given prompt the way a product like ChatGPT does. This is because, unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement Learning from Human Feedback (RLHF) to better “follow” human instructions. ### Limitations and biases The core functionality of a large language model is to take a string of text and predict the next token. The token used by the model need not produce the most “accurate” text. Never rely on Pythia-2.8B-deduped to produce factually accurate output. This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset known to contain profanity and texts that are lewd or otherwise offensive. See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a discussion of documented biases with regards to gender, religion, and race. Pythia-2.8B-deduped may produce socially unacceptable or undesirable text, *even if* the prompt itself does not include anything explicitly offensive. If you plan on using text generated through, for example, the Hosted Inference API, we recommend having a human curate the outputs of this language model before presenting it to other people. Please inform your audience that the text was generated by Pythia-2.8B-deduped. ### Quickstart Pythia models can be loaded and used via the following code, demonstrated here for the third `pythia-70m-deduped` checkpoint: ```python from transformers import GPTNeoXForCausalLM, AutoTokenizer model = GPTNeoXForCausalLM.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) tokenizer = AutoTokenizer.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) inputs = tokenizer("Hello, I am", return_tensors="pt") tokens = model.generate(**inputs) tokenizer.decode(tokens[0]) ``` Revision/branch `step143000` corresponds exactly to the model checkpoint on the `main` branch of each model.<br> For more information on how to use all Pythia models, see [documentation on GitHub](https://github.com/EleutherAI/pythia). ## Training ### Training data Pythia-2.8B-deduped was trained on the Pile **after the dataset has been globally deduplicated**.<br> [The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in English. It was created by EleutherAI specifically for training large language models. It contains texts from 22 diverse sources, roughly broken down into five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl), prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and miscellaneous (e.g. GitHub, Enron Emails). See [the Pile paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources, methodology, and a discussion of ethical implications. Consult [the datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation about the Pile and its component datasets. The Pile can be downloaded from the [official website](https://pile.eleuther.ai/), or from a [community mirror](https://the-eye.eu/public/AI/pile/). ### Training procedure All models were trained on the exact same data, in the exact same order. Each model saw 299,892,736,000 tokens during training, and 143 checkpoints for each model are saved every 2,097,152,000 tokens, spaced evenly throughout training, from `step1000` to `step143000` (which is the same as `main`). In addition, we also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`. This corresponds to training for just under 1 epoch on the Pile for non-deduplicated models, and about 1.5 epochs on the deduplicated Pile. All *Pythia* models trained for 143000 steps at a batch size of 2M (2,097,152 tokens).<br> See [GitHub](https://github.com/EleutherAI/pythia) for more details on training procedure, including [how to reproduce it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br> Pythia uses the same tokenizer as [GPT-NeoX- 20B](https://huggingface.co/EleutherAI/gpt-neox-20b). ## Evaluations All 16 *Pythia* models were evaluated using the [LM Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access the results by model and step at `results/json/*` in the [GitHub repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br> Expand the sections below to see plots of evaluation results for all Pythia and Pythia-deduped models compared with OPT and BLOOM. <details> <summary>LAMBADA – OpenAI</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/> </details> <details> <summary>Physical Interaction: Question Answering (PIQA)</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/> </details> <details> <summary>WinoGrande</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/> </details> <details> <summary>AI2 Reasoning Challenge—Easy Set</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/> </details> <details> <summary>SciQ</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/> </details> ## Changelog This section compares differences between previously released [Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current models. See Appendix B of the Pythia paper for further discussion of these changes and the motivation behind them. We found that retraining Pythia had no impact on benchmark performance. - All model sizes are now trained with uniform batch size of 2M tokens. Previously, the models of size 160M, 410M, and 1.4B parameters were trained with batch sizes of 4M tokens. - We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64, 128,256,512} in addition to every 1000 training steps. - Flash Attention was used in the new retrained suite. - We remedied a minor inconsistency that existed in the original suite: all models of size 2.8B parameters or smaller had a learning rate (LR) schedule which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and 12B models all used an LR schedule which decayed to a minimum LR of 0. In the redone training runs, we rectified this inconsistency: all models now were trained with LR decaying to a minimum of 0.1× their maximum LR. ### Naming convention and parameter count *Pythia* models were renamed in January 2023. It is possible that the old naming convention still persists in some documentation by accident. The current naming convention (70M, 160M, etc.) is based on total parameter count. <figure style="width:32em"> | current Pythia suffix | old suffix | total params | non-embedding params | | --------------------: | ---------: | -------------: | -------------------: | | 70M | 19M | 70,426,624 | 18,915,328 | | 160M | 125M | 162,322,944 | 85,056,000 | | 410M | 350M | 405,334,016 | 302,311,424 | | 1B | 800M | 1,011,781,632 | 805,736,448 | | 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 | | 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 | | 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 | | 12B | 13B | 11,846,072,320 | 11,327,027,200 | </figure>
[ "QUESTION_ANSWERING", "TRANSLATION" ]
[ "SCIQ" ]
Non_BioNLP
aident-ai/bge-base-en-onnx
aident-ai
feature-extraction
[ "sentence-transformers", "pytorch", "onnx", "bert", "feature-extraction", "sentence-similarity", "transformers", "mteb", "en", "arxiv:2309.07597", "license:mit", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,694
1,695
19
2
--- language: - en license: mit tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers - mteb --- This is a fork from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) and exported to onnx for inference. ======= <h1 align="center">FlagEmbedding</h1> <h4 align="center"> <p> <a href=#model-list>Model List</a> | <a href=#frequently-asked-questions>FAQ</a> | <a href=#usage>Usage</a> | <a href="#evaluation">Evaluation</a> | <a href="#train">Train</a> | <a href="#contact">Contact</a> | <a href="#citation">Citation</a> | <a href="#license">License</a> <p> </h4> More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding). [English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md) FlagEmbedding can map any text to a low-dimensional dense vector which can be used for tasks like retrieval, classification, clustering, or semantic search. And it also can be used in vector databases for LLMs. ************* 🌟**Updates**🌟 ************* - 09/15/2023: Release [paper](https://arxiv.org/pdf/2309.07597.pdf) and [dataset](https://data.baai.ac.cn/details/BAAI-MTP). - 09/12/2023: New Release: - **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models. - **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction. - 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning. - 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard). - 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗** - 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada: - 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset. ## Model List `bge` is short for `BAAI general embedding`. | Model | Language | | Description | query instruction for retrieval\* | |:-------------------------------|:--------:| :--------:| :--------:|:--------:| | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient \** | | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient \** | | | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` | \*: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages. \**: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models. For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results. ## Frequently asked questions <details> <summary>1. How to fine-tune bge embedding model?</summary> <!-- ### How to fine-tune bge embedding model? --> Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model. Some suggestions: - Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance. - If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity. - If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker. </details> <details> <summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary> <!-- ### The similarity score between two dissimilar sentences is higher than 0.5 --> **Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.** Since we finetune the models by contrastive learning with a temperature of 0.01, the similarity distribution of the current BGE model is about in the interval \[0.6, 1\]. So a similarity score greater than 0.5 does not indicate that the two sentences are similar. For downstream tasks, such as passage retrieval or semantic similarity, **what matters is the relative order of the scores, not the absolute value.** If you need to filter similar sentences based on a similarity threshold, please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9). </details> <details> <summary>3. When does the query instruction need to be used</summary> <!-- ### When does the query instruction need to be used --> For a retrieval task that uses short queries to find long related documents, it is recommended to add instructions for these short queries. **The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.** In all cases, the documents/passages do not need to add the instruction. </details> ## Usage ### Usage for Embedding Model Here are some examples for using `bge` models with [FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers). #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding. ```python from FlagEmbedding import FlagModel sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = FlagModel('BAAI/bge-large-zh-v1.5', query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:", use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation embeddings_1 = model.encode(sentences_1) embeddings_2 = model.encode(sentences_2) similarity = embeddings_1 @ embeddings_2.T print(similarity) # for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query # corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] q_embeddings = model.encode_queries(queries) p_embeddings = model.encode(passages) scores = q_embeddings @ p_embeddings.T ``` For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list). By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs. You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable. #### Using Sentence-Transformers You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net): ``` pip install -U sentence-transformers ``` ```python from sentence_transformers import SentenceTransformer sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = SentenceTransformer('BAAI/bge-large-zh-v1.5') embeddings_1 = model.encode(sentences_1, normalize_embeddings=True) embeddings_2 = model.encode(sentences_2, normalize_embeddings=True) similarity = embeddings_1 @ embeddings_2.T print(similarity) ``` For s2p(short query to long passage) retrieval task, each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)). But the instruction is not needed for passages. ```python from sentence_transformers import SentenceTransformer queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] instruction = "为这个句子生成表示以用于检索相关文章:" model = SentenceTransformer('BAAI/bge-large-zh-v1.5') q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True) p_embeddings = model.encode(passages, normalize_embeddings=True) scores = q_embeddings @ p_embeddings.T ``` #### Using Langchain You can use `bge` in langchain like this: ```python from langchain.embeddings import HuggingFaceBgeEmbeddings model_name = "BAAI/bge-large-en-v1.5" model_kwargs = {'device': 'cuda'} encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity model = HuggingFaceBgeEmbeddings( model_name=model_name, model_kwargs=model_kwargs, encode_kwargs=encode_kwargs, query_instruction="为这个句子生成表示以用于检索相关文章:" ) model.query_instruction = "为这个句子生成表示以用于检索相关文章:" ``` #### Using HuggingFace Transformers With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding. ```python from transformers import AutoTokenizer, AutoModel import torch # Sentences we want sentence embeddings for sentences = ["样例数据-1", "样例数据-2"] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5') model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5') model.eval() # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages) # encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, cls pooling. sentence_embeddings = model_output[0][:, 0] # normalize embeddings sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1) print("Sentence embeddings:", sentence_embeddings) ``` ### Usage for Reranker Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. You can get a relevance score by inputting query and passage to the reranker. The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range. #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` Get relevance scores (higher scores indicate more relevance): ```python from FlagEmbedding import FlagReranker reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation score = reranker.compute_score(['query', 'passage']) print(score) scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]) print(scores) ``` #### Using Huggingface transformers ```python import torch from transformers import AutoModelForSequenceClassification, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large') model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large') model.eval() pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']] with torch.no_grad(): inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512) scores = model(**inputs, return_dict=True).logits.view(-1, ).float() print(scores) ``` ## Evaluation `baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!** For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md). - **MTEB**: | Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) | |:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:| | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 | | [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 | | [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 | | [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 | | [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 | | [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 | | [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 | | [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 | | [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 | | [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 | | [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 | | [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 | | [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 | | [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 | | [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 | - **C-MTEB**: We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks. Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction. | Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 | | [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 | | [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 | | [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 | | [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 | | [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 | | [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 | | [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 | | [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 | | [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 | | [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 | - **Reranking**: See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script. | Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 | | multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 | | multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 | | multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 | | m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 | | m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 | | bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 | | bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 | \* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks ## Train ### BAAI Embedding We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning. **You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).** We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain). Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned. More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md). ### BGE Reranker Cross-encoder will perform full-attention over the input pair, which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model. Therefore, it can be used to re-rank the top-k documents returned by embedding model. We train the cross-encoder on a multilingual pair data, The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker). More details pelease refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker) ## Contact If you have any question or suggestion related to this project, feel free to open an issue or pull request. You also can email Shitao Xiao([email protected]) and Zheng Liu([email protected]). ## Citation If you find our work helpful, please cite us: ``` @misc{bge_embedding, title={C-Pack: Packaged Resources To Advance General Chinese Embedding}, author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff}, year={2023}, eprint={2309.07597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## License FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
[ "SEMANTIC_SIMILARITY", "SUMMARIZATION" ]
[ "BEAR" ]
Non_BioNLP
EleutherAI/pythia-1.4b
EleutherAI
text-generation
[ "transformers", "pytorch", "safetensors", "gpt_neox", "text-generation", "causal-lm", "pythia", "en", "dataset:EleutherAI/the_pile", "arxiv:2304.01373", "arxiv:2101.00027", "arxiv:2201.07311", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
1,675
1,688
83,953
23
--- datasets: - EleutherAI/the_pile language: - en license: apache-2.0 tags: - pytorch - causal-lm - pythia --- The *Pythia Scaling Suite* is a collection of models developed to facilitate interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf). It contains two sets of eight models of sizes 70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two models: one trained on the Pile, and one trained on the Pile after the dataset has been globally deduplicated. All 8 model sizes are trained on the exact same data, in the exact same order. We also provide 154 intermediate checkpoints per model, hosted on Hugging Face as branches. The Pythia model suite was deliberately designed to promote scientific research on large language models, especially interpretability research. Despite not centering downstream performance as a design goal, we find the models <a href="#evaluations">match or exceed</a> the performance of similar and same-sized models, such as those in the OPT and GPT-Neo suites. <details> <summary style="font-weight:600">Details on previous early release and naming convention.</summary> Previously, we released an early version of the Pythia suite to the public. However, we decided to retrain the model suite to address a few hyperparameter discrepancies. This model card <a href="#changelog">lists the changes</a>; see appendix B in the Pythia paper for further discussion. We found no difference in benchmark performance between the two Pythia versions. The old models are [still available](https://huggingface.co/models?other=pythia_v0), but we suggest the retrained suite if you are just starting to use Pythia.<br> **This is the current release.** Please note that all models in the *Pythia* suite were renamed in January 2023. For clarity, a <a href="#naming-convention-and-parameter-count">table comparing the old and new names</a> is provided in this model card, together with exact parameter counts. </details> <br> # Pythia-1.4B ## Model Details - Developed by: [EleutherAI](http://eleuther.ai) - Model type: Transformer-based Language Model - Language: English - Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia) for training procedure, config files, and details on how to use. [See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation details. - Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox) - License: Apache 2.0 - Contact: to ask questions about this model, join the [EleutherAI Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`. Please read the existing *Pythia* documentation before asking about it in the EleutherAI Discord. For general correspondence: [contact@eleuther. ai](mailto:[email protected]). <figure> | Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models | | -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: | | 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — | | 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M | | 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M | | 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — | | 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B | | 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B | | 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B | | 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — | <figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and non-deduped models of a given size have the same hyperparameters. “Equivalent” models have <b>exactly</b> the same architecture, and the same number of non-embedding parameters.</figcaption> </figure> ## Uses and Limitations ### Intended Use The primary intended use of Pythia is research on the behavior, functionality, and limitations of large language models. This suite is intended to provide a controlled setting for performing scientific experiments. We also provide 154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints `step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to `step143000`. These checkpoints are hosted on Hugging Face as branches. Note that branch `143000` corresponds exactly to the model checkpoint on the `main` branch of each model. You may also further fine-tune and adapt Pythia-1.4B for deployment, as long as your use is in accordance with the Apache 2.0 license. Pythia models work with the Hugging Face [Transformers Library](https://huggingface.co/docs/transformers/index). If you decide to use pre-trained Pythia-1.4B as a basis for your fine-tuned model, please conduct your own risk and bias assessment. ### Out-of-scope use The Pythia Suite is **not** intended for deployment. It is not a in itself a product and cannot be used for human-facing interactions. For example, the model may generate harmful or offensive text. Please evaluate the risks associated with your particular use case. Pythia models are English-language only, and are not suitable for translation or generating text in other languages. Pythia-1.4B has not been fine-tuned for downstream contexts in which language models are commonly deployed, such as writing genre prose, or commercial chatbots. This means Pythia-1.4B will **not** respond to a given prompt the way a product like ChatGPT does. This is because, unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement Learning from Human Feedback (RLHF) to better “follow” human instructions. ### Limitations and biases The core functionality of a large language model is to take a string of text and predict the next token. The token used by the model need not produce the most “accurate” text. Never rely on Pythia-1.4B to produce factually accurate output. This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset known to contain profanity and texts that are lewd or otherwise offensive. See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a discussion of documented biases with regards to gender, religion, and race. Pythia-1.4B may produce socially unacceptable or undesirable text, *even if* the prompt itself does not include anything explicitly offensive. If you plan on using text generated through, for example, the Hosted Inference API, we recommend having a human curate the outputs of this language model before presenting it to other people. Please inform your audience that the text was generated by Pythia-1.4B. ### Quickstart Pythia models can be loaded and used via the following code, demonstrated here for the third `pythia-70m-deduped` checkpoint: ```python from transformers import GPTNeoXForCausalLM, AutoTokenizer model = GPTNeoXForCausalLM.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) tokenizer = AutoTokenizer.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) inputs = tokenizer("Hello, I am", return_tensors="pt") tokens = model.generate(**inputs) tokenizer.decode(tokens[0]) ``` Revision/branch `step143000` corresponds exactly to the model checkpoint on the `main` branch of each model.<br> For more information on how to use all Pythia models, see [documentation on GitHub](https://github.com/EleutherAI/pythia). ## Training ### Training data [The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in English. It was created by EleutherAI specifically for training large language models. It contains texts from 22 diverse sources, roughly broken down into five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl), prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and miscellaneous (e.g. GitHub, Enron Emails). See [the Pile paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources, methodology, and a discussion of ethical implications. Consult [the datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation about the Pile and its component datasets. The Pile can be downloaded from the [official website](https://pile.eleuther.ai/), or from a [community mirror](https://the-eye.eu/public/AI/pile/).<br> The Pile was **not** deduplicated before being used to train Pythia-1.4B. ### Training procedure All models were trained on the exact same data, in the exact same order. Each model saw 299,892,736,000 tokens during training, and 143 checkpoints for each model are saved every 2,097,152,000 tokens, spaced evenly throughout training, from `step1000` to `step143000` (which is the same as `main`). In addition, we also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`. This corresponds to training for just under 1 epoch on the Pile for non-deduplicated models, and about 1.5 epochs on the deduplicated Pile. All *Pythia* models trained for 143000 steps at a batch size of 2M (2,097,152 tokens).<br> See [GitHub](https://github.com/EleutherAI/pythia) for more details on training procedure, including [how to reproduce it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br> Pythia uses the same tokenizer as [GPT-NeoX- 20B](https://huggingface.co/EleutherAI/gpt-neox-20b). ## Evaluations All 16 *Pythia* models were evaluated using the [LM Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access the results by model and step at `results/json/*` in the [GitHub repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br> Expand the sections below to see plots of evaluation results for all Pythia and Pythia-deduped models compared with OPT and BLOOM. <details> <summary>LAMBADA – OpenAI</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/> </details> <details> <summary>Physical Interaction: Question Answering (PIQA)</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/> </details> <details> <summary>WinoGrande</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/> </details> <details> <summary>AI2 Reasoning Challenge—Easy Set</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/> </details> <details> <summary>SciQ</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/> </details> ## Changelog This section compares differences between previously released [Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current models. See Appendix B of the Pythia paper for further discussion of these changes and the motivation behind them. We found that retraining Pythia had no impact on benchmark performance. - All model sizes are now trained with uniform batch size of 2M tokens. Previously, the models of size 160M, 410M, and 1.4B parameters were trained with batch sizes of 4M tokens. - We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64, 128,256,512} in addition to every 1000 training steps. - Flash Attention was used in the new retrained suite. - We remedied a minor inconsistency that existed in the original suite: all models of size 2.8B parameters or smaller had a learning rate (LR) schedule which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and 12B models all used an LR schedule which decayed to a minimum LR of 0. In the redone training runs, we rectified this inconsistency: all models now were trained with LR decaying to a minimum of 0.1× their maximum LR. ### Naming convention and parameter count *Pythia* models were renamed in January 2023. It is possible that the old naming convention still persists in some documentation by accident. The current naming convention (70M, 160M, etc.) is based on total parameter count. <figure style="width:32em"> | current Pythia suffix | old suffix | total params | non-embedding params | | --------------------: | ---------: | -------------: | -------------------: | | 70M | 19M | 70,426,624 | 18,915,328 | | 160M | 125M | 162,322,944 | 85,056,000 | | 410M | 350M | 405,334,016 | 302,311,424 | | 1B | 800M | 1,011,781,632 | 805,736,448 | | 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 | | 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 | | 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 | | 12B | 13B | 11,846,072,320 | 11,327,027,200 | </figure>
[ "QUESTION_ANSWERING", "TRANSLATION" ]
[ "SCIQ" ]
Non_BioNLP
mxs980/gte-Qwen2-1.5B-instruct-Q8_0-GGUF
mxs980
sentence-similarity
[ "sentence-transformers", "gguf", "mteb", "transformers", "Qwen2", "sentence-similarity", "llama-cpp", "gguf-my-repo", "base_model:Alibaba-NLP/gte-Qwen2-1.5B-instruct", "base_model:quantized:Alibaba-NLP/gte-Qwen2-1.5B-instruct", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us", "conversational" ]
1,719
1,719
41
0
--- base_model: Alibaba-NLP/gte-Qwen2-1.5B-instruct license: apache-2.0 tags: - mteb - sentence-transformers - transformers - Qwen2 - sentence-similarity - llama-cpp - gguf-my-repo model-index: - name: gte-qwen2-7B-instruct results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 83.98507462686567 - type: ap value: 50.93015252587014 - type: f1 value: 78.50416599051215 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 96.61065 - type: ap value: 94.89174052954196 - type: f1 value: 96.60942596940565 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 55.614000000000004 - type: f1 value: 54.90553480294904 - task: type: Retrieval dataset: name: MTEB ArguAna type: mteb/arguana config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 45.164 - type: map_at_10 value: 61.519 - type: map_at_100 value: 61.769 - type: map_at_1000 value: 61.769 - type: map_at_3 value: 57.443999999999996 - type: map_at_5 value: 60.058 - type: mrr_at_1 value: 46.088 - type: mrr_at_10 value: 61.861 - type: mrr_at_100 value: 62.117999999999995 - type: mrr_at_1000 value: 62.117999999999995 - type: mrr_at_3 value: 57.729 - type: mrr_at_5 value: 60.392 - type: ndcg_at_1 value: 45.164 - type: ndcg_at_10 value: 69.72 - type: ndcg_at_100 value: 70.719 - type: ndcg_at_1000 value: 70.719 - type: ndcg_at_3 value: 61.517999999999994 - type: ndcg_at_5 value: 66.247 - type: precision_at_1 value: 45.164 - type: precision_at_10 value: 9.545 - type: precision_at_100 value: 0.996 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 24.443 - type: precision_at_5 value: 16.97 - type: recall_at_1 value: 45.164 - type: recall_at_10 value: 95.448 - type: recall_at_100 value: 99.644 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 73.329 - type: recall_at_5 value: 84.851 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 50.511868162026175 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 45.007803189284004 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 64.55292107723382 - type: mrr value: 77.66158818097877 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 85.65459047085452 - type: cos_sim_spearman value: 82.10729255710761 - type: euclidean_pearson value: 82.78079159312476 - type: euclidean_spearman value: 80.50002701880933 - type: manhattan_pearson value: 82.41372641383016 - type: manhattan_spearman value: 80.57412509272639 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 87.30844155844156 - type: f1 value: 87.25307322443255 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 43.20754608934859 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 38.818037697335505 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: map_at_1 value: 35.423 - type: map_at_10 value: 47.198 - type: map_at_100 value: 48.899 - type: map_at_1000 value: 49.004 - type: map_at_3 value: 43.114999999999995 - type: map_at_5 value: 45.491 - type: mrr_at_1 value: 42.918 - type: mrr_at_10 value: 53.299 - type: mrr_at_100 value: 54.032000000000004 - type: mrr_at_1000 value: 54.055 - type: mrr_at_3 value: 50.453 - type: mrr_at_5 value: 52.205999999999996 - type: ndcg_at_1 value: 42.918 - type: ndcg_at_10 value: 53.98 - type: ndcg_at_100 value: 59.57 - type: ndcg_at_1000 value: 60.879000000000005 - type: ndcg_at_3 value: 48.224000000000004 - type: ndcg_at_5 value: 50.998 - type: precision_at_1 value: 42.918 - type: precision_at_10 value: 10.299999999999999 - type: precision_at_100 value: 1.687 - type: precision_at_1000 value: 0.211 - type: precision_at_3 value: 22.842000000000002 - type: precision_at_5 value: 16.681 - type: recall_at_1 value: 35.423 - type: recall_at_10 value: 66.824 - type: recall_at_100 value: 89.564 - type: recall_at_1000 value: 97.501 - type: recall_at_3 value: 50.365 - type: recall_at_5 value: 57.921 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval type: BeIR/cqadupstack config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: map_at_1 value: 33.205 - type: map_at_10 value: 44.859 - type: map_at_100 value: 46.135 - type: map_at_1000 value: 46.259 - type: map_at_3 value: 41.839 - type: map_at_5 value: 43.662 - type: mrr_at_1 value: 41.146 - type: mrr_at_10 value: 50.621 - type: mrr_at_100 value: 51.207 - type: mrr_at_1000 value: 51.246 - type: mrr_at_3 value: 48.535000000000004 - type: mrr_at_5 value: 49.818 - type: ndcg_at_1 value: 41.146 - type: ndcg_at_10 value: 50.683 - type: ndcg_at_100 value: 54.82 - type: ndcg_at_1000 value: 56.69 - type: ndcg_at_3 value: 46.611000000000004 - type: ndcg_at_5 value: 48.66 - type: precision_at_1 value: 41.146 - type: precision_at_10 value: 9.439 - type: precision_at_100 value: 1.465 - type: precision_at_1000 value: 0.194 - type: precision_at_3 value: 22.59 - type: precision_at_5 value: 15.86 - type: recall_at_1 value: 33.205 - type: recall_at_10 value: 61.028999999999996 - type: recall_at_100 value: 78.152 - type: recall_at_1000 value: 89.59700000000001 - type: recall_at_3 value: 49.05 - type: recall_at_5 value: 54.836 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval type: BeIR/cqadupstack config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 41.637 - type: map_at_10 value: 55.162 - type: map_at_100 value: 56.142 - type: map_at_1000 value: 56.188 - type: map_at_3 value: 51.564 - type: map_at_5 value: 53.696 - type: mrr_at_1 value: 47.524 - type: mrr_at_10 value: 58.243 - type: mrr_at_100 value: 58.879999999999995 - type: mrr_at_1000 value: 58.9 - type: mrr_at_3 value: 55.69499999999999 - type: mrr_at_5 value: 57.284 - type: ndcg_at_1 value: 47.524 - type: ndcg_at_10 value: 61.305 - type: ndcg_at_100 value: 65.077 - type: ndcg_at_1000 value: 65.941 - type: ndcg_at_3 value: 55.422000000000004 - type: ndcg_at_5 value: 58.516 - type: precision_at_1 value: 47.524 - type: precision_at_10 value: 9.918000000000001 - type: precision_at_100 value: 1.276 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 24.765 - type: precision_at_5 value: 17.204 - type: recall_at_1 value: 41.637 - type: recall_at_10 value: 76.185 - type: recall_at_100 value: 92.149 - type: recall_at_1000 value: 98.199 - type: recall_at_3 value: 60.856 - type: recall_at_5 value: 68.25099999999999 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval type: BeIR/cqadupstack config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: map_at_1 value: 26.27 - type: map_at_10 value: 37.463 - type: map_at_100 value: 38.434000000000005 - type: map_at_1000 value: 38.509 - type: map_at_3 value: 34.226 - type: map_at_5 value: 36.161 - type: mrr_at_1 value: 28.588 - type: mrr_at_10 value: 39.383 - type: mrr_at_100 value: 40.23 - type: mrr_at_1000 value: 40.281 - type: mrr_at_3 value: 36.422 - type: mrr_at_5 value: 38.252 - type: ndcg_at_1 value: 28.588 - type: ndcg_at_10 value: 43.511 - type: ndcg_at_100 value: 48.274 - type: ndcg_at_1000 value: 49.975 - type: ndcg_at_3 value: 37.319 - type: ndcg_at_5 value: 40.568 - type: precision_at_1 value: 28.588 - type: precision_at_10 value: 6.893000000000001 - type: precision_at_100 value: 0.9900000000000001 - type: precision_at_1000 value: 0.117 - type: precision_at_3 value: 16.347 - type: precision_at_5 value: 11.661000000000001 - type: recall_at_1 value: 26.27 - type: recall_at_10 value: 60.284000000000006 - type: recall_at_100 value: 81.902 - type: recall_at_1000 value: 94.43 - type: recall_at_3 value: 43.537 - type: recall_at_5 value: 51.475 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval type: BeIR/cqadupstack config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: map_at_1 value: 18.168 - type: map_at_10 value: 28.410000000000004 - type: map_at_100 value: 29.78 - type: map_at_1000 value: 29.892999999999997 - type: map_at_3 value: 25.238 - type: map_at_5 value: 26.96 - type: mrr_at_1 value: 23.507 - type: mrr_at_10 value: 33.382 - type: mrr_at_100 value: 34.404 - type: mrr_at_1000 value: 34.467999999999996 - type: mrr_at_3 value: 30.637999999999998 - type: mrr_at_5 value: 32.199 - type: ndcg_at_1 value: 23.507 - type: ndcg_at_10 value: 34.571000000000005 - type: ndcg_at_100 value: 40.663 - type: ndcg_at_1000 value: 43.236000000000004 - type: ndcg_at_3 value: 29.053 - type: ndcg_at_5 value: 31.563999999999997 - type: precision_at_1 value: 23.507 - type: precision_at_10 value: 6.654 - type: precision_at_100 value: 1.113 - type: precision_at_1000 value: 0.146 - type: precision_at_3 value: 14.427999999999999 - type: precision_at_5 value: 10.498000000000001 - type: recall_at_1 value: 18.168 - type: recall_at_10 value: 48.443000000000005 - type: recall_at_100 value: 74.47 - type: recall_at_1000 value: 92.494 - type: recall_at_3 value: 33.379999999999995 - type: recall_at_5 value: 39.76 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval type: BeIR/cqadupstack config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: map_at_1 value: 32.39 - type: map_at_10 value: 44.479 - type: map_at_100 value: 45.977000000000004 - type: map_at_1000 value: 46.087 - type: map_at_3 value: 40.976 - type: map_at_5 value: 43.038 - type: mrr_at_1 value: 40.135 - type: mrr_at_10 value: 50.160000000000004 - type: mrr_at_100 value: 51.052 - type: mrr_at_1000 value: 51.087 - type: mrr_at_3 value: 47.818 - type: mrr_at_5 value: 49.171 - type: ndcg_at_1 value: 40.135 - type: ndcg_at_10 value: 50.731 - type: ndcg_at_100 value: 56.452000000000005 - type: ndcg_at_1000 value: 58.123000000000005 - type: ndcg_at_3 value: 45.507 - type: ndcg_at_5 value: 48.11 - type: precision_at_1 value: 40.135 - type: precision_at_10 value: 9.192 - type: precision_at_100 value: 1.397 - type: precision_at_1000 value: 0.169 - type: precision_at_3 value: 21.816 - type: precision_at_5 value: 15.476 - type: recall_at_1 value: 32.39 - type: recall_at_10 value: 63.597 - type: recall_at_100 value: 86.737 - type: recall_at_1000 value: 97.039 - type: recall_at_3 value: 48.906 - type: recall_at_5 value: 55.659000000000006 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval type: BeIR/cqadupstack config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: map_at_1 value: 28.397 - type: map_at_10 value: 39.871 - type: map_at_100 value: 41.309000000000005 - type: map_at_1000 value: 41.409 - type: map_at_3 value: 36.047000000000004 - type: map_at_5 value: 38.104 - type: mrr_at_1 value: 34.703 - type: mrr_at_10 value: 44.773 - type: mrr_at_100 value: 45.64 - type: mrr_at_1000 value: 45.678999999999995 - type: mrr_at_3 value: 41.705 - type: mrr_at_5 value: 43.406 - type: ndcg_at_1 value: 34.703 - type: ndcg_at_10 value: 46.271 - type: ndcg_at_100 value: 52.037 - type: ndcg_at_1000 value: 53.81700000000001 - type: ndcg_at_3 value: 39.966 - type: ndcg_at_5 value: 42.801 - type: precision_at_1 value: 34.703 - type: precision_at_10 value: 8.744 - type: precision_at_100 value: 1.348 - type: precision_at_1000 value: 0.167 - type: precision_at_3 value: 19.102 - type: precision_at_5 value: 13.836 - type: recall_at_1 value: 28.397 - type: recall_at_10 value: 60.299 - type: recall_at_100 value: 84.595 - type: recall_at_1000 value: 96.155 - type: recall_at_3 value: 43.065 - type: recall_at_5 value: 50.371 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: BeIR/cqadupstack config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 28.044333333333338 - type: map_at_10 value: 38.78691666666666 - type: map_at_100 value: 40.113 - type: map_at_1000 value: 40.22125 - type: map_at_3 value: 35.52966666666667 - type: map_at_5 value: 37.372749999999996 - type: mrr_at_1 value: 33.159083333333335 - type: mrr_at_10 value: 42.913583333333335 - type: mrr_at_100 value: 43.7845 - type: mrr_at_1000 value: 43.830333333333336 - type: mrr_at_3 value: 40.29816666666667 - type: mrr_at_5 value: 41.81366666666667 - type: ndcg_at_1 value: 33.159083333333335 - type: ndcg_at_10 value: 44.75750000000001 - type: ndcg_at_100 value: 50.13658333333334 - type: ndcg_at_1000 value: 52.037 - type: ndcg_at_3 value: 39.34258333333334 - type: ndcg_at_5 value: 41.93708333333333 - type: precision_at_1 value: 33.159083333333335 - type: precision_at_10 value: 7.952416666666667 - type: precision_at_100 value: 1.2571666666666668 - type: precision_at_1000 value: 0.16099999999999998 - type: precision_at_3 value: 18.303833333333337 - type: precision_at_5 value: 13.057083333333333 - type: recall_at_1 value: 28.044333333333338 - type: recall_at_10 value: 58.237249999999996 - type: recall_at_100 value: 81.35391666666666 - type: recall_at_1000 value: 94.21283333333334 - type: recall_at_3 value: 43.32341666666667 - type: recall_at_5 value: 49.94908333333333 - type: map_at_1 value: 18.398 - type: map_at_10 value: 27.929 - type: map_at_100 value: 29.032999999999998 - type: map_at_1000 value: 29.126 - type: map_at_3 value: 25.070999999999998 - type: map_at_5 value: 26.583000000000002 - type: mrr_at_1 value: 19.963 - type: mrr_at_10 value: 29.997 - type: mrr_at_100 value: 30.9 - type: mrr_at_1000 value: 30.972 - type: mrr_at_3 value: 27.264 - type: mrr_at_5 value: 28.826 - type: ndcg_at_1 value: 19.963 - type: ndcg_at_10 value: 33.678999999999995 - type: ndcg_at_100 value: 38.931 - type: ndcg_at_1000 value: 41.379 - type: ndcg_at_3 value: 28.000000000000004 - type: ndcg_at_5 value: 30.637999999999998 - type: precision_at_1 value: 19.963 - type: precision_at_10 value: 5.7299999999999995 - type: precision_at_100 value: 0.902 - type: precision_at_1000 value: 0.122 - type: precision_at_3 value: 12.631 - type: precision_at_5 value: 9.057 - type: recall_at_1 value: 18.398 - type: recall_at_10 value: 49.254 - type: recall_at_100 value: 73.182 - type: recall_at_1000 value: 91.637 - type: recall_at_3 value: 34.06 - type: recall_at_5 value: 40.416000000000004 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval type: BeIR/cqadupstack config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: map_at_1 value: 27.838 - type: map_at_10 value: 36.04 - type: map_at_100 value: 37.113 - type: map_at_1000 value: 37.204 - type: map_at_3 value: 33.585 - type: map_at_5 value: 34.845 - type: mrr_at_1 value: 30.982 - type: mrr_at_10 value: 39.105000000000004 - type: mrr_at_100 value: 39.98 - type: mrr_at_1000 value: 40.042 - type: mrr_at_3 value: 36.912 - type: mrr_at_5 value: 38.062000000000005 - type: ndcg_at_1 value: 30.982 - type: ndcg_at_10 value: 40.982 - type: ndcg_at_100 value: 46.092 - type: ndcg_at_1000 value: 48.25 - type: ndcg_at_3 value: 36.41 - type: ndcg_at_5 value: 38.379999999999995 - type: precision_at_1 value: 30.982 - type: precision_at_10 value: 6.534 - type: precision_at_100 value: 0.9820000000000001 - type: precision_at_1000 value: 0.124 - type: precision_at_3 value: 15.745999999999999 - type: precision_at_5 value: 10.828 - type: recall_at_1 value: 27.838 - type: recall_at_10 value: 52.971000000000004 - type: recall_at_100 value: 76.357 - type: recall_at_1000 value: 91.973 - type: recall_at_3 value: 40.157 - type: recall_at_5 value: 45.147999999999996 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval type: BeIR/cqadupstack config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 19.059 - type: map_at_10 value: 27.454 - type: map_at_100 value: 28.736 - type: map_at_1000 value: 28.865000000000002 - type: map_at_3 value: 24.773999999999997 - type: map_at_5 value: 26.266000000000002 - type: mrr_at_1 value: 23.125 - type: mrr_at_10 value: 31.267 - type: mrr_at_100 value: 32.32 - type: mrr_at_1000 value: 32.394 - type: mrr_at_3 value: 28.894 - type: mrr_at_5 value: 30.281000000000002 - type: ndcg_at_1 value: 23.125 - type: ndcg_at_10 value: 32.588 - type: ndcg_at_100 value: 38.432 - type: ndcg_at_1000 value: 41.214 - type: ndcg_at_3 value: 27.938000000000002 - type: ndcg_at_5 value: 30.127 - type: precision_at_1 value: 23.125 - type: precision_at_10 value: 5.9639999999999995 - type: precision_at_100 value: 1.047 - type: precision_at_1000 value: 0.148 - type: precision_at_3 value: 13.294 - type: precision_at_5 value: 9.628 - type: recall_at_1 value: 19.059 - type: recall_at_10 value: 44.25 - type: recall_at_100 value: 69.948 - type: recall_at_1000 value: 89.35300000000001 - type: recall_at_3 value: 31.114000000000004 - type: recall_at_5 value: 36.846000000000004 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval type: BeIR/cqadupstack config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: map_at_1 value: 28.355999999999998 - type: map_at_10 value: 39.055 - type: map_at_100 value: 40.486 - type: map_at_1000 value: 40.571 - type: map_at_3 value: 35.69 - type: map_at_5 value: 37.605 - type: mrr_at_1 value: 33.302 - type: mrr_at_10 value: 42.986000000000004 - type: mrr_at_100 value: 43.957 - type: mrr_at_1000 value: 43.996 - type: mrr_at_3 value: 40.111999999999995 - type: mrr_at_5 value: 41.735 - type: ndcg_at_1 value: 33.302 - type: ndcg_at_10 value: 44.962999999999994 - type: ndcg_at_100 value: 50.917 - type: ndcg_at_1000 value: 52.622 - type: ndcg_at_3 value: 39.182 - type: ndcg_at_5 value: 41.939 - type: precision_at_1 value: 33.302 - type: precision_at_10 value: 7.779999999999999 - type: precision_at_100 value: 1.203 - type: precision_at_1000 value: 0.145 - type: precision_at_3 value: 18.035 - type: precision_at_5 value: 12.873000000000001 - type: recall_at_1 value: 28.355999999999998 - type: recall_at_10 value: 58.782000000000004 - type: recall_at_100 value: 84.02199999999999 - type: recall_at_1000 value: 95.511 - type: recall_at_3 value: 43.126999999999995 - type: recall_at_5 value: 50.14999999999999 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval type: BeIR/cqadupstack config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: map_at_1 value: 27.391 - type: map_at_10 value: 37.523 - type: map_at_100 value: 39.312000000000005 - type: map_at_1000 value: 39.54 - type: map_at_3 value: 34.231 - type: map_at_5 value: 36.062 - type: mrr_at_1 value: 32.016 - type: mrr_at_10 value: 41.747 - type: mrr_at_100 value: 42.812 - type: mrr_at_1000 value: 42.844 - type: mrr_at_3 value: 39.129999999999995 - type: mrr_at_5 value: 40.524 - type: ndcg_at_1 value: 32.016 - type: ndcg_at_10 value: 43.826 - type: ndcg_at_100 value: 50.373999999999995 - type: ndcg_at_1000 value: 52.318 - type: ndcg_at_3 value: 38.479 - type: ndcg_at_5 value: 40.944 - type: precision_at_1 value: 32.016 - type: precision_at_10 value: 8.280999999999999 - type: precision_at_100 value: 1.6760000000000002 - type: precision_at_1000 value: 0.25 - type: precision_at_3 value: 18.05 - type: precision_at_5 value: 13.083 - type: recall_at_1 value: 27.391 - type: recall_at_10 value: 56.928999999999995 - type: recall_at_100 value: 85.169 - type: recall_at_1000 value: 96.665 - type: recall_at_3 value: 42.264 - type: recall_at_5 value: 48.556 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: mteb/climate-fever config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: map_at_1 value: 19.681 - type: map_at_10 value: 32.741 - type: map_at_100 value: 34.811 - type: map_at_1000 value: 35.003 - type: map_at_3 value: 27.697 - type: map_at_5 value: 30.372 - type: mrr_at_1 value: 44.951 - type: mrr_at_10 value: 56.34400000000001 - type: mrr_at_100 value: 56.961 - type: mrr_at_1000 value: 56.987 - type: mrr_at_3 value: 53.681 - type: mrr_at_5 value: 55.407 - type: ndcg_at_1 value: 44.951 - type: ndcg_at_10 value: 42.905 - type: ndcg_at_100 value: 49.95 - type: ndcg_at_1000 value: 52.917 - type: ndcg_at_3 value: 36.815 - type: ndcg_at_5 value: 38.817 - type: precision_at_1 value: 44.951 - type: precision_at_10 value: 12.989999999999998 - type: precision_at_100 value: 2.068 - type: precision_at_1000 value: 0.263 - type: precision_at_3 value: 27.275 - type: precision_at_5 value: 20.365 - type: recall_at_1 value: 19.681 - type: recall_at_10 value: 48.272999999999996 - type: recall_at_100 value: 71.87400000000001 - type: recall_at_1000 value: 87.929 - type: recall_at_3 value: 32.653999999999996 - type: recall_at_5 value: 39.364 - task: type: Retrieval dataset: name: MTEB DBPedia type: mteb/dbpedia config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: map_at_1 value: 10.231 - type: map_at_10 value: 22.338 - type: map_at_100 value: 31.927 - type: map_at_1000 value: 33.87 - type: map_at_3 value: 15.559999999999999 - type: map_at_5 value: 18.239 - type: mrr_at_1 value: 75.0 - type: mrr_at_10 value: 81.303 - type: mrr_at_100 value: 81.523 - type: mrr_at_1000 value: 81.53 - type: mrr_at_3 value: 80.083 - type: mrr_at_5 value: 80.758 - type: ndcg_at_1 value: 64.625 - type: ndcg_at_10 value: 48.687000000000005 - type: ndcg_at_100 value: 52.791 - type: ndcg_at_1000 value: 60.041999999999994 - type: ndcg_at_3 value: 53.757999999999996 - type: ndcg_at_5 value: 50.76500000000001 - type: precision_at_1 value: 75.0 - type: precision_at_10 value: 38.3 - type: precision_at_100 value: 12.025 - type: precision_at_1000 value: 2.3970000000000002 - type: precision_at_3 value: 55.417 - type: precision_at_5 value: 47.5 - type: recall_at_1 value: 10.231 - type: recall_at_10 value: 27.697 - type: recall_at_100 value: 57.409 - type: recall_at_1000 value: 80.547 - type: recall_at_3 value: 16.668 - type: recall_at_5 value: 20.552 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 61.365 - type: f1 value: 56.7540827912991 - task: type: Retrieval dataset: name: MTEB FEVER type: mteb/fever config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: map_at_1 value: 83.479 - type: map_at_10 value: 88.898 - type: map_at_100 value: 89.11 - type: map_at_1000 value: 89.12400000000001 - type: map_at_3 value: 88.103 - type: map_at_5 value: 88.629 - type: mrr_at_1 value: 89.934 - type: mrr_at_10 value: 93.91000000000001 - type: mrr_at_100 value: 93.937 - type: mrr_at_1000 value: 93.938 - type: mrr_at_3 value: 93.62700000000001 - type: mrr_at_5 value: 93.84599999999999 - type: ndcg_at_1 value: 89.934 - type: ndcg_at_10 value: 91.574 - type: ndcg_at_100 value: 92.238 - type: ndcg_at_1000 value: 92.45 - type: ndcg_at_3 value: 90.586 - type: ndcg_at_5 value: 91.16300000000001 - type: precision_at_1 value: 89.934 - type: precision_at_10 value: 10.555 - type: precision_at_100 value: 1.1159999999999999 - type: precision_at_1000 value: 0.11499999999999999 - type: precision_at_3 value: 33.588 - type: precision_at_5 value: 20.642 - type: recall_at_1 value: 83.479 - type: recall_at_10 value: 94.971 - type: recall_at_100 value: 97.397 - type: recall_at_1000 value: 98.666 - type: recall_at_3 value: 92.24799999999999 - type: recall_at_5 value: 93.797 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: mteb/fiqa config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: map_at_1 value: 27.16 - type: map_at_10 value: 45.593 - type: map_at_100 value: 47.762 - type: map_at_1000 value: 47.899 - type: map_at_3 value: 39.237 - type: map_at_5 value: 42.970000000000006 - type: mrr_at_1 value: 52.623 - type: mrr_at_10 value: 62.637 - type: mrr_at_100 value: 63.169 - type: mrr_at_1000 value: 63.185 - type: mrr_at_3 value: 59.928000000000004 - type: mrr_at_5 value: 61.702999999999996 - type: ndcg_at_1 value: 52.623 - type: ndcg_at_10 value: 54.701 - type: ndcg_at_100 value: 61.263 - type: ndcg_at_1000 value: 63.134 - type: ndcg_at_3 value: 49.265 - type: ndcg_at_5 value: 51.665000000000006 - type: precision_at_1 value: 52.623 - type: precision_at_10 value: 15.185 - type: precision_at_100 value: 2.202 - type: precision_at_1000 value: 0.254 - type: precision_at_3 value: 32.767 - type: precision_at_5 value: 24.722 - type: recall_at_1 value: 27.16 - type: recall_at_10 value: 63.309000000000005 - type: recall_at_100 value: 86.722 - type: recall_at_1000 value: 97.505 - type: recall_at_3 value: 45.045 - type: recall_at_5 value: 54.02400000000001 - task: type: Retrieval dataset: name: MTEB HotpotQA type: mteb/hotpotqa config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: map_at_1 value: 42.573 - type: map_at_10 value: 59.373 - type: map_at_100 value: 60.292 - type: map_at_1000 value: 60.358999999999995 - type: map_at_3 value: 56.159000000000006 - type: map_at_5 value: 58.123999999999995 - type: mrr_at_1 value: 85.14500000000001 - type: mrr_at_10 value: 89.25999999999999 - type: mrr_at_100 value: 89.373 - type: mrr_at_1000 value: 89.377 - type: mrr_at_3 value: 88.618 - type: mrr_at_5 value: 89.036 - type: ndcg_at_1 value: 85.14500000000001 - type: ndcg_at_10 value: 68.95 - type: ndcg_at_100 value: 71.95 - type: ndcg_at_1000 value: 73.232 - type: ndcg_at_3 value: 64.546 - type: ndcg_at_5 value: 66.945 - type: precision_at_1 value: 85.14500000000001 - type: precision_at_10 value: 13.865 - type: precision_at_100 value: 1.619 - type: precision_at_1000 value: 0.179 - type: precision_at_3 value: 39.703 - type: precision_at_5 value: 25.718000000000004 - type: recall_at_1 value: 42.573 - type: recall_at_10 value: 69.325 - type: recall_at_100 value: 80.932 - type: recall_at_1000 value: 89.446 - type: recall_at_3 value: 59.553999999999995 - type: recall_at_5 value: 64.294 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 95.8336 - type: ap value: 93.78862962194073 - type: f1 value: 95.83192650728371 - task: type: Retrieval dataset: name: MTEB MSMARCO type: mteb/msmarco config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: map_at_1 value: 23.075000000000003 - type: map_at_10 value: 36.102000000000004 - type: map_at_100 value: 37.257 - type: map_at_1000 value: 37.3 - type: map_at_3 value: 32.144 - type: map_at_5 value: 34.359 - type: mrr_at_1 value: 23.711 - type: mrr_at_10 value: 36.671 - type: mrr_at_100 value: 37.763999999999996 - type: mrr_at_1000 value: 37.801 - type: mrr_at_3 value: 32.775 - type: mrr_at_5 value: 34.977000000000004 - type: ndcg_at_1 value: 23.711 - type: ndcg_at_10 value: 43.361 - type: ndcg_at_100 value: 48.839 - type: ndcg_at_1000 value: 49.88 - type: ndcg_at_3 value: 35.269 - type: ndcg_at_5 value: 39.224 - type: precision_at_1 value: 23.711 - type: precision_at_10 value: 6.866999999999999 - type: precision_at_100 value: 0.96 - type: precision_at_1000 value: 0.105 - type: precision_at_3 value: 15.096000000000002 - type: precision_at_5 value: 11.083 - type: recall_at_1 value: 23.075000000000003 - type: recall_at_10 value: 65.756 - type: recall_at_100 value: 90.88199999999999 - type: recall_at_1000 value: 98.739 - type: recall_at_3 value: 43.691 - type: recall_at_5 value: 53.15800000000001 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 97.69493844049248 - type: f1 value: 97.55048089616261 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 88.75968992248062 - type: f1 value: 72.26321223399123 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 82.40080699394754 - type: f1 value: 79.62590029057968 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 84.49562878278414 - type: f1 value: 84.0040193313333 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 39.386760057101945 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 37.89687154075537 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 33.94151656057482 - type: mrr value: 35.32684700746953 - task: type: Retrieval dataset: name: MTEB NFCorpus type: mteb/nfcorpus config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: map_at_1 value: 6.239999999999999 - type: map_at_10 value: 14.862 - type: map_at_100 value: 18.955 - type: map_at_1000 value: 20.694000000000003 - type: map_at_3 value: 10.683 - type: map_at_5 value: 12.674 - type: mrr_at_1 value: 50.15500000000001 - type: mrr_at_10 value: 59.697 - type: mrr_at_100 value: 60.095 - type: mrr_at_1000 value: 60.129999999999995 - type: mrr_at_3 value: 58.35900000000001 - type: mrr_at_5 value: 58.839 - type: ndcg_at_1 value: 48.452 - type: ndcg_at_10 value: 39.341 - type: ndcg_at_100 value: 35.866 - type: ndcg_at_1000 value: 45.111000000000004 - type: ndcg_at_3 value: 44.527 - type: ndcg_at_5 value: 42.946 - type: precision_at_1 value: 50.15500000000001 - type: precision_at_10 value: 29.536 - type: precision_at_100 value: 9.142 - type: precision_at_1000 value: 2.2849999999999997 - type: precision_at_3 value: 41.899 - type: precision_at_5 value: 37.647000000000006 - type: recall_at_1 value: 6.239999999999999 - type: recall_at_10 value: 19.278000000000002 - type: recall_at_100 value: 36.074 - type: recall_at_1000 value: 70.017 - type: recall_at_3 value: 12.066 - type: recall_at_5 value: 15.254000000000001 - task: type: Retrieval dataset: name: MTEB NQ type: mteb/nq config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: map_at_1 value: 39.75 - type: map_at_10 value: 56.443 - type: map_at_100 value: 57.233999999999995 - type: map_at_1000 value: 57.249 - type: map_at_3 value: 52.032999999999994 - type: map_at_5 value: 54.937999999999995 - type: mrr_at_1 value: 44.728 - type: mrr_at_10 value: 58.939 - type: mrr_at_100 value: 59.489000000000004 - type: mrr_at_1000 value: 59.499 - type: mrr_at_3 value: 55.711999999999996 - type: mrr_at_5 value: 57.89 - type: ndcg_at_1 value: 44.728 - type: ndcg_at_10 value: 63.998999999999995 - type: ndcg_at_100 value: 67.077 - type: ndcg_at_1000 value: 67.40899999999999 - type: ndcg_at_3 value: 56.266000000000005 - type: ndcg_at_5 value: 60.88 - type: precision_at_1 value: 44.728 - type: precision_at_10 value: 10.09 - type: precision_at_100 value: 1.1809999999999998 - type: precision_at_1000 value: 0.121 - type: precision_at_3 value: 25.145 - type: precision_at_5 value: 17.822 - type: recall_at_1 value: 39.75 - type: recall_at_10 value: 84.234 - type: recall_at_100 value: 97.055 - type: recall_at_1000 value: 99.517 - type: recall_at_3 value: 64.851 - type: recall_at_5 value: 75.343 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: mteb/quora config: default split: test revision: None metrics: - type: map_at_1 value: 72.085 - type: map_at_10 value: 86.107 - type: map_at_100 value: 86.727 - type: map_at_1000 value: 86.74 - type: map_at_3 value: 83.21 - type: map_at_5 value: 85.06 - type: mrr_at_1 value: 82.94 - type: mrr_at_10 value: 88.845 - type: mrr_at_100 value: 88.926 - type: mrr_at_1000 value: 88.927 - type: mrr_at_3 value: 87.993 - type: mrr_at_5 value: 88.62299999999999 - type: ndcg_at_1 value: 82.97 - type: ndcg_at_10 value: 89.645 - type: ndcg_at_100 value: 90.717 - type: ndcg_at_1000 value: 90.78 - type: ndcg_at_3 value: 86.99900000000001 - type: ndcg_at_5 value: 88.52600000000001 - type: precision_at_1 value: 82.97 - type: precision_at_10 value: 13.569 - type: precision_at_100 value: 1.539 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 38.043 - type: precision_at_5 value: 24.992 - type: recall_at_1 value: 72.085 - type: recall_at_10 value: 96.262 - type: recall_at_100 value: 99.77000000000001 - type: recall_at_1000 value: 99.997 - type: recall_at_3 value: 88.652 - type: recall_at_5 value: 93.01899999999999 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 55.82153952668092 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 62.094465801879295 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: mteb/scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 5.688 - type: map_at_10 value: 15.201999999999998 - type: map_at_100 value: 18.096 - type: map_at_1000 value: 18.481 - type: map_at_3 value: 10.734 - type: map_at_5 value: 12.94 - type: mrr_at_1 value: 28.000000000000004 - type: mrr_at_10 value: 41.101 - type: mrr_at_100 value: 42.202 - type: mrr_at_1000 value: 42.228 - type: mrr_at_3 value: 37.683 - type: mrr_at_5 value: 39.708 - type: ndcg_at_1 value: 28.000000000000004 - type: ndcg_at_10 value: 24.976000000000003 - type: ndcg_at_100 value: 35.129 - type: ndcg_at_1000 value: 40.77 - type: ndcg_at_3 value: 23.787 - type: ndcg_at_5 value: 20.816000000000003 - type: precision_at_1 value: 28.000000000000004 - type: precision_at_10 value: 13.04 - type: precision_at_100 value: 2.761 - type: precision_at_1000 value: 0.41000000000000003 - type: precision_at_3 value: 22.6 - type: precision_at_5 value: 18.52 - type: recall_at_1 value: 5.688 - type: recall_at_10 value: 26.43 - type: recall_at_100 value: 56.02 - type: recall_at_1000 value: 83.21 - type: recall_at_3 value: 13.752 - type: recall_at_5 value: 18.777 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 85.15084859283178 - type: cos_sim_spearman value: 80.49030614009419 - type: euclidean_pearson value: 81.84574978672468 - type: euclidean_spearman value: 79.89787150656818 - type: manhattan_pearson value: 81.63076538567131 - type: manhattan_spearman value: 79.69867352121841 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 84.64097921490992 - type: cos_sim_spearman value: 77.25370084896514 - type: euclidean_pearson value: 82.71210826468788 - type: euclidean_spearman value: 78.50445584994826 - type: manhattan_pearson value: 82.92580164330298 - type: manhattan_spearman value: 78.69686891301019 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 87.24596417308994 - type: cos_sim_spearman value: 87.79454220555091 - type: euclidean_pearson value: 87.40242561671164 - type: euclidean_spearman value: 88.25955597373556 - type: manhattan_pearson value: 87.25160240485849 - type: manhattan_spearman value: 88.155794979818 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 84.44914233422564 - type: cos_sim_spearman value: 82.91015471820322 - type: euclidean_pearson value: 84.7206656630327 - type: euclidean_spearman value: 83.86408872059216 - type: manhattan_pearson value: 84.72816725158454 - type: manhattan_spearman value: 84.01603388572788 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 87.6168026237477 - type: cos_sim_spearman value: 88.45414278092397 - type: euclidean_pearson value: 88.57023240882022 - type: euclidean_spearman value: 89.04102190922094 - type: manhattan_pearson value: 88.66695535796354 - type: manhattan_spearman value: 89.19898476680969 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 84.27925826089424 - type: cos_sim_spearman value: 85.45291099550461 - type: euclidean_pearson value: 83.63853036580834 - type: euclidean_spearman value: 84.33468035821484 - type: manhattan_pearson value: 83.72778773251596 - type: manhattan_spearman value: 84.51583132445376 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 89.67375185692552 - type: cos_sim_spearman value: 90.32542469203855 - type: euclidean_pearson value: 89.63513717951847 - type: euclidean_spearman value: 89.87760271003745 - type: manhattan_pearson value: 89.28381452982924 - type: manhattan_spearman value: 89.53568197785721 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 66.24644693819846 - type: cos_sim_spearman value: 66.09889420525377 - type: euclidean_pearson value: 63.72551583520747 - type: euclidean_spearman value: 63.01385470780679 - type: manhattan_pearson value: 64.09258157214097 - type: manhattan_spearman value: 63.080517752822594 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 86.27321463839989 - type: cos_sim_spearman value: 86.37572865993327 - type: euclidean_pearson value: 86.36268020198149 - type: euclidean_spearman value: 86.31089339478922 - type: manhattan_pearson value: 86.4260445761947 - type: manhattan_spearman value: 86.45885895320457 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 86.52456702387798 - type: mrr value: 96.34556529164372 - task: type: Retrieval dataset: name: MTEB SciFact type: mteb/scifact config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: map_at_1 value: 61.99400000000001 - type: map_at_10 value: 73.38799999999999 - type: map_at_100 value: 73.747 - type: map_at_1000 value: 73.75 - type: map_at_3 value: 70.04599999999999 - type: map_at_5 value: 72.095 - type: mrr_at_1 value: 65.0 - type: mrr_at_10 value: 74.42800000000001 - type: mrr_at_100 value: 74.722 - type: mrr_at_1000 value: 74.725 - type: mrr_at_3 value: 72.056 - type: mrr_at_5 value: 73.60600000000001 - type: ndcg_at_1 value: 65.0 - type: ndcg_at_10 value: 78.435 - type: ndcg_at_100 value: 79.922 - type: ndcg_at_1000 value: 80.00500000000001 - type: ndcg_at_3 value: 73.05199999999999 - type: ndcg_at_5 value: 75.98 - type: precision_at_1 value: 65.0 - type: precision_at_10 value: 10.5 - type: precision_at_100 value: 1.123 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 28.555999999999997 - type: precision_at_5 value: 19.0 - type: recall_at_1 value: 61.99400000000001 - type: recall_at_10 value: 92.72200000000001 - type: recall_at_100 value: 99.333 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 78.739 - type: recall_at_5 value: 85.828 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.79009900990098 - type: cos_sim_ap value: 95.3203137438653 - type: cos_sim_f1 value: 89.12386706948641 - type: cos_sim_precision value: 89.75659229208925 - type: cos_sim_recall value: 88.5 - type: dot_accuracy value: 99.67821782178218 - type: dot_ap value: 89.94069840000675 - type: dot_f1 value: 83.45902463549521 - type: dot_precision value: 83.9231547017189 - type: dot_recall value: 83.0 - type: euclidean_accuracy value: 99.78613861386138 - type: euclidean_ap value: 95.10648259135526 - type: euclidean_f1 value: 88.77338877338877 - type: euclidean_precision value: 92.42424242424242 - type: euclidean_recall value: 85.39999999999999 - type: manhattan_accuracy value: 99.7950495049505 - type: manhattan_ap value: 95.29987661320946 - type: manhattan_f1 value: 89.21313183949972 - type: manhattan_precision value: 93.14472252448314 - type: manhattan_recall value: 85.6 - type: max_accuracy value: 99.7950495049505 - type: max_ap value: 95.3203137438653 - type: max_f1 value: 89.21313183949972 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 67.65446577183913 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 46.30749237193961 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 54.91481849959949 - type: mrr value: 55.853506175197346 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.08196549170419 - type: cos_sim_spearman value: 31.16661390597077 - type: dot_pearson value: 29.892258410943466 - type: dot_spearman value: 30.51328811965085 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: mteb/trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.23900000000000002 - type: map_at_10 value: 2.173 - type: map_at_100 value: 14.24 - type: map_at_1000 value: 35.309000000000005 - type: map_at_3 value: 0.7100000000000001 - type: map_at_5 value: 1.163 - type: mrr_at_1 value: 92.0 - type: mrr_at_10 value: 96.0 - type: mrr_at_100 value: 96.0 - type: mrr_at_1000 value: 96.0 - type: mrr_at_3 value: 96.0 - type: mrr_at_5 value: 96.0 - type: ndcg_at_1 value: 90.0 - type: ndcg_at_10 value: 85.382 - type: ndcg_at_100 value: 68.03 - type: ndcg_at_1000 value: 61.021 - type: ndcg_at_3 value: 89.765 - type: ndcg_at_5 value: 88.444 - type: precision_at_1 value: 92.0 - type: precision_at_10 value: 88.0 - type: precision_at_100 value: 70.02000000000001 - type: precision_at_1000 value: 26.984 - type: precision_at_3 value: 94.0 - type: precision_at_5 value: 92.80000000000001 - type: recall_at_1 value: 0.23900000000000002 - type: recall_at_10 value: 2.313 - type: recall_at_100 value: 17.049 - type: recall_at_1000 value: 57.489999999999995 - type: recall_at_3 value: 0.737 - type: recall_at_5 value: 1.221 - task: type: Retrieval dataset: name: MTEB Touche2020 type: mteb/touche2020 config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: map_at_1 value: 2.75 - type: map_at_10 value: 11.29 - type: map_at_100 value: 18.032999999999998 - type: map_at_1000 value: 19.746 - type: map_at_3 value: 6.555 - type: map_at_5 value: 8.706999999999999 - type: mrr_at_1 value: 34.694 - type: mrr_at_10 value: 50.55 - type: mrr_at_100 value: 51.659 - type: mrr_at_1000 value: 51.659 - type: mrr_at_3 value: 47.278999999999996 - type: mrr_at_5 value: 49.728 - type: ndcg_at_1 value: 32.653 - type: ndcg_at_10 value: 27.894000000000002 - type: ndcg_at_100 value: 39.769 - type: ndcg_at_1000 value: 51.495999999999995 - type: ndcg_at_3 value: 32.954 - type: ndcg_at_5 value: 31.502999999999997 - type: precision_at_1 value: 34.694 - type: precision_at_10 value: 23.265 - type: precision_at_100 value: 7.898 - type: precision_at_1000 value: 1.58 - type: precision_at_3 value: 34.694 - type: precision_at_5 value: 31.429000000000002 - type: recall_at_1 value: 2.75 - type: recall_at_10 value: 16.953 - type: recall_at_100 value: 48.68 - type: recall_at_1000 value: 85.18599999999999 - type: recall_at_3 value: 7.710999999999999 - type: recall_at_5 value: 11.484 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 82.66099999999999 - type: ap value: 25.555698090238337 - type: f1 value: 66.48402012461622 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 72.94567062818335 - type: f1 value: 73.28139189595674 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 49.581627240203474 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 87.78089050485785 - type: cos_sim_ap value: 79.64487116574168 - type: cos_sim_f1 value: 72.46563021970964 - type: cos_sim_precision value: 70.62359128474831 - type: cos_sim_recall value: 74.40633245382587 - type: dot_accuracy value: 86.2609524944865 - type: dot_ap value: 75.513046857613 - type: dot_f1 value: 68.58213616489695 - type: dot_precision value: 65.12455516014235 - type: dot_recall value: 72.42744063324538 - type: euclidean_accuracy value: 87.6080348095607 - type: euclidean_ap value: 79.00204933649795 - type: euclidean_f1 value: 72.14495342605589 - type: euclidean_precision value: 69.85421299728193 - type: euclidean_recall value: 74.5910290237467 - type: manhattan_accuracy value: 87.59611372712642 - type: manhattan_ap value: 78.78523756706264 - type: manhattan_f1 value: 71.86499137718648 - type: manhattan_precision value: 67.39833641404806 - type: manhattan_recall value: 76.96569920844327 - type: max_accuracy value: 87.78089050485785 - type: max_ap value: 79.64487116574168 - type: max_f1 value: 72.46563021970964 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 89.98719292117825 - type: cos_sim_ap value: 87.58146137353202 - type: cos_sim_f1 value: 80.28543232369239 - type: cos_sim_precision value: 79.1735289714029 - type: cos_sim_recall value: 81.42901139513397 - type: dot_accuracy value: 88.9199363526992 - type: dot_ap value: 84.98499998630417 - type: dot_f1 value: 78.21951400757969 - type: dot_precision value: 75.58523624874336 - type: dot_recall value: 81.04404065291038 - type: euclidean_accuracy value: 89.77374160748244 - type: euclidean_ap value: 87.35151562835209 - type: euclidean_f1 value: 79.92160922940393 - type: euclidean_precision value: 76.88531587933979 - type: euclidean_recall value: 83.20757622420696 - type: manhattan_accuracy value: 89.72717041176699 - type: manhattan_ap value: 87.34065592142515 - type: manhattan_f1 value: 79.85603419187943 - type: manhattan_precision value: 77.82243332115455 - type: manhattan_recall value: 81.99876809362489 - type: max_accuracy value: 89.98719292117825 - type: max_ap value: 87.58146137353202 - type: max_f1 value: 80.28543232369239 - task: type: STS dataset: name: MTEB AFQMC type: C-MTEB/AFQMC config: default split: validation revision: b44c3b011063adb25877c13823db83bb193913c4 metrics: - type: cos_sim_pearson value: 53.45954203592337 - type: cos_sim_spearman value: 58.42154680418638 - type: euclidean_pearson value: 56.41543791722753 - type: euclidean_spearman value: 58.39328016640146 - type: manhattan_pearson value: 56.318510356833876 - type: manhattan_spearman value: 58.28423447818184 - task: type: STS dataset: name: MTEB ATEC type: C-MTEB/ATEC config: default split: test revision: 0f319b1142f28d00e055a6770f3f726ae9b7d865 metrics: - type: cos_sim_pearson value: 50.78356460675945 - type: cos_sim_spearman value: 55.6530411663269 - type: euclidean_pearson value: 56.50763660417816 - type: euclidean_spearman value: 55.733823335669065 - type: manhattan_pearson value: 56.45323093512866 - type: manhattan_spearman value: 55.63248619032702 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (zh) type: mteb/amazon_reviews_multi config: zh split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 47.209999999999994 - type: f1 value: 46.08892432018655 - task: type: STS dataset: name: MTEB BQ type: C-MTEB/BQ config: default split: test revision: e3dda5e115e487b39ec7e618c0c6a29137052a55 metrics: - type: cos_sim_pearson value: 70.25573992001478 - type: cos_sim_spearman value: 73.85247134951433 - type: euclidean_pearson value: 72.60033082168442 - type: euclidean_spearman value: 73.72445893756499 - type: manhattan_pearson value: 72.59932284620231 - type: manhattan_spearman value: 73.68002490614583 - task: type: Clustering dataset: name: MTEB CLSClusteringP2P type: C-MTEB/CLSClusteringP2P config: default split: test revision: 4b6227591c6c1a73bc76b1055f3b7f3588e72476 metrics: - type: v_measure value: 45.21317724305628 - task: type: Clustering dataset: name: MTEB CLSClusteringS2S type: C-MTEB/CLSClusteringS2S config: default split: test revision: e458b3f5414b62b7f9f83499ac1f5497ae2e869f metrics: - type: v_measure value: 42.49825170976724 - task: type: Reranking dataset: name: MTEB CMedQAv1 type: C-MTEB/CMedQAv1-reranking config: default split: test revision: 8d7f1e942507dac42dc58017c1a001c3717da7df metrics: - type: map value: 88.15661686810597 - type: mrr value: 90.11222222222223 - task: type: Reranking dataset: name: MTEB CMedQAv2 type: C-MTEB/CMedQAv2-reranking config: default split: test revision: 23d186750531a14a0357ca22cd92d712fd512ea0 metrics: - type: map value: 88.1204726064383 - type: mrr value: 90.20142857142858 - task: type: Retrieval dataset: name: MTEB CmedqaRetrieval type: C-MTEB/CmedqaRetrieval config: default split: dev revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301 metrics: - type: map_at_1 value: 27.224999999999998 - type: map_at_10 value: 40.169 - type: map_at_100 value: 42.0 - type: map_at_1000 value: 42.109 - type: map_at_3 value: 35.76 - type: map_at_5 value: 38.221 - type: mrr_at_1 value: 40.56 - type: mrr_at_10 value: 49.118 - type: mrr_at_100 value: 50.092999999999996 - type: mrr_at_1000 value: 50.133 - type: mrr_at_3 value: 46.507 - type: mrr_at_5 value: 47.973 - type: ndcg_at_1 value: 40.56 - type: ndcg_at_10 value: 46.972 - type: ndcg_at_100 value: 54.04 - type: ndcg_at_1000 value: 55.862 - type: ndcg_at_3 value: 41.36 - type: ndcg_at_5 value: 43.704 - type: precision_at_1 value: 40.56 - type: precision_at_10 value: 10.302999999999999 - type: precision_at_100 value: 1.606 - type: precision_at_1000 value: 0.184 - type: precision_at_3 value: 23.064 - type: precision_at_5 value: 16.764000000000003 - type: recall_at_1 value: 27.224999999999998 - type: recall_at_10 value: 58.05200000000001 - type: recall_at_100 value: 87.092 - type: recall_at_1000 value: 99.099 - type: recall_at_3 value: 41.373 - type: recall_at_5 value: 48.453 - task: type: PairClassification dataset: name: MTEB Cmnli type: C-MTEB/CMNLI config: default split: validation revision: 41bc36f332156f7adc9e38f53777c959b2ae9766 metrics: - type: cos_sim_accuracy value: 77.40228502705953 - type: cos_sim_ap value: 86.22359172956327 - type: cos_sim_f1 value: 78.96328293736501 - type: cos_sim_precision value: 73.36945615091311 - type: cos_sim_recall value: 85.48047696983868 - type: dot_accuracy value: 75.53818400481059 - type: dot_ap value: 83.70164011305312 - type: dot_f1 value: 77.67298719348754 - type: dot_precision value: 67.49482401656314 - type: dot_recall value: 91.46598082768296 - type: euclidean_accuracy value: 77.94347564642213 - type: euclidean_ap value: 86.4652108728609 - type: euclidean_f1 value: 79.15555555555555 - type: euclidean_precision value: 75.41816641964853 - type: euclidean_recall value: 83.28267477203647 - type: manhattan_accuracy value: 77.45039085989175 - type: manhattan_ap value: 86.09986583900665 - type: manhattan_f1 value: 78.93669264438988 - type: manhattan_precision value: 72.63261296660117 - type: manhattan_recall value: 86.43909282207154 - type: max_accuracy value: 77.94347564642213 - type: max_ap value: 86.4652108728609 - type: max_f1 value: 79.15555555555555 - task: type: Retrieval dataset: name: MTEB CovidRetrieval type: C-MTEB/CovidRetrieval config: default split: dev revision: 1271c7809071a13532e05f25fb53511ffce77117 metrics: - type: map_at_1 value: 69.336 - type: map_at_10 value: 77.16 - type: map_at_100 value: 77.47500000000001 - type: map_at_1000 value: 77.482 - type: map_at_3 value: 75.42999999999999 - type: map_at_5 value: 76.468 - type: mrr_at_1 value: 69.44200000000001 - type: mrr_at_10 value: 77.132 - type: mrr_at_100 value: 77.43299999999999 - type: mrr_at_1000 value: 77.44 - type: mrr_at_3 value: 75.395 - type: mrr_at_5 value: 76.459 - type: ndcg_at_1 value: 69.547 - type: ndcg_at_10 value: 80.794 - type: ndcg_at_100 value: 82.245 - type: ndcg_at_1000 value: 82.40899999999999 - type: ndcg_at_3 value: 77.303 - type: ndcg_at_5 value: 79.168 - type: precision_at_1 value: 69.547 - type: precision_at_10 value: 9.305 - type: precision_at_100 value: 0.9979999999999999 - type: precision_at_1000 value: 0.101 - type: precision_at_3 value: 27.749000000000002 - type: precision_at_5 value: 17.576 - type: recall_at_1 value: 69.336 - type: recall_at_10 value: 92.097 - type: recall_at_100 value: 98.736 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 82.64 - type: recall_at_5 value: 87.144 - task: type: Retrieval dataset: name: MTEB DuRetrieval type: C-MTEB/DuRetrieval config: default split: dev revision: a1a333e290fe30b10f3f56498e3a0d911a693ced metrics: - type: map_at_1 value: 26.817999999999998 - type: map_at_10 value: 82.67 - type: map_at_100 value: 85.304 - type: map_at_1000 value: 85.334 - type: map_at_3 value: 57.336 - type: map_at_5 value: 72.474 - type: mrr_at_1 value: 91.45 - type: mrr_at_10 value: 94.272 - type: mrr_at_100 value: 94.318 - type: mrr_at_1000 value: 94.32000000000001 - type: mrr_at_3 value: 94.0 - type: mrr_at_5 value: 94.17699999999999 - type: ndcg_at_1 value: 91.45 - type: ndcg_at_10 value: 89.404 - type: ndcg_at_100 value: 91.724 - type: ndcg_at_1000 value: 91.973 - type: ndcg_at_3 value: 88.104 - type: ndcg_at_5 value: 87.25699999999999 - type: precision_at_1 value: 91.45 - type: precision_at_10 value: 42.585 - type: precision_at_100 value: 4.838 - type: precision_at_1000 value: 0.49 - type: precision_at_3 value: 78.8 - type: precision_at_5 value: 66.66 - type: recall_at_1 value: 26.817999999999998 - type: recall_at_10 value: 90.67 - type: recall_at_100 value: 98.36200000000001 - type: recall_at_1000 value: 99.583 - type: recall_at_3 value: 59.614999999999995 - type: recall_at_5 value: 77.05199999999999 - task: type: Retrieval dataset: name: MTEB EcomRetrieval type: C-MTEB/EcomRetrieval config: default split: dev revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9 metrics: - type: map_at_1 value: 47.699999999999996 - type: map_at_10 value: 57.589999999999996 - type: map_at_100 value: 58.226 - type: map_at_1000 value: 58.251 - type: map_at_3 value: 55.233 - type: map_at_5 value: 56.633 - type: mrr_at_1 value: 47.699999999999996 - type: mrr_at_10 value: 57.589999999999996 - type: mrr_at_100 value: 58.226 - type: mrr_at_1000 value: 58.251 - type: mrr_at_3 value: 55.233 - type: mrr_at_5 value: 56.633 - type: ndcg_at_1 value: 47.699999999999996 - type: ndcg_at_10 value: 62.505 - type: ndcg_at_100 value: 65.517 - type: ndcg_at_1000 value: 66.19800000000001 - type: ndcg_at_3 value: 57.643 - type: ndcg_at_5 value: 60.181 - type: precision_at_1 value: 47.699999999999996 - type: precision_at_10 value: 7.8 - type: precision_at_100 value: 0.919 - type: precision_at_1000 value: 0.097 - type: precision_at_3 value: 21.532999999999998 - type: precision_at_5 value: 14.16 - type: recall_at_1 value: 47.699999999999996 - type: recall_at_10 value: 78.0 - type: recall_at_100 value: 91.9 - type: recall_at_1000 value: 97.3 - type: recall_at_3 value: 64.60000000000001 - type: recall_at_5 value: 70.8 - task: type: Classification dataset: name: MTEB IFlyTek type: C-MTEB/IFlyTek-classification config: default split: validation revision: 421605374b29664c5fc098418fe20ada9bd55f8a metrics: - type: accuracy value: 44.84801846864178 - type: f1 value: 37.47347897956339 - task: type: Classification dataset: name: MTEB JDReview type: C-MTEB/JDReview-classification config: default split: test revision: b7c64bd89eb87f8ded463478346f76731f07bf8b metrics: - type: accuracy value: 85.81613508442777 - type: ap value: 52.68244615477374 - type: f1 value: 80.0445640948843 - task: type: STS dataset: name: MTEB LCQMC type: C-MTEB/LCQMC config: default split: test revision: 17f9b096f80380fce5ed12a9be8be7784b337daf metrics: - type: cos_sim_pearson value: 69.57786502217138 - type: cos_sim_spearman value: 75.39106054489906 - type: euclidean_pearson value: 73.72082954602402 - type: euclidean_spearman value: 75.14421475913619 - type: manhattan_pearson value: 73.62463076633642 - type: manhattan_spearman value: 75.01301565104112 - task: type: Reranking dataset: name: MTEB MMarcoReranking type: C-MTEB/Mmarco-reranking config: default split: dev revision: None metrics: - type: map value: 29.143797057999134 - type: mrr value: 28.08174603174603 - task: type: Retrieval dataset: name: MTEB MMarcoRetrieval type: C-MTEB/MMarcoRetrieval config: default split: dev revision: 539bbde593d947e2a124ba72651aafc09eb33fc2 metrics: - type: map_at_1 value: 70.492 - type: map_at_10 value: 79.501 - type: map_at_100 value: 79.728 - type: map_at_1000 value: 79.735 - type: map_at_3 value: 77.77 - type: map_at_5 value: 78.851 - type: mrr_at_1 value: 72.822 - type: mrr_at_10 value: 80.001 - type: mrr_at_100 value: 80.19 - type: mrr_at_1000 value: 80.197 - type: mrr_at_3 value: 78.484 - type: mrr_at_5 value: 79.42099999999999 - type: ndcg_at_1 value: 72.822 - type: ndcg_at_10 value: 83.013 - type: ndcg_at_100 value: 84.013 - type: ndcg_at_1000 value: 84.20400000000001 - type: ndcg_at_3 value: 79.728 - type: ndcg_at_5 value: 81.542 - type: precision_at_1 value: 72.822 - type: precision_at_10 value: 9.917 - type: precision_at_100 value: 1.042 - type: precision_at_1000 value: 0.106 - type: precision_at_3 value: 29.847 - type: precision_at_5 value: 18.871 - type: recall_at_1 value: 70.492 - type: recall_at_10 value: 93.325 - type: recall_at_100 value: 97.822 - type: recall_at_1000 value: 99.319 - type: recall_at_3 value: 84.636 - type: recall_at_5 value: 88.93100000000001 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-CN) type: mteb/amazon_massive_intent config: zh-CN split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 76.88298587760592 - type: f1 value: 73.89001762017176 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-CN) type: mteb/amazon_massive_scenario config: zh-CN split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 80.76328177538669 - type: f1 value: 80.24718532423358 - task: type: Retrieval dataset: name: MTEB MedicalRetrieval type: C-MTEB/MedicalRetrieval config: default split: dev revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6 metrics: - type: map_at_1 value: 49.6 - type: map_at_10 value: 55.620999999999995 - type: map_at_100 value: 56.204 - type: map_at_1000 value: 56.251 - type: map_at_3 value: 54.132999999999996 - type: map_at_5 value: 54.933 - type: mrr_at_1 value: 49.7 - type: mrr_at_10 value: 55.67100000000001 - type: mrr_at_100 value: 56.254000000000005 - type: mrr_at_1000 value: 56.301 - type: mrr_at_3 value: 54.18300000000001 - type: mrr_at_5 value: 54.983000000000004 - type: ndcg_at_1 value: 49.6 - type: ndcg_at_10 value: 58.645 - type: ndcg_at_100 value: 61.789 - type: ndcg_at_1000 value: 63.219 - type: ndcg_at_3 value: 55.567 - type: ndcg_at_5 value: 57.008 - type: precision_at_1 value: 49.6 - type: precision_at_10 value: 6.819999999999999 - type: precision_at_100 value: 0.836 - type: precision_at_1000 value: 0.095 - type: precision_at_3 value: 19.900000000000002 - type: precision_at_5 value: 12.64 - type: recall_at_1 value: 49.6 - type: recall_at_10 value: 68.2 - type: recall_at_100 value: 83.6 - type: recall_at_1000 value: 95.3 - type: recall_at_3 value: 59.699999999999996 - type: recall_at_5 value: 63.2 - task: type: Classification dataset: name: MTEB MultilingualSentiment type: C-MTEB/MultilingualSentiment-classification config: default split: validation revision: 46958b007a63fdbf239b7672c25d0bea67b5ea1a metrics: - type: accuracy value: 74.45666666666666 - type: f1 value: 74.32582402190089 - task: type: PairClassification dataset: name: MTEB Ocnli type: C-MTEB/OCNLI config: default split: validation revision: 66e76a618a34d6d565d5538088562851e6daa7ec metrics: - type: cos_sim_accuracy value: 80.67135896047645 - type: cos_sim_ap value: 87.60421240712051 - type: cos_sim_f1 value: 82.1304131408661 - type: cos_sim_precision value: 77.68361581920904 - type: cos_sim_recall value: 87.11721224920802 - type: dot_accuracy value: 79.04710341093666 - type: dot_ap value: 85.6370059719336 - type: dot_f1 value: 80.763723150358 - type: dot_precision value: 73.69337979094077 - type: dot_recall value: 89.33474128827878 - type: euclidean_accuracy value: 81.05035192203573 - type: euclidean_ap value: 87.7880240053663 - type: euclidean_f1 value: 82.50244379276637 - type: euclidean_precision value: 76.7970882620564 - type: euclidean_recall value: 89.1235480464625 - type: manhattan_accuracy value: 80.61721710882512 - type: manhattan_ap value: 87.43568120591175 - type: manhattan_f1 value: 81.89526184538653 - type: manhattan_precision value: 77.5992438563327 - type: manhattan_recall value: 86.6948257655755 - type: max_accuracy value: 81.05035192203573 - type: max_ap value: 87.7880240053663 - type: max_f1 value: 82.50244379276637 - task: type: Classification dataset: name: MTEB OnlineShopping type: C-MTEB/OnlineShopping-classification config: default split: test revision: e610f2ebd179a8fda30ae534c3878750a96db120 metrics: - type: accuracy value: 93.5 - type: ap value: 91.31357903446782 - type: f1 value: 93.48088994006616 - task: type: STS dataset: name: MTEB PAWSX type: C-MTEB/PAWSX config: default split: test revision: 9c6a90e430ac22b5779fb019a23e820b11a8b5e1 metrics: - type: cos_sim_pearson value: 36.93293453538077 - type: cos_sim_spearman value: 42.45972506308574 - type: euclidean_pearson value: 42.34945133152159 - type: euclidean_spearman value: 42.331610303674644 - type: manhattan_pearson value: 42.31455070249498 - type: manhattan_spearman value: 42.19887982891834 - task: type: STS dataset: name: MTEB QBQTC type: C-MTEB/QBQTC config: default split: test revision: 790b0510dc52b1553e8c49f3d2afb48c0e5c48b7 metrics: - type: cos_sim_pearson value: 33.683290790043785 - type: cos_sim_spearman value: 35.149171171202994 - type: euclidean_pearson value: 32.33806561267862 - type: euclidean_spearman value: 34.483576387347966 - type: manhattan_pearson value: 32.47629754599608 - type: manhattan_spearman value: 34.66434471867615 - task: type: STS dataset: name: MTEB STS22 (zh) type: mteb/sts22-crosslingual-sts config: zh split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 66.46322760516104 - type: cos_sim_spearman value: 67.398478319726 - type: euclidean_pearson value: 64.7223480293625 - type: euclidean_spearman value: 66.83118568812951 - type: manhattan_pearson value: 64.88440039828305 - type: manhattan_spearman value: 66.80429458952257 - task: type: STS dataset: name: MTEB STSB type: C-MTEB/STSB config: default split: test revision: 0cde68302b3541bb8b3c340dc0644b0b745b3dc0 metrics: - type: cos_sim_pearson value: 79.08991383232105 - type: cos_sim_spearman value: 79.39715677296854 - type: euclidean_pearson value: 78.63201279320496 - type: euclidean_spearman value: 79.40262660785731 - type: manhattan_pearson value: 78.98138363146906 - type: manhattan_spearman value: 79.79968413014194 - task: type: Reranking dataset: name: MTEB T2Reranking type: C-MTEB/T2Reranking config: default split: dev revision: 76631901a18387f85eaa53e5450019b87ad58ef9 metrics: - type: map value: 67.43289278789972 - type: mrr value: 77.53012460908535 - task: type: Retrieval dataset: name: MTEB T2Retrieval type: C-MTEB/T2Retrieval config: default split: dev revision: 8731a845f1bf500a4f111cf1070785c793d10e64 metrics: - type: map_at_1 value: 27.733999999999998 - type: map_at_10 value: 78.24799999999999 - type: map_at_100 value: 81.765 - type: map_at_1000 value: 81.824 - type: map_at_3 value: 54.92 - type: map_at_5 value: 67.61399999999999 - type: mrr_at_1 value: 90.527 - type: mrr_at_10 value: 92.843 - type: mrr_at_100 value: 92.927 - type: mrr_at_1000 value: 92.93 - type: mrr_at_3 value: 92.45100000000001 - type: mrr_at_5 value: 92.693 - type: ndcg_at_1 value: 90.527 - type: ndcg_at_10 value: 85.466 - type: ndcg_at_100 value: 88.846 - type: ndcg_at_1000 value: 89.415 - type: ndcg_at_3 value: 86.768 - type: ndcg_at_5 value: 85.46000000000001 - type: precision_at_1 value: 90.527 - type: precision_at_10 value: 42.488 - type: precision_at_100 value: 5.024 - type: precision_at_1000 value: 0.516 - type: precision_at_3 value: 75.907 - type: precision_at_5 value: 63.727000000000004 - type: recall_at_1 value: 27.733999999999998 - type: recall_at_10 value: 84.346 - type: recall_at_100 value: 95.536 - type: recall_at_1000 value: 98.42999999999999 - type: recall_at_3 value: 56.455 - type: recall_at_5 value: 70.755 - task: type: Classification dataset: name: MTEB TNews type: C-MTEB/TNews-classification config: default split: validation revision: 317f262bf1e6126357bbe89e875451e4b0938fe4 metrics: - type: accuracy value: 49.952000000000005 - type: f1 value: 48.264617195258054 - task: type: Clustering dataset: name: MTEB ThuNewsClusteringP2P type: C-MTEB/ThuNewsClusteringP2P config: default split: test revision: 5798586b105c0434e4f0fe5e767abe619442cf93 metrics: - type: v_measure value: 68.23769904483508 - task: type: Clustering dataset: name: MTEB ThuNewsClusteringS2S type: C-MTEB/ThuNewsClusteringS2S config: default split: test revision: 8a8b2caeda43f39e13c4bc5bea0f8a667896e10d metrics: - type: v_measure value: 62.50294403136556 - task: type: Retrieval dataset: name: MTEB VideoRetrieval type: C-MTEB/VideoRetrieval config: default split: dev revision: 58c2597a5943a2ba48f4668c3b90d796283c5639 metrics: - type: map_at_1 value: 54.0 - type: map_at_10 value: 63.668 - type: map_at_100 value: 64.217 - type: map_at_1000 value: 64.23100000000001 - type: map_at_3 value: 61.7 - type: map_at_5 value: 62.870000000000005 - type: mrr_at_1 value: 54.0 - type: mrr_at_10 value: 63.668 - type: mrr_at_100 value: 64.217 - type: mrr_at_1000 value: 64.23100000000001 - type: mrr_at_3 value: 61.7 - type: mrr_at_5 value: 62.870000000000005 - type: ndcg_at_1 value: 54.0 - type: ndcg_at_10 value: 68.11399999999999 - type: ndcg_at_100 value: 70.723 - type: ndcg_at_1000 value: 71.123 - type: ndcg_at_3 value: 64.074 - type: ndcg_at_5 value: 66.178 - type: precision_at_1 value: 54.0 - type: precision_at_10 value: 8.200000000000001 - type: precision_at_100 value: 0.941 - type: precision_at_1000 value: 0.097 - type: precision_at_3 value: 23.633000000000003 - type: precision_at_5 value: 15.2 - type: recall_at_1 value: 54.0 - type: recall_at_10 value: 82.0 - type: recall_at_100 value: 94.1 - type: recall_at_1000 value: 97.3 - type: recall_at_3 value: 70.89999999999999 - type: recall_at_5 value: 76.0 - task: type: Classification dataset: name: MTEB Waimai type: C-MTEB/waimai-classification config: default split: test revision: 339287def212450dcaa9df8c22bf93e9980c7023 metrics: - type: accuracy value: 86.63000000000001 - type: ap value: 69.99457882599567 - type: f1 value: 85.07735617998541 - task: type: Clustering dataset: name: MTEB 8TagsClustering type: PL-MTEB/8tags-clustering config: default split: test revision: None metrics: - type: v_measure value: 44.594104491193555 - task: type: Classification dataset: name: MTEB AllegroReviews type: PL-MTEB/allegro-reviews config: default split: test revision: None metrics: - type: accuracy value: 63.97614314115309 - type: f1 value: 52.15634261679283 - task: type: Retrieval dataset: name: MTEB ArguAna-PL type: clarin-knext/arguana-pl config: default split: test revision: 63fc86750af76253e8c760fc9e534bbf24d260a2 metrics: - type: map_at_1 value: 32.646 - type: map_at_10 value: 47.963 - type: map_at_100 value: 48.789 - type: map_at_1000 value: 48.797000000000004 - type: map_at_3 value: 43.196 - type: map_at_5 value: 46.016 - type: mrr_at_1 value: 33.073 - type: mrr_at_10 value: 48.126000000000005 - type: mrr_at_100 value: 48.946 - type: mrr_at_1000 value: 48.953 - type: mrr_at_3 value: 43.374 - type: mrr_at_5 value: 46.147 - type: ndcg_at_1 value: 32.646 - type: ndcg_at_10 value: 56.481 - type: ndcg_at_100 value: 59.922 - type: ndcg_at_1000 value: 60.07 - type: ndcg_at_3 value: 46.675 - type: ndcg_at_5 value: 51.76500000000001 - type: precision_at_1 value: 32.646 - type: precision_at_10 value: 8.371 - type: precision_at_100 value: 0.9860000000000001 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 18.919 - type: precision_at_5 value: 13.825999999999999 - type: recall_at_1 value: 32.646 - type: recall_at_10 value: 83.71300000000001 - type: recall_at_100 value: 98.578 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 56.757000000000005 - type: recall_at_5 value: 69.132 - task: type: Classification dataset: name: MTEB CBD type: PL-MTEB/cbd config: default split: test revision: None metrics: - type: accuracy value: 68.56 - type: ap value: 23.310493680488513 - type: f1 value: 58.85369533105693 - task: type: PairClassification dataset: name: MTEB CDSC-E type: PL-MTEB/cdsce-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 88.5 - type: cos_sim_ap value: 72.42140924378361 - type: cos_sim_f1 value: 66.0919540229885 - type: cos_sim_precision value: 72.78481012658227 - type: cos_sim_recall value: 60.526315789473685 - type: dot_accuracy value: 88.5 - type: dot_ap value: 72.42140924378361 - type: dot_f1 value: 66.0919540229885 - type: dot_precision value: 72.78481012658227 - type: dot_recall value: 60.526315789473685 - type: euclidean_accuracy value: 88.5 - type: euclidean_ap value: 72.42140924378361 - type: euclidean_f1 value: 66.0919540229885 - type: euclidean_precision value: 72.78481012658227 - type: euclidean_recall value: 60.526315789473685 - type: manhattan_accuracy value: 88.5 - type: manhattan_ap value: 72.49745515311696 - type: manhattan_f1 value: 66.0968660968661 - type: manhattan_precision value: 72.04968944099379 - type: manhattan_recall value: 61.05263157894737 - type: max_accuracy value: 88.5 - type: max_ap value: 72.49745515311696 - type: max_f1 value: 66.0968660968661 - task: type: STS dataset: name: MTEB CDSC-R type: PL-MTEB/cdscr-sts config: default split: test revision: None metrics: - type: cos_sim_pearson value: 90.32269765590145 - type: cos_sim_spearman value: 89.73666311491672 - type: euclidean_pearson value: 88.2933868516544 - type: euclidean_spearman value: 89.73666311491672 - type: manhattan_pearson value: 88.33474590219448 - type: manhattan_spearman value: 89.8548364866583 - task: type: Retrieval dataset: name: MTEB DBPedia-PL type: clarin-knext/dbpedia-pl config: default split: test revision: 76afe41d9af165cc40999fcaa92312b8b012064a metrics: - type: map_at_1 value: 7.632999999999999 - type: map_at_10 value: 16.426 - type: map_at_100 value: 22.651 - type: map_at_1000 value: 24.372 - type: map_at_3 value: 11.706 - type: map_at_5 value: 13.529 - type: mrr_at_1 value: 60.75000000000001 - type: mrr_at_10 value: 68.613 - type: mrr_at_100 value: 69.001 - type: mrr_at_1000 value: 69.021 - type: mrr_at_3 value: 67.0 - type: mrr_at_5 value: 67.925 - type: ndcg_at_1 value: 49.875 - type: ndcg_at_10 value: 36.978 - type: ndcg_at_100 value: 40.031 - type: ndcg_at_1000 value: 47.566 - type: ndcg_at_3 value: 41.148 - type: ndcg_at_5 value: 38.702 - type: precision_at_1 value: 60.75000000000001 - type: precision_at_10 value: 29.7 - type: precision_at_100 value: 9.278 - type: precision_at_1000 value: 2.099 - type: precision_at_3 value: 44.0 - type: precision_at_5 value: 37.6 - type: recall_at_1 value: 7.632999999999999 - type: recall_at_10 value: 22.040000000000003 - type: recall_at_100 value: 44.024 - type: recall_at_1000 value: 67.848 - type: recall_at_3 value: 13.093 - type: recall_at_5 value: 15.973 - task: type: Retrieval dataset: name: MTEB FiQA-PL type: clarin-knext/fiqa-pl config: default split: test revision: 2e535829717f8bf9dc829b7f911cc5bbd4e6608e metrics: - type: map_at_1 value: 15.473 - type: map_at_10 value: 24.579 - type: map_at_100 value: 26.387 - type: map_at_1000 value: 26.57 - type: map_at_3 value: 21.278 - type: map_at_5 value: 23.179 - type: mrr_at_1 value: 30.709999999999997 - type: mrr_at_10 value: 38.994 - type: mrr_at_100 value: 39.993 - type: mrr_at_1000 value: 40.044999999999995 - type: mrr_at_3 value: 36.342999999999996 - type: mrr_at_5 value: 37.846999999999994 - type: ndcg_at_1 value: 30.709999999999997 - type: ndcg_at_10 value: 31.608999999999998 - type: ndcg_at_100 value: 38.807 - type: ndcg_at_1000 value: 42.208 - type: ndcg_at_3 value: 28.086 - type: ndcg_at_5 value: 29.323 - type: precision_at_1 value: 30.709999999999997 - type: precision_at_10 value: 8.688 - type: precision_at_100 value: 1.608 - type: precision_at_1000 value: 0.22100000000000003 - type: precision_at_3 value: 18.724 - type: precision_at_5 value: 13.950999999999999 - type: recall_at_1 value: 15.473 - type: recall_at_10 value: 38.361000000000004 - type: recall_at_100 value: 65.2 - type: recall_at_1000 value: 85.789 - type: recall_at_3 value: 25.401 - type: recall_at_5 value: 30.875999999999998 - task: type: Retrieval dataset: name: MTEB HotpotQA-PL type: clarin-knext/hotpotqa-pl config: default split: test revision: a0bd479ac97b4ccb5bd6ce320c415d0bb4beb907 metrics: - type: map_at_1 value: 38.096000000000004 - type: map_at_10 value: 51.44499999999999 - type: map_at_100 value: 52.325 - type: map_at_1000 value: 52.397000000000006 - type: map_at_3 value: 48.626999999999995 - type: map_at_5 value: 50.342 - type: mrr_at_1 value: 76.19200000000001 - type: mrr_at_10 value: 81.191 - type: mrr_at_100 value: 81.431 - type: mrr_at_1000 value: 81.443 - type: mrr_at_3 value: 80.30199999999999 - type: mrr_at_5 value: 80.85900000000001 - type: ndcg_at_1 value: 76.19200000000001 - type: ndcg_at_10 value: 60.9 - type: ndcg_at_100 value: 64.14699999999999 - type: ndcg_at_1000 value: 65.647 - type: ndcg_at_3 value: 56.818000000000005 - type: ndcg_at_5 value: 59.019999999999996 - type: precision_at_1 value: 76.19200000000001 - type: precision_at_10 value: 12.203 - type: precision_at_100 value: 1.478 - type: precision_at_1000 value: 0.168 - type: precision_at_3 value: 34.616 - type: precision_at_5 value: 22.515 - type: recall_at_1 value: 38.096000000000004 - type: recall_at_10 value: 61.013 - type: recall_at_100 value: 73.90299999999999 - type: recall_at_1000 value: 83.91 - type: recall_at_3 value: 51.92400000000001 - type: recall_at_5 value: 56.286 - task: type: Retrieval dataset: name: MTEB MSMARCO-PL type: clarin-knext/msmarco-pl config: default split: test revision: 8634c07806d5cce3a6138e260e59b81760a0a640 metrics: - type: map_at_1 value: 1.548 - type: map_at_10 value: 11.049000000000001 - type: map_at_100 value: 28.874 - type: map_at_1000 value: 34.931 - type: map_at_3 value: 4.162 - type: map_at_5 value: 6.396 - type: mrr_at_1 value: 90.69800000000001 - type: mrr_at_10 value: 92.093 - type: mrr_at_100 value: 92.345 - type: mrr_at_1000 value: 92.345 - type: mrr_at_3 value: 91.86 - type: mrr_at_5 value: 91.86 - type: ndcg_at_1 value: 74.031 - type: ndcg_at_10 value: 63.978 - type: ndcg_at_100 value: 53.101 - type: ndcg_at_1000 value: 60.675999999999995 - type: ndcg_at_3 value: 71.421 - type: ndcg_at_5 value: 68.098 - type: precision_at_1 value: 90.69800000000001 - type: precision_at_10 value: 71.86 - type: precision_at_100 value: 31.395 - type: precision_at_1000 value: 5.981 - type: precision_at_3 value: 84.49600000000001 - type: precision_at_5 value: 79.07 - type: recall_at_1 value: 1.548 - type: recall_at_10 value: 12.149000000000001 - type: recall_at_100 value: 40.794999999999995 - type: recall_at_1000 value: 67.974 - type: recall_at_3 value: 4.244 - type: recall_at_5 value: 6.608 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pl) type: mteb/amazon_massive_intent config: pl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 73.55413584398119 - type: f1 value: 69.65610882318181 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pl) type: mteb/amazon_massive_scenario config: pl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 76.37188971082716 - type: f1 value: 75.64847309941361 - task: type: Retrieval dataset: name: MTEB NFCorpus-PL type: clarin-knext/nfcorpus-pl config: default split: test revision: 9a6f9567fda928260afed2de480d79c98bf0bec0 metrics: - type: map_at_1 value: 4.919 - type: map_at_10 value: 10.834000000000001 - type: map_at_100 value: 13.38 - type: map_at_1000 value: 14.581 - type: map_at_3 value: 8.198 - type: map_at_5 value: 9.428 - type: mrr_at_1 value: 41.176 - type: mrr_at_10 value: 50.083 - type: mrr_at_100 value: 50.559 - type: mrr_at_1000 value: 50.604000000000006 - type: mrr_at_3 value: 47.936 - type: mrr_at_5 value: 49.407000000000004 - type: ndcg_at_1 value: 39.628 - type: ndcg_at_10 value: 30.098000000000003 - type: ndcg_at_100 value: 27.061 - type: ndcg_at_1000 value: 35.94 - type: ndcg_at_3 value: 35.135 - type: ndcg_at_5 value: 33.335 - type: precision_at_1 value: 41.176 - type: precision_at_10 value: 22.259999999999998 - type: precision_at_100 value: 6.712 - type: precision_at_1000 value: 1.9060000000000001 - type: precision_at_3 value: 33.23 - type: precision_at_5 value: 29.04 - type: recall_at_1 value: 4.919 - type: recall_at_10 value: 14.196 - type: recall_at_100 value: 26.948 - type: recall_at_1000 value: 59.211000000000006 - type: recall_at_3 value: 9.44 - type: recall_at_5 value: 11.569 - task: type: Retrieval dataset: name: MTEB NQ-PL type: clarin-knext/nq-pl config: default split: test revision: f171245712cf85dd4700b06bef18001578d0ca8d metrics: - type: map_at_1 value: 25.35 - type: map_at_10 value: 37.884 - type: map_at_100 value: 38.955 - type: map_at_1000 value: 39.007999999999996 - type: map_at_3 value: 34.239999999999995 - type: map_at_5 value: 36.398 - type: mrr_at_1 value: 28.737000000000002 - type: mrr_at_10 value: 39.973 - type: mrr_at_100 value: 40.844 - type: mrr_at_1000 value: 40.885 - type: mrr_at_3 value: 36.901 - type: mrr_at_5 value: 38.721 - type: ndcg_at_1 value: 28.708 - type: ndcg_at_10 value: 44.204 - type: ndcg_at_100 value: 48.978 - type: ndcg_at_1000 value: 50.33 - type: ndcg_at_3 value: 37.36 - type: ndcg_at_5 value: 40.912 - type: precision_at_1 value: 28.708 - type: precision_at_10 value: 7.367 - type: precision_at_100 value: 1.0030000000000001 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 17.034 - type: precision_at_5 value: 12.293999999999999 - type: recall_at_1 value: 25.35 - type: recall_at_10 value: 61.411 - type: recall_at_100 value: 82.599 - type: recall_at_1000 value: 92.903 - type: recall_at_3 value: 43.728 - type: recall_at_5 value: 51.854 - task: type: Classification dataset: name: MTEB PAC type: laugustyniak/abusive-clauses-pl config: default split: test revision: None metrics: - type: accuracy value: 69.04141326382856 - type: ap value: 77.49422763833996 - type: f1 value: 66.73472657783407 - task: type: PairClassification dataset: name: MTEB PPC type: PL-MTEB/ppc-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 81.0 - type: cos_sim_ap value: 91.47194213011349 - type: cos_sim_f1 value: 84.73767885532592 - type: cos_sim_precision value: 81.49847094801224 - type: cos_sim_recall value: 88.24503311258279 - type: dot_accuracy value: 81.0 - type: dot_ap value: 91.47194213011349 - type: dot_f1 value: 84.73767885532592 - type: dot_precision value: 81.49847094801224 - type: dot_recall value: 88.24503311258279 - type: euclidean_accuracy value: 81.0 - type: euclidean_ap value: 91.47194213011349 - type: euclidean_f1 value: 84.73767885532592 - type: euclidean_precision value: 81.49847094801224 - type: euclidean_recall value: 88.24503311258279 - type: manhattan_accuracy value: 81.0 - type: manhattan_ap value: 91.46464475050571 - type: manhattan_f1 value: 84.48687350835321 - type: manhattan_precision value: 81.31699846860643 - type: manhattan_recall value: 87.91390728476821 - type: max_accuracy value: 81.0 - type: max_ap value: 91.47194213011349 - type: max_f1 value: 84.73767885532592 - task: type: PairClassification dataset: name: MTEB PSC type: PL-MTEB/psc-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 97.6808905380334 - type: cos_sim_ap value: 99.27948611836348 - type: cos_sim_f1 value: 96.15975422427034 - type: cos_sim_precision value: 96.90402476780186 - type: cos_sim_recall value: 95.42682926829268 - type: dot_accuracy value: 97.6808905380334 - type: dot_ap value: 99.2794861183635 - type: dot_f1 value: 96.15975422427034 - type: dot_precision value: 96.90402476780186 - type: dot_recall value: 95.42682926829268 - type: euclidean_accuracy value: 97.6808905380334 - type: euclidean_ap value: 99.2794861183635 - type: euclidean_f1 value: 96.15975422427034 - type: euclidean_precision value: 96.90402476780186 - type: euclidean_recall value: 95.42682926829268 - type: manhattan_accuracy value: 97.6808905380334 - type: manhattan_ap value: 99.28715055268721 - type: manhattan_f1 value: 96.14791987673343 - type: manhattan_precision value: 97.19626168224299 - type: manhattan_recall value: 95.1219512195122 - type: max_accuracy value: 97.6808905380334 - type: max_ap value: 99.28715055268721 - type: max_f1 value: 96.15975422427034 - task: type: Classification dataset: name: MTEB PolEmo2.0-IN type: PL-MTEB/polemo2_in config: default split: test revision: None metrics: - type: accuracy value: 86.16343490304708 - type: f1 value: 83.3442579486744 - task: type: Classification dataset: name: MTEB PolEmo2.0-OUT type: PL-MTEB/polemo2_out config: default split: test revision: None metrics: - type: accuracy value: 68.40080971659918 - type: f1 value: 53.13720751142237 - task: type: Retrieval dataset: name: MTEB Quora-PL type: clarin-knext/quora-pl config: default split: test revision: 0be27e93455051e531182b85e85e425aba12e9d4 metrics: - type: map_at_1 value: 63.322 - type: map_at_10 value: 76.847 - type: map_at_100 value: 77.616 - type: map_at_1000 value: 77.644 - type: map_at_3 value: 73.624 - type: map_at_5 value: 75.603 - type: mrr_at_1 value: 72.88 - type: mrr_at_10 value: 80.376 - type: mrr_at_100 value: 80.604 - type: mrr_at_1000 value: 80.61 - type: mrr_at_3 value: 78.92 - type: mrr_at_5 value: 79.869 - type: ndcg_at_1 value: 72.89999999999999 - type: ndcg_at_10 value: 81.43 - type: ndcg_at_100 value: 83.394 - type: ndcg_at_1000 value: 83.685 - type: ndcg_at_3 value: 77.62599999999999 - type: ndcg_at_5 value: 79.656 - type: precision_at_1 value: 72.89999999999999 - type: precision_at_10 value: 12.548 - type: precision_at_100 value: 1.4869999999999999 - type: precision_at_1000 value: 0.155 - type: precision_at_3 value: 34.027 - type: precision_at_5 value: 22.654 - type: recall_at_1 value: 63.322 - type: recall_at_10 value: 90.664 - type: recall_at_100 value: 97.974 - type: recall_at_1000 value: 99.636 - type: recall_at_3 value: 80.067 - type: recall_at_5 value: 85.526 - task: type: Retrieval dataset: name: MTEB SCIDOCS-PL type: clarin-knext/scidocs-pl config: default split: test revision: 45452b03f05560207ef19149545f168e596c9337 metrics: - type: map_at_1 value: 3.95 - type: map_at_10 value: 9.658999999999999 - type: map_at_100 value: 11.384 - type: map_at_1000 value: 11.677 - type: map_at_3 value: 7.055 - type: map_at_5 value: 8.244 - type: mrr_at_1 value: 19.5 - type: mrr_at_10 value: 28.777 - type: mrr_at_100 value: 29.936 - type: mrr_at_1000 value: 30.009999999999998 - type: mrr_at_3 value: 25.55 - type: mrr_at_5 value: 27.284999999999997 - type: ndcg_at_1 value: 19.5 - type: ndcg_at_10 value: 16.589000000000002 - type: ndcg_at_100 value: 23.879 - type: ndcg_at_1000 value: 29.279 - type: ndcg_at_3 value: 15.719 - type: ndcg_at_5 value: 13.572000000000001 - type: precision_at_1 value: 19.5 - type: precision_at_10 value: 8.62 - type: precision_at_100 value: 1.924 - type: precision_at_1000 value: 0.322 - type: precision_at_3 value: 14.6 - type: precision_at_5 value: 11.78 - type: recall_at_1 value: 3.95 - type: recall_at_10 value: 17.477999999999998 - type: recall_at_100 value: 38.99 - type: recall_at_1000 value: 65.417 - type: recall_at_3 value: 8.883000000000001 - type: recall_at_5 value: 11.933 - task: type: PairClassification dataset: name: MTEB SICK-E-PL type: PL-MTEB/sicke-pl-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 83.48960456583775 - type: cos_sim_ap value: 76.31522115825375 - type: cos_sim_f1 value: 70.35573122529645 - type: cos_sim_precision value: 70.9934735315446 - type: cos_sim_recall value: 69.72934472934473 - type: dot_accuracy value: 83.48960456583775 - type: dot_ap value: 76.31522115825373 - type: dot_f1 value: 70.35573122529645 - type: dot_precision value: 70.9934735315446 - type: dot_recall value: 69.72934472934473 - type: euclidean_accuracy value: 83.48960456583775 - type: euclidean_ap value: 76.31522115825373 - type: euclidean_f1 value: 70.35573122529645 - type: euclidean_precision value: 70.9934735315446 - type: euclidean_recall value: 69.72934472934473 - type: manhattan_accuracy value: 83.46922136159804 - type: manhattan_ap value: 76.18474601388084 - type: manhattan_f1 value: 70.34779490856937 - type: manhattan_precision value: 70.83032490974729 - type: manhattan_recall value: 69.87179487179486 - type: max_accuracy value: 83.48960456583775 - type: max_ap value: 76.31522115825375 - type: max_f1 value: 70.35573122529645 - task: type: STS dataset: name: MTEB SICK-R-PL type: PL-MTEB/sickr-pl-sts config: default split: test revision: None metrics: - type: cos_sim_pearson value: 77.95374883876302 - type: cos_sim_spearman value: 73.77630219171942 - type: euclidean_pearson value: 75.81927069594934 - type: euclidean_spearman value: 73.7763211303831 - type: manhattan_pearson value: 76.03126859057528 - type: manhattan_spearman value: 73.96528138013369 - task: type: STS dataset: name: MTEB STS22 (pl) type: mteb/sts22-crosslingual-sts config: pl split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 37.388282764841826 - type: cos_sim_spearman value: 40.83477184710897 - type: euclidean_pearson value: 26.754737044177805 - type: euclidean_spearman value: 40.83477184710897 - type: manhattan_pearson value: 26.760453110872458 - type: manhattan_spearman value: 41.034477441383856 - task: type: Retrieval dataset: name: MTEB SciFact-PL type: clarin-knext/scifact-pl config: default split: test revision: 47932a35f045ef8ed01ba82bf9ff67f6e109207e metrics: - type: map_at_1 value: 49.15 - type: map_at_10 value: 61.690999999999995 - type: map_at_100 value: 62.348000000000006 - type: map_at_1000 value: 62.38 - type: map_at_3 value: 58.824 - type: map_at_5 value: 60.662000000000006 - type: mrr_at_1 value: 51.333 - type: mrr_at_10 value: 62.731 - type: mrr_at_100 value: 63.245 - type: mrr_at_1000 value: 63.275000000000006 - type: mrr_at_3 value: 60.667 - type: mrr_at_5 value: 61.93300000000001 - type: ndcg_at_1 value: 51.333 - type: ndcg_at_10 value: 67.168 - type: ndcg_at_100 value: 69.833 - type: ndcg_at_1000 value: 70.56700000000001 - type: ndcg_at_3 value: 62.40599999999999 - type: ndcg_at_5 value: 65.029 - type: precision_at_1 value: 51.333 - type: precision_at_10 value: 9.333 - type: precision_at_100 value: 1.0699999999999998 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 25.333 - type: precision_at_5 value: 17.067 - type: recall_at_1 value: 49.15 - type: recall_at_10 value: 82.533 - type: recall_at_100 value: 94.167 - type: recall_at_1000 value: 99.667 - type: recall_at_3 value: 69.917 - type: recall_at_5 value: 76.356 - task: type: Retrieval dataset: name: MTEB TRECCOVID-PL type: clarin-knext/trec-covid-pl config: default split: test revision: 81bcb408f33366c2a20ac54adafad1ae7e877fdd metrics: - type: map_at_1 value: 0.261 - type: map_at_10 value: 2.1260000000000003 - type: map_at_100 value: 12.171999999999999 - type: map_at_1000 value: 26.884999999999998 - type: map_at_3 value: 0.695 - type: map_at_5 value: 1.134 - type: mrr_at_1 value: 96.0 - type: mrr_at_10 value: 96.952 - type: mrr_at_100 value: 96.952 - type: mrr_at_1000 value: 96.952 - type: mrr_at_3 value: 96.667 - type: mrr_at_5 value: 96.667 - type: ndcg_at_1 value: 92.0 - type: ndcg_at_10 value: 81.193 - type: ndcg_at_100 value: 61.129 - type: ndcg_at_1000 value: 51.157 - type: ndcg_at_3 value: 85.693 - type: ndcg_at_5 value: 84.129 - type: precision_at_1 value: 96.0 - type: precision_at_10 value: 85.39999999999999 - type: precision_at_100 value: 62.03999999999999 - type: precision_at_1000 value: 22.224 - type: precision_at_3 value: 88.0 - type: precision_at_5 value: 88.0 - type: recall_at_1 value: 0.261 - type: recall_at_10 value: 2.262 - type: recall_at_100 value: 14.981 - type: recall_at_1000 value: 46.837 - type: recall_at_3 value: 0.703 - type: recall_at_5 value: 1.172 - task: type: Clustering dataset: name: MTEB AlloProfClusteringP2P type: lyon-nlp/alloprof config: default split: test revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b metrics: - type: v_measure value: 70.55290063940157 - type: v_measure value: 55.41500719337263 - task: type: Reranking dataset: name: MTEB AlloprofReranking type: lyon-nlp/mteb-fr-reranking-alloprof-s2p config: default split: test revision: 666fdacebe0291776e86f29345663dfaf80a0db9 metrics: - type: map value: 73.48697375332002 - type: mrr value: 75.01836585523822 - task: type: Retrieval dataset: name: MTEB AlloprofRetrieval type: lyon-nlp/alloprof config: default split: test revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b metrics: - type: map_at_1 value: 38.454 - type: map_at_10 value: 51.605000000000004 - type: map_at_100 value: 52.653000000000006 - type: map_at_1000 value: 52.697 - type: map_at_3 value: 48.304 - type: map_at_5 value: 50.073 - type: mrr_at_1 value: 43.307 - type: mrr_at_10 value: 54.400000000000006 - type: mrr_at_100 value: 55.147999999999996 - type: mrr_at_1000 value: 55.174 - type: mrr_at_3 value: 51.77 - type: mrr_at_5 value: 53.166999999999994 - type: ndcg_at_1 value: 43.307 - type: ndcg_at_10 value: 57.891000000000005 - type: ndcg_at_100 value: 62.161 - type: ndcg_at_1000 value: 63.083 - type: ndcg_at_3 value: 51.851 - type: ndcg_at_5 value: 54.605000000000004 - type: precision_at_1 value: 43.307 - type: precision_at_10 value: 9.033 - type: precision_at_100 value: 1.172 - type: precision_at_1000 value: 0.127 - type: precision_at_3 value: 22.798 - type: precision_at_5 value: 15.492 - type: recall_at_1 value: 38.454 - type: recall_at_10 value: 74.166 - type: recall_at_100 value: 92.43599999999999 - type: recall_at_1000 value: 99.071 - type: recall_at_3 value: 58.087 - type: recall_at_5 value: 64.568 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (fr) type: mteb/amazon_reviews_multi config: fr split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 53.474 - type: f1 value: 50.38275392350236 - task: type: Retrieval dataset: name: MTEB BSARDRetrieval type: maastrichtlawtech/bsard config: default split: test revision: 5effa1b9b5fa3b0f9e12523e6e43e5f86a6e6d59 metrics: - type: map_at_1 value: 2.252 - type: map_at_10 value: 4.661 - type: map_at_100 value: 5.271 - type: map_at_1000 value: 5.3629999999999995 - type: map_at_3 value: 3.604 - type: map_at_5 value: 4.3020000000000005 - type: mrr_at_1 value: 2.252 - type: mrr_at_10 value: 4.661 - type: mrr_at_100 value: 5.271 - type: mrr_at_1000 value: 5.3629999999999995 - type: mrr_at_3 value: 3.604 - type: mrr_at_5 value: 4.3020000000000005 - type: ndcg_at_1 value: 2.252 - type: ndcg_at_10 value: 6.3020000000000005 - type: ndcg_at_100 value: 10.342 - type: ndcg_at_1000 value: 13.475999999999999 - type: ndcg_at_3 value: 4.0649999999999995 - type: ndcg_at_5 value: 5.344 - type: precision_at_1 value: 2.252 - type: precision_at_10 value: 1.171 - type: precision_at_100 value: 0.333 - type: precision_at_1000 value: 0.059000000000000004 - type: precision_at_3 value: 1.802 - type: precision_at_5 value: 1.712 - type: recall_at_1 value: 2.252 - type: recall_at_10 value: 11.712 - type: recall_at_100 value: 33.333 - type: recall_at_1000 value: 59.458999999999996 - type: recall_at_3 value: 5.405 - type: recall_at_5 value: 8.559 - task: type: Clustering dataset: name: MTEB HALClusteringS2S type: lyon-nlp/clustering-hal-s2s config: default split: test revision: e06ebbbb123f8144bef1a5d18796f3dec9ae2915 metrics: - type: v_measure value: 28.301882091023288 - task: type: Clustering dataset: name: MTEB MLSUMClusteringP2P type: mlsum config: default split: test revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7 metrics: - type: v_measure value: 45.26992995191701 - type: v_measure value: 42.773174876871145 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (fr) type: mteb/mtop_domain config: fr split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.47635452552458 - type: f1 value: 93.19922617577213 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (fr) type: mteb/mtop_intent config: fr split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 80.2317569683683 - type: f1 value: 56.18060418621901 - task: type: Classification dataset: name: MTEB MasakhaNEWSClassification (fra) type: masakhane/masakhanews config: fra split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: accuracy value: 85.18957345971565 - type: f1 value: 80.829981537394 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (fra) type: masakhane/masakhanews config: fra split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: v_measure value: 71.04138999801822 - type: v_measure value: 71.7056263158008 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fr) type: mteb/amazon_massive_intent config: fr split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 76.65097511768661 - type: f1 value: 73.82441070598712 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fr) type: mteb/amazon_massive_scenario config: fr split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 79.09885675857431 - type: f1 value: 78.28407777434224 - task: type: Retrieval dataset: name: MTEB MintakaRetrieval (fr) type: jinaai/mintakaqa config: fr split: test revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e metrics: - type: map_at_1 value: 25.307000000000002 - type: map_at_10 value: 36.723 - type: map_at_100 value: 37.713 - type: map_at_1000 value: 37.769000000000005 - type: map_at_3 value: 33.77 - type: map_at_5 value: 35.463 - type: mrr_at_1 value: 25.307000000000002 - type: mrr_at_10 value: 36.723 - type: mrr_at_100 value: 37.713 - type: mrr_at_1000 value: 37.769000000000005 - type: mrr_at_3 value: 33.77 - type: mrr_at_5 value: 35.463 - type: ndcg_at_1 value: 25.307000000000002 - type: ndcg_at_10 value: 42.559999999999995 - type: ndcg_at_100 value: 47.457 - type: ndcg_at_1000 value: 49.162 - type: ndcg_at_3 value: 36.461 - type: ndcg_at_5 value: 39.504 - type: precision_at_1 value: 25.307000000000002 - type: precision_at_10 value: 6.106 - type: precision_at_100 value: 0.8420000000000001 - type: precision_at_1000 value: 0.098 - type: precision_at_3 value: 14.741999999999999 - type: precision_at_5 value: 10.319 - type: recall_at_1 value: 25.307000000000002 - type: recall_at_10 value: 61.056999999999995 - type: recall_at_100 value: 84.152 - type: recall_at_1000 value: 98.03399999999999 - type: recall_at_3 value: 44.226 - type: recall_at_5 value: 51.597 - task: type: PairClassification dataset: name: MTEB OpusparcusPC (fr) type: GEM/opusparcus config: fr split: test revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a metrics: - type: cos_sim_accuracy value: 99.90069513406156 - type: cos_sim_ap value: 100.0 - type: cos_sim_f1 value: 99.95032290114257 - type: cos_sim_precision value: 100.0 - type: cos_sim_recall value: 99.90069513406156 - type: dot_accuracy value: 99.90069513406156 - type: dot_ap value: 100.0 - type: dot_f1 value: 99.95032290114257 - type: dot_precision value: 100.0 - type: dot_recall value: 99.90069513406156 - type: euclidean_accuracy value: 99.90069513406156 - type: euclidean_ap value: 100.0 - type: euclidean_f1 value: 99.95032290114257 - type: euclidean_precision value: 100.0 - type: euclidean_recall value: 99.90069513406156 - type: manhattan_accuracy value: 99.90069513406156 - type: manhattan_ap value: 100.0 - type: manhattan_f1 value: 99.95032290114257 - type: manhattan_precision value: 100.0 - type: manhattan_recall value: 99.90069513406156 - type: max_accuracy value: 99.90069513406156 - type: max_ap value: 100.0 - type: max_f1 value: 99.95032290114257 - task: type: PairClassification dataset: name: MTEB PawsX (fr) type: paws-x config: fr split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cos_sim_accuracy value: 70.8 - type: cos_sim_ap value: 73.7671529695957 - type: cos_sim_f1 value: 68.80964339527875 - type: cos_sim_precision value: 62.95955882352941 - type: cos_sim_recall value: 75.85825027685493 - type: dot_accuracy value: 70.8 - type: dot_ap value: 73.78345265366947 - type: dot_f1 value: 68.80964339527875 - type: dot_precision value: 62.95955882352941 - type: dot_recall value: 75.85825027685493 - type: euclidean_accuracy value: 70.8 - type: euclidean_ap value: 73.7671529695957 - type: euclidean_f1 value: 68.80964339527875 - type: euclidean_precision value: 62.95955882352941 - type: euclidean_recall value: 75.85825027685493 - type: manhattan_accuracy value: 70.75 - type: manhattan_ap value: 73.78996383615953 - type: manhattan_f1 value: 68.79432624113475 - type: manhattan_precision value: 63.39869281045751 - type: manhattan_recall value: 75.1937984496124 - type: max_accuracy value: 70.8 - type: max_ap value: 73.78996383615953 - type: max_f1 value: 68.80964339527875 - task: type: STS dataset: name: MTEB SICKFr type: Lajavaness/SICK-fr config: default split: test revision: e077ab4cf4774a1e36d86d593b150422fafd8e8a metrics: - type: cos_sim_pearson value: 84.03253762760392 - type: cos_sim_spearman value: 79.68280105762004 - type: euclidean_pearson value: 80.98265050044444 - type: euclidean_spearman value: 79.68233242682867 - type: manhattan_pearson value: 80.9678911810704 - type: manhattan_spearman value: 79.70264097683109 - task: type: STS dataset: name: MTEB STS22 (fr) type: mteb/sts22-crosslingual-sts config: fr split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 80.56896987572884 - type: cos_sim_spearman value: 81.84352499523287 - type: euclidean_pearson value: 80.40831759421305 - type: euclidean_spearman value: 81.84352499523287 - type: manhattan_pearson value: 80.74333857561238 - type: manhattan_spearman value: 82.41503246733892 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (fr) type: stsb_multi_mt config: fr split: test revision: 93d57ef91790589e3ce9c365164337a8a78b7632 metrics: - type: cos_sim_pearson value: 82.71826762276979 - type: cos_sim_spearman value: 82.25433354916042 - type: euclidean_pearson value: 81.87115571724316 - type: euclidean_spearman value: 82.25322342890107 - type: manhattan_pearson value: 82.11174867527224 - type: manhattan_spearman value: 82.55905365203084 - task: type: Summarization dataset: name: MTEB SummEvalFr type: lyon-nlp/summarization-summeval-fr-p2p config: default split: test revision: b385812de6a9577b6f4d0f88c6a6e35395a94054 metrics: - type: cos_sim_pearson value: 30.659441623392887 - type: cos_sim_spearman value: 30.501134097353315 - type: dot_pearson value: 30.659444768851056 - type: dot_spearman value: 30.501134097353315 - task: type: Reranking dataset: name: MTEB SyntecReranking type: lyon-nlp/mteb-fr-reranking-syntec-s2p config: default split: test revision: b205c5084a0934ce8af14338bf03feb19499c84d metrics: - type: map value: 94.03333333333333 - type: mrr value: 94.03333333333333 - task: type: Retrieval dataset: name: MTEB SyntecRetrieval type: lyon-nlp/mteb-fr-retrieval-syntec-s2p config: default split: test revision: 77f7e271bf4a92b24fce5119f3486b583ca016ff metrics: - type: map_at_1 value: 79.0 - type: map_at_10 value: 87.61 - type: map_at_100 value: 87.655 - type: map_at_1000 value: 87.655 - type: map_at_3 value: 87.167 - type: map_at_5 value: 87.36699999999999 - type: mrr_at_1 value: 79.0 - type: mrr_at_10 value: 87.61 - type: mrr_at_100 value: 87.655 - type: mrr_at_1000 value: 87.655 - type: mrr_at_3 value: 87.167 - type: mrr_at_5 value: 87.36699999999999 - type: ndcg_at_1 value: 79.0 - type: ndcg_at_10 value: 90.473 - type: ndcg_at_100 value: 90.694 - type: ndcg_at_1000 value: 90.694 - type: ndcg_at_3 value: 89.464 - type: ndcg_at_5 value: 89.851 - type: precision_at_1 value: 79.0 - type: precision_at_10 value: 9.9 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 32.0 - type: precision_at_5 value: 19.400000000000002 - type: recall_at_1 value: 79.0 - type: recall_at_10 value: 99.0 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 96.0 - type: recall_at_5 value: 97.0 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (fr) type: jinaai/xpqa config: fr split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: map_at_1 value: 39.395 - type: map_at_10 value: 59.123999999999995 - type: map_at_100 value: 60.704 - type: map_at_1000 value: 60.760000000000005 - type: map_at_3 value: 53.187 - type: map_at_5 value: 56.863 - type: mrr_at_1 value: 62.083 - type: mrr_at_10 value: 68.87299999999999 - type: mrr_at_100 value: 69.46900000000001 - type: mrr_at_1000 value: 69.48299999999999 - type: mrr_at_3 value: 66.8 - type: mrr_at_5 value: 67.928 - type: ndcg_at_1 value: 62.083 - type: ndcg_at_10 value: 65.583 - type: ndcg_at_100 value: 70.918 - type: ndcg_at_1000 value: 71.72800000000001 - type: ndcg_at_3 value: 60.428000000000004 - type: ndcg_at_5 value: 61.853 - type: precision_at_1 value: 62.083 - type: precision_at_10 value: 15.033 - type: precision_at_100 value: 1.9529999999999998 - type: precision_at_1000 value: 0.207 - type: precision_at_3 value: 36.315 - type: precision_at_5 value: 25.955000000000002 - type: recall_at_1 value: 39.395 - type: recall_at_10 value: 74.332 - type: recall_at_100 value: 94.729 - type: recall_at_1000 value: 99.75500000000001 - type: recall_at_3 value: 57.679 - type: recall_at_5 value: 65.036 --- # mxs980/gte-Qwen2-1.5B-instruct-Q8_0-GGUF This model was converted to GGUF format from [`Alibaba-NLP/gte-Qwen2-1.5B-instruct`](https://huggingface.co/Alibaba-NLP/gte-Qwen2-1.5B-instruct) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/Alibaba-NLP/gte-Qwen2-1.5B-instruct) for more details on the model. ## Use with llama.cpp Install llama.cpp through brew (works on Mac and Linux) ```bash brew install llama.cpp ``` Invoke the llama.cpp server or the CLI. ### CLI: ```bash llama-cli --hf-repo mxs980/gte-Qwen2-1.5B-instruct-Q8_0-GGUF --hf-file gte-qwen2-1.5b-instruct-q8_0.gguf -p "The meaning to life and the universe is" ``` ### Server: ```bash llama-server --hf-repo mxs980/gte-Qwen2-1.5B-instruct-Q8_0-GGUF --hf-file gte-qwen2-1.5b-instruct-q8_0.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. Step 1: Clone llama.cpp from GitHub. ``` git clone https://github.com/ggerganov/llama.cpp ``` Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux). ``` cd llama.cpp && LLAMA_CURL=1 make ``` Step 3: Run inference through the main binary. ``` ./llama-cli --hf-repo mxs980/gte-Qwen2-1.5B-instruct-Q8_0-GGUF --hf-file gte-qwen2-1.5b-instruct-q8_0.gguf -p "The meaning to life and the universe is" ``` or ``` ./llama-server --hf-repo mxs980/gte-Qwen2-1.5B-instruct-Q8_0-GGUF --hf-file gte-qwen2-1.5b-instruct-q8_0.gguf -c 2048 ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
medspaner/xlm-roberta-large-spanish-trials-misc-ents
medspaner
token-classification
[ "transformers", "pytorch", "xlm-roberta", "token-classification", "generated_from_trainer", "license:cc-by-nc-4.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,705
1,727
12
0
--- license: cc-by-nc-4.0 metrics: - precision - recall - f1 - accuracy tags: - generated_from_trainer widget: - text: Paciente normotenso (PA = 120/70 mmHg) model-index: - name: xlm-roberta-large-spanish-trials-cases-misc-ents results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # xlm-roberta-large-spanish-trials-cases-misc-ents This named entity recognition model detects the following types of medical entities: - Concept: e.g. *ANOVA* - Food_or_Drink: e.g. *alcohol*, *soja* - Observation: clinical findings/observations: e.g. *normotenso* - Quantifier_or_Qualifier: e.g. *grave* - Result_or_Value: result of a diagnostic procedure or laboratory analysis: e.g. *120/70 mmHg* The model achieves the following results on the test set (results are averaged over 5 evaluation rounds): - Precision: 0.715 (±0.014) - Recall: 0.672 (±0.016) - F1: 0.692 (±0.007) - Accuracy: 0.957 (±0.001) ## Model description This model adapts the pre-trained model [xlm-roberta-large-spanish-clinical](https://huggingface.co/llange/xlm-roberta-large-spanish-clinical), presented in [Lange et al. (2022)](https://academic.oup.com/bioinformatics/article/38/12/3267/6575884). It is fine-tuned to conduct medical named entity recognition on texts about in Spanish. The model is fine-tuned on the [CT-EBM-ES corpus (Campillos-Llanos et al. 2021)](https://bmcmedinformdecismak.biomedcentral.com/articles/10.1186/s12911-021-01395-z). If you use this model, please, cite as follows: ``` @article{campillosetal2024,         title = {{Hybrid tool for semantic annotation and concept extraction of medical texts in Spanish}},         author = {Campillos-Llanos, Leonardo and Valverde-Mateos, Ana and Capllonch-Carri{\'o}n, Adri{\'a}n},         journal = {BMC Bioinformatics}, year={2024}, publisher={BioMed Central} } ``` ## Intended uses & limitations **Disclosure**: *This model is under development and needs to be improved. It should not be used for medical decision making without human assistance and supervision* This model is intended for a generalist purpose, and may have bias and/or any other undesirable distortions. Third parties who deploy or provide systems and/or services using any of these models (or using systems based on these models) should note that it is their responsibility to mitigate the risks arising from their use. Third parties, in any event, need to comply with applicable regulations, including regulations concerning the use of artificial intelligence. The owner or creator of the models will in no event be liable for any results arising from the use made by third parties of these models. **Descargo de responsabilidad**: *Esta herramienta se encuentra en desarrollo y no debe ser empleada para la toma de decisiones médicas* La finalidad de este modelo es generalista, y se advierte que puede tener sesgos y/u otro tipo de distorsiones indeseables. Terceras partes que desplieguen o proporcionen sistemas y/o servicios usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) han tener presente que es su responsabilidad abordar y minimizar los riesgos derivados de su uso. Las terceras partes, en cualquier circunstancia, deben cumplir con la normativa aplicable, incluyendo la normativa que concierne al uso de la inteligencia artificial. El propietario o creador de los modelos de ningún modo será responsable de los resultados derivados del uso que las terceras partes hagan de estos modelos. ## Training and evaluation data The model is fine-tuned on the Clinical Trials for Evidence-Based-Medicine in Spanish (CT-EBM-SP) corpus vs 2. The CT-EBM-SP corpus is a collection of 1200 texts about clinical trials studies and clinical trials announcements: - 500 abstracts from journals published under a Creative Commons license, e.g. available in PubMed or the Scientific Electronic Library Online (SciELO) - 700 clinical trials announcements published in the European Clinical Trials Register and Repositorio Español de Estudios Clínicos If you use the CT-EBM-ES resource, please, cite as follows: ``` @article{campillosetal-midm2021,         title = {A clinical trials corpus annotated with UMLS© entities to enhance the access to Evidence-Based Medicine},         author = {Campillos-Llanos, Leonardo and Valverde-Mateos, Ana and Capllonch-Carri{\'o}n, Adri{\'a}n and Moreno-Sandoval, Antonio},         journal = {BMC Medical Informatics and Decision Making},         volume={21}, number={1}, pages={1--19}, year={2021}, publisher={BioMed Central} } ``` ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: we used different seeds for 5 evaluation rounds, and uploaded the model with the best results - optimizer: Adam - num_epochs: average 19,4 (±6.07); trained with early stopping if no improvement after 5 epochs (early stopping patience: 5) ### Training results (test set; average and standard deviation of 5 rounds with different seeds) | Precision | Recall | F1 | Accuracy | |:--------------:|:--------------:|:--------------:|:--------------:| | 0.715 (±0.014) | 0.672 (±0.016) | 0.692 (±0.007) | 0.957 (±0.001) | ### Framework versions - Transformers 4.17.0 - Pytorch 1.10.2+cu113 - Datasets 1.18.4 - Tokenizers 0.11.6
[ "NAMED_ENTITY_RECOGNITION" ]
[ "CT-EBM-SP", "SCIELO" ]
BioNLP
EleutherAI/pythia-160m-v0
EleutherAI
text-generation
[ "transformers", "pytorch", "safetensors", "gpt_neox", "text-generation", "causal-lm", "pythia", "pythia_v0", "en", "dataset:the_pile", "arxiv:2101.00027", "arxiv:2201.07311", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
1,665
1,688
596
8
--- datasets: - the_pile language: - en license: apache-2.0 tags: - pytorch - causal-lm - pythia - pythia_v0 --- The *Pythia Scaling Suite* is a collection of models developed to facilitate interpretability research. It contains two sets of eight models of sizes 70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two models: one trained on the Pile, and one trained on the Pile after the dataset has been globally deduplicated. All 8 model sizes are trained on the exact same data, in the exact same order. All Pythia models are available [on Hugging Face](https://huggingface.co/models?other=pythia). The Pythia model suite was deliberately designed to promote scientific research on large language models, especially interpretability research. Despite not centering downstream performance as a design goal, we find the models <a href="#evaluations">match or exceed</a> the performance of similar and same-sized models, such as those in the OPT and GPT-Neo suites. Please note that all models in the *Pythia* suite were renamed in January 2023. For clarity, a <a href="#naming-convention-and-parameter-count">table comparing the old and new names</a> is provided in this model card, together with exact parameter counts. ## Pythia-160M ### Model Details - Developed by: [EleutherAI](http://eleuther.ai) - Model type: Transformer-based Language Model - Language: English - Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia) for training procedure, config files, and details on how to use. - Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox) - License: Apache 2.0 - Contact: to ask questions about this model, join the [EleutherAI Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`. Please read the existing *Pythia* documentation before asking about it in the EleutherAI Discord. For general correspondence: [contact@eleuther. ai](mailto:[email protected]). <figure> | Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models | | -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: | | 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — | | 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M | | 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M | | 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — | | 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B | | 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B | | 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B | | 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — | <figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and non-deduped models of a given size have the same hyperparameters. “Equivalent” models have <b>exactly</b> the same architecture, and the same number of non-embedding parameters.</figcaption> </figure> ### Uses and Limitations #### Intended Use The primary intended use of Pythia is research on the behavior, functionality, and limitations of large language models. This suite is intended to provide a controlled setting for performing scientific experiments. To enable the study of how language models change over the course of training, we provide 143 evenly spaced intermediate checkpoints per model. These checkpoints are hosted on Hugging Face as branches. Note that branch `143000` corresponds exactly to the model checkpoint on the `main` branch of each model. You may also further fine-tune and adapt Pythia-160M for deployment, as long as your use is in accordance with the Apache 2.0 license. Pythia models work with the Hugging Face [Transformers Library](https://huggingface.co/docs/transformers/index). If you decide to use pre-trained Pythia-160M as a basis for your fine-tuned model, please conduct your own risk and bias assessment. #### Out-of-scope use The Pythia Suite is **not** intended for deployment. It is not a in itself a product and cannot be used for human-facing interactions. Pythia models are English-language only, and are not suitable for translation or generating text in other languages. Pythia-160M has not been fine-tuned for downstream contexts in which language models are commonly deployed, such as writing genre prose, or commercial chatbots. This means Pythia-160M will **not** respond to a given prompt the way a product like ChatGPT does. This is because, unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement Learning from Human Feedback (RLHF) to better “understand” human instructions. #### Limitations and biases The core functionality of a large language model is to take a string of text and predict the next token. The token deemed statistically most likely by the model need not produce the most “accurate” text. Never rely on Pythia-160M to produce factually accurate output. This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset known to contain profanity and texts that are lewd or otherwise offensive. See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a discussion of documented biases with regards to gender, religion, and race. Pythia-160M may produce socially unacceptable or undesirable text, *even if* the prompt itself does not include anything explicitly offensive. If you plan on using text generated through, for example, the Hosted Inference API, we recommend having a human curate the outputs of this language model before presenting it to other people. Please inform your audience that the text was generated by Pythia-160M. ### Quickstart Pythia models can be loaded and used via the following code, demonstrated here for the third `pythia-70m-deduped` checkpoint: ```python from transformers import GPTNeoXForCausalLM, AutoTokenizer model = GPTNeoXForCausalLM.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) tokenizer = AutoTokenizer.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) inputs = tokenizer("Hello, I am", return_tensors="pt") tokens = model.generate(**inputs) tokenizer.decode(tokens[0]) ``` Revision/branch `step143000` corresponds exactly to the model checkpoint on the `main` branch of each model.<br> For more information on how to use all Pythia models, see [documentation on GitHub](https://github.com/EleutherAI/pythia). ### Training #### Training data [The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in English. It was created by EleutherAI specifically for training large language models. It contains texts from 22 diverse sources, roughly broken down into five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl), prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and miscellaneous (e.g. GitHub, Enron Emails). See [the Pile paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources, methodology, and a discussion of ethical implications. Consult [the datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation about the Pile and its component datasets. The Pile can be downloaded from the [official website](https://pile.eleuther.ai/), or from a [community mirror](https://the-eye.eu/public/AI/pile/).<br> The Pile was **not** deduplicated before being used to train Pythia-160M. #### Training procedure All models were trained on the exact same data, in the exact same order. Each model saw 299,892,736,000 tokens during training, and 143 checkpoints for each model are saved every 2,097,152,000 tokens, spaced evenly throughout training. This corresponds to training for just under 1 epoch on the Pile for non-deduplicated models, and about 1.5 epochs on the deduplicated Pile. All *Pythia* models trained for the equivalent of 143000 steps at a batch size of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch size of 4M tokens listed were originally trained for 71500 steps instead, with checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for consistency with all 2M batch models, so `step1000` is the first checkpoint for `pythia-1.4b` that was saved (corresponding to step 500 in training), and `step1000` is likewise the first `pythia-6.9b` checkpoint that was saved (corresponding to 1000 “actual” steps).<br> See [GitHub](https://github.com/EleutherAI/pythia) for more details on training procedure, including [how to reproduce it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br> Pythia uses the same tokenizer as [GPT-NeoX- 20B](https://huggingface.co/EleutherAI/gpt-neox-20b). ### Evaluations All 16 *Pythia* models were evaluated using the [LM Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access the results by model and step at `results/json/*` in the [GitHub repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br> Expand the sections below to see plots of evaluation results for all Pythia and Pythia-deduped models compared with OPT and BLOOM. <details> <summary>LAMBADA – OpenAI</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/> </details> <details> <summary>Physical Interaction: Question Answering (PIQA)</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/> </details> <details> <summary>WinoGrande</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/> </details> <details> <summary>AI2 Reasoning Challenge—Challenge Set</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/> </details> <details> <summary>SciQ</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/> </details> ### Naming convention and parameter count *Pythia* models were renamed in January 2023. It is possible that the old naming convention still persists in some documentation by accident. The current naming convention (70M, 160M, etc.) is based on total parameter count. <figure style="width:32em"> | current Pythia suffix | old suffix | total params | non-embedding params | | --------------------: | ---------: | -------------: | -------------------: | | 70M | 19M | 70,426,624 | 18,915,328 | | 160M | 125M | 162,322,944 | 85,056,000 | | 410M | 350M | 405,334,016 | 302,311,424 | | 1B | 800M | 1,011,781,632 | 805,736,448 | | 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 | | 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 | | 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 | | 12B | 13B | 11,846,072,320 | 11,327,027,200 | </figure>
[ "QUESTION_ANSWERING", "TRANSLATION" ]
[ "SCIQ" ]
Non_BioNLP
LoneStriker/OpenBioLLM-Llama3-8B-6.0bpw-h6-exl2
LoneStriker
text-generation
[ "transformers", "pytorch", "llama", "text-generation", "llama-3", "Mixtral", "instruct", "finetune", "chatml", "DPO", "RLHF", "gpt4", "distillation", "en", "arxiv:2305.18290", "arxiv:2303.13375", "arxiv:2212.13138", "arxiv:2305.09617", "arxiv:2402.07023", "base_model:meta-llama/Meta-Llama-3-8B", "base_model:quantized:meta-llama/Meta-Llama-3-8B", "license:llama3", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "6-bit", "exl2", "region:us" ]
1,714
1,714
7
0
--- base_model: meta-llama/Meta-Llama-3-8B language: - en license: llama3 tags: - llama-3 - llama - Mixtral - instruct - finetune - chatml - DPO - RLHF - gpt4 - distillation widget: - example_title: OpenBioLLM-8B messages: - role: system content: You are an expert and experienced from the healthcare and biomedical domain with extensive medical knowledge and practical experience. - role: user content: How long does it take for newborn jaundice to go away? output: text: 'Newborn jaundice, also known as neonatal jaundice, is a common condition in newborns where the yellowing of the skin and eyes occurs due to an elevated level of bilirubin in the blood. Bilirubin is a yellow pigment that forms when red blood cells break down. In most cases, newborn jaundice resolves on its own without any specific treatment. The duration of newborn jaundice can vary depending on several factors such as the underlying cause, gestational age at birth, and individual variations in bilirubin metabolism. Here are some general guidelines: 1. Physiological jaundice: This is the most common type of newborn jaundice and usually appears within 24-72 hours after birth. It tends to peak between the second and fifth day of life and gradually improves over the next week or two. By the time the baby is one week old, the jaundice should have mostly resolved. 2. Breast milk jaundice: This type of jaundice occurs in breastfed babies and may appear later than physiological jaundice, typically between the fifth and fourteenth day of life. It tends to persist for a longer duration but usually resolves within six weeks after birth. 3. Pathological jaundice: This type of jaundice is less common and occurs due to an underlying medical condition that affects bilirubin metabolism or liver function. The duration of pathological jaundice depends on the specific cause and may require treatment. It''s important for parents to monitor their newborn''s jaundice closely and seek medical advice if the jaundice progresses rapidly, becomes severe, or is accompanied by other symptoms such as poor feeding, lethargy, or excessive sleepiness. In these cases, further evaluation and management may be necessary. Remember that each baby is unique, and the timing of jaundice resolution can vary. If you have concerns about your newborn''s jaundice, it''s always best to consult with a healthcare professional for personalized advice and guidance.' model-index: - name: OpenBioLLM-8B results: [] --- <div align="center"> <img width="260px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/BrQCb95lmEIFz79QAmoNA.png"></div> ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/2FhDh8NDvMl7iSxbQz9BP.png) <div align="center"> <h1>Advancing Open-source Large Language Models in Medical Domain</h1> </div> <p align="center" style="margin-top: 0px;"> <a href="https://colab.research.google.com/drive/1F5oV20InEYeAJGmBwYF9NM_QhLmjBkKJ?usp=sharing"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="OpenChat Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 10px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text" style=" margin-right: 5px;">Online Demo</span> </a> | <a href="https://github.com/openlifescience-ai"> <img src="https://github.githubassets.com/assets/GitHub-Mark-ea2971cee799.png" alt="GitHub Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text" style=" margin-right: 5px;">GitHub</span> </a> | <a href="#"> <img src="https://github.com/alpayariyak/openchat/blob/master/assets/arxiv-logomark-small-square-border.png?raw=true" alt="ArXiv Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text" style="margin-right: 5px;">Paper</span> </a> | <a href="https://discord.gg/A5Fjf5zC69"> <img src="https://cloud.githubusercontent.com/assets/6291467/26705903/96c2d66e-477c-11e7-9f4e-f3c0efe96c9a.png" alt="Discord Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text">Discord</span> </a> </p> ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/KGmRE5w2sepNtwsEu8t7K.jpeg) Introducing OpenBioLLM-8B: A State-of-the-Art Open Source Biomedical Large Language Model OpenBioLLM-8B is an advanced open source language model designed specifically for the biomedical domain. Developed by Saama AI Labs, this model leverages cutting-edge techniques to achieve state-of-the-art performance on a wide range of biomedical tasks. 🏥 **Biomedical Specialization**: OpenBioLLM-8B is tailored for the unique language and knowledge requirements of the medical and life sciences fields. It was fine-tuned on a vast corpus of high-quality biomedical data, enabling it to understand and generate text with domain-specific accuracy and fluency. 🎓 **Superior Performance**: With 8 billion parameters, OpenBioLLM-8B outperforms other open source biomedical language models of similar scale. It has also demonstrated better results compared to larger proprietary & open-source models like GPT-3.5 and Meditron-70B on biomedical benchmarks. 🧠 **Advanced Training Techniques**: OpenBioLLM-8B builds upon the powerful foundations of the **Meta-Llama-3-8B** and [Meta-Llama-3-8B](meta-llama/Meta-Llama-3-8B) models. It incorporates the DPO dataset and fine-tuning recipe along with a custom diverse medical instruction dataset. Key components of the training pipeline include: <div align="center"> <img width="1200px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/oPchsJsEpQoGcGXVbh7YS.png"> </div> - **Policy Optimization**: [Direct Preference Optimization: Your Language Model is Secretly a Reward Model (DPO)](https://arxiv.org/abs/2305.18290) - **Ranking Dataset**: [berkeley-nest/Nectar](https://huggingface.co/datasets/berkeley-nest/Nectar) - **Fine-tuning dataset**: Custom Medical Instruct dataset (We plan to release a sample training dataset in our upcoming paper; please stay updated) This combination of cutting-edge techniques enables OpenBioLLM-8B to align with key capabilities and preferences for biomedical applications. ⚙️ **Release Details**: - **Model Size**: 8 billion parameters - **Quantization**: Optimized quantized versions available [Here](https://huggingface.co/aaditya/OpenBioLLM-8B-GGUF) - **Language(s) (NLP):** en - **Developed By**: [Ankit Pal (Aaditya Ura)](https://aadityaura.github.io/) from Saama AI Labs - **License:** Meta-Llama License - **Fine-tuned from models:** [meta-llama/Meta-Llama-3-8B](meta-llama/Meta-Llama-3-8B) - **Resources for more information:** - Paper: Coming soon The model can be fine-tuned for more specialized tasks and datasets as needed. OpenBioLLM-8B represents an important step forward in democratizing advanced language AI for the biomedical community. By leveraging state-of-the-art architectures and training techniques from leading open source efforts like Llama-3, we have created a powerful tool to accelerate innovation and discovery in healthcare and the life sciences. We are excited to share OpenBioLLM-8B with researchers and developers around the world. ### Use with transformers **Important: Please use the exact chat template provided by Llama-3 instruct version. Otherwise there will be a degradation in the performance. The model output can be verbose in rare cases. Please consider setting temperature = 0 to make this happen less.** See the snippet below for usage with Transformers: ```python import transformers import torch model_id = "aaditya/OpenBioLLM-Llama3-8B" pipeline = transformers.pipeline( "text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device="auto", ) messages = [ {"role": "system", "content": "You are an expert and experienced from the healthcare and biomedical domain with extensive medical knowledge and practical experience. Your name is OpenBioLLM, and you were developed by Saama AI Labs. who's willing to help answer the user's query with explanation. In your explanation, leverage your deep medical expertise such as relevant anatomical structures, physiological processes, diagnostic criteria, treatment guidelines, or other pertinent medical concepts. Use precise medical terminology while still aiming to make the explanation clear and accessible to a general audience."}, {"role": "user", "content": "How can i split a 3mg or 4mg waefin pill so i can get a 2.5mg pill?"}, ] prompt = pipeline.tokenizer.apply_chat_template( messages, tokenize=False, add_generation_prompt=True ) terminators = [ pipeline.tokenizer.eos_token_id, pipeline.tokenizer.convert_tokens_to_ids("<|eot_id|>") ] outputs = pipeline( prompt, max_new_tokens=256, eos_token_id=terminators, do_sample=True, temperature=0.0, top_p=0.9, ) print(outputs[0]["generated_text"][len(prompt):]) ``` ## **Training procedure** ### **Training hyperparameters** <details> <summary>Click to see details</summary> - learning_rate: 0.0002 - lr_scheduler: cosine - train_batch_size: 12 - eval_batch_size: 8 - GPU: H100 80GB SXM5 - num_devices: 1 - optimizer: adamw_bnb_8bit - lr_scheduler_warmup_steps: 100 - num_epochs: 4 </details> ### **Peft hyperparameters** <details> <summary>Click to see details</summary> - adapter: qlora - lora_r: 128 - lora_alpha: 256 - lora_dropout: 0.05 - lora_target_linear: true -lora_target_modules: - q_proj - v_proj - k_proj - o_proj - gate_proj - down_proj - up_proj </details> ### **Training results** ### **Framework versions** - Transformers 4.39.3 - Pytorch 2.1.2+cu121 - Datasets 2.18.0 - Tokenizers 0.15.1 - Axolotl - Lm harness for evaluation # Benchmark Results 🔥 OpenBioLLM-8B demonstrates superior performance compared to larger models, such as GPT-3.5, Meditron-70B across 9 diverse biomedical datasets, achieving state-of-the-art results with an average score of 72.50%, despite having a significantly smaller parameter count. The model's strong performance in domain-specific tasks, such as Clinical KG, Medical Genetics, and PubMedQA, highlights its ability to effectively capture and apply biomedical knowledge. 🚨 The GPT-4, Med-PaLM-1, and Med-PaLM-2 results are taken from their official papers. Since Med-PaLM doesn't provide zero-shot accuracy, we are using 5-shot accuracy from their paper for comparison. All results presented are in the zero-shot setting, except for Med-PaLM-2 and Med-PaLM-1, which use 5-shot accuracy. | | Clinical KG | Medical Genetics | Anatomy | Pro Medicine | College Biology | College Medicine | MedQA 4 opts | PubMedQA | MedMCQA | Avg | |--------------------|-------------|------------------|---------|--------------|-----------------|------------------|--------------|----------|---------|-------| | **OpenBioLLM-70B** | **92.93** | **93.197** | **83.904** | 93.75 | 93.827 | **85.749** | 78.162 | 78.97 | **74.014** | **86.05588** | | Med-PaLM-2 (5-shot) | 88.3 | 90 | 77.8 | **95.2** | 94.4 | 80.9 | **79.7** | **79.2** | 71.3 | 84.08 | | **GPT-4** | 86.04 | 91 | 80 | 93.01 | **95.14** | 76.88 | 78.87 | 75.2 | 69.52 | 82.85 | | Med-PaLM-1 (Flan-PaLM, 5-shot) | 80.4 | 75 | 63.7 | 83.8 | 88.9 | 76.3 | 67.6 | 79 | 57.6 | 74.7 | | **OpenBioLLM-8B** | 76.101 | 86.1 | 69.829 | 78.21 | 84.213 | 68.042 | 58.993 | 74.12 | 56.913 | 72.502 | | Gemini-1.0 | 76.7 | 75.8 | 66.7 | 77.7 | 88 | 69.2 | 58 | 70.7 | 54.3 | 70.79 | | GPT-3.5 Turbo 1106 | 74.71 | 74 | 72.79 | 72.79 | 72.91 | 64.73 | 57.71 | 72.66 | 53.79 | 66 | | Meditron-70B | 66.79 | 69 | 53.33 | 71.69 | 76.38 | 63 | 57.1 | 76.6 | 46.85 | 64.52 | | gemma-7b | 69.81 | 70 | 59.26 | 66.18 | 79.86 | 60.12 | 47.21 | 76.2 | 48.96 | 64.18 | | Mistral-7B-v0.1 | 68.68 | 71 | 55.56 | 68.38 | 68.06 | 59.54 | 50.82 | 75.4 | 48.2 | 62.85 | | Apollo-7B | 62.26 | 72 | 61.48 | 69.12 | 70.83 | 55.49 | 55.22 | 39.8 | 53.77 | 60 | | MedAlpaca-7b | 57.36 | 69 | 57.04 | 67.28 | 65.28 | 54.34 | 41.71 | 72.8 | 37.51 | 58.03 | | BioMistral-7B | 59.9 | 64 | 56.5 | 60.4 | 59 | 54.7 | 50.6 | 77.5 | 48.1 | 57.3 | | AlpaCare-llama2-7b | 49.81 | 49 | 45.92 | 33.82 | 50 | 43.35 | 29.77 | 72.2 | 34.42 | 45.36 | | ClinicalGPT | 30.56 | 27 | 30.37 | 19.48 | 25 | 24.27 | 26.08 | 63.8 | 28.18 | 30.52 | <div align="center"> <img width="1600px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/_SzdcJSBjZyo8RS1bTEkP.png"> </div> ## Detailed Medical Subjectwise accuracy ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/UXF-V0col0Z0sS6BGPBkE.png) # Use Cases & Examples 🚨 **Below results are from the quantized version of OpenBioLLM-70B** # Summarize Clinical Notes OpenBioLLM-70B can efficiently analyze and summarize complex clinical notes, EHR data, and discharge summaries, extracting key information and generating concise, structured summaries ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/xdwdBgOxNi_TfML0hKlI8.png) # Answer Medical Questions OpenBioLLM-70B can provide answers to a wide range of medical questions. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/zO95GlwOQEZqCKQF69mE6.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/OKBczKw7gWeW5xsuDpc27.png) <details> <summary>Click to see details</summary> ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/eJGHT5khppYvJb8fQ-YW4.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/Cnbwrqa_-ORHRuNRC2P6Y.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/J9DhdcvukAc9mnnW9fj2C.png) </details> # Clinical Entity Recognition OpenBioLLM-70B can perform advanced clinical entity recognition by identifying and extracting key medical concepts, such as diseases, symptoms, medications, procedures, and anatomical structures, from unstructured clinical text. By leveraging its deep understanding of medical terminology and context, the model can accurately annotate and categorize clinical entities, enabling more efficient information retrieval, data analysis, and knowledge discovery from electronic health records, research articles, and other biomedical text sources. This capability can support various downstream applications, such as clinical decision support, pharmacovigilance, and medical research. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/_69BW4k9LVABFwtxixL45.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/DKy5wYCoPhoPPUc1-x8_J.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/7WD9zCCBZT4-4XlfnIQjl.png) # Biomarkers Extraction ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/ZttoM4AiteT7gFYVhjIpN.png) # Classification OpenBioLLM-70B can perform various biomedical classification tasks, such as disease prediction, sentiment analysis, medical document categorization ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/Bf5MW1d75qT-1F_TR_hC0.png) # De-Identification OpenBioLLM-70B can detect and remove personally identifiable information (PII) from medical records, ensuring patient privacy and compliance with data protection regulations like HIPAA. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/hKX4kzm--Tw5bj6K78msy.png) **Advisory Notice!**  While OpenBioLLM-70B & 8B leverages high-quality data sources, its outputs may still contain inaccuracies, biases, or misalignments that could pose risks if relied upon for medical decision-making without further testing and refinement. The model's performance has not yet been rigorously evaluated in randomized controlled trials or real-world healthcare environments. Therefore, we strongly advise against using OpenBioLLM-70B & 8B for any direct patient care, clinical decision support, or other professional medical purposes at this time. Its use should be limited to research, development, and exploratory applications by qualified individuals who understand its limitations. OpenBioLLM-70B & 8B are intended solely as a research tool to assist healthcare professionals and should never be considered a replacement for the professional judgment and expertise of a qualified medical doctor. Appropriately adapting and validating OpenBioLLM-70B & 8B for specific medical use cases would require significant additional work, potentially including: - Thorough testing and evaluation in relevant clinical scenarios - Alignment with evidence-based guidelines and best practices - Mitigation of potential biases and failure modes - Integration with human oversight and interpretation - Compliance with regulatory and ethical standards Always consult a qualified healthcare provider for personal medical needs. # Citation If you find OpenBioLLM-70B & 8B useful in your work, please cite the model as follows: ``` @misc{OpenBioLLMs, author = {Ankit Pal, Malaikannan Sankarasubbu}, title = {OpenBioLLMs: Advancing Open-Source Large Language Models for Healthcare and Life Sciences}, year = {2024}, publisher = {Hugging Face}, journal = {Hugging Face repository}, howpublished = {\url{https://huggingface.co/aaditya/OpenBioLLM-Llama3-70B}} } ``` The accompanying paper is currently in progress and will be released soon. <div align="center"> <h2> 💌 Contact </h2> </div> We look forward to hearing you and collaborating on this exciting project! **Contributors:** - [Ankit Pal (Aaditya Ura)](https://aadityaura.github.io/) [aadityaura at gmail dot com] - Saama AI Labs - Note: I am looking for a funded PhD opportunity, especially if it fits my Responsible Generative AI, Multimodal LLMs, Geometric Deep Learning, and Healthcare AI skillset. # References We thank the [Meta Team](meta-llama/Meta-Llama-3-70B-Instruct) for their amazing models! Result sources - [1] GPT-4 [Capabilities of GPT-4 on Medical Challenge Problems] (https://arxiv.org/abs/2303.13375) - [2] Med-PaLM-1 [Large Language Models Encode Clinical Knowledge](https://arxiv.org/abs/2212.13138) - [3] Med-PaLM-2 [Towards Expert-Level Medical Question Answering with Large Language Models](https://arxiv.org/abs/2305.09617) - [4] Gemini-1.0 [Gemini Goes to Med School](https://arxiv.org/abs/2402.07023)
[ "QUESTION_ANSWERING" ]
[ "MEDQA", "PUBMEDQA" ]
BioNLP
Nashhz/SBERT_KFOLD_JobDescriptions_Skills_UserPortfolios
Nashhz
sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:16682", "loss:CosineSimilarityLoss", "arxiv:1908.10084", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,735
1,735
308
0
--- library_name: sentence-transformers pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:16682 - loss:CosineSimilarityLoss widget: - source_sentence: Hello, I am Redoan Ahmad I'm a professional Graphic Designer who finds great joy in creating assets that not only meet the expectations of my clients, but exceed them and add to what has become a delightful portfolio of my work. I am an expert in the field, and specialize in many different aspects of design work, including but not limited to + Logos + Flyers + Brochures + Banners + Icons + Business card + Branding As you can see, I take on projects involving a plethora of different visual assets. I use the Adobe Suite Programs to create and perfect everything I make, both for my clients and on my own time, so I'm incredibly adept at sentences: - I'm in search of a designer who can help craft a unique and engaging digital portfolio for my company. The desired style of the portfolio is creative and artistic, so I'm looking for someone who can think outside the box and design a portfolio that truly stands out. Key components of the portfolio will include - Client testimonials These will need to be presented in an appealing way that showcases our strong relationships and positive feedback from our clients. - Project case studies I want to highlight some of our best work. This will require a designer who can help distill complex projects into easy-to-understand and visually appealing presentations. Ideal candidates for this project should be experienced in creating digital portfolios and have a strong design background. They should be able to demonstrate a flexible and creative design approach, with a portfolio that reflects a 'creative and artistic' style. Good communication skills are a must, as we will need to collaborate closely to ensure the final product meets our expectations. - I need a proficient developer who can replicate a Forex trading software for me. The software needs to include - Real-time data feed The software should provide up-to-the-minute information about the forex market. - Automated trading I want the software to have a feature that allows for trading without human intervention, based on pre-set parameters or algorithms. The final product needs to be compatible with Windows. Ideal candidates for this project should have substantial experience in creating or replicating trading software, particularly in the Forex sector. Knowledge of real-time data processing and automated trading systems is crucial. Please ensure your bid reflects your expertise in this field. - I'm seeking a talented graphic designer to assist with a short project. The tasks will include designing a logo, banners, and screenshots, as well as a favicon for our website, app stores, and social media platforms. - source_sentence: Hello I am a skilled graphic designer, my designs are creative and based on modern strategies. The ones I create express the customer's brand language and make multiple connections with the audience. I am interested in engineering and through my work I try to meet customer requirements and expectations.. I am an experienced graphic designer who loves to create modern and unique designs. I specialize in personal calling and branding projects.!! sentences: - I'm seeking a talented graphic designer who can create engaging and visually appealing designs for my marketing materials, specifically for flyers and business cards. Ideally, the freelancer should have a keen understanding of design principles and be able to create designs that will capture attention and convey my brand message effectively. Skills and experience needed - Proficient in graphic design software such as Adobe Illustrator, Photoshop, etc. - Creative and innovative thinker - Strong understanding of design principles - Experience in designing marketing materials - Excellent communication skills - I'm looking for a skilled web application developer proficient in NodeJSTypescriptVue 3 to help me build an interactive web application. The main features of this project would include - Utilizing the Vue 3 Framework Prior experience in Vue.js is a must. Understanding of its core concepts and features is essential to deliver a high-quality application. - Payment Gateway Integration The application will require integration with a payment gateway such as Stripe or PayPal. Experience with these platforms is highly desirable. - User Authentication Clerk - Flexible Design The application should be able to accommodate future expansions or modifications, so a flexible design and coding approach is key. The main technologies that application will use are - NodeJSExpressTypescriptPrisma - Vue 3ShadCNTailwind CSS I have a detailed specification which I will share with those selected to be shortlisted. To be considered for this project 1. A brief summary of your experience in the core technologies I want to use for the App. 2. Please provide links for any projects which use Node JSExpressPrisma and Vue 3 If you have any further questions please reach out. - I'm in need of a talented graphic designer to create website graphics for my project. This includes designing banner images, icons, and infographics. Ideal Skills - Proficiency in graphic design software Adobe Illustrator, Photoshop, etc. - Strong portfolio of website graphics - Experience with designing for social media and ad campaigns Please note, the banner images will be used on the homepage, social media, and ad campaigns. A deep understanding of how to create engaging and impactful designs for these platforms is crucial. - source_sentence: PHP Codeigniter Laravel Google Ads API - PHPPython Google AppsAds Script Bing Ads API Twitter API TikTok API FB API Google APIs GitHub login to view URL LinkedIn Profile login to view URL sentences: - I need a structural engineer to provide detailed engineering plans for a residential building. Specific Requirements - Foundation plans - Framing plans - Roof structure details Additionally, I need - Copies of the structural engineering details, including piers and footings. - A reference site classification report with a copy of the report provided. Ideal candidates should have - Extensive experience in structural engineering for residential buildings. - Ability to interpret and work from existing architectural plans. - Strong communication skills to provide necessary documentation clearly. - I'm looking for a talented web developer with a strong background in Shopify to create a robust e-commerce website for selling electronics and gadgets. Key Requirements - Expertise in Shopify You should have a deep understanding of the platform to build an effective, secure and user-friendly online store. - E-commerce Development Experience in creating e-commerce websites is essential. You will need to implement features that facilitate seamless shopping experiences. - Understanding of Electronics A knowledge of the electronics industry will be a plus, as it will help in designing the website Please note, this project does not include the add-on features such as product reviews, discount codes or customer account creation, but these may be discussed further down the line. - I'm looking for a professional with experience in WebSocket and Laravel to integrate Twilio and login to view URL into my Laravel Blade website. The primary function of Twilio will be enabling voice calls on the website. Key Tasks - Implement Twilio for voice call functionality on the website. - Integrate login to view URL's Natural Language Processing NLP capabilities into the site. Ideal Candidate - Proficient in Laravel and Blade. - Extensive experience with Twilio and Vapi.ai. - Strong knowledge of WebSocket. - Ability to implement NLP features effectively. - source_sentence: I have 6-year experience as a Web Designer and WordPress Designer. 100+ completed projects. My Top Skills - HTML, CSS, Bootstrap 3 4 5 - Admin Dashboard - Email Template within 2 to 3 hours - Web Design - HTML5, CSS3 Canvas, SVG - PSD, FIGMA, ZEPLIN, XD, image, pdf to HTML, CSS Conversion - PSD, FIGMA, ZEPLIN, XD, image, pdf to Bootstrap Conversion - Animation, Slider - Fix Tailwind CSS - Photoshop intermediate - Adobe XD Mobile App any changes intermediate sentences: - I'm seeking a talented web developer with a keen eye for 3D design to revamp our current website. The job involves a complete overhaul of the website's layout, user interface, and 3D images. Key Requirements - Proficiency in 3D design You should be adept at enhancing textures, improving lighting, and updating models for a more engaging and visually striking website. - WordPress Expertise The new design should be compatible with WordPress, so prior experience with this platform is a must. Responsibilities - Redesign the website layout and user interface to improve overall user experience. - Update all existing 3D images, enhancing them with improved textures and lighting. - Ensure the website is fully functional on the WordPress platform. Ideal Candidate - A creative thinker with a strong background in both web development and 3D design. - Prior experience with WordPress and a portfolio that showcases your skills in revamping websites. - Excellent communication skills to ensure smooth collaboration and understanding of my vision for the project. I'd love to hear from you if you're confident in your ability to take on this project. Please include relevant samples of your past work in your application. Experience with Fancy Product Designer for customisations must be on time samples of what I want login to view URL login to view URL login to view URL - I'm looking for a skilled web developer experienced in web scraping to create a web scraper for me. Key Requirements - The scraper should be able to extract product prices from Amazon. Ideal Skills and Experience - Proficiency in Python and libraries like BeautifulSoup and Scrapy. - Previous experience scraping data from Amazon is a plus. - Strong understanding of web scraping ethics and legal considerations. Please include in your proposal examples of similar projects you've completed. - I'm looking for an expert mobile app developer who can create a comprehensive e-commerce app for both iOS and Android platforms. Key Features - User-friendly interface - Secure payment gateway - Real-time inventory updates - Customer review and rating system - Push notifications for sales and offers Ideal Skills - Proficiency in cross-platform mobile app development - Experience in e-commerce app development - Knowledge of UIUX design principles - Understanding of secure payment integration - Familiarity with inventory management systems Your expertise will help me reach my goal of launching a top-tier e-commerce app. Please provide your portfolio showcasing similar projects you've completed in the past. - source_sentence: I have 15+ years experiences with web development, machine learning engineering and product development. I also have 5+ years experiences with team management for developing new product and maintaining old products. sentences: - I'm starting a web development company and need a senior WordPress developer who is proficient in PHP, JavaScript, HTML, and CSS. This role will require working closely with my designer to customize websites. Key Responsibilities - Custom theme development - Communicating with the Designer - Optimising websites for performance - Ongoing website maintenance The ideal candidate should - Have expert-level experience with custom theme development - Be eager to learn and adapt - Have a solid track record with WordPress - Know the pain points of WordPress and how to solve them - Benefit Experience with SEO Collaboration - We will be using TrelloWhatsappTeams for project management and collaboration tasks. Your ability to work as part of a team and communicate effectively will be crucial for our success. A passion for web development and a desire to be part of a growing company will make this a rewarding opportunity. - Job Title Freelance Graphic Designer Monthly Deliverables Minimum 30 Creative Designs Budget 10,000 Month Job Description We are seeking a Freelance Graphic Designer to create high-quality and creative visuals for our projects monthly. The ideal candidate will have experience designing a wide range of materials, including images for digital platforms, brochures, banners, PDFs, and other print-ready files. This remote freelance role is expected to deliver 30 designs per month. If you're passionate about visual design and can consistently meet deadlines with high-quality work, we'd love to hear from you! Key Responsibilities Create 30+ designs per month, including - Social media graphics - Flyers, brochures, and pamphlets - PDF print files - Flex banners and large-scale designs Design for multiple formats Digital websocial media and print brochures, banners, etc.. - Collaborate with stakeholders to ensure designs align with the brand and project goals. - Make revisions and adjustments based on feedback. - Prepare print-ready files with accurate specifications. --- Required Skills - Proficiency in Adobe Creative Suite Photoshop, Illustrator, InDesign or equivalent tools. - Strong understanding of layout, typography, and color theory, - Experience in designing for both digital and print mediums. - Knowledge of print specifications and formats CMYK, DPI, bleed, etc.. - Ability to work independently and deliver within deadlines. --- Preferred Qualifications - Prior experience as a freelance designer or working in an agency setting. - Experience with branding projects - Strong portfolio showcasing past work. --- Compensation - 10,000 per month for a minimum of 30 imagesdesigns - Additional designs or complex projects may be compensated separately based on agreement. --- How to Apply Interested candidates should submit their portfolios and CVs this platform Please include samples of - Social media posts or marketing graphics - Print designs like brochures or banners - Any other relevant design work --- Additional Information - This is a remote freelance opportunity. - Payments will be made monthly upon submission and approval of deliverables. - Long-term collaboration opportunities available based on performance. - Seeking a talented content writer to create engaging and SEO-friendly articles across diverse markets. The candidate should possess strong expertise in producing content that not only resonates with readers but also performs well in search engine rankings. Please submit samples of your past work where you have successfully balanced keyword integration with compelling content. --- # SentenceTransformer This is a [sentence-transformers](https://www.SBERT.net) model trained. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer <!-- - **Base model:** [Unknown](https://huggingface.co/unknown) --> - **Maximum Sequence Length:** 256 tokens - **Output Dimensionality:** 384 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("Nashhz/SBERT_KFOLD_JobDescriptions_Skills_UserPortfolios") # Run inference sentences = [ 'I have 15+ years experiences with web development, machine learning engineering and product development. I also have 5+ years experiences with team management for developing new product and maintaining old products.', "I'm starting a web development company and need a senior WordPress developer who is proficient in PHP, JavaScript, HTML, and CSS. This role will require working closely with my designer to customize websites. Key Responsibilities - Custom theme development - Communicating with the Designer - Optimising websites for performance - Ongoing website maintenance The ideal candidate should - Have expert-level experience with custom theme development - Be eager to learn and adapt - Have a solid track record with WordPress - Know the pain points of WordPress and how to solve them - Benefit Experience with SEO Collaboration - We will be using TrelloWhatsappTeams for project management and collaboration tasks. Your ability to work as part of a team and communicate effectively will be crucial for our success. A passion for web development and a desire to be part of a growing company will make this a rewarding opportunity.", "Job Title Freelance Graphic Designer Monthly Deliverables Minimum 30 Creative Designs Budget 10,000 Month Job Description We are seeking a Freelance Graphic Designer to create high-quality and creative visuals for our projects monthly. The ideal candidate will have experience designing a wide range of materials, including images for digital platforms, brochures, banners, PDFs, and other print-ready files. This remote freelance role is expected to deliver 30 designs per month. If you're passionate about visual design and can consistently meet deadlines with high-quality work, we'd love to hear from you! Key Responsibilities Create 30+ designs per month, including - Social media graphics - Flyers, brochures, and pamphlets - PDF print files - Flex banners and large-scale designs Design for multiple formats Digital websocial media and print brochures, banners, etc.. - Collaborate with stakeholders to ensure designs align with the brand and project goals. - Make revisions and adjustments based on feedback. - Prepare print-ready files with accurate specifications. --- Required Skills - Proficiency in Adobe Creative Suite Photoshop, Illustrator, InDesign or equivalent tools. - Strong understanding of layout, typography, and color theory, - Experience in designing for both digital and print mediums. - Knowledge of print specifications and formats CMYK, DPI, bleed, etc.. - Ability to work independently and deliver within deadlines. --- Preferred Qualifications - Prior experience as a freelance designer or working in an agency setting. - Experience with branding projects - Strong portfolio showcasing past work. --- Compensation - 10,000 per month for a minimum of 30 imagesdesigns - Additional designs or complex projects may be compensated separately based on agreement. --- How to Apply Interested candidates should submit their portfolios and CVs this platform Please include samples of - Social media posts or marketing graphics - Print designs like brochures or banners - Any other relevant design work --- Additional Information - This is a remote freelance opportunity. - Payments will be made monthly upon submission and approval of deliverables. - Long-term collaboration opportunities available based on performance.", ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 384] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 16,682 training samples * Columns: <code>sentence_0</code>, <code>sentence_1</code>, and <code>label</code> * Approximate statistics based on the first 1000 samples: | | sentence_0 | sentence_1 | label | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------| | type | string | string | float | | details | <ul><li>min: 4 tokens</li><li>mean: 160.64 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 163.14 tokens</li><li>max: 256 tokens</li></ul> | <ul><li>min: 0.27</li><li>mean: 0.72</li><li>max: 1.0</li></ul> | * Samples: | sentence_0 | sentence_1 | label | |:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------| | <code>Amazon eBay Tiktok Shop Amazon Services Amazon Seller Central Management A to Z Store Management A to Z Inventory Management Winning Product Sourcing Product Listing with SEO Listing With Variations Listing Optimization Title, Bullet Points & Description Optimization Images Optimization Product Launching FBA Shipment Creation more Amazon eBay Tiktok Shop Amazon Services Amazon Seller Central Management A to Z Store Management A to Z Inventory Management Winning Product Sourcing Product Listing with SEO Listing With Variations Listing Optimization Title, Bullet Points & Description Optimization Images Optimization Product Launching FBA Shipment Creation Sales Generation Dropshipping Store Design A+ Content Creation Amazon PPC Campaigns Brand Registry Trademark Registration Customer Services Management eBay Services eBay Store Management A to Z A to Z eBay Dropshipping Services Winning Products Sourcing Products listing with SEO Products listing With Variations Listings Optimization Title , Bullet Point & Description Optimization Images Optimization Keywords Optimization Sales Boost Products Ranking Hot selling product with 30 to 50 profit Competitor Analysis Orders Fulfillment Customer Services Management eBay Account Defect Removal Tax Exemption Management Setting Up Promotions Listing Templates Creation Tiktok Shop Services TikTok Shop Account Setup Product Listing Listing Optimization Keyword Research Product Hunting Competitor Analysis Campaign Management Influencer Collaboration TikTok Live Shopping Order Management Promotion Management TikTok Ads for Shop Content Creation for Shop Sales Analytics & Reporting Problem Solving & Issue Resolution Ongoing Shop Optimization</code> | <code>I'm seeking a skilled professional to assist with a variety of tasks including selling products from Amazon UAE to eBay UK via dropshipping, product sourcing, and full virtual assistance. Key Responsibilities - Product Searching & Listing Identify profitable products, create and optimize listings, and conduct market trend analysis. - SEO Management Oversee the search engine optimization for our listed products. - Selling & Listing Management List products on Amazon, eBay, and our website, while managing sales. Ideal Candidate - Previous dropshipping experience, particularly between Amazon and eBay, is a plus. - Strong skills in SEO, product sourcing, and virtual assistance. - Excellent understanding of market trends and product profitability. - Able to create and optimize product listings for maximum visibility and sales. This is a full-time position which requires dedication and a proactive approach. Please only apply if you have the necessary skills and experience.</code> | <code>0.7151671051979065</code> | | <code>We are a group of young, energetic, creative & professional website developer, graphic designer and IT-Administrator who are devoted to implement your requirement with modern technology. Website Design - Development-Modification - Wordpress - Ecommerce - DynamicCustomized site Development Graphic Design - logo design - Brochure - Flyer - Leaflet - PDF Profile - Catalog - Greetings Card - PackageLabel Design - Business Card - Image RetouchEnhancementEditingManipulation IT-Admin Virtual Assistant - Product Listing - Site Content Management - Product Image Enhance - Data Processing - PDF conversion to WordExcel - Web Research - Data Scraping Why Choose Us o Quality Support for everyday 365 days even after project completion o We understand your requirements precisely to deliver Creative designs o 100 client satisfaction guaranteed</code> | <code>We are looking for a skilled and dedicated full-time web developer to join our team. The ideal candidate should have extensive experience working with WordPress, Divi, and Elementor, as well as the ability to create custom WordPress themes. Key Responsibilities Develop, maintain, and optimize WordPress websites. Customize and configure Divi and Elementor page builders to meet client needs. Create custom WordPress themes from scratch, ensuring they are optimized for performance and usability. Troubleshoot and resolve any website issues as they arise. Ensure websites are responsive and work seamlessly across all devices. Collaborate with our design and content teams to bring creative ideas to life. Stay up to date with the latest web development trends and best practices. Requirements Proven experience with WordPress, including custom theme development. Proficiency in Divi and Elementor page builders. Strong understanding of HTML, CSS, JavaScript, and PHP. Experience in responsive design and cross-browser compatibility. Ability to work independently and meet deadlines. Strong problem-solving skills and attention to detail. Excellent communication skills in English. Preferred Qualifications Experience with WooCommerce or other WordPress plugins. Familiarity with SEO best practices. Knowledge of version control systems like Git. If you are passionate about web development and want to be part of a growing team, we'd love to hear from you! Please submit your portfolio and CV for consideration.</code> | <code>0.7487468719482422</code> | | <code>Hi there, I'm Priyanshu Agarwal I'm a Python expert with a diverse skillset that includes web scraping, Zoho and Tally Prime accounting, automation, and Python application building. With my strong foundation in Python, I can build and automate applications that meet your business needs, saving you time and resources. As a web scraping expert, I specialize in using Python, Selenium, BeautifulSoup4, and Python Requests to extract data from websites and web applications. I have experience in projects of varying scales, from small-scale data collection to large-scale data mining for enterprise-level clients. In addition to my technical expertise in web scraping, I have a strong background in accounting software such as Zoho and Tally Prime. I have experience in managing financial data, generating reports, and automating financial processes using these tools. I understand the importance of accurate and timely financial data in business decision-making, and I strive to ensure that my clients' financial data is organized, up-to-date, and easily accessible. With my experience in automation and Python application building, I can create custom solutions to</code> | <code>I'm in need of a data scraping expert to assist in gathering market research data from various retail websites. The ideal freelancer for this project should have a robust experience with Python and Java, as well as proficiency in Odoo and Airtable. Experience in building microservices would be a significant advantage. Key Responsibilities - Scraping data from designated retail websites for market research purposes - Organizing and managing the gathered data in Airtable - Potential development of microservices for data handling, 8n8 Skills and Experience Required - Extensive experience in data scraping, particularly from retail websites - Proficiency in Python and Java - Experience with Odoo and Airtable - Prior experience in building microservices - Understanding of market research techniques and requirements</code> | <code>0.747043251991272</code> | * Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters: ```json { "loss_fct": "torch.nn.modules.loss.MSELoss" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `num_train_epochs`: 4 - `multi_dataset_batch_sampler`: round_robin #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: no - `prediction_loss_only`: True - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 5e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1 - `num_train_epochs`: 4 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.0 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: round_robin </details> ### Training Logs | Epoch | Step | Training Loss | |:------:|:----:|:-------------:| | 0.4794 | 500 | 0.0003 | | 0.9588 | 1000 | 0.0003 | | 1.4382 | 1500 | 0.0003 | | 1.9175 | 2000 | 0.0003 | | 2.3969 | 2500 | 0.0002 | | 2.8763 | 3000 | 0.0002 | | 3.3557 | 3500 | 0.0002 | | 3.8351 | 4000 | 0.0002 | | 0.4794 | 500 | 0.0003 | | 0.9588 | 1000 | 0.0003 | | 1.4382 | 1500 | 0.0003 | | 1.9175 | 2000 | 0.0003 | | 2.3969 | 2500 | 0.0002 | | 2.8763 | 3000 | 0.0002 | | 3.3557 | 3500 | 0.0002 | | 3.8351 | 4000 | 0.0001 | | 0.4794 | 500 | 0.0002 | | 0.9588 | 1000 | 0.0002 | | 1.4382 | 1500 | 0.0002 | | 1.9175 | 2000 | 0.0002 | | 2.3969 | 2500 | 0.0002 | | 2.8763 | 3000 | 0.0002 | | 3.3557 | 3500 | 0.0001 | | 3.8351 | 4000 | 0.0001 | | 0.4794 | 500 | 0.0002 | | 0.9588 | 1000 | 0.0002 | | 1.4382 | 1500 | 0.0002 | | 1.9175 | 2000 | 0.0002 | | 2.3969 | 2500 | 0.0002 | | 2.8763 | 3000 | 0.0001 | | 3.3557 | 3500 | 0.0001 | | 3.8351 | 4000 | 0.0001 | | 0.4794 | 500 | 0.0002 | | 0.9588 | 1000 | 0.0002 | | 1.4382 | 1500 | 0.0002 | | 1.9175 | 2000 | 0.0002 | | 2.3969 | 2500 | 0.0001 | | 2.8763 | 3000 | 0.0001 | | 3.3557 | 3500 | 0.0001 | | 3.8351 | 4000 | 0.0001 | ### Framework Versions - Python: 3.12.6 - Sentence Transformers: 3.2.0 - Transformers: 4.45.2 - PyTorch: 2.4.1+cpu - Accelerate: 1.0.1 - Datasets: 3.0.1 - Tokenizers: 0.20.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "CRAFT" ]
Non_BioNLP
corto-ai/bge-reranker-large-onnx
corto-ai
feature-extraction
[ "transformers", "onnx", "xlm-roberta", "text-classification", "mteb", "feature-extraction", "en", "zh", "arxiv:2401.03462", "arxiv:2312.15503", "arxiv:2311.13534", "arxiv:2310.07554", "arxiv:2309.07597", "license:mit", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,717
1,717
11
0
--- language: - en - zh license: mit pipeline_tag: feature-extraction tags: - mteb model-index: - name: bge-reranker-base results: - task: type: Reranking dataset: name: MTEB CMedQAv1 type: C-MTEB/CMedQAv1-reranking config: default split: test revision: None metrics: - type: map value: 81.27206722525007 - type: mrr value: 84.14238095238095 - task: type: Reranking dataset: name: MTEB CMedQAv2 type: C-MTEB/CMedQAv2-reranking config: default split: test revision: None metrics: - type: map value: 84.10369934291236 - type: mrr value: 86.79376984126984 - task: type: Reranking dataset: name: MTEB MMarcoReranking type: C-MTEB/Mmarco-reranking config: default split: dev revision: None metrics: - type: map value: 35.4600511272538 - type: mrr value: 34.60238095238095 - task: type: Reranking dataset: name: MTEB T2Reranking type: C-MTEB/T2Reranking config: default split: dev revision: None metrics: - type: map value: 67.27728847727172 - type: mrr value: 77.1315192743764 --- <br><br> # bge-reranker-large-onnx This repo was forked from the **BAAI/bge-reranker-large** model and contains only the ONNX version of the model. Below is the original model card from the source repo. --- **We have updated the [new reranker](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_reranker), supporting larger lengths, more languages, and achieving better performance.** <h1 align="center">FlagEmbedding</h1> <h4 align="center"> <p> <a href=#model-list>Model List</a> | <a href=#frequently-asked-questions>FAQ</a> | <a href=#usage>Usage</a> | <a href="#evaluation">Evaluation</a> | <a href="#train">Train</a> | <a href="#citation">Citation</a> | <a href="#license">License</a> <p> </h4> **More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding).** [English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md) FlagEmbedding focuses on retrieval-augmented LLMs, consisting of the following projects currently: - **Long-Context LLM**: [Activation Beacon](https://github.com/FlagOpen/FlagEmbedding/tree/master/Long_LLM/activation_beacon) - **Fine-tuning of LM** : [LM-Cocktail](https://github.com/FlagOpen/FlagEmbedding/tree/master/LM_Cocktail) - **Embedding Model**: [Visualized-BGE](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/visual), [BGE-M3](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3), [LLM Embedder](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_embedder), [BGE Embedding](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/baai_general_embedding) - **Reranker Model**: [llm rerankers](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_reranker), [BGE Reranker](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker) - **Benchmark**: [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) ## News - 3/18/2024: Release new [rerankers](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_reranker), built upon powerful M3 and LLM (GEMMA and MiniCPM, not so large actually) backbones, supporitng multi-lingual processing and larger inputs, massive improvements of ranking performances on BEIR, C-MTEB/Retrieval, MIRACL, LlamaIndex Evaluation. - 3/18/2024: Release [Visualized-BGE](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/visual), equipping BGE with visual capabilities. Visualized-BGE can be utilized to generate embeddings for hybrid image-text data. - 1/30/2024: Release **BGE-M3**, a new member to BGE model series! M3 stands for **M**ulti-linguality (100+ languages), **M**ulti-granularities (input length up to 8192), **M**ulti-Functionality (unification of dense, lexical, multi-vec/colbert retrieval). It is the first embedding model which supports all three retrieval methods, achieving new SOTA on multi-lingual (MIRACL) and cross-lingual (MKQA) benchmarks. [Technical Report](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/BGE_M3/BGE_M3.pdf) and [Code](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3). :fire: - 1/9/2024: Release [Activation-Beacon](https://github.com/FlagOpen/FlagEmbedding/tree/master/Long_LLM/activation_beacon), an effective, efficient, compatible, and low-cost (training) method to extend the context length of LLM. [Technical Report](https://arxiv.org/abs/2401.03462) :fire: - 12/24/2023: Release **LLaRA**, a LLaMA-7B based dense retriever, leading to state-of-the-art performances on MS MARCO and BEIR. Model and code will be open-sourced. Please stay tuned. [Technical Report](https://arxiv.org/abs/2312.15503) - 11/23/2023: Release [LM-Cocktail](https://github.com/FlagOpen/FlagEmbedding/tree/master/LM_Cocktail), a method to maintain general capabilities during fine-tuning by merging multiple language models. [Technical Report](https://arxiv.org/abs/2311.13534) :fire: - 10/12/2023: Release [LLM-Embedder](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_embedder), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Technical Report](https://arxiv.org/pdf/2310.07554.pdf) - 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) of BGE has been released - 09/15/2023: The [massive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released - 09/12/2023: New models: - **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models. - **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction. <details> <summary>More</summary> <!-- ### More --> - 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning. - 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard). - 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗** - 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada: - 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset. </details> ## Model List `bge` is short for `BAAI general embedding`. | Model | Language | | Description | query instruction for retrieval [1] | |:-------------------------------|:--------:| :--------:| :--------:|:--------:| | [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) | Multilingual | [Inference](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3#usage) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3) | Multi-Functionality(dense retrieval, sparse retrieval, multi-vector(colbert)), Multi-Linguality, and Multi-Granularity(8192 tokens) | | | [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](./FlagEmbedding/llm_embedder/README.md) [Fine-tune](./FlagEmbedding/llm_embedder/README.md) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](./FlagEmbedding/llm_embedder/README.md) | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | | | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` | [1\]: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages. [2\]: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models. For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results. All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI. If you cannot open the Huggingface Hub, you also can download the models at https://model.baai.ac.cn/models . ## Frequently asked questions <details> <summary>1. How to fine-tune bge embedding model?</summary> <!-- ### How to fine-tune bge embedding model? --> Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model. Some suggestions: - Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance. - If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity. - If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker. Refer to this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) for the fine-tuning for reranker </details> <details> <summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary> <!-- ### The similarity score between two dissimilar sentences is higher than 0.5 --> **Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.** Since we finetune the models by contrastive learning with a temperature of 0.01, the similarity distribution of the current BGE model is about in the interval \[0.6, 1\]. So a similarity score greater than 0.5 does not indicate that the two sentences are similar. For downstream tasks, such as passage retrieval or semantic similarity, **what matters is the relative order of the scores, not the absolute value.** If you need to filter similar sentences based on a similarity threshold, please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9). </details> <details> <summary>3. When does the query instruction need to be used</summary> <!-- ### When does the query instruction need to be used --> For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction. No instruction only has a slight degradation in retrieval performance compared with using instruction. So you can generate embedding without instruction in all cases for convenience. For a retrieval task that uses short queries to find long related documents, it is recommended to add instructions for these short queries. **The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.** In all cases, the documents/passages do not need to add the instruction. </details> ## Usage ### Usage for Embedding Model Here are some examples for using `bge` models with [FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers). #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding. ```python from FlagEmbedding import FlagModel sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = FlagModel('BAAI/bge-large-zh-v1.5', query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:", use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation embeddings_1 = model.encode(sentences_1) embeddings_2 = model.encode(sentences_2) similarity = embeddings_1 @ embeddings_2.T print(similarity) # for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query # corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] q_embeddings = model.encode_queries(queries) p_embeddings = model.encode(passages) scores = q_embeddings @ p_embeddings.T ``` For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list). By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs. You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable. #### Using Sentence-Transformers You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net): ``` pip install -U sentence-transformers ``` ```python from sentence_transformers import SentenceTransformer sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = SentenceTransformer('BAAI/bge-large-zh-v1.5') embeddings_1 = model.encode(sentences_1, normalize_embeddings=True) embeddings_2 = model.encode(sentences_2, normalize_embeddings=True) similarity = embeddings_1 @ embeddings_2.T print(similarity) ``` For s2p(short query to long passage) retrieval task, each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)). But the instruction is not needed for passages. ```python from sentence_transformers import SentenceTransformer queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] instruction = "为这个句子生成表示以用于检索相关文章:" model = SentenceTransformer('BAAI/bge-large-zh-v1.5') q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True) p_embeddings = model.encode(passages, normalize_embeddings=True) scores = q_embeddings @ p_embeddings.T ``` #### Using Langchain You can use `bge` in langchain like this: ```python from langchain.embeddings import HuggingFaceBgeEmbeddings model_name = "BAAI/bge-large-en-v1.5" model_kwargs = {'device': 'cuda'} encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity model = HuggingFaceBgeEmbeddings( model_name=model_name, model_kwargs=model_kwargs, encode_kwargs=encode_kwargs, query_instruction="为这个句子生成表示以用于检索相关文章:" ) model.query_instruction = "为这个句子生成表示以用于检索相关文章:" ``` #### Using HuggingFace Transformers With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding. ```python from transformers import AutoTokenizer, AutoModel import torch # Sentences we want sentence embeddings for sentences = ["样例数据-1", "样例数据-2"] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5') model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5') model.eval() # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages) # encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, cls pooling. sentence_embeddings = model_output[0][:, 0] # normalize embeddings sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1) print("Sentence embeddings:", sentence_embeddings) ``` ### Usage for Reranker Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. You can get a relevance score by inputting query and passage to the reranker. The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range. #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` Get relevance scores (higher scores indicate more relevance): ```python from FlagEmbedding import FlagReranker reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation score = reranker.compute_score(['query', 'passage']) print(score) scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]) print(scores) ``` #### Using Huggingface transformers ```python import torch from transformers import AutoModelForSequenceClassification, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large') model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large') model.eval() pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']] with torch.no_grad(): inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512) scores = model(**inputs, return_dict=True).logits.view(-1, ).float() print(scores) ``` #### Usage reranker with the ONNX files ```python from optimum.onnxruntime import ORTModelForSequenceClassification # type: ignore import torch from transformers import AutoModelForSequenceClassification, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large') model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-base') model_ort = ORTModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-base', file_name="onnx/model.onnx") # Sentences we want sentence embeddings for pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']] # Tokenize sentences encoded_input = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt') scores_ort = model_ort(**encoded_input, return_dict=True).logits.view(-1, ).float() # Compute token embeddings with torch.inference_mode(): scores = model_ort(**encoded_input, return_dict=True).logits.view(-1, ).float() # scores and scores_ort are identical ``` #### Usage reranker with infinity Its also possible to deploy the onnx/torch files with the [infinity_emb](https://github.com/michaelfeil/infinity) pip package. ```python import asyncio from infinity_emb import AsyncEmbeddingEngine, EngineArgs query='what is a panda?' docs = ['The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear', "Paris is in France."] engine = AsyncEmbeddingEngine.from_args( EngineArgs(model_name_or_path = "BAAI/bge-reranker-base", device="cpu", engine="torch" # or engine="optimum" for onnx )) async def main(): async with engine: ranking, usage = await engine.rerank(query=query, docs=docs) print(list(zip(ranking, docs))) asyncio.run(main()) ``` ## Evaluation `baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!** For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md). - **MTEB**: | Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) | |:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:| | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 | | [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 | | [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 | | [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 | | [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 | | [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 | | [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 | | [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 | | [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 | | [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 | | [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 | | [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 | | [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 | | [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 | | [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 | - **C-MTEB**: We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks. Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction. | Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 | | [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 | | [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 | | [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 | | [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 | | [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 | | [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 | | [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 | | [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 | | [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 | | [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 | - **Reranking**: See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script. | Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 | | multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 | | multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 | | multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 | | m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 | | m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 | | bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 | | bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 | \* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks ## Train ### BAAI Embedding We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning. **You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).** We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain). Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned. More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md). ### BGE Reranker Cross-encoder will perform full-attention over the input pair, which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model. Therefore, it can be used to re-rank the top-k documents returned by embedding model. We train the cross-encoder on a multilingual pair data, The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker). More details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker) ## Citation If you find this repository useful, please consider giving a star :star: and citation ``` @misc{bge_embedding, title={C-Pack: Packaged Resources To Advance General Chinese Embedding}, author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff}, year={2023}, eprint={2309.07597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## License FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
[ "SEMANTIC_SIMILARITY", "SUMMARIZATION" ]
[ "BEAR" ]
Non_BioNLP
croissantllm/CroissantLLMBase-GGUF
croissantllm
text-generation
[ "gguf", "legal", "code", "text-generation-inference", "art", "text-generation", "fr", "en", "dataset:cerebras/SlimPajama-627B", "dataset:uonlp/CulturaX", "dataset:pg19", "dataset:bigcode/starcoderdata", "dataset:croissantllm/croissant_dataset", "arxiv:2402.00786", "license:mit", "endpoints_compatible", "region:us" ]
1,707
1,714
64
4
--- datasets: - cerebras/SlimPajama-627B - uonlp/CulturaX - pg19 - bigcode/starcoderdata - croissantllm/croissant_dataset language: - fr - en license: mit pipeline_tag: text-generation tags: - legal - code - text-generation-inference - art --- # CroissantLLM - Base GGUF (190k steps, Final version) This model is part of the CroissantLLM initiative, and corresponds to the checkpoint after 190k steps (2.99 T) tokens. To play with the final model, we recommend using the Chat version: https://huggingface.co/croissantllm/CroissantLLMChat-v0.1. https://arxiv.org/abs/2402.00786 ## Abstract We introduce CroissantLLM, a 1.3B language model pretrained on a set of 3T English and French tokens, to bring to the research and industrial community a high-performance, fully open-sourced bilingual model that runs swiftly on consumer-grade local hardware. To that end, we pioneer the approach of training an intrinsically bilingual model with a 1:1 English-to-French pretraining data ratio, a custom tokenizer, and bilingual finetuning datasets. We release the training dataset, notably containing a French split with manually curated, high-quality, and varied data sources. To assess performance outside of English, we craft a novel benchmark, FrenchBench, consisting of an array of classification and generation tasks, covering various orthogonal aspects of model performance in the French Language. Additionally, rooted in transparency and to foster further Large Language Model research, we release codebases, and dozens of checkpoints across various model sizes, training data distributions, and training steps, as well as fine-tuned Chat models, and strong translation models. We evaluate our model through the FMTI framework, and validate 81% of the transparency criteria, far beyond the scores of even most open initiatives. This work enriches the NLP landscape, breaking away from previous English-centric work in order to strengthen our understanding of multilinguality in language models. ## Citation Our work can be cited as: ```bash @misc{faysse2024croissantllm, title={CroissantLLM: A Truly Bilingual French-English Language Model}, author={Manuel Faysse and Patrick Fernandes and Nuno M. Guerreiro and António Loison and Duarte M. Alves and Caio Corro and Nicolas Boizard and João Alves and Ricardo Rei and Pedro H. Martins and Antoni Bigata Casademunt and François Yvon and André F. T. Martins and Gautier Viaud and Céline Hudelot and Pierre Colombo}, year={2024}, eprint={2402.00786}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## Usage This model is a base model, that is, it is not finetuned for Chat function and works best with few-shot prompting strategies. ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer model_name = "croissantllm/CroissantLLMBase" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16, device_map="auto") inputs = tokenizer("I am so tired I could sleep right now. -> Je suis si fatigué que je pourrais m'endormir maintenant.\nHe is heading to the market. -> Il va au marché.\nWe are running on the beach. ->", return_tensors="pt").to(model.device) tokens = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.95, top_k=60, temperature=0.3) print(tokenizer.decode(tokens[0])) # remove bos token inputs = tokenizer("Capitales: France -> Paris, Italie -> Rome, Allemagne -> Berlin, Espagne ->", return_tensors="pt", add_special_tokens=True).to(model.device) tokens = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.95, top_k=60) print(tokenizer.decode(tokens[0])) ```
[ "TRANSLATION" ]
[ "CRAFT" ]
Non_BioNLP
Mihaiii/Squirtle
Mihaiii
sentence-similarity
[ "sentence-transformers", "onnx", "safetensors", "bert", "feature-extraction", "sentence-similarity", "bge", "mteb", "dataset:Mihaiii/qa-assistant", "license:mit", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,714
1,714
35
1
--- datasets: - Mihaiii/qa-assistant library_name: sentence-transformers license: mit pipeline_tag: sentence-similarity tags: - sentence-transformers - feature-extraction - sentence-similarity - bge - mteb model-index: - name: Squirtle results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 69.59701492537313 - type: ap value: 31.80839087521638 - type: f1 value: 63.43204352573031 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 82.09027499999999 - type: ap value: 76.95004336850603 - type: f1 value: 82.04505556179174 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 41.943999999999996 - type: f1 value: 40.40964457303876 - task: type: Retrieval dataset: name: MTEB ArguAna type: mteb/arguana config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 13.869000000000002 - type: map_at_10 value: 24.631 - type: map_at_100 value: 25.965 - type: map_at_1000 value: 26.023000000000003 - type: map_at_20 value: 25.442999999999998 - type: map_at_3 value: 20.827 - type: map_at_5 value: 22.776 - type: mrr_at_1 value: 14.580000000000002 - type: mrr_at_10 value: 24.91 - type: mrr_at_100 value: 26.229999999999997 - type: mrr_at_1000 value: 26.288 - type: mrr_at_20 value: 25.708 - type: mrr_at_3 value: 21.136 - type: mrr_at_5 value: 23.02 - type: ndcg_at_1 value: 13.869000000000002 - type: ndcg_at_10 value: 31.14 - type: ndcg_at_100 value: 37.885999999999996 - type: ndcg_at_1000 value: 39.497 - type: ndcg_at_20 value: 34.068 - type: ndcg_at_3 value: 23.163 - type: ndcg_at_5 value: 26.677 - type: precision_at_1 value: 13.869000000000002 - type: precision_at_10 value: 5.220000000000001 - type: precision_at_100 value: 0.844 - type: precision_at_1000 value: 0.097 - type: precision_at_20 value: 3.186 - type: precision_at_3 value: 9.981 - type: precision_at_5 value: 7.696 - type: recall_at_1 value: 13.869000000000002 - type: recall_at_10 value: 52.205 - type: recall_at_100 value: 84.42399999999999 - type: recall_at_1000 value: 97.297 - type: recall_at_20 value: 63.727000000000004 - type: recall_at_3 value: 29.942999999999998 - type: recall_at_5 value: 38.478 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 33.042527574996505 - type: v_measures value: - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - 0.2896613951792161 - 0.2974905938215674 - 0.28195491579456905 - 0.3008325954323272 - 0.3012695848509836 - 0.28933380000430453 - 0.297420818100457 - 0.2792041800887245 - 0.3049968405105834 - 0.30704380358904726 - 0.39238640618067383 - 0.3932595512850983 - 0.3875472939281748 - 0.39822946285500505 - 0.39839156092566014 - 0.40184636328122075 - 0.39008499175162326 - 0.3984035967802891 - 0.39159106298575347 - 0.3923217036338575 - 0.3916410911561569 - 0.2357749280106326 - 0.23682806457721106 - 0.3122239617657793 - 0.26610676013174756 - 0.18123482803921434 - 0.2504695156635453 - 0.10917464735757001 - 0.16714512698028008 - 1.0 - 0.19931410358764295 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 24.68133686033884 - type: v_measures value: - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - 0.2005976632299017 - 0.208968006943616 - 0.20946008190179435 - 0.20539809799180958 - 0.21463587994609631 - 0.20913407901977635 - 0.20908020832330956 - 0.1944493063711425 - 0.20181175619582953 - 0.2249901827151246 - 0.29132293951181787 - 0.29570222215271086 - 0.2796075942678196 - 0.28871411057617774 - 0.29302758518431116 - 0.29227253592096986 - 0.2856462545898644 - 0.28687743467743254 - 0.2900793948371436 - 0.28627385826697854 - 0.27308659940457203 - 0.14117319401377473 - 0.1761477350541332 - 0.24048342650129406 - 0.19387054212465876 - 0.14470023981605995 - 0.16704070762984086 - 0.07547453139959907 - 0.127993495025131 - 1.0 - 0.14319476311235024 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 52.344372012529384 - type: mrr value: 65.32614430813877 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 69.44065444549933 - type: cos_sim_spearman value: 71.77814153774398 - type: euclidean_pearson value: 70.59416783558756 - type: euclidean_spearman value: 71.77814153774398 - type: manhattan_pearson value: 70.99287197201959 - type: manhattan_spearman value: 72.0769435268729 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 67.12987012987013 - type: f1 value: 65.99991975715585 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 30.861774505346606 - type: v_measures value: - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - 0.3057878417529878 - 0.3086229109676654 - 0.3080657568280612 - 0.3002878816865892 - 0.30903247986282023 - 0.3022960257813801 - 0.31981283125167154 - 0.3119766955566159 - 0.3039859162306553 - 0.31630911061621453 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 21.100665285420916 - type: v_measures value: - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - 0.21042268101320297 - 0.19607301651541253 - 0.21811669828359762 - 0.20892482431651227 - 0.20621532003083415 - 0.215815720040119 - 0.20517452774094483 - 0.21396360841093787 - 0.20967704706047804 - 0.22568308513005236 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: mteb/cqadupstack-android config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: map_at_1 value: 17.835 - type: map_at_10 value: 24.718999999999998 - type: map_at_100 value: 25.755 - type: map_at_1000 value: 25.887 - type: map_at_20 value: 25.217 - type: map_at_3 value: 23.076 - type: map_at_5 value: 23.96 - type: mrr_at_1 value: 23.033 - type: mrr_at_10 value: 29.868 - type: mrr_at_100 value: 30.757 - type: mrr_at_1000 value: 30.834 - type: mrr_at_20 value: 30.37 - type: mrr_at_3 value: 28.112 - type: mrr_at_5 value: 29.185 - type: ndcg_at_1 value: 23.033 - type: ndcg_at_10 value: 28.899 - type: ndcg_at_100 value: 33.788000000000004 - type: ndcg_at_1000 value: 36.962 - type: ndcg_at_20 value: 30.497000000000003 - type: ndcg_at_3 value: 26.442 - type: ndcg_at_5 value: 27.466 - type: precision_at_1 value: 23.033 - type: precision_at_10 value: 5.351 - type: precision_at_100 value: 0.9610000000000001 - type: precision_at_1000 value: 0.151 - type: precision_at_20 value: 3.2259999999999995 - type: precision_at_3 value: 12.923000000000002 - type: precision_at_5 value: 8.956 - type: recall_at_1 value: 17.835 - type: recall_at_10 value: 36.034 - type: recall_at_100 value: 57.615 - type: recall_at_1000 value: 79.72 - type: recall_at_20 value: 41.894999999999996 - type: recall_at_3 value: 28.313 - type: recall_at_5 value: 31.639 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval type: mteb/cqadupstack-english config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: map_at_1 value: 12.166 - type: map_at_10 value: 16.320999999999998 - type: map_at_100 value: 16.954 - type: map_at_1000 value: 17.054 - type: map_at_20 value: 16.651 - type: map_at_3 value: 14.890999999999998 - type: map_at_5 value: 15.695999999999998 - type: mrr_at_1 value: 15.287 - type: mrr_at_10 value: 19.487 - type: mrr_at_100 value: 20.11 - type: mrr_at_1000 value: 20.185 - type: mrr_at_20 value: 19.830000000000002 - type: mrr_at_3 value: 18.068 - type: mrr_at_5 value: 18.855 - type: ndcg_at_1 value: 15.287 - type: ndcg_at_10 value: 19.198999999999998 - type: ndcg_at_100 value: 22.395 - type: ndcg_at_1000 value: 25.106 - type: ndcg_at_20 value: 20.297 - type: ndcg_at_3 value: 16.743 - type: ndcg_at_5 value: 17.855999999999998 - type: precision_at_1 value: 15.287 - type: precision_at_10 value: 3.605 - type: precision_at_100 value: 0.638 - type: precision_at_1000 value: 0.108 - type: precision_at_20 value: 2.166 - type: precision_at_3 value: 8.089 - type: precision_at_5 value: 5.822 - type: recall_at_1 value: 12.166 - type: recall_at_10 value: 24.701999999999998 - type: recall_at_100 value: 39.199 - type: recall_at_1000 value: 58.205 - type: recall_at_20 value: 28.791 - type: recall_at_3 value: 17.469 - type: recall_at_5 value: 20.615 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval type: mteb/cqadupstack-gaming config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 19.667 - type: map_at_10 value: 27.163999999999998 - type: map_at_100 value: 28.044000000000004 - type: map_at_1000 value: 28.142 - type: map_at_20 value: 27.645999999999997 - type: map_at_3 value: 24.914 - type: map_at_5 value: 26.078000000000003 - type: mrr_at_1 value: 23.197000000000003 - type: mrr_at_10 value: 30.202 - type: mrr_at_100 value: 30.976 - type: mrr_at_1000 value: 31.047000000000004 - type: mrr_at_20 value: 30.636000000000003 - type: mrr_at_3 value: 28.004 - type: mrr_at_5 value: 29.164 - type: ndcg_at_1 value: 23.197000000000003 - type: ndcg_at_10 value: 31.618000000000002 - type: ndcg_at_100 value: 35.977 - type: ndcg_at_1000 value: 38.458 - type: ndcg_at_20 value: 33.242 - type: ndcg_at_3 value: 27.285999999999998 - type: ndcg_at_5 value: 29.163 - type: precision_at_1 value: 23.197000000000003 - type: precision_at_10 value: 5.26 - type: precision_at_100 value: 0.8200000000000001 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_20 value: 3.082 - type: precision_at_3 value: 12.247 - type: precision_at_5 value: 8.577 - type: recall_at_1 value: 19.667 - type: recall_at_10 value: 42.443 - type: recall_at_100 value: 62.254 - type: recall_at_1000 value: 80.44 - type: recall_at_20 value: 48.447 - type: recall_at_3 value: 30.518 - type: recall_at_5 value: 35.22 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval type: mteb/cqadupstack-gis config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: map_at_1 value: 10.923 - type: map_at_10 value: 14.24 - type: map_at_100 value: 15.001000000000001 - type: map_at_1000 value: 15.092 - type: map_at_20 value: 14.623 - type: map_at_3 value: 13.168 - type: map_at_5 value: 13.678 - type: mrr_at_1 value: 11.525 - type: mrr_at_10 value: 15.187000000000001 - type: mrr_at_100 value: 15.939999999999998 - type: mrr_at_1000 value: 16.03 - type: mrr_at_20 value: 15.557000000000002 - type: mrr_at_3 value: 13.991999999999999 - type: mrr_at_5 value: 14.557 - type: ndcg_at_1 value: 11.525 - type: ndcg_at_10 value: 16.512999999999998 - type: ndcg_at_100 value: 20.445 - type: ndcg_at_1000 value: 23.398 - type: ndcg_at_20 value: 17.832 - type: ndcg_at_3 value: 14.224 - type: ndcg_at_5 value: 15.136 - type: precision_at_1 value: 11.525 - type: precision_at_10 value: 2.565 - type: precision_at_100 value: 0.484 - type: precision_at_1000 value: 0.076 - type: precision_at_20 value: 1.582 - type: precision_at_3 value: 5.989 - type: precision_at_5 value: 4.1579999999999995 - type: recall_at_1 value: 10.923 - type: recall_at_10 value: 22.695 - type: recall_at_100 value: 40.892 - type: recall_at_1000 value: 64.456 - type: recall_at_20 value: 27.607 - type: recall_at_3 value: 16.348 - type: recall_at_5 value: 18.504 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval type: mteb/cqadupstack-mathematica config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: map_at_1 value: 5.409 - type: map_at_10 value: 8.584999999999999 - type: map_at_100 value: 9.392 - type: map_at_1000 value: 9.5 - type: map_at_20 value: 8.943 - type: map_at_3 value: 7.3 - type: map_at_5 value: 7.962 - type: mrr_at_1 value: 6.965000000000001 - type: mrr_at_10 value: 10.593 - type: mrr_at_100 value: 11.496 - type: mrr_at_1000 value: 11.578 - type: mrr_at_20 value: 11.021 - type: mrr_at_3 value: 8.976 - type: mrr_at_5 value: 9.797 - type: ndcg_at_1 value: 6.965000000000001 - type: ndcg_at_10 value: 11.056000000000001 - type: ndcg_at_100 value: 15.683 - type: ndcg_at_1000 value: 18.873 - type: ndcg_at_20 value: 12.331 - type: ndcg_at_3 value: 8.334 - type: ndcg_at_5 value: 9.512 - type: precision_at_1 value: 6.965000000000001 - type: precision_at_10 value: 2.177 - type: precision_at_100 value: 0.54 - type: precision_at_1000 value: 0.095 - type: precision_at_20 value: 1.468 - type: precision_at_3 value: 3.9800000000000004 - type: precision_at_5 value: 3.109 - type: recall_at_1 value: 5.409 - type: recall_at_10 value: 16.895 - type: recall_at_100 value: 38.167 - type: recall_at_1000 value: 61.783 - type: recall_at_20 value: 21.248 - type: recall_at_3 value: 9.518 - type: recall_at_5 value: 12.426 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval type: mteb/cqadupstack-physics config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: map_at_1 value: 13.688 - type: map_at_10 value: 19.096 - type: map_at_100 value: 20.058 - type: map_at_1000 value: 20.194000000000003 - type: map_at_20 value: 19.595000000000002 - type: map_at_3 value: 17.313000000000002 - type: map_at_5 value: 18.41 - type: mrr_at_1 value: 17.132 - type: mrr_at_10 value: 22.95 - type: mrr_at_100 value: 23.799 - type: mrr_at_1000 value: 23.884 - type: mrr_at_20 value: 23.419999999999998 - type: mrr_at_3 value: 20.95 - type: mrr_at_5 value: 22.21 - type: ndcg_at_1 value: 17.132 - type: ndcg_at_10 value: 22.88 - type: ndcg_at_100 value: 27.572000000000003 - type: ndcg_at_1000 value: 30.824 - type: ndcg_at_20 value: 24.516 - type: ndcg_at_3 value: 19.64 - type: ndcg_at_5 value: 21.4 - type: precision_at_1 value: 17.132 - type: precision_at_10 value: 4.263999999999999 - type: precision_at_100 value: 0.7969999999999999 - type: precision_at_1000 value: 0.125 - type: precision_at_20 value: 2.6519999999999997 - type: precision_at_3 value: 9.336 - type: precision_at_5 value: 6.93 - type: recall_at_1 value: 13.688 - type: recall_at_10 value: 30.537999999999997 - type: recall_at_100 value: 51.017999999999994 - type: recall_at_1000 value: 73.921 - type: recall_at_20 value: 36.174 - type: recall_at_3 value: 21.568 - type: recall_at_5 value: 26.127 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval type: mteb/cqadupstack-programmers config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: map_at_1 value: 8.173 - type: map_at_10 value: 11.648 - type: map_at_100 value: 12.434000000000001 - type: map_at_1000 value: 12.540000000000001 - type: map_at_20 value: 12.030000000000001 - type: map_at_3 value: 10.568 - type: map_at_5 value: 11.064 - type: mrr_at_1 value: 10.274 - type: mrr_at_10 value: 14.505 - type: mrr_at_100 value: 15.332 - type: mrr_at_1000 value: 15.409 - type: mrr_at_20 value: 14.899999999999999 - type: mrr_at_3 value: 13.375 - type: mrr_at_5 value: 13.929 - type: ndcg_at_1 value: 10.274 - type: ndcg_at_10 value: 14.283999999999999 - type: ndcg_at_100 value: 18.731 - type: ndcg_at_1000 value: 21.744 - type: ndcg_at_20 value: 15.647 - type: ndcg_at_3 value: 12.278 - type: ndcg_at_5 value: 12.974 - type: precision_at_1 value: 10.274 - type: precision_at_10 value: 2.683 - type: precision_at_100 value: 0.582 - type: precision_at_1000 value: 0.099 - type: precision_at_20 value: 1.7409999999999999 - type: precision_at_3 value: 6.088 - type: precision_at_5 value: 4.201 - type: recall_at_1 value: 8.173 - type: recall_at_10 value: 19.642 - type: recall_at_100 value: 40.213 - type: recall_at_1000 value: 62.083999999999996 - type: recall_at_20 value: 24.537 - type: recall_at_3 value: 13.700999999999999 - type: recall_at_5 value: 15.751000000000001 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: mteb/cqadupstack config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 11.252416666666667 - type: map_at_10 value: 15.589583333333334 - type: map_at_100 value: 16.381166666666665 - type: map_at_1000 value: 16.490333333333332 - type: map_at_20 value: 15.99116666666667 - type: map_at_3 value: 14.140916666666667 - type: map_at_5 value: 14.9045 - type: mrr_at_1 value: 13.710416666666664 - type: mrr_at_10 value: 18.34416666666667 - type: mrr_at_100 value: 19.110083333333336 - type: mrr_at_1000 value: 19.192583333333335 - type: mrr_at_20 value: 18.74783333333333 - type: mrr_at_3 value: 16.799416666666666 - type: mrr_at_5 value: 17.62725 - type: ndcg_at_1 value: 13.710416666666664 - type: ndcg_at_10 value: 18.628583333333335 - type: ndcg_at_100 value: 22.733666666666668 - type: ndcg_at_1000 value: 25.728499999999997 - type: ndcg_at_20 value: 19.994500000000002 - type: ndcg_at_3 value: 15.918083333333332 - type: ndcg_at_5 value: 17.086999999999996 - type: precision_at_1 value: 13.710416666666664 - type: precision_at_10 value: 3.3575 - type: precision_at_100 value: 0.6368333333333333 - type: precision_at_1000 value: 0.10508333333333333 - type: precision_at_20 value: 2.074833333333333 - type: precision_at_3 value: 7.440333333333333 - type: precision_at_5 value: 5.341916666666667 - type: recall_at_1 value: 11.252416666666667 - type: recall_at_10 value: 25.200833333333332 - type: recall_at_100 value: 44.075333333333326 - type: recall_at_1000 value: 66.12541666666665 - type: recall_at_20 value: 30.24916666666667 - type: recall_at_3 value: 17.46591666666667 - type: recall_at_5 value: 20.53691666666667 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval type: mteb/cqadupstack-stats config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: map_at_1 value: 8.696 - type: map_at_10 value: 12.339 - type: map_at_100 value: 12.946 - type: map_at_1000 value: 13.04 - type: map_at_20 value: 12.6 - type: map_at_3 value: 11.06 - type: map_at_5 value: 11.530999999999999 - type: mrr_at_1 value: 10.276 - type: mrr_at_10 value: 14.463999999999999 - type: mrr_at_100 value: 15.07 - type: mrr_at_1000 value: 15.152 - type: mrr_at_20 value: 14.737 - type: mrr_at_3 value: 13.037 - type: mrr_at_5 value: 13.627 - type: ndcg_at_1 value: 10.276 - type: ndcg_at_10 value: 15.085 - type: ndcg_at_100 value: 18.538 - type: ndcg_at_1000 value: 21.461 - type: ndcg_at_20 value: 15.976 - type: ndcg_at_3 value: 12.454 - type: ndcg_at_5 value: 13.195 - type: precision_at_1 value: 10.276 - type: precision_at_10 value: 2.669 - type: precision_at_100 value: 0.48900000000000005 - type: precision_at_1000 value: 0.08 - type: precision_at_20 value: 1.572 - type: precision_at_3 value: 5.726 - type: precision_at_5 value: 3.9570000000000003 - type: recall_at_1 value: 8.696 - type: recall_at_10 value: 21.766 - type: recall_at_100 value: 38.269 - type: recall_at_1000 value: 61.106 - type: recall_at_20 value: 24.992 - type: recall_at_3 value: 14.032 - type: recall_at_5 value: 15.967999999999998 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval type: mteb/cqadupstack-tex config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 6.13 - type: map_at_10 value: 9.067 - type: map_at_100 value: 9.687999999999999 - type: map_at_1000 value: 9.792 - type: map_at_20 value: 9.384 - type: map_at_3 value: 8.006 - type: map_at_5 value: 8.581999999999999 - type: mrr_at_1 value: 7.605 - type: mrr_at_10 value: 11.111 - type: mrr_at_100 value: 11.745999999999999 - type: mrr_at_1000 value: 11.837 - type: mrr_at_20 value: 11.452 - type: mrr_at_3 value: 9.922 - type: mrr_at_5 value: 10.522 - type: ndcg_at_1 value: 7.605 - type: ndcg_at_10 value: 11.302 - type: ndcg_at_100 value: 14.629 - type: ndcg_at_1000 value: 17.739 - type: ndcg_at_20 value: 12.411 - type: ndcg_at_3 value: 9.28 - type: ndcg_at_5 value: 10.161000000000001 - type: precision_at_1 value: 7.605 - type: precision_at_10 value: 2.22 - type: precision_at_100 value: 0.46499999999999997 - type: precision_at_1000 value: 0.087 - type: precision_at_20 value: 1.428 - type: precision_at_3 value: 4.565 - type: precision_at_5 value: 3.3649999999999998 - type: recall_at_1 value: 6.13 - type: recall_at_10 value: 16.009999999999998 - type: recall_at_100 value: 31.467 - type: recall_at_1000 value: 54.722 - type: recall_at_20 value: 20.137 - type: recall_at_3 value: 10.347000000000001 - type: recall_at_5 value: 12.692 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval type: mteb/cqadupstack-unix config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: map_at_1 value: 11.645 - type: map_at_10 value: 15.466 - type: map_at_100 value: 16.147 - type: map_at_1000 value: 16.247 - type: map_at_20 value: 15.806999999999999 - type: map_at_3 value: 14.011000000000001 - type: map_at_5 value: 14.967 - type: mrr_at_1 value: 14.179 - type: mrr_at_10 value: 18.512 - type: mrr_at_100 value: 19.184 - type: mrr_at_1000 value: 19.267 - type: mrr_at_20 value: 18.855 - type: mrr_at_3 value: 16.993 - type: mrr_at_5 value: 17.954 - type: ndcg_at_1 value: 14.179 - type: ndcg_at_10 value: 18.311 - type: ndcg_at_100 value: 21.996 - type: ndcg_at_1000 value: 24.942 - type: ndcg_at_20 value: 19.522000000000002 - type: ndcg_at_3 value: 15.593000000000002 - type: ndcg_at_5 value: 17.116 - type: precision_at_1 value: 14.179 - type: precision_at_10 value: 3.116 - type: precision_at_100 value: 0.5519999999999999 - type: precision_at_1000 value: 0.091 - type: precision_at_20 value: 1.87 - type: precision_at_3 value: 7.090000000000001 - type: precision_at_5 value: 5.224 - type: recall_at_1 value: 11.645 - type: recall_at_10 value: 24.206 - type: recall_at_100 value: 41.29 - type: recall_at_1000 value: 63.205999999999996 - type: recall_at_20 value: 28.659000000000002 - type: recall_at_3 value: 16.771 - type: recall_at_5 value: 20.602 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval type: mteb/cqadupstack-webmasters config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: map_at_1 value: 12.435 - type: map_at_10 value: 17.263 - type: map_at_100 value: 18.137 - type: map_at_1000 value: 18.282999999999998 - type: map_at_20 value: 17.724 - type: map_at_3 value: 15.648000000000001 - type: map_at_5 value: 16.542 - type: mrr_at_1 value: 15.809999999999999 - type: mrr_at_10 value: 20.687 - type: mrr_at_100 value: 21.484 - type: mrr_at_1000 value: 21.567 - type: mrr_at_20 value: 21.124000000000002 - type: mrr_at_3 value: 19.104 - type: mrr_at_5 value: 19.974 - type: ndcg_at_1 value: 15.809999999999999 - type: ndcg_at_10 value: 20.801 - type: ndcg_at_100 value: 25.001 - type: ndcg_at_1000 value: 28.347 - type: ndcg_at_20 value: 22.223000000000003 - type: ndcg_at_3 value: 18.046 - type: ndcg_at_5 value: 19.308 - type: precision_at_1 value: 15.809999999999999 - type: precision_at_10 value: 4.032 - type: precision_at_100 value: 0.832 - type: precision_at_1000 value: 0.16 - type: precision_at_20 value: 2.54 - type: precision_at_3 value: 8.63 - type: precision_at_5 value: 6.4030000000000005 - type: recall_at_1 value: 12.435 - type: recall_at_10 value: 27.495000000000005 - type: recall_at_100 value: 47.522999999999996 - type: recall_at_1000 value: 70.804 - type: recall_at_20 value: 33.334 - type: recall_at_3 value: 19.192 - type: recall_at_5 value: 22.435 - task: type: Retrieval dataset: name: MTEB CQADupstackWordpressRetrieval type: mteb/cqadupstack-wordpress config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 8.262 - type: map_at_10 value: 11.167 - type: map_at_100 value: 12.017999999999999 - type: map_at_1000 value: 12.113 - type: map_at_20 value: 11.674 - type: map_at_3 value: 9.736 - type: map_at_5 value: 10.384 - type: mrr_at_1 value: 9.242 - type: mrr_at_10 value: 12.564 - type: mrr_at_100 value: 13.427 - type: mrr_at_1000 value: 13.520999999999999 - type: mrr_at_20 value: 13.072000000000001 - type: mrr_at_3 value: 11.06 - type: mrr_at_5 value: 11.753 - type: ndcg_at_1 value: 9.242 - type: ndcg_at_10 value: 13.594999999999999 - type: ndcg_at_100 value: 18.049 - type: ndcg_at_1000 value: 20.888 - type: ndcg_at_20 value: 15.440000000000001 - type: ndcg_at_3 value: 10.697 - type: ndcg_at_5 value: 11.757 - type: precision_at_1 value: 9.242 - type: precision_at_10 value: 2.348 - type: precision_at_100 value: 0.482 - type: precision_at_1000 value: 0.077 - type: precision_at_20 value: 1.5709999999999997 - type: precision_at_3 value: 4.621 - type: precision_at_5 value: 3.401 - type: recall_at_1 value: 8.262 - type: recall_at_10 value: 19.983999999999998 - type: recall_at_100 value: 40.997 - type: recall_at_1000 value: 63.058 - type: recall_at_20 value: 27.168999999999997 - type: recall_at_3 value: 11.814 - type: recall_at_5 value: 14.463999999999999 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: mteb/climate-fever config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: map_at_1 value: 4.058 - type: map_at_10 value: 6.734 - type: map_at_100 value: 7.593999999999999 - type: map_at_1000 value: 7.736999999999999 - type: map_at_20 value: 7.102 - type: map_at_3 value: 5.559 - type: map_at_5 value: 6.178999999999999 - type: mrr_at_1 value: 8.404 - type: mrr_at_10 value: 13.514999999999999 - type: mrr_at_100 value: 14.518 - type: mrr_at_1000 value: 14.599 - type: mrr_at_20 value: 14.025000000000002 - type: mrr_at_3 value: 11.584999999999999 - type: mrr_at_5 value: 12.588 - type: ndcg_at_1 value: 8.404 - type: ndcg_at_10 value: 10.02 - type: ndcg_at_100 value: 14.771999999999998 - type: ndcg_at_1000 value: 18.251 - type: ndcg_at_20 value: 11.378 - type: ndcg_at_3 value: 7.675 - type: ndcg_at_5 value: 8.558 - type: precision_at_1 value: 8.404 - type: precision_at_10 value: 3.212 - type: precision_at_100 value: 0.83 - type: precision_at_1000 value: 0.146 - type: precision_at_20 value: 2.186 - type: precision_at_3 value: 5.624 - type: precision_at_5 value: 4.5600000000000005 - type: recall_at_1 value: 4.058 - type: recall_at_10 value: 12.751999999999999 - type: recall_at_100 value: 30.219 - type: recall_at_1000 value: 50.749 - type: recall_at_20 value: 16.634 - type: recall_at_3 value: 7.234999999999999 - type: recall_at_5 value: 9.418 - task: type: Retrieval dataset: name: MTEB DBPedia type: mteb/dbpedia config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: map_at_1 value: 5.516 - type: map_at_10 value: 11.001 - type: map_at_100 value: 14.527999999999999 - type: map_at_1000 value: 15.417 - type: map_at_20 value: 12.446 - type: map_at_3 value: 8.269 - type: map_at_5 value: 9.345 - type: mrr_at_1 value: 43.5 - type: mrr_at_10 value: 54.078 - type: mrr_at_100 value: 54.655 - type: mrr_at_1000 value: 54.679 - type: mrr_at_20 value: 54.461999999999996 - type: mrr_at_3 value: 51.37500000000001 - type: mrr_at_5 value: 53.25 - type: ndcg_at_1 value: 33.125 - type: ndcg_at_10 value: 25.665 - type: ndcg_at_100 value: 28.116000000000003 - type: ndcg_at_1000 value: 34.477000000000004 - type: ndcg_at_20 value: 25.027 - type: ndcg_at_3 value: 28.4 - type: ndcg_at_5 value: 27.094 - type: precision_at_1 value: 43.5 - type: precision_at_10 value: 21.65 - type: precision_at_100 value: 6.351999999999999 - type: precision_at_1000 value: 1.306 - type: precision_at_20 value: 15.662 - type: precision_at_3 value: 32.333 - type: precision_at_5 value: 28.199999999999996 - type: recall_at_1 value: 5.516 - type: recall_at_10 value: 15.457 - type: recall_at_100 value: 32.903 - type: recall_at_1000 value: 53.81700000000001 - type: recall_at_20 value: 20.365 - type: recall_at_3 value: 9.528 - type: recall_at_5 value: 11.619 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 45.79 - type: f1 value: 38.89634882093881 - task: type: Retrieval dataset: name: MTEB FEVER type: mteb/fever config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: map_at_1 value: 18.063000000000002 - type: map_at_10 value: 24.911 - type: map_at_100 value: 25.688 - type: map_at_1000 value: 25.758 - type: map_at_20 value: 25.358999999999998 - type: map_at_3 value: 22.743 - type: map_at_5 value: 23.924 - type: mrr_at_1 value: 19.472 - type: mrr_at_10 value: 26.587 - type: mrr_at_100 value: 27.362 - type: mrr_at_1000 value: 27.428 - type: mrr_at_20 value: 27.040999999999997 - type: mrr_at_3 value: 24.362000000000002 - type: mrr_at_5 value: 25.593 - type: ndcg_at_1 value: 19.472 - type: ndcg_at_10 value: 29.183999999999997 - type: ndcg_at_100 value: 33.207 - type: ndcg_at_1000 value: 35.21 - type: ndcg_at_20 value: 30.791 - type: ndcg_at_3 value: 24.701999999999998 - type: ndcg_at_5 value: 26.823000000000004 - type: precision_at_1 value: 19.472 - type: precision_at_10 value: 4.469 - type: precision_at_100 value: 0.6629999999999999 - type: precision_at_1000 value: 0.08499999999999999 - type: precision_at_20 value: 2.59 - type: precision_at_3 value: 10.401 - type: precision_at_5 value: 7.363 - type: recall_at_1 value: 18.063000000000002 - type: recall_at_10 value: 41.071999999999996 - type: recall_at_100 value: 60.049 - type: recall_at_1000 value: 75.64699999999999 - type: recall_at_20 value: 47.211999999999996 - type: recall_at_3 value: 28.796 - type: recall_at_5 value: 33.894999999999996 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: mteb/fiqa config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: map_at_1 value: 2.45 - type: map_at_10 value: 4.255 - type: map_at_100 value: 4.809 - type: map_at_1000 value: 4.954 - type: map_at_20 value: 4.513 - type: map_at_3 value: 3.4029999999999996 - type: map_at_5 value: 3.782 - type: mrr_at_1 value: 4.938 - type: mrr_at_10 value: 8.231 - type: mrr_at_100 value: 8.902000000000001 - type: mrr_at_1000 value: 9.019 - type: mrr_at_20 value: 8.530999999999999 - type: mrr_at_3 value: 6.944 - type: mrr_at_5 value: 7.623 - type: ndcg_at_1 value: 4.938 - type: ndcg_at_10 value: 6.425 - type: ndcg_at_100 value: 9.661999999999999 - type: ndcg_at_1000 value: 13.911999999999999 - type: ndcg_at_20 value: 7.3 - type: ndcg_at_3 value: 4.907 - type: ndcg_at_5 value: 5.406 - type: precision_at_1 value: 4.938 - type: precision_at_10 value: 2.037 - type: precision_at_100 value: 0.528 - type: precision_at_1000 value: 0.125 - type: precision_at_20 value: 1.366 - type: precision_at_3 value: 3.344 - type: precision_at_5 value: 2.7470000000000003 - type: recall_at_1 value: 2.45 - type: recall_at_10 value: 8.987 - type: recall_at_100 value: 22.302 - type: recall_at_1000 value: 49.903999999999996 - type: recall_at_20 value: 11.712 - type: recall_at_3 value: 4.675 - type: recall_at_5 value: 6.161 - task: type: Retrieval dataset: name: MTEB HotpotQA type: mteb/hotpotqa config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: map_at_1 value: 23.585 - type: map_at_10 value: 31.893 - type: map_at_100 value: 32.851 - type: map_at_1000 value: 32.951 - type: map_at_20 value: 32.415 - type: map_at_3 value: 29.787000000000003 - type: map_at_5 value: 31.012 - type: mrr_at_1 value: 47.171 - type: mrr_at_10 value: 54.333 - type: mrr_at_100 value: 54.949000000000005 - type: mrr_at_1000 value: 54.98800000000001 - type: mrr_at_20 value: 54.702 - type: mrr_at_3 value: 52.632999999999996 - type: mrr_at_5 value: 53.652 - type: ndcg_at_1 value: 47.171 - type: ndcg_at_10 value: 39.884 - type: ndcg_at_100 value: 44.019000000000005 - type: ndcg_at_1000 value: 46.303 - type: ndcg_at_20 value: 41.461999999999996 - type: ndcg_at_3 value: 36.153999999999996 - type: ndcg_at_5 value: 38.072 - type: precision_at_1 value: 47.171 - type: precision_at_10 value: 8.396 - type: precision_at_100 value: 1.169 - type: precision_at_1000 value: 0.147 - type: precision_at_20 value: 4.707 - type: precision_at_3 value: 22.408 - type: precision_at_5 value: 14.966 - type: recall_at_1 value: 23.585 - type: recall_at_10 value: 41.978 - type: recall_at_100 value: 58.447 - type: recall_at_1000 value: 73.7 - type: recall_at_20 value: 47.07 - type: recall_at_3 value: 33.611999999999995 - type: recall_at_5 value: 37.413999999999994 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 74.9528 - type: ap value: 69.50790744137139 - type: f1 value: 74.77689594327182 - task: type: Retrieval dataset: name: MTEB MSMARCO type: mteb/msmarco config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: map_at_1 value: 8.186 - type: map_at_10 value: 13.352 - type: map_at_100 value: 14.147000000000002 - type: map_at_1000 value: 14.231 - type: map_at_20 value: 13.753000000000002 - type: map_at_3 value: 11.529 - type: map_at_5 value: 12.497 - type: mrr_at_1 value: 8.424 - type: mrr_at_10 value: 13.675999999999998 - type: mrr_at_100 value: 14.475999999999999 - type: mrr_at_1000 value: 14.557 - type: mrr_at_20 value: 14.084 - type: mrr_at_3 value: 11.843 - type: mrr_at_5 value: 12.82 - type: ndcg_at_1 value: 8.424 - type: ndcg_at_10 value: 16.534 - type: ndcg_at_100 value: 20.982 - type: ndcg_at_1000 value: 23.538999999999998 - type: ndcg_at_20 value: 18.012 - type: ndcg_at_3 value: 12.729 - type: ndcg_at_5 value: 14.466999999999999 - type: precision_at_1 value: 8.424 - type: precision_at_10 value: 2.7449999999999997 - type: precision_at_100 value: 0.507 - type: precision_at_1000 value: 0.073 - type: precision_at_20 value: 1.683 - type: precision_at_3 value: 5.478000000000001 - type: precision_at_5 value: 4.16 - type: recall_at_1 value: 8.186 - type: recall_at_10 value: 26.415 - type: recall_at_100 value: 48.282000000000004 - type: recall_at_1000 value: 68.869 - type: recall_at_20 value: 32.207 - type: recall_at_3 value: 15.909 - type: recall_at_5 value: 20.09 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 87.26858185134519 - type: f1 value: 86.73793752046078 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 54.65800273597811 - type: f1 value: 36.16413360524473 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 61.519838601210495 - type: f1 value: 58.35755839392156 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.04102219233357 - type: f1 value: 65.55523696441647 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 27.16765056253893 - type: v_measures value: - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - 0.2535665532592405 - 0.25745435154373697 - 0.2588139996653209 - 0.2563977645588755 - 0.2572790917147801 - 0.28011260965698515 - 0.28489569719921415 - 0.2978121202496781 - 0.2927319740642704 - 0.27770089434179124 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 23.778196508186724 - type: v_measures value: - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - 0.22243646306633857 - 0.2203410753173429 - 0.2227543188103344 - 0.22414069966133132 - 0.2284479943649894 - 0.2523527902057292 - 0.25535019508635054 - 0.25480623149347 - 0.2575581979609686 - 0.23963168485181752 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 30.088514713666076 - type: mrr value: 31.010218178449588 - task: type: Retrieval dataset: name: MTEB NFCorpus type: mteb/nfcorpus config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: map_at_1 value: 2.228 - type: map_at_10 value: 4.338 - type: map_at_100 value: 5.427 - type: map_at_1000 value: 6.325 - type: map_at_20 value: 4.729 - type: map_at_3 value: 3.495 - type: map_at_5 value: 3.8150000000000004 - type: mrr_at_1 value: 22.291 - type: mrr_at_10 value: 29.622 - type: mrr_at_100 value: 30.547 - type: mrr_at_1000 value: 30.618000000000002 - type: mrr_at_20 value: 30.070000000000004 - type: mrr_at_3 value: 27.141 - type: mrr_at_5 value: 28.488000000000003 - type: ndcg_at_1 value: 21.362000000000002 - type: ndcg_at_10 value: 15.64 - type: ndcg_at_100 value: 14.832 - type: ndcg_at_1000 value: 23.980999999999998 - type: ndcg_at_20 value: 14.408000000000001 - type: ndcg_at_3 value: 18.719 - type: ndcg_at_5 value: 17.137 - type: precision_at_1 value: 21.981 - type: precision_at_10 value: 11.548 - type: precision_at_100 value: 4.223 - type: precision_at_1000 value: 1.6500000000000001 - type: precision_at_20 value: 8.39 - type: precision_at_3 value: 17.337 - type: precision_at_5 value: 14.613000000000001 - type: recall_at_1 value: 2.228 - type: recall_at_10 value: 6.9190000000000005 - type: recall_at_100 value: 16.854 - type: recall_at_1000 value: 49.179 - type: recall_at_20 value: 9.166 - type: recall_at_3 value: 4.263 - type: recall_at_5 value: 4.956 - task: type: Retrieval dataset: name: MTEB NQ type: mteb/nq config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: map_at_1 value: 9.176 - type: map_at_10 value: 15.720999999999998 - type: map_at_100 value: 16.847 - type: map_at_1000 value: 16.939999999999998 - type: map_at_20 value: 16.355 - type: map_at_3 value: 13.402 - type: map_at_5 value: 14.663 - type: mrr_at_1 value: 10.458 - type: mrr_at_10 value: 17.413 - type: mrr_at_100 value: 18.442 - type: mrr_at_1000 value: 18.52 - type: mrr_at_20 value: 18.006 - type: mrr_at_3 value: 15.043999999999999 - type: mrr_at_5 value: 16.367 - type: ndcg_at_1 value: 10.458 - type: ndcg_at_10 value: 19.994999999999997 - type: ndcg_at_100 value: 25.665 - type: ndcg_at_1000 value: 28.277 - type: ndcg_at_20 value: 22.233 - type: ndcg_at_3 value: 15.168999999999999 - type: ndcg_at_5 value: 17.453 - type: precision_at_1 value: 10.458 - type: precision_at_10 value: 3.711 - type: precision_at_100 value: 0.697 - type: precision_at_1000 value: 0.095 - type: precision_at_20 value: 2.3810000000000002 - type: precision_at_3 value: 7.204000000000001 - type: precision_at_5 value: 5.568 - type: recall_at_1 value: 9.176 - type: recall_at_10 value: 31.646 - type: recall_at_100 value: 57.865 - type: recall_at_1000 value: 78.11399999999999 - type: recall_at_20 value: 40.117000000000004 - type: recall_at_3 value: 18.67 - type: recall_at_5 value: 24.063000000000002 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: mteb/quora config: default split: test revision: e4e08e0b7dbe3c8700f0daef558ff32256715259 metrics: - type: map_at_1 value: 62.597 - type: map_at_10 value: 75.3 - type: map_at_100 value: 76.057 - type: map_at_1000 value: 76.089 - type: map_at_20 value: 75.762 - type: map_at_3 value: 72.41499999999999 - type: map_at_5 value: 74.139 - type: mrr_at_1 value: 72.11999999999999 - type: mrr_at_10 value: 79.44600000000001 - type: mrr_at_100 value: 79.691 - type: mrr_at_1000 value: 79.696 - type: mrr_at_20 value: 79.604 - type: mrr_at_3 value: 78.015 - type: mrr_at_5 value: 78.90700000000001 - type: ndcg_at_1 value: 72.15 - type: ndcg_at_10 value: 79.937 - type: ndcg_at_100 value: 82.074 - type: ndcg_at_1000 value: 82.443 - type: ndcg_at_20 value: 80.916 - type: ndcg_at_3 value: 76.452 - type: ndcg_at_5 value: 78.192 - type: precision_at_1 value: 72.15 - type: precision_at_10 value: 12.117 - type: precision_at_100 value: 1.4500000000000002 - type: precision_at_1000 value: 0.154 - type: precision_at_20 value: 6.503 - type: precision_at_3 value: 33.267 - type: precision_at_5 value: 21.944 - type: recall_at_1 value: 62.597 - type: recall_at_10 value: 88.911 - type: recall_at_100 value: 97.112 - type: recall_at_1000 value: 99.229 - type: recall_at_20 value: 92.231 - type: recall_at_3 value: 78.83099999999999 - type: recall_at_5 value: 83.757 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 31.453135224292588 - type: v_measures value: - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - 0.34024081488556046 - 0.31978719363198366 - 0.28326863670514296 - 0.2736227852661663 - 0.33176589594215805 - 0.281739297860462 - 0.3714152055541526 - 0.2784460528138246 - 0.28292867038320446 - 0.3011498262585792 - 0.2903236549747166 - 0.36937775233378656 - 0.30011371483471927 - 0.33579158840067747 - 0.3774325279364799 - 0.2798489399988548 - 0.30350039884840657 - 0.39379070544611877 - 0.29845537391174287 - 0.280224383799162 - 0.2683644031255058 - 0.28462417081553165 - 0.4207860651822375 - 0.30599639335371903 - 0.29028935381025356 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 385e3cb46b4cfa89021f56c4380204149d0efe33 metrics: - type: v_measure value: 43.69122416835423 - type: v_measures value: - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - 0.4949442160711536 - 0.5089714608477952 - 0.533056646726052 - 0.28870974397114113 - 0.4845435888947718 - 0.4358272686082502 - 0.15963756448560423 - 0.4966594103138184 - 0.4483975331373559 - 0.5183749837794799 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: mteb/scidocs config: default split: test revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88 metrics: - type: map_at_1 value: 2.558 - type: map_at_10 value: 5.4670000000000005 - type: map_at_100 value: 6.601999999999999 - type: map_at_1000 value: 6.816 - type: map_at_20 value: 6.013 - type: map_at_3 value: 4.132000000000001 - type: map_at_5 value: 4.672 - type: mrr_at_1 value: 12.5 - type: mrr_at_10 value: 18.454 - type: mrr_at_100 value: 19.585 - type: mrr_at_1000 value: 19.698999999999998 - type: mrr_at_20 value: 19.093 - type: mrr_at_3 value: 16.25 - type: mrr_at_5 value: 17.349999999999998 - type: ndcg_at_1 value: 12.5 - type: ndcg_at_10 value: 9.931 - type: ndcg_at_100 value: 15.332 - type: ndcg_at_1000 value: 20.285 - type: ndcg_at_20 value: 11.73 - type: ndcg_at_3 value: 9.425 - type: ndcg_at_5 value: 7.994 - type: precision_at_1 value: 12.5 - type: precision_at_10 value: 5.11 - type: precision_at_100 value: 1.299 - type: precision_at_1000 value: 0.251 - type: precision_at_20 value: 3.5999999999999996 - type: precision_at_3 value: 8.533 - type: precision_at_5 value: 6.7 - type: recall_at_1 value: 2.558 - type: recall_at_10 value: 10.4 - type: recall_at_100 value: 26.35 - type: recall_at_1000 value: 50.888 - type: recall_at_20 value: 14.610000000000001 - type: recall_at_3 value: 5.208 - type: recall_at_5 value: 6.808 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: 20a6d6f312dd54037fe07a32d58e5e168867909d metrics: - type: cos_sim_pearson value: 80.46080544471825 - type: cos_sim_spearman value: 77.33681018334157 - type: euclidean_pearson value: 78.32030772877526 - type: euclidean_spearman value: 77.3367915580176 - type: manhattan_pearson value: 78.23694581981565 - type: manhattan_spearman value: 77.24572801084182 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 77.33143319366522 - type: cos_sim_spearman value: 70.15243619467687 - type: euclidean_pearson value: 74.35384725257417 - type: euclidean_spearman value: 70.15020588975051 - type: manhattan_pearson value: 74.49763893926959 - type: manhattan_spearman value: 70.35289409088577 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 75.43426290814391 - type: cos_sim_spearman value: 78.41580967540904 - type: euclidean_pearson value: 77.87697798842441 - type: euclidean_spearman value: 78.41580967540904 - type: manhattan_pearson value: 77.7742301162175 - type: manhattan_spearman value: 78.23561925777014 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 75.72059066580607 - type: cos_sim_spearman value: 74.76063270848232 - type: euclidean_pearson value: 75.96422568212527 - type: euclidean_spearman value: 74.76063912580608 - type: manhattan_pearson value: 75.93446446206052 - type: manhattan_spearman value: 74.80351881324513 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 79.50308070637769 - type: cos_sim_spearman value: 82.00177922226122 - type: euclidean_pearson value: 81.88334998600465 - type: euclidean_spearman value: 82.00175996908672 - type: manhattan_pearson value: 82.04162815561806 - type: manhattan_spearman value: 82.16179492395742 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 72.660749090443 - type: cos_sim_spearman value: 78.27062791462116 - type: euclidean_pearson value: 77.22132046879575 - type: euclidean_spearman value: 78.27062749235377 - type: manhattan_pearson value: 77.30349168561915 - type: manhattan_spearman value: 78.38610133247218 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 84.40073205259823 - type: cos_sim_spearman value: 85.85093351857286 - type: euclidean_pearson value: 86.39555107737667 - type: euclidean_spearman value: 85.85093351857286 - type: manhattan_pearson value: 86.15780582794078 - type: manhattan_spearman value: 85.67768599300385 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 54.06121880120164 - type: cos_sim_spearman value: 61.20018366762684 - type: euclidean_pearson value: 59.08089664894604 - type: euclidean_spearman value: 61.20018366762684 - type: manhattan_pearson value: 58.88169190353213 - type: manhattan_spearman value: 60.82629422553597 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 76.9607252955321 - type: cos_sim_spearman value: 79.20891358738938 - type: euclidean_pearson value: 79.53044888138301 - type: euclidean_spearman value: 79.20891358738938 - type: manhattan_pearson value: 79.37313113618887 - type: manhattan_spearman value: 79.0667751270519 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 71.0421477784269 - type: mrr value: 89.94940426312975 - task: type: Retrieval dataset: name: MTEB SciFact type: mteb/scifact config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: map_at_1 value: 31.900000000000002 - type: map_at_10 value: 38.494 - type: map_at_100 value: 39.353 - type: map_at_1000 value: 39.427 - type: map_at_20 value: 38.952 - type: map_at_3 value: 36.238 - type: map_at_5 value: 37.36 - type: mrr_at_1 value: 34.0 - type: mrr_at_10 value: 40.327 - type: mrr_at_100 value: 41.052 - type: mrr_at_1000 value: 41.120000000000005 - type: mrr_at_20 value: 40.737 - type: mrr_at_3 value: 38.333 - type: mrr_at_5 value: 39.367000000000004 - type: ndcg_at_1 value: 34.0 - type: ndcg_at_10 value: 42.419000000000004 - type: ndcg_at_100 value: 46.589000000000006 - type: ndcg_at_1000 value: 48.966 - type: ndcg_at_20 value: 43.980000000000004 - type: ndcg_at_3 value: 38.124 - type: ndcg_at_5 value: 39.952 - type: precision_at_1 value: 34.0 - type: precision_at_10 value: 5.933 - type: precision_at_100 value: 0.8330000000000001 - type: precision_at_1000 value: 0.104 - type: precision_at_20 value: 3.3329999999999997 - type: precision_at_3 value: 15.0 - type: precision_at_5 value: 10.067 - type: recall_at_1 value: 31.900000000000002 - type: recall_at_10 value: 52.800000000000004 - type: recall_at_100 value: 72.10600000000001 - type: recall_at_1000 value: 91.60000000000001 - type: recall_at_20 value: 58.699999999999996 - type: recall_at_3 value: 41.317 - type: recall_at_5 value: 45.761 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.62871287128714 - type: cos_sim_ap value: 85.22434241429664 - type: cos_sim_f1 value: 79.31605074462217 - type: cos_sim_precision value: 88.43788437884379 - type: cos_sim_recall value: 71.89999999999999 - type: dot_accuracy value: 99.62871287128714 - type: dot_ap value: 85.22434241429666 - type: dot_f1 value: 79.31605074462217 - type: dot_precision value: 88.43788437884379 - type: dot_recall value: 71.89999999999999 - type: euclidean_accuracy value: 99.62871287128714 - type: euclidean_ap value: 85.22434237736961 - type: euclidean_f1 value: 79.31605074462217 - type: euclidean_precision value: 88.43788437884379 - type: euclidean_recall value: 71.89999999999999 - type: manhattan_accuracy value: 99.62475247524752 - type: manhattan_ap value: 85.53918872229502 - type: manhattan_f1 value: 79.38618925831203 - type: manhattan_precision value: 81.2565445026178 - type: manhattan_recall value: 77.60000000000001 - type: max_accuracy value: 99.62871287128714 - type: max_ap value: 85.53918872229502 - type: max_f1 value: 79.38618925831203 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 39.16142357597941 - type: v_measures value: - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - 0.3824405761636396 - 0.44216202123263126 - 0.3390286805950001 - 0.40370202650437953 - 0.3687764786128344 - 0.3002689364743748 - 0.3406756129607103 - 0.4239251906201308 - 0.41513537797197647 - 0.39562333880392536 - 0.44243846336620263 - 0.4564014124962121 - 0.46843968839295613 - 0.3486700249457605 - 0.3931094737880025 - 0.38614031871714743 - 0.39009948062151834 - 0.3952861715088528 - 0.3768164106667065 - 0.39372559829701875 - 0.41022022885425324 - 0.3442845107165114 - 0.36768421400456974 - 0.40522290066464794 - 0.40007875701488965 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 29.175984546605825 - type: v_measures value: - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - 0.28319515044921223 - 0.2715264094552343 - 0.27440620100214314 - 0.26830955555466396 - 0.27653185247970546 - 0.3178752664718975 - 0.3080336049306678 - 0.3068022206397505 - 0.3022010188359171 - 0.3087171748413907 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 40.56760857818254 - type: mrr value: 40.94357439945675 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.764610926778037 - type: cos_sim_spearman value: 30.298920879214158 - type: dot_pearson value: 30.764611831321552 - type: dot_spearman value: 30.298299440561465 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: mteb/trec-covid config: default split: test revision: bb9466bac8153a0349341eb1b22e06409e78ef4e metrics: - type: map_at_1 value: 0.109 - type: map_at_10 value: 0.781 - type: map_at_100 value: 2.995 - type: map_at_1000 value: 6.854 - type: map_at_20 value: 1.2 - type: map_at_3 value: 0.28700000000000003 - type: map_at_5 value: 0.434 - type: mrr_at_1 value: 42.0 - type: mrr_at_10 value: 54.955 - type: mrr_at_100 value: 55.655 - type: mrr_at_1000 value: 55.689 - type: mrr_at_20 value: 55.42399999999999 - type: mrr_at_3 value: 51.0 - type: mrr_at_5 value: 53.800000000000004 - type: ndcg_at_1 value: 39.0 - type: ndcg_at_10 value: 39.479 - type: ndcg_at_100 value: 25.752000000000002 - type: ndcg_at_1000 value: 22.868 - type: ndcg_at_20 value: 35.707 - type: ndcg_at_3 value: 39.419 - type: ndcg_at_5 value: 39.64 - type: precision_at_1 value: 42.0 - type: precision_at_10 value: 43.6 - type: precision_at_100 value: 25.88 - type: precision_at_1000 value: 10.784 - type: precision_at_20 value: 37.8 - type: precision_at_3 value: 43.333 - type: precision_at_5 value: 43.6 - type: recall_at_1 value: 0.109 - type: recall_at_10 value: 1.038 - type: recall_at_100 value: 5.495 - type: recall_at_1000 value: 21.665 - type: recall_at_20 value: 1.722 - type: recall_at_3 value: 0.318 - type: recall_at_5 value: 0.522 - task: type: Retrieval dataset: name: MTEB Touche2020 type: mteb/touche2020 config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: map_at_1 value: 1.302 - type: map_at_10 value: 2.514 - type: map_at_100 value: 3.341 - type: map_at_1000 value: 3.757 - type: map_at_20 value: 2.85 - type: map_at_3 value: 1.8450000000000002 - type: map_at_5 value: 1.873 - type: mrr_at_1 value: 18.367 - type: mrr_at_10 value: 24.789 - type: mrr_at_100 value: 26.517000000000003 - type: mrr_at_1000 value: 26.593 - type: mrr_at_20 value: 25.946 - type: mrr_at_3 value: 22.448999999999998 - type: mrr_at_5 value: 22.959 - type: ndcg_at_1 value: 16.326999999999998 - type: ndcg_at_10 value: 7.7509999999999994 - type: ndcg_at_100 value: 10.67 - type: ndcg_at_1000 value: 17.76 - type: ndcg_at_20 value: 7.674 - type: ndcg_at_3 value: 10.369 - type: ndcg_at_5 value: 7.840999999999999 - type: precision_at_1 value: 18.367 - type: precision_at_10 value: 7.142999999999999 - type: precision_at_100 value: 2.327 - type: precision_at_1000 value: 0.6779999999999999 - type: precision_at_20 value: 5.408 - type: precision_at_3 value: 11.565 - type: precision_at_5 value: 7.3469999999999995 - type: recall_at_1 value: 1.302 - type: recall_at_10 value: 4.919 - type: recall_at_100 value: 14.430000000000001 - type: recall_at_1000 value: 36.949 - type: recall_at_20 value: 7.0040000000000004 - type: recall_at_3 value: 2.2319999999999998 - type: recall_at_5 value: 2.3449999999999998 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de metrics: - type: accuracy value: 64.47265625 - type: ap value: 11.979631561643862 - type: f1 value: 49.90647543589666 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 61.79966044142614 - type: f1 value: 61.89030508018869 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 28.234217666259703 - type: v_measures value: - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - 0.29450695840941515 - 0.30590470809304793 - 0.29205899710992034 - 0.27123807357354457 - 0.28092608890535714 - 0.2787486406145347 - 0.26689540227394454 - 0.26139744229328293 - 0.2785944239497992 - 0.2931510314031239 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 84.0317100792752 - type: cos_sim_ap value: 67.56361271781817 - type: cos_sim_f1 value: 63.082081211970696 - type: cos_sim_precision value: 59.58245367112362 - type: cos_sim_recall value: 67.01846965699208 - type: dot_accuracy value: 84.0317100792752 - type: dot_ap value: 67.56359342938897 - type: dot_f1 value: 63.082081211970696 - type: dot_precision value: 59.58245367112362 - type: dot_recall value: 67.01846965699208 - type: euclidean_accuracy value: 84.0317100792752 - type: euclidean_ap value: 67.5636169518733 - type: euclidean_f1 value: 63.082081211970696 - type: euclidean_precision value: 59.58245367112362 - type: euclidean_recall value: 67.01846965699208 - type: manhattan_accuracy value: 84.0734338677952 - type: manhattan_ap value: 67.44969672020721 - type: manhattan_f1 value: 63.09479205695017 - type: manhattan_precision value: 59.90040313018734 - type: manhattan_recall value: 66.64907651715039 - type: max_accuracy value: 84.0734338677952 - type: max_ap value: 67.5636169518733 - type: max_f1 value: 63.09479205695017 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 87.60624054022587 - type: cos_sim_ap value: 82.94451598409692 - type: cos_sim_f1 value: 74.76484194294527 - type: cos_sim_precision value: 74.86874613959235 - type: cos_sim_recall value: 74.66122574684324 - type: dot_accuracy value: 87.60624054022587 - type: dot_ap value: 82.94451133280317 - type: dot_f1 value: 74.76484194294527 - type: dot_precision value: 74.86874613959235 - type: dot_recall value: 74.66122574684324 - type: euclidean_accuracy value: 87.60624054022587 - type: euclidean_ap value: 82.94449586426977 - type: euclidean_f1 value: 74.76484194294527 - type: euclidean_precision value: 74.86874613959235 - type: euclidean_recall value: 74.66122574684324 - type: manhattan_accuracy value: 87.63922847052432 - type: manhattan_ap value: 82.9449637573502 - type: manhattan_f1 value: 74.9452996046217 - type: manhattan_precision value: 74.73015386970833 - type: manhattan_recall value: 75.1616877117339 - type: max_accuracy value: 87.63922847052432 - type: max_ap value: 82.9449637573502 - type: max_f1 value: 74.9452996046217 --- # Squirtle Squirtle is a distill of [bge-base-en-v1.5](BAAI/bge-base-en-v1.5). ## Intended purpose <span style="color:blue">This model is designed for use in semantic-autocomplete ([click here for demo](https://mihaiii.github.io/semantic-autocomplete/)).</span> Make sure you also pass `pipelineParams={{ pooling: "cls", normalize: true }}` since the default pooling in the component is mean. ## Usage Other than within [semantic-autocomplete](https://github.com/Mihaiii/semantic-autocomplete), you can use this model same as [bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5#usage).
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
BioNLP
Teradata/multilingual-e5-base
Teradata
sentence-similarity
[ "onnx", "mteb", "sentence-similarity", "teradata", "multilingual", "af", "am", "ar", "as", "az", "be", "bg", "bn", "br", "bs", "ca", "cs", "cy", "da", "de", "el", "en", "eo", "es", "et", "eu", "fa", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "hu", "hy", "id", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "ku", "ky", "la", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", "no", "om", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sa", "sd", "si", "sk", "sl", "so", "sq", "sr", "su", "sv", "sw", "ta", "te", "th", "tl", "tr", "ug", "uk", "ur", "uz", "vi", "xh", "yi", "zh", "license:mit", "model-index", "region:us" ]
1,739
1,741
25
0
--- language: - multilingual - af - am - ar - as - az - be - bg - bn - br - bs - ca - cs - cy - da - de - el - en - eo - es - et - eu - fa - fi - fr - fy - ga - gd - gl - gu - ha - he - hi - hr - hu - hy - id - is - it - ja - jv - ka - kk - km - kn - ko - ku - ky - la - lo - lt - lv - mg - mk - ml - mn - mr - ms - my - ne - nl - 'no' - om - or - pa - pl - ps - pt - ro - ru - sa - sd - si - sk - sl - so - sq - sr - su - sv - sw - ta - te - th - tl - tr - ug - uk - ur - uz - vi - xh - yi - zh license: mit tags: - mteb - sentence-similarity - onnx - teradata model-index: - name: multilingual-e5-base results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 78.97014925373135 - type: ap value: 43.69351129103008 - type: f1 value: 73.38075030070492 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (de) type: mteb/amazon_counterfactual config: de split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 71.7237687366167 - type: ap value: 82.22089859962671 - type: f1 value: 69.95532758884401 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en-ext) type: mteb/amazon_counterfactual config: en-ext split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 79.65517241379312 - type: ap value: 28.507918657094738 - type: f1 value: 66.84516013726119 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (ja) type: mteb/amazon_counterfactual config: ja split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 73.32976445396146 - type: ap value: 20.720481637566014 - type: f1 value: 59.78002763416003 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 90.63775 - type: ap value: 87.22277903861716 - type: f1 value: 90.60378636386807 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 44.546 - type: f1 value: 44.05666638370923 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (de) type: mteb/amazon_reviews_multi config: de split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 41.828 - type: f1 value: 41.2710255644252 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (es) type: mteb/amazon_reviews_multi config: es split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 40.534 - type: f1 value: 39.820743174270326 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (fr) type: mteb/amazon_reviews_multi config: fr split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 39.684 - type: f1 value: 39.11052682815307 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (ja) type: mteb/amazon_reviews_multi config: ja split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 37.436 - type: f1 value: 37.07082931930871 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (zh) type: mteb/amazon_reviews_multi config: zh split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 37.226000000000006 - type: f1 value: 36.65372077739185 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: map_at_1 value: 22.831000000000003 - type: map_at_10 value: 36.42 - type: map_at_100 value: 37.699 - type: map_at_1000 value: 37.724000000000004 - type: map_at_3 value: 32.207 - type: map_at_5 value: 34.312 - type: mrr_at_1 value: 23.257 - type: mrr_at_10 value: 36.574 - type: mrr_at_100 value: 37.854 - type: mrr_at_1000 value: 37.878 - type: mrr_at_3 value: 32.385000000000005 - type: mrr_at_5 value: 34.48 - type: ndcg_at_1 value: 22.831000000000003 - type: ndcg_at_10 value: 44.230000000000004 - type: ndcg_at_100 value: 49.974000000000004 - type: ndcg_at_1000 value: 50.522999999999996 - type: ndcg_at_3 value: 35.363 - type: ndcg_at_5 value: 39.164 - type: precision_at_1 value: 22.831000000000003 - type: precision_at_10 value: 6.935 - type: precision_at_100 value: 0.9520000000000001 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 14.841 - type: precision_at_5 value: 10.754 - type: recall_at_1 value: 22.831000000000003 - type: recall_at_10 value: 69.346 - type: recall_at_100 value: 95.235 - type: recall_at_1000 value: 99.36 - type: recall_at_3 value: 44.523 - type: recall_at_5 value: 53.769999999999996 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 40.27789869854063 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 35.41979463347428 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 58.22752045109304 - type: mrr value: 71.51112430198303 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 84.71147646622866 - type: cos_sim_spearman value: 85.059167046486 - type: euclidean_pearson value: 75.88421613600647 - type: euclidean_spearman value: 75.12821787150585 - type: manhattan_pearson value: 75.22005646957604 - type: manhattan_spearman value: 74.42880434453272 - task: type: BitextMining dataset: name: MTEB BUCC (de-en) type: mteb/bucc-bitext-mining config: de-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 99.23799582463465 - type: f1 value: 99.12665274878218 - type: precision value: 99.07098121085595 - type: recall value: 99.23799582463465 - task: type: BitextMining dataset: name: MTEB BUCC (fr-en) type: mteb/bucc-bitext-mining config: fr-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 97.88685890380806 - type: f1 value: 97.59336708489249 - type: precision value: 97.44662117543473 - type: recall value: 97.88685890380806 - task: type: BitextMining dataset: name: MTEB BUCC (ru-en) type: mteb/bucc-bitext-mining config: ru-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 97.47142362313821 - type: f1 value: 97.1989377670015 - type: precision value: 97.06384944001847 - type: recall value: 97.47142362313821 - task: type: BitextMining dataset: name: MTEB BUCC (zh-en) type: mteb/bucc-bitext-mining config: zh-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 98.4728804634018 - type: f1 value: 98.2973494821836 - type: precision value: 98.2095839915745 - type: recall value: 98.4728804634018 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 82.74025974025975 - type: f1 value: 82.67420447730439 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 35.0380848063507 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 29.45956405670166 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: map_at_1 value: 32.122 - type: map_at_10 value: 42.03 - type: map_at_100 value: 43.364000000000004 - type: map_at_1000 value: 43.474000000000004 - type: map_at_3 value: 38.804 - type: map_at_5 value: 40.585 - type: mrr_at_1 value: 39.914 - type: mrr_at_10 value: 48.227 - type: mrr_at_100 value: 49.018 - type: mrr_at_1000 value: 49.064 - type: mrr_at_3 value: 45.994 - type: mrr_at_5 value: 47.396 - type: ndcg_at_1 value: 39.914 - type: ndcg_at_10 value: 47.825 - type: ndcg_at_100 value: 52.852 - type: ndcg_at_1000 value: 54.891 - type: ndcg_at_3 value: 43.517 - type: ndcg_at_5 value: 45.493 - type: precision_at_1 value: 39.914 - type: precision_at_10 value: 8.956 - type: precision_at_100 value: 1.388 - type: precision_at_1000 value: 0.182 - type: precision_at_3 value: 20.791999999999998 - type: precision_at_5 value: 14.821000000000002 - type: recall_at_1 value: 32.122 - type: recall_at_10 value: 58.294999999999995 - type: recall_at_100 value: 79.726 - type: recall_at_1000 value: 93.099 - type: recall_at_3 value: 45.017 - type: recall_at_5 value: 51.002 - type: map_at_1 value: 29.677999999999997 - type: map_at_10 value: 38.684000000000005 - type: map_at_100 value: 39.812999999999995 - type: map_at_1000 value: 39.945 - type: map_at_3 value: 35.831 - type: map_at_5 value: 37.446 - type: mrr_at_1 value: 37.771 - type: mrr_at_10 value: 44.936 - type: mrr_at_100 value: 45.583 - type: mrr_at_1000 value: 45.634 - type: mrr_at_3 value: 42.771 - type: mrr_at_5 value: 43.994 - type: ndcg_at_1 value: 37.771 - type: ndcg_at_10 value: 44.059 - type: ndcg_at_100 value: 48.192 - type: ndcg_at_1000 value: 50.375 - type: ndcg_at_3 value: 40.172000000000004 - type: ndcg_at_5 value: 41.899 - type: precision_at_1 value: 37.771 - type: precision_at_10 value: 8.286999999999999 - type: precision_at_100 value: 1.322 - type: precision_at_1000 value: 0.178 - type: precision_at_3 value: 19.406000000000002 - type: precision_at_5 value: 13.745 - type: recall_at_1 value: 29.677999999999997 - type: recall_at_10 value: 53.071 - type: recall_at_100 value: 70.812 - type: recall_at_1000 value: 84.841 - type: recall_at_3 value: 41.016000000000005 - type: recall_at_5 value: 46.22 - type: map_at_1 value: 42.675000000000004 - type: map_at_10 value: 53.93599999999999 - type: map_at_100 value: 54.806999999999995 - type: map_at_1000 value: 54.867 - type: map_at_3 value: 50.934000000000005 - type: map_at_5 value: 52.583 - type: mrr_at_1 value: 48.339 - type: mrr_at_10 value: 57.265 - type: mrr_at_100 value: 57.873 - type: mrr_at_1000 value: 57.906 - type: mrr_at_3 value: 55.193000000000005 - type: mrr_at_5 value: 56.303000000000004 - type: ndcg_at_1 value: 48.339 - type: ndcg_at_10 value: 59.19799999999999 - type: ndcg_at_100 value: 62.743 - type: ndcg_at_1000 value: 63.99399999999999 - type: ndcg_at_3 value: 54.367 - type: ndcg_at_5 value: 56.548 - type: precision_at_1 value: 48.339 - type: precision_at_10 value: 9.216000000000001 - type: precision_at_100 value: 1.1809999999999998 - type: precision_at_1000 value: 0.134 - type: precision_at_3 value: 23.72 - type: precision_at_5 value: 16.025 - type: recall_at_1 value: 42.675000000000004 - type: recall_at_10 value: 71.437 - type: recall_at_100 value: 86.803 - type: recall_at_1000 value: 95.581 - type: recall_at_3 value: 58.434 - type: recall_at_5 value: 63.754 - type: map_at_1 value: 23.518 - type: map_at_10 value: 30.648999999999997 - type: map_at_100 value: 31.508999999999997 - type: map_at_1000 value: 31.604 - type: map_at_3 value: 28.247 - type: map_at_5 value: 29.65 - type: mrr_at_1 value: 25.650000000000002 - type: mrr_at_10 value: 32.771 - type: mrr_at_100 value: 33.554 - type: mrr_at_1000 value: 33.629999999999995 - type: mrr_at_3 value: 30.433 - type: mrr_at_5 value: 31.812 - type: ndcg_at_1 value: 25.650000000000002 - type: ndcg_at_10 value: 34.929 - type: ndcg_at_100 value: 39.382 - type: ndcg_at_1000 value: 41.913 - type: ndcg_at_3 value: 30.292 - type: ndcg_at_5 value: 32.629999999999995 - type: precision_at_1 value: 25.650000000000002 - type: precision_at_10 value: 5.311 - type: precision_at_100 value: 0.792 - type: precision_at_1000 value: 0.105 - type: precision_at_3 value: 12.58 - type: precision_at_5 value: 8.994 - type: recall_at_1 value: 23.518 - type: recall_at_10 value: 46.19 - type: recall_at_100 value: 67.123 - type: recall_at_1000 value: 86.442 - type: recall_at_3 value: 33.678000000000004 - type: recall_at_5 value: 39.244 - type: map_at_1 value: 15.891 - type: map_at_10 value: 22.464000000000002 - type: map_at_100 value: 23.483 - type: map_at_1000 value: 23.613 - type: map_at_3 value: 20.080000000000002 - type: map_at_5 value: 21.526 - type: mrr_at_1 value: 20.025000000000002 - type: mrr_at_10 value: 26.712999999999997 - type: mrr_at_100 value: 27.650000000000002 - type: mrr_at_1000 value: 27.737000000000002 - type: mrr_at_3 value: 24.274 - type: mrr_at_5 value: 25.711000000000002 - type: ndcg_at_1 value: 20.025000000000002 - type: ndcg_at_10 value: 27.028999999999996 - type: ndcg_at_100 value: 32.064 - type: ndcg_at_1000 value: 35.188 - type: ndcg_at_3 value: 22.512999999999998 - type: ndcg_at_5 value: 24.89 - type: precision_at_1 value: 20.025000000000002 - type: precision_at_10 value: 4.776 - type: precision_at_100 value: 0.8500000000000001 - type: precision_at_1000 value: 0.125 - type: precision_at_3 value: 10.531 - type: precision_at_5 value: 7.811 - type: recall_at_1 value: 15.891 - type: recall_at_10 value: 37.261 - type: recall_at_100 value: 59.12 - type: recall_at_1000 value: 81.356 - type: recall_at_3 value: 24.741 - type: recall_at_5 value: 30.753999999999998 - type: map_at_1 value: 27.544 - type: map_at_10 value: 36.283 - type: map_at_100 value: 37.467 - type: map_at_1000 value: 37.574000000000005 - type: map_at_3 value: 33.528999999999996 - type: map_at_5 value: 35.028999999999996 - type: mrr_at_1 value: 34.166999999999994 - type: mrr_at_10 value: 41.866 - type: mrr_at_100 value: 42.666 - type: mrr_at_1000 value: 42.716 - type: mrr_at_3 value: 39.541 - type: mrr_at_5 value: 40.768 - type: ndcg_at_1 value: 34.166999999999994 - type: ndcg_at_10 value: 41.577 - type: ndcg_at_100 value: 46.687 - type: ndcg_at_1000 value: 48.967 - type: ndcg_at_3 value: 37.177 - type: ndcg_at_5 value: 39.097 - type: precision_at_1 value: 34.166999999999994 - type: precision_at_10 value: 7.420999999999999 - type: precision_at_100 value: 1.165 - type: precision_at_1000 value: 0.154 - type: precision_at_3 value: 17.291999999999998 - type: precision_at_5 value: 12.166 - type: recall_at_1 value: 27.544 - type: recall_at_10 value: 51.99399999999999 - type: recall_at_100 value: 73.738 - type: recall_at_1000 value: 89.33 - type: recall_at_3 value: 39.179 - type: recall_at_5 value: 44.385999999999996 - type: map_at_1 value: 26.661 - type: map_at_10 value: 35.475 - type: map_at_100 value: 36.626999999999995 - type: map_at_1000 value: 36.741 - type: map_at_3 value: 32.818000000000005 - type: map_at_5 value: 34.397 - type: mrr_at_1 value: 32.647999999999996 - type: mrr_at_10 value: 40.784 - type: mrr_at_100 value: 41.602 - type: mrr_at_1000 value: 41.661 - type: mrr_at_3 value: 38.68 - type: mrr_at_5 value: 39.838 - type: ndcg_at_1 value: 32.647999999999996 - type: ndcg_at_10 value: 40.697 - type: ndcg_at_100 value: 45.799 - type: ndcg_at_1000 value: 48.235 - type: ndcg_at_3 value: 36.516 - type: ndcg_at_5 value: 38.515 - type: precision_at_1 value: 32.647999999999996 - type: precision_at_10 value: 7.202999999999999 - type: precision_at_100 value: 1.1360000000000001 - type: precision_at_1000 value: 0.151 - type: precision_at_3 value: 17.314 - type: precision_at_5 value: 12.145999999999999 - type: recall_at_1 value: 26.661 - type: recall_at_10 value: 50.995000000000005 - type: recall_at_100 value: 73.065 - type: recall_at_1000 value: 89.781 - type: recall_at_3 value: 39.073 - type: recall_at_5 value: 44.395 - type: map_at_1 value: 25.946583333333333 - type: map_at_10 value: 33.79725 - type: map_at_100 value: 34.86408333333333 - type: map_at_1000 value: 34.9795 - type: map_at_3 value: 31.259999999999998 - type: map_at_5 value: 32.71541666666666 - type: mrr_at_1 value: 30.863749999999996 - type: mrr_at_10 value: 37.99183333333333 - type: mrr_at_100 value: 38.790499999999994 - type: mrr_at_1000 value: 38.85575000000001 - type: mrr_at_3 value: 35.82083333333333 - type: mrr_at_5 value: 37.07533333333333 - type: ndcg_at_1 value: 30.863749999999996 - type: ndcg_at_10 value: 38.52141666666667 - type: ndcg_at_100 value: 43.17966666666667 - type: ndcg_at_1000 value: 45.64608333333333 - type: ndcg_at_3 value: 34.333000000000006 - type: ndcg_at_5 value: 36.34975 - type: precision_at_1 value: 30.863749999999996 - type: precision_at_10 value: 6.598999999999999 - type: precision_at_100 value: 1.0502500000000001 - type: precision_at_1000 value: 0.14400000000000002 - type: precision_at_3 value: 15.557583333333334 - type: precision_at_5 value: 11.020000000000001 - type: recall_at_1 value: 25.946583333333333 - type: recall_at_10 value: 48.36991666666666 - type: recall_at_100 value: 69.02408333333334 - type: recall_at_1000 value: 86.43858333333331 - type: recall_at_3 value: 36.4965 - type: recall_at_5 value: 41.76258333333334 - type: map_at_1 value: 22.431 - type: map_at_10 value: 28.889 - type: map_at_100 value: 29.642000000000003 - type: map_at_1000 value: 29.742 - type: map_at_3 value: 26.998 - type: map_at_5 value: 28.172000000000004 - type: mrr_at_1 value: 25.307000000000002 - type: mrr_at_10 value: 31.763 - type: mrr_at_100 value: 32.443 - type: mrr_at_1000 value: 32.531 - type: mrr_at_3 value: 29.959000000000003 - type: mrr_at_5 value: 31.063000000000002 - type: ndcg_at_1 value: 25.307000000000002 - type: ndcg_at_10 value: 32.586999999999996 - type: ndcg_at_100 value: 36.5 - type: ndcg_at_1000 value: 39.133 - type: ndcg_at_3 value: 29.25 - type: ndcg_at_5 value: 31.023 - type: precision_at_1 value: 25.307000000000002 - type: precision_at_10 value: 4.954 - type: precision_at_100 value: 0.747 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 12.577 - type: precision_at_5 value: 8.741999999999999 - type: recall_at_1 value: 22.431 - type: recall_at_10 value: 41.134 - type: recall_at_100 value: 59.28600000000001 - type: recall_at_1000 value: 78.857 - type: recall_at_3 value: 31.926 - type: recall_at_5 value: 36.335 - type: map_at_1 value: 17.586 - type: map_at_10 value: 23.304 - type: map_at_100 value: 24.159 - type: map_at_1000 value: 24.281 - type: map_at_3 value: 21.316 - type: map_at_5 value: 22.383 - type: mrr_at_1 value: 21.645 - type: mrr_at_10 value: 27.365000000000002 - type: mrr_at_100 value: 28.108 - type: mrr_at_1000 value: 28.192 - type: mrr_at_3 value: 25.482 - type: mrr_at_5 value: 26.479999999999997 - type: ndcg_at_1 value: 21.645 - type: ndcg_at_10 value: 27.306 - type: ndcg_at_100 value: 31.496000000000002 - type: ndcg_at_1000 value: 34.53 - type: ndcg_at_3 value: 23.73 - type: ndcg_at_5 value: 25.294 - type: precision_at_1 value: 21.645 - type: precision_at_10 value: 4.797 - type: precision_at_100 value: 0.8059999999999999 - type: precision_at_1000 value: 0.121 - type: precision_at_3 value: 10.850999999999999 - type: precision_at_5 value: 7.736 - type: recall_at_1 value: 17.586 - type: recall_at_10 value: 35.481 - type: recall_at_100 value: 54.534000000000006 - type: recall_at_1000 value: 76.456 - type: recall_at_3 value: 25.335 - type: recall_at_5 value: 29.473 - type: map_at_1 value: 25.095 - type: map_at_10 value: 32.374 - type: map_at_100 value: 33.537 - type: map_at_1000 value: 33.634 - type: map_at_3 value: 30.089 - type: map_at_5 value: 31.433 - type: mrr_at_1 value: 29.198 - type: mrr_at_10 value: 36.01 - type: mrr_at_100 value: 37.022 - type: mrr_at_1000 value: 37.083 - type: mrr_at_3 value: 33.94 - type: mrr_at_5 value: 35.148 - type: ndcg_at_1 value: 29.198 - type: ndcg_at_10 value: 36.729 - type: ndcg_at_100 value: 42.114000000000004 - type: ndcg_at_1000 value: 44.592 - type: ndcg_at_3 value: 32.644 - type: ndcg_at_5 value: 34.652 - type: precision_at_1 value: 29.198 - type: precision_at_10 value: 5.970000000000001 - type: precision_at_100 value: 0.967 - type: precision_at_1000 value: 0.129 - type: precision_at_3 value: 14.396999999999998 - type: precision_at_5 value: 10.093 - type: recall_at_1 value: 25.095 - type: recall_at_10 value: 46.392 - type: recall_at_100 value: 69.706 - type: recall_at_1000 value: 87.738 - type: recall_at_3 value: 35.303000000000004 - type: recall_at_5 value: 40.441 - type: map_at_1 value: 26.857999999999997 - type: map_at_10 value: 34.066 - type: map_at_100 value: 35.671 - type: map_at_1000 value: 35.881 - type: map_at_3 value: 31.304 - type: map_at_5 value: 32.885 - type: mrr_at_1 value: 32.411 - type: mrr_at_10 value: 38.987 - type: mrr_at_100 value: 39.894 - type: mrr_at_1000 value: 39.959 - type: mrr_at_3 value: 36.626999999999995 - type: mrr_at_5 value: 38.011 - type: ndcg_at_1 value: 32.411 - type: ndcg_at_10 value: 39.208 - type: ndcg_at_100 value: 44.626 - type: ndcg_at_1000 value: 47.43 - type: ndcg_at_3 value: 35.091 - type: ndcg_at_5 value: 37.119 - type: precision_at_1 value: 32.411 - type: precision_at_10 value: 7.51 - type: precision_at_100 value: 1.486 - type: precision_at_1000 value: 0.234 - type: precision_at_3 value: 16.14 - type: precision_at_5 value: 11.976 - type: recall_at_1 value: 26.857999999999997 - type: recall_at_10 value: 47.407 - type: recall_at_100 value: 72.236 - type: recall_at_1000 value: 90.77 - type: recall_at_3 value: 35.125 - type: recall_at_5 value: 40.522999999999996 - type: map_at_1 value: 21.3 - type: map_at_10 value: 27.412999999999997 - type: map_at_100 value: 28.29 - type: map_at_1000 value: 28.398 - type: map_at_3 value: 25.169999999999998 - type: map_at_5 value: 26.496 - type: mrr_at_1 value: 23.29 - type: mrr_at_10 value: 29.215000000000003 - type: mrr_at_100 value: 30.073 - type: mrr_at_1000 value: 30.156 - type: mrr_at_3 value: 26.956000000000003 - type: mrr_at_5 value: 28.38 - type: ndcg_at_1 value: 23.29 - type: ndcg_at_10 value: 31.113000000000003 - type: ndcg_at_100 value: 35.701 - type: ndcg_at_1000 value: 38.505 - type: ndcg_at_3 value: 26.727 - type: ndcg_at_5 value: 29.037000000000003 - type: precision_at_1 value: 23.29 - type: precision_at_10 value: 4.787 - type: precision_at_100 value: 0.763 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 11.091 - type: precision_at_5 value: 7.985 - type: recall_at_1 value: 21.3 - type: recall_at_10 value: 40.782000000000004 - type: recall_at_100 value: 62.13999999999999 - type: recall_at_1000 value: 83.012 - type: recall_at_3 value: 29.131 - type: recall_at_5 value: 34.624 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: map_at_1 value: 9.631 - type: map_at_10 value: 16.634999999999998 - type: map_at_100 value: 18.23 - type: map_at_1000 value: 18.419 - type: map_at_3 value: 13.66 - type: map_at_5 value: 15.173 - type: mrr_at_1 value: 21.368000000000002 - type: mrr_at_10 value: 31.56 - type: mrr_at_100 value: 32.58 - type: mrr_at_1000 value: 32.633 - type: mrr_at_3 value: 28.241 - type: mrr_at_5 value: 30.225 - type: ndcg_at_1 value: 21.368000000000002 - type: ndcg_at_10 value: 23.855999999999998 - type: ndcg_at_100 value: 30.686999999999998 - type: ndcg_at_1000 value: 34.327000000000005 - type: ndcg_at_3 value: 18.781 - type: ndcg_at_5 value: 20.73 - type: precision_at_1 value: 21.368000000000002 - type: precision_at_10 value: 7.564 - type: precision_at_100 value: 1.496 - type: precision_at_1000 value: 0.217 - type: precision_at_3 value: 13.876 - type: precision_at_5 value: 11.062 - type: recall_at_1 value: 9.631 - type: recall_at_10 value: 29.517 - type: recall_at_100 value: 53.452 - type: recall_at_1000 value: 74.115 - type: recall_at_3 value: 17.605999999999998 - type: recall_at_5 value: 22.505 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: map_at_1 value: 8.885 - type: map_at_10 value: 18.798000000000002 - type: map_at_100 value: 26.316 - type: map_at_1000 value: 27.869 - type: map_at_3 value: 13.719000000000001 - type: map_at_5 value: 15.716 - type: mrr_at_1 value: 66 - type: mrr_at_10 value: 74.263 - type: mrr_at_100 value: 74.519 - type: mrr_at_1000 value: 74.531 - type: mrr_at_3 value: 72.458 - type: mrr_at_5 value: 73.321 - type: ndcg_at_1 value: 53.87499999999999 - type: ndcg_at_10 value: 40.355999999999995 - type: ndcg_at_100 value: 44.366 - type: ndcg_at_1000 value: 51.771 - type: ndcg_at_3 value: 45.195 - type: ndcg_at_5 value: 42.187000000000005 - type: precision_at_1 value: 66 - type: precision_at_10 value: 31.75 - type: precision_at_100 value: 10.11 - type: precision_at_1000 value: 1.9800000000000002 - type: precision_at_3 value: 48.167 - type: precision_at_5 value: 40.050000000000004 - type: recall_at_1 value: 8.885 - type: recall_at_10 value: 24.471999999999998 - type: recall_at_100 value: 49.669000000000004 - type: recall_at_1000 value: 73.383 - type: recall_at_3 value: 14.872 - type: recall_at_5 value: 18.262999999999998 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 45.18 - type: f1 value: 40.26878691789978 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: map_at_1 value: 62.751999999999995 - type: map_at_10 value: 74.131 - type: map_at_100 value: 74.407 - type: map_at_1000 value: 74.423 - type: map_at_3 value: 72.329 - type: map_at_5 value: 73.555 - type: mrr_at_1 value: 67.282 - type: mrr_at_10 value: 78.292 - type: mrr_at_100 value: 78.455 - type: mrr_at_1000 value: 78.458 - type: mrr_at_3 value: 76.755 - type: mrr_at_5 value: 77.839 - type: ndcg_at_1 value: 67.282 - type: ndcg_at_10 value: 79.443 - type: ndcg_at_100 value: 80.529 - type: ndcg_at_1000 value: 80.812 - type: ndcg_at_3 value: 76.281 - type: ndcg_at_5 value: 78.235 - type: precision_at_1 value: 67.282 - type: precision_at_10 value: 10.078 - type: precision_at_100 value: 1.082 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 30.178 - type: precision_at_5 value: 19.232 - type: recall_at_1 value: 62.751999999999995 - type: recall_at_10 value: 91.521 - type: recall_at_100 value: 95.997 - type: recall_at_1000 value: 97.775 - type: recall_at_3 value: 83.131 - type: recall_at_5 value: 87.93299999999999 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: map_at_1 value: 18.861 - type: map_at_10 value: 30.252000000000002 - type: map_at_100 value: 32.082 - type: map_at_1000 value: 32.261 - type: map_at_3 value: 25.909 - type: map_at_5 value: 28.296 - type: mrr_at_1 value: 37.346000000000004 - type: mrr_at_10 value: 45.802 - type: mrr_at_100 value: 46.611999999999995 - type: mrr_at_1000 value: 46.659 - type: mrr_at_3 value: 43.056 - type: mrr_at_5 value: 44.637 - type: ndcg_at_1 value: 37.346000000000004 - type: ndcg_at_10 value: 38.169 - type: ndcg_at_100 value: 44.864 - type: ndcg_at_1000 value: 47.974 - type: ndcg_at_3 value: 33.619 - type: ndcg_at_5 value: 35.317 - type: precision_at_1 value: 37.346000000000004 - type: precision_at_10 value: 10.693999999999999 - type: precision_at_100 value: 1.775 - type: precision_at_1000 value: 0.231 - type: precision_at_3 value: 22.325 - type: precision_at_5 value: 16.852 - type: recall_at_1 value: 18.861 - type: recall_at_10 value: 45.672000000000004 - type: recall_at_100 value: 70.60499999999999 - type: recall_at_1000 value: 89.216 - type: recall_at_3 value: 30.361 - type: recall_at_5 value: 36.998999999999995 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: map_at_1 value: 37.852999999999994 - type: map_at_10 value: 59.961 - type: map_at_100 value: 60.78 - type: map_at_1000 value: 60.843 - type: map_at_3 value: 56.39999999999999 - type: map_at_5 value: 58.646 - type: mrr_at_1 value: 75.70599999999999 - type: mrr_at_10 value: 82.321 - type: mrr_at_100 value: 82.516 - type: mrr_at_1000 value: 82.525 - type: mrr_at_3 value: 81.317 - type: mrr_at_5 value: 81.922 - type: ndcg_at_1 value: 75.70599999999999 - type: ndcg_at_10 value: 68.557 - type: ndcg_at_100 value: 71.485 - type: ndcg_at_1000 value: 72.71600000000001 - type: ndcg_at_3 value: 63.524 - type: ndcg_at_5 value: 66.338 - type: precision_at_1 value: 75.70599999999999 - type: precision_at_10 value: 14.463000000000001 - type: precision_at_100 value: 1.677 - type: precision_at_1000 value: 0.184 - type: precision_at_3 value: 40.806 - type: precision_at_5 value: 26.709 - type: recall_at_1 value: 37.852999999999994 - type: recall_at_10 value: 72.316 - type: recall_at_100 value: 83.842 - type: recall_at_1000 value: 91.999 - type: recall_at_3 value: 61.209 - type: recall_at_5 value: 66.77199999999999 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 85.46039999999999 - type: ap value: 79.9812521351881 - type: f1 value: 85.31722909702084 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: dev revision: None metrics: - type: map_at_1 value: 22.704 - type: map_at_10 value: 35.329 - type: map_at_100 value: 36.494 - type: map_at_1000 value: 36.541000000000004 - type: map_at_3 value: 31.476 - type: map_at_5 value: 33.731 - type: mrr_at_1 value: 23.294999999999998 - type: mrr_at_10 value: 35.859 - type: mrr_at_100 value: 36.968 - type: mrr_at_1000 value: 37.008 - type: mrr_at_3 value: 32.085 - type: mrr_at_5 value: 34.299 - type: ndcg_at_1 value: 23.324 - type: ndcg_at_10 value: 42.274 - type: ndcg_at_100 value: 47.839999999999996 - type: ndcg_at_1000 value: 48.971 - type: ndcg_at_3 value: 34.454 - type: ndcg_at_5 value: 38.464 - type: precision_at_1 value: 23.324 - type: precision_at_10 value: 6.648 - type: precision_at_100 value: 0.9440000000000001 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 14.674999999999999 - type: precision_at_5 value: 10.850999999999999 - type: recall_at_1 value: 22.704 - type: recall_at_10 value: 63.660000000000004 - type: recall_at_100 value: 89.29899999999999 - type: recall_at_1000 value: 97.88900000000001 - type: recall_at_3 value: 42.441 - type: recall_at_5 value: 52.04 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.1326949384405 - type: f1 value: 92.89743579612082 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (de) type: mteb/mtop_domain config: de split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 89.62524654832347 - type: f1 value: 88.65106082263151 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (es) type: mteb/mtop_domain config: es split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 90.59039359573046 - type: f1 value: 90.31532892105662 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (fr) type: mteb/mtop_domain config: fr split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 86.21046038208581 - type: f1 value: 86.41459529813113 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (hi) type: mteb/mtop_domain config: hi split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 87.3180351380423 - type: f1 value: 86.71383078226444 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (th) type: mteb/mtop_domain config: th split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 86.24231464737792 - type: f1 value: 86.31845567592403 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 75.27131782945736 - type: f1 value: 57.52079940417103 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (de) type: mteb/mtop_intent config: de split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 71.2341504649197 - type: f1 value: 51.349951558039244 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (es) type: mteb/mtop_intent config: es split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 71.27418278852569 - type: f1 value: 50.1714985749095 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (fr) type: mteb/mtop_intent config: fr split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 67.68243031631694 - type: f1 value: 50.1066160836192 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (hi) type: mteb/mtop_intent config: hi split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 69.2362854069559 - type: f1 value: 48.821279948766424 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (th) type: mteb/mtop_intent config: th split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 71.71428571428571 - type: f1 value: 53.94611389496195 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (af) type: mteb/amazon_massive_intent config: af split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 59.97646267652992 - type: f1 value: 57.26797883561521 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (am) type: mteb/amazon_massive_intent config: am split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 53.65501008742435 - type: f1 value: 50.416258382177034 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ar) type: mteb/amazon_massive_intent config: ar split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 57.45796906523201 - type: f1 value: 53.306690547422185 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (az) type: mteb/amazon_massive_intent config: az split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.59246805648957 - type: f1 value: 59.818381969051494 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (bn) type: mteb/amazon_massive_intent config: bn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 61.126429051782104 - type: f1 value: 58.25993593933026 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (cy) type: mteb/amazon_massive_intent config: cy split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 50.057162071284466 - type: f1 value: 46.96095728790911 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (da) type: mteb/amazon_massive_intent config: da split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 66.64425016812375 - type: f1 value: 62.858291698755764 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (de) type: mteb/amazon_massive_intent config: de split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 66.08944182918628 - type: f1 value: 62.44639030604241 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (el) type: mteb/amazon_massive_intent config: el split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.68056489576328 - type: f1 value: 61.775326758789504 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 72.11163416274377 - type: f1 value: 69.70789096927015 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (es) type: mteb/amazon_massive_intent config: es split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.40282447881641 - type: f1 value: 66.38492065671895 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fa) type: mteb/amazon_massive_intent config: fa split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.24613315400134 - type: f1 value: 64.3348019501336 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fi) type: mteb/amazon_massive_intent config: fi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.78345662407531 - type: f1 value: 62.21279452354622 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fr) type: mteb/amazon_massive_intent config: fr split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.9455279085407 - type: f1 value: 65.48193124964094 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (he) type: mteb/amazon_massive_intent config: he split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.05110961667788 - type: f1 value: 58.097856564684534 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (hi) type: mteb/amazon_massive_intent config: hi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.95292535305985 - type: f1 value: 62.09182174767901 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (hu) type: mteb/amazon_massive_intent config: hu split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.97310020174848 - type: f1 value: 61.14252567730396 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (hy) type: mteb/amazon_massive_intent config: hy split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 60.08069939475453 - type: f1 value: 57.044041742492034 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (id) type: mteb/amazon_massive_intent config: id split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 66.63752521856085 - type: f1 value: 63.889340907205316 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (is) type: mteb/amazon_massive_intent config: is split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 56.385339609952936 - type: f1 value: 53.449033750088304 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (it) type: mteb/amazon_massive_intent config: it split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.93073301950234 - type: f1 value: 65.9884357824104 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ja) type: mteb/amazon_massive_intent config: ja split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.94418291862812 - type: f1 value: 66.48740222583132 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (jv) type: mteb/amazon_massive_intent config: jv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 54.26025554808339 - type: f1 value: 50.19562815100793 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ka) type: mteb/amazon_massive_intent config: ka split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 48.98789509078682 - type: f1 value: 46.65788438676836 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (km) type: mteb/amazon_massive_intent config: km split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 44.68728984532616 - type: f1 value: 41.642419349541996 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (kn) type: mteb/amazon_massive_intent config: kn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 59.19300605245461 - type: f1 value: 55.8626492442437 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ko) type: mteb/amazon_massive_intent config: ko split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 66.33826496301278 - type: f1 value: 63.89499791648792 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (lv) type: mteb/amazon_massive_intent config: lv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 60.33960995292536 - type: f1 value: 57.15242464180892 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ml) type: mteb/amazon_massive_intent config: ml split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 63.09347679892402 - type: f1 value: 59.64733214063841 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (mn) type: mteb/amazon_massive_intent config: mn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 58.75924680564896 - type: f1 value: 55.96585692366827 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ms) type: mteb/amazon_massive_intent config: ms split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.48486886348352 - type: f1 value: 59.45143559032946 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (my) type: mteb/amazon_massive_intent config: my split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 58.56422326832549 - type: f1 value: 54.96368702901926 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (nb) type: mteb/amazon_massive_intent config: nb split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 66.18022864828512 - type: f1 value: 63.05369805040634 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (nl) type: mteb/amazon_massive_intent config: nl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.30329522528581 - type: f1 value: 64.06084612020727 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pl) type: mteb/amazon_massive_intent config: pl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.36919973100201 - type: f1 value: 65.12154124788887 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pt) type: mteb/amazon_massive_intent config: pt split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.98117014122394 - type: f1 value: 66.41847559806962 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ro) type: mteb/amazon_massive_intent config: ro split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.53799596503026 - type: f1 value: 62.17067330740817 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ru) type: mteb/amazon_massive_intent config: ru split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.01815736381977 - type: f1 value: 66.24988369607843 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sl) type: mteb/amazon_massive_intent config: sl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.34700739744452 - type: f1 value: 59.957933424941636 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sq) type: mteb/amazon_massive_intent config: sq split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 61.23402824478815 - type: f1 value: 57.98836976018471 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sv) type: mteb/amazon_massive_intent config: sv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.54068594485541 - type: f1 value: 65.43849680666855 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sw) type: mteb/amazon_massive_intent config: sw split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 55.998655010087425 - type: f1 value: 52.83737515406804 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ta) type: mteb/amazon_massive_intent config: ta split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 58.71217215870882 - type: f1 value: 55.051794977833026 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (te) type: mteb/amazon_massive_intent config: te split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 59.724277067921996 - type: f1 value: 56.33485571838306 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (th) type: mteb/amazon_massive_intent config: th split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.59515803631473 - type: f1 value: 64.96772366193588 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (tl) type: mteb/amazon_massive_intent config: tl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 60.860793544048406 - type: f1 value: 58.148845819115394 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (tr) type: mteb/amazon_massive_intent config: tr split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.40753194351043 - type: f1 value: 63.18903778054698 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ur) type: mteb/amazon_massive_intent config: ur split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 61.52320107599194 - type: f1 value: 58.356144563398516 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (vi) type: mteb/amazon_massive_intent config: vi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 66.17014122394083 - type: f1 value: 63.919964062638925 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-CN) type: mteb/amazon_massive_intent config: zh-CN split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.15601882985878 - type: f1 value: 67.01451905761371 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-TW) type: mteb/amazon_massive_intent config: zh-TW split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.65030262273034 - type: f1 value: 64.14420425129063 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (af) type: mteb/amazon_massive_scenario config: af split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 65.08742434431743 - type: f1 value: 63.044060042311756 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (am) type: mteb/amazon_massive_scenario config: am split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 58.52387357094821 - type: f1 value: 56.82398588814534 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ar) type: mteb/amazon_massive_scenario config: ar split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 62.239408204438476 - type: f1 value: 61.92570286170469 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (az) type: mteb/amazon_massive_scenario config: az split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 63.74915938130463 - type: f1 value: 62.130740689396276 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (bn) type: mteb/amazon_massive_scenario config: bn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 65.00336247478144 - type: f1 value: 63.71080635228055 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (cy) type: mteb/amazon_massive_scenario config: cy split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 52.837928715534645 - type: f1 value: 50.390741680320836 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (da) type: mteb/amazon_massive_scenario config: da split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.42098184263618 - type: f1 value: 71.41355113538995 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (de) type: mteb/amazon_massive_scenario config: de split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.95359784801613 - type: f1 value: 71.42699340156742 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (el) type: mteb/amazon_massive_scenario config: el split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.18157363819772 - type: f1 value: 69.74836113037671 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 77.08137188971082 - type: f1 value: 76.78000685068261 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (es) type: mteb/amazon_massive_scenario config: es split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.5030262273033 - type: f1 value: 71.71620130425673 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fa) type: mteb/amazon_massive_scenario config: fa split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.24546065904505 - type: f1 value: 69.07638311730359 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fi) type: mteb/amazon_massive_scenario config: fi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 69.12911903160726 - type: f1 value: 68.32651736539815 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fr) type: mteb/amazon_massive_scenario config: fr split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.89307330195025 - type: f1 value: 71.33986549860187 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (he) type: mteb/amazon_massive_scenario config: he split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.44451916610626 - type: f1 value: 66.90192664503866 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (hi) type: mteb/amazon_massive_scenario config: hi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 69.16274377942166 - type: f1 value: 68.01090953775066 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (hu) type: mteb/amazon_massive_scenario config: hu split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.75319435104237 - type: f1 value: 70.18035309201403 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (hy) type: mteb/amazon_massive_scenario config: hy split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 63.14391392064559 - type: f1 value: 61.48286540778145 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (id) type: mteb/amazon_massive_scenario config: id split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.70275722932078 - type: f1 value: 70.26164779846495 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (is) type: mteb/amazon_massive_scenario config: is split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 60.93813046402153 - type: f1 value: 58.8852862116525 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (it) type: mteb/amazon_massive_scenario config: it split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.320107599193 - type: f1 value: 72.19836409602924 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ja) type: mteb/amazon_massive_scenario config: ja split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.65366509751176 - type: f1 value: 74.55188288799579 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (jv) type: mteb/amazon_massive_scenario config: jv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 59.694014794889036 - type: f1 value: 58.11353311721067 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ka) type: mteb/amazon_massive_scenario config: ka split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 54.37457969065231 - type: f1 value: 52.81306134311697 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (km) type: mteb/amazon_massive_scenario config: km split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 48.3086751849361 - type: f1 value: 45.396449765419376 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (kn) type: mteb/amazon_massive_scenario config: kn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 62.151983860121064 - type: f1 value: 60.31762544281696 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ko) type: mteb/amazon_massive_scenario config: ko split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.44788164088769 - type: f1 value: 71.68150151736367 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (lv) type: mteb/amazon_massive_scenario config: lv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 62.81439139206455 - type: f1 value: 62.06735559105593 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ml) type: mteb/amazon_massive_scenario config: ml split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.04303967720242 - type: f1 value: 66.68298851670133 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (mn) type: mteb/amazon_massive_scenario config: mn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 61.43913920645595 - type: f1 value: 60.25605977560783 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ms) type: mteb/amazon_massive_scenario config: ms split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.90316072629456 - type: f1 value: 65.1325924692381 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (my) type: mteb/amazon_massive_scenario config: my split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 61.63752521856086 - type: f1 value: 59.14284778039585 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (nb) type: mteb/amazon_massive_scenario config: nb split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.63080026899797 - type: f1 value: 70.89771864626877 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (nl) type: mteb/amazon_massive_scenario config: nl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.10827168796234 - type: f1 value: 71.71954219691159 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pl) type: mteb/amazon_massive_scenario config: pl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.59515803631471 - type: f1 value: 70.05040128099003 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pt) type: mteb/amazon_massive_scenario config: pt split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.83389374579691 - type: f1 value: 70.84877936562735 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ro) type: mteb/amazon_massive_scenario config: ro split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 69.18628110289173 - type: f1 value: 68.97232927921841 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ru) type: mteb/amazon_massive_scenario config: ru split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.99260255548083 - type: f1 value: 72.85139492157732 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sl) type: mteb/amazon_massive_scenario config: sl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 65.26227303295225 - type: f1 value: 65.08833655469431 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sq) type: mteb/amazon_massive_scenario config: sq split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.48621385339611 - type: f1 value: 64.43483199071298 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sv) type: mteb/amazon_massive_scenario config: sv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.14391392064559 - type: f1 value: 72.2580822579741 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sw) type: mteb/amazon_massive_scenario config: sw split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 59.88567585743107 - type: f1 value: 58.3073765932569 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ta) type: mteb/amazon_massive_scenario config: ta split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 62.38399462004034 - type: f1 value: 60.82139544252606 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (te) type: mteb/amazon_massive_scenario config: te split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 62.58574310692671 - type: f1 value: 60.71443370385374 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (th) type: mteb/amazon_massive_scenario config: th split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.61398789509079 - type: f1 value: 70.99761812049401 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (tl) type: mteb/amazon_massive_scenario config: tl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 62.73705447209146 - type: f1 value: 61.680849331794796 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (tr) type: mteb/amazon_massive_scenario config: tr split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.66778749159381 - type: f1 value: 71.17320646080115 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ur) type: mteb/amazon_massive_scenario config: ur split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 64.640215198386 - type: f1 value: 63.301805157015444 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (vi) type: mteb/amazon_massive_scenario config: vi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.00672494956288 - type: f1 value: 70.26005548582106 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-CN) type: mteb/amazon_massive_scenario config: zh-CN split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 75.42030934767989 - type: f1 value: 75.2074842882598 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-TW) type: mteb/amazon_massive_scenario config: zh-TW split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.69266980497646 - type: f1 value: 70.94103167391192 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 28.91697191169135 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 28.434000079573313 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 30.96683513343383 - type: mrr value: 31.967364078714834 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.5280000000000005 - type: map_at_10 value: 11.793 - type: map_at_100 value: 14.496999999999998 - type: map_at_1000 value: 15.783 - type: map_at_3 value: 8.838 - type: map_at_5 value: 10.07 - type: mrr_at_1 value: 43.653 - type: mrr_at_10 value: 51.531000000000006 - type: mrr_at_100 value: 52.205 - type: mrr_at_1000 value: 52.242999999999995 - type: mrr_at_3 value: 49.431999999999995 - type: mrr_at_5 value: 50.470000000000006 - type: ndcg_at_1 value: 42.415000000000006 - type: ndcg_at_10 value: 32.464999999999996 - type: ndcg_at_100 value: 28.927999999999997 - type: ndcg_at_1000 value: 37.629000000000005 - type: ndcg_at_3 value: 37.845 - type: ndcg_at_5 value: 35.147 - type: precision_at_1 value: 43.653 - type: precision_at_10 value: 23.932000000000002 - type: precision_at_100 value: 7.17 - type: precision_at_1000 value: 1.967 - type: precision_at_3 value: 35.397 - type: precision_at_5 value: 29.907 - type: recall_at_1 value: 5.5280000000000005 - type: recall_at_10 value: 15.568000000000001 - type: recall_at_100 value: 28.54 - type: recall_at_1000 value: 59.864 - type: recall_at_3 value: 9.822000000000001 - type: recall_at_5 value: 11.726 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: map_at_1 value: 37.041000000000004 - type: map_at_10 value: 52.664 - type: map_at_100 value: 53.477 - type: map_at_1000 value: 53.505 - type: map_at_3 value: 48.510999999999996 - type: map_at_5 value: 51.036 - type: mrr_at_1 value: 41.338 - type: mrr_at_10 value: 55.071000000000005 - type: mrr_at_100 value: 55.672 - type: mrr_at_1000 value: 55.689 - type: mrr_at_3 value: 51.82 - type: mrr_at_5 value: 53.852 - type: ndcg_at_1 value: 41.338 - type: ndcg_at_10 value: 60.01800000000001 - type: ndcg_at_100 value: 63.409000000000006 - type: ndcg_at_1000 value: 64.017 - type: ndcg_at_3 value: 52.44799999999999 - type: ndcg_at_5 value: 56.571000000000005 - type: precision_at_1 value: 41.338 - type: precision_at_10 value: 9.531 - type: precision_at_100 value: 1.145 - type: precision_at_1000 value: 0.12 - type: precision_at_3 value: 23.416 - type: precision_at_5 value: 16.46 - type: recall_at_1 value: 37.041000000000004 - type: recall_at_10 value: 79.76299999999999 - type: recall_at_100 value: 94.39 - type: recall_at_1000 value: 98.851 - type: recall_at_3 value: 60.465 - type: recall_at_5 value: 69.906 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: map_at_1 value: 69.952 - type: map_at_10 value: 83.758 - type: map_at_100 value: 84.406 - type: map_at_1000 value: 84.425 - type: map_at_3 value: 80.839 - type: map_at_5 value: 82.646 - type: mrr_at_1 value: 80.62 - type: mrr_at_10 value: 86.947 - type: mrr_at_100 value: 87.063 - type: mrr_at_1000 value: 87.064 - type: mrr_at_3 value: 85.96000000000001 - type: mrr_at_5 value: 86.619 - type: ndcg_at_1 value: 80.63 - type: ndcg_at_10 value: 87.64800000000001 - type: ndcg_at_100 value: 88.929 - type: ndcg_at_1000 value: 89.054 - type: ndcg_at_3 value: 84.765 - type: ndcg_at_5 value: 86.291 - type: precision_at_1 value: 80.63 - type: precision_at_10 value: 13.314 - type: precision_at_100 value: 1.525 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.1 - type: precision_at_5 value: 24.372 - type: recall_at_1 value: 69.952 - type: recall_at_10 value: 94.955 - type: recall_at_100 value: 99.38 - type: recall_at_1000 value: 99.96000000000001 - type: recall_at_3 value: 86.60600000000001 - type: recall_at_5 value: 90.997 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 42.41329517878427 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 55.171278362748666 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 4.213 - type: map_at_10 value: 9.895 - type: map_at_100 value: 11.776 - type: map_at_1000 value: 12.084 - type: map_at_3 value: 7.2669999999999995 - type: map_at_5 value: 8.620999999999999 - type: mrr_at_1 value: 20.8 - type: mrr_at_10 value: 31.112000000000002 - type: mrr_at_100 value: 32.274 - type: mrr_at_1000 value: 32.35 - type: mrr_at_3 value: 28.133000000000003 - type: mrr_at_5 value: 29.892999999999997 - type: ndcg_at_1 value: 20.8 - type: ndcg_at_10 value: 17.163999999999998 - type: ndcg_at_100 value: 24.738 - type: ndcg_at_1000 value: 30.316 - type: ndcg_at_3 value: 16.665 - type: ndcg_at_5 value: 14.478 - type: precision_at_1 value: 20.8 - type: precision_at_10 value: 8.74 - type: precision_at_100 value: 1.963 - type: precision_at_1000 value: 0.33 - type: precision_at_3 value: 15.467 - type: precision_at_5 value: 12.6 - type: recall_at_1 value: 4.213 - type: recall_at_10 value: 17.698 - type: recall_at_100 value: 39.838 - type: recall_at_1000 value: 66.893 - type: recall_at_3 value: 9.418 - type: recall_at_5 value: 12.773000000000001 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 82.90453315738294 - type: cos_sim_spearman value: 78.51197850080254 - type: euclidean_pearson value: 80.09647123597748 - type: euclidean_spearman value: 78.63548011514061 - type: manhattan_pearson value: 80.10645285675231 - type: manhattan_spearman value: 78.57861806068901 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 84.2616156846401 - type: cos_sim_spearman value: 76.69713867850156 - type: euclidean_pearson value: 77.97948563800394 - type: euclidean_spearman value: 74.2371211567807 - type: manhattan_pearson value: 77.69697879669705 - type: manhattan_spearman value: 73.86529778022278 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 77.0293269315045 - type: cos_sim_spearman value: 78.02555120584198 - type: euclidean_pearson value: 78.25398100379078 - type: euclidean_spearman value: 78.66963870599464 - type: manhattan_pearson value: 78.14314682167348 - type: manhattan_spearman value: 78.57692322969135 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 79.16989925136942 - type: cos_sim_spearman value: 76.5996225327091 - type: euclidean_pearson value: 77.8319003279786 - type: euclidean_spearman value: 76.42824009468998 - type: manhattan_pearson value: 77.69118862737736 - type: manhattan_spearman value: 76.25568104762812 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 87.42012286935325 - type: cos_sim_spearman value: 88.15654297884122 - type: euclidean_pearson value: 87.34082819427852 - type: euclidean_spearman value: 88.06333589547084 - type: manhattan_pearson value: 87.25115596784842 - type: manhattan_spearman value: 87.9559927695203 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 82.88222044996712 - type: cos_sim_spearman value: 84.28476589061077 - type: euclidean_pearson value: 83.17399758058309 - type: euclidean_spearman value: 83.85497357244542 - type: manhattan_pearson value: 83.0308397703786 - type: manhattan_spearman value: 83.71554539935046 - task: type: STS dataset: name: MTEB STS17 (ko-ko) type: mteb/sts17-crosslingual-sts config: ko-ko split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 80.20682986257339 - type: cos_sim_spearman value: 79.94567120362092 - type: euclidean_pearson value: 79.43122480368902 - type: euclidean_spearman value: 79.94802077264987 - type: manhattan_pearson value: 79.32653021527081 - type: manhattan_spearman value: 79.80961146709178 - task: type: STS dataset: name: MTEB STS17 (ar-ar) type: mteb/sts17-crosslingual-sts config: ar-ar split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 74.46578144394383 - type: cos_sim_spearman value: 74.52496637472179 - type: euclidean_pearson value: 72.2903807076809 - type: euclidean_spearman value: 73.55549359771645 - type: manhattan_pearson value: 72.09324837709393 - type: manhattan_spearman value: 73.36743103606581 - task: type: STS dataset: name: MTEB STS17 (en-ar) type: mteb/sts17-crosslingual-sts config: en-ar split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 71.37272335116 - type: cos_sim_spearman value: 71.26702117766037 - type: euclidean_pearson value: 67.114829954434 - type: euclidean_spearman value: 66.37938893947761 - type: manhattan_pearson value: 66.79688574095246 - type: manhattan_spearman value: 66.17292828079667 - task: type: STS dataset: name: MTEB STS17 (en-de) type: mteb/sts17-crosslingual-sts config: en-de split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 80.61016770129092 - type: cos_sim_spearman value: 82.08515426632214 - type: euclidean_pearson value: 80.557340361131 - type: euclidean_spearman value: 80.37585812266175 - type: manhattan_pearson value: 80.6782873404285 - type: manhattan_spearman value: 80.6678073032024 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 87.00150745350108 - type: cos_sim_spearman value: 87.83441972211425 - type: euclidean_pearson value: 87.94826702308792 - type: euclidean_spearman value: 87.46143974860725 - type: manhattan_pearson value: 87.97560344306105 - type: manhattan_spearman value: 87.5267102829796 - task: type: STS dataset: name: MTEB STS17 (en-tr) type: mteb/sts17-crosslingual-sts config: en-tr split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 64.76325252267235 - type: cos_sim_spearman value: 63.32615095463905 - type: euclidean_pearson value: 64.07920669155716 - type: euclidean_spearman value: 61.21409893072176 - type: manhattan_pearson value: 64.26308625680016 - type: manhattan_spearman value: 61.2438185254079 - task: type: STS dataset: name: MTEB STS17 (es-en) type: mteb/sts17-crosslingual-sts config: es-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 75.82644463022595 - type: cos_sim_spearman value: 76.50381269945073 - type: euclidean_pearson value: 75.1328548315934 - type: euclidean_spearman value: 75.63761139408453 - type: manhattan_pearson value: 75.18610101241407 - type: manhattan_spearman value: 75.30669266354164 - task: type: STS dataset: name: MTEB STS17 (es-es) type: mteb/sts17-crosslingual-sts config: es-es split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 87.49994164686832 - type: cos_sim_spearman value: 86.73743986245549 - type: euclidean_pearson value: 86.8272894387145 - type: euclidean_spearman value: 85.97608491000507 - type: manhattan_pearson value: 86.74960140396779 - type: manhattan_spearman value: 85.79285984190273 - task: type: STS dataset: name: MTEB STS17 (fr-en) type: mteb/sts17-crosslingual-sts config: fr-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 79.58172210788469 - type: cos_sim_spearman value: 80.17516468334607 - type: euclidean_pearson value: 77.56537843470504 - type: euclidean_spearman value: 77.57264627395521 - type: manhattan_pearson value: 78.09703521695943 - type: manhattan_spearman value: 78.15942760916954 - task: type: STS dataset: name: MTEB STS17 (it-en) type: mteb/sts17-crosslingual-sts config: it-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 79.7589932931751 - type: cos_sim_spearman value: 80.15210089028162 - type: euclidean_pearson value: 77.54135223516057 - type: euclidean_spearman value: 77.52697996368764 - type: manhattan_pearson value: 77.65734439572518 - type: manhattan_spearman value: 77.77702992016121 - task: type: STS dataset: name: MTEB STS17 (nl-en) type: mteb/sts17-crosslingual-sts config: nl-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 79.16682365511267 - type: cos_sim_spearman value: 79.25311267628506 - type: euclidean_pearson value: 77.54882036762244 - type: euclidean_spearman value: 77.33212935194827 - type: manhattan_pearson value: 77.98405516064015 - type: manhattan_spearman value: 77.85075717865719 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 59.10473294775917 - type: cos_sim_spearman value: 61.82780474476838 - type: euclidean_pearson value: 45.885111672377256 - type: euclidean_spearman value: 56.88306351932454 - type: manhattan_pearson value: 46.101218127323186 - type: manhattan_spearman value: 56.80953694186333 - task: type: STS dataset: name: MTEB STS22 (de) type: mteb/sts22-crosslingual-sts config: de split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 45.781923079584146 - type: cos_sim_spearman value: 55.95098449691107 - type: euclidean_pearson value: 25.4571031323205 - type: euclidean_spearman value: 49.859978118078935 - type: manhattan_pearson value: 25.624938455041384 - type: manhattan_spearman value: 49.99546185049401 - task: type: STS dataset: name: MTEB STS22 (es) type: mteb/sts22-crosslingual-sts config: es split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 60.00618133997907 - type: cos_sim_spearman value: 66.57896677718321 - type: euclidean_pearson value: 42.60118466388821 - type: euclidean_spearman value: 62.8210759715209 - type: manhattan_pearson value: 42.63446860604094 - type: manhattan_spearman value: 62.73803068925271 - task: type: STS dataset: name: MTEB STS22 (pl) type: mteb/sts22-crosslingual-sts config: pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 28.460759121626943 - type: cos_sim_spearman value: 34.13459007469131 - type: euclidean_pearson value: 6.0917739325525195 - type: euclidean_spearman value: 27.9947262664867 - type: manhattan_pearson value: 6.16877864169911 - type: manhattan_spearman value: 28.00664163971514 - task: type: STS dataset: name: MTEB STS22 (tr) type: mteb/sts22-crosslingual-sts config: tr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 57.42546621771696 - type: cos_sim_spearman value: 63.699663168970474 - type: euclidean_pearson value: 38.12085278789738 - type: euclidean_spearman value: 58.12329140741536 - type: manhattan_pearson value: 37.97364549443335 - type: manhattan_spearman value: 57.81545502318733 - task: type: STS dataset: name: MTEB STS22 (ar) type: mteb/sts22-crosslingual-sts config: ar split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 46.82241380954213 - type: cos_sim_spearman value: 57.86569456006391 - type: euclidean_pearson value: 31.80480070178813 - type: euclidean_spearman value: 52.484000620130104 - type: manhattan_pearson value: 31.952708554646097 - type: manhattan_spearman value: 52.8560972356195 - task: type: STS dataset: name: MTEB STS22 (ru) type: mteb/sts22-crosslingual-sts config: ru split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 52.00447170498087 - type: cos_sim_spearman value: 60.664116225735164 - type: euclidean_pearson value: 33.87382555421702 - type: euclidean_spearman value: 55.74649067458667 - type: manhattan_pearson value: 33.99117246759437 - type: manhattan_spearman value: 55.98749034923899 - task: type: STS dataset: name: MTEB STS22 (zh) type: mteb/sts22-crosslingual-sts config: zh split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 58.06497233105448 - type: cos_sim_spearman value: 65.62968801135676 - type: euclidean_pearson value: 47.482076613243905 - type: euclidean_spearman value: 62.65137791498299 - type: manhattan_pearson value: 47.57052626104093 - type: manhattan_spearman value: 62.436916516613294 - task: type: STS dataset: name: MTEB STS22 (fr) type: mteb/sts22-crosslingual-sts config: fr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 70.49397298562575 - type: cos_sim_spearman value: 74.79604041187868 - type: euclidean_pearson value: 49.661891561317795 - type: euclidean_spearman value: 70.31535537621006 - type: manhattan_pearson value: 49.553715741850006 - type: manhattan_spearman value: 70.24779344636806 - task: type: STS dataset: name: MTEB STS22 (de-en) type: mteb/sts22-crosslingual-sts config: de-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 55.640574515348696 - type: cos_sim_spearman value: 54.927959317689 - type: euclidean_pearson value: 29.00139666967476 - type: euclidean_spearman value: 41.86386566971605 - type: manhattan_pearson value: 29.47411067730344 - type: manhattan_spearman value: 42.337438424952786 - task: type: STS dataset: name: MTEB STS22 (es-en) type: mteb/sts22-crosslingual-sts config: es-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 68.14095292259312 - type: cos_sim_spearman value: 73.99017581234789 - type: euclidean_pearson value: 46.46304297872084 - type: euclidean_spearman value: 60.91834114800041 - type: manhattan_pearson value: 47.07072666338692 - type: manhattan_spearman value: 61.70415727977926 - task: type: STS dataset: name: MTEB STS22 (it) type: mteb/sts22-crosslingual-sts config: it split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 73.27184653359575 - type: cos_sim_spearman value: 77.76070252418626 - type: euclidean_pearson value: 62.30586577544778 - type: euclidean_spearman value: 75.14246629110978 - type: manhattan_pearson value: 62.328196884927046 - type: manhattan_spearman value: 75.1282792981433 - task: type: STS dataset: name: MTEB STS22 (pl-en) type: mteb/sts22-crosslingual-sts config: pl-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 71.59448528829957 - type: cos_sim_spearman value: 70.37277734222123 - type: euclidean_pearson value: 57.63145565721123 - type: euclidean_spearman value: 66.10113048304427 - type: manhattan_pearson value: 57.18897811586808 - type: manhattan_spearman value: 66.5595511215901 - task: type: STS dataset: name: MTEB STS22 (zh-en) type: mteb/sts22-crosslingual-sts config: zh-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 66.37520607720838 - type: cos_sim_spearman value: 69.92282148997948 - type: euclidean_pearson value: 40.55768770125291 - type: euclidean_spearman value: 55.189128944669605 - type: manhattan_pearson value: 41.03566433468883 - type: manhattan_spearman value: 55.61251893174558 - task: type: STS dataset: name: MTEB STS22 (es-it) type: mteb/sts22-crosslingual-sts config: es-it split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 57.791929533771835 - type: cos_sim_spearman value: 66.45819707662093 - type: euclidean_pearson value: 39.03686018511092 - type: euclidean_spearman value: 56.01282695640428 - type: manhattan_pearson value: 38.91586623619632 - type: manhattan_spearman value: 56.69394943612747 - task: type: STS dataset: name: MTEB STS22 (de-fr) type: mteb/sts22-crosslingual-sts config: de-fr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 47.82224468473866 - type: cos_sim_spearman value: 59.467307194781164 - type: euclidean_pearson value: 27.428459190256145 - type: euclidean_spearman value: 60.83463107397519 - type: manhattan_pearson value: 27.487391578496638 - type: manhattan_spearman value: 61.281380460246496 - task: type: STS dataset: name: MTEB STS22 (de-pl) type: mteb/sts22-crosslingual-sts config: de-pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 16.306666792752644 - type: cos_sim_spearman value: 39.35486427252405 - type: euclidean_pearson value: -2.7887154897955435 - type: euclidean_spearman value: 27.1296051831719 - type: manhattan_pearson value: -3.202291270581297 - type: manhattan_spearman value: 26.32895849218158 - task: type: STS dataset: name: MTEB STS22 (fr-pl) type: mteb/sts22-crosslingual-sts config: fr-pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 59.67006803805076 - type: cos_sim_spearman value: 73.24670207647144 - type: euclidean_pearson value: 46.91884681500483 - type: euclidean_spearman value: 16.903085094570333 - type: manhattan_pearson value: 46.88391675325812 - type: manhattan_spearman value: 28.17180849095055 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 83.79555591223837 - type: cos_sim_spearman value: 85.63658602085185 - type: euclidean_pearson value: 85.22080894037671 - type: euclidean_spearman value: 85.54113580167038 - type: manhattan_pearson value: 85.1639505960118 - type: manhattan_spearman value: 85.43502665436196 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 80.73900991689766 - type: mrr value: 94.81624131133934 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: map_at_1 value: 55.678000000000004 - type: map_at_10 value: 65.135 - type: map_at_100 value: 65.824 - type: map_at_1000 value: 65.852 - type: map_at_3 value: 62.736000000000004 - type: map_at_5 value: 64.411 - type: mrr_at_1 value: 58.333 - type: mrr_at_10 value: 66.5 - type: mrr_at_100 value: 67.053 - type: mrr_at_1000 value: 67.08 - type: mrr_at_3 value: 64.944 - type: mrr_at_5 value: 65.89399999999999 - type: ndcg_at_1 value: 58.333 - type: ndcg_at_10 value: 69.34700000000001 - type: ndcg_at_100 value: 72.32 - type: ndcg_at_1000 value: 73.014 - type: ndcg_at_3 value: 65.578 - type: ndcg_at_5 value: 67.738 - type: precision_at_1 value: 58.333 - type: precision_at_10 value: 9.033 - type: precision_at_100 value: 1.0670000000000002 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 25.444 - type: precision_at_5 value: 16.933 - type: recall_at_1 value: 55.678000000000004 - type: recall_at_10 value: 80.72200000000001 - type: recall_at_100 value: 93.93299999999999 - type: recall_at_1000 value: 99.333 - type: recall_at_3 value: 70.783 - type: recall_at_5 value: 75.978 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.74653465346535 - type: cos_sim_ap value: 93.01476369929063 - type: cos_sim_f1 value: 86.93009118541033 - type: cos_sim_precision value: 88.09034907597535 - type: cos_sim_recall value: 85.8 - type: dot_accuracy value: 99.22970297029703 - type: dot_ap value: 51.58725659485144 - type: dot_f1 value: 53.51351351351352 - type: dot_precision value: 58.235294117647065 - type: dot_recall value: 49.5 - type: euclidean_accuracy value: 99.74356435643564 - type: euclidean_ap value: 92.40332894384368 - type: euclidean_f1 value: 86.97838109602817 - type: euclidean_precision value: 87.46208291203236 - type: euclidean_recall value: 86.5 - type: manhattan_accuracy value: 99.73069306930694 - type: manhattan_ap value: 92.01320815721121 - type: manhattan_f1 value: 86.4135864135864 - type: manhattan_precision value: 86.32734530938124 - type: manhattan_recall value: 86.5 - type: max_accuracy value: 99.74653465346535 - type: max_ap value: 93.01476369929063 - type: max_f1 value: 86.97838109602817 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 55.2660514302523 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 30.4637783572547 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 49.41377758357637 - type: mrr value: 50.138451213818854 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 28.887846011166594 - type: cos_sim_spearman value: 30.10823258355903 - type: dot_pearson value: 12.888049550236385 - type: dot_spearman value: 12.827495903098123 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.21 - type: map_at_10 value: 1.667 - type: map_at_100 value: 9.15 - type: map_at_1000 value: 22.927 - type: map_at_3 value: 0.573 - type: map_at_5 value: 0.915 - type: mrr_at_1 value: 80 - type: mrr_at_10 value: 87.167 - type: mrr_at_100 value: 87.167 - type: mrr_at_1000 value: 87.167 - type: mrr_at_3 value: 85.667 - type: mrr_at_5 value: 87.167 - type: ndcg_at_1 value: 76 - type: ndcg_at_10 value: 69.757 - type: ndcg_at_100 value: 52.402 - type: ndcg_at_1000 value: 47.737 - type: ndcg_at_3 value: 71.866 - type: ndcg_at_5 value: 72.225 - type: precision_at_1 value: 80 - type: precision_at_10 value: 75 - type: precision_at_100 value: 53.959999999999994 - type: precision_at_1000 value: 21.568 - type: precision_at_3 value: 76.667 - type: precision_at_5 value: 78 - type: recall_at_1 value: 0.21 - type: recall_at_10 value: 1.9189999999999998 - type: recall_at_100 value: 12.589 - type: recall_at_1000 value: 45.312000000000005 - type: recall_at_3 value: 0.61 - type: recall_at_5 value: 1.019 - task: type: BitextMining dataset: name: MTEB Tatoeba (sqi-eng) type: mteb/tatoeba-bitext-mining config: sqi-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.10000000000001 - type: f1 value: 90.06 - type: precision value: 89.17333333333333 - type: recall value: 92.10000000000001 - task: type: BitextMining dataset: name: MTEB Tatoeba (fry-eng) type: mteb/tatoeba-bitext-mining config: fry-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 56.06936416184971 - type: f1 value: 50.87508028259473 - type: precision value: 48.97398843930635 - type: recall value: 56.06936416184971 - task: type: BitextMining dataset: name: MTEB Tatoeba (kur-eng) type: mteb/tatoeba-bitext-mining config: kur-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 57.3170731707317 - type: f1 value: 52.96080139372822 - type: precision value: 51.67861124382864 - type: recall value: 57.3170731707317 - task: type: BitextMining dataset: name: MTEB Tatoeba (tur-eng) type: mteb/tatoeba-bitext-mining config: tur-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.3 - type: f1 value: 92.67333333333333 - type: precision value: 91.90833333333333 - type: recall value: 94.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (deu-eng) type: mteb/tatoeba-bitext-mining config: deu-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.7 - type: f1 value: 97.07333333333332 - type: precision value: 96.79500000000002 - type: recall value: 97.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (nld-eng) type: mteb/tatoeba-bitext-mining config: nld-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.69999999999999 - type: f1 value: 93.2 - type: precision value: 92.48333333333333 - type: recall value: 94.69999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (ron-eng) type: mteb/tatoeba-bitext-mining config: ron-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.9 - type: f1 value: 91.26666666666667 - type: precision value: 90.59444444444445 - type: recall value: 92.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (ang-eng) type: mteb/tatoeba-bitext-mining config: ang-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 34.32835820895522 - type: f1 value: 29.074180380150533 - type: precision value: 28.068207322920596 - type: recall value: 34.32835820895522 - task: type: BitextMining dataset: name: MTEB Tatoeba (ido-eng) type: mteb/tatoeba-bitext-mining config: ido-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 78.5 - type: f1 value: 74.3945115995116 - type: precision value: 72.82967843459222 - type: recall value: 78.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (jav-eng) type: mteb/tatoeba-bitext-mining config: jav-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 66.34146341463415 - type: f1 value: 61.2469400518181 - type: precision value: 59.63977756660683 - type: recall value: 66.34146341463415 - task: type: BitextMining dataset: name: MTEB Tatoeba (isl-eng) type: mteb/tatoeba-bitext-mining config: isl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 80.9 - type: f1 value: 76.90349206349207 - type: precision value: 75.32921568627451 - type: recall value: 80.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (slv-eng) type: mteb/tatoeba-bitext-mining config: slv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 84.93317132442284 - type: f1 value: 81.92519105034295 - type: precision value: 80.71283920615635 - type: recall value: 84.93317132442284 - task: type: BitextMining dataset: name: MTEB Tatoeba (cym-eng) type: mteb/tatoeba-bitext-mining config: cym-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 71.1304347826087 - type: f1 value: 65.22394755003451 - type: precision value: 62.912422360248435 - type: recall value: 71.1304347826087 - task: type: BitextMining dataset: name: MTEB Tatoeba (kaz-eng) type: mteb/tatoeba-bitext-mining config: kaz-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 79.82608695652173 - type: f1 value: 75.55693581780538 - type: precision value: 73.79420289855072 - type: recall value: 79.82608695652173 - task: type: BitextMining dataset: name: MTEB Tatoeba (est-eng) type: mteb/tatoeba-bitext-mining config: est-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 74 - type: f1 value: 70.51022222222223 - type: precision value: 69.29673599347512 - type: recall value: 74 - task: type: BitextMining dataset: name: MTEB Tatoeba (heb-eng) type: mteb/tatoeba-bitext-mining config: heb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 78.7 - type: f1 value: 74.14238095238095 - type: precision value: 72.27214285714285 - type: recall value: 78.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (gla-eng) type: mteb/tatoeba-bitext-mining config: gla-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 48.97466827503016 - type: f1 value: 43.080330405420874 - type: precision value: 41.36505499593557 - type: recall value: 48.97466827503016 - task: type: BitextMining dataset: name: MTEB Tatoeba (mar-eng) type: mteb/tatoeba-bitext-mining config: mar-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 89.60000000000001 - type: f1 value: 86.62333333333333 - type: precision value: 85.225 - type: recall value: 89.60000000000001 - task: type: BitextMining dataset: name: MTEB Tatoeba (lat-eng) type: mteb/tatoeba-bitext-mining config: lat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 45.2 - type: f1 value: 39.5761253006253 - type: precision value: 37.991358436312 - type: recall value: 45.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (bel-eng) type: mteb/tatoeba-bitext-mining config: bel-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 89.5 - type: f1 value: 86.70333333333333 - type: precision value: 85.53166666666667 - type: recall value: 89.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (pms-eng) type: mteb/tatoeba-bitext-mining config: pms-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 50.095238095238095 - type: f1 value: 44.60650460650461 - type: precision value: 42.774116796477045 - type: recall value: 50.095238095238095 - task: type: BitextMining dataset: name: MTEB Tatoeba (gle-eng) type: mteb/tatoeba-bitext-mining config: gle-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 63.4 - type: f1 value: 58.35967261904762 - type: precision value: 56.54857142857143 - type: recall value: 63.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (pes-eng) type: mteb/tatoeba-bitext-mining config: pes-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 89.2 - type: f1 value: 87.075 - type: precision value: 86.12095238095239 - type: recall value: 89.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (nob-eng) type: mteb/tatoeba-bitext-mining config: nob-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.8 - type: f1 value: 95.90333333333334 - type: precision value: 95.50833333333333 - type: recall value: 96.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (bul-eng) type: mteb/tatoeba-bitext-mining config: bul-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.9 - type: f1 value: 88.6288888888889 - type: precision value: 87.61607142857142 - type: recall value: 90.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (cbk-eng) type: mteb/tatoeba-bitext-mining config: cbk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 65.2 - type: f1 value: 60.54377630539395 - type: precision value: 58.89434482711381 - type: recall value: 65.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (hun-eng) type: mteb/tatoeba-bitext-mining config: hun-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 87 - type: f1 value: 84.32412698412699 - type: precision value: 83.25527777777778 - type: recall value: 87 - task: type: BitextMining dataset: name: MTEB Tatoeba (uig-eng) type: mteb/tatoeba-bitext-mining config: uig-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 68.7 - type: f1 value: 63.07883541295306 - type: precision value: 61.06117424242426 - type: recall value: 68.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (rus-eng) type: mteb/tatoeba-bitext-mining config: rus-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.7 - type: f1 value: 91.78333333333335 - type: precision value: 90.86666666666667 - type: recall value: 93.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (spa-eng) type: mteb/tatoeba-bitext-mining config: spa-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.7 - type: f1 value: 96.96666666666667 - type: precision value: 96.61666666666667 - type: recall value: 97.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (hye-eng) type: mteb/tatoeba-bitext-mining config: hye-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.27493261455525 - type: f1 value: 85.90745732255168 - type: precision value: 84.91389637616052 - type: recall value: 88.27493261455525 - task: type: BitextMining dataset: name: MTEB Tatoeba (tel-eng) type: mteb/tatoeba-bitext-mining config: tel-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.5982905982906 - type: f1 value: 88.4900284900285 - type: precision value: 87.57122507122507 - type: recall value: 90.5982905982906 - task: type: BitextMining dataset: name: MTEB Tatoeba (afr-eng) type: mteb/tatoeba-bitext-mining config: afr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 89.5 - type: f1 value: 86.90769841269842 - type: precision value: 85.80178571428571 - type: recall value: 89.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (mon-eng) type: mteb/tatoeba-bitext-mining config: mon-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 82.5 - type: f1 value: 78.36796536796538 - type: precision value: 76.82196969696969 - type: recall value: 82.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (arz-eng) type: mteb/tatoeba-bitext-mining config: arz-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 71.48846960167715 - type: f1 value: 66.78771089148448 - type: precision value: 64.98302885095339 - type: recall value: 71.48846960167715 - task: type: BitextMining dataset: name: MTEB Tatoeba (hrv-eng) type: mteb/tatoeba-bitext-mining config: hrv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.1 - type: f1 value: 92.50333333333333 - type: precision value: 91.77499999999999 - type: recall value: 94.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (nov-eng) type: mteb/tatoeba-bitext-mining config: nov-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 71.20622568093385 - type: f1 value: 66.83278891450098 - type: precision value: 65.35065777283677 - type: recall value: 71.20622568093385 - task: type: BitextMining dataset: name: MTEB Tatoeba (gsw-eng) type: mteb/tatoeba-bitext-mining config: gsw-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 48.717948717948715 - type: f1 value: 43.53146853146853 - type: precision value: 42.04721204721204 - type: recall value: 48.717948717948715 - task: type: BitextMining dataset: name: MTEB Tatoeba (nds-eng) type: mteb/tatoeba-bitext-mining config: nds-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 58.5 - type: f1 value: 53.8564991863928 - type: precision value: 52.40329436122275 - type: recall value: 58.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (ukr-eng) type: mteb/tatoeba-bitext-mining config: ukr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.8 - type: f1 value: 88.29 - type: precision value: 87.09166666666667 - type: recall value: 90.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (uzb-eng) type: mteb/tatoeba-bitext-mining config: uzb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 67.28971962616822 - type: f1 value: 62.63425307817832 - type: precision value: 60.98065939771546 - type: recall value: 67.28971962616822 - task: type: BitextMining dataset: name: MTEB Tatoeba (lit-eng) type: mteb/tatoeba-bitext-mining config: lit-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 78.7 - type: f1 value: 75.5264472455649 - type: precision value: 74.38205086580086 - type: recall value: 78.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (ina-eng) type: mteb/tatoeba-bitext-mining config: ina-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.7 - type: f1 value: 86.10809523809525 - type: precision value: 85.07602564102565 - type: recall value: 88.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (lfn-eng) type: mteb/tatoeba-bitext-mining config: lfn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 56.99999999999999 - type: f1 value: 52.85487521402737 - type: precision value: 51.53985162713104 - type: recall value: 56.99999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (zsm-eng) type: mteb/tatoeba-bitext-mining config: zsm-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94 - type: f1 value: 92.45333333333333 - type: precision value: 91.79166666666667 - type: recall value: 94 - task: type: BitextMining dataset: name: MTEB Tatoeba (ita-eng) type: mteb/tatoeba-bitext-mining config: ita-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.30000000000001 - type: f1 value: 90.61333333333333 - type: precision value: 89.83333333333331 - type: recall value: 92.30000000000001 - task: type: BitextMining dataset: name: MTEB Tatoeba (cmn-eng) type: mteb/tatoeba-bitext-mining config: cmn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.69999999999999 - type: f1 value: 93.34555555555555 - type: precision value: 92.75416666666668 - type: recall value: 94.69999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (lvs-eng) type: mteb/tatoeba-bitext-mining config: lvs-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 80.2 - type: f1 value: 76.6563035113035 - type: precision value: 75.3014652014652 - type: recall value: 80.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (glg-eng) type: mteb/tatoeba-bitext-mining config: glg-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 84.7 - type: f1 value: 82.78689263765207 - type: precision value: 82.06705086580087 - type: recall value: 84.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (ceb-eng) type: mteb/tatoeba-bitext-mining config: ceb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 50.33333333333333 - type: f1 value: 45.461523661523664 - type: precision value: 43.93545574795575 - type: recall value: 50.33333333333333 - task: type: BitextMining dataset: name: MTEB Tatoeba (bre-eng) type: mteb/tatoeba-bitext-mining config: bre-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 6.6000000000000005 - type: f1 value: 5.442121400446441 - type: precision value: 5.146630385487529 - type: recall value: 6.6000000000000005 - task: type: BitextMining dataset: name: MTEB Tatoeba (ben-eng) type: mteb/tatoeba-bitext-mining config: ben-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 85 - type: f1 value: 81.04666666666667 - type: precision value: 79.25 - type: recall value: 85 - task: type: BitextMining dataset: name: MTEB Tatoeba (swg-eng) type: mteb/tatoeba-bitext-mining config: swg-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 47.32142857142857 - type: f1 value: 42.333333333333336 - type: precision value: 40.69196428571429 - type: recall value: 47.32142857142857 - task: type: BitextMining dataset: name: MTEB Tatoeba (arq-eng) type: mteb/tatoeba-bitext-mining config: arq-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 30.735455543358945 - type: f1 value: 26.73616790022338 - type: precision value: 25.397823220451283 - type: recall value: 30.735455543358945 - task: type: BitextMining dataset: name: MTEB Tatoeba (kab-eng) type: mteb/tatoeba-bitext-mining config: kab-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 25.1 - type: f1 value: 21.975989896371022 - type: precision value: 21.059885632257203 - type: recall value: 25.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (fra-eng) type: mteb/tatoeba-bitext-mining config: fra-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.3 - type: f1 value: 92.75666666666666 - type: precision value: 92.06166666666665 - type: recall value: 94.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (por-eng) type: mteb/tatoeba-bitext-mining config: por-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.1 - type: f1 value: 92.74 - type: precision value: 92.09166666666667 - type: recall value: 94.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (tat-eng) type: mteb/tatoeba-bitext-mining config: tat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 71.3 - type: f1 value: 66.922442002442 - type: precision value: 65.38249567099568 - type: recall value: 71.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (oci-eng) type: mteb/tatoeba-bitext-mining config: oci-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 40.300000000000004 - type: f1 value: 35.78682789299971 - type: precision value: 34.66425128716588 - type: recall value: 40.300000000000004 - task: type: BitextMining dataset: name: MTEB Tatoeba (pol-eng) type: mteb/tatoeba-bitext-mining config: pol-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96 - type: f1 value: 94.82333333333334 - type: precision value: 94.27833333333334 - type: recall value: 96 - task: type: BitextMining dataset: name: MTEB Tatoeba (war-eng) type: mteb/tatoeba-bitext-mining config: war-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 51.1 - type: f1 value: 47.179074753133584 - type: precision value: 46.06461044702424 - type: recall value: 51.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (aze-eng) type: mteb/tatoeba-bitext-mining config: aze-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 87.7 - type: f1 value: 84.71 - type: precision value: 83.46166666666667 - type: recall value: 87.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (vie-eng) type: mteb/tatoeba-bitext-mining config: vie-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.8 - type: f1 value: 94.68333333333334 - type: precision value: 94.13333333333334 - type: recall value: 95.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (nno-eng) type: mteb/tatoeba-bitext-mining config: nno-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 85.39999999999999 - type: f1 value: 82.5577380952381 - type: precision value: 81.36833333333334 - type: recall value: 85.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (cha-eng) type: mteb/tatoeba-bitext-mining config: cha-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 21.16788321167883 - type: f1 value: 16.948865627297987 - type: precision value: 15.971932568647897 - type: recall value: 21.16788321167883 - task: type: BitextMining dataset: name: MTEB Tatoeba (mhr-eng) type: mteb/tatoeba-bitext-mining config: mhr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 6.9 - type: f1 value: 5.515526831658907 - type: precision value: 5.141966366966367 - type: recall value: 6.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (dan-eng) type: mteb/tatoeba-bitext-mining config: dan-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.2 - type: f1 value: 91.39666666666668 - type: precision value: 90.58666666666667 - type: recall value: 93.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (ell-eng) type: mteb/tatoeba-bitext-mining config: ell-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.2 - type: f1 value: 89.95666666666666 - type: precision value: 88.92833333333333 - type: recall value: 92.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (amh-eng) type: mteb/tatoeba-bitext-mining config: amh-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 79.76190476190477 - type: f1 value: 74.93386243386244 - type: precision value: 73.11011904761904 - type: recall value: 79.76190476190477 - task: type: BitextMining dataset: name: MTEB Tatoeba (pam-eng) type: mteb/tatoeba-bitext-mining config: pam-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 8.799999999999999 - type: f1 value: 6.921439712248537 - type: precision value: 6.489885109680683 - type: recall value: 8.799999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (hsb-eng) type: mteb/tatoeba-bitext-mining config: hsb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 45.75569358178054 - type: f1 value: 40.34699501312631 - type: precision value: 38.57886764719063 - type: recall value: 45.75569358178054 - task: type: BitextMining dataset: name: MTEB Tatoeba (srp-eng) type: mteb/tatoeba-bitext-mining config: srp-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.4 - type: f1 value: 89.08333333333333 - type: precision value: 88.01666666666668 - type: recall value: 91.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (epo-eng) type: mteb/tatoeba-bitext-mining config: epo-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.60000000000001 - type: f1 value: 92.06690476190477 - type: precision value: 91.45095238095239 - type: recall value: 93.60000000000001 - task: type: BitextMining dataset: name: MTEB Tatoeba (kzj-eng) type: mteb/tatoeba-bitext-mining config: kzj-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 7.5 - type: f1 value: 6.200363129378736 - type: precision value: 5.89115314822466 - type: recall value: 7.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (awa-eng) type: mteb/tatoeba-bitext-mining config: awa-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 73.59307359307358 - type: f1 value: 68.38933553219267 - type: precision value: 66.62698412698413 - type: recall value: 73.59307359307358 - task: type: BitextMining dataset: name: MTEB Tatoeba (fao-eng) type: mteb/tatoeba-bitext-mining config: fao-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 69.8473282442748 - type: f1 value: 64.72373682297346 - type: precision value: 62.82834214131924 - type: recall value: 69.8473282442748 - task: type: BitextMining dataset: name: MTEB Tatoeba (mal-eng) type: mteb/tatoeba-bitext-mining config: mal-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.5254730713246 - type: f1 value: 96.72489082969432 - type: precision value: 96.33672974284326 - type: recall value: 97.5254730713246 - task: type: BitextMining dataset: name: MTEB Tatoeba (ile-eng) type: mteb/tatoeba-bitext-mining config: ile-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 75.6 - type: f1 value: 72.42746031746033 - type: precision value: 71.14036630036631 - type: recall value: 75.6 - task: type: BitextMining dataset: name: MTEB Tatoeba (bos-eng) type: mteb/tatoeba-bitext-mining config: bos-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.24293785310734 - type: f1 value: 88.86064030131826 - type: precision value: 87.73540489642184 - type: recall value: 91.24293785310734 - task: type: BitextMining dataset: name: MTEB Tatoeba (cor-eng) type: mteb/tatoeba-bitext-mining config: cor-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 6.2 - type: f1 value: 4.383083659794954 - type: precision value: 4.027861324289673 - type: recall value: 6.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (cat-eng) type: mteb/tatoeba-bitext-mining config: cat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 86.8 - type: f1 value: 84.09428571428572 - type: precision value: 83.00333333333333 - type: recall value: 86.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (eus-eng) type: mteb/tatoeba-bitext-mining config: eus-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 60.699999999999996 - type: f1 value: 56.1584972394755 - type: precision value: 54.713456330903135 - type: recall value: 60.699999999999996 - task: type: BitextMining dataset: name: MTEB Tatoeba (yue-eng) type: mteb/tatoeba-bitext-mining config: yue-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 84.2 - type: f1 value: 80.66190476190475 - type: precision value: 79.19690476190476 - type: recall value: 84.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (swe-eng) type: mteb/tatoeba-bitext-mining config: swe-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.2 - type: f1 value: 91.33 - type: precision value: 90.45 - type: recall value: 93.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (dtp-eng) type: mteb/tatoeba-bitext-mining config: dtp-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 6.3 - type: f1 value: 5.126828976748276 - type: precision value: 4.853614328966668 - type: recall value: 6.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (kat-eng) type: mteb/tatoeba-bitext-mining config: kat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 81.76943699731903 - type: f1 value: 77.82873739308057 - type: precision value: 76.27622452019234 - type: recall value: 81.76943699731903 - task: type: BitextMining dataset: name: MTEB Tatoeba (jpn-eng) type: mteb/tatoeba-bitext-mining config: jpn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.30000000000001 - type: f1 value: 90.29666666666665 - type: precision value: 89.40333333333334 - type: recall value: 92.30000000000001 - task: type: BitextMining dataset: name: MTEB Tatoeba (csb-eng) type: mteb/tatoeba-bitext-mining config: csb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 29.249011857707508 - type: f1 value: 24.561866096392947 - type: precision value: 23.356583740215456 - type: recall value: 29.249011857707508 - task: type: BitextMining dataset: name: MTEB Tatoeba (xho-eng) type: mteb/tatoeba-bitext-mining config: xho-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 77.46478873239437 - type: f1 value: 73.23943661971832 - type: precision value: 71.66666666666667 - type: recall value: 77.46478873239437 - task: type: BitextMining dataset: name: MTEB Tatoeba (orv-eng) type: mteb/tatoeba-bitext-mining config: orv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 20.35928143712575 - type: f1 value: 15.997867865075824 - type: precision value: 14.882104658301346 - type: recall value: 20.35928143712575 - task: type: BitextMining dataset: name: MTEB Tatoeba (ind-eng) type: mteb/tatoeba-bitext-mining config: ind-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.2 - type: f1 value: 90.25999999999999 - type: precision value: 89.45333333333335 - type: recall value: 92.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (tuk-eng) type: mteb/tatoeba-bitext-mining config: tuk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 23.15270935960591 - type: f1 value: 19.65673625772148 - type: precision value: 18.793705293464992 - type: recall value: 23.15270935960591 - task: type: BitextMining dataset: name: MTEB Tatoeba (max-eng) type: mteb/tatoeba-bitext-mining config: max-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 59.154929577464785 - type: f1 value: 52.3868463305083 - type: precision value: 50.14938113529662 - type: recall value: 59.154929577464785 - task: type: BitextMining dataset: name: MTEB Tatoeba (swh-eng) type: mteb/tatoeba-bitext-mining config: swh-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 70.51282051282051 - type: f1 value: 66.8089133089133 - type: precision value: 65.37645687645687 - type: recall value: 70.51282051282051 - task: type: BitextMining dataset: name: MTEB Tatoeba (hin-eng) type: mteb/tatoeba-bitext-mining config: hin-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.6 - type: f1 value: 93 - type: precision value: 92.23333333333333 - type: recall value: 94.6 - task: type: BitextMining dataset: name: MTEB Tatoeba (dsb-eng) type: mteb/tatoeba-bitext-mining config: dsb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 38.62212943632568 - type: f1 value: 34.3278276962583 - type: precision value: 33.07646935732408 - type: recall value: 38.62212943632568 - task: type: BitextMining dataset: name: MTEB Tatoeba (ber-eng) type: mteb/tatoeba-bitext-mining config: ber-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 28.1 - type: f1 value: 23.579609223054604 - type: precision value: 22.39622774921555 - type: recall value: 28.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (tam-eng) type: mteb/tatoeba-bitext-mining config: tam-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.27361563517914 - type: f1 value: 85.12486427795874 - type: precision value: 83.71335504885994 - type: recall value: 88.27361563517914 - task: type: BitextMining dataset: name: MTEB Tatoeba (slk-eng) type: mteb/tatoeba-bitext-mining config: slk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.6 - type: f1 value: 86.39928571428571 - type: precision value: 85.4947557997558 - type: recall value: 88.6 - task: type: BitextMining dataset: name: MTEB Tatoeba (tgl-eng) type: mteb/tatoeba-bitext-mining config: tgl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 86.5 - type: f1 value: 83.77952380952381 - type: precision value: 82.67602564102565 - type: recall value: 86.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (ast-eng) type: mteb/tatoeba-bitext-mining config: ast-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 79.52755905511812 - type: f1 value: 75.3055868016498 - type: precision value: 73.81889763779527 - type: recall value: 79.52755905511812 - task: type: BitextMining dataset: name: MTEB Tatoeba (mkd-eng) type: mteb/tatoeba-bitext-mining config: mkd-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 77.9 - type: f1 value: 73.76261904761905 - type: precision value: 72.11670995670995 - type: recall value: 77.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (khm-eng) type: mteb/tatoeba-bitext-mining config: khm-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 53.8781163434903 - type: f1 value: 47.25804051288816 - type: precision value: 45.0603482390186 - type: recall value: 53.8781163434903 - task: type: BitextMining dataset: name: MTEB Tatoeba (ces-eng) type: mteb/tatoeba-bitext-mining config: ces-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.10000000000001 - type: f1 value: 88.88 - type: precision value: 87.96333333333334 - type: recall value: 91.10000000000001 - task: type: BitextMining dataset: name: MTEB Tatoeba (tzl-eng) type: mteb/tatoeba-bitext-mining config: tzl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 38.46153846153847 - type: f1 value: 34.43978243978244 - type: precision value: 33.429487179487175 - type: recall value: 38.46153846153847 - task: type: BitextMining dataset: name: MTEB Tatoeba (urd-eng) type: mteb/tatoeba-bitext-mining config: urd-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.9 - type: f1 value: 86.19888888888887 - type: precision value: 85.07440476190476 - type: recall value: 88.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (ara-eng) type: mteb/tatoeba-bitext-mining config: ara-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 85.9 - type: f1 value: 82.58857142857143 - type: precision value: 81.15666666666667 - type: recall value: 85.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (kor-eng) type: mteb/tatoeba-bitext-mining config: kor-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 86.8 - type: f1 value: 83.36999999999999 - type: precision value: 81.86833333333333 - type: recall value: 86.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (yid-eng) type: mteb/tatoeba-bitext-mining config: yid-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 68.51415094339622 - type: f1 value: 63.195000099481234 - type: precision value: 61.394033442972116 - type: recall value: 68.51415094339622 - task: type: BitextMining dataset: name: MTEB Tatoeba (fin-eng) type: mteb/tatoeba-bitext-mining config: fin-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.5 - type: f1 value: 86.14603174603175 - type: precision value: 85.1162037037037 - type: recall value: 88.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (tha-eng) type: mteb/tatoeba-bitext-mining config: tha-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.62043795620438 - type: f1 value: 94.40389294403892 - type: precision value: 93.7956204379562 - type: recall value: 95.62043795620438 - task: type: BitextMining dataset: name: MTEB Tatoeba (wuu-eng) type: mteb/tatoeba-bitext-mining config: wuu-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 81.8 - type: f1 value: 78.6532178932179 - type: precision value: 77.46348795840176 - type: recall value: 81.8 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.603 - type: map_at_10 value: 8.5 - type: map_at_100 value: 12.985 - type: map_at_1000 value: 14.466999999999999 - type: map_at_3 value: 4.859999999999999 - type: map_at_5 value: 5.817 - type: mrr_at_1 value: 28.571 - type: mrr_at_10 value: 42.331 - type: mrr_at_100 value: 43.592999999999996 - type: mrr_at_1000 value: 43.592999999999996 - type: mrr_at_3 value: 38.435 - type: mrr_at_5 value: 39.966 - type: ndcg_at_1 value: 26.531 - type: ndcg_at_10 value: 21.353 - type: ndcg_at_100 value: 31.087999999999997 - type: ndcg_at_1000 value: 43.163000000000004 - type: ndcg_at_3 value: 22.999 - type: ndcg_at_5 value: 21.451 - type: precision_at_1 value: 28.571 - type: precision_at_10 value: 19.387999999999998 - type: precision_at_100 value: 6.265 - type: precision_at_1000 value: 1.4160000000000001 - type: precision_at_3 value: 24.490000000000002 - type: precision_at_5 value: 21.224 - type: recall_at_1 value: 2.603 - type: recall_at_10 value: 14.474 - type: recall_at_100 value: 40.287 - type: recall_at_1000 value: 76.606 - type: recall_at_3 value: 5.978 - type: recall_at_5 value: 7.819 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 69.7848 - type: ap value: 13.661023167088224 - type: f1 value: 53.61686134460943 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 61.28183361629882 - type: f1 value: 61.55481034919965 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 35.972128420092396 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 85.59933241938367 - type: cos_sim_ap value: 72.20760361208136 - type: cos_sim_f1 value: 66.4447731755424 - type: cos_sim_precision value: 62.35539102267469 - type: cos_sim_recall value: 71.10817941952506 - type: dot_accuracy value: 78.98313166835548 - type: dot_ap value: 44.492521645493795 - type: dot_f1 value: 45.814889336016094 - type: dot_precision value: 37.02439024390244 - type: dot_recall value: 60.07915567282321 - type: euclidean_accuracy value: 85.3907134767837 - type: euclidean_ap value: 71.53847289080343 - type: euclidean_f1 value: 65.95952206778834 - type: euclidean_precision value: 61.31006346328196 - type: euclidean_recall value: 71.37203166226914 - type: manhattan_accuracy value: 85.40859510043511 - type: manhattan_ap value: 71.49664104395515 - type: manhattan_f1 value: 65.98569969356485 - type: manhattan_precision value: 63.928748144482924 - type: manhattan_recall value: 68.17941952506597 - type: max_accuracy value: 85.59933241938367 - type: max_ap value: 72.20760361208136 - type: max_f1 value: 66.4447731755424 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.83261536073273 - type: cos_sim_ap value: 85.48178133644264 - type: cos_sim_f1 value: 77.87816307403935 - type: cos_sim_precision value: 75.88953021114926 - type: cos_sim_recall value: 79.97382198952879 - type: dot_accuracy value: 79.76287499514883 - type: dot_ap value: 59.17438838475084 - type: dot_f1 value: 56.34566667855996 - type: dot_precision value: 52.50349092359864 - type: dot_recall value: 60.794579611949494 - type: euclidean_accuracy value: 88.76857996662397 - type: euclidean_ap value: 85.22764834359887 - type: euclidean_f1 value: 77.65379751543554 - type: euclidean_precision value: 75.11152683839401 - type: euclidean_recall value: 80.37419156144134 - type: manhattan_accuracy value: 88.6987231730508 - type: manhattan_ap value: 85.18907981724007 - type: manhattan_f1 value: 77.51967028849757 - type: manhattan_precision value: 75.49992701795358 - type: manhattan_recall value: 79.65044656606098 - type: max_accuracy value: 88.83261536073273 - type: max_ap value: 85.48178133644264 - type: max_f1 value: 77.87816307403935 --- ***See Disclaimer below*** ---- # A Teradata Vantage compatible Embeddings Model # intfloat/multilingual-e5-base ## Overview of this Model An Embedding Model which maps text (sentence/ paragraphs) into a vector. The [intfloat/multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) model well known for its effectiveness in capturing semantic meanings in text data. It's a state-of-the-art model trained on a large corpus, capable of generating high-quality text embeddings. - 278.04M params (Sizes in ONNX format - "fp32": 1058.73MB, "int8": 265.5MB, "uint8": 265.5MB) - 514 maximum input tokens - 768 dimensions of output vector - Licence: mit. The released models can be used for commercial purposes free of charge. - Reference to Original Model: https://huggingface.co/intfloat/multilingual-e5-base ## Quickstart: Deploying this Model in Teradata Vantage We have pre-converted the model into the ONNX format compatible with BYOM 6.0, eliminating the need for manual conversion. **Note:** Ensure you have access to a Teradata Database with BYOM 6.0 installed. To get started, clone the pre-converted model directly from the Teradata HuggingFace repository. ```python import teradataml as tdml import getpass from huggingface_hub import hf_hub_download model_name = "multilingual-e5-base" number_dimensions_output = 768 model_file_name = "model.onnx" # Step 1: Download Model from Teradata HuggingFace Page hf_hub_download(repo_id=f"Teradata/{model_name}", filename=f"onnx/{model_file_name}", local_dir="./") hf_hub_download(repo_id=f"Teradata/{model_name}", filename=f"tokenizer.json", local_dir="./") # Step 2: Create Connection to Vantage tdml.create_context(host = input('enter your hostname'), username=input('enter your username'), password = getpass.getpass("enter your password")) # Step 3: Load Models into Vantage # a) Embedding model tdml.save_byom(model_id = model_name, # must be unique in the models table model_file = f"onnx/{model_file_name}", table_name = 'embeddings_models' ) # b) Tokenizer tdml.save_byom(model_id = model_name, # must be unique in the models table model_file = 'tokenizer.json', table_name = 'embeddings_tokenizers') # Step 4: Test ONNXEmbeddings Function # Note that ONNXEmbeddings expects the 'payload' column to be 'txt'. # If it has got a different name, just rename it in a subquery/CTE. input_table = "emails.emails" embeddings_query = f""" SELECT * from mldb.ONNXEmbeddings( on {input_table} as InputTable on (select * from embeddings_models where model_id = '{model_name}') as ModelTable DIMENSION on (select model as tokenizer from embeddings_tokenizers where model_id = '{model_name}') as TokenizerTable DIMENSION using Accumulate('id', 'txt') ModelOutputTensor('sentence_embedding') EnableMemoryCheck('false') OutputFormat('FLOAT32({number_dimensions_output})') OverwriteCachedModel('true') ) a """ DF_embeddings = tdml.DataFrame.from_query(embeddings_query) DF_embeddings ``` ## What Can I Do with the Embeddings? Teradata Vantage includes pre-built in-database functions to process embeddings further. Explore the following examples: - **Semantic Clustering with TD_KMeans:** [Semantic Clustering Python Notebook](https://github.com/Teradata/jupyter-demos/blob/main/UseCases/Language_Models_InVantage/Semantic_Clustering_Python.ipynb) - **Semantic Distance with TD_VectorDistance:** [Semantic Similarity Python Notebook](https://github.com/Teradata/jupyter-demos/blob/main/UseCases/Language_Models_InVantage/Semantic_Similarity_Python.ipynb) - **RAG-Based Application with TD_VectorDistance:** [RAG and Bedrock Query PDF Notebook](https://github.com/Teradata/jupyter-demos/blob/main/UseCases/Language_Models_InVantage/RAG_and_Bedrock_QueryPDF.ipynb) ## Deep Dive into Model Conversion to ONNX **The steps below outline how we converted the open-source Hugging Face model into an ONNX file compatible with the in-database ONNXEmbeddings function.** You do not need to perform these steps—they are provided solely for documentation and transparency. However, they may be helpful if you wish to convert another model to the required format. ### Part 1. Importing and Converting Model using optimum We start by importing the pre-trained [intfloat/multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) model from Hugging Face. To enhance performance and ensure compatibility with various execution environments, we'll use the [Optimum](https://github.com/huggingface/optimum) utility to convert the model into the ONNX (Open Neural Network Exchange) format. After conversion to ONNX, we are fixing the opset in the ONNX file for compatibility with ONNX runtime used in Teradata Vantage We are generating ONNX files for multiple different precisions: fp32, int8, uint8 You can find the detailed conversion steps in the file [convert.py](./convert.py) ### Part 2. Running the model in Python with onnxruntime & compare results Once the fixes are applied, we proceed to test the correctness of the ONNX model by calculating cosine similarity between two texts using native SentenceTransformers and ONNX runtime, comparing the results. If the results are identical, it confirms that the ONNX model gives the same result as the native models, validating its correctness and suitability for further use in the database. ```python import onnxruntime as rt from sentence_transformers.util import cos_sim from sentence_transformers import SentenceTransformer import transformers sentences_1 = 'How is the weather today?' sentences_2 = 'What is the current weather like today?' # Calculate ONNX result tokenizer = transformers.AutoTokenizer.from_pretrained("intfloat/multilingual-e5-base") predef_sess = rt.InferenceSession("onnx/model.onnx") enc1 = tokenizer(sentences_1) embeddings_1_onnx = predef_sess.run(None, {"input_ids": [enc1.input_ids], "attention_mask": [enc1.attention_mask]}) enc2 = tokenizer(sentences_2) embeddings_2_onnx = predef_sess.run(None, {"input_ids": [enc2.input_ids], "attention_mask": [enc2.attention_mask]}) # Calculate embeddings with SentenceTransformer model = SentenceTransformer(model_id, trust_remote_code=True) embeddings_1_sentence_transformer = model.encode(sentences_1, normalize_embeddings=True, trust_remote_code=True) embeddings_2_sentence_transformer = model.encode(sentences_2, normalize_embeddings=True, trust_remote_code=True) # Compare results print("Cosine similiarity for embeddings calculated with ONNX:" + str(cos_sim(embeddings_1_onnx[1][0], embeddings_2_onnx[1][0]))) print("Cosine similiarity for embeddings calculated with SentenceTransformer:" + str(cos_sim(embeddings_1_sentence_transformer, embeddings_2_sentence_transformer))) ``` You can find the detailed ONNX vs. SentenceTransformer result comparison steps in the file [test_local.py](./test_local.py) ----- DISCLAIMER: The content herein (“Content”) is provided “AS IS” and is not covered by any Teradata Operations, Inc. and its affiliates (“Teradata”) agreements. Its listing here does not constitute certification or endorsement by Teradata. To the extent any of the Content contains or is related to any artificial intelligence (“AI”) or other language learning models (“Models”) that interoperate with the products and services of Teradata, by accessing, bringing, deploying or using such Models, you acknowledge and agree that you are solely responsible for ensuring compliance with all applicable laws, regulations, and restrictions governing the use, deployment, and distribution of AI technologies. This includes, but is not limited to, AI Diffusion Rules, European Union AI Act, AI-related laws and regulations, privacy laws, export controls, and financial or sector-specific regulations. While Teradata may provide support, guidance, or assistance in the deployment or implementation of Models to interoperate with Teradata’s products and/or services, you remain fully responsible for ensuring that your Models, data, and applications comply with all relevant legal and regulatory obligations. Our assistance does not constitute legal or regulatory approval, and Teradata disclaims any liability arising from non-compliance with applicable laws. You must determine the suitability of the Models for any purpose. Given the probabilistic nature of machine learning and modeling, the use of the Models may in some situations result in incorrect output that does not accurately reflect the action generated. You should evaluate the accuracy of any output as appropriate for your use case, including by using human review of the output.
[ "SEMANTIC_SIMILARITY", "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
t12e/instructor-base
t12e
sentence-similarity
[ "sentence-transformers", "pytorch", "t5", "text-embedding", "embeddings", "information-retrieval", "beir", "text-classification", "language-model", "text-clustering", "text-semantic-similarity", "text-evaluation", "prompt-retrieval", "text-reranking", "feature-extraction", "sentence-similarity", "transformers", "English", "Sentence Similarity", "natural_questions", "ms_marco", "fever", "hotpot_qa", "mteb", "en", "arxiv:2212.09741", "license:apache-2.0", "model-index", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
1,685
1,685
9
0
--- language: en license: apache-2.0 pipeline_tag: sentence-similarity tags: - text-embedding - embeddings - information-retrieval - beir - text-classification - language-model - text-clustering - text-semantic-similarity - text-evaluation - prompt-retrieval - text-reranking - sentence-transformers - feature-extraction - sentence-similarity - transformers - t5 - English - Sentence Similarity - natural_questions - ms_marco - fever - hotpot_qa - mteb inference: false model-index: - name: final_base_results results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 86.2089552238806 - type: ap value: 55.76273850794966 - type: f1 value: 81.26104211414781 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 88.35995000000001 - type: ap value: 84.18839957309655 - type: f1 value: 88.317619250081 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 44.64 - type: f1 value: 42.48663956478136 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: map_at_1 value: 27.383000000000003 - type: map_at_10 value: 43.024 - type: map_at_100 value: 44.023 - type: map_at_1000 value: 44.025999999999996 - type: map_at_3 value: 37.684 - type: map_at_5 value: 40.884 - type: mrr_at_1 value: 28.094 - type: mrr_at_10 value: 43.315 - type: mrr_at_100 value: 44.313 - type: mrr_at_1000 value: 44.317 - type: mrr_at_3 value: 37.862 - type: mrr_at_5 value: 41.155 - type: ndcg_at_1 value: 27.383000000000003 - type: ndcg_at_10 value: 52.032000000000004 - type: ndcg_at_100 value: 56.19499999999999 - type: ndcg_at_1000 value: 56.272 - type: ndcg_at_3 value: 41.166000000000004 - type: ndcg_at_5 value: 46.92 - type: precision_at_1 value: 27.383000000000003 - type: precision_at_10 value: 8.087 - type: precision_at_100 value: 0.989 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 17.093 - type: precision_at_5 value: 13.044 - type: recall_at_1 value: 27.383000000000003 - type: recall_at_10 value: 80.868 - type: recall_at_100 value: 98.86200000000001 - type: recall_at_1000 value: 99.431 - type: recall_at_3 value: 51.28 - type: recall_at_5 value: 65.22 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 39.68441054431849 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 29.188539728343844 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 63.173362687519784 - type: mrr value: 76.18860748362133 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_spearman value: 82.30789953771232 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 77.03571428571428 - type: f1 value: 75.87384305045917 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 32.98041170516364 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 25.71652988451154 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: map_at_1 value: 33.739999999999995 - type: map_at_10 value: 46.197 - type: map_at_100 value: 47.814 - type: map_at_1000 value: 47.934 - type: map_at_3 value: 43.091 - type: map_at_5 value: 44.81 - type: mrr_at_1 value: 41.059 - type: mrr_at_10 value: 52.292 - type: mrr_at_100 value: 52.978 - type: mrr_at_1000 value: 53.015 - type: mrr_at_3 value: 49.976 - type: mrr_at_5 value: 51.449999999999996 - type: ndcg_at_1 value: 41.059 - type: ndcg_at_10 value: 52.608 - type: ndcg_at_100 value: 57.965 - type: ndcg_at_1000 value: 59.775999999999996 - type: ndcg_at_3 value: 48.473 - type: ndcg_at_5 value: 50.407999999999994 - type: precision_at_1 value: 41.059 - type: precision_at_10 value: 9.943 - type: precision_at_100 value: 1.6070000000000002 - type: precision_at_1000 value: 0.20500000000000002 - type: precision_at_3 value: 23.413999999999998 - type: precision_at_5 value: 16.481 - type: recall_at_1 value: 33.739999999999995 - type: recall_at_10 value: 63.888999999999996 - type: recall_at_100 value: 85.832 - type: recall_at_1000 value: 97.475 - type: recall_at_3 value: 51.953 - type: recall_at_5 value: 57.498000000000005 - type: map_at_1 value: 31.169999999999998 - type: map_at_10 value: 41.455 - type: map_at_100 value: 42.716 - type: map_at_1000 value: 42.847 - type: map_at_3 value: 38.568999999999996 - type: map_at_5 value: 40.099000000000004 - type: mrr_at_1 value: 39.427 - type: mrr_at_10 value: 47.818 - type: mrr_at_100 value: 48.519 - type: mrr_at_1000 value: 48.558 - type: mrr_at_3 value: 45.86 - type: mrr_at_5 value: 46.936 - type: ndcg_at_1 value: 39.427 - type: ndcg_at_10 value: 47.181 - type: ndcg_at_100 value: 51.737 - type: ndcg_at_1000 value: 53.74 - type: ndcg_at_3 value: 43.261 - type: ndcg_at_5 value: 44.891 - type: precision_at_1 value: 39.427 - type: precision_at_10 value: 8.847 - type: precision_at_100 value: 1.425 - type: precision_at_1000 value: 0.189 - type: precision_at_3 value: 20.785999999999998 - type: precision_at_5 value: 14.560999999999998 - type: recall_at_1 value: 31.169999999999998 - type: recall_at_10 value: 56.971000000000004 - type: recall_at_100 value: 76.31400000000001 - type: recall_at_1000 value: 88.93900000000001 - type: recall_at_3 value: 45.208 - type: recall_at_5 value: 49.923 - type: map_at_1 value: 39.682 - type: map_at_10 value: 52.766000000000005 - type: map_at_100 value: 53.84100000000001 - type: map_at_1000 value: 53.898 - type: map_at_3 value: 49.291000000000004 - type: map_at_5 value: 51.365 - type: mrr_at_1 value: 45.266 - type: mrr_at_10 value: 56.093 - type: mrr_at_100 value: 56.763 - type: mrr_at_1000 value: 56.793000000000006 - type: mrr_at_3 value: 53.668000000000006 - type: mrr_at_5 value: 55.1 - type: ndcg_at_1 value: 45.266 - type: ndcg_at_10 value: 58.836 - type: ndcg_at_100 value: 62.863 - type: ndcg_at_1000 value: 63.912 - type: ndcg_at_3 value: 53.19199999999999 - type: ndcg_at_5 value: 56.125 - type: precision_at_1 value: 45.266 - type: precision_at_10 value: 9.492 - type: precision_at_100 value: 1.236 - type: precision_at_1000 value: 0.13699999999999998 - type: precision_at_3 value: 23.762 - type: precision_at_5 value: 16.414 - type: recall_at_1 value: 39.682 - type: recall_at_10 value: 73.233 - type: recall_at_100 value: 90.335 - type: recall_at_1000 value: 97.452 - type: recall_at_3 value: 58.562000000000005 - type: recall_at_5 value: 65.569 - type: map_at_1 value: 26.743 - type: map_at_10 value: 34.016000000000005 - type: map_at_100 value: 35.028999999999996 - type: map_at_1000 value: 35.113 - type: map_at_3 value: 31.763 - type: map_at_5 value: 33.013999999999996 - type: mrr_at_1 value: 28.927000000000003 - type: mrr_at_10 value: 36.32 - type: mrr_at_100 value: 37.221 - type: mrr_at_1000 value: 37.281 - type: mrr_at_3 value: 34.105000000000004 - type: mrr_at_5 value: 35.371 - type: ndcg_at_1 value: 28.927000000000003 - type: ndcg_at_10 value: 38.474000000000004 - type: ndcg_at_100 value: 43.580000000000005 - type: ndcg_at_1000 value: 45.64 - type: ndcg_at_3 value: 34.035 - type: ndcg_at_5 value: 36.186 - type: precision_at_1 value: 28.927000000000003 - type: precision_at_10 value: 5.74 - type: precision_at_100 value: 0.8710000000000001 - type: precision_at_1000 value: 0.108 - type: precision_at_3 value: 14.124 - type: precision_at_5 value: 9.74 - type: recall_at_1 value: 26.743 - type: recall_at_10 value: 49.955 - type: recall_at_100 value: 73.904 - type: recall_at_1000 value: 89.133 - type: recall_at_3 value: 38.072 - type: recall_at_5 value: 43.266 - type: map_at_1 value: 16.928 - type: map_at_10 value: 23.549 - type: map_at_100 value: 24.887 - type: map_at_1000 value: 25.018 - type: map_at_3 value: 21.002000000000002 - type: map_at_5 value: 22.256 - type: mrr_at_1 value: 21.02 - type: mrr_at_10 value: 27.898 - type: mrr_at_100 value: 29.018 - type: mrr_at_1000 value: 29.099999999999998 - type: mrr_at_3 value: 25.456 - type: mrr_at_5 value: 26.625 - type: ndcg_at_1 value: 21.02 - type: ndcg_at_10 value: 28.277 - type: ndcg_at_100 value: 34.54 - type: ndcg_at_1000 value: 37.719 - type: ndcg_at_3 value: 23.707 - type: ndcg_at_5 value: 25.482 - type: precision_at_1 value: 21.02 - type: precision_at_10 value: 5.361 - type: precision_at_100 value: 0.9809999999999999 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 11.401 - type: precision_at_5 value: 8.209 - type: recall_at_1 value: 16.928 - type: recall_at_10 value: 38.601 - type: recall_at_100 value: 65.759 - type: recall_at_1000 value: 88.543 - type: recall_at_3 value: 25.556 - type: recall_at_5 value: 30.447000000000003 - type: map_at_1 value: 28.549000000000003 - type: map_at_10 value: 38.426 - type: map_at_100 value: 39.845000000000006 - type: map_at_1000 value: 39.956 - type: map_at_3 value: 35.372 - type: map_at_5 value: 37.204 - type: mrr_at_1 value: 35.034 - type: mrr_at_10 value: 44.041000000000004 - type: mrr_at_100 value: 44.95 - type: mrr_at_1000 value: 44.997 - type: mrr_at_3 value: 41.498000000000005 - type: mrr_at_5 value: 43.077 - type: ndcg_at_1 value: 35.034 - type: ndcg_at_10 value: 44.218 - type: ndcg_at_100 value: 49.958000000000006 - type: ndcg_at_1000 value: 52.019000000000005 - type: ndcg_at_3 value: 39.34 - type: ndcg_at_5 value: 41.892 - type: precision_at_1 value: 35.034 - type: precision_at_10 value: 7.911 - type: precision_at_100 value: 1.26 - type: precision_at_1000 value: 0.16 - type: precision_at_3 value: 18.511 - type: precision_at_5 value: 13.205 - type: recall_at_1 value: 28.549000000000003 - type: recall_at_10 value: 56.035999999999994 - type: recall_at_100 value: 79.701 - type: recall_at_1000 value: 93.149 - type: recall_at_3 value: 42.275 - type: recall_at_5 value: 49.097 - type: map_at_1 value: 29.391000000000002 - type: map_at_10 value: 39.48 - type: map_at_100 value: 40.727000000000004 - type: map_at_1000 value: 40.835 - type: map_at_3 value: 36.234 - type: map_at_5 value: 37.877 - type: mrr_at_1 value: 35.959 - type: mrr_at_10 value: 44.726 - type: mrr_at_100 value: 45.531 - type: mrr_at_1000 value: 45.582 - type: mrr_at_3 value: 42.047000000000004 - type: mrr_at_5 value: 43.611 - type: ndcg_at_1 value: 35.959 - type: ndcg_at_10 value: 45.303 - type: ndcg_at_100 value: 50.683 - type: ndcg_at_1000 value: 52.818 - type: ndcg_at_3 value: 39.987 - type: ndcg_at_5 value: 42.243 - type: precision_at_1 value: 35.959 - type: precision_at_10 value: 8.241999999999999 - type: precision_at_100 value: 1.274 - type: precision_at_1000 value: 0.163 - type: precision_at_3 value: 18.836 - type: precision_at_5 value: 13.196 - type: recall_at_1 value: 29.391000000000002 - type: recall_at_10 value: 57.364000000000004 - type: recall_at_100 value: 80.683 - type: recall_at_1000 value: 94.918 - type: recall_at_3 value: 42.263 - type: recall_at_5 value: 48.634 - type: map_at_1 value: 26.791749999999997 - type: map_at_10 value: 35.75541666666667 - type: map_at_100 value: 37.00791666666667 - type: map_at_1000 value: 37.12408333333333 - type: map_at_3 value: 33.02966666666667 - type: map_at_5 value: 34.56866666666667 - type: mrr_at_1 value: 31.744333333333337 - type: mrr_at_10 value: 39.9925 - type: mrr_at_100 value: 40.86458333333333 - type: mrr_at_1000 value: 40.92175000000001 - type: mrr_at_3 value: 37.68183333333334 - type: mrr_at_5 value: 39.028499999999994 - type: ndcg_at_1 value: 31.744333333333337 - type: ndcg_at_10 value: 40.95008333333334 - type: ndcg_at_100 value: 46.25966666666667 - type: ndcg_at_1000 value: 48.535333333333334 - type: ndcg_at_3 value: 36.43333333333333 - type: ndcg_at_5 value: 38.602333333333334 - type: precision_at_1 value: 31.744333333333337 - type: precision_at_10 value: 7.135166666666666 - type: precision_at_100 value: 1.1535833333333334 - type: precision_at_1000 value: 0.15391666666666665 - type: precision_at_3 value: 16.713 - type: precision_at_5 value: 11.828416666666666 - type: recall_at_1 value: 26.791749999999997 - type: recall_at_10 value: 51.98625 - type: recall_at_100 value: 75.30358333333334 - type: recall_at_1000 value: 91.05433333333333 - type: recall_at_3 value: 39.39583333333333 - type: recall_at_5 value: 45.05925 - type: map_at_1 value: 22.219 - type: map_at_10 value: 29.162 - type: map_at_100 value: 30.049999999999997 - type: map_at_1000 value: 30.144 - type: map_at_3 value: 27.204 - type: map_at_5 value: 28.351 - type: mrr_at_1 value: 25.153 - type: mrr_at_10 value: 31.814999999999998 - type: mrr_at_100 value: 32.573 - type: mrr_at_1000 value: 32.645 - type: mrr_at_3 value: 29.934 - type: mrr_at_5 value: 30.946 - type: ndcg_at_1 value: 25.153 - type: ndcg_at_10 value: 33.099000000000004 - type: ndcg_at_100 value: 37.768 - type: ndcg_at_1000 value: 40.331 - type: ndcg_at_3 value: 29.473 - type: ndcg_at_5 value: 31.206 - type: precision_at_1 value: 25.153 - type: precision_at_10 value: 5.183999999999999 - type: precision_at_100 value: 0.8170000000000001 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 12.831999999999999 - type: precision_at_5 value: 8.895999999999999 - type: recall_at_1 value: 22.219 - type: recall_at_10 value: 42.637 - type: recall_at_100 value: 64.704 - type: recall_at_1000 value: 83.963 - type: recall_at_3 value: 32.444 - type: recall_at_5 value: 36.802 - type: map_at_1 value: 17.427999999999997 - type: map_at_10 value: 24.029 - type: map_at_100 value: 25.119999999999997 - type: map_at_1000 value: 25.257 - type: map_at_3 value: 22.016 - type: map_at_5 value: 23.143 - type: mrr_at_1 value: 21.129 - type: mrr_at_10 value: 27.750000000000004 - type: mrr_at_100 value: 28.666999999999998 - type: mrr_at_1000 value: 28.754999999999995 - type: mrr_at_3 value: 25.849 - type: mrr_at_5 value: 26.939999999999998 - type: ndcg_at_1 value: 21.129 - type: ndcg_at_10 value: 28.203 - type: ndcg_at_100 value: 33.44 - type: ndcg_at_1000 value: 36.61 - type: ndcg_at_3 value: 24.648999999999997 - type: ndcg_at_5 value: 26.316 - type: precision_at_1 value: 21.129 - type: precision_at_10 value: 5.055 - type: precision_at_100 value: 0.909 - type: precision_at_1000 value: 0.13699999999999998 - type: precision_at_3 value: 11.666 - type: precision_at_5 value: 8.3 - type: recall_at_1 value: 17.427999999999997 - type: recall_at_10 value: 36.923 - type: recall_at_100 value: 60.606 - type: recall_at_1000 value: 83.19 - type: recall_at_3 value: 26.845000000000002 - type: recall_at_5 value: 31.247000000000003 - type: map_at_1 value: 26.457000000000004 - type: map_at_10 value: 35.228 - type: map_at_100 value: 36.475 - type: map_at_1000 value: 36.585 - type: map_at_3 value: 32.444 - type: map_at_5 value: 34.046 - type: mrr_at_1 value: 30.784 - type: mrr_at_10 value: 39.133 - type: mrr_at_100 value: 40.11 - type: mrr_at_1000 value: 40.169 - type: mrr_at_3 value: 36.692 - type: mrr_at_5 value: 38.17 - type: ndcg_at_1 value: 30.784 - type: ndcg_at_10 value: 40.358 - type: ndcg_at_100 value: 46.119 - type: ndcg_at_1000 value: 48.428 - type: ndcg_at_3 value: 35.504000000000005 - type: ndcg_at_5 value: 37.864 - type: precision_at_1 value: 30.784 - type: precision_at_10 value: 6.800000000000001 - type: precision_at_100 value: 1.083 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 15.920000000000002 - type: precision_at_5 value: 11.437 - type: recall_at_1 value: 26.457000000000004 - type: recall_at_10 value: 51.845 - type: recall_at_100 value: 77.046 - type: recall_at_1000 value: 92.892 - type: recall_at_3 value: 38.89 - type: recall_at_5 value: 44.688 - type: map_at_1 value: 29.378999999999998 - type: map_at_10 value: 37.373 - type: map_at_100 value: 39.107 - type: map_at_1000 value: 39.317 - type: map_at_3 value: 34.563 - type: map_at_5 value: 36.173 - type: mrr_at_1 value: 35.178 - type: mrr_at_10 value: 42.44 - type: mrr_at_100 value: 43.434 - type: mrr_at_1000 value: 43.482 - type: mrr_at_3 value: 39.987 - type: mrr_at_5 value: 41.370000000000005 - type: ndcg_at_1 value: 35.178 - type: ndcg_at_10 value: 42.82 - type: ndcg_at_100 value: 48.935 - type: ndcg_at_1000 value: 51.28 - type: ndcg_at_3 value: 38.562999999999995 - type: ndcg_at_5 value: 40.687 - type: precision_at_1 value: 35.178 - type: precision_at_10 value: 7.945 - type: precision_at_100 value: 1.524 - type: precision_at_1000 value: 0.242 - type: precision_at_3 value: 17.721 - type: precision_at_5 value: 12.925 - type: recall_at_1 value: 29.378999999999998 - type: recall_at_10 value: 52.141999999999996 - type: recall_at_100 value: 79.49000000000001 - type: recall_at_1000 value: 93.782 - type: recall_at_3 value: 39.579 - type: recall_at_5 value: 45.462 - type: map_at_1 value: 19.814999999999998 - type: map_at_10 value: 27.383999999999997 - type: map_at_100 value: 28.483999999999998 - type: map_at_1000 value: 28.585 - type: map_at_3 value: 24.807000000000002 - type: map_at_5 value: 26.485999999999997 - type: mrr_at_1 value: 21.996 - type: mrr_at_10 value: 29.584 - type: mrr_at_100 value: 30.611 - type: mrr_at_1000 value: 30.684 - type: mrr_at_3 value: 27.11 - type: mrr_at_5 value: 28.746 - type: ndcg_at_1 value: 21.996 - type: ndcg_at_10 value: 32.024 - type: ndcg_at_100 value: 37.528 - type: ndcg_at_1000 value: 40.150999999999996 - type: ndcg_at_3 value: 27.016000000000002 - type: ndcg_at_5 value: 29.927999999999997 - type: precision_at_1 value: 21.996 - type: precision_at_10 value: 5.102 - type: precision_at_100 value: 0.856 - type: precision_at_1000 value: 0.117 - type: precision_at_3 value: 11.583 - type: precision_at_5 value: 8.577 - type: recall_at_1 value: 19.814999999999998 - type: recall_at_10 value: 44.239 - type: recall_at_100 value: 69.269 - type: recall_at_1000 value: 89.216 - type: recall_at_3 value: 31.102999999999998 - type: recall_at_5 value: 38.078 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: map_at_1 value: 11.349 - type: map_at_10 value: 19.436 - type: map_at_100 value: 21.282999999999998 - type: map_at_1000 value: 21.479 - type: map_at_3 value: 15.841 - type: map_at_5 value: 17.558 - type: mrr_at_1 value: 25.863000000000003 - type: mrr_at_10 value: 37.218 - type: mrr_at_100 value: 38.198 - type: mrr_at_1000 value: 38.236 - type: mrr_at_3 value: 33.409 - type: mrr_at_5 value: 35.602000000000004 - type: ndcg_at_1 value: 25.863000000000003 - type: ndcg_at_10 value: 27.953 - type: ndcg_at_100 value: 35.327 - type: ndcg_at_1000 value: 38.708999999999996 - type: ndcg_at_3 value: 21.985 - type: ndcg_at_5 value: 23.957 - type: precision_at_1 value: 25.863000000000003 - type: precision_at_10 value: 8.99 - type: precision_at_100 value: 1.6889999999999998 - type: precision_at_1000 value: 0.232 - type: precision_at_3 value: 16.308 - type: precision_at_5 value: 12.912 - type: recall_at_1 value: 11.349 - type: recall_at_10 value: 34.581 - type: recall_at_100 value: 60.178 - type: recall_at_1000 value: 78.88199999999999 - type: recall_at_3 value: 20.041999999999998 - type: recall_at_5 value: 25.458 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: map_at_1 value: 7.893 - type: map_at_10 value: 15.457 - type: map_at_100 value: 20.905 - type: map_at_1000 value: 22.116 - type: map_at_3 value: 11.593 - type: map_at_5 value: 13.134 - type: mrr_at_1 value: 57.49999999999999 - type: mrr_at_10 value: 65.467 - type: mrr_at_100 value: 66.022 - type: mrr_at_1000 value: 66.039 - type: mrr_at_3 value: 63.458000000000006 - type: mrr_at_5 value: 64.546 - type: ndcg_at_1 value: 45.875 - type: ndcg_at_10 value: 33.344 - type: ndcg_at_100 value: 36.849 - type: ndcg_at_1000 value: 44.03 - type: ndcg_at_3 value: 37.504 - type: ndcg_at_5 value: 34.892 - type: precision_at_1 value: 57.49999999999999 - type: precision_at_10 value: 25.95 - type: precision_at_100 value: 7.89 - type: precision_at_1000 value: 1.669 - type: precision_at_3 value: 40.333000000000006 - type: precision_at_5 value: 33.050000000000004 - type: recall_at_1 value: 7.893 - type: recall_at_10 value: 20.724999999999998 - type: recall_at_100 value: 42.516 - type: recall_at_1000 value: 65.822 - type: recall_at_3 value: 12.615000000000002 - type: recall_at_5 value: 15.482000000000001 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 51.760000000000005 - type: f1 value: 45.51690565701713 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: map_at_1 value: 53.882 - type: map_at_10 value: 65.902 - type: map_at_100 value: 66.33 - type: map_at_1000 value: 66.348 - type: map_at_3 value: 63.75999999999999 - type: map_at_5 value: 65.181 - type: mrr_at_1 value: 58.041 - type: mrr_at_10 value: 70.133 - type: mrr_at_100 value: 70.463 - type: mrr_at_1000 value: 70.47 - type: mrr_at_3 value: 68.164 - type: mrr_at_5 value: 69.465 - type: ndcg_at_1 value: 58.041 - type: ndcg_at_10 value: 71.84700000000001 - type: ndcg_at_100 value: 73.699 - type: ndcg_at_1000 value: 74.06700000000001 - type: ndcg_at_3 value: 67.855 - type: ndcg_at_5 value: 70.203 - type: precision_at_1 value: 58.041 - type: precision_at_10 value: 9.427000000000001 - type: precision_at_100 value: 1.049 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 27.278000000000002 - type: precision_at_5 value: 17.693 - type: recall_at_1 value: 53.882 - type: recall_at_10 value: 85.99 - type: recall_at_100 value: 94.09100000000001 - type: recall_at_1000 value: 96.612 - type: recall_at_3 value: 75.25 - type: recall_at_5 value: 80.997 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: map_at_1 value: 19.165 - type: map_at_10 value: 31.845000000000002 - type: map_at_100 value: 33.678999999999995 - type: map_at_1000 value: 33.878 - type: map_at_3 value: 27.881 - type: map_at_5 value: 30.049999999999997 - type: mrr_at_1 value: 38.272 - type: mrr_at_10 value: 47.04 - type: mrr_at_100 value: 47.923 - type: mrr_at_1000 value: 47.973 - type: mrr_at_3 value: 44.985 - type: mrr_at_5 value: 46.150000000000006 - type: ndcg_at_1 value: 38.272 - type: ndcg_at_10 value: 39.177 - type: ndcg_at_100 value: 45.995000000000005 - type: ndcg_at_1000 value: 49.312 - type: ndcg_at_3 value: 36.135 - type: ndcg_at_5 value: 36.936 - type: precision_at_1 value: 38.272 - type: precision_at_10 value: 10.926 - type: precision_at_100 value: 1.809 - type: precision_at_1000 value: 0.23700000000000002 - type: precision_at_3 value: 24.331 - type: precision_at_5 value: 17.747 - type: recall_at_1 value: 19.165 - type: recall_at_10 value: 45.103 - type: recall_at_100 value: 70.295 - type: recall_at_1000 value: 90.592 - type: recall_at_3 value: 32.832 - type: recall_at_5 value: 37.905 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: map_at_1 value: 32.397 - type: map_at_10 value: 44.83 - type: map_at_100 value: 45.716 - type: map_at_1000 value: 45.797 - type: map_at_3 value: 41.955999999999996 - type: map_at_5 value: 43.736999999999995 - type: mrr_at_1 value: 64.794 - type: mrr_at_10 value: 71.866 - type: mrr_at_100 value: 72.22 - type: mrr_at_1000 value: 72.238 - type: mrr_at_3 value: 70.416 - type: mrr_at_5 value: 71.304 - type: ndcg_at_1 value: 64.794 - type: ndcg_at_10 value: 54.186 - type: ndcg_at_100 value: 57.623000000000005 - type: ndcg_at_1000 value: 59.302 - type: ndcg_at_3 value: 49.703 - type: ndcg_at_5 value: 52.154999999999994 - type: precision_at_1 value: 64.794 - type: precision_at_10 value: 11.219 - type: precision_at_100 value: 1.394 - type: precision_at_1000 value: 0.16199999999999998 - type: precision_at_3 value: 30.767 - type: precision_at_5 value: 20.397000000000002 - type: recall_at_1 value: 32.397 - type: recall_at_10 value: 56.096999999999994 - type: recall_at_100 value: 69.696 - type: recall_at_1000 value: 80.88499999999999 - type: recall_at_3 value: 46.150999999999996 - type: recall_at_5 value: 50.993 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 81.1744 - type: ap value: 75.44973697032414 - type: f1 value: 81.09901117955782 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: dev revision: None metrics: - type: map_at_1 value: 19.519000000000002 - type: map_at_10 value: 31.025000000000002 - type: map_at_100 value: 32.275999999999996 - type: map_at_1000 value: 32.329 - type: map_at_3 value: 27.132 - type: map_at_5 value: 29.415999999999997 - type: mrr_at_1 value: 20.115 - type: mrr_at_10 value: 31.569000000000003 - type: mrr_at_100 value: 32.768 - type: mrr_at_1000 value: 32.816 - type: mrr_at_3 value: 27.748 - type: mrr_at_5 value: 29.956 - type: ndcg_at_1 value: 20.115 - type: ndcg_at_10 value: 37.756 - type: ndcg_at_100 value: 43.858000000000004 - type: ndcg_at_1000 value: 45.199 - type: ndcg_at_3 value: 29.818 - type: ndcg_at_5 value: 33.875 - type: precision_at_1 value: 20.115 - type: precision_at_10 value: 6.122 - type: precision_at_100 value: 0.919 - type: precision_at_1000 value: 0.10300000000000001 - type: precision_at_3 value: 12.794 - type: precision_at_5 value: 9.731 - type: recall_at_1 value: 19.519000000000002 - type: recall_at_10 value: 58.62500000000001 - type: recall_at_100 value: 86.99 - type: recall_at_1000 value: 97.268 - type: recall_at_3 value: 37.002 - type: recall_at_5 value: 46.778 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.71865025079799 - type: f1 value: 93.38906173610519 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 70.2576379388965 - type: f1 value: 49.20405830249464 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.48486886348351 - type: f1 value: 64.92199176095157 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.59246805648958 - type: f1 value: 72.1222026389164 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 30.887642595096825 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 28.3764418784054 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 31.81544126336991 - type: mrr value: 32.82666576268031 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.185 - type: map_at_10 value: 11.158 - type: map_at_100 value: 14.041 - type: map_at_1000 value: 15.360999999999999 - type: map_at_3 value: 8.417 - type: map_at_5 value: 9.378 - type: mrr_at_1 value: 44.582 - type: mrr_at_10 value: 53.083999999999996 - type: mrr_at_100 value: 53.787 - type: mrr_at_1000 value: 53.824000000000005 - type: mrr_at_3 value: 51.187000000000005 - type: mrr_at_5 value: 52.379 - type: ndcg_at_1 value: 42.57 - type: ndcg_at_10 value: 31.593 - type: ndcg_at_100 value: 29.093999999999998 - type: ndcg_at_1000 value: 37.909 - type: ndcg_at_3 value: 37.083 - type: ndcg_at_5 value: 34.397 - type: precision_at_1 value: 43.963 - type: precision_at_10 value: 23.498 - type: precision_at_100 value: 7.6160000000000005 - type: precision_at_1000 value: 2.032 - type: precision_at_3 value: 34.572 - type: precision_at_5 value: 29.412 - type: recall_at_1 value: 5.185 - type: recall_at_10 value: 15.234 - type: recall_at_100 value: 29.49 - type: recall_at_1000 value: 62.273999999999994 - type: recall_at_3 value: 9.55 - type: recall_at_5 value: 11.103 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: map_at_1 value: 23.803 - type: map_at_10 value: 38.183 - type: map_at_100 value: 39.421 - type: map_at_1000 value: 39.464 - type: map_at_3 value: 33.835 - type: map_at_5 value: 36.327 - type: mrr_at_1 value: 26.68 - type: mrr_at_10 value: 40.439 - type: mrr_at_100 value: 41.415 - type: mrr_at_1000 value: 41.443999999999996 - type: mrr_at_3 value: 36.612 - type: mrr_at_5 value: 38.877 - type: ndcg_at_1 value: 26.68 - type: ndcg_at_10 value: 45.882 - type: ndcg_at_100 value: 51.227999999999994 - type: ndcg_at_1000 value: 52.207 - type: ndcg_at_3 value: 37.511 - type: ndcg_at_5 value: 41.749 - type: precision_at_1 value: 26.68 - type: precision_at_10 value: 7.9750000000000005 - type: precision_at_100 value: 1.0959999999999999 - type: precision_at_1000 value: 0.11900000000000001 - type: precision_at_3 value: 17.449 - type: precision_at_5 value: 12.897 - type: recall_at_1 value: 23.803 - type: recall_at_10 value: 67.152 - type: recall_at_100 value: 90.522 - type: recall_at_1000 value: 97.743 - type: recall_at_3 value: 45.338 - type: recall_at_5 value: 55.106 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: map_at_1 value: 70.473 - type: map_at_10 value: 84.452 - type: map_at_100 value: 85.101 - type: map_at_1000 value: 85.115 - type: map_at_3 value: 81.435 - type: map_at_5 value: 83.338 - type: mrr_at_1 value: 81.19 - type: mrr_at_10 value: 87.324 - type: mrr_at_100 value: 87.434 - type: mrr_at_1000 value: 87.435 - type: mrr_at_3 value: 86.31 - type: mrr_at_5 value: 87.002 - type: ndcg_at_1 value: 81.21000000000001 - type: ndcg_at_10 value: 88.19 - type: ndcg_at_100 value: 89.44 - type: ndcg_at_1000 value: 89.526 - type: ndcg_at_3 value: 85.237 - type: ndcg_at_5 value: 86.892 - type: precision_at_1 value: 81.21000000000001 - type: precision_at_10 value: 13.417000000000002 - type: precision_at_100 value: 1.537 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.31 - type: precision_at_5 value: 24.59 - type: recall_at_1 value: 70.473 - type: recall_at_10 value: 95.367 - type: recall_at_100 value: 99.616 - type: recall_at_1000 value: 99.996 - type: recall_at_3 value: 86.936 - type: recall_at_5 value: 91.557 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 59.25776525253911 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 63.22135271663078 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 4.003 - type: map_at_10 value: 10.062999999999999 - type: map_at_100 value: 11.854000000000001 - type: map_at_1000 value: 12.145999999999999 - type: map_at_3 value: 7.242 - type: map_at_5 value: 8.652999999999999 - type: mrr_at_1 value: 19.7 - type: mrr_at_10 value: 29.721999999999998 - type: mrr_at_100 value: 30.867 - type: mrr_at_1000 value: 30.944 - type: mrr_at_3 value: 26.683 - type: mrr_at_5 value: 28.498 - type: ndcg_at_1 value: 19.7 - type: ndcg_at_10 value: 17.095 - type: ndcg_at_100 value: 24.375 - type: ndcg_at_1000 value: 29.831000000000003 - type: ndcg_at_3 value: 16.305 - type: ndcg_at_5 value: 14.291 - type: precision_at_1 value: 19.7 - type: precision_at_10 value: 8.799999999999999 - type: precision_at_100 value: 1.9349999999999998 - type: precision_at_1000 value: 0.32399999999999995 - type: precision_at_3 value: 15.2 - type: precision_at_5 value: 12.540000000000001 - type: recall_at_1 value: 4.003 - type: recall_at_10 value: 17.877000000000002 - type: recall_at_100 value: 39.217 - type: recall_at_1000 value: 65.862 - type: recall_at_3 value: 9.242 - type: recall_at_5 value: 12.715000000000002 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_spearman value: 80.25888668589654 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_spearman value: 77.02037527837669 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_spearman value: 86.58432681008449 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_spearman value: 81.31697756099051 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_spearman value: 88.18867599667057 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_spearman value: 84.87853941747623 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_spearman value: 89.46479925383916 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_spearman value: 66.45272113649146 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_spearman value: 86.43357313527851 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 78.82761687254882 - type: mrr value: 93.46223674655047 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: map_at_1 value: 44.583 - type: map_at_10 value: 52.978 - type: map_at_100 value: 53.803 - type: map_at_1000 value: 53.839999999999996 - type: map_at_3 value: 50.03300000000001 - type: map_at_5 value: 51.939 - type: mrr_at_1 value: 47.0 - type: mrr_at_10 value: 54.730000000000004 - type: mrr_at_100 value: 55.31399999999999 - type: mrr_at_1000 value: 55.346 - type: mrr_at_3 value: 52.0 - type: mrr_at_5 value: 53.783 - type: ndcg_at_1 value: 47.0 - type: ndcg_at_10 value: 57.82899999999999 - type: ndcg_at_100 value: 61.49400000000001 - type: ndcg_at_1000 value: 62.676 - type: ndcg_at_3 value: 52.373000000000005 - type: ndcg_at_5 value: 55.481 - type: precision_at_1 value: 47.0 - type: precision_at_10 value: 7.867 - type: precision_at_100 value: 0.997 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 20.556 - type: precision_at_5 value: 14.066999999999998 - type: recall_at_1 value: 44.583 - type: recall_at_10 value: 71.172 - type: recall_at_100 value: 87.7 - type: recall_at_1000 value: 97.333 - type: recall_at_3 value: 56.511 - type: recall_at_5 value: 64.206 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.66237623762376 - type: cos_sim_ap value: 90.35465126226322 - type: cos_sim_f1 value: 82.44575936883628 - type: cos_sim_precision value: 81.32295719844358 - type: cos_sim_recall value: 83.6 - type: dot_accuracy value: 99.66237623762376 - type: dot_ap value: 90.35464287920453 - type: dot_f1 value: 82.44575936883628 - type: dot_precision value: 81.32295719844358 - type: dot_recall value: 83.6 - type: euclidean_accuracy value: 99.66237623762376 - type: euclidean_ap value: 90.3546512622632 - type: euclidean_f1 value: 82.44575936883628 - type: euclidean_precision value: 81.32295719844358 - type: euclidean_recall value: 83.6 - type: manhattan_accuracy value: 99.65940594059406 - type: manhattan_ap value: 90.29220174849843 - type: manhattan_f1 value: 82.4987605354487 - type: manhattan_precision value: 81.80924287118977 - type: manhattan_recall value: 83.2 - type: max_accuracy value: 99.66237623762376 - type: max_ap value: 90.35465126226322 - type: max_f1 value: 82.4987605354487 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 65.0394225901397 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 35.27954189859326 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 50.99055979974896 - type: mrr value: 51.82745257193787 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.21655465344237 - type: cos_sim_spearman value: 29.853205339630172 - type: dot_pearson value: 30.216540628083564 - type: dot_spearman value: 29.868978894753027 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.2 - type: map_at_10 value: 1.398 - type: map_at_100 value: 7.406 - type: map_at_1000 value: 18.401 - type: map_at_3 value: 0.479 - type: map_at_5 value: 0.772 - type: mrr_at_1 value: 70.0 - type: mrr_at_10 value: 79.25999999999999 - type: mrr_at_100 value: 79.25999999999999 - type: mrr_at_1000 value: 79.25999999999999 - type: mrr_at_3 value: 77.333 - type: mrr_at_5 value: 78.133 - type: ndcg_at_1 value: 63.0 - type: ndcg_at_10 value: 58.548 - type: ndcg_at_100 value: 45.216 - type: ndcg_at_1000 value: 41.149 - type: ndcg_at_3 value: 60.641999999999996 - type: ndcg_at_5 value: 61.135 - type: precision_at_1 value: 70.0 - type: precision_at_10 value: 64.0 - type: precision_at_100 value: 46.92 - type: precision_at_1000 value: 18.642 - type: precision_at_3 value: 64.667 - type: precision_at_5 value: 66.4 - type: recall_at_1 value: 0.2 - type: recall_at_10 value: 1.6729999999999998 - type: recall_at_100 value: 10.856 - type: recall_at_1000 value: 38.964999999999996 - type: recall_at_3 value: 0.504 - type: recall_at_5 value: 0.852 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 1.6629999999999998 - type: map_at_10 value: 8.601 - type: map_at_100 value: 14.354 - type: map_at_1000 value: 15.927 - type: map_at_3 value: 4.1930000000000005 - type: map_at_5 value: 5.655 - type: mrr_at_1 value: 18.367 - type: mrr_at_10 value: 34.466 - type: mrr_at_100 value: 35.235 - type: mrr_at_1000 value: 35.27 - type: mrr_at_3 value: 28.571 - type: mrr_at_5 value: 31.531 - type: ndcg_at_1 value: 14.285999999999998 - type: ndcg_at_10 value: 20.374 - type: ndcg_at_100 value: 33.532000000000004 - type: ndcg_at_1000 value: 45.561 - type: ndcg_at_3 value: 18.442 - type: ndcg_at_5 value: 18.076 - type: precision_at_1 value: 18.367 - type: precision_at_10 value: 20.204 - type: precision_at_100 value: 7.489999999999999 - type: precision_at_1000 value: 1.5630000000000002 - type: precision_at_3 value: 21.769 - type: precision_at_5 value: 20.408 - type: recall_at_1 value: 1.6629999999999998 - type: recall_at_10 value: 15.549 - type: recall_at_100 value: 47.497 - type: recall_at_1000 value: 84.524 - type: recall_at_3 value: 5.289 - type: recall_at_5 value: 8.035 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 71.8194 - type: ap value: 14.447702451658554 - type: f1 value: 55.13659412856185 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 63.310696095076416 - type: f1 value: 63.360434851097814 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 51.30677907335145 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 86.12386004649221 - type: cos_sim_ap value: 73.99096426215495 - type: cos_sim_f1 value: 68.18416968442834 - type: cos_sim_precision value: 66.86960933536275 - type: cos_sim_recall value: 69.55145118733509 - type: dot_accuracy value: 86.12386004649221 - type: dot_ap value: 73.99096813038672 - type: dot_f1 value: 68.18416968442834 - type: dot_precision value: 66.86960933536275 - type: dot_recall value: 69.55145118733509 - type: euclidean_accuracy value: 86.12386004649221 - type: euclidean_ap value: 73.99095984980165 - type: euclidean_f1 value: 68.18416968442834 - type: euclidean_precision value: 66.86960933536275 - type: euclidean_recall value: 69.55145118733509 - type: manhattan_accuracy value: 86.09405734040651 - type: manhattan_ap value: 73.96825745608601 - type: manhattan_f1 value: 68.13888179729383 - type: manhattan_precision value: 65.99901088031652 - type: manhattan_recall value: 70.42216358839049 - type: max_accuracy value: 86.12386004649221 - type: max_ap value: 73.99096813038672 - type: max_f1 value: 68.18416968442834 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.99367407924865 - type: cos_sim_ap value: 86.19720829843081 - type: cos_sim_f1 value: 78.39889075384951 - type: cos_sim_precision value: 74.5110278818144 - type: cos_sim_recall value: 82.71481367416075 - type: dot_accuracy value: 88.99367407924865 - type: dot_ap value: 86.19718471454047 - type: dot_f1 value: 78.39889075384951 - type: dot_precision value: 74.5110278818144 - type: dot_recall value: 82.71481367416075 - type: euclidean_accuracy value: 88.99367407924865 - type: euclidean_ap value: 86.1972021422436 - type: euclidean_f1 value: 78.39889075384951 - type: euclidean_precision value: 74.5110278818144 - type: euclidean_recall value: 82.71481367416075 - type: manhattan_accuracy value: 88.95680521597392 - type: manhattan_ap value: 86.16659921351506 - type: manhattan_f1 value: 78.39125971550081 - type: manhattan_precision value: 74.82502799552073 - type: manhattan_recall value: 82.31444410224823 - type: max_accuracy value: 88.99367407924865 - type: max_ap value: 86.19720829843081 - type: max_f1 value: 78.39889075384951 --- # hkunlp/instructor-base We introduce **Instructor**👨‍🏫, an instruction-finetuned text embedding model that can generate text embeddings tailored to any task (e.g., classification, retrieval, clustering, text evaluation, etc.) and domains (e.g., science, finance, etc.) ***by simply providing the task instruction, without any finetuning***. Instructor👨‍ achieves sota on 70 diverse embedding tasks! The model is easy to use with **our customized** `sentence-transformer` library. For more details, check out [our paper](https://arxiv.org/abs/2212.09741) and [project page](https://instructor-embedding.github.io/)! **************************** **Updates** **************************** * 01/21: We released a new [checkpoint](https://huggingface.co/hkunlp/instructor-base) trained with hard negatives, which gives better performance. * 12/21: We released our [paper](https://arxiv.org/abs/2212.09741), [code](https://github.com/HKUNLP/instructor-embedding), [checkpoint](https://huggingface.co/hkunlp/instructor-base) and [project page](https://instructor-embedding.github.io/)! Check them out! ## Quick start <hr /> ## Installation ```bash pip install InstructorEmbedding ``` ## Compute your customized embeddings Then you can use the model like this to calculate domain-specific and task-aware embeddings: ```python from InstructorEmbedding import INSTRUCTOR model = INSTRUCTOR('hkunlp/instructor-base') sentence = "3D ActionSLAM: wearable person tracking in multi-floor environments" instruction = "Represent the Science title:" embeddings = model.encode([[instruction,sentence]]) print(embeddings) ``` ## Use cases <hr /> ## Calculate embeddings for your customized texts If you want to calculate customized embeddings for specific sentences, you may follow the unified template to write instructions: &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Represent the `domain` `text_type` for `task_objective`: * `domain` is optional, and it specifies the domain of the text, e.g., science, finance, medicine, etc. * `text_type` is required, and it specifies the encoding unit, e.g., sentence, document, paragraph, etc. * `task_objective` is optional, and it specifies the objective of embedding, e.g., retrieve a document, classify the sentence, etc. ## Calculate Sentence similarities You can further use the model to compute similarities between two groups of sentences, with **customized embeddings**. ```python from sklearn.metrics.pairwise import cosine_similarity sentences_a = [['Represent the Science sentence: ','Parton energy loss in QCD matter'], ['Represent the Financial statement: ','The Federal Reserve on Wednesday raised its benchmark interest rate.']] sentences_b = [['Represent the Science sentence: ','The Chiral Phase Transition in Dissipative Dynamics'], ['Represent the Financial statement: ','The funds rose less than 0.5 per cent on Friday']] embeddings_a = model.encode(sentences_a) embeddings_b = model.encode(sentences_b) similarities = cosine_similarity(embeddings_a,embeddings_b) print(similarities) ``` ## Information Retrieval You can also use **customized embeddings** for information retrieval. ```python import numpy as np from sklearn.metrics.pairwise import cosine_similarity query = [['Represent the Wikipedia question for retrieving supporting documents: ','where is the food stored in a yam plant']] corpus = [['Represent the Wikipedia document for retrieval: ','Capitalism has been dominant in the Western world since the end of feudalism, but most feel[who?] that the term "mixed economies" more precisely describes most contemporary economies, due to their containing both private-owned and state-owned enterprises. In capitalism, prices determine the demand-supply scale. For example, higher demand for certain goods and services lead to higher prices and lower demand for certain goods lead to lower prices.'], ['Represent the Wikipedia document for retrieval: ',"The disparate impact theory is especially controversial under the Fair Housing Act because the Act regulates many activities relating to housing, insurance, and mortgage loans—and some scholars have argued that the theory's use under the Fair Housing Act, combined with extensions of the Community Reinvestment Act, contributed to rise of sub-prime lending and the crash of the U.S. housing market and ensuing global economic recession"], ['Represent the Wikipedia document for retrieval: ','Disparate impact in United States labor law refers to practices in employment, housing, and other areas that adversely affect one group of people of a protected characteristic more than another, even though rules applied by employers or landlords are formally neutral. Although the protected classes vary by statute, most federal civil rights laws protect based on race, color, religion, national origin, and sex as protected traits, and some laws include disability status and other traits as well.']] query_embeddings = model.encode(query) corpus_embeddings = model.encode(corpus) similarities = cosine_similarity(query_embeddings,corpus_embeddings) retrieved_doc_id = np.argmax(similarities) print(retrieved_doc_id) ``` ## Clustering Use **customized embeddings** for clustering texts in groups. ```python import sklearn.cluster sentences = [['Represent the Medicine sentence for clustering: ','Dynamical Scalar Degree of Freedom in Horava-Lifshitz Gravity'], ['Represent the Medicine sentence for clustering: ','Comparison of Atmospheric Neutrino Flux Calculations at Low Energies'], ['Represent the Medicine sentence for clustering: ','Fermion Bags in the Massive Gross-Neveu Model'], ['Represent the Medicine sentence for clustering: ',"QCD corrections to Associated t-tbar-H production at the Tevatron"], ['Represent the Medicine sentence for clustering: ','A New Analysis of the R Measurements: Resonance Parameters of the Higher, Vector States of Charmonium']] embeddings = model.encode(sentences) clustering_model = sklearn.cluster.MiniBatchKMeans(n_clusters=2) clustering_model.fit(embeddings) cluster_assignment = clustering_model.labels_ print(cluster_assignment) ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
carsondial/slinger20241231-1
carsondial
sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:45000", "loss:MatryoshkaLoss", "loss:MultipleNegativesRankingLoss", "en", "arxiv:1908.10084", "arxiv:2205.13147", "arxiv:1705.00652", "base_model:BAAI/bge-base-en-v1.5", "base_model:finetune:BAAI/bge-base-en-v1.5", "license:apache-2.0", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,735
1,735
5
0
--- base_model: BAAI/bge-base-en-v1.5 language: - en library_name: sentence-transformers license: apache-2.0 metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:45000 - loss:MatryoshkaLoss - loss:MultipleNegativesRankingLoss widget: - source_sentence: 'Remote sales job with CRM experience Business development specialist job description Commission-based sales role with flexible schedule' sentences: - 'Marketing Plans & Research Insight to Inform your Strategy Your buyer is our preoccupation. So when we work together to develop plans for new marketing or product launches, we help you research how your buyers discover and decide about solutions like yours. We also help you align overall company goals such as revenue growth and customer retention with more specific marketing objectives. This yields action oriented, measurable, flexible plans that are aligned to results that matter to everyone in the organization. - Marketing / Promotion Plan Development - Vertical Market Research - Competitor Research - Persona Development Let''s Make Something Great Together!Contact Us We capture internal and external perceptions from stakeholders, customers and prospects. To understand the market and predicted trends, we incorporate relevant 3rd party research into the shaping of any strategy. Analysis and insight We provide highly relevant, easy to implement knowledge to help further important marketing decisions.' - 'Handpicked Sales and Hot New Products for Busy Women Building your Summer Wardrobe with the Basics Posts contain affiliate links where I earn a small amount commission on purchases through links. This post is sponsored by JCPenney but all thoughts are my own! Summer is here and that means it is time for a little wardrobe refresh! One thing I am doing this year is buying basic pieces that I can mix, match and style with others. I teamed up with JCPenney for this post to show you that JCPenney is a one stop destination for affordable fashion and home items.' - 'Hello, Business Developers! My name is Nate Ginsburg and I am a serial entrepreneur. I’m the founder of Premier Media, host of the Ecom Exits Podcast, and partner in a handful of other online businesses. About the role: As a Business Development Specialist, with a strong background in sales, you will be responsible for identifying, pursuing, and nurturing business opportunities, fostering strong client relationships, and expanding our client base. This role plays a crucial part in achieving our business objectives by driving revenue growth and ensuring we meet our client’s talent acquisition needs. This is a Remote Part-Time or Full-Time (commission-based) position. - Bachelor’s degree in business, marketing, or a related field (preferred). - Previous experience in a recruitment agency or the human resources industry, with a demonstrated track record of successful sales and client acquisition. - Should be able to work independently, from acquiring clients to managing relationships with them, ensuring their requirements are met by the back-end team. - Should be able to generate leads independently and attend client meetings. - Excellent communication and interpersonal skills. - Strong negotiation and persuasion abilities. - Self-motivated and results-oriented with a high degree of initiative. - Ability to work independently and as part of a collaborative team. - Proficiency in using CRM software and other relevant sales and marketing tools. - Identify and engage with potential clients to understand their staffing and recruitment requirements. - Develop and maintain a comprehensive understanding of the industries and sectors in which the agency operates. - Build and maintain a robust sales pipeline to meet and exceed monthly, quarterly, and annual revenue targets. - Collaborate with our recruitment team to customize solutions that align with client needs and deliver value-added services. - Prepare and deliver compelling sales presentations to prospective clients. - Negotiate and finalize client agreements, contracts, and service-level agreements. - Stay up-to-date on industry trends, market dynamics, and competitive landscape. - Provide regular reports and updates to the management team on business development progress and client acquisition efforts. - Work with an existing talented team - Completely remote and flexible schedule - Location independence - You’ll be joining a high-level and fast-paced team, working with exciting projects Please note that you’ll be joining a fast-paced and high-achieving team. You’ll be held to high expectations and challenged. And you’ll also be supported in your growth in this role professionally as well as personally. If this sounds interesting to you, please proceed with the application.' - source_sentence: 'Dock-it Springfield OH contact information Dock-it business services Springfield Ohio Anthony Young Dock-it phone number' sentences: - 'VMware testing servers-by-the-minute cloud vCloud Air joins the come-and-get-it cloud caper Journalists who cover cloud have a recurring nightmare: leaving the “l” out of “public cloud” and copping a caning in the comments for their somewhat Freudian slip. But one vendor that operates a cloud that isn''t entirely public is VMware: Virtzilla''s vCloud Air has hitherto been sold as a by-the-month affair. The company has, however, signalled it''s keen to get into the “hand me your credit card and run up virtual machines for an hour” caper, and has now announced that it''s testing just such a service. You – yes even YOU – can apply to be a test pilot by signing up here. VMware''s calling this vCloud air service “self-provision on-demand” for now, and promises access with “nothing more than a browser and a credit card.” Usage will be metered by the minute and billed monthly in arrears. The company''s talking up the usual vision of vSphere and vCenter everywhere, for a One Control Freak To Rule Them All rig that behaves the same whether servers are under your desk or on VMware''s largely-rented bit barns. The announcement of the test pilot program is noteworthy because VMware''s talking up the chance to access “a virtually unlimited pool of resources.” That''s a bold statement because compared to the likes of AWS, Google and Microsoft, VMware doesn''t have enormous amounts of money to spend on bit barns. With even the likes of Rackspace walking away from come-and-get-it public cloud, VMware heading in that direction bespeaks either cunning ways of scaling or a big reservoir of bucks to throw at the market. ®' - 'Click-Through Rate (CTR) is a simple metric that shows that rate at which an ad impression receives a click. It’s a core measurement in Search Engine Marketing and yet, slightly misunderstood on the surface. Let’s look at this major ad performance metric. What is Click-Through Rate? Whenever and however you generate traffic to your app or website — be it through any channel (social, organic, referral, paid, display, email) — your links, banners, or ads will have impressions (views) and clicks. Impressions are the number of times a link or ad was seen, or rather available on the screen to be seen. Clicks are, well, people who clicked the link or ad. Click-Through Rate is the percentage of people who saw the impression and clicked it. A little fictional case study to better understand CTR To better understand CTR, let’s look at an example of a ‘Search Network’ campaign in Google Adwords. Since I live in wine country, we’ll be using my, fictional, Milani Winery. First, a little context on how Google Adwords works. In general, ‘search network’ campaigns through Adwords charge on a cost-per-click (aka. pay-per-click) model. Essentially, you’re showing ads at the top of Google search results pages for targeted keyword searches. You determine how to show your impressions based on factors like geotarget, keyword selection and keyword match type. The larger the geotarget and the broader the keyword, the larger the pool and likely the lower the Click-Through Rate. The tighter the geotarget and more specific the keyword (think niche), the more targeted and smaller the pool and thus a likely higher CTR. Branding campaigns usually generate a lot of impressions to a wide audience, so expect a lower CTR. A Call-to-Action campaign will be more targeted and you’ll want to hit a higher CTR. Adwords & CTR: Winery Example (while I am related to the Sebastiani’s and Mondavi’s, my winery is fictional…for now) Milani Winery, runs an Adwords campaign to drive visits to their tasting room in Northern California. They run three ad groups in a campaign targeting California. The first ad group is titled “wine,” and the keywords are ‘broad’ matched to the term “wine”. The second ad group is titled “winery,” and targets keywords using broad match modifiers, phrases, and exact match keywords. Finally, the third ad group in this campaign is titled “tasting room,” and keywords are phrase match and exact match only. Which do you think will have the highest Click-Through Rate? Keywords for Ad Group 1: “Wine” Keywords for Ad Group 2: “Winery” Keywords for Ad Group 3: “Tasting Room” [tasting room near me] [near me tasting room] +best +tasting +room +tasting +room +visit +tasting +room location Ad group 1: “Wine” is just too broad and would generate the lowest Click-Through Rate (probably around a 0.1%-0.7%). It would return results for things like “Red Red Wine” or “How to remove wine stains” — both of which aren’t relevant. Google would likely decrease your Quality Score, because of the broadness of the term, and your ads probably would show much of the time. Ad Group 2: “Winery” is getting closer. Winery is a good search term for increasing traffic to a tasting room, but it’s not a bulls-eye match to your target audience. Your CTR is likely to be higher here, but not that much more significant. I’d probably estimate a Click-Through Rate of around 1%-2.5%. Advertising for this keyword could be seen more as a brand awareness campaign. Ad group 3: “Tasting Room” is just about right! People searching for tasting rooms are likely going to see your ad and find it relevant, especially if your ad is enticing or delivering an offer. At this level of targeting expect to see upwards of 5-6% and up to maybe 8% in CTR. The volume might be smaller but the relevancy and quality is high! Anything higher than 10% and you are targeted your audience at level of CTR that is usually found when targeting your brand keyword terms, like “Milani Winery.” Great Merlot btw! * Note, CTR can vary quite a bit depending on bid, landing page quality, and ad copy. So, my estimates above assume you have great quality in those three categories. The Click-Through Rate Deception We’ve now seen an example of CTR in action — targeting broad and delivering low CTR, versus tactical targeting generating high CTR. Now we need to talk about when the number can lie and why it’s important pair CTR with a quality metric. I was once handed an account to optimize that had been running for over a year. The previous company had been delivering a very low Cost-Per-Click and a high Click-Through Rate. Sounds great on the surface, but as I dove into the data I found that they had delivered 98% of the previous years’ clicks and budget on a single broad match term that was included in their brand. The keyword was also synonymous with the name of a county. To top it off, they were not even measuring the quality of visits. Over that previous year, they likely blew $60,000 on irrelevant CTR — even though they had great “ad performance” numbers. Don’t always believe great ad performance metrics without measuring quality, or viewing actual search terms. A performance without quality metrics is the really the pitfall of Adwords in general. You must have some quality metrics in there that you are tracking. Google Analytics by default will import Bounce Rate, Ave. Session Duration, Pages Per Session, and % of New Sessions. You should also import your goals from GA to best optimize your keyword CTA’s. If for some reason, you can’t access Google Analytics and quality metrics, then you can still get an idea of the quality by looking at the actual search term data report on the keyword tab of Adwords. That will at least tell you if you are hitting a close target. Don’t want to manage Adwords yourself? Contact me' - 'Dock-it is a Business Services (Unclassified) business in Springfield, OH. Dock-it classified under Business Services (Unclassified), with principal employer Anthony Young Full Name Report is located in 1938 Broadway, Springfield, Ohio OH 45504. For sales, support, account inquiries, and how to be an affiliate, the best way to get in touch is through numbers: (937) 340-1110 Full Phone Report. It has been operating since , boasting total quality assurance, and an annual revenue of Unknown. Their Single Location can be reached with the use of the following coordinates: 39.93067,-83.84218. They currently have 1 to 4 employees, and you can learn more about their offerings at . You can also view their clients’ testimonials, government compliance and annual reports through their website. Dock-it aims to strengthen their B2B relationships through advertisements and brand promotion. Registered codes from Standard Industrial Classification is , and from North American Industrial Classification System. Customer feedback is highly appreciated; be sure to leave your comments on their website through survey forms. These are necessary for company service improvements. |Categorized In:||Business Services (Unclassified)| |Address:||1938 Broadway, Springfield, OH 45504| |Phone Number:||(937) 340-1110 Full Phone Report| |Contact Person:||Anthony Young Full Name Report| |Business Type:||B2B (Business to Business)| |Employee #:||1 to 4| |Location Type:||Single Location| |Annual Revenue ($):||Unknown| |Share This Business:|' - source_sentence: 'Is Stylelili.com a scam? Why is Stylelili.com suspicious? Is Stylelili.com a legitimate online shopping platform?' sentences: - 'By reviewing journal entries, it has been noticed that the base currency has a mismatch with the foreign rate, appearing to be incorrectly calculated. Example:Account Currency balance in USD is $1,115,354.018 and showing as CAD $1,085,098.1719; but it should be $1,429,941.04871 CAD at a current system exchange rate of 1 CAD = 0.78 USD. Most of the exchange rates in the sample are between 0.70 and 1. There are no transactions with an exchange rate greater than 1. So how is it showing a balance greater than Base currency if there are no transactions with an exchange rate greater than 1? As the calculation is done in a running balance and the transaction for the debit that have 1 as transaction rate have more occurrences compared to credit one, it caused the balance to be accumulated throughout the year. See in this given example: Everest kept record for the transaction for each currency, base and foreign, and for each of these their own running balance where in the case for this account, the carried over amount have accumulated, causing the odd increase in the difference. In example given the ending balance are 50 for base currency while in the foreign currency it is 80, even though the last entry have 0.70 as exchange rate, the running balance rate was 1.60 in actual. By default, Everest keep track of the separate amount for each of currency, one for base currency and the other one is foreign currency these caused an increase in the difference. In order to reset this behavior, a suggestion is to clear out the balance by creating two journal that uses one journal entry out from account and one journal in entry, as an example please find the screenshots below that show switched rate with base currency of USD to CAD as foreign currency, with the current balance for the CAD account is CAD 70.00. Credit: Raditya Perdevi' - 'In the current landscape of abundant online shopping options, consumers have a plethora of choices. However, these numerous options also come with a hidden peril: the presence of scammers posing as genuine businesses. Consequently, when evaluating a website’s legitimacy, a tool like Gurustab becomes essential. This leads us to the question of whether Stylelili.com is a reputable entity or a fraudulent operation. Stylelili.com operates as an e-commerce platform, offering a range of products from sweaters to hoodies and shirts. Yet, with its prominence as one of the most frequently searched websites on Gurustab, a comprehensive assessment was imperative. We regularly share our list of the most searched sites on Twitter, providing tips on avoiding scams. Dubious Registration Date on Stylelili.com It’s a common tactic employed by unscrupulous websites to exaggerate their establishment date. Why? Because consumers are more inclined to trust older online shops over newer ones. Consequently, when we encountered their claim on the “About Us” page indicating a founding year of 2016, it raised suspicion. However, upon cross-referencing with Gurustab and Whois.com, we were startled to discover that the website was registered as recently as July 2023. This inconsistency casts a shadow of doubt on the legitimacy of Stylelili.com. Significant Discounts and Remarkably Low Prices One of the most alluring aspects of online shopping is the opportunity to find products at prices lower than those in physical stores. However, scammers are well aware of this consumer behavior and exploit it to their advantage. Stylelili.com entices shoppers with seemingly irresistible deals and discounts, promising unbelievable bargains. For instance, a majority of their products boast discounts exceeding 80%, an anomaly in the industry. While these bargains may appear enticing at first glance, it’s imperative to exercise caution when prices seem too good to be true. Online scammers often employ this tactic by presenting products at unbelievably low prices to attract potential victims. Such rock-bottom prices frequently serve as a telltale sign, indicating potentially deceitful intentions. Legitimate businesses strive for sustainable profit margins, making excessively low prices a cause for concern. Faux Social Media Buttons Reputable and genuine online businesses actively maintain a presence on social media platforms to connect with their customers, showcase their products, and promote transparency. Stylelili.com, however, raises suspicions by displaying social media buttons that, upon closer examination, prove to be mere facades. Clicking on these buttons redirects users to the main pages of social media platforms but lacks any actual content or engagement from the website. This tactic is commonly employed to create the illusion of legitimacy and a vibrant online presence. Negative Reviews on Trustpilot: A Reason for Alarm A brief visit to Trustpilot, a widely-used review platform, reveals a concerning pattern regarding Stylelili.com. Numerous customers have voiced skepticism and dissatisfaction with their shopping experiences on the site, with some even questioning the website’s authenticity. Final Verdict: Is Stylelili.com Legit or a Scam? In sum, the presence of multiple conspicuous warning signs cannot be overlooked. From the remarkably low prices and substantial discounts to the misleading registration date, there are ample reasons for concern when it comes to Stylelili.com. The inclusion of faux social media buttons and the presence of negative customer reviews further strengthen the case against its legitimacy. Consumers should exercise utmost caution when considering shopping at Stylelili.com, as there are undoubtedly more reliable online fashion retailers available.' - 'Are you looking for Spotify Premium Apk Download link? Great! Here you’ll get the download link of Spotify Premium Music Apk file here. Spotify is the very popular music application for Android where you can listen to all songs and music for free. If you don’t know how to install Spotify Premium App on Android for free then please follow this guide on and read how to install it on Android. Here you’ll get step by step guide along with screenshots. Click on the link below to download latest apk file of premium version- Spotify Premium Apk: Overview Spotify is the very popular song listening service. On this application, you can listen to all latest and older songs for free. Here you can download all songs and music on your phone. You can download Spotify Premium Apk from above link. Features of Spotify Apk: - Download & Listen to all English and Hindi songs for free. - Free to download on Android phones. - All latest and older music. - Available in miltiple languages.' - source_sentence: 'What are the skills required for an iPhone app developer? What services does Techlopes offer for iPhone app development? How can I hire an iPhone app developer?' sentences: - 'There have been many comments to the same, some logical, some not. I thought i would throw this question out to the readers of this blog and see what your views are on the same. I think only advance payment receive shippers only refuse to accept o. I want suggest to you explain about this issue which one should bear the cost that charge by the bank fee. For me, this issue is pretty much clear! Why so? An order to get a shipment on board comes to carrier from the consignor, not from the shipper. The consignor chooses the carrier and is responsible to pay haulage for his services. Actually, the shipper has no relations to the carrier. Ones the shipper is responsible for, is to get the shipment loaded on board. So he overtakes THC charges. I want to buy golf equipment from a seller in USA for my business in my country.Delta planer for sale All comments are export view oriented. What about the impoter? Hi Javier, one persons import is the other persons export. The bill of lading fee is a fee charged by the shipping line for the processing of the bill of lading on behalf of the client. The question is whether this charge should be paid by the shipper who is responsible for the cargo to be delivery FOB or since this charge is incurred after the cargo is loaded on board, then should it become the liability of the consignee.?? Hi RK, Trust the handling fees will be forwarders fee which is part of export clearance. Hi, Any charges raised prior to the cargo being loaded and stowed on board the vessel are for the shipper under FOB. Whats this handling charges? We should give that? Kindly advise. Hi RK, this would most probably be the Forwarders charges for handling the shipment. The Terminal Handling and Bill Charges are the actual port and line charges and the Agents Handling would be the fee for the Forwarder. Thus once cargo is stowed and secured, a bill of lading can be issued by the vessel. The bill of lading is a contract between the shipper and the vessel to move the cargo safely to the agreed place of delivery.Web tracking helps you and the recipient to keep track of the delivery progress. You contact their customer support in the event you face any issues. Even though it is time-guaranteed, premium service is quite economical for courier and shipment. Services in the business of logistics have the top growing demand in the distribution market. At exactly the same time, in addition, it ventured into Cold Chain Logistics. They give technical expertise linked to government agencies to guarantee a quick, efficient, and compliant import or export of goods to a selection of countries. Using our service is easy and simple to use. It might differ based on what sort of service your are Using. Our customer service provides an assortment of consumer care and client support alternatives to assist you in every possible method. In addition, we support, develop and operate third-party logistics service which employs the most recent technologies to fulfill every need of our clients.2020 09 1en ielts general writing task 2 samples band 9 pdf Shipping to certain nations or regions of the world involves increased risk. When placed correctly, you will secure the utmost impact. This will permit you to track the development of your consignment effortlessly via our site.With our global network, we can provide you with streamlined freight forwarding to book your cargo, arrange for pickup and delivery, and manage the shipping documentation. With decades of experience, we facilitate the entire forwarding process according to your specifications as well as the requirements of the import and export countries. As an NVOCC, we can maximize your routing and transit time options by managing your ocean shipments door-to-door, with ease, visibility and global reach. Choose from our flexible Sea Freight services based on your timing, cargo size and equipment needs. Faster transit times, normally means air freight however it can be very expensive.Where is the ac fuse on a 2009 outback full version Your company will benefit from our excellent capacity with top tier carriers and a strong global network. Click Here. View our sea freight bill of lading terms and conditions here:. EN FR. Access to space allocations with major carriers. Network of KWE Terminals to manage your cargo flows. Door-to-door service Web based tracking of your shipments. Maximize routing and transit options to global destinations. Chargeable Weight Calculation.The original program was written in Visual Basic, and only the windows executable file was distributed, thus, it is not possible to enhance or port it to other platforms. I started this project after I discovered how easy was to understand the serial protocol used between the Parameter Designer and the e-bike controller. Besides, I had a dream to be able to change e-bike settings anytime, anywhere. You may note several differences between the way how XPD presents the controller parameters, and the way how Parameter Designer used to do it. I''ll list here the most important differences, with detailed explanations. A: Of course, nobody should expect the controller will control these values up to the displayed precision. However, controller uses different units internally for most values, and if you translate successive internal-units values to SI system using integer values, you might get the same number for different amounts in internal controller units.Youtube oembed cors Thus I have decided to display the values with 0. A: As said above, controller uses different units for most parameters, and they are integer numbers in the range For example, controller value 90 maps to When these numbers are rounded to nearest integer, A: As I said before, the controller uses its internal scale for measuring volts and amperes. This scale depends on the values of some components on the board, and for some controllers the internal range would really map to this extremely large real-world scale. This does not really mean that EB will handle phase currents up to amperes wow in the real world, you still have to make sure the actual power MOSFETs can deal properly with the selected currents. Help Create Join Login. Operations Management. IT Management. Project Management. Services Business VoIP. Resources Blog Articles Deals. Menu Help Create Join Login. Home Browse xpd-ebike Wiki. Home What is this? Application screenshot. Android main application screen. Editing a profile on Android. Upload profile to controller via Bluetooth. Oh no! Some styles failed to load. Sign Up No, Thank you.He was born in Amsterdam, Netherlands and is a Dutch nationality. He has almost 6. On the popular complementary channelsthe figures declared by the report increases to be somewhere around to be with 1 million subscribers and the about a tentative amount million views and searches of his uploaded videos. Kwebbelkop is quite successful in his gaming venture and is the creator of a successful YouTube channel as well. Kwebbelkop is not the name given by his parents, but his birth name is Jordi Van Den Bussche. He was born in the year1 st June in the city of Amsterdam, Netherlands and possesses a Dutch nationality. He was brought up along with his sister Lauren, but not much is known about his academic career. Perhaps his urge of becoming a proficient gamer might not have let him pierce more into studies and academic lessons. From his childhood days, Jordi is into making group of friend for playing games and finally started off with the YouTube channel in the year In the month of October, that very year Jordi understood that YouTube would soon make him a successful gamer and perhaps serves a potential pathway to success. Just after passing of that very year, Jordi decided of starting off with a new channel and finally published Call of Duty: Black Ops 2 and the series of Minecraft. He also became famous for the several prank call videos which were considered as his son gaming videos. By the yearthe count of subscribers reached a count of about 10, After such a successful subscribers count, Jordi finally decided to be a fulltime YouTube star and a full-time gamer. Soon he was known to people as a popular gamer who has marvelous subscribers count. Sources clarified that his mother successfully runs a vlog under the name Momsvlog. Meanwhile, while being a gamer, he got his graduate degree and started off updating his YouTube channel seriously. The three prolific gamers were involved in solving the difficult series of Grand Theft Auto Online races. With the passage of time, due to some inconveniences faced, Kodi was at once replaced with the famous British YouTube star Slogoman. These were Jordi early gaming ventures, and soon he jumped off into the life of a professional gamer. Kwebbelkop has a prosperous gaming career ahead. He had successful started off a gaming team, though the entire members have reshuffled. The formation of the group has enabled Kwebbelkop to know all the traits and techniques are to be used while playing the tuff games. The team formed by him used to play the various races of GTA V online races. He is very humble towards his subscriber and viewers. He addresses his fans with the name KOPS. Along with his sister Lauren, he designs his videos and works upon its other features. His favorite color is known to be orange, and he tries to add the color in every possible way he can. He is quite a good-natured guy who is also gentle in his behavior. He is very much thankful to his fans and followers and often expresses gratitude through the social media sites for making him counted among the popular gamers. Not much information is available about the rising YouTube star. Her girlfriend plays a crucial role in making his videos have an added glamour.Postal Code Are you the developer of this app? Claim your app to get free and unrestricted access to your app and developer data. Sign up. Log in. Google Play Rating history and histogram. Join us for free to see more information about your app and learn how we can help you promote and earn money with your app. Bill Of Lading (船荷証券)、Sea Waybill (海上貨物運送状)について Developed by WiseSchematics, RE Equalizer reflects the design, performance and workflow of a powerful studio grade equalizers on the move. Integrated with advanced features such as Shunt Audio Engine c and Smart Interface cRE-EQ components utilize special filters which eliminate the artifacts, noise and distortion associated with band interaction, and provide perfect curve and stair-stepped filter responses. Finally, RE-EQ integrated real-time analyzer lets you compare the difference between audio signal. RE-EQ has been primarily designed to achieve ''broad brush'' tone-shaping effects thus equipped with fine controls to provide full freedom for sound tuning and tweaking so that users get the mix and sound the way they want, which means spending less time on setup and more time for tuning. RE-EQ is also available as Demo Version for free which is intended for Testing and Checking Purposes but may contain limited features and functionality. Note: For better compatibility and integration with the device, please make sure that no other equalizers or random effects are running in background. If you are facing such issues, please follow the below guideline. Such libraries can cause interference issues and may cancel or block effect from each other. We haven''t collected library information for this app yet. You can use the AppBrain Ad Detector app to detect libraries in apps installed on your device. Equalizer music player booster. Eqfy Equalizer. Want more apps? Find the Android apps that are trending right now. You have reached your daily pageview limit Register now to get 5 free AppBrain Intelligence pageviews per day. You''ll have access to: Unlimited pageviews both app and developer details Recent install count per app last 30 days Detailed ranking data per app Recent install count per developer last 30 days Full timeline per developer Device market shares data per country 1 Month. By continuing browsing on this website. You consent to enable third parties cookies usage. To launch a multiple search, fill out the field with up to three container references separated by commas. Search by Container Booking Bill of lading. Reference s. Further to our announcement on the launch of OCEAN Alliance OA Day 4 that will bring you expeditious shipping and service reliability on our Trans-Pacific services, we are pleased to share the effective sailing dates as below. Oh no! Some styles failed to load. 😵 Due to recent developments in Bangladesh, terminal operations and pick-up of inbound containers has been slow. Consequently, the yards in Chittagong are reaching full capacity, and most reefer plugs are already being occupied.50 square meter house floor plan As such, we are forced to divert reefer cargo to other ports. Keep abreast of the latest updates on APL services and operations, as well as rates and tariffs. Click here to subscribe to APL customer advisories. Necessary cookies enable core functionality. The website cannot function properly without these cookies. Registers a unique ID that is used to generate statistical data on how the visitor uses the website. Registers a unique ID that is used to optimize user experience when visiting our website. Registers a unique ID that is used to collect and transfer contact information from webforms to internal databases. Due to inactivity, your session is about to expire. You will be logged out in minutes and all unsaved actions will be lost. I am here, let''s continue. Go to the login page. Accept all Customize. To launch a multiple search, fill out the field with up to three container references separated by commas Search by Container Booking Bill of lading Reference s. Bangladesh — Port Congestion Surcharge PCS for cargo bound for Chittagong Friday, April 10, Due to recent developments in Bangladesh, terminal operations and pick-up of inbound containers has been slow.Princess Kemz, Rems, Animo, Jackibazey and Eroko + Kwe on ShowBiz Movement 10th April 2019 Necessary Cookies Necessary cookies enable core functionality. Google Analytics Registers a unique ID that is used to generate statistical data on how the visitor uses the website. OFF ON. Brainsonic Registers a unique ID that is used to optimize user experience when visiting our website. Eloqua Registers a unique ID that is used to collect and transfer contact information from webforms to internal databases. Due to inactivity, your session has expired. - Unit upgrades civ 6 - Seq2seq chatbot - B450 vs x470 for 3700x - Remanded appeal timeline - Bulk reloading supplies - Too many connections mariadb - Ghost recon ios app - Njoy lights up but wont hit - Nagar parishad - Rosetta stone - Sandro cabo verde mp3 - 12v niedervolt halogen fassung g4 - Guide to writing the final paper - Vba border color - Qualcomm volte nv - Katangian ng pilipinas - What is rtd coffee - Money dupe escape from tarkov - Kyocera taskalfa 3500i driver - Zigbee security system' - 'Tagged: Genesis Simple Menus Plugin Using Genesis Simple Menus, it works fine to change the submenu. Now I wish I could also change the primary menu for each page or post. How do I change this plugin to be able to do what I want? I can add that code? thanks You should post this question on the Genesis Simple Menus Support page found here. Ron Rennick, the developer should be able to assist you. You must be logged in to reply to this topic. Are you a blogger, web designer, developer, or website owner looking to generate more income? Promote products for the largest and fastest growing online website platform as part of our affiliate program.' - 'Hire iPhone App Developer IPhone is leading the smartphone industry. It has become a global trend to hire IPhone app developer for iphone app development. Business aims can be achieved through loyal hiring of iphone developers as this is an important step of strategy development of iphone applications. Iphone app development is in demand these days and to cope up with this increasing demand iphone app development organizations offer services like Hire iphone developer. Techlopes provides you skilled iphone app developers from its exceptional iOS developer’s pool to equip you with excellent hiring solutions. We have an aerodynamic procedure to recruit iphone app developer. Our excellent line up has comprehensive knowledge about this contemporary technology and is equipped with all the attributes to understand your prerequisites, communicate efficiently and act dynamically. We keep are developers up to date and trained them with latest technology that enhance productivity and enables us to deliver optimal solutions. Our experts possess specific skills of Robust programming skills in Objective-C, Understanding Cocoa Touch Framework, COCOS 2D framework, Essential Data understanding, Third Party & APNS integration, Web service (SOAP, REST, JSON, XML), Experience using social media APIs, Excellent debugging and optimization skills. Amalgamation of developers’ innovation and knowledge results in the form of remarkable solutions and applications. Our high tech working environment motivates the developers to perform well and provide solutions that can address your individual business needs. Being a foremost iPhone app development company, we keep on looking for more captivating arrangements and efficient techniques to bring out the best practices of iPhone app development. Techlopes hire iphone app developer offer the development services for iphone monitoring, iphone business, social networking and iphone gaming apps development. Hire Iphone Developer Services - Iphone M commerce applications - Social networking applications - Iphone utility applications - Iphone gaming applications - Iphone organization applications - Iphone web app development - Iphone business applications - Iphone multimedia application' - source_sentence: 'What is article marketing and how does it work? How can I use article marketing to increase traffic to my website? What are some effective strategies for an article marketing campaign?' sentences: - 'Alandari Gray Loveseat Sporting a pleasing roll arm for classic flair, the Alandari loveseat in soft gray exudes an easy elegance sure to please. Feel-good upholstery with linen texturing is made for everyday luxury. Muted-tone toss pillows are a soothing, sophisticated complement. Frame constructions have been rigorously tested to simulate the home and transportation environments for improved durability. Frame components are secured with combinations of glue, blocks, interlocking panels and staples. Seats and back spring rails are cut from mixed hardwood and engineered lumber. Stripes and patterns are match cut. All fabrics are pre-approved for wearability and durability against AHFA standards. Cushions are constructed of low melt fiber wrapped over high quality foam. Constructed with a platform seat foundation.' - 'Advertising An Article Marketing As every website owner knows, content is essential to raising your rankings in the search engines. There''s another way to use content to bring even more visitors to your site. It''s called article marketing. You can use one article in dozens of venues, to multiply your exposure exponentially. More traffic means more sales. Let''s look at some strategies you can use in your article marketing campaign.Let''s say you sell seeds, gardening supplies and gardening books on your site. You''re facing stiff competition, as there are many websites marketing these products. How do you get your share of attention? The answer is simple: exposure, as much as you can generate! Article marketing is a natural for this purpose. Write one article, and use it to advertise your site, over and over again. You know your products and their uses best. Write an article on how to start seeds. Give detailed and useful information such as equipment required, temperature and seasonal concerns, thinning and transplanting. There are many topics you can write about. Write an article telling your reader how to choose a good pair of garden gloves, and the advantages of a leather glove for one task, or a sturdy cotton glove for light work. Or, tell your reader how a quality bulb planter makes short work of a big planting project. I''m sure you get the idea. Do not write an advertisement! When your articles are prepared, your related subjectsarticle marketing campaign is ready to launch. Do a search using the term ”article directories gardening”. You''ll get hundreds or even thousands of results. Visit some of these article directories and browse their gardening category. Take a look at the quality of their articles. Unfortunately, there are many article directories with poorly written articles, so be choosy when selecting those you want to be associated with, as quality-conscious directories get the most readers. Here''s how it works: you submit your article to the directory. You are not paid money, but are rewarded with a byline and a link to your site. Every person who reads your article has the opportunity to click through to your site. Give your reader useful information in an engaging style, and chances are good they''ll come to your site to see what else they can learn. You can submit the same article to other directories, and watch your traffic and sales grow. Your article marketing strategy should include contacts with other website owners, especially those with an e-newsletter. Offer your article as newsletter content in exchange for a byline. See if you can negotiate reciprocal links. Of course you don''t want to contact direct competition. A florist, greenhouse or organic gardening site might be good choices. Your article marketing campaign''s success is directly related to the amount of exposure you create through your effort. One indirect result of a good article marketing campaign is establishing yourself as an expert in your business. Success is measured by traffic and sales. Be prolific and diligent! About the Author: InsightsOnMarketing provides readers with the latest reviews, articles, commentaries and write-ups on all article marketing, article directories, article reader related subjects' - 'If you have any questions before making a purchase, chat with our online sales to get more information. Start Online Chat Contact customer service for order status and other after-sales issues. Submit a Ticket Round Neck Black Lace Panel Shift Dress Lace Panel Lantern Sleeve Black Shift Dress Lace and PU Panel Black Shift Mini Dress Lace Up Long Sleeve Black Sweater Shift Dress Just click the LIKE BUTTON Below To Get Your10%OFFCoupon Code Show me my coupon! Join us to save on top fashion' model-index: - name: hi-di-hi-base results: - task: type: information-retrieval name: Information Retrieval dataset: name: dim 768 type: dim_768 metrics: - type: cosine_accuracy@1 value: 0.8376 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.9022 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.9204 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9438 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.8376 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.3007333333333333 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.18407999999999997 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09437999999999998 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.8376 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.9022 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.9204 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9438 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8910666642970089 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.87418634920635 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.8762185281448824 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 512 type: dim_512 metrics: - type: cosine_accuracy@1 value: 0.837 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.8996 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.9208 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9436 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.837 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.29986666666666667 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.18415999999999996 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09435999999999999 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.837 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.8996 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.9208 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9436 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8900999328860697 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.8729962698412704 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.8749614549818263 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 256 type: dim_256 metrics: - type: cosine_accuracy@1 value: 0.8256 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.8938 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.9166 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9402 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.8256 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.29793333333333333 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.18331999999999998 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09401999999999999 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.8256 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.8938 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.9166 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9402 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8827878243788164 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.8643884126984136 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.8663736602503759 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 128 type: dim_128 metrics: - type: cosine_accuracy@1 value: 0.814 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.879 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.9082 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9344 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.814 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.293 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.18164 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09343999999999998 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.814 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.879 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.9082 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9344 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8727948445198305 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.8531903174603174 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.8553014228992841 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 64 type: dim_64 metrics: - type: cosine_accuracy@1 value: 0.768 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.8472 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.879 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9108 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.768 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.2824 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.17579999999999998 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09108 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.768 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.8472 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.879 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9108 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8382263417002942 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.8151035714285721 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.8181338550128306 name: Cosine Map@100 --- # hi-di-hi-base This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) <!-- at revision a5beb1e3e68b9ab74eb54cfd186867f64f240e1a --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 dimensions - **Similarity Function:** Cosine Similarity - **Training Dataset:** - json - **Language:** en - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("carsondial/slinger20241231-1") # Run inference sentences = [ 'What is article marketing and how does it work?\nHow can I use article marketing to increase traffic to my website?\nWhat are some effective strategies for an article marketing campaign?', "Advertising An Article Marketing\nAs every website owner knows, content is essential to raising your rankings in the search engines. There's another way to use content to bring even more visitors to your site. It's called article marketing. You can use one article in dozens of venues, to multiply your exposure exponentially. More traffic means more sales. Let's look at some strategies you can use in your article marketing campaign.Let's say you sell seeds, gardening supplies and gardening books on your site. You're facing stiff competition, as there are many websites marketing these products.\nHow do you get your share of attention? The answer is simple: exposure, as much as you can generate! Article marketing is a natural for this purpose. Write one article, and use it to advertise your site, over and over again.\nYou know your products and their uses best. Write an article on how to start seeds. Give detailed and useful information such as equipment required, temperature and seasonal concerns, thinning and transplanting. There are many topics you can write about.\nWrite an article telling your reader how to choose a good pair of garden gloves, and the advantages of a leather glove for one task, or a sturdy cotton glove for light work.\nOr, tell your reader how a quality bulb planter makes short work of a big planting project. I'm sure you get the idea. Do not write an advertisement!\nWhen your articles are prepared, your related subjectsarticle marketing campaign is ready to launch. Do a search using the term ”article directories gardening”. You'll get hundreds or even thousands of results. Visit some of these article directories and browse their gardening category. Take a look at the quality of their articles. Unfortunately, there are many article directories with poorly written articles, so be choosy when selecting those you want to be associated with, as quality-conscious directories get the most readers.\nHere's how it works: you submit your article to the directory. You are not paid money, but are rewarded with a byline and a link to your site. Every person who reads your article has the opportunity to click through to your site. Give your reader useful information in an engaging style, and chances are good they'll come to your site to see what else they can learn. You can submit the same article to other directories, and watch your traffic and sales grow.\nYour article marketing strategy should include contacts with other website owners, especially those with an e-newsletter. Offer your article as newsletter content in exchange for a byline. See if you can negotiate reciprocal links. Of course you don't want to contact direct competition. A florist, greenhouse or organic gardening site might be good choices.\nYour article marketing campaign's success is directly related to the amount of exposure you create through your effort. One indirect result of a good article marketing campaign is establishing yourself as an expert in your business. Success is measured by traffic and sales. Be prolific and diligent!\nAbout the Author: InsightsOnMarketing provides readers with the latest reviews, articles, commentaries and write-ups on all article marketing, article directories, article reader related subjects", 'Alandari Gray Loveseat\nSporting a pleasing roll arm for classic flair, the Alandari loveseat in soft gray exudes an easy elegance sure to please. Feel-good upholstery with linen texturing is made for everyday luxury. Muted-tone toss pillows are a soothing, sophisticated complement.\nFrame constructions have been rigorously tested to simulate the home and transportation environments for improved durability. Frame components are secured with combinations of glue, blocks, interlocking panels and staples. Seats and back spring rails are cut from mixed hardwood and engineered lumber. Stripes and patterns are match cut. All fabrics are pre-approved for wearability and durability against AHFA standards. Cushions are constructed of low melt fiber wrapped over high quality foam. Constructed with a platform seat foundation.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Information Retrieval * Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 | |:--------------------|:-----------|:-----------|:-----------|:-----------|:-----------| | cosine_accuracy@1 | 0.8376 | 0.837 | 0.8256 | 0.814 | 0.768 | | cosine_accuracy@3 | 0.9022 | 0.8996 | 0.8938 | 0.879 | 0.8472 | | cosine_accuracy@5 | 0.9204 | 0.9208 | 0.9166 | 0.9082 | 0.879 | | cosine_accuracy@10 | 0.9438 | 0.9436 | 0.9402 | 0.9344 | 0.9108 | | cosine_precision@1 | 0.8376 | 0.837 | 0.8256 | 0.814 | 0.768 | | cosine_precision@3 | 0.3007 | 0.2999 | 0.2979 | 0.293 | 0.2824 | | cosine_precision@5 | 0.1841 | 0.1842 | 0.1833 | 0.1816 | 0.1758 | | cosine_precision@10 | 0.0944 | 0.0944 | 0.094 | 0.0934 | 0.0911 | | cosine_recall@1 | 0.8376 | 0.837 | 0.8256 | 0.814 | 0.768 | | cosine_recall@3 | 0.9022 | 0.8996 | 0.8938 | 0.879 | 0.8472 | | cosine_recall@5 | 0.9204 | 0.9208 | 0.9166 | 0.9082 | 0.879 | | cosine_recall@10 | 0.9438 | 0.9436 | 0.9402 | 0.9344 | 0.9108 | | **cosine_ndcg@10** | **0.8911** | **0.8901** | **0.8828** | **0.8728** | **0.8382** | | cosine_mrr@10 | 0.8742 | 0.873 | 0.8644 | 0.8532 | 0.8151 | | cosine_map@100 | 0.8762 | 0.875 | 0.8664 | 0.8553 | 0.8181 | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### json * Dataset: json * Size: 45,000 training samples * Columns: <code>anchor</code> and <code>positive</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | |:--------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 12 tokens</li><li>mean: 30.74 tokens</li><li>max: 112 tokens</li></ul> | <ul><li>min: 51 tokens</li><li>mean: 382.61 tokens</li><li>max: 512 tokens</li></ul> | * Samples: | anchor | positive | |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>How to split the Join form into multiple parts<br>Adding HTML and text to the Join form page<br>Customizing the login logout message</code> | <code>This tutorial will show you how to split the Join form/page into two or three parts. Whether you want a cleaner looking join form or you want to reduce the number of spam bots this should help.<br>This Tutorial will show you how to add HTML and text to the Join form page. If you have ever wanted to add a few more details or spruce up the Join form then this might be what you have been looking for.<br>This tutorial will show you how to spruce up the login - logout message people see. If you want to add a special message or just change the standard default phrase this one might be for you. It's very simple and only takes a few moments to apply this one.<br>This tutorial will show you how to take advantage and optimize your site using the Page Block feature available in your Administration panel. With a little exploring and tweaking it's possible to boost your site's performance.<br>If your Forum RSS feed links are redirecting to your homepage when clicked on you might need to update the forum module...</code> | | <code>Failed mobile technologies of 2011<br>What happened to Siri<br>Disappointing mobile innovations</code> | <code>Last year we all got excited about mobile-computing products that failed to deliver<br>It's an awesome time to be a gadget-happy consumer electronics freak. Multi-touch user interfaces. Huge advances in miniaturization and battery life. Cloud-based storage. Mobile computing has never been better.<br>But sometimes, when companies announce incredible new products or technologies, and everybody proclaims that a new era has dawned, and that culture-shifting transformations are about to take place -- nothing happens.<br>Here are five mobile technologies from last year that were supposed to change the world, but didn't.<br>Apple seemed to do everything right with its voice assistant strategy.<br>The company acquired the leading app maker with the best technology. It spent two years perfecting and integrating the technology, and bulking up on servers to handle the number-crunching required to deliver human-like voice interaction.<br>Siri was then launched to huge fanfare.<br>Overnight, people changed how they int...</code> | | <code>What are the different types of web usage for voice over campaigns?<br>What are the standard types of web usage for voice over?<br>What are the different ways to use voice over in web campaigns?</code> | <code>Different Types of Web Usage<br>Clearly, nothing has upended the voice over industry quite like the Internet. On all ends of the spectrum, from the basic sourcing of voice over talent to right on down to hyper-targeted, increasingly personalized messaging, it can feel a little “Wild West” to navigate if you’re not going directly through an agent. And one of the biggest stumblers when sourcing for voice talent? The concept of “web usage.”<br>Usage — a.k.a., how the finished voice over is being used — will cause the greatest fluctuation in a voice talent’s quote. As we all know, there are myriad ways any material can be used and distributed online… and not all distribution is created equally.<br>Below, we’ve got a handy little guide detailing all the different, standard types of web usage we tend to come across here at Blue Wave Voiceover (Heck, the way things are going, we might have to update this once more before you finish reading it).<br>Feel free to refer to this when putting together any kind...</code> | * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters: ```json { "loss": "MultipleNegativesRankingLoss", "matryoshka_dims": [ 768, 512, 256, 128, 64 ], "matryoshka_weights": [ 1, 1, 1, 1, 1 ], "n_dims_per_step": -1 } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: epoch - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `gradient_accumulation_steps`: 16 - `learning_rate`: 2e-05 - `num_train_epochs`: 4 - `lr_scheduler_type`: cosine - `warmup_ratio`: 0.1 - `bf16`: True - `tf32`: True - `load_best_model_at_end`: True - `optim`: adamw_torch_fused - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: epoch - `prediction_loss_only`: True - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 16 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 4 - `max_steps`: -1 - `lr_scheduler_type`: cosine - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: True - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: True - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch_fused - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: None - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `average_tokens_across_devices`: False - `prompts`: None - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 | |:----------:|:-------:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:| | 0.1137 | 10 | 1.0843 | - | - | - | - | - | | 0.2274 | 20 | 0.6597 | - | - | - | - | - | | 0.3412 | 30 | 0.3466 | - | - | - | - | - | | 0.4549 | 40 | 0.253 | - | - | - | - | - | | 0.5686 | 50 | 0.2286 | - | - | - | - | - | | 0.6823 | 60 | 0.2007 | - | - | - | - | - | | 0.7960 | 70 | 0.1576 | - | - | - | - | - | | 0.9097 | 80 | 0.1652 | - | - | - | - | - | | 0.9893 | 87 | - | 0.8880 | 0.8861 | 0.8788 | 0.8657 | 0.8267 | | 1.0341 | 90 | 0.1563 | - | - | - | - | - | | 1.1478 | 100 | 0.1485 | - | - | - | - | - | | 1.2615 | 110 | 0.108 | - | - | - | - | - | | 1.3753 | 120 | 0.0874 | - | - | - | - | - | | 1.4890 | 130 | 0.0851 | - | - | - | - | - | | 1.6027 | 140 | 0.0897 | - | - | - | - | - | | 1.7164 | 150 | 0.0803 | - | - | - | - | - | | 1.8301 | 160 | 0.0645 | - | - | - | - | - | | 1.9439 | 170 | 0.0806 | - | - | - | - | - | | 1.9893 | 174 | - | 0.8906 | 0.8893 | 0.8815 | 0.8712 | 0.8351 | | 2.0682 | 180 | 0.0812 | - | - | - | - | - | | 2.1819 | 190 | 0.0743 | - | - | - | - | - | | 2.2957 | 200 | 0.0531 | - | - | - | - | - | | 2.4094 | 210 | 0.0448 | - | - | - | - | - | | 2.5231 | 220 | 0.0465 | - | - | - | - | - | | 2.6368 | 230 | 0.0486 | - | - | - | - | - | | 2.7505 | 240 | 0.0509 | - | - | - | - | - | | 2.8643 | 250 | 0.0395 | - | - | - | - | - | | 2.9780 | 260 | 0.0521 | - | - | - | - | - | | 2.9893 | 261 | - | 0.8912 | 0.8897 | 0.8823 | 0.8720 | 0.8375 | | 3.1023 | 270 | 0.0551 | - | - | - | - | - | | 3.2161 | 280 | 0.0412 | - | - | - | - | - | | 3.3298 | 290 | 0.0373 | - | - | - | - | - | | 3.4435 | 300 | 0.0387 | - | - | - | - | - | | 3.5572 | 310 | 0.0438 | - | - | - | - | - | | 3.6709 | 320 | 0.0433 | - | - | - | - | - | | 3.7846 | 330 | 0.0368 | - | - | - | - | - | | 3.8984 | 340 | 0.0418 | - | - | - | - | - | | **3.9893** | **348** | **-** | **0.8911** | **0.8901** | **0.8828** | **0.8728** | **0.8382** | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.3.1 - Transformers: 4.47.1 - PyTorch: 2.5.1+cu121 - Accelerate: 1.2.1 - Datasets: 3.2.0 - Tokenizers: 0.21.0 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MatryoshkaLoss ```bibtex @misc{kusupati2024matryoshka, title={Matryoshka Representation Learning}, author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi}, year={2024}, eprint={2205.13147}, archivePrefix={arXiv}, primaryClass={cs.LG} } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION" ]
[ "BEAR" ]
Non_BioNLP
aimlresearch2023/snowflake-arctic-embed-m-v1.5-Q8_0-GGUF
aimlresearch2023
sentence-similarity
[ "sentence-transformers", "gguf", "feature-extraction", "sentence-similarity", "mteb", "arctic", "snowflake-arctic-embed", "transformers.js", "llama-cpp", "gguf-my-repo", "base_model:Snowflake/snowflake-arctic-embed-m-v1.5", "base_model:quantized:Snowflake/snowflake-arctic-embed-m-v1.5", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,726
1,726
7
0
--- base_model: Snowflake/snowflake-arctic-embed-m-v1.5 license: apache-2.0 pipeline_tag: sentence-similarity tags: - sentence-transformers - feature-extraction - sentence-similarity - mteb - arctic - snowflake-arctic-embed - transformers.js - llama-cpp - gguf-my-repo model-index: - name: snowflake-arctic-embed-m-v1.5 results: - task: type: Retrieval dataset: name: MTEB ArguAna type: mteb/arguana config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: main_score value: 59.53000000000001 - type: map_at_1 value: 34.282000000000004 - type: map_at_10 value: 50.613 - type: map_at_100 value: 51.269 - type: map_at_1000 value: 51.271 - type: map_at_20 value: 51.158 - type: map_at_3 value: 45.626 - type: map_at_5 value: 48.638 - type: mrr_at_1 value: 34.92176386913229 - type: mrr_at_10 value: 50.856081645555406 - type: mrr_at_100 value: 51.510739437069034 - type: mrr_at_1000 value: 51.51299498830165 - type: mrr_at_20 value: 51.39987941081724 - type: mrr_at_3 value: 45.993361782835514 - type: mrr_at_5 value: 48.88098624940742 - type: nauc_map_at_1000_diff1 value: 10.628675774160785 - type: nauc_map_at_1000_max value: -10.11742589992339 - type: nauc_map_at_1000_std value: -18.29277379812427 - type: nauc_map_at_100_diff1 value: 10.63250240035489 - type: nauc_map_at_100_max value: -10.112078786734363 - type: nauc_map_at_100_std value: -18.288524872706834 - type: nauc_map_at_10_diff1 value: 10.476494913081712 - type: nauc_map_at_10_max value: -9.890937746734037 - type: nauc_map_at_10_std value: -18.279750514750443 - type: nauc_map_at_1_diff1 value: 14.549204048461151 - type: nauc_map_at_1_max value: -12.230560087701225 - type: nauc_map_at_1_std value: -19.469903650130362 - type: nauc_map_at_20_diff1 value: 10.586564571825674 - type: nauc_map_at_20_max value: -10.00292720526217 - type: nauc_map_at_20_std value: -18.258077347878064 - type: nauc_map_at_3_diff1 value: 10.378663968090372 - type: nauc_map_at_3_max value: -10.458896171786185 - type: nauc_map_at_3_std value: -18.38852760333766 - type: nauc_map_at_5_diff1 value: 10.235960275925581 - type: nauc_map_at_5_max value: -10.239496080409058 - type: nauc_map_at_5_std value: -18.817023479445886 - type: nauc_mrr_at_1000_diff1 value: 8.718212649575722 - type: nauc_mrr_at_1000_max value: -10.81022794038691 - type: nauc_mrr_at_1000_std value: -17.87669499555167 - type: nauc_mrr_at_100_diff1 value: 8.722174171165133 - type: nauc_mrr_at_100_max value: -10.804840985713525 - type: nauc_mrr_at_100_std value: -17.872487099359986 - type: nauc_mrr_at_10_diff1 value: 8.609421635870238 - type: nauc_mrr_at_10_max value: -10.568644717548432 - type: nauc_mrr_at_10_std value: -17.872968762635814 - type: nauc_mrr_at_1_diff1 value: 12.69590006263834 - type: nauc_mrr_at_1_max value: -12.082056561238321 - type: nauc_mrr_at_1_std value: -18.036424092186657 - type: nauc_mrr_at_20_diff1 value: 8.684842497970315 - type: nauc_mrr_at_20_max value: -10.691578914627286 - type: nauc_mrr_at_20_std value: -17.84350301434992 - type: nauc_mrr_at_3_diff1 value: 8.649761557556763 - type: nauc_mrr_at_3_max value: -11.104694428047496 - type: nauc_mrr_at_3_std value: -18.149917948370344 - type: nauc_mrr_at_5_diff1 value: 8.433489750038396 - type: nauc_mrr_at_5_max value: -10.917772454397436 - type: nauc_mrr_at_5_std value: -18.4094211134111 - type: nauc_ndcg_at_1000_diff1 value: 10.19041067807956 - type: nauc_ndcg_at_1000_max value: -9.54328201605796 - type: nauc_ndcg_at_1000_std value: -17.824620427456633 - type: nauc_ndcg_at_100_diff1 value: 10.289491087585963 - type: nauc_ndcg_at_100_max value: -9.357214331420337 - type: nauc_ndcg_at_100_std value: -17.657600653632873 - type: nauc_ndcg_at_10_diff1 value: 9.435530877596092 - type: nauc_ndcg_at_10_max value: -8.182581635383546 - type: nauc_ndcg_at_10_std value: -17.603156479980388 - type: nauc_ndcg_at_1_diff1 value: 14.549204048461151 - type: nauc_ndcg_at_1_max value: -12.230560087701225 - type: nauc_ndcg_at_1_std value: -19.469903650130362 - type: nauc_ndcg_at_20_diff1 value: 9.885227087275197 - type: nauc_ndcg_at_20_max value: -8.52362662391439 - type: nauc_ndcg_at_20_std value: -17.441705436231764 - type: nauc_ndcg_at_3_diff1 value: 9.22542769998547 - type: nauc_ndcg_at_3_max value: -9.903590564219288 - type: nauc_ndcg_at_3_std value: -18.357220221111593 - type: nauc_ndcg_at_5_diff1 value: 8.8756720745828 - type: nauc_ndcg_at_5_max value: -9.269764943861245 - type: nauc_ndcg_at_5_std value: -19.009229433187784 - type: nauc_precision_at_1000_diff1 value: 3.733355117431035 - type: nauc_precision_at_1000_max value: 3.9603571352517393 - type: nauc_precision_at_1000_std value: 70.07345061131439 - type: nauc_precision_at_100_diff1 value: 29.019032142462457 - type: nauc_precision_at_100_max value: 40.75153328286103 - type: nauc_precision_at_100_std value: 62.634249549126594 - type: nauc_precision_at_10_diff1 value: 2.5762677254910353 - type: nauc_precision_at_10_max value: 6.096298633773051 - type: nauc_precision_at_10_std value: -11.507400451348587 - type: nauc_precision_at_1_diff1 value: 14.549204048461151 - type: nauc_precision_at_1_max value: -12.230560087701225 - type: nauc_precision_at_1_std value: -19.469903650130362 - type: nauc_precision_at_20_diff1 value: 1.715540124567996 - type: nauc_precision_at_20_max value: 21.53546453945913 - type: nauc_precision_at_20_std value: 1.537961142195571 - type: nauc_precision_at_3_diff1 value: 5.701850652555737 - type: nauc_precision_at_3_max value: -8.180345365085552 - type: nauc_precision_at_3_std value: -18.37033750502482 - type: nauc_precision_at_5_diff1 value: 3.6053552181042843 - type: nauc_precision_at_5_max value: -5.207647070615612 - type: nauc_precision_at_5_std value: -19.89491085427258 - type: nauc_recall_at_1000_diff1 value: 3.733355117431255 - type: nauc_recall_at_1000_max value: 3.9603571352482194 - type: nauc_recall_at_1000_std value: 70.07345061131205 - type: nauc_recall_at_100_diff1 value: 29.01903214246288 - type: nauc_recall_at_100_max value: 40.7515332828621 - type: nauc_recall_at_100_std value: 62.63424954912607 - type: nauc_recall_at_10_diff1 value: 2.5762677254911988 - type: nauc_recall_at_10_max value: 6.0962986337729905 - type: nauc_recall_at_10_std value: -11.507400451348577 - type: nauc_recall_at_1_diff1 value: 14.549204048461151 - type: nauc_recall_at_1_max value: -12.230560087701225 - type: nauc_recall_at_1_std value: -19.469903650130362 - type: nauc_recall_at_20_diff1 value: 1.7155401245682675 - type: nauc_recall_at_20_max value: 21.535464539459632 - type: nauc_recall_at_20_std value: 1.5379611421957025 - type: nauc_recall_at_3_diff1 value: 5.7018506525557875 - type: nauc_recall_at_3_max value: -8.180345365085538 - type: nauc_recall_at_3_std value: -18.370337505024796 - type: nauc_recall_at_5_diff1 value: 3.6053552181043913 - type: nauc_recall_at_5_max value: -5.207647070615579 - type: nauc_recall_at_5_std value: -19.894910854272492 - type: ndcg_at_1 value: 34.282000000000004 - type: ndcg_at_10 value: 59.53000000000001 - type: ndcg_at_100 value: 62.187000000000005 - type: ndcg_at_1000 value: 62.243 - type: ndcg_at_20 value: 61.451 - type: ndcg_at_3 value: 49.393 - type: ndcg_at_5 value: 54.771 - type: precision_at_1 value: 34.282000000000004 - type: precision_at_10 value: 8.791 - type: precision_at_100 value: 0.992 - type: precision_at_1000 value: 0.1 - type: precision_at_20 value: 4.769 - type: precision_at_3 value: 20.104 - type: precision_at_5 value: 14.651 - type: recall_at_1 value: 34.282000000000004 - type: recall_at_10 value: 87.909 - type: recall_at_100 value: 99.21799999999999 - type: recall_at_1000 value: 99.644 - type: recall_at_20 value: 95.377 - type: recall_at_3 value: 60.313 - type: recall_at_5 value: 73.257 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: mteb/cqadupstack-android config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: main_score value: 53.885000000000005 - type: map_at_1 value: 35.429 - type: map_at_10 value: 47.469 - type: map_at_100 value: 48.997 - type: map_at_1000 value: 49.117 - type: map_at_20 value: 48.324 - type: map_at_3 value: 43.835 - type: map_at_5 value: 46.043 - type: mrr_at_1 value: 43.34763948497854 - type: mrr_at_10 value: 53.258623430297234 - type: mrr_at_100 value: 53.99123884299005 - type: mrr_at_1000 value: 54.02458101713216 - type: mrr_at_20 value: 53.695964669618945 - type: mrr_at_3 value: 50.81068192656173 - type: mrr_at_5 value: 52.45588936576058 - type: nauc_map_at_1000_diff1 value: 51.55382824218782 - type: nauc_map_at_1000_max value: 31.855350695084606 - type: nauc_map_at_1000_std value: -5.465862008150992 - type: nauc_map_at_100_diff1 value: 51.55889312452534 - type: nauc_map_at_100_max value: 31.88429637207401 - type: nauc_map_at_100_std value: -5.40805152544196 - type: nauc_map_at_10_diff1 value: 51.6592677505875 - type: nauc_map_at_10_max value: 31.554425233617543 - type: nauc_map_at_10_std value: -6.125756131339046 - type: nauc_map_at_1_diff1 value: 55.6889617582672 - type: nauc_map_at_1_max value: 27.821166966868176 - type: nauc_map_at_1_std value: -5.778838498211728 - type: nauc_map_at_20_diff1 value: 51.70520970992564 - type: nauc_map_at_20_max value: 31.811676633900465 - type: nauc_map_at_20_std value: -5.463596751904718 - type: nauc_map_at_3_diff1 value: 53.206169626589606 - type: nauc_map_at_3_max value: 31.64373830824983 - type: nauc_map_at_3_std value: -6.054761451312827 - type: nauc_map_at_5_diff1 value: 52.37308971673694 - type: nauc_map_at_5_max value: 31.974302019633644 - type: nauc_map_at_5_std value: -6.302653399940531 - type: nauc_mrr_at_1000_diff1 value: 49.345152231490616 - type: nauc_mrr_at_1000_max value: 33.49789501712511 - type: nauc_mrr_at_1000_std value: -6.054730861163538 - type: nauc_mrr_at_100_diff1 value: 49.3387577601307 - type: nauc_mrr_at_100_max value: 33.48149992464187 - type: nauc_mrr_at_100_std value: -6.061177137579308 - type: nauc_mrr_at_10_diff1 value: 49.08312288449718 - type: nauc_mrr_at_10_max value: 33.470393322577465 - type: nauc_mrr_at_10_std value: -6.180286430216975 - type: nauc_mrr_at_1_diff1 value: 52.43364978537192 - type: nauc_mrr_at_1_max value: 31.521755633355713 - type: nauc_mrr_at_1_std value: -7.002499524130836 - type: nauc_mrr_at_20_diff1 value: 49.311059224991766 - type: nauc_mrr_at_20_max value: 33.538523037692144 - type: nauc_mrr_at_20_std value: -6.034619474981136 - type: nauc_mrr_at_3_diff1 value: 49.90489868439366 - type: nauc_mrr_at_3_max value: 34.400493912164606 - type: nauc_mrr_at_3_std value: -6.028875320994629 - type: nauc_mrr_at_5_diff1 value: 49.033661898983475 - type: nauc_mrr_at_5_max value: 33.732315350193936 - type: nauc_mrr_at_5_std value: -6.272548556330368 - type: nauc_ndcg_at_1000_diff1 value: 49.81681892539247 - type: nauc_ndcg_at_1000_max value: 33.06518006062093 - type: nauc_ndcg_at_1000_std value: -4.282105713014755 - type: nauc_ndcg_at_100_diff1 value: 49.42362108857786 - type: nauc_ndcg_at_100_max value: 32.92024325540483 - type: nauc_ndcg_at_100_std value: -3.7786765305496717 - type: nauc_ndcg_at_10_diff1 value: 48.83102435475594 - type: nauc_ndcg_at_10_max value: 31.898404563611958 - type: nauc_ndcg_at_10_std value: -6.2024003866707 - type: nauc_ndcg_at_1_diff1 value: 52.43364978537192 - type: nauc_ndcg_at_1_max value: 31.521755633355713 - type: nauc_ndcg_at_1_std value: -7.002499524130836 - type: nauc_ndcg_at_20_diff1 value: 49.466526454438316 - type: nauc_ndcg_at_20_max value: 32.424462698701674 - type: nauc_ndcg_at_20_std value: -4.520809563712905 - type: nauc_ndcg_at_3_diff1 value: 50.997884562583884 - type: nauc_ndcg_at_3_max value: 33.26787046916917 - type: nauc_ndcg_at_3_std value: -6.340699471083753 - type: nauc_ndcg_at_5_diff1 value: 49.68314458398097 - type: nauc_ndcg_at_5_max value: 32.80910071143984 - type: nauc_ndcg_at_5_std value: -6.734495576445887 - type: nauc_precision_at_1000_diff1 value: -24.18940012795299 - type: nauc_precision_at_1000_max value: -10.995343674356896 - type: nauc_precision_at_1000_std value: -8.298841004724856 - type: nauc_precision_at_100_diff1 value: -18.104939577865935 - type: nauc_precision_at_100_max value: -1.3757613100627637 - type: nauc_precision_at_100_std value: 0.07661922190466432 - type: nauc_precision_at_10_diff1 value: 3.9624459059275967 - type: nauc_precision_at_10_max value: 14.841561593450391 - type: nauc_precision_at_10_std value: -2.485374333613117 - type: nauc_precision_at_1_diff1 value: 52.43364978537192 - type: nauc_precision_at_1_max value: 31.521755633355713 - type: nauc_precision_at_1_std value: -7.002499524130836 - type: nauc_precision_at_20_diff1 value: -4.4791763436505265 - type: nauc_precision_at_20_max value: 9.157872836996276 - type: nauc_precision_at_20_std value: 2.086903518342088 - type: nauc_precision_at_3_diff1 value: 28.480888018235568 - type: nauc_precision_at_3_max value: 30.34526267718485 - type: nauc_precision_at_3_std value: -6.3006706923866025 - type: nauc_precision_at_5_diff1 value: 16.488039195453517 - type: nauc_precision_at_5_max value: 24.593477099241852 - type: nauc_precision_at_5_std value: -5.316448107840636 - type: nauc_recall_at_1000_diff1 value: 34.715187316533076 - type: nauc_recall_at_1000_max value: 58.2266544684947 - type: nauc_recall_at_1000_std value: 63.85237636398278 - type: nauc_recall_at_100_diff1 value: 36.08623826028132 - type: nauc_recall_at_100_max value: 33.05011429439473 - type: nauc_recall_at_100_std value: 16.559545021212564 - type: nauc_recall_at_10_diff1 value: 39.76738610714205 - type: nauc_recall_at_10_max value: 28.233045706945997 - type: nauc_recall_at_10_std value: -5.13243784043598 - type: nauc_recall_at_1_diff1 value: 55.6889617582672 - type: nauc_recall_at_1_max value: 27.821166966868176 - type: nauc_recall_at_1_std value: -5.778838498211728 - type: nauc_recall_at_20_diff1 value: 41.18682480073759 - type: nauc_recall_at_20_max value: 29.525993239296945 - type: nauc_recall_at_20_std value: 1.5003598438954298 - type: nauc_recall_at_3_diff1 value: 48.31879460301157 - type: nauc_recall_at_3_max value: 32.93751306970167 - type: nauc_recall_at_3_std value: -5.28070084211707 - type: nauc_recall_at_5_diff1 value: 44.327686388315435 - type: nauc_recall_at_5_max value: 32.04823486234599 - type: nauc_recall_at_5_std value: -6.4221525602778256 - type: ndcg_at_1 value: 43.348 - type: ndcg_at_10 value: 53.885000000000005 - type: ndcg_at_100 value: 59.204 - type: ndcg_at_1000 value: 60.744 - type: ndcg_at_20 value: 55.995 - type: ndcg_at_3 value: 49.112 - type: ndcg_at_5 value: 51.61900000000001 - type: precision_at_1 value: 43.348 - type: precision_at_10 value: 10.242999999999999 - type: precision_at_100 value: 1.6150000000000002 - type: precision_at_1000 value: 0.203 - type: precision_at_20 value: 6.066 - type: precision_at_3 value: 23.605 - type: precision_at_5 value: 17.024 - type: recall_at_1 value: 35.429 - type: recall_at_10 value: 65.77199999999999 - type: recall_at_100 value: 87.89 - type: recall_at_1000 value: 97.13000000000001 - type: recall_at_20 value: 73.299 - type: recall_at_3 value: 52.034000000000006 - type: recall_at_5 value: 58.96 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval type: mteb/cqadupstack-english config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: main_score value: 49.55 - type: map_at_1 value: 31.684 - type: map_at_10 value: 43.258 - type: map_at_100 value: 44.628 - type: map_at_1000 value: 44.761 - type: map_at_20 value: 44.015 - type: map_at_3 value: 39.778000000000006 - type: map_at_5 value: 41.643 - type: mrr_at_1 value: 39.87261146496815 - type: mrr_at_10 value: 49.31978566373469 - type: mrr_at_100 value: 49.94922739445482 - type: mrr_at_1000 value: 49.990325601254106 - type: mrr_at_20 value: 49.70597468576704 - type: mrr_at_3 value: 47.070063694267546 - type: mrr_at_5 value: 48.23248407643316 - type: nauc_map_at_1000_diff1 value: 53.44044712371752 - type: nauc_map_at_1000_max value: 34.5651440062204 - type: nauc_map_at_1000_std value: -0.9814384609230475 - type: nauc_map_at_100_diff1 value: 53.429004435388464 - type: nauc_map_at_100_max value: 34.52038957273436 - type: nauc_map_at_100_std value: -1.1021936362699805 - type: nauc_map_at_10_diff1 value: 53.879128574022005 - type: nauc_map_at_10_max value: 33.74771524140917 - type: nauc_map_at_10_std value: -2.945132777205236 - type: nauc_map_at_1_diff1 value: 60.25159799695403 - type: nauc_map_at_1_max value: 26.843892985235808 - type: nauc_map_at_1_std value: -9.618702739509093 - type: nauc_map_at_20_diff1 value: 53.56789898225283 - type: nauc_map_at_20_max value: 34.11628845872402 - type: nauc_map_at_20_std value: -2.024376635870884 - type: nauc_map_at_3_diff1 value: 54.45882099014072 - type: nauc_map_at_3_max value: 31.29495446507793 - type: nauc_map_at_3_std value: -6.391948228781555 - type: nauc_map_at_5_diff1 value: 54.20536489050697 - type: nauc_map_at_5_max value: 32.31001487256826 - type: nauc_map_at_5_std value: -5.050953263346934 - type: nauc_mrr_at_1000_diff1 value: 50.835858995999125 - type: nauc_mrr_at_1000_max value: 38.20717381701079 - type: nauc_mrr_at_1000_std value: 4.174163368228787 - type: nauc_mrr_at_100_diff1 value: 50.827072441041224 - type: nauc_mrr_at_100_max value: 38.21077622034756 - type: nauc_mrr_at_100_std value: 4.1951082737013365 - type: nauc_mrr_at_10_diff1 value: 50.90578491570948 - type: nauc_mrr_at_10_max value: 38.19229691746408 - type: nauc_mrr_at_10_std value: 3.8290750066335546 - type: nauc_mrr_at_1_diff1 value: 54.807021746871186 - type: nauc_mrr_at_1_max value: 37.09225642043841 - type: nauc_mrr_at_1_std value: 0.5654547513131355 - type: nauc_mrr_at_20_diff1 value: 50.86247832095378 - type: nauc_mrr_at_20_max value: 38.19277867384178 - type: nauc_mrr_at_20_std value: 4.098932316791841 - type: nauc_mrr_at_3_diff1 value: 50.788934370903036 - type: nauc_mrr_at_3_max value: 37.72130561895659 - type: nauc_mrr_at_3_std value: 2.7339370381517583 - type: nauc_mrr_at_5_diff1 value: 50.72543792525547 - type: nauc_mrr_at_5_max value: 37.57740908475375 - type: nauc_mrr_at_5_std value: 2.742881431085094 - type: nauc_ndcg_at_1000_diff1 value: 50.89692885407576 - type: nauc_ndcg_at_1000_max value: 37.250583054716955 - type: nauc_ndcg_at_1000_std value: 5.552279826578831 - type: nauc_ndcg_at_100_diff1 value: 50.624606875496944 - type: nauc_ndcg_at_100_max value: 37.1024514234627 - type: nauc_ndcg_at_100_std value: 5.495892760032762 - type: nauc_ndcg_at_10_diff1 value: 51.910387255793445 - type: nauc_ndcg_at_10_max value: 36.71168418905039 - type: nauc_ndcg_at_10_std value: 2.3064115117905217 - type: nauc_ndcg_at_1_diff1 value: 54.807021746871186 - type: nauc_ndcg_at_1_max value: 37.09225642043841 - type: nauc_ndcg_at_1_std value: 0.5654547513131355 - type: nauc_ndcg_at_20_diff1 value: 51.43416588546778 - type: nauc_ndcg_at_20_max value: 36.76387180172346 - type: nauc_ndcg_at_20_std value: 3.7012798827049718 - type: nauc_ndcg_at_3_diff1 value: 50.91198494475423 - type: nauc_ndcg_at_3_max value: 34.92770670756687 - type: nauc_ndcg_at_3_std value: -0.9071486759887368 - type: nauc_ndcg_at_5_diff1 value: 51.63559468683886 - type: nauc_ndcg_at_5_max value: 34.86849679864564 - type: nauc_ndcg_at_5_std value: -0.734837221224976 - type: nauc_precision_at_1000_diff1 value: -13.43645457127175 - type: nauc_precision_at_1000_max value: 12.71162105198664 - type: nauc_precision_at_1000_std value: 33.175399007040255 - type: nauc_precision_at_100_diff1 value: -8.549834785105412 - type: nauc_precision_at_100_max value: 22.47383497331883 - type: nauc_precision_at_100_std value: 39.09108761430844 - type: nauc_precision_at_10_diff1 value: 7.556572451100043 - type: nauc_precision_at_10_max value: 35.35285122987575 - type: nauc_precision_at_10_std value: 29.417466305615967 - type: nauc_precision_at_1_diff1 value: 54.807021746871186 - type: nauc_precision_at_1_max value: 37.09225642043841 - type: nauc_precision_at_1_std value: 0.5654547513131355 - type: nauc_precision_at_20_diff1 value: -0.550158641635712 - type: nauc_precision_at_20_max value: 29.9068430006187 - type: nauc_precision_at_20_std value: 33.920603132821185 - type: nauc_precision_at_3_diff1 value: 25.551264664276687 - type: nauc_precision_at_3_max value: 37.59463225854679 - type: nauc_precision_at_3_std value: 13.707295021359043 - type: nauc_precision_at_5_diff1 value: 17.76136129817151 - type: nauc_precision_at_5_max value: 35.85363807255972 - type: nauc_precision_at_5_std value: 19.48470876841111 - type: nauc_recall_at_1000_diff1 value: 37.1593620123866 - type: nauc_recall_at_1000_max value: 46.29322536951135 - type: nauc_recall_at_1000_std value: 51.47312657083967 - type: nauc_recall_at_100_diff1 value: 37.7542224949536 - type: nauc_recall_at_100_max value: 38.84120637703135 - type: nauc_recall_at_100_std value: 28.839672572221925 - type: nauc_recall_at_10_diff1 value: 46.24130302658384 - type: nauc_recall_at_10_max value: 35.89001724712849 - type: nauc_recall_at_10_std value: 6.985137790828618 - type: nauc_recall_at_1_diff1 value: 60.25159799695403 - type: nauc_recall_at_1_max value: 26.843892985235808 - type: nauc_recall_at_1_std value: -9.618702739509093 - type: nauc_recall_at_20_diff1 value: 43.63576680886187 - type: nauc_recall_at_20_max value: 36.79079644708101 - type: nauc_recall_at_20_std value: 13.81561928605839 - type: nauc_recall_at_3_diff1 value: 48.2299322140522 - type: nauc_recall_at_3_max value: 30.038088484376203 - type: nauc_recall_at_3_std value: -4.871116183843762 - type: nauc_recall_at_5_diff1 value: 47.22331872695983 - type: nauc_recall_at_5_max value: 30.398541477173136 - type: nauc_recall_at_5_std value: -3.2038541888528957 - type: ndcg_at_1 value: 39.873 - type: ndcg_at_10 value: 49.55 - type: ndcg_at_100 value: 53.809 - type: ndcg_at_1000 value: 55.767999999999994 - type: ndcg_at_20 value: 51.275999999999996 - type: ndcg_at_3 value: 44.91 - type: ndcg_at_5 value: 46.855999999999995 - type: precision_at_1 value: 39.873 - type: precision_at_10 value: 9.65 - type: precision_at_100 value: 1.522 - type: precision_at_1000 value: 0.196 - type: precision_at_20 value: 5.701 - type: precision_at_3 value: 22.166 - type: precision_at_5 value: 15.643 - type: recall_at_1 value: 31.684 - type: recall_at_10 value: 60.69 - type: recall_at_100 value: 78.521 - type: recall_at_1000 value: 91.02900000000001 - type: recall_at_20 value: 66.973 - type: recall_at_3 value: 46.807 - type: recall_at_5 value: 52.402 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval type: mteb/cqadupstack-gaming config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: main_score value: 62.686 - type: map_at_1 value: 43.856 - type: map_at_10 value: 57.056 - type: map_at_100 value: 58.048 - type: map_at_1000 value: 58.092 - type: map_at_20 value: 57.684000000000005 - type: map_at_3 value: 53.958 - type: map_at_5 value: 55.80500000000001 - type: mrr_at_1 value: 50.03134796238244 - type: mrr_at_10 value: 60.31022043091019 - type: mrr_at_100 value: 60.91892338857461 - type: mrr_at_1000 value: 60.93770463536649 - type: mrr_at_20 value: 60.705642387392736 - type: mrr_at_3 value: 58.286311389759746 - type: mrr_at_5 value: 59.49320794148393 - type: nauc_map_at_1000_diff1 value: 54.849140197256695 - type: nauc_map_at_1000_max value: 38.978448968260224 - type: nauc_map_at_1000_std value: 0.4955439383268162 - type: nauc_map_at_100_diff1 value: 54.824334747823364 - type: nauc_map_at_100_max value: 38.959443109450994 - type: nauc_map_at_100_std value: 0.49626092018886037 - type: nauc_map_at_10_diff1 value: 54.778189277103394 - type: nauc_map_at_10_max value: 38.20972191654546 - type: nauc_map_at_10_std value: -0.7239823837455759 - type: nauc_map_at_1_diff1 value: 58.74017164752485 - type: nauc_map_at_1_max value: 31.528974862589585 - type: nauc_map_at_1_std value: -3.273824691929492 - type: nauc_map_at_20_diff1 value: 54.78943693416187 - type: nauc_map_at_20_max value: 38.77930316443076 - type: nauc_map_at_20_std value: 0.25607460088355544 - type: nauc_map_at_3_diff1 value: 55.68313410225767 - type: nauc_map_at_3_max value: 36.22847284104399 - type: nauc_map_at_3_std value: -3.010979639100503 - type: nauc_map_at_5_diff1 value: 55.11385094420661 - type: nauc_map_at_5_max value: 37.319681045490924 - type: nauc_map_at_5_std value: -2.156640733221061 - type: nauc_mrr_at_1000_diff1 value: 54.504759468380705 - type: nauc_mrr_at_1000_max value: 40.58849492650406 - type: nauc_mrr_at_1000_std value: 1.8226622175866118 - type: nauc_mrr_at_100_diff1 value: 54.4918034449886 - type: nauc_mrr_at_100_max value: 40.59202728933427 - type: nauc_mrr_at_100_std value: 1.8276428096536335 - type: nauc_mrr_at_10_diff1 value: 54.33603399493329 - type: nauc_mrr_at_10_max value: 40.58896878978089 - type: nauc_mrr_at_10_std value: 1.5733340909114375 - type: nauc_mrr_at_1_diff1 value: 58.062410036466105 - type: nauc_mrr_at_1_max value: 37.660958859966506 - type: nauc_mrr_at_1_std value: 0.029007600674170648 - type: nauc_mrr_at_20_diff1 value: 54.43793386924358 - type: nauc_mrr_at_20_max value: 40.66773423875307 - type: nauc_mrr_at_20_std value: 1.891967891797154 - type: nauc_mrr_at_3_diff1 value: 54.77901284537966 - type: nauc_mrr_at_3_max value: 40.182219821206964 - type: nauc_mrr_at_3_std value: 0.8911935034597871 - type: nauc_mrr_at_5_diff1 value: 54.466068837163675 - type: nauc_mrr_at_5_max value: 40.334996916684126 - type: nauc_mrr_at_5_std value: 0.9460830492892364 - type: nauc_ndcg_at_1000_diff1 value: 53.8465376860938 - type: nauc_ndcg_at_1000_max value: 41.63158111016696 - type: nauc_ndcg_at_1000_std value: 3.864205884257578 - type: nauc_ndcg_at_100_diff1 value: 53.4025864436944 - type: nauc_ndcg_at_100_max value: 41.805453995307914 - type: nauc_ndcg_at_100_std value: 4.36777557904857 - type: nauc_ndcg_at_10_diff1 value: 52.96034987157544 - type: nauc_ndcg_at_10_max value: 40.7601173480795 - type: nauc_ndcg_at_10_std value: 1.905824035879141 - type: nauc_ndcg_at_1_diff1 value: 58.062410036466105 - type: nauc_ndcg_at_1_max value: 37.660958859966506 - type: nauc_ndcg_at_1_std value: 0.029007600674170648 - type: nauc_ndcg_at_20_diff1 value: 53.2834771889242 - type: nauc_ndcg_at_20_max value: 41.713541932946406 - type: nauc_ndcg_at_20_std value: 3.865102828793311 - type: nauc_ndcg_at_3_diff1 value: 54.03389464372289 - type: nauc_ndcg_at_3_max value: 38.41449914649933 - type: nauc_ndcg_at_3_std value: -0.886276189886313 - type: nauc_ndcg_at_5_diff1 value: 53.456413320299 - type: nauc_ndcg_at_5_max value: 39.49048882649335 - type: nauc_ndcg_at_5_std value: -0.42692690160443814 - type: nauc_precision_at_1000_diff1 value: -14.770791653274824 - type: nauc_precision_at_1000_max value: 21.479874538905246 - type: nauc_precision_at_1000_std value: 28.607024261300207 - type: nauc_precision_at_100_diff1 value: -12.189696449878126 - type: nauc_precision_at_100_max value: 26.69785787492456 - type: nauc_precision_at_100_std value: 33.59098307467553 - type: nauc_precision_at_10_diff1 value: 6.922968330978399 - type: nauc_precision_at_10_max value: 34.52138344123087 - type: nauc_precision_at_10_std value: 21.768427637079952 - type: nauc_precision_at_1_diff1 value: 58.062410036466105 - type: nauc_precision_at_1_max value: 37.660958859966506 - type: nauc_precision_at_1_std value: 0.029007600674170648 - type: nauc_precision_at_20_diff1 value: -0.6837867902179278 - type: nauc_precision_at_20_max value: 33.98683709011133 - type: nauc_precision_at_20_std value: 30.8845561918902 - type: nauc_precision_at_3_diff1 value: 28.195043041120847 - type: nauc_precision_at_3_max value: 37.659916094938836 - type: nauc_precision_at_3_std value: 7.226520146634867 - type: nauc_precision_at_5_diff1 value: 16.633667288096245 - type: nauc_precision_at_5_max value: 34.90176597404891 - type: nauc_precision_at_5_std value: 12.421585442334088 - type: nauc_recall_at_1000_diff1 value: 45.20743732415397 - type: nauc_recall_at_1000_max value: 72.77115913579242 - type: nauc_recall_at_1000_std value: 70.48328496679083 - type: nauc_recall_at_100_diff1 value: 38.56282680810794 - type: nauc_recall_at_100_max value: 55.46797683321103 - type: nauc_recall_at_100_std value: 36.878791151929136 - type: nauc_recall_at_10_diff1 value: 44.18252051452362 - type: nauc_recall_at_10_max value: 43.33391810040086 - type: nauc_recall_at_10_std value: 6.663378192277723 - type: nauc_recall_at_1_diff1 value: 58.74017164752485 - type: nauc_recall_at_1_max value: 31.528974862589585 - type: nauc_recall_at_1_std value: -3.273824691929492 - type: nauc_recall_at_20_diff1 value: 44.19944231642417 - type: nauc_recall_at_20_max value: 49.401101483915866 - type: nauc_recall_at_20_std value: 18.97803841673839 - type: nauc_recall_at_3_diff1 value: 49.56378985428704 - type: nauc_recall_at_3_max value: 36.434210616870224 - type: nauc_recall_at_3_std value: -2.850559971607616 - type: nauc_recall_at_5_diff1 value: 47.37107217086109 - type: nauc_recall_at_5_max value: 39.0236745509895 - type: nauc_recall_at_5_std value: -1.7402454457937195 - type: ndcg_at_1 value: 50.031000000000006 - type: ndcg_at_10 value: 62.686 - type: ndcg_at_100 value: 66.403 - type: ndcg_at_1000 value: 67.241 - type: ndcg_at_20 value: 64.37899999999999 - type: ndcg_at_3 value: 57.859 - type: ndcg_at_5 value: 60.375 - type: precision_at_1 value: 50.031000000000006 - type: precision_at_10 value: 9.856 - type: precision_at_100 value: 1.266 - type: precision_at_1000 value: 0.13799999999999998 - type: precision_at_20 value: 5.489 - type: precision_at_3 value: 25.746999999999996 - type: precision_at_5 value: 17.492 - type: recall_at_1 value: 43.856 - type: recall_at_10 value: 75.824 - type: recall_at_100 value: 91.622 - type: recall_at_1000 value: 97.538 - type: recall_at_20 value: 81.951 - type: recall_at_3 value: 63.016000000000005 - type: recall_at_5 value: 69.18299999999999 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval type: mteb/cqadupstack-gis config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: main_score value: 43.983 - type: map_at_1 value: 28.942 - type: map_at_10 value: 38.621 - type: map_at_100 value: 39.7 - type: map_at_1000 value: 39.766 - type: map_at_20 value: 39.262 - type: map_at_3 value: 35.719 - type: map_at_5 value: 37.378 - type: mrr_at_1 value: 31.29943502824859 - type: mrr_at_10 value: 40.76463994260603 - type: mrr_at_100 value: 41.67073617629083 - type: mrr_at_1000 value: 41.717446259457105 - type: mrr_at_20 value: 41.32577374689195 - type: mrr_at_3 value: 37.984934086628996 - type: mrr_at_5 value: 39.64595103578152 - type: nauc_map_at_1000_diff1 value: 43.64461679688985 - type: nauc_map_at_1000_max value: 31.53717883948204 - type: nauc_map_at_1000_std value: 1.193745788248017 - type: nauc_map_at_100_diff1 value: 43.63847825079489 - type: nauc_map_at_100_max value: 31.536602619279165 - type: nauc_map_at_100_std value: 1.2001240243342401 - type: nauc_map_at_10_diff1 value: 43.845991987142014 - type: nauc_map_at_10_max value: 31.27509937344113 - type: nauc_map_at_10_std value: 0.7327934840520994 - type: nauc_map_at_1_diff1 value: 50.62269273984579 - type: nauc_map_at_1_max value: 30.16325757909521 - type: nauc_map_at_1_std value: -0.6398875136233392 - type: nauc_map_at_20_diff1 value: 43.630758403790914 - type: nauc_map_at_20_max value: 31.408258098047703 - type: nauc_map_at_20_std value: 1.12616034652217 - type: nauc_map_at_3_diff1 value: 44.823493567359456 - type: nauc_map_at_3_max value: 31.075886347614496 - type: nauc_map_at_3_std value: -0.25126874515735426 - type: nauc_map_at_5_diff1 value: 43.79768853087658 - type: nauc_map_at_5_max value: 31.091080995725324 - type: nauc_map_at_5_std value: 0.16440771782544047 - type: nauc_mrr_at_1000_diff1 value: 42.7865400752329 - type: nauc_mrr_at_1000_max value: 32.84731670326893 - type: nauc_mrr_at_1000_std value: 2.6067637582013825 - type: nauc_mrr_at_100_diff1 value: 42.771741548331065 - type: nauc_mrr_at_100_max value: 32.85324232845987 - type: nauc_mrr_at_100_std value: 2.6092786694308376 - type: nauc_mrr_at_10_diff1 value: 42.82969738870672 - type: nauc_mrr_at_10_max value: 32.69407549631432 - type: nauc_mrr_at_10_std value: 2.302903910016054 - type: nauc_mrr_at_1_diff1 value: 49.05638333657571 - type: nauc_mrr_at_1_max value: 33.12030717171514 - type: nauc_mrr_at_1_std value: 1.3278035087690774 - type: nauc_mrr_at_20_diff1 value: 42.74267239536286 - type: nauc_mrr_at_20_max value: 32.78571108973092 - type: nauc_mrr_at_20_std value: 2.5932669908758643 - type: nauc_mrr_at_3_diff1 value: 43.69963426089187 - type: nauc_mrr_at_3_max value: 32.78193126956233 - type: nauc_mrr_at_3_std value: 1.634874463134699 - type: nauc_mrr_at_5_diff1 value: 42.838630647832524 - type: nauc_mrr_at_5_max value: 32.459318735260545 - type: nauc_mrr_at_5_std value: 1.9412518283209172 - type: nauc_ndcg_at_1000_diff1 value: 41.01253839851583 - type: nauc_ndcg_at_1000_max value: 32.69570568894237 - type: nauc_ndcg_at_1000_std value: 3.4254737113410343 - type: nauc_ndcg_at_100_diff1 value: 40.62589243745832 - type: nauc_ndcg_at_100_max value: 32.664990655736126 - type: nauc_ndcg_at_100_std value: 3.799569445326048 - type: nauc_ndcg_at_10_diff1 value: 41.31658753735306 - type: nauc_ndcg_at_10_max value: 31.511946320339295 - type: nauc_ndcg_at_10_std value: 2.0492930500796662 - type: nauc_ndcg_at_1_diff1 value: 49.05638333657571 - type: nauc_ndcg_at_1_max value: 33.12030717171514 - type: nauc_ndcg_at_1_std value: 1.3278035087690774 - type: nauc_ndcg_at_20_diff1 value: 40.66188223212841 - type: nauc_ndcg_at_20_max value: 31.926240431497476 - type: nauc_ndcg_at_20_std value: 3.370398664595343 - type: nauc_ndcg_at_3_diff1 value: 43.035580180241 - type: nauc_ndcg_at_3_max value: 31.363874129878404 - type: nauc_ndcg_at_3_std value: 0.1422507242819929 - type: nauc_ndcg_at_5_diff1 value: 41.29049003955878 - type: nauc_ndcg_at_5_max value: 31.112034994977737 - type: nauc_ndcg_at_5_std value: 0.860179279828966 - type: nauc_precision_at_1000_diff1 value: -12.41854465881981 - type: nauc_precision_at_1000_max value: 14.706779246590548 - type: nauc_precision_at_1000_std value: 9.812804367375206 - type: nauc_precision_at_100_diff1 value: 2.797520107808461 - type: nauc_precision_at_100_max value: 24.335873541811406 - type: nauc_precision_at_100_std value: 12.87186398750545 - type: nauc_precision_at_10_diff1 value: 24.530962799265847 - type: nauc_precision_at_10_max value: 31.00772010798733 - type: nauc_precision_at_10_std value: 6.696733001548185 - type: nauc_precision_at_1_diff1 value: 49.05638333657571 - type: nauc_precision_at_1_max value: 33.12030717171514 - type: nauc_precision_at_1_std value: 1.3278035087690774 - type: nauc_precision_at_20_diff1 value: 16.25028416351204 - type: nauc_precision_at_20_max value: 29.629326492027342 - type: nauc_precision_at_20_std value: 11.085888573121679 - type: nauc_precision_at_3_diff1 value: 33.923667689694256 - type: nauc_precision_at_3_max value: 33.5859782361996 - type: nauc_precision_at_3_std value: 1.9468331086918693 - type: nauc_precision_at_5_diff1 value: 27.917827233088875 - type: nauc_precision_at_5_max value: 33.13290043423535 - type: nauc_precision_at_5_std value: 3.800870695945311 - type: nauc_recall_at_1000_diff1 value: 9.680283388428789 - type: nauc_recall_at_1000_max value: 49.479399284871235 - type: nauc_recall_at_1000_std value: 31.506985071436088 - type: nauc_recall_at_100_diff1 value: 23.607673377885448 - type: nauc_recall_at_100_max value: 36.637750366403935 - type: nauc_recall_at_100_std value: 18.30770690564224 - type: nauc_recall_at_10_diff1 value: 33.199683418312446 - type: nauc_recall_at_10_max value: 29.63115497012312 - type: nauc_recall_at_10_std value: 4.813200391480566 - type: nauc_recall_at_1_diff1 value: 50.62269273984579 - type: nauc_recall_at_1_max value: 30.16325757909521 - type: nauc_recall_at_1_std value: -0.6398875136233392 - type: nauc_recall_at_20_diff1 value: 29.16488387844995 - type: nauc_recall_at_20_max value: 30.788019479459 - type: nauc_recall_at_20_std value: 11.031953917298853 - type: nauc_recall_at_3_diff1 value: 38.215351600417065 - type: nauc_recall_at_3_max value: 29.619887154236128 - type: nauc_recall_at_3_std value: -0.13237298980339363 - type: nauc_recall_at_5_diff1 value: 33.93788042633265 - type: nauc_recall_at_5_max value: 28.67185092656741 - type: nauc_recall_at_5_std value: 1.316700201091445 - type: ndcg_at_1 value: 31.299 - type: ndcg_at_10 value: 43.983 - type: ndcg_at_100 value: 48.992999999999995 - type: ndcg_at_1000 value: 50.757 - type: ndcg_at_20 value: 46.152 - type: ndcg_at_3 value: 38.367000000000004 - type: ndcg_at_5 value: 41.171 - type: precision_at_1 value: 31.299 - type: precision_at_10 value: 6.734 - type: precision_at_100 value: 0.972 - type: precision_at_1000 value: 0.11499999999999999 - type: precision_at_20 value: 3.898 - type: precision_at_3 value: 16.121 - type: precision_at_5 value: 11.344999999999999 - type: recall_at_1 value: 28.942 - type: recall_at_10 value: 58.343999999999994 - type: recall_at_100 value: 80.82300000000001 - type: recall_at_1000 value: 94.348 - type: recall_at_20 value: 66.449 - type: recall_at_3 value: 43.415 - type: recall_at_5 value: 50.007999999999996 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval type: mteb/cqadupstack-mathematica config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: main_score value: 33.144 - type: map_at_1 value: 19.41 - type: map_at_10 value: 27.802 - type: map_at_100 value: 29.157 - type: map_at_1000 value: 29.274 - type: map_at_20 value: 28.549000000000003 - type: map_at_3 value: 25.052999999999997 - type: map_at_5 value: 26.521 - type: mrr_at_1 value: 23.756218905472636 - type: mrr_at_10 value: 32.3623450209271 - type: mrr_at_100 value: 33.3648208444617 - type: mrr_at_1000 value: 33.427688215162185 - type: mrr_at_20 value: 32.93723485575758 - type: mrr_at_3 value: 29.539800995024883 - type: mrr_at_5 value: 31.156716417910452 - type: nauc_map_at_1000_diff1 value: 36.196391248081284 - type: nauc_map_at_1000_max value: 25.650644367091495 - type: nauc_map_at_1000_std value: 6.130340697729844 - type: nauc_map_at_100_diff1 value: 36.138890642411376 - type: nauc_map_at_100_max value: 25.587124763888518 - type: nauc_map_at_100_std value: 6.129336379055536 - type: nauc_map_at_10_diff1 value: 36.254426743566775 - type: nauc_map_at_10_max value: 25.465599906543034 - type: nauc_map_at_10_std value: 5.880280378112879 - type: nauc_map_at_1_diff1 value: 42.890551563179976 - type: nauc_map_at_1_max value: 25.813805281076956 - type: nauc_map_at_1_std value: 5.150718386163028 - type: nauc_map_at_20_diff1 value: 35.98551587974314 - type: nauc_map_at_20_max value: 25.501540521726636 - type: nauc_map_at_20_std value: 5.858703157458749 - type: nauc_map_at_3_diff1 value: 37.646558039577734 - type: nauc_map_at_3_max value: 26.138491471124247 - type: nauc_map_at_3_std value: 6.0487505175540734 - type: nauc_map_at_5_diff1 value: 36.817582976153695 - type: nauc_map_at_5_max value: 25.398200211121146 - type: nauc_map_at_5_std value: 6.31126763919522 - type: nauc_mrr_at_1000_diff1 value: 37.313544952847835 - type: nauc_mrr_at_1000_max value: 26.96218532078988 - type: nauc_mrr_at_1000_std value: 6.814359224654042 - type: nauc_mrr_at_100_diff1 value: 37.28104407653679 - type: nauc_mrr_at_100_max value: 26.931243040477256 - type: nauc_mrr_at_100_std value: 6.800500150841733 - type: nauc_mrr_at_10_diff1 value: 37.315832621275895 - type: nauc_mrr_at_10_max value: 26.941454225978372 - type: nauc_mrr_at_10_std value: 6.837046527796884 - type: nauc_mrr_at_1_diff1 value: 43.19904188582958 - type: nauc_mrr_at_1_max value: 26.975620445904795 - type: nauc_mrr_at_1_std value: 4.52071008581395 - type: nauc_mrr_at_20_diff1 value: 37.2200524790774 - type: nauc_mrr_at_20_max value: 26.971494160765847 - type: nauc_mrr_at_20_std value: 6.716431228783282 - type: nauc_mrr_at_3_diff1 value: 38.46236387340654 - type: nauc_mrr_at_3_max value: 27.846812992192056 - type: nauc_mrr_at_3_std value: 6.550711872569794 - type: nauc_mrr_at_5_diff1 value: 37.620346007658476 - type: nauc_mrr_at_5_max value: 27.031025952102038 - type: nauc_mrr_at_5_std value: 7.32343760231163 - type: nauc_ndcg_at_1000_diff1 value: 34.95081314840592 - type: nauc_ndcg_at_1000_max value: 26.89265465124325 - type: nauc_ndcg_at_1000_std value: 7.854154466831975 - type: nauc_ndcg_at_100_diff1 value: 34.01417812563093 - type: nauc_ndcg_at_100_max value: 25.792737746436835 - type: nauc_ndcg_at_100_std value: 7.726584165493833 - type: nauc_ndcg_at_10_diff1 value: 33.895122516474466 - type: nauc_ndcg_at_10_max value: 25.388442204589612 - type: nauc_ndcg_at_10_std value: 6.359560223645991 - type: nauc_ndcg_at_1_diff1 value: 43.19904188582958 - type: nauc_ndcg_at_1_max value: 26.975620445904795 - type: nauc_ndcg_at_1_std value: 4.52071008581395 - type: nauc_ndcg_at_20_diff1 value: 33.36078689830245 - type: nauc_ndcg_at_20_max value: 25.531794610571563 - type: nauc_ndcg_at_20_std value: 6.136658608653248 - type: nauc_ndcg_at_3_diff1 value: 36.44505602530781 - type: nauc_ndcg_at_3_max value: 26.9104071983157 - type: nauc_ndcg_at_3_std value: 6.427178520371878 - type: nauc_ndcg_at_5_diff1 value: 35.01384323197442 - type: nauc_ndcg_at_5_max value: 25.5560447088692 - type: nauc_ndcg_at_5_std value: 7.3676236760360485 - type: nauc_precision_at_1000_diff1 value: 2.8903331041804514 - type: nauc_precision_at_1000_max value: 4.059662742366004 - type: nauc_precision_at_1000_std value: -1.5891687644008334 - type: nauc_precision_at_100_diff1 value: 8.437726471693766 - type: nauc_precision_at_100_max value: 11.250588557568427 - type: nauc_precision_at_100_std value: 4.231571164627862 - type: nauc_precision_at_10_diff1 value: 19.57085237210294 - type: nauc_precision_at_10_max value: 20.973093492003905 - type: nauc_precision_at_10_std value: 3.197416248152466 - type: nauc_precision_at_1_diff1 value: 43.19904188582958 - type: nauc_precision_at_1_max value: 26.975620445904795 - type: nauc_precision_at_1_std value: 4.52071008581395 - type: nauc_precision_at_20_diff1 value: 15.67136554192724 - type: nauc_precision_at_20_max value: 17.706882621057858 - type: nauc_precision_at_20_std value: 1.9363472182867714 - type: nauc_precision_at_3_diff1 value: 30.38035695042325 - type: nauc_precision_at_3_max value: 26.48218693244094 - type: nauc_precision_at_3_std value: 6.424657705785632 - type: nauc_precision_at_5_diff1 value: 25.272543315171458 - type: nauc_precision_at_5_max value: 22.32441421311652 - type: nauc_precision_at_5_std value: 7.4912569081905716 - type: nauc_recall_at_1000_diff1 value: 25.5748044137675 - type: nauc_recall_at_1000_max value: 43.85796585370269 - type: nauc_recall_at_1000_std value: 30.0338086596789 - type: nauc_recall_at_100_diff1 value: 22.577080638885093 - type: nauc_recall_at_100_max value: 23.224511700617477 - type: nauc_recall_at_100_std value: 15.187963852289313 - type: nauc_recall_at_10_diff1 value: 25.058592299355908 - type: nauc_recall_at_10_max value: 22.24448483279841 - type: nauc_recall_at_10_std value: 6.3179089740052765 - type: nauc_recall_at_1_diff1 value: 42.890551563179976 - type: nauc_recall_at_1_max value: 25.813805281076956 - type: nauc_recall_at_1_std value: 5.150718386163028 - type: nauc_recall_at_20_diff1 value: 22.433865123187307 - type: nauc_recall_at_20_max value: 22.739695641511762 - type: nauc_recall_at_20_std value: 5.362005125538497 - type: nauc_recall_at_3_diff1 value: 32.17919168998616 - type: nauc_recall_at_3_max value: 26.044028436867357 - type: nauc_recall_at_3_std value: 7.420349884006329 - type: nauc_recall_at_5_diff1 value: 28.967104573649138 - type: nauc_recall_at_5_max value: 23.40865848168201 - type: nauc_recall_at_5_std value: 9.174406147723621 - type: ndcg_at_1 value: 23.756 - type: ndcg_at_10 value: 33.144 - type: ndcg_at_100 value: 39.261 - type: ndcg_at_1000 value: 41.881 - type: ndcg_at_20 value: 35.56 - type: ndcg_at_3 value: 27.927999999999997 - type: ndcg_at_5 value: 30.293999999999997 - type: precision_at_1 value: 23.756 - type: precision_at_10 value: 5.995 - type: precision_at_100 value: 1.053 - type: precision_at_1000 value: 0.14100000000000001 - type: precision_at_20 value: 3.688 - type: precision_at_3 value: 13.059999999999999 - type: precision_at_5 value: 9.602 - type: recall_at_1 value: 19.41 - type: recall_at_10 value: 45.074 - type: recall_at_100 value: 71.131 - type: recall_at_1000 value: 89.604 - type: recall_at_20 value: 53.673 - type: recall_at_3 value: 31.055 - type: recall_at_5 value: 36.714999999999996 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval type: mteb/cqadupstack-physics config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: main_score value: 49.675000000000004 - type: map_at_1 value: 33.178999999999995 - type: map_at_10 value: 43.807 - type: map_at_100 value: 45.17 - type: map_at_1000 value: 45.271 - type: map_at_20 value: 44.516 - type: map_at_3 value: 40.813 - type: map_at_5 value: 42.457 - type: mrr_at_1 value: 40.32723772858518 - type: mrr_at_10 value: 49.646867409138814 - type: mrr_at_100 value: 50.493686101426285 - type: mrr_at_1000 value: 50.525386961808834 - type: mrr_at_20 value: 50.120274354884586 - type: mrr_at_3 value: 47.49759384023096 - type: mrr_at_5 value: 48.72473532242535 - type: nauc_map_at_1000_diff1 value: 49.5947127786396 - type: nauc_map_at_1000_max value: 33.39720045844929 - type: nauc_map_at_1000_std value: -3.131428593252271 - type: nauc_map_at_100_diff1 value: 49.57797867324617 - type: nauc_map_at_100_max value: 33.356927974709464 - type: nauc_map_at_100_std value: -3.1661365376766337 - type: nauc_map_at_10_diff1 value: 49.59294630598952 - type: nauc_map_at_10_max value: 32.86647346990462 - type: nauc_map_at_10_std value: -4.1582043443386745 - type: nauc_map_at_1_diff1 value: 53.98646767288695 - type: nauc_map_at_1_max value: 29.45629077638936 - type: nauc_map_at_1_std value: -5.621187380771589 - type: nauc_map_at_20_diff1 value: 49.486982890447074 - type: nauc_map_at_20_max value: 33.11681933406332 - type: nauc_map_at_20_std value: -3.5826433195146854 - type: nauc_map_at_3_diff1 value: 50.81807107491861 - type: nauc_map_at_3_max value: 32.32552291988859 - type: nauc_map_at_3_std value: -3.952946504088928 - type: nauc_map_at_5_diff1 value: 49.70201354274439 - type: nauc_map_at_5_max value: 32.831846031004886 - type: nauc_map_at_5_std value: -3.8330488624207737 - type: nauc_mrr_at_1000_diff1 value: 49.04159472507738 - type: nauc_mrr_at_1000_max value: 35.617600171138676 - type: nauc_mrr_at_1000_std value: -1.5975830757486646 - type: nauc_mrr_at_100_diff1 value: 49.03848471692094 - type: nauc_mrr_at_100_max value: 35.61936748662614 - type: nauc_mrr_at_100_std value: -1.5922053398594729 - type: nauc_mrr_at_10_diff1 value: 48.92463964652612 - type: nauc_mrr_at_10_max value: 35.37757708992045 - type: nauc_mrr_at_10_std value: -2.2052028139567303 - type: nauc_mrr_at_1_diff1 value: 52.23915787290734 - type: nauc_mrr_at_1_max value: 34.393531787632334 - type: nauc_mrr_at_1_std value: -1.452007661016969 - type: nauc_mrr_at_20_diff1 value: 48.91168438018404 - type: nauc_mrr_at_20_max value: 35.478962544421876 - type: nauc_mrr_at_20_std value: -1.8246048423555414 - type: nauc_mrr_at_3_diff1 value: 50.115432665442164 - type: nauc_mrr_at_3_max value: 35.89093796085569 - type: nauc_mrr_at_3_std value: -1.4895016313153366 - type: nauc_mrr_at_5_diff1 value: 49.04321261351915 - type: nauc_mrr_at_5_max value: 35.85730520949451 - type: nauc_mrr_at_5_std value: -1.68790556880753 - type: nauc_ndcg_at_1000_diff1 value: 48.294697499154374 - type: nauc_ndcg_at_1000_max value: 35.167410242367595 - type: nauc_ndcg_at_1000_std value: -0.6346078535914157 - type: nauc_ndcg_at_100_diff1 value: 48.025525283449014 - type: nauc_ndcg_at_100_max value: 34.79288511776105 - type: nauc_ndcg_at_100_std value: -0.7823403044086993 - type: nauc_ndcg_at_10_diff1 value: 47.70793258015258 - type: nauc_ndcg_at_10_max value: 33.09558927880104 - type: nauc_ndcg_at_10_std value: -4.7793864166260605 - type: nauc_ndcg_at_1_diff1 value: 52.23915787290734 - type: nauc_ndcg_at_1_max value: 34.393531787632334 - type: nauc_ndcg_at_1_std value: -1.452007661016969 - type: nauc_ndcg_at_20_diff1 value: 47.354286045074815 - type: nauc_ndcg_at_20_max value: 33.686648806027975 - type: nauc_ndcg_at_20_std value: -3.0189085132476556 - type: nauc_ndcg_at_3_diff1 value: 49.68805334316908 - type: nauc_ndcg_at_3_max value: 34.196077748056496 - type: nauc_ndcg_at_3_std value: -2.7167289163768436 - type: nauc_ndcg_at_5_diff1 value: 47.94474868912989 - type: nauc_ndcg_at_5_max value: 34.00261603413051 - type: nauc_ndcg_at_5_std value: -3.3541028103046115 - type: nauc_precision_at_1000_diff1 value: -12.0150100710755 - type: nauc_precision_at_1000_max value: 5.332942816568796 - type: nauc_precision_at_1000_std value: 14.543288479130458 - type: nauc_precision_at_100_diff1 value: -4.920332181588838 - type: nauc_precision_at_100_max value: 14.42313332017491 - type: nauc_precision_at_100_std value: 17.821953321018384 - type: nauc_precision_at_10_diff1 value: 14.70509089079217 - type: nauc_precision_at_10_max value: 25.381887131649716 - type: nauc_precision_at_10_std value: 5.226419288645675 - type: nauc_precision_at_1_diff1 value: 52.23915787290734 - type: nauc_precision_at_1_max value: 34.393531787632334 - type: nauc_precision_at_1_std value: -1.452007661016969 - type: nauc_precision_at_20_diff1 value: 6.312827641507564 - type: nauc_precision_at_20_max value: 22.483038562271933 - type: nauc_precision_at_20_std value: 11.368419856892416 - type: nauc_precision_at_3_diff1 value: 33.271443420273606 - type: nauc_precision_at_3_max value: 33.571078182106675 - type: nauc_precision_at_3_std value: 4.47382265155717 - type: nauc_precision_at_5_diff1 value: 23.43287104284656 - type: nauc_precision_at_5_max value: 30.909085068105313 - type: nauc_precision_at_5_std value: 5.545672049452433 - type: nauc_recall_at_1000_diff1 value: 35.22615594677707 - type: nauc_recall_at_1000_max value: 52.0710533173532 - type: nauc_recall_at_1000_std value: 45.17683523786464 - type: nauc_recall_at_100_diff1 value: 36.2169056956332 - type: nauc_recall_at_100_max value: 35.02435003210817 - type: nauc_recall_at_100_std value: 15.833632946282508 - type: nauc_recall_at_10_diff1 value: 39.12440292974848 - type: nauc_recall_at_10_max value: 28.0546011979648 - type: nauc_recall_at_10_std value: -9.620558638092172 - type: nauc_recall_at_1_diff1 value: 53.98646767288695 - type: nauc_recall_at_1_max value: 29.45629077638936 - type: nauc_recall_at_1_std value: -5.621187380771589 - type: nauc_recall_at_20_diff1 value: 36.39254630768161 - type: nauc_recall_at_20_max value: 29.277856508751967 - type: nauc_recall_at_20_std value: -3.048007490798412 - type: nauc_recall_at_3_diff1 value: 45.64706642644958 - type: nauc_recall_at_3_max value: 31.003050159737413 - type: nauc_recall_at_3_std value: -4.849763876930667 - type: nauc_recall_at_5_diff1 value: 40.918108859971746 - type: nauc_recall_at_5_max value: 30.69907335071493 - type: nauc_recall_at_5_std value: -6.1445436251916865 - type: ndcg_at_1 value: 40.327 - type: ndcg_at_10 value: 49.675000000000004 - type: ndcg_at_100 value: 55.364000000000004 - type: ndcg_at_1000 value: 56.992 - type: ndcg_at_20 value: 51.803999999999995 - type: ndcg_at_3 value: 45.227000000000004 - type: ndcg_at_5 value: 47.244 - type: precision_at_1 value: 40.327 - type: precision_at_10 value: 8.826 - type: precision_at_100 value: 1.354 - type: precision_at_1000 value: 0.167 - type: precision_at_20 value: 5.115 - type: precision_at_3 value: 21.303 - type: precision_at_5 value: 14.726 - type: recall_at_1 value: 33.178999999999995 - type: recall_at_10 value: 61.087 - type: recall_at_100 value: 85.099 - type: recall_at_1000 value: 95.14099999999999 - type: recall_at_20 value: 68.623 - type: recall_at_3 value: 48.245 - type: recall_at_5 value: 53.832 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval type: mteb/cqadupstack-programmers config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: main_score value: 44.99 - type: map_at_1 value: 28.089 - type: map_at_10 value: 38.98 - type: map_at_100 value: 40.339000000000006 - type: map_at_1000 value: 40.441 - type: map_at_20 value: 39.702 - type: map_at_3 value: 35.620000000000005 - type: map_at_5 value: 37.657000000000004 - type: mrr_at_1 value: 35.15981735159817 - type: mrr_at_10 value: 44.54075161266937 - type: mrr_at_100 value: 45.435730392436646 - type: mrr_at_1000 value: 45.47673849356812 - type: mrr_at_20 value: 45.05949613726918 - type: mrr_at_3 value: 42.00913242009131 - type: mrr_at_5 value: 43.52739726027392 - type: nauc_map_at_1000_diff1 value: 42.6375513442399 - type: nauc_map_at_1000_max value: 35.83899956589522 - type: nauc_map_at_1000_std value: 5.798620017712549 - type: nauc_map_at_100_diff1 value: 42.609712253881504 - type: nauc_map_at_100_max value: 35.85401871065736 - type: nauc_map_at_100_std value: 5.829007296755533 - type: nauc_map_at_10_diff1 value: 42.90931172127824 - type: nauc_map_at_10_max value: 35.46694204511423 - type: nauc_map_at_10_std value: 5.131477704152026 - type: nauc_map_at_1_diff1 value: 48.066312177855956 - type: nauc_map_at_1_max value: 30.67745267941573 - type: nauc_map_at_1_std value: -1.4170737991670943 - type: nauc_map_at_20_diff1 value: 42.730423700784 - type: nauc_map_at_20_max value: 35.710039616497085 - type: nauc_map_at_20_std value: 5.363961887475162 - type: nauc_map_at_3_diff1 value: 43.499223646579935 - type: nauc_map_at_3_max value: 33.872570039621564 - type: nauc_map_at_3_std value: 3.0787571843453008 - type: nauc_map_at_5_diff1 value: 43.28963642946521 - type: nauc_map_at_5_max value: 35.18327408279892 - type: nauc_map_at_5_std value: 4.516467154662473 - type: nauc_mrr_at_1000_diff1 value: 42.71279871641341 - type: nauc_mrr_at_1000_max value: 37.48825064817496 - type: nauc_mrr_at_1000_std value: 8.10015025024314 - type: nauc_mrr_at_100_diff1 value: 42.694777404773376 - type: nauc_mrr_at_100_max value: 37.476741768741086 - type: nauc_mrr_at_100_std value: 8.11525130417229 - type: nauc_mrr_at_10_diff1 value: 42.954194054560176 - type: nauc_mrr_at_10_max value: 37.606138578797506 - type: nauc_mrr_at_10_std value: 8.092519513302399 - type: nauc_mrr_at_1_diff1 value: 48.350790286038574 - type: nauc_mrr_at_1_max value: 33.97992759739641 - type: nauc_mrr_at_1_std value: 1.8332987018664093 - type: nauc_mrr_at_20_diff1 value: 42.664983701783044 - type: nauc_mrr_at_20_max value: 37.47450702110784 - type: nauc_mrr_at_20_std value: 8.001067634745462 - type: nauc_mrr_at_3_diff1 value: 42.921968602737955 - type: nauc_mrr_at_3_max value: 37.19599728791262 - type: nauc_mrr_at_3_std value: 7.4692697422507575 - type: nauc_mrr_at_5_diff1 value: 42.96028546491891 - type: nauc_mrr_at_5_max value: 37.688350071295915 - type: nauc_mrr_at_5_std value: 8.213017954012372 - type: nauc_ndcg_at_1000_diff1 value: 40.70763263942397 - type: nauc_ndcg_at_1000_max value: 37.87768319167602 - type: nauc_ndcg_at_1000_std value: 9.908807071686738 - type: nauc_ndcg_at_100_diff1 value: 39.97828438221707 - type: nauc_ndcg_at_100_max value: 37.7723393835996 - type: nauc_ndcg_at_100_std value: 10.666779466040097 - type: nauc_ndcg_at_10_diff1 value: 41.172233451172936 - type: nauc_ndcg_at_10_max value: 37.12252131573939 - type: nauc_ndcg_at_10_std value: 8.273798754436639 - type: nauc_ndcg_at_1_diff1 value: 48.350790286038574 - type: nauc_ndcg_at_1_max value: 33.97992759739641 - type: nauc_ndcg_at_1_std value: 1.8332987018664093 - type: nauc_ndcg_at_20_diff1 value: 40.33325895172716 - type: nauc_ndcg_at_20_max value: 37.36015594019951 - type: nauc_ndcg_at_20_std value: 8.818556108749302 - type: nauc_ndcg_at_3_diff1 value: 41.652701699747254 - type: nauc_ndcg_at_3_max value: 35.499109874223294 - type: nauc_ndcg_at_3_std value: 5.831784865606119 - type: nauc_ndcg_at_5_diff1 value: 41.856346892595475 - type: nauc_ndcg_at_5_max value: 36.940681835687194 - type: nauc_ndcg_at_5_std value: 7.507798515093516 - type: nauc_precision_at_1000_diff1 value: -2.4605367806784866 - type: nauc_precision_at_1000_max value: -0.3538142127162922 - type: nauc_precision_at_1000_std value: 8.369794961833236 - type: nauc_precision_at_100_diff1 value: -0.34954522096524704 - type: nauc_precision_at_100_max value: 13.159909603146458 - type: nauc_precision_at_100_std value: 19.425561514133996 - type: nauc_precision_at_10_diff1 value: 17.048304710148145 - type: nauc_precision_at_10_max value: 29.816041846806375 - type: nauc_precision_at_10_std value: 18.358893367243798 - type: nauc_precision_at_1_diff1 value: 48.350790286038574 - type: nauc_precision_at_1_max value: 33.97992759739641 - type: nauc_precision_at_1_std value: 1.8332987018664093 - type: nauc_precision_at_20_diff1 value: 10.450903599411344 - type: nauc_precision_at_20_max value: 25.228916373799127 - type: nauc_precision_at_20_std value: 18.46893569529936 - type: nauc_precision_at_3_diff1 value: 29.181236567048636 - type: nauc_precision_at_3_max value: 35.64918262500281 - type: nauc_precision_at_3_std value: 13.347538222514968 - type: nauc_precision_at_5_diff1 value: 23.693323840550345 - type: nauc_precision_at_5_max value: 33.972399735191225 - type: nauc_precision_at_5_std value: 17.107012760554618 - type: nauc_recall_at_1000_diff1 value: 20.297340483227945 - type: nauc_recall_at_1000_max value: 63.084305970127275 - type: nauc_recall_at_1000_std value: 63.04655000858784 - type: nauc_recall_at_100_diff1 value: 22.587332148979723 - type: nauc_recall_at_100_max value: 40.740968468024775 - type: nauc_recall_at_100_std value: 34.120423684507124 - type: nauc_recall_at_10_diff1 value: 33.361195948673675 - type: nauc_recall_at_10_max value: 37.1411402410262 - type: nauc_recall_at_10_std value: 13.475407196166259 - type: nauc_recall_at_1_diff1 value: 48.066312177855956 - type: nauc_recall_at_1_max value: 30.67745267941573 - type: nauc_recall_at_1_std value: -1.4170737991670943 - type: nauc_recall_at_20_diff1 value: 28.703982984383984 - type: nauc_recall_at_20_max value: 37.32929431193496 - type: nauc_recall_at_20_std value: 16.139135347989903 - type: nauc_recall_at_3_diff1 value: 36.53346179134789 - type: nauc_recall_at_3_max value: 34.11397914899309 - type: nauc_recall_at_3_std value: 7.19358019807132 - type: nauc_recall_at_5_diff1 value: 36.24058894947452 - type: nauc_recall_at_5_max value: 37.00990358651097 - type: nauc_recall_at_5_std value: 11.074645476821619 - type: ndcg_at_1 value: 35.160000000000004 - type: ndcg_at_10 value: 44.99 - type: ndcg_at_100 value: 50.661 - type: ndcg_at_1000 value: 52.599 - type: ndcg_at_20 value: 47.154 - type: ndcg_at_3 value: 39.843 - type: ndcg_at_5 value: 42.486000000000004 - type: precision_at_1 value: 35.160000000000004 - type: precision_at_10 value: 8.299 - type: precision_at_100 value: 1.2850000000000001 - type: precision_at_1000 value: 0.16199999999999998 - type: precision_at_20 value: 4.84 - type: precision_at_3 value: 19.178 - type: precision_at_5 value: 13.927 - type: recall_at_1 value: 28.089 - type: recall_at_10 value: 57.158 - type: recall_at_100 value: 81.461 - type: recall_at_1000 value: 94.46900000000001 - type: recall_at_20 value: 64.927 - type: recall_at_3 value: 42.775999999999996 - type: recall_at_5 value: 49.719 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: mteb/cqadupstack config: default split: test revision: CQADupstackRetrieval is a combined dataset metrics: - type: main_score value: 44.989166666666655 - type: ndcg_at_10 value: 44.989166666666655 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval type: mteb/cqadupstack-stats config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: main_score value: 39.586 - type: map_at_1 value: 27.301 - type: map_at_10 value: 35.022 - type: map_at_100 value: 36.061 - type: map_at_1000 value: 36.146 - type: map_at_20 value: 35.608000000000004 - type: map_at_3 value: 32.978 - type: map_at_5 value: 33.994 - type: mrr_at_1 value: 30.67484662576687 - type: mrr_at_10 value: 38.1696124257474 - type: mrr_at_100 value: 38.99730898994137 - type: mrr_at_1000 value: 39.049871007408136 - type: mrr_at_20 value: 38.62424051396064 - type: mrr_at_3 value: 36.40081799591004 - type: mrr_at_5 value: 37.23670756646219 - type: nauc_map_at_1000_diff1 value: 50.4395097150819 - type: nauc_map_at_1000_max value: 42.36231476768413 - type: nauc_map_at_1000_std value: 1.0739414045485742 - type: nauc_map_at_100_diff1 value: 50.4253775421283 - type: nauc_map_at_100_max value: 42.34508969348633 - type: nauc_map_at_100_std value: 1.0590256535050135 - type: nauc_map_at_10_diff1 value: 50.74196619464362 - type: nauc_map_at_10_max value: 42.354326434590284 - type: nauc_map_at_10_std value: 0.6330167542705694 - type: nauc_map_at_1_diff1 value: 55.7404810490963 - type: nauc_map_at_1_max value: 40.7676941648045 - type: nauc_map_at_1_std value: -5.021772566610674 - type: nauc_map_at_20_diff1 value: 50.39792463598886 - type: nauc_map_at_20_max value: 42.25768760228577 - type: nauc_map_at_20_std value: 0.8979017700131807 - type: nauc_map_at_3_diff1 value: 51.53267996170815 - type: nauc_map_at_3_max value: 41.78801756883417 - type: nauc_map_at_3_std value: -0.6652383024396911 - type: nauc_map_at_5_diff1 value: 50.992783683271504 - type: nauc_map_at_5_max value: 41.8607977828188 - type: nauc_map_at_5_std value: 0.3484379897869807 - type: nauc_mrr_at_1000_diff1 value: 48.952907124445126 - type: nauc_mrr_at_1000_max value: 42.93563741482114 - type: nauc_mrr_at_1000_std value: 3.0791495753556424 - type: nauc_mrr_at_100_diff1 value: 48.941921107360805 - type: nauc_mrr_at_100_max value: 42.94419657374061 - type: nauc_mrr_at_100_std value: 3.075397087180154 - type: nauc_mrr_at_10_diff1 value: 49.098926306303056 - type: nauc_mrr_at_10_max value: 42.941857820499806 - type: nauc_mrr_at_10_std value: 2.8184474174054372 - type: nauc_mrr_at_1_diff1 value: 54.428109877009334 - type: nauc_mrr_at_1_max value: 42.50273386972492 - type: nauc_mrr_at_1_std value: -2.1811826216412187 - type: nauc_mrr_at_20_diff1 value: 48.82502192775839 - type: nauc_mrr_at_20_max value: 42.92227277257095 - type: nauc_mrr_at_20_std value: 2.975812634368533 - type: nauc_mrr_at_3_diff1 value: 49.440009227591176 - type: nauc_mrr_at_3_max value: 42.95503176290712 - type: nauc_mrr_at_3_std value: 2.2997128945013796 - type: nauc_mrr_at_5_diff1 value: 49.09846782701398 - type: nauc_mrr_at_5_max value: 42.51449168285772 - type: nauc_mrr_at_5_std value: 2.7785816484421297 - type: nauc_ndcg_at_1000_diff1 value: 48.14680758187888 - type: nauc_ndcg_at_1000_max value: 43.57465718500695 - type: nauc_ndcg_at_1000_std value: 5.287435676678261 - type: nauc_ndcg_at_100_diff1 value: 47.66081605743284 - type: nauc_ndcg_at_100_max value: 43.28156751251163 - type: nauc_ndcg_at_100_std value: 4.959626409663624 - type: nauc_ndcg_at_10_diff1 value: 48.25075619623878 - type: nauc_ndcg_at_10_max value: 43.00688660666578 - type: nauc_ndcg_at_10_std value: 3.2319193368891637 - type: nauc_ndcg_at_1_diff1 value: 54.428109877009334 - type: nauc_ndcg_at_1_max value: 42.50273386972492 - type: nauc_ndcg_at_1_std value: -2.1811826216412187 - type: nauc_ndcg_at_20_diff1 value: 47.1943098627403 - type: nauc_ndcg_at_20_max value: 42.86954491768707 - type: nauc_ndcg_at_20_std value: 4.08583080150737 - type: nauc_ndcg_at_3_diff1 value: 49.32681523192246 - type: nauc_ndcg_at_3_max value: 42.46898641470274 - type: nauc_ndcg_at_3_std value: 1.7416962407725236 - type: nauc_ndcg_at_5_diff1 value: 48.59647012439291 - type: nauc_ndcg_at_5_max value: 42.07098889846439 - type: nauc_ndcg_at_5_std value: 2.979621233356828 - type: nauc_precision_at_1000_diff1 value: -1.7366334161587105 - type: nauc_precision_at_1000_max value: 17.70969166396819 - type: nauc_precision_at_1000_std value: 17.50619975322144 - type: nauc_precision_at_100_diff1 value: 10.082579982582155 - type: nauc_precision_at_100_max value: 28.024893516091776 - type: nauc_precision_at_100_std value: 18.41413013357596 - type: nauc_precision_at_10_diff1 value: 28.796167732373657 - type: nauc_precision_at_10_max value: 40.37340024485382 - type: nauc_precision_at_10_std value: 13.718572711091733 - type: nauc_precision_at_1_diff1 value: 54.428109877009334 - type: nauc_precision_at_1_max value: 42.50273386972492 - type: nauc_precision_at_1_std value: -2.1811826216412187 - type: nauc_precision_at_20_diff1 value: 19.82691920771315 - type: nauc_precision_at_20_max value: 34.45075390159975 - type: nauc_precision_at_20_std value: 16.410812072348058 - type: nauc_precision_at_3_diff1 value: 40.85430254962678 - type: nauc_precision_at_3_max value: 43.63016056067074 - type: nauc_precision_at_3_std value: 9.322014634477581 - type: nauc_precision_at_5_diff1 value: 35.830272848975795 - type: nauc_precision_at_5_max value: 41.30047691620363 - type: nauc_precision_at_5_std value: 13.145693992266565 - type: nauc_recall_at_1000_diff1 value: 35.532000545890504 - type: nauc_recall_at_1000_max value: 50.714223194510325 - type: nauc_recall_at_1000_std value: 43.09037309139045 - type: nauc_recall_at_100_diff1 value: 35.11024488875192 - type: nauc_recall_at_100_max value: 43.0874566265193 - type: nauc_recall_at_100_std value: 19.70628521846854 - type: nauc_recall_at_10_diff1 value: 40.36203726741153 - type: nauc_recall_at_10_max value: 42.581482582576726 - type: nauc_recall_at_10_std value: 8.642553371022348 - type: nauc_recall_at_1_diff1 value: 55.7404810490963 - type: nauc_recall_at_1_max value: 40.7676941648045 - type: nauc_recall_at_1_std value: -5.021772566610674 - type: nauc_recall_at_20_diff1 value: 35.97348868186562 - type: nauc_recall_at_20_max value: 41.82695933305065 - type: nauc_recall_at_20_std value: 11.444957541593585 - type: nauc_recall_at_3_diff1 value: 44.20020470014979 - type: nauc_recall_at_3_max value: 40.84130855296979 - type: nauc_recall_at_3_std value: 5.004883338558809 - type: nauc_recall_at_5_diff1 value: 42.08756885472078 - type: nauc_recall_at_5_max value: 39.90323783606852 - type: nauc_recall_at_5_std value: 8.085182534171127 - type: ndcg_at_1 value: 30.675 - type: ndcg_at_10 value: 39.586 - type: ndcg_at_100 value: 44.737 - type: ndcg_at_1000 value: 46.863 - type: ndcg_at_20 value: 41.495 - type: ndcg_at_3 value: 35.8 - type: ndcg_at_5 value: 37.3 - type: precision_at_1 value: 30.675 - type: precision_at_10 value: 6.196 - type: precision_at_100 value: 0.9570000000000001 - type: precision_at_1000 value: 0.122 - type: precision_at_20 value: 3.6350000000000002 - type: precision_at_3 value: 15.337 - type: precision_at_5 value: 10.337 - type: recall_at_1 value: 27.301 - type: recall_at_10 value: 50.346999999999994 - type: recall_at_100 value: 74.459 - type: recall_at_1000 value: 90.018 - type: recall_at_20 value: 57.473 - type: recall_at_3 value: 39.672000000000004 - type: recall_at_5 value: 43.383 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval type: mteb/cqadupstack-tex config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: main_score value: 32.842 - type: map_at_1 value: 19.527 - type: map_at_10 value: 27.711999999999996 - type: map_at_100 value: 28.98 - type: map_at_1000 value: 29.108 - type: map_at_20 value: 28.407 - type: map_at_3 value: 25.023 - type: map_at_5 value: 26.528000000000002 - type: mrr_at_1 value: 23.675154852030282 - type: mrr_at_10 value: 31.810676323752784 - type: mrr_at_100 value: 32.788970614380716 - type: mrr_at_1000 value: 32.86028758975889 - type: mrr_at_20 value: 32.35935756676056 - type: mrr_at_3 value: 29.41615049323246 - type: mrr_at_5 value: 30.785730672172633 - type: nauc_map_at_1000_diff1 value: 35.597766688968015 - type: nauc_map_at_1000_max value: 26.295790183159845 - type: nauc_map_at_1000_std value: -0.04229904865958209 - type: nauc_map_at_100_diff1 value: 35.568782622469925 - type: nauc_map_at_100_max value: 26.27850795471227 - type: nauc_map_at_100_std value: -0.04944875782811099 - type: nauc_map_at_10_diff1 value: 35.63760937893694 - type: nauc_map_at_10_max value: 26.130094042028233 - type: nauc_map_at_10_std value: -0.6896882769027717 - type: nauc_map_at_1_diff1 value: 41.759098341890976 - type: nauc_map_at_1_max value: 23.918885427783326 - type: nauc_map_at_1_std value: -2.1383574897865074 - type: nauc_map_at_20_diff1 value: 35.55706530442612 - type: nauc_map_at_20_max value: 26.23339626569677 - type: nauc_map_at_20_std value: -0.162172033918129 - type: nauc_map_at_3_diff1 value: 37.22183376355153 - type: nauc_map_at_3_max value: 25.770512522122186 - type: nauc_map_at_3_std value: -1.3105892187778403 - type: nauc_map_at_5_diff1 value: 36.205913161663084 - type: nauc_map_at_5_max value: 25.953300641502064 - type: nauc_map_at_5_std value: -0.7987363137547906 - type: nauc_mrr_at_1000_diff1 value: 34.864016559617646 - type: nauc_mrr_at_1000_max value: 26.8689525348564 - type: nauc_mrr_at_1000_std value: -0.5839923973914446 - type: nauc_mrr_at_100_diff1 value: 34.83820469598538 - type: nauc_mrr_at_100_max value: 26.864669056231282 - type: nauc_mrr_at_100_std value: -0.5785645654158633 - type: nauc_mrr_at_10_diff1 value: 34.81868397381981 - type: nauc_mrr_at_10_max value: 26.79988560460627 - type: nauc_mrr_at_10_std value: -1.1113808365827318 - type: nauc_mrr_at_1_diff1 value: 40.0281507903504 - type: nauc_mrr_at_1_max value: 25.036735941806583 - type: nauc_mrr_at_1_std value: -2.508700799268523 - type: nauc_mrr_at_20_diff1 value: 34.81954537357966 - type: nauc_mrr_at_20_max value: 26.877673033315453 - type: nauc_mrr_at_20_std value: -0.6706028107452919 - type: nauc_mrr_at_3_diff1 value: 35.87313782549696 - type: nauc_mrr_at_3_max value: 26.776261693392335 - type: nauc_mrr_at_3_std value: -1.8010591328112908 - type: nauc_mrr_at_5_diff1 value: 35.31673912159536 - type: nauc_mrr_at_5_max value: 26.78720786106881 - type: nauc_mrr_at_5_std value: -1.3096326953900546 - type: nauc_ndcg_at_1000_diff1 value: 33.43105244339048 - type: nauc_ndcg_at_1000_max value: 27.52195065724684 - type: nauc_ndcg_at_1000_std value: 2.8376056562675744 - type: nauc_ndcg_at_100_diff1 value: 32.90916846420573 - type: nauc_ndcg_at_100_max value: 27.27161017736065 - type: nauc_ndcg_at_100_std value: 2.8703122625872126 - type: nauc_ndcg_at_10_diff1 value: 33.12714979317447 - type: nauc_ndcg_at_10_max value: 26.67762031747992 - type: nauc_ndcg_at_10_std value: -0.1341345572932233 - type: nauc_ndcg_at_1_diff1 value: 40.0281507903504 - type: nauc_ndcg_at_1_max value: 25.036735941806583 - type: nauc_ndcg_at_1_std value: -2.508700799268523 - type: nauc_ndcg_at_20_diff1 value: 32.891656138688546 - type: nauc_ndcg_at_20_max value: 26.991976404027163 - type: nauc_ndcg_at_20_std value: 1.6050741106677746 - type: nauc_ndcg_at_3_diff1 value: 35.576958713955484 - type: nauc_ndcg_at_3_max value: 26.41687745899445 - type: nauc_ndcg_at_3_std value: -1.5326687067002291 - type: nauc_ndcg_at_5_diff1 value: 34.27335619067276 - type: nauc_ndcg_at_5_max value: 26.479515412084208 - type: nauc_ndcg_at_5_std value: -0.5597648935666003 - type: nauc_precision_at_1000_diff1 value: -0.18660914306684007 - type: nauc_precision_at_1000_max value: 7.268255385799229 - type: nauc_precision_at_1000_std value: -0.1968875268478991 - type: nauc_precision_at_100_diff1 value: 7.386701205054449 - type: nauc_precision_at_100_max value: 15.477735603019607 - type: nauc_precision_at_100_std value: 4.753153414679307 - type: nauc_precision_at_10_diff1 value: 18.4668296945938 - type: nauc_precision_at_10_max value: 25.457144217779597 - type: nauc_precision_at_10_std value: 0.40165373733963605 - type: nauc_precision_at_1_diff1 value: 40.0281507903504 - type: nauc_precision_at_1_max value: 25.036735941806583 - type: nauc_precision_at_1_std value: -2.508700799268523 - type: nauc_precision_at_20_diff1 value: 14.751135844289335 - type: nauc_precision_at_20_max value: 22.763373329576293 - type: nauc_precision_at_20_std value: 4.360731801761864 - type: nauc_precision_at_3_diff1 value: 28.154753888265393 - type: nauc_precision_at_3_max value: 27.838427033527147 - type: nauc_precision_at_3_std value: -1.0042621266717804 - type: nauc_precision_at_5_diff1 value: 23.549026872711423 - type: nauc_precision_at_5_max value: 27.192214745385044 - type: nauc_precision_at_5_std value: 0.4455206110174471 - type: nauc_recall_at_1000_diff1 value: 17.905404210815632 - type: nauc_recall_at_1000_max value: 32.8674418535776 - type: nauc_recall_at_1000_std value: 35.187050415735435 - type: nauc_recall_at_100_diff1 value: 20.903609751984757 - type: nauc_recall_at_100_max value: 27.180306691518364 - type: nauc_recall_at_100_std value: 17.553030959393297 - type: nauc_recall_at_10_diff1 value: 25.615147693464387 - type: nauc_recall_at_10_max value: 25.97062699453565 - type: nauc_recall_at_10_std value: 2.2181702899826576 - type: nauc_recall_at_1_diff1 value: 41.759098341890976 - type: nauc_recall_at_1_max value: 23.918885427783326 - type: nauc_recall_at_1_std value: -2.1383574897865074 - type: nauc_recall_at_20_diff1 value: 23.922775940094386 - type: nauc_recall_at_20_max value: 26.384627814902785 - type: nauc_recall_at_20_std value: 7.944532403561578 - type: nauc_recall_at_3_diff1 value: 32.26543270634743 - type: nauc_recall_at_3_max value: 26.36357710828272 - type: nauc_recall_at_3_std value: -0.42723331708340706 - type: nauc_recall_at_5_diff1 value: 29.080464141763336 - type: nauc_recall_at_5_max value: 25.81238438303652 - type: nauc_recall_at_5_std value: 1.1649311168287726 - type: ndcg_at_1 value: 23.674999999999997 - type: ndcg_at_10 value: 32.842 - type: ndcg_at_100 value: 38.64 - type: ndcg_at_1000 value: 41.367 - type: ndcg_at_20 value: 35.032999999999994 - type: ndcg_at_3 value: 28.166000000000004 - type: ndcg_at_5 value: 30.407 - type: precision_at_1 value: 23.674999999999997 - type: precision_at_10 value: 6.005 - type: precision_at_100 value: 1.053 - type: precision_at_1000 value: 0.146 - type: precision_at_20 value: 3.6580000000000004 - type: precision_at_3 value: 13.352 - type: precision_at_5 value: 9.718 - type: recall_at_1 value: 19.527 - type: recall_at_10 value: 44.096999999999994 - type: recall_at_100 value: 69.962 - type: recall_at_1000 value: 89.035 - type: recall_at_20 value: 52.166000000000004 - type: recall_at_3 value: 30.946 - type: recall_at_5 value: 36.789 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval type: mteb/cqadupstack-unix config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: main_score value: 46.54 - type: map_at_1 value: 29.953999999999997 - type: map_at_10 value: 40.742 - type: map_at_100 value: 41.964 - type: map_at_1000 value: 42.059999999999995 - type: map_at_20 value: 41.426 - type: map_at_3 value: 37.378 - type: map_at_5 value: 39.267 - type: mrr_at_1 value: 34.701492537313435 - type: mrr_at_10 value: 44.29978085761664 - type: mrr_at_100 value: 45.205551401915486 - type: mrr_at_1000 value: 45.24735017384963 - type: mrr_at_20 value: 44.85338423755729 - type: mrr_at_3 value: 41.57338308457707 - type: mrr_at_5 value: 43.19185323383077 - type: nauc_map_at_1000_diff1 value: 48.45170522932164 - type: nauc_map_at_1000_max value: 31.544164363591204 - type: nauc_map_at_1000_std value: 0.8661088818146858 - type: nauc_map_at_100_diff1 value: 48.47347800061323 - type: nauc_map_at_100_max value: 31.568637596620313 - type: nauc_map_at_100_std value: 0.9252699336843858 - type: nauc_map_at_10_diff1 value: 48.64849891585432 - type: nauc_map_at_10_max value: 31.40371265579746 - type: nauc_map_at_10_std value: 0.7088016563713089 - type: nauc_map_at_1_diff1 value: 53.57918993108331 - type: nauc_map_at_1_max value: 31.392632653740993 - type: nauc_map_at_1_std value: -2.857306170463933 - type: nauc_map_at_20_diff1 value: 48.49084353023969 - type: nauc_map_at_20_max value: 31.470313174779374 - type: nauc_map_at_20_std value: 0.8950296035234309 - type: nauc_map_at_3_diff1 value: 49.273481161619806 - type: nauc_map_at_3_max value: 31.101471509782826 - type: nauc_map_at_3_std value: -0.886510096257905 - type: nauc_map_at_5_diff1 value: 48.85344288229106 - type: nauc_map_at_5_max value: 31.32633663238284 - type: nauc_map_at_5_std value: -0.44752909698881177 - type: nauc_mrr_at_1000_diff1 value: 46.27593166906613 - type: nauc_mrr_at_1000_max value: 31.637594372116336 - type: nauc_mrr_at_1000_std value: 0.8444917550670064 - type: nauc_mrr_at_100_diff1 value: 46.27161543033672 - type: nauc_mrr_at_100_max value: 31.64330655339695 - type: nauc_mrr_at_100_std value: 0.8717446416398773 - type: nauc_mrr_at_10_diff1 value: 46.100348481312864 - type: nauc_mrr_at_10_max value: 31.594271897882237 - type: nauc_mrr_at_10_std value: 0.8807168907688873 - type: nauc_mrr_at_1_diff1 value: 51.35163098909763 - type: nauc_mrr_at_1_max value: 31.99084441327899 - type: nauc_mrr_at_1_std value: -2.688594880742662 - type: nauc_mrr_at_20_diff1 value: 46.18178546174727 - type: nauc_mrr_at_20_max value: 31.639111674119448 - type: nauc_mrr_at_20_std value: 0.9855008641374622 - type: nauc_mrr_at_3_diff1 value: 46.307484835305864 - type: nauc_mrr_at_3_max value: 31.35563850804847 - type: nauc_mrr_at_3_std value: -0.3419536587707561 - type: nauc_mrr_at_5_diff1 value: 46.17646418781234 - type: nauc_mrr_at_5_max value: 31.313474270239833 - type: nauc_mrr_at_5_std value: -0.08656550526568331 - type: nauc_ndcg_at_1000_diff1 value: 46.12095795101613 - type: nauc_ndcg_at_1000_max value: 31.989083597726314 - type: nauc_ndcg_at_1000_std value: 3.2965704707660763 - type: nauc_ndcg_at_100_diff1 value: 46.05376249841318 - type: nauc_ndcg_at_100_max value: 32.39195988574972 - type: nauc_ndcg_at_100_std value: 4.518018135593347 - type: nauc_ndcg_at_10_diff1 value: 46.133631183744875 - type: nauc_ndcg_at_10_max value: 31.45358876172339 - type: nauc_ndcg_at_10_std value: 3.4254370918871055 - type: nauc_ndcg_at_1_diff1 value: 51.35163098909763 - type: nauc_ndcg_at_1_max value: 31.99084441327899 - type: nauc_ndcg_at_1_std value: -2.688594880742662 - type: nauc_ndcg_at_20_diff1 value: 45.94584949766954 - type: nauc_ndcg_at_20_max value: 31.689777515111295 - type: nauc_ndcg_at_20_std value: 4.189082428922442 - type: nauc_ndcg_at_3_diff1 value: 46.5057835389752 - type: nauc_ndcg_at_3_max value: 30.941407592082047 - type: nauc_ndcg_at_3_std value: -0.042473944857831535 - type: nauc_ndcg_at_5_diff1 value: 46.369027395136136 - type: nauc_ndcg_at_5_max value: 31.057841776505352 - type: nauc_ndcg_at_5_std value: 0.6878993420489522 - type: nauc_precision_at_1000_diff1 value: -17.30759714093202 - type: nauc_precision_at_1000_max value: -4.441155558458858 - type: nauc_precision_at_1000_std value: 1.5537300718220326 - type: nauc_precision_at_100_diff1 value: -7.18920438222021 - type: nauc_precision_at_100_max value: 8.017878121399253 - type: nauc_precision_at_100_std value: 11.357132919349102 - type: nauc_precision_at_10_diff1 value: 15.202451884794076 - type: nauc_precision_at_10_max value: 19.077295902881417 - type: nauc_precision_at_10_std value: 9.885526867355805 - type: nauc_precision_at_1_diff1 value: 51.35163098909763 - type: nauc_precision_at_1_max value: 31.99084441327899 - type: nauc_precision_at_1_std value: -2.688594880742662 - type: nauc_precision_at_20_diff1 value: 6.827461091494899 - type: nauc_precision_at_20_max value: 15.27268633497114 - type: nauc_precision_at_20_std value: 11.515826649647384 - type: nauc_precision_at_3_diff1 value: 31.043021807472027 - type: nauc_precision_at_3_max value: 26.22457157531548 - type: nauc_precision_at_3_std value: 1.788215968301994 - type: nauc_precision_at_5_diff1 value: 25.030185818513235 - type: nauc_precision_at_5_max value: 23.680129160901537 - type: nauc_precision_at_5_std value: 4.303018899688115 - type: nauc_recall_at_1000_diff1 value: 28.68826642607512 - type: nauc_recall_at_1000_max value: 42.33849804103852 - type: nauc_recall_at_1000_std value: 42.67413575876864 - type: nauc_recall_at_100_diff1 value: 36.51494878715 - type: nauc_recall_at_100_max value: 37.4764995034434 - type: nauc_recall_at_100_std value: 28.295671266661017 - type: nauc_recall_at_10_diff1 value: 39.416721111463524 - type: nauc_recall_at_10_max value: 29.95985608454179 - type: nauc_recall_at_10_std value: 12.423335839786201 - type: nauc_recall_at_1_diff1 value: 53.57918993108331 - type: nauc_recall_at_1_max value: 31.392632653740993 - type: nauc_recall_at_1_std value: -2.857306170463933 - type: nauc_recall_at_20_diff1 value: 38.228803480194046 - type: nauc_recall_at_20_max value: 30.87261362975955 - type: nauc_recall_at_20_std value: 16.977113091834095 - type: nauc_recall_at_3_diff1 value: 43.154348566653155 - type: nauc_recall_at_3_max value: 29.54536633744803 - type: nauc_recall_at_3_std value: 2.02842672250621 - type: nauc_recall_at_5_diff1 value: 41.00436246072242 - type: nauc_recall_at_5_max value: 29.413569555348023 - type: nauc_recall_at_5_std value: 3.845214021958289 - type: ndcg_at_1 value: 34.701 - type: ndcg_at_10 value: 46.54 - type: ndcg_at_100 value: 51.754999999999995 - type: ndcg_at_1000 value: 53.71 - type: ndcg_at_20 value: 48.679 - type: ndcg_at_3 value: 40.892 - type: ndcg_at_5 value: 43.595 - type: precision_at_1 value: 34.701 - type: precision_at_10 value: 8.004 - type: precision_at_100 value: 1.185 - type: precision_at_1000 value: 0.145 - type: precision_at_20 value: 4.632 - type: precision_at_3 value: 18.719 - type: precision_at_5 value: 13.245999999999999 - type: recall_at_1 value: 29.953999999999997 - type: recall_at_10 value: 60.246 - type: recall_at_100 value: 82.128 - type: recall_at_1000 value: 95.622 - type: recall_at_20 value: 67.756 - type: recall_at_3 value: 45.096000000000004 - type: recall_at_5 value: 51.9 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval type: mteb/cqadupstack-webmasters config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: main_score value: 44.718999999999994 - type: map_at_1 value: 28.383999999999997 - type: map_at_10 value: 38.422 - type: map_at_100 value: 40.058 - type: map_at_1000 value: 40.276 - type: map_at_20 value: 39.301 - type: map_at_3 value: 35.205 - type: map_at_5 value: 36.803999999999995 - type: mrr_at_1 value: 33.59683794466403 - type: mrr_at_10 value: 42.837536859275986 - type: mrr_at_100 value: 43.7501703455481 - type: mrr_at_1000 value: 43.79258407771123 - type: mrr_at_20 value: 43.36044710445095 - type: mrr_at_3 value: 40.15151515151516 - type: mrr_at_5 value: 41.74242424242425 - type: nauc_map_at_1000_diff1 value: 47.934826596875304 - type: nauc_map_at_1000_max value: 32.39759438116062 - type: nauc_map_at_1000_std value: 0.9489007346763054 - type: nauc_map_at_100_diff1 value: 47.94844822157888 - type: nauc_map_at_100_max value: 32.51485845519537 - type: nauc_map_at_100_std value: 0.8094339925545622 - type: nauc_map_at_10_diff1 value: 48.251456404874645 - type: nauc_map_at_10_max value: 31.412906399154245 - type: nauc_map_at_10_std value: -0.7024825737369933 - type: nauc_map_at_1_diff1 value: 55.81906101970174 - type: nauc_map_at_1_max value: 31.811715334193796 - type: nauc_map_at_1_std value: -6.17056859281584 - type: nauc_map_at_20_diff1 value: 47.80902650237369 - type: nauc_map_at_20_max value: 32.22465403023091 - type: nauc_map_at_20_std value: 0.20706526946705656 - type: nauc_map_at_3_diff1 value: 49.97333984346632 - type: nauc_map_at_3_max value: 31.58195498640799 - type: nauc_map_at_3_std value: -2.577539707727459 - type: nauc_map_at_5_diff1 value: 49.40005767350608 - type: nauc_map_at_5_max value: 30.998435600377434 - type: nauc_map_at_5_std value: -2.1231771618690307 - type: nauc_mrr_at_1000_diff1 value: 46.86811371969663 - type: nauc_mrr_at_1000_max value: 31.25147138171024 - type: nauc_mrr_at_1000_std value: 1.9954422477585918 - type: nauc_mrr_at_100_diff1 value: 46.855870345882195 - type: nauc_mrr_at_100_max value: 31.263524035665966 - type: nauc_mrr_at_100_std value: 2.0160751193806568 - type: nauc_mrr_at_10_diff1 value: 46.93294772825783 - type: nauc_mrr_at_10_max value: 30.927002048701663 - type: nauc_mrr_at_10_std value: 1.6538220080908224 - type: nauc_mrr_at_1_diff1 value: 52.416386548395664 - type: nauc_mrr_at_1_max value: 32.28582003787206 - type: nauc_mrr_at_1_std value: -2.154991145714492 - type: nauc_mrr_at_20_diff1 value: 46.71796185319694 - type: nauc_mrr_at_20_max value: 31.16219902794994 - type: nauc_mrr_at_20_std value: 1.8590646572728409 - type: nauc_mrr_at_3_diff1 value: 47.697100317669914 - type: nauc_mrr_at_3_max value: 30.821806030159383 - type: nauc_mrr_at_3_std value: 1.1927626358099177 - type: nauc_mrr_at_5_diff1 value: 47.065272061365704 - type: nauc_mrr_at_5_max value: 30.299230962805023 - type: nauc_mrr_at_5_std value: 1.3225842862629529 - type: nauc_ndcg_at_1000_diff1 value: 45.20612583136058 - type: nauc_ndcg_at_1000_max value: 33.51931869947315 - type: nauc_ndcg_at_1000_std value: 4.923707509620363 - type: nauc_ndcg_at_100_diff1 value: 44.76206243393775 - type: nauc_ndcg_at_100_max value: 33.57771606755598 - type: nauc_ndcg_at_100_std value: 5.30915563331338 - type: nauc_ndcg_at_10_diff1 value: 45.12714032463827 - type: nauc_ndcg_at_10_max value: 30.351909495610492 - type: nauc_ndcg_at_10_std value: 2.3972947289996873 - type: nauc_ndcg_at_1_diff1 value: 52.416386548395664 - type: nauc_ndcg_at_1_max value: 32.28582003787206 - type: nauc_ndcg_at_1_std value: -2.154991145714492 - type: nauc_ndcg_at_20_diff1 value: 44.20281844000005 - type: nauc_ndcg_at_20_max value: 32.14112739396226 - type: nauc_ndcg_at_20_std value: 3.3971385462591916 - type: nauc_ndcg_at_3_diff1 value: 47.0633767031858 - type: nauc_ndcg_at_3_max value: 31.032896053733435 - type: nauc_ndcg_at_3_std value: 0.6827544906310201 - type: nauc_ndcg_at_5_diff1 value: 46.735352294106484 - type: nauc_ndcg_at_5_max value: 29.784992270528544 - type: nauc_ndcg_at_5_std value: 0.8685943819516141 - type: nauc_precision_at_1000_diff1 value: -12.223330179860852 - type: nauc_precision_at_1000_max value: -9.266492213777273 - type: nauc_precision_at_1000_std value: 19.0569899587788 - type: nauc_precision_at_100_diff1 value: -5.803751085072067 - type: nauc_precision_at_100_max value: 3.448932057044294 - type: nauc_precision_at_100_std value: 23.470863527030627 - type: nauc_precision_at_10_diff1 value: 8.887357341361907 - type: nauc_precision_at_10_max value: 18.67165390928126 - type: nauc_precision_at_10_std value: 19.158543337955404 - type: nauc_precision_at_1_diff1 value: 52.416386548395664 - type: nauc_precision_at_1_max value: 32.28582003787206 - type: nauc_precision_at_1_std value: -2.154991145714492 - type: nauc_precision_at_20_diff1 value: 0.942496138409553 - type: nauc_precision_at_20_max value: 18.86957127610774 - type: nauc_precision_at_20_std value: 24.075503903246496 - type: nauc_precision_at_3_diff1 value: 28.15363877307106 - type: nauc_precision_at_3_max value: 27.064928137991824 - type: nauc_precision_at_3_std value: 8.632807104504753 - type: nauc_precision_at_5_diff1 value: 20.805862332497973 - type: nauc_precision_at_5_max value: 21.420201475758404 - type: nauc_precision_at_5_std value: 12.380239645425714 - type: nauc_recall_at_1000_diff1 value: 18.478341468055547 - type: nauc_recall_at_1000_max value: 56.293560115074506 - type: nauc_recall_at_1000_std value: 64.31607185065428 - type: nauc_recall_at_100_diff1 value: 26.737267337771886 - type: nauc_recall_at_100_max value: 38.011889141496326 - type: nauc_recall_at_100_std value: 30.44904690114732 - type: nauc_recall_at_10_diff1 value: 35.22772732735716 - type: nauc_recall_at_10_max value: 26.000054115159486 - type: nauc_recall_at_10_std value: 5.174264254271206 - type: nauc_recall_at_1_diff1 value: 55.81906101970174 - type: nauc_recall_at_1_max value: 31.811715334193796 - type: nauc_recall_at_1_std value: -6.17056859281584 - type: nauc_recall_at_20_diff1 value: 30.48493302415641 - type: nauc_recall_at_20_max value: 31.05487040370753 - type: nauc_recall_at_20_std value: 10.319948318834136 - type: nauc_recall_at_3_diff1 value: 43.12289512340243 - type: nauc_recall_at_3_max value: 28.176279771026135 - type: nauc_recall_at_3_std value: -0.1775154523381921 - type: nauc_recall_at_5_diff1 value: 40.9934933741234 - type: nauc_recall_at_5_max value: 25.569156290584733 - type: nauc_recall_at_5_std value: 0.21166696686855038 - type: ndcg_at_1 value: 33.597 - type: ndcg_at_10 value: 44.718999999999994 - type: ndcg_at_100 value: 50.324000000000005 - type: ndcg_at_1000 value: 52.468 - type: ndcg_at_20 value: 46.822 - type: ndcg_at_3 value: 39.558 - type: ndcg_at_5 value: 41.827999999999996 - type: precision_at_1 value: 33.597 - type: precision_at_10 value: 8.735 - type: precision_at_100 value: 1.6420000000000001 - type: precision_at_1000 value: 0.246 - type: precision_at_20 value: 5.375 - type: precision_at_3 value: 18.511 - type: precision_at_5 value: 13.399 - type: recall_at_1 value: 28.383999999999997 - type: recall_at_10 value: 56.425000000000004 - type: recall_at_100 value: 82.01899999999999 - type: recall_at_1000 value: 95.285 - type: recall_at_20 value: 64.615 - type: recall_at_3 value: 42.171 - type: recall_at_5 value: 48.296 - task: type: Retrieval dataset: name: MTEB CQADupstackWordpressRetrieval type: mteb/cqadupstack-wordpress config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: main_score value: 38.269999999999996 - type: map_at_1 value: 25.324999999999996 - type: map_at_10 value: 33.263 - type: map_at_100 value: 34.304 - type: map_at_1000 value: 34.394000000000005 - type: map_at_20 value: 33.827 - type: map_at_3 value: 30.259999999999998 - type: map_at_5 value: 31.832 - type: mrr_at_1 value: 27.171903881700555 - type: mrr_at_10 value: 35.334991051257234 - type: mrr_at_100 value: 36.251283465952355 - type: mrr_at_1000 value: 36.316236092511055 - type: mrr_at_20 value: 35.87141909945257 - type: mrr_at_3 value: 32.71719038817007 - type: mrr_at_5 value: 34.19593345656194 - type: nauc_map_at_1000_diff1 value: 39.614836211522714 - type: nauc_map_at_1000_max value: 22.019768626310192 - type: nauc_map_at_1000_std value: -1.5238708712112499 - type: nauc_map_at_100_diff1 value: 39.63008548572307 - type: nauc_map_at_100_max value: 22.044756063752345 - type: nauc_map_at_100_std value: -1.4869190221494792 - type: nauc_map_at_10_diff1 value: 39.73025012395569 - type: nauc_map_at_10_max value: 22.117710178892107 - type: nauc_map_at_10_std value: -2.5129984871932973 - type: nauc_map_at_1_diff1 value: 45.015617718902654 - type: nauc_map_at_1_max value: 19.313800263189638 - type: nauc_map_at_1_std value: -4.763931386681675 - type: nauc_map_at_20_diff1 value: 39.53678019013766 - type: nauc_map_at_20_max value: 21.880316719428258 - type: nauc_map_at_20_std value: -1.882003994523355 - type: nauc_map_at_3_diff1 value: 40.37307665298228 - type: nauc_map_at_3_max value: 20.851976075322533 - type: nauc_map_at_3_std value: -2.429569082966531 - type: nauc_map_at_5_diff1 value: 39.763015635086 - type: nauc_map_at_5_max value: 22.010102196900725 - type: nauc_map_at_5_std value: -2.654896415670943 - type: nauc_mrr_at_1000_diff1 value: 39.74071733680025 - type: nauc_mrr_at_1000_max value: 21.67309640681989 - type: nauc_mrr_at_1000_std value: -1.4003373135477462 - type: nauc_mrr_at_100_diff1 value: 39.730614151966485 - type: nauc_mrr_at_100_max value: 21.678390048971767 - type: nauc_mrr_at_100_std value: -1.3655362623563931 - type: nauc_mrr_at_10_diff1 value: 39.7900031013241 - type: nauc_mrr_at_10_max value: 21.73643491725051 - type: nauc_mrr_at_10_std value: -2.1175389838696312 - type: nauc_mrr_at_1_diff1 value: 46.165736140679776 - type: nauc_mrr_at_1_max value: 20.071083446822147 - type: nauc_mrr_at_1_std value: -5.018909100858311 - type: nauc_mrr_at_20_diff1 value: 39.6371295762885 - type: nauc_mrr_at_20_max value: 21.659557440270973 - type: nauc_mrr_at_20_std value: -1.4909603958341686 - type: nauc_mrr_at_3_diff1 value: 40.351150322758876 - type: nauc_mrr_at_3_max value: 20.83706249041544 - type: nauc_mrr_at_3_std value: -1.956027373253151 - type: nauc_mrr_at_5_diff1 value: 39.57759107791911 - type: nauc_mrr_at_5_max value: 21.79552045204151 - type: nauc_mrr_at_5_std value: -2.1507013120951126 - type: nauc_ndcg_at_1000_diff1 value: 37.717619356839016 - type: nauc_ndcg_at_1000_max value: 22.545375504379805 - type: nauc_ndcg_at_1000_std value: 1.682348628141016 - type: nauc_ndcg_at_100_diff1 value: 37.656027803682626 - type: nauc_ndcg_at_100_max value: 22.49278246383637 - type: nauc_ndcg_at_100_std value: 2.6818118152357773 - type: nauc_ndcg_at_10_diff1 value: 37.834954205539766 - type: nauc_ndcg_at_10_max value: 22.655839885558443 - type: nauc_ndcg_at_10_std value: -1.97159619786231 - type: nauc_ndcg_at_1_diff1 value: 46.165736140679776 - type: nauc_ndcg_at_1_max value: 20.071083446822147 - type: nauc_ndcg_at_1_std value: -5.018909100858311 - type: nauc_ndcg_at_20_diff1 value: 37.171914857454304 - type: nauc_ndcg_at_20_max value: 21.858904801745897 - type: nauc_ndcg_at_20_std value: 0.3809854859496657 - type: nauc_ndcg_at_3_diff1 value: 38.4460623883955 - type: nauc_ndcg_at_3_max value: 20.95244159463402 - type: nauc_ndcg_at_3_std value: -1.2685011660086651 - type: nauc_ndcg_at_5_diff1 value: 37.48831054573054 - type: nauc_ndcg_at_5_max value: 22.625921624640526 - type: nauc_ndcg_at_5_std value: -2.049221092724925 - type: nauc_precision_at_1000_diff1 value: -19.120500628263994 - type: nauc_precision_at_1000_max value: -6.650707109047473 - type: nauc_precision_at_1000_std value: 15.71193179253002 - type: nauc_precision_at_100_diff1 value: 6.254606806876069 - type: nauc_precision_at_100_max value: 14.601826922181823 - type: nauc_precision_at_100_std value: 28.38299592246453 - type: nauc_precision_at_10_diff1 value: 22.978614338670816 - type: nauc_precision_at_10_max value: 23.04146766323557 - type: nauc_precision_at_10_std value: 6.226264308612577 - type: nauc_precision_at_1_diff1 value: 46.165736140679776 - type: nauc_precision_at_1_max value: 20.071083446822147 - type: nauc_precision_at_1_std value: -5.018909100858311 - type: nauc_precision_at_20_diff1 value: 17.681032853225602 - type: nauc_precision_at_20_max value: 18.66680304585122 - type: nauc_precision_at_20_std value: 15.34896796713905 - type: nauc_precision_at_3_diff1 value: 31.359396694559194 - type: nauc_precision_at_3_max value: 22.279263308973274 - type: nauc_precision_at_3_std value: 3.6302537979529035 - type: nauc_precision_at_5_diff1 value: 26.32257879892933 - type: nauc_precision_at_5_max value: 25.402524493181026 - type: nauc_precision_at_5_std value: 4.731450603747359 - type: nauc_recall_at_1000_diff1 value: 23.562925244967875 - type: nauc_recall_at_1000_max value: 30.737399333586797 - type: nauc_recall_at_1000_std value: 34.19418935008663 - type: nauc_recall_at_100_diff1 value: 28.703574970574824 - type: nauc_recall_at_100_max value: 22.448663600170278 - type: nauc_recall_at_100_std value: 24.53297349042035 - type: nauc_recall_at_10_diff1 value: 31.73603907811882 - type: nauc_recall_at_10_max value: 23.453183748640765 - type: nauc_recall_at_10_std value: -1.8279054407176274 - type: nauc_recall_at_1_diff1 value: 45.015617718902654 - type: nauc_recall_at_1_max value: 19.313800263189638 - type: nauc_recall_at_1_std value: -4.763931386681675 - type: nauc_recall_at_20_diff1 value: 28.74169081866096 - type: nauc_recall_at_20_max value: 20.035509169577324 - type: nauc_recall_at_20_std value: 7.371615811227748 - type: nauc_recall_at_3_diff1 value: 34.09890157333362 - type: nauc_recall_at_3_max value: 20.46565842748346 - type: nauc_recall_at_3_std value: -0.4337283067447526 - type: nauc_recall_at_5_diff1 value: 30.974580787842402 - type: nauc_recall_at_5_max value: 23.76379349487105 - type: nauc_recall_at_5_std value: -1.8407515927979428 - type: ndcg_at_1 value: 27.172 - type: ndcg_at_10 value: 38.269999999999996 - type: ndcg_at_100 value: 43.338 - type: ndcg_at_1000 value: 45.594 - type: ndcg_at_20 value: 40.256 - type: ndcg_at_3 value: 32.673 - type: ndcg_at_5 value: 35.224 - type: precision_at_1 value: 27.172 - type: precision_at_10 value: 6.063000000000001 - type: precision_at_100 value: 0.9259999999999999 - type: precision_at_1000 value: 0.123 - type: precision_at_20 value: 3.5029999999999997 - type: precision_at_3 value: 13.74 - type: precision_at_5 value: 9.797 - type: recall_at_1 value: 25.324999999999996 - type: recall_at_10 value: 51.634 - type: recall_at_100 value: 74.687 - type: recall_at_1000 value: 91.412 - type: recall_at_20 value: 59.207 - type: recall_at_3 value: 36.678 - type: recall_at_5 value: 42.742999999999995 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: mteb/climate-fever config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: main_score value: 36.853 - type: map_at_1 value: 15.371000000000002 - type: map_at_10 value: 27.122 - type: map_at_100 value: 29.226000000000003 - type: map_at_1000 value: 29.409999999999997 - type: map_at_20 value: 28.274 - type: map_at_3 value: 22.431 - type: map_at_5 value: 24.877 - type: mrr_at_1 value: 34.13680781758958 - type: mrr_at_10 value: 47.265911793599145 - type: mrr_at_100 value: 48.028369995763846 - type: mrr_at_1000 value: 48.05317022537804 - type: mrr_at_20 value: 47.75785292259516 - type: mrr_at_3 value: 43.887079261672156 - type: mrr_at_5 value: 45.906623235613544 - type: nauc_map_at_1000_diff1 value: 24.949211292921547 - type: nauc_map_at_1000_max value: 38.69844483304584 - type: nauc_map_at_1000_std value: 18.336359440844753 - type: nauc_map_at_100_diff1 value: 24.8951732982492 - type: nauc_map_at_100_max value: 38.65049158594052 - type: nauc_map_at_100_std value: 18.28935278388095 - type: nauc_map_at_10_diff1 value: 24.606032216798273 - type: nauc_map_at_10_max value: 38.00608351559887 - type: nauc_map_at_10_std value: 16.61261615173358 - type: nauc_map_at_1_diff1 value: 30.83614944448221 - type: nauc_map_at_1_max value: 33.757528532809 - type: nauc_map_at_1_std value: 8.880622713261126 - type: nauc_map_at_20_diff1 value: 24.75491310922017 - type: nauc_map_at_20_max value: 38.353679076398834 - type: nauc_map_at_20_std value: 17.58637493443171 - type: nauc_map_at_3_diff1 value: 25.563085273287083 - type: nauc_map_at_3_max value: 35.14515679047155 - type: nauc_map_at_3_std value: 11.75594869817732 - type: nauc_map_at_5_diff1 value: 24.815807517691614 - type: nauc_map_at_5_max value: 36.25905426665983 - type: nauc_map_at_5_std value: 14.516391726180697 - type: nauc_mrr_at_1000_diff1 value: 27.948233427121274 - type: nauc_mrr_at_1000_max value: 37.5893640945859 - type: nauc_mrr_at_1000_std value: 19.588442449629763 - type: nauc_mrr_at_100_diff1 value: 27.947962345854037 - type: nauc_mrr_at_100_max value: 37.60375479481945 - type: nauc_mrr_at_100_std value: 19.614791576283793 - type: nauc_mrr_at_10_diff1 value: 27.882311310262136 - type: nauc_mrr_at_10_max value: 37.58580968074054 - type: nauc_mrr_at_10_std value: 19.49875186170201 - type: nauc_mrr_at_1_diff1 value: 28.017413073648477 - type: nauc_mrr_at_1_max value: 32.87710191514022 - type: nauc_mrr_at_1_std value: 14.04889142608459 - type: nauc_mrr_at_20_diff1 value: 27.89129925771968 - type: nauc_mrr_at_20_max value: 37.6142863106945 - type: nauc_mrr_at_20_std value: 19.645390143394163 - type: nauc_mrr_at_3_diff1 value: 27.99609559690795 - type: nauc_mrr_at_3_max value: 36.87362332456197 - type: nauc_mrr_at_3_std value: 18.598416821915333 - type: nauc_mrr_at_5_diff1 value: 27.68306089976716 - type: nauc_mrr_at_5_max value: 37.12264485659723 - type: nauc_mrr_at_5_std value: 19.18875305730564 - type: nauc_ndcg_at_1000_diff1 value: 25.736779186453777 - type: nauc_ndcg_at_1000_max value: 41.93281139456004 - type: nauc_ndcg_at_1000_std value: 25.179038422659993 - type: nauc_ndcg_at_100_diff1 value: 25.144796623848322 - type: nauc_ndcg_at_100_max value: 41.72820916876173 - type: nauc_ndcg_at_100_std value: 25.12851686850754 - type: nauc_ndcg_at_10_diff1 value: 24.321249191226652 - type: nauc_ndcg_at_10_max value: 40.23711916935706 - type: nauc_ndcg_at_10_std value: 20.89060972334557 - type: nauc_ndcg_at_1_diff1 value: 28.017413073648477 - type: nauc_ndcg_at_1_max value: 32.87710191514022 - type: nauc_ndcg_at_1_std value: 14.04889142608459 - type: nauc_ndcg_at_20_diff1 value: 24.5090484877482 - type: nauc_ndcg_at_20_max value: 40.752854032983606 - type: nauc_ndcg_at_20_std value: 22.70331074781384 - type: nauc_ndcg_at_3_diff1 value: 25.13499057756147 - type: nauc_ndcg_at_3_max value: 35.8325682137567 - type: nauc_ndcg_at_3_std value: 15.23768392706637 - type: nauc_ndcg_at_5_diff1 value: 24.614105695451116 - type: nauc_ndcg_at_5_max value: 37.68089587624492 - type: nauc_ndcg_at_5_std value: 17.946406099261708 - type: nauc_precision_at_1000_diff1 value: -2.022340544774227 - type: nauc_precision_at_1000_max value: 6.070578645067797 - type: nauc_precision_at_1000_std value: 22.15132728777549 - type: nauc_precision_at_100_diff1 value: 4.544144474504255 - type: nauc_precision_at_100_max value: 19.780392159848574 - type: nauc_precision_at_100_std value: 31.107111186002438 - type: nauc_precision_at_10_diff1 value: 10.107015022955848 - type: nauc_precision_at_10_max value: 30.779709099060465 - type: nauc_precision_at_10_std value: 27.324148451668602 - type: nauc_precision_at_1_diff1 value: 28.017413073648477 - type: nauc_precision_at_1_max value: 32.87710191514022 - type: nauc_precision_at_1_std value: 14.04889142608459 - type: nauc_precision_at_20_diff1 value: 8.270881053079405 - type: nauc_precision_at_20_max value: 27.26753946078481 - type: nauc_precision_at_20_std value: 29.156725822074204 - type: nauc_precision_at_3_diff1 value: 17.82468940497632 - type: nauc_precision_at_3_max value: 31.490021174215155 - type: nauc_precision_at_3_std value: 18.73818985054394 - type: nauc_precision_at_5_diff1 value: 13.24803141673961 - type: nauc_precision_at_5_max value: 29.94926240784298 - type: nauc_precision_at_5_std value: 23.2940906142919 - type: nauc_recall_at_1000_diff1 value: 19.09850333580471 - type: nauc_recall_at_1000_max value: 46.026306142840596 - type: nauc_recall_at_1000_std value: 46.50391519568263 - type: nauc_recall_at_100_diff1 value: 16.739384224869738 - type: nauc_recall_at_100_max value: 40.68987136431252 - type: nauc_recall_at_100_std value: 36.01609750485591 - type: nauc_recall_at_10_diff1 value: 17.51796617221814 - type: nauc_recall_at_10_max value: 39.47453129444401 - type: nauc_recall_at_10_std value: 23.79239002974899 - type: nauc_recall_at_1_diff1 value: 30.83614944448221 - type: nauc_recall_at_1_max value: 33.757528532809 - type: nauc_recall_at_1_std value: 8.880622713261126 - type: nauc_recall_at_20_diff1 value: 16.978668307251652 - type: nauc_recall_at_20_max value: 39.09115357303713 - type: nauc_recall_at_20_std value: 27.278668534187524 - type: nauc_recall_at_3_diff1 value: 22.55937738994021 - type: nauc_recall_at_3_max value: 36.25055459395638 - type: nauc_recall_at_3_std value: 14.828905168761247 - type: nauc_recall_at_5_diff1 value: 19.32656748627199 - type: nauc_recall_at_5_max value: 36.28836228620816 - type: nauc_recall_at_5_std value: 19.264352933914278 - type: ndcg_at_1 value: 34.137 - type: ndcg_at_10 value: 36.853 - type: ndcg_at_100 value: 44.279 - type: ndcg_at_1000 value: 47.336 - type: ndcg_at_20 value: 39.815 - type: ndcg_at_3 value: 30.253999999999998 - type: ndcg_at_5 value: 32.649 - type: precision_at_1 value: 34.137 - type: precision_at_10 value: 11.655 - type: precision_at_100 value: 1.9619999999999997 - type: precision_at_1000 value: 0.254 - type: precision_at_20 value: 7.1209999999999996 - type: precision_at_3 value: 22.823 - type: precision_at_5 value: 17.655 - type: recall_at_1 value: 15.371000000000002 - type: recall_at_10 value: 43.718 - type: recall_at_100 value: 68.81 - type: recall_at_1000 value: 85.69600000000001 - type: recall_at_20 value: 51.94 - type: recall_at_3 value: 27.694000000000003 - type: recall_at_5 value: 34.469 - task: type: Retrieval dataset: name: MTEB DBPedia type: mteb/dbpedia config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: main_score value: 45.553 - type: map_at_1 value: 9.168999999999999 - type: map_at_10 value: 22.154 - type: map_at_100 value: 32.174 - type: map_at_1000 value: 33.974 - type: map_at_20 value: 25.899 - type: map_at_3 value: 15.275 - type: map_at_5 value: 18.291 - type: mrr_at_1 value: 70.75 - type: mrr_at_10 value: 78.39662698412697 - type: mrr_at_100 value: 78.56221458977012 - type: mrr_at_1000 value: 78.56669970642338 - type: mrr_at_20 value: 78.49688805346696 - type: mrr_at_3 value: 76.33333333333333 - type: mrr_at_5 value: 77.70833333333333 - type: nauc_map_at_1000_diff1 value: 18.465085922071346 - type: nauc_map_at_1000_max value: 24.29804638788498 - type: nauc_map_at_1000_std value: 22.380463943423514 - type: nauc_map_at_100_diff1 value: 19.37585410674523 - type: nauc_map_at_100_max value: 22.56424042509462 - type: nauc_map_at_100_std value: 19.672237275984426 - type: nauc_map_at_10_diff1 value: 23.597788166305577 - type: nauc_map_at_10_max value: 9.157316105122925 - type: nauc_map_at_10_std value: -3.8881247055786807 - type: nauc_map_at_1_diff1 value: 43.96699602275052 - type: nauc_map_at_1_max value: -0.7577088440873263 - type: nauc_map_at_1_std value: -17.732463891968404 - type: nauc_map_at_20_diff1 value: 22.326759054850097 - type: nauc_map_at_20_max value: 14.879191412167703 - type: nauc_map_at_20_std value: 5.405751236575241 - type: nauc_map_at_3_diff1 value: 28.73583545428074 - type: nauc_map_at_3_max value: 1.5986597211018239 - type: nauc_map_at_3_std value: -16.512455883681515 - type: nauc_map_at_5_diff1 value: 25.401810959155057 - type: nauc_map_at_5_max value: 4.418875376978587 - type: nauc_map_at_5_std value: -12.296750992013052 - type: nauc_mrr_at_1000_diff1 value: 51.228801807498584 - type: nauc_mrr_at_1000_max value: 61.040998883279585 - type: nauc_mrr_at_1000_std value: 40.93983887257123 - type: nauc_mrr_at_100_diff1 value: 51.23715338435314 - type: nauc_mrr_at_100_max value: 61.03971408781317 - type: nauc_mrr_at_100_std value: 40.91796923590573 - type: nauc_mrr_at_10_diff1 value: 51.1214868552331 - type: nauc_mrr_at_10_max value: 61.03069045590881 - type: nauc_mrr_at_10_std value: 40.661621199704264 - type: nauc_mrr_at_1_diff1 value: 50.84660003035892 - type: nauc_mrr_at_1_max value: 60.692091499960895 - type: nauc_mrr_at_1_std value: 42.126228731502955 - type: nauc_mrr_at_20_diff1 value: 51.0402624284872 - type: nauc_mrr_at_20_max value: 60.94577844338166 - type: nauc_mrr_at_20_std value: 40.89505950503613 - type: nauc_mrr_at_3_diff1 value: 51.771113665996516 - type: nauc_mrr_at_3_max value: 61.65264793077224 - type: nauc_mrr_at_3_std value: 41.75781827057092 - type: nauc_mrr_at_5_diff1 value: 51.0656793772882 - type: nauc_mrr_at_5_max value: 61.08042065139715 - type: nauc_mrr_at_5_std value: 41.11203271084835 - type: nauc_ndcg_at_1000_diff1 value: 22.347978262245107 - type: nauc_ndcg_at_1000_max value: 36.56458763955002 - type: nauc_ndcg_at_1000_std value: 35.99616144258822 - type: nauc_ndcg_at_100_diff1 value: 23.1120990977162 - type: nauc_ndcg_at_100_max value: 30.79663306311657 - type: nauc_ndcg_at_100_std value: 27.387572106784297 - type: nauc_ndcg_at_10_diff1 value: 23.329746066899656 - type: nauc_ndcg_at_10_max value: 28.69246947084685 - type: nauc_ndcg_at_10_std value: 21.457736188325345 - type: nauc_ndcg_at_1_diff1 value: 39.99399153456974 - type: nauc_ndcg_at_1_max value: 38.12447856470389 - type: nauc_ndcg_at_1_std value: 27.768869260384676 - type: nauc_ndcg_at_20_diff1 value: 24.945374175339907 - type: nauc_ndcg_at_20_max value: 27.67836982165295 - type: nauc_ndcg_at_20_std value: 19.7933631060578 - type: nauc_ndcg_at_3_diff1 value: 26.063492354398527 - type: nauc_ndcg_at_3_max value: 33.06541959550656 - type: nauc_ndcg_at_3_std value: 23.278902797288726 - type: nauc_ndcg_at_5_diff1 value: 22.521596060750035 - type: nauc_ndcg_at_5_max value: 31.210005673730784 - type: nauc_ndcg_at_5_std value: 22.893106456317927 - type: nauc_precision_at_1000_diff1 value: -19.845356495096006 - type: nauc_precision_at_1000_max value: 4.163819381816099 - type: nauc_precision_at_1000_std value: 7.612952884590339 - type: nauc_precision_at_100_diff1 value: -8.2679285153361 - type: nauc_precision_at_100_max value: 29.78018175573565 - type: nauc_precision_at_100_std value: 41.07244463956215 - type: nauc_precision_at_10_diff1 value: -3.2451428407349057 - type: nauc_precision_at_10_max value: 36.92563008274906 - type: nauc_precision_at_10_std value: 45.06962043489777 - type: nauc_precision_at_1_diff1 value: 50.84660003035892 - type: nauc_precision_at_1_max value: 60.692091499960895 - type: nauc_precision_at_1_std value: 42.126228731502955 - type: nauc_precision_at_20_diff1 value: -3.432279149061878 - type: nauc_precision_at_20_max value: 37.013592483974875 - type: nauc_precision_at_20_std value: 46.47324739428665 - type: nauc_precision_at_3_diff1 value: 7.28495481051025 - type: nauc_precision_at_3_max value: 38.66372411741402 - type: nauc_precision_at_3_std value: 35.23163993723955 - type: nauc_precision_at_5_diff1 value: -0.16540230063716202 - type: nauc_precision_at_5_max value: 37.322494255721715 - type: nauc_precision_at_5_std value: 39.666653561269754 - type: nauc_recall_at_1000_diff1 value: 11.388326469283681 - type: nauc_recall_at_1000_max value: 32.698146308591674 - type: nauc_recall_at_1000_std value: 49.48830488070777 - type: nauc_recall_at_100_diff1 value: 11.497443532756819 - type: nauc_recall_at_100_max value: 20.196970431621615 - type: nauc_recall_at_100_std value: 23.688772100803433 - type: nauc_recall_at_10_diff1 value: 16.519851398596003 - type: nauc_recall_at_10_max value: 0.774066845071221 - type: nauc_recall_at_10_std value: -10.89514647001814 - type: nauc_recall_at_1_diff1 value: 43.96699602275052 - type: nauc_recall_at_1_max value: -0.7577088440873263 - type: nauc_recall_at_1_std value: -17.732463891968404 - type: nauc_recall_at_20_diff1 value: 15.202960269878258 - type: nauc_recall_at_20_max value: 7.067263295590253 - type: nauc_recall_at_20_std value: -0.06050108222640702 - type: nauc_recall_at_3_diff1 value: 24.066741361525125 - type: nauc_recall_at_3_max value: -2.1961525860488424 - type: nauc_recall_at_3_std value: -19.48307077749568 - type: nauc_recall_at_5_diff1 value: 20.086330794102707 - type: nauc_recall_at_5_max value: -0.8866528062747986 - type: nauc_recall_at_5_std value: -16.53799173962747 - type: ndcg_at_1 value: 57.99999999999999 - type: ndcg_at_10 value: 45.553 - type: ndcg_at_100 value: 51.014 - type: ndcg_at_1000 value: 58.226 - type: ndcg_at_20 value: 44.98 - type: ndcg_at_3 value: 48.981 - type: ndcg_at_5 value: 46.794999999999995 - type: precision_at_1 value: 70.75 - type: precision_at_10 value: 36.85 - type: precision_at_100 value: 11.955 - type: precision_at_1000 value: 2.247 - type: precision_at_20 value: 28.075 - type: precision_at_3 value: 52.666999999999994 - type: precision_at_5 value: 45.85 - type: recall_at_1 value: 9.168999999999999 - type: recall_at_10 value: 28.796 - type: recall_at_100 value: 58.892999999999994 - type: recall_at_1000 value: 81.644 - type: recall_at_20 value: 36.659000000000006 - type: recall_at_3 value: 16.709 - type: recall_at_5 value: 21.387 - task: type: Retrieval dataset: name: MTEB FEVER type: mteb/fever config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: main_score value: 88.41 - type: map_at_1 value: 75.637 - type: map_at_10 value: 84.674 - type: map_at_100 value: 84.909 - type: map_at_1000 value: 84.92 - type: map_at_20 value: 84.836 - type: map_at_3 value: 83.44200000000001 - type: map_at_5 value: 84.28099999999999 - type: mrr_at_1 value: 81.56315631563157 - type: mrr_at_10 value: 88.89571695264748 - type: mrr_at_100 value: 88.93671417216285 - type: mrr_at_1000 value: 88.93708016011664 - type: mrr_at_20 value: 88.9311652665256 - type: mrr_at_3 value: 88.20882088208805 - type: mrr_at_5 value: 88.72937293729349 - type: nauc_map_at_1000_diff1 value: 54.41216035074026 - type: nauc_map_at_1000_max value: 13.346153003554361 - type: nauc_map_at_1000_std value: -6.721664416152164 - type: nauc_map_at_100_diff1 value: 54.36538350995795 - type: nauc_map_at_100_max value: 13.355583381471298 - type: nauc_map_at_100_std value: -6.696921015641016 - type: nauc_map_at_10_diff1 value: 54.0389127730555 - type: nauc_map_at_10_max value: 13.387802159150663 - type: nauc_map_at_10_std value: -6.73514381731833 - type: nauc_map_at_1_diff1 value: 57.99489574836453 - type: nauc_map_at_1_max value: 7.830032589171654 - type: nauc_map_at_1_std value: -10.140208285080295 - type: nauc_map_at_20_diff1 value: 54.16841004736076 - type: nauc_map_at_20_max value: 13.345607363689746 - type: nauc_map_at_20_std value: -6.663119775158465 - type: nauc_map_at_3_diff1 value: 53.82879543599303 - type: nauc_map_at_3_max value: 12.716952288433902 - type: nauc_map_at_3_std value: -7.746102082835598 - type: nauc_map_at_5_diff1 value: 53.82838395350109 - type: nauc_map_at_5_max value: 13.487373534211702 - type: nauc_map_at_5_std value: -6.869504398693434 - type: nauc_mrr_at_1000_diff1 value: 68.92783546581906 - type: nauc_mrr_at_1000_max value: 12.076297180596592 - type: nauc_mrr_at_1000_std value: -13.306257067567998 - type: nauc_mrr_at_100_diff1 value: 68.92780219775517 - type: nauc_mrr_at_100_max value: 12.078449805054374 - type: nauc_mrr_at_100_std value: -13.303524852703719 - type: nauc_mrr_at_10_diff1 value: 68.92686206881258 - type: nauc_mrr_at_10_max value: 12.273295656884873 - type: nauc_mrr_at_10_std value: -13.222483496603965 - type: nauc_mrr_at_1_diff1 value: 70.1738022073041 - type: nauc_mrr_at_1_max value: 9.378639533482806 - type: nauc_mrr_at_1_std value: -13.444033823202348 - type: nauc_mrr_at_20_diff1 value: 68.91161304905303 - type: nauc_mrr_at_20_max value: 12.117091514817885 - type: nauc_mrr_at_20_std value: -13.258261750160239 - type: nauc_mrr_at_3_diff1 value: 68.61982455945467 - type: nauc_mrr_at_3_max value: 12.608213879734578 - type: nauc_mrr_at_3_std value: -13.558003431587839 - type: nauc_mrr_at_5_diff1 value: 68.81439097457242 - type: nauc_mrr_at_5_max value: 12.54025598903624 - type: nauc_mrr_at_5_std value: -13.199231514972093 - type: nauc_ndcg_at_1000_diff1 value: 56.47563443877495 - type: nauc_ndcg_at_1000_max value: 14.508331783439466 - type: nauc_ndcg_at_1000_std value: -6.206829736668775 - type: nauc_ndcg_at_100_diff1 value: 55.54015515673474 - type: nauc_ndcg_at_100_max value: 14.753595778278136 - type: nauc_ndcg_at_100_std value: -5.638517949568802 - type: nauc_ndcg_at_10_diff1 value: 54.220845223257996 - type: nauc_ndcg_at_10_max value: 15.265309648490021 - type: nauc_ndcg_at_10_std value: -5.516276098929109 - type: nauc_ndcg_at_1_diff1 value: 70.1738022073041 - type: nauc_ndcg_at_1_max value: 9.378639533482806 - type: nauc_ndcg_at_1_std value: -13.444033823202348 - type: nauc_ndcg_at_20_diff1 value: 54.481406100854635 - type: nauc_ndcg_at_20_max value: 14.868763583210498 - type: nauc_ndcg_at_20_std value: -5.328097380018734 - type: nauc_ndcg_at_3_diff1 value: 54.94411725607744 - type: nauc_ndcg_at_3_max value: 14.27186734506607 - type: nauc_ndcg_at_3_std value: -7.894724962312474 - type: nauc_ndcg_at_5_diff1 value: 54.08048166974806 - type: nauc_ndcg_at_5_max value: 15.528233170721006 - type: nauc_ndcg_at_5_std value: -5.984768714537104 - type: nauc_precision_at_1000_diff1 value: -8.744323640074445 - type: nauc_precision_at_1000_max value: -0.01881224392053465 - type: nauc_precision_at_1000_std value: 3.8721477979260635 - type: nauc_precision_at_100_diff1 value: -11.86150156952171 - type: nauc_precision_at_100_max value: 3.2736651314552314 - type: nauc_precision_at_100_std value: 8.12687620615509 - type: nauc_precision_at_10_diff1 value: -10.360708676781178 - type: nauc_precision_at_10_max value: 10.945552490433458 - type: nauc_precision_at_10_std value: 11.016707653014485 - type: nauc_precision_at_1_diff1 value: 70.1738022073041 - type: nauc_precision_at_1_max value: 9.378639533482806 - type: nauc_precision_at_1_std value: -13.444033823202348 - type: nauc_precision_at_20_diff1 value: -13.557721925696583 - type: nauc_precision_at_20_max value: 6.331386521718574 - type: nauc_precision_at_20_std value: 10.322188778142388 - type: nauc_precision_at_3_diff1 value: 15.139456770248968 - type: nauc_precision_at_3_max value: 17.10220985600708 - type: nauc_precision_at_3_std value: 3.0448183682558074 - type: nauc_precision_at_5_diff1 value: -1.9825577548111102 - type: nauc_precision_at_5_max value: 17.139148127012625 - type: nauc_precision_at_5_std value: 10.598435750554753 - type: nauc_recall_at_1000_diff1 value: 15.641740744283005 - type: nauc_recall_at_1000_max value: 44.65315702195612 - type: nauc_recall_at_1000_std value: 52.34265862835513 - type: nauc_recall_at_100_diff1 value: 5.254385435323394 - type: nauc_recall_at_100_max value: 38.53577774395794 - type: nauc_recall_at_100_std value: 43.47744274335829 - type: nauc_recall_at_10_diff1 value: 19.135735476268042 - type: nauc_recall_at_10_max value: 30.05417445923848 - type: nauc_recall_at_10_std value: 18.3988023241141 - type: nauc_recall_at_1_diff1 value: 57.99489574836453 - type: nauc_recall_at_1_max value: 7.830032589171654 - type: nauc_recall_at_1_std value: -10.140208285080295 - type: nauc_recall_at_20_diff1 value: 9.444797759735126 - type: nauc_recall_at_20_max value: 31.001311675371017 - type: nauc_recall_at_20_std value: 29.351418893822178 - type: nauc_recall_at_3_diff1 value: 36.88862653262064 - type: nauc_recall_at_3_max value: 19.845892741607823 - type: nauc_recall_at_3_std value: -1.0584273105890794 - type: nauc_recall_at_5_diff1 value: 27.360718561944974 - type: nauc_recall_at_5_max value: 26.698311215441738 - type: nauc_recall_at_5_std value: 8.97113997755362 - type: ndcg_at_1 value: 81.563 - type: ndcg_at_10 value: 88.41 - type: ndcg_at_100 value: 89.101 - type: ndcg_at_1000 value: 89.25800000000001 - type: ndcg_at_20 value: 88.79 - type: ndcg_at_3 value: 86.599 - type: ndcg_at_5 value: 87.74 - type: precision_at_1 value: 81.563 - type: precision_at_10 value: 10.699 - type: precision_at_100 value: 1.13 - type: precision_at_1000 value: 0.116 - type: precision_at_20 value: 5.479 - type: precision_at_3 value: 33.238 - type: precision_at_5 value: 20.744 - type: recall_at_1 value: 75.637 - type: recall_at_10 value: 95.57600000000001 - type: recall_at_100 value: 98.072 - type: recall_at_1000 value: 98.951 - type: recall_at_20 value: 96.792 - type: recall_at_3 value: 90.79599999999999 - type: recall_at_5 value: 93.674 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: mteb/fiqa config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: main_score value: 42.396 - type: map_at_1 value: 21.711 - type: map_at_10 value: 34.628 - type: map_at_100 value: 36.549 - type: map_at_1000 value: 36.719 - type: map_at_20 value: 35.673 - type: map_at_3 value: 30.585 - type: map_at_5 value: 32.875 - type: mrr_at_1 value: 41.82098765432099 - type: mrr_at_10 value: 50.69505682931607 - type: mrr_at_100 value: 51.50556608727901 - type: mrr_at_1000 value: 51.53870583208304 - type: mrr_at_20 value: 51.15345764364655 - type: mrr_at_3 value: 48.35390946502059 - type: mrr_at_5 value: 49.87397119341563 - type: nauc_map_at_1000_diff1 value: 45.182252919583895 - type: nauc_map_at_1000_max value: 35.66124930024801 - type: nauc_map_at_1000_std value: -0.6925562638650965 - type: nauc_map_at_100_diff1 value: 45.116964706960125 - type: nauc_map_at_100_max value: 35.54990469525889 - type: nauc_map_at_100_std value: -0.6667263852859368 - type: nauc_map_at_10_diff1 value: 45.39189096228184 - type: nauc_map_at_10_max value: 34.780111261901 - type: nauc_map_at_10_std value: -1.8169859294150819 - type: nauc_map_at_1_diff1 value: 47.72764937952259 - type: nauc_map_at_1_max value: 24.83306559709341 - type: nauc_map_at_1_std value: -4.714128457297418 - type: nauc_map_at_20_diff1 value: 45.17073365898278 - type: nauc_map_at_20_max value: 35.0938403469058 - type: nauc_map_at_20_std value: -1.373412631183604 - type: nauc_map_at_3_diff1 value: 46.525724305731295 - type: nauc_map_at_3_max value: 31.042538866512597 - type: nauc_map_at_3_std value: -4.119355935975354 - type: nauc_map_at_5_diff1 value: 45.79569633383187 - type: nauc_map_at_5_max value: 32.88779656647293 - type: nauc_map_at_5_std value: -3.2518474739335312 - type: nauc_mrr_at_1000_diff1 value: 52.83619185487903 - type: nauc_mrr_at_1000_max value: 42.30310720405186 - type: nauc_mrr_at_1000_std value: -1.1487703348518024 - type: nauc_mrr_at_100_diff1 value: 52.82248853996664 - type: nauc_mrr_at_100_max value: 42.30549701564678 - type: nauc_mrr_at_100_std value: -1.1240113031894834 - type: nauc_mrr_at_10_diff1 value: 52.74644276642243 - type: nauc_mrr_at_10_max value: 42.39103029476398 - type: nauc_mrr_at_10_std value: -1.1043413237848576 - type: nauc_mrr_at_1_diff1 value: 54.810335521617326 - type: nauc_mrr_at_1_max value: 40.733260207843394 - type: nauc_mrr_at_1_std value: -4.452554921565855 - type: nauc_mrr_at_20_diff1 value: 52.788257862499954 - type: nauc_mrr_at_20_max value: 42.32658875363406 - type: nauc_mrr_at_20_std value: -1.2209728080684497 - type: nauc_mrr_at_3_diff1 value: 53.43281175319808 - type: nauc_mrr_at_3_max value: 41.735942650867926 - type: nauc_mrr_at_3_std value: -2.462688102468019 - type: nauc_mrr_at_5_diff1 value: 52.874037126566606 - type: nauc_mrr_at_5_max value: 41.93740449458822 - type: nauc_mrr_at_5_std value: -1.2928874908441947 - type: nauc_ndcg_at_1000_diff1 value: 46.5532425476402 - type: nauc_ndcg_at_1000_max value: 40.369611603370515 - type: nauc_ndcg_at_1000_std value: 3.472567588386994 - type: nauc_ndcg_at_100_diff1 value: 45.75244404695404 - type: nauc_ndcg_at_100_max value: 39.36470550675439 - type: nauc_ndcg_at_100_std value: 4.356189041115731 - type: nauc_ndcg_at_10_diff1 value: 46.005135323539704 - type: nauc_ndcg_at_10_max value: 37.89018165334218 - type: nauc_ndcg_at_10_std value: 0.7129618297768014 - type: nauc_ndcg_at_1_diff1 value: 54.810335521617326 - type: nauc_ndcg_at_1_max value: 40.733260207843394 - type: nauc_ndcg_at_1_std value: -4.452554921565855 - type: nauc_ndcg_at_20_diff1 value: 45.841552790490034 - type: nauc_ndcg_at_20_max value: 38.04992825472661 - type: nauc_ndcg_at_20_std value: 1.2748305707955212 - type: nauc_ndcg_at_3_diff1 value: 46.683033449357744 - type: nauc_ndcg_at_3_max value: 37.46397870760607 - type: nauc_ndcg_at_3_std value: -2.3421854966319824 - type: nauc_ndcg_at_5_diff1 value: 45.82409645378457 - type: nauc_ndcg_at_5_max value: 36.27588234096716 - type: nauc_ndcg_at_5_std value: -1.5141197170944254 - type: nauc_precision_at_1000_diff1 value: -3.137944321071885 - type: nauc_precision_at_1000_max value: 24.12803166253776 - type: nauc_precision_at_1000_std value: 11.076454789944101 - type: nauc_precision_at_100_diff1 value: 3.9896283891401048 - type: nauc_precision_at_100_max value: 31.00198316788829 - type: nauc_precision_at_100_std value: 15.725887643803063 - type: nauc_precision_at_10_diff1 value: 20.493420889888394 - type: nauc_precision_at_10_max value: 41.689699671507405 - type: nauc_precision_at_10_std value: 9.374983385669914 - type: nauc_precision_at_1_diff1 value: 54.810335521617326 - type: nauc_precision_at_1_max value: 40.733260207843394 - type: nauc_precision_at_1_std value: -4.452554921565855 - type: nauc_precision_at_20_diff1 value: 15.02911800246446 - type: nauc_precision_at_20_max value: 39.227068888505 - type: nauc_precision_at_20_std value: 11.755558515319404 - type: nauc_precision_at_3_diff1 value: 34.044986535461746 - type: nauc_precision_at_3_max value: 40.96605829831656 - type: nauc_precision_at_3_std value: 1.1903535705688038 - type: nauc_precision_at_5_diff1 value: 26.617002443432707 - type: nauc_precision_at_5_max value: 40.60413785916794 - type: nauc_precision_at_5_std value: 3.6984531670502814 - type: nauc_recall_at_1000_diff1 value: 26.96489389440101 - type: nauc_recall_at_1000_max value: 41.811583968523955 - type: nauc_recall_at_1000_std value: 41.5719519496712 - type: nauc_recall_at_100_diff1 value: 28.50851434908223 - type: nauc_recall_at_100_max value: 32.19528060706322 - type: nauc_recall_at_100_std value: 25.56935294258179 - type: nauc_recall_at_10_diff1 value: 35.139582891180964 - type: nauc_recall_at_10_max value: 32.15221840434225 - type: nauc_recall_at_10_std value: 5.550434611582702 - type: nauc_recall_at_1_diff1 value: 47.72764937952259 - type: nauc_recall_at_1_max value: 24.83306559709341 - type: nauc_recall_at_1_std value: -4.714128457297418 - type: nauc_recall_at_20_diff1 value: 32.78604811055205 - type: nauc_recall_at_20_max value: 29.62940720700254 - type: nauc_recall_at_20_std value: 6.769941491859872 - type: nauc_recall_at_3_diff1 value: 40.76090616138699 - type: nauc_recall_at_3_max value: 27.506425490226867 - type: nauc_recall_at_3_std value: -2.608872693119243 - type: nauc_recall_at_5_diff1 value: 37.06532485024711 - type: nauc_recall_at_5_max value: 27.704150556658448 - type: nauc_recall_at_5_std value: 0.4718707152343872 - type: ndcg_at_1 value: 41.821000000000005 - type: ndcg_at_10 value: 42.396 - type: ndcg_at_100 value: 49.370000000000005 - type: ndcg_at_1000 value: 52.251000000000005 - type: ndcg_at_20 value: 45.097 - type: ndcg_at_3 value: 39.028 - type: ndcg_at_5 value: 40.222 - type: precision_at_1 value: 41.821000000000005 - type: precision_at_10 value: 11.451 - type: precision_at_100 value: 1.863 - type: precision_at_1000 value: 0.23900000000000002 - type: precision_at_20 value: 6.798 - type: precision_at_3 value: 25.823 - type: precision_at_5 value: 18.735 - type: recall_at_1 value: 21.711 - type: recall_at_10 value: 48.862 - type: recall_at_100 value: 74.708 - type: recall_at_1000 value: 91.865 - type: recall_at_20 value: 57.50999999999999 - type: recall_at_3 value: 35.85 - type: recall_at_5 value: 41.976 - task: type: Retrieval dataset: name: MTEB HotpotQA type: mteb/hotpotqa config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: main_score value: 72.21 - type: map_at_1 value: 39.487 - type: map_at_10 value: 63.949999999999996 - type: map_at_100 value: 64.873 - type: map_at_1000 value: 64.927 - type: map_at_20 value: 64.529 - type: map_at_3 value: 60.243 - type: map_at_5 value: 62.613 - type: mrr_at_1 value: 78.97366644159351 - type: mrr_at_10 value: 84.84600173627825 - type: mrr_at_100 value: 85.0172804866798 - type: mrr_at_1000 value: 85.02245651152857 - type: mrr_at_20 value: 84.9625577788225 - type: mrr_at_3 value: 83.90276839972962 - type: mrr_at_5 value: 84.48278190411845 - type: nauc_map_at_1000_diff1 value: 19.825004700775164 - type: nauc_map_at_1000_max value: 19.943221724164182 - type: nauc_map_at_1000_std value: 10.068951166560058 - type: nauc_map_at_100_diff1 value: 19.80139472181137 - type: nauc_map_at_100_max value: 19.938006132804347 - type: nauc_map_at_100_std value: 10.100008107666842 - type: nauc_map_at_10_diff1 value: 19.53604502514735 - type: nauc_map_at_10_max value: 19.62768870331064 - type: nauc_map_at_10_std value: 9.446859074725705 - type: nauc_map_at_1_diff1 value: 67.7764270505257 - type: nauc_map_at_1_max value: 38.45166604737058 - type: nauc_map_at_1_std value: 1.9919181988552352 - type: nauc_map_at_20_diff1 value: 19.635871913149913 - type: nauc_map_at_20_max value: 19.812838965919155 - type: nauc_map_at_20_std value: 9.905163140101845 - type: nauc_map_at_3_diff1 value: 18.965707122532212 - type: nauc_map_at_3_max value: 17.878860313056517 - type: nauc_map_at_3_std value: 6.189378752019195 - type: nauc_map_at_5_diff1 value: 19.493354049675954 - type: nauc_map_at_5_max value: 19.24527088109141 - type: nauc_map_at_5_std value: 8.283883139680066 - type: nauc_mrr_at_1000_diff1 value: 66.87150374356781 - type: nauc_mrr_at_1000_max value: 41.413456443203984 - type: nauc_mrr_at_1000_std value: 4.140387282484357 - type: nauc_mrr_at_100_diff1 value: 66.87178015619061 - type: nauc_mrr_at_100_max value: 41.419754763150834 - type: nauc_mrr_at_100_std value: 4.15222235416704 - type: nauc_mrr_at_10_diff1 value: 66.89720586892301 - type: nauc_mrr_at_10_max value: 41.56353878125211 - type: nauc_mrr_at_10_std value: 4.213376519922392 - type: nauc_mrr_at_1_diff1 value: 67.7764270505257 - type: nauc_mrr_at_1_max value: 38.45166604737058 - type: nauc_mrr_at_1_std value: 1.9919181988552352 - type: nauc_mrr_at_20_diff1 value: 66.8714688713149 - type: nauc_mrr_at_20_max value: 41.46170778986735 - type: nauc_mrr_at_20_std value: 4.165154741309859 - type: nauc_mrr_at_3_diff1 value: 66.31615462679144 - type: nauc_mrr_at_3_max value: 41.419637693259936 - type: nauc_mrr_at_3_std value: 3.814834551396097 - type: nauc_mrr_at_5_diff1 value: 66.7289413087213 - type: nauc_mrr_at_5_max value: 41.668346356371586 - type: nauc_mrr_at_5_std value: 4.116331539882484 - type: nauc_ndcg_at_1000_diff1 value: 26.37325375970598 - type: nauc_ndcg_at_1000_max value: 24.850915174721735 - type: nauc_ndcg_at_1000_std value: 13.37585683440429 - type: nauc_ndcg_at_100_diff1 value: 25.591771178059503 - type: nauc_ndcg_at_100_max value: 24.562820829532473 - type: nauc_ndcg_at_100_std value: 14.093690500501541 - type: nauc_ndcg_at_10_diff1 value: 24.64600598115805 - type: nauc_ndcg_at_10_max value: 23.543499404760023 - type: nauc_ndcg_at_10_std value: 11.55823632781553 - type: nauc_ndcg_at_1_diff1 value: 67.7764270505257 - type: nauc_ndcg_at_1_max value: 38.45166604737058 - type: nauc_ndcg_at_1_std value: 1.9919181988552352 - type: nauc_ndcg_at_20_diff1 value: 24.757843275306726 - type: nauc_ndcg_at_20_max value: 23.951154200380827 - type: nauc_ndcg_at_20_std value: 12.931320453044886 - type: nauc_ndcg_at_3_diff1 value: 24.37742630418847 - type: nauc_ndcg_at_3_max value: 21.310512304883723 - type: nauc_ndcg_at_3_std value: 6.503993200818077 - type: nauc_ndcg_at_5_diff1 value: 24.813706829269716 - type: nauc_ndcg_at_5_max value: 22.993657212898 - type: nauc_ndcg_at_5_std value: 9.34462052506809 - type: nauc_precision_at_1000_diff1 value: -0.6506415756958156 - type: nauc_precision_at_1000_max value: 28.039755644694875 - type: nauc_precision_at_1000_std value: 53.46474329623814 - type: nauc_precision_at_100_diff1 value: 3.78462668236152 - type: nauc_precision_at_100_max value: 22.501700881673862 - type: nauc_precision_at_100_std value: 40.56672716474142 - type: nauc_precision_at_10_diff1 value: 9.156113228907534 - type: nauc_precision_at_10_max value: 19.734206254833254 - type: nauc_precision_at_10_std value: 19.986282545779602 - type: nauc_precision_at_1_diff1 value: 67.7764270505257 - type: nauc_precision_at_1_max value: 38.45166604737058 - type: nauc_precision_at_1_std value: 1.9919181988552352 - type: nauc_precision_at_20_diff1 value: 6.6164335644470125 - type: nauc_precision_at_20_max value: 20.29343459608317 - type: nauc_precision_at_20_std value: 26.51115475333977 - type: nauc_precision_at_3_diff1 value: 12.476520554399546 - type: nauc_precision_at_3_max value: 16.69401409858964 - type: nauc_precision_at_3_std value: 8.165880294907444 - type: nauc_precision_at_5_diff1 value: 11.783242828320958 - type: nauc_precision_at_5_max value: 19.0679467875759 - type: nauc_precision_at_5_std value: 13.615358345509884 - type: nauc_recall_at_1000_diff1 value: -0.6506415756960168 - type: nauc_recall_at_1000_max value: 28.039755644694786 - type: nauc_recall_at_1000_std value: 53.46474329623801 - type: nauc_recall_at_100_diff1 value: 3.7846266823613877 - type: nauc_recall_at_100_max value: 22.501700881674008 - type: nauc_recall_at_100_std value: 40.566727164741366 - type: nauc_recall_at_10_diff1 value: 9.15611322890755 - type: nauc_recall_at_10_max value: 19.73420625483318 - type: nauc_recall_at_10_std value: 19.98628254577951 - type: nauc_recall_at_1_diff1 value: 67.7764270505257 - type: nauc_recall_at_1_max value: 38.45166604737058 - type: nauc_recall_at_1_std value: 1.9919181988552352 - type: nauc_recall_at_20_diff1 value: 6.616433564446929 - type: nauc_recall_at_20_max value: 20.293434596083248 - type: nauc_recall_at_20_std value: 26.5111547533396 - type: nauc_recall_at_3_diff1 value: 12.476520554399531 - type: nauc_recall_at_3_max value: 16.69401409858966 - type: nauc_recall_at_3_std value: 8.165880294907438 - type: nauc_recall_at_5_diff1 value: 11.783242828320999 - type: nauc_recall_at_5_max value: 19.067946787575845 - type: nauc_recall_at_5_std value: 13.61535834550991 - type: ndcg_at_1 value: 78.974 - type: ndcg_at_10 value: 72.21 - type: ndcg_at_100 value: 75.264 - type: ndcg_at_1000 value: 76.259 - type: ndcg_at_20 value: 73.628 - type: ndcg_at_3 value: 67.047 - type: ndcg_at_5 value: 69.974 - type: precision_at_1 value: 78.974 - type: precision_at_10 value: 15.267 - type: precision_at_100 value: 1.762 - type: precision_at_1000 value: 0.189 - type: precision_at_20 value: 8.09 - type: precision_at_3 value: 43.309 - type: precision_at_5 value: 28.294000000000004 - type: recall_at_1 value: 39.487 - type: recall_at_10 value: 76.334 - type: recall_at_100 value: 88.076 - type: recall_at_1000 value: 94.59100000000001 - type: recall_at_20 value: 80.898 - type: recall_at_3 value: 64.96300000000001 - type: recall_at_5 value: 70.736 - task: type: Retrieval dataset: name: MTEB MSMARCO type: mteb/msmarco config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: main_score value: 42.027 - type: map_at_1 value: 22.118 - type: map_at_10 value: 34.816 - type: map_at_100 value: 35.983 - type: map_at_1000 value: 36.028999999999996 - type: map_at_20 value: 35.545 - type: map_at_3 value: 30.752000000000002 - type: map_at_5 value: 33.114 - type: mrr_at_1 value: 22.793696275071635 - type: mrr_at_10 value: 35.47250079592483 - type: mrr_at_100 value: 36.576471512902856 - type: mrr_at_1000 value: 36.616205680509786 - type: mrr_at_20 value: 36.16557033864942 - type: mrr_at_3 value: 31.48758357211065 - type: mrr_at_5 value: 33.80563514804202 - type: nauc_map_at_1000_diff1 value: 32.89234100489284 - type: nauc_map_at_1000_max value: 1.1802816553581001 - type: nauc_map_at_1000_std value: -20.187692925732446 - type: nauc_map_at_100_diff1 value: 32.88694493681772 - type: nauc_map_at_100_max value: 1.1732717578080365 - type: nauc_map_at_100_std value: -20.164165529035245 - type: nauc_map_at_10_diff1 value: 32.826182211848796 - type: nauc_map_at_10_max value: 1.1551262165737235 - type: nauc_map_at_10_std value: -20.88326292319754 - type: nauc_map_at_1_diff1 value: 36.12732122790642 - type: nauc_map_at_1_max value: 1.8197550109156913 - type: nauc_map_at_1_std value: -17.205625720792167 - type: nauc_map_at_20_diff1 value: 32.83333177195551 - type: nauc_map_at_20_max value: 1.0937431645506202 - type: nauc_map_at_20_std value: -20.503956514646145 - type: nauc_map_at_3_diff1 value: 32.76264193805814 - type: nauc_map_at_3_max value: 0.8560962042500389 - type: nauc_map_at_3_std value: -20.608930717315577 - type: nauc_map_at_5_diff1 value: 32.78673238978775 - type: nauc_map_at_5_max value: 1.0511863039329437 - type: nauc_map_at_5_std value: -21.02164728626011 - type: nauc_mrr_at_1000_diff1 value: 32.610323934702286 - type: nauc_mrr_at_1000_max value: 1.276669121901405 - type: nauc_mrr_at_1000_std value: -19.908120615285043 - type: nauc_mrr_at_100_diff1 value: 32.601373758102795 - type: nauc_mrr_at_100_max value: 1.2752735149992132 - type: nauc_mrr_at_100_std value: -19.87937042610101 - type: nauc_mrr_at_10_diff1 value: 32.55795432078168 - type: nauc_mrr_at_10_max value: 1.2881786969258637 - type: nauc_mrr_at_10_std value: -20.54564519015977 - type: nauc_mrr_at_1_diff1 value: 35.596301376443726 - type: nauc_mrr_at_1_max value: 1.7633238037306902 - type: nauc_mrr_at_1_std value: -17.1999420019887 - type: nauc_mrr_at_20_diff1 value: 32.57185739111023 - type: nauc_mrr_at_20_max value: 1.2212620853201877 - type: nauc_mrr_at_20_std value: -20.179517281041264 - type: nauc_mrr_at_3_diff1 value: 32.42681377099514 - type: nauc_mrr_at_3_max value: 0.8745921708861145 - type: nauc_mrr_at_3_std value: -20.41017687790572 - type: nauc_mrr_at_5_diff1 value: 32.499107129648266 - type: nauc_mrr_at_5_max value: 1.1159673851851573 - type: nauc_mrr_at_5_std value: -20.695143502133824 - type: nauc_ndcg_at_1000_diff1 value: 32.16957965806702 - type: nauc_ndcg_at_1000_max value: 1.6763998947980905 - type: nauc_ndcg_at_1000_std value: -18.970592350332893 - type: nauc_ndcg_at_100_diff1 value: 31.977550102558872 - type: nauc_ndcg_at_100_max value: 1.5625858650110014 - type: nauc_ndcg_at_100_std value: -17.990456766123835 - type: nauc_ndcg_at_10_diff1 value: 31.82738932481356 - type: nauc_ndcg_at_10_max value: 1.1661362042692103 - type: nauc_ndcg_at_10_std value: -21.872680193994217 - type: nauc_ndcg_at_1_diff1 value: 35.596301376443726 - type: nauc_ndcg_at_1_max value: 1.7633238037306902 - type: nauc_ndcg_at_1_std value: -17.1999420019887 - type: nauc_ndcg_at_20_diff1 value: 31.749656399266264 - type: nauc_ndcg_at_20_max value: 0.9629024493088691 - type: nauc_ndcg_at_20_std value: -20.4379403899277 - type: nauc_ndcg_at_3_diff1 value: 31.731361436850836 - type: nauc_ndcg_at_3_max value: 0.531749791578849 - type: nauc_ndcg_at_3_std value: -21.551112910698674 - type: nauc_ndcg_at_5_diff1 value: 31.785373941157303 - type: nauc_ndcg_at_5_max value: 0.86207769368333 - type: nauc_ndcg_at_5_std value: -22.24923399160171 - type: nauc_precision_at_1000_diff1 value: -3.841288331986519 - type: nauc_precision_at_1000_max value: 13.558041371634976 - type: nauc_precision_at_1000_std value: 15.181510484512827 - type: nauc_precision_at_100_diff1 value: 12.441154582709053 - type: nauc_precision_at_100_max value: 8.428136255841935 - type: nauc_precision_at_100_std value: 14.710391839731656 - type: nauc_precision_at_10_diff1 value: 26.185854813986705 - type: nauc_precision_at_10_max value: 1.6348387310504464 - type: nauc_precision_at_10_std value: -23.448927004357298 - type: nauc_precision_at_1_diff1 value: 35.596301376443726 - type: nauc_precision_at_1_max value: 1.7633238037306902 - type: nauc_precision_at_1_std value: -17.1999420019887 - type: nauc_precision_at_20_diff1 value: 22.69194179544158 - type: nauc_precision_at_20_max value: 1.2972015009169306 - type: nauc_precision_at_20_std value: -15.751482380060269 - type: nauc_precision_at_3_diff1 value: 28.255531512125188 - type: nauc_precision_at_3_max value: -0.3715575458464333 - type: nauc_precision_at_3_std value: -24.227970454057697 - type: nauc_precision_at_5_diff1 value: 27.65497951098847 - type: nauc_precision_at_5_max value: 0.449773375292472 - type: nauc_precision_at_5_std value: -25.37445450938601 - type: nauc_recall_at_1000_diff1 value: 15.243948516763819 - type: nauc_recall_at_1000_max value: 41.821227805251375 - type: nauc_recall_at_1000_std value: 61.66297794838101 - type: nauc_recall_at_100_diff1 value: 24.516543685029994 - type: nauc_recall_at_100_max value: 7.093972966253228 - type: nauc_recall_at_100_std value: 17.244452321212282 - type: nauc_recall_at_10_diff1 value: 28.404243095182828 - type: nauc_recall_at_10_max value: 1.0805210480930945 - type: nauc_recall_at_10_std value: -24.885018657039527 - type: nauc_recall_at_1_diff1 value: 36.12732122790642 - type: nauc_recall_at_1_max value: 1.8197550109156913 - type: nauc_recall_at_1_std value: -17.205625720792167 - type: nauc_recall_at_20_diff1 value: 26.956250169438512 - type: nauc_recall_at_20_max value: 0.023973408161285917 - type: nauc_recall_at_20_std value: -18.32944444428131 - type: nauc_recall_at_3_diff1 value: 28.9894205130054 - type: nauc_recall_at_3_max value: -0.36140658021466865 - type: nauc_recall_at_3_std value: -24.022505107768364 - type: nauc_recall_at_5_diff1 value: 28.907023434955104 - type: nauc_recall_at_5_max value: 0.2501037567297729 - type: nauc_recall_at_5_std value: -25.719919602271496 - type: ndcg_at_1 value: 22.794 - type: ndcg_at_10 value: 42.027 - type: ndcg_at_100 value: 47.601 - type: ndcg_at_1000 value: 48.713 - type: ndcg_at_20 value: 44.623000000000005 - type: ndcg_at_3 value: 33.772999999999996 - type: ndcg_at_5 value: 37.991 - type: precision_at_1 value: 22.794 - type: precision_at_10 value: 6.711 - type: precision_at_100 value: 0.9490000000000001 - type: precision_at_1000 value: 0.105 - type: precision_at_20 value: 3.8920000000000003 - type: precision_at_3 value: 14.46 - type: precision_at_5 value: 10.822 - type: recall_at_1 value: 22.118 - type: recall_at_10 value: 64.201 - type: recall_at_100 value: 89.878 - type: recall_at_1000 value: 98.259 - type: recall_at_20 value: 74.34100000000001 - type: recall_at_3 value: 41.8 - type: recall_at_5 value: 51.959 - task: type: Retrieval dataset: name: MTEB NFCorpus type: mteb/nfcorpus config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: main_score value: 36.201 - type: map_at_1 value: 5.654 - type: map_at_10 value: 13.402 - type: map_at_100 value: 16.849 - type: map_at_1000 value: 18.264 - type: map_at_20 value: 14.832 - type: map_at_3 value: 9.619 - type: map_at_5 value: 11.483 - type: mrr_at_1 value: 47.6780185758514 - type: mrr_at_10 value: 56.47906531033466 - type: mrr_at_100 value: 57.04539749991402 - type: mrr_at_1000 value: 57.08810157607369 - type: mrr_at_20 value: 56.88003170105462 - type: mrr_at_3 value: 54.43756449948401 - type: mrr_at_5 value: 55.660474716202266 - type: nauc_map_at_1000_diff1 value: 31.134615238698192 - type: nauc_map_at_1000_max value: 36.09522002487132 - type: nauc_map_at_1000_std value: 14.72627666649002 - type: nauc_map_at_100_diff1 value: 32.777473351864444 - type: nauc_map_at_100_max value: 35.25391471621035 - type: nauc_map_at_100_std value: 12.024428973861083 - type: nauc_map_at_10_diff1 value: 36.46466466148528 - type: nauc_map_at_10_max value: 29.707805406826722 - type: nauc_map_at_10_std value: 2.0678757794226335 - type: nauc_map_at_1_diff1 value: 54.30208426149679 - type: nauc_map_at_1_max value: 18.69125148481608 - type: nauc_map_at_1_std value: -8.970955660291802 - type: nauc_map_at_20_diff1 value: 34.76513311600623 - type: nauc_map_at_20_max value: 32.20666003570514 - type: nauc_map_at_20_std value: 5.924889441518581 - type: nauc_map_at_3_diff1 value: 45.73465176835491 - type: nauc_map_at_3_max value: 23.492291524989106 - type: nauc_map_at_3_std value: -5.0123536561688855 - type: nauc_map_at_5_diff1 value: 39.7128319374107 - type: nauc_map_at_5_max value: 25.84231729559691 - type: nauc_map_at_5_std value: -2.0861428981140344 - type: nauc_mrr_at_1000_diff1 value: 33.0997881703397 - type: nauc_mrr_at_1000_max value: 52.7089709923531 - type: nauc_mrr_at_1000_std value: 28.8517952674151 - type: nauc_mrr_at_100_diff1 value: 33.1094984027438 - type: nauc_mrr_at_100_max value: 52.74301398138847 - type: nauc_mrr_at_100_std value: 28.897997840300892 - type: nauc_mrr_at_10_diff1 value: 33.300713655464925 - type: nauc_mrr_at_10_max value: 52.572139698742184 - type: nauc_mrr_at_10_std value: 28.66875615527188 - type: nauc_mrr_at_1_diff1 value: 32.57632582147155 - type: nauc_mrr_at_1_max value: 46.020072246328816 - type: nauc_mrr_at_1_std value: 20.99097889820076 - type: nauc_mrr_at_20_diff1 value: 33.04083904518949 - type: nauc_mrr_at_20_max value: 52.597451362456994 - type: nauc_mrr_at_20_std value: 28.681527293587898 - type: nauc_mrr_at_3_diff1 value: 33.64864656322754 - type: nauc_mrr_at_3_max value: 51.82256412011279 - type: nauc_mrr_at_3_std value: 27.241260746740686 - type: nauc_mrr_at_5_diff1 value: 33.53201325467246 - type: nauc_mrr_at_5_max value: 52.79440885773516 - type: nauc_mrr_at_5_std value: 28.663081392086028 - type: nauc_ndcg_at_1000_diff1 value: 28.632650542040714 - type: nauc_ndcg_at_1000_max value: 51.24103069835822 - type: nauc_ndcg_at_1000_std value: 35.05503784757999 - type: nauc_ndcg_at_100_diff1 value: 29.082177715298503 - type: nauc_ndcg_at_100_max value: 45.24750203464315 - type: nauc_ndcg_at_100_std value: 27.146548925680914 - type: nauc_ndcg_at_10_diff1 value: 25.123554466093594 - type: nauc_ndcg_at_10_max value: 42.74355537806512 - type: nauc_ndcg_at_10_std value: 22.234407997803935 - type: nauc_ndcg_at_1_diff1 value: 33.75083940012058 - type: nauc_ndcg_at_1_max value: 44.44319402133161 - type: nauc_ndcg_at_1_std value: 19.146499358406487 - type: nauc_ndcg_at_20_diff1 value: 24.954207968331872 - type: nauc_ndcg_at_20_max value: 41.25991844405748 - type: nauc_ndcg_at_20_std value: 22.169009285868864 - type: nauc_ndcg_at_3_diff1 value: 28.186539942033516 - type: nauc_ndcg_at_3_max value: 44.40790009754965 - type: nauc_ndcg_at_3_std value: 20.99226576085115 - type: nauc_ndcg_at_5_diff1 value: 25.498387899376706 - type: nauc_ndcg_at_5_max value: 43.174709766261316 - type: nauc_ndcg_at_5_std value: 21.88111962672031 - type: nauc_precision_at_1000_diff1 value: -16.22321012507648 - type: nauc_precision_at_1000_max value: 5.808852256649677 - type: nauc_precision_at_1000_std value: 19.875641776698824 - type: nauc_precision_at_100_diff1 value: -10.248089374355486 - type: nauc_precision_at_100_max value: 19.29065415127588 - type: nauc_precision_at_100_std value: 31.75019665627339 - type: nauc_precision_at_10_diff1 value: 3.6783257583955056 - type: nauc_precision_at_10_max value: 39.22286010695767 - type: nauc_precision_at_10_std value: 31.225485732801022 - type: nauc_precision_at_1_diff1 value: 32.57632582147155 - type: nauc_precision_at_1_max value: 46.020072246328816 - type: nauc_precision_at_1_std value: 20.99097889820076 - type: nauc_precision_at_20_diff1 value: -3.1632510833242784 - type: nauc_precision_at_20_max value: 31.575496762405734 - type: nauc_precision_at_20_std value: 31.576283324468115 - type: nauc_precision_at_3_diff1 value: 17.78864585545647 - type: nauc_precision_at_3_max value: 44.201289661125585 - type: nauc_precision_at_3_std value: 25.447840649726693 - type: nauc_precision_at_5_diff1 value: 9.986748662091358 - type: nauc_precision_at_5_max value: 41.214164860776755 - type: nauc_precision_at_5_std value: 28.22551704127726 - type: nauc_recall_at_1000_diff1 value: 10.984331766850506 - type: nauc_recall_at_1000_max value: 24.641216018034104 - type: nauc_recall_at_1000_std value: 26.91064221008446 - type: nauc_recall_at_100_diff1 value: 23.7009352078473 - type: nauc_recall_at_100_max value: 30.176031609451297 - type: nauc_recall_at_100_std value: 20.360365243211564 - type: nauc_recall_at_10_diff1 value: 28.11831737650638 - type: nauc_recall_at_10_max value: 24.21539670487414 - type: nauc_recall_at_10_std value: 2.245504974150148 - type: nauc_recall_at_1_diff1 value: 54.30208426149679 - type: nauc_recall_at_1_max value: 18.69125148481608 - type: nauc_recall_at_1_std value: -8.970955660291802 - type: nauc_recall_at_20_diff1 value: 26.199425305139908 - type: nauc_recall_at_20_max value: 24.66704097503736 - type: nauc_recall_at_20_std value: 5.86052107206246 - type: nauc_recall_at_3_diff1 value: 42.88348677575622 - type: nauc_recall_at_3_max value: 21.189371077603308 - type: nauc_recall_at_3_std value: -4.537510127238226 - type: nauc_recall_at_5_diff1 value: 30.7936756722569 - type: nauc_recall_at_5_max value: 21.06136406164962 - type: nauc_recall_at_5_std value: -1.4113804735229794 - type: ndcg_at_1 value: 45.975 - type: ndcg_at_10 value: 36.201 - type: ndcg_at_100 value: 32.736 - type: ndcg_at_1000 value: 41.099000000000004 - type: ndcg_at_20 value: 33.724 - type: ndcg_at_3 value: 42.242000000000004 - type: ndcg_at_5 value: 40.137 - type: precision_at_1 value: 47.678 - type: precision_at_10 value: 26.904 - type: precision_at_100 value: 8.368 - type: precision_at_1000 value: 2.078 - type: precision_at_20 value: 19.845 - type: precision_at_3 value: 40.351 - type: precision_at_5 value: 35.108 - type: recall_at_1 value: 5.654 - type: recall_at_10 value: 17.793 - type: recall_at_100 value: 32.483000000000004 - type: recall_at_1000 value: 63.294 - type: recall_at_20 value: 21.754 - type: recall_at_3 value: 10.771 - type: recall_at_5 value: 14.084 - task: type: Retrieval dataset: name: MTEB NQ type: mteb/nq config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: main_score value: 62.464 - type: map_at_1 value: 38.0 - type: map_at_10 value: 54.806 - type: map_at_100 value: 55.599 - type: map_at_1000 value: 55.617000000000004 - type: map_at_20 value: 55.336 - type: map_at_3 value: 50.58200000000001 - type: map_at_5 value: 53.181 - type: mrr_at_1 value: 42.46813441483198 - type: mrr_at_10 value: 57.060710147326446 - type: mrr_at_100 value: 57.60978373431328 - type: mrr_at_1000 value: 57.62192762809547 - type: mrr_at_20 value: 57.43431796174232 - type: mrr_at_3 value: 53.78041714947835 - type: mrr_at_5 value: 55.81257242178437 - type: nauc_map_at_1000_diff1 value: 38.337572188308194 - type: nauc_map_at_1000_max value: 27.550035254787197 - type: nauc_map_at_1000_std value: -7.5513729587308145 - type: nauc_map_at_100_diff1 value: 38.335337794455015 - type: nauc_map_at_100_max value: 27.56919614414171 - type: nauc_map_at_100_std value: -7.526017855405723 - type: nauc_map_at_10_diff1 value: 38.308131361353816 - type: nauc_map_at_10_max value: 27.691849580929933 - type: nauc_map_at_10_std value: -7.971461731555123 - type: nauc_map_at_1_diff1 value: 42.721072690634884 - type: nauc_map_at_1_max value: 21.750451486885332 - type: nauc_map_at_1_std value: -9.99540950522643 - type: nauc_map_at_20_diff1 value: 38.25792874982169 - type: nauc_map_at_20_max value: 27.68877906159661 - type: nauc_map_at_20_std value: -7.560753583212102 - type: nauc_map_at_3_diff1 value: 37.950570055936254 - type: nauc_map_at_3_max value: 26.257969511794858 - type: nauc_map_at_3_std value: -9.236868658300553 - type: nauc_map_at_5_diff1 value: 37.99893219450212 - type: nauc_map_at_5_max value: 27.293454259158057 - type: nauc_map_at_5_std value: -8.734089449603806 - type: nauc_mrr_at_1000_diff1 value: 37.777767467474774 - type: nauc_mrr_at_1000_max value: 27.39507603748298 - type: nauc_mrr_at_1000_std value: -5.554754076870114 - type: nauc_mrr_at_100_diff1 value: 37.77981674583538 - type: nauc_mrr_at_100_max value: 27.411100989441557 - type: nauc_mrr_at_100_std value: -5.539061231412731 - type: nauc_mrr_at_10_diff1 value: 37.72399003363479 - type: nauc_mrr_at_10_max value: 27.618142546685416 - type: nauc_mrr_at_10_std value: -5.6819843907448195 - type: nauc_mrr_at_1_diff1 value: 41.17596078958236 - type: nauc_mrr_at_1_max value: 23.32588591818617 - type: nauc_mrr_at_1_std value: -7.126628034623689 - type: nauc_mrr_at_20_diff1 value: 37.695136721588 - type: nauc_mrr_at_20_max value: 27.52850676467322 - type: nauc_mrr_at_20_std value: -5.50667995515647 - type: nauc_mrr_at_3_diff1 value: 37.23845700908964 - type: nauc_mrr_at_3_max value: 26.69389772971012 - type: nauc_mrr_at_3_std value: -6.31868405989011 - type: nauc_mrr_at_5_diff1 value: 37.33757394192838 - type: nauc_mrr_at_5_max value: 27.42091593836207 - type: nauc_mrr_at_5_std value: -5.993243330132065 - type: nauc_ndcg_at_1000_diff1 value: 37.74836061640332 - type: nauc_ndcg_at_1000_max value: 29.03148916289089 - type: nauc_ndcg_at_1000_std value: -5.543065770074502 - type: nauc_ndcg_at_100_diff1 value: 37.75593955089626 - type: nauc_ndcg_at_100_max value: 29.67109480272493 - type: nauc_ndcg_at_100_std value: -4.773697596687493 - type: nauc_ndcg_at_10_diff1 value: 37.41701174824348 - type: nauc_ndcg_at_10_max value: 30.448703434043445 - type: nauc_ndcg_at_10_std value: -6.306202666419071 - type: nauc_ndcg_at_1_diff1 value: 41.17596078958236 - type: nauc_ndcg_at_1_max value: 23.32588591818617 - type: nauc_ndcg_at_1_std value: -7.126628034623689 - type: nauc_ndcg_at_20_diff1 value: 37.17445197824622 - type: nauc_ndcg_at_20_max value: 30.47378561555209 - type: nauc_ndcg_at_20_std value: -4.921584853993488 - type: nauc_ndcg_at_3_diff1 value: 36.5261976812068 - type: nauc_ndcg_at_3_max value: 27.560538820208926 - type: nauc_ndcg_at_3_std value: -8.556686332882931 - type: nauc_ndcg_at_5_diff1 value: 36.571462759614526 - type: nauc_ndcg_at_5_max value: 29.363401730752585 - type: nauc_ndcg_at_5_std value: -7.825739170420347 - type: nauc_precision_at_1000_diff1 value: -12.588899483401223 - type: nauc_precision_at_1000_max value: 2.641097890578701 - type: nauc_precision_at_1000_std value: 17.643107625788748 - type: nauc_precision_at_100_diff1 value: -8.40579874206785 - type: nauc_precision_at_100_max value: 9.725496771040037 - type: nauc_precision_at_100_std value: 21.558582760191243 - type: nauc_precision_at_10_diff1 value: 6.619157191854486 - type: nauc_precision_at_10_max value: 23.767406373688402 - type: nauc_precision_at_10_std value: 10.428535003478808 - type: nauc_precision_at_1_diff1 value: 41.17596078958236 - type: nauc_precision_at_1_max value: 23.32588591818617 - type: nauc_precision_at_1_std value: -7.126628034623689 - type: nauc_precision_at_20_diff1 value: -0.6449974218292859 - type: nauc_precision_at_20_max value: 20.211503851418783 - type: nauc_precision_at_20_std value: 17.922745410142575 - type: nauc_precision_at_3_diff1 value: 19.710276097428657 - type: nauc_precision_at_3_max value: 26.768918044758706 - type: nauc_precision_at_3_std value: -1.0636448912049246 - type: nauc_precision_at_5_diff1 value: 13.073181337982613 - type: nauc_precision_at_5_max value: 26.418340338971024 - type: nauc_precision_at_5_std value: 2.9842078949528688 - type: nauc_recall_at_1000_diff1 value: 30.52411148739828 - type: nauc_recall_at_1000_max value: 90.96409807536762 - type: nauc_recall_at_1000_std value: 83.94857830921949 - type: nauc_recall_at_100_diff1 value: 36.936303690592155 - type: nauc_recall_at_100_max value: 71.91515014325869 - type: nauc_recall_at_100_std value: 48.93061263403371 - type: nauc_recall_at_10_diff1 value: 32.84292362076269 - type: nauc_recall_at_10_max value: 44.27252783122478 - type: nauc_recall_at_10_std value: -1.5981198975612385 - type: nauc_recall_at_1_diff1 value: 42.721072690634884 - type: nauc_recall_at_1_max value: 21.750451486885332 - type: nauc_recall_at_1_std value: -9.99540950522643 - type: nauc_recall_at_20_diff1 value: 29.36724417081702 - type: nauc_recall_at_20_max value: 52.035846390214715 - type: nauc_recall_at_20_std value: 11.967264191332818 - type: nauc_recall_at_3_diff1 value: 31.634923771936098 - type: nauc_recall_at_3_max value: 30.225743369869473 - type: nauc_recall_at_3_std value: -9.253665347118615 - type: nauc_recall_at_5_diff1 value: 30.66271853090737 - type: nauc_recall_at_5_max value: 35.70815715994996 - type: nauc_recall_at_5_std value: -7.836012956078996 - type: ndcg_at_1 value: 42.468 - type: ndcg_at_10 value: 62.464 - type: ndcg_at_100 value: 65.618 - type: ndcg_at_1000 value: 66.014 - type: ndcg_at_20 value: 64.12 - type: ndcg_at_3 value: 54.790000000000006 - type: ndcg_at_5 value: 58.992 - type: precision_at_1 value: 42.468 - type: precision_at_10 value: 9.959 - type: precision_at_100 value: 1.174 - type: precision_at_1000 value: 0.121 - type: precision_at_20 value: 5.380999999999999 - type: precision_at_3 value: 24.73 - type: precision_at_5 value: 17.299999999999997 - type: recall_at_1 value: 38.0 - type: recall_at_10 value: 83.22699999999999 - type: recall_at_100 value: 96.584 - type: recall_at_1000 value: 99.512 - type: recall_at_20 value: 89.291 - type: recall_at_3 value: 63.666 - type: recall_at_5 value: 73.27900000000001 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: mteb/quora config: default split: test revision: e4e08e0b7dbe3c8700f0daef558ff32256715259 metrics: - type: main_score value: 87.366 - type: map_at_1 value: 69.95700000000001 - type: map_at_10 value: 83.55 - type: map_at_100 value: 84.196 - type: map_at_1000 value: 84.21600000000001 - type: map_at_20 value: 83.982 - type: map_at_3 value: 80.647 - type: map_at_5 value: 82.443 - type: mrr_at_1 value: 80.39 - type: mrr_at_10 value: 86.65646031746004 - type: mrr_at_100 value: 86.7852113210373 - type: mrr_at_1000 value: 86.78651118354796 - type: mrr_at_20 value: 86.75772838878498 - type: mrr_at_3 value: 85.67499999999971 - type: mrr_at_5 value: 86.33749999999962 - type: nauc_map_at_1000_diff1 value: 76.68189702770007 - type: nauc_map_at_1000_max value: 36.19988239025682 - type: nauc_map_at_1000_std value: -26.231691135645736 - type: nauc_map_at_100_diff1 value: 76.68832712120171 - type: nauc_map_at_100_max value: 36.18627717337547 - type: nauc_map_at_100_std value: -26.28243886166 - type: nauc_map_at_10_diff1 value: 76.88888516032657 - type: nauc_map_at_10_max value: 35.69809861085124 - type: nauc_map_at_10_std value: -27.859425473864224 - type: nauc_map_at_1_diff1 value: 79.5243725217315 - type: nauc_map_at_1_max value: 27.092773841207002 - type: nauc_map_at_1_std value: -26.223200911204543 - type: nauc_map_at_20_diff1 value: 76.74938996155176 - type: nauc_map_at_20_max value: 36.07373781351406 - type: nauc_map_at_20_std value: -26.891400098628015 - type: nauc_map_at_3_diff1 value: 77.29604745045076 - type: nauc_map_at_3_max value: 33.11431059356283 - type: nauc_map_at_3_std value: -29.555237195931085 - type: nauc_map_at_5_diff1 value: 77.14069217901078 - type: nauc_map_at_5_max value: 34.68656073526487 - type: nauc_map_at_5_std value: -28.945053669861508 - type: nauc_mrr_at_1000_diff1 value: 76.66087451567746 - type: nauc_mrr_at_1000_max value: 38.78133177265328 - type: nauc_mrr_at_1000_std value: -23.75726541774991 - type: nauc_mrr_at_100_diff1 value: 76.66117078261013 - type: nauc_mrr_at_100_max value: 38.782533036423885 - type: nauc_mrr_at_100_std value: -23.752587601473568 - type: nauc_mrr_at_10_diff1 value: 76.65866401411019 - type: nauc_mrr_at_10_max value: 38.87950311049704 - type: nauc_mrr_at_10_std value: -23.873660706680578 - type: nauc_mrr_at_1_diff1 value: 77.42633506487041 - type: nauc_mrr_at_1_max value: 37.93973722217786 - type: nauc_mrr_at_1_std value: -23.3984130771317 - type: nauc_mrr_at_20_diff1 value: 76.66210684923414 - type: nauc_mrr_at_20_max value: 38.81293033048911 - type: nauc_mrr_at_20_std value: -23.736590746133736 - type: nauc_mrr_at_3_diff1 value: 76.33711764736019 - type: nauc_mrr_at_3_max value: 38.5659231830368 - type: nauc_mrr_at_3_std value: -23.99588149124865 - type: nauc_mrr_at_5_diff1 value: 76.57123830226054 - type: nauc_mrr_at_5_max value: 38.97947097392977 - type: nauc_mrr_at_5_std value: -23.943668957974246 - type: nauc_ndcg_at_1000_diff1 value: 76.38447339050585 - type: nauc_ndcg_at_1000_max value: 37.756822792877934 - type: nauc_ndcg_at_1000_std value: -24.046995734357164 - type: nauc_ndcg_at_100_diff1 value: 76.44058018066822 - type: nauc_ndcg_at_100_max value: 37.72948294169218 - type: nauc_ndcg_at_100_std value: -24.083432140741795 - type: nauc_ndcg_at_10_diff1 value: 76.56246287923074 - type: nauc_ndcg_at_10_max value: 37.0329253490553 - type: nauc_ndcg_at_10_std value: -26.6495163705961 - type: nauc_ndcg_at_1_diff1 value: 77.4085129990432 - type: nauc_ndcg_at_1_max value: 38.06139172214421 - type: nauc_ndcg_at_1_std value: -23.656477126977386 - type: nauc_ndcg_at_20_diff1 value: 76.50192496743098 - type: nauc_ndcg_at_20_max value: 37.51759311013985 - type: nauc_ndcg_at_20_std value: -25.45517058360004 - type: nauc_ndcg_at_3_diff1 value: 75.94398494081794 - type: nauc_ndcg_at_3_max value: 35.7666711547279 - type: nauc_ndcg_at_3_std value: -26.866022682361578 - type: nauc_ndcg_at_5_diff1 value: 76.47334274088344 - type: nauc_ndcg_at_5_max value: 36.40830331490731 - type: nauc_ndcg_at_5_std value: -27.170121189572765 - type: nauc_precision_at_1000_diff1 value: -43.33672630765437 - type: nauc_precision_at_1000_max value: -5.089751329149161 - type: nauc_precision_at_1000_std value: 30.6241447847051 - type: nauc_precision_at_100_diff1 value: -42.736833035629864 - type: nauc_precision_at_100_max value: -4.060198408346224 - type: nauc_precision_at_100_std value: 29.807050266205344 - type: nauc_precision_at_10_diff1 value: -35.90810562245906 - type: nauc_precision_at_10_max value: 1.1633204529249133 - type: nauc_precision_at_10_std value: 20.129691203276018 - type: nauc_precision_at_1_diff1 value: 77.4085129990432 - type: nauc_precision_at_1_max value: 38.06139172214421 - type: nauc_precision_at_1_std value: -23.656477126977386 - type: nauc_precision_at_20_diff1 value: -40.2132286912738 - type: nauc_precision_at_20_max value: -1.3004735030734194 - type: nauc_precision_at_20_std value: 25.15612293757488 - type: nauc_precision_at_3_diff1 value: -13.873825299883904 - type: nauc_precision_at_3_max value: 11.038689278907233 - type: nauc_precision_at_3_std value: 5.4276449621706 - type: nauc_precision_at_5_diff1 value: -27.151668633894737 - type: nauc_precision_at_5_max value: 5.795130010163115 - type: nauc_precision_at_5_std value: 13.220722167587375 - type: nauc_recall_at_1000_diff1 value: 83.903950427863 - type: nauc_recall_at_1000_max value: 37.82919000897223 - type: nauc_recall_at_1000_std value: 70.65670846771707 - type: nauc_recall_at_100_diff1 value: 75.23306095335836 - type: nauc_recall_at_100_max value: 37.54281648247423 - type: nauc_recall_at_100_std value: 8.434289114377373 - type: nauc_recall_at_10_diff1 value: 72.7872912723047 - type: nauc_recall_at_10_max value: 34.261519652104184 - type: nauc_recall_at_10_std value: -34.60101950810808 - type: nauc_recall_at_1_diff1 value: 79.5243725217315 - type: nauc_recall_at_1_max value: 27.092773841207002 - type: nauc_recall_at_1_std value: -26.223200911204543 - type: nauc_recall_at_20_diff1 value: 72.8297963091964 - type: nauc_recall_at_20_max value: 36.070220569670916 - type: nauc_recall_at_20_std value: -27.20897179168245 - type: nauc_recall_at_3_diff1 value: 73.47456374650459 - type: nauc_recall_at_3_max value: 29.901663407294816 - type: nauc_recall_at_3_std value: -32.83329537040381 - type: nauc_recall_at_5_diff1 value: 73.05025750827126 - type: nauc_recall_at_5_max value: 32.35733470860963 - type: nauc_recall_at_5_std value: -34.32357558493091 - type: ndcg_at_1 value: 80.4 - type: ndcg_at_10 value: 87.366 - type: ndcg_at_100 value: 88.7 - type: ndcg_at_1000 value: 88.842 - type: ndcg_at_20 value: 88.11 - type: ndcg_at_3 value: 84.52499999999999 - type: ndcg_at_5 value: 86.047 - type: precision_at_1 value: 80.4 - type: precision_at_10 value: 13.235 - type: precision_at_100 value: 1.516 - type: precision_at_1000 value: 0.156 - type: precision_at_20 value: 7.037 - type: precision_at_3 value: 36.9 - type: precision_at_5 value: 24.236 - type: recall_at_1 value: 69.95700000000001 - type: recall_at_10 value: 94.535 - type: recall_at_100 value: 99.164 - type: recall_at_1000 value: 99.855 - type: recall_at_20 value: 96.974 - type: recall_at_3 value: 86.33800000000001 - type: recall_at_5 value: 90.69 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: mteb/scidocs config: default split: test revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88 metrics: - type: main_score value: 21.492 - type: map_at_1 value: 5.192 - type: map_at_10 value: 12.959000000000001 - type: map_at_100 value: 14.963999999999999 - type: map_at_1000 value: 15.261 - type: map_at_20 value: 13.988999999999999 - type: map_at_3 value: 9.235 - type: map_at_5 value: 11.042 - type: mrr_at_1 value: 25.5 - type: mrr_at_10 value: 36.37313492063491 - type: mrr_at_100 value: 37.36517957347626 - type: mrr_at_1000 value: 37.42538601073437 - type: mrr_at_20 value: 36.987896404421136 - type: mrr_at_3 value: 32.966666666666654 - type: mrr_at_5 value: 34.95166666666664 - type: nauc_map_at_1000_diff1 value: 13.635120934154395 - type: nauc_map_at_1000_max value: 28.03542983005195 - type: nauc_map_at_1000_std value: 17.07156940311778 - type: nauc_map_at_100_diff1 value: 13.59237295184475 - type: nauc_map_at_100_max value: 27.992291365051237 - type: nauc_map_at_100_std value: 16.926533467400464 - type: nauc_map_at_10_diff1 value: 14.149193235999993 - type: nauc_map_at_10_max value: 26.520643811139305 - type: nauc_map_at_10_std value: 13.168673602548925 - type: nauc_map_at_1_diff1 value: 20.096094508148465 - type: nauc_map_at_1_max value: 17.41582245576302 - type: nauc_map_at_1_std value: 5.771729007558897 - type: nauc_map_at_20_diff1 value: 13.977726400526427 - type: nauc_map_at_20_max value: 27.2322235491895 - type: nauc_map_at_20_std value: 14.972781677750435 - type: nauc_map_at_3_diff1 value: 17.371153027460355 - type: nauc_map_at_3_max value: 24.457758503208254 - type: nauc_map_at_3_std value: 7.719726821179824 - type: nauc_map_at_5_diff1 value: 14.600442843442574 - type: nauc_map_at_5_max value: 25.899736370856296 - type: nauc_map_at_5_std value: 10.125349354853359 - type: nauc_mrr_at_1000_diff1 value: 18.70342821390236 - type: nauc_mrr_at_1000_max value: 23.365194520549114 - type: nauc_mrr_at_1000_std value: 12.185114294903236 - type: nauc_mrr_at_100_diff1 value: 18.677858738015907 - type: nauc_mrr_at_100_max value: 23.372641996726742 - type: nauc_mrr_at_100_std value: 12.216130561991909 - type: nauc_mrr_at_10_diff1 value: 18.79094453090232 - type: nauc_mrr_at_10_max value: 23.511686337006466 - type: nauc_mrr_at_10_std value: 11.879716687008134 - type: nauc_mrr_at_1_diff1 value: 20.10455171810408 - type: nauc_mrr_at_1_max value: 17.741566234315428 - type: nauc_mrr_at_1_std value: 6.1676764583652215 - type: nauc_mrr_at_20_diff1 value: 18.70143648544655 - type: nauc_mrr_at_20_max value: 23.45603239095019 - type: nauc_mrr_at_20_std value: 12.244613576686202 - type: nauc_mrr_at_3_diff1 value: 18.894662528857374 - type: nauc_mrr_at_3_max value: 23.3739038101588 - type: nauc_mrr_at_3_std value: 10.4709044796543 - type: nauc_mrr_at_5_diff1 value: 18.877786065095563 - type: nauc_mrr_at_5_max value: 23.78061081203872 - type: nauc_mrr_at_5_std value: 11.847882917869622 - type: nauc_ndcg_at_1000_diff1 value: 13.99159027398115 - type: nauc_ndcg_at_1000_max value: 29.44766808611483 - type: nauc_ndcg_at_1000_std value: 24.289749574699915 - type: nauc_ndcg_at_100_diff1 value: 13.164020363258746 - type: nauc_ndcg_at_100_max value: 29.642442997167723 - type: nauc_ndcg_at_100_std value: 23.761764515453866 - type: nauc_ndcg_at_10_diff1 value: 14.839883268638546 - type: nauc_ndcg_at_10_max value: 27.21043708455449 - type: nauc_ndcg_at_10_std value: 15.56110419291775 - type: nauc_ndcg_at_1_diff1 value: 20.10455171810408 - type: nauc_ndcg_at_1_max value: 17.741566234315428 - type: nauc_ndcg_at_1_std value: 6.1676764583652215 - type: nauc_ndcg_at_20_diff1 value: 14.27998110295395 - type: nauc_ndcg_at_20_max value: 28.2492026337839 - type: nauc_ndcg_at_20_std value: 18.822356982979105 - type: nauc_ndcg_at_3_diff1 value: 17.659263157535445 - type: nauc_ndcg_at_3_max value: 25.416706421591396 - type: nauc_ndcg_at_3_std value: 9.650689638152636 - type: nauc_ndcg_at_5_diff1 value: 15.38459833918123 - type: nauc_ndcg_at_5_max value: 26.92495519416969 - type: nauc_ndcg_at_5_std value: 12.71017696809276 - type: nauc_precision_at_1000_diff1 value: 6.128490135458364 - type: nauc_precision_at_1000_max value: 23.52693893261883 - type: nauc_precision_at_1000_std value: 36.280432732819925 - type: nauc_precision_at_100_diff1 value: 5.306163791220436 - type: nauc_precision_at_100_max value: 27.67851033239246 - type: nauc_precision_at_100_std value: 34.29821573752515 - type: nauc_precision_at_10_diff1 value: 10.829686435425472 - type: nauc_precision_at_10_max value: 27.201648684015318 - type: nauc_precision_at_10_std value: 19.376999508233254 - type: nauc_precision_at_1_diff1 value: 20.10455171810408 - type: nauc_precision_at_1_max value: 17.741566234315428 - type: nauc_precision_at_1_std value: 6.1676764583652215 - type: nauc_precision_at_20_diff1 value: 9.416169626702048 - type: nauc_precision_at_20_max value: 27.65257998670333 - type: nauc_precision_at_20_std value: 24.761868509805826 - type: nauc_precision_at_3_diff1 value: 16.666456902017348 - type: nauc_precision_at_3_max value: 27.9969730961105 - type: nauc_precision_at_3_std value: 10.991562741393231 - type: nauc_precision_at_5_diff1 value: 12.26205064462843 - type: nauc_precision_at_5_max value: 29.083848730874095 - type: nauc_precision_at_5_std value: 15.66630836555747 - type: nauc_recall_at_1000_diff1 value: 5.600277836894063 - type: nauc_recall_at_1000_max value: 23.228705161815526 - type: nauc_recall_at_1000_std value: 36.822431061799485 - type: nauc_recall_at_100_diff1 value: 4.991781244867178 - type: nauc_recall_at_100_max value: 27.70095625483475 - type: nauc_recall_at_100_std value: 34.67168431597854 - type: nauc_recall_at_10_diff1 value: 10.580860425931972 - type: nauc_recall_at_10_max value: 27.145829414223666 - type: nauc_recall_at_10_std value: 19.330630157067382 - type: nauc_recall_at_1_diff1 value: 20.096094508148465 - type: nauc_recall_at_1_max value: 17.41582245576302 - type: nauc_recall_at_1_std value: 5.771729007558897 - type: nauc_recall_at_20_diff1 value: 9.06945331260344 - type: nauc_recall_at_20_max value: 27.56725251066482 - type: nauc_recall_at_20_std value: 24.77644509886098 - type: nauc_recall_at_3_diff1 value: 16.660507676429322 - type: nauc_recall_at_3_max value: 27.816546386536434 - type: nauc_recall_at_3_std value: 10.687824478247007 - type: nauc_recall_at_5_diff1 value: 11.992514446369388 - type: nauc_recall_at_5_max value: 28.789031176671948 - type: nauc_recall_at_5_std value: 15.422118990090805 - type: ndcg_at_1 value: 25.5 - type: ndcg_at_10 value: 21.492 - type: ndcg_at_100 value: 29.022 - type: ndcg_at_1000 value: 34.298 - type: ndcg_at_20 value: 24.237000000000002 - type: ndcg_at_3 value: 20.392 - type: ndcg_at_5 value: 17.801000000000002 - type: precision_at_1 value: 25.5 - type: precision_at_10 value: 11.09 - type: precision_at_100 value: 2.1919999999999997 - type: precision_at_1000 value: 0.346 - type: precision_at_20 value: 7.135 - type: precision_at_3 value: 18.933 - type: precision_at_5 value: 15.52 - type: recall_at_1 value: 5.192 - type: recall_at_10 value: 22.512999999999998 - type: recall_at_100 value: 44.505 - type: recall_at_1000 value: 70.267 - type: recall_at_20 value: 28.965000000000003 - type: recall_at_3 value: 11.522 - type: recall_at_5 value: 15.751999999999999 - task: type: Retrieval dataset: name: MTEB SciFact type: mteb/scifact config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: main_score value: 71.586 - type: map_at_1 value: 56.760999999999996 - type: map_at_10 value: 66.893 - type: map_at_100 value: 67.42 - type: map_at_1000 value: 67.44200000000001 - type: map_at_20 value: 67.232 - type: map_at_3 value: 64.193 - type: map_at_5 value: 65.73400000000001 - type: mrr_at_1 value: 60.0 - type: mrr_at_10 value: 68.20383597883595 - type: mrr_at_100 value: 68.58867453733343 - type: mrr_at_1000 value: 68.61117469977329 - type: mrr_at_20 value: 68.43973740684265 - type: mrr_at_3 value: 66.11111111111111 - type: mrr_at_5 value: 67.44444444444446 - type: nauc_map_at_1000_diff1 value: 72.66688261123035 - type: nauc_map_at_1000_max value: 61.02926282006283 - type: nauc_map_at_1000_std value: 11.084549829740526 - type: nauc_map_at_100_diff1 value: 72.66226192320828 - type: nauc_map_at_100_max value: 61.04393223108811 - type: nauc_map_at_100_std value: 11.101529343291695 - type: nauc_map_at_10_diff1 value: 72.66732266693091 - type: nauc_map_at_10_max value: 61.24124296311832 - type: nauc_map_at_10_std value: 10.91179451961794 - type: nauc_map_at_1_diff1 value: 74.2356464256346 - type: nauc_map_at_1_max value: 54.06962758957632 - type: nauc_map_at_1_std value: 0.8037891907963532 - type: nauc_map_at_20_diff1 value: 72.65198594061253 - type: nauc_map_at_20_max value: 61.130159351448185 - type: nauc_map_at_20_std value: 11.2246899245522 - type: nauc_map_at_3_diff1 value: 72.78578673303954 - type: nauc_map_at_3_max value: 59.19073262936321 - type: nauc_map_at_3_std value: 8.460301560522968 - type: nauc_map_at_5_diff1 value: 72.55004168261968 - type: nauc_map_at_5_max value: 59.75181935082357 - type: nauc_map_at_5_std value: 9.440299527201889 - type: nauc_mrr_at_1000_diff1 value: 72.82720348470325 - type: nauc_mrr_at_1000_max value: 62.344231223741446 - type: nauc_mrr_at_1000_std value: 12.60196558488974 - type: nauc_mrr_at_100_diff1 value: 72.82236849255094 - type: nauc_mrr_at_100_max value: 62.35799491393125 - type: nauc_mrr_at_100_std value: 12.617900773655673 - type: nauc_mrr_at_10_diff1 value: 72.7722847495086 - type: nauc_mrr_at_10_max value: 62.66642401155435 - type: nauc_mrr_at_10_std value: 12.906381237738746 - type: nauc_mrr_at_1_diff1 value: 74.71208073612343 - type: nauc_mrr_at_1_max value: 59.50430394775893 - type: nauc_mrr_at_1_std value: 8.129514198080512 - type: nauc_mrr_at_20_diff1 value: 72.78312367361772 - type: nauc_mrr_at_20_max value: 62.421122493761885 - type: nauc_mrr_at_20_std value: 12.693437522498588 - type: nauc_mrr_at_3_diff1 value: 73.50670156385345 - type: nauc_mrr_at_3_max value: 62.01717537699209 - type: nauc_mrr_at_3_std value: 11.926548252191182 - type: nauc_mrr_at_5_diff1 value: 72.62204028549876 - type: nauc_mrr_at_5_max value: 62.319358766312085 - type: nauc_mrr_at_5_std value: 13.081257923284342 - type: nauc_ndcg_at_1000_diff1 value: 72.29960539074736 - type: nauc_ndcg_at_1000_max value: 62.75096959221402 - type: nauc_ndcg_at_1000_std value: 13.81528462505362 - type: nauc_ndcg_at_100_diff1 value: 72.19985782073529 - type: nauc_ndcg_at_100_max value: 63.18837705326287 - type: nauc_ndcg_at_100_std value: 14.506479655117138 - type: nauc_ndcg_at_10_diff1 value: 71.85759847832983 - type: nauc_ndcg_at_10_max value: 64.150996056865 - type: nauc_ndcg_at_10_std value: 14.580606901634278 - type: nauc_ndcg_at_1_diff1 value: 74.71208073612343 - type: nauc_ndcg_at_1_max value: 59.50430394775893 - type: nauc_ndcg_at_1_std value: 8.129514198080512 - type: nauc_ndcg_at_20_diff1 value: 71.80987178228351 - type: nauc_ndcg_at_20_max value: 63.56269460865743 - type: nauc_ndcg_at_20_std value: 15.024978004625922 - type: nauc_ndcg_at_3_diff1 value: 72.35095651602592 - type: nauc_ndcg_at_3_max value: 61.60548011855679 - type: nauc_ndcg_at_3_std value: 12.048248788835263 - type: nauc_ndcg_at_5_diff1 value: 71.48615621881864 - type: nauc_ndcg_at_5_max value: 61.72870035979784 - type: nauc_ndcg_at_5_std value: 12.83048357446691 - type: nauc_precision_at_1000_diff1 value: -14.743011420972 - type: nauc_precision_at_1000_max value: 19.281995763080158 - type: nauc_precision_at_1000_std value: 49.6140660398164 - type: nauc_precision_at_100_diff1 value: 0.11278174806205563 - type: nauc_precision_at_100_max value: 29.704511820077332 - type: nauc_precision_at_100_std value: 47.84916954122579 - type: nauc_precision_at_10_diff1 value: 20.498227967235728 - type: nauc_precision_at_10_max value: 47.883119365891595 - type: nauc_precision_at_10_std value: 45.182178693450595 - type: nauc_precision_at_1_diff1 value: 74.71208073612343 - type: nauc_precision_at_1_max value: 59.50430394775893 - type: nauc_precision_at_1_std value: 8.129514198080512 - type: nauc_precision_at_20_diff1 value: 12.551737222341455 - type: nauc_precision_at_20_max value: 40.618899501225634 - type: nauc_precision_at_20_std value: 48.5598454249067 - type: nauc_precision_at_3_diff1 value: 47.67720764601145 - type: nauc_precision_at_3_max value: 56.50632017305064 - type: nauc_precision_at_3_std value: 31.14175140162157 - type: nauc_precision_at_5_diff1 value: 35.10058622792819 - type: nauc_precision_at_5_max value: 51.88948872657981 - type: nauc_precision_at_5_std value: 37.62796957461928 - type: nauc_recall_at_1000_diff1 value: 79.57516339869238 - type: nauc_recall_at_1000_max value: 86.11111111111035 - type: nauc_recall_at_1000_std value: 79.57516339869238 - type: nauc_recall_at_100_diff1 value: 70.50859559510081 - type: nauc_recall_at_100_max value: 79.17009941231396 - type: nauc_recall_at_100_std value: 44.32910419069595 - type: nauc_recall_at_10_diff1 value: 66.16118569361245 - type: nauc_recall_at_10_max value: 74.73542948302286 - type: nauc_recall_at_10_std value: 27.680330939810037 - type: nauc_recall_at_1_diff1 value: 74.2356464256346 - type: nauc_recall_at_1_max value: 54.06962758957632 - type: nauc_recall_at_1_std value: 0.8037891907963532 - type: nauc_recall_at_20_diff1 value: 65.4748436545527 - type: nauc_recall_at_20_max value: 73.81532199081235 - type: nauc_recall_at_20_std value: 33.59324708196253 - type: nauc_recall_at_3_diff1 value: 68.83194804473622 - type: nauc_recall_at_3_max value: 61.77722610439669 - type: nauc_recall_at_3_std value: 13.984923756556714 - type: nauc_recall_at_5_diff1 value: 65.51467417209523 - type: nauc_recall_at_5_max value: 64.08276291427661 - type: nauc_recall_at_5_std value: 19.976472037847167 - type: ndcg_at_1 value: 60.0 - type: ndcg_at_10 value: 71.586 - type: ndcg_at_100 value: 73.76899999999999 - type: ndcg_at_1000 value: 74.386 - type: ndcg_at_20 value: 72.612 - type: ndcg_at_3 value: 66.944 - type: ndcg_at_5 value: 69.333 - type: precision_at_1 value: 60.0 - type: precision_at_10 value: 9.6 - type: precision_at_100 value: 1.073 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_20 value: 5.033 - type: precision_at_3 value: 26.333000000000002 - type: precision_at_5 value: 17.4 - type: recall_at_1 value: 56.760999999999996 - type: recall_at_10 value: 84.589 - type: recall_at_100 value: 94.333 - type: recall_at_1000 value: 99.333 - type: recall_at_20 value: 88.43299999999999 - type: recall_at_3 value: 72.10600000000001 - type: recall_at_5 value: 78.194 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: mteb/trec-covid config: default split: test revision: bb9466bac8153a0349341eb1b22e06409e78ef4e metrics: - type: main_score value: 84.60600000000001 - type: map_at_1 value: 0.257 - type: map_at_10 value: 2.196 - type: map_at_100 value: 13.252 - type: map_at_1000 value: 31.473000000000003 - type: map_at_20 value: 4.023000000000001 - type: map_at_3 value: 0.722 - type: map_at_5 value: 1.146 - type: mrr_at_1 value: 94.0 - type: mrr_at_10 value: 97.0 - type: mrr_at_100 value: 97.0 - type: mrr_at_1000 value: 97.0 - type: mrr_at_20 value: 97.0 - type: mrr_at_3 value: 97.0 - type: mrr_at_5 value: 97.0 - type: nauc_map_at_1000_diff1 value: -30.674816554207062 - type: nauc_map_at_1000_max value: 53.18598689657068 - type: nauc_map_at_1000_std value: 78.88325309469121 - type: nauc_map_at_100_diff1 value: -17.6877824653978 - type: nauc_map_at_100_max value: 19.584159765315658 - type: nauc_map_at_100_std value: 48.051154190992726 - type: nauc_map_at_10_diff1 value: 20.076631089898626 - type: nauc_map_at_10_max value: -8.642556160185636 - type: nauc_map_at_10_std value: -5.768698617334298 - type: nauc_map_at_1_diff1 value: 27.342260509653798 - type: nauc_map_at_1_max value: -23.400451210297994 - type: nauc_map_at_1_std value: -21.152006353733853 - type: nauc_map_at_20_diff1 value: 8.019321726240506 - type: nauc_map_at_20_max value: -1.4826378210544222 - type: nauc_map_at_20_std value: 5.698208117745366 - type: nauc_map_at_3_diff1 value: 32.073377946749446 - type: nauc_map_at_3_max value: -13.099353983204654 - type: nauc_map_at_3_std value: -15.36319127398037 - type: nauc_map_at_5_diff1 value: 22.500045815797876 - type: nauc_map_at_5_max value: -8.548135411428023 - type: nauc_map_at_5_std value: -8.547850460331334 - type: nauc_mrr_at_1000_diff1 value: -6.022408963585526 - type: nauc_mrr_at_1000_max value: 4.481792717087155 - type: nauc_mrr_at_1000_std value: 51.6962340491753 - type: nauc_mrr_at_100_diff1 value: -6.022408963585526 - type: nauc_mrr_at_100_max value: 4.481792717087155 - type: nauc_mrr_at_100_std value: 51.6962340491753 - type: nauc_mrr_at_10_diff1 value: -6.022408963585526 - type: nauc_mrr_at_10_max value: 4.481792717087155 - type: nauc_mrr_at_10_std value: 51.6962340491753 - type: nauc_mrr_at_1_diff1 value: -6.022408963585076 - type: nauc_mrr_at_1_max value: 4.481792717087146 - type: nauc_mrr_at_1_std value: 51.69623404917518 - type: nauc_mrr_at_20_diff1 value: -6.022408963585526 - type: nauc_mrr_at_20_max value: 4.481792717087155 - type: nauc_mrr_at_20_std value: 51.6962340491753 - type: nauc_mrr_at_3_diff1 value: -6.022408963585526 - type: nauc_mrr_at_3_max value: 4.481792717087155 - type: nauc_mrr_at_3_std value: 51.6962340491753 - type: nauc_mrr_at_5_diff1 value: -6.022408963585526 - type: nauc_mrr_at_5_max value: 4.481792717087155 - type: nauc_mrr_at_5_std value: 51.6962340491753 - type: nauc_ndcg_at_1000_diff1 value: -20.79697283984295 - type: nauc_ndcg_at_1000_max value: 52.97671908009218 - type: nauc_ndcg_at_1000_std value: 75.43907707019758 - type: nauc_ndcg_at_100_diff1 value: -38.620752706946455 - type: nauc_ndcg_at_100_max value: 49.41307462381511 - type: nauc_ndcg_at_100_std value: 81.33299379244252 - type: nauc_ndcg_at_10_diff1 value: -18.611906363037356 - type: nauc_ndcg_at_10_max value: 44.20544651664479 - type: nauc_ndcg_at_10_std value: 61.322552829935816 - type: nauc_ndcg_at_1_diff1 value: 18.625935567849073 - type: nauc_ndcg_at_1_max value: -10.104132769280879 - type: nauc_ndcg_at_1_std value: 22.449560689879743 - type: nauc_ndcg_at_20_diff1 value: -30.61130208138771 - type: nauc_ndcg_at_20_max value: 52.68851710375231 - type: nauc_ndcg_at_20_std value: 69.72357683382992 - type: nauc_ndcg_at_3_diff1 value: 5.695394821691213 - type: nauc_ndcg_at_3_max value: 37.909122367102135 - type: nauc_ndcg_at_3_std value: 46.2366603255159 - type: nauc_ndcg_at_5_diff1 value: -15.273067832464731 - type: nauc_ndcg_at_5_max value: 49.7054639475091 - type: nauc_ndcg_at_5_std value: 58.83754007826166 - type: nauc_precision_at_1000_diff1 value: -31.565302588492035 - type: nauc_precision_at_1000_max value: 52.56214379514724 - type: nauc_precision_at_1000_std value: 53.40618234326055 - type: nauc_precision_at_100_diff1 value: -44.67273120709088 - type: nauc_precision_at_100_max value: 48.30381155522576 - type: nauc_precision_at_100_std value: 82.1984661602578 - type: nauc_precision_at_10_diff1 value: -24.737383556860145 - type: nauc_precision_at_10_max value: 52.816815002878556 - type: nauc_precision_at_10_std value: 67.99052410030845 - type: nauc_precision_at_1_diff1 value: -6.022408963585076 - type: nauc_precision_at_1_max value: 4.481792717087146 - type: nauc_precision_at_1_std value: 51.69623404917518 - type: nauc_precision_at_20_diff1 value: -40.23628054967093 - type: nauc_precision_at_20_max value: 56.980056980057014 - type: nauc_precision_at_20_std value: 76.60976777785895 - type: nauc_precision_at_3_diff1 value: -4.661784068466279 - type: nauc_precision_at_3_max value: 59.052007899934125 - type: nauc_precision_at_3_std value: 58.187952600394986 - type: nauc_precision_at_5_diff1 value: -38.11848143512736 - type: nauc_precision_at_5_max value: 68.6149353358365 - type: nauc_precision_at_5_std value: 73.55652899457661 - type: nauc_recall_at_1000_diff1 value: -14.886527444436345 - type: nauc_recall_at_1000_max value: 48.07492302795808 - type: nauc_recall_at_1000_std value: 65.05623212485906 - type: nauc_recall_at_100_diff1 value: -8.148385729388195 - type: nauc_recall_at_100_max value: 8.041615364614533 - type: nauc_recall_at_100_std value: 33.77187914574611 - type: nauc_recall_at_10_diff1 value: 24.333628413035942 - type: nauc_recall_at_10_max value: -14.577877145192078 - type: nauc_recall_at_10_std value: -12.131819145098557 - type: nauc_recall_at_1_diff1 value: 27.342260509653798 - type: nauc_recall_at_1_max value: -23.400451210297994 - type: nauc_recall_at_1_std value: -21.152006353733853 - type: nauc_recall_at_20_diff1 value: 13.695556376785564 - type: nauc_recall_at_20_max value: -8.872009346408264 - type: nauc_recall_at_20_std value: -3.163199444247112 - type: nauc_recall_at_3_diff1 value: 32.00442538217753 - type: nauc_recall_at_3_max value: -15.159737942664552 - type: nauc_recall_at_3_std value: -17.530833132440645 - type: nauc_recall_at_5_diff1 value: 22.64740552912405 - type: nauc_recall_at_5_max value: -12.947090597010414 - type: nauc_recall_at_5_std value: -12.914478822476807 - type: ndcg_at_1 value: 88.0 - type: ndcg_at_10 value: 84.60600000000001 - type: ndcg_at_100 value: 64.31700000000001 - type: ndcg_at_1000 value: 56.40500000000001 - type: ndcg_at_20 value: 80.561 - type: ndcg_at_3 value: 87.87700000000001 - type: ndcg_at_5 value: 86.641 - type: precision_at_1 value: 94.0 - type: precision_at_10 value: 88.2 - type: precision_at_100 value: 65.9 - type: precision_at_1000 value: 25.019999999999996 - type: precision_at_20 value: 84.7 - type: precision_at_3 value: 92.0 - type: precision_at_5 value: 90.0 - type: recall_at_1 value: 0.257 - type: recall_at_10 value: 2.338 - type: recall_at_100 value: 15.831999999999999 - type: recall_at_1000 value: 52.519000000000005 - type: recall_at_20 value: 4.367 - type: recall_at_3 value: 0.74 - type: recall_at_5 value: 1.196 - task: type: Retrieval dataset: name: MTEB Touche2020 type: mteb/touche2020 config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: main_score value: 31.426 - type: map_at_1 value: 3.4709999999999996 - type: map_at_10 value: 13.236999999999998 - type: map_at_100 value: 19.521 - type: map_at_1000 value: 21.224 - type: map_at_20 value: 15.626000000000001 - type: map_at_3 value: 7.152 - type: map_at_5 value: 9.914000000000001 - type: mrr_at_1 value: 44.89795918367347 - type: mrr_at_10 value: 57.54373177842565 - type: mrr_at_100 value: 57.855267710139536 - type: mrr_at_1000 value: 57.855267710139536 - type: mrr_at_20 value: 57.70071764969724 - type: mrr_at_3 value: 52.72108843537414 - type: mrr_at_5 value: 55.06802721088435 - type: nauc_map_at_1000_diff1 value: 21.148857552115558 - type: nauc_map_at_1000_max value: 2.0837572569021323 - type: nauc_map_at_1000_std value: 3.203419709665347 - type: nauc_map_at_100_diff1 value: 21.383778167597878 - type: nauc_map_at_100_max value: 0.965767943155967 - type: nauc_map_at_100_std value: 0.3949924961020957 - type: nauc_map_at_10_diff1 value: 27.178555638086394 - type: nauc_map_at_10_max value: 4.480675175857958 - type: nauc_map_at_10_std value: -13.69553539513878 - type: nauc_map_at_1_diff1 value: 27.63901823865334 - type: nauc_map_at_1_max value: -18.6387233237763 - type: nauc_map_at_1_std value: -27.02164241863646 - type: nauc_map_at_20_diff1 value: 23.892104752374888 - type: nauc_map_at_20_max value: 3.5343136621362348 - type: nauc_map_at_20_std value: -8.765101188860816 - type: nauc_map_at_3_diff1 value: 22.065793929837493 - type: nauc_map_at_3_max value: 0.8063396680860568 - type: nauc_map_at_3_std value: -20.404849396621824 - type: nauc_map_at_5_diff1 value: 22.66626080580714 - type: nauc_map_at_5_max value: 5.423340658352383 - type: nauc_map_at_5_std value: -18.31523779843455 - type: nauc_mrr_at_1000_diff1 value: 30.520722269282665 - type: nauc_mrr_at_1000_max value: -16.644959497742267 - type: nauc_mrr_at_1000_std value: -16.3824126273053 - type: nauc_mrr_at_100_diff1 value: 30.520722269282665 - type: nauc_mrr_at_100_max value: -16.644959497742267 - type: nauc_mrr_at_100_std value: -16.3824126273053 - type: nauc_mrr_at_10_diff1 value: 30.428248939332974 - type: nauc_mrr_at_10_max value: -16.300183919261585 - type: nauc_mrr_at_10_std value: -15.404823235836309 - type: nauc_mrr_at_1_diff1 value: 27.041346572613474 - type: nauc_mrr_at_1_max value: -23.181309312755804 - type: nauc_mrr_at_1_std value: -24.33076726484014 - type: nauc_mrr_at_20_diff1 value: 30.676558567379303 - type: nauc_mrr_at_20_max value: -16.914268763031416 - type: nauc_mrr_at_20_std value: -15.77742854976336 - type: nauc_mrr_at_3_diff1 value: 31.718457109787096 - type: nauc_mrr_at_3_max value: -15.508391132202235 - type: nauc_mrr_at_3_std value: -20.33229438349494 - type: nauc_mrr_at_5_diff1 value: 28.73798376227693 - type: nauc_mrr_at_5_max value: -16.086295031060196 - type: nauc_mrr_at_5_std value: -15.644604635769321 - type: nauc_ndcg_at_1000_diff1 value: 22.158724660189606 - type: nauc_ndcg_at_1000_max value: -3.1755686809941475 - type: nauc_ndcg_at_1000_std value: 19.258386224159075 - type: nauc_ndcg_at_100_diff1 value: 21.83846748649288 - type: nauc_ndcg_at_100_max value: -10.939957598756036 - type: nauc_ndcg_at_100_std value: 14.729678880436623 - type: nauc_ndcg_at_10_diff1 value: 26.944882726098424 - type: nauc_ndcg_at_10_max value: -3.5176483833346617 - type: nauc_ndcg_at_10_std value: -5.400606773697211 - type: nauc_ndcg_at_1_diff1 value: 26.649410985172985 - type: nauc_ndcg_at_1_max value: -18.806716526067493 - type: nauc_ndcg_at_1_std value: -25.100244999343506 - type: nauc_ndcg_at_20_diff1 value: 24.860266153648315 - type: nauc_ndcg_at_20_max value: -7.521401821712892 - type: nauc_ndcg_at_20_std value: -3.3696577425983003 - type: nauc_ndcg_at_3_diff1 value: 23.9933326962406 - type: nauc_ndcg_at_3_max value: -0.4609479344284664 - type: nauc_ndcg_at_3_std value: -15.176459166869897 - type: nauc_ndcg_at_5_diff1 value: 22.50595978713142 - type: nauc_ndcg_at_5_max value: -2.1093870656000857 - type: nauc_ndcg_at_5_std value: -12.732197425528257 - type: nauc_precision_at_1000_diff1 value: -20.335120385950024 - type: nauc_precision_at_1000_max value: 26.95109729939765 - type: nauc_precision_at_1000_std value: 29.981685890622117 - type: nauc_precision_at_100_diff1 value: -2.782114329320704 - type: nauc_precision_at_100_max value: 2.9489322002048604 - type: nauc_precision_at_100_std value: 67.3074073674319 - type: nauc_precision_at_10_diff1 value: 21.385177180383383 - type: nauc_precision_at_10_max value: -2.4696365259422817 - type: nauc_precision_at_10_std value: 14.469784299536673 - type: nauc_precision_at_1_diff1 value: 27.041346572613474 - type: nauc_precision_at_1_max value: -23.181309312755804 - type: nauc_precision_at_1_std value: -24.33076726484014 - type: nauc_precision_at_20_diff1 value: 11.993846579997673 - type: nauc_precision_at_20_max value: -2.4792189693296227 - type: nauc_precision_at_20_std value: 28.581394687807745 - type: nauc_precision_at_3_diff1 value: 20.70568446328836 - type: nauc_precision_at_3_max value: 0.37326398699875984 - type: nauc_precision_at_3_std value: -12.983918676694389 - type: nauc_precision_at_5_diff1 value: 19.47466335828124 - type: nauc_precision_at_5_max value: -1.8921617684385994 - type: nauc_precision_at_5_std value: -6.533875294402164 - type: nauc_recall_at_1000_diff1 value: 7.611201305723156 - type: nauc_recall_at_1000_max value: 5.6416194035820055 - type: nauc_recall_at_1000_std value: 61.695208644278 - type: nauc_recall_at_100_diff1 value: 10.0183258158735 - type: nauc_recall_at_100_max value: -10.950612455698973 - type: nauc_recall_at_100_std value: 33.06069987640471 - type: nauc_recall_at_10_diff1 value: 24.738210305731535 - type: nauc_recall_at_10_max value: -2.6592454032071546 - type: nauc_recall_at_10_std value: -4.83987517793115 - type: nauc_recall_at_1_diff1 value: 27.63901823865334 - type: nauc_recall_at_1_max value: -18.6387233237763 - type: nauc_recall_at_1_std value: -27.02164241863646 - type: nauc_recall_at_20_diff1 value: 17.79601177409034 - type: nauc_recall_at_20_max value: -6.681637093148051 - type: nauc_recall_at_20_std value: 3.369193919932238 - type: nauc_recall_at_3_diff1 value: 24.9589431081204 - type: nauc_recall_at_3_max value: 2.4783640980500232 - type: nauc_recall_at_3_std value: -19.567415651090702 - type: nauc_recall_at_5_diff1 value: 23.71803410135437 - type: nauc_recall_at_5_max value: 1.6294309357641652 - type: nauc_recall_at_5_std value: -15.365511906408983 - type: ndcg_at_1 value: 40.816 - type: ndcg_at_10 value: 31.426 - type: ndcg_at_100 value: 41.558 - type: ndcg_at_1000 value: 53.042 - type: ndcg_at_20 value: 31.108999999999998 - type: ndcg_at_3 value: 35.518 - type: ndcg_at_5 value: 33.235 - type: precision_at_1 value: 44.897999999999996 - type: precision_at_10 value: 27.551 - type: precision_at_100 value: 8.204 - type: precision_at_1000 value: 1.582 - type: precision_at_20 value: 19.796 - type: precision_at_3 value: 36.735 - type: precision_at_5 value: 33.061 - type: recall_at_1 value: 3.4709999999999996 - type: recall_at_10 value: 19.563 - type: recall_at_100 value: 50.3 - type: recall_at_1000 value: 85.13199999999999 - type: recall_at_20 value: 26.738 - type: recall_at_3 value: 7.8420000000000005 - type: recall_at_5 value: 11.994 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 68.29850746268657 - type: ap value: 30.109785890841966 - type: ap_weighted value: 30.109785890841966 - type: f1 value: 61.76875915202924 - type: f1_weighted value: 71.32073190458556 - type: main_score value: 68.29850746268657 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification (default) type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 90.3068 - type: ap value: 86.17914339624038 - type: ap_weighted value: 86.17914339624038 - type: f1 value: 90.29716826358077 - type: f1_weighted value: 90.29716826358077 - type: main_score value: 90.3068 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 46.272000000000006 - type: f1 value: 45.57042543386915 - type: f1_weighted value: 45.57042543386915 - type: main_score value: 46.272000000000006 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P (default) type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: main_score value: 44.9469238081379 - type: v_measure value: 44.9469238081379 - type: v_measure_std value: 13.26811262671461 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S (default) type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: main_score value: 34.12071448053325 - type: v_measure value: 34.12071448053325 - type: v_measure_std value: 13.7019879046405 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions (default) type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: main_score value: 61.597667288657846 - type: map value: 61.597667288657846 - type: mrr value: 75.57940904893813 - type: nAUC_map_diff1 value: 8.745172077340095 - type: nAUC_map_max value: 20.114863024035493 - type: nAUC_map_std value: 15.991351189572192 - type: nAUC_mrr_diff1 value: 20.781369244159983 - type: nAUC_mrr_max value: 30.78542570228559 - type: nAUC_mrr_std value: 19.861484857303676 - task: type: STS dataset: name: MTEB BIOSSES (default) type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cosine_pearson value: 88.55587996301419 - type: cosine_spearman value: 86.40317357420093 - type: euclidean_pearson value: 86.93771958250231 - type: euclidean_spearman value: 86.40317357420093 - type: main_score value: 86.40317357420093 - type: manhattan_pearson value: 86.92196577117366 - type: manhattan_spearman value: 85.79834051556095 - type: pearson value: 88.55587996301419 - type: spearman value: 86.40317357420093 - task: type: Classification dataset: name: MTEB Banking77Classification (default) type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 80.0064935064935 - type: f1 value: 79.29524254086299 - type: f1_weighted value: 79.295242540863 - type: main_score value: 80.0064935064935 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P (default) type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: main_score value: 35.27186813341181 - type: v_measure value: 35.27186813341181 - type: v_measure_std value: 0.8621482145872432 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S (default) type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: main_score value: 28.411805064852295 - type: v_measure value: 28.411805064852295 - type: v_measure_std value: 0.7194290078011281 - task: type: Classification dataset: name: MTEB EmotionClassification (default) type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 43.675 - type: f1 value: 40.15061931375577 - type: f1_weighted value: 45.714186572727066 - type: main_score value: 43.675 - task: type: Classification dataset: name: MTEB ImdbClassification (default) type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 84.35640000000001 - type: ap value: 79.07507736685174 - type: ap_weighted value: 79.07507736685174 - type: f1 value: 84.32288494833531 - type: f1_weighted value: 84.32288494833531 - type: main_score value: 84.35640000000001 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 91.35658914728684 - type: f1 value: 90.86877537911086 - type: f1_weighted value: 91.3282092774443 - type: main_score value: 91.35658914728684 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 60.63611491108071 - type: f1 value: 42.78886482112741 - type: f1_weighted value: 63.44208631840539 - type: main_score value: 60.63611491108071 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 4672e20407010da34463acc759c162ca9734bca6 metrics: - type: accuracy value: 66.68796234028245 - type: f1 value: 64.44940791000278 - type: f1_weighted value: 65.77554417406792 - type: main_score value: 66.68796234028245 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8 metrics: - type: accuracy value: 73.0598520511096 - type: f1 value: 72.14267273884774 - type: f1_weighted value: 72.93345180137516 - type: main_score value: 73.0598520511096 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P (default) type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: main_score value: 31.143081341699606 - type: v_measure value: 31.143081341699606 - type: v_measure_std value: 1.5578716347076906 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S (default) type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: main_score value: 27.010818869829556 - type: v_measure value: 27.010818869829556 - type: v_measure_std value: 1.1771554540819378 - task: type: Reranking dataset: name: MTEB MindSmallReranking (default) type: mteb/mind_small config: default split: test revision: 59042f120c80e8afa9cdbb224f67076cec0fc9a7 metrics: - type: main_score value: 30.20503776754942 - type: map value: 30.20503776754942 - type: mrr value: 31.076636002733437 - type: nAUC_map_diff1 value: 7.290568655287842 - type: nAUC_map_max value: -21.381599355932945 - type: nAUC_map_std value: -7.709920607543168 - type: nAUC_mrr_diff1 value: 7.558397329284913 - type: nAUC_mrr_max value: -15.981397186427607 - type: nAUC_mrr_std value: -4.870495243168834 - task: type: Clustering dataset: name: MTEB RedditClustering (default) type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: main_score value: 51.85893476633338 - type: v_measure value: 51.85893476633338 - type: v_measure_std value: 4.704770139385852 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P (default) type: mteb/reddit-clustering-p2p config: default split: test revision: 385e3cb46b4cfa89021f56c4380204149d0efe33 metrics: - type: main_score value: 61.8124222918822 - type: v_measure value: 61.8124222918822 - type: v_measure_std value: 11.994472578100165 - task: type: STS dataset: name: MTEB SICK-R (default) type: mteb/sickr-sts config: default split: test revision: 20a6d6f312dd54037fe07a32d58e5e168867909d metrics: - type: cosine_pearson value: 77.63310776935984 - type: cosine_spearman value: 69.86468291111039 - type: euclidean_pearson value: 73.91537077798837 - type: euclidean_spearman value: 69.86468376650203 - type: main_score value: 69.86468291111039 - type: manhattan_pearson value: 73.68616048370464 - type: manhattan_spearman value: 69.76232036206659 - type: pearson value: 77.63310776935984 - type: spearman value: 69.86468291111039 - task: type: STS dataset: name: MTEB STS12 (default) type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cosine_pearson value: 57.71716838245049 - type: cosine_spearman value: 61.797855543446424 - type: euclidean_pearson value: 58.22958675325848 - type: euclidean_spearman value: 61.797855543446424 - type: main_score value: 61.797855543446424 - type: manhattan_pearson value: 57.63117544997929 - type: manhattan_spearman value: 61.3629404350085 - type: pearson value: 57.71716838245049 - type: spearman value: 61.797855543446424 - task: type: STS dataset: name: MTEB STS13 (default) type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cosine_pearson value: 82.30260026790903 - type: cosine_spearman value: 82.66959813070869 - type: euclidean_pearson value: 82.08383017580783 - type: euclidean_spearman value: 82.66959813070869 - type: main_score value: 82.66959813070869 - type: manhattan_pearson value: 81.77991451392153 - type: manhattan_spearman value: 82.3652534745606 - type: pearson value: 82.30260026790903 - type: spearman value: 82.66959813070869 - task: type: STS dataset: name: MTEB STS14 (default) type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cosine_pearson value: 71.50608384084478 - type: cosine_spearman value: 68.94968064977785 - type: euclidean_pearson value: 70.73381299949564 - type: euclidean_spearman value: 68.94968064977785 - type: main_score value: 68.94968064977785 - type: manhattan_pearson value: 70.5385486953787 - type: manhattan_spearman value: 68.82132770672365 - type: pearson value: 71.50608384084478 - type: spearman value: 68.94968064977785 - task: type: STS dataset: name: MTEB STS15 (default) type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cosine_pearson value: 73.66969825874907 - type: cosine_spearman value: 75.55374982088381 - type: euclidean_pearson value: 75.9339313749594 - type: euclidean_spearman value: 75.55374982088381 - type: main_score value: 75.55374982088381 - type: manhattan_pearson value: 75.88287553383817 - type: manhattan_spearman value: 75.50729812977688 - type: pearson value: 73.66969825874907 - type: spearman value: 75.55374982088381 - task: type: STS dataset: name: MTEB STS16 (default) type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cosine_pearson value: 74.5954724414016 - type: cosine_spearman value: 77.2688820850505 - type: euclidean_pearson value: 77.19866353971555 - type: euclidean_spearman value: 77.2688820850505 - type: main_score value: 77.2688820850505 - type: manhattan_pearson value: 77.27072603680978 - type: manhattan_spearman value: 77.29408453673607 - type: pearson value: 74.5954724414016 - type: spearman value: 77.2688820850505 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: faeb762787bd10488a50c8b5be4a3b82e411949c metrics: - type: cosine_pearson value: 71.52588722654055 - type: cosine_spearman value: 74.97235736456061 - type: euclidean_pearson value: 74.51952528854038 - type: euclidean_spearman value: 74.97235736456061 - type: main_score value: 74.97235736456061 - type: manhattan_pearson value: 74.48272300884209 - type: manhattan_spearman value: 74.80633649415176 - type: pearson value: 71.52588722654055 - type: spearman value: 74.97235736456061 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3 metrics: - type: cosine_pearson value: 68.80031120401976 - type: cosine_spearman value: 69.07945196478491 - type: euclidean_pearson value: 68.99674496430792 - type: euclidean_spearman value: 69.07945196478491 - type: main_score value: 69.07945196478491 - type: manhattan_pearson value: 69.00236107775687 - type: manhattan_spearman value: 68.98064879049272 - type: pearson value: 68.80031120401976 - type: spearman value: 69.07945196478491 - task: type: STS dataset: name: MTEB STSBenchmark (default) type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cosine_pearson value: 65.6898007230089 - type: cosine_spearman value: 69.72386211803668 - type: euclidean_pearson value: 69.04523003701475 - type: euclidean_spearman value: 69.72386211803668 - type: main_score value: 69.72386211803668 - type: manhattan_pearson value: 68.80479743770702 - type: manhattan_spearman value: 69.43264575177459 - type: pearson value: 65.6898007230089 - type: spearman value: 69.72386211803668 - task: type: Reranking dataset: name: MTEB SciDocsRR (default) type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: main_score value: 79.74088066874383 - type: map value: 79.74088066874383 - type: mrr value: 94.47697455050397 - type: nAUC_map_diff1 value: 8.036086256905502 - type: nAUC_map_max value: 54.88199803816819 - type: nAUC_map_std value: 69.16267942176574 - type: nAUC_mrr_diff1 value: 50.020738477678115 - type: nAUC_mrr_max value: 83.28922770326483 - type: nAUC_mrr_std value: 83.63973501802224 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions (default) type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cosine_accuracy value: 99.83861386138614 - type: cosine_accuracy_threshold value: 74.75666999816895 - type: cosine_ap value: 96.15132792066652 - type: cosine_f1 value: 91.84890656063618 - type: cosine_f1_threshold value: 71.70594930648804 - type: cosine_precision value: 91.30434782608695 - type: cosine_recall value: 92.4 - type: dot_accuracy value: 99.83861386138614 - type: dot_accuracy_threshold value: 74.75666999816895 - type: dot_ap value: 96.15132792066653 - type: dot_f1 value: 91.84890656063618 - type: dot_f1_threshold value: 71.70596122741699 - type: dot_precision value: 91.30434782608695 - type: dot_recall value: 92.4 - type: euclidean_accuracy value: 99.83861386138614 - type: euclidean_accuracy_threshold value: 71.05395793914795 - type: euclidean_ap value: 96.15132792066652 - type: euclidean_f1 value: 91.84890656063618 - type: euclidean_f1_threshold value: 75.22505521774292 - type: euclidean_precision value: 91.30434782608695 - type: euclidean_recall value: 92.4 - type: main_score value: 96.15132792066653 - type: manhattan_accuracy value: 99.83564356435643 - type: manhattan_accuracy_threshold value: 1547.6950645446777 - type: manhattan_ap value: 96.06151211452136 - type: manhattan_f1 value: 91.61676646706587 - type: manhattan_f1_threshold value: 1626.3608932495117 - type: manhattan_precision value: 91.43426294820716 - type: manhattan_recall value: 91.8 - type: max_ap value: 96.15132792066653 - type: max_f1 value: 91.84890656063618 - type: max_precision value: 91.43426294820716 - type: max_recall value: 92.4 - type: similarity_accuracy value: 99.83861386138614 - type: similarity_accuracy_threshold value: 74.75666999816895 - type: similarity_ap value: 96.15132792066652 - type: similarity_f1 value: 91.84890656063618 - type: similarity_f1_threshold value: 71.70594930648804 - type: similarity_precision value: 91.30434782608695 - type: similarity_recall value: 92.4 - task: type: Clustering dataset: name: MTEB StackExchangeClustering (default) type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: main_score value: 61.24120328328453 - type: v_measure value: 61.24120328328453 - type: v_measure_std value: 3.9946560691100372 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P (default) type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: main_score value: 33.808268374864745 - type: v_measure value: 33.808268374864745 - type: v_measure_std value: 1.2212188701887239 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions (default) type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: main_score value: 52.19806018468037 - type: map value: 52.19806018468037 - type: mrr value: 52.98921462524404 - type: nAUC_map_diff1 value: 37.41443156995912 - type: nAUC_map_max value: 9.410262727675603 - type: nAUC_map_std value: 8.7094185014992 - type: nAUC_mrr_diff1 value: 37.78202772392581 - type: nAUC_mrr_max value: 10.517635536565816 - type: nAUC_mrr_std value: 8.509423813772491 - task: type: Summarization dataset: name: MTEB SummEval (default) type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cosine_pearson value: 30.48413700430812 - type: cosine_spearman value: 30.357162200875816 - type: dot_pearson value: 30.484140144824938 - type: dot_spearman value: 30.357162200875816 - type: main_score value: 30.357162200875816 - type: pearson value: 30.48413700430812 - type: spearman value: 30.357162200875816 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification (default) type: mteb/toxic_conversations_50k config: default split: test revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de metrics: - type: accuracy value: 66.8359375 - type: ap value: 12.482653786025985 - type: ap_weighted value: 12.482653786025985 - type: f1 value: 51.328608527332385 - type: f1_weighted value: 74.07974463955398 - type: main_score value: 66.8359375 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification (default) type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 53.907753254103 - type: f1 value: 54.22707647269581 - type: f1_weighted value: 53.611822984407695 - type: main_score value: 53.907753254103 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering (default) type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: main_score value: 38.1364789307295 - type: v_measure value: 38.1364789307295 - type: v_measure_std value: 2.0731634966352077 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 (default) type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cosine_accuracy value: 82.66674614054956 - type: cosine_accuracy_threshold value: 79.80123162269592 - type: cosine_ap value: 63.28209719072804 - type: cosine_f1 value: 60.16389710903711 - type: cosine_f1_threshold value: 72.22893834114075 - type: cosine_precision value: 52.90232185748599 - type: cosine_recall value: 69.73614775725594 - type: dot_accuracy value: 82.66674614054956 - type: dot_accuracy_threshold value: 79.8012375831604 - type: dot_ap value: 63.282103870645166 - type: dot_f1 value: 60.16389710903711 - type: dot_f1_threshold value: 72.22894430160522 - type: dot_precision value: 52.90232185748599 - type: dot_recall value: 69.73614775725594 - type: euclidean_accuracy value: 82.66674614054956 - type: euclidean_accuracy_threshold value: 63.55905532836914 - type: euclidean_ap value: 63.282095399953164 - type: euclidean_f1 value: 60.16389710903711 - type: euclidean_f1_threshold value: 74.5265781879425 - type: euclidean_precision value: 52.90232185748599 - type: euclidean_recall value: 69.73614775725594 - type: main_score value: 63.282103870645166 - type: manhattan_accuracy value: 82.74423317637242 - type: manhattan_accuracy_threshold value: 1415.380859375 - type: manhattan_ap value: 63.26931757839598 - type: manhattan_f1 value: 60.11014948859166 - type: manhattan_f1_threshold value: 1632.522201538086 - type: manhattan_precision value: 52.359506559624045 - type: manhattan_recall value: 70.55408970976254 - type: max_ap value: 63.282103870645166 - type: max_f1 value: 60.16389710903711 - type: max_precision value: 52.90232185748599 - type: max_recall value: 70.55408970976254 - type: similarity_accuracy value: 82.66674614054956 - type: similarity_accuracy_threshold value: 79.80123162269592 - type: similarity_ap value: 63.28209719072804 - type: similarity_f1 value: 60.16389710903711 - type: similarity_f1_threshold value: 72.22893834114075 - type: similarity_precision value: 52.90232185748599 - type: similarity_recall value: 69.73614775725594 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus (default) type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cosine_accuracy value: 88.10105949470253 - type: cosine_accuracy_threshold value: 68.95147562026978 - type: cosine_ap value: 84.65516103854583 - type: cosine_f1 value: 76.54581123301605 - type: cosine_f1_threshold value: 63.92929553985596 - type: cosine_precision value: 72.46526344751685 - type: cosine_recall value: 81.11333538651063 - type: dot_accuracy value: 88.10105949470253 - type: dot_accuracy_threshold value: 68.95147562026978 - type: dot_ap value: 84.65516301437592 - type: dot_f1 value: 76.54581123301605 - type: dot_f1_threshold value: 63.92928957939148 - type: dot_precision value: 72.46526344751685 - type: dot_recall value: 81.11333538651063 - type: euclidean_accuracy value: 88.10105949470253 - type: euclidean_accuracy_threshold value: 78.80169153213501 - type: euclidean_ap value: 84.65517268264233 - type: euclidean_f1 value: 76.54581123301605 - type: euclidean_f1_threshold value: 84.93610620498657 - type: euclidean_precision value: 72.46526344751685 - type: euclidean_recall value: 81.11333538651063 - type: main_score value: 84.65517268264233 - type: manhattan_accuracy value: 88.08941669577366 - type: manhattan_accuracy_threshold value: 1739.3169403076172 - type: manhattan_ap value: 84.64592398855694 - type: manhattan_f1 value: 76.62890540443034 - type: manhattan_f1_threshold value: 1861.344337463379 - type: manhattan_precision value: 72.09775967413442 - type: manhattan_recall value: 81.76778564829073 - type: max_ap value: 84.65517268264233 - type: max_f1 value: 76.62890540443034 - type: max_precision value: 72.46526344751685 - type: max_recall value: 81.76778564829073 - type: similarity_accuracy value: 88.10105949470253 - type: similarity_accuracy_threshold value: 68.95147562026978 - type: similarity_ap value: 84.65516103854583 - type: similarity_f1 value: 76.54581123301605 - type: similarity_f1_threshold value: 63.92929553985596 - type: similarity_precision value: 72.46526344751685 - type: similarity_recall value: 81.11333538651063 --- # aimlresearch2023/snowflake-arctic-embed-m-v1.5-Q8_0-GGUF This model was converted to GGUF format from [`Snowflake/snowflake-arctic-embed-m-v1.5`](https://huggingface.co/Snowflake/snowflake-arctic-embed-m-v1.5) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/Snowflake/snowflake-arctic-embed-m-v1.5) for more details on the model. ## Use with llama.cpp Install llama.cpp through brew (works on Mac and Linux) ```bash brew install llama.cpp ``` Invoke the llama.cpp server or the CLI. ### CLI: ```bash llama-cli --hf-repo aimlresearch2023/snowflake-arctic-embed-m-v1.5-Q8_0-GGUF --hf-file snowflake-arctic-embed-m-v1.5-q8_0.gguf -p "The meaning to life and the universe is" ``` ### Server: ```bash llama-server --hf-repo aimlresearch2023/snowflake-arctic-embed-m-v1.5-Q8_0-GGUF --hf-file snowflake-arctic-embed-m-v1.5-q8_0.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. Step 1: Clone llama.cpp from GitHub. ``` git clone https://github.com/ggerganov/llama.cpp ``` Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux). ``` cd llama.cpp && LLAMA_CURL=1 make ``` Step 3: Run inference through the main binary. ``` ./llama-cli --hf-repo aimlresearch2023/snowflake-arctic-embed-m-v1.5-Q8_0-GGUF --hf-file snowflake-arctic-embed-m-v1.5-q8_0.gguf -p "The meaning to life and the universe is" ``` or ``` ./llama-server --hf-repo aimlresearch2023/snowflake-arctic-embed-m-v1.5-Q8_0-GGUF --hf-file snowflake-arctic-embed-m-v1.5-q8_0.gguf -c 2048 ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
LoneStriker/OpenBioLLM-Llama3-70B-4.0bpw-h6-exl2
LoneStriker
text-generation
[ "transformers", "pytorch", "llama", "text-generation", "llama-3", "Mixtral", "instruct", "finetune", "chatml", "DPO", "RLHF", "gpt4", "distillation", "conversational", "en", "arxiv:2305.18290", "arxiv:2303.13375", "arxiv:2212.13138", "arxiv:2305.09617", "arxiv:2402.07023", "base_model:meta-llama/Meta-Llama-3-70B-Instruct", "base_model:quantized:meta-llama/Meta-Llama-3-70B-Instruct", "license:llama3", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "4-bit", "exl2", "region:us" ]
1,714
1,714
8
0
--- base_model: meta-llama/Meta-Llama-3-70B-Instruct language: - en license: llama3 tags: - llama-3 - llama - Mixtral - instruct - finetune - chatml - DPO - RLHF - gpt4 - distillation widget: - example_title: OpenBioLLM-70B messages: - role: system content: You are an expert and experienced from the healthcare and biomedical domain with extensive medical knowledge and practical experience. - role: user content: How long does it take for newborn jaundice to go away? output: text: 'Newborn jaundice, also known as neonatal jaundice, is a common condition in newborns where the yellowing of the skin and eyes occurs due to an elevated level of bilirubin in the blood. Bilirubin is a yellow pigment that forms when red blood cells break down. In most cases, newborn jaundice resolves on its own without any specific treatment. The duration of newborn jaundice can vary depending on several factors such as the underlying cause, gestational age at birth, and individual variations in bilirubin metabolism. Here are some general guidelines: 1. Physiological jaundice: This is the most common type of newborn jaundice and usually appears within 24-72 hours after birth. It tends to peak between the second and fifth day of life and gradually improves over the next week or two. By the time the baby is one week old, the jaundice should have mostly resolved. 2. Breast milk jaundice: This type of jaundice occurs in breastfed babies and may appear later than physiological jaundice, typically between the fifth and fourteenth day of life. It tends to persist for a longer duration but usually resolves within six weeks after birth. 3. Pathological jaundice: This type of jaundice is less common and occurs due to an underlying medical condition that affects bilirubin metabolism or liver function. The duration of pathological jaundice depends on the specific cause and may require treatment. It''s important for parents to monitor their newborn''s jaundice closely and seek medical advice if the jaundice progresses rapidly, becomes severe, or is accompanied by other symptoms such as poor feeding, lethargy, or excessive sleepiness. In these cases, further evaluation and management may be necessary. Remember that each baby is unique, and the timing of jaundice resolution can vary. If you have concerns about your newborn''s jaundice, it''s always best to consult with a healthcare professional for personalized advice and guidance.' model-index: - name: OpenBioLLM-70B results: [] --- <div align="center"> <img width="260px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/BrQCb95lmEIFz79QAmoNA.png"></div> ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/fJIOPJnY6Ff6fUiSIuMEt.png) <div align="center"> <h1>Advancing Open-source Large Language Models in Medical Domain</h1> </div> <p align="center" style="margin-top: 0px;"> <a href="https://colab.research.google.com/drive/1F5oV20InEYeAJGmBwYF9NM_QhLmjBkKJ?usp=sharing"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="OpenChat Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 10px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text" style=" margin-right: 5px;">Online Demo</span> </a> | <a href="https://github.com/openlifescience-ai"> <img src="https://github.githubassets.com/assets/GitHub-Mark-ea2971cee799.png" alt="GitHub Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text" style=" margin-right: 5px;">GitHub</span> </a> | <a href="#"> <img src="https://github.com/alpayariyak/openchat/blob/master/assets/arxiv-logomark-small-square-border.png?raw=true" alt="ArXiv Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text" style="margin-right: 5px;">Paper</span> </a> | <a href="https://discord.gg/A5Fjf5zC69"> <img src="https://cloud.githubusercontent.com/assets/6291467/26705903/96c2d66e-477c-11e7-9f4e-f3c0efe96c9a.png" alt="Discord Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text">Discord</span> </a> </p> ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/KGmRE5w2sepNtwsEu8t7K.jpeg) Introducing OpenBioLLM-70B: A State-of-the-Art Open Source Biomedical Large Language Model OpenBioLLM-70B is an advanced open source language model designed specifically for the biomedical domain. Developed by Saama AI Labs, this model leverages cutting-edge techniques to achieve state-of-the-art performance on a wide range of biomedical tasks. 🏥 **Biomedical Specialization**: OpenBioLLM-70B is tailored for the unique language and knowledge requirements of the medical and life sciences fields. It was fine-tuned on a vast corpus of high-quality biomedical data, enabling it to understand and generate text with domain-specific accuracy and fluency. 🎓 **Superior Performance**: With 70 billion parameters, OpenBioLLM-70B outperforms other open source biomedical language models of similar scale. It has also demonstrated better results compared to larger proprietary & open-source models like GPT-4, Gemini, Meditron-70B, Med-PaLM-1 & Med-PaLM-2 on biomedical benchmarks. 🧠 **Advanced Training Techniques**: OpenBioLLM-70B builds upon the powerful foundations of the **Meta-Llama-3-70B-Instruct** and [Meta-Llama-3-70B-Instruct](meta-llama/Meta-Llama-3-70B-Instruct) models. It incorporates the DPO dataset and fine-tuning recipe along with a custom diverse medical instruction dataset. Key components of the training pipeline include: <div align="center"> <img width="1200px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/oPchsJsEpQoGcGXVbh7YS.png"> </div> - **Policy Optimization**: [Direct Preference Optimization: Your Language Model is Secretly a Reward Model (DPO)](https://arxiv.org/abs/2305.18290) - **Fine-tuning dataset**: Custom Medical Instruct dataset (We plan to release a sample training dataset in our upcoming paper; please stay updated) This combination of cutting-edge techniques enables OpenBioLLM-70B to align with key capabilities and preferences for biomedical applications. ⚙️ **Release Details**: - **Model Size**: 70 billion parameters - **Quantization**: Optimized quantized versions available [Here](https://huggingface.co/aaditya/OpenBioLLM-70B-GGUF) - **Language(s) (NLP):** en - **Developed By**: [Ankit Pal (Aaditya Ura)](https://aadityaura.github.io/) from Saama AI Labs - **License:** Meta-Llama License - **Fine-tuned from models:** [Meta-Llama-3-70B-Instruct](meta-llama/Meta-Llama-3-70B-Instruct) - **Resources for more information:** - Paper: Coming soon The model can be fine-tuned for more specialized tasks and datasets as needed. OpenBioLLM-70B represents an important step forward in democratizing advanced language AI for the biomedical community. By leveraging state-of-the-art architectures and training techniques from leading open source efforts like Llama-3, we have created a powerful tool to accelerate innovation and discovery in healthcare and the life sciences. We are excited to share OpenBioLLM-70B with researchers and developers around the world. ### Use with transformers **Important: Please use the exact chat template provided by Llama-3 instruct version. Otherwise there will be a degradation in the performance. The model output can be verbose in rare cases. Please consider setting temperature = 0 to make this happen less.** See the snippet below for usage with Transformers: ```python import transformers import torch model_id = "aaditya/OpenBioLLM-Llama3-70B" pipeline = transformers.pipeline( "text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device="auto", ) messages = [ {"role": "system", "content": "You are an expert and experienced from the healthcare and biomedical domain with extensive medical knowledge and practical experience. Your name is OpenBioLLM, and you were developed by Saama AI Labs. who's willing to help answer the user's query with explanation. In your explanation, leverage your deep medical expertise such as relevant anatomical structures, physiological processes, diagnostic criteria, treatment guidelines, or other pertinent medical concepts. Use precise medical terminology while still aiming to make the explanation clear and accessible to a general audience."}, {"role": "user", "content": "How can i split a 3mg or 4mg waefin pill so i can get a 2.5mg pill?"}, ] prompt = pipeline.tokenizer.apply_chat_template( messages, tokenize=False, add_generation_prompt=True ) terminators = [ pipeline.tokenizer.eos_token_id, pipeline.tokenizer.convert_tokens_to_ids("<|eot_id|>") ] outputs = pipeline( prompt, max_new_tokens=256, eos_token_id=terminators, do_sample=True, temperature=0.0, top_p=0.9, ) print(outputs[0]["generated_text"][len(prompt):]) ``` ## **Training procedure** ### **Training hyperparameters** <details> <summary>Click to see details</summary> - learning_rate: 0.0002 - lr_scheduler: cosine - train_batch_size: 12 - eval_batch_size: 8 - GPU: H100 80GB SXM5 - num_devices: 8 - optimizer: adamw_bnb_8bit - lr_scheduler_warmup_steps: 100 - num_epochs: 4 </details> ### **Peft hyperparameters** <details> <summary>Click to see details</summary> - adapter: qlora - lora_r: 128 - lora_alpha: 256 - lora_dropout: 0.05 - lora_target_linear: true -lora_target_modules: - q_proj - v_proj - k_proj - o_proj - gate_proj - down_proj - up_proj </details> ### **Training results** ### **Framework versions** - Transformers 4.39.3 - Pytorch 2.1.2+cu121 - Datasets 2.18.0 - Tokenizers 0.15.1 - Axolotl - Lm harness for evaluation # Benchmark Results 🔥 OpenBioLLM-70B demonstrates superior performance compared to larger models, such as GPT-4, Gemini, Meditron-70B, Med-PaLM-1 & Med-PaLM-2 across 9 diverse biomedical datasets, achieving state-of-the-art results with an average score of 86.06%, despite having a significantly smaller parameter count. The model's strong performance in domain-specific tasks, such as Clinical KG, Medical Genetics, and PubMedQA, highlights its ability to effectively capture and apply biomedical knowledge. 🚨 The GPT-4, Med-PaLM-1, and Med-PaLM-2 results are taken from their official papers. Since Med-PaLM doesn't provide zero-shot accuracy, we are using 5-shot accuracy from their paper for comparison. All results presented are in the zero-shot setting, except for Med-PaLM-2 and Med-PaLM-1, which use 5-shot accuracy. | | Clinical KG | Medical Genetics | Anatomy | Pro Medicine | College Biology | College Medicine | MedQA 4 opts | PubMedQA | MedMCQA | Avg | |--------------------|-------------|------------------|---------|--------------|-----------------|------------------|--------------|----------|---------|-------| | **OpenBioLLM-70B** | **92.93** | **93.197** | **83.904** | 93.75 | 93.827 | **85.749** | 78.162 | 78.97 | **74.014** | **86.05588** | | Med-PaLM-2 (5-shot) | 88.3 | 90 | 77.8 | **95.2** | 94.4 | 80.9 | **79.7** | **79.2** | 71.3 | 84.08 | | **GPT-4** | 86.04 | 91 | 80 | 93.01 | **95.14** | 76.88 | 78.87 | 75.2 | 69.52 | 82.85 | | Med-PaLM-1 (Flan-PaLM, 5-shot) | 80.4 | 75 | 63.7 | 83.8 | 88.9 | 76.3 | 67.6 | 79 | 57.6 | 74.7 | | **OpenBioLLM-8B** | 76.101 | 86.1 | 69.829 | 78.21 | 84.213 | 68.042 | 58.993 | 74.12 | 56.913 | 72.502 | | Gemini-1.0 | 76.7 | 75.8 | 66.7 | 77.7 | 88 | 69.2 | 58 | 70.7 | 54.3 | 70.79 | | GPT-3.5 Turbo 1106 | 74.71 | 74 | 72.79 | 72.79 | 72.91 | 64.73 | 57.71 | 72.66 | 53.79 | 66 | | Meditron-70B | 66.79 | 69 | 53.33 | 71.69 | 76.38 | 63 | 57.1 | 76.6 | 46.85 | 64.52 | | gemma-7b | 69.81 | 70 | 59.26 | 66.18 | 79.86 | 60.12 | 47.21 | 76.2 | 48.96 | 64.18 | | Mistral-7B-v0.1 | 68.68 | 71 | 55.56 | 68.38 | 68.06 | 59.54 | 50.82 | 75.4 | 48.2 | 62.85 | | Apollo-7B | 62.26 | 72 | 61.48 | 69.12 | 70.83 | 55.49 | 55.22 | 39.8 | 53.77 | 60 | | MedAlpaca-7b | 57.36 | 69 | 57.04 | 67.28 | 65.28 | 54.34 | 41.71 | 72.8 | 37.51 | 58.03 | | BioMistral-7B | 59.9 | 64 | 56.5 | 60.4 | 59 | 54.7 | 50.6 | 77.5 | 48.1 | 57.3 | | AlpaCare-llama2-7b | 49.81 | 49 | 45.92 | 33.82 | 50 | 43.35 | 29.77 | 72.2 | 34.42 | 45.36 | | ClinicalGPT | 30.56 | 27 | 30.37 | 19.48 | 25 | 24.27 | 26.08 | 63.8 | 28.18 | 30.52 | <div align="center"> <img width="1600px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/_SzdcJSBjZyo8RS1bTEkP.png"> </div> ## Detailed Medical Subjectwise accuracy ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/UXF-V0col0Z0sS6BGPBkE.png) # Use Cases & Examples 🚨 **Below results are from the quantized version of OpenBioLLM-70B # Summarize Clinical Notes OpenBioLLM-70B can efficiently analyze and summarize complex clinical notes, EHR data, and discharge summaries, extracting key information and generating concise, structured summaries ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/xdwdBgOxNi_TfML0hKlI8.png) # Answer Medical Questions OpenBioLLM-70B can provide answers to a wide range of medical questions. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/zO95GlwOQEZqCKQF69mE6.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/OKBczKw7gWeW5xsuDpc27.png) <details> <summary>Click to see details</summary> ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/eJGHT5khppYvJb8fQ-YW4.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/Cnbwrqa_-ORHRuNRC2P6Y.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/J9DhdcvukAc9mnnW9fj2C.png) </details> # Clinical Entity Recognition OpenBioLLM-70B can perform advanced clinical entity recognition by identifying and extracting key medical concepts, such as diseases, symptoms, medications, procedures, and anatomical structures, from unstructured clinical text. By leveraging its deep understanding of medical terminology and context, the model can accurately annotate and categorize clinical entities, enabling more efficient information retrieval, data analysis, and knowledge discovery from electronic health records, research articles, and other biomedical text sources. This capability can support various downstream applications, such as clinical decision support, pharmacovigilance, and medical research. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/_69BW4k9LVABFwtxixL45.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/DKy5wYCoPhoPPUc1-x8_J.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/7WD9zCCBZT4-4XlfnIQjl.png) # Biomarkers Extraction ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/ZttoM4AiteT7gFYVhjIpN.png) # Classification OpenBioLLM-70B can perform various biomedical classification tasks, such as disease prediction, sentiment analysis, medical document categorization ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/Bf5MW1d75qT-1F_TR_hC0.png) # De-Identification OpenBioLLM-70B can detect and remove personally identifiable information (PII) from medical records, ensuring patient privacy and compliance with data protection regulations like HIPAA. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/hKX4kzm--Tw5bj6K78msy.png) **Advisory Notice!**  While OpenBioLLM-70B leverages high-quality data sources, its outputs may still contain inaccuracies, biases, or misalignments that could pose risks if relied upon for medical decision-making without further testing and refinement. The model's performance has not yet been rigorously evaluated in randomized controlled trials or real-world healthcare environments. Therefore, we strongly advise against using OpenBioLLM-70B for any direct patient care, clinical decision support, or other professional medical purposes at this time. Its use should be limited to research, development, and exploratory applications by qualified individuals who understand its limitations. OpenBioLLM-70B is intended solely as a research tool to assist healthcare professionals and should never be considered a replacement for the professional judgment and expertise of a qualified medical doctor. Appropriately adapting and validating OpenBioLLM-70B for specific medical use cases would require significant additional work, potentially including: - Thorough testing and evaluation in relevant clinical scenarios - Alignment with evidence-based guidelines and best practices - Mitigation of potential biases and failure modes - Integration with human oversight and interpretation - Compliance with regulatory and ethical standards Always consult a qualified healthcare provider for personal medical needs. # Citation If you find OpenBioLLM-70B & 8B useful in your work, please cite the model as follows: ``` @misc{OpenBioLLMs, author = {Ankit Pal, Malaikannan Sankarasubbu}, title = {OpenBioLLMs: Advancing Open-Source Large Language Models for Healthcare and Life Sciences}, year = {2024}, publisher = {Hugging Face}, journal = {Hugging Face repository}, howpublished = {\url{https://huggingface.co/aaditya/OpenBioLLM-Llama3-70B}} } ``` The accompanying paper is currently in progress and will be released soon. <div align="center"> <h2> 💌 Contact </h2> </div> We look forward to hearing you and collaborating on this exciting project! **Contributors:** - [Ankit Pal (Aaditya Ura)](https://aadityaura.github.io/) [aadityaura at gmail dot com] - Saama AI Labs - Note: I am looking for a funded PhD opportunity, especially if it fits my Responsible Generative AI, Multimodal LLMs, Geometric Deep Learning, and Healthcare AI skillset. # References We thank the [Meta Team](meta-llama/Meta-Llama-3-70B-Instruct) for their amazing models! Result sources - [1] GPT-4 [Capabilities of GPT-4 on Medical Challenge Problems] (https://arxiv.org/abs/2303.13375) - [2] Med-PaLM-1 [Large Language Models Encode Clinical Knowledge](https://arxiv.org/abs/2212.13138) - [3] Med-PaLM-2 [Towards Expert-Level Medical Question Answering with Large Language Models](https://arxiv.org/abs/2305.09617) - [4] Gemini-1.0 [Gemini Goes to Med School](https://arxiv.org/abs/2402.07023)
[ "QUESTION_ANSWERING" ]
[ "MEDQA", "PUBMEDQA" ]
BioNLP
twadada/mpn-wordpca
twadada
null
[ "mteb", "model-index", "region:us" ]
1,726
1,726
0
0
--- tags: - mteb model-index: - name: mpnet_main_wordPCA results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: None config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 66.43283582089553 - type: ap value: 30.29663133704438 - type: f1 value: 60.96548204994961 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: None config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 65.5157 - type: ap value: 60.635968290391354 - type: f1 value: 65.22797046096731 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: None config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 33.018 - type: f1 value: 32.56614947751552 - task: type: Retrieval dataset: name: MTEB ArguAna type: None config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 20.768 - type: map_at_10 value: 33.792 - type: map_at_100 value: 34.961 - type: map_at_1000 value: 35.001 - type: map_at_3 value: 29.942999999999998 - type: map_at_5 value: 31.941999999999997 - type: mrr_at_1 value: 21.124000000000002 - type: mrr_at_10 value: 33.954 - type: mrr_at_100 value: 35.124 - type: mrr_at_1000 value: 35.164 - type: mrr_at_3 value: 30.133 - type: mrr_at_5 value: 32.082 - type: ndcg_at_1 value: 20.768 - type: ndcg_at_10 value: 41.001 - type: ndcg_at_100 value: 46.556 - type: ndcg_at_1000 value: 47.671 - type: ndcg_at_3 value: 32.938 - type: ndcg_at_5 value: 36.571 - type: precision_at_1 value: 20.768 - type: precision_at_10 value: 6.4079999999999995 - type: precision_at_100 value: 0.897 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 13.869000000000002 - type: precision_at_5 value: 10.100000000000001 - type: recall_at_1 value: 20.768 - type: recall_at_10 value: 64.083 - type: recall_at_100 value: 89.687 - type: recall_at_1000 value: 98.578 - type: recall_at_3 value: 41.607 - type: recall_at_5 value: 50.498 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: None config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 35.53339688014046 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: None config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 26.894818446313483 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: None config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 54.07897208491808 - type: mrr value: 68.92175614474783 - task: type: STS dataset: name: MTEB BIOSSES type: None config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 72.13906525048021 - type: cos_sim_spearman value: 71.55223277580176 - type: euclidean_pearson value: 72.00248024168417 - type: euclidean_spearman value: 71.55223277580176 - type: manhattan_pearson value: 72.96802825988968 - type: manhattan_spearman value: 72.66912569484411 - task: type: Classification dataset: name: MTEB Banking77Classification type: None config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 70.05194805194805 - type: f1 value: 68.84496347971877 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: None config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 33.41380417466471 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: None config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 24.417788232051212 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: None config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: map_at_1 value: 20.139000000000003 - type: map_at_10 value: 27.015 - type: map_at_100 value: 28.132 - type: map_at_1000 value: 28.301 - type: map_at_3 value: 24.740000000000002 - type: map_at_5 value: 25.672 - type: mrr_at_1 value: 25.607999999999997 - type: mrr_at_10 value: 32.143 - type: mrr_at_100 value: 32.979 - type: mrr_at_1000 value: 33.062999999999995 - type: mrr_at_3 value: 30.067 - type: mrr_at_5 value: 30.875000000000004 - type: ndcg_at_1 value: 25.607999999999997 - type: ndcg_at_10 value: 31.913999999999998 - type: ndcg_at_100 value: 36.853 - type: ndcg_at_1000 value: 40.189 - type: ndcg_at_3 value: 28.214 - type: ndcg_at_5 value: 29.185 - type: precision_at_1 value: 25.607999999999997 - type: precision_at_10 value: 6.252000000000001 - type: precision_at_100 value: 1.11 - type: precision_at_1000 value: 0.179 - type: precision_at_3 value: 13.591000000000001 - type: precision_at_5 value: 9.585 - type: recall_at_1 value: 20.139000000000003 - type: recall_at_10 value: 41.065000000000005 - type: recall_at_100 value: 62.966 - type: recall_at_1000 value: 85.545 - type: recall_at_3 value: 29.331000000000003 - type: recall_at_5 value: 32.532 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval type: None config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: map_at_1 value: 14.366000000000001 - type: map_at_10 value: 19.827 - type: map_at_100 value: 20.727999999999998 - type: map_at_1000 value: 20.854 - type: map_at_3 value: 17.957 - type: map_at_5 value: 18.961 - type: mrr_at_1 value: 18.471 - type: mrr_at_10 value: 23.778 - type: mrr_at_100 value: 24.58 - type: mrr_at_1000 value: 24.661 - type: mrr_at_3 value: 21.847 - type: mrr_at_5 value: 22.872999999999998 - type: ndcg_at_1 value: 18.471 - type: ndcg_at_10 value: 23.555 - type: ndcg_at_100 value: 27.755000000000003 - type: ndcg_at_1000 value: 30.739 - type: ndcg_at_3 value: 20.371 - type: ndcg_at_5 value: 21.739 - type: precision_at_1 value: 18.471 - type: precision_at_10 value: 4.516 - type: precision_at_100 value: 0.832 - type: precision_at_1000 value: 0.132 - type: precision_at_3 value: 9.851 - type: precision_at_5 value: 7.172000000000001 - type: recall_at_1 value: 14.366000000000001 - type: recall_at_10 value: 30.819999999999997 - type: recall_at_100 value: 49.303999999999995 - type: recall_at_1000 value: 69.975 - type: recall_at_3 value: 21.397 - type: recall_at_5 value: 25.223000000000003 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval type: None config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 23.610999999999997 - type: map_at_10 value: 31.745 - type: map_at_100 value: 32.844 - type: map_at_1000 value: 32.940999999999995 - type: map_at_3 value: 29.110000000000003 - type: map_at_5 value: 30.695 - type: mrr_at_1 value: 27.586 - type: mrr_at_10 value: 34.809 - type: mrr_at_100 value: 35.745 - type: mrr_at_1000 value: 35.806 - type: mrr_at_3 value: 32.414 - type: mrr_at_5 value: 33.839999999999996 - type: ndcg_at_1 value: 27.586 - type: ndcg_at_10 value: 36.369 - type: ndcg_at_100 value: 41.743 - type: ndcg_at_1000 value: 44.01 - type: ndcg_at_3 value: 31.551000000000002 - type: ndcg_at_5 value: 34.048 - type: precision_at_1 value: 27.586 - type: precision_at_10 value: 5.994 - type: precision_at_100 value: 0.958 - type: precision_at_1000 value: 0.123 - type: precision_at_3 value: 14.023 - type: precision_at_5 value: 10.006 - type: recall_at_1 value: 23.610999999999997 - type: recall_at_10 value: 47.532999999999994 - type: recall_at_100 value: 71.89 - type: recall_at_1000 value: 88.469 - type: recall_at_3 value: 34.624 - type: recall_at_5 value: 40.760000000000005 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval type: None config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: map_at_1 value: 10.452 - type: map_at_10 value: 14.865 - type: map_at_100 value: 15.709000000000001 - type: map_at_1000 value: 15.818999999999999 - type: map_at_3 value: 13.131 - type: map_at_5 value: 13.935 - type: mrr_at_1 value: 11.299 - type: mrr_at_10 value: 15.856 - type: mrr_at_100 value: 16.727 - type: mrr_at_1000 value: 16.817999999999998 - type: mrr_at_3 value: 14.143 - type: mrr_at_5 value: 14.962 - type: ndcg_at_1 value: 11.299 - type: ndcg_at_10 value: 17.807000000000002 - type: ndcg_at_100 value: 22.542 - type: ndcg_at_1000 value: 25.871 - type: ndcg_at_3 value: 14.277999999999999 - type: ndcg_at_5 value: 15.689 - type: precision_at_1 value: 11.299 - type: precision_at_10 value: 2.994 - type: precision_at_100 value: 0.571 - type: precision_at_1000 value: 0.091 - type: precision_at_3 value: 6.177 - type: precision_at_5 value: 4.542 - type: recall_at_1 value: 10.452 - type: recall_at_10 value: 26.043 - type: recall_at_100 value: 48.955 - type: recall_at_1000 value: 75.03999999999999 - type: recall_at_3 value: 16.384 - type: recall_at_5 value: 19.819 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval type: None config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: map_at_1 value: 5.7700000000000005 - type: map_at_10 value: 9.113 - type: map_at_100 value: 9.969999999999999 - type: map_at_1000 value: 10.086 - type: map_at_3 value: 7.797 - type: map_at_5 value: 8.512 - type: mrr_at_1 value: 7.463 - type: mrr_at_10 value: 11.491 - type: mrr_at_100 value: 12.354 - type: mrr_at_1000 value: 12.443999999999999 - type: mrr_at_3 value: 9.866999999999999 - type: mrr_at_5 value: 10.738 - type: ndcg_at_1 value: 7.463 - type: ndcg_at_10 value: 11.759 - type: ndcg_at_100 value: 16.364 - type: ndcg_at_1000 value: 19.614 - type: ndcg_at_3 value: 9.043 - type: ndcg_at_5 value: 10.25 - type: precision_at_1 value: 7.463 - type: precision_at_10 value: 2.363 - type: precision_at_100 value: 0.553 - type: precision_at_1000 value: 0.097 - type: precision_at_3 value: 4.436 - type: precision_at_5 value: 3.458 - type: recall_at_1 value: 5.7700000000000005 - type: recall_at_10 value: 17.69 - type: recall_at_100 value: 38.624 - type: recall_at_1000 value: 62.425 - type: recall_at_3 value: 10.281 - type: recall_at_5 value: 13.221 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval type: None config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: map_at_1 value: 15.049999999999999 - type: map_at_10 value: 20.639 - type: map_at_100 value: 21.806 - type: map_at_1000 value: 21.953 - type: map_at_3 value: 18.549 - type: map_at_5 value: 19.686 - type: mrr_at_1 value: 18.864 - type: mrr_at_10 value: 24.948999999999998 - type: mrr_at_100 value: 25.933 - type: mrr_at_1000 value: 26.023000000000003 - type: mrr_at_3 value: 22.794 - type: mrr_at_5 value: 24.07 - type: ndcg_at_1 value: 18.864 - type: ndcg_at_10 value: 24.877 - type: ndcg_at_100 value: 30.705 - type: ndcg_at_1000 value: 34.195 - type: ndcg_at_3 value: 21.112000000000002 - type: ndcg_at_5 value: 22.895 - type: precision_at_1 value: 18.864 - type: precision_at_10 value: 4.793 - type: precision_at_100 value: 0.9390000000000001 - type: precision_at_1000 value: 0.14400000000000002 - type: precision_at_3 value: 10.106 - type: precision_at_5 value: 7.603 - type: recall_at_1 value: 15.049999999999999 - type: recall_at_10 value: 33.466 - type: recall_at_100 value: 59.496 - type: recall_at_1000 value: 84.101 - type: recall_at_3 value: 22.948 - type: recall_at_5 value: 27.389999999999997 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval type: None config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: map_at_1 value: 10.477 - type: map_at_10 value: 15.299 - type: map_at_100 value: 16.372 - type: map_at_1000 value: 16.521 - type: map_at_3 value: 13.3 - type: map_at_5 value: 14.306 - type: mrr_at_1 value: 13.242 - type: mrr_at_10 value: 18.465 - type: mrr_at_100 value: 19.387 - type: mrr_at_1000 value: 19.483 - type: mrr_at_3 value: 16.362 - type: mrr_at_5 value: 17.452 - type: ndcg_at_1 value: 13.242 - type: ndcg_at_10 value: 19.016 - type: ndcg_at_100 value: 24.556 - type: ndcg_at_1000 value: 28.205999999999996 - type: ndcg_at_3 value: 15.242 - type: ndcg_at_5 value: 16.802 - type: precision_at_1 value: 13.242 - type: precision_at_10 value: 3.8129999999999997 - type: precision_at_100 value: 0.796 - type: precision_at_1000 value: 0.131 - type: precision_at_3 value: 7.42 - type: precision_at_5 value: 5.639 - type: recall_at_1 value: 10.477 - type: recall_at_10 value: 27.250000000000004 - type: recall_at_100 value: 52.459 - type: recall_at_1000 value: 78.224 - type: recall_at_3 value: 16.663 - type: recall_at_5 value: 20.759 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: mteb/cqadupstack config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 12.268416666666667 - type: map_at_10 value: 17.1725 - type: map_at_100 value: 18.108 - type: map_at_1000 value: 18.239833333333337 - type: map_at_3 value: 15.385916666666667 - type: map_at_5 value: 16.33808333333333 - type: mrr_at_1 value: 15.043583333333332 - type: mrr_at_10 value: 20.076749999999997 - type: mrr_at_100 value: 20.9225 - type: mrr_at_1000 value: 21.012833333333333 - type: mrr_at_3 value: 18.25883333333333 - type: mrr_at_5 value: 19.230083333333333 - type: ndcg_at_1 value: 15.043583333333332 - type: ndcg_at_10 value: 20.6305 - type: ndcg_at_100 value: 25.39741666666667 - type: ndcg_at_1000 value: 28.71625 - type: ndcg_at_3 value: 17.321416666666668 - type: ndcg_at_5 value: 18.781666666666666 - type: precision_at_1 value: 15.043583333333332 - type: precision_at_10 value: 3.8265000000000007 - type: precision_at_100 value: 0.7464166666666667 - type: precision_at_1000 value: 0.12216666666666667 - type: precision_at_3 value: 8.127416666666665 - type: precision_at_5 value: 5.986083333333333 - type: recall_at_1 value: 12.268416666666667 - type: recall_at_10 value: 28.25583333333333 - type: recall_at_100 value: 50.277833333333334 - type: recall_at_1000 value: 74.42433333333332 - type: recall_at_3 value: 18.861 - type: recall_at_5 value: 22.663416666666674 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval type: None config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: map_at_1 value: 9.209 - type: map_at_10 value: 13.633999999999999 - type: map_at_100 value: 14.328 - type: map_at_1000 value: 14.418000000000001 - type: map_at_3 value: 11.828 - type: map_at_5 value: 12.812000000000001 - type: mrr_at_1 value: 11.043 - type: mrr_at_10 value: 15.639 - type: mrr_at_100 value: 16.349 - type: mrr_at_1000 value: 16.426 - type: mrr_at_3 value: 13.88 - type: mrr_at_5 value: 14.838999999999999 - type: ndcg_at_1 value: 11.043 - type: ndcg_at_10 value: 16.738 - type: ndcg_at_100 value: 20.541 - type: ndcg_at_1000 value: 23.237 - type: ndcg_at_3 value: 13.261999999999999 - type: ndcg_at_5 value: 14.89 - type: precision_at_1 value: 11.043 - type: precision_at_10 value: 3.052 - type: precision_at_100 value: 0.549 - type: precision_at_1000 value: 0.08499999999999999 - type: precision_at_3 value: 6.135 - type: precision_at_5 value: 4.693 - type: recall_at_1 value: 9.209 - type: recall_at_10 value: 24.581 - type: recall_at_100 value: 42.473 - type: recall_at_1000 value: 63.19200000000001 - type: recall_at_3 value: 14.901 - type: recall_at_5 value: 19.067 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval type: None config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 5.831 - type: map_at_10 value: 9.002 - type: map_at_100 value: 9.646 - type: map_at_1000 value: 9.769 - type: map_at_3 value: 7.949000000000001 - type: map_at_5 value: 8.484 - type: mrr_at_1 value: 7.811 - type: mrr_at_10 value: 11.218 - type: mrr_at_100 value: 11.902 - type: mrr_at_1000 value: 12.005 - type: mrr_at_3 value: 10.077 - type: mrr_at_5 value: 10.652000000000001 - type: ndcg_at_1 value: 7.811 - type: ndcg_at_10 value: 11.342 - type: ndcg_at_100 value: 15.045 - type: ndcg_at_1000 value: 18.703 - type: ndcg_at_3 value: 9.293 - type: ndcg_at_5 value: 10.126 - type: precision_at_1 value: 7.811 - type: precision_at_10 value: 2.213 - type: precision_at_100 value: 0.501 - type: precision_at_1000 value: 0.098 - type: precision_at_3 value: 4.565 - type: precision_at_5 value: 3.4070000000000005 - type: recall_at_1 value: 5.831 - type: recall_at_10 value: 16.456 - type: recall_at_100 value: 33.985 - type: recall_at_1000 value: 61.44500000000001 - type: recall_at_3 value: 10.578 - type: recall_at_5 value: 12.720999999999998 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval type: None config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: map_at_1 value: 11.285 - type: map_at_10 value: 15.156 - type: map_at_100 value: 16.067999999999998 - type: map_at_1000 value: 16.189999999999998 - type: map_at_3 value: 13.806 - type: map_at_5 value: 14.471 - type: mrr_at_1 value: 13.899000000000001 - type: mrr_at_10 value: 18.291 - type: mrr_at_100 value: 19.141 - type: mrr_at_1000 value: 19.239 - type: mrr_at_3 value: 16.822 - type: mrr_at_5 value: 17.488999999999997 - type: ndcg_at_1 value: 13.899000000000001 - type: ndcg_at_10 value: 18.142 - type: ndcg_at_100 value: 23.116 - type: ndcg_at_1000 value: 26.406000000000002 - type: ndcg_at_3 value: 15.519 - type: ndcg_at_5 value: 16.502 - type: precision_at_1 value: 13.899000000000001 - type: precision_at_10 value: 3.125 - type: precision_at_100 value: 0.633 - type: precision_at_1000 value: 0.10300000000000001 - type: precision_at_3 value: 7.152 - type: precision_at_5 value: 4.981 - type: recall_at_1 value: 11.285 - type: recall_at_10 value: 24.481 - type: recall_at_100 value: 47.82 - type: recall_at_1000 value: 71.873 - type: recall_at_3 value: 16.954 - type: recall_at_5 value: 19.625999999999998 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval type: None config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: map_at_1 value: 12.497 - type: map_at_10 value: 17.415 - type: map_at_100 value: 18.551000000000002 - type: map_at_1000 value: 18.759 - type: map_at_3 value: 15.648000000000001 - type: map_at_5 value: 16.685 - type: mrr_at_1 value: 15.809999999999999 - type: mrr_at_10 value: 20.74 - type: mrr_at_100 value: 21.657 - type: mrr_at_1000 value: 21.764 - type: mrr_at_3 value: 18.972 - type: mrr_at_5 value: 20.01 - type: ndcg_at_1 value: 15.809999999999999 - type: ndcg_at_10 value: 21.061 - type: ndcg_at_100 value: 26.150000000000002 - type: ndcg_at_1000 value: 30.381000000000004 - type: ndcg_at_3 value: 18.167 - type: ndcg_at_5 value: 19.599 - type: precision_at_1 value: 15.809999999999999 - type: precision_at_10 value: 4.289 - type: precision_at_100 value: 0.996 - type: precision_at_1000 value: 0.191 - type: precision_at_3 value: 8.959 - type: precision_at_5 value: 6.68 - type: recall_at_1 value: 12.497 - type: recall_at_10 value: 27.615000000000002 - type: recall_at_100 value: 52.063 - type: recall_at_1000 value: 81.321 - type: recall_at_3 value: 18.648 - type: recall_at_5 value: 22.808 - task: type: Retrieval dataset: name: MTEB CQADupstackWordpressRetrieval type: None config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 8.534 - type: map_at_10 value: 12.36 - type: map_at_100 value: 13.142000000000001 - type: map_at_1000 value: 13.267000000000001 - type: map_at_3 value: 10.816 - type: map_at_5 value: 11.838 - type: mrr_at_1 value: 9.427000000000001 - type: mrr_at_10 value: 13.542000000000002 - type: mrr_at_100 value: 14.316 - type: mrr_at_1000 value: 14.421999999999999 - type: mrr_at_3 value: 11.860999999999999 - type: mrr_at_5 value: 12.961 - type: ndcg_at_1 value: 9.427000000000001 - type: ndcg_at_10 value: 14.985999999999999 - type: ndcg_at_100 value: 19.399 - type: ndcg_at_1000 value: 23.044 - type: ndcg_at_3 value: 11.805 - type: ndcg_at_5 value: 13.655000000000001 - type: precision_at_1 value: 9.427000000000001 - type: precision_at_10 value: 2.514 - type: precision_at_100 value: 0.519 - type: precision_at_1000 value: 0.092 - type: precision_at_3 value: 5.114 - type: precision_at_5 value: 4.067 - type: recall_at_1 value: 8.534 - type: recall_at_10 value: 22.07 - type: recall_at_100 value: 43.299 - type: recall_at_1000 value: 71.482 - type: recall_at_3 value: 13.623 - type: recall_at_5 value: 18.035 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: None config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: map_at_1 value: 6.146999999999999 - type: map_at_10 value: 11.126 - type: map_at_100 value: 12.579 - type: map_at_1000 value: 12.791 - type: map_at_3 value: 9.035 - type: map_at_5 value: 10.111 - type: mrr_at_1 value: 13.941 - type: mrr_at_10 value: 22.84 - type: mrr_at_100 value: 24.03 - type: mrr_at_1000 value: 24.093999999999998 - type: mrr_at_3 value: 19.695999999999998 - type: mrr_at_5 value: 21.455 - type: ndcg_at_1 value: 13.941 - type: ndcg_at_10 value: 16.864 - type: ndcg_at_100 value: 23.701 - type: ndcg_at_1000 value: 27.817999999999998 - type: ndcg_at_3 value: 12.888 - type: ndcg_at_5 value: 14.359 - type: precision_at_1 value: 13.941 - type: precision_at_10 value: 5.6419999999999995 - type: precision_at_100 value: 1.298 - type: precision_at_1000 value: 0.20400000000000001 - type: precision_at_3 value: 10.011000000000001 - type: precision_at_5 value: 8.065 - type: recall_at_1 value: 6.146999999999999 - type: recall_at_10 value: 21.701 - type: recall_at_100 value: 46.117000000000004 - type: recall_at_1000 value: 69.64 - type: recall_at_3 value: 12.052999999999999 - type: recall_at_5 value: 15.956999999999999 - task: type: Retrieval dataset: name: MTEB DBPedia type: None config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: map_at_1 value: 3.2620000000000005 - type: map_at_10 value: 7.564 - type: map_at_100 value: 10.668999999999999 - type: map_at_1000 value: 11.533999999999999 - type: map_at_3 value: 5.322 - type: map_at_5 value: 6.135 - type: mrr_at_1 value: 36.5 - type: mrr_at_10 value: 45.622 - type: mrr_at_100 value: 46.312999999999995 - type: mrr_at_1000 value: 46.342 - type: mrr_at_3 value: 42.458 - type: mrr_at_5 value: 43.796 - type: ndcg_at_1 value: 26.125 - type: ndcg_at_10 value: 20.279 - type: ndcg_at_100 value: 23.253 - type: ndcg_at_1000 value: 29.804000000000002 - type: ndcg_at_3 value: 21.688 - type: ndcg_at_5 value: 20.203 - type: precision_at_1 value: 36.5 - type: precision_at_10 value: 18.45 - type: precision_at_100 value: 5.997 - type: precision_at_1000 value: 1.309 - type: precision_at_3 value: 26.25 - type: precision_at_5 value: 21.95 - type: recall_at_1 value: 3.2620000000000005 - type: recall_at_10 value: 12.374 - type: recall_at_100 value: 30.087000000000003 - type: recall_at_1000 value: 52.72599999999999 - type: recall_at_3 value: 6.393 - type: recall_at_5 value: 8.013 - task: type: Classification dataset: name: MTEB EmotionClassification type: None config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 44.93000000000001 - type: f1 value: 40.84475902511729 - task: type: Retrieval dataset: name: MTEB FEVER type: None config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: map_at_1 value: 12.41 - type: map_at_10 value: 18.831999999999997 - type: map_at_100 value: 19.747 - type: map_at_1000 value: 19.832 - type: map_at_3 value: 16.712 - type: map_at_5 value: 17.876 - type: mrr_at_1 value: 13.141 - type: mrr_at_10 value: 19.917 - type: mrr_at_100 value: 20.846999999999998 - type: mrr_at_1000 value: 20.924 - type: mrr_at_3 value: 17.737 - type: mrr_at_5 value: 18.948 - type: ndcg_at_1 value: 13.141 - type: ndcg_at_10 value: 22.787 - type: ndcg_at_100 value: 27.505000000000003 - type: ndcg_at_1000 value: 29.904999999999998 - type: ndcg_at_3 value: 18.41 - type: ndcg_at_5 value: 20.508000000000003 - type: precision_at_1 value: 13.141 - type: precision_at_10 value: 3.6929999999999996 - type: precision_at_100 value: 0.624 - type: precision_at_1000 value: 0.08499999999999999 - type: precision_at_3 value: 7.991 - type: precision_at_5 value: 5.869 - type: recall_at_1 value: 12.41 - type: recall_at_10 value: 34.211999999999996 - type: recall_at_100 value: 56.301 - type: recall_at_1000 value: 74.936 - type: recall_at_3 value: 22.283 - type: recall_at_5 value: 27.342 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: None config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: map_at_1 value: 5.861000000000001 - type: map_at_10 value: 9.26 - type: map_at_100 value: 10.441 - type: map_at_1000 value: 10.642 - type: map_at_3 value: 8.008 - type: map_at_5 value: 8.674999999999999 - type: mrr_at_1 value: 11.574 - type: mrr_at_10 value: 16.729 - type: mrr_at_100 value: 17.842 - type: mrr_at_1000 value: 17.949 - type: mrr_at_3 value: 15.072 - type: mrr_at_5 value: 16.106 - type: ndcg_at_1 value: 11.574 - type: ndcg_at_10 value: 12.805 - type: ndcg_at_100 value: 18.877 - type: ndcg_at_1000 value: 23.662 - type: ndcg_at_3 value: 10.992 - type: ndcg_at_5 value: 11.677 - type: precision_at_1 value: 11.574 - type: precision_at_10 value: 3.673 - type: precision_at_100 value: 0.963 - type: precision_at_1000 value: 0.179 - type: precision_at_3 value: 7.407 - type: precision_at_5 value: 5.679 - type: recall_at_1 value: 5.861000000000001 - type: recall_at_10 value: 15.911 - type: recall_at_100 value: 40.158 - type: recall_at_1000 value: 70.295 - type: recall_at_3 value: 10.142 - type: recall_at_5 value: 12.509999999999998 - task: type: Retrieval dataset: name: MTEB HotpotQA type: None config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: map_at_1 value: 13.794999999999998 - type: map_at_10 value: 19.573999999999998 - type: map_at_100 value: 20.363 - type: map_at_1000 value: 20.469 - type: map_at_3 value: 17.862000000000002 - type: map_at_5 value: 18.849 - type: mrr_at_1 value: 27.589000000000002 - type: mrr_at_10 value: 33.997 - type: mrr_at_100 value: 34.747 - type: mrr_at_1000 value: 34.812 - type: mrr_at_3 value: 32.163000000000004 - type: mrr_at_5 value: 33.265 - type: ndcg_at_1 value: 27.589000000000002 - type: ndcg_at_10 value: 25.413999999999998 - type: ndcg_at_100 value: 29.336000000000002 - type: ndcg_at_1000 value: 32.012 - type: ndcg_at_3 value: 22.093 - type: ndcg_at_5 value: 23.794999999999998 - type: precision_at_1 value: 27.589000000000002 - type: precision_at_10 value: 5.699 - type: precision_at_100 value: 0.886 - type: precision_at_1000 value: 0.124 - type: precision_at_3 value: 14.027000000000001 - type: precision_at_5 value: 9.766 - type: recall_at_1 value: 13.794999999999998 - type: recall_at_10 value: 28.494000000000003 - type: recall_at_100 value: 44.308 - type: recall_at_1000 value: 62.22800000000001 - type: recall_at_3 value: 21.04 - type: recall_at_5 value: 24.416 - task: type: Classification dataset: name: MTEB ImdbClassification type: None config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 64.60480000000001 - type: ap value: 59.727131414185166 - type: f1 value: 64.1072904179992 - task: type: Retrieval dataset: name: MTEB MSMARCO type: None config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: map_at_1 value: 5.826 - type: map_at_10 value: 9.851 - type: map_at_100 value: 10.697 - type: map_at_1000 value: 10.803 - type: map_at_3 value: 8.238 - type: map_at_5 value: 9.104 - type: mrr_at_1 value: 6.046 - type: mrr_at_10 value: 10.134 - type: mrr_at_100 value: 10.987 - type: mrr_at_1000 value: 11.091 - type: mrr_at_3 value: 8.508000000000001 - type: mrr_at_5 value: 9.371 - type: ndcg_at_1 value: 6.046 - type: ndcg_at_10 value: 12.536 - type: ndcg_at_100 value: 17.187 - type: ndcg_at_1000 value: 20.371 - type: ndcg_at_3 value: 9.164 - type: ndcg_at_5 value: 10.725999999999999 - type: precision_at_1 value: 6.046 - type: precision_at_10 value: 2.175 - type: precision_at_100 value: 0.45799999999999996 - type: precision_at_1000 value: 0.073 - type: precision_at_3 value: 4.016 - type: precision_at_5 value: 3.206 - type: recall_at_1 value: 5.826 - type: recall_at_10 value: 20.926000000000002 - type: recall_at_100 value: 43.669000000000004 - type: recall_at_1000 value: 69.247 - type: recall_at_3 value: 11.609 - type: recall_at_5 value: 15.376999999999999 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: None config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 88.16689466484269 - type: f1 value: 87.5981029414324 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: None config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 55.33515731874146 - type: f1 value: 37.77464057733795 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: None config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.044384667115004 - type: f1 value: 60.29405543040358 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: None config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.21721587088096 - type: f1 value: 69.8224169927227 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: None config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 29.09923884257758 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: None config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 27.58464252003336 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: None config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 29.685278831397703 - type: mrr value: 30.52917269137772 - task: type: Retrieval dataset: name: MTEB NFCorpus type: None config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: map_at_1 value: 4.274 - type: map_at_10 value: 7.764 - type: map_at_100 value: 10.043000000000001 - type: map_at_1000 value: 11.394 - type: map_at_3 value: 5.887 - type: map_at_5 value: 6.761 - type: mrr_at_1 value: 33.745999999999995 - type: mrr_at_10 value: 42.836 - type: mrr_at_100 value: 43.701 - type: mrr_at_1000 value: 43.749 - type: mrr_at_3 value: 40.248 - type: mrr_at_5 value: 41.78 - type: ndcg_at_1 value: 31.734 - type: ndcg_at_10 value: 24.268 - type: ndcg_at_100 value: 23.801 - type: ndcg_at_1000 value: 33.347 - type: ndcg_at_3 value: 27.394000000000002 - type: ndcg_at_5 value: 26.312 - type: precision_at_1 value: 33.745999999999995 - type: precision_at_10 value: 17.802 - type: precision_at_100 value: 6.616 - type: precision_at_1000 value: 2.008 - type: precision_at_3 value: 25.8 - type: precision_at_5 value: 22.972 - type: recall_at_1 value: 4.274 - type: recall_at_10 value: 12.012 - type: recall_at_100 value: 26.706999999999997 - type: recall_at_1000 value: 60.634 - type: recall_at_3 value: 6.61 - type: recall_at_5 value: 8.545 - task: type: Retrieval dataset: name: MTEB NQ type: None config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: map_at_1 value: 8.519 - type: map_at_10 value: 14.407 - type: map_at_100 value: 15.607 - type: map_at_1000 value: 15.705 - type: map_at_3 value: 12.241 - type: map_at_5 value: 13.291 - type: mrr_at_1 value: 9.676 - type: mrr_at_10 value: 16.025 - type: mrr_at_100 value: 17.151 - type: mrr_at_1000 value: 17.233 - type: mrr_at_3 value: 13.783999999999999 - type: mrr_at_5 value: 14.875 - type: ndcg_at_1 value: 9.676 - type: ndcg_at_10 value: 18.439 - type: ndcg_at_100 value: 24.375 - type: ndcg_at_1000 value: 27.111 - type: ndcg_at_3 value: 13.862 - type: ndcg_at_5 value: 15.747 - type: precision_at_1 value: 9.676 - type: precision_at_10 value: 3.473 - type: precision_at_100 value: 0.681 - type: precision_at_1000 value: 0.094 - type: precision_at_3 value: 6.615 - type: precision_at_5 value: 5.017 - type: recall_at_1 value: 8.519 - type: recall_at_10 value: 29.442 - type: recall_at_100 value: 56.704 - type: recall_at_1000 value: 77.827 - type: recall_at_3 value: 17.055 - type: recall_at_5 value: 21.454 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: None config: default split: test revision: None metrics: - type: map_at_1 value: 63.114000000000004 - type: map_at_10 value: 75.723 - type: map_at_100 value: 76.50099999999999 - type: map_at_1000 value: 76.536 - type: map_at_3 value: 72.80499999999999 - type: map_at_5 value: 74.512 - type: mrr_at_1 value: 72.59 - type: mrr_at_10 value: 79.794 - type: mrr_at_100 value: 80.05499999999999 - type: mrr_at_1000 value: 80.061 - type: mrr_at_3 value: 78.333 - type: mrr_at_5 value: 79.244 - type: ndcg_at_1 value: 72.66 - type: ndcg_at_10 value: 80.306 - type: ndcg_at_100 value: 82.439 - type: ndcg_at_1000 value: 82.826 - type: ndcg_at_3 value: 76.72099999999999 - type: ndcg_at_5 value: 78.482 - type: precision_at_1 value: 72.66 - type: precision_at_10 value: 12.169 - type: precision_at_100 value: 1.4529999999999998 - type: precision_at_1000 value: 0.154 - type: precision_at_3 value: 33.267 - type: precision_at_5 value: 21.954 - type: recall_at_1 value: 63.114000000000004 - type: recall_at_10 value: 89.17699999999999 - type: recall_at_100 value: 97.208 - type: recall_at_1000 value: 99.39 - type: recall_at_3 value: 78.949 - type: recall_at_5 value: 83.848 - type: map_at_1 value: 2.6229999999999998 - type: map_at_10 value: 6.6290000000000004 - type: map_at_100 value: 8.052 - type: map_at_1000 value: 8.322000000000001 - type: map_at_3 value: 4.749 - type: map_at_5 value: 5.649 - type: mrr_at_1 value: 12.9 - type: mrr_at_10 value: 21.098 - type: mrr_at_100 value: 22.301000000000002 - type: mrr_at_1000 value: 22.391 - type: mrr_at_3 value: 18.099999999999998 - type: mrr_at_5 value: 19.615 - type: ndcg_at_1 value: 12.9 - type: ndcg_at_10 value: 12.031 - type: ndcg_at_100 value: 18.526 - type: ndcg_at_1000 value: 23.993000000000002 - type: ndcg_at_3 value: 10.894 - type: ndcg_at_5 value: 9.638 - type: precision_at_1 value: 12.9 - type: precision_at_10 value: 6.460000000000001 - type: precision_at_100 value: 1.598 - type: precision_at_1000 value: 0.292 - type: precision_at_3 value: 10.233 - type: precision_at_5 value: 8.559999999999999 - type: recall_at_1 value: 2.6229999999999998 - type: recall_at_10 value: 13.111999999999998 - type: recall_at_100 value: 32.418 - type: recall_at_1000 value: 59.24700000000001 - type: recall_at_3 value: 6.2330000000000005 - type: recall_at_5 value: 8.673 - type: map_at_1 value: 0.152 - type: map_at_10 value: 1.0370000000000001 - type: map_at_100 value: 5.169 - type: map_at_1000 value: 11.804 - type: map_at_3 value: 0.367 - type: map_at_5 value: 0.557 - type: mrr_at_1 value: 62.0 - type: mrr_at_10 value: 72.786 - type: mrr_at_100 value: 73.009 - type: mrr_at_1000 value: 73.009 - type: mrr_at_3 value: 69.0 - type: mrr_at_5 value: 72.0 - type: ndcg_at_1 value: 56.99999999999999 - type: ndcg_at_10 value: 52.013 - type: ndcg_at_100 value: 36.669000000000004 - type: ndcg_at_1000 value: 31.721 - type: ndcg_at_3 value: 53.52 - type: ndcg_at_5 value: 52.54600000000001 - type: precision_at_1 value: 64.0 - type: precision_at_10 value: 57.4 - type: precision_at_100 value: 38.14 - type: precision_at_1000 value: 14.996 - type: precision_at_3 value: 58.667 - type: precision_at_5 value: 57.599999999999994 - type: recall_at_1 value: 0.152 - type: recall_at_10 value: 1.352 - type: recall_at_100 value: 8.392 - type: recall_at_1000 value: 30.470000000000002 - type: recall_at_3 value: 0.409 - type: recall_at_5 value: 0.661 - task: type: Clustering dataset: name: MTEB RedditClustering type: None config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 36.72406468585852 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: None config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 43.472169436837376 - task: type: STS dataset: name: MTEB SICK-R type: None config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 77.29221538383013 - type: cos_sim_spearman value: 65.00237260316423 - type: euclidean_pearson value: 71.97594877548565 - type: euclidean_spearman value: 65.00224511756271 - type: manhattan_pearson value: 70.69044074330846 - type: manhattan_spearman value: 64.56188226107231 - task: type: STS dataset: name: MTEB STS12 type: None config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 77.15455031153215 - type: cos_sim_spearman value: 68.87516936019733 - type: euclidean_pearson value: 74.40173366578063 - type: euclidean_spearman value: 68.87649875376111 - type: manhattan_pearson value: 71.37948083917088 - type: manhattan_spearman value: 66.87593196817534 - task: type: STS dataset: name: MTEB STS13 type: None config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 76.85372381398427 - type: cos_sim_spearman value: 78.70991572075931 - type: euclidean_pearson value: 78.02059239423504 - type: euclidean_spearman value: 78.70995352760224 - type: manhattan_pearson value: 77.29573198105021 - type: manhattan_spearman value: 77.83698304986561 - task: type: STS dataset: name: MTEB STS14 type: None config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 75.62945293081356 - type: cos_sim_spearman value: 72.21402271266491 - type: euclidean_pearson value: 74.8632992511187 - type: euclidean_spearman value: 72.21401302442116 - type: manhattan_pearson value: 74.12692290251161 - type: manhattan_spearman value: 71.68132558277071 - task: type: STS dataset: name: MTEB STS15 type: None config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 78.39263075496403 - type: cos_sim_spearman value: 80.29205232773359 - type: euclidean_pearson value: 79.98077838577872 - type: euclidean_spearman value: 80.29205083786974 - type: manhattan_pearson value: 79.40585067880913 - type: manhattan_spearman value: 79.76402780723464 - task: type: STS dataset: name: MTEB STS16 type: None config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 67.87975357523919 - type: cos_sim_spearman value: 71.32759045416529 - type: euclidean_pearson value: 71.38828468907693 - type: euclidean_spearman value: 71.32841999853918 - type: manhattan_pearson value: 71.52115254680312 - type: manhattan_spearman value: 71.36502371568079 - task: type: STS dataset: name: MTEB STS17 (en-en) type: None config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 79.73783641527713 - type: cos_sim_spearman value: 80.83877209500405 - type: euclidean_pearson value: 80.9815305740776 - type: euclidean_spearman value: 80.8396480829947 - type: manhattan_pearson value: 80.20252699531369 - type: manhattan_spearman value: 80.06981696234901 - task: type: STS dataset: name: MTEB STS22 (en) type: None config: en split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 47.00304169592553 - type: cos_sim_spearman value: 56.38544150292195 - type: euclidean_pearson value: 53.01919280358667 - type: euclidean_spearman value: 56.38544150292195 - type: manhattan_pearson value: 51.65539290367504 - type: manhattan_spearman value: 55.124846472764 - task: type: STS dataset: name: MTEB STSBenchmark type: None config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 71.91396118730897 - type: cos_sim_spearman value: 70.00936285125462 - type: euclidean_pearson value: 72.92234042209962 - type: euclidean_spearman value: 70.00938132595871 - type: manhattan_pearson value: 72.83606277528781 - type: manhattan_spearman value: 70.11738672765205 - task: type: Reranking dataset: name: MTEB SciDocsRR type: None config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 74.41530407041719 - type: mrr value: 91.4664184762224 - task: type: Retrieval dataset: name: MTEB SciFact type: None config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: map_at_1 value: 31.667 - type: map_at_10 value: 40.327 - type: map_at_100 value: 41.29 - type: map_at_1000 value: 41.374 - type: map_at_3 value: 37.824000000000005 - type: map_at_5 value: 39.455 - type: mrr_at_1 value: 33.333 - type: mrr_at_10 value: 41.650999999999996 - type: mrr_at_100 value: 42.516 - type: mrr_at_1000 value: 42.586 - type: mrr_at_3 value: 39.333 - type: mrr_at_5 value: 40.967 - type: ndcg_at_1 value: 33.333 - type: ndcg_at_10 value: 44.955 - type: ndcg_at_100 value: 50.176 - type: ndcg_at_1000 value: 52.18000000000001 - type: ndcg_at_3 value: 40.233999999999995 - type: ndcg_at_5 value: 43.04 - type: precision_at_1 value: 33.333 - type: precision_at_10 value: 6.367000000000001 - type: precision_at_100 value: 0.9369999999999999 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 16.111 - type: precision_at_5 value: 11.267000000000001 - type: recall_at_1 value: 31.667 - type: recall_at_10 value: 57.778 - type: recall_at_100 value: 82.989 - type: recall_at_1000 value: 98.26700000000001 - type: recall_at_3 value: 45.639 - type: recall_at_5 value: 52.278000000000006 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: None config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.67029702970297 - type: cos_sim_ap value: 89.07129165961199 - type: cos_sim_f1 value: 82.57537046499746 - type: cos_sim_precision value: 84.43051201671892 - type: cos_sim_recall value: 80.80000000000001 - type: dot_accuracy value: 99.67029702970297 - type: dot_ap value: 89.07129165961199 - type: dot_f1 value: 82.57537046499746 - type: dot_precision value: 84.43051201671892 - type: dot_recall value: 80.80000000000001 - type: euclidean_accuracy value: 99.67029702970297 - type: euclidean_ap value: 89.07129165961199 - type: euclidean_f1 value: 82.57537046499746 - type: euclidean_precision value: 84.43051201671892 - type: euclidean_recall value: 80.80000000000001 - type: manhattan_accuracy value: 99.65643564356435 - type: manhattan_ap value: 88.41137288354716 - type: manhattan_f1 value: 81.70347003154575 - type: manhattan_precision value: 86.14190687361419 - type: manhattan_recall value: 77.7 - type: max_accuracy value: 99.67029702970297 - type: max_ap value: 89.07129165961199 - type: max_f1 value: 82.57537046499746 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: None config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 41.60638325431974 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: None config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 29.664294828405303 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: None config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 42.25132223734513 - type: mrr value: 42.64716492657669 - task: type: Summarization dataset: name: MTEB SummEval type: None config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.69873203329748 - type: cos_sim_spearman value: 30.504626278665437 - type: dot_pearson value: 30.69873212465138 - type: dot_spearman value: 30.576343071933813 - task: type: Retrieval dataset: name: MTEB Touche2020 type: None config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: map_at_1 value: 1.165 - type: map_at_10 value: 4.088 - type: map_at_100 value: 7.378 - type: map_at_1000 value: 8.958 - type: map_at_3 value: 2.254 - type: map_at_5 value: 3.524 - type: mrr_at_1 value: 18.367 - type: mrr_at_10 value: 30.646 - type: mrr_at_100 value: 32.625 - type: mrr_at_1000 value: 32.661 - type: mrr_at_3 value: 26.871000000000002 - type: mrr_at_5 value: 30.238 - type: ndcg_at_1 value: 18.367 - type: ndcg_at_10 value: 13.061 - type: ndcg_at_100 value: 23.47 - type: ndcg_at_1000 value: 37.61 - type: ndcg_at_3 value: 15.546 - type: ndcg_at_5 value: 16.355 - type: precision_at_1 value: 18.367 - type: precision_at_10 value: 11.429 - type: precision_at_100 value: 5.653 - type: precision_at_1000 value: 1.451 - type: precision_at_3 value: 15.645999999999999 - type: precision_at_5 value: 17.551 - type: recall_at_1 value: 1.165 - type: recall_at_10 value: 8.018 - type: recall_at_100 value: 35.23 - type: recall_at_1000 value: 79.57600000000001 - type: recall_at_3 value: 3.076 - type: recall_at_5 value: 6.242 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: None config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 71.9568 - type: ap value: 13.253846581305634 - type: f1 value: 54.324508660685645 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: None config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 55.39049235993209 - type: f1 value: 55.53656453466803 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: None config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 35.252056863048416 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: None config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 82.41640340942958 - type: cos_sim_ap value: 61.46818116365959 - type: cos_sim_f1 value: 59.05859279235107 - type: cos_sim_precision value: 55.151098901098905 - type: cos_sim_recall value: 63.562005277044854 - type: dot_accuracy value: 82.41640340942958 - type: dot_ap value: 61.46818116365959 - type: dot_f1 value: 59.05859279235107 - type: dot_precision value: 55.151098901098905 - type: dot_recall value: 63.562005277044854 - type: euclidean_accuracy value: 82.41640340942958 - type: euclidean_ap value: 61.46818116365959 - type: euclidean_f1 value: 59.05859279235107 - type: euclidean_precision value: 55.151098901098905 - type: euclidean_recall value: 63.562005277044854 - type: manhattan_accuracy value: 82.80979912976099 - type: manhattan_ap value: 62.57319706784323 - type: manhattan_f1 value: 59.7376886909181 - type: manhattan_precision value: 56.24417520969245 - type: manhattan_recall value: 63.69393139841689 - type: max_accuracy value: 82.80979912976099 - type: max_ap value: 62.57319706784323 - type: max_f1 value: 59.7376886909181 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: None config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 86.4012108510886 - type: cos_sim_ap value: 80.82375141067818 - type: cos_sim_f1 value: 72.80055190443339 - type: cos_sim_precision value: 68.88614031471174 - type: cos_sim_recall value: 77.18663381582999 - type: dot_accuracy value: 86.4012108510886 - type: dot_ap value: 80.82375193841179 - type: dot_f1 value: 72.80055190443339 - type: dot_precision value: 68.88614031471174 - type: dot_recall value: 77.18663381582999 - type: euclidean_accuracy value: 86.4012108510886 - type: euclidean_ap value: 80.82375158687223 - type: euclidean_f1 value: 72.80055190443339 - type: euclidean_precision value: 68.88614031471174 - type: euclidean_recall value: 77.18663381582999 - type: manhattan_accuracy value: 86.5700314355571 - type: manhattan_ap value: 81.24256901305888 - type: manhattan_f1 value: 73.30839356408673 - type: manhattan_precision value: 69.38466827088347 - type: manhattan_recall value: 77.70249461040962 - type: max_accuracy value: 86.5700314355571 - type: max_ap value: 81.24256901305888 - type: max_f1 value: 73.30839356408673 ---
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
gliner-community/gliner_xxl-v2.5
gliner-community
token-classification
[ "gliner", "pytorch", "token-classification", "multilingual", "dataset:urchade/pile-mistral-v0.1", "arxiv:2311.08526", "license:apache-2.0", "region:us" ]
1,724
1,725
70
3
--- datasets: - urchade/pile-mistral-v0.1 language: - multilingual library_name: gliner license: apache-2.0 pipeline_tag: token-classification --- # About GLiNER is a Named Entity Recognition (NER) model capable of identifying any entity type using a bidirectional transformer encoder (BERT-like). It provides a practical alternative to traditional NER models, which are limited to predefined entities, and Large Language Models (LLMs) that, despite their flexibility, are costly and large for resource-constrained scenarios. ## Links * Paper: https://arxiv.org/abs/2311.08526 * Repository: https://github.com/urchade/GLiNER ## Installation To use this model, you must install the GLiNER Python library: ``` !pip install gliner -U ``` ## Usage Once you've downloaded the GLiNER library, you can import the GLiNER class. You can then load this model using `GLiNER.from_pretrained` and predict entities with `predict_entities`. ```python from gliner import GLiNER model = GLiNER.from_pretrained("gliner-community/gliner_xxl-v2.5", load_tokenizer=True) text = """ Cristiano Ronaldo dos Santos Aveiro (Portuguese pronunciation: [kɾiʃˈtjɐnu ʁɔˈnaldu]; born 5 February 1985) is a Portuguese professional footballer who plays as a forward for and captains both Saudi Pro League club Al Nassr and the Portugal national team. Widely regarded as one of the greatest players of all time, Ronaldo has won five Ballon d'Or awards,[note 3] a record three UEFA Men's Player of the Year Awards, and four European Golden Shoes, the most by a European player. He has won 33 trophies in his career, including seven league titles, five UEFA Champions Leagues, the UEFA European Championship and the UEFA Nations League. Ronaldo holds the records for most appearances (183), goals (140) and assists (42) in the Champions League, goals in the European Championship (14), international goals (128) and international appearances (205). He is one of the few players to have made over 1,200 professional career appearances, the most by an outfield player, and has scored over 850 official senior career goals for club and country, making him the top goalscorer of all time. """ labels = ["person", "award", "date", "competitions", "teams"] entities = model.predict_entities(text, labels) for entity in entities: print(entity["text"], "=>", entity["label"]) ``` ``` Cristiano Ronaldo dos Santos Aveiro => person 5 February 1985 => date Al Nassr => teams Portugal national team => teams Ballon d'Or => award UEFA Men's Player of the Year Awards => award European Golden Shoes => award UEFA Champions Leagues => competitions UEFA European Championship => competitions UEFA Nations League => competitions Champions League => competitions European Championship => competitions ``` ## Named Entity Recognition benchmark result Below is a comparison of results between previous versions of the model and the current one: ![Models performance](models_comparison.png) ### Results on other datasets | Model | Dataset | Precision | Recall | F1 Score | |------------------------------------|---------------------|-----------|--------|----------| | gliner-community/gliner_small-v2.5 | ACE 2004 | 35.18% | 22.81% | 27.67% | | | ACE 2005 | 35.89% | 22.39% | 27.58% | | | AnatEM | 49.12% | 31.31% | 38.24% | | | Broad Tweet Corpus | 59.51% | 77.85% | 67.46% | | | CoNLL 2003 | 63.16% | 70.43% | 66.60% | | | FabNER | 23.78% | 22.55% | 23.15% | | | FindVehicle | 37.46% | 40.06% | 38.72% | | | GENIA_NER | 45.90% | 54.11% | 49.67% | | | HarveyNER | 13.20% | 32.58% | 18.78% | | | MultiNERD | 45.87% | 87.01% | 60.07% | | | Ontonotes | 23.05% | 41.16% | 29.55% | | | PolyglotNER | 31.88% | 67.22% | 43.25% | | | TweetNER7 | 40.98% | 39.91% | 40.44% | | | WikiANN en | 55.35% | 60.06% | 57.61% | | | WikiNeural | 64.52% | 86.24% | 73.81% | | | bc2gm | 51.70% | 49.99% | 50.83% | | | bc4chemd | 30.78% | 57.56% | 40.11% | | | bc5cdr | 63.48% | 69.65% | 66.42% | | | ncbi | 63.36% | 66.67% | 64.97% | | | **Average** | | | **46.58%** | |------------------------------------|---------------------|-----------|--------|----------| | urchade/gliner_small-v2.1 | ACE 2004 | 38.89% | 23.53% | 29.32% | | | ACE 2005 | 42.09% | 26.82% | 32.76% | | | AnatEM | 63.71% | 19.45% | 29.80% | | | Broad Tweet Corpus | 57.01% | 70.49% | 63.04% | | | CoNLL 2003 | 57.11% | 62.66% | 59.76% | | | FabNER | 32.41% | 12.33% | 17.87% | | | FindVehicle | 43.47% | 33.02% | 37.53% | | | GENIA_NER | 61.03% | 37.25% | 46.26% | | | HarveyNER | 23.12% | 15.16% | 18.32% | | | MultiNERD | 43.63% | 83.60% | 57.34% | | | Ontonotes | 23.25% | 35.41% | 28.07% | | | PolyglotNER | 29.47% | 64.41% | 40.44% | | | TweetNER7 | 44.78% | 30.83% | 36.52% | | | WikiANN en | 52.58% | 58.31% | 55.30% | | | WikiNeural | 53.38% | 82.19% | 64.72% | | | bc2gm | 66.64% | 30.56% | 41.90% | | | bc4chemd | 42.01% | 56.03% | 48.02% | | | bc5cdr | 72.03% | 58.58% | 64.61% | | | ncbi | 68.88% | 46.71% | 55.67% | | | **Average** | | | **43.54%** | |------------------------------------|---------------------|-----------|--------|----------| | EmergentMethods/gliner_small-v2.1 | ACE 2004 | 39.92% | 17.50% | 24.34% | | | ACE 2005 | 38.53% | 16.58% | 23.18% | | | AnatEM | 55.95% | 25.69% | 35.22% | | | Broad Tweet Corpus | 66.63% | 72.00% | 69.21% | | | CoNLL 2003 | 62.89% | 58.96% | 60.86% | | | FabNER | 32.76% | 13.33% | 18.95% | | | FindVehicle | 42.93% | 43.20% | 43.06% | | | GENIA_NER | 51.28% | 43.75% | 47.22% | | | HarveyNER | 24.82% | 21.52% | 23.05% | | | MultiNERD | 59.27% | 80.69% | 68.34% | | | Ontonotes | 32.97% | 37.59% | 35.13% | | | PolyglotNER | 33.60% | 63.30% | 43.90% | | | TweetNER7 | 46.90% | 28.66% | 35.58% | | | WikiANN en | 51.91% | 55.43% | 53.61% | | | WikiNeural | 70.65% | 82.21% | 75.99% | | | bc2gm | 49.95% | 43.13% | 46.29% | | | bc4chemd | 35.88% | 71.64% | 47.81% | | | bc5cdr | 68.41% | 68.90% | 68.65% | | | ncbi | 55.31% | 59.87% | 57.50% | | | **Average** | | | **46.20%** | |-----------------------------------------|---------------------|-----------|--------|----------| | gliner-community/gliner_medium-v2.5 | ACE 2004 | 33.06% | 20.96% | 25.66% | | | ACE 2005 | 33.65% | 19.65% | 24.81% | | | AnatEM | 52.03% | 35.28% | 42.05% | | | Broad Tweet Corpus | 60.57% | 79.09% | 68.60% | | | CoNLL 2003 | 63.80% | 68.31% | 65.98% | | | FabNER | 26.20% | 22.26% | 24.07% | | | FindVehicle | 41.95% | 40.68% | 41.30% | | | GENIA_NER | 51.83% | 62.34% | 56.60% | | | HarveyNER | 14.04% | 32.17% | 19.55% | | | MultiNERD | 47.63% | 88.78% | 62.00% | | | Ontonotes | 21.68% | 38.41% | 27.71% | | | PolyglotNER | 32.73% | 68.27% | 44.24% | | | TweetNER7 | 40.39% | 37.64% | 38.97% | | | WikiANN en | 56.41% | 59.90% | 58.10% | | | WikiNeural | 65.61% | 86.28% | 74.54% | | | bc2gm | 55.20% | 56.71% | 55.95% | | | bc4chemd | 35.94% | 63.67% | 45.94% | | | bc5cdr | 63.50% | 70.09% | 66.63% | | | ncbi | 62.96% | 68.55% | 65.63% | | | **Average** | | | **47.81%** | |-----------------------------------------|---------------------|-----------|--------|----------| | urchade/gliner_medium-v2.1 | ACE 2004 | 36.33% | 22.74% | 27.97% | | | ACE 2005 | 40.49% | 25.46% | 31.27% | | | AnatEM | 59.75% | 16.87% | 26.31% | | | Broad Tweet Corpus | 60.89% | 67.25% | 63.91% | | | CoNLL 2003 | 60.62% | 62.39% | 61.50% | | | FabNER | 27.72% | 12.24% | 16.98% | | | FindVehicle | 41.55% | 31.31% | 35.71% | | | GENIA_NER | 60.86% | 43.93% | 51.03% | | | HarveyNER | 23.20% | 23.16% | 23.18% | | | MultiNERD | 41.25% | 83.74% | 55.27% | | | Ontonotes | 20.58% | 34.11% | 25.67% | | | PolyglotNER | 31.32% | 64.22% | 42.11% | | | TweetNER7 | 44.52% | 33.42% | 38.18% | | | WikiANN en | 54.57% | 56.47% | 55.51% | | | WikiNeural | 57.60% | 81.57% | 67.52% | | | bc2gm | 67.98% | 33.45% | 44.84% | | | bc4chemd | 45.66% | 52.00% | 48.62% | | | bc5cdr | 72.20% | 58.12% | 64.40% | | | ncbi | 73.12% | 49.74% | 59.20% | | | **Average** | | | **44.17%** | |-----------------------------------------|---------------------|-----------|--------|----------| | EmergentMethods/gliner_news_medium-v2.1 | ACE 2004 | 39.21% | 17.24% | 23.95% | | | ACE 2005 | 39.82% | 16.48% | 23.31% | | | AnatEM | 57.67% | 23.57% | 33.46% | | | Broad Tweet Corpus | 69.52% | 65.94% | 67.69% | | | CoNLL 2003 | 68.26% | 58.45% | 62.97% | | | FabNER | 30.74% | 15.51% | 20.62% | | | FindVehicle | 40.33% | 37.37% | 38.79% | | | GENIA_NER | 53.70% | 47.73% | 50.54% | | | HarveyNER | 26.29% | 27.05% | 26.67% | | | MultiNERD | 56.78% | 81.96% | 67.08% | | | Ontonotes | 30.90% | 35.86% | 33.19% | | | PolyglotNER | 35.98% | 60.96% | 45.25% | | | TweetNER7 | 52.37% | 30.50% | 38.55% | | | WikiANN en | 53.81% | 52.29% | 53.04% | | | WikiNeural | 76.84% | 78.92% | 77.86% | | | bc2gm | 62.97% | 44.24% | 51.96% | | | bc4chemd | 44.90% | 65.56% | 53.30% | | | bc5cdr | 73.93% | 67.03% | 70.31% | | | ncbi | 69.53% | 60.82% | 64.88% | | | **Average** | | | **47.55%** | |-----------------------------------------|---------------------|-----------|--------|----------| | gliner-community/gliner_large-v2.5 | ACE 2004 | 31.64% | 22.81% | 26.51% | | | ACE 2005 | 32.10% | 22.56% | 26.49% | | | AnatEM | 53.64% | 27.82% | 36.64% | | | Broad Tweet Corpus | 61.93% | 76.85% | 68.59% | | | CoNLL 2003 | 62.83% | 67.71% | 65.18% | | | FabNER | 24.54% | 27.03% | 25.73% | | | FindVehicle | 40.71% | 56.24% | 47.23% | | | GENIA_NER | 43.56% | 52.56% | 47.64% | | | HarveyNER | 14.85% | 27.05% | 19.17% | | | MultiNERD | 38.04% | 89.17% | 53.33% | | | Ontonotes | 17.28% | 40.16% | 24.16% | | | PolyglotNER | 32.88% | 63.31% | 43.28% | | | TweetNER7 | 38.03% | 41.43% | 39.66% | | | WikiANN en | 57.80% | 60.54% | 59.14% | | | WikiNeural | 67.72% | 83.94% | 74.96% | | | bc2gm | 54.74% | 48.54% | 51.45% | | | bc4chemd | 40.20% | 58.66% | 47.71% | | | bc5cdr | 66.27% | 71.95% | 69.00% | | | ncbi | 68.09% | 61.55% | 64.65% | | | **Average** | | | **46.87%** | |-----------------------------------------|---------------------|-----------|--------|----------| | urchade/gliner_large-v2.1 | ACE 2004 | 37.52% | 25.38% | 30.28% | | | ACE 2005 | 39.02% | 29.00% | 33.27% | | | AnatEM | 52.86% | 13.64% | 21.68% | | | Broad Tweet Corpus | 51.44% | 71.73% | 59.91% | | | CoNLL 2003 | 54.86% | 64.98% | 59.49% | | | FabNER | 23.98% | 16.00% | 19.19% | | | FindVehicle | 47.04% | 57.53% | 51.76% | | | GENIA_NER | 58.10% | 49.98% | 53.74% | | | HarveyNER | 16.29% | 21.93% | 18.69% | | | MultiNERD | 34.09% | 85.43% | 48.74% | | | Ontonotes | 14.02% | 32.01% | 19.50% | | | PolyglotNER | 28.53% | 64.92% | 39.64% | | | TweetNER7 | 38.00% | 34.34% | 36.08% | | | WikiANN en | 51.69% | 59.92% | 55.50% | | | WikiNeural | 50.94% | 82.08% | 62.87% | | | bc2gm | 64.48% | 32.47% | 43.19% | | | bc4chemd | 48.66% | 57.52% | 52.72% | | | bc5cdr | 72.19% | 64.27% | 68.00% | | | ncbi | 69.54% | 52.25% | 59.67% | | | **Average** | | | **43.89%** | |-----------------------------------------|---------------------|-----------|--------|----------| | EmergenMethods/gliner_news_large-v2.1 | ACE 2004 | 43.19% | 18.39% | 25.80% | | | ACE 2005 | 45.24% | 21.20% | 28.87% | | | AnatEM | 61.51% | 21.66% | 32.04% | | | Broad Tweet Corpus | 69.38% | 68.99% | 69.18% | | | CoNLL 2003 | 61.47% | 52.18% | 56.45% | | | FabNER | 27.42% | 19.11% | 22.52% | | | FindVehicle | 46.30% | 62.48% | 53.19% | | | GENIA_NER | 54.13% | 54.02% | 54.07% | | | HarveyNER | 15.91% | 15.78% | 15.84% | | | MultiNERD | 53.73% | 79.07% | 63.98% | | | Ontonotes | 26.78% | 39.77% | 32.01% | | | PolyglotNER | 34.28% | 55.87% | 42.49% | | | TweetNER7 | 48.06% | 28.18% | 35.53% | | | WikiANN en | 53.66% | 51.34% | 52.47% | | | WikiNeural | 69.81% | 70.75% | 70.28% | | | bc2gm | 59.83% | 37.62% | 46.20% | | | bc4chemd | 46.24% | 69.15% | 55.42% | | | bc5cdr | 71.94% | 70.37% | 71.15% | | | ncbi | 70.17% | 61.44% | 65.52% | | | **Average** | | | **47.00%** | |-----------------|---------------------|-----------|--------|----------| | numind/NuNER_Zero-span | ACE 2004 | 37.15% | 20.01% | 26.01% | | | ACE 2005 | 34.93% | 17.87% | 23.64% | | | AnatEM | 62.78% | 20.19% | 30.55% | | | Broad Tweet Corpus | 51.75% | 71.76% | 60.13% | | | CoNLL 2003 | 58.11% | 70.34% | 63.64% | | | FabNER | 35.56% | 18.17% | 24.05% | | | FindVehicle | 51.19% | 38.75% | 44.11% | | | GENIA_NER | 59.98% | 48.49% | 53.63% | | | HarveyNER | 26.57% | 23.36% | 24.86% | | | MultiNERD | 50.47% | 87.06% | 63.90% | | | Ontonotes | 26.65% | 38.68% | 31.56% | | | PolyglotNER | 31.19% | 68.13% | 42.79% | | | TweetNER7 | 47.40% | 34.45% | 39.90% | | | WikiANN en | 55.81% | 60.65% | 58.13% | | | WikiNeural | 61.93% | 86.89% | 72.31% | | | bc2gm | 63.75% | 44.22% | 52.22% | | | bc4chemd | 43.21% | 63.35% | 51.37% | | | bc5cdr | 66.99% | 72.00% | 69.40% | | | ncbi | 70.20% | 53.92% | 60.99% | | | **Average** | | | **47.00%** | |-----------------------------------------|---------------------|-----------|--------|----------| | gliner-community/gliner-community-v2.5 | ACE 2004 | | | 29.80% | | | ACE 2005 | | | 31.90% | | | AnatEM | | | 26.40% | | | Broad Tweet Corpus | | | 71.60% | | | CoNLL 2003 | | | 66.70% | | | FabNER | | | 27.40% | | | FindVehicle | | | 49.90% | | | GENIA_NER | | | 57.20% | | | HarveyNER | | | 29.40% | | | MultiNERD | | | 61.40% | | | Ontonotes | | | 28.10% | | | PolyglotNER | | | 43.50% | | | TweetNER7 | | | 42.00% | | | WikiANN en | | | 56.40% | | | WikiNeural | | | 75.70% | | | bc2gm | | | 47.80% | | | bc4chemd | | | 50.00% | | | bc5cdr | | | 71.10% | | | ncbi | | | 65.00% | | | **Average** | | | **49.00%** | |-----------------------------------------|---------------------|-----------|--------|----------| ## Other available models | Release | Model Name | # of Parameters | Language | License | | - | - | - | - | - | | v0 | [urchade/gliner_base](https://huggingface.co/urchade/gliner_base)<br>[urchade/gliner_multi](https://huggingface.co/urchade/gliner_multi) | 209M<br>209M | English<br>Multilingual | cc-by-nc-4.0 | | v1 | [urchade/gliner_small-v1](https://huggingface.co/urchade/gliner_small-v1)<br>[urchade/gliner_medium-v1](https://huggingface.co/urchade/gliner_medium-v1)<br>[urchade/gliner_large-v1](https://huggingface.co/urchade/gliner_large-v1) | 166M<br>209M<br>459M | English <br> English <br> English | cc-by-nc-4.0 | | v2 | [urchade/gliner_small-v2](https://huggingface.co/urchade/gliner_small-v2)<br>[urchade/gliner_medium-v2](https://huggingface.co/urchade/gliner_medium-v2)<br>[urchade/gliner_large-v2](https://huggingface.co/urchade/gliner_large-v2) | 166M<br>209M<br>459M | English <br> English <br> English | apache-2.0 | | v2.1 | [urchade/gliner_small-v2.1](https://huggingface.co/urchade/gliner_small-v2.1)<br>[urchade/gliner_medium-v2.1](https://huggingface.co/urchade/gliner_medium-v2.1)<br>[urchade/gliner_large-v2.1](https://huggingface.co/urchade/gliner_large-v2.1) <br>[urchade/gliner_multi-v2.1](https://huggingface.co/urchade/gliner_multi-v2.1) | 166M<br>209M<br>459M<br>209M | English <br> English <br> English <br> Multilingual | apache-2.0 | ## Model Authors The model authors are: * [Urchade Zaratiana](https://huggingface.co/urchade) * [Ihor Stepanov](https://huggingface.co/Ihor) * Nadi Tomeh * Pierre Holat * Thierry Charnois ## Citation ```bibtex @misc{zaratiana2023gliner, title={GLiNER: Generalist Model for Named Entity Recognition using Bidirectional Transformer}, author={Urchade Zaratiana and Nadi Tomeh and Pierre Holat and Thierry Charnois}, year={2023}, eprint={2311.08526}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
[ "NAMED_ENTITY_RECOGNITION" ]
[ "ANATEM", "BC5CDR" ]
Non_BioNLP
twadada/nmc-300-w50k-b30k
twadada
null
[ "mteb", "model-index", "region:us" ]
1,726
1,726
0
0
--- tags: - mteb model-index: - name: nomic_classification_307_w50k_b30k results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: None config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 72.04477611940298 - type: ap value: 34.133147681736574 - type: f1 value: 65.73569090089603 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: None config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 64.6129 - type: ap value: 59.73474867106533 - type: f1 value: 64.37745361254407 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: None config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 33.86000000000001 - type: f1 value: 33.4167439582646 - task: type: Retrieval dataset: name: MTEB ArguAna type: None config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 20.555 - type: map_at_10 value: 34.29 - type: map_at_100 value: 35.473 - type: map_at_1000 value: 35.498000000000005 - type: map_at_3 value: 29.753 - type: map_at_5 value: 32.257000000000005 - type: mrr_at_1 value: 21.266 - type: mrr_at_10 value: 34.571000000000005 - type: mrr_at_100 value: 35.732 - type: mrr_at_1000 value: 35.758 - type: mrr_at_3 value: 30.037999999999997 - type: mrr_at_5 value: 32.492 - type: ndcg_at_1 value: 20.555 - type: ndcg_at_10 value: 42.283 - type: ndcg_at_100 value: 47.904 - type: ndcg_at_1000 value: 48.518 - type: ndcg_at_3 value: 32.845 - type: ndcg_at_5 value: 37.372 - type: precision_at_1 value: 20.555 - type: precision_at_10 value: 6.7989999999999995 - type: precision_at_100 value: 0.9400000000000001 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 13.94 - type: precision_at_5 value: 10.569 - type: recall_at_1 value: 20.555 - type: recall_at_10 value: 67.994 - type: recall_at_100 value: 93.95400000000001 - type: recall_at_1000 value: 98.649 - type: recall_at_3 value: 41.821000000000005 - type: recall_at_5 value: 52.845 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: None config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 31.705177661382283 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: None config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 21.93065120477086 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: None config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 52.52224532394495 - type: mrr value: 66.04625599085433 - task: type: STS dataset: name: MTEB BIOSSES type: None config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 78.75516031950443 - type: cos_sim_spearman value: 76.92645161656596 - type: euclidean_pearson value: 78.07410163583403 - type: euclidean_spearman value: 76.92645161656596 - type: manhattan_pearson value: 77.99272531232194 - type: manhattan_spearman value: 76.85596808284 - task: type: Classification dataset: name: MTEB Banking77Classification type: None config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 72.29220779220779 - type: f1 value: 71.45293655648962 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: None config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 29.683083608126136 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: None config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 20.115223677732263 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: None config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: map_at_1 value: 21.508 - type: map_at_10 value: 28.613 - type: map_at_100 value: 29.694 - type: map_at_1000 value: 29.848000000000003 - type: map_at_3 value: 26.444000000000003 - type: map_at_5 value: 27.705000000000002 - type: mrr_at_1 value: 27.039 - type: mrr_at_10 value: 34.28 - type: mrr_at_100 value: 35.156 - type: mrr_at_1000 value: 35.229 - type: mrr_at_3 value: 32.546 - type: mrr_at_5 value: 33.541 - type: ndcg_at_1 value: 27.039 - type: ndcg_at_10 value: 33.157 - type: ndcg_at_100 value: 38.172 - type: ndcg_at_1000 value: 41.407 - type: ndcg_at_3 value: 30.293999999999997 - type: ndcg_at_5 value: 31.540000000000003 - type: precision_at_1 value: 27.039 - type: precision_at_10 value: 6.223 - type: precision_at_100 value: 1.087 - type: precision_at_1000 value: 0.166 - type: precision_at_3 value: 14.688 - type: precision_at_5 value: 10.501000000000001 - type: recall_at_1 value: 21.508 - type: recall_at_10 value: 40.608 - type: recall_at_100 value: 63.131 - type: recall_at_1000 value: 85.292 - type: recall_at_3 value: 31.283 - type: recall_at_5 value: 35.237 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval type: None config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: map_at_1 value: 16.338 - type: map_at_10 value: 21.65 - type: map_at_100 value: 22.549 - type: map_at_1000 value: 22.675 - type: map_at_3 value: 19.888 - type: map_at_5 value: 20.848 - type: mrr_at_1 value: 20.764 - type: mrr_at_10 value: 25.8 - type: mrr_at_100 value: 26.590000000000003 - type: mrr_at_1000 value: 26.663999999999998 - type: mrr_at_3 value: 24.066000000000003 - type: mrr_at_5 value: 24.907 - type: ndcg_at_1 value: 20.764 - type: ndcg_at_10 value: 25.229000000000003 - type: ndcg_at_100 value: 29.604000000000003 - type: ndcg_at_1000 value: 32.535 - type: ndcg_at_3 value: 22.294 - type: ndcg_at_5 value: 23.52 - type: precision_at_1 value: 20.764 - type: precision_at_10 value: 4.675 - type: precision_at_100 value: 0.859 - type: precision_at_1000 value: 0.136 - type: precision_at_3 value: 10.616 - type: precision_at_5 value: 7.567 - type: recall_at_1 value: 16.338 - type: recall_at_10 value: 31.825 - type: recall_at_100 value: 51.400999999999996 - type: recall_at_1000 value: 71.50800000000001 - type: recall_at_3 value: 23.372 - type: recall_at_5 value: 26.662000000000003 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval type: None config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 25.196 - type: map_at_10 value: 33.175 - type: map_at_100 value: 34.239000000000004 - type: map_at_1000 value: 34.339 - type: map_at_3 value: 30.887999999999998 - type: map_at_5 value: 32.090999999999994 - type: mrr_at_1 value: 28.965999999999998 - type: mrr_at_10 value: 36.148 - type: mrr_at_100 value: 37.015 - type: mrr_at_1000 value: 37.078 - type: mrr_at_3 value: 33.992 - type: mrr_at_5 value: 35.117 - type: ndcg_at_1 value: 28.965999999999998 - type: ndcg_at_10 value: 37.687 - type: ndcg_at_100 value: 42.768 - type: ndcg_at_1000 value: 45.07 - type: ndcg_at_3 value: 33.341 - type: ndcg_at_5 value: 35.237 - type: precision_at_1 value: 28.965999999999998 - type: precision_at_10 value: 6.138 - type: precision_at_100 value: 0.95 - type: precision_at_1000 value: 0.123 - type: precision_at_3 value: 14.796000000000001 - type: precision_at_5 value: 10.194 - type: recall_at_1 value: 25.196 - type: recall_at_10 value: 48.443000000000005 - type: recall_at_100 value: 71.414 - type: recall_at_1000 value: 88.108 - type: recall_at_3 value: 36.647999999999996 - type: recall_at_5 value: 41.384 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval type: None config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: map_at_1 value: 11.036 - type: map_at_10 value: 15.011 - type: map_at_100 value: 15.751999999999999 - type: map_at_1000 value: 15.864 - type: map_at_3 value: 13.675 - type: map_at_5 value: 14.414 - type: mrr_at_1 value: 12.203 - type: mrr_at_10 value: 16.232 - type: mrr_at_100 value: 16.965 - type: mrr_at_1000 value: 17.069000000000003 - type: mrr_at_3 value: 14.84 - type: mrr_at_5 value: 15.591 - type: ndcg_at_1 value: 12.203 - type: ndcg_at_10 value: 17.59 - type: ndcg_at_100 value: 21.718 - type: ndcg_at_1000 value: 25.108000000000004 - type: ndcg_at_3 value: 14.857999999999999 - type: ndcg_at_5 value: 16.161 - type: precision_at_1 value: 12.203 - type: precision_at_10 value: 2.757 - type: precision_at_100 value: 0.518 - type: precision_at_1000 value: 0.08499999999999999 - type: precision_at_3 value: 6.328 - type: precision_at_5 value: 4.497 - type: recall_at_1 value: 11.036 - type: recall_at_10 value: 24.511 - type: recall_at_100 value: 44.396 - type: recall_at_1000 value: 70.916 - type: recall_at_3 value: 17.1 - type: recall_at_5 value: 20.244999999999997 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval type: None config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: map_at_1 value: 6.329999999999999 - type: map_at_10 value: 9.338000000000001 - type: map_at_100 value: 10.047 - type: map_at_1000 value: 10.164 - type: map_at_3 value: 8.072 - type: map_at_5 value: 8.863 - type: mrr_at_1 value: 8.333 - type: mrr_at_10 value: 11.55 - type: mrr_at_100 value: 12.298 - type: mrr_at_1000 value: 12.393 - type: mrr_at_3 value: 10.095 - type: mrr_at_5 value: 10.947 - type: ndcg_at_1 value: 8.333 - type: ndcg_at_10 value: 11.676 - type: ndcg_at_100 value: 15.468000000000002 - type: ndcg_at_1000 value: 19.057 - type: ndcg_at_3 value: 9.17 - type: ndcg_at_5 value: 10.484 - type: precision_at_1 value: 8.333 - type: precision_at_10 value: 2.2640000000000002 - type: precision_at_100 value: 0.484 - type: precision_at_1000 value: 0.093 - type: precision_at_3 value: 4.353 - type: precision_at_5 value: 3.458 - type: recall_at_1 value: 6.329999999999999 - type: recall_at_10 value: 16.825000000000003 - type: recall_at_100 value: 33.986 - type: recall_at_1000 value: 60.80799999999999 - type: recall_at_3 value: 9.98 - type: recall_at_5 value: 13.251 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval type: None config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: map_at_1 value: 17.105999999999998 - type: map_at_10 value: 22.99 - type: map_at_100 value: 24.108999999999998 - type: map_at_1000 value: 24.239 - type: map_at_3 value: 21.096999999999998 - type: map_at_5 value: 22.076 - type: mrr_at_1 value: 21.752 - type: mrr_at_10 value: 27.793 - type: mrr_at_100 value: 28.676000000000002 - type: mrr_at_1000 value: 28.754999999999995 - type: mrr_at_3 value: 25.874000000000002 - type: mrr_at_5 value: 26.88 - type: ndcg_at_1 value: 21.752 - type: ndcg_at_10 value: 27.151999999999997 - type: ndcg_at_100 value: 32.492 - type: ndcg_at_1000 value: 35.563 - type: ndcg_at_3 value: 23.839 - type: ndcg_at_5 value: 25.188 - type: precision_at_1 value: 21.752 - type: precision_at_10 value: 4.986 - type: precision_at_100 value: 0.9249999999999999 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 11.197 - type: precision_at_5 value: 7.968999999999999 - type: recall_at_1 value: 17.105999999999998 - type: recall_at_10 value: 35.231 - type: recall_at_100 value: 58.634 - type: recall_at_1000 value: 80.077 - type: recall_at_3 value: 25.534000000000002 - type: recall_at_5 value: 29.175 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval type: None config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: map_at_1 value: 11.303 - type: map_at_10 value: 16.778000000000002 - type: map_at_100 value: 17.802 - type: map_at_1000 value: 17.944 - type: map_at_3 value: 14.906 - type: map_at_5 value: 16.038 - type: mrr_at_1 value: 14.269000000000002 - type: mrr_at_10 value: 19.967 - type: mrr_at_100 value: 20.921 - type: mrr_at_1000 value: 21.009 - type: mrr_at_3 value: 18.075 - type: mrr_at_5 value: 19.198999999999998 - type: ndcg_at_1 value: 14.269000000000002 - type: ndcg_at_10 value: 20.294999999999998 - type: ndcg_at_100 value: 25.346999999999998 - type: ndcg_at_1000 value: 28.860999999999997 - type: ndcg_at_3 value: 16.858 - type: ndcg_at_5 value: 18.647 - type: precision_at_1 value: 14.269000000000002 - type: precision_at_10 value: 3.904 - type: precision_at_100 value: 0.757 - type: precision_at_1000 value: 0.124 - type: precision_at_3 value: 8.029 - type: precision_at_5 value: 6.164 - type: recall_at_1 value: 11.303 - type: recall_at_10 value: 27.661 - type: recall_at_100 value: 49.976 - type: recall_at_1000 value: 75.01400000000001 - type: recall_at_3 value: 18.719 - type: recall_at_5 value: 23.061999999999998 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: mteb/cqadupstack config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 13.457416666666669 - type: map_at_10 value: 18.469333333333335 - type: map_at_100 value: 19.32433333333334 - type: map_at_1000 value: 19.44766666666667 - type: map_at_3 value: 16.88808333333333 - type: map_at_5 value: 17.75958333333333 - type: mrr_at_1 value: 16.38458333333333 - type: mrr_at_10 value: 21.479583333333334 - type: mrr_at_100 value: 22.253666666666664 - type: mrr_at_1000 value: 22.340916666666665 - type: mrr_at_3 value: 19.879083333333334 - type: mrr_at_5 value: 20.749750000000002 - type: ndcg_at_1 value: 16.38458333333333 - type: ndcg_at_10 value: 21.7645 - type: ndcg_at_100 value: 26.100583333333333 - type: ndcg_at_1000 value: 29.30358333333334 - type: ndcg_at_3 value: 18.90866666666667 - type: ndcg_at_5 value: 20.19475 - type: precision_at_1 value: 16.38458333333333 - type: precision_at_10 value: 3.896916666666667 - type: precision_at_100 value: 0.7189999999999999 - type: precision_at_1000 value: 0.11716666666666666 - type: precision_at_3 value: 8.805583333333333 - type: precision_at_5 value: 6.311999999999999 - type: recall_at_1 value: 13.457416666666669 - type: recall_at_10 value: 28.731250000000003 - type: recall_at_100 value: 48.64233333333334 - type: recall_at_1000 value: 72.04116666666665 - type: recall_at_3 value: 20.665249999999997 - type: recall_at_5 value: 23.995583333333332 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval type: None config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: map_at_1 value: 10.05 - type: map_at_10 value: 14.036999999999999 - type: map_at_100 value: 14.64 - type: map_at_1000 value: 14.722 - type: map_at_3 value: 12.85 - type: map_at_5 value: 13.517000000000001 - type: mrr_at_1 value: 12.117 - type: mrr_at_10 value: 16.145 - type: mrr_at_100 value: 16.732 - type: mrr_at_1000 value: 16.811 - type: mrr_at_3 value: 14.877 - type: mrr_at_5 value: 15.583 - type: ndcg_at_1 value: 12.117 - type: ndcg_at_10 value: 16.668 - type: ndcg_at_100 value: 19.971 - type: ndcg_at_1000 value: 22.527 - type: ndcg_at_3 value: 14.393 - type: ndcg_at_5 value: 15.436 - type: precision_at_1 value: 12.117 - type: precision_at_10 value: 2.822 - type: precision_at_100 value: 0.49500000000000005 - type: precision_at_1000 value: 0.078 - type: precision_at_3 value: 6.646000000000001 - type: precision_at_5 value: 4.662999999999999 - type: recall_at_1 value: 10.05 - type: recall_at_10 value: 22.899 - type: recall_at_100 value: 38.489000000000004 - type: recall_at_1000 value: 58.275 - type: recall_at_3 value: 16.166 - type: recall_at_5 value: 18.976000000000003 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval type: None config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 7.185999999999999 - type: map_at_10 value: 10.324 - type: map_at_100 value: 10.921 - type: map_at_1000 value: 11.038 - type: map_at_3 value: 9.35 - type: map_at_5 value: 9.918000000000001 - type: mrr_at_1 value: 8.878 - type: mrr_at_10 value: 12.497 - type: mrr_at_100 value: 13.103000000000002 - type: mrr_at_1000 value: 13.203000000000001 - type: mrr_at_3 value: 11.373 - type: mrr_at_5 value: 12.061 - type: ndcg_at_1 value: 8.878 - type: ndcg_at_10 value: 12.543000000000001 - type: ndcg_at_100 value: 15.943999999999999 - type: ndcg_at_1000 value: 19.407 - type: ndcg_at_3 value: 10.687000000000001 - type: ndcg_at_5 value: 11.619 - type: precision_at_1 value: 8.878 - type: precision_at_10 value: 2.35 - type: precision_at_100 value: 0.488 - type: precision_at_1000 value: 0.094 - type: precision_at_3 value: 5.127000000000001 - type: precision_at_5 value: 3.82 - type: recall_at_1 value: 7.185999999999999 - type: recall_at_10 value: 17.138 - type: recall_at_100 value: 33.194 - type: recall_at_1000 value: 59.14 - type: recall_at_3 value: 12.058 - type: recall_at_5 value: 14.329 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval type: None config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: map_at_1 value: 11.498 - type: map_at_10 value: 15.725 - type: map_at_100 value: 16.515 - type: map_at_1000 value: 16.628 - type: map_at_3 value: 14.335999999999999 - type: map_at_5 value: 15.027 - type: mrr_at_1 value: 14.086000000000002 - type: mrr_at_10 value: 18.592 - type: mrr_at_100 value: 19.405 - type: mrr_at_1000 value: 19.500999999999998 - type: mrr_at_3 value: 17.086000000000002 - type: mrr_at_5 value: 17.855999999999998 - type: ndcg_at_1 value: 14.086000000000002 - type: ndcg_at_10 value: 18.686 - type: ndcg_at_100 value: 22.925 - type: ndcg_at_1000 value: 26.279999999999998 - type: ndcg_at_3 value: 15.955 - type: ndcg_at_5 value: 17.05 - type: precision_at_1 value: 14.086000000000002 - type: precision_at_10 value: 3.209 - type: precision_at_100 value: 0.587 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 7.338 - type: precision_at_5 value: 5.131 - type: recall_at_1 value: 11.498 - type: recall_at_10 value: 25.135 - type: recall_at_100 value: 44.751000000000005 - type: recall_at_1000 value: 69.75 - type: recall_at_3 value: 17.471999999999998 - type: recall_at_5 value: 20.273 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval type: None config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: map_at_1 value: 13.111999999999998 - type: map_at_10 value: 19.224 - type: map_at_100 value: 20.183999999999997 - type: map_at_1000 value: 20.354 - type: map_at_3 value: 17.738 - type: map_at_5 value: 18.518 - type: mrr_at_1 value: 16.008 - type: mrr_at_10 value: 22.421 - type: mrr_at_100 value: 23.174 - type: mrr_at_1000 value: 23.261000000000003 - type: mrr_at_3 value: 20.784 - type: mrr_at_5 value: 21.634 - type: ndcg_at_1 value: 16.008 - type: ndcg_at_10 value: 23.082 - type: ndcg_at_100 value: 27.563 - type: ndcg_at_1000 value: 31.135 - type: ndcg_at_3 value: 20.517 - type: ndcg_at_5 value: 21.595 - type: precision_at_1 value: 16.008 - type: precision_at_10 value: 4.625 - type: precision_at_100 value: 0.966 - type: precision_at_1000 value: 0.179 - type: precision_at_3 value: 10.079 - type: precision_at_5 value: 7.2330000000000005 - type: recall_at_1 value: 13.111999999999998 - type: recall_at_10 value: 30.297 - type: recall_at_100 value: 51.549 - type: recall_at_1000 value: 76.255 - type: recall_at_3 value: 22.817999999999998 - type: recall_at_5 value: 25.741000000000003 - task: type: Retrieval dataset: name: MTEB CQADupstackWordpressRetrieval type: None config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 10.825999999999999 - type: map_at_10 value: 14.767 - type: map_at_100 value: 15.440000000000001 - type: map_at_1000 value: 15.557000000000002 - type: map_at_3 value: 13.413 - type: map_at_5 value: 14.099999999999998 - type: mrr_at_1 value: 12.2 - type: mrr_at_10 value: 16.33 - type: mrr_at_100 value: 17.009 - type: mrr_at_1000 value: 17.118 - type: mrr_at_3 value: 14.940999999999999 - type: mrr_at_5 value: 15.681000000000001 - type: ndcg_at_1 value: 12.2 - type: ndcg_at_10 value: 17.409 - type: ndcg_at_100 value: 21.235 - type: ndcg_at_1000 value: 24.693 - type: ndcg_at_3 value: 14.698 - type: ndcg_at_5 value: 15.86 - type: precision_at_1 value: 12.2 - type: precision_at_10 value: 2.81 - type: precision_at_100 value: 0.512 - type: precision_at_1000 value: 0.09 - type: precision_at_3 value: 6.47 - type: precision_at_5 value: 4.547 - type: recall_at_1 value: 10.825999999999999 - type: recall_at_10 value: 24.202 - type: recall_at_100 value: 42.787 - type: recall_at_1000 value: 69.351 - type: recall_at_3 value: 16.833000000000002 - type: recall_at_5 value: 19.612 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: None config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: map_at_1 value: 5.436 - type: map_at_10 value: 9.59 - type: map_at_100 value: 10.847 - type: map_at_1000 value: 11.043 - type: map_at_3 value: 7.797 - type: map_at_5 value: 8.627 - type: mrr_at_1 value: 12.182 - type: mrr_at_10 value: 19.584 - type: mrr_at_100 value: 20.772 - type: mrr_at_1000 value: 20.855999999999998 - type: mrr_at_3 value: 16.938 - type: mrr_at_5 value: 18.358 - type: ndcg_at_1 value: 12.182 - type: ndcg_at_10 value: 14.514 - type: ndcg_at_100 value: 20.495 - type: ndcg_at_1000 value: 24.664 - type: ndcg_at_3 value: 11.024000000000001 - type: ndcg_at_5 value: 12.200999999999999 - type: precision_at_1 value: 12.182 - type: precision_at_10 value: 4.853 - type: precision_at_100 value: 1.124 - type: precision_at_1000 value: 0.187 - type: precision_at_3 value: 8.317 - type: precision_at_5 value: 6.697 - type: recall_at_1 value: 5.436 - type: recall_at_10 value: 18.615000000000002 - type: recall_at_100 value: 39.621 - type: recall_at_1000 value: 63.852 - type: recall_at_3 value: 10.474 - type: recall_at_5 value: 13.370999999999999 - task: type: Retrieval dataset: name: MTEB DBPedia type: None config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: map_at_1 value: 3.8309999999999995 - type: map_at_10 value: 7.7490000000000006 - type: map_at_100 value: 10.421 - type: map_at_1000 value: 11.161 - type: map_at_3 value: 5.762 - type: map_at_5 value: 6.645 - type: mrr_at_1 value: 35.25 - type: mrr_at_10 value: 43.685 - type: mrr_at_100 value: 44.567 - type: mrr_at_1000 value: 44.618 - type: mrr_at_3 value: 41.375 - type: mrr_at_5 value: 42.85 - type: ndcg_at_1 value: 25.624999999999996 - type: ndcg_at_10 value: 19.837 - type: ndcg_at_100 value: 21.92 - type: ndcg_at_1000 value: 28.116000000000003 - type: ndcg_at_3 value: 22.561 - type: ndcg_at_5 value: 21.073 - type: precision_at_1 value: 35.25 - type: precision_at_10 value: 17.125 - type: precision_at_100 value: 5.35 - type: precision_at_1000 value: 1.142 - type: precision_at_3 value: 25.833000000000002 - type: precision_at_5 value: 22.0 - type: recall_at_1 value: 3.8309999999999995 - type: recall_at_10 value: 11.393 - type: recall_at_100 value: 26.519 - type: recall_at_1000 value: 47.249 - type: recall_at_3 value: 6.708 - type: recall_at_5 value: 8.584 - task: type: Classification dataset: name: MTEB EmotionClassification type: None config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 43.69 - type: f1 value: 39.825457200782445 - task: type: Retrieval dataset: name: MTEB FEVER type: None config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: map_at_1 value: 10.94 - type: map_at_10 value: 16.482 - type: map_at_100 value: 17.321 - type: map_at_1000 value: 17.403 - type: map_at_3 value: 14.679 - type: map_at_5 value: 15.637 - type: mrr_at_1 value: 11.701 - type: mrr_at_10 value: 17.583 - type: mrr_at_100 value: 18.444 - type: mrr_at_1000 value: 18.519 - type: mrr_at_3 value: 15.709000000000001 - type: mrr_at_5 value: 16.707 - type: ndcg_at_1 value: 11.701 - type: ndcg_at_10 value: 19.946 - type: ndcg_at_100 value: 24.462 - type: ndcg_at_1000 value: 26.863 - type: ndcg_at_3 value: 16.155 - type: ndcg_at_5 value: 17.888 - type: precision_at_1 value: 11.701 - type: precision_at_10 value: 3.2399999999999998 - type: precision_at_100 value: 0.5680000000000001 - type: precision_at_1000 value: 0.079 - type: precision_at_3 value: 6.991 - type: precision_at_5 value: 5.101 - type: recall_at_1 value: 10.94 - type: recall_at_10 value: 29.848000000000003 - type: recall_at_100 value: 51.451 - type: recall_at_1000 value: 70.316 - type: recall_at_3 value: 19.39 - type: recall_at_5 value: 23.583000000000002 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: None config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: map_at_1 value: 5.825 - type: map_at_10 value: 9.669 - type: map_at_100 value: 10.657 - type: map_at_1000 value: 10.850999999999999 - type: map_at_3 value: 8.174 - type: map_at_5 value: 8.931000000000001 - type: mrr_at_1 value: 11.574 - type: mrr_at_10 value: 17.153 - type: mrr_at_100 value: 18.027 - type: mrr_at_1000 value: 18.138 - type: mrr_at_3 value: 15.201 - type: mrr_at_5 value: 16.119 - type: ndcg_at_1 value: 11.574 - type: ndcg_at_10 value: 13.599 - type: ndcg_at_100 value: 18.422 - type: ndcg_at_1000 value: 23.124 - type: ndcg_at_3 value: 11.219999999999999 - type: ndcg_at_5 value: 11.984 - type: precision_at_1 value: 11.574 - type: precision_at_10 value: 4.043 - type: precision_at_100 value: 0.8869999999999999 - type: precision_at_1000 value: 0.168 - type: precision_at_3 value: 7.716000000000001 - type: precision_at_5 value: 5.926 - type: recall_at_1 value: 5.825 - type: recall_at_10 value: 17.837 - type: recall_at_100 value: 36.771 - type: recall_at_1000 value: 66.81 - type: recall_at_3 value: 10.181999999999999 - type: recall_at_5 value: 12.909 - task: type: Retrieval dataset: name: MTEB HotpotQA type: None config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: map_at_1 value: 13.558 - type: map_at_10 value: 18.987000000000002 - type: map_at_100 value: 19.768 - type: map_at_1000 value: 19.878999999999998 - type: map_at_3 value: 17.399 - type: map_at_5 value: 18.231 - type: mrr_at_1 value: 27.117 - type: mrr_at_10 value: 33.32 - type: mrr_at_100 value: 34.050000000000004 - type: mrr_at_1000 value: 34.122 - type: mrr_at_3 value: 31.570999999999998 - type: mrr_at_5 value: 32.519999999999996 - type: ndcg_at_1 value: 27.117 - type: ndcg_at_10 value: 24.708 - type: ndcg_at_100 value: 28.566000000000003 - type: ndcg_at_1000 value: 31.418000000000003 - type: ndcg_at_3 value: 21.549 - type: ndcg_at_5 value: 22.997 - type: precision_at_1 value: 27.117 - type: precision_at_10 value: 5.5169999999999995 - type: precision_at_100 value: 0.8630000000000001 - type: precision_at_1000 value: 0.125 - type: precision_at_3 value: 13.603000000000002 - type: precision_at_5 value: 9.313 - type: recall_at_1 value: 13.558 - type: recall_at_10 value: 27.583000000000002 - type: recall_at_100 value: 43.153000000000006 - type: recall_at_1000 value: 62.255 - type: recall_at_3 value: 20.405 - type: recall_at_5 value: 23.282 - task: type: Classification dataset: name: MTEB ImdbClassification type: None config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 64.06880000000001 - type: ap value: 59.27294551535466 - type: f1 value: 63.91536827569369 - task: type: Retrieval dataset: name: MTEB MSMARCO type: None config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: map_at_1 value: 5.241 - type: map_at_10 value: 9.158 - type: map_at_100 value: 9.907 - type: map_at_1000 value: 10.001 - type: map_at_3 value: 7.7 - type: map_at_5 value: 8.466999999999999 - type: mrr_at_1 value: 5.43 - type: mrr_at_10 value: 9.42 - type: mrr_at_100 value: 10.174999999999999 - type: mrr_at_1000 value: 10.267999999999999 - type: mrr_at_3 value: 7.932 - type: mrr_at_5 value: 8.718 - type: ndcg_at_1 value: 5.415 - type: ndcg_at_10 value: 11.655 - type: ndcg_at_100 value: 15.894 - type: ndcg_at_1000 value: 18.837 - type: ndcg_at_3 value: 8.59 - type: ndcg_at_5 value: 9.982000000000001 - type: precision_at_1 value: 5.415 - type: precision_at_10 value: 2.023 - type: precision_at_100 value: 0.42500000000000004 - type: precision_at_1000 value: 0.068 - type: precision_at_3 value: 3.7920000000000003 - type: precision_at_5 value: 2.971 - type: recall_at_1 value: 5.241 - type: recall_at_10 value: 19.429 - type: recall_at_100 value: 40.422999999999995 - type: recall_at_1000 value: 64.191 - type: recall_at_3 value: 10.95 - type: recall_at_5 value: 14.319 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: None config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 89.40264477884178 - type: f1 value: 88.48890314653876 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: None config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 59.86320109439125 - type: f1 value: 40.351741507970175 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: None config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.17552118359112 - type: f1 value: 59.92291942649051 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: None config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.89710827168797 - type: f1 value: 67.48141199477321 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: None config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 27.003683715948068 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: None config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 23.655067999182727 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: None config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 28.614207594880682 - type: mrr value: 29.46624248673221 - task: type: Retrieval dataset: name: MTEB NFCorpus type: None config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: map_at_1 value: 3.047 - type: map_at_10 value: 6.144 - type: map_at_100 value: 7.806 - type: map_at_1000 value: 8.998000000000001 - type: map_at_3 value: 4.566 - type: map_at_5 value: 5.321 - type: mrr_at_1 value: 30.65 - type: mrr_at_10 value: 39.213 - type: mrr_at_100 value: 39.973 - type: mrr_at_1000 value: 40.048 - type: mrr_at_3 value: 36.687 - type: mrr_at_5 value: 38.421 - type: ndcg_at_1 value: 28.638 - type: ndcg_at_10 value: 21.032 - type: ndcg_at_100 value: 19.85 - type: ndcg_at_1000 value: 29.416999999999998 - type: ndcg_at_3 value: 24.403 - type: ndcg_at_5 value: 22.956 - type: precision_at_1 value: 30.65 - type: precision_at_10 value: 15.479999999999999 - type: precision_at_100 value: 5.83 - type: precision_at_1000 value: 1.8800000000000001 - type: precision_at_3 value: 23.22 - type: precision_at_5 value: 19.628 - type: recall_at_1 value: 3.047 - type: recall_at_10 value: 9.871 - type: recall_at_100 value: 21.556 - type: recall_at_1000 value: 56.15 - type: recall_at_3 value: 5.476 - type: recall_at_5 value: 7.359 - task: type: Retrieval dataset: name: MTEB NQ type: None config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: map_at_1 value: 7.79 - type: map_at_10 value: 13.5 - type: map_at_100 value: 14.541 - type: map_at_1000 value: 14.643 - type: map_at_3 value: 11.446000000000002 - type: map_at_5 value: 12.437 - type: mrr_at_1 value: 8.863999999999999 - type: mrr_at_10 value: 14.985000000000001 - type: mrr_at_100 value: 15.989999999999998 - type: mrr_at_1000 value: 16.073 - type: mrr_at_3 value: 12.799 - type: mrr_at_5 value: 13.902999999999999 - type: ndcg_at_1 value: 8.863999999999999 - type: ndcg_at_10 value: 17.335 - type: ndcg_at_100 value: 22.884 - type: ndcg_at_1000 value: 25.747999999999998 - type: ndcg_at_3 value: 12.97 - type: ndcg_at_5 value: 14.799000000000001 - type: precision_at_1 value: 8.863999999999999 - type: precision_at_10 value: 3.2620000000000005 - type: precision_at_100 value: 0.6459999999999999 - type: precision_at_1000 value: 0.092 - type: precision_at_3 value: 6.161 - type: precision_at_5 value: 4.733 - type: recall_at_1 value: 7.79 - type: recall_at_10 value: 27.868 - type: recall_at_100 value: 54.096999999999994 - type: recall_at_1000 value: 76.27199999999999 - type: recall_at_3 value: 16.063 - type: recall_at_5 value: 20.355 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: None config: default split: test revision: None metrics: - type: map_at_1 value: 60.968 - type: map_at_10 value: 73.346 - type: map_at_100 value: 74.144 - type: map_at_1000 value: 74.181 - type: map_at_3 value: 70.462 - type: map_at_5 value: 72.167 - type: mrr_at_1 value: 70.24000000000001 - type: mrr_at_10 value: 77.794 - type: mrr_at_100 value: 78.079 - type: mrr_at_1000 value: 78.086 - type: mrr_at_3 value: 76.265 - type: mrr_at_5 value: 77.242 - type: ndcg_at_1 value: 70.28999999999999 - type: ndcg_at_10 value: 78.16199999999999 - type: ndcg_at_100 value: 80.515 - type: ndcg_at_1000 value: 80.987 - type: ndcg_at_3 value: 74.49 - type: ndcg_at_5 value: 76.334 - type: precision_at_1 value: 70.28999999999999 - type: precision_at_10 value: 11.827 - type: precision_at_100 value: 1.435 - type: precision_at_1000 value: 0.154 - type: precision_at_3 value: 32.25 - type: precision_at_5 value: 21.342 - type: recall_at_1 value: 60.968 - type: recall_at_10 value: 87.449 - type: recall_at_100 value: 96.557 - type: recall_at_1000 value: 99.328 - type: recall_at_3 value: 76.91799999999999 - type: recall_at_5 value: 82.048 - type: map_at_1 value: 2.635 - type: map_at_10 value: 6.061 - type: map_at_100 value: 7.287000000000001 - type: map_at_1000 value: 7.528 - type: map_at_3 value: 4.424 - type: map_at_5 value: 5.226 - type: mrr_at_1 value: 13.0 - type: mrr_at_10 value: 20.066 - type: mrr_at_100 value: 21.352 - type: mrr_at_1000 value: 21.447 - type: mrr_at_3 value: 17.666999999999998 - type: mrr_at_5 value: 19.002 - type: ndcg_at_1 value: 13.0 - type: ndcg_at_10 value: 10.943 - type: ndcg_at_100 value: 16.822 - type: ndcg_at_1000 value: 21.905 - type: ndcg_at_3 value: 10.291 - type: ndcg_at_5 value: 9.028 - type: precision_at_1 value: 13.0 - type: precision_at_10 value: 5.680000000000001 - type: precision_at_100 value: 1.4200000000000002 - type: precision_at_1000 value: 0.265 - type: precision_at_3 value: 9.5 - type: precision_at_5 value: 7.86 - type: recall_at_1 value: 2.635 - type: recall_at_10 value: 11.501999999999999 - type: recall_at_100 value: 28.854999999999997 - type: recall_at_1000 value: 53.818 - type: recall_at_3 value: 5.798 - type: recall_at_5 value: 7.963000000000001 - type: map_at_1 value: 0.11299999999999999 - type: map_at_10 value: 0.652 - type: map_at_100 value: 3.51 - type: map_at_1000 value: 8.445 - type: map_at_3 value: 0.259 - type: map_at_5 value: 0.392 - type: mrr_at_1 value: 46.0 - type: mrr_at_10 value: 58.263 - type: mrr_at_100 value: 58.935 - type: mrr_at_1000 value: 58.935 - type: mrr_at_3 value: 55.00000000000001 - type: mrr_at_5 value: 56.39999999999999 - type: ndcg_at_1 value: 41.0 - type: ndcg_at_10 value: 34.724 - type: ndcg_at_100 value: 27.108999999999998 - type: ndcg_at_1000 value: 24.773999999999997 - type: ndcg_at_3 value: 38.48 - type: ndcg_at_5 value: 37.399 - type: precision_at_1 value: 48.0 - type: precision_at_10 value: 37.2 - type: precision_at_100 value: 28.4 - type: precision_at_1000 value: 12.21 - type: precision_at_3 value: 42.667 - type: precision_at_5 value: 40.8 - type: recall_at_1 value: 0.11299999999999999 - type: recall_at_10 value: 0.8370000000000001 - type: recall_at_100 value: 5.992 - type: recall_at_1000 value: 24.051000000000002 - type: recall_at_3 value: 0.28800000000000003 - type: recall_at_5 value: 0.469 - task: type: Clustering dataset: name: MTEB RedditClustering type: None config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 28.013469586101447 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: None config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 43.29182795065702 - task: type: STS dataset: name: MTEB SICK-R type: None config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 76.50555278582816 - type: cos_sim_spearman value: 67.67214548084267 - type: euclidean_pearson value: 72.80672647042718 - type: euclidean_spearman value: 67.67205019006775 - type: manhattan_pearson value: 71.39917324420631 - type: manhattan_spearman value: 66.3359934752193 - task: type: STS dataset: name: MTEB STS12 type: None config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 73.46335143910176 - type: cos_sim_spearman value: 66.48232205374549 - type: euclidean_pearson value: 69.85631416436473 - type: euclidean_spearman value: 66.48363705410418 - type: manhattan_pearson value: 70.72055367256513 - type: manhattan_spearman value: 67.85751320875836 - task: type: STS dataset: name: MTEB STS13 type: None config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 77.64955371743831 - type: cos_sim_spearman value: 78.9381622922565 - type: euclidean_pearson value: 78.55659975799884 - type: euclidean_spearman value: 78.93820009727344 - type: manhattan_pearson value: 79.0225046950142 - type: manhattan_spearman value: 79.48472901118284 - task: type: STS dataset: name: MTEB STS14 type: None config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 77.25183335845733 - type: cos_sim_spearman value: 74.24842723953067 - type: euclidean_pearson value: 76.4686232324461 - type: euclidean_spearman value: 74.24841754885648 - type: manhattan_pearson value: 76.58863832312952 - type: manhattan_spearman value: 74.65445574230469 - task: type: STS dataset: name: MTEB STS15 type: None config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 79.89380076521006 - type: cos_sim_spearman value: 80.97474081393298 - type: euclidean_pearson value: 80.98258525159558 - type: euclidean_spearman value: 80.97473932039865 - type: manhattan_pearson value: 81.6739145030644 - type: manhattan_spearman value: 81.84024193133868 - task: type: STS dataset: name: MTEB STS16 type: None config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 75.47297982552274 - type: cos_sim_spearman value: 76.78688389457616 - type: euclidean_pearson value: 76.42350164312737 - type: euclidean_spearman value: 76.78743419029857 - type: manhattan_pearson value: 77.05352545216272 - type: manhattan_spearman value: 77.51886896774369 - task: type: STS dataset: name: MTEB STS17 (en-en) type: None config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 82.91509212460161 - type: cos_sim_spearman value: 84.0998411802368 - type: euclidean_pearson value: 84.18246980817624 - type: euclidean_spearman value: 84.10071528982036 - type: manhattan_pearson value: 84.4128363010489 - type: manhattan_spearman value: 84.43725453490214 - task: type: STS dataset: name: MTEB STS22 (en) type: None config: en split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 62.86109059526856 - type: cos_sim_spearman value: 60.51897891571115 - type: euclidean_pearson value: 62.803664785666925 - type: euclidean_spearman value: 60.51897891571115 - type: manhattan_pearson value: 63.29647652965783 - type: manhattan_spearman value: 61.57692163615942 - task: type: STS dataset: name: MTEB STSBenchmark type: None config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 77.23785194416263 - type: cos_sim_spearman value: 75.86058891622369 - type: euclidean_pearson value: 77.38393317729829 - type: euclidean_spearman value: 75.86063730582144 - type: manhattan_pearson value: 77.4860432550345 - type: manhattan_spearman value: 76.0756045460265 - task: type: Reranking dataset: name: MTEB SciDocsRR type: None config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 70.65732608186795 - type: mrr value: 90.0984182846928 - task: type: Retrieval dataset: name: MTEB SciFact type: None config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: map_at_1 value: 30.721999999999998 - type: map_at_10 value: 37.522 - type: map_at_100 value: 38.699 - type: map_at_1000 value: 38.769 - type: map_at_3 value: 34.995 - type: map_at_5 value: 36.701 - type: mrr_at_1 value: 32.667 - type: mrr_at_10 value: 39.298 - type: mrr_at_100 value: 40.297 - type: mrr_at_1000 value: 40.354 - type: mrr_at_3 value: 37.167 - type: mrr_at_5 value: 38.533 - type: ndcg_at_1 value: 32.667 - type: ndcg_at_10 value: 41.634 - type: ndcg_at_100 value: 47.49 - type: ndcg_at_1000 value: 49.419000000000004 - type: ndcg_at_3 value: 36.925000000000004 - type: ndcg_at_5 value: 39.739000000000004 - type: precision_at_1 value: 32.667 - type: precision_at_10 value: 5.833 - type: precision_at_100 value: 0.91 - type: precision_at_1000 value: 0.108 - type: precision_at_3 value: 14.556 - type: precision_at_5 value: 10.333 - type: recall_at_1 value: 30.721999999999998 - type: recall_at_10 value: 52.722 - type: recall_at_100 value: 80.417 - type: recall_at_1000 value: 95.8 - type: recall_at_3 value: 40.306 - type: recall_at_5 value: 47.083000000000006 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: None config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.63564356435643 - type: cos_sim_ap value: 87.3180475245355 - type: cos_sim_f1 value: 80.3125 - type: cos_sim_precision value: 83.80434782608695 - type: cos_sim_recall value: 77.10000000000001 - type: dot_accuracy value: 99.63564356435643 - type: dot_ap value: 87.3180475245355 - type: dot_f1 value: 80.3125 - type: dot_precision value: 83.80434782608695 - type: dot_recall value: 77.10000000000001 - type: euclidean_accuracy value: 99.63564356435643 - type: euclidean_ap value: 87.3180475245355 - type: euclidean_f1 value: 80.3125 - type: euclidean_precision value: 83.80434782608695 - type: euclidean_recall value: 77.10000000000001 - type: manhattan_accuracy value: 99.69504950495049 - type: manhattan_ap value: 90.59998550104231 - type: manhattan_f1 value: 83.91462779802187 - type: manhattan_precision value: 87.51357220412595 - type: manhattan_recall value: 80.60000000000001 - type: max_accuracy value: 99.69504950495049 - type: max_ap value: 90.59998550104231 - type: max_f1 value: 83.91462779802187 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: None config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 32.31610507058257 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: None config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 29.125242344875502 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: None config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 43.09273403724595 - type: mrr value: 43.54354999575587 - task: type: Summarization dataset: name: MTEB SummEval type: None config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.14375732010076 - type: cos_sim_spearman value: 29.565975101655656 - type: dot_pearson value: 30.14375735419279 - type: dot_spearman value: 29.497714274949065 - task: type: Retrieval dataset: name: MTEB Touche2020 type: None config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: map_at_1 value: 2.173 - type: map_at_10 value: 8.079 - type: map_at_100 value: 13.44 - type: map_at_1000 value: 14.985000000000001 - type: map_at_3 value: 4.756 - type: map_at_5 value: 5.995 - type: mrr_at_1 value: 30.612000000000002 - type: mrr_at_10 value: 43.341 - type: mrr_at_100 value: 44.192 - type: mrr_at_1000 value: 44.192 - type: mrr_at_3 value: 40.136 - type: mrr_at_5 value: 41.463 - type: ndcg_at_1 value: 27.551 - type: ndcg_at_10 value: 20.093 - type: ndcg_at_100 value: 33.133 - type: ndcg_at_1000 value: 44.344 - type: ndcg_at_3 value: 25.82 - type: ndcg_at_5 value: 22.216 - type: precision_at_1 value: 30.612000000000002 - type: precision_at_10 value: 17.755000000000003 - type: precision_at_100 value: 7.469 - type: precision_at_1000 value: 1.465 - type: precision_at_3 value: 27.891 - type: precision_at_5 value: 22.448999999999998 - type: recall_at_1 value: 2.173 - type: recall_at_10 value: 12.662999999999998 - type: recall_at_100 value: 44.589 - type: recall_at_1000 value: 78.997 - type: recall_at_3 value: 5.955 - type: recall_at_5 value: 7.89 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: None config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 69.899 - type: ap value: 13.50183607213333 - type: f1 value: 53.589735425592465 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: None config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 53.74080362195812 - type: f1 value: 53.940488490088434 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: None config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 33.38930659623301 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: None config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 83.71580139476664 - type: cos_sim_ap value: 65.75325189939345 - type: cos_sim_f1 value: 62.39284383153186 - type: cos_sim_precision value: 58.95750176097676 - type: cos_sim_recall value: 66.25329815303431 - type: dot_accuracy value: 83.71580139476664 - type: dot_ap value: 65.75325189939345 - type: dot_f1 value: 62.39284383153186 - type: dot_precision value: 58.95750176097676 - type: dot_recall value: 66.25329815303431 - type: euclidean_accuracy value: 83.71580139476664 - type: euclidean_ap value: 65.75325189939345 - type: euclidean_f1 value: 62.39284383153186 - type: euclidean_precision value: 58.95750176097676 - type: euclidean_recall value: 66.25329815303431 - type: manhattan_accuracy value: 83.4714192048638 - type: manhattan_ap value: 64.42014741278865 - type: manhattan_f1 value: 60.886814469078175 - type: manhattan_precision value: 54.58158995815899 - type: manhattan_recall value: 68.83905013192611 - type: max_accuracy value: 83.71580139476664 - type: max_ap value: 65.75325189939345 - type: max_f1 value: 62.39284383153186 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: None config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 87.46846741956766 - type: cos_sim_ap value: 82.74561195728911 - type: cos_sim_f1 value: 74.91745501762196 - type: cos_sim_precision value: 72.29183074389633 - type: cos_sim_recall value: 77.74099168463196 - type: dot_accuracy value: 87.46846741956766 - type: dot_ap value: 82.74561216064899 - type: dot_f1 value: 74.91745501762196 - type: dot_precision value: 72.29183074389633 - type: dot_recall value: 77.74099168463196 - type: euclidean_accuracy value: 87.46846741956766 - type: euclidean_ap value: 82.74561227188258 - type: euclidean_f1 value: 74.91745501762196 - type: euclidean_precision value: 72.29183074389633 - type: euclidean_recall value: 77.74099168463196 - type: manhattan_accuracy value: 87.4607055536151 - type: manhattan_ap value: 82.82100726504085 - type: manhattan_f1 value: 74.95324413753418 - type: manhattan_precision value: 70.329373650108 - type: manhattan_recall value: 80.22790267939637 - type: max_accuracy value: 87.46846741956766 - type: max_ap value: 82.82100726504085 - type: max_f1 value: 74.95324413753418 ---
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
PlanTL-GOB-ES/bsc-bio-ehr-es
PlanTL-GOB-ES
fill-mask
[ "transformers", "pytorch", "roberta", "fill-mask", "biomedical", "clinical", "ehr", "spanish", "es", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,649
1,668
665
12
--- language: - es license: apache-2.0 metrics: - ppl tags: - biomedical - clinical - ehr - spanish widget: - text: El único antecedente personal a reseñar era la <mask> arterial. - text: Las radiologías óseas de cuerpo entero no detectan alteraciones <mask>, ni alteraciones vertebrales. - text: En el <mask> toraco-abdómino-pélvico no se encontraron hallazgos patológicos de interés. --- # Biomedical-clinical language model for Spanish ## Table of contents <details> <summary>Click to expand</summary> - [Model description](#model-description) - [Intended uses and limitations](#intended-use) - [How to use](#how-to-use) - [Limitations and bias](#limitations-and-bias) - [Training](#training) - [Evaluation](#evaluation) - [Additional information](#additional-information) - [Author](#author) - [Contact information](#contact-information) - [Copyright](#copyright) - [Licensing information](#licensing-information) - [Funding](#funding) - [Citing information](#citing-information) - [Disclaimer](#disclaimer) </details> ## Model description Biomedical pretrained language model for Spanish. For more details about the corpus, the pretraining and the evaluation, check the official [repository](https://github.com/PlanTL-GOB-ES/lm-biomedical-clinical-es). ## Intended uses and limitations The model is ready-to-use only for masked language modelling to perform the Fill Mask task (try the inference API or read the next section). However, it is intended to be fine-tuned on downstream tasks such as Named Entity Recognition or Text Classification. ## How to use ## Limitations and bias At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. ## Training ### Tokenization and model pretraining This model is a [RoBERTa-based](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model trained on a **biomedical-clinical** corpus in Spanish collected from several sources (see next section). The training corpus has been tokenized using a byte version of [Byte-Pair Encoding (BPE)](https://github.com/openai/gpt-2) used in the original [RoBERTA](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model with a vocabulary size of 52,000 tokens. The pretraining consists of a masked language model training at the subword level following the approach employed for the RoBERTa base model with the same hyperparameters as in the original work. The training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM, using Adam optimizer with a peak learning rate of 0.0005 and an effective batch size of 2,048 sentences. ### Training corpora and preprocessing The training corpus is composed of several biomedical corpora in Spanish, collected from publicly available corpora and crawlers, and a real-world clinical corpus collected from more than 278K clinical documents and notes. To obtain a high-quality training corpus while retaining the idiosyncrasies of the clinical language, a cleaning pipeline has been applied only to the biomedical corpora, keeping the clinical corpus uncleaned. Essentially, the cleaning operations used are: - data parsing in different formats - sentence splitting - language detection - filtering of ill-formed sentences - deduplication of repetitive contents - keep the original document boundaries Then, the biomedical corpora are concatenated and further global deduplication among the biomedical corpora has been applied. Eventually, the clinical corpus is concatenated to the cleaned biomedical corpus resulting in a medium-size biomedical-clinical corpus for Spanish composed of more than 1B tokens. The table below shows some basic statistics of the individual cleaned corpora: | Name | No. tokens | Description | |-----------------------------------------------------------------------------------------|-------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | [Medical crawler](https://zenodo.org/record/4561970) | 903,558,13 | Crawler of more than 3,000 URLs belonging to Spanish biomedical and health domains. | | Clinical cases misc. | 102,855,267 | A miscellany of medical content, essentially clinical cases. Note that a clinical case report is a scientific publication where medical practitioners share patient cases and it is different from a clinical note or document. | | EHR documents | 95,267,20 | Collection of more than 278K clinical documents, including discharge reports, clinical course notes and X-ray reports, for a total of 91M tokens. | | [Scielo](https://zenodo.org/record/2541681#.YlP1DshBwio) | 60,007,289 | Publications written in Spanish crawled from the Spanish SciELO server in 2017. | | [BARR2_background](https://temu.bsc.es/BARR2/downloads/background_set.raw_text.tar.bz2) | 24,516,442 | Biomedical Abbreviation Recognition and Resolution (BARR2) containing Spanish clinical case study sections from a variety of clinical disciplines. | | Wikipedia_life_sciences | 13,890,501 | Wikipedia articles crawled 04/01/2021 with the [Wikipedia API python library](https://pypi.org/project/Wikipedia-API/) starting from the "Ciencias\_de\_la\_vida" category up to a maximum of 5 subcategories. Multiple links to the same articles are then discarded to avoid repeating content. | | Patents | 13,463,387 | Google Patent in Medical Domain for Spain (Spanish). The accepted codes (Medical Domain) for Json files of patents are: "A61B", "A61C","A61F", "A61H", "A61K", "A61L","A61M", "A61B", "A61P". | | [EMEA](http://opus.nlpl.eu/download.php?f=EMEA/v3/moses/en-es.txt.zip) | 5,377,448 | Spanish-side documents extracted from parallel corpora made out of PDF documents from the European Medicines Agency. | | [mespen_Medline](https://zenodo.org/record/3562536#.YTt1fH2xXbR) | 4,166,077 | Spanish-side articles extracted from a collection of Spanish-English parallel corpus consisting of biomedical scientific literature. The collection of parallel resources is aggregated from the MedlinePlus source. | | PubMed | 1,858,966 | Open-access articles from the PubMed repository crawled in 2017. | ## Evaluation The model has been fine-tuned on three Named Entity Recognition (NER) tasks using three clinical NER datasets: - [PharmaCoNER](https://zenodo.org/record/4270158): is a track on chemical and drug mention recognition from Spanish medical texts (for more info see: https://temu.bsc.es/pharmaconer/). - [CANTEMIST](https://zenodo.org/record/3978041#.YTt5qH2xXbQ): is a shared task specifically focusing on named entity recognition of tumor morphology, in Spanish (for more info see: https://zenodo.org/record/3978041#.YTt5qH2xXbQ). - ICTUSnet: consists of 1,006 hospital discharge reports of patients admitted for stroke from 18 different Spanish hospitals. It contains more than 79,000 annotations for 51 different kinds of variables. We addressed the NER task as a token classification problem using a standard linear layer along with the BIO tagging schema. We compared our models with the general-domain Spanish [roberta-base-bne](https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne), the general-domain multilingual model that supports Spanish [mBERT](https://huggingface.co/bert-base-multilingual-cased), the domain-specific English model [BioBERT](https://huggingface.co/dmis-lab/biobert-base-cased-v1.2), and three domain-specific models based on continual pre-training, [mBERT-Galén](https://ieeexplore.ieee.org/document/9430499), [XLM-R-Galén](https://ieeexplore.ieee.org/document/9430499) and [BETO-Galén](https://ieeexplore.ieee.org/document/9430499). The table below shows the F1 scores obtained: | Tasks/Models | bsc-bio-ehr-es | XLM-R-Galén | BETO-Galén | mBERT-Galén | mBERT | BioBERT | roberta-base-bne | |--------------|----------------|--------------------|--------------|--------------|--------------|--------------|------------------| | PharmaCoNER | **0.8913** | 0.8754 | 0.8537 | 0.8594 | 0.8671 | 0.8545 | 0.8474 | | CANTEMIST | **0.8340** | 0.8078 | 0.8153 | 0.8168 | 0.8116 | 0.8070 | 0.7875 | | ICTUSnet | **0.8756** | 0.8716 | 0.8498 | 0.8509 | 0.8631 | 0.8521 | 0.8677 | The fine-tuning scripts can be found in the official GitHub [repository](https://github.com/PlanTL-GOB-ES/lm-biomedical-clinical-es). ## Additional information ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected]) ### Contact information For further information, send an email to <[email protected]> ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. ### Citing information If you use these models, please cite our work: ```bibtext @inproceedings{carrino-etal-2022-pretrained, title = "Pretrained Biomedical Language Models for Clinical {NLP} in {S}panish", author = "Carrino, Casimiro Pio and Llop, Joan and P{\`a}mies, Marc and Guti{\'e}rrez-Fandi{\~n}o, Asier and Armengol-Estap{\'e}, Jordi and Silveira-Ocampo, Joaqu{\'\i}n and Valencia, Alfonso and Gonzalez-Agirre, Aitor and Villegas, Marta", booktitle = "Proceedings of the 21st Workshop on Biomedical Language Processing", month = may, year = "2022", address = "Dublin, Ireland", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2022.bionlp-1.19", doi = "10.18653/v1/2022.bionlp-1.19", pages = "193--199", abstract = "This work presents the first large-scale biomedical Spanish language models trained from scratch, using large biomedical corpora consisting of a total of 1.1B tokens and an EHR corpus of 95M tokens. We compared them against general-domain and other domain-specific models for Spanish on three clinical NER tasks. As main results, our models are superior across the NER tasks, rendering them more convenient for clinical NLP applications. Furthermore, our findings indicate that when enough data is available, pre-training from scratch is better than continual pre-training when tested on clinical tasks, raising an exciting research question about which approach is optimal. Our models and fine-tuning scripts are publicly available at HuggingFace and GitHub.", } ``` ### Disclaimer <details> <summary>Click to expand</summary> The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos. </details>
[ "NAMED_ENTITY_RECOGNITION", "TEXT_CLASSIFICATION" ]
[ "CANTEMIST", "PHARMACONER", "SCIELO" ]
BioNLP
barisaydin/bge-small-en
barisaydin
feature-extraction
[ "transformers", "pytorch", "safetensors", "bert", "feature-extraction", "mteb", "sentence transformers", "en", "arxiv:2309.07597", "license:mit", "model-index", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,695
1,695
9
0
--- language: - en license: mit tags: - mteb - sentence transformers model-index: - name: bge-small-en results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 74.34328358208955 - type: ap value: 37.59947775195661 - type: f1 value: 68.548415491933 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 93.04527499999999 - type: ap value: 89.60696356772135 - type: f1 value: 93.03361469382438 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 46.08 - type: f1 value: 45.66249835363254 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: map_at_1 value: 35.205999999999996 - type: map_at_10 value: 50.782000000000004 - type: map_at_100 value: 51.547 - type: map_at_1000 value: 51.554 - type: map_at_3 value: 46.515 - type: map_at_5 value: 49.296 - type: mrr_at_1 value: 35.632999999999996 - type: mrr_at_10 value: 50.958999999999996 - type: mrr_at_100 value: 51.724000000000004 - type: mrr_at_1000 value: 51.731 - type: mrr_at_3 value: 46.669 - type: mrr_at_5 value: 49.439 - type: ndcg_at_1 value: 35.205999999999996 - type: ndcg_at_10 value: 58.835 - type: ndcg_at_100 value: 62.095 - type: ndcg_at_1000 value: 62.255 - type: ndcg_at_3 value: 50.255 - type: ndcg_at_5 value: 55.296 - type: precision_at_1 value: 35.205999999999996 - type: precision_at_10 value: 8.421 - type: precision_at_100 value: 0.984 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 20.365 - type: precision_at_5 value: 14.680000000000001 - type: recall_at_1 value: 35.205999999999996 - type: recall_at_10 value: 84.211 - type: recall_at_100 value: 98.43499999999999 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 61.095 - type: recall_at_5 value: 73.4 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 47.52644476278646 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 39.973045724188964 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 62.28285314871488 - type: mrr value: 74.52743701358659 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 80.09041909160327 - type: cos_sim_spearman value: 79.96266537706944 - type: euclidean_pearson value: 79.50774978162241 - type: euclidean_spearman value: 79.9144715078551 - type: manhattan_pearson value: 79.2062139879302 - type: manhattan_spearman value: 79.35000081468212 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 85.31493506493506 - type: f1 value: 85.2704557977762 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 39.6837242810816 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 35.38881249555897 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: map_at_1 value: 27.884999999999998 - type: map_at_10 value: 39.574 - type: map_at_100 value: 40.993 - type: map_at_1000 value: 41.129 - type: map_at_3 value: 36.089 - type: map_at_5 value: 38.191 - type: mrr_at_1 value: 34.477999999999994 - type: mrr_at_10 value: 45.411 - type: mrr_at_100 value: 46.089999999999996 - type: mrr_at_1000 value: 46.147 - type: mrr_at_3 value: 42.346000000000004 - type: mrr_at_5 value: 44.292 - type: ndcg_at_1 value: 34.477999999999994 - type: ndcg_at_10 value: 46.123999999999995 - type: ndcg_at_100 value: 51.349999999999994 - type: ndcg_at_1000 value: 53.578 - type: ndcg_at_3 value: 40.824 - type: ndcg_at_5 value: 43.571 - type: precision_at_1 value: 34.477999999999994 - type: precision_at_10 value: 8.841000000000001 - type: precision_at_100 value: 1.4460000000000002 - type: precision_at_1000 value: 0.192 - type: precision_at_3 value: 19.742 - type: precision_at_5 value: 14.421000000000001 - type: recall_at_1 value: 27.884999999999998 - type: recall_at_10 value: 59.087 - type: recall_at_100 value: 80.609 - type: recall_at_1000 value: 95.054 - type: recall_at_3 value: 44.082 - type: recall_at_5 value: 51.593999999999994 - type: map_at_1 value: 30.639 - type: map_at_10 value: 40.047 - type: map_at_100 value: 41.302 - type: map_at_1000 value: 41.425 - type: map_at_3 value: 37.406 - type: map_at_5 value: 38.934000000000005 - type: mrr_at_1 value: 37.707 - type: mrr_at_10 value: 46.082 - type: mrr_at_100 value: 46.745 - type: mrr_at_1000 value: 46.786 - type: mrr_at_3 value: 43.980999999999995 - type: mrr_at_5 value: 45.287 - type: ndcg_at_1 value: 37.707 - type: ndcg_at_10 value: 45.525 - type: ndcg_at_100 value: 49.976 - type: ndcg_at_1000 value: 51.94499999999999 - type: ndcg_at_3 value: 41.704 - type: ndcg_at_5 value: 43.596000000000004 - type: precision_at_1 value: 37.707 - type: precision_at_10 value: 8.465 - type: precision_at_100 value: 1.375 - type: precision_at_1000 value: 0.183 - type: precision_at_3 value: 19.979 - type: precision_at_5 value: 14.115 - type: recall_at_1 value: 30.639 - type: recall_at_10 value: 54.775 - type: recall_at_100 value: 73.678 - type: recall_at_1000 value: 86.142 - type: recall_at_3 value: 43.230000000000004 - type: recall_at_5 value: 48.622 - type: map_at_1 value: 38.038 - type: map_at_10 value: 49.922 - type: map_at_100 value: 51.032 - type: map_at_1000 value: 51.085 - type: map_at_3 value: 46.664 - type: map_at_5 value: 48.588 - type: mrr_at_1 value: 43.95 - type: mrr_at_10 value: 53.566 - type: mrr_at_100 value: 54.318999999999996 - type: mrr_at_1000 value: 54.348 - type: mrr_at_3 value: 51.066 - type: mrr_at_5 value: 52.649 - type: ndcg_at_1 value: 43.95 - type: ndcg_at_10 value: 55.676 - type: ndcg_at_100 value: 60.126000000000005 - type: ndcg_at_1000 value: 61.208 - type: ndcg_at_3 value: 50.20400000000001 - type: ndcg_at_5 value: 53.038 - type: precision_at_1 value: 43.95 - type: precision_at_10 value: 8.953 - type: precision_at_100 value: 1.2109999999999999 - type: precision_at_1000 value: 0.135 - type: precision_at_3 value: 22.256999999999998 - type: precision_at_5 value: 15.524 - type: recall_at_1 value: 38.038 - type: recall_at_10 value: 69.15 - type: recall_at_100 value: 88.31599999999999 - type: recall_at_1000 value: 95.993 - type: recall_at_3 value: 54.663 - type: recall_at_5 value: 61.373 - type: map_at_1 value: 24.872 - type: map_at_10 value: 32.912 - type: map_at_100 value: 33.972 - type: map_at_1000 value: 34.046 - type: map_at_3 value: 30.361 - type: map_at_5 value: 31.704 - type: mrr_at_1 value: 26.779999999999998 - type: mrr_at_10 value: 34.812 - type: mrr_at_100 value: 35.754999999999995 - type: mrr_at_1000 value: 35.809000000000005 - type: mrr_at_3 value: 32.335 - type: mrr_at_5 value: 33.64 - type: ndcg_at_1 value: 26.779999999999998 - type: ndcg_at_10 value: 37.623 - type: ndcg_at_100 value: 42.924 - type: ndcg_at_1000 value: 44.856 - type: ndcg_at_3 value: 32.574 - type: ndcg_at_5 value: 34.842 - type: precision_at_1 value: 26.779999999999998 - type: precision_at_10 value: 5.729 - type: precision_at_100 value: 0.886 - type: precision_at_1000 value: 0.109 - type: precision_at_3 value: 13.559 - type: precision_at_5 value: 9.469 - type: recall_at_1 value: 24.872 - type: recall_at_10 value: 50.400999999999996 - type: recall_at_100 value: 74.954 - type: recall_at_1000 value: 89.56 - type: recall_at_3 value: 36.726 - type: recall_at_5 value: 42.138999999999996 - type: map_at_1 value: 16.803 - type: map_at_10 value: 24.348 - type: map_at_100 value: 25.56 - type: map_at_1000 value: 25.668000000000003 - type: map_at_3 value: 21.811 - type: map_at_5 value: 23.287 - type: mrr_at_1 value: 20.771 - type: mrr_at_10 value: 28.961 - type: mrr_at_100 value: 29.979 - type: mrr_at_1000 value: 30.046 - type: mrr_at_3 value: 26.555 - type: mrr_at_5 value: 28.060000000000002 - type: ndcg_at_1 value: 20.771 - type: ndcg_at_10 value: 29.335 - type: ndcg_at_100 value: 35.188 - type: ndcg_at_1000 value: 37.812 - type: ndcg_at_3 value: 24.83 - type: ndcg_at_5 value: 27.119 - type: precision_at_1 value: 20.771 - type: precision_at_10 value: 5.4350000000000005 - type: precision_at_100 value: 0.9480000000000001 - type: precision_at_1000 value: 0.13 - type: precision_at_3 value: 11.982 - type: precision_at_5 value: 8.831 - type: recall_at_1 value: 16.803 - type: recall_at_10 value: 40.039 - type: recall_at_100 value: 65.83200000000001 - type: recall_at_1000 value: 84.478 - type: recall_at_3 value: 27.682000000000002 - type: recall_at_5 value: 33.535 - type: map_at_1 value: 28.345 - type: map_at_10 value: 37.757000000000005 - type: map_at_100 value: 39.141 - type: map_at_1000 value: 39.262 - type: map_at_3 value: 35.183 - type: map_at_5 value: 36.592 - type: mrr_at_1 value: 34.649 - type: mrr_at_10 value: 43.586999999999996 - type: mrr_at_100 value: 44.481 - type: mrr_at_1000 value: 44.542 - type: mrr_at_3 value: 41.29 - type: mrr_at_5 value: 42.642 - type: ndcg_at_1 value: 34.649 - type: ndcg_at_10 value: 43.161 - type: ndcg_at_100 value: 48.734 - type: ndcg_at_1000 value: 51.046 - type: ndcg_at_3 value: 39.118 - type: ndcg_at_5 value: 41.022 - type: precision_at_1 value: 34.649 - type: precision_at_10 value: 7.603 - type: precision_at_100 value: 1.209 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 18.319 - type: precision_at_5 value: 12.839 - type: recall_at_1 value: 28.345 - type: recall_at_10 value: 53.367 - type: recall_at_100 value: 76.453 - type: recall_at_1000 value: 91.82000000000001 - type: recall_at_3 value: 41.636 - type: recall_at_5 value: 46.760000000000005 - type: map_at_1 value: 22.419 - type: map_at_10 value: 31.716 - type: map_at_100 value: 33.152 - type: map_at_1000 value: 33.267 - type: map_at_3 value: 28.74 - type: map_at_5 value: 30.48 - type: mrr_at_1 value: 28.310999999999996 - type: mrr_at_10 value: 37.039 - type: mrr_at_100 value: 38.09 - type: mrr_at_1000 value: 38.145 - type: mrr_at_3 value: 34.437 - type: mrr_at_5 value: 36.024 - type: ndcg_at_1 value: 28.310999999999996 - type: ndcg_at_10 value: 37.41 - type: ndcg_at_100 value: 43.647999999999996 - type: ndcg_at_1000 value: 46.007 - type: ndcg_at_3 value: 32.509 - type: ndcg_at_5 value: 34.943999999999996 - type: precision_at_1 value: 28.310999999999996 - type: precision_at_10 value: 6.963 - type: precision_at_100 value: 1.1860000000000002 - type: precision_at_1000 value: 0.154 - type: precision_at_3 value: 15.867999999999999 - type: precision_at_5 value: 11.507000000000001 - type: recall_at_1 value: 22.419 - type: recall_at_10 value: 49.28 - type: recall_at_100 value: 75.802 - type: recall_at_1000 value: 92.032 - type: recall_at_3 value: 35.399 - type: recall_at_5 value: 42.027 - type: map_at_1 value: 24.669249999999998 - type: map_at_10 value: 33.332583333333325 - type: map_at_100 value: 34.557833333333335 - type: map_at_1000 value: 34.67141666666666 - type: map_at_3 value: 30.663166666666662 - type: map_at_5 value: 32.14883333333333 - type: mrr_at_1 value: 29.193833333333334 - type: mrr_at_10 value: 37.47625 - type: mrr_at_100 value: 38.3545 - type: mrr_at_1000 value: 38.413166666666676 - type: mrr_at_3 value: 35.06741666666667 - type: mrr_at_5 value: 36.450666666666656 - type: ndcg_at_1 value: 29.193833333333334 - type: ndcg_at_10 value: 38.505416666666676 - type: ndcg_at_100 value: 43.81125 - type: ndcg_at_1000 value: 46.09558333333333 - type: ndcg_at_3 value: 33.90916666666667 - type: ndcg_at_5 value: 36.07666666666666 - type: precision_at_1 value: 29.193833333333334 - type: precision_at_10 value: 6.7251666666666665 - type: precision_at_100 value: 1.1058333333333332 - type: precision_at_1000 value: 0.14833333333333332 - type: precision_at_3 value: 15.554166666666665 - type: precision_at_5 value: 11.079250000000002 - type: recall_at_1 value: 24.669249999999998 - type: recall_at_10 value: 49.75583333333332 - type: recall_at_100 value: 73.06908333333332 - type: recall_at_1000 value: 88.91316666666667 - type: recall_at_3 value: 36.913250000000005 - type: recall_at_5 value: 42.48641666666666 - type: map_at_1 value: 24.044999999999998 - type: map_at_10 value: 30.349999999999998 - type: map_at_100 value: 31.273 - type: map_at_1000 value: 31.362000000000002 - type: map_at_3 value: 28.508 - type: map_at_5 value: 29.369 - type: mrr_at_1 value: 26.994 - type: mrr_at_10 value: 33.12 - type: mrr_at_100 value: 33.904 - type: mrr_at_1000 value: 33.967000000000006 - type: mrr_at_3 value: 31.365 - type: mrr_at_5 value: 32.124 - type: ndcg_at_1 value: 26.994 - type: ndcg_at_10 value: 34.214 - type: ndcg_at_100 value: 38.681 - type: ndcg_at_1000 value: 40.926 - type: ndcg_at_3 value: 30.725 - type: ndcg_at_5 value: 31.967000000000002 - type: precision_at_1 value: 26.994 - type: precision_at_10 value: 5.215 - type: precision_at_100 value: 0.807 - type: precision_at_1000 value: 0.108 - type: precision_at_3 value: 12.986 - type: precision_at_5 value: 8.712 - type: recall_at_1 value: 24.044999999999998 - type: recall_at_10 value: 43.456 - type: recall_at_100 value: 63.675000000000004 - type: recall_at_1000 value: 80.05499999999999 - type: recall_at_3 value: 33.561 - type: recall_at_5 value: 36.767 - type: map_at_1 value: 15.672 - type: map_at_10 value: 22.641 - type: map_at_100 value: 23.75 - type: map_at_1000 value: 23.877000000000002 - type: map_at_3 value: 20.219 - type: map_at_5 value: 21.648 - type: mrr_at_1 value: 18.823 - type: mrr_at_10 value: 26.101999999999997 - type: mrr_at_100 value: 27.038 - type: mrr_at_1000 value: 27.118 - type: mrr_at_3 value: 23.669 - type: mrr_at_5 value: 25.173000000000002 - type: ndcg_at_1 value: 18.823 - type: ndcg_at_10 value: 27.176000000000002 - type: ndcg_at_100 value: 32.42 - type: ndcg_at_1000 value: 35.413 - type: ndcg_at_3 value: 22.756999999999998 - type: ndcg_at_5 value: 25.032 - type: precision_at_1 value: 18.823 - type: precision_at_10 value: 5.034000000000001 - type: precision_at_100 value: 0.895 - type: precision_at_1000 value: 0.132 - type: precision_at_3 value: 10.771 - type: precision_at_5 value: 8.1 - type: recall_at_1 value: 15.672 - type: recall_at_10 value: 37.296 - type: recall_at_100 value: 60.863 - type: recall_at_1000 value: 82.234 - type: recall_at_3 value: 25.330000000000002 - type: recall_at_5 value: 30.964000000000002 - type: map_at_1 value: 24.633 - type: map_at_10 value: 32.858 - type: map_at_100 value: 34.038000000000004 - type: map_at_1000 value: 34.141 - type: map_at_3 value: 30.209000000000003 - type: map_at_5 value: 31.567 - type: mrr_at_1 value: 28.358 - type: mrr_at_10 value: 36.433 - type: mrr_at_100 value: 37.352000000000004 - type: mrr_at_1000 value: 37.41 - type: mrr_at_3 value: 34.033 - type: mrr_at_5 value: 35.246 - type: ndcg_at_1 value: 28.358 - type: ndcg_at_10 value: 37.973 - type: ndcg_at_100 value: 43.411 - type: ndcg_at_1000 value: 45.747 - type: ndcg_at_3 value: 32.934999999999995 - type: ndcg_at_5 value: 35.013 - type: precision_at_1 value: 28.358 - type: precision_at_10 value: 6.418 - type: precision_at_100 value: 1.02 - type: precision_at_1000 value: 0.133 - type: precision_at_3 value: 14.677000000000001 - type: precision_at_5 value: 10.335999999999999 - type: recall_at_1 value: 24.633 - type: recall_at_10 value: 50.048 - type: recall_at_100 value: 73.821 - type: recall_at_1000 value: 90.046 - type: recall_at_3 value: 36.284 - type: recall_at_5 value: 41.370000000000005 - type: map_at_1 value: 23.133 - type: map_at_10 value: 31.491999999999997 - type: map_at_100 value: 33.062000000000005 - type: map_at_1000 value: 33.256 - type: map_at_3 value: 28.886 - type: map_at_5 value: 30.262 - type: mrr_at_1 value: 28.063 - type: mrr_at_10 value: 36.144 - type: mrr_at_100 value: 37.14 - type: mrr_at_1000 value: 37.191 - type: mrr_at_3 value: 33.762 - type: mrr_at_5 value: 34.997 - type: ndcg_at_1 value: 28.063 - type: ndcg_at_10 value: 36.951 - type: ndcg_at_100 value: 43.287 - type: ndcg_at_1000 value: 45.777 - type: ndcg_at_3 value: 32.786 - type: ndcg_at_5 value: 34.65 - type: precision_at_1 value: 28.063 - type: precision_at_10 value: 7.055 - type: precision_at_100 value: 1.476 - type: precision_at_1000 value: 0.22899999999999998 - type: precision_at_3 value: 15.481 - type: precision_at_5 value: 11.186 - type: recall_at_1 value: 23.133 - type: recall_at_10 value: 47.285 - type: recall_at_100 value: 76.176 - type: recall_at_1000 value: 92.176 - type: recall_at_3 value: 35.223 - type: recall_at_5 value: 40.142 - type: map_at_1 value: 19.547 - type: map_at_10 value: 26.374 - type: map_at_100 value: 27.419 - type: map_at_1000 value: 27.539 - type: map_at_3 value: 23.882 - type: map_at_5 value: 25.163999999999998 - type: mrr_at_1 value: 21.442 - type: mrr_at_10 value: 28.458 - type: mrr_at_100 value: 29.360999999999997 - type: mrr_at_1000 value: 29.448999999999998 - type: mrr_at_3 value: 25.97 - type: mrr_at_5 value: 27.273999999999997 - type: ndcg_at_1 value: 21.442 - type: ndcg_at_10 value: 30.897000000000002 - type: ndcg_at_100 value: 35.99 - type: ndcg_at_1000 value: 38.832 - type: ndcg_at_3 value: 25.944 - type: ndcg_at_5 value: 28.126 - type: precision_at_1 value: 21.442 - type: precision_at_10 value: 4.9910000000000005 - type: precision_at_100 value: 0.8109999999999999 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_3 value: 11.029 - type: precision_at_5 value: 7.911 - type: recall_at_1 value: 19.547 - type: recall_at_10 value: 42.886 - type: recall_at_100 value: 66.64999999999999 - type: recall_at_1000 value: 87.368 - type: recall_at_3 value: 29.143 - type: recall_at_5 value: 34.544000000000004 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: map_at_1 value: 15.572 - type: map_at_10 value: 25.312 - type: map_at_100 value: 27.062 - type: map_at_1000 value: 27.253 - type: map_at_3 value: 21.601 - type: map_at_5 value: 23.473 - type: mrr_at_1 value: 34.984 - type: mrr_at_10 value: 46.406 - type: mrr_at_100 value: 47.179 - type: mrr_at_1000 value: 47.21 - type: mrr_at_3 value: 43.485 - type: mrr_at_5 value: 45.322 - type: ndcg_at_1 value: 34.984 - type: ndcg_at_10 value: 34.344 - type: ndcg_at_100 value: 41.015 - type: ndcg_at_1000 value: 44.366 - type: ndcg_at_3 value: 29.119 - type: ndcg_at_5 value: 30.825999999999997 - type: precision_at_1 value: 34.984 - type: precision_at_10 value: 10.358 - type: precision_at_100 value: 1.762 - type: precision_at_1000 value: 0.23900000000000002 - type: precision_at_3 value: 21.368000000000002 - type: precision_at_5 value: 15.948 - type: recall_at_1 value: 15.572 - type: recall_at_10 value: 39.367999999999995 - type: recall_at_100 value: 62.183 - type: recall_at_1000 value: 80.92200000000001 - type: recall_at_3 value: 26.131999999999998 - type: recall_at_5 value: 31.635999999999996 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: map_at_1 value: 8.848 - type: map_at_10 value: 19.25 - type: map_at_100 value: 27.193 - type: map_at_1000 value: 28.721999999999998 - type: map_at_3 value: 13.968 - type: map_at_5 value: 16.283 - type: mrr_at_1 value: 68.75 - type: mrr_at_10 value: 76.25 - type: mrr_at_100 value: 76.534 - type: mrr_at_1000 value: 76.53999999999999 - type: mrr_at_3 value: 74.667 - type: mrr_at_5 value: 75.86699999999999 - type: ndcg_at_1 value: 56.00000000000001 - type: ndcg_at_10 value: 41.426 - type: ndcg_at_100 value: 45.660000000000004 - type: ndcg_at_1000 value: 53.02 - type: ndcg_at_3 value: 46.581 - type: ndcg_at_5 value: 43.836999999999996 - type: precision_at_1 value: 68.75 - type: precision_at_10 value: 32.800000000000004 - type: precision_at_100 value: 10.440000000000001 - type: precision_at_1000 value: 1.9980000000000002 - type: precision_at_3 value: 49.667 - type: precision_at_5 value: 42.25 - type: recall_at_1 value: 8.848 - type: recall_at_10 value: 24.467 - type: recall_at_100 value: 51.344 - type: recall_at_1000 value: 75.235 - type: recall_at_3 value: 15.329 - type: recall_at_5 value: 18.892999999999997 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 48.95 - type: f1 value: 43.44563593360779 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: map_at_1 value: 78.036 - type: map_at_10 value: 85.639 - type: map_at_100 value: 85.815 - type: map_at_1000 value: 85.829 - type: map_at_3 value: 84.795 - type: map_at_5 value: 85.336 - type: mrr_at_1 value: 84.353 - type: mrr_at_10 value: 90.582 - type: mrr_at_100 value: 90.617 - type: mrr_at_1000 value: 90.617 - type: mrr_at_3 value: 90.132 - type: mrr_at_5 value: 90.447 - type: ndcg_at_1 value: 84.353 - type: ndcg_at_10 value: 89.003 - type: ndcg_at_100 value: 89.60000000000001 - type: ndcg_at_1000 value: 89.836 - type: ndcg_at_3 value: 87.81400000000001 - type: ndcg_at_5 value: 88.478 - type: precision_at_1 value: 84.353 - type: precision_at_10 value: 10.482 - type: precision_at_100 value: 1.099 - type: precision_at_1000 value: 0.11399999999999999 - type: precision_at_3 value: 33.257999999999996 - type: precision_at_5 value: 20.465 - type: recall_at_1 value: 78.036 - type: recall_at_10 value: 94.517 - type: recall_at_100 value: 96.828 - type: recall_at_1000 value: 98.261 - type: recall_at_3 value: 91.12 - type: recall_at_5 value: 92.946 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: map_at_1 value: 20.191 - type: map_at_10 value: 32.369 - type: map_at_100 value: 34.123999999999995 - type: map_at_1000 value: 34.317 - type: map_at_3 value: 28.71 - type: map_at_5 value: 30.607 - type: mrr_at_1 value: 40.894999999999996 - type: mrr_at_10 value: 48.842 - type: mrr_at_100 value: 49.599 - type: mrr_at_1000 value: 49.647000000000006 - type: mrr_at_3 value: 46.785 - type: mrr_at_5 value: 47.672 - type: ndcg_at_1 value: 40.894999999999996 - type: ndcg_at_10 value: 39.872 - type: ndcg_at_100 value: 46.126 - type: ndcg_at_1000 value: 49.476 - type: ndcg_at_3 value: 37.153000000000006 - type: ndcg_at_5 value: 37.433 - type: precision_at_1 value: 40.894999999999996 - type: precision_at_10 value: 10.818 - type: precision_at_100 value: 1.73 - type: precision_at_1000 value: 0.231 - type: precision_at_3 value: 25.051000000000002 - type: precision_at_5 value: 17.531 - type: recall_at_1 value: 20.191 - type: recall_at_10 value: 45.768 - type: recall_at_100 value: 68.82000000000001 - type: recall_at_1000 value: 89.133 - type: recall_at_3 value: 33.296 - type: recall_at_5 value: 38.022 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: map_at_1 value: 39.257 - type: map_at_10 value: 61.467000000000006 - type: map_at_100 value: 62.364 - type: map_at_1000 value: 62.424 - type: map_at_3 value: 58.228 - type: map_at_5 value: 60.283 - type: mrr_at_1 value: 78.515 - type: mrr_at_10 value: 84.191 - type: mrr_at_100 value: 84.378 - type: mrr_at_1000 value: 84.385 - type: mrr_at_3 value: 83.284 - type: mrr_at_5 value: 83.856 - type: ndcg_at_1 value: 78.515 - type: ndcg_at_10 value: 69.78999999999999 - type: ndcg_at_100 value: 72.886 - type: ndcg_at_1000 value: 74.015 - type: ndcg_at_3 value: 65.23 - type: ndcg_at_5 value: 67.80199999999999 - type: precision_at_1 value: 78.515 - type: precision_at_10 value: 14.519000000000002 - type: precision_at_100 value: 1.694 - type: precision_at_1000 value: 0.184 - type: precision_at_3 value: 41.702 - type: precision_at_5 value: 27.046999999999997 - type: recall_at_1 value: 39.257 - type: recall_at_10 value: 72.59299999999999 - type: recall_at_100 value: 84.679 - type: recall_at_1000 value: 92.12 - type: recall_at_3 value: 62.552 - type: recall_at_5 value: 67.616 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 91.5152 - type: ap value: 87.64584669595709 - type: f1 value: 91.50605576428437 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: dev revision: None metrics: - type: map_at_1 value: 21.926000000000002 - type: map_at_10 value: 34.049 - type: map_at_100 value: 35.213 - type: map_at_1000 value: 35.265 - type: map_at_3 value: 30.309 - type: map_at_5 value: 32.407000000000004 - type: mrr_at_1 value: 22.55 - type: mrr_at_10 value: 34.657 - type: mrr_at_100 value: 35.760999999999996 - type: mrr_at_1000 value: 35.807 - type: mrr_at_3 value: 30.989 - type: mrr_at_5 value: 33.039 - type: ndcg_at_1 value: 22.55 - type: ndcg_at_10 value: 40.842 - type: ndcg_at_100 value: 46.436 - type: ndcg_at_1000 value: 47.721999999999994 - type: ndcg_at_3 value: 33.209 - type: ndcg_at_5 value: 36.943 - type: precision_at_1 value: 22.55 - type: precision_at_10 value: 6.447 - type: precision_at_100 value: 0.9249999999999999 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 14.136000000000001 - type: precision_at_5 value: 10.381 - type: recall_at_1 value: 21.926000000000002 - type: recall_at_10 value: 61.724999999999994 - type: recall_at_100 value: 87.604 - type: recall_at_1000 value: 97.421 - type: recall_at_3 value: 40.944 - type: recall_at_5 value: 49.915 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.54765161878704 - type: f1 value: 93.3298945415573 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 75.71591427268582 - type: f1 value: 59.32113870474471 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 75.83053127101547 - type: f1 value: 73.60757944876475 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 78.72562205783457 - type: f1 value: 78.63761662505502 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 33.37935633767996 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 31.55270546130387 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 30.462692753143834 - type: mrr value: 31.497569753511563 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.646 - type: map_at_10 value: 12.498 - type: map_at_100 value: 15.486 - type: map_at_1000 value: 16.805999999999997 - type: map_at_3 value: 9.325 - type: map_at_5 value: 10.751 - type: mrr_at_1 value: 43.034 - type: mrr_at_10 value: 52.662 - type: mrr_at_100 value: 53.189 - type: mrr_at_1000 value: 53.25 - type: mrr_at_3 value: 50.929 - type: mrr_at_5 value: 51.92 - type: ndcg_at_1 value: 41.796 - type: ndcg_at_10 value: 33.477000000000004 - type: ndcg_at_100 value: 29.996000000000002 - type: ndcg_at_1000 value: 38.864 - type: ndcg_at_3 value: 38.940000000000005 - type: ndcg_at_5 value: 36.689 - type: precision_at_1 value: 43.034 - type: precision_at_10 value: 24.799 - type: precision_at_100 value: 7.432999999999999 - type: precision_at_1000 value: 1.9929999999999999 - type: precision_at_3 value: 36.842000000000006 - type: precision_at_5 value: 32.135999999999996 - type: recall_at_1 value: 5.646 - type: recall_at_10 value: 15.963 - type: recall_at_100 value: 29.492 - type: recall_at_1000 value: 61.711000000000006 - type: recall_at_3 value: 10.585 - type: recall_at_5 value: 12.753999999999998 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: map_at_1 value: 27.602 - type: map_at_10 value: 41.545 - type: map_at_100 value: 42.644999999999996 - type: map_at_1000 value: 42.685 - type: map_at_3 value: 37.261 - type: map_at_5 value: 39.706 - type: mrr_at_1 value: 31.141000000000002 - type: mrr_at_10 value: 44.139 - type: mrr_at_100 value: 44.997 - type: mrr_at_1000 value: 45.025999999999996 - type: mrr_at_3 value: 40.503 - type: mrr_at_5 value: 42.64 - type: ndcg_at_1 value: 31.141000000000002 - type: ndcg_at_10 value: 48.995 - type: ndcg_at_100 value: 53.788000000000004 - type: ndcg_at_1000 value: 54.730000000000004 - type: ndcg_at_3 value: 40.844 - type: ndcg_at_5 value: 44.955 - type: precision_at_1 value: 31.141000000000002 - type: precision_at_10 value: 8.233 - type: precision_at_100 value: 1.093 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_3 value: 18.579 - type: precision_at_5 value: 13.533999999999999 - type: recall_at_1 value: 27.602 - type: recall_at_10 value: 69.216 - type: recall_at_100 value: 90.252 - type: recall_at_1000 value: 97.27 - type: recall_at_3 value: 47.987 - type: recall_at_5 value: 57.438 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: map_at_1 value: 70.949 - type: map_at_10 value: 84.89999999999999 - type: map_at_100 value: 85.531 - type: map_at_1000 value: 85.548 - type: map_at_3 value: 82.027 - type: map_at_5 value: 83.853 - type: mrr_at_1 value: 81.69999999999999 - type: mrr_at_10 value: 87.813 - type: mrr_at_100 value: 87.917 - type: mrr_at_1000 value: 87.91799999999999 - type: mrr_at_3 value: 86.938 - type: mrr_at_5 value: 87.53999999999999 - type: ndcg_at_1 value: 81.75 - type: ndcg_at_10 value: 88.55499999999999 - type: ndcg_at_100 value: 89.765 - type: ndcg_at_1000 value: 89.871 - type: ndcg_at_3 value: 85.905 - type: ndcg_at_5 value: 87.41 - type: precision_at_1 value: 81.75 - type: precision_at_10 value: 13.403 - type: precision_at_100 value: 1.528 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.597 - type: precision_at_5 value: 24.69 - type: recall_at_1 value: 70.949 - type: recall_at_10 value: 95.423 - type: recall_at_100 value: 99.509 - type: recall_at_1000 value: 99.982 - type: recall_at_3 value: 87.717 - type: recall_at_5 value: 92.032 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 51.76962893449579 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 62.32897690686379 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 4.478 - type: map_at_10 value: 11.994 - type: map_at_100 value: 13.977 - type: map_at_1000 value: 14.295 - type: map_at_3 value: 8.408999999999999 - type: map_at_5 value: 10.024 - type: mrr_at_1 value: 22.1 - type: mrr_at_10 value: 33.526 - type: mrr_at_100 value: 34.577000000000005 - type: mrr_at_1000 value: 34.632000000000005 - type: mrr_at_3 value: 30.217 - type: mrr_at_5 value: 31.962000000000003 - type: ndcg_at_1 value: 22.1 - type: ndcg_at_10 value: 20.191 - type: ndcg_at_100 value: 27.954 - type: ndcg_at_1000 value: 33.491 - type: ndcg_at_3 value: 18.787000000000003 - type: ndcg_at_5 value: 16.378999999999998 - type: precision_at_1 value: 22.1 - type: precision_at_10 value: 10.69 - type: precision_at_100 value: 2.1919999999999997 - type: precision_at_1000 value: 0.35200000000000004 - type: precision_at_3 value: 17.732999999999997 - type: precision_at_5 value: 14.499999999999998 - type: recall_at_1 value: 4.478 - type: recall_at_10 value: 21.657 - type: recall_at_100 value: 44.54 - type: recall_at_1000 value: 71.542 - type: recall_at_3 value: 10.778 - type: recall_at_5 value: 14.687 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 82.82325259156718 - type: cos_sim_spearman value: 79.2463589100662 - type: euclidean_pearson value: 80.48318380496771 - type: euclidean_spearman value: 79.34451935199979 - type: manhattan_pearson value: 80.39041824178759 - type: manhattan_spearman value: 79.23002892700211 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 85.74130231431258 - type: cos_sim_spearman value: 78.36856568042397 - type: euclidean_pearson value: 82.48301631890303 - type: euclidean_spearman value: 78.28376980722732 - type: manhattan_pearson value: 82.43552075450525 - type: manhattan_spearman value: 78.22702443947126 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 79.96138619461459 - type: cos_sim_spearman value: 81.85436343502379 - type: euclidean_pearson value: 81.82895226665367 - type: euclidean_spearman value: 82.22707349602916 - type: manhattan_pearson value: 81.66303369445873 - type: manhattan_spearman value: 82.05030197179455 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 80.05481244198648 - type: cos_sim_spearman value: 80.85052504637808 - type: euclidean_pearson value: 80.86728419744497 - type: euclidean_spearman value: 81.033786401512 - type: manhattan_pearson value: 80.90107531061103 - type: manhattan_spearman value: 81.11374116827795 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 84.615220756399 - type: cos_sim_spearman value: 86.46858500002092 - type: euclidean_pearson value: 86.08307800247586 - type: euclidean_spearman value: 86.72691443870013 - type: manhattan_pearson value: 85.96155594487269 - type: manhattan_spearman value: 86.605909505275 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 82.14363913634436 - type: cos_sim_spearman value: 84.48430226487102 - type: euclidean_pearson value: 83.75303424801902 - type: euclidean_spearman value: 84.56762380734538 - type: manhattan_pearson value: 83.6135447165928 - type: manhattan_spearman value: 84.39898212616731 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 85.09909252554525 - type: cos_sim_spearman value: 85.70951402743276 - type: euclidean_pearson value: 87.1991936239908 - type: euclidean_spearman value: 86.07745840612071 - type: manhattan_pearson value: 87.25039137549952 - type: manhattan_spearman value: 85.99938746659761 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 63.529332093413615 - type: cos_sim_spearman value: 65.38177340147439 - type: euclidean_pearson value: 66.35278011412136 - type: euclidean_spearman value: 65.47147267032997 - type: manhattan_pearson value: 66.71804682408693 - type: manhattan_spearman value: 65.67406521423597 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 82.45802942885662 - type: cos_sim_spearman value: 84.8853341842566 - type: euclidean_pearson value: 84.60915021096707 - type: euclidean_spearman value: 85.11181242913666 - type: manhattan_pearson value: 84.38600521210364 - type: manhattan_spearman value: 84.89045417981723 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 85.92793380635129 - type: mrr value: 95.85834191226348 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: map_at_1 value: 55.74400000000001 - type: map_at_10 value: 65.455 - type: map_at_100 value: 66.106 - type: map_at_1000 value: 66.129 - type: map_at_3 value: 62.719 - type: map_at_5 value: 64.441 - type: mrr_at_1 value: 58.667 - type: mrr_at_10 value: 66.776 - type: mrr_at_100 value: 67.363 - type: mrr_at_1000 value: 67.384 - type: mrr_at_3 value: 64.889 - type: mrr_at_5 value: 66.122 - type: ndcg_at_1 value: 58.667 - type: ndcg_at_10 value: 69.904 - type: ndcg_at_100 value: 72.807 - type: ndcg_at_1000 value: 73.423 - type: ndcg_at_3 value: 65.405 - type: ndcg_at_5 value: 67.86999999999999 - type: precision_at_1 value: 58.667 - type: precision_at_10 value: 9.3 - type: precision_at_100 value: 1.08 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 25.444 - type: precision_at_5 value: 17 - type: recall_at_1 value: 55.74400000000001 - type: recall_at_10 value: 82.122 - type: recall_at_100 value: 95.167 - type: recall_at_1000 value: 100 - type: recall_at_3 value: 70.14399999999999 - type: recall_at_5 value: 76.417 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.86534653465347 - type: cos_sim_ap value: 96.54142419791388 - type: cos_sim_f1 value: 93.07535641547861 - type: cos_sim_precision value: 94.81327800829875 - type: cos_sim_recall value: 91.4 - type: dot_accuracy value: 99.86435643564356 - type: dot_ap value: 96.53682260449868 - type: dot_f1 value: 92.98515104966718 - type: dot_precision value: 95.27806925498426 - type: dot_recall value: 90.8 - type: euclidean_accuracy value: 99.86336633663366 - type: euclidean_ap value: 96.5228676185697 - type: euclidean_f1 value: 92.9735234215886 - type: euclidean_precision value: 94.70954356846472 - type: euclidean_recall value: 91.3 - type: manhattan_accuracy value: 99.85841584158416 - type: manhattan_ap value: 96.50392760934032 - type: manhattan_f1 value: 92.84642321160581 - type: manhattan_precision value: 92.8928928928929 - type: manhattan_recall value: 92.80000000000001 - type: max_accuracy value: 99.86534653465347 - type: max_ap value: 96.54142419791388 - type: max_f1 value: 93.07535641547861 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 61.08285408766616 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 35.640675309010604 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 53.20333913710715 - type: mrr value: 54.088813555725324 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.79465221925075 - type: cos_sim_spearman value: 30.530816059163634 - type: dot_pearson value: 31.364837244718043 - type: dot_spearman value: 30.79726823684003 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.22599999999999998 - type: map_at_10 value: 1.735 - type: map_at_100 value: 8.978 - type: map_at_1000 value: 20.851 - type: map_at_3 value: 0.613 - type: map_at_5 value: 0.964 - type: mrr_at_1 value: 88 - type: mrr_at_10 value: 92.867 - type: mrr_at_100 value: 92.867 - type: mrr_at_1000 value: 92.867 - type: mrr_at_3 value: 92.667 - type: mrr_at_5 value: 92.667 - type: ndcg_at_1 value: 82 - type: ndcg_at_10 value: 73.164 - type: ndcg_at_100 value: 51.878 - type: ndcg_at_1000 value: 44.864 - type: ndcg_at_3 value: 79.184 - type: ndcg_at_5 value: 76.39 - type: precision_at_1 value: 88 - type: precision_at_10 value: 76.2 - type: precision_at_100 value: 52.459999999999994 - type: precision_at_1000 value: 19.692 - type: precision_at_3 value: 82.667 - type: precision_at_5 value: 80 - type: recall_at_1 value: 0.22599999999999998 - type: recall_at_10 value: 1.942 - type: recall_at_100 value: 12.342 - type: recall_at_1000 value: 41.42 - type: recall_at_3 value: 0.637 - type: recall_at_5 value: 1.034 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 3.567 - type: map_at_10 value: 13.116 - type: map_at_100 value: 19.39 - type: map_at_1000 value: 20.988 - type: map_at_3 value: 7.109 - type: map_at_5 value: 9.950000000000001 - type: mrr_at_1 value: 42.857 - type: mrr_at_10 value: 57.404999999999994 - type: mrr_at_100 value: 58.021 - type: mrr_at_1000 value: 58.021 - type: mrr_at_3 value: 54.762 - type: mrr_at_5 value: 56.19 - type: ndcg_at_1 value: 38.775999999999996 - type: ndcg_at_10 value: 30.359 - type: ndcg_at_100 value: 41.284 - type: ndcg_at_1000 value: 52.30200000000001 - type: ndcg_at_3 value: 36.744 - type: ndcg_at_5 value: 34.326 - type: precision_at_1 value: 42.857 - type: precision_at_10 value: 26.122 - type: precision_at_100 value: 8.082 - type: precision_at_1000 value: 1.559 - type: precision_at_3 value: 40.136 - type: precision_at_5 value: 35.510000000000005 - type: recall_at_1 value: 3.567 - type: recall_at_10 value: 19.045 - type: recall_at_100 value: 49.979 - type: recall_at_1000 value: 84.206 - type: recall_at_3 value: 8.52 - type: recall_at_5 value: 13.103000000000002 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 68.8394 - type: ap value: 13.454399712443099 - type: f1 value: 53.04963076364322 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 60.546123372948514 - type: f1 value: 60.86952793277713 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 49.10042955060234 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 85.03308100375514 - type: cos_sim_ap value: 71.08284605869684 - type: cos_sim_f1 value: 65.42539436255494 - type: cos_sim_precision value: 64.14807302231237 - type: cos_sim_recall value: 66.75461741424802 - type: dot_accuracy value: 84.68736961316088 - type: dot_ap value: 69.20524036530992 - type: dot_f1 value: 63.54893953365829 - type: dot_precision value: 63.45698500394633 - type: dot_recall value: 63.641160949868066 - type: euclidean_accuracy value: 85.07480479227513 - type: euclidean_ap value: 71.14592761009864 - type: euclidean_f1 value: 65.43814432989691 - type: euclidean_precision value: 63.95465994962216 - type: euclidean_recall value: 66.99208443271768 - type: manhattan_accuracy value: 85.06288370984085 - type: manhattan_ap value: 71.07289742593868 - type: manhattan_f1 value: 65.37585421412301 - type: manhattan_precision value: 62.816147859922175 - type: manhattan_recall value: 68.15303430079156 - type: max_accuracy value: 85.07480479227513 - type: max_ap value: 71.14592761009864 - type: max_f1 value: 65.43814432989691 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 87.79058485659952 - type: cos_sim_ap value: 83.7183187008759 - type: cos_sim_f1 value: 75.86921142180798 - type: cos_sim_precision value: 73.00683371298405 - type: cos_sim_recall value: 78.96519864490298 - type: dot_accuracy value: 87.0085768618776 - type: dot_ap value: 81.87467488474279 - type: dot_f1 value: 74.04188363990559 - type: dot_precision value: 72.10507114191901 - type: dot_recall value: 76.08561749307053 - type: euclidean_accuracy value: 87.8332751193387 - type: euclidean_ap value: 83.83585648120315 - type: euclidean_f1 value: 76.02582177042369 - type: euclidean_precision value: 73.36388371759989 - type: euclidean_recall value: 78.88820449645827 - type: manhattan_accuracy value: 87.87208444910156 - type: manhattan_ap value: 83.8101950642973 - type: manhattan_f1 value: 75.90454195535027 - type: manhattan_precision value: 72.44419564761039 - type: manhattan_recall value: 79.71204188481676 - type: max_accuracy value: 87.87208444910156 - type: max_ap value: 83.83585648120315 - type: max_f1 value: 76.02582177042369 --- **Recommend switching to newest [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5), which has more reasonable similarity distribution and same method of usage.** <h1 align="center">FlagEmbedding</h1> <h4 align="center"> <p> <a href=#model-list>Model List</a> | <a href=#frequently-asked-questions>FAQ</a> | <a href=#usage>Usage</a> | <a href="#evaluation">Evaluation</a> | <a href="#train">Train</a> | <a href="#contact">Contact</a> | <a href="#citation">Citation</a> | <a href="#license">License</a> <p> </h4> More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding). [English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md) FlagEmbedding can map any text to a low-dimensional dense vector which can be used for tasks like retrieval, classification, clustering, or semantic search. And it also can be used in vector databases for LLMs. ************* 🌟**Updates**🌟 ************* - 09/15/2023: Release [paper](https://arxiv.org/pdf/2309.07597.pdf) and [dataset](https://data.baai.ac.cn/details/BAAI-MTP). - 09/12/2023: New Release: - **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models. - **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction. - 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning. - 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard). - 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗** - 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada: - 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset. ## Model List `bge` is short for `BAAI general embedding`. | Model | Language | | Description | query instruction for retrieval\* | |:-------------------------------|:--------:| :--------:| :--------:|:--------:| | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient \** | | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient \** | | | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` | \*: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages. \**: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models. For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results. ## Frequently asked questions <details> <summary>1. How to fine-tune bge embedding model?</summary> <!-- ### How to fine-tune bge embedding model? --> Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model. Some suggestions: - Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance. - If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity. - If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker. </details> <details> <summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary> <!-- ### The similarity score between two dissimilar sentences is higher than 0.5 --> **Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.** Since we finetune the models by contrastive learning with a temperature of 0.01, the similarity distribution of the current BGE model is about in the interval \[0.6, 1\]. So a similarity score greater than 0.5 does not indicate that the two sentences are similar. For downstream tasks, such as passage retrieval or semantic similarity, **what matters is the relative order of the scores, not the absolute value.** If you need to filter similar sentences based on a similarity threshold, please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9). </details> <details> <summary>3. When does the query instruction need to be used</summary> <!-- ### When does the query instruction need to be used --> For a retrieval task that uses short queries to find long related documents, it is recommended to add instructions for these short queries. **The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.** In all cases, the documents/passages do not need to add the instruction. </details> ## Usage ### Usage for Embedding Model Here are some examples for using `bge` models with [FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers). #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding. ```python from FlagEmbedding import FlagModel sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = FlagModel('BAAI/bge-large-zh-v1.5', query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:", use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation embeddings_1 = model.encode(sentences_1) embeddings_2 = model.encode(sentences_2) similarity = embeddings_1 @ embeddings_2.T print(similarity) # for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query # corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] q_embeddings = model.encode_queries(queries) p_embeddings = model.encode(passages) scores = q_embeddings @ p_embeddings.T ``` For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list). By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs. You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable. #### Using Sentence-Transformers You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net): ``` pip install -U sentence-transformers ``` ```python from sentence_transformers import SentenceTransformer sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = SentenceTransformer('BAAI/bge-large-zh-v1.5') embeddings_1 = model.encode(sentences_1, normalize_embeddings=True) embeddings_2 = model.encode(sentences_2, normalize_embeddings=True) similarity = embeddings_1 @ embeddings_2.T print(similarity) ``` For s2p(short query to long passage) retrieval task, each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)). But the instruction is not needed for passages. ```python from sentence_transformers import SentenceTransformer queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] instruction = "为这个句子生成表示以用于检索相关文章:" model = SentenceTransformer('BAAI/bge-large-zh-v1.5') q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True) p_embeddings = model.encode(passages, normalize_embeddings=True) scores = q_embeddings @ p_embeddings.T ``` #### Using Langchain You can use `bge` in langchain like this: ```python from langchain.embeddings import HuggingFaceBgeEmbeddings model_name = "BAAI/bge-large-en-v1.5" model_kwargs = {'device': 'cuda'} encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity model = HuggingFaceBgeEmbeddings( model_name=model_name, model_kwargs=model_kwargs, encode_kwargs=encode_kwargs, query_instruction="为这个句子生成表示以用于检索相关文章:" ) model.query_instruction = "为这个句子生成表示以用于检索相关文章:" ``` #### Using HuggingFace Transformers With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding. ```python from transformers import AutoTokenizer, AutoModel import torch # Sentences we want sentence embeddings for sentences = ["样例数据-1", "样例数据-2"] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5') model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5') model.eval() # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages) # encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, cls pooling. sentence_embeddings = model_output[0][:, 0] # normalize embeddings sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1) print("Sentence embeddings:", sentence_embeddings) ``` ### Usage for Reranker Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. You can get a relevance score by inputting query and passage to the reranker. The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range. #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` Get relevance scores (higher scores indicate more relevance): ```python from FlagEmbedding import FlagReranker reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation score = reranker.compute_score(['query', 'passage']) print(score) scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]) print(scores) ``` #### Using Huggingface transformers ```python import torch from transformers import AutoModelForSequenceClassification, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large') model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large') model.eval() pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']] with torch.no_grad(): inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512) scores = model(**inputs, return_dict=True).logits.view(-1, ).float() print(scores) ``` ## Evaluation `baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!** For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md). - **MTEB**: | Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) | |:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:| | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 | | [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 | | [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 | | [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 | | [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 | | [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 | | [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 | | [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 | | [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 | | [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 | | [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 | | [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 | | [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 | | [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 | | [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 | - **C-MTEB**: We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks. Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction. | Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 | | [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 | | [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 | | [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 | | [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 | | [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 | | [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 | | [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 | | [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 | | [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 | | [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 | - **Reranking**: See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script. | Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 | | multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 | | multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 | | multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 | | m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 | | m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 | | bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 | | bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 | \* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks ## Train ### BAAI Embedding We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning. **You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).** We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain). Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned. More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md). ### BGE Reranker Cross-encoder will perform full-attention over the input pair, which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model. Therefore, it can be used to re-rank the top-k documents returned by embedding model. We train the cross-encoder on a multilingual pair data, The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker). More details pelease refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker) ## Contact If you have any question or suggestion related to this project, feel free to open an issue or pull request. You also can email Shitao Xiao([email protected]) and Zheng Liu([email protected]). ## Citation If you find our work helpful, please cite us: ``` @misc{bge_embedding, title={C-Pack: Packaged Resources To Advance General Chinese Embedding}, author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff}, year={2023}, eprint={2309.07597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## License FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
[ "SEMANTIC_SIMILARITY", "SUMMARIZATION" ]
[ "BEAR", "BIOSSES", "SCIFACT" ]
Non_BioNLP
bnightning/gte-Qwen2-7B-instruct-Q4_K_M-GGUF
bnightning
sentence-similarity
[ "sentence-transformers", "gguf", "mteb", "transformers", "Qwen2", "sentence-similarity", "llama-cpp", "gguf-my-repo", "base_model:Alibaba-NLP/gte-Qwen2-7B-instruct", "base_model:quantized:Alibaba-NLP/gte-Qwen2-7B-instruct", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us", "conversational" ]
1,741
1,741
12
0
--- base_model: Alibaba-NLP/gte-Qwen2-7B-instruct license: apache-2.0 tags: - mteb - sentence-transformers - transformers - Qwen2 - sentence-similarity - llama-cpp - gguf-my-repo model-index: - name: gte-qwen2-7B-instruct results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 91.31343283582089 - type: ap value: 67.64251402604096 - type: f1 value: 87.53372530755692 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 97.497825 - type: ap value: 96.30329547047529 - type: f1 value: 97.49769793778039 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 62.564 - type: f1 value: 60.975777935041066 - task: type: Retrieval dataset: name: MTEB ArguAna type: mteb/arguana config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 36.486000000000004 - type: map_at_10 value: 54.842 - type: map_at_100 value: 55.206999999999994 - type: map_at_1000 value: 55.206999999999994 - type: map_at_3 value: 49.893 - type: map_at_5 value: 53.105000000000004 - type: mrr_at_1 value: 37.34 - type: mrr_at_10 value: 55.143 - type: mrr_at_100 value: 55.509 - type: mrr_at_1000 value: 55.509 - type: mrr_at_3 value: 50.212999999999994 - type: mrr_at_5 value: 53.432 - type: ndcg_at_1 value: 36.486000000000004 - type: ndcg_at_10 value: 64.273 - type: ndcg_at_100 value: 65.66199999999999 - type: ndcg_at_1000 value: 65.66199999999999 - type: ndcg_at_3 value: 54.352999999999994 - type: ndcg_at_5 value: 60.131 - type: precision_at_1 value: 36.486000000000004 - type: precision_at_10 value: 9.395000000000001 - type: precision_at_100 value: 0.996 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 22.428 - type: precision_at_5 value: 16.259 - type: recall_at_1 value: 36.486000000000004 - type: recall_at_10 value: 93.95400000000001 - type: recall_at_100 value: 99.644 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 67.283 - type: recall_at_5 value: 81.294 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 56.461169803700564 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 51.73600434466286 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 67.57827065898053 - type: mrr value: 79.08136569493911 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 83.53324575999243 - type: cos_sim_spearman value: 81.37173362822374 - type: euclidean_pearson value: 82.19243335103444 - type: euclidean_spearman value: 81.33679307304334 - type: manhattan_pearson value: 82.38752665975699 - type: manhattan_spearman value: 81.31510583189689 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 87.56818181818181 - type: f1 value: 87.25826722019875 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 50.09239610327673 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 46.64733054606282 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: map_at_1 value: 33.997 - type: map_at_10 value: 48.176 - type: map_at_100 value: 49.82 - type: map_at_1000 value: 49.924 - type: map_at_3 value: 43.626 - type: map_at_5 value: 46.275 - type: mrr_at_1 value: 42.059999999999995 - type: mrr_at_10 value: 53.726 - type: mrr_at_100 value: 54.398 - type: mrr_at_1000 value: 54.416 - type: mrr_at_3 value: 50.714999999999996 - type: mrr_at_5 value: 52.639 - type: ndcg_at_1 value: 42.059999999999995 - type: ndcg_at_10 value: 55.574999999999996 - type: ndcg_at_100 value: 60.744 - type: ndcg_at_1000 value: 61.85699999999999 - type: ndcg_at_3 value: 49.363 - type: ndcg_at_5 value: 52.44 - type: precision_at_1 value: 42.059999999999995 - type: precision_at_10 value: 11.101999999999999 - type: precision_at_100 value: 1.73 - type: precision_at_1000 value: 0.218 - type: precision_at_3 value: 24.464 - type: precision_at_5 value: 18.026 - type: recall_at_1 value: 33.997 - type: recall_at_10 value: 70.35900000000001 - type: recall_at_100 value: 91.642 - type: recall_at_1000 value: 97.977 - type: recall_at_3 value: 52.76 - type: recall_at_5 value: 61.148 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval type: BeIR/cqadupstack config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: map_at_1 value: 35.884 - type: map_at_10 value: 48.14 - type: map_at_100 value: 49.5 - type: map_at_1000 value: 49.63 - type: map_at_3 value: 44.646 - type: map_at_5 value: 46.617999999999995 - type: mrr_at_1 value: 44.458999999999996 - type: mrr_at_10 value: 53.751000000000005 - type: mrr_at_100 value: 54.37800000000001 - type: mrr_at_1000 value: 54.415 - type: mrr_at_3 value: 51.815 - type: mrr_at_5 value: 52.882 - type: ndcg_at_1 value: 44.458999999999996 - type: ndcg_at_10 value: 54.157 - type: ndcg_at_100 value: 58.362 - type: ndcg_at_1000 value: 60.178 - type: ndcg_at_3 value: 49.661 - type: ndcg_at_5 value: 51.74999999999999 - type: precision_at_1 value: 44.458999999999996 - type: precision_at_10 value: 10.248 - type: precision_at_100 value: 1.5890000000000002 - type: precision_at_1000 value: 0.207 - type: precision_at_3 value: 23.928 - type: precision_at_5 value: 16.878999999999998 - type: recall_at_1 value: 35.884 - type: recall_at_10 value: 64.798 - type: recall_at_100 value: 82.345 - type: recall_at_1000 value: 93.267 - type: recall_at_3 value: 51.847 - type: recall_at_5 value: 57.601 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval type: BeIR/cqadupstack config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 39.383 - type: map_at_10 value: 53.714 - type: map_at_100 value: 54.838 - type: map_at_1000 value: 54.87800000000001 - type: map_at_3 value: 50.114999999999995 - type: map_at_5 value: 52.153000000000006 - type: mrr_at_1 value: 45.016 - type: mrr_at_10 value: 56.732000000000006 - type: mrr_at_100 value: 57.411 - type: mrr_at_1000 value: 57.431 - type: mrr_at_3 value: 54.044000000000004 - type: mrr_at_5 value: 55.639 - type: ndcg_at_1 value: 45.016 - type: ndcg_at_10 value: 60.228 - type: ndcg_at_100 value: 64.277 - type: ndcg_at_1000 value: 65.07 - type: ndcg_at_3 value: 54.124 - type: ndcg_at_5 value: 57.147000000000006 - type: precision_at_1 value: 45.016 - type: precision_at_10 value: 9.937 - type: precision_at_100 value: 1.288 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 24.471999999999998 - type: precision_at_5 value: 16.991 - type: recall_at_1 value: 39.383 - type: recall_at_10 value: 76.175 - type: recall_at_100 value: 93.02 - type: recall_at_1000 value: 98.60900000000001 - type: recall_at_3 value: 60.265 - type: recall_at_5 value: 67.46600000000001 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval type: BeIR/cqadupstack config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: map_at_1 value: 27.426000000000002 - type: map_at_10 value: 37.397000000000006 - type: map_at_100 value: 38.61 - type: map_at_1000 value: 38.678000000000004 - type: map_at_3 value: 34.150999999999996 - type: map_at_5 value: 36.137 - type: mrr_at_1 value: 29.944 - type: mrr_at_10 value: 39.654 - type: mrr_at_100 value: 40.638000000000005 - type: mrr_at_1000 value: 40.691 - type: mrr_at_3 value: 36.817 - type: mrr_at_5 value: 38.524 - type: ndcg_at_1 value: 29.944 - type: ndcg_at_10 value: 43.094 - type: ndcg_at_100 value: 48.789 - type: ndcg_at_1000 value: 50.339999999999996 - type: ndcg_at_3 value: 36.984 - type: ndcg_at_5 value: 40.248 - type: precision_at_1 value: 29.944 - type: precision_at_10 value: 6.78 - type: precision_at_100 value: 1.024 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_3 value: 15.895000000000001 - type: precision_at_5 value: 11.39 - type: recall_at_1 value: 27.426000000000002 - type: recall_at_10 value: 58.464000000000006 - type: recall_at_100 value: 84.193 - type: recall_at_1000 value: 95.52000000000001 - type: recall_at_3 value: 42.172 - type: recall_at_5 value: 50.101 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval type: BeIR/cqadupstack config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: map_at_1 value: 19.721 - type: map_at_10 value: 31.604 - type: map_at_100 value: 32.972 - type: map_at_1000 value: 33.077 - type: map_at_3 value: 27.218999999999998 - type: map_at_5 value: 29.53 - type: mrr_at_1 value: 25.0 - type: mrr_at_10 value: 35.843 - type: mrr_at_100 value: 36.785000000000004 - type: mrr_at_1000 value: 36.842000000000006 - type: mrr_at_3 value: 32.193 - type: mrr_at_5 value: 34.264 - type: ndcg_at_1 value: 25.0 - type: ndcg_at_10 value: 38.606 - type: ndcg_at_100 value: 44.272 - type: ndcg_at_1000 value: 46.527 - type: ndcg_at_3 value: 30.985000000000003 - type: ndcg_at_5 value: 34.43 - type: precision_at_1 value: 25.0 - type: precision_at_10 value: 7.811 - type: precision_at_100 value: 1.203 - type: precision_at_1000 value: 0.15 - type: precision_at_3 value: 15.423 - type: precision_at_5 value: 11.791 - type: recall_at_1 value: 19.721 - type: recall_at_10 value: 55.625 - type: recall_at_100 value: 79.34400000000001 - type: recall_at_1000 value: 95.208 - type: recall_at_3 value: 35.19 - type: recall_at_5 value: 43.626 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval type: BeIR/cqadupstack config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: map_at_1 value: 33.784 - type: map_at_10 value: 47.522 - type: map_at_100 value: 48.949999999999996 - type: map_at_1000 value: 49.038 - type: map_at_3 value: 43.284 - type: map_at_5 value: 45.629 - type: mrr_at_1 value: 41.482 - type: mrr_at_10 value: 52.830999999999996 - type: mrr_at_100 value: 53.559999999999995 - type: mrr_at_1000 value: 53.588 - type: mrr_at_3 value: 50.016000000000005 - type: mrr_at_5 value: 51.614000000000004 - type: ndcg_at_1 value: 41.482 - type: ndcg_at_10 value: 54.569 - type: ndcg_at_100 value: 59.675999999999995 - type: ndcg_at_1000 value: 60.989000000000004 - type: ndcg_at_3 value: 48.187000000000005 - type: ndcg_at_5 value: 51.183 - type: precision_at_1 value: 41.482 - type: precision_at_10 value: 10.221 - type: precision_at_100 value: 1.486 - type: precision_at_1000 value: 0.17500000000000002 - type: precision_at_3 value: 23.548 - type: precision_at_5 value: 16.805 - type: recall_at_1 value: 33.784 - type: recall_at_10 value: 69.798 - type: recall_at_100 value: 90.098 - type: recall_at_1000 value: 98.176 - type: recall_at_3 value: 52.127 - type: recall_at_5 value: 59.861 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval type: BeIR/cqadupstack config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: map_at_1 value: 28.038999999999998 - type: map_at_10 value: 41.904 - type: map_at_100 value: 43.36 - type: map_at_1000 value: 43.453 - type: map_at_3 value: 37.785999999999994 - type: map_at_5 value: 40.105000000000004 - type: mrr_at_1 value: 35.046 - type: mrr_at_10 value: 46.926 - type: mrr_at_100 value: 47.815000000000005 - type: mrr_at_1000 value: 47.849000000000004 - type: mrr_at_3 value: 44.273 - type: mrr_at_5 value: 45.774 - type: ndcg_at_1 value: 35.046 - type: ndcg_at_10 value: 48.937000000000005 - type: ndcg_at_100 value: 54.544000000000004 - type: ndcg_at_1000 value: 56.069 - type: ndcg_at_3 value: 42.858000000000004 - type: ndcg_at_5 value: 45.644 - type: precision_at_1 value: 35.046 - type: precision_at_10 value: 9.452 - type: precision_at_100 value: 1.429 - type: precision_at_1000 value: 0.173 - type: precision_at_3 value: 21.346999999999998 - type: precision_at_5 value: 15.342 - type: recall_at_1 value: 28.038999999999998 - type: recall_at_10 value: 64.59700000000001 - type: recall_at_100 value: 87.735 - type: recall_at_1000 value: 97.41300000000001 - type: recall_at_3 value: 47.368 - type: recall_at_5 value: 54.93900000000001 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: BeIR/cqadupstack config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 28.17291666666667 - type: map_at_10 value: 40.025749999999995 - type: map_at_100 value: 41.39208333333333 - type: map_at_1000 value: 41.499249999999996 - type: map_at_3 value: 36.347 - type: map_at_5 value: 38.41391666666667 - type: mrr_at_1 value: 33.65925 - type: mrr_at_10 value: 44.085499999999996 - type: mrr_at_100 value: 44.94116666666667 - type: mrr_at_1000 value: 44.9855 - type: mrr_at_3 value: 41.2815 - type: mrr_at_5 value: 42.91491666666666 - type: ndcg_at_1 value: 33.65925 - type: ndcg_at_10 value: 46.430833333333325 - type: ndcg_at_100 value: 51.761 - type: ndcg_at_1000 value: 53.50899999999999 - type: ndcg_at_3 value: 40.45133333333333 - type: ndcg_at_5 value: 43.31483333333334 - type: precision_at_1 value: 33.65925 - type: precision_at_10 value: 8.4995 - type: precision_at_100 value: 1.3210000000000004 - type: precision_at_1000 value: 0.16591666666666666 - type: precision_at_3 value: 19.165083333333335 - type: precision_at_5 value: 13.81816666666667 - type: recall_at_1 value: 28.17291666666667 - type: recall_at_10 value: 61.12624999999999 - type: recall_at_100 value: 83.97266666666667 - type: recall_at_1000 value: 95.66550000000001 - type: recall_at_3 value: 44.661249999999995 - type: recall_at_5 value: 51.983333333333334 - type: map_at_1 value: 17.936 - type: map_at_10 value: 27.399 - type: map_at_100 value: 28.632 - type: map_at_1000 value: 28.738000000000003 - type: map_at_3 value: 24.456 - type: map_at_5 value: 26.06 - type: mrr_at_1 value: 19.224 - type: mrr_at_10 value: 28.998 - type: mrr_at_100 value: 30.11 - type: mrr_at_1000 value: 30.177 - type: mrr_at_3 value: 26.247999999999998 - type: mrr_at_5 value: 27.708 - type: ndcg_at_1 value: 19.224 - type: ndcg_at_10 value: 32.911 - type: ndcg_at_100 value: 38.873999999999995 - type: ndcg_at_1000 value: 41.277 - type: ndcg_at_3 value: 27.142 - type: ndcg_at_5 value: 29.755 - type: precision_at_1 value: 19.224 - type: precision_at_10 value: 5.6930000000000005 - type: precision_at_100 value: 0.9259999999999999 - type: precision_at_1000 value: 0.126 - type: precision_at_3 value: 12.138 - type: precision_at_5 value: 8.909 - type: recall_at_1 value: 17.936 - type: recall_at_10 value: 48.096 - type: recall_at_100 value: 75.389 - type: recall_at_1000 value: 92.803 - type: recall_at_3 value: 32.812999999999995 - type: recall_at_5 value: 38.851 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval type: BeIR/cqadupstack config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: map_at_1 value: 24.681 - type: map_at_10 value: 34.892 - type: map_at_100 value: 35.996 - type: map_at_1000 value: 36.083 - type: map_at_3 value: 31.491999999999997 - type: map_at_5 value: 33.632 - type: mrr_at_1 value: 28.528 - type: mrr_at_10 value: 37.694 - type: mrr_at_100 value: 38.613 - type: mrr_at_1000 value: 38.668 - type: mrr_at_3 value: 34.714 - type: mrr_at_5 value: 36.616 - type: ndcg_at_1 value: 28.528 - type: ndcg_at_10 value: 40.703 - type: ndcg_at_100 value: 45.993 - type: ndcg_at_1000 value: 47.847 - type: ndcg_at_3 value: 34.622 - type: ndcg_at_5 value: 38.035999999999994 - type: precision_at_1 value: 28.528 - type: precision_at_10 value: 6.902 - type: precision_at_100 value: 1.0370000000000001 - type: precision_at_1000 value: 0.126 - type: precision_at_3 value: 15.798000000000002 - type: precision_at_5 value: 11.655999999999999 - type: recall_at_1 value: 24.681 - type: recall_at_10 value: 55.81 - type: recall_at_100 value: 79.785 - type: recall_at_1000 value: 92.959 - type: recall_at_3 value: 39.074 - type: recall_at_5 value: 47.568 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval type: BeIR/cqadupstack config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 18.627 - type: map_at_10 value: 27.872000000000003 - type: map_at_100 value: 29.237999999999996 - type: map_at_1000 value: 29.363 - type: map_at_3 value: 24.751 - type: map_at_5 value: 26.521 - type: mrr_at_1 value: 23.021 - type: mrr_at_10 value: 31.924000000000003 - type: mrr_at_100 value: 32.922000000000004 - type: mrr_at_1000 value: 32.988 - type: mrr_at_3 value: 29.192 - type: mrr_at_5 value: 30.798 - type: ndcg_at_1 value: 23.021 - type: ndcg_at_10 value: 33.535 - type: ndcg_at_100 value: 39.732 - type: ndcg_at_1000 value: 42.201 - type: ndcg_at_3 value: 28.153 - type: ndcg_at_5 value: 30.746000000000002 - type: precision_at_1 value: 23.021 - type: precision_at_10 value: 6.459 - type: precision_at_100 value: 1.1320000000000001 - type: precision_at_1000 value: 0.153 - type: precision_at_3 value: 13.719000000000001 - type: precision_at_5 value: 10.193000000000001 - type: recall_at_1 value: 18.627 - type: recall_at_10 value: 46.463 - type: recall_at_100 value: 74.226 - type: recall_at_1000 value: 91.28500000000001 - type: recall_at_3 value: 31.357000000000003 - type: recall_at_5 value: 38.067 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval type: BeIR/cqadupstack config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: map_at_1 value: 31.457 - type: map_at_10 value: 42.888 - type: map_at_100 value: 44.24 - type: map_at_1000 value: 44.327 - type: map_at_3 value: 39.588 - type: map_at_5 value: 41.423 - type: mrr_at_1 value: 37.126999999999995 - type: mrr_at_10 value: 47.083000000000006 - type: mrr_at_100 value: 47.997 - type: mrr_at_1000 value: 48.044 - type: mrr_at_3 value: 44.574000000000005 - type: mrr_at_5 value: 46.202 - type: ndcg_at_1 value: 37.126999999999995 - type: ndcg_at_10 value: 48.833 - type: ndcg_at_100 value: 54.327000000000005 - type: ndcg_at_1000 value: 56.011 - type: ndcg_at_3 value: 43.541999999999994 - type: ndcg_at_5 value: 46.127 - type: precision_at_1 value: 37.126999999999995 - type: precision_at_10 value: 8.376999999999999 - type: precision_at_100 value: 1.2309999999999999 - type: precision_at_1000 value: 0.146 - type: precision_at_3 value: 20.211000000000002 - type: precision_at_5 value: 14.16 - type: recall_at_1 value: 31.457 - type: recall_at_10 value: 62.369 - type: recall_at_100 value: 85.444 - type: recall_at_1000 value: 96.65599999999999 - type: recall_at_3 value: 47.961 - type: recall_at_5 value: 54.676 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval type: BeIR/cqadupstack config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: map_at_1 value: 27.139999999999997 - type: map_at_10 value: 38.801 - type: map_at_100 value: 40.549 - type: map_at_1000 value: 40.802 - type: map_at_3 value: 35.05 - type: map_at_5 value: 36.884 - type: mrr_at_1 value: 33.004 - type: mrr_at_10 value: 43.864 - type: mrr_at_100 value: 44.667 - type: mrr_at_1000 value: 44.717 - type: mrr_at_3 value: 40.777 - type: mrr_at_5 value: 42.319 - type: ndcg_at_1 value: 33.004 - type: ndcg_at_10 value: 46.022 - type: ndcg_at_100 value: 51.542 - type: ndcg_at_1000 value: 53.742000000000004 - type: ndcg_at_3 value: 39.795 - type: ndcg_at_5 value: 42.272 - type: precision_at_1 value: 33.004 - type: precision_at_10 value: 9.012 - type: precision_at_100 value: 1.7770000000000001 - type: precision_at_1000 value: 0.26 - type: precision_at_3 value: 19.038 - type: precision_at_5 value: 13.675999999999998 - type: recall_at_1 value: 27.139999999999997 - type: recall_at_10 value: 60.961 - type: recall_at_100 value: 84.451 - type: recall_at_1000 value: 98.113 - type: recall_at_3 value: 43.001 - type: recall_at_5 value: 49.896 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: mteb/climate-fever config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: map_at_1 value: 22.076999999999998 - type: map_at_10 value: 35.44 - type: map_at_100 value: 37.651 - type: map_at_1000 value: 37.824999999999996 - type: map_at_3 value: 30.764999999999997 - type: map_at_5 value: 33.26 - type: mrr_at_1 value: 50.163000000000004 - type: mrr_at_10 value: 61.207 - type: mrr_at_100 value: 61.675000000000004 - type: mrr_at_1000 value: 61.692 - type: mrr_at_3 value: 58.60999999999999 - type: mrr_at_5 value: 60.307 - type: ndcg_at_1 value: 50.163000000000004 - type: ndcg_at_10 value: 45.882 - type: ndcg_at_100 value: 53.239999999999995 - type: ndcg_at_1000 value: 55.852000000000004 - type: ndcg_at_3 value: 40.514 - type: ndcg_at_5 value: 42.038 - type: precision_at_1 value: 50.163000000000004 - type: precision_at_10 value: 13.466000000000001 - type: precision_at_100 value: 2.164 - type: precision_at_1000 value: 0.266 - type: precision_at_3 value: 29.707 - type: precision_at_5 value: 21.694 - type: recall_at_1 value: 22.076999999999998 - type: recall_at_10 value: 50.193 - type: recall_at_100 value: 74.993 - type: recall_at_1000 value: 89.131 - type: recall_at_3 value: 35.472 - type: recall_at_5 value: 41.814 - task: type: Retrieval dataset: name: MTEB DBPedia type: mteb/dbpedia config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: map_at_1 value: 9.953 - type: map_at_10 value: 24.515 - type: map_at_100 value: 36.173 - type: map_at_1000 value: 38.351 - type: map_at_3 value: 16.592000000000002 - type: map_at_5 value: 20.036 - type: mrr_at_1 value: 74.25 - type: mrr_at_10 value: 81.813 - type: mrr_at_100 value: 82.006 - type: mrr_at_1000 value: 82.011 - type: mrr_at_3 value: 80.875 - type: mrr_at_5 value: 81.362 - type: ndcg_at_1 value: 62.5 - type: ndcg_at_10 value: 52.42 - type: ndcg_at_100 value: 56.808 - type: ndcg_at_1000 value: 63.532999999999994 - type: ndcg_at_3 value: 56.654 - type: ndcg_at_5 value: 54.18300000000001 - type: precision_at_1 value: 74.25 - type: precision_at_10 value: 42.699999999999996 - type: precision_at_100 value: 13.675 - type: precision_at_1000 value: 2.664 - type: precision_at_3 value: 60.5 - type: precision_at_5 value: 52.800000000000004 - type: recall_at_1 value: 9.953 - type: recall_at_10 value: 30.253999999999998 - type: recall_at_100 value: 62.516000000000005 - type: recall_at_1000 value: 84.163 - type: recall_at_3 value: 18.13 - type: recall_at_5 value: 22.771 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 79.455 - type: f1 value: 74.16798697647569 - task: type: Retrieval dataset: name: MTEB FEVER type: mteb/fever config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: map_at_1 value: 87.531 - type: map_at_10 value: 93.16799999999999 - type: map_at_100 value: 93.341 - type: map_at_1000 value: 93.349 - type: map_at_3 value: 92.444 - type: map_at_5 value: 92.865 - type: mrr_at_1 value: 94.014 - type: mrr_at_10 value: 96.761 - type: mrr_at_100 value: 96.762 - type: mrr_at_1000 value: 96.762 - type: mrr_at_3 value: 96.672 - type: mrr_at_5 value: 96.736 - type: ndcg_at_1 value: 94.014 - type: ndcg_at_10 value: 95.112 - type: ndcg_at_100 value: 95.578 - type: ndcg_at_1000 value: 95.68900000000001 - type: ndcg_at_3 value: 94.392 - type: ndcg_at_5 value: 94.72500000000001 - type: precision_at_1 value: 94.014 - type: precision_at_10 value: 11.065 - type: precision_at_100 value: 1.157 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_3 value: 35.259 - type: precision_at_5 value: 21.599 - type: recall_at_1 value: 87.531 - type: recall_at_10 value: 97.356 - type: recall_at_100 value: 98.965 - type: recall_at_1000 value: 99.607 - type: recall_at_3 value: 95.312 - type: recall_at_5 value: 96.295 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: mteb/fiqa config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: map_at_1 value: 32.055 - type: map_at_10 value: 53.114 - type: map_at_100 value: 55.235 - type: map_at_1000 value: 55.345 - type: map_at_3 value: 45.854 - type: map_at_5 value: 50.025 - type: mrr_at_1 value: 60.34 - type: mrr_at_10 value: 68.804 - type: mrr_at_100 value: 69.309 - type: mrr_at_1000 value: 69.32199999999999 - type: mrr_at_3 value: 66.40899999999999 - type: mrr_at_5 value: 67.976 - type: ndcg_at_1 value: 60.34 - type: ndcg_at_10 value: 62.031000000000006 - type: ndcg_at_100 value: 68.00500000000001 - type: ndcg_at_1000 value: 69.286 - type: ndcg_at_3 value: 56.355999999999995 - type: ndcg_at_5 value: 58.687 - type: precision_at_1 value: 60.34 - type: precision_at_10 value: 17.176 - type: precision_at_100 value: 2.36 - type: precision_at_1000 value: 0.259 - type: precision_at_3 value: 37.14 - type: precision_at_5 value: 27.809 - type: recall_at_1 value: 32.055 - type: recall_at_10 value: 70.91 - type: recall_at_100 value: 91.83 - type: recall_at_1000 value: 98.871 - type: recall_at_3 value: 51.202999999999996 - type: recall_at_5 value: 60.563 - task: type: Retrieval dataset: name: MTEB HotpotQA type: mteb/hotpotqa config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: map_at_1 value: 43.68 - type: map_at_10 value: 64.389 - type: map_at_100 value: 65.24 - type: map_at_1000 value: 65.303 - type: map_at_3 value: 61.309000000000005 - type: map_at_5 value: 63.275999999999996 - type: mrr_at_1 value: 87.36 - type: mrr_at_10 value: 91.12 - type: mrr_at_100 value: 91.227 - type: mrr_at_1000 value: 91.229 - type: mrr_at_3 value: 90.57600000000001 - type: mrr_at_5 value: 90.912 - type: ndcg_at_1 value: 87.36 - type: ndcg_at_10 value: 73.076 - type: ndcg_at_100 value: 75.895 - type: ndcg_at_1000 value: 77.049 - type: ndcg_at_3 value: 68.929 - type: ndcg_at_5 value: 71.28 - type: precision_at_1 value: 87.36 - type: precision_at_10 value: 14.741000000000001 - type: precision_at_100 value: 1.694 - type: precision_at_1000 value: 0.185 - type: precision_at_3 value: 43.043 - type: precision_at_5 value: 27.681 - type: recall_at_1 value: 43.68 - type: recall_at_10 value: 73.707 - type: recall_at_100 value: 84.7 - type: recall_at_1000 value: 92.309 - type: recall_at_3 value: 64.564 - type: recall_at_5 value: 69.203 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 96.75399999999999 - type: ap value: 95.29389839242187 - type: f1 value: 96.75348377433475 - task: type: Retrieval dataset: name: MTEB MSMARCO type: mteb/msmarco config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: map_at_1 value: 25.176 - type: map_at_10 value: 38.598 - type: map_at_100 value: 39.707 - type: map_at_1000 value: 39.744 - type: map_at_3 value: 34.566 - type: map_at_5 value: 36.863 - type: mrr_at_1 value: 25.874000000000002 - type: mrr_at_10 value: 39.214 - type: mrr_at_100 value: 40.251 - type: mrr_at_1000 value: 40.281 - type: mrr_at_3 value: 35.291 - type: mrr_at_5 value: 37.545 - type: ndcg_at_1 value: 25.874000000000002 - type: ndcg_at_10 value: 45.98 - type: ndcg_at_100 value: 51.197 - type: ndcg_at_1000 value: 52.073 - type: ndcg_at_3 value: 37.785999999999994 - type: ndcg_at_5 value: 41.870000000000005 - type: precision_at_1 value: 25.874000000000002 - type: precision_at_10 value: 7.181 - type: precision_at_100 value: 0.979 - type: precision_at_1000 value: 0.106 - type: precision_at_3 value: 16.051000000000002 - type: precision_at_5 value: 11.713 - type: recall_at_1 value: 25.176 - type: recall_at_10 value: 68.67699999999999 - type: recall_at_100 value: 92.55 - type: recall_at_1000 value: 99.164 - type: recall_at_3 value: 46.372 - type: recall_at_5 value: 56.16 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 99.03784769721841 - type: f1 value: 98.97791641821495 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 91.88326493388054 - type: f1 value: 73.74809928034335 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 85.41358439811701 - type: f1 value: 83.503679460639 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 89.77135171486215 - type: f1 value: 88.89843747468366 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 46.22695362087359 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 44.132372165849425 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 33.35680810650402 - type: mrr value: 34.72625715637218 - task: type: Retrieval dataset: name: MTEB NFCorpus type: mteb/nfcorpus config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: map_at_1 value: 7.165000000000001 - type: map_at_10 value: 15.424 - type: map_at_100 value: 20.28 - type: map_at_1000 value: 22.065 - type: map_at_3 value: 11.236 - type: map_at_5 value: 13.025999999999998 - type: mrr_at_1 value: 51.702999999999996 - type: mrr_at_10 value: 59.965 - type: mrr_at_100 value: 60.667 - type: mrr_at_1000 value: 60.702999999999996 - type: mrr_at_3 value: 58.772000000000006 - type: mrr_at_5 value: 59.267 - type: ndcg_at_1 value: 49.536 - type: ndcg_at_10 value: 40.6 - type: ndcg_at_100 value: 37.848 - type: ndcg_at_1000 value: 46.657 - type: ndcg_at_3 value: 46.117999999999995 - type: ndcg_at_5 value: 43.619 - type: precision_at_1 value: 51.393 - type: precision_at_10 value: 30.31 - type: precision_at_100 value: 9.972 - type: precision_at_1000 value: 2.329 - type: precision_at_3 value: 43.137 - type: precision_at_5 value: 37.585 - type: recall_at_1 value: 7.165000000000001 - type: recall_at_10 value: 19.689999999999998 - type: recall_at_100 value: 39.237 - type: recall_at_1000 value: 71.417 - type: recall_at_3 value: 12.247 - type: recall_at_5 value: 14.902999999999999 - task: type: Retrieval dataset: name: MTEB NQ type: mteb/nq config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: map_at_1 value: 42.653999999999996 - type: map_at_10 value: 59.611999999999995 - type: map_at_100 value: 60.32300000000001 - type: map_at_1000 value: 60.336 - type: map_at_3 value: 55.584999999999994 - type: map_at_5 value: 58.19 - type: mrr_at_1 value: 47.683 - type: mrr_at_10 value: 62.06700000000001 - type: mrr_at_100 value: 62.537 - type: mrr_at_1000 value: 62.544999999999995 - type: mrr_at_3 value: 59.178 - type: mrr_at_5 value: 61.034 - type: ndcg_at_1 value: 47.654 - type: ndcg_at_10 value: 67.001 - type: ndcg_at_100 value: 69.73899999999999 - type: ndcg_at_1000 value: 69.986 - type: ndcg_at_3 value: 59.95700000000001 - type: ndcg_at_5 value: 64.025 - type: precision_at_1 value: 47.654 - type: precision_at_10 value: 10.367999999999999 - type: precision_at_100 value: 1.192 - type: precision_at_1000 value: 0.121 - type: precision_at_3 value: 26.651000000000003 - type: precision_at_5 value: 18.459 - type: recall_at_1 value: 42.653999999999996 - type: recall_at_10 value: 86.619 - type: recall_at_100 value: 98.04899999999999 - type: recall_at_1000 value: 99.812 - type: recall_at_3 value: 68.987 - type: recall_at_5 value: 78.158 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: mteb/quora config: default split: test revision: None metrics: - type: map_at_1 value: 72.538 - type: map_at_10 value: 86.702 - type: map_at_100 value: 87.31 - type: map_at_1000 value: 87.323 - type: map_at_3 value: 83.87 - type: map_at_5 value: 85.682 - type: mrr_at_1 value: 83.31 - type: mrr_at_10 value: 89.225 - type: mrr_at_100 value: 89.30399999999999 - type: mrr_at_1000 value: 89.30399999999999 - type: mrr_at_3 value: 88.44300000000001 - type: mrr_at_5 value: 89.005 - type: ndcg_at_1 value: 83.32000000000001 - type: ndcg_at_10 value: 90.095 - type: ndcg_at_100 value: 91.12 - type: ndcg_at_1000 value: 91.179 - type: ndcg_at_3 value: 87.606 - type: ndcg_at_5 value: 89.031 - type: precision_at_1 value: 83.32000000000001 - type: precision_at_10 value: 13.641 - type: precision_at_100 value: 1.541 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 38.377 - type: precision_at_5 value: 25.162000000000003 - type: recall_at_1 value: 72.538 - type: recall_at_10 value: 96.47200000000001 - type: recall_at_100 value: 99.785 - type: recall_at_1000 value: 99.99900000000001 - type: recall_at_3 value: 89.278 - type: recall_at_5 value: 93.367 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 73.55219145406065 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 74.13437105242755 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: mteb/scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 6.873 - type: map_at_10 value: 17.944 - type: map_at_100 value: 21.171 - type: map_at_1000 value: 21.528 - type: map_at_3 value: 12.415 - type: map_at_5 value: 15.187999999999999 - type: mrr_at_1 value: 33.800000000000004 - type: mrr_at_10 value: 46.455 - type: mrr_at_100 value: 47.378 - type: mrr_at_1000 value: 47.394999999999996 - type: mrr_at_3 value: 42.367 - type: mrr_at_5 value: 44.972 - type: ndcg_at_1 value: 33.800000000000004 - type: ndcg_at_10 value: 28.907 - type: ndcg_at_100 value: 39.695 - type: ndcg_at_1000 value: 44.582 - type: ndcg_at_3 value: 26.949 - type: ndcg_at_5 value: 23.988 - type: precision_at_1 value: 33.800000000000004 - type: precision_at_10 value: 15.079999999999998 - type: precision_at_100 value: 3.056 - type: precision_at_1000 value: 0.42100000000000004 - type: precision_at_3 value: 25.167 - type: precision_at_5 value: 21.26 - type: recall_at_1 value: 6.873 - type: recall_at_10 value: 30.568 - type: recall_at_100 value: 62.062 - type: recall_at_1000 value: 85.37700000000001 - type: recall_at_3 value: 15.312999999999999 - type: recall_at_5 value: 21.575 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 82.37009118256057 - type: cos_sim_spearman value: 79.27986395671529 - type: euclidean_pearson value: 79.18037715442115 - type: euclidean_spearman value: 79.28004791561621 - type: manhattan_pearson value: 79.34062972800541 - type: manhattan_spearman value: 79.43106695543402 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 87.48474767383833 - type: cos_sim_spearman value: 79.54505388752513 - type: euclidean_pearson value: 83.43282704179565 - type: euclidean_spearman value: 79.54579919925405 - type: manhattan_pearson value: 83.77564492427952 - type: manhattan_spearman value: 79.84558396989286 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 88.803698035802 - type: cos_sim_spearman value: 88.83451367754881 - type: euclidean_pearson value: 88.28939285711628 - type: euclidean_spearman value: 88.83528996073112 - type: manhattan_pearson value: 88.28017412671795 - type: manhattan_spearman value: 88.9228828016344 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 85.27469288153428 - type: cos_sim_spearman value: 83.87477064876288 - type: euclidean_pearson value: 84.2601737035379 - type: euclidean_spearman value: 83.87431082479074 - type: manhattan_pearson value: 84.3621547772745 - type: manhattan_spearman value: 84.12094375000423 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 88.12749863201587 - type: cos_sim_spearman value: 88.54287568368565 - type: euclidean_pearson value: 87.90429700607999 - type: euclidean_spearman value: 88.5437689576261 - type: manhattan_pearson value: 88.19276653356833 - type: manhattan_spearman value: 88.99995393814679 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 85.68398747560902 - type: cos_sim_spearman value: 86.48815303460574 - type: euclidean_pearson value: 85.52356631237954 - type: euclidean_spearman value: 86.486391949551 - type: manhattan_pearson value: 85.67267981761788 - type: manhattan_spearman value: 86.7073696332485 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 88.9057107443124 - type: cos_sim_spearman value: 88.7312168757697 - type: euclidean_pearson value: 88.72810439714794 - type: euclidean_spearman value: 88.71976185854771 - type: manhattan_pearson value: 88.50433745949111 - type: manhattan_spearman value: 88.51726175544195 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 67.59391795109886 - type: cos_sim_spearman value: 66.87613008631367 - type: euclidean_pearson value: 69.23198488262217 - type: euclidean_spearman value: 66.85427723013692 - type: manhattan_pearson value: 69.50730124841084 - type: manhattan_spearman value: 67.10404669820792 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 87.0820605344619 - type: cos_sim_spearman value: 86.8518089863434 - type: euclidean_pearson value: 86.31087134689284 - type: euclidean_spearman value: 86.8518520517941 - type: manhattan_pearson value: 86.47203796160612 - type: manhattan_spearman value: 87.1080149734421 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 89.09255369305481 - type: mrr value: 97.10323445617563 - task: type: Retrieval dataset: name: MTEB SciFact type: mteb/scifact config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: map_at_1 value: 61.260999999999996 - type: map_at_10 value: 74.043 - type: map_at_100 value: 74.37700000000001 - type: map_at_1000 value: 74.384 - type: map_at_3 value: 71.222 - type: map_at_5 value: 72.875 - type: mrr_at_1 value: 64.333 - type: mrr_at_10 value: 74.984 - type: mrr_at_100 value: 75.247 - type: mrr_at_1000 value: 75.25500000000001 - type: mrr_at_3 value: 73.167 - type: mrr_at_5 value: 74.35000000000001 - type: ndcg_at_1 value: 64.333 - type: ndcg_at_10 value: 79.06 - type: ndcg_at_100 value: 80.416 - type: ndcg_at_1000 value: 80.55600000000001 - type: ndcg_at_3 value: 74.753 - type: ndcg_at_5 value: 76.97500000000001 - type: precision_at_1 value: 64.333 - type: precision_at_10 value: 10.567 - type: precision_at_100 value: 1.1199999999999999 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 29.889 - type: precision_at_5 value: 19.533 - type: recall_at_1 value: 61.260999999999996 - type: recall_at_10 value: 93.167 - type: recall_at_100 value: 99.0 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 81.667 - type: recall_at_5 value: 87.394 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.71980198019801 - type: cos_sim_ap value: 92.81616007802704 - type: cos_sim_f1 value: 85.17548454688318 - type: cos_sim_precision value: 89.43894389438944 - type: cos_sim_recall value: 81.3 - type: dot_accuracy value: 99.71980198019801 - type: dot_ap value: 92.81398760591358 - type: dot_f1 value: 85.17548454688318 - type: dot_precision value: 89.43894389438944 - type: dot_recall value: 81.3 - type: euclidean_accuracy value: 99.71980198019801 - type: euclidean_ap value: 92.81560637245072 - type: euclidean_f1 value: 85.17548454688318 - type: euclidean_precision value: 89.43894389438944 - type: euclidean_recall value: 81.3 - type: manhattan_accuracy value: 99.73069306930694 - type: manhattan_ap value: 93.14005487480794 - type: manhattan_f1 value: 85.56263269639068 - type: manhattan_precision value: 91.17647058823529 - type: manhattan_recall value: 80.60000000000001 - type: max_accuracy value: 99.73069306930694 - type: max_ap value: 93.14005487480794 - type: max_f1 value: 85.56263269639068 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 79.86443362395185 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 49.40897096662564 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 55.66040806627947 - type: mrr value: 56.58670475766064 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 31.51015090598575 - type: cos_sim_spearman value: 31.35016454939226 - type: dot_pearson value: 31.5150068731 - type: dot_spearman value: 31.34790869023487 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: mteb/trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.254 - type: map_at_10 value: 2.064 - type: map_at_100 value: 12.909 - type: map_at_1000 value: 31.761 - type: map_at_3 value: 0.738 - type: map_at_5 value: 1.155 - type: mrr_at_1 value: 96.0 - type: mrr_at_10 value: 98.0 - type: mrr_at_100 value: 98.0 - type: mrr_at_1000 value: 98.0 - type: mrr_at_3 value: 98.0 - type: mrr_at_5 value: 98.0 - type: ndcg_at_1 value: 93.0 - type: ndcg_at_10 value: 82.258 - type: ndcg_at_100 value: 64.34 - type: ndcg_at_1000 value: 57.912 - type: ndcg_at_3 value: 90.827 - type: ndcg_at_5 value: 86.79 - type: precision_at_1 value: 96.0 - type: precision_at_10 value: 84.8 - type: precision_at_100 value: 66.0 - type: precision_at_1000 value: 25.356 - type: precision_at_3 value: 94.667 - type: precision_at_5 value: 90.4 - type: recall_at_1 value: 0.254 - type: recall_at_10 value: 2.1950000000000003 - type: recall_at_100 value: 16.088 - type: recall_at_1000 value: 54.559000000000005 - type: recall_at_3 value: 0.75 - type: recall_at_5 value: 1.191 - task: type: Retrieval dataset: name: MTEB Touche2020 type: mteb/touche2020 config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: map_at_1 value: 2.976 - type: map_at_10 value: 11.389000000000001 - type: map_at_100 value: 18.429000000000002 - type: map_at_1000 value: 20.113 - type: map_at_3 value: 6.483 - type: map_at_5 value: 8.770999999999999 - type: mrr_at_1 value: 40.816 - type: mrr_at_10 value: 58.118 - type: mrr_at_100 value: 58.489999999999995 - type: mrr_at_1000 value: 58.489999999999995 - type: mrr_at_3 value: 53.061 - type: mrr_at_5 value: 57.041 - type: ndcg_at_1 value: 40.816 - type: ndcg_at_10 value: 30.567 - type: ndcg_at_100 value: 42.44 - type: ndcg_at_1000 value: 53.480000000000004 - type: ndcg_at_3 value: 36.016 - type: ndcg_at_5 value: 34.257 - type: precision_at_1 value: 42.857 - type: precision_at_10 value: 25.714 - type: precision_at_100 value: 8.429 - type: precision_at_1000 value: 1.5939999999999999 - type: precision_at_3 value: 36.735 - type: precision_at_5 value: 33.878 - type: recall_at_1 value: 2.976 - type: recall_at_10 value: 17.854999999999997 - type: recall_at_100 value: 51.833 - type: recall_at_1000 value: 86.223 - type: recall_at_3 value: 7.887 - type: recall_at_5 value: 12.026 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 85.1174 - type: ap value: 30.169441069345748 - type: f1 value: 69.79254701873245 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 72.58347481607245 - type: f1 value: 72.74877295564937 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 53.90586138221305 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 87.35769207844072 - type: cos_sim_ap value: 77.9645072410354 - type: cos_sim_f1 value: 71.32352941176471 - type: cos_sim_precision value: 66.5903890160183 - type: cos_sim_recall value: 76.78100263852242 - type: dot_accuracy value: 87.37557370209214 - type: dot_ap value: 77.96250046429908 - type: dot_f1 value: 71.28932757557064 - type: dot_precision value: 66.95249130938586 - type: dot_recall value: 76.22691292875989 - type: euclidean_accuracy value: 87.35173153722357 - type: euclidean_ap value: 77.96520460741593 - type: euclidean_f1 value: 71.32470733210104 - type: euclidean_precision value: 66.91329479768785 - type: euclidean_recall value: 76.35883905013192 - type: manhattan_accuracy value: 87.25636287774931 - type: manhattan_ap value: 77.77752485611796 - type: manhattan_f1 value: 71.18148599269183 - type: manhattan_precision value: 66.10859728506787 - type: manhattan_recall value: 77.0976253298153 - type: max_accuracy value: 87.37557370209214 - type: max_ap value: 77.96520460741593 - type: max_f1 value: 71.32470733210104 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 89.38176737687739 - type: cos_sim_ap value: 86.58811861657401 - type: cos_sim_f1 value: 79.09430644097604 - type: cos_sim_precision value: 75.45085977911366 - type: cos_sim_recall value: 83.10748383122882 - type: dot_accuracy value: 89.38370784336554 - type: dot_ap value: 86.58840606004333 - type: dot_f1 value: 79.10179860068133 - type: dot_precision value: 75.44546153308643 - type: dot_recall value: 83.13058207576223 - type: euclidean_accuracy value: 89.38564830985369 - type: euclidean_ap value: 86.58820721061164 - type: euclidean_f1 value: 79.09070942235888 - type: euclidean_precision value: 75.38729937194697 - type: euclidean_recall value: 83.17677856482906 - type: manhattan_accuracy value: 89.40699344122326 - type: manhattan_ap value: 86.60631843011362 - type: manhattan_f1 value: 79.14949970570925 - type: manhattan_precision value: 75.78191039729502 - type: manhattan_recall value: 82.83030489682784 - type: max_accuracy value: 89.40699344122326 - type: max_ap value: 86.60631843011362 - type: max_f1 value: 79.14949970570925 - task: type: STS dataset: name: MTEB AFQMC type: C-MTEB/AFQMC config: default split: validation revision: b44c3b011063adb25877c13823db83bb193913c4 metrics: - type: cos_sim_pearson value: 65.58442135663871 - type: cos_sim_spearman value: 72.2538631361313 - type: euclidean_pearson value: 70.97255486607429 - type: euclidean_spearman value: 72.25374250228647 - type: manhattan_pearson value: 70.83250199989911 - type: manhattan_spearman value: 72.14819496536272 - task: type: STS dataset: name: MTEB ATEC type: C-MTEB/ATEC config: default split: test revision: 0f319b1142f28d00e055a6770f3f726ae9b7d865 metrics: - type: cos_sim_pearson value: 59.99478404929932 - type: cos_sim_spearman value: 62.61836216999812 - type: euclidean_pearson value: 66.86429811933593 - type: euclidean_spearman value: 62.6183520374191 - type: manhattan_pearson value: 66.8063778911633 - type: manhattan_spearman value: 62.569607573241115 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (zh) type: mteb/amazon_reviews_multi config: zh split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 53.98400000000001 - type: f1 value: 51.21447361350723 - task: type: STS dataset: name: MTEB BQ type: C-MTEB/BQ config: default split: test revision: e3dda5e115e487b39ec7e618c0c6a29137052a55 metrics: - type: cos_sim_pearson value: 79.11941660686553 - type: cos_sim_spearman value: 81.25029594540435 - type: euclidean_pearson value: 82.06973504238826 - type: euclidean_spearman value: 81.2501989488524 - type: manhattan_pearson value: 82.10094630392753 - type: manhattan_spearman value: 81.27987244392389 - task: type: Clustering dataset: name: MTEB CLSClusteringP2P type: C-MTEB/CLSClusteringP2P config: default split: test revision: 4b6227591c6c1a73bc76b1055f3b7f3588e72476 metrics: - type: v_measure value: 47.07270168705156 - task: type: Clustering dataset: name: MTEB CLSClusteringS2S type: C-MTEB/CLSClusteringS2S config: default split: test revision: e458b3f5414b62b7f9f83499ac1f5497ae2e869f metrics: - type: v_measure value: 45.98511703185043 - task: type: Reranking dataset: name: MTEB CMedQAv1 type: C-MTEB/CMedQAv1-reranking config: default split: test revision: 8d7f1e942507dac42dc58017c1a001c3717da7df metrics: - type: map value: 88.19895157194931 - type: mrr value: 90.21424603174603 - task: type: Reranking dataset: name: MTEB CMedQAv2 type: C-MTEB/CMedQAv2-reranking config: default split: test revision: 23d186750531a14a0357ca22cd92d712fd512ea0 metrics: - type: map value: 88.03317320980119 - type: mrr value: 89.9461507936508 - task: type: Retrieval dataset: name: MTEB CmedqaRetrieval type: C-MTEB/CmedqaRetrieval config: default split: dev revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301 metrics: - type: map_at_1 value: 29.037000000000003 - type: map_at_10 value: 42.001 - type: map_at_100 value: 43.773 - type: map_at_1000 value: 43.878 - type: map_at_3 value: 37.637 - type: map_at_5 value: 40.034 - type: mrr_at_1 value: 43.136 - type: mrr_at_10 value: 51.158 - type: mrr_at_100 value: 52.083 - type: mrr_at_1000 value: 52.12 - type: mrr_at_3 value: 48.733 - type: mrr_at_5 value: 50.025 - type: ndcg_at_1 value: 43.136 - type: ndcg_at_10 value: 48.685 - type: ndcg_at_100 value: 55.513 - type: ndcg_at_1000 value: 57.242000000000004 - type: ndcg_at_3 value: 43.329 - type: ndcg_at_5 value: 45.438 - type: precision_at_1 value: 43.136 - type: precision_at_10 value: 10.56 - type: precision_at_100 value: 1.6129999999999998 - type: precision_at_1000 value: 0.184 - type: precision_at_3 value: 24.064 - type: precision_at_5 value: 17.269000000000002 - type: recall_at_1 value: 29.037000000000003 - type: recall_at_10 value: 59.245000000000005 - type: recall_at_100 value: 87.355 - type: recall_at_1000 value: 98.74000000000001 - type: recall_at_3 value: 42.99 - type: recall_at_5 value: 49.681999999999995 - task: type: PairClassification dataset: name: MTEB Cmnli type: C-MTEB/CMNLI config: default split: validation revision: 41bc36f332156f7adc9e38f53777c959b2ae9766 metrics: - type: cos_sim_accuracy value: 82.68190018039687 - type: cos_sim_ap value: 90.18017125327886 - type: cos_sim_f1 value: 83.64080906868193 - type: cos_sim_precision value: 79.7076890489303 - type: cos_sim_recall value: 87.98223053542202 - type: dot_accuracy value: 82.68190018039687 - type: dot_ap value: 90.18782350103646 - type: dot_f1 value: 83.64242087729039 - type: dot_precision value: 79.65313028764805 - type: dot_recall value: 88.05237315875614 - type: euclidean_accuracy value: 82.68190018039687 - type: euclidean_ap value: 90.1801957900632 - type: euclidean_f1 value: 83.63636363636364 - type: euclidean_precision value: 79.52772506852203 - type: euclidean_recall value: 88.19265840542437 - type: manhattan_accuracy value: 82.14070956103427 - type: manhattan_ap value: 89.96178420101427 - type: manhattan_f1 value: 83.21087838578791 - type: manhattan_precision value: 78.35605121850475 - type: manhattan_recall value: 88.70703764320785 - type: max_accuracy value: 82.68190018039687 - type: max_ap value: 90.18782350103646 - type: max_f1 value: 83.64242087729039 - task: type: Retrieval dataset: name: MTEB CovidRetrieval type: C-MTEB/CovidRetrieval config: default split: dev revision: 1271c7809071a13532e05f25fb53511ffce77117 metrics: - type: map_at_1 value: 72.234 - type: map_at_10 value: 80.10000000000001 - type: map_at_100 value: 80.36 - type: map_at_1000 value: 80.363 - type: map_at_3 value: 78.315 - type: map_at_5 value: 79.607 - type: mrr_at_1 value: 72.392 - type: mrr_at_10 value: 80.117 - type: mrr_at_100 value: 80.36999999999999 - type: mrr_at_1000 value: 80.373 - type: mrr_at_3 value: 78.469 - type: mrr_at_5 value: 79.633 - type: ndcg_at_1 value: 72.392 - type: ndcg_at_10 value: 83.651 - type: ndcg_at_100 value: 84.749 - type: ndcg_at_1000 value: 84.83000000000001 - type: ndcg_at_3 value: 80.253 - type: ndcg_at_5 value: 82.485 - type: precision_at_1 value: 72.392 - type: precision_at_10 value: 9.557 - type: precision_at_100 value: 1.004 - type: precision_at_1000 value: 0.101 - type: precision_at_3 value: 28.732000000000003 - type: precision_at_5 value: 18.377 - type: recall_at_1 value: 72.234 - type: recall_at_10 value: 94.573 - type: recall_at_100 value: 99.368 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 85.669 - type: recall_at_5 value: 91.01700000000001 - task: type: Retrieval dataset: name: MTEB DuRetrieval type: C-MTEB/DuRetrieval config: default split: dev revision: a1a333e290fe30b10f3f56498e3a0d911a693ced metrics: - type: map_at_1 value: 26.173999999999996 - type: map_at_10 value: 80.04 - type: map_at_100 value: 82.94500000000001 - type: map_at_1000 value: 82.98100000000001 - type: map_at_3 value: 55.562999999999995 - type: map_at_5 value: 69.89800000000001 - type: mrr_at_1 value: 89.5 - type: mrr_at_10 value: 92.996 - type: mrr_at_100 value: 93.06400000000001 - type: mrr_at_1000 value: 93.065 - type: mrr_at_3 value: 92.658 - type: mrr_at_5 value: 92.84599999999999 - type: ndcg_at_1 value: 89.5 - type: ndcg_at_10 value: 87.443 - type: ndcg_at_100 value: 90.253 - type: ndcg_at_1000 value: 90.549 - type: ndcg_at_3 value: 85.874 - type: ndcg_at_5 value: 84.842 - type: precision_at_1 value: 89.5 - type: precision_at_10 value: 41.805 - type: precision_at_100 value: 4.827 - type: precision_at_1000 value: 0.49 - type: precision_at_3 value: 76.85 - type: precision_at_5 value: 64.8 - type: recall_at_1 value: 26.173999999999996 - type: recall_at_10 value: 89.101 - type: recall_at_100 value: 98.08099999999999 - type: recall_at_1000 value: 99.529 - type: recall_at_3 value: 57.902 - type: recall_at_5 value: 74.602 - task: type: Retrieval dataset: name: MTEB EcomRetrieval type: C-MTEB/EcomRetrieval config: default split: dev revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9 metrics: - type: map_at_1 value: 56.10000000000001 - type: map_at_10 value: 66.15299999999999 - type: map_at_100 value: 66.625 - type: map_at_1000 value: 66.636 - type: map_at_3 value: 63.632999999999996 - type: map_at_5 value: 65.293 - type: mrr_at_1 value: 56.10000000000001 - type: mrr_at_10 value: 66.15299999999999 - type: mrr_at_100 value: 66.625 - type: mrr_at_1000 value: 66.636 - type: mrr_at_3 value: 63.632999999999996 - type: mrr_at_5 value: 65.293 - type: ndcg_at_1 value: 56.10000000000001 - type: ndcg_at_10 value: 71.146 - type: ndcg_at_100 value: 73.27799999999999 - type: ndcg_at_1000 value: 73.529 - type: ndcg_at_3 value: 66.09 - type: ndcg_at_5 value: 69.08999999999999 - type: precision_at_1 value: 56.10000000000001 - type: precision_at_10 value: 8.68 - type: precision_at_100 value: 0.964 - type: precision_at_1000 value: 0.098 - type: precision_at_3 value: 24.4 - type: precision_at_5 value: 16.1 - type: recall_at_1 value: 56.10000000000001 - type: recall_at_10 value: 86.8 - type: recall_at_100 value: 96.39999999999999 - type: recall_at_1000 value: 98.3 - type: recall_at_3 value: 73.2 - type: recall_at_5 value: 80.5 - task: type: Classification dataset: name: MTEB IFlyTek type: C-MTEB/IFlyTek-classification config: default split: validation revision: 421605374b29664c5fc098418fe20ada9bd55f8a metrics: - type: accuracy value: 54.52096960369373 - type: f1 value: 40.930845295808695 - task: type: Classification dataset: name: MTEB JDReview type: C-MTEB/JDReview-classification config: default split: test revision: b7c64bd89eb87f8ded463478346f76731f07bf8b metrics: - type: accuracy value: 86.51031894934334 - type: ap value: 55.9516014323483 - type: f1 value: 81.54813679326381 - task: type: STS dataset: name: MTEB LCQMC type: C-MTEB/LCQMC config: default split: test revision: 17f9b096f80380fce5ed12a9be8be7784b337daf metrics: - type: cos_sim_pearson value: 69.67437838574276 - type: cos_sim_spearman value: 73.81314174653045 - type: euclidean_pearson value: 72.63430276680275 - type: euclidean_spearman value: 73.81358736777001 - type: manhattan_pearson value: 72.58743833842829 - type: manhattan_spearman value: 73.7590419009179 - task: type: Reranking dataset: name: MTEB MMarcoReranking type: C-MTEB/Mmarco-reranking config: default split: dev revision: None metrics: - type: map value: 31.648613483640254 - type: mrr value: 30.37420634920635 - task: type: Retrieval dataset: name: MTEB MMarcoRetrieval type: C-MTEB/MMarcoRetrieval config: default split: dev revision: 539bbde593d947e2a124ba72651aafc09eb33fc2 metrics: - type: map_at_1 value: 73.28099999999999 - type: map_at_10 value: 81.977 - type: map_at_100 value: 82.222 - type: map_at_1000 value: 82.22699999999999 - type: map_at_3 value: 80.441 - type: map_at_5 value: 81.46600000000001 - type: mrr_at_1 value: 75.673 - type: mrr_at_10 value: 82.41000000000001 - type: mrr_at_100 value: 82.616 - type: mrr_at_1000 value: 82.621 - type: mrr_at_3 value: 81.094 - type: mrr_at_5 value: 81.962 - type: ndcg_at_1 value: 75.673 - type: ndcg_at_10 value: 85.15599999999999 - type: ndcg_at_100 value: 86.151 - type: ndcg_at_1000 value: 86.26899999999999 - type: ndcg_at_3 value: 82.304 - type: ndcg_at_5 value: 84.009 - type: precision_at_1 value: 75.673 - type: precision_at_10 value: 10.042 - type: precision_at_100 value: 1.052 - type: precision_at_1000 value: 0.106 - type: precision_at_3 value: 30.673000000000002 - type: precision_at_5 value: 19.326999999999998 - type: recall_at_1 value: 73.28099999999999 - type: recall_at_10 value: 94.446 - type: recall_at_100 value: 98.737 - type: recall_at_1000 value: 99.649 - type: recall_at_3 value: 86.984 - type: recall_at_5 value: 91.024 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-CN) type: mteb/amazon_massive_intent config: zh-CN split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 81.08607935440484 - type: f1 value: 78.24879986066307 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-CN) type: mteb/amazon_massive_scenario config: zh-CN split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 86.05917955615332 - type: f1 value: 85.05279279434997 - task: type: Retrieval dataset: name: MTEB MedicalRetrieval type: C-MTEB/MedicalRetrieval config: default split: dev revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6 metrics: - type: map_at_1 value: 56.2 - type: map_at_10 value: 62.57899999999999 - type: map_at_100 value: 63.154999999999994 - type: map_at_1000 value: 63.193 - type: map_at_3 value: 61.217 - type: map_at_5 value: 62.012 - type: mrr_at_1 value: 56.3 - type: mrr_at_10 value: 62.629000000000005 - type: mrr_at_100 value: 63.205999999999996 - type: mrr_at_1000 value: 63.244 - type: mrr_at_3 value: 61.267 - type: mrr_at_5 value: 62.062 - type: ndcg_at_1 value: 56.2 - type: ndcg_at_10 value: 65.592 - type: ndcg_at_100 value: 68.657 - type: ndcg_at_1000 value: 69.671 - type: ndcg_at_3 value: 62.808 - type: ndcg_at_5 value: 64.24499999999999 - type: precision_at_1 value: 56.2 - type: precision_at_10 value: 7.5 - type: precision_at_100 value: 0.899 - type: precision_at_1000 value: 0.098 - type: precision_at_3 value: 22.467000000000002 - type: precision_at_5 value: 14.180000000000001 - type: recall_at_1 value: 56.2 - type: recall_at_10 value: 75.0 - type: recall_at_100 value: 89.9 - type: recall_at_1000 value: 97.89999999999999 - type: recall_at_3 value: 67.4 - type: recall_at_5 value: 70.89999999999999 - task: type: Classification dataset: name: MTEB MultilingualSentiment type: C-MTEB/MultilingualSentiment-classification config: default split: validation revision: 46958b007a63fdbf239b7672c25d0bea67b5ea1a metrics: - type: accuracy value: 76.87666666666667 - type: f1 value: 76.7317686219665 - task: type: PairClassification dataset: name: MTEB Ocnli type: C-MTEB/OCNLI config: default split: validation revision: 66e76a618a34d6d565d5538088562851e6daa7ec metrics: - type: cos_sim_accuracy value: 79.64266377910124 - type: cos_sim_ap value: 84.78274442344829 - type: cos_sim_f1 value: 81.16947472745292 - type: cos_sim_precision value: 76.47058823529412 - type: cos_sim_recall value: 86.48363252375924 - type: dot_accuracy value: 79.64266377910124 - type: dot_ap value: 84.7851404063692 - type: dot_f1 value: 81.16947472745292 - type: dot_precision value: 76.47058823529412 - type: dot_recall value: 86.48363252375924 - type: euclidean_accuracy value: 79.64266377910124 - type: euclidean_ap value: 84.78068373762378 - type: euclidean_f1 value: 81.14794656110837 - type: euclidean_precision value: 76.35009310986965 - type: euclidean_recall value: 86.58922914466737 - type: manhattan_accuracy value: 79.48023822414727 - type: manhattan_ap value: 84.72928897427576 - type: manhattan_f1 value: 81.32084770823064 - type: manhattan_precision value: 76.24768946395564 - type: manhattan_recall value: 87.11721224920802 - type: max_accuracy value: 79.64266377910124 - type: max_ap value: 84.7851404063692 - type: max_f1 value: 81.32084770823064 - task: type: Classification dataset: name: MTEB OnlineShopping type: C-MTEB/OnlineShopping-classification config: default split: test revision: e610f2ebd179a8fda30ae534c3878750a96db120 metrics: - type: accuracy value: 94.3 - type: ap value: 92.8664032274438 - type: f1 value: 94.29311102997727 - task: type: STS dataset: name: MTEB PAWSX type: C-MTEB/PAWSX config: default split: test revision: 9c6a90e430ac22b5779fb019a23e820b11a8b5e1 metrics: - type: cos_sim_pearson value: 48.51392279882909 - type: cos_sim_spearman value: 54.06338895994974 - type: euclidean_pearson value: 52.58480559573412 - type: euclidean_spearman value: 54.06417276612201 - type: manhattan_pearson value: 52.69525121721343 - type: manhattan_spearman value: 54.048147455389675 - task: type: STS dataset: name: MTEB QBQTC type: C-MTEB/QBQTC config: default split: test revision: 790b0510dc52b1553e8c49f3d2afb48c0e5c48b7 metrics: - type: cos_sim_pearson value: 29.728387290757325 - type: cos_sim_spearman value: 31.366121633635284 - type: euclidean_pearson value: 29.14588368552961 - type: euclidean_spearman value: 31.36764411112844 - type: manhattan_pearson value: 29.63517350523121 - type: manhattan_spearman value: 31.94157020583762 - task: type: STS dataset: name: MTEB STS22 (zh) type: mteb/sts22-crosslingual-sts config: zh split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 63.64868296271406 - type: cos_sim_spearman value: 66.12800618164744 - type: euclidean_pearson value: 63.21405767340238 - type: euclidean_spearman value: 66.12786567790748 - type: manhattan_pearson value: 64.04300276525848 - type: manhattan_spearman value: 66.5066857145652 - task: type: STS dataset: name: MTEB STSB type: C-MTEB/STSB config: default split: test revision: 0cde68302b3541bb8b3c340dc0644b0b745b3dc0 metrics: - type: cos_sim_pearson value: 81.2302623912794 - type: cos_sim_spearman value: 81.16833673266562 - type: euclidean_pearson value: 79.47647843876024 - type: euclidean_spearman value: 81.16944349524972 - type: manhattan_pearson value: 79.84947238492208 - type: manhattan_spearman value: 81.64626599410026 - task: type: Reranking dataset: name: MTEB T2Reranking type: C-MTEB/T2Reranking config: default split: dev revision: 76631901a18387f85eaa53e5450019b87ad58ef9 metrics: - type: map value: 67.80129586475687 - type: mrr value: 77.77402311635554 - task: type: Retrieval dataset: name: MTEB T2Retrieval type: C-MTEB/T2Retrieval config: default split: dev revision: 8731a845f1bf500a4f111cf1070785c793d10e64 metrics: - type: map_at_1 value: 28.666999999999998 - type: map_at_10 value: 81.063 - type: map_at_100 value: 84.504 - type: map_at_1000 value: 84.552 - type: map_at_3 value: 56.897 - type: map_at_5 value: 70.073 - type: mrr_at_1 value: 92.087 - type: mrr_at_10 value: 94.132 - type: mrr_at_100 value: 94.19800000000001 - type: mrr_at_1000 value: 94.19999999999999 - type: mrr_at_3 value: 93.78999999999999 - type: mrr_at_5 value: 94.002 - type: ndcg_at_1 value: 92.087 - type: ndcg_at_10 value: 87.734 - type: ndcg_at_100 value: 90.736 - type: ndcg_at_1000 value: 91.184 - type: ndcg_at_3 value: 88.78 - type: ndcg_at_5 value: 87.676 - type: precision_at_1 value: 92.087 - type: precision_at_10 value: 43.46 - type: precision_at_100 value: 5.07 - type: precision_at_1000 value: 0.518 - type: precision_at_3 value: 77.49000000000001 - type: precision_at_5 value: 65.194 - type: recall_at_1 value: 28.666999999999998 - type: recall_at_10 value: 86.632 - type: recall_at_100 value: 96.646 - type: recall_at_1000 value: 98.917 - type: recall_at_3 value: 58.333999999999996 - type: recall_at_5 value: 72.974 - task: type: Classification dataset: name: MTEB TNews type: C-MTEB/TNews-classification config: default split: validation revision: 317f262bf1e6126357bbe89e875451e4b0938fe4 metrics: - type: accuracy value: 52.971999999999994 - type: f1 value: 50.2898280984929 - task: type: Clustering dataset: name: MTEB ThuNewsClusteringP2P type: C-MTEB/ThuNewsClusteringP2P config: default split: test revision: 5798586b105c0434e4f0fe5e767abe619442cf93 metrics: - type: v_measure value: 86.0797948663824 - task: type: Clustering dataset: name: MTEB ThuNewsClusteringS2S type: C-MTEB/ThuNewsClusteringS2S config: default split: test revision: 8a8b2caeda43f39e13c4bc5bea0f8a667896e10d metrics: - type: v_measure value: 85.10759092255017 - task: type: Retrieval dataset: name: MTEB VideoRetrieval type: C-MTEB/VideoRetrieval config: default split: dev revision: 58c2597a5943a2ba48f4668c3b90d796283c5639 metrics: - type: map_at_1 value: 65.60000000000001 - type: map_at_10 value: 74.773 - type: map_at_100 value: 75.128 - type: map_at_1000 value: 75.136 - type: map_at_3 value: 73.05 - type: map_at_5 value: 74.13499999999999 - type: mrr_at_1 value: 65.60000000000001 - type: mrr_at_10 value: 74.773 - type: mrr_at_100 value: 75.128 - type: mrr_at_1000 value: 75.136 - type: mrr_at_3 value: 73.05 - type: mrr_at_5 value: 74.13499999999999 - type: ndcg_at_1 value: 65.60000000000001 - type: ndcg_at_10 value: 78.84299999999999 - type: ndcg_at_100 value: 80.40899999999999 - type: ndcg_at_1000 value: 80.57 - type: ndcg_at_3 value: 75.40599999999999 - type: ndcg_at_5 value: 77.351 - type: precision_at_1 value: 65.60000000000001 - type: precision_at_10 value: 9.139999999999999 - type: precision_at_100 value: 0.984 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 27.400000000000002 - type: precision_at_5 value: 17.380000000000003 - type: recall_at_1 value: 65.60000000000001 - type: recall_at_10 value: 91.4 - type: recall_at_100 value: 98.4 - type: recall_at_1000 value: 99.6 - type: recall_at_3 value: 82.19999999999999 - type: recall_at_5 value: 86.9 - task: type: Classification dataset: name: MTEB Waimai type: C-MTEB/waimai-classification config: default split: test revision: 339287def212450dcaa9df8c22bf93e9980c7023 metrics: - type: accuracy value: 89.47 - type: ap value: 75.59561751845389 - type: f1 value: 87.95207751382563 - task: type: Clustering dataset: name: MTEB AlloProfClusteringP2P type: lyon-nlp/alloprof config: default split: test revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b metrics: - type: v_measure value: 76.05592323841036 - type: v_measure value: 64.51718058866508 - task: type: Reranking dataset: name: MTEB AlloprofReranking type: lyon-nlp/mteb-fr-reranking-alloprof-s2p config: default split: test revision: 666fdacebe0291776e86f29345663dfaf80a0db9 metrics: - type: map value: 73.08278490943373 - type: mrr value: 74.66561454570449 - task: type: Retrieval dataset: name: MTEB AlloprofRetrieval type: lyon-nlp/alloprof config: default split: test revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b metrics: - type: map_at_1 value: 38.912 - type: map_at_10 value: 52.437999999999995 - type: map_at_100 value: 53.38 - type: map_at_1000 value: 53.427 - type: map_at_3 value: 48.879 - type: map_at_5 value: 50.934000000000005 - type: mrr_at_1 value: 44.085 - type: mrr_at_10 value: 55.337 - type: mrr_at_100 value: 56.016999999999996 - type: mrr_at_1000 value: 56.043 - type: mrr_at_3 value: 52.55499999999999 - type: mrr_at_5 value: 54.20399999999999 - type: ndcg_at_1 value: 44.085 - type: ndcg_at_10 value: 58.876 - type: ndcg_at_100 value: 62.714000000000006 - type: ndcg_at_1000 value: 63.721000000000004 - type: ndcg_at_3 value: 52.444 - type: ndcg_at_5 value: 55.692 - type: precision_at_1 value: 44.085 - type: precision_at_10 value: 9.21 - type: precision_at_100 value: 1.164 - type: precision_at_1000 value: 0.128 - type: precision_at_3 value: 23.043 - type: precision_at_5 value: 15.898000000000001 - type: recall_at_1 value: 38.912 - type: recall_at_10 value: 75.577 - type: recall_at_100 value: 92.038 - type: recall_at_1000 value: 99.325 - type: recall_at_3 value: 58.592 - type: recall_at_5 value: 66.235 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (fr) type: mteb/amazon_reviews_multi config: fr split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 55.532000000000004 - type: f1 value: 52.5783943471605 - task: type: Retrieval dataset: name: MTEB BSARDRetrieval type: maastrichtlawtech/bsard config: default split: test revision: 5effa1b9b5fa3b0f9e12523e6e43e5f86a6e6d59 metrics: - type: map_at_1 value: 8.108 - type: map_at_10 value: 14.710999999999999 - type: map_at_100 value: 15.891 - type: map_at_1000 value: 15.983 - type: map_at_3 value: 12.237 - type: map_at_5 value: 13.679 - type: mrr_at_1 value: 8.108 - type: mrr_at_10 value: 14.710999999999999 - type: mrr_at_100 value: 15.891 - type: mrr_at_1000 value: 15.983 - type: mrr_at_3 value: 12.237 - type: mrr_at_5 value: 13.679 - type: ndcg_at_1 value: 8.108 - type: ndcg_at_10 value: 18.796 - type: ndcg_at_100 value: 25.098 - type: ndcg_at_1000 value: 27.951999999999998 - type: ndcg_at_3 value: 13.712 - type: ndcg_at_5 value: 16.309 - type: precision_at_1 value: 8.108 - type: precision_at_10 value: 3.198 - type: precision_at_100 value: 0.626 - type: precision_at_1000 value: 0.086 - type: precision_at_3 value: 6.006 - type: precision_at_5 value: 4.865 - type: recall_at_1 value: 8.108 - type: recall_at_10 value: 31.982 - type: recall_at_100 value: 62.613 - type: recall_at_1000 value: 86.036 - type: recall_at_3 value: 18.018 - type: recall_at_5 value: 24.324 - task: type: Clustering dataset: name: MTEB HALClusteringS2S type: lyon-nlp/clustering-hal-s2s config: default split: test revision: e06ebbbb123f8144bef1a5d18796f3dec9ae2915 metrics: - type: v_measure value: 30.833269778867116 - task: type: Clustering dataset: name: MTEB MLSUMClusteringP2P type: mlsum config: default split: test revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7 metrics: - type: v_measure value: 50.0281928004713 - type: v_measure value: 43.699961510636534 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (fr) type: mteb/mtop_domain config: fr split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 96.68963357344191 - type: f1 value: 96.45175170820961 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (fr) type: mteb/mtop_intent config: fr split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 87.46946445349202 - type: f1 value: 65.79860440988624 - task: type: Classification dataset: name: MTEB MasakhaNEWSClassification (fra) type: masakhane/masakhanews config: fra split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: accuracy value: 82.60663507109005 - type: f1 value: 77.20462646604777 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (fra) type: masakhane/masakhanews config: fra split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: v_measure value: 60.19311264967803 - type: v_measure value: 63.6235764409785 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fr) type: mteb/amazon_massive_intent config: fr split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 81.65097511768661 - type: f1 value: 78.77796091490924 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fr) type: mteb/amazon_massive_scenario config: fr split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 86.64425016812373 - type: f1 value: 85.4912728670017 - task: type: Retrieval dataset: name: MTEB MintakaRetrieval (fr) type: jinaai/mintakaqa config: fr split: test revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e metrics: - type: map_at_1 value: 35.913000000000004 - type: map_at_10 value: 48.147 - type: map_at_100 value: 48.91 - type: map_at_1000 value: 48.949 - type: map_at_3 value: 45.269999999999996 - type: map_at_5 value: 47.115 - type: mrr_at_1 value: 35.913000000000004 - type: mrr_at_10 value: 48.147 - type: mrr_at_100 value: 48.91 - type: mrr_at_1000 value: 48.949 - type: mrr_at_3 value: 45.269999999999996 - type: mrr_at_5 value: 47.115 - type: ndcg_at_1 value: 35.913000000000004 - type: ndcg_at_10 value: 54.03 - type: ndcg_at_100 value: 57.839 - type: ndcg_at_1000 value: 58.925000000000004 - type: ndcg_at_3 value: 48.217999999999996 - type: ndcg_at_5 value: 51.56699999999999 - type: precision_at_1 value: 35.913000000000004 - type: precision_at_10 value: 7.244000000000001 - type: precision_at_100 value: 0.9039999999999999 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 18.905 - type: precision_at_5 value: 12.981000000000002 - type: recall_at_1 value: 35.913000000000004 - type: recall_at_10 value: 72.441 - type: recall_at_100 value: 90.41799999999999 - type: recall_at_1000 value: 99.099 - type: recall_at_3 value: 56.716 - type: recall_at_5 value: 64.90599999999999 - task: type: PairClassification dataset: name: MTEB OpusparcusPC (fr) type: GEM/opusparcus config: fr split: test revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a metrics: - type: cos_sim_accuracy value: 99.90069513406156 - type: cos_sim_ap value: 100.0 - type: cos_sim_f1 value: 99.95032290114257 - type: cos_sim_precision value: 100.0 - type: cos_sim_recall value: 99.90069513406156 - type: dot_accuracy value: 99.90069513406156 - type: dot_ap value: 100.0 - type: dot_f1 value: 99.95032290114257 - type: dot_precision value: 100.0 - type: dot_recall value: 99.90069513406156 - type: euclidean_accuracy value: 99.90069513406156 - type: euclidean_ap value: 100.0 - type: euclidean_f1 value: 99.95032290114257 - type: euclidean_precision value: 100.0 - type: euclidean_recall value: 99.90069513406156 - type: manhattan_accuracy value: 99.90069513406156 - type: manhattan_ap value: 100.0 - type: manhattan_f1 value: 99.95032290114257 - type: manhattan_precision value: 100.0 - type: manhattan_recall value: 99.90069513406156 - type: max_accuracy value: 99.90069513406156 - type: max_ap value: 100.0 - type: max_f1 value: 99.95032290114257 - task: type: PairClassification dataset: name: MTEB PawsX (fr) type: paws-x config: fr split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cos_sim_accuracy value: 75.25 - type: cos_sim_ap value: 80.86376001270014 - type: cos_sim_f1 value: 73.65945437441204 - type: cos_sim_precision value: 64.02289452166802 - type: cos_sim_recall value: 86.71096345514951 - type: dot_accuracy value: 75.25 - type: dot_ap value: 80.93686107633002 - type: dot_f1 value: 73.65945437441204 - type: dot_precision value: 64.02289452166802 - type: dot_recall value: 86.71096345514951 - type: euclidean_accuracy value: 75.25 - type: euclidean_ap value: 80.86379136218862 - type: euclidean_f1 value: 73.65945437441204 - type: euclidean_precision value: 64.02289452166802 - type: euclidean_recall value: 86.71096345514951 - type: manhattan_accuracy value: 75.3 - type: manhattan_ap value: 80.87826606097734 - type: manhattan_f1 value: 73.68421052631581 - type: manhattan_precision value: 64.0 - type: manhattan_recall value: 86.82170542635659 - type: max_accuracy value: 75.3 - type: max_ap value: 80.93686107633002 - type: max_f1 value: 73.68421052631581 - task: type: STS dataset: name: MTEB SICKFr type: Lajavaness/SICK-fr config: default split: test revision: e077ab4cf4774a1e36d86d593b150422fafd8e8a metrics: - type: cos_sim_pearson value: 81.42349425981143 - type: cos_sim_spearman value: 78.90454327031226 - type: euclidean_pearson value: 78.39086497435166 - type: euclidean_spearman value: 78.9046133980509 - type: manhattan_pearson value: 78.63743094286502 - type: manhattan_spearman value: 79.12136348449269 - task: type: STS dataset: name: MTEB STS22 (fr) type: mteb/sts22-crosslingual-sts config: fr split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 81.452697919749 - type: cos_sim_spearman value: 82.58116836039301 - type: euclidean_pearson value: 81.04038478932786 - type: euclidean_spearman value: 82.58116836039301 - type: manhattan_pearson value: 81.37075396187771 - type: manhattan_spearman value: 82.73678231355368 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (fr) type: stsb_multi_mt config: fr split: test revision: 93d57ef91790589e3ce9c365164337a8a78b7632 metrics: - type: cos_sim_pearson value: 85.7419764013806 - type: cos_sim_spearman value: 85.46085808849622 - type: euclidean_pearson value: 83.70449639870063 - type: euclidean_spearman value: 85.46159013076233 - type: manhattan_pearson value: 83.95259510313929 - type: manhattan_spearman value: 85.8029724659458 - task: type: Summarization dataset: name: MTEB SummEvalFr type: lyon-nlp/summarization-summeval-fr-p2p config: default split: test revision: b385812de6a9577b6f4d0f88c6a6e35395a94054 metrics: - type: cos_sim_pearson value: 32.61063271753325 - type: cos_sim_spearman value: 31.454589417353603 - type: dot_pearson value: 32.6106288643431 - type: dot_spearman value: 31.454589417353603 - task: type: Reranking dataset: name: MTEB SyntecReranking type: lyon-nlp/mteb-fr-reranking-syntec-s2p config: default split: test revision: b205c5084a0934ce8af14338bf03feb19499c84d metrics: - type: map value: 84.31666666666666 - type: mrr value: 84.31666666666666 - task: type: Retrieval dataset: name: MTEB SyntecRetrieval type: lyon-nlp/mteb-fr-retrieval-syntec-s2p config: default split: test revision: 77f7e271bf4a92b24fce5119f3486b583ca016ff metrics: - type: map_at_1 value: 63.0 - type: map_at_10 value: 73.471 - type: map_at_100 value: 73.87 - type: map_at_1000 value: 73.87 - type: map_at_3 value: 70.5 - type: map_at_5 value: 73.05 - type: mrr_at_1 value: 63.0 - type: mrr_at_10 value: 73.471 - type: mrr_at_100 value: 73.87 - type: mrr_at_1000 value: 73.87 - type: mrr_at_3 value: 70.5 - type: mrr_at_5 value: 73.05 - type: ndcg_at_1 value: 63.0 - type: ndcg_at_10 value: 78.255 - type: ndcg_at_100 value: 79.88 - type: ndcg_at_1000 value: 79.88 - type: ndcg_at_3 value: 72.702 - type: ndcg_at_5 value: 77.264 - type: precision_at_1 value: 63.0 - type: precision_at_10 value: 9.3 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 26.333000000000002 - type: precision_at_5 value: 18.0 - type: recall_at_1 value: 63.0 - type: recall_at_10 value: 93.0 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 79.0 - type: recall_at_5 value: 90.0 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (fr) type: jinaai/xpqa config: fr split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: map_at_1 value: 40.338 - type: map_at_10 value: 61.927 - type: map_at_100 value: 63.361999999999995 - type: map_at_1000 value: 63.405 - type: map_at_3 value: 55.479 - type: map_at_5 value: 59.732 - type: mrr_at_1 value: 63.551 - type: mrr_at_10 value: 71.006 - type: mrr_at_100 value: 71.501 - type: mrr_at_1000 value: 71.509 - type: mrr_at_3 value: 69.07 - type: mrr_at_5 value: 70.165 - type: ndcg_at_1 value: 63.551 - type: ndcg_at_10 value: 68.297 - type: ndcg_at_100 value: 73.13199999999999 - type: ndcg_at_1000 value: 73.751 - type: ndcg_at_3 value: 62.999 - type: ndcg_at_5 value: 64.89 - type: precision_at_1 value: 63.551 - type: precision_at_10 value: 15.661 - type: precision_at_100 value: 1.9789999999999999 - type: precision_at_1000 value: 0.207 - type: precision_at_3 value: 38.273 - type: precision_at_5 value: 27.61 - type: recall_at_1 value: 40.338 - type: recall_at_10 value: 77.267 - type: recall_at_100 value: 95.892 - type: recall_at_1000 value: 99.75500000000001 - type: recall_at_3 value: 60.36 - type: recall_at_5 value: 68.825 - task: type: Clustering dataset: name: MTEB 8TagsClustering type: PL-MTEB/8tags-clustering config: default split: test revision: None metrics: - type: v_measure value: 51.36126303874126 - task: type: Classification dataset: name: MTEB AllegroReviews type: PL-MTEB/allegro-reviews config: default split: test revision: None metrics: - type: accuracy value: 67.13717693836979 - type: f1 value: 57.27609848003782 - task: type: Retrieval dataset: name: MTEB ArguAna-PL type: clarin-knext/arguana-pl config: default split: test revision: 63fc86750af76253e8c760fc9e534bbf24d260a2 metrics: - type: map_at_1 value: 35.276999999999994 - type: map_at_10 value: 51.086 - type: map_at_100 value: 51.788000000000004 - type: map_at_1000 value: 51.791 - type: map_at_3 value: 46.147 - type: map_at_5 value: 49.078 - type: mrr_at_1 value: 35.917 - type: mrr_at_10 value: 51.315999999999995 - type: mrr_at_100 value: 52.018 - type: mrr_at_1000 value: 52.022 - type: mrr_at_3 value: 46.349000000000004 - type: mrr_at_5 value: 49.297000000000004 - type: ndcg_at_1 value: 35.276999999999994 - type: ndcg_at_10 value: 59.870999999999995 - type: ndcg_at_100 value: 62.590999999999994 - type: ndcg_at_1000 value: 62.661 - type: ndcg_at_3 value: 49.745 - type: ndcg_at_5 value: 55.067 - type: precision_at_1 value: 35.276999999999994 - type: precision_at_10 value: 8.791 - type: precision_at_100 value: 0.991 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 20.057 - type: precision_at_5 value: 14.637 - type: recall_at_1 value: 35.276999999999994 - type: recall_at_10 value: 87.909 - type: recall_at_100 value: 99.14699999999999 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 60.171 - type: recall_at_5 value: 73.18599999999999 - task: type: Classification dataset: name: MTEB CBD type: PL-MTEB/cbd config: default split: test revision: None metrics: - type: accuracy value: 78.03000000000002 - type: ap value: 29.12548553897622 - type: f1 value: 66.54857118886073 - task: type: PairClassification dataset: name: MTEB CDSC-E type: PL-MTEB/cdsce-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 89.0 - type: cos_sim_ap value: 76.75437826834582 - type: cos_sim_f1 value: 66.4850136239782 - type: cos_sim_precision value: 68.92655367231639 - type: cos_sim_recall value: 64.21052631578948 - type: dot_accuracy value: 89.0 - type: dot_ap value: 76.75437826834582 - type: dot_f1 value: 66.4850136239782 - type: dot_precision value: 68.92655367231639 - type: dot_recall value: 64.21052631578948 - type: euclidean_accuracy value: 89.0 - type: euclidean_ap value: 76.75437826834582 - type: euclidean_f1 value: 66.4850136239782 - type: euclidean_precision value: 68.92655367231639 - type: euclidean_recall value: 64.21052631578948 - type: manhattan_accuracy value: 89.0 - type: manhattan_ap value: 76.66074220647083 - type: manhattan_f1 value: 66.47058823529412 - type: manhattan_precision value: 75.33333333333333 - type: manhattan_recall value: 59.473684210526315 - type: max_accuracy value: 89.0 - type: max_ap value: 76.75437826834582 - type: max_f1 value: 66.4850136239782 - task: type: STS dataset: name: MTEB CDSC-R type: PL-MTEB/cdscr-sts config: default split: test revision: None metrics: - type: cos_sim_pearson value: 93.12903172428328 - type: cos_sim_spearman value: 92.66381487060741 - type: euclidean_pearson value: 90.37278396708922 - type: euclidean_spearman value: 92.66381487060741 - type: manhattan_pearson value: 90.32503296540962 - type: manhattan_spearman value: 92.6902938354313 - task: type: Retrieval dataset: name: MTEB DBPedia-PL type: clarin-knext/dbpedia-pl config: default split: test revision: 76afe41d9af165cc40999fcaa92312b8b012064a metrics: - type: map_at_1 value: 8.83 - type: map_at_10 value: 18.326 - type: map_at_100 value: 26.496 - type: map_at_1000 value: 28.455000000000002 - type: map_at_3 value: 12.933 - type: map_at_5 value: 15.168000000000001 - type: mrr_at_1 value: 66.0 - type: mrr_at_10 value: 72.76700000000001 - type: mrr_at_100 value: 73.203 - type: mrr_at_1000 value: 73.219 - type: mrr_at_3 value: 71.458 - type: mrr_at_5 value: 72.246 - type: ndcg_at_1 value: 55.375 - type: ndcg_at_10 value: 41.3 - type: ndcg_at_100 value: 45.891 - type: ndcg_at_1000 value: 52.905 - type: ndcg_at_3 value: 46.472 - type: ndcg_at_5 value: 43.734 - type: precision_at_1 value: 66.0 - type: precision_at_10 value: 33.074999999999996 - type: precision_at_100 value: 11.094999999999999 - type: precision_at_1000 value: 2.374 - type: precision_at_3 value: 48.583 - type: precision_at_5 value: 42.0 - type: recall_at_1 value: 8.83 - type: recall_at_10 value: 22.587 - type: recall_at_100 value: 50.61600000000001 - type: recall_at_1000 value: 73.559 - type: recall_at_3 value: 13.688 - type: recall_at_5 value: 16.855 - task: type: Retrieval dataset: name: MTEB FiQA-PL type: clarin-knext/fiqa-pl config: default split: test revision: 2e535829717f8bf9dc829b7f911cc5bbd4e6608e metrics: - type: map_at_1 value: 20.587 - type: map_at_10 value: 33.095 - type: map_at_100 value: 35.24 - type: map_at_1000 value: 35.429 - type: map_at_3 value: 28.626 - type: map_at_5 value: 31.136999999999997 - type: mrr_at_1 value: 40.586 - type: mrr_at_10 value: 49.033 - type: mrr_at_100 value: 49.952999999999996 - type: mrr_at_1000 value: 49.992 - type: mrr_at_3 value: 46.553 - type: mrr_at_5 value: 48.035 - type: ndcg_at_1 value: 40.586 - type: ndcg_at_10 value: 41.046 - type: ndcg_at_100 value: 48.586 - type: ndcg_at_1000 value: 51.634 - type: ndcg_at_3 value: 36.773 - type: ndcg_at_5 value: 38.389 - type: precision_at_1 value: 40.586 - type: precision_at_10 value: 11.466 - type: precision_at_100 value: 1.909 - type: precision_at_1000 value: 0.245 - type: precision_at_3 value: 24.434 - type: precision_at_5 value: 18.426000000000002 - type: recall_at_1 value: 20.587 - type: recall_at_10 value: 47.986000000000004 - type: recall_at_100 value: 75.761 - type: recall_at_1000 value: 94.065 - type: recall_at_3 value: 33.339 - type: recall_at_5 value: 39.765 - task: type: Retrieval dataset: name: MTEB HotpotQA-PL type: clarin-knext/hotpotqa-pl config: default split: test revision: a0bd479ac97b4ccb5bd6ce320c415d0bb4beb907 metrics: - type: map_at_1 value: 40.878 - type: map_at_10 value: 58.775999999999996 - type: map_at_100 value: 59.632 - type: map_at_1000 value: 59.707 - type: map_at_3 value: 56.074 - type: map_at_5 value: 57.629 - type: mrr_at_1 value: 81.756 - type: mrr_at_10 value: 86.117 - type: mrr_at_100 value: 86.299 - type: mrr_at_1000 value: 86.30600000000001 - type: mrr_at_3 value: 85.345 - type: mrr_at_5 value: 85.832 - type: ndcg_at_1 value: 81.756 - type: ndcg_at_10 value: 67.608 - type: ndcg_at_100 value: 70.575 - type: ndcg_at_1000 value: 71.99600000000001 - type: ndcg_at_3 value: 63.723 - type: ndcg_at_5 value: 65.70700000000001 - type: precision_at_1 value: 81.756 - type: precision_at_10 value: 13.619 - type: precision_at_100 value: 1.5939999999999999 - type: precision_at_1000 value: 0.178 - type: precision_at_3 value: 39.604 - type: precision_at_5 value: 25.332 - type: recall_at_1 value: 40.878 - type: recall_at_10 value: 68.096 - type: recall_at_100 value: 79.696 - type: recall_at_1000 value: 89.082 - type: recall_at_3 value: 59.406000000000006 - type: recall_at_5 value: 63.329 - task: type: Retrieval dataset: name: MTEB MSMARCO-PL type: clarin-knext/msmarco-pl config: default split: test revision: 8634c07806d5cce3a6138e260e59b81760a0a640 metrics: - type: map_at_1 value: 2.1839999999999997 - type: map_at_10 value: 11.346 - type: map_at_100 value: 30.325000000000003 - type: map_at_1000 value: 37.806 - type: map_at_3 value: 4.842 - type: map_at_5 value: 6.891 - type: mrr_at_1 value: 86.047 - type: mrr_at_10 value: 89.14699999999999 - type: mrr_at_100 value: 89.46600000000001 - type: mrr_at_1000 value: 89.46600000000001 - type: mrr_at_3 value: 89.14699999999999 - type: mrr_at_5 value: 89.14699999999999 - type: ndcg_at_1 value: 67.829 - type: ndcg_at_10 value: 62.222 - type: ndcg_at_100 value: 55.337 - type: ndcg_at_1000 value: 64.076 - type: ndcg_at_3 value: 68.12700000000001 - type: ndcg_at_5 value: 64.987 - type: precision_at_1 value: 86.047 - type: precision_at_10 value: 69.535 - type: precision_at_100 value: 32.93 - type: precision_at_1000 value: 6.6049999999999995 - type: precision_at_3 value: 79.845 - type: precision_at_5 value: 75.349 - type: recall_at_1 value: 2.1839999999999997 - type: recall_at_10 value: 12.866 - type: recall_at_100 value: 43.505 - type: recall_at_1000 value: 72.366 - type: recall_at_3 value: 4.947 - type: recall_at_5 value: 7.192 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pl) type: mteb/amazon_massive_intent config: pl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 80.75319435104238 - type: f1 value: 77.58961444860606 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pl) type: mteb/amazon_massive_scenario config: pl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 85.54472091459313 - type: f1 value: 84.29498563572106 - task: type: Retrieval dataset: name: MTEB NFCorpus-PL type: clarin-knext/nfcorpus-pl config: default split: test revision: 9a6f9567fda928260afed2de480d79c98bf0bec0 metrics: - type: map_at_1 value: 4.367 - type: map_at_10 value: 10.38 - type: map_at_100 value: 13.516 - type: map_at_1000 value: 14.982000000000001 - type: map_at_3 value: 7.367 - type: map_at_5 value: 8.59 - type: mrr_at_1 value: 41.486000000000004 - type: mrr_at_10 value: 48.886 - type: mrr_at_100 value: 49.657000000000004 - type: mrr_at_1000 value: 49.713 - type: mrr_at_3 value: 46.904 - type: mrr_at_5 value: 48.065000000000005 - type: ndcg_at_1 value: 40.402 - type: ndcg_at_10 value: 30.885 - type: ndcg_at_100 value: 28.393 - type: ndcg_at_1000 value: 37.428 - type: ndcg_at_3 value: 35.394999999999996 - type: ndcg_at_5 value: 33.391999999999996 - type: precision_at_1 value: 41.486000000000004 - type: precision_at_10 value: 23.437 - type: precision_at_100 value: 7.638 - type: precision_at_1000 value: 2.0389999999999997 - type: precision_at_3 value: 32.817 - type: precision_at_5 value: 28.915999999999997 - type: recall_at_1 value: 4.367 - type: recall_at_10 value: 14.655000000000001 - type: recall_at_100 value: 29.665999999999997 - type: recall_at_1000 value: 62.073 - type: recall_at_3 value: 8.51 - type: recall_at_5 value: 10.689 - task: type: Retrieval dataset: name: MTEB NQ-PL type: clarin-knext/nq-pl config: default split: test revision: f171245712cf85dd4700b06bef18001578d0ca8d metrics: - type: map_at_1 value: 28.616000000000003 - type: map_at_10 value: 41.626000000000005 - type: map_at_100 value: 42.689 - type: map_at_1000 value: 42.733 - type: map_at_3 value: 37.729 - type: map_at_5 value: 39.879999999999995 - type: mrr_at_1 value: 32.068000000000005 - type: mrr_at_10 value: 44.029 - type: mrr_at_100 value: 44.87 - type: mrr_at_1000 value: 44.901 - type: mrr_at_3 value: 40.687 - type: mrr_at_5 value: 42.625 - type: ndcg_at_1 value: 32.068000000000005 - type: ndcg_at_10 value: 48.449999999999996 - type: ndcg_at_100 value: 53.13 - type: ndcg_at_1000 value: 54.186 - type: ndcg_at_3 value: 40.983999999999995 - type: ndcg_at_5 value: 44.628 - type: precision_at_1 value: 32.068000000000005 - type: precision_at_10 value: 7.9750000000000005 - type: precision_at_100 value: 1.061 - type: precision_at_1000 value: 0.116 - type: precision_at_3 value: 18.404999999999998 - type: precision_at_5 value: 13.111 - type: recall_at_1 value: 28.616000000000003 - type: recall_at_10 value: 66.956 - type: recall_at_100 value: 87.657 - type: recall_at_1000 value: 95.548 - type: recall_at_3 value: 47.453 - type: recall_at_5 value: 55.87800000000001 - task: type: Classification dataset: name: MTEB PAC type: laugustyniak/abusive-clauses-pl config: default split: test revision: None metrics: - type: accuracy value: 69.04141326382856 - type: ap value: 77.47589122111044 - type: f1 value: 66.6332277374775 - task: type: PairClassification dataset: name: MTEB PPC type: PL-MTEB/ppc-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 86.4 - type: cos_sim_ap value: 94.1044939667201 - type: cos_sim_f1 value: 88.78048780487805 - type: cos_sim_precision value: 87.22044728434504 - type: cos_sim_recall value: 90.39735099337747 - type: dot_accuracy value: 86.4 - type: dot_ap value: 94.1044939667201 - type: dot_f1 value: 88.78048780487805 - type: dot_precision value: 87.22044728434504 - type: dot_recall value: 90.39735099337747 - type: euclidean_accuracy value: 86.4 - type: euclidean_ap value: 94.1044939667201 - type: euclidean_f1 value: 88.78048780487805 - type: euclidean_precision value: 87.22044728434504 - type: euclidean_recall value: 90.39735099337747 - type: manhattan_accuracy value: 86.4 - type: manhattan_ap value: 94.11438365697387 - type: manhattan_f1 value: 88.77968877968877 - type: manhattan_precision value: 87.84440842787681 - type: manhattan_recall value: 89.73509933774835 - type: max_accuracy value: 86.4 - type: max_ap value: 94.11438365697387 - type: max_f1 value: 88.78048780487805 - task: type: PairClassification dataset: name: MTEB PSC type: PL-MTEB/psc-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 97.86641929499072 - type: cos_sim_ap value: 99.36904211868182 - type: cos_sim_f1 value: 96.56203288490283 - type: cos_sim_precision value: 94.72140762463343 - type: cos_sim_recall value: 98.47560975609755 - type: dot_accuracy value: 97.86641929499072 - type: dot_ap value: 99.36904211868183 - type: dot_f1 value: 96.56203288490283 - type: dot_precision value: 94.72140762463343 - type: dot_recall value: 98.47560975609755 - type: euclidean_accuracy value: 97.86641929499072 - type: euclidean_ap value: 99.36904211868183 - type: euclidean_f1 value: 96.56203288490283 - type: euclidean_precision value: 94.72140762463343 - type: euclidean_recall value: 98.47560975609755 - type: manhattan_accuracy value: 98.14471243042672 - type: manhattan_ap value: 99.43359540492416 - type: manhattan_f1 value: 96.98795180722892 - type: manhattan_precision value: 95.83333333333334 - type: manhattan_recall value: 98.17073170731707 - type: max_accuracy value: 98.14471243042672 - type: max_ap value: 99.43359540492416 - type: max_f1 value: 96.98795180722892 - task: type: Classification dataset: name: MTEB PolEmo2.0-IN type: PL-MTEB/polemo2_in config: default split: test revision: None metrics: - type: accuracy value: 89.39058171745152 - type: f1 value: 86.8552093529568 - task: type: Classification dataset: name: MTEB PolEmo2.0-OUT type: PL-MTEB/polemo2_out config: default split: test revision: None metrics: - type: accuracy value: 74.97975708502024 - type: f1 value: 58.73081628832407 - task: type: Retrieval dataset: name: MTEB Quora-PL type: clarin-knext/quora-pl config: default split: test revision: 0be27e93455051e531182b85e85e425aba12e9d4 metrics: - type: map_at_1 value: 64.917 - type: map_at_10 value: 78.74600000000001 - type: map_at_100 value: 79.501 - type: map_at_1000 value: 79.524 - type: map_at_3 value: 75.549 - type: map_at_5 value: 77.495 - type: mrr_at_1 value: 74.9 - type: mrr_at_10 value: 82.112 - type: mrr_at_100 value: 82.314 - type: mrr_at_1000 value: 82.317 - type: mrr_at_3 value: 80.745 - type: mrr_at_5 value: 81.607 - type: ndcg_at_1 value: 74.83999999999999 - type: ndcg_at_10 value: 83.214 - type: ndcg_at_100 value: 84.997 - type: ndcg_at_1000 value: 85.207 - type: ndcg_at_3 value: 79.547 - type: ndcg_at_5 value: 81.46600000000001 - type: precision_at_1 value: 74.83999999999999 - type: precision_at_10 value: 12.822 - type: precision_at_100 value: 1.506 - type: precision_at_1000 value: 0.156 - type: precision_at_3 value: 34.903 - type: precision_at_5 value: 23.16 - type: recall_at_1 value: 64.917 - type: recall_at_10 value: 92.27199999999999 - type: recall_at_100 value: 98.715 - type: recall_at_1000 value: 99.854 - type: recall_at_3 value: 82.04599999999999 - type: recall_at_5 value: 87.2 - task: type: Retrieval dataset: name: MTEB SCIDOCS-PL type: clarin-knext/scidocs-pl config: default split: test revision: 45452b03f05560207ef19149545f168e596c9337 metrics: - type: map_at_1 value: 3.51 - type: map_at_10 value: 9.046999999999999 - type: map_at_100 value: 10.823 - type: map_at_1000 value: 11.144 - type: map_at_3 value: 6.257 - type: map_at_5 value: 7.648000000000001 - type: mrr_at_1 value: 17.299999999999997 - type: mrr_at_10 value: 27.419 - type: mrr_at_100 value: 28.618 - type: mrr_at_1000 value: 28.685 - type: mrr_at_3 value: 23.817 - type: mrr_at_5 value: 25.927 - type: ndcg_at_1 value: 17.299999999999997 - type: ndcg_at_10 value: 16.084 - type: ndcg_at_100 value: 23.729 - type: ndcg_at_1000 value: 29.476999999999997 - type: ndcg_at_3 value: 14.327000000000002 - type: ndcg_at_5 value: 13.017999999999999 - type: precision_at_1 value: 17.299999999999997 - type: precision_at_10 value: 8.63 - type: precision_at_100 value: 1.981 - type: precision_at_1000 value: 0.336 - type: precision_at_3 value: 13.4 - type: precision_at_5 value: 11.700000000000001 - type: recall_at_1 value: 3.51 - type: recall_at_10 value: 17.518 - type: recall_at_100 value: 40.275 - type: recall_at_1000 value: 68.203 - type: recall_at_3 value: 8.155 - type: recall_at_5 value: 11.875 - task: type: PairClassification dataset: name: MTEB SICK-E-PL type: PL-MTEB/sicke-pl-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 86.30248675091724 - type: cos_sim_ap value: 83.6756734006714 - type: cos_sim_f1 value: 74.97367497367497 - type: cos_sim_precision value: 73.91003460207612 - type: cos_sim_recall value: 76.06837606837607 - type: dot_accuracy value: 86.30248675091724 - type: dot_ap value: 83.6756734006714 - type: dot_f1 value: 74.97367497367497 - type: dot_precision value: 73.91003460207612 - type: dot_recall value: 76.06837606837607 - type: euclidean_accuracy value: 86.30248675091724 - type: euclidean_ap value: 83.67566984333091 - type: euclidean_f1 value: 74.97367497367497 - type: euclidean_precision value: 73.91003460207612 - type: euclidean_recall value: 76.06837606837607 - type: manhattan_accuracy value: 86.28210354667753 - type: manhattan_ap value: 83.64216119130171 - type: manhattan_f1 value: 74.92152075340078 - type: manhattan_precision value: 73.4107997265892 - type: manhattan_recall value: 76.49572649572649 - type: max_accuracy value: 86.30248675091724 - type: max_ap value: 83.6756734006714 - type: max_f1 value: 74.97367497367497 - task: type: STS dataset: name: MTEB SICK-R-PL type: PL-MTEB/sickr-pl-sts config: default split: test revision: None metrics: - type: cos_sim_pearson value: 82.23295940859121 - type: cos_sim_spearman value: 78.89329160768719 - type: euclidean_pearson value: 79.56019107076818 - type: euclidean_spearman value: 78.89330209904084 - type: manhattan_pearson value: 79.76098513973719 - type: manhattan_spearman value: 79.05490162570123 - task: type: STS dataset: name: MTEB STS22 (pl) type: mteb/sts22-crosslingual-sts config: pl split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 37.732606308062486 - type: cos_sim_spearman value: 41.01645667030284 - type: euclidean_pearson value: 26.61722556367085 - type: euclidean_spearman value: 41.01645667030284 - type: manhattan_pearson value: 26.60917378970807 - type: manhattan_spearman value: 41.51335727617614 - task: type: Retrieval dataset: name: MTEB SciFact-PL type: clarin-knext/scifact-pl config: default split: test revision: 47932a35f045ef8ed01ba82bf9ff67f6e109207e metrics: - type: map_at_1 value: 54.31700000000001 - type: map_at_10 value: 65.564 - type: map_at_100 value: 66.062 - type: map_at_1000 value: 66.08699999999999 - type: map_at_3 value: 62.592999999999996 - type: map_at_5 value: 63.888 - type: mrr_at_1 value: 56.99999999999999 - type: mrr_at_10 value: 66.412 - type: mrr_at_100 value: 66.85900000000001 - type: mrr_at_1000 value: 66.88 - type: mrr_at_3 value: 64.22200000000001 - type: mrr_at_5 value: 65.206 - type: ndcg_at_1 value: 56.99999999999999 - type: ndcg_at_10 value: 70.577 - type: ndcg_at_100 value: 72.879 - type: ndcg_at_1000 value: 73.45 - type: ndcg_at_3 value: 65.5 - type: ndcg_at_5 value: 67.278 - type: precision_at_1 value: 56.99999999999999 - type: precision_at_10 value: 9.667 - type: precision_at_100 value: 1.083 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 26.0 - type: precision_at_5 value: 16.933 - type: recall_at_1 value: 54.31700000000001 - type: recall_at_10 value: 85.056 - type: recall_at_100 value: 95.667 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 71.0 - type: recall_at_5 value: 75.672 - task: type: Retrieval dataset: name: MTEB TRECCOVID-PL type: clarin-knext/trec-covid-pl config: default split: test revision: 81bcb408f33366c2a20ac54adafad1ae7e877fdd metrics: - type: map_at_1 value: 0.245 - type: map_at_10 value: 2.051 - type: map_at_100 value: 12.009 - type: map_at_1000 value: 27.448 - type: map_at_3 value: 0.721 - type: map_at_5 value: 1.13 - type: mrr_at_1 value: 88.0 - type: mrr_at_10 value: 93.0 - type: mrr_at_100 value: 93.0 - type: mrr_at_1000 value: 93.0 - type: mrr_at_3 value: 93.0 - type: mrr_at_5 value: 93.0 - type: ndcg_at_1 value: 85.0 - type: ndcg_at_10 value: 80.303 - type: ndcg_at_100 value: 61.23499999999999 - type: ndcg_at_1000 value: 52.978 - type: ndcg_at_3 value: 84.419 - type: ndcg_at_5 value: 82.976 - type: precision_at_1 value: 88.0 - type: precision_at_10 value: 83.39999999999999 - type: precision_at_100 value: 61.96 - type: precision_at_1000 value: 22.648 - type: precision_at_3 value: 89.333 - type: precision_at_5 value: 87.2 - type: recall_at_1 value: 0.245 - type: recall_at_10 value: 2.193 - type: recall_at_100 value: 14.938 - type: recall_at_1000 value: 48.563 - type: recall_at_3 value: 0.738 - type: recall_at_5 value: 1.173 --- # bnightning/gte-Qwen2-7B-instruct-Q4_K_M-GGUF This model was converted to GGUF format from [`Alibaba-NLP/gte-Qwen2-7B-instruct`](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) for more details on the model. ## Use with llama.cpp Install llama.cpp through brew (works on Mac and Linux) ```bash brew install llama.cpp ``` Invoke the llama.cpp server or the CLI. ### CLI: ```bash llama-cli --hf-repo bnightning/gte-Qwen2-7B-instruct-Q4_K_M-GGUF --hf-file gte-qwen2-7b-instruct-q4_k_m.gguf -p "The meaning to life and the universe is" ``` ### Server: ```bash llama-server --hf-repo bnightning/gte-Qwen2-7B-instruct-Q4_K_M-GGUF --hf-file gte-qwen2-7b-instruct-q4_k_m.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. Step 1: Clone llama.cpp from GitHub. ``` git clone https://github.com/ggerganov/llama.cpp ``` Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux). ``` cd llama.cpp && LLAMA_CURL=1 make ``` Step 3: Run inference through the main binary. ``` ./llama-cli --hf-repo bnightning/gte-Qwen2-7B-instruct-Q4_K_M-GGUF --hf-file gte-qwen2-7b-instruct-q4_k_m.gguf -p "The meaning to life and the universe is" ``` or ``` ./llama-server --hf-repo bnightning/gte-Qwen2-7B-instruct-Q4_K_M-GGUF --hf-file gte-qwen2-7b-instruct-q4_k_m.gguf -c 2048 ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
twadada/nmc-nignore15
twadada
null
[ "mteb", "model-index", "region:us" ]
1,726
1,726
0
0
--- tags: - mteb model-index: - name: nomic_classification_nignore15 results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: None config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 73.01492537313432 - type: ap value: 35.50250124630455 - type: f1 value: 66.89959317702703 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: None config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 64.84830000000001 - type: ap value: 59.73270245283254 - type: f1 value: 64.76353235413379 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: None config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 34.105999999999995 - type: f1 value: 33.54422658625557 - task: type: Retrieval dataset: name: MTEB ArguAna type: None config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 21.622 - type: map_at_10 value: 35.826 - type: map_at_100 value: 37.053000000000004 - type: map_at_1000 value: 37.074 - type: map_at_3 value: 31.211 - type: map_at_5 value: 33.921 - type: mrr_at_1 value: 22.262 - type: mrr_at_10 value: 36.036 - type: mrr_at_100 value: 37.263000000000005 - type: mrr_at_1000 value: 37.284 - type: mrr_at_3 value: 31.484 - type: mrr_at_5 value: 34.144000000000005 - type: ndcg_at_1 value: 21.622 - type: ndcg_at_10 value: 43.922 - type: ndcg_at_100 value: 49.506 - type: ndcg_at_1000 value: 50.009 - type: ndcg_at_3 value: 34.372 - type: ndcg_at_5 value: 39.275 - type: precision_at_1 value: 21.622 - type: precision_at_10 value: 6.991 - type: precision_at_100 value: 0.9520000000000001 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 14.509 - type: precision_at_5 value: 11.094999999999999 - type: recall_at_1 value: 21.622 - type: recall_at_10 value: 69.915 - type: recall_at_100 value: 95.235 - type: recall_at_1000 value: 99.075 - type: recall_at_3 value: 43.528 - type: recall_at_5 value: 55.477 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: None config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 34.911008985215766 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: None config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 24.405265668502622 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: None config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 54.19334227044753 - type: mrr value: 69.16600712307083 - task: type: STS dataset: name: MTEB BIOSSES type: None config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 82.93657061818256 - type: cos_sim_spearman value: 80.1584529707527 - type: euclidean_pearson value: 81.85076602737361 - type: euclidean_spearman value: 80.1584529707527 - type: manhattan_pearson value: 81.32446368945303 - type: manhattan_spearman value: 80.35183087097523 - task: type: Classification dataset: name: MTEB Banking77Classification type: None config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 72.75974025974025 - type: f1 value: 72.01384314530861 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: None config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 32.205913552227386 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: None config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 23.518894282858902 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: None config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: map_at_1 value: 23.200000000000003 - type: map_at_10 value: 30.767 - type: map_at_100 value: 31.909 - type: map_at_1000 value: 32.054 - type: map_at_3 value: 28.048000000000002 - type: map_at_5 value: 29.668 - type: mrr_at_1 value: 29.185 - type: mrr_at_10 value: 36.361 - type: mrr_at_100 value: 37.212 - type: mrr_at_1000 value: 37.275999999999996 - type: mrr_at_3 value: 34.168 - type: mrr_at_5 value: 35.463 - type: ndcg_at_1 value: 29.185 - type: ndcg_at_10 value: 35.735 - type: ndcg_at_100 value: 40.884 - type: ndcg_at_1000 value: 43.887 - type: ndcg_at_3 value: 31.839000000000002 - type: ndcg_at_5 value: 33.759 - type: precision_at_1 value: 29.185 - type: precision_at_10 value: 6.723999999999999 - type: precision_at_100 value: 1.163 - type: precision_at_1000 value: 0.173 - type: precision_at_3 value: 15.165000000000001 - type: precision_at_5 value: 11.101999999999999 - type: recall_at_1 value: 23.200000000000003 - type: recall_at_10 value: 44.68 - type: recall_at_100 value: 67.47999999999999 - type: recall_at_1000 value: 88.152 - type: recall_at_3 value: 33.055 - type: recall_at_5 value: 38.481 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval type: None config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: map_at_1 value: 17.410999999999998 - type: map_at_10 value: 23.605999999999998 - type: map_at_100 value: 24.532 - type: map_at_1000 value: 24.645 - type: map_at_3 value: 21.628 - type: map_at_5 value: 22.849 - type: mrr_at_1 value: 22.038 - type: mrr_at_10 value: 27.947 - type: mrr_at_100 value: 28.701 - type: mrr_at_1000 value: 28.772 - type: mrr_at_3 value: 26.008 - type: mrr_at_5 value: 27.187 - type: ndcg_at_1 value: 22.038 - type: ndcg_at_10 value: 27.632 - type: ndcg_at_100 value: 31.852000000000004 - type: ndcg_at_1000 value: 34.587 - type: ndcg_at_3 value: 24.274 - type: ndcg_at_5 value: 26.005 - type: precision_at_1 value: 22.038 - type: precision_at_10 value: 5.108 - type: precision_at_100 value: 0.911 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 11.591999999999999 - type: precision_at_5 value: 8.42 - type: recall_at_1 value: 17.410999999999998 - type: recall_at_10 value: 35.346 - type: recall_at_100 value: 53.849000000000004 - type: recall_at_1000 value: 72.56700000000001 - type: recall_at_3 value: 25.647 - type: recall_at_5 value: 30.288999999999998 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval type: None config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 27.1 - type: map_at_10 value: 35.973 - type: map_at_100 value: 37.1 - type: map_at_1000 value: 37.191 - type: map_at_3 value: 33.409 - type: map_at_5 value: 34.867 - type: mrr_at_1 value: 31.473000000000003 - type: mrr_at_10 value: 39.35 - type: mrr_at_100 value: 40.258 - type: mrr_at_1000 value: 40.314 - type: mrr_at_3 value: 37.106 - type: mrr_at_5 value: 38.316 - type: ndcg_at_1 value: 31.473000000000003 - type: ndcg_at_10 value: 40.831 - type: ndcg_at_100 value: 46.094 - type: ndcg_at_1000 value: 48.147 - type: ndcg_at_3 value: 36.187000000000005 - type: ndcg_at_5 value: 38.389 - type: precision_at_1 value: 31.473000000000003 - type: precision_at_10 value: 6.627 - type: precision_at_100 value: 1.013 - type: precision_at_1000 value: 0.127 - type: precision_at_3 value: 16.092000000000002 - type: precision_at_5 value: 11.197 - type: recall_at_1 value: 27.1 - type: recall_at_10 value: 52.208 - type: recall_at_100 value: 75.913 - type: recall_at_1000 value: 90.623 - type: recall_at_3 value: 39.696999999999996 - type: recall_at_5 value: 45.068999999999996 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval type: None config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: map_at_1 value: 13.164000000000001 - type: map_at_10 value: 17.209 - type: map_at_100 value: 18.078 - type: map_at_1000 value: 18.196 - type: map_at_3 value: 15.723999999999998 - type: map_at_5 value: 16.53 - type: mrr_at_1 value: 14.35 - type: mrr_at_10 value: 18.549 - type: mrr_at_100 value: 19.378999999999998 - type: mrr_at_1000 value: 19.485 - type: mrr_at_3 value: 17.024 - type: mrr_at_5 value: 17.832 - type: ndcg_at_1 value: 14.35 - type: ndcg_at_10 value: 19.949 - type: ndcg_at_100 value: 24.59 - type: ndcg_at_1000 value: 28.102 - type: ndcg_at_3 value: 16.894000000000002 - type: ndcg_at_5 value: 18.322 - type: precision_at_1 value: 14.35 - type: precision_at_10 value: 3.0620000000000003 - type: precision_at_100 value: 0.573 - type: precision_at_1000 value: 0.092 - type: precision_at_3 value: 7.005999999999999 - type: precision_at_5 value: 5.017 - type: recall_at_1 value: 13.164000000000001 - type: recall_at_10 value: 27.282 - type: recall_at_100 value: 49.352000000000004 - type: recall_at_1000 value: 76.7 - type: recall_at_3 value: 18.93 - type: recall_at_5 value: 22.395 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval type: None config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: map_at_1 value: 7.343 - type: map_at_10 value: 10.732999999999999 - type: map_at_100 value: 11.524 - type: map_at_1000 value: 11.658 - type: map_at_3 value: 9.501999999999999 - type: map_at_5 value: 10.118 - type: mrr_at_1 value: 9.577 - type: mrr_at_10 value: 13.293 - type: mrr_at_100 value: 14.126 - type: mrr_at_1000 value: 14.234 - type: mrr_at_3 value: 11.816 - type: mrr_at_5 value: 12.519 - type: ndcg_at_1 value: 9.577 - type: ndcg_at_10 value: 13.303999999999998 - type: ndcg_at_100 value: 17.596999999999998 - type: ndcg_at_1000 value: 21.406 - type: ndcg_at_3 value: 10.788 - type: ndcg_at_5 value: 11.815000000000001 - type: precision_at_1 value: 9.577 - type: precision_at_10 value: 2.512 - type: precision_at_100 value: 0.545 - type: precision_at_1000 value: 0.10200000000000001 - type: precision_at_3 value: 5.1 - type: precision_at_5 value: 3.781 - type: recall_at_1 value: 7.343 - type: recall_at_10 value: 18.912000000000003 - type: recall_at_100 value: 38.389 - type: recall_at_1000 value: 66.424 - type: recall_at_3 value: 11.851 - type: recall_at_5 value: 14.424000000000001 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval type: None config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: map_at_1 value: 18.055 - type: map_at_10 value: 24.271 - type: map_at_100 value: 25.418000000000003 - type: map_at_1000 value: 25.554 - type: map_at_3 value: 22.067 - type: map_at_5 value: 23.274 - type: mrr_at_1 value: 22.137 - type: mrr_at_10 value: 28.671000000000003 - type: mrr_at_100 value: 29.604000000000003 - type: mrr_at_1000 value: 29.68 - type: mrr_at_3 value: 26.372 - type: mrr_at_5 value: 27.766999999999996 - type: ndcg_at_1 value: 22.137 - type: ndcg_at_10 value: 28.687 - type: ndcg_at_100 value: 34.309 - type: ndcg_at_1000 value: 37.316 - type: ndcg_at_3 value: 24.761 - type: ndcg_at_5 value: 26.598 - type: precision_at_1 value: 22.137 - type: precision_at_10 value: 5.3420000000000005 - type: precision_at_100 value: 0.9740000000000001 - type: precision_at_1000 value: 0.14200000000000002 - type: precision_at_3 value: 11.677999999999999 - type: precision_at_5 value: 8.527 - type: recall_at_1 value: 18.055 - type: recall_at_10 value: 37.701 - type: recall_at_100 value: 62.661 - type: recall_at_1000 value: 83.37299999999999 - type: recall_at_3 value: 26.491999999999997 - type: recall_at_5 value: 31.366 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval type: None config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: map_at_1 value: 13.889999999999999 - type: map_at_10 value: 19.337 - type: map_at_100 value: 20.389 - type: map_at_1000 value: 20.53 - type: map_at_3 value: 17.404 - type: map_at_5 value: 18.356 - type: mrr_at_1 value: 17.122999999999998 - type: mrr_at_10 value: 22.951 - type: mrr_at_100 value: 23.919999999999998 - type: mrr_at_1000 value: 24.01 - type: mrr_at_3 value: 21.081 - type: mrr_at_5 value: 22.102 - type: ndcg_at_1 value: 17.122999999999998 - type: ndcg_at_10 value: 23.16 - type: ndcg_at_100 value: 28.337 - type: ndcg_at_1000 value: 31.808999999999997 - type: ndcg_at_3 value: 19.649 - type: ndcg_at_5 value: 21.047 - type: precision_at_1 value: 17.122999999999998 - type: precision_at_10 value: 4.406000000000001 - type: precision_at_100 value: 0.831 - type: precision_at_1000 value: 0.13 - type: precision_at_3 value: 9.361 - type: precision_at_5 value: 6.804 - type: recall_at_1 value: 13.889999999999999 - type: recall_at_10 value: 31.162 - type: recall_at_100 value: 53.862 - type: recall_at_1000 value: 78.668 - type: recall_at_3 value: 21.276999999999997 - type: recall_at_5 value: 24.945999999999998 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: mteb/cqadupstack config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 14.876916666666668 - type: map_at_10 value: 20.335916666666666 - type: map_at_100 value: 21.264833333333335 - type: map_at_1000 value: 21.392083333333332 - type: map_at_3 value: 18.571416666666668 - type: map_at_5 value: 19.552416666666666 - type: mrr_at_1 value: 17.9935 - type: mrr_at_10 value: 23.497000000000003 - type: mrr_at_100 value: 24.318666666666665 - type: mrr_at_1000 value: 24.404583333333328 - type: mrr_at_3 value: 21.74641666666667 - type: mrr_at_5 value: 22.727416666666667 - type: ndcg_at_1 value: 17.9935 - type: ndcg_at_10 value: 23.92941666666667 - type: ndcg_at_100 value: 28.531999999999996 - type: ndcg_at_1000 value: 31.72616666666667 - type: ndcg_at_3 value: 20.738083333333332 - type: ndcg_at_5 value: 22.215416666666666 - type: precision_at_1 value: 17.9935 - type: precision_at_10 value: 4.256916666666667 - type: precision_at_100 value: 0.7820833333333335 - type: precision_at_1000 value: 0.12375000000000003 - type: precision_at_3 value: 9.594916666666666 - type: precision_at_5 value: 6.911333333333333 - type: recall_at_1 value: 14.876916666666668 - type: recall_at_10 value: 31.664250000000006 - type: recall_at_100 value: 52.60891666666667 - type: recall_at_1000 value: 75.82383333333334 - type: recall_at_3 value: 22.649833333333333 - type: recall_at_5 value: 26.4515 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval type: None config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: map_at_1 value: 10.74 - type: map_at_10 value: 15.364 - type: map_at_100 value: 16.066 - type: map_at_1000 value: 16.147 - type: map_at_3 value: 13.871 - type: map_at_5 value: 14.724 - type: mrr_at_1 value: 12.883 - type: mrr_at_10 value: 17.657 - type: mrr_at_100 value: 18.299000000000003 - type: mrr_at_1000 value: 18.369 - type: mrr_at_3 value: 16.104 - type: mrr_at_5 value: 16.986 - type: ndcg_at_1 value: 12.883 - type: ndcg_at_10 value: 18.429000000000002 - type: ndcg_at_100 value: 22.144 - type: ndcg_at_1000 value: 24.647 - type: ndcg_at_3 value: 15.542 - type: ndcg_at_5 value: 16.929 - type: precision_at_1 value: 12.883 - type: precision_at_10 value: 3.19 - type: precision_at_100 value: 0.5579999999999999 - type: precision_at_1000 value: 0.08499999999999999 - type: precision_at_3 value: 7.156999999999999 - type: precision_at_5 value: 5.215 - type: recall_at_1 value: 10.74 - type: recall_at_10 value: 25.762 - type: recall_at_100 value: 43.132999999999996 - type: recall_at_1000 value: 62.26199999999999 - type: recall_at_3 value: 17.629 - type: recall_at_5 value: 21.125 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval type: None config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 7.707 - type: map_at_10 value: 11.26 - type: map_at_100 value: 11.918 - type: map_at_1000 value: 12.043 - type: map_at_3 value: 10.169 - type: map_at_5 value: 10.817 - type: mrr_at_1 value: 9.429 - type: mrr_at_10 value: 13.5 - type: mrr_at_100 value: 14.177999999999999 - type: mrr_at_1000 value: 14.280000000000001 - type: mrr_at_3 value: 12.273 - type: mrr_at_5 value: 13.048000000000002 - type: ndcg_at_1 value: 9.429 - type: ndcg_at_10 value: 13.697999999999999 - type: ndcg_at_100 value: 17.427 - type: ndcg_at_1000 value: 21.013 - type: ndcg_at_3 value: 11.639 - type: ndcg_at_5 value: 12.705 - type: precision_at_1 value: 9.429 - type: precision_at_10 value: 2.546 - type: precision_at_100 value: 0.534 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 5.666 - type: precision_at_5 value: 4.212 - type: recall_at_1 value: 7.707 - type: recall_at_10 value: 18.901 - type: recall_at_100 value: 36.38 - type: recall_at_1000 value: 63.017999999999994 - type: recall_at_3 value: 13.123999999999999 - type: recall_at_5 value: 15.834000000000001 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval type: None config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: map_at_1 value: 13.034 - type: map_at_10 value: 17.943 - type: map_at_100 value: 18.787000000000003 - type: map_at_1000 value: 18.907 - type: map_at_3 value: 16.508 - type: map_at_5 value: 17.267 - type: mrr_at_1 value: 15.485 - type: mrr_at_10 value: 20.801 - type: mrr_at_100 value: 21.632 - type: mrr_at_1000 value: 21.731 - type: mrr_at_3 value: 19.279 - type: mrr_at_5 value: 20.09 - type: ndcg_at_1 value: 15.485 - type: ndcg_at_10 value: 21.301000000000002 - type: ndcg_at_100 value: 25.606 - type: ndcg_at_1000 value: 29.109 - type: ndcg_at_3 value: 18.451999999999998 - type: ndcg_at_5 value: 19.685 - type: precision_at_1 value: 15.485 - type: precision_at_10 value: 3.6470000000000002 - type: precision_at_100 value: 0.639 - type: precision_at_1000 value: 0.106 - type: precision_at_3 value: 8.551 - type: precision_at_5 value: 5.989 - type: recall_at_1 value: 13.034 - type: recall_at_10 value: 28.909000000000002 - type: recall_at_100 value: 48.28 - type: recall_at_1000 value: 74.375 - type: recall_at_3 value: 20.871000000000002 - type: recall_at_5 value: 24.066000000000003 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval type: None config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: map_at_1 value: 14.667 - type: map_at_10 value: 21.229 - type: map_at_100 value: 22.303 - type: map_at_1000 value: 22.512 - type: map_at_3 value: 19.527 - type: map_at_5 value: 20.415 - type: mrr_at_1 value: 18.379 - type: mrr_at_10 value: 24.829 - type: mrr_at_100 value: 25.623 - type: mrr_at_1000 value: 25.712000000000003 - type: mrr_at_3 value: 23.09 - type: mrr_at_5 value: 23.979 - type: ndcg_at_1 value: 18.379 - type: ndcg_at_10 value: 25.462 - type: ndcg_at_100 value: 30.255 - type: ndcg_at_1000 value: 34.019 - type: ndcg_at_3 value: 22.567999999999998 - type: ndcg_at_5 value: 23.79 - type: precision_at_1 value: 18.379 - type: precision_at_10 value: 4.9799999999999995 - type: precision_at_100 value: 1.099 - type: precision_at_1000 value: 0.198 - type: precision_at_3 value: 10.870000000000001 - type: precision_at_5 value: 7.866 - type: recall_at_1 value: 14.667 - type: recall_at_10 value: 33.550000000000004 - type: recall_at_100 value: 56.123999999999995 - type: recall_at_1000 value: 81.883 - type: recall_at_3 value: 24.944 - type: recall_at_5 value: 28.055000000000003 - task: type: Retrieval dataset: name: MTEB CQADupstackWordpressRetrieval type: None config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 12.212 - type: map_at_10 value: 16.339000000000002 - type: map_at_100 value: 17.154 - type: map_at_1000 value: 17.268 - type: map_at_3 value: 15.0 - type: map_at_5 value: 15.744 - type: mrr_at_1 value: 13.863 - type: mrr_at_10 value: 18.055 - type: mrr_at_100 value: 18.892 - type: mrr_at_1000 value: 18.992 - type: mrr_at_3 value: 16.636 - type: mrr_at_5 value: 17.44 - type: ndcg_at_1 value: 13.863 - type: ndcg_at_10 value: 18.965 - type: ndcg_at_100 value: 23.289 - type: ndcg_at_1000 value: 26.672 - type: ndcg_at_3 value: 16.264 - type: ndcg_at_5 value: 17.541 - type: precision_at_1 value: 13.863 - type: precision_at_10 value: 2.939 - type: precision_at_100 value: 0.545 - type: precision_at_1000 value: 0.091 - type: precision_at_3 value: 6.901 - type: precision_at_5 value: 4.806 - type: recall_at_1 value: 12.212 - type: recall_at_10 value: 25.557999999999996 - type: recall_at_100 value: 45.884 - type: recall_at_1000 value: 71.841 - type: recall_at_3 value: 18.281 - type: recall_at_5 value: 21.368000000000002 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: None config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: map_at_1 value: 5.991 - type: map_at_10 value: 11.031 - type: map_at_100 value: 12.415 - type: map_at_1000 value: 12.601999999999999 - type: map_at_3 value: 8.752 - type: map_at_5 value: 9.873999999999999 - type: mrr_at_1 value: 13.355 - type: mrr_at_10 value: 22.002 - type: mrr_at_100 value: 23.146 - type: mrr_at_1000 value: 23.218 - type: mrr_at_3 value: 18.719 - type: mrr_at_5 value: 20.543 - type: ndcg_at_1 value: 13.355 - type: ndcg_at_10 value: 16.711000000000002 - type: ndcg_at_100 value: 23.073 - type: ndcg_at_1000 value: 27.108999999999998 - type: ndcg_at_3 value: 12.289 - type: ndcg_at_5 value: 13.943 - type: precision_at_1 value: 13.355 - type: precision_at_10 value: 5.648000000000001 - type: precision_at_100 value: 1.248 - type: precision_at_1000 value: 0.198 - type: precision_at_3 value: 9.359 - type: precision_at_5 value: 7.739 - type: recall_at_1 value: 5.991 - type: recall_at_10 value: 21.898 - type: recall_at_100 value: 44.324000000000005 - type: recall_at_1000 value: 67.777 - type: recall_at_3 value: 11.527 - type: recall_at_5 value: 15.61 - task: type: Retrieval dataset: name: MTEB DBPedia type: None config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: map_at_1 value: 4.125 - type: map_at_10 value: 9.629999999999999 - type: map_at_100 value: 13.306000000000001 - type: map_at_1000 value: 14.26 - type: map_at_3 value: 6.894 - type: map_at_5 value: 8.19 - type: mrr_at_1 value: 39.25 - type: mrr_at_10 value: 49.495 - type: mrr_at_100 value: 50.139 - type: mrr_at_1000 value: 50.169 - type: mrr_at_3 value: 46.333 - type: mrr_at_5 value: 48.008 - type: ndcg_at_1 value: 28.499999999999996 - type: ndcg_at_10 value: 23.794 - type: ndcg_at_100 value: 26.632 - type: ndcg_at_1000 value: 33.382 - type: ndcg_at_3 value: 26.282 - type: ndcg_at_5 value: 25.113000000000003 - type: precision_at_1 value: 39.25 - type: precision_at_10 value: 21.075 - type: precision_at_100 value: 6.607 - type: precision_at_1000 value: 1.366 - type: precision_at_3 value: 31.667 - type: precision_at_5 value: 27.150000000000002 - type: recall_at_1 value: 4.125 - type: recall_at_10 value: 14.603 - type: recall_at_100 value: 32.888 - type: recall_at_1000 value: 55.901 - type: recall_at_3 value: 8.396 - type: recall_at_5 value: 10.902000000000001 - task: type: Classification dataset: name: MTEB EmotionClassification type: None config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 44.39999999999999 - type: f1 value: 41.19798638036962 - task: type: Retrieval dataset: name: MTEB FEVER type: None config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: map_at_1 value: 15.998999999999999 - type: map_at_10 value: 24.037 - type: map_at_100 value: 24.979000000000003 - type: map_at_1000 value: 25.052000000000003 - type: map_at_3 value: 21.537 - type: map_at_5 value: 22.926 - type: mrr_at_1 value: 17.072000000000003 - type: mrr_at_10 value: 25.526 - type: mrr_at_100 value: 26.464 - type: mrr_at_1000 value: 26.528000000000002 - type: mrr_at_3 value: 22.922 - type: mrr_at_5 value: 24.391 - type: ndcg_at_1 value: 17.072000000000003 - type: ndcg_at_10 value: 28.933999999999997 - type: ndcg_at_100 value: 33.812999999999995 - type: ndcg_at_1000 value: 35.874 - type: ndcg_at_3 value: 23.746000000000002 - type: ndcg_at_5 value: 26.264 - type: precision_at_1 value: 17.072000000000003 - type: precision_at_10 value: 4.6739999999999995 - type: precision_at_100 value: 0.732 - type: precision_at_1000 value: 0.093 - type: precision_at_3 value: 10.326 - type: precision_at_5 value: 7.531000000000001 - type: recall_at_1 value: 15.998999999999999 - type: recall_at_10 value: 42.888999999999996 - type: recall_at_100 value: 65.864 - type: recall_at_1000 value: 81.872 - type: recall_at_3 value: 28.735 - type: recall_at_5 value: 34.817 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: None config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: map_at_1 value: 7.0889999999999995 - type: map_at_10 value: 11.533 - type: map_at_100 value: 12.626999999999999 - type: map_at_1000 value: 12.831000000000001 - type: map_at_3 value: 9.524000000000001 - type: map_at_5 value: 10.484 - type: mrr_at_1 value: 13.889000000000001 - type: mrr_at_10 value: 20.035 - type: mrr_at_100 value: 21.041999999999998 - type: mrr_at_1000 value: 21.142 - type: mrr_at_3 value: 17.695 - type: mrr_at_5 value: 18.83 - type: ndcg_at_1 value: 13.889000000000001 - type: ndcg_at_10 value: 16.122 - type: ndcg_at_100 value: 21.485000000000003 - type: ndcg_at_1000 value: 26.101999999999997 - type: ndcg_at_3 value: 12.967999999999998 - type: ndcg_at_5 value: 13.975000000000001 - type: precision_at_1 value: 13.889000000000001 - type: precision_at_10 value: 4.769 - type: precision_at_100 value: 1.012 - type: precision_at_1000 value: 0.179 - type: precision_at_3 value: 8.693 - type: precision_at_5 value: 6.79 - type: recall_at_1 value: 7.0889999999999995 - type: recall_at_10 value: 21.163999999999998 - type: recall_at_100 value: 42.247 - type: recall_at_1000 value: 71.395 - type: recall_at_3 value: 11.694 - type: recall_at_5 value: 15.051 - task: type: Retrieval dataset: name: MTEB HotpotQA type: None config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: map_at_1 value: 17.752000000000002 - type: map_at_10 value: 25.064999999999998 - type: map_at_100 value: 26.003999999999998 - type: map_at_1000 value: 26.116 - type: map_at_3 value: 23.064 - type: map_at_5 value: 24.149 - type: mrr_at_1 value: 35.503 - type: mrr_at_10 value: 42.649 - type: mrr_at_100 value: 43.389 - type: mrr_at_1000 value: 43.445 - type: mrr_at_3 value: 40.699999999999996 - type: mrr_at_5 value: 41.817 - type: ndcg_at_1 value: 35.503 - type: ndcg_at_10 value: 31.968000000000004 - type: ndcg_at_100 value: 36.257 - type: ndcg_at_1000 value: 38.928000000000004 - type: ndcg_at_3 value: 28.176000000000002 - type: ndcg_at_5 value: 29.994 - type: precision_at_1 value: 35.503 - type: precision_at_10 value: 7.0440000000000005 - type: precision_at_100 value: 1.046 - type: precision_at_1000 value: 0.13999999999999999 - type: precision_at_3 value: 17.763 - type: precision_at_5 value: 12.1 - type: recall_at_1 value: 17.752000000000002 - type: recall_at_10 value: 35.219 - type: recall_at_100 value: 52.309000000000005 - type: recall_at_1000 value: 70.162 - type: recall_at_3 value: 26.644000000000002 - type: recall_at_5 value: 30.25 - task: type: Classification dataset: name: MTEB ImdbClassification type: None config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 63.9016 - type: ap value: 59.11956601013746 - type: f1 value: 63.68974251662037 - task: type: Retrieval dataset: name: MTEB MSMARCO type: None config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: map_at_1 value: 6.561999999999999 - type: map_at_10 value: 11.378 - type: map_at_100 value: 12.258 - type: map_at_1000 value: 12.361 - type: map_at_3 value: 9.577 - type: map_at_5 value: 10.525 - type: mrr_at_1 value: 6.762 - type: mrr_at_10 value: 11.674 - type: mrr_at_100 value: 12.554000000000002 - type: mrr_at_1000 value: 12.654000000000002 - type: mrr_at_3 value: 9.833 - type: mrr_at_5 value: 10.806000000000001 - type: ndcg_at_1 value: 6.734 - type: ndcg_at_10 value: 14.459 - type: ndcg_at_100 value: 19.317999999999998 - type: ndcg_at_1000 value: 22.407 - type: ndcg_at_3 value: 10.666 - type: ndcg_at_5 value: 12.393 - type: precision_at_1 value: 6.734 - type: precision_at_10 value: 2.5069999999999997 - type: precision_at_100 value: 0.505 - type: precision_at_1000 value: 0.077 - type: precision_at_3 value: 4.6850000000000005 - type: precision_at_5 value: 3.682 - type: recall_at_1 value: 6.561999999999999 - type: recall_at_10 value: 24.07 - type: recall_at_100 value: 47.856 - type: recall_at_1000 value: 72.654 - type: recall_at_3 value: 13.584 - type: recall_at_5 value: 17.76 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: None config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 88.86000911992703 - type: f1 value: 87.98975696911226 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: None config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 58.38349293205655 - type: f1 value: 40.01779036138312 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: None config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 61.852723604572965 - type: f1 value: 60.13917532573332 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: None config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.14727639542704 - type: f1 value: 66.7653952309667 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: None config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 29.504423289782554 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: None config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 25.331311574764182 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: None config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 29.79405962065405 - type: mrr value: 30.80981570313803 - task: type: Retrieval dataset: name: MTEB NFCorpus type: None config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: map_at_1 value: 3.8309999999999995 - type: map_at_10 value: 8.15 - type: map_at_100 value: 10.295 - type: map_at_1000 value: 11.649 - type: map_at_3 value: 6.088 - type: map_at_5 value: 7.138 - type: mrr_at_1 value: 33.745999999999995 - type: mrr_at_10 value: 43.422 - type: mrr_at_100 value: 44.193 - type: mrr_at_1000 value: 44.261 - type: mrr_at_3 value: 40.506 - type: mrr_at_5 value: 42.812 - type: ndcg_at_1 value: 31.579 - type: ndcg_at_10 value: 25.357000000000003 - type: ndcg_at_100 value: 23.597 - type: ndcg_at_1000 value: 33.143 - type: ndcg_at_3 value: 28.778 - type: ndcg_at_5 value: 27.92 - type: precision_at_1 value: 33.745999999999995 - type: precision_at_10 value: 18.854000000000003 - type: precision_at_100 value: 6.464 - type: precision_at_1000 value: 1.9900000000000002 - type: precision_at_3 value: 27.450999999999997 - type: precision_at_5 value: 24.52 - type: recall_at_1 value: 3.8309999999999995 - type: recall_at_10 value: 12.18 - type: recall_at_100 value: 25.258999999999997 - type: recall_at_1000 value: 59.059 - type: recall_at_3 value: 7.353 - type: recall_at_5 value: 9.777 - task: type: Retrieval dataset: name: MTEB NQ type: None config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: map_at_1 value: 9.132 - type: map_at_10 value: 15.934999999999999 - type: map_at_100 value: 17.168 - type: map_at_1000 value: 17.266000000000002 - type: map_at_3 value: 13.296 - type: map_at_5 value: 14.713000000000001 - type: mrr_at_1 value: 10.342 - type: mrr_at_10 value: 17.535 - type: mrr_at_100 value: 18.689 - type: mrr_at_1000 value: 18.77 - type: mrr_at_3 value: 14.846 - type: mrr_at_5 value: 16.337 - type: ndcg_at_1 value: 10.342 - type: ndcg_at_10 value: 20.409 - type: ndcg_at_100 value: 26.672 - type: ndcg_at_1000 value: 29.321 - type: ndcg_at_3 value: 14.982000000000001 - type: ndcg_at_5 value: 17.522 - type: precision_at_1 value: 10.342 - type: precision_at_10 value: 3.8440000000000003 - type: precision_at_100 value: 0.741 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 7.0489999999999995 - type: precision_at_5 value: 5.632000000000001 - type: recall_at_1 value: 9.132 - type: recall_at_10 value: 32.800000000000004 - type: recall_at_100 value: 61.895999999999994 - type: recall_at_1000 value: 82.146 - type: recall_at_3 value: 18.342 - type: recall_at_5 value: 24.242 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: None config: default split: test revision: None metrics: - type: map_at_1 value: 64.005 - type: map_at_10 value: 76.709 - type: map_at_100 value: 77.464 - type: map_at_1000 value: 77.498 - type: map_at_3 value: 73.75699999999999 - type: map_at_5 value: 75.553 - type: mrr_at_1 value: 73.81 - type: mrr_at_10 value: 81.006 - type: mrr_at_100 value: 81.23599999999999 - type: mrr_at_1000 value: 81.241 - type: mrr_at_3 value: 79.56800000000001 - type: mrr_at_5 value: 80.50399999999999 - type: ndcg_at_1 value: 73.85000000000001 - type: ndcg_at_10 value: 81.33399999999999 - type: ndcg_at_100 value: 83.378 - type: ndcg_at_1000 value: 83.726 - type: ndcg_at_3 value: 77.791 - type: ndcg_at_5 value: 79.636 - type: precision_at_1 value: 73.85000000000001 - type: precision_at_10 value: 12.262 - type: precision_at_100 value: 1.461 - type: precision_at_1000 value: 0.155 - type: precision_at_3 value: 33.672999999999995 - type: precision_at_5 value: 22.253999999999998 - type: recall_at_1 value: 64.005 - type: recall_at_10 value: 90.137 - type: recall_at_100 value: 97.77799999999999 - type: recall_at_1000 value: 99.667 - type: recall_at_3 value: 79.972 - type: recall_at_5 value: 85.10799999999999 - type: map_at_1 value: 3.008 - type: map_at_10 value: 7.086 - type: map_at_100 value: 8.498 - type: map_at_1000 value: 8.744 - type: map_at_3 value: 5.2330000000000005 - type: map_at_5 value: 6.188 - type: mrr_at_1 value: 14.799999999999999 - type: mrr_at_10 value: 22.731 - type: mrr_at_100 value: 23.963 - type: mrr_at_1000 value: 24.046 - type: mrr_at_3 value: 19.950000000000003 - type: mrr_at_5 value: 21.38 - type: ndcg_at_1 value: 14.799999999999999 - type: ndcg_at_10 value: 12.581999999999999 - type: ndcg_at_100 value: 19.024 - type: ndcg_at_1000 value: 24.075 - type: ndcg_at_3 value: 11.937000000000001 - type: ndcg_at_5 value: 10.427 - type: precision_at_1 value: 14.799999999999999 - type: precision_at_10 value: 6.5 - type: precision_at_100 value: 1.591 - type: precision_at_1000 value: 0.281 - type: precision_at_3 value: 11.1 - type: precision_at_5 value: 9.120000000000001 - type: recall_at_1 value: 3.008 - type: recall_at_10 value: 13.197000000000001 - type: recall_at_100 value: 32.323 - type: recall_at_1000 value: 57.172999999999995 - type: recall_at_3 value: 6.753000000000001 - type: recall_at_5 value: 9.248000000000001 - type: map_at_1 value: 0.129 - type: map_at_10 value: 0.717 - type: map_at_100 value: 3.6580000000000004 - type: map_at_1000 value: 9.374 - type: map_at_3 value: 0.302 - type: map_at_5 value: 0.422 - type: mrr_at_1 value: 56.00000000000001 - type: mrr_at_10 value: 67.475 - type: mrr_at_100 value: 68.018 - type: mrr_at_1000 value: 68.035 - type: mrr_at_3 value: 65.667 - type: mrr_at_5 value: 66.567 - type: ndcg_at_1 value: 50.0 - type: ndcg_at_10 value: 40.910999999999994 - type: ndcg_at_100 value: 30.386999999999997 - type: ndcg_at_1000 value: 27.009 - type: ndcg_at_3 value: 46.776 - type: ndcg_at_5 value: 42.504 - type: precision_at_1 value: 56.00000000000001 - type: precision_at_10 value: 43.6 - type: precision_at_100 value: 31.64 - type: precision_at_1000 value: 13.214 - type: precision_at_3 value: 50.0 - type: precision_at_5 value: 44.800000000000004 - type: recall_at_1 value: 0.129 - type: recall_at_10 value: 0.95 - type: recall_at_100 value: 6.526999999999999 - type: recall_at_1000 value: 25.894000000000002 - type: recall_at_3 value: 0.34299999999999997 - type: recall_at_5 value: 0.49899999999999994 - task: type: Clustering dataset: name: MTEB RedditClustering type: None config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 39.423003368001304 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: None config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 47.05364643191673 - task: type: STS dataset: name: MTEB SICK-R type: None config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 75.82474380833001 - type: cos_sim_spearman value: 66.06084939474535 - type: euclidean_pearson value: 71.1524270169037 - type: euclidean_spearman value: 66.06095698159474 - type: manhattan_pearson value: 68.79401530108056 - type: manhattan_spearman value: 64.55149376982865 - task: type: STS dataset: name: MTEB STS12 type: None config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 74.40557358335994 - type: cos_sim_spearman value: 67.61862451971042 - type: euclidean_pearson value: 70.41404072782692 - type: euclidean_spearman value: 67.6198686410611 - type: manhattan_pearson value: 68.97879579551457 - type: manhattan_spearman value: 67.2295683767691 - task: type: STS dataset: name: MTEB STS13 type: None config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 78.0503994579303 - type: cos_sim_spearman value: 79.15391592614903 - type: euclidean_pearson value: 78.84085468610613 - type: euclidean_spearman value: 79.15395372943995 - type: manhattan_pearson value: 78.64185478146945 - type: manhattan_spearman value: 79.14714263528944 - task: type: STS dataset: name: MTEB STS14 type: None config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 78.3615928485999 - type: cos_sim_spearman value: 74.97382110132605 - type: euclidean_pearson value: 77.23854527060215 - type: euclidean_spearman value: 74.97381140978526 - type: manhattan_pearson value: 76.07931709400336 - type: manhattan_spearman value: 74.24120638811475 - task: type: STS dataset: name: MTEB STS15 type: None config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 81.20257664300357 - type: cos_sim_spearman value: 82.00047913551732 - type: euclidean_pearson value: 81.9647778954467 - type: euclidean_spearman value: 82.0004776230638 - type: manhattan_pearson value: 81.71577106207948 - type: manhattan_spearman value: 81.99682550355493 - task: type: STS dataset: name: MTEB STS16 type: None config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 76.25724348036597 - type: cos_sim_spearman value: 77.44730180182509 - type: euclidean_pearson value: 77.07257885830403 - type: euclidean_spearman value: 77.44757329765216 - type: manhattan_pearson value: 77.56516524638705 - type: manhattan_spearman value: 77.96306993156203 - task: type: STS dataset: name: MTEB STS17 (en-en) type: None config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 84.12218819562516 - type: cos_sim_spearman value: 85.1670421446071 - type: euclidean_pearson value: 84.780712654921 - type: euclidean_spearman value: 85.16791563947264 - type: manhattan_pearson value: 84.55044493571614 - type: manhattan_spearman value: 85.27489322017652 - task: type: STS dataset: name: MTEB STS22 (en) type: None config: en split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 63.36228072576271 - type: cos_sim_spearman value: 60.8804162279283 - type: euclidean_pearson value: 63.45076147869696 - type: euclidean_spearman value: 60.8804162279283 - type: manhattan_pearson value: 62.75925080903245 - type: manhattan_spearman value: 60.42286149492114 - task: type: STS dataset: name: MTEB STSBenchmark type: None config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 78.42867098481462 - type: cos_sim_spearman value: 77.27498110374914 - type: euclidean_pearson value: 78.31189930366395 - type: euclidean_spearman value: 77.27499957014132 - type: manhattan_pearson value: 77.5492342797482 - type: manhattan_spearman value: 76.70759132287284 - task: type: Reranking dataset: name: MTEB SciDocsRR type: None config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 72.88088283802969 - type: mrr value: 91.2407327603406 - task: type: Retrieval dataset: name: MTEB SciFact type: None config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: map_at_1 value: 34.75 - type: map_at_10 value: 43.795 - type: map_at_100 value: 44.897999999999996 - type: map_at_1000 value: 44.964999999999996 - type: map_at_3 value: 41.235 - type: map_at_5 value: 42.785000000000004 - type: mrr_at_1 value: 37.0 - type: mrr_at_10 value: 45.608 - type: mrr_at_100 value: 46.556999999999995 - type: mrr_at_1000 value: 46.62 - type: mrr_at_3 value: 43.333 - type: mrr_at_5 value: 44.733000000000004 - type: ndcg_at_1 value: 37.0 - type: ndcg_at_10 value: 48.620000000000005 - type: ndcg_at_100 value: 53.772 - type: ndcg_at_1000 value: 55.403999999999996 - type: ndcg_at_3 value: 43.741 - type: ndcg_at_5 value: 46.358 - type: precision_at_1 value: 37.0 - type: precision_at_10 value: 6.800000000000001 - type: precision_at_100 value: 0.967 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 17.666999999999998 - type: precision_at_5 value: 12.0 - type: recall_at_1 value: 34.75 - type: recall_at_10 value: 61.87200000000001 - type: recall_at_100 value: 85.317 - type: recall_at_1000 value: 97.8 - type: recall_at_3 value: 48.567 - type: recall_at_5 value: 55.233 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: None config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.71485148514851 - type: cos_sim_ap value: 91.89197853928673 - type: cos_sim_f1 value: 85.48387096774194 - type: cos_sim_precision value: 86.1788617886179 - type: cos_sim_recall value: 84.8 - type: dot_accuracy value: 99.71485148514851 - type: dot_ap value: 91.89197853928673 - type: dot_f1 value: 85.48387096774194 - type: dot_precision value: 86.1788617886179 - type: dot_recall value: 84.8 - type: euclidean_accuracy value: 99.71485148514851 - type: euclidean_ap value: 91.89197853928673 - type: euclidean_f1 value: 85.48387096774194 - type: euclidean_precision value: 86.1788617886179 - type: euclidean_recall value: 84.8 - type: manhattan_accuracy value: 99.76633663366337 - type: manhattan_ap value: 93.70793412033116 - type: manhattan_f1 value: 87.77050830397583 - type: manhattan_precision value: 88.34853090172238 - type: manhattan_recall value: 87.2 - type: max_accuracy value: 99.76633663366337 - type: max_ap value: 93.70793412033116 - type: max_f1 value: 87.77050830397583 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: None config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 42.040101504017464 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: None config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 29.736735784987 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: None config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 43.30708664346592 - type: mrr value: 43.91143578643579 - task: type: Summarization dataset: name: MTEB SummEval type: None config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.57246866942087 - type: cos_sim_spearman value: 30.69719029010722 - type: dot_pearson value: 30.572468627823802 - type: dot_spearman value: 30.675070232612644 - task: type: Retrieval dataset: name: MTEB Touche2020 type: None config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: map_at_1 value: 1.8429999999999997 - type: map_at_10 value: 8.476 - type: map_at_100 value: 14.779 - type: map_at_1000 value: 16.381 - type: map_at_3 value: 4.361000000000001 - type: map_at_5 value: 6.064 - type: mrr_at_1 value: 28.571 - type: mrr_at_10 value: 44.039 - type: mrr_at_100 value: 45.024 - type: mrr_at_1000 value: 45.024 - type: mrr_at_3 value: 41.156 - type: mrr_at_5 value: 42.075 - type: ndcg_at_1 value: 27.551 - type: ndcg_at_10 value: 22.672 - type: ndcg_at_100 value: 35.961999999999996 - type: ndcg_at_1000 value: 47.365 - type: ndcg_at_3 value: 26.016000000000002 - type: ndcg_at_5 value: 23.794999999999998 - type: precision_at_1 value: 28.571 - type: precision_at_10 value: 20.816000000000003 - type: precision_at_100 value: 8.245 - type: precision_at_1000 value: 1.555 - type: precision_at_3 value: 27.891 - type: precision_at_5 value: 24.082 - type: recall_at_1 value: 1.8429999999999997 - type: recall_at_10 value: 14.37 - type: recall_at_100 value: 49.120999999999995 - type: recall_at_1000 value: 83.98400000000001 - type: recall_at_3 value: 5.641 - type: recall_at_5 value: 8.321000000000002 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: None config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 71.1968 - type: ap value: 14.44161573930493 - type: f1 value: 54.83573235336061 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: None config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 50.94510469722694 - type: f1 value: 51.12605321866063 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: None config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 37.53693681679847 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: None config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 83.60851165285807 - type: cos_sim_ap value: 65.18156857786212 - type: cos_sim_f1 value: 62.61223512510475 - type: cos_sim_precision value: 57.30878807801885 - type: cos_sim_recall value: 68.99736147757255 - type: dot_accuracy value: 83.60851165285807 - type: dot_ap value: 65.18156857786212 - type: dot_f1 value: 62.61223512510475 - type: dot_precision value: 57.30878807801885 - type: dot_recall value: 68.99736147757255 - type: euclidean_accuracy value: 83.60851165285807 - type: euclidean_ap value: 65.18156857786212 - type: euclidean_f1 value: 62.61223512510475 - type: euclidean_precision value: 57.30878807801885 - type: euclidean_recall value: 68.99736147757255 - type: manhattan_accuracy value: 82.49389044525243 - type: manhattan_ap value: 62.03288965473206 - type: manhattan_f1 value: 59.25074695472304 - type: manhattan_precision value: 52.483713355048856 - type: manhattan_recall value: 68.02110817941951 - type: max_accuracy value: 83.60851165285807 - type: max_ap value: 65.18156857786212 - type: max_f1 value: 62.61223512510475 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: None config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 87.60235960724958 - type: cos_sim_ap value: 83.31122009321201 - type: cos_sim_f1 value: 75.5953714072415 - type: cos_sim_precision value: 73.36617881466454 - type: cos_sim_recall value: 77.96427471512165 - type: dot_accuracy value: 87.60235960724958 - type: dot_ap value: 83.31122039383348 - type: dot_f1 value: 75.5953714072415 - type: dot_precision value: 73.36617881466454 - type: dot_recall value: 77.96427471512165 - type: euclidean_accuracy value: 87.60235960724958 - type: euclidean_ap value: 83.31123735759617 - type: euclidean_f1 value: 75.5953714072415 - type: euclidean_precision value: 73.36617881466454 - type: euclidean_recall value: 77.96427471512165 - type: manhattan_accuracy value: 87.58295494236815 - type: manhattan_ap value: 83.15022211312501 - type: manhattan_f1 value: 75.28497215681878 - type: manhattan_precision value: 73.14982932674849 - type: manhattan_recall value: 77.54850631352016 - type: max_accuracy value: 87.60235960724958 - type: max_ap value: 83.31123735759617 - type: max_f1 value: 75.5953714072415 ---
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
tensorblock/gte-Qwen2-7B-instruct-GGUF
tensorblock
sentence-similarity
[ "sentence-transformers", "gguf", "mteb", "transformers", "Qwen2", "sentence-similarity", "TensorBlock", "GGUF", "base_model:Alibaba-NLP/gte-Qwen2-7B-instruct", "base_model:quantized:Alibaba-NLP/gte-Qwen2-7B-instruct", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us", "conversational" ]
1,731
1,731
3,236
8
--- base_model: Alibaba-NLP/gte-Qwen2-7B-instruct license: apache-2.0 tags: - mteb - sentence-transformers - transformers - Qwen2 - sentence-similarity - TensorBlock - GGUF model-index: - name: gte-qwen2-7B-instruct results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 91.31343283582089 - type: ap value: 67.64251402604096 - type: f1 value: 87.53372530755692 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 97.497825 - type: ap value: 96.30329547047529 - type: f1 value: 97.49769793778039 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 62.564 - type: f1 value: 60.975777935041066 - task: type: Retrieval dataset: name: MTEB ArguAna type: mteb/arguana config: default split: test revision: c22ab2a51041ffd869aaddef7af8d8215647e41a metrics: - type: map_at_1 value: 36.486000000000004 - type: map_at_10 value: 54.842 - type: map_at_100 value: 55.206999999999994 - type: map_at_1000 value: 55.206999999999994 - type: map_at_3 value: 49.893 - type: map_at_5 value: 53.105000000000004 - type: mrr_at_1 value: 37.34 - type: mrr_at_10 value: 55.143 - type: mrr_at_100 value: 55.509 - type: mrr_at_1000 value: 55.509 - type: mrr_at_3 value: 50.212999999999994 - type: mrr_at_5 value: 53.432 - type: ndcg_at_1 value: 36.486000000000004 - type: ndcg_at_10 value: 64.273 - type: ndcg_at_100 value: 65.66199999999999 - type: ndcg_at_1000 value: 65.66199999999999 - type: ndcg_at_3 value: 54.352999999999994 - type: ndcg_at_5 value: 60.131 - type: precision_at_1 value: 36.486000000000004 - type: precision_at_10 value: 9.395000000000001 - type: precision_at_100 value: 0.996 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 22.428 - type: precision_at_5 value: 16.259 - type: recall_at_1 value: 36.486000000000004 - type: recall_at_10 value: 93.95400000000001 - type: recall_at_100 value: 99.644 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 67.283 - type: recall_at_5 value: 81.294 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 56.461169803700564 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 51.73600434466286 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 67.57827065898053 - type: mrr value: 79.08136569493911 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 83.53324575999243 - type: cos_sim_spearman value: 81.37173362822374 - type: euclidean_pearson value: 82.19243335103444 - type: euclidean_spearman value: 81.33679307304334 - type: manhattan_pearson value: 82.38752665975699 - type: manhattan_spearman value: 81.31510583189689 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 87.56818181818181 - type: f1 value: 87.25826722019875 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 50.09239610327673 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 46.64733054606282 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: f46a197baaae43b4f621051089b82a364682dfeb metrics: - type: map_at_1 value: 33.997 - type: map_at_10 value: 48.176 - type: map_at_100 value: 49.82 - type: map_at_1000 value: 49.924 - type: map_at_3 value: 43.626 - type: map_at_5 value: 46.275 - type: mrr_at_1 value: 42.059999999999995 - type: mrr_at_10 value: 53.726 - type: mrr_at_100 value: 54.398 - type: mrr_at_1000 value: 54.416 - type: mrr_at_3 value: 50.714999999999996 - type: mrr_at_5 value: 52.639 - type: ndcg_at_1 value: 42.059999999999995 - type: ndcg_at_10 value: 55.574999999999996 - type: ndcg_at_100 value: 60.744 - type: ndcg_at_1000 value: 61.85699999999999 - type: ndcg_at_3 value: 49.363 - type: ndcg_at_5 value: 52.44 - type: precision_at_1 value: 42.059999999999995 - type: precision_at_10 value: 11.101999999999999 - type: precision_at_100 value: 1.73 - type: precision_at_1000 value: 0.218 - type: precision_at_3 value: 24.464 - type: precision_at_5 value: 18.026 - type: recall_at_1 value: 33.997 - type: recall_at_10 value: 70.35900000000001 - type: recall_at_100 value: 91.642 - type: recall_at_1000 value: 97.977 - type: recall_at_3 value: 52.76 - type: recall_at_5 value: 61.148 - task: type: Retrieval dataset: name: MTEB CQADupstackEnglishRetrieval type: BeIR/cqadupstack config: default split: test revision: ad9991cb51e31e31e430383c75ffb2885547b5f0 metrics: - type: map_at_1 value: 35.884 - type: map_at_10 value: 48.14 - type: map_at_100 value: 49.5 - type: map_at_1000 value: 49.63 - type: map_at_3 value: 44.646 - type: map_at_5 value: 46.617999999999995 - type: mrr_at_1 value: 44.458999999999996 - type: mrr_at_10 value: 53.751000000000005 - type: mrr_at_100 value: 54.37800000000001 - type: mrr_at_1000 value: 54.415 - type: mrr_at_3 value: 51.815 - type: mrr_at_5 value: 52.882 - type: ndcg_at_1 value: 44.458999999999996 - type: ndcg_at_10 value: 54.157 - type: ndcg_at_100 value: 58.362 - type: ndcg_at_1000 value: 60.178 - type: ndcg_at_3 value: 49.661 - type: ndcg_at_5 value: 51.74999999999999 - type: precision_at_1 value: 44.458999999999996 - type: precision_at_10 value: 10.248 - type: precision_at_100 value: 1.5890000000000002 - type: precision_at_1000 value: 0.207 - type: precision_at_3 value: 23.928 - type: precision_at_5 value: 16.878999999999998 - type: recall_at_1 value: 35.884 - type: recall_at_10 value: 64.798 - type: recall_at_100 value: 82.345 - type: recall_at_1000 value: 93.267 - type: recall_at_3 value: 51.847 - type: recall_at_5 value: 57.601 - task: type: Retrieval dataset: name: MTEB CQADupstackGamingRetrieval type: BeIR/cqadupstack config: default split: test revision: 4885aa143210c98657558c04aaf3dc47cfb54340 metrics: - type: map_at_1 value: 39.383 - type: map_at_10 value: 53.714 - type: map_at_100 value: 54.838 - type: map_at_1000 value: 54.87800000000001 - type: map_at_3 value: 50.114999999999995 - type: map_at_5 value: 52.153000000000006 - type: mrr_at_1 value: 45.016 - type: mrr_at_10 value: 56.732000000000006 - type: mrr_at_100 value: 57.411 - type: mrr_at_1000 value: 57.431 - type: mrr_at_3 value: 54.044000000000004 - type: mrr_at_5 value: 55.639 - type: ndcg_at_1 value: 45.016 - type: ndcg_at_10 value: 60.228 - type: ndcg_at_100 value: 64.277 - type: ndcg_at_1000 value: 65.07 - type: ndcg_at_3 value: 54.124 - type: ndcg_at_5 value: 57.147000000000006 - type: precision_at_1 value: 45.016 - type: precision_at_10 value: 9.937 - type: precision_at_100 value: 1.288 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 24.471999999999998 - type: precision_at_5 value: 16.991 - type: recall_at_1 value: 39.383 - type: recall_at_10 value: 76.175 - type: recall_at_100 value: 93.02 - type: recall_at_1000 value: 98.60900000000001 - type: recall_at_3 value: 60.265 - type: recall_at_5 value: 67.46600000000001 - task: type: Retrieval dataset: name: MTEB CQADupstackGisRetrieval type: BeIR/cqadupstack config: default split: test revision: 5003b3064772da1887988e05400cf3806fe491f2 metrics: - type: map_at_1 value: 27.426000000000002 - type: map_at_10 value: 37.397000000000006 - type: map_at_100 value: 38.61 - type: map_at_1000 value: 38.678000000000004 - type: map_at_3 value: 34.150999999999996 - type: map_at_5 value: 36.137 - type: mrr_at_1 value: 29.944 - type: mrr_at_10 value: 39.654 - type: mrr_at_100 value: 40.638000000000005 - type: mrr_at_1000 value: 40.691 - type: mrr_at_3 value: 36.817 - type: mrr_at_5 value: 38.524 - type: ndcg_at_1 value: 29.944 - type: ndcg_at_10 value: 43.094 - type: ndcg_at_100 value: 48.789 - type: ndcg_at_1000 value: 50.339999999999996 - type: ndcg_at_3 value: 36.984 - type: ndcg_at_5 value: 40.248 - type: precision_at_1 value: 29.944 - type: precision_at_10 value: 6.78 - type: precision_at_100 value: 1.024 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_3 value: 15.895000000000001 - type: precision_at_5 value: 11.39 - type: recall_at_1 value: 27.426000000000002 - type: recall_at_10 value: 58.464000000000006 - type: recall_at_100 value: 84.193 - type: recall_at_1000 value: 95.52000000000001 - type: recall_at_3 value: 42.172 - type: recall_at_5 value: 50.101 - task: type: Retrieval dataset: name: MTEB CQADupstackMathematicaRetrieval type: BeIR/cqadupstack config: default split: test revision: 90fceea13679c63fe563ded68f3b6f06e50061de metrics: - type: map_at_1 value: 19.721 - type: map_at_10 value: 31.604 - type: map_at_100 value: 32.972 - type: map_at_1000 value: 33.077 - type: map_at_3 value: 27.218999999999998 - type: map_at_5 value: 29.53 - type: mrr_at_1 value: 25.0 - type: mrr_at_10 value: 35.843 - type: mrr_at_100 value: 36.785000000000004 - type: mrr_at_1000 value: 36.842000000000006 - type: mrr_at_3 value: 32.193 - type: mrr_at_5 value: 34.264 - type: ndcg_at_1 value: 25.0 - type: ndcg_at_10 value: 38.606 - type: ndcg_at_100 value: 44.272 - type: ndcg_at_1000 value: 46.527 - type: ndcg_at_3 value: 30.985000000000003 - type: ndcg_at_5 value: 34.43 - type: precision_at_1 value: 25.0 - type: precision_at_10 value: 7.811 - type: precision_at_100 value: 1.203 - type: precision_at_1000 value: 0.15 - type: precision_at_3 value: 15.423 - type: precision_at_5 value: 11.791 - type: recall_at_1 value: 19.721 - type: recall_at_10 value: 55.625 - type: recall_at_100 value: 79.34400000000001 - type: recall_at_1000 value: 95.208 - type: recall_at_3 value: 35.19 - type: recall_at_5 value: 43.626 - task: type: Retrieval dataset: name: MTEB CQADupstackPhysicsRetrieval type: BeIR/cqadupstack config: default split: test revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4 metrics: - type: map_at_1 value: 33.784 - type: map_at_10 value: 47.522 - type: map_at_100 value: 48.949999999999996 - type: map_at_1000 value: 49.038 - type: map_at_3 value: 43.284 - type: map_at_5 value: 45.629 - type: mrr_at_1 value: 41.482 - type: mrr_at_10 value: 52.830999999999996 - type: mrr_at_100 value: 53.559999999999995 - type: mrr_at_1000 value: 53.588 - type: mrr_at_3 value: 50.016000000000005 - type: mrr_at_5 value: 51.614000000000004 - type: ndcg_at_1 value: 41.482 - type: ndcg_at_10 value: 54.569 - type: ndcg_at_100 value: 59.675999999999995 - type: ndcg_at_1000 value: 60.989000000000004 - type: ndcg_at_3 value: 48.187000000000005 - type: ndcg_at_5 value: 51.183 - type: precision_at_1 value: 41.482 - type: precision_at_10 value: 10.221 - type: precision_at_100 value: 1.486 - type: precision_at_1000 value: 0.17500000000000002 - type: precision_at_3 value: 23.548 - type: precision_at_5 value: 16.805 - type: recall_at_1 value: 33.784 - type: recall_at_10 value: 69.798 - type: recall_at_100 value: 90.098 - type: recall_at_1000 value: 98.176 - type: recall_at_3 value: 52.127 - type: recall_at_5 value: 59.861 - task: type: Retrieval dataset: name: MTEB CQADupstackProgrammersRetrieval type: BeIR/cqadupstack config: default split: test revision: 6184bc1440d2dbc7612be22b50686b8826d22b32 metrics: - type: map_at_1 value: 28.038999999999998 - type: map_at_10 value: 41.904 - type: map_at_100 value: 43.36 - type: map_at_1000 value: 43.453 - type: map_at_3 value: 37.785999999999994 - type: map_at_5 value: 40.105000000000004 - type: mrr_at_1 value: 35.046 - type: mrr_at_10 value: 46.926 - type: mrr_at_100 value: 47.815000000000005 - type: mrr_at_1000 value: 47.849000000000004 - type: mrr_at_3 value: 44.273 - type: mrr_at_5 value: 45.774 - type: ndcg_at_1 value: 35.046 - type: ndcg_at_10 value: 48.937000000000005 - type: ndcg_at_100 value: 54.544000000000004 - type: ndcg_at_1000 value: 56.069 - type: ndcg_at_3 value: 42.858000000000004 - type: ndcg_at_5 value: 45.644 - type: precision_at_1 value: 35.046 - type: precision_at_10 value: 9.452 - type: precision_at_100 value: 1.429 - type: precision_at_1000 value: 0.173 - type: precision_at_3 value: 21.346999999999998 - type: precision_at_5 value: 15.342 - type: recall_at_1 value: 28.038999999999998 - type: recall_at_10 value: 64.59700000000001 - type: recall_at_100 value: 87.735 - type: recall_at_1000 value: 97.41300000000001 - type: recall_at_3 value: 47.368 - type: recall_at_5 value: 54.93900000000001 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: BeIR/cqadupstack config: default split: test revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4 metrics: - type: map_at_1 value: 28.17291666666667 - type: map_at_10 value: 40.025749999999995 - type: map_at_100 value: 41.39208333333333 - type: map_at_1000 value: 41.499249999999996 - type: map_at_3 value: 36.347 - type: map_at_5 value: 38.41391666666667 - type: mrr_at_1 value: 33.65925 - type: mrr_at_10 value: 44.085499999999996 - type: mrr_at_100 value: 44.94116666666667 - type: mrr_at_1000 value: 44.9855 - type: mrr_at_3 value: 41.2815 - type: mrr_at_5 value: 42.91491666666666 - type: ndcg_at_1 value: 33.65925 - type: ndcg_at_10 value: 46.430833333333325 - type: ndcg_at_100 value: 51.761 - type: ndcg_at_1000 value: 53.50899999999999 - type: ndcg_at_3 value: 40.45133333333333 - type: ndcg_at_5 value: 43.31483333333334 - type: precision_at_1 value: 33.65925 - type: precision_at_10 value: 8.4995 - type: precision_at_100 value: 1.3210000000000004 - type: precision_at_1000 value: 0.16591666666666666 - type: precision_at_3 value: 19.165083333333335 - type: precision_at_5 value: 13.81816666666667 - type: recall_at_1 value: 28.17291666666667 - type: recall_at_10 value: 61.12624999999999 - type: recall_at_100 value: 83.97266666666667 - type: recall_at_1000 value: 95.66550000000001 - type: recall_at_3 value: 44.661249999999995 - type: recall_at_5 value: 51.983333333333334 - type: map_at_1 value: 17.936 - type: map_at_10 value: 27.399 - type: map_at_100 value: 28.632 - type: map_at_1000 value: 28.738000000000003 - type: map_at_3 value: 24.456 - type: map_at_5 value: 26.06 - type: mrr_at_1 value: 19.224 - type: mrr_at_10 value: 28.998 - type: mrr_at_100 value: 30.11 - type: mrr_at_1000 value: 30.177 - type: mrr_at_3 value: 26.247999999999998 - type: mrr_at_5 value: 27.708 - type: ndcg_at_1 value: 19.224 - type: ndcg_at_10 value: 32.911 - type: ndcg_at_100 value: 38.873999999999995 - type: ndcg_at_1000 value: 41.277 - type: ndcg_at_3 value: 27.142 - type: ndcg_at_5 value: 29.755 - type: precision_at_1 value: 19.224 - type: precision_at_10 value: 5.6930000000000005 - type: precision_at_100 value: 0.9259999999999999 - type: precision_at_1000 value: 0.126 - type: precision_at_3 value: 12.138 - type: precision_at_5 value: 8.909 - type: recall_at_1 value: 17.936 - type: recall_at_10 value: 48.096 - type: recall_at_100 value: 75.389 - type: recall_at_1000 value: 92.803 - type: recall_at_3 value: 32.812999999999995 - type: recall_at_5 value: 38.851 - task: type: Retrieval dataset: name: MTEB CQADupstackStatsRetrieval type: BeIR/cqadupstack config: default split: test revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a metrics: - type: map_at_1 value: 24.681 - type: map_at_10 value: 34.892 - type: map_at_100 value: 35.996 - type: map_at_1000 value: 36.083 - type: map_at_3 value: 31.491999999999997 - type: map_at_5 value: 33.632 - type: mrr_at_1 value: 28.528 - type: mrr_at_10 value: 37.694 - type: mrr_at_100 value: 38.613 - type: mrr_at_1000 value: 38.668 - type: mrr_at_3 value: 34.714 - type: mrr_at_5 value: 36.616 - type: ndcg_at_1 value: 28.528 - type: ndcg_at_10 value: 40.703 - type: ndcg_at_100 value: 45.993 - type: ndcg_at_1000 value: 47.847 - type: ndcg_at_3 value: 34.622 - type: ndcg_at_5 value: 38.035999999999994 - type: precision_at_1 value: 28.528 - type: precision_at_10 value: 6.902 - type: precision_at_100 value: 1.0370000000000001 - type: precision_at_1000 value: 0.126 - type: precision_at_3 value: 15.798000000000002 - type: precision_at_5 value: 11.655999999999999 - type: recall_at_1 value: 24.681 - type: recall_at_10 value: 55.81 - type: recall_at_100 value: 79.785 - type: recall_at_1000 value: 92.959 - type: recall_at_3 value: 39.074 - type: recall_at_5 value: 47.568 - task: type: Retrieval dataset: name: MTEB CQADupstackTexRetrieval type: BeIR/cqadupstack config: default split: test revision: 46989137a86843e03a6195de44b09deda022eec7 metrics: - type: map_at_1 value: 18.627 - type: map_at_10 value: 27.872000000000003 - type: map_at_100 value: 29.237999999999996 - type: map_at_1000 value: 29.363 - type: map_at_3 value: 24.751 - type: map_at_5 value: 26.521 - type: mrr_at_1 value: 23.021 - type: mrr_at_10 value: 31.924000000000003 - type: mrr_at_100 value: 32.922000000000004 - type: mrr_at_1000 value: 32.988 - type: mrr_at_3 value: 29.192 - type: mrr_at_5 value: 30.798 - type: ndcg_at_1 value: 23.021 - type: ndcg_at_10 value: 33.535 - type: ndcg_at_100 value: 39.732 - type: ndcg_at_1000 value: 42.201 - type: ndcg_at_3 value: 28.153 - type: ndcg_at_5 value: 30.746000000000002 - type: precision_at_1 value: 23.021 - type: precision_at_10 value: 6.459 - type: precision_at_100 value: 1.1320000000000001 - type: precision_at_1000 value: 0.153 - type: precision_at_3 value: 13.719000000000001 - type: precision_at_5 value: 10.193000000000001 - type: recall_at_1 value: 18.627 - type: recall_at_10 value: 46.463 - type: recall_at_100 value: 74.226 - type: recall_at_1000 value: 91.28500000000001 - type: recall_at_3 value: 31.357000000000003 - type: recall_at_5 value: 38.067 - task: type: Retrieval dataset: name: MTEB CQADupstackUnixRetrieval type: BeIR/cqadupstack config: default split: test revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53 metrics: - type: map_at_1 value: 31.457 - type: map_at_10 value: 42.888 - type: map_at_100 value: 44.24 - type: map_at_1000 value: 44.327 - type: map_at_3 value: 39.588 - type: map_at_5 value: 41.423 - type: mrr_at_1 value: 37.126999999999995 - type: mrr_at_10 value: 47.083000000000006 - type: mrr_at_100 value: 47.997 - type: mrr_at_1000 value: 48.044 - type: mrr_at_3 value: 44.574000000000005 - type: mrr_at_5 value: 46.202 - type: ndcg_at_1 value: 37.126999999999995 - type: ndcg_at_10 value: 48.833 - type: ndcg_at_100 value: 54.327000000000005 - type: ndcg_at_1000 value: 56.011 - type: ndcg_at_3 value: 43.541999999999994 - type: ndcg_at_5 value: 46.127 - type: precision_at_1 value: 37.126999999999995 - type: precision_at_10 value: 8.376999999999999 - type: precision_at_100 value: 1.2309999999999999 - type: precision_at_1000 value: 0.146 - type: precision_at_3 value: 20.211000000000002 - type: precision_at_5 value: 14.16 - type: recall_at_1 value: 31.457 - type: recall_at_10 value: 62.369 - type: recall_at_100 value: 85.444 - type: recall_at_1000 value: 96.65599999999999 - type: recall_at_3 value: 47.961 - type: recall_at_5 value: 54.676 - task: type: Retrieval dataset: name: MTEB CQADupstackWebmastersRetrieval type: BeIR/cqadupstack config: default split: test revision: 160c094312a0e1facb97e55eeddb698c0abe3571 metrics: - type: map_at_1 value: 27.139999999999997 - type: map_at_10 value: 38.801 - type: map_at_100 value: 40.549 - type: map_at_1000 value: 40.802 - type: map_at_3 value: 35.05 - type: map_at_5 value: 36.884 - type: mrr_at_1 value: 33.004 - type: mrr_at_10 value: 43.864 - type: mrr_at_100 value: 44.667 - type: mrr_at_1000 value: 44.717 - type: mrr_at_3 value: 40.777 - type: mrr_at_5 value: 42.319 - type: ndcg_at_1 value: 33.004 - type: ndcg_at_10 value: 46.022 - type: ndcg_at_100 value: 51.542 - type: ndcg_at_1000 value: 53.742000000000004 - type: ndcg_at_3 value: 39.795 - type: ndcg_at_5 value: 42.272 - type: precision_at_1 value: 33.004 - type: precision_at_10 value: 9.012 - type: precision_at_100 value: 1.7770000000000001 - type: precision_at_1000 value: 0.26 - type: precision_at_3 value: 19.038 - type: precision_at_5 value: 13.675999999999998 - type: recall_at_1 value: 27.139999999999997 - type: recall_at_10 value: 60.961 - type: recall_at_100 value: 84.451 - type: recall_at_1000 value: 98.113 - type: recall_at_3 value: 43.001 - type: recall_at_5 value: 49.896 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: mteb/climate-fever config: default split: test revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380 metrics: - type: map_at_1 value: 22.076999999999998 - type: map_at_10 value: 35.44 - type: map_at_100 value: 37.651 - type: map_at_1000 value: 37.824999999999996 - type: map_at_3 value: 30.764999999999997 - type: map_at_5 value: 33.26 - type: mrr_at_1 value: 50.163000000000004 - type: mrr_at_10 value: 61.207 - type: mrr_at_100 value: 61.675000000000004 - type: mrr_at_1000 value: 61.692 - type: mrr_at_3 value: 58.60999999999999 - type: mrr_at_5 value: 60.307 - type: ndcg_at_1 value: 50.163000000000004 - type: ndcg_at_10 value: 45.882 - type: ndcg_at_100 value: 53.239999999999995 - type: ndcg_at_1000 value: 55.852000000000004 - type: ndcg_at_3 value: 40.514 - type: ndcg_at_5 value: 42.038 - type: precision_at_1 value: 50.163000000000004 - type: precision_at_10 value: 13.466000000000001 - type: precision_at_100 value: 2.164 - type: precision_at_1000 value: 0.266 - type: precision_at_3 value: 29.707 - type: precision_at_5 value: 21.694 - type: recall_at_1 value: 22.076999999999998 - type: recall_at_10 value: 50.193 - type: recall_at_100 value: 74.993 - type: recall_at_1000 value: 89.131 - type: recall_at_3 value: 35.472 - type: recall_at_5 value: 41.814 - task: type: Retrieval dataset: name: MTEB DBPedia type: mteb/dbpedia config: default split: test revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659 metrics: - type: map_at_1 value: 9.953 - type: map_at_10 value: 24.515 - type: map_at_100 value: 36.173 - type: map_at_1000 value: 38.351 - type: map_at_3 value: 16.592000000000002 - type: map_at_5 value: 20.036 - type: mrr_at_1 value: 74.25 - type: mrr_at_10 value: 81.813 - type: mrr_at_100 value: 82.006 - type: mrr_at_1000 value: 82.011 - type: mrr_at_3 value: 80.875 - type: mrr_at_5 value: 81.362 - type: ndcg_at_1 value: 62.5 - type: ndcg_at_10 value: 52.42 - type: ndcg_at_100 value: 56.808 - type: ndcg_at_1000 value: 63.532999999999994 - type: ndcg_at_3 value: 56.654 - type: ndcg_at_5 value: 54.18300000000001 - type: precision_at_1 value: 74.25 - type: precision_at_10 value: 42.699999999999996 - type: precision_at_100 value: 13.675 - type: precision_at_1000 value: 2.664 - type: precision_at_3 value: 60.5 - type: precision_at_5 value: 52.800000000000004 - type: recall_at_1 value: 9.953 - type: recall_at_10 value: 30.253999999999998 - type: recall_at_100 value: 62.516000000000005 - type: recall_at_1000 value: 84.163 - type: recall_at_3 value: 18.13 - type: recall_at_5 value: 22.771 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 79.455 - type: f1 value: 74.16798697647569 - task: type: Retrieval dataset: name: MTEB FEVER type: mteb/fever config: default split: test revision: bea83ef9e8fb933d90a2f1d5515737465d613e12 metrics: - type: map_at_1 value: 87.531 - type: map_at_10 value: 93.16799999999999 - type: map_at_100 value: 93.341 - type: map_at_1000 value: 93.349 - type: map_at_3 value: 92.444 - type: map_at_5 value: 92.865 - type: mrr_at_1 value: 94.014 - type: mrr_at_10 value: 96.761 - type: mrr_at_100 value: 96.762 - type: mrr_at_1000 value: 96.762 - type: mrr_at_3 value: 96.672 - type: mrr_at_5 value: 96.736 - type: ndcg_at_1 value: 94.014 - type: ndcg_at_10 value: 95.112 - type: ndcg_at_100 value: 95.578 - type: ndcg_at_1000 value: 95.68900000000001 - type: ndcg_at_3 value: 94.392 - type: ndcg_at_5 value: 94.72500000000001 - type: precision_at_1 value: 94.014 - type: precision_at_10 value: 11.065 - type: precision_at_100 value: 1.157 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_3 value: 35.259 - type: precision_at_5 value: 21.599 - type: recall_at_1 value: 87.531 - type: recall_at_10 value: 97.356 - type: recall_at_100 value: 98.965 - type: recall_at_1000 value: 99.607 - type: recall_at_3 value: 95.312 - type: recall_at_5 value: 96.295 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: mteb/fiqa config: default split: test revision: 27a168819829fe9bcd655c2df245fb19452e8e06 metrics: - type: map_at_1 value: 32.055 - type: map_at_10 value: 53.114 - type: map_at_100 value: 55.235 - type: map_at_1000 value: 55.345 - type: map_at_3 value: 45.854 - type: map_at_5 value: 50.025 - type: mrr_at_1 value: 60.34 - type: mrr_at_10 value: 68.804 - type: mrr_at_100 value: 69.309 - type: mrr_at_1000 value: 69.32199999999999 - type: mrr_at_3 value: 66.40899999999999 - type: mrr_at_5 value: 67.976 - type: ndcg_at_1 value: 60.34 - type: ndcg_at_10 value: 62.031000000000006 - type: ndcg_at_100 value: 68.00500000000001 - type: ndcg_at_1000 value: 69.286 - type: ndcg_at_3 value: 56.355999999999995 - type: ndcg_at_5 value: 58.687 - type: precision_at_1 value: 60.34 - type: precision_at_10 value: 17.176 - type: precision_at_100 value: 2.36 - type: precision_at_1000 value: 0.259 - type: precision_at_3 value: 37.14 - type: precision_at_5 value: 27.809 - type: recall_at_1 value: 32.055 - type: recall_at_10 value: 70.91 - type: recall_at_100 value: 91.83 - type: recall_at_1000 value: 98.871 - type: recall_at_3 value: 51.202999999999996 - type: recall_at_5 value: 60.563 - task: type: Retrieval dataset: name: MTEB HotpotQA type: mteb/hotpotqa config: default split: test revision: ab518f4d6fcca38d87c25209f94beba119d02014 metrics: - type: map_at_1 value: 43.68 - type: map_at_10 value: 64.389 - type: map_at_100 value: 65.24 - type: map_at_1000 value: 65.303 - type: map_at_3 value: 61.309000000000005 - type: map_at_5 value: 63.275999999999996 - type: mrr_at_1 value: 87.36 - type: mrr_at_10 value: 91.12 - type: mrr_at_100 value: 91.227 - type: mrr_at_1000 value: 91.229 - type: mrr_at_3 value: 90.57600000000001 - type: mrr_at_5 value: 90.912 - type: ndcg_at_1 value: 87.36 - type: ndcg_at_10 value: 73.076 - type: ndcg_at_100 value: 75.895 - type: ndcg_at_1000 value: 77.049 - type: ndcg_at_3 value: 68.929 - type: ndcg_at_5 value: 71.28 - type: precision_at_1 value: 87.36 - type: precision_at_10 value: 14.741000000000001 - type: precision_at_100 value: 1.694 - type: precision_at_1000 value: 0.185 - type: precision_at_3 value: 43.043 - type: precision_at_5 value: 27.681 - type: recall_at_1 value: 43.68 - type: recall_at_10 value: 73.707 - type: recall_at_100 value: 84.7 - type: recall_at_1000 value: 92.309 - type: recall_at_3 value: 64.564 - type: recall_at_5 value: 69.203 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 96.75399999999999 - type: ap value: 95.29389839242187 - type: f1 value: 96.75348377433475 - task: type: Retrieval dataset: name: MTEB MSMARCO type: mteb/msmarco config: default split: dev revision: c5a29a104738b98a9e76336939199e264163d4a0 metrics: - type: map_at_1 value: 25.176 - type: map_at_10 value: 38.598 - type: map_at_100 value: 39.707 - type: map_at_1000 value: 39.744 - type: map_at_3 value: 34.566 - type: map_at_5 value: 36.863 - type: mrr_at_1 value: 25.874000000000002 - type: mrr_at_10 value: 39.214 - type: mrr_at_100 value: 40.251 - type: mrr_at_1000 value: 40.281 - type: mrr_at_3 value: 35.291 - type: mrr_at_5 value: 37.545 - type: ndcg_at_1 value: 25.874000000000002 - type: ndcg_at_10 value: 45.98 - type: ndcg_at_100 value: 51.197 - type: ndcg_at_1000 value: 52.073 - type: ndcg_at_3 value: 37.785999999999994 - type: ndcg_at_5 value: 41.870000000000005 - type: precision_at_1 value: 25.874000000000002 - type: precision_at_10 value: 7.181 - type: precision_at_100 value: 0.979 - type: precision_at_1000 value: 0.106 - type: precision_at_3 value: 16.051000000000002 - type: precision_at_5 value: 11.713 - type: recall_at_1 value: 25.176 - type: recall_at_10 value: 68.67699999999999 - type: recall_at_100 value: 92.55 - type: recall_at_1000 value: 99.164 - type: recall_at_3 value: 46.372 - type: recall_at_5 value: 56.16 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 99.03784769721841 - type: f1 value: 98.97791641821495 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 91.88326493388054 - type: f1 value: 73.74809928034335 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 85.41358439811701 - type: f1 value: 83.503679460639 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 89.77135171486215 - type: f1 value: 88.89843747468366 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 46.22695362087359 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 44.132372165849425 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 33.35680810650402 - type: mrr value: 34.72625715637218 - task: type: Retrieval dataset: name: MTEB NFCorpus type: mteb/nfcorpus config: default split: test revision: ec0fa4fe99da2ff19ca1214b7966684033a58814 metrics: - type: map_at_1 value: 7.165000000000001 - type: map_at_10 value: 15.424 - type: map_at_100 value: 20.28 - type: map_at_1000 value: 22.065 - type: map_at_3 value: 11.236 - type: map_at_5 value: 13.025999999999998 - type: mrr_at_1 value: 51.702999999999996 - type: mrr_at_10 value: 59.965 - type: mrr_at_100 value: 60.667 - type: mrr_at_1000 value: 60.702999999999996 - type: mrr_at_3 value: 58.772000000000006 - type: mrr_at_5 value: 59.267 - type: ndcg_at_1 value: 49.536 - type: ndcg_at_10 value: 40.6 - type: ndcg_at_100 value: 37.848 - type: ndcg_at_1000 value: 46.657 - type: ndcg_at_3 value: 46.117999999999995 - type: ndcg_at_5 value: 43.619 - type: precision_at_1 value: 51.393 - type: precision_at_10 value: 30.31 - type: precision_at_100 value: 9.972 - type: precision_at_1000 value: 2.329 - type: precision_at_3 value: 43.137 - type: precision_at_5 value: 37.585 - type: recall_at_1 value: 7.165000000000001 - type: recall_at_10 value: 19.689999999999998 - type: recall_at_100 value: 39.237 - type: recall_at_1000 value: 71.417 - type: recall_at_3 value: 12.247 - type: recall_at_5 value: 14.902999999999999 - task: type: Retrieval dataset: name: MTEB NQ type: mteb/nq config: default split: test revision: b774495ed302d8c44a3a7ea25c90dbce03968f31 metrics: - type: map_at_1 value: 42.653999999999996 - type: map_at_10 value: 59.611999999999995 - type: map_at_100 value: 60.32300000000001 - type: map_at_1000 value: 60.336 - type: map_at_3 value: 55.584999999999994 - type: map_at_5 value: 58.19 - type: mrr_at_1 value: 47.683 - type: mrr_at_10 value: 62.06700000000001 - type: mrr_at_100 value: 62.537 - type: mrr_at_1000 value: 62.544999999999995 - type: mrr_at_3 value: 59.178 - type: mrr_at_5 value: 61.034 - type: ndcg_at_1 value: 47.654 - type: ndcg_at_10 value: 67.001 - type: ndcg_at_100 value: 69.73899999999999 - type: ndcg_at_1000 value: 69.986 - type: ndcg_at_3 value: 59.95700000000001 - type: ndcg_at_5 value: 64.025 - type: precision_at_1 value: 47.654 - type: precision_at_10 value: 10.367999999999999 - type: precision_at_100 value: 1.192 - type: precision_at_1000 value: 0.121 - type: precision_at_3 value: 26.651000000000003 - type: precision_at_5 value: 18.459 - type: recall_at_1 value: 42.653999999999996 - type: recall_at_10 value: 86.619 - type: recall_at_100 value: 98.04899999999999 - type: recall_at_1000 value: 99.812 - type: recall_at_3 value: 68.987 - type: recall_at_5 value: 78.158 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: mteb/quora config: default split: test revision: None metrics: - type: map_at_1 value: 72.538 - type: map_at_10 value: 86.702 - type: map_at_100 value: 87.31 - type: map_at_1000 value: 87.323 - type: map_at_3 value: 83.87 - type: map_at_5 value: 85.682 - type: mrr_at_1 value: 83.31 - type: mrr_at_10 value: 89.225 - type: mrr_at_100 value: 89.30399999999999 - type: mrr_at_1000 value: 89.30399999999999 - type: mrr_at_3 value: 88.44300000000001 - type: mrr_at_5 value: 89.005 - type: ndcg_at_1 value: 83.32000000000001 - type: ndcg_at_10 value: 90.095 - type: ndcg_at_100 value: 91.12 - type: ndcg_at_1000 value: 91.179 - type: ndcg_at_3 value: 87.606 - type: ndcg_at_5 value: 89.031 - type: precision_at_1 value: 83.32000000000001 - type: precision_at_10 value: 13.641 - type: precision_at_100 value: 1.541 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 38.377 - type: precision_at_5 value: 25.162000000000003 - type: recall_at_1 value: 72.538 - type: recall_at_10 value: 96.47200000000001 - type: recall_at_100 value: 99.785 - type: recall_at_1000 value: 99.99900000000001 - type: recall_at_3 value: 89.278 - type: recall_at_5 value: 93.367 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 73.55219145406065 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 74.13437105242755 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: mteb/scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 6.873 - type: map_at_10 value: 17.944 - type: map_at_100 value: 21.171 - type: map_at_1000 value: 21.528 - type: map_at_3 value: 12.415 - type: map_at_5 value: 15.187999999999999 - type: mrr_at_1 value: 33.800000000000004 - type: mrr_at_10 value: 46.455 - type: mrr_at_100 value: 47.378 - type: mrr_at_1000 value: 47.394999999999996 - type: mrr_at_3 value: 42.367 - type: mrr_at_5 value: 44.972 - type: ndcg_at_1 value: 33.800000000000004 - type: ndcg_at_10 value: 28.907 - type: ndcg_at_100 value: 39.695 - type: ndcg_at_1000 value: 44.582 - type: ndcg_at_3 value: 26.949 - type: ndcg_at_5 value: 23.988 - type: precision_at_1 value: 33.800000000000004 - type: precision_at_10 value: 15.079999999999998 - type: precision_at_100 value: 3.056 - type: precision_at_1000 value: 0.42100000000000004 - type: precision_at_3 value: 25.167 - type: precision_at_5 value: 21.26 - type: recall_at_1 value: 6.873 - type: recall_at_10 value: 30.568 - type: recall_at_100 value: 62.062 - type: recall_at_1000 value: 85.37700000000001 - type: recall_at_3 value: 15.312999999999999 - type: recall_at_5 value: 21.575 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 82.37009118256057 - type: cos_sim_spearman value: 79.27986395671529 - type: euclidean_pearson value: 79.18037715442115 - type: euclidean_spearman value: 79.28004791561621 - type: manhattan_pearson value: 79.34062972800541 - type: manhattan_spearman value: 79.43106695543402 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 87.48474767383833 - type: cos_sim_spearman value: 79.54505388752513 - type: euclidean_pearson value: 83.43282704179565 - type: euclidean_spearman value: 79.54579919925405 - type: manhattan_pearson value: 83.77564492427952 - type: manhattan_spearman value: 79.84558396989286 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 88.803698035802 - type: cos_sim_spearman value: 88.83451367754881 - type: euclidean_pearson value: 88.28939285711628 - type: euclidean_spearman value: 88.83528996073112 - type: manhattan_pearson value: 88.28017412671795 - type: manhattan_spearman value: 88.9228828016344 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 85.27469288153428 - type: cos_sim_spearman value: 83.87477064876288 - type: euclidean_pearson value: 84.2601737035379 - type: euclidean_spearman value: 83.87431082479074 - type: manhattan_pearson value: 84.3621547772745 - type: manhattan_spearman value: 84.12094375000423 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 88.12749863201587 - type: cos_sim_spearman value: 88.54287568368565 - type: euclidean_pearson value: 87.90429700607999 - type: euclidean_spearman value: 88.5437689576261 - type: manhattan_pearson value: 88.19276653356833 - type: manhattan_spearman value: 88.99995393814679 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 85.68398747560902 - type: cos_sim_spearman value: 86.48815303460574 - type: euclidean_pearson value: 85.52356631237954 - type: euclidean_spearman value: 86.486391949551 - type: manhattan_pearson value: 85.67267981761788 - type: manhattan_spearman value: 86.7073696332485 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 88.9057107443124 - type: cos_sim_spearman value: 88.7312168757697 - type: euclidean_pearson value: 88.72810439714794 - type: euclidean_spearman value: 88.71976185854771 - type: manhattan_pearson value: 88.50433745949111 - type: manhattan_spearman value: 88.51726175544195 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 67.59391795109886 - type: cos_sim_spearman value: 66.87613008631367 - type: euclidean_pearson value: 69.23198488262217 - type: euclidean_spearman value: 66.85427723013692 - type: manhattan_pearson value: 69.50730124841084 - type: manhattan_spearman value: 67.10404669820792 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 87.0820605344619 - type: cos_sim_spearman value: 86.8518089863434 - type: euclidean_pearson value: 86.31087134689284 - type: euclidean_spearman value: 86.8518520517941 - type: manhattan_pearson value: 86.47203796160612 - type: manhattan_spearman value: 87.1080149734421 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 89.09255369305481 - type: mrr value: 97.10323445617563 - task: type: Retrieval dataset: name: MTEB SciFact type: mteb/scifact config: default split: test revision: 0228b52cf27578f30900b9e5271d331663a030d7 metrics: - type: map_at_1 value: 61.260999999999996 - type: map_at_10 value: 74.043 - type: map_at_100 value: 74.37700000000001 - type: map_at_1000 value: 74.384 - type: map_at_3 value: 71.222 - type: map_at_5 value: 72.875 - type: mrr_at_1 value: 64.333 - type: mrr_at_10 value: 74.984 - type: mrr_at_100 value: 75.247 - type: mrr_at_1000 value: 75.25500000000001 - type: mrr_at_3 value: 73.167 - type: mrr_at_5 value: 74.35000000000001 - type: ndcg_at_1 value: 64.333 - type: ndcg_at_10 value: 79.06 - type: ndcg_at_100 value: 80.416 - type: ndcg_at_1000 value: 80.55600000000001 - type: ndcg_at_3 value: 74.753 - type: ndcg_at_5 value: 76.97500000000001 - type: precision_at_1 value: 64.333 - type: precision_at_10 value: 10.567 - type: precision_at_100 value: 1.1199999999999999 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 29.889 - type: precision_at_5 value: 19.533 - type: recall_at_1 value: 61.260999999999996 - type: recall_at_10 value: 93.167 - type: recall_at_100 value: 99.0 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 81.667 - type: recall_at_5 value: 87.394 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.71980198019801 - type: cos_sim_ap value: 92.81616007802704 - type: cos_sim_f1 value: 85.17548454688318 - type: cos_sim_precision value: 89.43894389438944 - type: cos_sim_recall value: 81.3 - type: dot_accuracy value: 99.71980198019801 - type: dot_ap value: 92.81398760591358 - type: dot_f1 value: 85.17548454688318 - type: dot_precision value: 89.43894389438944 - type: dot_recall value: 81.3 - type: euclidean_accuracy value: 99.71980198019801 - type: euclidean_ap value: 92.81560637245072 - type: euclidean_f1 value: 85.17548454688318 - type: euclidean_precision value: 89.43894389438944 - type: euclidean_recall value: 81.3 - type: manhattan_accuracy value: 99.73069306930694 - type: manhattan_ap value: 93.14005487480794 - type: manhattan_f1 value: 85.56263269639068 - type: manhattan_precision value: 91.17647058823529 - type: manhattan_recall value: 80.60000000000001 - type: max_accuracy value: 99.73069306930694 - type: max_ap value: 93.14005487480794 - type: max_f1 value: 85.56263269639068 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 79.86443362395185 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 49.40897096662564 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 55.66040806627947 - type: mrr value: 56.58670475766064 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 31.51015090598575 - type: cos_sim_spearman value: 31.35016454939226 - type: dot_pearson value: 31.5150068731 - type: dot_spearman value: 31.34790869023487 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: mteb/trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.254 - type: map_at_10 value: 2.064 - type: map_at_100 value: 12.909 - type: map_at_1000 value: 31.761 - type: map_at_3 value: 0.738 - type: map_at_5 value: 1.155 - type: mrr_at_1 value: 96.0 - type: mrr_at_10 value: 98.0 - type: mrr_at_100 value: 98.0 - type: mrr_at_1000 value: 98.0 - type: mrr_at_3 value: 98.0 - type: mrr_at_5 value: 98.0 - type: ndcg_at_1 value: 93.0 - type: ndcg_at_10 value: 82.258 - type: ndcg_at_100 value: 64.34 - type: ndcg_at_1000 value: 57.912 - type: ndcg_at_3 value: 90.827 - type: ndcg_at_5 value: 86.79 - type: precision_at_1 value: 96.0 - type: precision_at_10 value: 84.8 - type: precision_at_100 value: 66.0 - type: precision_at_1000 value: 25.356 - type: precision_at_3 value: 94.667 - type: precision_at_5 value: 90.4 - type: recall_at_1 value: 0.254 - type: recall_at_10 value: 2.1950000000000003 - type: recall_at_100 value: 16.088 - type: recall_at_1000 value: 54.559000000000005 - type: recall_at_3 value: 0.75 - type: recall_at_5 value: 1.191 - task: type: Retrieval dataset: name: MTEB Touche2020 type: mteb/touche2020 config: default split: test revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f metrics: - type: map_at_1 value: 2.976 - type: map_at_10 value: 11.389000000000001 - type: map_at_100 value: 18.429000000000002 - type: map_at_1000 value: 20.113 - type: map_at_3 value: 6.483 - type: map_at_5 value: 8.770999999999999 - type: mrr_at_1 value: 40.816 - type: mrr_at_10 value: 58.118 - type: mrr_at_100 value: 58.489999999999995 - type: mrr_at_1000 value: 58.489999999999995 - type: mrr_at_3 value: 53.061 - type: mrr_at_5 value: 57.041 - type: ndcg_at_1 value: 40.816 - type: ndcg_at_10 value: 30.567 - type: ndcg_at_100 value: 42.44 - type: ndcg_at_1000 value: 53.480000000000004 - type: ndcg_at_3 value: 36.016 - type: ndcg_at_5 value: 34.257 - type: precision_at_1 value: 42.857 - type: precision_at_10 value: 25.714 - type: precision_at_100 value: 8.429 - type: precision_at_1000 value: 1.5939999999999999 - type: precision_at_3 value: 36.735 - type: precision_at_5 value: 33.878 - type: recall_at_1 value: 2.976 - type: recall_at_10 value: 17.854999999999997 - type: recall_at_100 value: 51.833 - type: recall_at_1000 value: 86.223 - type: recall_at_3 value: 7.887 - type: recall_at_5 value: 12.026 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 85.1174 - type: ap value: 30.169441069345748 - type: f1 value: 69.79254701873245 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 72.58347481607245 - type: f1 value: 72.74877295564937 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 53.90586138221305 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 87.35769207844072 - type: cos_sim_ap value: 77.9645072410354 - type: cos_sim_f1 value: 71.32352941176471 - type: cos_sim_precision value: 66.5903890160183 - type: cos_sim_recall value: 76.78100263852242 - type: dot_accuracy value: 87.37557370209214 - type: dot_ap value: 77.96250046429908 - type: dot_f1 value: 71.28932757557064 - type: dot_precision value: 66.95249130938586 - type: dot_recall value: 76.22691292875989 - type: euclidean_accuracy value: 87.35173153722357 - type: euclidean_ap value: 77.96520460741593 - type: euclidean_f1 value: 71.32470733210104 - type: euclidean_precision value: 66.91329479768785 - type: euclidean_recall value: 76.35883905013192 - type: manhattan_accuracy value: 87.25636287774931 - type: manhattan_ap value: 77.77752485611796 - type: manhattan_f1 value: 71.18148599269183 - type: manhattan_precision value: 66.10859728506787 - type: manhattan_recall value: 77.0976253298153 - type: max_accuracy value: 87.37557370209214 - type: max_ap value: 77.96520460741593 - type: max_f1 value: 71.32470733210104 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 89.38176737687739 - type: cos_sim_ap value: 86.58811861657401 - type: cos_sim_f1 value: 79.09430644097604 - type: cos_sim_precision value: 75.45085977911366 - type: cos_sim_recall value: 83.10748383122882 - type: dot_accuracy value: 89.38370784336554 - type: dot_ap value: 86.58840606004333 - type: dot_f1 value: 79.10179860068133 - type: dot_precision value: 75.44546153308643 - type: dot_recall value: 83.13058207576223 - type: euclidean_accuracy value: 89.38564830985369 - type: euclidean_ap value: 86.58820721061164 - type: euclidean_f1 value: 79.09070942235888 - type: euclidean_precision value: 75.38729937194697 - type: euclidean_recall value: 83.17677856482906 - type: manhattan_accuracy value: 89.40699344122326 - type: manhattan_ap value: 86.60631843011362 - type: manhattan_f1 value: 79.14949970570925 - type: manhattan_precision value: 75.78191039729502 - type: manhattan_recall value: 82.83030489682784 - type: max_accuracy value: 89.40699344122326 - type: max_ap value: 86.60631843011362 - type: max_f1 value: 79.14949970570925 - task: type: STS dataset: name: MTEB AFQMC type: C-MTEB/AFQMC config: default split: validation revision: b44c3b011063adb25877c13823db83bb193913c4 metrics: - type: cos_sim_pearson value: 65.58442135663871 - type: cos_sim_spearman value: 72.2538631361313 - type: euclidean_pearson value: 70.97255486607429 - type: euclidean_spearman value: 72.25374250228647 - type: manhattan_pearson value: 70.83250199989911 - type: manhattan_spearman value: 72.14819496536272 - task: type: STS dataset: name: MTEB ATEC type: C-MTEB/ATEC config: default split: test revision: 0f319b1142f28d00e055a6770f3f726ae9b7d865 metrics: - type: cos_sim_pearson value: 59.99478404929932 - type: cos_sim_spearman value: 62.61836216999812 - type: euclidean_pearson value: 66.86429811933593 - type: euclidean_spearman value: 62.6183520374191 - type: manhattan_pearson value: 66.8063778911633 - type: manhattan_spearman value: 62.569607573241115 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (zh) type: mteb/amazon_reviews_multi config: zh split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 53.98400000000001 - type: f1 value: 51.21447361350723 - task: type: STS dataset: name: MTEB BQ type: C-MTEB/BQ config: default split: test revision: e3dda5e115e487b39ec7e618c0c6a29137052a55 metrics: - type: cos_sim_pearson value: 79.11941660686553 - type: cos_sim_spearman value: 81.25029594540435 - type: euclidean_pearson value: 82.06973504238826 - type: euclidean_spearman value: 81.2501989488524 - type: manhattan_pearson value: 82.10094630392753 - type: manhattan_spearman value: 81.27987244392389 - task: type: Clustering dataset: name: MTEB CLSClusteringP2P type: C-MTEB/CLSClusteringP2P config: default split: test revision: 4b6227591c6c1a73bc76b1055f3b7f3588e72476 metrics: - type: v_measure value: 47.07270168705156 - task: type: Clustering dataset: name: MTEB CLSClusteringS2S type: C-MTEB/CLSClusteringS2S config: default split: test revision: e458b3f5414b62b7f9f83499ac1f5497ae2e869f metrics: - type: v_measure value: 45.98511703185043 - task: type: Reranking dataset: name: MTEB CMedQAv1 type: C-MTEB/CMedQAv1-reranking config: default split: test revision: 8d7f1e942507dac42dc58017c1a001c3717da7df metrics: - type: map value: 88.19895157194931 - type: mrr value: 90.21424603174603 - task: type: Reranking dataset: name: MTEB CMedQAv2 type: C-MTEB/CMedQAv2-reranking config: default split: test revision: 23d186750531a14a0357ca22cd92d712fd512ea0 metrics: - type: map value: 88.03317320980119 - type: mrr value: 89.9461507936508 - task: type: Retrieval dataset: name: MTEB CmedqaRetrieval type: C-MTEB/CmedqaRetrieval config: default split: dev revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301 metrics: - type: map_at_1 value: 29.037000000000003 - type: map_at_10 value: 42.001 - type: map_at_100 value: 43.773 - type: map_at_1000 value: 43.878 - type: map_at_3 value: 37.637 - type: map_at_5 value: 40.034 - type: mrr_at_1 value: 43.136 - type: mrr_at_10 value: 51.158 - type: mrr_at_100 value: 52.083 - type: mrr_at_1000 value: 52.12 - type: mrr_at_3 value: 48.733 - type: mrr_at_5 value: 50.025 - type: ndcg_at_1 value: 43.136 - type: ndcg_at_10 value: 48.685 - type: ndcg_at_100 value: 55.513 - type: ndcg_at_1000 value: 57.242000000000004 - type: ndcg_at_3 value: 43.329 - type: ndcg_at_5 value: 45.438 - type: precision_at_1 value: 43.136 - type: precision_at_10 value: 10.56 - type: precision_at_100 value: 1.6129999999999998 - type: precision_at_1000 value: 0.184 - type: precision_at_3 value: 24.064 - type: precision_at_5 value: 17.269000000000002 - type: recall_at_1 value: 29.037000000000003 - type: recall_at_10 value: 59.245000000000005 - type: recall_at_100 value: 87.355 - type: recall_at_1000 value: 98.74000000000001 - type: recall_at_3 value: 42.99 - type: recall_at_5 value: 49.681999999999995 - task: type: PairClassification dataset: name: MTEB Cmnli type: C-MTEB/CMNLI config: default split: validation revision: 41bc36f332156f7adc9e38f53777c959b2ae9766 metrics: - type: cos_sim_accuracy value: 82.68190018039687 - type: cos_sim_ap value: 90.18017125327886 - type: cos_sim_f1 value: 83.64080906868193 - type: cos_sim_precision value: 79.7076890489303 - type: cos_sim_recall value: 87.98223053542202 - type: dot_accuracy value: 82.68190018039687 - type: dot_ap value: 90.18782350103646 - type: dot_f1 value: 83.64242087729039 - type: dot_precision value: 79.65313028764805 - type: dot_recall value: 88.05237315875614 - type: euclidean_accuracy value: 82.68190018039687 - type: euclidean_ap value: 90.1801957900632 - type: euclidean_f1 value: 83.63636363636364 - type: euclidean_precision value: 79.52772506852203 - type: euclidean_recall value: 88.19265840542437 - type: manhattan_accuracy value: 82.14070956103427 - type: manhattan_ap value: 89.96178420101427 - type: manhattan_f1 value: 83.21087838578791 - type: manhattan_precision value: 78.35605121850475 - type: manhattan_recall value: 88.70703764320785 - type: max_accuracy value: 82.68190018039687 - type: max_ap value: 90.18782350103646 - type: max_f1 value: 83.64242087729039 - task: type: Retrieval dataset: name: MTEB CovidRetrieval type: C-MTEB/CovidRetrieval config: default split: dev revision: 1271c7809071a13532e05f25fb53511ffce77117 metrics: - type: map_at_1 value: 72.234 - type: map_at_10 value: 80.10000000000001 - type: map_at_100 value: 80.36 - type: map_at_1000 value: 80.363 - type: map_at_3 value: 78.315 - type: map_at_5 value: 79.607 - type: mrr_at_1 value: 72.392 - type: mrr_at_10 value: 80.117 - type: mrr_at_100 value: 80.36999999999999 - type: mrr_at_1000 value: 80.373 - type: mrr_at_3 value: 78.469 - type: mrr_at_5 value: 79.633 - type: ndcg_at_1 value: 72.392 - type: ndcg_at_10 value: 83.651 - type: ndcg_at_100 value: 84.749 - type: ndcg_at_1000 value: 84.83000000000001 - type: ndcg_at_3 value: 80.253 - type: ndcg_at_5 value: 82.485 - type: precision_at_1 value: 72.392 - type: precision_at_10 value: 9.557 - type: precision_at_100 value: 1.004 - type: precision_at_1000 value: 0.101 - type: precision_at_3 value: 28.732000000000003 - type: precision_at_5 value: 18.377 - type: recall_at_1 value: 72.234 - type: recall_at_10 value: 94.573 - type: recall_at_100 value: 99.368 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 85.669 - type: recall_at_5 value: 91.01700000000001 - task: type: Retrieval dataset: name: MTEB DuRetrieval type: C-MTEB/DuRetrieval config: default split: dev revision: a1a333e290fe30b10f3f56498e3a0d911a693ced metrics: - type: map_at_1 value: 26.173999999999996 - type: map_at_10 value: 80.04 - type: map_at_100 value: 82.94500000000001 - type: map_at_1000 value: 82.98100000000001 - type: map_at_3 value: 55.562999999999995 - type: map_at_5 value: 69.89800000000001 - type: mrr_at_1 value: 89.5 - type: mrr_at_10 value: 92.996 - type: mrr_at_100 value: 93.06400000000001 - type: mrr_at_1000 value: 93.065 - type: mrr_at_3 value: 92.658 - type: mrr_at_5 value: 92.84599999999999 - type: ndcg_at_1 value: 89.5 - type: ndcg_at_10 value: 87.443 - type: ndcg_at_100 value: 90.253 - type: ndcg_at_1000 value: 90.549 - type: ndcg_at_3 value: 85.874 - type: ndcg_at_5 value: 84.842 - type: precision_at_1 value: 89.5 - type: precision_at_10 value: 41.805 - type: precision_at_100 value: 4.827 - type: precision_at_1000 value: 0.49 - type: precision_at_3 value: 76.85 - type: precision_at_5 value: 64.8 - type: recall_at_1 value: 26.173999999999996 - type: recall_at_10 value: 89.101 - type: recall_at_100 value: 98.08099999999999 - type: recall_at_1000 value: 99.529 - type: recall_at_3 value: 57.902 - type: recall_at_5 value: 74.602 - task: type: Retrieval dataset: name: MTEB EcomRetrieval type: C-MTEB/EcomRetrieval config: default split: dev revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9 metrics: - type: map_at_1 value: 56.10000000000001 - type: map_at_10 value: 66.15299999999999 - type: map_at_100 value: 66.625 - type: map_at_1000 value: 66.636 - type: map_at_3 value: 63.632999999999996 - type: map_at_5 value: 65.293 - type: mrr_at_1 value: 56.10000000000001 - type: mrr_at_10 value: 66.15299999999999 - type: mrr_at_100 value: 66.625 - type: mrr_at_1000 value: 66.636 - type: mrr_at_3 value: 63.632999999999996 - type: mrr_at_5 value: 65.293 - type: ndcg_at_1 value: 56.10000000000001 - type: ndcg_at_10 value: 71.146 - type: ndcg_at_100 value: 73.27799999999999 - type: ndcg_at_1000 value: 73.529 - type: ndcg_at_3 value: 66.09 - type: ndcg_at_5 value: 69.08999999999999 - type: precision_at_1 value: 56.10000000000001 - type: precision_at_10 value: 8.68 - type: precision_at_100 value: 0.964 - type: precision_at_1000 value: 0.098 - type: precision_at_3 value: 24.4 - type: precision_at_5 value: 16.1 - type: recall_at_1 value: 56.10000000000001 - type: recall_at_10 value: 86.8 - type: recall_at_100 value: 96.39999999999999 - type: recall_at_1000 value: 98.3 - type: recall_at_3 value: 73.2 - type: recall_at_5 value: 80.5 - task: type: Classification dataset: name: MTEB IFlyTek type: C-MTEB/IFlyTek-classification config: default split: validation revision: 421605374b29664c5fc098418fe20ada9bd55f8a metrics: - type: accuracy value: 54.52096960369373 - type: f1 value: 40.930845295808695 - task: type: Classification dataset: name: MTEB JDReview type: C-MTEB/JDReview-classification config: default split: test revision: b7c64bd89eb87f8ded463478346f76731f07bf8b metrics: - type: accuracy value: 86.51031894934334 - type: ap value: 55.9516014323483 - type: f1 value: 81.54813679326381 - task: type: STS dataset: name: MTEB LCQMC type: C-MTEB/LCQMC config: default split: test revision: 17f9b096f80380fce5ed12a9be8be7784b337daf metrics: - type: cos_sim_pearson value: 69.67437838574276 - type: cos_sim_spearman value: 73.81314174653045 - type: euclidean_pearson value: 72.63430276680275 - type: euclidean_spearman value: 73.81358736777001 - type: manhattan_pearson value: 72.58743833842829 - type: manhattan_spearman value: 73.7590419009179 - task: type: Reranking dataset: name: MTEB MMarcoReranking type: C-MTEB/Mmarco-reranking config: default split: dev revision: None metrics: - type: map value: 31.648613483640254 - type: mrr value: 30.37420634920635 - task: type: Retrieval dataset: name: MTEB MMarcoRetrieval type: C-MTEB/MMarcoRetrieval config: default split: dev revision: 539bbde593d947e2a124ba72651aafc09eb33fc2 metrics: - type: map_at_1 value: 73.28099999999999 - type: map_at_10 value: 81.977 - type: map_at_100 value: 82.222 - type: map_at_1000 value: 82.22699999999999 - type: map_at_3 value: 80.441 - type: map_at_5 value: 81.46600000000001 - type: mrr_at_1 value: 75.673 - type: mrr_at_10 value: 82.41000000000001 - type: mrr_at_100 value: 82.616 - type: mrr_at_1000 value: 82.621 - type: mrr_at_3 value: 81.094 - type: mrr_at_5 value: 81.962 - type: ndcg_at_1 value: 75.673 - type: ndcg_at_10 value: 85.15599999999999 - type: ndcg_at_100 value: 86.151 - type: ndcg_at_1000 value: 86.26899999999999 - type: ndcg_at_3 value: 82.304 - type: ndcg_at_5 value: 84.009 - type: precision_at_1 value: 75.673 - type: precision_at_10 value: 10.042 - type: precision_at_100 value: 1.052 - type: precision_at_1000 value: 0.106 - type: precision_at_3 value: 30.673000000000002 - type: precision_at_5 value: 19.326999999999998 - type: recall_at_1 value: 73.28099999999999 - type: recall_at_10 value: 94.446 - type: recall_at_100 value: 98.737 - type: recall_at_1000 value: 99.649 - type: recall_at_3 value: 86.984 - type: recall_at_5 value: 91.024 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-CN) type: mteb/amazon_massive_intent config: zh-CN split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 81.08607935440484 - type: f1 value: 78.24879986066307 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-CN) type: mteb/amazon_massive_scenario config: zh-CN split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 86.05917955615332 - type: f1 value: 85.05279279434997 - task: type: Retrieval dataset: name: MTEB MedicalRetrieval type: C-MTEB/MedicalRetrieval config: default split: dev revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6 metrics: - type: map_at_1 value: 56.2 - type: map_at_10 value: 62.57899999999999 - type: map_at_100 value: 63.154999999999994 - type: map_at_1000 value: 63.193 - type: map_at_3 value: 61.217 - type: map_at_5 value: 62.012 - type: mrr_at_1 value: 56.3 - type: mrr_at_10 value: 62.629000000000005 - type: mrr_at_100 value: 63.205999999999996 - type: mrr_at_1000 value: 63.244 - type: mrr_at_3 value: 61.267 - type: mrr_at_5 value: 62.062 - type: ndcg_at_1 value: 56.2 - type: ndcg_at_10 value: 65.592 - type: ndcg_at_100 value: 68.657 - type: ndcg_at_1000 value: 69.671 - type: ndcg_at_3 value: 62.808 - type: ndcg_at_5 value: 64.24499999999999 - type: precision_at_1 value: 56.2 - type: precision_at_10 value: 7.5 - type: precision_at_100 value: 0.899 - type: precision_at_1000 value: 0.098 - type: precision_at_3 value: 22.467000000000002 - type: precision_at_5 value: 14.180000000000001 - type: recall_at_1 value: 56.2 - type: recall_at_10 value: 75.0 - type: recall_at_100 value: 89.9 - type: recall_at_1000 value: 97.89999999999999 - type: recall_at_3 value: 67.4 - type: recall_at_5 value: 70.89999999999999 - task: type: Classification dataset: name: MTEB MultilingualSentiment type: C-MTEB/MultilingualSentiment-classification config: default split: validation revision: 46958b007a63fdbf239b7672c25d0bea67b5ea1a metrics: - type: accuracy value: 76.87666666666667 - type: f1 value: 76.7317686219665 - task: type: PairClassification dataset: name: MTEB Ocnli type: C-MTEB/OCNLI config: default split: validation revision: 66e76a618a34d6d565d5538088562851e6daa7ec metrics: - type: cos_sim_accuracy value: 79.64266377910124 - type: cos_sim_ap value: 84.78274442344829 - type: cos_sim_f1 value: 81.16947472745292 - type: cos_sim_precision value: 76.47058823529412 - type: cos_sim_recall value: 86.48363252375924 - type: dot_accuracy value: 79.64266377910124 - type: dot_ap value: 84.7851404063692 - type: dot_f1 value: 81.16947472745292 - type: dot_precision value: 76.47058823529412 - type: dot_recall value: 86.48363252375924 - type: euclidean_accuracy value: 79.64266377910124 - type: euclidean_ap value: 84.78068373762378 - type: euclidean_f1 value: 81.14794656110837 - type: euclidean_precision value: 76.35009310986965 - type: euclidean_recall value: 86.58922914466737 - type: manhattan_accuracy value: 79.48023822414727 - type: manhattan_ap value: 84.72928897427576 - type: manhattan_f1 value: 81.32084770823064 - type: manhattan_precision value: 76.24768946395564 - type: manhattan_recall value: 87.11721224920802 - type: max_accuracy value: 79.64266377910124 - type: max_ap value: 84.7851404063692 - type: max_f1 value: 81.32084770823064 - task: type: Classification dataset: name: MTEB OnlineShopping type: C-MTEB/OnlineShopping-classification config: default split: test revision: e610f2ebd179a8fda30ae534c3878750a96db120 metrics: - type: accuracy value: 94.3 - type: ap value: 92.8664032274438 - type: f1 value: 94.29311102997727 - task: type: STS dataset: name: MTEB PAWSX type: C-MTEB/PAWSX config: default split: test revision: 9c6a90e430ac22b5779fb019a23e820b11a8b5e1 metrics: - type: cos_sim_pearson value: 48.51392279882909 - type: cos_sim_spearman value: 54.06338895994974 - type: euclidean_pearson value: 52.58480559573412 - type: euclidean_spearman value: 54.06417276612201 - type: manhattan_pearson value: 52.69525121721343 - type: manhattan_spearman value: 54.048147455389675 - task: type: STS dataset: name: MTEB QBQTC type: C-MTEB/QBQTC config: default split: test revision: 790b0510dc52b1553e8c49f3d2afb48c0e5c48b7 metrics: - type: cos_sim_pearson value: 29.728387290757325 - type: cos_sim_spearman value: 31.366121633635284 - type: euclidean_pearson value: 29.14588368552961 - type: euclidean_spearman value: 31.36764411112844 - type: manhattan_pearson value: 29.63517350523121 - type: manhattan_spearman value: 31.94157020583762 - task: type: STS dataset: name: MTEB STS22 (zh) type: mteb/sts22-crosslingual-sts config: zh split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 63.64868296271406 - type: cos_sim_spearman value: 66.12800618164744 - type: euclidean_pearson value: 63.21405767340238 - type: euclidean_spearman value: 66.12786567790748 - type: manhattan_pearson value: 64.04300276525848 - type: manhattan_spearman value: 66.5066857145652 - task: type: STS dataset: name: MTEB STSB type: C-MTEB/STSB config: default split: test revision: 0cde68302b3541bb8b3c340dc0644b0b745b3dc0 metrics: - type: cos_sim_pearson value: 81.2302623912794 - type: cos_sim_spearman value: 81.16833673266562 - type: euclidean_pearson value: 79.47647843876024 - type: euclidean_spearman value: 81.16944349524972 - type: manhattan_pearson value: 79.84947238492208 - type: manhattan_spearman value: 81.64626599410026 - task: type: Reranking dataset: name: MTEB T2Reranking type: C-MTEB/T2Reranking config: default split: dev revision: 76631901a18387f85eaa53e5450019b87ad58ef9 metrics: - type: map value: 67.80129586475687 - type: mrr value: 77.77402311635554 - task: type: Retrieval dataset: name: MTEB T2Retrieval type: C-MTEB/T2Retrieval config: default split: dev revision: 8731a845f1bf500a4f111cf1070785c793d10e64 metrics: - type: map_at_1 value: 28.666999999999998 - type: map_at_10 value: 81.063 - type: map_at_100 value: 84.504 - type: map_at_1000 value: 84.552 - type: map_at_3 value: 56.897 - type: map_at_5 value: 70.073 - type: mrr_at_1 value: 92.087 - type: mrr_at_10 value: 94.132 - type: mrr_at_100 value: 94.19800000000001 - type: mrr_at_1000 value: 94.19999999999999 - type: mrr_at_3 value: 93.78999999999999 - type: mrr_at_5 value: 94.002 - type: ndcg_at_1 value: 92.087 - type: ndcg_at_10 value: 87.734 - type: ndcg_at_100 value: 90.736 - type: ndcg_at_1000 value: 91.184 - type: ndcg_at_3 value: 88.78 - type: ndcg_at_5 value: 87.676 - type: precision_at_1 value: 92.087 - type: precision_at_10 value: 43.46 - type: precision_at_100 value: 5.07 - type: precision_at_1000 value: 0.518 - type: precision_at_3 value: 77.49000000000001 - type: precision_at_5 value: 65.194 - type: recall_at_1 value: 28.666999999999998 - type: recall_at_10 value: 86.632 - type: recall_at_100 value: 96.646 - type: recall_at_1000 value: 98.917 - type: recall_at_3 value: 58.333999999999996 - type: recall_at_5 value: 72.974 - task: type: Classification dataset: name: MTEB TNews type: C-MTEB/TNews-classification config: default split: validation revision: 317f262bf1e6126357bbe89e875451e4b0938fe4 metrics: - type: accuracy value: 52.971999999999994 - type: f1 value: 50.2898280984929 - task: type: Clustering dataset: name: MTEB ThuNewsClusteringP2P type: C-MTEB/ThuNewsClusteringP2P config: default split: test revision: 5798586b105c0434e4f0fe5e767abe619442cf93 metrics: - type: v_measure value: 86.0797948663824 - task: type: Clustering dataset: name: MTEB ThuNewsClusteringS2S type: C-MTEB/ThuNewsClusteringS2S config: default split: test revision: 8a8b2caeda43f39e13c4bc5bea0f8a667896e10d metrics: - type: v_measure value: 85.10759092255017 - task: type: Retrieval dataset: name: MTEB VideoRetrieval type: C-MTEB/VideoRetrieval config: default split: dev revision: 58c2597a5943a2ba48f4668c3b90d796283c5639 metrics: - type: map_at_1 value: 65.60000000000001 - type: map_at_10 value: 74.773 - type: map_at_100 value: 75.128 - type: map_at_1000 value: 75.136 - type: map_at_3 value: 73.05 - type: map_at_5 value: 74.13499999999999 - type: mrr_at_1 value: 65.60000000000001 - type: mrr_at_10 value: 74.773 - type: mrr_at_100 value: 75.128 - type: mrr_at_1000 value: 75.136 - type: mrr_at_3 value: 73.05 - type: mrr_at_5 value: 74.13499999999999 - type: ndcg_at_1 value: 65.60000000000001 - type: ndcg_at_10 value: 78.84299999999999 - type: ndcg_at_100 value: 80.40899999999999 - type: ndcg_at_1000 value: 80.57 - type: ndcg_at_3 value: 75.40599999999999 - type: ndcg_at_5 value: 77.351 - type: precision_at_1 value: 65.60000000000001 - type: precision_at_10 value: 9.139999999999999 - type: precision_at_100 value: 0.984 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 27.400000000000002 - type: precision_at_5 value: 17.380000000000003 - type: recall_at_1 value: 65.60000000000001 - type: recall_at_10 value: 91.4 - type: recall_at_100 value: 98.4 - type: recall_at_1000 value: 99.6 - type: recall_at_3 value: 82.19999999999999 - type: recall_at_5 value: 86.9 - task: type: Classification dataset: name: MTEB Waimai type: C-MTEB/waimai-classification config: default split: test revision: 339287def212450dcaa9df8c22bf93e9980c7023 metrics: - type: accuracy value: 89.47 - type: ap value: 75.59561751845389 - type: f1 value: 87.95207751382563 - task: type: Clustering dataset: name: MTEB AlloProfClusteringP2P type: lyon-nlp/alloprof config: default split: test revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b metrics: - type: v_measure value: 76.05592323841036 - type: v_measure value: 64.51718058866508 - task: type: Reranking dataset: name: MTEB AlloprofReranking type: lyon-nlp/mteb-fr-reranking-alloprof-s2p config: default split: test revision: 666fdacebe0291776e86f29345663dfaf80a0db9 metrics: - type: map value: 73.08278490943373 - type: mrr value: 74.66561454570449 - task: type: Retrieval dataset: name: MTEB AlloprofRetrieval type: lyon-nlp/alloprof config: default split: test revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b metrics: - type: map_at_1 value: 38.912 - type: map_at_10 value: 52.437999999999995 - type: map_at_100 value: 53.38 - type: map_at_1000 value: 53.427 - type: map_at_3 value: 48.879 - type: map_at_5 value: 50.934000000000005 - type: mrr_at_1 value: 44.085 - type: mrr_at_10 value: 55.337 - type: mrr_at_100 value: 56.016999999999996 - type: mrr_at_1000 value: 56.043 - type: mrr_at_3 value: 52.55499999999999 - type: mrr_at_5 value: 54.20399999999999 - type: ndcg_at_1 value: 44.085 - type: ndcg_at_10 value: 58.876 - type: ndcg_at_100 value: 62.714000000000006 - type: ndcg_at_1000 value: 63.721000000000004 - type: ndcg_at_3 value: 52.444 - type: ndcg_at_5 value: 55.692 - type: precision_at_1 value: 44.085 - type: precision_at_10 value: 9.21 - type: precision_at_100 value: 1.164 - type: precision_at_1000 value: 0.128 - type: precision_at_3 value: 23.043 - type: precision_at_5 value: 15.898000000000001 - type: recall_at_1 value: 38.912 - type: recall_at_10 value: 75.577 - type: recall_at_100 value: 92.038 - type: recall_at_1000 value: 99.325 - type: recall_at_3 value: 58.592 - type: recall_at_5 value: 66.235 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (fr) type: mteb/amazon_reviews_multi config: fr split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 55.532000000000004 - type: f1 value: 52.5783943471605 - task: type: Retrieval dataset: name: MTEB BSARDRetrieval type: maastrichtlawtech/bsard config: default split: test revision: 5effa1b9b5fa3b0f9e12523e6e43e5f86a6e6d59 metrics: - type: map_at_1 value: 8.108 - type: map_at_10 value: 14.710999999999999 - type: map_at_100 value: 15.891 - type: map_at_1000 value: 15.983 - type: map_at_3 value: 12.237 - type: map_at_5 value: 13.679 - type: mrr_at_1 value: 8.108 - type: mrr_at_10 value: 14.710999999999999 - type: mrr_at_100 value: 15.891 - type: mrr_at_1000 value: 15.983 - type: mrr_at_3 value: 12.237 - type: mrr_at_5 value: 13.679 - type: ndcg_at_1 value: 8.108 - type: ndcg_at_10 value: 18.796 - type: ndcg_at_100 value: 25.098 - type: ndcg_at_1000 value: 27.951999999999998 - type: ndcg_at_3 value: 13.712 - type: ndcg_at_5 value: 16.309 - type: precision_at_1 value: 8.108 - type: precision_at_10 value: 3.198 - type: precision_at_100 value: 0.626 - type: precision_at_1000 value: 0.086 - type: precision_at_3 value: 6.006 - type: precision_at_5 value: 4.865 - type: recall_at_1 value: 8.108 - type: recall_at_10 value: 31.982 - type: recall_at_100 value: 62.613 - type: recall_at_1000 value: 86.036 - type: recall_at_3 value: 18.018 - type: recall_at_5 value: 24.324 - task: type: Clustering dataset: name: MTEB HALClusteringS2S type: lyon-nlp/clustering-hal-s2s config: default split: test revision: e06ebbbb123f8144bef1a5d18796f3dec9ae2915 metrics: - type: v_measure value: 30.833269778867116 - task: type: Clustering dataset: name: MTEB MLSUMClusteringP2P type: mlsum config: default split: test revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7 metrics: - type: v_measure value: 50.0281928004713 - type: v_measure value: 43.699961510636534 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (fr) type: mteb/mtop_domain config: fr split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 96.68963357344191 - type: f1 value: 96.45175170820961 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (fr) type: mteb/mtop_intent config: fr split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 87.46946445349202 - type: f1 value: 65.79860440988624 - task: type: Classification dataset: name: MTEB MasakhaNEWSClassification (fra) type: masakhane/masakhanews config: fra split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: accuracy value: 82.60663507109005 - type: f1 value: 77.20462646604777 - task: type: Clustering dataset: name: MTEB MasakhaNEWSClusteringP2P (fra) type: masakhane/masakhanews config: fra split: test revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60 metrics: - type: v_measure value: 60.19311264967803 - type: v_measure value: 63.6235764409785 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fr) type: mteb/amazon_massive_intent config: fr split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 81.65097511768661 - type: f1 value: 78.77796091490924 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fr) type: mteb/amazon_massive_scenario config: fr split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 86.64425016812373 - type: f1 value: 85.4912728670017 - task: type: Retrieval dataset: name: MTEB MintakaRetrieval (fr) type: jinaai/mintakaqa config: fr split: test revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e metrics: - type: map_at_1 value: 35.913000000000004 - type: map_at_10 value: 48.147 - type: map_at_100 value: 48.91 - type: map_at_1000 value: 48.949 - type: map_at_3 value: 45.269999999999996 - type: map_at_5 value: 47.115 - type: mrr_at_1 value: 35.913000000000004 - type: mrr_at_10 value: 48.147 - type: mrr_at_100 value: 48.91 - type: mrr_at_1000 value: 48.949 - type: mrr_at_3 value: 45.269999999999996 - type: mrr_at_5 value: 47.115 - type: ndcg_at_1 value: 35.913000000000004 - type: ndcg_at_10 value: 54.03 - type: ndcg_at_100 value: 57.839 - type: ndcg_at_1000 value: 58.925000000000004 - type: ndcg_at_3 value: 48.217999999999996 - type: ndcg_at_5 value: 51.56699999999999 - type: precision_at_1 value: 35.913000000000004 - type: precision_at_10 value: 7.244000000000001 - type: precision_at_100 value: 0.9039999999999999 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 18.905 - type: precision_at_5 value: 12.981000000000002 - type: recall_at_1 value: 35.913000000000004 - type: recall_at_10 value: 72.441 - type: recall_at_100 value: 90.41799999999999 - type: recall_at_1000 value: 99.099 - type: recall_at_3 value: 56.716 - type: recall_at_5 value: 64.90599999999999 - task: type: PairClassification dataset: name: MTEB OpusparcusPC (fr) type: GEM/opusparcus config: fr split: test revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a metrics: - type: cos_sim_accuracy value: 99.90069513406156 - type: cos_sim_ap value: 100.0 - type: cos_sim_f1 value: 99.95032290114257 - type: cos_sim_precision value: 100.0 - type: cos_sim_recall value: 99.90069513406156 - type: dot_accuracy value: 99.90069513406156 - type: dot_ap value: 100.0 - type: dot_f1 value: 99.95032290114257 - type: dot_precision value: 100.0 - type: dot_recall value: 99.90069513406156 - type: euclidean_accuracy value: 99.90069513406156 - type: euclidean_ap value: 100.0 - type: euclidean_f1 value: 99.95032290114257 - type: euclidean_precision value: 100.0 - type: euclidean_recall value: 99.90069513406156 - type: manhattan_accuracy value: 99.90069513406156 - type: manhattan_ap value: 100.0 - type: manhattan_f1 value: 99.95032290114257 - type: manhattan_precision value: 100.0 - type: manhattan_recall value: 99.90069513406156 - type: max_accuracy value: 99.90069513406156 - type: max_ap value: 100.0 - type: max_f1 value: 99.95032290114257 - task: type: PairClassification dataset: name: MTEB PawsX (fr) type: paws-x config: fr split: test revision: 8a04d940a42cd40658986fdd8e3da561533a3646 metrics: - type: cos_sim_accuracy value: 75.25 - type: cos_sim_ap value: 80.86376001270014 - type: cos_sim_f1 value: 73.65945437441204 - type: cos_sim_precision value: 64.02289452166802 - type: cos_sim_recall value: 86.71096345514951 - type: dot_accuracy value: 75.25 - type: dot_ap value: 80.93686107633002 - type: dot_f1 value: 73.65945437441204 - type: dot_precision value: 64.02289452166802 - type: dot_recall value: 86.71096345514951 - type: euclidean_accuracy value: 75.25 - type: euclidean_ap value: 80.86379136218862 - type: euclidean_f1 value: 73.65945437441204 - type: euclidean_precision value: 64.02289452166802 - type: euclidean_recall value: 86.71096345514951 - type: manhattan_accuracy value: 75.3 - type: manhattan_ap value: 80.87826606097734 - type: manhattan_f1 value: 73.68421052631581 - type: manhattan_precision value: 64.0 - type: manhattan_recall value: 86.82170542635659 - type: max_accuracy value: 75.3 - type: max_ap value: 80.93686107633002 - type: max_f1 value: 73.68421052631581 - task: type: STS dataset: name: MTEB SICKFr type: Lajavaness/SICK-fr config: default split: test revision: e077ab4cf4774a1e36d86d593b150422fafd8e8a metrics: - type: cos_sim_pearson value: 81.42349425981143 - type: cos_sim_spearman value: 78.90454327031226 - type: euclidean_pearson value: 78.39086497435166 - type: euclidean_spearman value: 78.9046133980509 - type: manhattan_pearson value: 78.63743094286502 - type: manhattan_spearman value: 79.12136348449269 - task: type: STS dataset: name: MTEB STS22 (fr) type: mteb/sts22-crosslingual-sts config: fr split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 81.452697919749 - type: cos_sim_spearman value: 82.58116836039301 - type: euclidean_pearson value: 81.04038478932786 - type: euclidean_spearman value: 82.58116836039301 - type: manhattan_pearson value: 81.37075396187771 - type: manhattan_spearman value: 82.73678231355368 - task: type: STS dataset: name: MTEB STSBenchmarkMultilingualSTS (fr) type: stsb_multi_mt config: fr split: test revision: 93d57ef91790589e3ce9c365164337a8a78b7632 metrics: - type: cos_sim_pearson value: 85.7419764013806 - type: cos_sim_spearman value: 85.46085808849622 - type: euclidean_pearson value: 83.70449639870063 - type: euclidean_spearman value: 85.46159013076233 - type: manhattan_pearson value: 83.95259510313929 - type: manhattan_spearman value: 85.8029724659458 - task: type: Summarization dataset: name: MTEB SummEvalFr type: lyon-nlp/summarization-summeval-fr-p2p config: default split: test revision: b385812de6a9577b6f4d0f88c6a6e35395a94054 metrics: - type: cos_sim_pearson value: 32.61063271753325 - type: cos_sim_spearman value: 31.454589417353603 - type: dot_pearson value: 32.6106288643431 - type: dot_spearman value: 31.454589417353603 - task: type: Reranking dataset: name: MTEB SyntecReranking type: lyon-nlp/mteb-fr-reranking-syntec-s2p config: default split: test revision: b205c5084a0934ce8af14338bf03feb19499c84d metrics: - type: map value: 84.31666666666666 - type: mrr value: 84.31666666666666 - task: type: Retrieval dataset: name: MTEB SyntecRetrieval type: lyon-nlp/mteb-fr-retrieval-syntec-s2p config: default split: test revision: 77f7e271bf4a92b24fce5119f3486b583ca016ff metrics: - type: map_at_1 value: 63.0 - type: map_at_10 value: 73.471 - type: map_at_100 value: 73.87 - type: map_at_1000 value: 73.87 - type: map_at_3 value: 70.5 - type: map_at_5 value: 73.05 - type: mrr_at_1 value: 63.0 - type: mrr_at_10 value: 73.471 - type: mrr_at_100 value: 73.87 - type: mrr_at_1000 value: 73.87 - type: mrr_at_3 value: 70.5 - type: mrr_at_5 value: 73.05 - type: ndcg_at_1 value: 63.0 - type: ndcg_at_10 value: 78.255 - type: ndcg_at_100 value: 79.88 - type: ndcg_at_1000 value: 79.88 - type: ndcg_at_3 value: 72.702 - type: ndcg_at_5 value: 77.264 - type: precision_at_1 value: 63.0 - type: precision_at_10 value: 9.3 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 26.333000000000002 - type: precision_at_5 value: 18.0 - type: recall_at_1 value: 63.0 - type: recall_at_10 value: 93.0 - type: recall_at_100 value: 100.0 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 79.0 - type: recall_at_5 value: 90.0 - task: type: Retrieval dataset: name: MTEB XPQARetrieval (fr) type: jinaai/xpqa config: fr split: test revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f metrics: - type: map_at_1 value: 40.338 - type: map_at_10 value: 61.927 - type: map_at_100 value: 63.361999999999995 - type: map_at_1000 value: 63.405 - type: map_at_3 value: 55.479 - type: map_at_5 value: 59.732 - type: mrr_at_1 value: 63.551 - type: mrr_at_10 value: 71.006 - type: mrr_at_100 value: 71.501 - type: mrr_at_1000 value: 71.509 - type: mrr_at_3 value: 69.07 - type: mrr_at_5 value: 70.165 - type: ndcg_at_1 value: 63.551 - type: ndcg_at_10 value: 68.297 - type: ndcg_at_100 value: 73.13199999999999 - type: ndcg_at_1000 value: 73.751 - type: ndcg_at_3 value: 62.999 - type: ndcg_at_5 value: 64.89 - type: precision_at_1 value: 63.551 - type: precision_at_10 value: 15.661 - type: precision_at_100 value: 1.9789999999999999 - type: precision_at_1000 value: 0.207 - type: precision_at_3 value: 38.273 - type: precision_at_5 value: 27.61 - type: recall_at_1 value: 40.338 - type: recall_at_10 value: 77.267 - type: recall_at_100 value: 95.892 - type: recall_at_1000 value: 99.75500000000001 - type: recall_at_3 value: 60.36 - type: recall_at_5 value: 68.825 - task: type: Clustering dataset: name: MTEB 8TagsClustering type: PL-MTEB/8tags-clustering config: default split: test revision: None metrics: - type: v_measure value: 51.36126303874126 - task: type: Classification dataset: name: MTEB AllegroReviews type: PL-MTEB/allegro-reviews config: default split: test revision: None metrics: - type: accuracy value: 67.13717693836979 - type: f1 value: 57.27609848003782 - task: type: Retrieval dataset: name: MTEB ArguAna-PL type: clarin-knext/arguana-pl config: default split: test revision: 63fc86750af76253e8c760fc9e534bbf24d260a2 metrics: - type: map_at_1 value: 35.276999999999994 - type: map_at_10 value: 51.086 - type: map_at_100 value: 51.788000000000004 - type: map_at_1000 value: 51.791 - type: map_at_3 value: 46.147 - type: map_at_5 value: 49.078 - type: mrr_at_1 value: 35.917 - type: mrr_at_10 value: 51.315999999999995 - type: mrr_at_100 value: 52.018 - type: mrr_at_1000 value: 52.022 - type: mrr_at_3 value: 46.349000000000004 - type: mrr_at_5 value: 49.297000000000004 - type: ndcg_at_1 value: 35.276999999999994 - type: ndcg_at_10 value: 59.870999999999995 - type: ndcg_at_100 value: 62.590999999999994 - type: ndcg_at_1000 value: 62.661 - type: ndcg_at_3 value: 49.745 - type: ndcg_at_5 value: 55.067 - type: precision_at_1 value: 35.276999999999994 - type: precision_at_10 value: 8.791 - type: precision_at_100 value: 0.991 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 20.057 - type: precision_at_5 value: 14.637 - type: recall_at_1 value: 35.276999999999994 - type: recall_at_10 value: 87.909 - type: recall_at_100 value: 99.14699999999999 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 60.171 - type: recall_at_5 value: 73.18599999999999 - task: type: Classification dataset: name: MTEB CBD type: PL-MTEB/cbd config: default split: test revision: None metrics: - type: accuracy value: 78.03000000000002 - type: ap value: 29.12548553897622 - type: f1 value: 66.54857118886073 - task: type: PairClassification dataset: name: MTEB CDSC-E type: PL-MTEB/cdsce-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 89.0 - type: cos_sim_ap value: 76.75437826834582 - type: cos_sim_f1 value: 66.4850136239782 - type: cos_sim_precision value: 68.92655367231639 - type: cos_sim_recall value: 64.21052631578948 - type: dot_accuracy value: 89.0 - type: dot_ap value: 76.75437826834582 - type: dot_f1 value: 66.4850136239782 - type: dot_precision value: 68.92655367231639 - type: dot_recall value: 64.21052631578948 - type: euclidean_accuracy value: 89.0 - type: euclidean_ap value: 76.75437826834582 - type: euclidean_f1 value: 66.4850136239782 - type: euclidean_precision value: 68.92655367231639 - type: euclidean_recall value: 64.21052631578948 - type: manhattan_accuracy value: 89.0 - type: manhattan_ap value: 76.66074220647083 - type: manhattan_f1 value: 66.47058823529412 - type: manhattan_precision value: 75.33333333333333 - type: manhattan_recall value: 59.473684210526315 - type: max_accuracy value: 89.0 - type: max_ap value: 76.75437826834582 - type: max_f1 value: 66.4850136239782 - task: type: STS dataset: name: MTEB CDSC-R type: PL-MTEB/cdscr-sts config: default split: test revision: None metrics: - type: cos_sim_pearson value: 93.12903172428328 - type: cos_sim_spearman value: 92.66381487060741 - type: euclidean_pearson value: 90.37278396708922 - type: euclidean_spearman value: 92.66381487060741 - type: manhattan_pearson value: 90.32503296540962 - type: manhattan_spearman value: 92.6902938354313 - task: type: Retrieval dataset: name: MTEB DBPedia-PL type: clarin-knext/dbpedia-pl config: default split: test revision: 76afe41d9af165cc40999fcaa92312b8b012064a metrics: - type: map_at_1 value: 8.83 - type: map_at_10 value: 18.326 - type: map_at_100 value: 26.496 - type: map_at_1000 value: 28.455000000000002 - type: map_at_3 value: 12.933 - type: map_at_5 value: 15.168000000000001 - type: mrr_at_1 value: 66.0 - type: mrr_at_10 value: 72.76700000000001 - type: mrr_at_100 value: 73.203 - type: mrr_at_1000 value: 73.219 - type: mrr_at_3 value: 71.458 - type: mrr_at_5 value: 72.246 - type: ndcg_at_1 value: 55.375 - type: ndcg_at_10 value: 41.3 - type: ndcg_at_100 value: 45.891 - type: ndcg_at_1000 value: 52.905 - type: ndcg_at_3 value: 46.472 - type: ndcg_at_5 value: 43.734 - type: precision_at_1 value: 66.0 - type: precision_at_10 value: 33.074999999999996 - type: precision_at_100 value: 11.094999999999999 - type: precision_at_1000 value: 2.374 - type: precision_at_3 value: 48.583 - type: precision_at_5 value: 42.0 - type: recall_at_1 value: 8.83 - type: recall_at_10 value: 22.587 - type: recall_at_100 value: 50.61600000000001 - type: recall_at_1000 value: 73.559 - type: recall_at_3 value: 13.688 - type: recall_at_5 value: 16.855 - task: type: Retrieval dataset: name: MTEB FiQA-PL type: clarin-knext/fiqa-pl config: default split: test revision: 2e535829717f8bf9dc829b7f911cc5bbd4e6608e metrics: - type: map_at_1 value: 20.587 - type: map_at_10 value: 33.095 - type: map_at_100 value: 35.24 - type: map_at_1000 value: 35.429 - type: map_at_3 value: 28.626 - type: map_at_5 value: 31.136999999999997 - type: mrr_at_1 value: 40.586 - type: mrr_at_10 value: 49.033 - type: mrr_at_100 value: 49.952999999999996 - type: mrr_at_1000 value: 49.992 - type: mrr_at_3 value: 46.553 - type: mrr_at_5 value: 48.035 - type: ndcg_at_1 value: 40.586 - type: ndcg_at_10 value: 41.046 - type: ndcg_at_100 value: 48.586 - type: ndcg_at_1000 value: 51.634 - type: ndcg_at_3 value: 36.773 - type: ndcg_at_5 value: 38.389 - type: precision_at_1 value: 40.586 - type: precision_at_10 value: 11.466 - type: precision_at_100 value: 1.909 - type: precision_at_1000 value: 0.245 - type: precision_at_3 value: 24.434 - type: precision_at_5 value: 18.426000000000002 - type: recall_at_1 value: 20.587 - type: recall_at_10 value: 47.986000000000004 - type: recall_at_100 value: 75.761 - type: recall_at_1000 value: 94.065 - type: recall_at_3 value: 33.339 - type: recall_at_5 value: 39.765 - task: type: Retrieval dataset: name: MTEB HotpotQA-PL type: clarin-knext/hotpotqa-pl config: default split: test revision: a0bd479ac97b4ccb5bd6ce320c415d0bb4beb907 metrics: - type: map_at_1 value: 40.878 - type: map_at_10 value: 58.775999999999996 - type: map_at_100 value: 59.632 - type: map_at_1000 value: 59.707 - type: map_at_3 value: 56.074 - type: map_at_5 value: 57.629 - type: mrr_at_1 value: 81.756 - type: mrr_at_10 value: 86.117 - type: mrr_at_100 value: 86.299 - type: mrr_at_1000 value: 86.30600000000001 - type: mrr_at_3 value: 85.345 - type: mrr_at_5 value: 85.832 - type: ndcg_at_1 value: 81.756 - type: ndcg_at_10 value: 67.608 - type: ndcg_at_100 value: 70.575 - type: ndcg_at_1000 value: 71.99600000000001 - type: ndcg_at_3 value: 63.723 - type: ndcg_at_5 value: 65.70700000000001 - type: precision_at_1 value: 81.756 - type: precision_at_10 value: 13.619 - type: precision_at_100 value: 1.5939999999999999 - type: precision_at_1000 value: 0.178 - type: precision_at_3 value: 39.604 - type: precision_at_5 value: 25.332 - type: recall_at_1 value: 40.878 - type: recall_at_10 value: 68.096 - type: recall_at_100 value: 79.696 - type: recall_at_1000 value: 89.082 - type: recall_at_3 value: 59.406000000000006 - type: recall_at_5 value: 63.329 - task: type: Retrieval dataset: name: MTEB MSMARCO-PL type: clarin-knext/msmarco-pl config: default split: test revision: 8634c07806d5cce3a6138e260e59b81760a0a640 metrics: - type: map_at_1 value: 2.1839999999999997 - type: map_at_10 value: 11.346 - type: map_at_100 value: 30.325000000000003 - type: map_at_1000 value: 37.806 - type: map_at_3 value: 4.842 - type: map_at_5 value: 6.891 - type: mrr_at_1 value: 86.047 - type: mrr_at_10 value: 89.14699999999999 - type: mrr_at_100 value: 89.46600000000001 - type: mrr_at_1000 value: 89.46600000000001 - type: mrr_at_3 value: 89.14699999999999 - type: mrr_at_5 value: 89.14699999999999 - type: ndcg_at_1 value: 67.829 - type: ndcg_at_10 value: 62.222 - type: ndcg_at_100 value: 55.337 - type: ndcg_at_1000 value: 64.076 - type: ndcg_at_3 value: 68.12700000000001 - type: ndcg_at_5 value: 64.987 - type: precision_at_1 value: 86.047 - type: precision_at_10 value: 69.535 - type: precision_at_100 value: 32.93 - type: precision_at_1000 value: 6.6049999999999995 - type: precision_at_3 value: 79.845 - type: precision_at_5 value: 75.349 - type: recall_at_1 value: 2.1839999999999997 - type: recall_at_10 value: 12.866 - type: recall_at_100 value: 43.505 - type: recall_at_1000 value: 72.366 - type: recall_at_3 value: 4.947 - type: recall_at_5 value: 7.192 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pl) type: mteb/amazon_massive_intent config: pl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 80.75319435104238 - type: f1 value: 77.58961444860606 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pl) type: mteb/amazon_massive_scenario config: pl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 85.54472091459313 - type: f1 value: 84.29498563572106 - task: type: Retrieval dataset: name: MTEB NFCorpus-PL type: clarin-knext/nfcorpus-pl config: default split: test revision: 9a6f9567fda928260afed2de480d79c98bf0bec0 metrics: - type: map_at_1 value: 4.367 - type: map_at_10 value: 10.38 - type: map_at_100 value: 13.516 - type: map_at_1000 value: 14.982000000000001 - type: map_at_3 value: 7.367 - type: map_at_5 value: 8.59 - type: mrr_at_1 value: 41.486000000000004 - type: mrr_at_10 value: 48.886 - type: mrr_at_100 value: 49.657000000000004 - type: mrr_at_1000 value: 49.713 - type: mrr_at_3 value: 46.904 - type: mrr_at_5 value: 48.065000000000005 - type: ndcg_at_1 value: 40.402 - type: ndcg_at_10 value: 30.885 - type: ndcg_at_100 value: 28.393 - type: ndcg_at_1000 value: 37.428 - type: ndcg_at_3 value: 35.394999999999996 - type: ndcg_at_5 value: 33.391999999999996 - type: precision_at_1 value: 41.486000000000004 - type: precision_at_10 value: 23.437 - type: precision_at_100 value: 7.638 - type: precision_at_1000 value: 2.0389999999999997 - type: precision_at_3 value: 32.817 - type: precision_at_5 value: 28.915999999999997 - type: recall_at_1 value: 4.367 - type: recall_at_10 value: 14.655000000000001 - type: recall_at_100 value: 29.665999999999997 - type: recall_at_1000 value: 62.073 - type: recall_at_3 value: 8.51 - type: recall_at_5 value: 10.689 - task: type: Retrieval dataset: name: MTEB NQ-PL type: clarin-knext/nq-pl config: default split: test revision: f171245712cf85dd4700b06bef18001578d0ca8d metrics: - type: map_at_1 value: 28.616000000000003 - type: map_at_10 value: 41.626000000000005 - type: map_at_100 value: 42.689 - type: map_at_1000 value: 42.733 - type: map_at_3 value: 37.729 - type: map_at_5 value: 39.879999999999995 - type: mrr_at_1 value: 32.068000000000005 - type: mrr_at_10 value: 44.029 - type: mrr_at_100 value: 44.87 - type: mrr_at_1000 value: 44.901 - type: mrr_at_3 value: 40.687 - type: mrr_at_5 value: 42.625 - type: ndcg_at_1 value: 32.068000000000005 - type: ndcg_at_10 value: 48.449999999999996 - type: ndcg_at_100 value: 53.13 - type: ndcg_at_1000 value: 54.186 - type: ndcg_at_3 value: 40.983999999999995 - type: ndcg_at_5 value: 44.628 - type: precision_at_1 value: 32.068000000000005 - type: precision_at_10 value: 7.9750000000000005 - type: precision_at_100 value: 1.061 - type: precision_at_1000 value: 0.116 - type: precision_at_3 value: 18.404999999999998 - type: precision_at_5 value: 13.111 - type: recall_at_1 value: 28.616000000000003 - type: recall_at_10 value: 66.956 - type: recall_at_100 value: 87.657 - type: recall_at_1000 value: 95.548 - type: recall_at_3 value: 47.453 - type: recall_at_5 value: 55.87800000000001 - task: type: Classification dataset: name: MTEB PAC type: laugustyniak/abusive-clauses-pl config: default split: test revision: None metrics: - type: accuracy value: 69.04141326382856 - type: ap value: 77.47589122111044 - type: f1 value: 66.6332277374775 - task: type: PairClassification dataset: name: MTEB PPC type: PL-MTEB/ppc-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 86.4 - type: cos_sim_ap value: 94.1044939667201 - type: cos_sim_f1 value: 88.78048780487805 - type: cos_sim_precision value: 87.22044728434504 - type: cos_sim_recall value: 90.39735099337747 - type: dot_accuracy value: 86.4 - type: dot_ap value: 94.1044939667201 - type: dot_f1 value: 88.78048780487805 - type: dot_precision value: 87.22044728434504 - type: dot_recall value: 90.39735099337747 - type: euclidean_accuracy value: 86.4 - type: euclidean_ap value: 94.1044939667201 - type: euclidean_f1 value: 88.78048780487805 - type: euclidean_precision value: 87.22044728434504 - type: euclidean_recall value: 90.39735099337747 - type: manhattan_accuracy value: 86.4 - type: manhattan_ap value: 94.11438365697387 - type: manhattan_f1 value: 88.77968877968877 - type: manhattan_precision value: 87.84440842787681 - type: manhattan_recall value: 89.73509933774835 - type: max_accuracy value: 86.4 - type: max_ap value: 94.11438365697387 - type: max_f1 value: 88.78048780487805 - task: type: PairClassification dataset: name: MTEB PSC type: PL-MTEB/psc-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 97.86641929499072 - type: cos_sim_ap value: 99.36904211868182 - type: cos_sim_f1 value: 96.56203288490283 - type: cos_sim_precision value: 94.72140762463343 - type: cos_sim_recall value: 98.47560975609755 - type: dot_accuracy value: 97.86641929499072 - type: dot_ap value: 99.36904211868183 - type: dot_f1 value: 96.56203288490283 - type: dot_precision value: 94.72140762463343 - type: dot_recall value: 98.47560975609755 - type: euclidean_accuracy value: 97.86641929499072 - type: euclidean_ap value: 99.36904211868183 - type: euclidean_f1 value: 96.56203288490283 - type: euclidean_precision value: 94.72140762463343 - type: euclidean_recall value: 98.47560975609755 - type: manhattan_accuracy value: 98.14471243042672 - type: manhattan_ap value: 99.43359540492416 - type: manhattan_f1 value: 96.98795180722892 - type: manhattan_precision value: 95.83333333333334 - type: manhattan_recall value: 98.17073170731707 - type: max_accuracy value: 98.14471243042672 - type: max_ap value: 99.43359540492416 - type: max_f1 value: 96.98795180722892 - task: type: Classification dataset: name: MTEB PolEmo2.0-IN type: PL-MTEB/polemo2_in config: default split: test revision: None metrics: - type: accuracy value: 89.39058171745152 - type: f1 value: 86.8552093529568 - task: type: Classification dataset: name: MTEB PolEmo2.0-OUT type: PL-MTEB/polemo2_out config: default split: test revision: None metrics: - type: accuracy value: 74.97975708502024 - type: f1 value: 58.73081628832407 - task: type: Retrieval dataset: name: MTEB Quora-PL type: clarin-knext/quora-pl config: default split: test revision: 0be27e93455051e531182b85e85e425aba12e9d4 metrics: - type: map_at_1 value: 64.917 - type: map_at_10 value: 78.74600000000001 - type: map_at_100 value: 79.501 - type: map_at_1000 value: 79.524 - type: map_at_3 value: 75.549 - type: map_at_5 value: 77.495 - type: mrr_at_1 value: 74.9 - type: mrr_at_10 value: 82.112 - type: mrr_at_100 value: 82.314 - type: mrr_at_1000 value: 82.317 - type: mrr_at_3 value: 80.745 - type: mrr_at_5 value: 81.607 - type: ndcg_at_1 value: 74.83999999999999 - type: ndcg_at_10 value: 83.214 - type: ndcg_at_100 value: 84.997 - type: ndcg_at_1000 value: 85.207 - type: ndcg_at_3 value: 79.547 - type: ndcg_at_5 value: 81.46600000000001 - type: precision_at_1 value: 74.83999999999999 - type: precision_at_10 value: 12.822 - type: precision_at_100 value: 1.506 - type: precision_at_1000 value: 0.156 - type: precision_at_3 value: 34.903 - type: precision_at_5 value: 23.16 - type: recall_at_1 value: 64.917 - type: recall_at_10 value: 92.27199999999999 - type: recall_at_100 value: 98.715 - type: recall_at_1000 value: 99.854 - type: recall_at_3 value: 82.04599999999999 - type: recall_at_5 value: 87.2 - task: type: Retrieval dataset: name: MTEB SCIDOCS-PL type: clarin-knext/scidocs-pl config: default split: test revision: 45452b03f05560207ef19149545f168e596c9337 metrics: - type: map_at_1 value: 3.51 - type: map_at_10 value: 9.046999999999999 - type: map_at_100 value: 10.823 - type: map_at_1000 value: 11.144 - type: map_at_3 value: 6.257 - type: map_at_5 value: 7.648000000000001 - type: mrr_at_1 value: 17.299999999999997 - type: mrr_at_10 value: 27.419 - type: mrr_at_100 value: 28.618 - type: mrr_at_1000 value: 28.685 - type: mrr_at_3 value: 23.817 - type: mrr_at_5 value: 25.927 - type: ndcg_at_1 value: 17.299999999999997 - type: ndcg_at_10 value: 16.084 - type: ndcg_at_100 value: 23.729 - type: ndcg_at_1000 value: 29.476999999999997 - type: ndcg_at_3 value: 14.327000000000002 - type: ndcg_at_5 value: 13.017999999999999 - type: precision_at_1 value: 17.299999999999997 - type: precision_at_10 value: 8.63 - type: precision_at_100 value: 1.981 - type: precision_at_1000 value: 0.336 - type: precision_at_3 value: 13.4 - type: precision_at_5 value: 11.700000000000001 - type: recall_at_1 value: 3.51 - type: recall_at_10 value: 17.518 - type: recall_at_100 value: 40.275 - type: recall_at_1000 value: 68.203 - type: recall_at_3 value: 8.155 - type: recall_at_5 value: 11.875 - task: type: PairClassification dataset: name: MTEB SICK-E-PL type: PL-MTEB/sicke-pl-pairclassification config: default split: test revision: None metrics: - type: cos_sim_accuracy value: 86.30248675091724 - type: cos_sim_ap value: 83.6756734006714 - type: cos_sim_f1 value: 74.97367497367497 - type: cos_sim_precision value: 73.91003460207612 - type: cos_sim_recall value: 76.06837606837607 - type: dot_accuracy value: 86.30248675091724 - type: dot_ap value: 83.6756734006714 - type: dot_f1 value: 74.97367497367497 - type: dot_precision value: 73.91003460207612 - type: dot_recall value: 76.06837606837607 - type: euclidean_accuracy value: 86.30248675091724 - type: euclidean_ap value: 83.67566984333091 - type: euclidean_f1 value: 74.97367497367497 - type: euclidean_precision value: 73.91003460207612 - type: euclidean_recall value: 76.06837606837607 - type: manhattan_accuracy value: 86.28210354667753 - type: manhattan_ap value: 83.64216119130171 - type: manhattan_f1 value: 74.92152075340078 - type: manhattan_precision value: 73.4107997265892 - type: manhattan_recall value: 76.49572649572649 - type: max_accuracy value: 86.30248675091724 - type: max_ap value: 83.6756734006714 - type: max_f1 value: 74.97367497367497 - task: type: STS dataset: name: MTEB SICK-R-PL type: PL-MTEB/sickr-pl-sts config: default split: test revision: None metrics: - type: cos_sim_pearson value: 82.23295940859121 - type: cos_sim_spearman value: 78.89329160768719 - type: euclidean_pearson value: 79.56019107076818 - type: euclidean_spearman value: 78.89330209904084 - type: manhattan_pearson value: 79.76098513973719 - type: manhattan_spearman value: 79.05490162570123 - task: type: STS dataset: name: MTEB STS22 (pl) type: mteb/sts22-crosslingual-sts config: pl split: test revision: eea2b4fe26a775864c896887d910b76a8098ad3f metrics: - type: cos_sim_pearson value: 37.732606308062486 - type: cos_sim_spearman value: 41.01645667030284 - type: euclidean_pearson value: 26.61722556367085 - type: euclidean_spearman value: 41.01645667030284 - type: manhattan_pearson value: 26.60917378970807 - type: manhattan_spearman value: 41.51335727617614 - task: type: Retrieval dataset: name: MTEB SciFact-PL type: clarin-knext/scifact-pl config: default split: test revision: 47932a35f045ef8ed01ba82bf9ff67f6e109207e metrics: - type: map_at_1 value: 54.31700000000001 - type: map_at_10 value: 65.564 - type: map_at_100 value: 66.062 - type: map_at_1000 value: 66.08699999999999 - type: map_at_3 value: 62.592999999999996 - type: map_at_5 value: 63.888 - type: mrr_at_1 value: 56.99999999999999 - type: mrr_at_10 value: 66.412 - type: mrr_at_100 value: 66.85900000000001 - type: mrr_at_1000 value: 66.88 - type: mrr_at_3 value: 64.22200000000001 - type: mrr_at_5 value: 65.206 - type: ndcg_at_1 value: 56.99999999999999 - type: ndcg_at_10 value: 70.577 - type: ndcg_at_100 value: 72.879 - type: ndcg_at_1000 value: 73.45 - type: ndcg_at_3 value: 65.5 - type: ndcg_at_5 value: 67.278 - type: precision_at_1 value: 56.99999999999999 - type: precision_at_10 value: 9.667 - type: precision_at_100 value: 1.083 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 26.0 - type: precision_at_5 value: 16.933 - type: recall_at_1 value: 54.31700000000001 - type: recall_at_10 value: 85.056 - type: recall_at_100 value: 95.667 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 71.0 - type: recall_at_5 value: 75.672 - task: type: Retrieval dataset: name: MTEB TRECCOVID-PL type: clarin-knext/trec-covid-pl config: default split: test revision: 81bcb408f33366c2a20ac54adafad1ae7e877fdd metrics: - type: map_at_1 value: 0.245 - type: map_at_10 value: 2.051 - type: map_at_100 value: 12.009 - type: map_at_1000 value: 27.448 - type: map_at_3 value: 0.721 - type: map_at_5 value: 1.13 - type: mrr_at_1 value: 88.0 - type: mrr_at_10 value: 93.0 - type: mrr_at_100 value: 93.0 - type: mrr_at_1000 value: 93.0 - type: mrr_at_3 value: 93.0 - type: mrr_at_5 value: 93.0 - type: ndcg_at_1 value: 85.0 - type: ndcg_at_10 value: 80.303 - type: ndcg_at_100 value: 61.23499999999999 - type: ndcg_at_1000 value: 52.978 - type: ndcg_at_3 value: 84.419 - type: ndcg_at_5 value: 82.976 - type: precision_at_1 value: 88.0 - type: precision_at_10 value: 83.39999999999999 - type: precision_at_100 value: 61.96 - type: precision_at_1000 value: 22.648 - type: precision_at_3 value: 89.333 - type: precision_at_5 value: 87.2 - type: recall_at_1 value: 0.245 - type: recall_at_10 value: 2.193 - type: recall_at_100 value: 14.938 - type: recall_at_1000 value: 48.563 - type: recall_at_3 value: 0.738 - type: recall_at_5 value: 1.173 --- <div style="width: auto; margin-left: auto; margin-right: auto"> <img src="https://i.imgur.com/jC7kdl8.jpeg" alt="TensorBlock" style="width: 100%; min-width: 400px; display: block; margin: auto;"> </div> <div style="display: flex; justify-content: space-between; width: 100%;"> <div style="display: flex; flex-direction: column; align-items: flex-start;"> <p style="margin-top: 0.5em; margin-bottom: 0em;"> Feedback and support: TensorBlock's <a href="https://x.com/tensorblock_aoi">Twitter/X</a>, <a href="https://t.me/TensorBlock">Telegram Group</a> and <a href="https://x.com/tensorblock_aoi">Discord server</a> </p> </div> </div> ## Alibaba-NLP/gte-Qwen2-7B-instruct - GGUF This repo contains GGUF format model files for [Alibaba-NLP/gte-Qwen2-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct). The files were quantized using machines provided by [TensorBlock](https://tensorblock.co/), and they are compatible with llama.cpp as of [commit b4011](https://github.com/ggerganov/llama.cpp/commit/a6744e43e80f4be6398fc7733a01642c846dce1d). <div style="text-align: left; margin: 20px 0;"> <a href="https://tensorblock.co/waitlist/client" style="display: inline-block; padding: 10px 20px; background-color: #007bff; color: white; text-decoration: none; border-radius: 5px; font-weight: bold;"> Run them on the TensorBlock client using your local machine ↗ </a> </div> ## Prompt template ``` <|im_start|>system {system_prompt}<|im_end|> <|im_start|>user {prompt}<|im_end|> <|im_start|>assistant ``` ## Model file specification | Filename | Quant type | File Size | Description | | -------- | ---------- | --------- | ----------- | | [gte-Qwen2-7B-instruct-Q2_K.gguf](https://huggingface.co/tensorblock/gte-Qwen2-7B-instruct-GGUF/blob/main/gte-Qwen2-7B-instruct-Q2_K.gguf) | Q2_K | 2.807 GB | smallest, significant quality loss - not recommended for most purposes | | [gte-Qwen2-7B-instruct-Q3_K_S.gguf](https://huggingface.co/tensorblock/gte-Qwen2-7B-instruct-GGUF/blob/main/gte-Qwen2-7B-instruct-Q3_K_S.gguf) | Q3_K_S | 3.251 GB | very small, high quality loss | | [gte-Qwen2-7B-instruct-Q3_K_M.gguf](https://huggingface.co/tensorblock/gte-Qwen2-7B-instruct-GGUF/blob/main/gte-Qwen2-7B-instruct-Q3_K_M.gguf) | Q3_K_M | 3.545 GB | very small, high quality loss | | [gte-Qwen2-7B-instruct-Q3_K_L.gguf](https://huggingface.co/tensorblock/gte-Qwen2-7B-instruct-GGUF/blob/main/gte-Qwen2-7B-instruct-Q3_K_L.gguf) | Q3_K_L | 3.806 GB | small, substantial quality loss | | [gte-Qwen2-7B-instruct-Q4_0.gguf](https://huggingface.co/tensorblock/gte-Qwen2-7B-instruct-GGUF/blob/main/gte-Qwen2-7B-instruct-Q4_0.gguf) | Q4_0 | 4.125 GB | legacy; small, very high quality loss - prefer using Q3_K_M | | [gte-Qwen2-7B-instruct-Q4_K_S.gguf](https://huggingface.co/tensorblock/gte-Qwen2-7B-instruct-GGUF/blob/main/gte-Qwen2-7B-instruct-Q4_K_S.gguf) | Q4_K_S | 4.150 GB | small, greater quality loss | | [gte-Qwen2-7B-instruct-Q4_K_M.gguf](https://huggingface.co/tensorblock/gte-Qwen2-7B-instruct-GGUF/blob/main/gte-Qwen2-7B-instruct-Q4_K_M.gguf) | Q4_K_M | 4.360 GB | medium, balanced quality - recommended | | [gte-Qwen2-7B-instruct-Q5_0.gguf](https://huggingface.co/tensorblock/gte-Qwen2-7B-instruct-GGUF/blob/main/gte-Qwen2-7B-instruct-Q5_0.gguf) | Q5_0 | 4.948 GB | legacy; medium, balanced quality - prefer using Q4_K_M | | [gte-Qwen2-7B-instruct-Q5_K_S.gguf](https://huggingface.co/tensorblock/gte-Qwen2-7B-instruct-GGUF/blob/main/gte-Qwen2-7B-instruct-Q5_K_S.gguf) | Q5_K_S | 4.948 GB | large, low quality loss - recommended | | [gte-Qwen2-7B-instruct-Q5_K_M.gguf](https://huggingface.co/tensorblock/gte-Qwen2-7B-instruct-GGUF/blob/main/gte-Qwen2-7B-instruct-Q5_K_M.gguf) | Q5_K_M | 5.069 GB | large, very low quality loss - recommended | | [gte-Qwen2-7B-instruct-Q6_K.gguf](https://huggingface.co/tensorblock/gte-Qwen2-7B-instruct-GGUF/blob/main/gte-Qwen2-7B-instruct-Q6_K.gguf) | Q6_K | 5.822 GB | very large, extremely low quality loss | | [gte-Qwen2-7B-instruct-Q8_0.gguf](https://huggingface.co/tensorblock/gte-Qwen2-7B-instruct-GGUF/blob/main/gte-Qwen2-7B-instruct-Q8_0.gguf) | Q8_0 | 7.539 GB | very large, extremely low quality loss - not recommended | ## Downloading instruction ### Command line Firstly, install Huggingface Client ```shell pip install -U "huggingface_hub[cli]" ``` Then, downoad the individual model file the a local directory ```shell huggingface-cli download tensorblock/gte-Qwen2-7B-instruct-GGUF --include "gte-Qwen2-7B-instruct-Q2_K.gguf" --local-dir MY_LOCAL_DIR ``` If you wanna download multiple model files with a pattern (e.g., `*Q4_K*gguf`), you can try: ```shell huggingface-cli download tensorblock/gte-Qwen2-7B-instruct-GGUF --local-dir MY_LOCAL_DIR --local-dir-use-symlinks False --include='*Q4_K*gguf' ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
ymelka/camembert-cosmetic-similarity-v2
ymelka
sentence-similarity
[ "sentence-transformers", "safetensors", "camembert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:5000", "loss:CoSENTLoss", "arxiv:1908.10084", "base_model:ymelka/camembert-cosmetic-finetuned", "base_model:finetune:ymelka/camembert-cosmetic-finetuned", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,718
1,718
10
0
--- base_model: ymelka/camembert-cosmetic-finetuned datasets: [] language: [] library_name: sentence-transformers metrics: - pearson_cosine - spearman_cosine - pearson_manhattan - spearman_manhattan - pearson_euclidean - spearman_euclidean - pearson_dot - spearman_dot - pearson_max - spearman_max pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:5000 - loss:CoSENTLoss widget: - source_sentence: Un soin régulateur de pores hautement efficace, conçu pour réduire visiblement l'apparence des pores dilatés. Sa formule ciblée aide à affiner le grain de peau et à réguler la production de sébum, pour une peau plus lisse et uniforme. Idéal pour les peaux matures en quête de perfection. sentences: - La Crème Confort 1ères Rides de Coup D’Eclat est un soin hydratant apaisant qui procure une hydratation optimale à la peau tout en la régénérant. En 28 jours, la peau devient moins sensible et réactive, tandis que les premiers signes de l'âge sont corrigés et prévenus. Grâce à des ingrédients tels que l'huile de pépins de raisin, l'huile de macadamia et la vitamine E, cette crème redonne à la peau son éclat et sa vitalité, tout en lissant les ridules et affinant le grain de peau. Pour une utilisation externe sur le visage et le cou, cette formule non-comédogène convient à tous les types de peaux, y compris les peaux sensibles et réactives. Il est recommandé d'appliquer la crème matin et soir par un léger massage sur une peau démaquillée. Il est important de suivre les instructions d'utilisation et de ne pas dépasser la posologie recommandée. - La Source Micellaire Enchantée Rose D'Antan de Garancia est un produit de parapharmacie multifonctionnel qui nettoie, démaquille, hydrate, apaise et illumine la peau du visage, des yeux et des lèvres. Grâce à sa formule enrichie en actifs brevetés hydratants et apaisants, cette eau micellaire réduit de manière significative les sensations d'irritation et de picotement. Composée à 99,5% d'ingrédients d'origine naturelle, elle contient notamment de l'extrait de racine de chicorée, un prébiotique nourrissant pour le microbiote cutané. Pour l'utiliser, il suffit de tourner la pompe vers la gauche, d'imbiber un coton d'eau micellaire et de le passer sur le visage, les yeux et les lèvres sans rinçage. Laissez poser 5 secondes sur les yeux avant de démaquiller. Ce produit convient à tous les types de peau et est présenté dans un flacon pompe de 400 ml. - Le tonique hydratant Cosrx Hydrium est un produit de parapharmacie qui rend la peau plus fraîche et hydratée grâce à sa formule contenant de la vitamine B5 et de l'acide hyaluronique. Ce tonique hydratant agit comme une base essentielle pour la santé de la peau, en formant une barrière d'hydratation et en optimisant l'équilibre des peaux abîmées. Il convient à tous les types de peau, y compris les peaux sèches et à tendance acnéique. Les principaux ingrédients actifs incluent des acides hyaluroniques de type 6, du D-panthénol et de l'allantoïne pour une hydratation en profondeur et un effet apaisant sur les peaux sensibles. Pour une utilisation optimale, appliquez le tonique après le nettoyage du visage, en massant pour une meilleure absorption. Il peut également être utilisé comme masque en feuille, brume ou mélange nettoyant. Présenté en flacon de 150 ml, ce tonique hydratant est un allié idéal pour une peau fraîche et hydratée au quotidien. - source_sentence: Un soin hydratant et revitalisant qui apporte un éclat naturel à la peau. Enrichi en ingrédients nourrissants et anti-âge, ce soin aide à réduire les signes de fatigue et à améliorer la texture de la peau. Parfait pour revitaliser la peau mature et lui redonner toute sa jeunesse. sentences: - L'Alphanova Solide Exfoliant Visage est un produit naturel et végan qui purifie la peau et revitalise le teint. Composé d'huiles bio d'amande douce et de jojoba, de feuilles de verveine et de poudre de coques de noix, il convient aux peaux normales. Sans huile de palme ni sulfate, ce duo moussant doux et végétal offre une mousse généreuse et onctueuse au parfum frais de verveine. Avec 99,9% d'ingrédients d'origine naturelle, dont 73,4% issus de l'agriculture biologique, cet exfoliant visage Alphanova permet jusqu'à 100 utilisations. Pour l'utiliser, il suffit d'appliquer le nettoyant sur le visage humidifié, de masser délicatement en évitant le contour des yeux, puis de rincer abondamment. Présenté en deux formats de 75g, cet exfoliant offre une expérience de soin agréable et respectueuse de l'environnement. - Le Clarins Doux Nettoyant Moussant Apaisant est spécialement conçu pour les femmes ayant une peau très sèche ou sensible. Grâce à sa formule aux herbes des Alpes, ce nettoyant apaise et adoucit la peau tout en la protégeant des agressions extérieures. Enrichi en extraits de saponaire, de reine des près, d'aloé vera, de camomille bio et de beurre de karité, il nettoie en douceur, purifie, hydrate et apaise la peau. Sa texture mousse fine et onctueuse laisse la peau parfaitement nettoyée, douce et protégée. Ce nettoyant peut être utilisé matin et/ou soir en massant délicatement sur le visage et le cou, en évitant le contour des yeux. Il est recommandé de rincer abondamment après utilisation. Évitez le contour des yeux lors de l'application. Disponible en tube de 125 ml, ce nettoyant est idéal pour un nettoyage en douceur des peaux très sèches ou sensibles. - L'Eau Parfumée Bienfaisante Shiso de Roger&Gallet est un parfum unique aux notes fraîches et raffinées, mêlant le shiso, le petitgrain et la mandarine pour une sensation de fraîcheur naturelle. Enrichi en pivoine et en santal, ce parfum vert fusant apporte une énergie vibrante et permet de s'ouvrir à de nouveaux horizons. Idéal pour le corps, ce produit peut être utilisé en vaporisation pour accentuer son effet énergisant. Les principaux ingrédients actifs incluent l'extrait de feuille de Perilla ocymoides, connu pour ses propriétés revitalisantes. Il est recommandé de vaporiser un nuage de parfum devant soi et de le traverser pour profiter pleinement de ses bienfaits. Il est conseillé de ne pas utiliser ce produit sur une peau irritée ou lésée. Profitez de cette fragrance unique pour vous sentir revitalisé et plein d'énergie au quotidien. - source_sentence: Un nettoyant doux et hydratant, spécialement formulé pour éliminer les impuretés tout en apportant de l'éclat à la peau. Sa formule adaptée aux peaux matures aide à lutter contre les taches et les imperfections, tout en respectant la sensibilité de la peau. sentences: - Le nettoyant visage naturel solide Respire est spécialement conçu pour les peaux sensibles, offrant une formule douce et naturelle enrichie en huile de lin Bio, huile de tournesol Bio et beurre de karité Bio. Ces ingrédients apaisent, hydratent et protègent la peau, la laissant douce et saine. Sa formule sans ingrédients controversés convient parfaitement aux peaux sensibles. Facile à utiliser, il suffit de frotter doucement le nettoyant sur le visage humidifié, de masser légèrement la peau et de rincer. Vegan et non-testé sur les animaux, ce nettoyant est testé dermatologiquement et fabriqué en France. Il est idéal pour une peau apaisée et saine, et convient aux peaux sensibles. Il est recommandé de rincer immédiatement en cas de contact avec les yeux. - Le Gamarde Lait Nettoyant Douceur Peaux Délicates Bio est un nettoyant et démaquillant doux spécialement conçu pour les peaux délicates, sèches ou mixtes. Sa formule à base d'ingrédients naturels et biologiques, tels que l'eau de Gamarde les Bains, l'huile d'argan et l'huile de noisette, permet d'éliminer en douceur les impuretés et le maquillage tenace tout en respectant l'équilibre de la peau. Enrichi en huiles essentielles de Palmarosa et d'orange douce, ce lait nettoyant laisse la peau propre, douce et apaisée. Pour l'utiliser, il suffit d'appliquer une petite quantité sur le visage et le cou, puis de retirer avec un coton sec avant de tonifier la peau avec la Lotion Apaisante Douceur. Ce produit convient parfaitement pour un usage quotidien et ne présente aucune contre-indication particulière. - La serviette à cheveux Les Tendances D'Emma en couleur marron est un accessoire pratique et efficace pour sécher les cheveux en douceur. Fabriquée à partir de 90% de viscose de bambou et 10% de polyester, elle absorbe 4 fois mieux qu'une serviette classique. Son attache astucieuse permet de la maintenir en place sur la tête, évitant ainsi de traumatiser les cheveux lors du séchage. Idéale pour tous, y compris ceux qui ont opté pour des colorations naturelles, cette serviette simplifie la vie au quotidien. Facile à utiliser, il suffit de la placer sur la tête, de tourner et de glisser dans l'attache prévue. Lavable en machine, elle est pratique et écologique. Cette serviette à cheveux est conçue, fabriquée et imprimée en France dans une démarche éco-responsable. Un produit incontournable pour prendre soin de ses cheveux en toute simplicité. - source_sentence: Un soin anti-rides et régulateur de sébum, spécialement conçu pour traiter les rides et ridules tout en régulant l'excès de sébum. Sa formule hydratante et apaisante convient parfaitement aux peaux sensibles. sentences: - Le Phyt's Men Soin Anti-Rides est un fluide frais et non gras conçu pour atténuer les premiers signes de l'âge chez les hommes. Certifié Bio et d'origine naturelle, ce soin hydrate, raffermit et illumine la peau masculine. Sa formule contient des huiles végétales de sésame, noisette, chanvre, nigelle et beurre de karité, ainsi que des huiles essentielles de petit grain et géranium, pour leurs propriétés protectrices, apaisantes et tonifiantes. L'extrait de ginseng contribue à revitaliser la peau. Il est recommandé d'appliquer ce produit quotidiennement sur l'ensemble du visage. Ce soin est destiné à lutter contre les premiers signes de l'âge et est à usage externe uniquement. Il convient de noter que ce produit est déconseillé en cas d'allergie à l'un de ses composants. - L'Eau Thermale Spray Brumisateur Apaisant d'Avène est un soin essentiel pour les peaux sensibles, hypersensibles, allergiques et irritées. Grâce à sa composition unique en eau thermale d'Avène, ce spray apaise et sublime toutes les peaux, même les plus sensibles, en leur procurant une sensation d'apaisement, de confort et de bien-être. Les propriétés apaisantes et anti-irritantes de l'eau thermale d'Avène ont été démontrées par de nombreux travaux scientifiques, en faisant un véritable principe actif pour le traitement des affections cutanées. Il est recommandé pour les peaux atopiques, sébo-squameuses, couperosiques et sujettes aux photo-allergies. Les principaux ingrédients actifs de ce spray sont l'eau thermale Avène et le gaz (nitrogène), qui contribuent à apaiser la peau et à la protéger. Pour l'utiliser, il suffit de pulvériser une fine brume sur le visage. Ce soin a été testé par 100 utilisateurs qui ont tous apprécié ses bienfaits. Il est important de noter que ce produit est contre-indiqué en cas d'allergie à l'un de ses composants. - Le soin raffermissant corps et buste Copaïba Demain L'Empire 200ml est un produit de parapharmacie de haute qualité, formulé avec des ingrédients naturels et actifs pour offrir à la peau une hydratation, une protection et une fermeté optimales. Grâce à sa composition riche en huile de macadamia, beurre de babassu et autres actifs puissants, ce soin aide à améliorer l'élasticité de la peau, à réduire les rides, à prévenir les vergetures et à protéger contre les agressions extérieures. En utilisant ce produit quotidiennement, la peau retrouve sa jeunesse et sa vitalité, avec une texture douce et un parfum frais et vivifiant. Les principaux ingrédients actifs tels que la chitine, l'extrait végétal tropical et l'acide hyaluronique agissent en synergie pour rajeunir la peau et lui apporter une hydratation optimale. Il est recommandé d'appliquer ce soin sur tout le corps, en massant délicatement jusqu'à absorption complète. Il est conseillé de l'utiliser régulièrement pour des résultats visibles en seulement quelques mois. Il est important de noter que ce produit est destiné à un usage externe uniquement et qu'il est préférable de consulter un professionnel de la santé en cas de réaction allergique. - source_sentence: En complément du nettoyant et du soin, il est recommandé d'utiliser un masque purifiant et matifiant une à deux fois par semaine. Ce masque aidera à resserrer les pores, purifier la peau en profondeur et réguler l'excès de sébum pour un teint plus éclatant et uniforme. sentences: - La Crème Moussante Nettoyante Hydratante CeraVe est un produit développé en collaboration avec des dermatologues pour nettoyer, démaquiller et hydrater en douceur les peaux normales à sèches. Enrichie en céramides essentiels, acide hyaluronique et acides aminés, sa formule élimine efficacement les impuretés, la pollution et le maquillage longue tenue tout en restaurant la barrière cutanée. Grâce à la Technologie MVE, les actifs sont diffusés en continu pour une hydratation prolongée. Cette crème convient pour le visage et les yeux, est hypoallergénique et non-comédogène. Utilisez-la matin et soir sur une peau humide, faites mousser et rincez abondamment. Présentée en flacon pompe de 236 ml, elle laisse la peau douce, hydratée et propre sans laisser de résidus. - Le Fond de Teint Correcteur Fluide Avène en teinte miel est spécialement conçu pour corriger les imperfections cutanées modérées et unifier le teint de manière naturelle. Sa formule résistante à l'eau et à la sueur offre une haute tenue tout en protégeant la peau des rayons UV grâce à son indice de protection 20. Enrichi en pré-tocophéryl, il prévient le vieillissement photo-induit. Ce fond de teint contient un complexe pigmentaire photo-correcteur pour un teint homogène et lumineux. Il convient à tous les types de peaux sensibles, claires ou mates, et permet de camoufler efficacement les imperfections modérées. Pour une application optimale, il est recommandé de l'appliquer avec les doigts en unifiant sur l'ensemble du visage et du cou. Ce produit de parapharmacie est testé en centre de recherche dermatologique et utilisé à l'Atelier de Maquillage Médical de la Station thermale d'Avène. - Le Masque Purifiant Aromatique à l'Argile Darphin Skin Mat est un soin visage qui absorbe l'excès de sébum et purifie en profondeur l'épiderme. Grâce à sa formule, ce masque nettoie, clarifie et purifie la peau, la laissant plus fraîche et plus claire. Adapté à tous les types de peaux, il s'applique en fine couche sur le visage et le cou, en évitant le contour des yeux, et se laisse poser pendant 10 à 15 minutes avant de rincer à l'eau tiède. Ce masque contient de l'argile, connue pour ses propriétés absorbantes et purifiantes, ainsi que des ingrédients aromatiques pour une expérience sensorielle agréable. Il est recommandé de l'utiliser une à deux fois par semaine pour des résultats optimaux. Il est conseillé de ne pas l'utiliser sur une peau irritée ou lésée, et de faire un test préalable sur une petite zone de la peau pour éviter toute réaction allergique. Profitez des bienfaits de ce masque pour retrouver une peau nette et éclatante. model-index: - name: SentenceTransformer based on ymelka/camembert-cosmetic-finetuned results: - task: type: semantic-similarity name: Semantic Similarity dataset: name: stsb fr dev type: stsb-fr-dev metrics: - type: pearson_cosine value: 0.88932660143579 name: Pearson Cosine - type: spearman_cosine value: 0.9396146138613104 name: Spearman Cosine - type: pearson_manhattan value: 0.881603142317095 name: Pearson Manhattan - type: spearman_manhattan value: 0.9348905557483792 name: Spearman Manhattan - type: pearson_euclidean value: 0.8813221547202659 name: Pearson Euclidean - type: spearman_euclidean value: 0.936064706459536 name: Spearman Euclidean - type: pearson_dot value: 0.7698727846138187 name: Pearson Dot - type: spearman_dot value: 0.881900008532911 name: Spearman Dot - type: pearson_max value: 0.88932660143579 name: Pearson Max - type: spearman_max value: 0.9396146138613104 name: Spearman Max --- # SentenceTransformer based on ymelka/camembert-cosmetic-finetuned This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [ymelka/camembert-cosmetic-finetuned](https://huggingface.co/ymelka/camembert-cosmetic-finetuned). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [ymelka/camembert-cosmetic-finetuned](https://huggingface.co/ymelka/camembert-cosmetic-finetuned) <!-- at revision cd4cb90f9388340c5f02740130efd30336c08905 --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: CamembertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("ymelka/camembert-cosmetic-similarity-v2") # Run inference sentences = [ "En complément du nettoyant et du soin, il est recommandé d'utiliser un masque purifiant et matifiant une à deux fois par semaine. Ce masque aidera à resserrer les pores, purifier la peau en profondeur et réguler l'excès de sébum pour un teint plus éclatant et uniforme.", "Le Masque Purifiant Aromatique à l'Argile Darphin Skin Mat est un soin visage qui absorbe l'excès de sébum et purifie en profondeur l'épiderme. Grâce à sa formule, ce masque nettoie, clarifie et purifie la peau, la laissant plus fraîche et plus claire. Adapté à tous les types de peaux, il s'applique en fine couche sur le visage et le cou, en évitant le contour des yeux, et se laisse poser pendant 10 à 15 minutes avant de rincer à l'eau tiède. Ce masque contient de l'argile, connue pour ses propriétés absorbantes et purifiantes, ainsi que des ingrédients aromatiques pour une expérience sensorielle agréable. Il est recommandé de l'utiliser une à deux fois par semaine pour des résultats optimaux. Il est conseillé de ne pas l'utiliser sur une peau irritée ou lésée, et de faire un test préalable sur une petite zone de la peau pour éviter toute réaction allergique. Profitez des bienfaits de ce masque pour retrouver une peau nette et éclatante.", "Le Fond de Teint Correcteur Fluide Avène en teinte miel est spécialement conçu pour corriger les imperfections cutanées modérées et unifier le teint de manière naturelle. Sa formule résistante à l'eau et à la sueur offre une haute tenue tout en protégeant la peau des rayons UV grâce à son indice de protection 20. Enrichi en pré-tocophéryl, il prévient le vieillissement photo-induit. Ce fond de teint contient un complexe pigmentaire photo-correcteur pour un teint homogène et lumineux. Il convient à tous les types de peaux sensibles, claires ou mates, et permet de camoufler efficacement les imperfections modérées. Pour une application optimale, il est recommandé de l'appliquer avec les doigts en unifiant sur l'ensemble du visage et du cou. Ce produit de parapharmacie est testé en centre de recherche dermatologique et utilisé à l'Atelier de Maquillage Médical de la Station thermale d'Avène.", ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Semantic Similarity * Dataset: `stsb-fr-dev` * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) | Metric | Value | |:--------------------|:-----------| | pearson_cosine | 0.8893 | | **spearman_cosine** | **0.9396** | | pearson_manhattan | 0.8816 | | spearman_manhattan | 0.9349 | | pearson_euclidean | 0.8813 | | spearman_euclidean | 0.9361 | | pearson_dot | 0.7699 | | spearman_dot | 0.8819 | | pearson_max | 0.8893 | | spearman_max | 0.9396 | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 5,000 training samples * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | score | |:--------|:-----------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:---------------------------------------------------------------| | type | string | string | float | | details | <ul><li>min: 30 tokens</li><li>mean: 55.51 tokens</li><li>max: 90 tokens</li></ul> | <ul><li>min: 124 tokens</li><li>mean: 199.72 tokens</li><li>max: 503 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.47</li><li>max: 1.0</li></ul> | * Samples: | sentence1 | sentence2 | score | |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------| | <code>En complément, un sérum anti-imperfections peut être utilisé pour cibler spécifiquement les problèmes de peau tels que les imperfections et les rougeurs. Ce sérum aidera à purifier la peau et à réduire l'apparence des boutons.</code> | <code>Le sérum anti-imperfections Endro à base d'huile végétale de noisette et d'huiles essentielles bio est un concentré d'actifs naturels et antibactériens qui laisse la peau saine et les pores resserrés. Grâce à son action ciblée et hyper concentrée, il lutte efficacement contre les imperfections cutanées, réduisant ainsi les boutons et les rougeurs. Ce sérum convient aux adultes et aux adolescents à partir de 10 ans, et 73,35% des utilisateurs ont constaté une amélioration de leur peau en une semaine seulement. Les principaux ingrédients actifs tels que l'huile de noisette, l'huile essentielle de palmarosa et l'huile essentielle d'arbre à thé agissent en synergie pour purifier la peau et réguler l'excès de sébum. Pour une utilisation optimale, il est recommandé d'appliquer une petite goutte du sérum sur les zones à traiter le soir avant de dormir, en évitant le contour des yeux. Cependant, ce produit n'est pas adapté aux femmes enceintes ou allaitantes. En cas de contact avec les yeux, il est important de rincer abondamment et d'éviter toute exposition au soleil après application. Avec Endro Sérum Anti-Imperfections, retrouvez une peau nette et éclatante en toute simplicité.</code> | <code>0.9809522032737732</code> | | <code>Un soin régulateur et matifiant, idéal pour traiter les imperfections et les pores dilatés. Sa formule spécifique permettra de réduire l'apparence des imperfections tout en resserrant les pores pour une peau plus lisse et uniforme.</code> | <code>Le La Roche-Posay Effaclar MAT Soin Hydratant Sébo-Régulateur Visage Peaux Grasses est un soin spécialement conçu pour les peaux grasses sensibles sujettes à la brillance. Sa formule anti-brillance et anti-pores dilatés, grâce à l'association de Sebulyse, de microsphères absorbantes et de perlite, régule la production de sébum et matifie la peau immédiatement. Ce soin hydratant offre un effet matifiant et hydratant longue durée, tout en étant une excellente base de maquillage. Il convient aux adultes et aux adolescents, et est idéal pour les peaux à imperfections, à tendance acnéique et sujettes à la brillance. Pour une utilisation optimale, il est recommandé d'appliquer le produit matin et/ou soir sur l'ensemble du visage. Il est important de noter que ce produit est testé sous contrôle dermatologique, non comédogène et hypoallergénique.</code> | <code>0.9946829676628112</code> | | <code>Un complément de traitement anti-taches, conçu pour cibler spécifiquement les taches pigmentaires. Ce complément concentré en actifs éclaircissants aidera à atténuer les taches existantes et à prévenir l'apparition de nouvelles taches. Il est recommandé de l'utiliser en complément des autres soins pour une action ciblée et efficace.</code> | <code>Le Lierac Lumilogie Anti-Taches est un traitement ciblé pour les 3 types de taches cutanées : naissantes, visibles et incrustées. Grâce à sa formule innovante inspirée des techniques esthétiques combinées, ce produit agit sur les taches à tous les stades de leur développement. Enrichi en Hexyl R., Lys de mer et Extrait de plantain, il freine la production de mélanine, diminue les taches visibles et lutte contre l'incrustation de la mélanine en profondeur. De plus, les concentrés de vitamines E et B3 ainsi que les 7 hydroxy acides activent le renouvellement cellulaire pour éliminer la mélanine en surface. En résulte un teint unifié et plus uniforme dès la première utilisation, avec une efficacité prouvée dès 7 jours et une correction visible des taches dès 28 jours. Pour une utilisation optimale, appliquez 2 pressions du concentré jour le matin et du concentré nuit le soir sur l'ensemble du visage, en évitant le contour des yeux. Veillez à éviter le contour des yeux et à utiliser une protection solaire avec IP en cas d'exposition au soleil.</code> | <code>0.9939286708831788</code> | * Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "pairwise_cos_sim" } ``` ### Evaluation Dataset #### Unnamed Dataset * Size: 1,000 evaluation samples * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | score | |:--------|:-----------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:---------------------------------------------------------------| | type | string | string | float | | details | <ul><li>min: 30 tokens</li><li>mean: 54.83 tokens</li><li>max: 88 tokens</li></ul> | <ul><li>min: 120 tokens</li><li>mean: 197.93 tokens</li><li>max: 491 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.46</li><li>max: 1.0</li></ul> | * Samples: | sentence1 | sentence2 | score | |:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------| | <code>Un complément hydratant et correcteur, idéal pour les peaux sensibles et sujettes aux taches. Ce complément aidera à hydrater en profondeur, à atténuer les rides et à réduire l'apparence des pores pour une peau plus lisse et uniforme.</code> | <code>L'Huile Végétale de Karité Bio de Puressentiel est un produit nutritif, réparateur et apaisant, idéal pour nourrir et réparer en profondeur la peau et les cheveux. Cette huile 100% pure et naturelle est recommandée pour une utilisation externe sur la peau et les cheveux. Enrichie en beurre de karité issu de l'agriculture biologique, elle offre des propriétés nourrissantes et réparatrices. Pour une utilisation sur le visage et le corps, il est conseillé de chauffer une noisette de beurre de karité dans la paume de la main et de l'appliquer sur les zones sèches et craquelées. Pour les cheveux secs et abîmés, il suffit de chauffer une petite noisette de beurre de karité entre les mains et de l'appliquer sur les pointes et les longueurs. Il est important d'éviter le contact avec les yeux et les muqueuses, et de se laver les mains après application. Il est recommandé de conserver le produit à l'abri de la lumière, de l'air et de la chaleur. Disponible en pot de 100 ml, cette huile de karité bio est un allié naturel pour prendre soin de sa peau et de ses cheveux.</code> | <code>0.0544042661786079</code> | | <code>Un soin anti-âge global, conçu pour traiter les rides, les taches pigmentaires et les imperfections. Sa formule régulatrice et éclatante aidera à lisser la peau, à atténuer les taches et à réduire les imperfections pour un teint plus uniforme et lumineux.</code> | <code>Le sérum contour des yeux anti-rides Maison Éole Elle Et Lui Émerveillé est un produit de parapharmacie haut de gamme qui agit efficacement contre les rides, les ridules, les cernes et les poches. Sa formule complète enrichie en Bakuchiol, alternative naturelle au Rétinol A, nourrit la peau en profondeur et réduit les tâches cutanées. Grâce à ses actifs tels que l'huile de pépin de raisin, le Bisabolol et le Resvératrol, ce sérum hydrate intensément, lisse la peau et prévient le vieillissement cutané. Son utilisation matin et soir sur une peau propre permet d'obtenir un regard éclatant et reposé. Le flacon-pipette de 15ml facilite son application. Ce produit convient à tous les types de peau et ne contient ni parabène, ni silicone, ni ingrédients d'origine animale. Il est recommandé de suivre les instructions d'utilisation pour des résultats optimaux.</code> | <code>0.0781720206141471</code> | | <code>Un soin anti-rides et éclat, enrichi en actifs régénérants et illuminants. Ce soin aidera à lisser les rides, à uniformiser le teint et à redonner de l'éclat à la peau fatiguée.</code> | <code>L'Eau Micellaire Sebiaclear de SVR est un produit de parapharmacie qui purifie, nettoie et démaquille la peau en un seul geste. Adaptée aux peaux sensibles mixtes à grasses, cette eau micellaire aide à éliminer les impuretés, les boutons, les points noirs et l'excès de sébum sans dessécher la peau. Grâce à sa formule innovante contenant de la gluconolactone et de la niacinamide, elle offre une haute efficacité tout en respectant la peau. Les micelles présentes dans le produit nettoient et démaquillent en douceur, laissant la peau nette et fraîche. Pour l'utiliser, il suffit d'appliquer l'eau micellaire matin et/ou soir à l'aide d'un coton sur le visage et les yeux, sans rinçage. Avec une présentation en flacon de 400 ml, ce produit convient aux peaux sensibles à tendance acnéique et offre des résultats visibles dès 7 jours d'utilisation. Il est recommandé de ne pas l'utiliser en cas d'allergie à l'un des ingrédients et de consulter un professionnel de santé en cas de doute.</code> | <code>0.0607918016612529</code> | * Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "pairwise_cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `learning_rate`: 2e-05 - `weight_decay`: 0.01 - `num_train_epochs`: 4 - `warmup_ratio`: 0.1 - `bf16`: True - `load_best_model_at_end`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.01 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 4 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: True - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | loss | stsb-fr-dev_spearman_cosine | |:----------:|:-------:|:-------------:|:----------:|:---------------------------:| | 0 | 0 | - | - | 0.4986 | | 0.3195 | 100 | 4.6554 | 4.3185 | 0.8719 | | 0.6390 | 200 | 4.2773 | 4.1772 | 0.8984 | | 0.9585 | 300 | 4.1015 | 4.0808 | 0.9128 | | 1.2748 | 400 | 4.0285 | 4.0244 | 0.9215 | | 1.5942 | 500 | 3.9269 | 4.0512 | 0.9317 | | 1.9137 | 600 | 3.8057 | 3.9970 | 0.9348 | | 2.2300 | 700 | 3.7665 | 4.0250 | 0.9350 | | **2.5495** | **800** | **3.7541** | **3.9587** | **0.9396** | | 2.8690 | 900 | 3.6029 | 4.0481 | 0.9407 | | 3.1853 | 1000 | 3.6183 | 3.9964 | 0.9416 | | 3.5048 | 1100 | 3.5848 | 3.9711 | 0.9454 | | 3.8243 | 1200 | 3.5029 | 3.9985 | 0.9452 | | 3.9904 | 1252 | - | - | 0.9396 | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.0.1 - Transformers: 4.41.2 - PyTorch: 2.3.0+cu121 - Accelerate: 0.31.0 - Datasets: 2.20.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### CoSENTLoss ```bibtex @online{kexuefm-8847, title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT}, author={Su Jianlin}, year={2022}, month={Jan}, url={https://kexue.fm/archives/8847}, } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION", "SEMANTIC_SIMILARITY" ]
[ "CAS" ]
Non_BioNLP
CyberOptic/multilingual-e5-large-Q8_0-GGUF
CyberOptic
feature-extraction
[ "sentence-transformers", "gguf", "mteb", "Sentence Transformers", "sentence-similarity", "feature-extraction", "llama-cpp", "gguf-my-repo", "multilingual", "af", "am", "ar", "as", "az", "be", "bg", "bn", "br", "bs", "ca", "cs", "cy", "da", "de", "el", "en", "eo", "es", "et", "eu", "fa", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "hu", "hy", "id", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "ku", "ky", "la", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", "no", "om", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sa", "sd", "si", "sk", "sl", "so", "sq", "sr", "su", "sv", "sw", "ta", "te", "th", "tl", "tr", "ug", "uk", "ur", "uz", "vi", "xh", "yi", "zh", "base_model:intfloat/multilingual-e5-large", "base_model:quantized:intfloat/multilingual-e5-large", "license:mit", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,741
1,741
8
0
--- base_model: intfloat/multilingual-e5-large language: - multilingual - af - am - ar - as - az - be - bg - bn - br - bs - ca - cs - cy - da - de - el - en - eo - es - et - eu - fa - fi - fr - fy - ga - gd - gl - gu - ha - he - hi - hr - hu - hy - id - is - it - ja - jv - ka - kk - km - kn - ko - ku - ky - la - lo - lt - lv - mg - mk - ml - mn - mr - ms - my - ne - nl - 'no' - om - or - pa - pl - ps - pt - ro - ru - sa - sd - si - sk - sl - so - sq - sr - su - sv - sw - ta - te - th - tl - tr - ug - uk - ur - uz - vi - xh - yi - zh license: mit tags: - mteb - Sentence Transformers - sentence-similarity - feature-extraction - sentence-transformers - llama-cpp - gguf-my-repo model-index: - name: multilingual-e5-large results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 79.05970149253731 - type: ap value: 43.486574390835635 - type: f1 value: 73.32700092140148 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (de) type: mteb/amazon_counterfactual config: de split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 71.22055674518201 - type: ap value: 81.55756710830498 - type: f1 value: 69.28271787752661 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en-ext) type: mteb/amazon_counterfactual config: en-ext split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 80.41979010494754 - type: ap value: 29.34879922376344 - type: f1 value: 67.62475449011278 - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (ja) type: mteb/amazon_counterfactual config: ja split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 77.8372591006424 - type: ap value: 26.557560591210738 - type: f1 value: 64.96619417368707 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 93.489875 - type: ap value: 90.98758636917603 - type: f1 value: 93.48554819717332 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 47.564 - type: f1 value: 46.75122173518047 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (de) type: mteb/amazon_reviews_multi config: de split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 45.400000000000006 - type: f1 value: 44.17195682400632 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (es) type: mteb/amazon_reviews_multi config: es split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 43.068 - type: f1 value: 42.38155696855596 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (fr) type: mteb/amazon_reviews_multi config: fr split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 41.89 - type: f1 value: 40.84407321682663 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (ja) type: mteb/amazon_reviews_multi config: ja split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 40.120000000000005 - type: f1 value: 39.522976223819114 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (zh) type: mteb/amazon_reviews_multi config: zh split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 38.832 - type: f1 value: 38.0392533394713 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: map_at_1 value: 30.725 - type: map_at_10 value: 46.055 - type: map_at_100 value: 46.900999999999996 - type: map_at_1000 value: 46.911 - type: map_at_3 value: 41.548 - type: map_at_5 value: 44.297 - type: mrr_at_1 value: 31.152 - type: mrr_at_10 value: 46.231 - type: mrr_at_100 value: 47.07 - type: mrr_at_1000 value: 47.08 - type: mrr_at_3 value: 41.738 - type: mrr_at_5 value: 44.468999999999994 - type: ndcg_at_1 value: 30.725 - type: ndcg_at_10 value: 54.379999999999995 - type: ndcg_at_100 value: 58.138 - type: ndcg_at_1000 value: 58.389 - type: ndcg_at_3 value: 45.156 - type: ndcg_at_5 value: 50.123 - type: precision_at_1 value: 30.725 - type: precision_at_10 value: 8.087 - type: precision_at_100 value: 0.9769999999999999 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 18.54 - type: precision_at_5 value: 13.542000000000002 - type: recall_at_1 value: 30.725 - type: recall_at_10 value: 80.868 - type: recall_at_100 value: 97.653 - type: recall_at_1000 value: 99.57300000000001 - type: recall_at_3 value: 55.619 - type: recall_at_5 value: 67.71000000000001 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 44.30960650674069 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 38.427074197498996 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 60.28270056031872 - type: mrr value: 74.38332673789738 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 84.05942144105269 - type: cos_sim_spearman value: 82.51212105850809 - type: euclidean_pearson value: 81.95639829909122 - type: euclidean_spearman value: 82.3717564144213 - type: manhattan_pearson value: 81.79273425468256 - type: manhattan_spearman value: 82.20066817871039 - task: type: BitextMining dataset: name: MTEB BUCC (de-en) type: mteb/bucc-bitext-mining config: de-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 99.46764091858039 - type: f1 value: 99.37717466945023 - type: precision value: 99.33194154488518 - type: recall value: 99.46764091858039 - task: type: BitextMining dataset: name: MTEB BUCC (fr-en) type: mteb/bucc-bitext-mining config: fr-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 98.29407880255337 - type: f1 value: 98.11248073959938 - type: precision value: 98.02443319392472 - type: recall value: 98.29407880255337 - task: type: BitextMining dataset: name: MTEB BUCC (ru-en) type: mteb/bucc-bitext-mining config: ru-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 97.79009352268791 - type: f1 value: 97.5176076665512 - type: precision value: 97.38136473848286 - type: recall value: 97.79009352268791 - task: type: BitextMining dataset: name: MTEB BUCC (zh-en) type: mteb/bucc-bitext-mining config: zh-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 99.26276987888363 - type: f1 value: 99.20133403545726 - type: precision value: 99.17500438827453 - type: recall value: 99.26276987888363 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 84.72727272727273 - type: f1 value: 84.67672206031433 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 35.34220182511161 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 33.4987096128766 - task: type: Retrieval dataset: name: MTEB CQADupstackRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: map_at_1 value: 25.558249999999997 - type: map_at_10 value: 34.44425000000001 - type: map_at_100 value: 35.59833333333333 - type: map_at_1000 value: 35.706916666666665 - type: map_at_3 value: 31.691749999999995 - type: map_at_5 value: 33.252916666666664 - type: mrr_at_1 value: 30.252666666666666 - type: mrr_at_10 value: 38.60675 - type: mrr_at_100 value: 39.42666666666666 - type: mrr_at_1000 value: 39.48408333333334 - type: mrr_at_3 value: 36.17441666666665 - type: mrr_at_5 value: 37.56275 - type: ndcg_at_1 value: 30.252666666666666 - type: ndcg_at_10 value: 39.683 - type: ndcg_at_100 value: 44.68541666666667 - type: ndcg_at_1000 value: 46.94316666666668 - type: ndcg_at_3 value: 34.961749999999995 - type: ndcg_at_5 value: 37.215666666666664 - type: precision_at_1 value: 30.252666666666666 - type: precision_at_10 value: 6.904166666666667 - type: precision_at_100 value: 1.0989999999999995 - type: precision_at_1000 value: 0.14733333333333334 - type: precision_at_3 value: 16.037666666666667 - type: precision_at_5 value: 11.413583333333333 - type: recall_at_1 value: 25.558249999999997 - type: recall_at_10 value: 51.13341666666666 - type: recall_at_100 value: 73.08366666666667 - type: recall_at_1000 value: 88.79483333333334 - type: recall_at_3 value: 37.989083333333326 - type: recall_at_5 value: 43.787833333333325 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: map_at_1 value: 10.338 - type: map_at_10 value: 18.360000000000003 - type: map_at_100 value: 19.942 - type: map_at_1000 value: 20.134 - type: map_at_3 value: 15.174000000000001 - type: map_at_5 value: 16.830000000000002 - type: mrr_at_1 value: 23.257 - type: mrr_at_10 value: 33.768 - type: mrr_at_100 value: 34.707 - type: mrr_at_1000 value: 34.766000000000005 - type: mrr_at_3 value: 30.977 - type: mrr_at_5 value: 32.528 - type: ndcg_at_1 value: 23.257 - type: ndcg_at_10 value: 25.733 - type: ndcg_at_100 value: 32.288 - type: ndcg_at_1000 value: 35.992000000000004 - type: ndcg_at_3 value: 20.866 - type: ndcg_at_5 value: 22.612 - type: precision_at_1 value: 23.257 - type: precision_at_10 value: 8.124 - type: precision_at_100 value: 1.518 - type: precision_at_1000 value: 0.219 - type: precision_at_3 value: 15.679000000000002 - type: precision_at_5 value: 12.117 - type: recall_at_1 value: 10.338 - type: recall_at_10 value: 31.154 - type: recall_at_100 value: 54.161 - type: recall_at_1000 value: 75.21900000000001 - type: recall_at_3 value: 19.427 - type: recall_at_5 value: 24.214 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: map_at_1 value: 8.498 - type: map_at_10 value: 19.103 - type: map_at_100 value: 27.375 - type: map_at_1000 value: 28.981 - type: map_at_3 value: 13.764999999999999 - type: map_at_5 value: 15.950000000000001 - type: mrr_at_1 value: 65.5 - type: mrr_at_10 value: 74.53800000000001 - type: mrr_at_100 value: 74.71799999999999 - type: mrr_at_1000 value: 74.725 - type: mrr_at_3 value: 72.792 - type: mrr_at_5 value: 73.554 - type: ndcg_at_1 value: 53.37499999999999 - type: ndcg_at_10 value: 41.286 - type: ndcg_at_100 value: 45.972 - type: ndcg_at_1000 value: 53.123 - type: ndcg_at_3 value: 46.172999999999995 - type: ndcg_at_5 value: 43.033 - type: precision_at_1 value: 65.5 - type: precision_at_10 value: 32.725 - type: precision_at_100 value: 10.683 - type: precision_at_1000 value: 1.978 - type: precision_at_3 value: 50 - type: precision_at_5 value: 41.349999999999994 - type: recall_at_1 value: 8.498 - type: recall_at_10 value: 25.070999999999998 - type: recall_at_100 value: 52.383 - type: recall_at_1000 value: 74.91499999999999 - type: recall_at_3 value: 15.207999999999998 - type: recall_at_5 value: 18.563 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 46.5 - type: f1 value: 41.93833713984145 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: map_at_1 value: 67.914 - type: map_at_10 value: 78.10000000000001 - type: map_at_100 value: 78.333 - type: map_at_1000 value: 78.346 - type: map_at_3 value: 76.626 - type: map_at_5 value: 77.627 - type: mrr_at_1 value: 72.74199999999999 - type: mrr_at_10 value: 82.414 - type: mrr_at_100 value: 82.511 - type: mrr_at_1000 value: 82.513 - type: mrr_at_3 value: 81.231 - type: mrr_at_5 value: 82.065 - type: ndcg_at_1 value: 72.74199999999999 - type: ndcg_at_10 value: 82.806 - type: ndcg_at_100 value: 83.677 - type: ndcg_at_1000 value: 83.917 - type: ndcg_at_3 value: 80.305 - type: ndcg_at_5 value: 81.843 - type: precision_at_1 value: 72.74199999999999 - type: precision_at_10 value: 10.24 - type: precision_at_100 value: 1.089 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 31.268 - type: precision_at_5 value: 19.706000000000003 - type: recall_at_1 value: 67.914 - type: recall_at_10 value: 92.889 - type: recall_at_100 value: 96.42699999999999 - type: recall_at_1000 value: 97.92 - type: recall_at_3 value: 86.21 - type: recall_at_5 value: 90.036 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: map_at_1 value: 22.166 - type: map_at_10 value: 35.57 - type: map_at_100 value: 37.405 - type: map_at_1000 value: 37.564 - type: map_at_3 value: 30.379 - type: map_at_5 value: 33.324 - type: mrr_at_1 value: 43.519000000000005 - type: mrr_at_10 value: 51.556000000000004 - type: mrr_at_100 value: 52.344 - type: mrr_at_1000 value: 52.373999999999995 - type: mrr_at_3 value: 48.868 - type: mrr_at_5 value: 50.319 - type: ndcg_at_1 value: 43.519000000000005 - type: ndcg_at_10 value: 43.803 - type: ndcg_at_100 value: 50.468999999999994 - type: ndcg_at_1000 value: 53.111 - type: ndcg_at_3 value: 38.893 - type: ndcg_at_5 value: 40.653 - type: precision_at_1 value: 43.519000000000005 - type: precision_at_10 value: 12.253 - type: precision_at_100 value: 1.931 - type: precision_at_1000 value: 0.242 - type: precision_at_3 value: 25.617 - type: precision_at_5 value: 19.383 - type: recall_at_1 value: 22.166 - type: recall_at_10 value: 51.6 - type: recall_at_100 value: 76.574 - type: recall_at_1000 value: 92.192 - type: recall_at_3 value: 34.477999999999994 - type: recall_at_5 value: 41.835 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: map_at_1 value: 39.041 - type: map_at_10 value: 62.961999999999996 - type: map_at_100 value: 63.79899999999999 - type: map_at_1000 value: 63.854 - type: map_at_3 value: 59.399 - type: map_at_5 value: 61.669 - type: mrr_at_1 value: 78.082 - type: mrr_at_10 value: 84.321 - type: mrr_at_100 value: 84.49600000000001 - type: mrr_at_1000 value: 84.502 - type: mrr_at_3 value: 83.421 - type: mrr_at_5 value: 83.977 - type: ndcg_at_1 value: 78.082 - type: ndcg_at_10 value: 71.229 - type: ndcg_at_100 value: 74.10900000000001 - type: ndcg_at_1000 value: 75.169 - type: ndcg_at_3 value: 66.28699999999999 - type: ndcg_at_5 value: 69.084 - type: precision_at_1 value: 78.082 - type: precision_at_10 value: 14.993 - type: precision_at_100 value: 1.7239999999999998 - type: precision_at_1000 value: 0.186 - type: precision_at_3 value: 42.737 - type: precision_at_5 value: 27.843 - type: recall_at_1 value: 39.041 - type: recall_at_10 value: 74.96300000000001 - type: recall_at_100 value: 86.199 - type: recall_at_1000 value: 93.228 - type: recall_at_3 value: 64.105 - type: recall_at_5 value: 69.608 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 90.23160000000001 - type: ap value: 85.5674856808308 - type: f1 value: 90.18033354786317 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: dev revision: None metrics: - type: map_at_1 value: 24.091 - type: map_at_10 value: 36.753 - type: map_at_100 value: 37.913000000000004 - type: map_at_1000 value: 37.958999999999996 - type: map_at_3 value: 32.818999999999996 - type: map_at_5 value: 35.171 - type: mrr_at_1 value: 24.742 - type: mrr_at_10 value: 37.285000000000004 - type: mrr_at_100 value: 38.391999999999996 - type: mrr_at_1000 value: 38.431 - type: mrr_at_3 value: 33.440999999999995 - type: mrr_at_5 value: 35.75 - type: ndcg_at_1 value: 24.742 - type: ndcg_at_10 value: 43.698 - type: ndcg_at_100 value: 49.145 - type: ndcg_at_1000 value: 50.23800000000001 - type: ndcg_at_3 value: 35.769 - type: ndcg_at_5 value: 39.961999999999996 - type: precision_at_1 value: 24.742 - type: precision_at_10 value: 6.7989999999999995 - type: precision_at_100 value: 0.95 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 15.096000000000002 - type: precision_at_5 value: 11.183 - type: recall_at_1 value: 24.091 - type: recall_at_10 value: 65.068 - type: recall_at_100 value: 89.899 - type: recall_at_1000 value: 98.16 - type: recall_at_3 value: 43.68 - type: recall_at_5 value: 53.754999999999995 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.66621067031465 - type: f1 value: 93.49622853272142 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (de) type: mteb/mtop_domain config: de split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 91.94702733164272 - type: f1 value: 91.17043441745282 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (es) type: mteb/mtop_domain config: es split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 92.20146764509674 - type: f1 value: 91.98359080555608 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (fr) type: mteb/mtop_domain config: fr split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 88.99780770435328 - type: f1 value: 89.19746342724068 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (hi) type: mteb/mtop_domain config: hi split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 89.78486912871998 - type: f1 value: 89.24578823628642 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (th) type: mteb/mtop_domain config: th split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 88.74502712477394 - type: f1 value: 89.00297573881542 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 77.9046967624259 - type: f1 value: 59.36787125785957 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (de) type: mteb/mtop_intent config: de split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 74.5280360664976 - type: f1 value: 57.17723440888718 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (es) type: mteb/mtop_intent config: es split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 75.44029352901934 - type: f1 value: 54.052855531072964 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (fr) type: mteb/mtop_intent config: fr split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 70.5606013153774 - type: f1 value: 52.62215934386531 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (hi) type: mteb/mtop_intent config: hi split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 73.11581211903908 - type: f1 value: 52.341291845645465 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (th) type: mteb/mtop_intent config: th split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 74.28933092224233 - type: f1 value: 57.07918745504911 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (af) type: mteb/amazon_massive_intent config: af split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.38063214525892 - type: f1 value: 59.46463723443009 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (am) type: mteb/amazon_massive_intent config: am split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 56.06926698049766 - type: f1 value: 52.49084283283562 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ar) type: mteb/amazon_massive_intent config: ar split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 60.74983187626093 - type: f1 value: 56.960640620165904 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (az) type: mteb/amazon_massive_intent config: az split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.86550100874243 - type: f1 value: 62.47370548140688 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (bn) type: mteb/amazon_massive_intent config: bn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 63.971082716879636 - type: f1 value: 61.03812421957381 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (cy) type: mteb/amazon_massive_intent config: cy split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 54.98318762609282 - type: f1 value: 51.51207916008392 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (da) type: mteb/amazon_massive_intent config: da split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.45527908540686 - type: f1 value: 66.16631905400318 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (de) type: mteb/amazon_massive_intent config: de split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.32750504371216 - type: f1 value: 66.16755288646591 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (el) type: mteb/amazon_massive_intent config: el split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.09213180901143 - type: f1 value: 66.95654394661507 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 73.75588433086752 - type: f1 value: 71.79973779656923 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (es) type: mteb/amazon_massive_intent config: es split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.49428379287154 - type: f1 value: 68.37494379215734 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fa) type: mteb/amazon_massive_intent config: fa split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.90921318090115 - type: f1 value: 66.79517376481645 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fi) type: mteb/amazon_massive_intent config: fi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.12104909213181 - type: f1 value: 67.29448842879584 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (fr) type: mteb/amazon_massive_intent config: fr split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.34095494283793 - type: f1 value: 67.01134288992947 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (he) type: mteb/amazon_massive_intent config: he split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.61264290517822 - type: f1 value: 64.68730512660757 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (hi) type: mteb/amazon_massive_intent config: hi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.79757901815738 - type: f1 value: 65.24938539425598 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (hu) type: mteb/amazon_massive_intent config: hu split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.68728984532616 - type: f1 value: 67.0487169762553 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (hy) type: mteb/amazon_massive_intent config: hy split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.07464694014795 - type: f1 value: 59.183532276789286 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (id) type: mteb/amazon_massive_intent config: id split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.04707464694015 - type: f1 value: 67.66829629003848 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (is) type: mteb/amazon_massive_intent config: is split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.42434431741762 - type: f1 value: 59.01617226544757 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (it) type: mteb/amazon_massive_intent config: it split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.53127101546738 - type: f1 value: 68.10033760906255 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ja) type: mteb/amazon_massive_intent config: ja split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 72.50504371217215 - type: f1 value: 69.74931103158923 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (jv) type: mteb/amazon_massive_intent config: jv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 57.91190316072628 - type: f1 value: 54.05551136648796 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ka) type: mteb/amazon_massive_intent config: ka split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 51.78211163416275 - type: f1 value: 49.874888544058535 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (km) type: mteb/amazon_massive_intent config: km split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 47.017484868863484 - type: f1 value: 44.53364263352014 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (kn) type: mteb/amazon_massive_intent config: kn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.16207128446537 - type: f1 value: 59.01185692320829 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ko) type: mteb/amazon_massive_intent config: ko split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.42501681237391 - type: f1 value: 67.13169450166086 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (lv) type: mteb/amazon_massive_intent config: lv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.0780094149294 - type: f1 value: 64.41720167850707 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ml) type: mteb/amazon_massive_intent config: ml split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.57162071284466 - type: f1 value: 62.414138683804424 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (mn) type: mteb/amazon_massive_intent config: mn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 61.71149966375252 - type: f1 value: 58.594805125087234 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ms) type: mteb/amazon_massive_intent config: ms split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 66.03900470746471 - type: f1 value: 63.87937257883887 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (my) type: mteb/amazon_massive_intent config: my split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 60.8776059179556 - type: f1 value: 57.48587618059131 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (nb) type: mteb/amazon_massive_intent config: nb split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.87895090786819 - type: f1 value: 66.8141299430347 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (nl) type: mteb/amazon_massive_intent config: nl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.45057162071285 - type: f1 value: 67.46444039673516 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pl) type: mteb/amazon_massive_intent config: pl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 71.546738399462 - type: f1 value: 68.63640876702655 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (pt) type: mteb/amazon_massive_intent config: pt split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.72965702757229 - type: f1 value: 68.54119560379115 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ro) type: mteb/amazon_massive_intent config: ro split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.35574983187625 - type: f1 value: 65.88844917691927 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ru) type: mteb/amazon_massive_intent config: ru split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 71.70477471418964 - type: f1 value: 69.19665697061978 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sl) type: mteb/amazon_massive_intent config: sl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.0880968392737 - type: f1 value: 64.76962317666086 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sq) type: mteb/amazon_massive_intent config: sq split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.18493611297916 - type: f1 value: 62.49984559035371 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sv) type: mteb/amazon_massive_intent config: sv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 71.75857431069265 - type: f1 value: 69.20053687623418 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (sw) type: mteb/amazon_massive_intent config: sw split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 58.500336247478145 - type: f1 value: 55.2972398687929 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ta) type: mteb/amazon_massive_intent config: ta split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.68997982515132 - type: f1 value: 59.36848202755348 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (te) type: mteb/amazon_massive_intent config: te split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 63.01950235373235 - type: f1 value: 60.09351954625423 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (th) type: mteb/amazon_massive_intent config: th split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.29186281102892 - type: f1 value: 67.57860496703447 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (tl) type: mteb/amazon_massive_intent config: tl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.77471418964357 - type: f1 value: 61.913983147713836 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (tr) type: mteb/amazon_massive_intent config: tr split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.87222595830532 - type: f1 value: 66.03679033708141 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (ur) type: mteb/amazon_massive_intent config: ur split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.04505716207127 - type: f1 value: 61.28569169817908 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (vi) type: mteb/amazon_massive_intent config: vi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.38466711499663 - type: f1 value: 67.20532357036844 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-CN) type: mteb/amazon_massive_intent config: zh-CN split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 71.12306657700067 - type: f1 value: 68.91251226588182 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (zh-TW) type: mteb/amazon_massive_intent config: zh-TW split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 66.20040349697378 - type: f1 value: 66.02657347714175 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (af) type: mteb/amazon_massive_scenario config: af split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.73907195696032 - type: f1 value: 66.98484521791418 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (am) type: mteb/amazon_massive_scenario config: am split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 60.58843308675185 - type: f1 value: 58.95591723092005 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ar) type: mteb/amazon_massive_scenario config: ar split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.22730329522528 - type: f1 value: 66.0894499712115 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (az) type: mteb/amazon_massive_scenario config: az split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.48285137861465 - type: f1 value: 65.21963176785157 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (bn) type: mteb/amazon_massive_scenario config: bn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.74714189643578 - type: f1 value: 66.8212192745412 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (cy) type: mteb/amazon_massive_scenario config: cy split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 59.09213180901143 - type: f1 value: 56.70735546356339 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (da) type: mteb/amazon_massive_scenario config: da split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 75.05716207128448 - type: f1 value: 74.8413712365364 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (de) type: mteb/amazon_massive_scenario config: de split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.69737726967047 - type: f1 value: 74.7664341963 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (el) type: mteb/amazon_massive_scenario config: el split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.90383322125084 - type: f1 value: 73.59201554448323 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 77.51176866173503 - type: f1 value: 77.46104434577758 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (es) type: mteb/amazon_massive_scenario config: es split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.31069266980496 - type: f1 value: 74.61048660675635 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fa) type: mteb/amazon_massive_scenario config: fa split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.95225285810356 - type: f1 value: 72.33160006574627 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fi) type: mteb/amazon_massive_scenario config: fi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.12373907195696 - type: f1 value: 73.20921012557481 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (fr) type: mteb/amazon_massive_scenario config: fr split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.86684599865501 - type: f1 value: 73.82348774610831 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (he) type: mteb/amazon_massive_scenario config: he split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.40215198386012 - type: f1 value: 71.11945183971858 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (hi) type: mteb/amazon_massive_scenario config: hi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.12844653665098 - type: f1 value: 71.34450495911766 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (hu) type: mteb/amazon_massive_scenario config: hu split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.52252858103566 - type: f1 value: 73.98878711342999 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (hy) type: mteb/amazon_massive_scenario config: hy split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 64.93611297915265 - type: f1 value: 63.723200467653385 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (id) type: mteb/amazon_massive_scenario config: id split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.11903160726295 - type: f1 value: 73.82138439467096 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (is) type: mteb/amazon_massive_scenario config: is split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.15198386012105 - type: f1 value: 66.02172193802167 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (it) type: mteb/amazon_massive_scenario config: it split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.32414256893072 - type: f1 value: 74.30943421170574 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ja) type: mteb/amazon_massive_scenario config: ja split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 77.46805648957633 - type: f1 value: 77.62808409298209 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (jv) type: mteb/amazon_massive_scenario config: jv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 63.318762609280434 - type: f1 value: 62.094284066075076 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ka) type: mteb/amazon_massive_scenario config: ka split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 58.34902488231338 - type: f1 value: 57.12893860987984 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (km) type: mteb/amazon_massive_scenario config: km split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 50.88433086751849 - type: f1 value: 48.2272350802058 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (kn) type: mteb/amazon_massive_scenario config: kn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.4425016812374 - type: f1 value: 64.61463095996173 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ko) type: mteb/amazon_massive_scenario config: ko split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 75.04707464694015 - type: f1 value: 75.05099199098998 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (lv) type: mteb/amazon_massive_scenario config: lv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.50437121721586 - type: f1 value: 69.83397721096314 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ml) type: mteb/amazon_massive_scenario config: ml split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 69.94283792871553 - type: f1 value: 68.8704663703913 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (mn) type: mteb/amazon_massive_scenario config: mn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 64.79488903833222 - type: f1 value: 63.615424063345436 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ms) type: mteb/amazon_massive_scenario config: ms split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 69.88231338264963 - type: f1 value: 68.57892302593237 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (my) type: mteb/amazon_massive_scenario config: my split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 63.248150638870214 - type: f1 value: 61.06680605338809 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (nb) type: mteb/amazon_massive_scenario config: nb split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.84196368527236 - type: f1 value: 74.52566464968763 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (nl) type: mteb/amazon_massive_scenario config: nl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.8285137861466 - type: f1 value: 74.8853197608802 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pl) type: mteb/amazon_massive_scenario config: pl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.13248150638869 - type: f1 value: 74.3982040999179 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (pt) type: mteb/amazon_massive_scenario config: pt split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.49024882313383 - type: f1 value: 73.82153848368573 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ro) type: mteb/amazon_massive_scenario config: ro split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.72158708809684 - type: f1 value: 71.85049433180541 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ru) type: mteb/amazon_massive_scenario config: ru split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 75.137861466039 - type: f1 value: 75.37628348188467 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sl) type: mteb/amazon_massive_scenario config: sl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.86953597848016 - type: f1 value: 71.87537624521661 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sq) type: mteb/amazon_massive_scenario config: sq split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.27572293207801 - type: f1 value: 68.80017302344231 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sv) type: mteb/amazon_massive_scenario config: sv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 76.09952925353059 - type: f1 value: 76.07992707688408 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (sw) type: mteb/amazon_massive_scenario config: sw split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 63.140551445864155 - type: f1 value: 61.73855010331415 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ta) type: mteb/amazon_massive_scenario config: ta split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.27774041694687 - type: f1 value: 64.83664868894539 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (te) type: mteb/amazon_massive_scenario config: te split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.69468728984533 - type: f1 value: 64.76239666920868 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (th) type: mteb/amazon_massive_scenario config: th split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.44653665097512 - type: f1 value: 73.14646052013873 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (tl) type: mteb/amazon_massive_scenario config: tl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.71351714862139 - type: f1 value: 66.67212180163382 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (tr) type: mteb/amazon_massive_scenario config: tr split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.9946200403497 - type: f1 value: 73.87348793725525 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (ur) type: mteb/amazon_massive_scenario config: ur split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.15400134498992 - type: f1 value: 67.09433241421094 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (vi) type: mteb/amazon_massive_scenario config: vi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.11365164761264 - type: f1 value: 73.59502539433753 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-CN) type: mteb/amazon_massive_scenario config: zh-CN split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 76.82582380632145 - type: f1 value: 76.89992945316313 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (zh-TW) type: mteb/amazon_massive_scenario config: zh-TW split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.81237390719569 - type: f1 value: 72.36499770986265 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 31.480506569594695 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 29.71252128004552 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 31.421396787056548 - type: mrr value: 32.48155274872267 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.595 - type: map_at_10 value: 12.642000000000001 - type: map_at_100 value: 15.726 - type: map_at_1000 value: 17.061999999999998 - type: map_at_3 value: 9.125 - type: map_at_5 value: 10.866000000000001 - type: mrr_at_1 value: 43.344 - type: mrr_at_10 value: 52.227999999999994 - type: mrr_at_100 value: 52.898999999999994 - type: mrr_at_1000 value: 52.944 - type: mrr_at_3 value: 49.845 - type: mrr_at_5 value: 51.115 - type: ndcg_at_1 value: 41.949999999999996 - type: ndcg_at_10 value: 33.995 - type: ndcg_at_100 value: 30.869999999999997 - type: ndcg_at_1000 value: 39.487 - type: ndcg_at_3 value: 38.903999999999996 - type: ndcg_at_5 value: 37.236999999999995 - type: precision_at_1 value: 43.344 - type: precision_at_10 value: 25.480000000000004 - type: precision_at_100 value: 7.672 - type: precision_at_1000 value: 2.028 - type: precision_at_3 value: 36.636 - type: precision_at_5 value: 32.632 - type: recall_at_1 value: 5.595 - type: recall_at_10 value: 16.466 - type: recall_at_100 value: 31.226 - type: recall_at_1000 value: 62.778999999999996 - type: recall_at_3 value: 9.931 - type: recall_at_5 value: 12.884 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: map_at_1 value: 40.414 - type: map_at_10 value: 56.754000000000005 - type: map_at_100 value: 57.457 - type: map_at_1000 value: 57.477999999999994 - type: map_at_3 value: 52.873999999999995 - type: map_at_5 value: 55.175 - type: mrr_at_1 value: 45.278 - type: mrr_at_10 value: 59.192 - type: mrr_at_100 value: 59.650000000000006 - type: mrr_at_1000 value: 59.665 - type: mrr_at_3 value: 56.141 - type: mrr_at_5 value: 57.998000000000005 - type: ndcg_at_1 value: 45.278 - type: ndcg_at_10 value: 64.056 - type: ndcg_at_100 value: 66.89 - type: ndcg_at_1000 value: 67.364 - type: ndcg_at_3 value: 56.97 - type: ndcg_at_5 value: 60.719 - type: precision_at_1 value: 45.278 - type: precision_at_10 value: 9.994 - type: precision_at_100 value: 1.165 - type: precision_at_1000 value: 0.121 - type: precision_at_3 value: 25.512 - type: precision_at_5 value: 17.509 - type: recall_at_1 value: 40.414 - type: recall_at_10 value: 83.596 - type: recall_at_100 value: 95.72 - type: recall_at_1000 value: 99.24 - type: recall_at_3 value: 65.472 - type: recall_at_5 value: 74.039 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: map_at_1 value: 70.352 - type: map_at_10 value: 84.369 - type: map_at_100 value: 85.02499999999999 - type: map_at_1000 value: 85.04 - type: map_at_3 value: 81.42399999999999 - type: map_at_5 value: 83.279 - type: mrr_at_1 value: 81.05 - type: mrr_at_10 value: 87.401 - type: mrr_at_100 value: 87.504 - type: mrr_at_1000 value: 87.505 - type: mrr_at_3 value: 86.443 - type: mrr_at_5 value: 87.10799999999999 - type: ndcg_at_1 value: 81.04 - type: ndcg_at_10 value: 88.181 - type: ndcg_at_100 value: 89.411 - type: ndcg_at_1000 value: 89.507 - type: ndcg_at_3 value: 85.28099999999999 - type: ndcg_at_5 value: 86.888 - type: precision_at_1 value: 81.04 - type: precision_at_10 value: 13.406 - type: precision_at_100 value: 1.5350000000000001 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.31 - type: precision_at_5 value: 24.54 - type: recall_at_1 value: 70.352 - type: recall_at_10 value: 95.358 - type: recall_at_100 value: 99.541 - type: recall_at_1000 value: 99.984 - type: recall_at_3 value: 87.111 - type: recall_at_5 value: 91.643 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 46.54068723291946 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 63.216287629895994 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 4.023000000000001 - type: map_at_10 value: 10.071 - type: map_at_100 value: 11.892 - type: map_at_1000 value: 12.196 - type: map_at_3 value: 7.234 - type: map_at_5 value: 8.613999999999999 - type: mrr_at_1 value: 19.900000000000002 - type: mrr_at_10 value: 30.516 - type: mrr_at_100 value: 31.656000000000002 - type: mrr_at_1000 value: 31.723000000000003 - type: mrr_at_3 value: 27.400000000000002 - type: mrr_at_5 value: 29.270000000000003 - type: ndcg_at_1 value: 19.900000000000002 - type: ndcg_at_10 value: 17.474 - type: ndcg_at_100 value: 25.020999999999997 - type: ndcg_at_1000 value: 30.728 - type: ndcg_at_3 value: 16.588 - type: ndcg_at_5 value: 14.498 - type: precision_at_1 value: 19.900000000000002 - type: precision_at_10 value: 9.139999999999999 - type: precision_at_100 value: 2.011 - type: precision_at_1000 value: 0.33899999999999997 - type: precision_at_3 value: 15.667 - type: precision_at_5 value: 12.839999999999998 - type: recall_at_1 value: 4.023000000000001 - type: recall_at_10 value: 18.497 - type: recall_at_100 value: 40.8 - type: recall_at_1000 value: 68.812 - type: recall_at_3 value: 9.508 - type: recall_at_5 value: 12.983 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 83.967008785134 - type: cos_sim_spearman value: 80.23142141101837 - type: euclidean_pearson value: 81.20166064704539 - type: euclidean_spearman value: 80.18961335654585 - type: manhattan_pearson value: 81.13925443187625 - type: manhattan_spearman value: 80.07948723044424 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 86.94262461316023 - type: cos_sim_spearman value: 80.01596278563865 - type: euclidean_pearson value: 83.80799622922581 - type: euclidean_spearman value: 79.94984954947103 - type: manhattan_pearson value: 83.68473841756281 - type: manhattan_spearman value: 79.84990707951822 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 80.57346443146068 - type: cos_sim_spearman value: 81.54689837570866 - type: euclidean_pearson value: 81.10909881516007 - type: euclidean_spearman value: 81.56746243261762 - type: manhattan_pearson value: 80.87076036186582 - type: manhattan_spearman value: 81.33074987964402 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 79.54733787179849 - type: cos_sim_spearman value: 77.72202105610411 - type: euclidean_pearson value: 78.9043595478849 - type: euclidean_spearman value: 77.93422804309435 - type: manhattan_pearson value: 78.58115121621368 - type: manhattan_spearman value: 77.62508135122033 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 88.59880017237558 - type: cos_sim_spearman value: 89.31088630824758 - type: euclidean_pearson value: 88.47069261564656 - type: euclidean_spearman value: 89.33581971465233 - type: manhattan_pearson value: 88.40774264100956 - type: manhattan_spearman value: 89.28657485627835 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 84.08055117917084 - type: cos_sim_spearman value: 85.78491813080304 - type: euclidean_pearson value: 84.99329155500392 - type: euclidean_spearman value: 85.76728064677287 - type: manhattan_pearson value: 84.87947428989587 - type: manhattan_spearman value: 85.62429454917464 - task: type: STS dataset: name: MTEB STS17 (ko-ko) type: mteb/sts17-crosslingual-sts config: ko-ko split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 82.14190939287384 - type: cos_sim_spearman value: 82.27331573306041 - type: euclidean_pearson value: 81.891896953716 - type: euclidean_spearman value: 82.37695542955998 - type: manhattan_pearson value: 81.73123869460504 - type: manhattan_spearman value: 82.19989168441421 - task: type: STS dataset: name: MTEB STS17 (ar-ar) type: mteb/sts17-crosslingual-sts config: ar-ar split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 76.84695301843362 - type: cos_sim_spearman value: 77.87790986014461 - type: euclidean_pearson value: 76.91981583106315 - type: euclidean_spearman value: 77.88154772749589 - type: manhattan_pearson value: 76.94953277451093 - type: manhattan_spearman value: 77.80499230728604 - task: type: STS dataset: name: MTEB STS17 (en-ar) type: mteb/sts17-crosslingual-sts config: en-ar split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 75.44657840482016 - type: cos_sim_spearman value: 75.05531095119674 - type: euclidean_pearson value: 75.88161755829299 - type: euclidean_spearman value: 74.73176238219332 - type: manhattan_pearson value: 75.63984765635362 - type: manhattan_spearman value: 74.86476440770737 - task: type: STS dataset: name: MTEB STS17 (en-de) type: mteb/sts17-crosslingual-sts config: en-de split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 85.64700140524133 - type: cos_sim_spearman value: 86.16014210425672 - type: euclidean_pearson value: 86.49086860843221 - type: euclidean_spearman value: 86.09729326815614 - type: manhattan_pearson value: 86.43406265125513 - type: manhattan_spearman value: 86.17740150939994 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 87.91170098764921 - type: cos_sim_spearman value: 88.12437004058931 - type: euclidean_pearson value: 88.81828254494437 - type: euclidean_spearman value: 88.14831794572122 - type: manhattan_pearson value: 88.93442183448961 - type: manhattan_spearman value: 88.15254630778304 - task: type: STS dataset: name: MTEB STS17 (en-tr) type: mteb/sts17-crosslingual-sts config: en-tr split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 72.91390577997292 - type: cos_sim_spearman value: 71.22979457536074 - type: euclidean_pearson value: 74.40314008106749 - type: euclidean_spearman value: 72.54972136083246 - type: manhattan_pearson value: 73.85687539530218 - type: manhattan_spearman value: 72.09500771742637 - task: type: STS dataset: name: MTEB STS17 (es-en) type: mteb/sts17-crosslingual-sts config: es-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 80.9301067983089 - type: cos_sim_spearman value: 80.74989828346473 - type: euclidean_pearson value: 81.36781301814257 - type: euclidean_spearman value: 80.9448819964426 - type: manhattan_pearson value: 81.0351322685609 - type: manhattan_spearman value: 80.70192121844177 - task: type: STS dataset: name: MTEB STS17 (es-es) type: mteb/sts17-crosslingual-sts config: es-es split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 87.13820465980005 - type: cos_sim_spearman value: 86.73532498758757 - type: euclidean_pearson value: 87.21329451846637 - type: euclidean_spearman value: 86.57863198601002 - type: manhattan_pearson value: 87.06973713818554 - type: manhattan_spearman value: 86.47534918791499 - task: type: STS dataset: name: MTEB STS17 (fr-en) type: mteb/sts17-crosslingual-sts config: fr-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 85.48720108904415 - type: cos_sim_spearman value: 85.62221757068387 - type: euclidean_pearson value: 86.1010129512749 - type: euclidean_spearman value: 85.86580966509942 - type: manhattan_pearson value: 86.26800938808971 - type: manhattan_spearman value: 85.88902721678429 - task: type: STS dataset: name: MTEB STS17 (it-en) type: mteb/sts17-crosslingual-sts config: it-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 83.98021347333516 - type: cos_sim_spearman value: 84.53806553803501 - type: euclidean_pearson value: 84.61483347248364 - type: euclidean_spearman value: 85.14191408011702 - type: manhattan_pearson value: 84.75297588825967 - type: manhattan_spearman value: 85.33176753669242 - task: type: STS dataset: name: MTEB STS17 (nl-en) type: mteb/sts17-crosslingual-sts config: nl-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 84.51856644893233 - type: cos_sim_spearman value: 85.27510748506413 - type: euclidean_pearson value: 85.09886861540977 - type: euclidean_spearman value: 85.62579245860887 - type: manhattan_pearson value: 84.93017860464607 - type: manhattan_spearman value: 85.5063988898453 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 62.581573200584195 - type: cos_sim_spearman value: 63.05503590247928 - type: euclidean_pearson value: 63.652564812602094 - type: euclidean_spearman value: 62.64811520876156 - type: manhattan_pearson value: 63.506842893061076 - type: manhattan_spearman value: 62.51289573046917 - task: type: STS dataset: name: MTEB STS22 (de) type: mteb/sts22-crosslingual-sts config: de split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 48.2248801729127 - type: cos_sim_spearman value: 56.5936604678561 - type: euclidean_pearson value: 43.98149464089 - type: euclidean_spearman value: 56.108561882423615 - type: manhattan_pearson value: 43.86880305903564 - type: manhattan_spearman value: 56.04671150510166 - task: type: STS dataset: name: MTEB STS22 (es) type: mteb/sts22-crosslingual-sts config: es split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 55.17564527009831 - type: cos_sim_spearman value: 64.57978560979488 - type: euclidean_pearson value: 58.8818330154583 - type: euclidean_spearman value: 64.99214839071281 - type: manhattan_pearson value: 58.72671436121381 - type: manhattan_spearman value: 65.10713416616109 - task: type: STS dataset: name: MTEB STS22 (pl) type: mteb/sts22-crosslingual-sts config: pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 26.772131864023297 - type: cos_sim_spearman value: 34.68200792408681 - type: euclidean_pearson value: 16.68082419005441 - type: euclidean_spearman value: 34.83099932652166 - type: manhattan_pearson value: 16.52605949659529 - type: manhattan_spearman value: 34.82075801399475 - task: type: STS dataset: name: MTEB STS22 (tr) type: mteb/sts22-crosslingual-sts config: tr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 54.42415189043831 - type: cos_sim_spearman value: 63.54594264576758 - type: euclidean_pearson value: 57.36577498297745 - type: euclidean_spearman value: 63.111466379158074 - type: manhattan_pearson value: 57.584543715873885 - type: manhattan_spearman value: 63.22361054139183 - task: type: STS dataset: name: MTEB STS22 (ar) type: mteb/sts22-crosslingual-sts config: ar split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 47.55216762405518 - type: cos_sim_spearman value: 56.98670142896412 - type: euclidean_pearson value: 50.15318757562699 - type: euclidean_spearman value: 56.524941926541906 - type: manhattan_pearson value: 49.955618528674904 - type: manhattan_spearman value: 56.37102209240117 - task: type: STS dataset: name: MTEB STS22 (ru) type: mteb/sts22-crosslingual-sts config: ru split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 49.20540980338571 - type: cos_sim_spearman value: 59.9009453504406 - type: euclidean_pearson value: 49.557749853620535 - type: euclidean_spearman value: 59.76631621172456 - type: manhattan_pearson value: 49.62340591181147 - type: manhattan_spearman value: 59.94224880322436 - task: type: STS dataset: name: MTEB STS22 (zh) type: mteb/sts22-crosslingual-sts config: zh split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 51.508169956576985 - type: cos_sim_spearman value: 66.82461565306046 - type: euclidean_pearson value: 56.2274426480083 - type: euclidean_spearman value: 66.6775323848333 - type: manhattan_pearson value: 55.98277796300661 - type: manhattan_spearman value: 66.63669848497175 - task: type: STS dataset: name: MTEB STS22 (fr) type: mteb/sts22-crosslingual-sts config: fr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 72.86478788045507 - type: cos_sim_spearman value: 76.7946552053193 - type: euclidean_pearson value: 75.01598530490269 - type: euclidean_spearman value: 76.83618917858281 - type: manhattan_pearson value: 74.68337628304332 - type: manhattan_spearman value: 76.57480204017773 - task: type: STS dataset: name: MTEB STS22 (de-en) type: mteb/sts22-crosslingual-sts config: de-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 55.922619099401984 - type: cos_sim_spearman value: 56.599362477240774 - type: euclidean_pearson value: 56.68307052369783 - type: euclidean_spearman value: 54.28760436777401 - type: manhattan_pearson value: 56.67763566500681 - type: manhattan_spearman value: 53.94619541711359 - task: type: STS dataset: name: MTEB STS22 (es-en) type: mteb/sts22-crosslingual-sts config: es-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 66.74357206710913 - type: cos_sim_spearman value: 72.5208244925311 - type: euclidean_pearson value: 67.49254562186032 - type: euclidean_spearman value: 72.02469076238683 - type: manhattan_pearson value: 67.45251772238085 - type: manhattan_spearman value: 72.05538819984538 - task: type: STS dataset: name: MTEB STS22 (it) type: mteb/sts22-crosslingual-sts config: it split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 71.25734330033191 - type: cos_sim_spearman value: 76.98349083946823 - type: euclidean_pearson value: 73.71642838667736 - type: euclidean_spearman value: 77.01715504651384 - type: manhattan_pearson value: 73.61712711868105 - type: manhattan_spearman value: 77.01392571153896 - task: type: STS dataset: name: MTEB STS22 (pl-en) type: mteb/sts22-crosslingual-sts config: pl-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 63.18215462781212 - type: cos_sim_spearman value: 65.54373266117607 - type: euclidean_pearson value: 64.54126095439005 - type: euclidean_spearman value: 65.30410369102711 - type: manhattan_pearson value: 63.50332221148234 - type: manhattan_spearman value: 64.3455878104313 - task: type: STS dataset: name: MTEB STS22 (zh-en) type: mteb/sts22-crosslingual-sts config: zh-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 62.30509221440029 - type: cos_sim_spearman value: 65.99582704642478 - type: euclidean_pearson value: 63.43818859884195 - type: euclidean_spearman value: 66.83172582815764 - type: manhattan_pearson value: 63.055779168508764 - type: manhattan_spearman value: 65.49585020501449 - task: type: STS dataset: name: MTEB STS22 (es-it) type: mteb/sts22-crosslingual-sts config: es-it split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 59.587830825340404 - type: cos_sim_spearman value: 68.93467614588089 - type: euclidean_pearson value: 62.3073527367404 - type: euclidean_spearman value: 69.69758171553175 - type: manhattan_pearson value: 61.9074580815789 - type: manhattan_spearman value: 69.57696375597865 - task: type: STS dataset: name: MTEB STS22 (de-fr) type: mteb/sts22-crosslingual-sts config: de-fr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 57.143220125577066 - type: cos_sim_spearman value: 67.78857859159226 - type: euclidean_pearson value: 55.58225107923733 - type: euclidean_spearman value: 67.80662907184563 - type: manhattan_pearson value: 56.24953502726514 - type: manhattan_spearman value: 67.98262125431616 - task: type: STS dataset: name: MTEB STS22 (de-pl) type: mteb/sts22-crosslingual-sts config: de-pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 21.826928900322066 - type: cos_sim_spearman value: 49.578506634400405 - type: euclidean_pearson value: 27.939890138843214 - type: euclidean_spearman value: 52.71950519136242 - type: manhattan_pearson value: 26.39878683847546 - type: manhattan_spearman value: 47.54609580342499 - task: type: STS dataset: name: MTEB STS22 (fr-pl) type: mteb/sts22-crosslingual-sts config: fr-pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 57.27603854632001 - type: cos_sim_spearman value: 50.709255283710995 - type: euclidean_pearson value: 59.5419024445929 - type: euclidean_spearman value: 50.709255283710995 - type: manhattan_pearson value: 59.03256832438492 - type: manhattan_spearman value: 61.97797868009122 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 85.00757054859712 - type: cos_sim_spearman value: 87.29283629622222 - type: euclidean_pearson value: 86.54824171775536 - type: euclidean_spearman value: 87.24364730491402 - type: manhattan_pearson value: 86.5062156915074 - type: manhattan_spearman value: 87.15052170378574 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 82.03549357197389 - type: mrr value: 95.05437645143527 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: map_at_1 value: 57.260999999999996 - type: map_at_10 value: 66.259 - type: map_at_100 value: 66.884 - type: map_at_1000 value: 66.912 - type: map_at_3 value: 63.685 - type: map_at_5 value: 65.35499999999999 - type: mrr_at_1 value: 60.333000000000006 - type: mrr_at_10 value: 67.5 - type: mrr_at_100 value: 68.013 - type: mrr_at_1000 value: 68.038 - type: mrr_at_3 value: 65.61099999999999 - type: mrr_at_5 value: 66.861 - type: ndcg_at_1 value: 60.333000000000006 - type: ndcg_at_10 value: 70.41 - type: ndcg_at_100 value: 73.10600000000001 - type: ndcg_at_1000 value: 73.846 - type: ndcg_at_3 value: 66.133 - type: ndcg_at_5 value: 68.499 - type: precision_at_1 value: 60.333000000000006 - type: precision_at_10 value: 9.232999999999999 - type: precision_at_100 value: 1.0630000000000002 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 25.667 - type: precision_at_5 value: 17.067 - type: recall_at_1 value: 57.260999999999996 - type: recall_at_10 value: 81.94399999999999 - type: recall_at_100 value: 93.867 - type: recall_at_1000 value: 99.667 - type: recall_at_3 value: 70.339 - type: recall_at_5 value: 76.25 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.74356435643564 - type: cos_sim_ap value: 93.13411948212683 - type: cos_sim_f1 value: 86.80521991300147 - type: cos_sim_precision value: 84.00374181478017 - type: cos_sim_recall value: 89.8 - type: dot_accuracy value: 99.67920792079208 - type: dot_ap value: 89.27277565444479 - type: dot_f1 value: 83.9276990718124 - type: dot_precision value: 82.04393505253104 - type: dot_recall value: 85.9 - type: euclidean_accuracy value: 99.74257425742574 - type: euclidean_ap value: 93.17993008259062 - type: euclidean_f1 value: 86.69396110542476 - type: euclidean_precision value: 88.78406708595388 - type: euclidean_recall value: 84.7 - type: manhattan_accuracy value: 99.74257425742574 - type: manhattan_ap value: 93.14413755550099 - type: manhattan_f1 value: 86.82483594144371 - type: manhattan_precision value: 87.66564729867483 - type: manhattan_recall value: 86 - type: max_accuracy value: 99.74356435643564 - type: max_ap value: 93.17993008259062 - type: max_f1 value: 86.82483594144371 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 57.525863806168566 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 32.68850574423839 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 49.71580650644033 - type: mrr value: 50.50971903913081 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 29.152190498799484 - type: cos_sim_spearman value: 29.686180371952727 - type: dot_pearson value: 27.248664793816342 - type: dot_spearman value: 28.37748983721745 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.20400000000000001 - type: map_at_10 value: 1.6209999999999998 - type: map_at_100 value: 9.690999999999999 - type: map_at_1000 value: 23.733 - type: map_at_3 value: 0.575 - type: map_at_5 value: 0.885 - type: mrr_at_1 value: 78 - type: mrr_at_10 value: 86.56700000000001 - type: mrr_at_100 value: 86.56700000000001 - type: mrr_at_1000 value: 86.56700000000001 - type: mrr_at_3 value: 85.667 - type: mrr_at_5 value: 86.56700000000001 - type: ndcg_at_1 value: 76 - type: ndcg_at_10 value: 71.326 - type: ndcg_at_100 value: 54.208999999999996 - type: ndcg_at_1000 value: 49.252 - type: ndcg_at_3 value: 74.235 - type: ndcg_at_5 value: 73.833 - type: precision_at_1 value: 78 - type: precision_at_10 value: 74.8 - type: precision_at_100 value: 55.50000000000001 - type: precision_at_1000 value: 21.836 - type: precision_at_3 value: 78 - type: precision_at_5 value: 78 - type: recall_at_1 value: 0.20400000000000001 - type: recall_at_10 value: 1.894 - type: recall_at_100 value: 13.245999999999999 - type: recall_at_1000 value: 46.373 - type: recall_at_3 value: 0.613 - type: recall_at_5 value: 0.991 - task: type: BitextMining dataset: name: MTEB Tatoeba (sqi-eng) type: mteb/tatoeba-bitext-mining config: sqi-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.89999999999999 - type: f1 value: 94.69999999999999 - type: precision value: 94.11666666666667 - type: recall value: 95.89999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (fry-eng) type: mteb/tatoeba-bitext-mining config: fry-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 68.20809248554913 - type: f1 value: 63.431048720066066 - type: precision value: 61.69143958161298 - type: recall value: 68.20809248554913 - task: type: BitextMining dataset: name: MTEB Tatoeba (kur-eng) type: mteb/tatoeba-bitext-mining config: kur-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 71.21951219512195 - type: f1 value: 66.82926829268293 - type: precision value: 65.1260162601626 - type: recall value: 71.21951219512195 - task: type: BitextMining dataset: name: MTEB Tatoeba (tur-eng) type: mteb/tatoeba-bitext-mining config: tur-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.2 - type: f1 value: 96.26666666666667 - type: precision value: 95.8 - type: recall value: 97.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (deu-eng) type: mteb/tatoeba-bitext-mining config: deu-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 99.3 - type: f1 value: 99.06666666666666 - type: precision value: 98.95 - type: recall value: 99.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (nld-eng) type: mteb/tatoeba-bitext-mining config: nld-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.39999999999999 - type: f1 value: 96.63333333333333 - type: precision value: 96.26666666666668 - type: recall value: 97.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (ron-eng) type: mteb/tatoeba-bitext-mining config: ron-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96 - type: f1 value: 94.86666666666666 - type: precision value: 94.31666666666668 - type: recall value: 96 - task: type: BitextMining dataset: name: MTEB Tatoeba (ang-eng) type: mteb/tatoeba-bitext-mining config: ang-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 47.01492537313433 - type: f1 value: 40.178867566927266 - type: precision value: 38.179295828549556 - type: recall value: 47.01492537313433 - task: type: BitextMining dataset: name: MTEB Tatoeba (ido-eng) type: mteb/tatoeba-bitext-mining config: ido-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 86.5 - type: f1 value: 83.62537480063796 - type: precision value: 82.44555555555554 - type: recall value: 86.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (jav-eng) type: mteb/tatoeba-bitext-mining config: jav-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 80.48780487804879 - type: f1 value: 75.45644599303138 - type: precision value: 73.37398373983739 - type: recall value: 80.48780487804879 - task: type: BitextMining dataset: name: MTEB Tatoeba (isl-eng) type: mteb/tatoeba-bitext-mining config: isl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.7 - type: f1 value: 91.95666666666666 - type: precision value: 91.125 - type: recall value: 93.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (slv-eng) type: mteb/tatoeba-bitext-mining config: slv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.73754556500607 - type: f1 value: 89.65168084244632 - type: precision value: 88.73025516403402 - type: recall value: 91.73754556500607 - task: type: BitextMining dataset: name: MTEB Tatoeba (cym-eng) type: mteb/tatoeba-bitext-mining config: cym-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 81.04347826086956 - type: f1 value: 76.2128364389234 - type: precision value: 74.2 - type: recall value: 81.04347826086956 - task: type: BitextMining dataset: name: MTEB Tatoeba (kaz-eng) type: mteb/tatoeba-bitext-mining config: kaz-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 83.65217391304348 - type: f1 value: 79.4376811594203 - type: precision value: 77.65797101449274 - type: recall value: 83.65217391304348 - task: type: BitextMining dataset: name: MTEB Tatoeba (est-eng) type: mteb/tatoeba-bitext-mining config: est-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 87.5 - type: f1 value: 85.02690476190476 - type: precision value: 83.96261904761904 - type: recall value: 87.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (heb-eng) type: mteb/tatoeba-bitext-mining config: heb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 89.3 - type: f1 value: 86.52333333333333 - type: precision value: 85.22833333333332 - type: recall value: 89.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (gla-eng) type: mteb/tatoeba-bitext-mining config: gla-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 65.01809408926418 - type: f1 value: 59.00594446432805 - type: precision value: 56.827215807915444 - type: recall value: 65.01809408926418 - task: type: BitextMining dataset: name: MTEB Tatoeba (mar-eng) type: mteb/tatoeba-bitext-mining config: mar-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.2 - type: f1 value: 88.58 - type: precision value: 87.33333333333334 - type: recall value: 91.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (lat-eng) type: mteb/tatoeba-bitext-mining config: lat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 59.199999999999996 - type: f1 value: 53.299166276284915 - type: precision value: 51.3383908045977 - type: recall value: 59.199999999999996 - task: type: BitextMining dataset: name: MTEB Tatoeba (bel-eng) type: mteb/tatoeba-bitext-mining config: bel-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.2 - type: f1 value: 91.2 - type: precision value: 90.25 - type: recall value: 93.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (pms-eng) type: mteb/tatoeba-bitext-mining config: pms-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 64.76190476190476 - type: f1 value: 59.867110667110666 - type: precision value: 58.07390192653351 - type: recall value: 64.76190476190476 - task: type: BitextMining dataset: name: MTEB Tatoeba (gle-eng) type: mteb/tatoeba-bitext-mining config: gle-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.2 - type: f1 value: 71.48147546897547 - type: precision value: 69.65409090909091 - type: recall value: 76.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (pes-eng) type: mteb/tatoeba-bitext-mining config: pes-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.8 - type: f1 value: 92.14 - type: precision value: 91.35833333333333 - type: recall value: 93.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (nob-eng) type: mteb/tatoeba-bitext-mining config: nob-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.89999999999999 - type: f1 value: 97.2 - type: precision value: 96.85000000000001 - type: recall value: 97.89999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (bul-eng) type: mteb/tatoeba-bitext-mining config: bul-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.6 - type: f1 value: 92.93333333333334 - type: precision value: 92.13333333333333 - type: recall value: 94.6 - task: type: BitextMining dataset: name: MTEB Tatoeba (cbk-eng) type: mteb/tatoeba-bitext-mining config: cbk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 74.1 - type: f1 value: 69.14817460317461 - type: precision value: 67.2515873015873 - type: recall value: 74.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (hun-eng) type: mteb/tatoeba-bitext-mining config: hun-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.19999999999999 - type: f1 value: 94.01333333333335 - type: precision value: 93.46666666666667 - type: recall value: 95.19999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (uig-eng) type: mteb/tatoeba-bitext-mining config: uig-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.9 - type: f1 value: 72.07523809523809 - type: precision value: 70.19777777777779 - type: recall value: 76.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (rus-eng) type: mteb/tatoeba-bitext-mining config: rus-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.1 - type: f1 value: 92.31666666666666 - type: precision value: 91.43333333333332 - type: recall value: 94.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (spa-eng) type: mteb/tatoeba-bitext-mining config: spa-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.8 - type: f1 value: 97.1 - type: precision value: 96.76666666666668 - type: recall value: 97.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (hye-eng) type: mteb/tatoeba-bitext-mining config: hye-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.85714285714286 - type: f1 value: 90.92093441150045 - type: precision value: 90.00449236298293 - type: recall value: 92.85714285714286 - task: type: BitextMining dataset: name: MTEB Tatoeba (tel-eng) type: mteb/tatoeba-bitext-mining config: tel-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.16239316239316 - type: f1 value: 91.33903133903132 - type: precision value: 90.56267806267806 - type: recall value: 93.16239316239316 - task: type: BitextMining dataset: name: MTEB Tatoeba (afr-eng) type: mteb/tatoeba-bitext-mining config: afr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.4 - type: f1 value: 90.25666666666666 - type: precision value: 89.25833333333334 - type: recall value: 92.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (mon-eng) type: mteb/tatoeba-bitext-mining config: mon-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.22727272727272 - type: f1 value: 87.53030303030303 - type: precision value: 86.37121212121211 - type: recall value: 90.22727272727272 - task: type: BitextMining dataset: name: MTEB Tatoeba (arz-eng) type: mteb/tatoeba-bitext-mining config: arz-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 79.03563941299791 - type: f1 value: 74.7349505840072 - type: precision value: 72.9035639412998 - type: recall value: 79.03563941299791 - task: type: BitextMining dataset: name: MTEB Tatoeba (hrv-eng) type: mteb/tatoeba-bitext-mining config: hrv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97 - type: f1 value: 96.15 - type: precision value: 95.76666666666668 - type: recall value: 97 - task: type: BitextMining dataset: name: MTEB Tatoeba (nov-eng) type: mteb/tatoeba-bitext-mining config: nov-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.26459143968872 - type: f1 value: 71.55642023346303 - type: precision value: 69.7544932369835 - type: recall value: 76.26459143968872 - task: type: BitextMining dataset: name: MTEB Tatoeba (gsw-eng) type: mteb/tatoeba-bitext-mining config: gsw-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 58.119658119658126 - type: f1 value: 51.65242165242165 - type: precision value: 49.41768108434775 - type: recall value: 58.119658119658126 - task: type: BitextMining dataset: name: MTEB Tatoeba (nds-eng) type: mteb/tatoeba-bitext-mining config: nds-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 74.3 - type: f1 value: 69.52055555555555 - type: precision value: 67.7574938949939 - type: recall value: 74.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (ukr-eng) type: mteb/tatoeba-bitext-mining config: ukr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.8 - type: f1 value: 93.31666666666666 - type: precision value: 92.60000000000001 - type: recall value: 94.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (uzb-eng) type: mteb/tatoeba-bitext-mining config: uzb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.63551401869158 - type: f1 value: 72.35202492211837 - type: precision value: 70.60358255451713 - type: recall value: 76.63551401869158 - task: type: BitextMining dataset: name: MTEB Tatoeba (lit-eng) type: mteb/tatoeba-bitext-mining config: lit-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.4 - type: f1 value: 88.4811111111111 - type: precision value: 87.7452380952381 - type: recall value: 90.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (ina-eng) type: mteb/tatoeba-bitext-mining config: ina-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95 - type: f1 value: 93.60666666666667 - type: precision value: 92.975 - type: recall value: 95 - task: type: BitextMining dataset: name: MTEB Tatoeba (lfn-eng) type: mteb/tatoeba-bitext-mining config: lfn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 67.2 - type: f1 value: 63.01595782872099 - type: precision value: 61.596587301587306 - type: recall value: 67.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (zsm-eng) type: mteb/tatoeba-bitext-mining config: zsm-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.7 - type: f1 value: 94.52999999999999 - type: precision value: 94 - type: recall value: 95.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (ita-eng) type: mteb/tatoeba-bitext-mining config: ita-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.6 - type: f1 value: 93.28999999999999 - type: precision value: 92.675 - type: recall value: 94.6 - task: type: BitextMining dataset: name: MTEB Tatoeba (cmn-eng) type: mteb/tatoeba-bitext-mining config: cmn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.39999999999999 - type: f1 value: 95.28333333333333 - type: precision value: 94.75 - type: recall value: 96.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (lvs-eng) type: mteb/tatoeba-bitext-mining config: lvs-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.9 - type: f1 value: 89.83 - type: precision value: 88.92 - type: recall value: 91.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (glg-eng) type: mteb/tatoeba-bitext-mining config: glg-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.69999999999999 - type: f1 value: 93.34222222222223 - type: precision value: 92.75416666666668 - type: recall value: 94.69999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (ceb-eng) type: mteb/tatoeba-bitext-mining config: ceb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 60.333333333333336 - type: f1 value: 55.31203703703703 - type: precision value: 53.39971108326371 - type: recall value: 60.333333333333336 - task: type: BitextMining dataset: name: MTEB Tatoeba (bre-eng) type: mteb/tatoeba-bitext-mining config: bre-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 12.9 - type: f1 value: 11.099861903031458 - type: precision value: 10.589187932631877 - type: recall value: 12.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (ben-eng) type: mteb/tatoeba-bitext-mining config: ben-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 86.7 - type: f1 value: 83.0152380952381 - type: precision value: 81.37833333333333 - type: recall value: 86.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (swg-eng) type: mteb/tatoeba-bitext-mining config: swg-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 63.39285714285714 - type: f1 value: 56.832482993197274 - type: precision value: 54.56845238095237 - type: recall value: 63.39285714285714 - task: type: BitextMining dataset: name: MTEB Tatoeba (arq-eng) type: mteb/tatoeba-bitext-mining config: arq-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 48.73765093304062 - type: f1 value: 41.555736920720456 - type: precision value: 39.06874531737319 - type: recall value: 48.73765093304062 - task: type: BitextMining dataset: name: MTEB Tatoeba (kab-eng) type: mteb/tatoeba-bitext-mining config: kab-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 41.099999999999994 - type: f1 value: 36.540165945165946 - type: precision value: 35.05175685425686 - type: recall value: 41.099999999999994 - task: type: BitextMining dataset: name: MTEB Tatoeba (fra-eng) type: mteb/tatoeba-bitext-mining config: fra-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.89999999999999 - type: f1 value: 93.42333333333333 - type: precision value: 92.75833333333333 - type: recall value: 94.89999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (por-eng) type: mteb/tatoeba-bitext-mining config: por-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.89999999999999 - type: f1 value: 93.63333333333334 - type: precision value: 93.01666666666665 - type: recall value: 94.89999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (tat-eng) type: mteb/tatoeba-bitext-mining config: tat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 77.9 - type: f1 value: 73.64833333333334 - type: precision value: 71.90282106782105 - type: recall value: 77.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (oci-eng) type: mteb/tatoeba-bitext-mining config: oci-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 59.4 - type: f1 value: 54.90521367521367 - type: precision value: 53.432840025471606 - type: recall value: 59.4 - task: type: BitextMining dataset: name: MTEB Tatoeba (pol-eng) type: mteb/tatoeba-bitext-mining config: pol-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.39999999999999 - type: f1 value: 96.6 - type: precision value: 96.2 - type: recall value: 97.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (war-eng) type: mteb/tatoeba-bitext-mining config: war-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 67.2 - type: f1 value: 62.25926129426129 - type: precision value: 60.408376623376626 - type: recall value: 67.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (aze-eng) type: mteb/tatoeba-bitext-mining config: aze-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.2 - type: f1 value: 87.60666666666667 - type: precision value: 86.45277777777778 - type: recall value: 90.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (vie-eng) type: mteb/tatoeba-bitext-mining config: vie-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.7 - type: f1 value: 97 - type: precision value: 96.65 - type: recall value: 97.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (nno-eng) type: mteb/tatoeba-bitext-mining config: nno-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.2 - type: f1 value: 91.39746031746031 - type: precision value: 90.6125 - type: recall value: 93.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (cha-eng) type: mteb/tatoeba-bitext-mining config: cha-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 32.11678832116788 - type: f1 value: 27.210415386260234 - type: precision value: 26.20408990846947 - type: recall value: 32.11678832116788 - task: type: BitextMining dataset: name: MTEB Tatoeba (mhr-eng) type: mteb/tatoeba-bitext-mining config: mhr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 8.5 - type: f1 value: 6.787319277832475 - type: precision value: 6.3452094433344435 - type: recall value: 8.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (dan-eng) type: mteb/tatoeba-bitext-mining config: dan-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.1 - type: f1 value: 95.08 - type: precision value: 94.61666666666667 - type: recall value: 96.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (ell-eng) type: mteb/tatoeba-bitext-mining config: ell-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.3 - type: f1 value: 93.88333333333333 - type: precision value: 93.18333333333332 - type: recall value: 95.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (amh-eng) type: mteb/tatoeba-bitext-mining config: amh-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 85.11904761904762 - type: f1 value: 80.69444444444444 - type: precision value: 78.72023809523809 - type: recall value: 85.11904761904762 - task: type: BitextMining dataset: name: MTEB Tatoeba (pam-eng) type: mteb/tatoeba-bitext-mining config: pam-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 11.1 - type: f1 value: 9.276381801735853 - type: precision value: 8.798174603174601 - type: recall value: 11.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (hsb-eng) type: mteb/tatoeba-bitext-mining config: hsb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 63.56107660455487 - type: f1 value: 58.70433569191332 - type: precision value: 56.896926581464015 - type: recall value: 63.56107660455487 - task: type: BitextMining dataset: name: MTEB Tatoeba (srp-eng) type: mteb/tatoeba-bitext-mining config: srp-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.69999999999999 - type: f1 value: 93.10000000000001 - type: precision value: 92.35 - type: recall value: 94.69999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (epo-eng) type: mteb/tatoeba-bitext-mining config: epo-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.8 - type: f1 value: 96.01222222222222 - type: precision value: 95.67083333333332 - type: recall value: 96.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (kzj-eng) type: mteb/tatoeba-bitext-mining config: kzj-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 9.2 - type: f1 value: 7.911555250305249 - type: precision value: 7.631246556216846 - type: recall value: 9.2 - task: type: BitextMining dataset: name: MTEB Tatoeba (awa-eng) type: mteb/tatoeba-bitext-mining config: awa-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 77.48917748917748 - type: f1 value: 72.27375798804371 - type: precision value: 70.14430014430013 - type: recall value: 77.48917748917748 - task: type: BitextMining dataset: name: MTEB Tatoeba (fao-eng) type: mteb/tatoeba-bitext-mining config: fao-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 77.09923664122137 - type: f1 value: 72.61541257724463 - type: precision value: 70.8998380754106 - type: recall value: 77.09923664122137 - task: type: BitextMining dataset: name: MTEB Tatoeba (mal-eng) type: mteb/tatoeba-bitext-mining config: mal-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 98.2532751091703 - type: f1 value: 97.69529354682193 - type: precision value: 97.42843279961184 - type: recall value: 98.2532751091703 - task: type: BitextMining dataset: name: MTEB Tatoeba (ile-eng) type: mteb/tatoeba-bitext-mining config: ile-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 82.8 - type: f1 value: 79.14672619047619 - type: precision value: 77.59489247311828 - type: recall value: 82.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (bos-eng) type: mteb/tatoeba-bitext-mining config: bos-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.35028248587571 - type: f1 value: 92.86252354048965 - type: precision value: 92.2080979284369 - type: recall value: 94.35028248587571 - task: type: BitextMining dataset: name: MTEB Tatoeba (cor-eng) type: mteb/tatoeba-bitext-mining config: cor-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 8.5 - type: f1 value: 6.282429263935621 - type: precision value: 5.783274240739785 - type: recall value: 8.5 - task: type: BitextMining dataset: name: MTEB Tatoeba (cat-eng) type: mteb/tatoeba-bitext-mining config: cat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.7 - type: f1 value: 91.025 - type: precision value: 90.30428571428571 - type: recall value: 92.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (eus-eng) type: mteb/tatoeba-bitext-mining config: eus-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 81 - type: f1 value: 77.8232380952381 - type: precision value: 76.60194444444444 - type: recall value: 81 - task: type: BitextMining dataset: name: MTEB Tatoeba (yue-eng) type: mteb/tatoeba-bitext-mining config: yue-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91 - type: f1 value: 88.70857142857142 - type: precision value: 87.7 - type: recall value: 91 - task: type: BitextMining dataset: name: MTEB Tatoeba (swe-eng) type: mteb/tatoeba-bitext-mining config: swe-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.39999999999999 - type: f1 value: 95.3 - type: precision value: 94.76666666666667 - type: recall value: 96.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (dtp-eng) type: mteb/tatoeba-bitext-mining config: dtp-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 8.1 - type: f1 value: 7.001008218834307 - type: precision value: 6.708329562594269 - type: recall value: 8.1 - task: type: BitextMining dataset: name: MTEB Tatoeba (kat-eng) type: mteb/tatoeba-bitext-mining config: kat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 87.1313672922252 - type: f1 value: 84.09070598748882 - type: precision value: 82.79171454104429 - type: recall value: 87.1313672922252 - task: type: BitextMining dataset: name: MTEB Tatoeba (jpn-eng) type: mteb/tatoeba-bitext-mining config: jpn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.39999999999999 - type: f1 value: 95.28333333333333 - type: precision value: 94.73333333333332 - type: recall value: 96.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (csb-eng) type: mteb/tatoeba-bitext-mining config: csb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 42.29249011857708 - type: f1 value: 36.981018542283365 - type: precision value: 35.415877813576024 - type: recall value: 42.29249011857708 - task: type: BitextMining dataset: name: MTEB Tatoeba (xho-eng) type: mteb/tatoeba-bitext-mining config: xho-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 83.80281690140845 - type: f1 value: 80.86854460093896 - type: precision value: 79.60093896713614 - type: recall value: 83.80281690140845 - task: type: BitextMining dataset: name: MTEB Tatoeba (orv-eng) type: mteb/tatoeba-bitext-mining config: orv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 45.26946107784431 - type: f1 value: 39.80235464678088 - type: precision value: 38.14342660001342 - type: recall value: 45.26946107784431 - task: type: BitextMining dataset: name: MTEB Tatoeba (ind-eng) type: mteb/tatoeba-bitext-mining config: ind-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.3 - type: f1 value: 92.9 - type: precision value: 92.26666666666668 - type: recall value: 94.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (tuk-eng) type: mteb/tatoeba-bitext-mining config: tuk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 37.93103448275862 - type: f1 value: 33.15192743764172 - type: precision value: 31.57456528146183 - type: recall value: 37.93103448275862 - task: type: BitextMining dataset: name: MTEB Tatoeba (max-eng) type: mteb/tatoeba-bitext-mining config: max-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 69.01408450704226 - type: f1 value: 63.41549295774648 - type: precision value: 61.342778895595806 - type: recall value: 69.01408450704226 - task: type: BitextMining dataset: name: MTEB Tatoeba (swh-eng) type: mteb/tatoeba-bitext-mining config: swh-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.66666666666667 - type: f1 value: 71.60705960705961 - type: precision value: 69.60683760683762 - type: recall value: 76.66666666666667 - task: type: BitextMining dataset: name: MTEB Tatoeba (hin-eng) type: mteb/tatoeba-bitext-mining config: hin-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.8 - type: f1 value: 94.48333333333333 - type: precision value: 93.83333333333333 - type: recall value: 95.8 - task: type: BitextMining dataset: name: MTEB Tatoeba (dsb-eng) type: mteb/tatoeba-bitext-mining config: dsb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 52.81837160751566 - type: f1 value: 48.435977731384824 - type: precision value: 47.11291973845539 - type: recall value: 52.81837160751566 - task: type: BitextMining dataset: name: MTEB Tatoeba (ber-eng) type: mteb/tatoeba-bitext-mining config: ber-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 44.9 - type: f1 value: 38.88962621607783 - type: precision value: 36.95936507936508 - type: recall value: 44.9 - task: type: BitextMining dataset: name: MTEB Tatoeba (tam-eng) type: mteb/tatoeba-bitext-mining config: tam-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.55374592833876 - type: f1 value: 88.22553125484721 - type: precision value: 87.26927252985884 - type: recall value: 90.55374592833876 - task: type: BitextMining dataset: name: MTEB Tatoeba (slk-eng) type: mteb/tatoeba-bitext-mining config: slk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.6 - type: f1 value: 93.13333333333333 - type: precision value: 92.45333333333333 - type: recall value: 94.6 - task: type: BitextMining dataset: name: MTEB Tatoeba (tgl-eng) type: mteb/tatoeba-bitext-mining config: tgl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.7 - type: f1 value: 91.99666666666667 - type: precision value: 91.26666666666668 - type: recall value: 93.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (ast-eng) type: mteb/tatoeba-bitext-mining config: ast-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 85.03937007874016 - type: f1 value: 81.75853018372703 - type: precision value: 80.34120734908137 - type: recall value: 85.03937007874016 - task: type: BitextMining dataset: name: MTEB Tatoeba (mkd-eng) type: mteb/tatoeba-bitext-mining config: mkd-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.3 - type: f1 value: 85.5 - type: precision value: 84.25833333333334 - type: recall value: 88.3 - task: type: BitextMining dataset: name: MTEB Tatoeba (khm-eng) type: mteb/tatoeba-bitext-mining config: khm-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 65.51246537396122 - type: f1 value: 60.02297410192148 - type: precision value: 58.133467727289236 - type: recall value: 65.51246537396122 - task: type: BitextMining dataset: name: MTEB Tatoeba (ces-eng) type: mteb/tatoeba-bitext-mining config: ces-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96 - type: f1 value: 94.89 - type: precision value: 94.39166666666667 - type: recall value: 96 - task: type: BitextMining dataset: name: MTEB Tatoeba (tzl-eng) type: mteb/tatoeba-bitext-mining config: tzl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 57.692307692307686 - type: f1 value: 53.162393162393165 - type: precision value: 51.70673076923077 - type: recall value: 57.692307692307686 - task: type: BitextMining dataset: name: MTEB Tatoeba (urd-eng) type: mteb/tatoeba-bitext-mining config: urd-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.60000000000001 - type: f1 value: 89.21190476190475 - type: precision value: 88.08666666666667 - type: recall value: 91.60000000000001 - task: type: BitextMining dataset: name: MTEB Tatoeba (ara-eng) type: mteb/tatoeba-bitext-mining config: ara-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88 - type: f1 value: 85.47 - type: precision value: 84.43266233766234 - type: recall value: 88 - task: type: BitextMining dataset: name: MTEB Tatoeba (kor-eng) type: mteb/tatoeba-bitext-mining config: kor-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.7 - type: f1 value: 90.64999999999999 - type: precision value: 89.68333333333332 - type: recall value: 92.7 - task: type: BitextMining dataset: name: MTEB Tatoeba (yid-eng) type: mteb/tatoeba-bitext-mining config: yid-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 80.30660377358491 - type: f1 value: 76.33044137466307 - type: precision value: 74.78970125786164 - type: recall value: 80.30660377358491 - task: type: BitextMining dataset: name: MTEB Tatoeba (fin-eng) type: mteb/tatoeba-bitext-mining config: fin-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.39999999999999 - type: f1 value: 95.44 - type: precision value: 94.99166666666666 - type: recall value: 96.39999999999999 - task: type: BitextMining dataset: name: MTEB Tatoeba (tha-eng) type: mteb/tatoeba-bitext-mining config: tha-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.53284671532847 - type: f1 value: 95.37712895377129 - type: precision value: 94.7992700729927 - type: recall value: 96.53284671532847 - task: type: BitextMining dataset: name: MTEB Tatoeba (wuu-eng) type: mteb/tatoeba-bitext-mining config: wuu-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 89 - type: f1 value: 86.23190476190476 - type: precision value: 85.035 - type: recall value: 89 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.585 - type: map_at_10 value: 9.012 - type: map_at_100 value: 14.027000000000001 - type: map_at_1000 value: 15.565000000000001 - type: map_at_3 value: 5.032 - type: map_at_5 value: 6.657 - type: mrr_at_1 value: 28.571 - type: mrr_at_10 value: 45.377 - type: mrr_at_100 value: 46.119 - type: mrr_at_1000 value: 46.127 - type: mrr_at_3 value: 41.156 - type: mrr_at_5 value: 42.585 - type: ndcg_at_1 value: 27.551 - type: ndcg_at_10 value: 23.395 - type: ndcg_at_100 value: 33.342 - type: ndcg_at_1000 value: 45.523 - type: ndcg_at_3 value: 25.158 - type: ndcg_at_5 value: 23.427 - type: precision_at_1 value: 28.571 - type: precision_at_10 value: 21.429000000000002 - type: precision_at_100 value: 6.714 - type: precision_at_1000 value: 1.473 - type: precision_at_3 value: 27.211000000000002 - type: precision_at_5 value: 24.490000000000002 - type: recall_at_1 value: 2.585 - type: recall_at_10 value: 15.418999999999999 - type: recall_at_100 value: 42.485 - type: recall_at_1000 value: 79.536 - type: recall_at_3 value: 6.239999999999999 - type: recall_at_5 value: 8.996 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 71.3234 - type: ap value: 14.361688653847423 - type: f1 value: 54.819068624319044 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 61.97792869269949 - type: f1 value: 62.28965628513728 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 38.90540145385218 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 86.53513739047506 - type: cos_sim_ap value: 75.27741586677557 - type: cos_sim_f1 value: 69.18792902473774 - type: cos_sim_precision value: 67.94708725515136 - type: cos_sim_recall value: 70.47493403693932 - type: dot_accuracy value: 84.7052512368123 - type: dot_ap value: 69.36075482849378 - type: dot_f1 value: 64.44688376631296 - type: dot_precision value: 59.92288500793831 - type: dot_recall value: 69.70976253298153 - type: euclidean_accuracy value: 86.60666388508076 - type: euclidean_ap value: 75.47512772621097 - type: euclidean_f1 value: 69.413872536473 - type: euclidean_precision value: 67.39562624254472 - type: euclidean_recall value: 71.55672823218997 - type: manhattan_accuracy value: 86.52917684925792 - type: manhattan_ap value: 75.34000110496703 - type: manhattan_f1 value: 69.28489190226429 - type: manhattan_precision value: 67.24608889992551 - type: manhattan_recall value: 71.45118733509234 - type: max_accuracy value: 86.60666388508076 - type: max_ap value: 75.47512772621097 - type: max_f1 value: 69.413872536473 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 89.01695967710637 - type: cos_sim_ap value: 85.8298270742901 - type: cos_sim_f1 value: 78.46988128389272 - type: cos_sim_precision value: 74.86017897091722 - type: cos_sim_recall value: 82.44533415460425 - type: dot_accuracy value: 88.19420188613343 - type: dot_ap value: 83.82679165901324 - type: dot_f1 value: 76.55833777304208 - type: dot_precision value: 75.6884875846501 - type: dot_recall value: 77.44841392054204 - type: euclidean_accuracy value: 89.03054294252338 - type: euclidean_ap value: 85.89089555185325 - type: euclidean_f1 value: 78.62997658079624 - type: euclidean_precision value: 74.92329149232914 - type: euclidean_recall value: 82.72251308900523 - type: manhattan_accuracy value: 89.0266620095471 - type: manhattan_ap value: 85.86458997929147 - type: manhattan_f1 value: 78.50685331000291 - type: manhattan_precision value: 74.5499861534201 - type: manhattan_recall value: 82.90729904527257 - type: max_accuracy value: 89.03054294252338 - type: max_ap value: 85.89089555185325 - type: max_f1 value: 78.62997658079624 --- # CyberOptic/multilingual-e5-large-Q8_0-GGUF This model was converted to GGUF format from [`intfloat/multilingual-e5-large`](https://huggingface.co/intfloat/multilingual-e5-large) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/intfloat/multilingual-e5-large) for more details on the model. ## Use with llama.cpp Install llama.cpp through brew (works on Mac and Linux) ```bash brew install llama.cpp ``` Invoke the llama.cpp server or the CLI. ### CLI: ```bash llama-cli --hf-repo CyberOptic/multilingual-e5-large-Q8_0-GGUF --hf-file multilingual-e5-large-q8_0.gguf -p "The meaning to life and the universe is" ``` ### Server: ```bash llama-server --hf-repo CyberOptic/multilingual-e5-large-Q8_0-GGUF --hf-file multilingual-e5-large-q8_0.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. Step 1: Clone llama.cpp from GitHub. ``` git clone https://github.com/ggerganov/llama.cpp ``` Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux). ``` cd llama.cpp && LLAMA_CURL=1 make ``` Step 3: Run inference through the main binary. ``` ./llama-cli --hf-repo CyberOptic/multilingual-e5-large-Q8_0-GGUF --hf-file multilingual-e5-large-q8_0.gguf -p "The meaning to life and the universe is" ``` or ``` ./llama-server --hf-repo CyberOptic/multilingual-e5-large-Q8_0-GGUF --hf-file multilingual-e5-large-q8_0.gguf -c 2048 ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
LoneStriker/OpenBioLLM-Llama3-8B-5.0bpw-h6-exl2
LoneStriker
text-generation
[ "transformers", "pytorch", "llama", "text-generation", "llama-3", "Mixtral", "instruct", "finetune", "chatml", "DPO", "RLHF", "gpt4", "distillation", "en", "arxiv:2305.18290", "arxiv:2303.13375", "arxiv:2212.13138", "arxiv:2305.09617", "arxiv:2402.07023", "base_model:meta-llama/Meta-Llama-3-8B", "base_model:quantized:meta-llama/Meta-Llama-3-8B", "license:llama3", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "5-bit", "exl2", "region:us" ]
1,714
1,714
8
0
--- base_model: meta-llama/Meta-Llama-3-8B language: - en license: llama3 tags: - llama-3 - llama - Mixtral - instruct - finetune - chatml - DPO - RLHF - gpt4 - distillation widget: - example_title: OpenBioLLM-8B messages: - role: system content: You are an expert and experienced from the healthcare and biomedical domain with extensive medical knowledge and practical experience. - role: user content: How long does it take for newborn jaundice to go away? output: text: 'Newborn jaundice, also known as neonatal jaundice, is a common condition in newborns where the yellowing of the skin and eyes occurs due to an elevated level of bilirubin in the blood. Bilirubin is a yellow pigment that forms when red blood cells break down. In most cases, newborn jaundice resolves on its own without any specific treatment. The duration of newborn jaundice can vary depending on several factors such as the underlying cause, gestational age at birth, and individual variations in bilirubin metabolism. Here are some general guidelines: 1. Physiological jaundice: This is the most common type of newborn jaundice and usually appears within 24-72 hours after birth. It tends to peak between the second and fifth day of life and gradually improves over the next week or two. By the time the baby is one week old, the jaundice should have mostly resolved. 2. Breast milk jaundice: This type of jaundice occurs in breastfed babies and may appear later than physiological jaundice, typically between the fifth and fourteenth day of life. It tends to persist for a longer duration but usually resolves within six weeks after birth. 3. Pathological jaundice: This type of jaundice is less common and occurs due to an underlying medical condition that affects bilirubin metabolism or liver function. The duration of pathological jaundice depends on the specific cause and may require treatment. It''s important for parents to monitor their newborn''s jaundice closely and seek medical advice if the jaundice progresses rapidly, becomes severe, or is accompanied by other symptoms such as poor feeding, lethargy, or excessive sleepiness. In these cases, further evaluation and management may be necessary. Remember that each baby is unique, and the timing of jaundice resolution can vary. If you have concerns about your newborn''s jaundice, it''s always best to consult with a healthcare professional for personalized advice and guidance.' model-index: - name: OpenBioLLM-8B results: [] --- <div align="center"> <img width="260px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/BrQCb95lmEIFz79QAmoNA.png"></div> ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/2FhDh8NDvMl7iSxbQz9BP.png) <div align="center"> <h1>Advancing Open-source Large Language Models in Medical Domain</h1> </div> <p align="center" style="margin-top: 0px;"> <a href="https://colab.research.google.com/drive/1F5oV20InEYeAJGmBwYF9NM_QhLmjBkKJ?usp=sharing"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="OpenChat Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 10px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text" style=" margin-right: 5px;">Online Demo</span> </a> | <a href="https://github.com/openlifescience-ai"> <img src="https://github.githubassets.com/assets/GitHub-Mark-ea2971cee799.png" alt="GitHub Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text" style=" margin-right: 5px;">GitHub</span> </a> | <a href="#"> <img src="https://github.com/alpayariyak/openchat/blob/master/assets/arxiv-logomark-small-square-border.png?raw=true" alt="ArXiv Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text" style="margin-right: 5px;">Paper</span> </a> | <a href="https://discord.gg/A5Fjf5zC69"> <img src="https://cloud.githubusercontent.com/assets/6291467/26705903/96c2d66e-477c-11e7-9f4e-f3c0efe96c9a.png" alt="Discord Logo" style="width:20px; vertical-align: middle; display: inline-block; margin-right: 5px; margin-left: 5px; margin-top: 0px; margin-bottom: 0px;"/> <span class="link-text">Discord</span> </a> </p> ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/KGmRE5w2sepNtwsEu8t7K.jpeg) Introducing OpenBioLLM-8B: A State-of-the-Art Open Source Biomedical Large Language Model OpenBioLLM-8B is an advanced open source language model designed specifically for the biomedical domain. Developed by Saama AI Labs, this model leverages cutting-edge techniques to achieve state-of-the-art performance on a wide range of biomedical tasks. 🏥 **Biomedical Specialization**: OpenBioLLM-8B is tailored for the unique language and knowledge requirements of the medical and life sciences fields. It was fine-tuned on a vast corpus of high-quality biomedical data, enabling it to understand and generate text with domain-specific accuracy and fluency. 🎓 **Superior Performance**: With 8 billion parameters, OpenBioLLM-8B outperforms other open source biomedical language models of similar scale. It has also demonstrated better results compared to larger proprietary & open-source models like GPT-3.5 and Meditron-70B on biomedical benchmarks. 🧠 **Advanced Training Techniques**: OpenBioLLM-8B builds upon the powerful foundations of the **Meta-Llama-3-8B** and [Meta-Llama-3-8B](meta-llama/Meta-Llama-3-8B) models. It incorporates the DPO dataset and fine-tuning recipe along with a custom diverse medical instruction dataset. Key components of the training pipeline include: <div align="center"> <img width="1200px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/oPchsJsEpQoGcGXVbh7YS.png"> </div> - **Policy Optimization**: [Direct Preference Optimization: Your Language Model is Secretly a Reward Model (DPO)](https://arxiv.org/abs/2305.18290) - **Ranking Dataset**: [berkeley-nest/Nectar](https://huggingface.co/datasets/berkeley-nest/Nectar) - **Fine-tuning dataset**: Custom Medical Instruct dataset (We plan to release a sample training dataset in our upcoming paper; please stay updated) This combination of cutting-edge techniques enables OpenBioLLM-8B to align with key capabilities and preferences for biomedical applications. ⚙️ **Release Details**: - **Model Size**: 8 billion parameters - **Quantization**: Optimized quantized versions available [Here](https://huggingface.co/aaditya/OpenBioLLM-8B-GGUF) - **Language(s) (NLP):** en - **Developed By**: [Ankit Pal (Aaditya Ura)](https://aadityaura.github.io/) from Saama AI Labs - **License:** Meta-Llama License - **Fine-tuned from models:** [meta-llama/Meta-Llama-3-8B](meta-llama/Meta-Llama-3-8B) - **Resources for more information:** - Paper: Coming soon The model can be fine-tuned for more specialized tasks and datasets as needed. OpenBioLLM-8B represents an important step forward in democratizing advanced language AI for the biomedical community. By leveraging state-of-the-art architectures and training techniques from leading open source efforts like Llama-3, we have created a powerful tool to accelerate innovation and discovery in healthcare and the life sciences. We are excited to share OpenBioLLM-8B with researchers and developers around the world. ### Use with transformers **Important: Please use the exact chat template provided by Llama-3 instruct version. Otherwise there will be a degradation in the performance. The model output can be verbose in rare cases. Please consider setting temperature = 0 to make this happen less.** See the snippet below for usage with Transformers: ```python import transformers import torch model_id = "aaditya/OpenBioLLM-Llama3-8B" pipeline = transformers.pipeline( "text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device="auto", ) messages = [ {"role": "system", "content": "You are an expert and experienced from the healthcare and biomedical domain with extensive medical knowledge and practical experience. Your name is OpenBioLLM, and you were developed by Saama AI Labs. who's willing to help answer the user's query with explanation. In your explanation, leverage your deep medical expertise such as relevant anatomical structures, physiological processes, diagnostic criteria, treatment guidelines, or other pertinent medical concepts. Use precise medical terminology while still aiming to make the explanation clear and accessible to a general audience."}, {"role": "user", "content": "How can i split a 3mg or 4mg waefin pill so i can get a 2.5mg pill?"}, ] prompt = pipeline.tokenizer.apply_chat_template( messages, tokenize=False, add_generation_prompt=True ) terminators = [ pipeline.tokenizer.eos_token_id, pipeline.tokenizer.convert_tokens_to_ids("<|eot_id|>") ] outputs = pipeline( prompt, max_new_tokens=256, eos_token_id=terminators, do_sample=True, temperature=0.0, top_p=0.9, ) print(outputs[0]["generated_text"][len(prompt):]) ``` ## **Training procedure** ### **Training hyperparameters** <details> <summary>Click to see details</summary> - learning_rate: 0.0002 - lr_scheduler: cosine - train_batch_size: 12 - eval_batch_size: 8 - GPU: H100 80GB SXM5 - num_devices: 1 - optimizer: adamw_bnb_8bit - lr_scheduler_warmup_steps: 100 - num_epochs: 4 </details> ### **Peft hyperparameters** <details> <summary>Click to see details</summary> - adapter: qlora - lora_r: 128 - lora_alpha: 256 - lora_dropout: 0.05 - lora_target_linear: true -lora_target_modules: - q_proj - v_proj - k_proj - o_proj - gate_proj - down_proj - up_proj </details> ### **Training results** ### **Framework versions** - Transformers 4.39.3 - Pytorch 2.1.2+cu121 - Datasets 2.18.0 - Tokenizers 0.15.1 - Axolotl - Lm harness for evaluation # Benchmark Results 🔥 OpenBioLLM-8B demonstrates superior performance compared to larger models, such as GPT-3.5, Meditron-70B across 9 diverse biomedical datasets, achieving state-of-the-art results with an average score of 72.50%, despite having a significantly smaller parameter count. The model's strong performance in domain-specific tasks, such as Clinical KG, Medical Genetics, and PubMedQA, highlights its ability to effectively capture and apply biomedical knowledge. 🚨 The GPT-4, Med-PaLM-1, and Med-PaLM-2 results are taken from their official papers. Since Med-PaLM doesn't provide zero-shot accuracy, we are using 5-shot accuracy from their paper for comparison. All results presented are in the zero-shot setting, except for Med-PaLM-2 and Med-PaLM-1, which use 5-shot accuracy. | | Clinical KG | Medical Genetics | Anatomy | Pro Medicine | College Biology | College Medicine | MedQA 4 opts | PubMedQA | MedMCQA | Avg | |--------------------|-------------|------------------|---------|--------------|-----------------|------------------|--------------|----------|---------|-------| | **OpenBioLLM-70B** | **92.93** | **93.197** | **83.904** | 93.75 | 93.827 | **85.749** | 78.162 | 78.97 | **74.014** | **86.05588** | | Med-PaLM-2 (5-shot) | 88.3 | 90 | 77.8 | **95.2** | 94.4 | 80.9 | **79.7** | **79.2** | 71.3 | 84.08 | | **GPT-4** | 86.04 | 91 | 80 | 93.01 | **95.14** | 76.88 | 78.87 | 75.2 | 69.52 | 82.85 | | Med-PaLM-1 (Flan-PaLM, 5-shot) | 80.4 | 75 | 63.7 | 83.8 | 88.9 | 76.3 | 67.6 | 79 | 57.6 | 74.7 | | **OpenBioLLM-8B** | 76.101 | 86.1 | 69.829 | 78.21 | 84.213 | 68.042 | 58.993 | 74.12 | 56.913 | 72.502 | | Gemini-1.0 | 76.7 | 75.8 | 66.7 | 77.7 | 88 | 69.2 | 58 | 70.7 | 54.3 | 70.79 | | GPT-3.5 Turbo 1106 | 74.71 | 74 | 72.79 | 72.79 | 72.91 | 64.73 | 57.71 | 72.66 | 53.79 | 66 | | Meditron-70B | 66.79 | 69 | 53.33 | 71.69 | 76.38 | 63 | 57.1 | 76.6 | 46.85 | 64.52 | | gemma-7b | 69.81 | 70 | 59.26 | 66.18 | 79.86 | 60.12 | 47.21 | 76.2 | 48.96 | 64.18 | | Mistral-7B-v0.1 | 68.68 | 71 | 55.56 | 68.38 | 68.06 | 59.54 | 50.82 | 75.4 | 48.2 | 62.85 | | Apollo-7B | 62.26 | 72 | 61.48 | 69.12 | 70.83 | 55.49 | 55.22 | 39.8 | 53.77 | 60 | | MedAlpaca-7b | 57.36 | 69 | 57.04 | 67.28 | 65.28 | 54.34 | 41.71 | 72.8 | 37.51 | 58.03 | | BioMistral-7B | 59.9 | 64 | 56.5 | 60.4 | 59 | 54.7 | 50.6 | 77.5 | 48.1 | 57.3 | | AlpaCare-llama2-7b | 49.81 | 49 | 45.92 | 33.82 | 50 | 43.35 | 29.77 | 72.2 | 34.42 | 45.36 | | ClinicalGPT | 30.56 | 27 | 30.37 | 19.48 | 25 | 24.27 | 26.08 | 63.8 | 28.18 | 30.52 | <div align="center"> <img width="1600px" src="https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/_SzdcJSBjZyo8RS1bTEkP.png"> </div> ## Detailed Medical Subjectwise accuracy ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/UXF-V0col0Z0sS6BGPBkE.png) # Use Cases & Examples 🚨 **Below results are from the quantized version of OpenBioLLM-70B** # Summarize Clinical Notes OpenBioLLM-70B can efficiently analyze and summarize complex clinical notes, EHR data, and discharge summaries, extracting key information and generating concise, structured summaries ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/xdwdBgOxNi_TfML0hKlI8.png) # Answer Medical Questions OpenBioLLM-70B can provide answers to a wide range of medical questions. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/zO95GlwOQEZqCKQF69mE6.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/OKBczKw7gWeW5xsuDpc27.png) <details> <summary>Click to see details</summary> ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/eJGHT5khppYvJb8fQ-YW4.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/Cnbwrqa_-ORHRuNRC2P6Y.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/J9DhdcvukAc9mnnW9fj2C.png) </details> # Clinical Entity Recognition OpenBioLLM-70B can perform advanced clinical entity recognition by identifying and extracting key medical concepts, such as diseases, symptoms, medications, procedures, and anatomical structures, from unstructured clinical text. By leveraging its deep understanding of medical terminology and context, the model can accurately annotate and categorize clinical entities, enabling more efficient information retrieval, data analysis, and knowledge discovery from electronic health records, research articles, and other biomedical text sources. This capability can support various downstream applications, such as clinical decision support, pharmacovigilance, and medical research. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/_69BW4k9LVABFwtxixL45.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/DKy5wYCoPhoPPUc1-x8_J.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/7WD9zCCBZT4-4XlfnIQjl.png) # Biomarkers Extraction ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/ZttoM4AiteT7gFYVhjIpN.png) # Classification OpenBioLLM-70B can perform various biomedical classification tasks, such as disease prediction, sentiment analysis, medical document categorization ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/Bf5MW1d75qT-1F_TR_hC0.png) # De-Identification OpenBioLLM-70B can detect and remove personally identifiable information (PII) from medical records, ensuring patient privacy and compliance with data protection regulations like HIPAA. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5f3fe13d79c1ba4c353d0c19/hKX4kzm--Tw5bj6K78msy.png) **Advisory Notice!**  While OpenBioLLM-70B & 8B leverages high-quality data sources, its outputs may still contain inaccuracies, biases, or misalignments that could pose risks if relied upon for medical decision-making without further testing and refinement. The model's performance has not yet been rigorously evaluated in randomized controlled trials or real-world healthcare environments. Therefore, we strongly advise against using OpenBioLLM-70B & 8B for any direct patient care, clinical decision support, or other professional medical purposes at this time. Its use should be limited to research, development, and exploratory applications by qualified individuals who understand its limitations. OpenBioLLM-70B & 8B are intended solely as a research tool to assist healthcare professionals and should never be considered a replacement for the professional judgment and expertise of a qualified medical doctor. Appropriately adapting and validating OpenBioLLM-70B & 8B for specific medical use cases would require significant additional work, potentially including: - Thorough testing and evaluation in relevant clinical scenarios - Alignment with evidence-based guidelines and best practices - Mitigation of potential biases and failure modes - Integration with human oversight and interpretation - Compliance with regulatory and ethical standards Always consult a qualified healthcare provider for personal medical needs. # Citation If you find OpenBioLLM-70B & 8B useful in your work, please cite the model as follows: ``` @misc{OpenBioLLMs, author = {Ankit Pal, Malaikannan Sankarasubbu}, title = {OpenBioLLMs: Advancing Open-Source Large Language Models for Healthcare and Life Sciences}, year = {2024}, publisher = {Hugging Face}, journal = {Hugging Face repository}, howpublished = {\url{https://huggingface.co/aaditya/OpenBioLLM-Llama3-70B}} } ``` The accompanying paper is currently in progress and will be released soon. <div align="center"> <h2> 💌 Contact </h2> </div> We look forward to hearing you and collaborating on this exciting project! **Contributors:** - [Ankit Pal (Aaditya Ura)](https://aadityaura.github.io/) [aadityaura at gmail dot com] - Saama AI Labs - Note: I am looking for a funded PhD opportunity, especially if it fits my Responsible Generative AI, Multimodal LLMs, Geometric Deep Learning, and Healthcare AI skillset. # References We thank the [Meta Team](meta-llama/Meta-Llama-3-70B-Instruct) for their amazing models! Result sources - [1] GPT-4 [Capabilities of GPT-4 on Medical Challenge Problems] (https://arxiv.org/abs/2303.13375) - [2] Med-PaLM-1 [Large Language Models Encode Clinical Knowledge](https://arxiv.org/abs/2212.13138) - [3] Med-PaLM-2 [Towards Expert-Level Medical Question Answering with Large Language Models](https://arxiv.org/abs/2305.09617) - [4] Gemini-1.0 [Gemini Goes to Med School](https://arxiv.org/abs/2402.07023)
[ "QUESTION_ANSWERING" ]
[ "MEDQA", "PUBMEDQA" ]
BioNLP
hkunlp/instructor-base
hkunlp
sentence-similarity
[ "sentence-transformers", "pytorch", "t5", "text-embedding", "embeddings", "information-retrieval", "beir", "text-classification", "language-model", "text-clustering", "text-semantic-similarity", "text-evaluation", "prompt-retrieval", "text-reranking", "feature-extraction", "sentence-similarity", "transformers", "English", "Sentence Similarity", "natural_questions", "ms_marco", "fever", "hotpot_qa", "mteb", "en", "arxiv:2212.09741", "license:apache-2.0", "model-index", "autotrain_compatible", "text-generation-inference", "region:us" ]
1,671
1,674
11,563
116
--- language: en license: apache-2.0 pipeline_tag: sentence-similarity tags: - text-embedding - embeddings - information-retrieval - beir - text-classification - language-model - text-clustering - text-semantic-similarity - text-evaluation - prompt-retrieval - text-reranking - sentence-transformers - feature-extraction - sentence-similarity - transformers - t5 - English - Sentence Similarity - natural_questions - ms_marco - fever - hotpot_qa - mteb inference: false model-index: - name: final_base_results results: - task: type: Classification dataset: name: MTEB AmazonCounterfactualClassification (en) type: mteb/amazon_counterfactual config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 86.2089552238806 - type: ap value: 55.76273850794966 - type: f1 value: 81.26104211414781 - task: type: Classification dataset: name: MTEB AmazonPolarityClassification type: mteb/amazon_polarity config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 88.35995000000001 - type: ap value: 84.18839957309655 - type: f1 value: 88.317619250081 - task: type: Classification dataset: name: MTEB AmazonReviewsClassification (en) type: mteb/amazon_reviews_multi config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 44.64 - type: f1 value: 42.48663956478136 - task: type: Retrieval dataset: name: MTEB ArguAna type: arguana config: default split: test revision: None metrics: - type: map_at_1 value: 27.383000000000003 - type: map_at_10 value: 43.024 - type: map_at_100 value: 44.023 - type: map_at_1000 value: 44.025999999999996 - type: map_at_3 value: 37.684 - type: map_at_5 value: 40.884 - type: mrr_at_1 value: 28.094 - type: mrr_at_10 value: 43.315 - type: mrr_at_100 value: 44.313 - type: mrr_at_1000 value: 44.317 - type: mrr_at_3 value: 37.862 - type: mrr_at_5 value: 41.155 - type: ndcg_at_1 value: 27.383000000000003 - type: ndcg_at_10 value: 52.032000000000004 - type: ndcg_at_100 value: 56.19499999999999 - type: ndcg_at_1000 value: 56.272 - type: ndcg_at_3 value: 41.166000000000004 - type: ndcg_at_5 value: 46.92 - type: precision_at_1 value: 27.383000000000003 - type: precision_at_10 value: 8.087 - type: precision_at_100 value: 0.989 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 17.093 - type: precision_at_5 value: 13.044 - type: recall_at_1 value: 27.383000000000003 - type: recall_at_10 value: 80.868 - type: recall_at_100 value: 98.86200000000001 - type: recall_at_1000 value: 99.431 - type: recall_at_3 value: 51.28 - type: recall_at_5 value: 65.22 - task: type: Clustering dataset: name: MTEB ArxivClusteringP2P type: mteb/arxiv-clustering-p2p config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 39.68441054431849 - task: type: Clustering dataset: name: MTEB ArxivClusteringS2S type: mteb/arxiv-clustering-s2s config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 29.188539728343844 - task: type: Reranking dataset: name: MTEB AskUbuntuDupQuestions type: mteb/askubuntudupquestions-reranking config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 63.173362687519784 - type: mrr value: 76.18860748362133 - task: type: STS dataset: name: MTEB BIOSSES type: mteb/biosses-sts config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_spearman value: 82.30789953771232 - task: type: Classification dataset: name: MTEB Banking77Classification type: mteb/banking77 config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 77.03571428571428 - type: f1 value: 75.87384305045917 - task: type: Clustering dataset: name: MTEB BiorxivClusteringP2P type: mteb/biorxiv-clustering-p2p config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 32.98041170516364 - task: type: Clustering dataset: name: MTEB BiorxivClusteringS2S type: mteb/biorxiv-clustering-s2s config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 25.71652988451154 - task: type: Retrieval dataset: name: MTEB CQADupstackAndroidRetrieval type: BeIR/cqadupstack config: default split: test revision: None metrics: - type: map_at_1 value: 33.739999999999995 - type: map_at_10 value: 46.197 - type: map_at_100 value: 47.814 - type: map_at_1000 value: 47.934 - type: map_at_3 value: 43.091 - type: map_at_5 value: 44.81 - type: mrr_at_1 value: 41.059 - type: mrr_at_10 value: 52.292 - type: mrr_at_100 value: 52.978 - type: mrr_at_1000 value: 53.015 - type: mrr_at_3 value: 49.976 - type: mrr_at_5 value: 51.449999999999996 - type: ndcg_at_1 value: 41.059 - type: ndcg_at_10 value: 52.608 - type: ndcg_at_100 value: 57.965 - type: ndcg_at_1000 value: 59.775999999999996 - type: ndcg_at_3 value: 48.473 - type: ndcg_at_5 value: 50.407999999999994 - type: precision_at_1 value: 41.059 - type: precision_at_10 value: 9.943 - type: precision_at_100 value: 1.6070000000000002 - type: precision_at_1000 value: 0.20500000000000002 - type: precision_at_3 value: 23.413999999999998 - type: precision_at_5 value: 16.481 - type: recall_at_1 value: 33.739999999999995 - type: recall_at_10 value: 63.888999999999996 - type: recall_at_100 value: 85.832 - type: recall_at_1000 value: 97.475 - type: recall_at_3 value: 51.953 - type: recall_at_5 value: 57.498000000000005 - type: map_at_1 value: 31.169999999999998 - type: map_at_10 value: 41.455 - type: map_at_100 value: 42.716 - type: map_at_1000 value: 42.847 - type: map_at_3 value: 38.568999999999996 - type: map_at_5 value: 40.099000000000004 - type: mrr_at_1 value: 39.427 - type: mrr_at_10 value: 47.818 - type: mrr_at_100 value: 48.519 - type: mrr_at_1000 value: 48.558 - type: mrr_at_3 value: 45.86 - type: mrr_at_5 value: 46.936 - type: ndcg_at_1 value: 39.427 - type: ndcg_at_10 value: 47.181 - type: ndcg_at_100 value: 51.737 - type: ndcg_at_1000 value: 53.74 - type: ndcg_at_3 value: 43.261 - type: ndcg_at_5 value: 44.891 - type: precision_at_1 value: 39.427 - type: precision_at_10 value: 8.847 - type: precision_at_100 value: 1.425 - type: precision_at_1000 value: 0.189 - type: precision_at_3 value: 20.785999999999998 - type: precision_at_5 value: 14.560999999999998 - type: recall_at_1 value: 31.169999999999998 - type: recall_at_10 value: 56.971000000000004 - type: recall_at_100 value: 76.31400000000001 - type: recall_at_1000 value: 88.93900000000001 - type: recall_at_3 value: 45.208 - type: recall_at_5 value: 49.923 - type: map_at_1 value: 39.682 - type: map_at_10 value: 52.766000000000005 - type: map_at_100 value: 53.84100000000001 - type: map_at_1000 value: 53.898 - type: map_at_3 value: 49.291000000000004 - type: map_at_5 value: 51.365 - type: mrr_at_1 value: 45.266 - type: mrr_at_10 value: 56.093 - type: mrr_at_100 value: 56.763 - type: mrr_at_1000 value: 56.793000000000006 - type: mrr_at_3 value: 53.668000000000006 - type: mrr_at_5 value: 55.1 - type: ndcg_at_1 value: 45.266 - type: ndcg_at_10 value: 58.836 - type: ndcg_at_100 value: 62.863 - type: ndcg_at_1000 value: 63.912 - type: ndcg_at_3 value: 53.19199999999999 - type: ndcg_at_5 value: 56.125 - type: precision_at_1 value: 45.266 - type: precision_at_10 value: 9.492 - type: precision_at_100 value: 1.236 - type: precision_at_1000 value: 0.13699999999999998 - type: precision_at_3 value: 23.762 - type: precision_at_5 value: 16.414 - type: recall_at_1 value: 39.682 - type: recall_at_10 value: 73.233 - type: recall_at_100 value: 90.335 - type: recall_at_1000 value: 97.452 - type: recall_at_3 value: 58.562000000000005 - type: recall_at_5 value: 65.569 - type: map_at_1 value: 26.743 - type: map_at_10 value: 34.016000000000005 - type: map_at_100 value: 35.028999999999996 - type: map_at_1000 value: 35.113 - type: map_at_3 value: 31.763 - type: map_at_5 value: 33.013999999999996 - type: mrr_at_1 value: 28.927000000000003 - type: mrr_at_10 value: 36.32 - type: mrr_at_100 value: 37.221 - type: mrr_at_1000 value: 37.281 - type: mrr_at_3 value: 34.105000000000004 - type: mrr_at_5 value: 35.371 - type: ndcg_at_1 value: 28.927000000000003 - type: ndcg_at_10 value: 38.474000000000004 - type: ndcg_at_100 value: 43.580000000000005 - type: ndcg_at_1000 value: 45.64 - type: ndcg_at_3 value: 34.035 - type: ndcg_at_5 value: 36.186 - type: precision_at_1 value: 28.927000000000003 - type: precision_at_10 value: 5.74 - type: precision_at_100 value: 0.8710000000000001 - type: precision_at_1000 value: 0.108 - type: precision_at_3 value: 14.124 - type: precision_at_5 value: 9.74 - type: recall_at_1 value: 26.743 - type: recall_at_10 value: 49.955 - type: recall_at_100 value: 73.904 - type: recall_at_1000 value: 89.133 - type: recall_at_3 value: 38.072 - type: recall_at_5 value: 43.266 - type: map_at_1 value: 16.928 - type: map_at_10 value: 23.549 - type: map_at_100 value: 24.887 - type: map_at_1000 value: 25.018 - type: map_at_3 value: 21.002000000000002 - type: map_at_5 value: 22.256 - type: mrr_at_1 value: 21.02 - type: mrr_at_10 value: 27.898 - type: mrr_at_100 value: 29.018 - type: mrr_at_1000 value: 29.099999999999998 - type: mrr_at_3 value: 25.456 - type: mrr_at_5 value: 26.625 - type: ndcg_at_1 value: 21.02 - type: ndcg_at_10 value: 28.277 - type: ndcg_at_100 value: 34.54 - type: ndcg_at_1000 value: 37.719 - type: ndcg_at_3 value: 23.707 - type: ndcg_at_5 value: 25.482 - type: precision_at_1 value: 21.02 - type: precision_at_10 value: 5.361 - type: precision_at_100 value: 0.9809999999999999 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 11.401 - type: precision_at_5 value: 8.209 - type: recall_at_1 value: 16.928 - type: recall_at_10 value: 38.601 - type: recall_at_100 value: 65.759 - type: recall_at_1000 value: 88.543 - type: recall_at_3 value: 25.556 - type: recall_at_5 value: 30.447000000000003 - type: map_at_1 value: 28.549000000000003 - type: map_at_10 value: 38.426 - type: map_at_100 value: 39.845000000000006 - type: map_at_1000 value: 39.956 - type: map_at_3 value: 35.372 - type: map_at_5 value: 37.204 - type: mrr_at_1 value: 35.034 - type: mrr_at_10 value: 44.041000000000004 - type: mrr_at_100 value: 44.95 - type: mrr_at_1000 value: 44.997 - type: mrr_at_3 value: 41.498000000000005 - type: mrr_at_5 value: 43.077 - type: ndcg_at_1 value: 35.034 - type: ndcg_at_10 value: 44.218 - type: ndcg_at_100 value: 49.958000000000006 - type: ndcg_at_1000 value: 52.019000000000005 - type: ndcg_at_3 value: 39.34 - type: ndcg_at_5 value: 41.892 - type: precision_at_1 value: 35.034 - type: precision_at_10 value: 7.911 - type: precision_at_100 value: 1.26 - type: precision_at_1000 value: 0.16 - type: precision_at_3 value: 18.511 - type: precision_at_5 value: 13.205 - type: recall_at_1 value: 28.549000000000003 - type: recall_at_10 value: 56.035999999999994 - type: recall_at_100 value: 79.701 - type: recall_at_1000 value: 93.149 - type: recall_at_3 value: 42.275 - type: recall_at_5 value: 49.097 - type: map_at_1 value: 29.391000000000002 - type: map_at_10 value: 39.48 - type: map_at_100 value: 40.727000000000004 - type: map_at_1000 value: 40.835 - type: map_at_3 value: 36.234 - type: map_at_5 value: 37.877 - type: mrr_at_1 value: 35.959 - type: mrr_at_10 value: 44.726 - type: mrr_at_100 value: 45.531 - type: mrr_at_1000 value: 45.582 - type: mrr_at_3 value: 42.047000000000004 - type: mrr_at_5 value: 43.611 - type: ndcg_at_1 value: 35.959 - type: ndcg_at_10 value: 45.303 - type: ndcg_at_100 value: 50.683 - type: ndcg_at_1000 value: 52.818 - type: ndcg_at_3 value: 39.987 - type: ndcg_at_5 value: 42.243 - type: precision_at_1 value: 35.959 - type: precision_at_10 value: 8.241999999999999 - type: precision_at_100 value: 1.274 - type: precision_at_1000 value: 0.163 - type: precision_at_3 value: 18.836 - type: precision_at_5 value: 13.196 - type: recall_at_1 value: 29.391000000000002 - type: recall_at_10 value: 57.364000000000004 - type: recall_at_100 value: 80.683 - type: recall_at_1000 value: 94.918 - type: recall_at_3 value: 42.263 - type: recall_at_5 value: 48.634 - type: map_at_1 value: 26.791749999999997 - type: map_at_10 value: 35.75541666666667 - type: map_at_100 value: 37.00791666666667 - type: map_at_1000 value: 37.12408333333333 - type: map_at_3 value: 33.02966666666667 - type: map_at_5 value: 34.56866666666667 - type: mrr_at_1 value: 31.744333333333337 - type: mrr_at_10 value: 39.9925 - type: mrr_at_100 value: 40.86458333333333 - type: mrr_at_1000 value: 40.92175000000001 - type: mrr_at_3 value: 37.68183333333334 - type: mrr_at_5 value: 39.028499999999994 - type: ndcg_at_1 value: 31.744333333333337 - type: ndcg_at_10 value: 40.95008333333334 - type: ndcg_at_100 value: 46.25966666666667 - type: ndcg_at_1000 value: 48.535333333333334 - type: ndcg_at_3 value: 36.43333333333333 - type: ndcg_at_5 value: 38.602333333333334 - type: precision_at_1 value: 31.744333333333337 - type: precision_at_10 value: 7.135166666666666 - type: precision_at_100 value: 1.1535833333333334 - type: precision_at_1000 value: 0.15391666666666665 - type: precision_at_3 value: 16.713 - type: precision_at_5 value: 11.828416666666666 - type: recall_at_1 value: 26.791749999999997 - type: recall_at_10 value: 51.98625 - type: recall_at_100 value: 75.30358333333334 - type: recall_at_1000 value: 91.05433333333333 - type: recall_at_3 value: 39.39583333333333 - type: recall_at_5 value: 45.05925 - type: map_at_1 value: 22.219 - type: map_at_10 value: 29.162 - type: map_at_100 value: 30.049999999999997 - type: map_at_1000 value: 30.144 - type: map_at_3 value: 27.204 - type: map_at_5 value: 28.351 - type: mrr_at_1 value: 25.153 - type: mrr_at_10 value: 31.814999999999998 - type: mrr_at_100 value: 32.573 - type: mrr_at_1000 value: 32.645 - type: mrr_at_3 value: 29.934 - type: mrr_at_5 value: 30.946 - type: ndcg_at_1 value: 25.153 - type: ndcg_at_10 value: 33.099000000000004 - type: ndcg_at_100 value: 37.768 - type: ndcg_at_1000 value: 40.331 - type: ndcg_at_3 value: 29.473 - type: ndcg_at_5 value: 31.206 - type: precision_at_1 value: 25.153 - type: precision_at_10 value: 5.183999999999999 - type: precision_at_100 value: 0.8170000000000001 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 12.831999999999999 - type: precision_at_5 value: 8.895999999999999 - type: recall_at_1 value: 22.219 - type: recall_at_10 value: 42.637 - type: recall_at_100 value: 64.704 - type: recall_at_1000 value: 83.963 - type: recall_at_3 value: 32.444 - type: recall_at_5 value: 36.802 - type: map_at_1 value: 17.427999999999997 - type: map_at_10 value: 24.029 - type: map_at_100 value: 25.119999999999997 - type: map_at_1000 value: 25.257 - type: map_at_3 value: 22.016 - type: map_at_5 value: 23.143 - type: mrr_at_1 value: 21.129 - type: mrr_at_10 value: 27.750000000000004 - type: mrr_at_100 value: 28.666999999999998 - type: mrr_at_1000 value: 28.754999999999995 - type: mrr_at_3 value: 25.849 - type: mrr_at_5 value: 26.939999999999998 - type: ndcg_at_1 value: 21.129 - type: ndcg_at_10 value: 28.203 - type: ndcg_at_100 value: 33.44 - type: ndcg_at_1000 value: 36.61 - type: ndcg_at_3 value: 24.648999999999997 - type: ndcg_at_5 value: 26.316 - type: precision_at_1 value: 21.129 - type: precision_at_10 value: 5.055 - type: precision_at_100 value: 0.909 - type: precision_at_1000 value: 0.13699999999999998 - type: precision_at_3 value: 11.666 - type: precision_at_5 value: 8.3 - type: recall_at_1 value: 17.427999999999997 - type: recall_at_10 value: 36.923 - type: recall_at_100 value: 60.606 - type: recall_at_1000 value: 83.19 - type: recall_at_3 value: 26.845000000000002 - type: recall_at_5 value: 31.247000000000003 - type: map_at_1 value: 26.457000000000004 - type: map_at_10 value: 35.228 - type: map_at_100 value: 36.475 - type: map_at_1000 value: 36.585 - type: map_at_3 value: 32.444 - type: map_at_5 value: 34.046 - type: mrr_at_1 value: 30.784 - type: mrr_at_10 value: 39.133 - type: mrr_at_100 value: 40.11 - type: mrr_at_1000 value: 40.169 - type: mrr_at_3 value: 36.692 - type: mrr_at_5 value: 38.17 - type: ndcg_at_1 value: 30.784 - type: ndcg_at_10 value: 40.358 - type: ndcg_at_100 value: 46.119 - type: ndcg_at_1000 value: 48.428 - type: ndcg_at_3 value: 35.504000000000005 - type: ndcg_at_5 value: 37.864 - type: precision_at_1 value: 30.784 - type: precision_at_10 value: 6.800000000000001 - type: precision_at_100 value: 1.083 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 15.920000000000002 - type: precision_at_5 value: 11.437 - type: recall_at_1 value: 26.457000000000004 - type: recall_at_10 value: 51.845 - type: recall_at_100 value: 77.046 - type: recall_at_1000 value: 92.892 - type: recall_at_3 value: 38.89 - type: recall_at_5 value: 44.688 - type: map_at_1 value: 29.378999999999998 - type: map_at_10 value: 37.373 - type: map_at_100 value: 39.107 - type: map_at_1000 value: 39.317 - type: map_at_3 value: 34.563 - type: map_at_5 value: 36.173 - type: mrr_at_1 value: 35.178 - type: mrr_at_10 value: 42.44 - type: mrr_at_100 value: 43.434 - type: mrr_at_1000 value: 43.482 - type: mrr_at_3 value: 39.987 - type: mrr_at_5 value: 41.370000000000005 - type: ndcg_at_1 value: 35.178 - type: ndcg_at_10 value: 42.82 - type: ndcg_at_100 value: 48.935 - type: ndcg_at_1000 value: 51.28 - type: ndcg_at_3 value: 38.562999999999995 - type: ndcg_at_5 value: 40.687 - type: precision_at_1 value: 35.178 - type: precision_at_10 value: 7.945 - type: precision_at_100 value: 1.524 - type: precision_at_1000 value: 0.242 - type: precision_at_3 value: 17.721 - type: precision_at_5 value: 12.925 - type: recall_at_1 value: 29.378999999999998 - type: recall_at_10 value: 52.141999999999996 - type: recall_at_100 value: 79.49000000000001 - type: recall_at_1000 value: 93.782 - type: recall_at_3 value: 39.579 - type: recall_at_5 value: 45.462 - type: map_at_1 value: 19.814999999999998 - type: map_at_10 value: 27.383999999999997 - type: map_at_100 value: 28.483999999999998 - type: map_at_1000 value: 28.585 - type: map_at_3 value: 24.807000000000002 - type: map_at_5 value: 26.485999999999997 - type: mrr_at_1 value: 21.996 - type: mrr_at_10 value: 29.584 - type: mrr_at_100 value: 30.611 - type: mrr_at_1000 value: 30.684 - type: mrr_at_3 value: 27.11 - type: mrr_at_5 value: 28.746 - type: ndcg_at_1 value: 21.996 - type: ndcg_at_10 value: 32.024 - type: ndcg_at_100 value: 37.528 - type: ndcg_at_1000 value: 40.150999999999996 - type: ndcg_at_3 value: 27.016000000000002 - type: ndcg_at_5 value: 29.927999999999997 - type: precision_at_1 value: 21.996 - type: precision_at_10 value: 5.102 - type: precision_at_100 value: 0.856 - type: precision_at_1000 value: 0.117 - type: precision_at_3 value: 11.583 - type: precision_at_5 value: 8.577 - type: recall_at_1 value: 19.814999999999998 - type: recall_at_10 value: 44.239 - type: recall_at_100 value: 69.269 - type: recall_at_1000 value: 89.216 - type: recall_at_3 value: 31.102999999999998 - type: recall_at_5 value: 38.078 - task: type: Retrieval dataset: name: MTEB ClimateFEVER type: climate-fever config: default split: test revision: None metrics: - type: map_at_1 value: 11.349 - type: map_at_10 value: 19.436 - type: map_at_100 value: 21.282999999999998 - type: map_at_1000 value: 21.479 - type: map_at_3 value: 15.841 - type: map_at_5 value: 17.558 - type: mrr_at_1 value: 25.863000000000003 - type: mrr_at_10 value: 37.218 - type: mrr_at_100 value: 38.198 - type: mrr_at_1000 value: 38.236 - type: mrr_at_3 value: 33.409 - type: mrr_at_5 value: 35.602000000000004 - type: ndcg_at_1 value: 25.863000000000003 - type: ndcg_at_10 value: 27.953 - type: ndcg_at_100 value: 35.327 - type: ndcg_at_1000 value: 38.708999999999996 - type: ndcg_at_3 value: 21.985 - type: ndcg_at_5 value: 23.957 - type: precision_at_1 value: 25.863000000000003 - type: precision_at_10 value: 8.99 - type: precision_at_100 value: 1.6889999999999998 - type: precision_at_1000 value: 0.232 - type: precision_at_3 value: 16.308 - type: precision_at_5 value: 12.912 - type: recall_at_1 value: 11.349 - type: recall_at_10 value: 34.581 - type: recall_at_100 value: 60.178 - type: recall_at_1000 value: 78.88199999999999 - type: recall_at_3 value: 20.041999999999998 - type: recall_at_5 value: 25.458 - task: type: Retrieval dataset: name: MTEB DBPedia type: dbpedia-entity config: default split: test revision: None metrics: - type: map_at_1 value: 7.893 - type: map_at_10 value: 15.457 - type: map_at_100 value: 20.905 - type: map_at_1000 value: 22.116 - type: map_at_3 value: 11.593 - type: map_at_5 value: 13.134 - type: mrr_at_1 value: 57.49999999999999 - type: mrr_at_10 value: 65.467 - type: mrr_at_100 value: 66.022 - type: mrr_at_1000 value: 66.039 - type: mrr_at_3 value: 63.458000000000006 - type: mrr_at_5 value: 64.546 - type: ndcg_at_1 value: 45.875 - type: ndcg_at_10 value: 33.344 - type: ndcg_at_100 value: 36.849 - type: ndcg_at_1000 value: 44.03 - type: ndcg_at_3 value: 37.504 - type: ndcg_at_5 value: 34.892 - type: precision_at_1 value: 57.49999999999999 - type: precision_at_10 value: 25.95 - type: precision_at_100 value: 7.89 - type: precision_at_1000 value: 1.669 - type: precision_at_3 value: 40.333000000000006 - type: precision_at_5 value: 33.050000000000004 - type: recall_at_1 value: 7.893 - type: recall_at_10 value: 20.724999999999998 - type: recall_at_100 value: 42.516 - type: recall_at_1000 value: 65.822 - type: recall_at_3 value: 12.615000000000002 - type: recall_at_5 value: 15.482000000000001 - task: type: Classification dataset: name: MTEB EmotionClassification type: mteb/emotion config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 51.760000000000005 - type: f1 value: 45.51690565701713 - task: type: Retrieval dataset: name: MTEB FEVER type: fever config: default split: test revision: None metrics: - type: map_at_1 value: 53.882 - type: map_at_10 value: 65.902 - type: map_at_100 value: 66.33 - type: map_at_1000 value: 66.348 - type: map_at_3 value: 63.75999999999999 - type: map_at_5 value: 65.181 - type: mrr_at_1 value: 58.041 - type: mrr_at_10 value: 70.133 - type: mrr_at_100 value: 70.463 - type: mrr_at_1000 value: 70.47 - type: mrr_at_3 value: 68.164 - type: mrr_at_5 value: 69.465 - type: ndcg_at_1 value: 58.041 - type: ndcg_at_10 value: 71.84700000000001 - type: ndcg_at_100 value: 73.699 - type: ndcg_at_1000 value: 74.06700000000001 - type: ndcg_at_3 value: 67.855 - type: ndcg_at_5 value: 70.203 - type: precision_at_1 value: 58.041 - type: precision_at_10 value: 9.427000000000001 - type: precision_at_100 value: 1.049 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 27.278000000000002 - type: precision_at_5 value: 17.693 - type: recall_at_1 value: 53.882 - type: recall_at_10 value: 85.99 - type: recall_at_100 value: 94.09100000000001 - type: recall_at_1000 value: 96.612 - type: recall_at_3 value: 75.25 - type: recall_at_5 value: 80.997 - task: type: Retrieval dataset: name: MTEB FiQA2018 type: fiqa config: default split: test revision: None metrics: - type: map_at_1 value: 19.165 - type: map_at_10 value: 31.845000000000002 - type: map_at_100 value: 33.678999999999995 - type: map_at_1000 value: 33.878 - type: map_at_3 value: 27.881 - type: map_at_5 value: 30.049999999999997 - type: mrr_at_1 value: 38.272 - type: mrr_at_10 value: 47.04 - type: mrr_at_100 value: 47.923 - type: mrr_at_1000 value: 47.973 - type: mrr_at_3 value: 44.985 - type: mrr_at_5 value: 46.150000000000006 - type: ndcg_at_1 value: 38.272 - type: ndcg_at_10 value: 39.177 - type: ndcg_at_100 value: 45.995000000000005 - type: ndcg_at_1000 value: 49.312 - type: ndcg_at_3 value: 36.135 - type: ndcg_at_5 value: 36.936 - type: precision_at_1 value: 38.272 - type: precision_at_10 value: 10.926 - type: precision_at_100 value: 1.809 - type: precision_at_1000 value: 0.23700000000000002 - type: precision_at_3 value: 24.331 - type: precision_at_5 value: 17.747 - type: recall_at_1 value: 19.165 - type: recall_at_10 value: 45.103 - type: recall_at_100 value: 70.295 - type: recall_at_1000 value: 90.592 - type: recall_at_3 value: 32.832 - type: recall_at_5 value: 37.905 - task: type: Retrieval dataset: name: MTEB HotpotQA type: hotpotqa config: default split: test revision: None metrics: - type: map_at_1 value: 32.397 - type: map_at_10 value: 44.83 - type: map_at_100 value: 45.716 - type: map_at_1000 value: 45.797 - type: map_at_3 value: 41.955999999999996 - type: map_at_5 value: 43.736999999999995 - type: mrr_at_1 value: 64.794 - type: mrr_at_10 value: 71.866 - type: mrr_at_100 value: 72.22 - type: mrr_at_1000 value: 72.238 - type: mrr_at_3 value: 70.416 - type: mrr_at_5 value: 71.304 - type: ndcg_at_1 value: 64.794 - type: ndcg_at_10 value: 54.186 - type: ndcg_at_100 value: 57.623000000000005 - type: ndcg_at_1000 value: 59.302 - type: ndcg_at_3 value: 49.703 - type: ndcg_at_5 value: 52.154999999999994 - type: precision_at_1 value: 64.794 - type: precision_at_10 value: 11.219 - type: precision_at_100 value: 1.394 - type: precision_at_1000 value: 0.16199999999999998 - type: precision_at_3 value: 30.767 - type: precision_at_5 value: 20.397000000000002 - type: recall_at_1 value: 32.397 - type: recall_at_10 value: 56.096999999999994 - type: recall_at_100 value: 69.696 - type: recall_at_1000 value: 80.88499999999999 - type: recall_at_3 value: 46.150999999999996 - type: recall_at_5 value: 50.993 - task: type: Classification dataset: name: MTEB ImdbClassification type: mteb/imdb config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 81.1744 - type: ap value: 75.44973697032414 - type: f1 value: 81.09901117955782 - task: type: Retrieval dataset: name: MTEB MSMARCO type: msmarco config: default split: dev revision: None metrics: - type: map_at_1 value: 19.519000000000002 - type: map_at_10 value: 31.025000000000002 - type: map_at_100 value: 32.275999999999996 - type: map_at_1000 value: 32.329 - type: map_at_3 value: 27.132 - type: map_at_5 value: 29.415999999999997 - type: mrr_at_1 value: 20.115 - type: mrr_at_10 value: 31.569000000000003 - type: mrr_at_100 value: 32.768 - type: mrr_at_1000 value: 32.816 - type: mrr_at_3 value: 27.748 - type: mrr_at_5 value: 29.956 - type: ndcg_at_1 value: 20.115 - type: ndcg_at_10 value: 37.756 - type: ndcg_at_100 value: 43.858000000000004 - type: ndcg_at_1000 value: 45.199 - type: ndcg_at_3 value: 29.818 - type: ndcg_at_5 value: 33.875 - type: precision_at_1 value: 20.115 - type: precision_at_10 value: 6.122 - type: precision_at_100 value: 0.919 - type: precision_at_1000 value: 0.10300000000000001 - type: precision_at_3 value: 12.794 - type: precision_at_5 value: 9.731 - type: recall_at_1 value: 19.519000000000002 - type: recall_at_10 value: 58.62500000000001 - type: recall_at_100 value: 86.99 - type: recall_at_1000 value: 97.268 - type: recall_at_3 value: 37.002 - type: recall_at_5 value: 46.778 - task: type: Classification dataset: name: MTEB MTOPDomainClassification (en) type: mteb/mtop_domain config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.71865025079799 - type: f1 value: 93.38906173610519 - task: type: Classification dataset: name: MTEB MTOPIntentClassification (en) type: mteb/mtop_intent config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 70.2576379388965 - type: f1 value: 49.20405830249464 - task: type: Classification dataset: name: MTEB MassiveIntentClassification (en) type: mteb/amazon_massive_intent config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.48486886348351 - type: f1 value: 64.92199176095157 - task: type: Classification dataset: name: MTEB MassiveScenarioClassification (en) type: mteb/amazon_massive_scenario config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.59246805648958 - type: f1 value: 72.1222026389164 - task: type: Clustering dataset: name: MTEB MedrxivClusteringP2P type: mteb/medrxiv-clustering-p2p config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 30.887642595096825 - task: type: Clustering dataset: name: MTEB MedrxivClusteringS2S type: mteb/medrxiv-clustering-s2s config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 28.3764418784054 - task: type: Reranking dataset: name: MTEB MindSmallReranking type: mteb/mind_small config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 31.81544126336991 - type: mrr value: 32.82666576268031 - task: type: Retrieval dataset: name: MTEB NFCorpus type: nfcorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.185 - type: map_at_10 value: 11.158 - type: map_at_100 value: 14.041 - type: map_at_1000 value: 15.360999999999999 - type: map_at_3 value: 8.417 - type: map_at_5 value: 9.378 - type: mrr_at_1 value: 44.582 - type: mrr_at_10 value: 53.083999999999996 - type: mrr_at_100 value: 53.787 - type: mrr_at_1000 value: 53.824000000000005 - type: mrr_at_3 value: 51.187000000000005 - type: mrr_at_5 value: 52.379 - type: ndcg_at_1 value: 42.57 - type: ndcg_at_10 value: 31.593 - type: ndcg_at_100 value: 29.093999999999998 - type: ndcg_at_1000 value: 37.909 - type: ndcg_at_3 value: 37.083 - type: ndcg_at_5 value: 34.397 - type: precision_at_1 value: 43.963 - type: precision_at_10 value: 23.498 - type: precision_at_100 value: 7.6160000000000005 - type: precision_at_1000 value: 2.032 - type: precision_at_3 value: 34.572 - type: precision_at_5 value: 29.412 - type: recall_at_1 value: 5.185 - type: recall_at_10 value: 15.234 - type: recall_at_100 value: 29.49 - type: recall_at_1000 value: 62.273999999999994 - type: recall_at_3 value: 9.55 - type: recall_at_5 value: 11.103 - task: type: Retrieval dataset: name: MTEB NQ type: nq config: default split: test revision: None metrics: - type: map_at_1 value: 23.803 - type: map_at_10 value: 38.183 - type: map_at_100 value: 39.421 - type: map_at_1000 value: 39.464 - type: map_at_3 value: 33.835 - type: map_at_5 value: 36.327 - type: mrr_at_1 value: 26.68 - type: mrr_at_10 value: 40.439 - type: mrr_at_100 value: 41.415 - type: mrr_at_1000 value: 41.443999999999996 - type: mrr_at_3 value: 36.612 - type: mrr_at_5 value: 38.877 - type: ndcg_at_1 value: 26.68 - type: ndcg_at_10 value: 45.882 - type: ndcg_at_100 value: 51.227999999999994 - type: ndcg_at_1000 value: 52.207 - type: ndcg_at_3 value: 37.511 - type: ndcg_at_5 value: 41.749 - type: precision_at_1 value: 26.68 - type: precision_at_10 value: 7.9750000000000005 - type: precision_at_100 value: 1.0959999999999999 - type: precision_at_1000 value: 0.11900000000000001 - type: precision_at_3 value: 17.449 - type: precision_at_5 value: 12.897 - type: recall_at_1 value: 23.803 - type: recall_at_10 value: 67.152 - type: recall_at_100 value: 90.522 - type: recall_at_1000 value: 97.743 - type: recall_at_3 value: 45.338 - type: recall_at_5 value: 55.106 - task: type: Retrieval dataset: name: MTEB QuoraRetrieval type: quora config: default split: test revision: None metrics: - type: map_at_1 value: 70.473 - type: map_at_10 value: 84.452 - type: map_at_100 value: 85.101 - type: map_at_1000 value: 85.115 - type: map_at_3 value: 81.435 - type: map_at_5 value: 83.338 - type: mrr_at_1 value: 81.19 - type: mrr_at_10 value: 87.324 - type: mrr_at_100 value: 87.434 - type: mrr_at_1000 value: 87.435 - type: mrr_at_3 value: 86.31 - type: mrr_at_5 value: 87.002 - type: ndcg_at_1 value: 81.21000000000001 - type: ndcg_at_10 value: 88.19 - type: ndcg_at_100 value: 89.44 - type: ndcg_at_1000 value: 89.526 - type: ndcg_at_3 value: 85.237 - type: ndcg_at_5 value: 86.892 - type: precision_at_1 value: 81.21000000000001 - type: precision_at_10 value: 13.417000000000002 - type: precision_at_100 value: 1.537 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.31 - type: precision_at_5 value: 24.59 - type: recall_at_1 value: 70.473 - type: recall_at_10 value: 95.367 - type: recall_at_100 value: 99.616 - type: recall_at_1000 value: 99.996 - type: recall_at_3 value: 86.936 - type: recall_at_5 value: 91.557 - task: type: Clustering dataset: name: MTEB RedditClustering type: mteb/reddit-clustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 59.25776525253911 - task: type: Clustering dataset: name: MTEB RedditClusteringP2P type: mteb/reddit-clustering-p2p config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 63.22135271663078 - task: type: Retrieval dataset: name: MTEB SCIDOCS type: scidocs config: default split: test revision: None metrics: - type: map_at_1 value: 4.003 - type: map_at_10 value: 10.062999999999999 - type: map_at_100 value: 11.854000000000001 - type: map_at_1000 value: 12.145999999999999 - type: map_at_3 value: 7.242 - type: map_at_5 value: 8.652999999999999 - type: mrr_at_1 value: 19.7 - type: mrr_at_10 value: 29.721999999999998 - type: mrr_at_100 value: 30.867 - type: mrr_at_1000 value: 30.944 - type: mrr_at_3 value: 26.683 - type: mrr_at_5 value: 28.498 - type: ndcg_at_1 value: 19.7 - type: ndcg_at_10 value: 17.095 - type: ndcg_at_100 value: 24.375 - type: ndcg_at_1000 value: 29.831000000000003 - type: ndcg_at_3 value: 16.305 - type: ndcg_at_5 value: 14.291 - type: precision_at_1 value: 19.7 - type: precision_at_10 value: 8.799999999999999 - type: precision_at_100 value: 1.9349999999999998 - type: precision_at_1000 value: 0.32399999999999995 - type: precision_at_3 value: 15.2 - type: precision_at_5 value: 12.540000000000001 - type: recall_at_1 value: 4.003 - type: recall_at_10 value: 17.877000000000002 - type: recall_at_100 value: 39.217 - type: recall_at_1000 value: 65.862 - type: recall_at_3 value: 9.242 - type: recall_at_5 value: 12.715000000000002 - task: type: STS dataset: name: MTEB SICK-R type: mteb/sickr-sts config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_spearman value: 80.25888668589654 - task: type: STS dataset: name: MTEB STS12 type: mteb/sts12-sts config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_spearman value: 77.02037527837669 - task: type: STS dataset: name: MTEB STS13 type: mteb/sts13-sts config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_spearman value: 86.58432681008449 - task: type: STS dataset: name: MTEB STS14 type: mteb/sts14-sts config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_spearman value: 81.31697756099051 - task: type: STS dataset: name: MTEB STS15 type: mteb/sts15-sts config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_spearman value: 88.18867599667057 - task: type: STS dataset: name: MTEB STS16 type: mteb/sts16-sts config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_spearman value: 84.87853941747623 - task: type: STS dataset: name: MTEB STS17 (en-en) type: mteb/sts17-crosslingual-sts config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_spearman value: 89.46479925383916 - task: type: STS dataset: name: MTEB STS22 (en) type: mteb/sts22-crosslingual-sts config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_spearman value: 66.45272113649146 - task: type: STS dataset: name: MTEB STSBenchmark type: mteb/stsbenchmark-sts config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_spearman value: 86.43357313527851 - task: type: Reranking dataset: name: MTEB SciDocsRR type: mteb/scidocs-reranking config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 78.82761687254882 - type: mrr value: 93.46223674655047 - task: type: Retrieval dataset: name: MTEB SciFact type: scifact config: default split: test revision: None metrics: - type: map_at_1 value: 44.583 - type: map_at_10 value: 52.978 - type: map_at_100 value: 53.803 - type: map_at_1000 value: 53.839999999999996 - type: map_at_3 value: 50.03300000000001 - type: map_at_5 value: 51.939 - type: mrr_at_1 value: 47.0 - type: mrr_at_10 value: 54.730000000000004 - type: mrr_at_100 value: 55.31399999999999 - type: mrr_at_1000 value: 55.346 - type: mrr_at_3 value: 52.0 - type: mrr_at_5 value: 53.783 - type: ndcg_at_1 value: 47.0 - type: ndcg_at_10 value: 57.82899999999999 - type: ndcg_at_100 value: 61.49400000000001 - type: ndcg_at_1000 value: 62.676 - type: ndcg_at_3 value: 52.373000000000005 - type: ndcg_at_5 value: 55.481 - type: precision_at_1 value: 47.0 - type: precision_at_10 value: 7.867 - type: precision_at_100 value: 0.997 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 20.556 - type: precision_at_5 value: 14.066999999999998 - type: recall_at_1 value: 44.583 - type: recall_at_10 value: 71.172 - type: recall_at_100 value: 87.7 - type: recall_at_1000 value: 97.333 - type: recall_at_3 value: 56.511 - type: recall_at_5 value: 64.206 - task: type: PairClassification dataset: name: MTEB SprintDuplicateQuestions type: mteb/sprintduplicatequestions-pairclassification config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.66237623762376 - type: cos_sim_ap value: 90.35465126226322 - type: cos_sim_f1 value: 82.44575936883628 - type: cos_sim_precision value: 81.32295719844358 - type: cos_sim_recall value: 83.6 - type: dot_accuracy value: 99.66237623762376 - type: dot_ap value: 90.35464287920453 - type: dot_f1 value: 82.44575936883628 - type: dot_precision value: 81.32295719844358 - type: dot_recall value: 83.6 - type: euclidean_accuracy value: 99.66237623762376 - type: euclidean_ap value: 90.3546512622632 - type: euclidean_f1 value: 82.44575936883628 - type: euclidean_precision value: 81.32295719844358 - type: euclidean_recall value: 83.6 - type: manhattan_accuracy value: 99.65940594059406 - type: manhattan_ap value: 90.29220174849843 - type: manhattan_f1 value: 82.4987605354487 - type: manhattan_precision value: 81.80924287118977 - type: manhattan_recall value: 83.2 - type: max_accuracy value: 99.66237623762376 - type: max_ap value: 90.35465126226322 - type: max_f1 value: 82.4987605354487 - task: type: Clustering dataset: name: MTEB StackExchangeClustering type: mteb/stackexchange-clustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 65.0394225901397 - task: type: Clustering dataset: name: MTEB StackExchangeClusteringP2P type: mteb/stackexchange-clustering-p2p config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 35.27954189859326 - task: type: Reranking dataset: name: MTEB StackOverflowDupQuestions type: mteb/stackoverflowdupquestions-reranking config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 50.99055979974896 - type: mrr value: 51.82745257193787 - task: type: Summarization dataset: name: MTEB SummEval type: mteb/summeval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.21655465344237 - type: cos_sim_spearman value: 29.853205339630172 - type: dot_pearson value: 30.216540628083564 - type: dot_spearman value: 29.868978894753027 - task: type: Retrieval dataset: name: MTEB TRECCOVID type: trec-covid config: default split: test revision: None metrics: - type: map_at_1 value: 0.2 - type: map_at_10 value: 1.398 - type: map_at_100 value: 7.406 - type: map_at_1000 value: 18.401 - type: map_at_3 value: 0.479 - type: map_at_5 value: 0.772 - type: mrr_at_1 value: 70.0 - type: mrr_at_10 value: 79.25999999999999 - type: mrr_at_100 value: 79.25999999999999 - type: mrr_at_1000 value: 79.25999999999999 - type: mrr_at_3 value: 77.333 - type: mrr_at_5 value: 78.133 - type: ndcg_at_1 value: 63.0 - type: ndcg_at_10 value: 58.548 - type: ndcg_at_100 value: 45.216 - type: ndcg_at_1000 value: 41.149 - type: ndcg_at_3 value: 60.641999999999996 - type: ndcg_at_5 value: 61.135 - type: precision_at_1 value: 70.0 - type: precision_at_10 value: 64.0 - type: precision_at_100 value: 46.92 - type: precision_at_1000 value: 18.642 - type: precision_at_3 value: 64.667 - type: precision_at_5 value: 66.4 - type: recall_at_1 value: 0.2 - type: recall_at_10 value: 1.6729999999999998 - type: recall_at_100 value: 10.856 - type: recall_at_1000 value: 38.964999999999996 - type: recall_at_3 value: 0.504 - type: recall_at_5 value: 0.852 - task: type: Retrieval dataset: name: MTEB Touche2020 type: webis-touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 1.6629999999999998 - type: map_at_10 value: 8.601 - type: map_at_100 value: 14.354 - type: map_at_1000 value: 15.927 - type: map_at_3 value: 4.1930000000000005 - type: map_at_5 value: 5.655 - type: mrr_at_1 value: 18.367 - type: mrr_at_10 value: 34.466 - type: mrr_at_100 value: 35.235 - type: mrr_at_1000 value: 35.27 - type: mrr_at_3 value: 28.571 - type: mrr_at_5 value: 31.531 - type: ndcg_at_1 value: 14.285999999999998 - type: ndcg_at_10 value: 20.374 - type: ndcg_at_100 value: 33.532000000000004 - type: ndcg_at_1000 value: 45.561 - type: ndcg_at_3 value: 18.442 - type: ndcg_at_5 value: 18.076 - type: precision_at_1 value: 18.367 - type: precision_at_10 value: 20.204 - type: precision_at_100 value: 7.489999999999999 - type: precision_at_1000 value: 1.5630000000000002 - type: precision_at_3 value: 21.769 - type: precision_at_5 value: 20.408 - type: recall_at_1 value: 1.6629999999999998 - type: recall_at_10 value: 15.549 - type: recall_at_100 value: 47.497 - type: recall_at_1000 value: 84.524 - type: recall_at_3 value: 5.289 - type: recall_at_5 value: 8.035 - task: type: Classification dataset: name: MTEB ToxicConversationsClassification type: mteb/toxic_conversations_50k config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 71.8194 - type: ap value: 14.447702451658554 - type: f1 value: 55.13659412856185 - task: type: Classification dataset: name: MTEB TweetSentimentExtractionClassification type: mteb/tweet_sentiment_extraction config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 63.310696095076416 - type: f1 value: 63.360434851097814 - task: type: Clustering dataset: name: MTEB TwentyNewsgroupsClustering type: mteb/twentynewsgroups-clustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 51.30677907335145 - task: type: PairClassification dataset: name: MTEB TwitterSemEval2015 type: mteb/twittersemeval2015-pairclassification config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 86.12386004649221 - type: cos_sim_ap value: 73.99096426215495 - type: cos_sim_f1 value: 68.18416968442834 - type: cos_sim_precision value: 66.86960933536275 - type: cos_sim_recall value: 69.55145118733509 - type: dot_accuracy value: 86.12386004649221 - type: dot_ap value: 73.99096813038672 - type: dot_f1 value: 68.18416968442834 - type: dot_precision value: 66.86960933536275 - type: dot_recall value: 69.55145118733509 - type: euclidean_accuracy value: 86.12386004649221 - type: euclidean_ap value: 73.99095984980165 - type: euclidean_f1 value: 68.18416968442834 - type: euclidean_precision value: 66.86960933536275 - type: euclidean_recall value: 69.55145118733509 - type: manhattan_accuracy value: 86.09405734040651 - type: manhattan_ap value: 73.96825745608601 - type: manhattan_f1 value: 68.13888179729383 - type: manhattan_precision value: 65.99901088031652 - type: manhattan_recall value: 70.42216358839049 - type: max_accuracy value: 86.12386004649221 - type: max_ap value: 73.99096813038672 - type: max_f1 value: 68.18416968442834 - task: type: PairClassification dataset: name: MTEB TwitterURLCorpus type: mteb/twitterurlcorpus-pairclassification config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.99367407924865 - type: cos_sim_ap value: 86.19720829843081 - type: cos_sim_f1 value: 78.39889075384951 - type: cos_sim_precision value: 74.5110278818144 - type: cos_sim_recall value: 82.71481367416075 - type: dot_accuracy value: 88.99367407924865 - type: dot_ap value: 86.19718471454047 - type: dot_f1 value: 78.39889075384951 - type: dot_precision value: 74.5110278818144 - type: dot_recall value: 82.71481367416075 - type: euclidean_accuracy value: 88.99367407924865 - type: euclidean_ap value: 86.1972021422436 - type: euclidean_f1 value: 78.39889075384951 - type: euclidean_precision value: 74.5110278818144 - type: euclidean_recall value: 82.71481367416075 - type: manhattan_accuracy value: 88.95680521597392 - type: manhattan_ap value: 86.16659921351506 - type: manhattan_f1 value: 78.39125971550081 - type: manhattan_precision value: 74.82502799552073 - type: manhattan_recall value: 82.31444410224823 - type: max_accuracy value: 88.99367407924865 - type: max_ap value: 86.19720829843081 - type: max_f1 value: 78.39889075384951 --- # hkunlp/instructor-base We introduce **Instructor**👨‍🏫, an instruction-finetuned text embedding model that can generate text embeddings tailored to any task (e.g., classification, retrieval, clustering, text evaluation, etc.) and domains (e.g., science, finance, etc.) ***by simply providing the task instruction, without any finetuning***. Instructor👨‍ achieves sota on 70 diverse embedding tasks! The model is easy to use with **our customized** `sentence-transformer` library. For more details, check out [our paper](https://arxiv.org/abs/2212.09741) and [project page](https://instructor-embedding.github.io/)! **************************** **Updates** **************************** * 01/21: We released a new [checkpoint](https://huggingface.co/hkunlp/instructor-base) trained with hard negatives, which gives better performance. * 12/21: We released our [paper](https://arxiv.org/abs/2212.09741), [code](https://github.com/HKUNLP/instructor-embedding), [checkpoint](https://huggingface.co/hkunlp/instructor-base) and [project page](https://instructor-embedding.github.io/)! Check them out! ## Quick start <hr /> ## Installation ```bash pip install InstructorEmbedding ``` ## Compute your customized embeddings Then you can use the model like this to calculate domain-specific and task-aware embeddings: ```python from InstructorEmbedding import INSTRUCTOR model = INSTRUCTOR('hkunlp/instructor-base') sentence = "3D ActionSLAM: wearable person tracking in multi-floor environments" instruction = "Represent the Science title:" embeddings = model.encode([[instruction,sentence]]) print(embeddings) ``` ## Use cases <hr /> ## Calculate embeddings for your customized texts If you want to calculate customized embeddings for specific sentences, you may follow the unified template to write instructions: &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Represent the `domain` `text_type` for `task_objective`: * `domain` is optional, and it specifies the domain of the text, e.g., science, finance, medicine, etc. * `text_type` is required, and it specifies the encoding unit, e.g., sentence, document, paragraph, etc. * `task_objective` is optional, and it specifies the objective of embedding, e.g., retrieve a document, classify the sentence, etc. ## Calculate Sentence similarities You can further use the model to compute similarities between two groups of sentences, with **customized embeddings**. ```python from sklearn.metrics.pairwise import cosine_similarity sentences_a = [['Represent the Science sentence: ','Parton energy loss in QCD matter'], ['Represent the Financial statement: ','The Federal Reserve on Wednesday raised its benchmark interest rate.']] sentences_b = [['Represent the Science sentence: ','The Chiral Phase Transition in Dissipative Dynamics'], ['Represent the Financial statement: ','The funds rose less than 0.5 per cent on Friday']] embeddings_a = model.encode(sentences_a) embeddings_b = model.encode(sentences_b) similarities = cosine_similarity(embeddings_a,embeddings_b) print(similarities) ``` ## Information Retrieval You can also use **customized embeddings** for information retrieval. ```python import numpy as np from sklearn.metrics.pairwise import cosine_similarity query = [['Represent the Wikipedia question for retrieving supporting documents: ','where is the food stored in a yam plant']] corpus = [['Represent the Wikipedia document for retrieval: ','Capitalism has been dominant in the Western world since the end of feudalism, but most feel[who?] that the term "mixed economies" more precisely describes most contemporary economies, due to their containing both private-owned and state-owned enterprises. In capitalism, prices determine the demand-supply scale. For example, higher demand for certain goods and services lead to higher prices and lower demand for certain goods lead to lower prices.'], ['Represent the Wikipedia document for retrieval: ',"The disparate impact theory is especially controversial under the Fair Housing Act because the Act regulates many activities relating to housing, insurance, and mortgage loans—and some scholars have argued that the theory's use under the Fair Housing Act, combined with extensions of the Community Reinvestment Act, contributed to rise of sub-prime lending and the crash of the U.S. housing market and ensuing global economic recession"], ['Represent the Wikipedia document for retrieval: ','Disparate impact in United States labor law refers to practices in employment, housing, and other areas that adversely affect one group of people of a protected characteristic more than another, even though rules applied by employers or landlords are formally neutral. Although the protected classes vary by statute, most federal civil rights laws protect based on race, color, religion, national origin, and sex as protected traits, and some laws include disability status and other traits as well.']] query_embeddings = model.encode(query) corpus_embeddings = model.encode(corpus) similarities = cosine_similarity(query_embeddings,corpus_embeddings) retrieved_doc_id = np.argmax(similarities) print(retrieved_doc_id) ``` ## Clustering Use **customized embeddings** for clustering texts in groups. ```python import sklearn.cluster sentences = [['Represent the Medicine sentence for clustering: ','Dynamical Scalar Degree of Freedom in Horava-Lifshitz Gravity'], ['Represent the Medicine sentence for clustering: ','Comparison of Atmospheric Neutrino Flux Calculations at Low Energies'], ['Represent the Medicine sentence for clustering: ','Fermion Bags in the Massive Gross-Neveu Model'], ['Represent the Medicine sentence for clustering: ',"QCD corrections to Associated t-tbar-H production at the Tevatron"], ['Represent the Medicine sentence for clustering: ','A New Analysis of the R Measurements: Resonance Parameters of the Higher, Vector States of Charmonium']] embeddings = model.encode(sentences) clustering_model = sklearn.cluster.MiniBatchKMeans(n_clusters=2) clustering_model.fit(embeddings) cluster_assignment = clustering_model.labels_ print(cluster_assignment) ```
[ "SUMMARIZATION" ]
[ "BIOSSES", "SCIFACT" ]
Non_BioNLP
BSC-LT/salamandra-7b-instruct
BSC-LT
text-generation
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "bg", "ca", "code", "cs", "cy", "da", "de", "el", "en", "es", "et", "eu", "fi", "fr", "ga", "gl", "hr", "hu", "it", "lt", "lv", "mt", "nl", "nn", "oc", "pl", "pt", "ro", "ru", "sh", "sk", "sl", "sr", "sv", "uk", "dataset:oscar-corpus/colossal-oscar-1.0", "dataset:HuggingFaceFW/fineweb-edu", "dataset:joelniklaus/eurlex_resources", "dataset:joelito/legal-mc4", "dataset:projecte-aina/CATalog", "dataset:UFRGS/brwac", "dataset:community-datasets/hrwac", "dataset:danish-foundation-models/danish-gigaword", "dataset:HiTZ/euscrawl", "dataset:PleIAs/French-PD-Newspapers", "dataset:PleIAs/French-PD-Books", "dataset:AI-team-UoA/greek_legal_code", "dataset:HiTZ/latxa-corpus-v1.1", "dataset:allenai/peS2o", "dataset:pile-of-law/pile-of-law", "dataset:PORTULAN/parlamento-pt", "dataset:hoskinson-center/proof-pile", "dataset:togethercomputer/RedPajama-Data-1T", "dataset:bigcode/starcoderdata", "dataset:bjoernp/tagesschau-2018-2023", "dataset:EleutherAI/the_pile_deduplicated", "arxiv:2502.08489", "arxiv:2403.14009", "arxiv:2403.20266", "arxiv:2101.00027", "arxiv:2207.00220", "arxiv:1810.06694", "arxiv:1911.05507", "arxiv:1906.03741", "arxiv:2406.17557", "arxiv:2402.06619", "arxiv:1803.09010", "base_model:BSC-LT/salamandra-7b", "base_model:finetune:BSC-LT/salamandra-7b", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
1,727
1,740
12,988
54
--- base_model: - BSC-LT/salamandra-7b datasets: - oscar-corpus/colossal-oscar-1.0 - HuggingFaceFW/fineweb-edu - joelniklaus/eurlex_resources - joelito/legal-mc4 - projecte-aina/CATalog - UFRGS/brwac - community-datasets/hrwac - danish-foundation-models/danish-gigaword - HiTZ/euscrawl - PleIAs/French-PD-Newspapers - PleIAs/French-PD-Books - AI-team-UoA/greek_legal_code - HiTZ/latxa-corpus-v1.1 - allenai/peS2o - pile-of-law/pile-of-law - PORTULAN/parlamento-pt - hoskinson-center/proof-pile - togethercomputer/RedPajama-Data-1T - bigcode/starcoderdata - bjoernp/tagesschau-2018-2023 - EleutherAI/the_pile_deduplicated language: - bg - ca - code - cs - cy - da - de - el - en - es - et - eu - fi - fr - ga - gl - hr - hu - it - lt - lv - mt - nl - nn - \no - oc - pl - pt - ro - ru - sh - sk - sl - sr - sv - uk library_name: transformers license: apache-2.0 pipeline_tag: text-generation --- ![](./images/salamandra_header.png) # Salamandra Model Card This repository contains the model described in [Salamandra Technical Report](https://huggingface.co/papers/2502.08489). Salamandra is a highly multilingual model pre-trained from scratch that comes in three different sizes — 2B, 7B and 40B parameters — with their respective base and instruction-tuned variants. This model card corresponds to the 7B instructed version. To visit the model cards of other Salamandra versions, please refer to the [Model Index](#model-index). The entire Salamandra family is released under a permissive [Apache 2.0 license](https://www.apache.org/licenses/LICENSE-2.0). Along with the open weights, all training scripts and configuration files are made publicly available in [this GitHub repository](https://github.com/langtech-bsc/salamandra). > [!WARNING] > **DISCLAIMER:** This model is a first proof-of-concept designed to demonstrate the instruction-following capabilities of recently released base models. > It has been optimized to engage in conversation but has *NOT* been aligned through RLHF to filter or avoid sensitive topics. > As a result, it may generate harmful or inappropriate content. > The team is actively working to enhance its performance through further instruction and alignment with RL techniques. --- ## Model Details ### Description Transformer-based decoder-only language model that has been pre-trained from scratch on 12.875 trillion tokens of highly curated data. The pre-training corpus contains text in 35 European languages and code. ### Hyperparameters The full list of hyperparameters for each model can be found [here](https://github.com/langtech-bsc/salamandra/blob/main/configs/bsc_7b.yaml). ### Architecture | | | |-------------------------|:--------------| | Total Parameters | 7,768,117,248 | | Embedding Parameters | 1,048,576,000 | | Layers | 32 | | Hidden size | 4,096 | | Attention heads | 32 | | Context length | 8,192 | | Vocabulary size | 256,000 | | Precision | bfloat16 | | Embedding type | RoPE | | Activation Function | SwiGLU | | Layer normalization | RMS Norm | | Flash attention | ✅ | | Grouped Query Attention | ✅ | | Num. query groups | 8 | --- ## Intended Use ### Direct Use The models are intended for both research and commercial use in any of the languages included in the training data. The base models are intended either for language generation or to be further fine-tuned for specific use-cases. The instruction-tuned variants can be used as general-purpose assistants, as long as the user is fully aware of the model’s limitations. ### Out-of-scope Use The model is not intended for malicious activities, such as harming others or violating human rights. Any downstream application must comply with current laws and regulations. Irresponsible usage in production environments without proper risk assessment and mitigation is also discouraged. --- ## Hardware and Software ### Training Framework Pre-training was conducted using NVIDIA’s [NeMo Framework](https://docs.nvidia.com/nemo-framework/index.html), which leverages PyTorch Lightning for efficient model training in highly distributed settings. The instruction-tuned versions were produced with [FastChat](https://github.com/lm-sys/FastChat). ### Compute Infrastructure All models were trained on [MareNostrum 5](https://www.bsc.es/ca/marenostrum/marenostrum-5), a pre-exascale EuroHPC supercomputer hosted and operated by Barcelona Supercomputing Center. The accelerated partition is composed of 1,120 nodes with the following specifications: - 4x Nvidia Hopper GPUs with 64GB HBM2 memory - 2x Intel Sapphire Rapids 8460Y+ at 2.3Ghz and 32c each (64 cores) - 4x NDR200 (BW per node 800Gb/s) - 512 GB of Main memory (DDR5) - 460GB on NVMe storage |Model|Nodes|GPUs| |:---:|:---:|:---:| |2B|64|256| |7B|128|512| |40B|256 / 512|1,024 / 2,048| --- ## How to use The instruction-following models use the commonly adopted ChatML template: ```jinja {%- if messages[0]['role'] == 'system' %}{%- set system_message = messages[0]['content'] %}{%- set loop_messages = messages[1:] %}{%- else %}{%- set system_message = 'SYSTEM MESSAGE' %}{%- set loop_messages = messages %}{%- endif %}{%- if not date_string is defined %}{%- set date_string = '2024-09-30' %}{%- endif %}{{ '<|im_start|>system\n' + system_message + '<|im_end|>\n' }}{% for message in loop_messages %}{%- if (message['role'] != 'user') and (message['role'] != 'assistant')%}{{ raise_exception('Only user and assitant roles are suported after the initial optional system message.') }}{% endif %}{% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}{{ raise_exception('After the optional system message, conversation roles must alternate user/assistant/user/assistant/...') }}{% endif %}{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n'}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %} ``` Where `system_message` is used to guide the model during generation and `date_string` can be set to allow the model to respond with the current date. The exact same chat template should be used for an enhanced conversational experience. The easiest way to apply it is by using the tokenizer's built-in functions, as shown in the following snippet. ```python from datetime import datetime from transformers import AutoTokenizer, AutoModelForCausalLM import transformers import torch model_id = "BSC-LT/salamandra-7b-instruct" text = "At what temperature does water boil?" tokenizer = AutoTokenizer.from_pretrained(model_id) model = AutoModelForCausalLM.from_pretrained( model_id, device_map="auto", torch_dtype=torch.bfloat16 ) message = [ { "role": "user", "content": text } ] date_string = datetime.today().strftime('%Y-%m-%d') prompt = tokenizer.apply_chat_template( message, tokenize=False, add_generation_prompt=True, date_string=date_string ) inputs = tokenizer.encode(prompt, add_special_tokens=False, return_tensors="pt") outputs = model.generate(input_ids=inputs.to(model.device), max_new_tokens=200) print(tokenizer.decode(outputs[0], skip_special_tokens=True)) ``` Using this template, each turn is preceded by a `<|im_start|>` delimiter and the role of the entity (either `user`, for content supplied by the user, or `assistant` for LLM responses), and finished with the `<|im_end|>` token. --- ## Data ### Pretraining Data The pre-training corpus comprises data from 35 European languages and 92 programming languages, with detailed data sources provided below. The initial three training epochs used 2.4 trillion tokens, obtained by manually adjusting data proportion to balance the representation and give more importance to Spain’s co-official (Spanish, Catalan, Galician, and Basque). This way, we downsampled code and English data to half, Spanish co-official languages were oversampled by 2x, and the remaining languages were kept in their original proportions. During the following epochs, the Colossal OSCAR dataset was replaced with the FineWeb-Edu dataset. This adjustment resulted in a total of 2.68 trillion tokens, distributed as outlined below: ![lang distrib](./images/corpus_languages_1.1.png) The pretraining corpus is predominantly composed of data from Colossal OSCAR, which contributes a significant 53.05% of the total tokens. Following this, Starcoder provides 13.67%, and FineWeb-Edu (350BT subset) adds 10.24%. The next largest sources are HPLT at 4.21% and French-PD at 3.59%. Other notable contributions include MaCoCu, Legal-ES, and EurLex, each contributing around 1.72% to 1.41%. These major sources collectively form the bulk of the corpus, ensuring a rich and diverse dataset for training the language model. The remaining 10% comes from smaller sources in various languages. Feel free to click the expand button below to see the full list of sources. <details> <summary>Data Sources</summary> | Dataset | Language | Source | |---|---|---| | Colossal OSCAR 1.0 | bg, ca, cs, cy, da, de, el, en, es, et, eu, fi, fr, ga, gl, hr, hu, it, lt, lv, mt, nl, nn, no, oc, pl, pt, ro, ru, sh, sk, sl, sr, sv, uk | Brack et al., 2024 | | Aya Dataset (w/o Evaluation Suite) | eu, hr, nl, fi, ka, hu, lt, nn, ro, sk, lv, cy, bg, cs, en, fr, de, ga, mt, pl, ru, sl, sv, ca, da, et, gl, el, it, no, pt, sr, es, uk | Singh et al., 2024 | | Wikimedia dumps | bg, ca, cs, da, de, el, en, es, et, eu, fi, fr, ga, gl, hr, hu, it, lt, lv, mt, nl, nn, no, pl, pt, ro, sh, sk, sl, sr, uk | [Link](https://dumps.wikimedia.org/) | | OpenSubtitles v2016 | bg, ca, cs, da, de, el, en, es, et, eu, fi, fr, gl, hr, it, lt, lv, nl, no, pl, pt, ro, sk, sl, sr, sv, uk | Lison & Tiedemann, 2016 | | EurLEX-Resources | bg, cs, da, de, el, en, es, et, fi, fr, ga, hr, hu, it, lt, lv, mt, nl, pl, pt, ro, sk, sl, sv | [Link](https://huggingface.co/datasets/joelniklaus/eurlex_resources) | | MC4-Legal | bg, cs, da, de, el, en, es, et, fi, fr, ga, hu, it, lt, lv, mt, nl, pl, pt, ro, sk, sl, sv | [Link](https://huggingface.co/datasets/joelito/legal-mc4) | | Parlamint | at, bg, cz, dk, ee, es, es-ga, fi, fr, gb, gr, hr, hu, it, lv, nl, no, pl, pt, rs, se, si | Erjavec et al., 2021 | | MaCoCu | bg, ca, el, hr, mt, sl, sr, uk | Bañón et al., 2022 | | CURLICAT | bg, hr, hu, pl, ro, sk, sl | Váradi et al., 2022 | | Norwegian Colossal Corpus (NCC) | nn, no | Kummervold et al., 2021 | | Academic Slovene KAS 2.0 | sl | Žagar et al., 2022 | | BIGPATENT | en | Sharma et al., 2019 | | Biomedical-ES | es | Internally generated biomedical dataset: Wikipedia LS, Pubmed, MeSpEn, patents, clinical cases, medical crawler | | Brazilian Portuguese Web as Corpus (BrWaC) | pt | Wagner Filho et al., 2018 | | Bulgarian National Corpus (BulNC) | bg | [Link](http://old.dcl.bas.bg/dataset/BulNC.7z) | | CaBeRnet | fr | Popa-Fabre et al., 2020 | | CATalog 1.0 | ca | Palomar-Giner et al., 2024 | | CorpusNÓS | gl | de-Dios-Flores et al., 2024 | | Croatian Web as Corpus 2.1 (hrWaC) | hr | Ljubešić & Klubička, 2014 | | DaNewsroom | da | Varab & Schluter, 2020 | | Danish GigaWord | da | Strømberg-Derczynski et al., 2021 | | DK-CLARIN Reference Corpus of General Danish | da | [Link](https://korpus.dsl.dk/clarin/) | | Estonian National Corpus 2021 (ENC) | et | Koppel & Kallas, 2022 | | Estonian Reference Corpus (ERC) | et | [Link](https://www.cl.ut.ee/korpused/segakorpus/) | | EusCrawl (w/o Wikipedia or NC-licenses) | eu | Artetxe et al., 2022 | | FineWeb-Edu (350BT subset) | en | Penedo et al., 2024 | | French Public Domain Books (French-PD) | fr | [Link](https://huggingface.co/datasets/PleIAs/French-PD-Books) | | French Public Domain Newspapers (French-PD) | fr | [Link](https://huggingface.co/datasets/PleIAs/French-PD-Newspapers) | | German Web as Corpus (DeWaC) | de | [Link](https://docs.sslmit.unibo.it/doku.php?id=corpora:dewac) | | Greek Legal Code (GLC) | el | Papaloukas et al., 2021 | | Greek Web Corpus (GWC) | el | Outsios et al., 2018 | | HPLT v1 - Spanish | es | de Gibert et al., 2024 | | HPLT v1.1 - Spanish | es | de Gibert et al., 2024 | | Irish Universal Dependencies (Ga-UD) | ga | [Link](https://universaldependencies.org/ga/index.html) | | Italian Web as Corpus (ItWaC) | it | [Link](https://docs.sslmit.unibo.it/doku.php?id=corpora:itwac) | | Korpus Malti | mt | Micallef et al., 2022 | | Korpus slovenských právnych predpisov v1.9 (SK-Laws) | sk | [Link](https://www.juls.savba.sk/data/marcell/legal-sk-20220322-1.9.ver.xz) | | Latxa Corpus v1.1 (GAITU) | eu | Etxaniz et al., 2024 [Link](https://huggingface.co/datasets/HiTZ/latxa-corpus-v1.1) | | Laws and legal acts of Ukraine (UK-Laws) | uk | [Link](https://lang.org.ua/en/corpora/#anchor7) | | Legal-ES | es | Internally generated legal dataset: BOE, BORME, Senado, Congreso, Spanish court orders, DOGC | | MARCELL Romanian legislative subcorpus v2 | ro | [Link](https://elrc-share.eu/reposMARCELL%20Romanian%20legislative%20subcorpus%20v2itory/browse/marcell-romanian-legislative-subcorpus-v2/2da548428b9d11eb9c1a00155d026706ce94a6b59ffc4b0e9fb5cd9cebe6889e/) | | Math AMPS | en | Hendrycks et al., 2021 | | NKPJ National Corpus of Polish v1.2 (NKPJ) | pl | Lewandowska-Tomaszczyk et al., 2013 | | Occitan Corpus (IEA-AALO) | oc | Provided by [IEA](https://www.institutestudisaranesi.cat/) | | Open Legal Data - German court decisions and laws | de | Ostendorff et al., 2020 | | ParlamentoPT | pt | Rodrigues et al., 2023 | | peS2o | en | Soldaini & Lo, 2023 | | PG-19 | en | Rae et al., 2019 | | Pile of Law (selected subsets) | en | Henderson* et al., 2022 | | Polish Parliamentary Corpus (PPC) | pl | Ogrodniczuk, 2018 | | Proof Pile | en | [Link](https://huggingface.co/datasets/hoskinson-center/proof-pile) | | RedPajama-Data T1 (StackExchange subset) | en | Computer, 2023 | | Scientific-ES | es | Internally generated scientific dataset: Dialnet, Scielo, CSIC, TDX, BSC, UCM | | SK Court Decisions v2.0 (OD-Justice) | sk | [Link](https://www.juls.savba.sk/data/od-justice/od-justice-2.0.ver.xz) | | Slovene Web as Corpus (slWaC) | sl | Erjavec et al., 2015 | | SoNaR Corpus NC 1.2 | nl | [Link](https://taalmaterialen.ivdnt.org/download/tstc-sonar-corpus/) | | Spanish Legal Domain Corpora (Spanish-Legal) | es | Gutiérrez-Fandiño et al., 2021 | | SrpKorSubset: news, legal, academic, conversation, lit- erary (SrpKor) | sr | [Link](http://www.korpus.matf.bg.ac.rs/) | | Starcoder | code | Li et al., 2023 | | State-related content from the Latvian Web (State-Latvian-Web) | lv | [Link](https://catalog.elra.info/en-us/repository/browse/ELRA-W0169/) | | SYN v9: large corpus of written Czech | cs | Křen et al., 2021 | | Tagesschau Archive Article | de | [Link](https://huggingface.co/datasets/bjoernp/tagesschau-2018-2023) | | The Danish Parliament Corpus 2009 - 2017, v1 | da | Hansen, 2018 | | The Gaois bilingual corpus of English-Irish legislation (Ga-Legislation) | ga | [Link](https://portulanclarin.net/repository/browse/the-gaois-bilingual-corpus-of-english-irish-legislation-processed/daeac17c9e3511ea9b7f02420a000407b83de243dc0b469aab41084386c5b80f/) | | The Pile (PhilPapers) | en | Gao et al., 2021 | | The Swedish Culturomics Gigaword Corpus (Swedish- Gigaword) | sv | Rødven-Eide, 2016 | | Welsh-GOV | cy | Crawling from [Link](https://www.llyw.cymru) | | Yle Finnish News Archive (Yle-News) | fi | [Link](http://urn.fi/urn:nbn:fi:lb-2021050401) | To consult the data summary document with the respective licences, please send an e-mail to [email protected]. <details> <summary>References</summary> - Abadji, J., Suárez, P. J. O., Romary, L., & Sagot, B. (2021). Ungoliant: An optimized pipeline for the generation of a very large-scale multilingual web corpus (H. Lüngen, M. Kupietz, P. Bański, A. Barbaresi, S. Clematide, & I. Pisetta, Eds.; pp. 1–9). Leibniz-Institut für Deutsche Sprache. [Link](https://doi.org/10.14618/ids-pub-10468) - Artetxe, M., Aldabe, I., Agerri, R., Perez-de-Viñaspre, O., & Soroa, A. (2022). Does Corpus Quality Really Matter for Low-Resource Languages? - Bañón, M., Esplà-Gomis, M., Forcada, M. L., García-Romero, C., Kuzman, T., Ljubešić, N., van Noord, R., Sempere, L. P., Ramírez-Sánchez, G., Rupnik, P., Suchomel, V., Toral, A., van der Werff, T., & Zaragoza, J. (2022). MaCoCu: Massive collection and curation of monolingual and bilingual data: Focus on under-resourced languages. Proceedings of the 23rd Annual Conference of the European Association for Machine Translation, 303–304. [Link](https://aclanthology.org/2022.eamt-1.41) - Brack, M., Ostendorff, M., Suarez, P. O., Saiz, J. J., Castilla, I. L., Palomar-Giner, J., Shvets, A., Schramowski, P., Rehm, G., Villegas, M., & Kersting, K. (2024). Community OSCAR: A Community Effort for Multilingual Web Data. [Link](https://occiglot.eu/papers/Community_Oscar.pdf) - Computer, T. (2023). RedPajama: An Open Source Recipe to Reproduce LLaMA training dataset [Computer software]. [Link](https://github.com/togethercomputer/RedPajama-Data) - de Gibert, O., Nail, G., Arefyev, N., Bañón, M., van der Linde, J., Ji, S., Zaragoza-Bernabeu, J., Aulamo, M., Ramírez-Sánchez, G., Kutuzov, A., Pyysalo, S., Oepen, S., & Tiedemann, J. (2024). A New Massive Multilingual Dataset for High-Performance Language Technologies (arXiv:2403.14009). arXiv. [Link](http://arxiv.org/abs/2403.14009) - Dodge, J., Sap, M., Marasović, A., Agnew, W., Ilharco, G., Groeneveld, D., Mitchell, M., & Gardner, M. (2021). Documenting Large Webtext Corpora: A Case Study on the Colossal Clean Crawled Corpus. In M.-F. Moens, X. Huang, L. Specia, & S. W. Yih (Eds.), Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (pp. 1286–1305). Association for Computational Linguistics. [Link](https://doi.org/10.18653/v1/2021.emnlp-main.98) - Erjavec, T., Ljubešić, N., & Logar, N. (2015). The slWaC corpus of the Slovene web. Informatica (Slovenia), 39, 35–42. - Erjavec, T., Ogrodniczuk, M., Osenova, P., Ljubešić, N., Simov, K., Grigorova, V., Rudolf, M., Pančur, A., Kopp, M., Barkarson, S., Steingrímsson, S. hór, van der Pol, H., Depoorter, G., de Does, J., Jongejan, B., Haltrup Hansen, D., Navarretta, C., Calzada Pérez, M., de Macedo, L. D., … Rayson, P. (2021). Linguistically annotated multilingual comparable corpora of parliamentary debates ParlaMint.ana 2.1. [Link](http://hdl.handle.net/11356/1431) - Etxaniz, J., Sainz, O., Perez, N., Aldabe, I., Rigau, G., Agirre, E., Ormazabal, A., Artetxe, M., & Soroa, A. (2024). Latxa: An Open Language Model and Evaluation Suite for Basque. [Link] (https://arxiv.org/abs/2403.20266) - Gao, L., Biderman, S., Black, S., Golding, L., Hoppe, T., Foster, C., Phang, J., He, H., Thite, A., Nabeshima, N., Presser, S., & Leahy, C. (2021). The Pile: An 800GB Dataset of Diverse Text for Language Modeling. CoRR, abs/2101.00027. [Link](https://arxiv.org/abs/2101.00027) - Gutiérrez-Fandiño, A., Armengol-Estapé, J., Gonzalez-Agirre, A., & Villegas, M. (2021). Spanish Legalese Language Model and Corpora. - Hansen, D. H. (2018). The Danish Parliament Corpus 2009—2017, v1. [Link](http://hdl.handle.net/20.500.12115/8) - Henderson*, P., Krass*, M. S., Zheng, L., Guha, N., Manning, C. D., Jurafsky, D., & Ho, D. E. (2022). Pile of Law: Learning Responsible Data Filtering from the Law and a 256GB Open-Source Legal Dataset. arXiv. [Link](https://arxiv.org/abs/2207.00220) - Hendrycks, D., Burns, C., Kadavath, S., Arora, A., Basart, S., Tang, E., Song, D., & Steinhardt, J. (2021). Measuring Mathematical Problem Solving With the MATH Dataset. NeurIPS. - Jansen, T., Tong, Y., Zevallos, V., & Suarez, P. O. (2022). Perplexed by Quality: A Perplexity-based Method for Adult and Harmful Content Detection in Multilingual Heterogeneous Web Data. - Koppel, K., & Kallas, J. (2022). Eesti keele ühendkorpuste sari 2013–2021: Mahukaim eestikeelsete digitekstide kogu. Eesti Rakenduslingvistika Ühingu Aastaraamat Estonian Papers in Applied Linguistics, 18, 207–228. [Link](https://doi.org/10.5128/erya18.12) - Křen, M., Cvrček, V., Henyš, J., Hnátková, M., Jelínek, T., Kocek, J., Kováříková, D., Křivan, J., Milička, J., Petkevič, V., Procházka, P., Skoumalová, H., Šindlerová, J., & Škrabal, M. (2021). SYN v9: Large corpus of written Czech. [Link](http://hdl.handle.net/11234/1-4635) - Kreutzer, J., Caswell, I., Wang, L., Wahab, A., van Esch, D., Ulzii-Orshikh, N., Tapo, A., Subramani, N., Sokolov, A., Sikasote, C., Setyawan, M., Sarin, S., Samb, S., Sagot, B., Rivera, C., Rios, A., Papadimitriou, I., Osei, S., Suarez, P. O., … Adeyemi, M. (2022). Quality at a Glance: An Audit of Web-Crawled Multilingual Datasets. Transactions of the Association for Computational Linguistics, 10, 50–72. [Link](https://doi.org/10.1162/tacl_a_00447) - Kummervold, P. E., De la Rosa, J., Wetjen, F., & Brygfjeld, S. A. (2021). Operationalizing a National Digital Library: The Case for a Norwegian Transformer Model. In S. Dobnik & L. Øvrelid (Eds.), Proceedings of the 23rd Nordic Conference on Computational Linguistics (NoDaLiDa) (pp. 20–29). Linköping University Electronic Press, Sweden. [Link](https://aclanthology.org/2021.nodalida-main.3) - Lewandowska-Tomaszczyk, B., Górski, R., Łaziński, M., & Przepiórkowski, A. (2013). The National Corpus of Polish (NKJP). Language use and data analysis. 309–319. - Li, R., Allal, L. B., Zi, Y., Muennighoff, N., Kocetkov, D., Mou, C., Marone, M., Akiki, C., Li, J., Chim, J., Liu, Q., Zheltonozhskii, E., Zhuo, T. Y., Wang, T., Dehaene, O., Davaadorj, M., Lamy-Poirier, J., Monteiro, J., Shliazhko, O., … Vries, H. de. (2023). StarCoder: May the source be with you! - Lison, P., & Tiedemann, J. (2016). OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles. In N. Calzolari, K. Choukri, T. Declerck, S. Goggi, M. Grobelnik, B. Maegaard, J. Mariani, H. Mazo, A. Moreno, J. Odijk, & S. Piperidis (Eds.), Proceedings of the Tenth International Conference on Language Resources and Evaluation (LREC’16) (pp. 923–929). European Language Resources Association (ELRA). [Link](https://aclanthology.org/L16-1147) - Ljubešić, N., & Klubička, F. (2014). Bs,hr,srWaC - Web Corpora of Bosnian, Croatian and Serbian. In F. Bildhauer & R. Schäfer (Eds.), Proceedings of the 9th Web as Corpus Workshop (WaC-9) (pp. 29–35). Association for Computational Linguistics. [Link](https://doi.org/10.3115/v1/W14-0405) - Micallef, K., Gatt, A., Tanti, M., van der Plas, L., & Borg, C. (2022). Pre-training Data Quality and Quantity for a Low-Resource Language: New Corpus and BERT Models for Maltese. Proceedings of the Third Workshop on Deep Learning for Low-Resource Natural Language Processing, 90–101. [Link](https://doi.org/10.18653/v1/2022.deeplo-1.10) - Ogrodniczuk, M. (2018). Polish Parliamentary Corpus. [Link](https://api.semanticscholar.org/CorpusID:235134113) - Ostendorff, M., Blume, T., & Ostendorff, S. (2020). Towards an Open Platform for Legal Information. Proceedings of the ACM/IEEE Joint Conference on Digital Libraries in 2020, 385–388. [Link](https://doi.org/10.1145/3383583.3398616) - Ostendorff, M., Suarez, P. O., Lage, L. F., & Rehm, G. (2024). LLM-Datasets: An Open Framework for Pretraining Datasets of Large Language Models. First Conference on Language Modeling. [Link](https://openreview.net/forum?id=5RdIMlGLXL) - Outsios, S., Skianis, K., Meladianos, P., Xypolopoulos, C., & Vazirgiannis, M. (2018). Word Embeddings from Large-Scale Greek Web content. arXiv Preprint arXiv:1810.06694. - Palomar-Giner, J., Saiz, J. J., Espuña, F., Mina, M., Da Dalt, S., Llop, J., Ostendorff, M., Ortiz Suarez, P., Rehm, G., Gonzalez-Agirre, A., & Villegas, M. (2024). A CURATEd CATalog: Rethinking the Extraction of Pretraining Corpora for Mid-Resourced Languages. In N. Calzolari, M.-Y. Kan, V. Hoste, A. Lenci, S. Sakti, & N. Xue (Eds.), Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024) (pp. 335–349). ELRA and ICCL. [Link](https://aclanthology.org/2024.lrec-main.31) - Papaloukas, C., Chalkidis, I., Athinaios, K., Pantazi, D.-A., & Koubarakis, M. (2021). Multi-granular Legal Topic Classification on Greek Legislation. Proceedings of the Natural Legal Language Processing Workshop 2021, 63–75. [Link](https://doi.org/10.48550/arXiv.2109.15298) - Popa-Fabre, M., Ortiz Suárez, P. J., Sagot, B., & de la Clergerie, É. (2020). French Contextualized Word-Embeddings with a sip of CaBeRnet: A New French Balanced Reference Corpus. Proceedings of the 8th Workshop on Challenges in the Management of Large Corpora, 15–23. [Link](https://aclanthology.org/2020.cmlc-1.3) - Rae, J. W., Potapenko, A., Jayakumar, S. M., Hillier, C., & Lillicrap, T. P. (2019). Compressive Transformers for Long-Range Sequence Modelling. arXiv Preprint. [Link](https://arxiv.org/abs/1911.05507) - Rodrigues, J., Gomes, L., Silva, J., Branco, A., Santos, R., Cardoso, H. L., & Osório, T. (2023). Advancing Neural Encoding of Portuguese with Transformer Albertina PT-\*. - Rødven-Eide, S. (2016). The Swedish Culturomics Gigaword CorpusThe Swedish Culturomics Gigaword Corpus [Dataset]. Språkbanken Text. [Link](https://doi.org/10.23695/3WMV-1Z09) - Sharma, E., Li, C., & Wang, L. (2019). BIGPATENT: A Large-Scale Dataset for Abstractive and Coherent Summarization. CoRR, abs/1906.03741. [Link](http://arxiv.org/abs/1906.03741) - Soldaini, L., & Lo, K. (2023). peS2o (Pretraining Efficiently on S2ORC) Dataset. Allen Institute for AI. - Strømberg-Derczynski, L., Ciosici, M., Baglini, R., Christiansen, M. H., Dalsgaard, J. A., Fusaroli, R., Henrichsen, P. J., Hvingelby, R., Kirkedal, A., Kjeldsen, A. S., Ladefoged, C., Nielsen, F. Å., Madsen, J., Petersen, M. L., Rystrøm, J. H., & Varab, D. (2021). The Danish Gigaword Corpus. Proceedings of the 23rd Nordic Conference on Computational Linguistics (NoDaLiDa), 413–421. [Link](https://aclanthology.org/2021.nodalida-main.46) - Subramani, N., Luccioni, S., Dodge, J., & Mitchell, M. (2023). Detecting Personal Information in Training Corpora: An Analysis. 208–220. [Link](https://doi.org/10.18653/v1/2023.trustnlp-1.18) - Varab, D., & Schluter, N. (2020). DaNewsroom: A Large-scale Danish Summarisation Dataset. Proceedings of The 12th Language Resources and Evaluation Conference, 6731–6739. [Link](https://www.aclweb.org/anthology/2020.lrec-1.831) - Váradi, T., Nyéki, B., Koeva, S., Tadić, M., Štefanec, V., Ogrodniczuk, M., Nitoń, B., Pezik, P., Barbu Mititelu, V., Irimia, E., Mitrofan, M., Tufi\textcommabelows, D., Garabík, R., Krek, S., & Repar, A. (2022). Introducing the CURLICAT Corpora: Seven-language Domain Specific Annotated Corpora from Curated Sources. In N. Calzolari, F. Béchet, P. Blache, K. Choukri, C. Cieri, T. Declerck, S. Goggi, H. Isahara, B. Maegaard, J. Mariani, H. Mazo, J. Odijk, & S. Piperidis (Eds.), Proceedings of the Thirteenth Language Resources and Evaluation Conference (pp. 100–108). European Language Resources Association. [Link](https://aclanthology.org/2022.lrec-1.11) - Wagner Filho, J. A., Wilkens, R., Idiart, M., & Villavicencio, A. (2018). The brwac corpus: A new open resource for brazilian portuguese. Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018). - Žagar, A., Kavaš, M., Robnik-Šikonja, M., Erjavec, T., Fišer, D., Ljubešić, N., Ferme, M., Borovič, M., Boškovič, B., Ojsteršek, M., & Hrovat, G. (2022). Corpus of academic Slovene KAS 2.0. [Link](http://hdl.handle.net/11356/1448) - Alicia Parrish, Angelica Chen, Nikita Nangia, Vishakh Padmakumar, Jason Phang, Jana Thompson, Phu Mon Htut, and Samuel Bowman. 2022. BBQ: A hand-built bias benchmark for question answering. In Findings of the Association for Computational Linguistics: ACL 2022, pages 2086–2105, Dublin, Ireland. Association for Computational Linguistics. - Emily Sheng, Kai-Wei Chang, Premkumar Natarajan, and Nanyun Peng. 2019. The Woman Worked as a Babysitter: On Biases in Language Generation. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 3407–3412, Hong Kong, China. Association for Computational Linguistics. - Clark, P., Cowhey, I., Etzioni, O., Khot, T., Sabharwal, A., Schoenick, C., & Tafjord, O. (2018). Think you have Solved Question Answering? Try ARC, the AI2 Reasoning Challenge. arXiv:1803. 05457v1. - Richard Socher, Alex Perelygin, Jean Wu, Jason Chuang, Christopher D. Manning, Andrew Ng, and Christopher Potts. 2013. Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank. In Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pages 1631–1642, Seattle, Washington, USA. Association for Computational Linguistics. - Penedo, G., Kydlíček, H., allal, L. B., Lozhkov, A., Mitchell, M., Raffel, C., Von Werra, L., & Wolf, T. (2024). The FineWeb Datasets: Decanting the Web for the Finest Text Data at Scale (arXiv:2406.17557). arXiv. http://arxiv.org/abs/2406.17557 - Singh, S., Vargus, F., Dsouza, D., Karlsson, B. F., Mahendiran, A., Ko, W.-Y., Shandilya, H., Patel, J., Mataciunas, D., OMahony, L., Zhang, M., Hettiarachchi, R., Wilson, J., Machado, M., Moura, L. S., Krzemiński, D., Fadaei, H., Ergün, I., Okoh, I., … Hooker, S. (2024). Aya Dataset: An Open-Access Collection for Multilingual Instruction Tuning (arXiv:2402.06619). arXiv. http://arxiv.org/abs/2402.06619 </details> </details> The model was trained on 3 pre-training epochs with 2.4T tokens per epoch, 2 additional pre-training epochs in which the English part of the Colossal OSCAR dataset was replaced with FineWeb-Edu (350BT subset), resulting in 2.68T tokens per epoch; and 1 final epoch of 0.315T higher quality tokens, meaning that the total number of tokens seen during pre-training is approximately 12.875 trillion tokens. We provide an extense Datasheet section following the best practices defined by [(Gebru et al., 2021)](https://arxiv.org/pdf/1803.09010). <details> <summary>Datasheet</summary> #### Motivation **For what purpose was the dataset created? Was there a specific task in mind? Was there a specific gap that needed to be filled? Please provide a description.** The purpose of creating this dataset is to pre-train the Salamandra family of multilingual models with high performance in a large number of European languages (35) and programming languages (92). We also want to represent the co-official languages of Spain: Spanish, Catalan, Galician and Basque. For this reason, we oversample these languages by a factor of 2. There is a great lack of massive multilingual data, especially in minority languages (Ostendorff & Rehm, 2023), so part of our efforts in the creation of this pre-training dataset have resulted in the contribution to large projects such as the Community OSCAR (Brack et al., 2024), which includes 151 languages and 40T words, or CATalog (Palomar-Giner et al., 2024), the largest open dataset in Catalan in the world. **Who created the dataset (e.g., which team, research group) and on behalf of which entity (e.g., company, institution, organization)?** The dataset has been created by the Language Technologies unit (LangTech) of the Barcelona Supercomputing Center - Centro Nacional de Supercomputación (BSC-CNS), which aims to advance the field of natural language processing through cutting-edge research and development and the use of HPC. In particular, it was created by the unit's data team, the main contributors being José Javier Saiz, Ferran Espuña and Jorge Palomar. However, the creation of the dataset would not have been possible without the collaboration of a large number of collaborators, partners and public institutions, which can be found in detail in the acknowledgements. **Who funded the creation of the dataset? If there is an associated grant, please provide the name of the grantor and the grant name and number.** This work has been promoted and financed by the Government of Catalonia through the [Aina project](https://projecteaina.cat/). This work is funded by the _Ministerio para la Transformación Digital y de la Función Pública_ - Funded by EU – NextGenerationEU within the framework of [ILENIA Project](https://proyectoilenia.es/) with reference 2022/TL22/00215337. #### Composition **What do the instances that comprise the dataset represent (e.g., documents, photos, people, countries)? Are there multiple types of instances (e.g., movies, users, and ratings; people and interactions between them; nodes and edges)? Please provide a description.** The dataset consists entirely of text documents in various languages. Specifically, data was mainly sourced from the following databases and repositories: - **Common Crawl:** Repository that holds website data and is run by the Common Crawl non-profit organization. It is updated monthly and is distributed under the CC0 1.0 public domain license. - **GitHub:** Community platform that allows developers to create, store, manage, and share their code. Repositories are crawled and then distributed with their original licenses, which may vary from permissive to non-commercial licenses. - **Wikimedia:** Database that holds the collection databases managed by the Wikimedia Foundation, including Wikipedia, Wikibooks, Wikinews, Wikiquote, Wikisource, and Wikivoyage. It is updated monthly and is distributed under Creative Commons Attribution-ShareAlike License 4.0. - **EurLex:** Repository that holds the collection of legal documents from the European Union, available in all of the EU’s 24 official languages and run by the Publications Office of the European Union. It is updated daily and is distributed under the Creative Commons Attribution 4.0 International license. - **Other repositories:** Specific repositories were crawled under permission for domain-specific corpora, which include academic, legal, and newspaper repositories. We provide a complete list of dataset sources at the end of this section. **How many instances are there in total (of each type, if appropriate)?** The dataset contains a diverse range of instances across multiple languages, with notable adjustments for certain languages. English represents the largest portion, accounting for 39.31% of the total data. Spanish was upsampled by a factor of 2, bringing its share to 16.12%, while Catalan (1.97%), Basque (0.24%), and Galician (0.31%) were also upsampled by 2. On the other hand, code-related data was downsampled by half, making up 5.78% of the total. Other prominent languages include French (6.6%), Russian (5.56%), German (4.79%), and Hungarian (4.59%), with several additional languages contributing between 1% and 2%, and smaller portions represented by a variety of others. **Does the dataset contain all possible instances or is it a sample (not necessarily random) of instances from a larger set? If the dataset is a sample, then what is the larger set? Is the sample representative of the larger set (e.g., geographic coverage)? If so, please describe how this representativeness was validated/verified. If it is not representative of the larger set, please describe why not (e.g., to cover a more diverse range of instances, because instances were withheld or unavailable).** The dataset is a sample from multiple sources, with different weights based on the primary language of the content: Spanish, Catalan, Basque, and Galician content was upsampled by a factor of two, while programming languages were downsampled by a factor of half. Other sources were sampled in proportion to their occurrence. **What data does each instance consist of? “Raw” data (e.g., unprocessed text or images) or features? In either case, please provide a description.** Each instance consists of a text document processed for deduplication, language identification, and source-specific filtering. Some documents required optical character recognition (OCR) to extract text from non-text formats such as PDFs. **Is there a label or target associated with each instance? If so, please provide a description.** Each instance is labelled with a unique identifier, the primary language of the content, and the URL for web-sourced instances. Additional labels were automatically assigned to detect specific types of content -harmful or toxic content- and to assign preliminary indicators of undesired qualities -very short documents, high density of symbols, etc.- which were used for filtering instances. **Is any information missing from individual instances? If so, please provide a description, explaining why this information is missing (e.g., because it was unavailable). This does not include intentionally removed information, but might include, e.g., redacted text.** No significant information is missing from the instances. **Are relationships between individual instances made explicit (e.g., users’ movie ratings, social network links)? If so, please describe how these relationships are made explicit.** Instances are related through shared metadata, such as source and language identifiers. **Are there recommended data splits (e.g., training, development/validation, testing)? If so, please provide a description of these splits, explaining the rationale behind them.** The dataset is randomly divided into training, validation and test sets, where the validation and test sets are each 1% of the total corpus. **Are there any errors, sources of noise, or redundancies in the dataset? If so, please provide a description.** Despite removing duplicated instances within each source, redundancy remains at the paragraph and sentence levels, particularly in web-sourced instances where search engine optimization techniques and templates contribute to repeated textual patterns. Some instances may be also duplicated across sources due to format variations. **Is the dataset self-contained, or does it link to or otherwise rely on external resources (e.g., websites, tweets, other datasets)? If it links to or relies on external resources, a) are there guarantees that they will exist, and remain constant, over time; b) are there official archival versions of the complete dataset (i.e., including the external resources as they existed at the time the dataset was created); c) are there any restrictions (e.g., licenses, fees) associated with any of the external resources that might apply to a dataset consumer? Please provide descriptions of all external resources and any restrictions associated with them, as well as links or other access points, as appropriate.** The dataset is self-contained and does not rely on external resources. **Does the dataset contain data that might be considered confidential (e.g., data that is protected by legal privilege or by doctor–patient confidentiality, data that includes the content of individuals’ non-public communications)? If so, please provide a description.** The dataset does not contain confidential data. **Does the dataset contain data that, if viewed directly, might be offensive, insulting, threatening, or might otherwise cause anxiety? If so, please describe why. If the dataset does not relate to people, you may skip the remaining questions in this section.** The dataset includes web-crawled content, which may overrepresent pornographic material across languages (Kreutzer et al., 2022). Although pre-processing techniques were applied to mitigate offensive content, the heterogeneity and scale of web-sourced data make exhaustive filtering challenging, which makes it next to impossible to identify all adult content without falling into excessive filtering, which may negatively influence certain demographic groups (Dodge et al., 2021). **Does the dataset identify any subpopulations (e.g., by age, gender)? If so, please describe how these subpopulations are identified and provide a description of their respective distributions within the dataset.** The dataset does not explicitly identify any subpopulations. **Is it possible to identify individuals (i.e., one or more natural persons), either directly or indirectly (i.e., in combination with other data) from the dataset? If so, please describe how.** Web-sourced instances in the dataset may contain personally identifiable information (PII) that is publicly available on the Web, such as names, IP addresses, email addresses, and phone numbers. While it would be possible to indirectly identify individuals through the combination of multiple data points, the nature and scale of web data makes it difficult to parse such information. In any case, efforts are made to filter or anonymize sensitive data (Mina et al., 2024), but some identifiable information may remain in the dataset. **Does the dataset contain data that might be considered sensitive in any way? If so, please provide a description.** Given that the dataset includes web-sourced content and other publicly available documents, instances may inadvertently reveal financial information, health-related details, or forms of government identification, such as social security numbers (Subramani et al., 2023), especially if the content originates from less-regulated sources or user-generated platforms. #### Collection Process **How was the data collected?** This dataset is constituted by combining several sources, whose acquisition methods can be classified into three groups: - Web-sourced datasets with some preprocessing available under permissive license. - Domain-specific or language-specific raw crawls. - Manually curated data obtained through collaborators, data providers (by means of legal assignment agreements) or open source projects (e.g. CATalog). **What mechanisms or procedures were used to collect the data? How were these mechanisms or procedures validated?** The data collection process was carried out using three different mechanisms, each corresponding to one of the groups defined in the previous answer. The specific methods used and their respective validation procedures are outlined below: - Open Direct Download: Data were obtained directly from publicly accessible sources, such as websites or repositories that provide open data downloads. We validate the data with a data integrity check, which ensures that the downloaded files are complete, uncorrupted and in the expected format and structure. - Ad hoc scrapers or crawlers: Custom web scraping scripts or crawlers were used to extract data from various online sources where direct downloads were not available. These scripts navigate web pages, extract relevant data and store it in a structured format. We validate this method with software unit tests to evaluate the functionality of individual components of the scraping programs, checking for errors or unexpected behaviour. In addition, data integrity tests were performed to verify that the collected data remained complete throughout the extraction and storage process. - Direct download via FTP, SFTP, API or S3: Some datasets were acquired using secure transfer protocols such as FTP (File Transfer Protocol), SFTP (Secure File Transfer Protocol), or API (Application Programming Interface) requests from cloud storage services such as Amazon S3. As with the open direct download method, data integrity tests were used to validate the completeness of the files to ensure that the files were not altered or corrupted during the transfer process. **If the dataset is a sample from a larger set, what was the sampling strategy?** The sampling strategy was to use the whole dataset resulting from the filtering explained in the 'preprocessing/cleaning/labelling' section, with the particularity that an upsampling of 2 (i.e. twice the probability of sampling a document) was performed for the co-official languages of Spain (Spanish, Catalan, Galician, Basque), and a downsampling of 1/2 was applied for code (half the probability of sampling a code document, evenly distributed among all programming languages). **Who was involved in the data collection process and how were they compensated?** This data is generally extracted, filtered and sampled by automated processes. The code required to run these processes has been developed entirely by members of the Language Technologies data team, or otherwise obtained from open-source software. Furthermore, there has been no monetary consideration for acquiring data from suppliers. **Over what timeframe was the data collected? Does this timeframe match the creation timeframe of the data associated with the instances? If not, please describe the timeframe in which the data associated with the instances was created.** Data were acquired and processed from April 2023 to April 2024. However, as mentioned, much data has been obtained from open projects such as Common Crawl, which contains data from 2014, so it is the end date (04/2024) rather than the start date that is important. **Were any ethical review processes conducted? If so, please provide a description of these review processes, including the outcomes, as well as a link or other access point to any supporting documentation.** No particular ethical review process has been carried out as the data is mostly open and not particularly sensitive. However, we have an internal evaluation team and a bias team to monitor ethical issues. In addition, we work closely with ‘Observatori d'Ètica en Intel·ligència Artificial’ (OEIAC) and ‘Agencia Española de Supervisión de la Inteligencia Artificial’ (AESIA) to audit the processes we carry out from an ethical and legal point of view, respectively. #### Preprocessing **Was any preprocessing/cleaning/labeling of the data done? If so, please provide a description. If not, you may skip the remaining questions in this section.** No changes were made to the content of individual text document instances. However, the web-sourced documents underwent a filtering process based on specific criteria along two key dimensions: - Quality filtering: The text processing pipeline CURATE (Palomar et. al, 2024) calculates a quality score for each document based on a set of filtering criteria that identify undesirable textual characteristics. Any document with a score below the 0.8 threshold was excluded from the dataset. - Harmful or adult content filtering: To reduce the amount of harmful or inappropriate material in the dataset, documents from Colossal OSCAR were filtered using the Ungoliant pipeline (Abadji et al., 2021), which uses the 'harmful\_pp' field, a perplexity-based score generated by a language model. **Was the “raw” data saved in addition to the preprocessed/cleaned/labeled data? If so, please provide a link or other access point to the “raw” data.** The original raw data was not kept. **Is the software that was used to preprocess/clean/label the data available? If so, please provide a link or other access point.** Yes, the preprocessing and filtering software is open-sourced. The [CURATE](https://github.com/langtech-bsc/CURATE) pipeline was used for CATalog and other curated datasets, and the [Ungoliant](https://github.com/oscar-project/ungoliant) pipeline was used for the OSCAR project. #### Uses **Has the dataset been used for any tasks already? If so, please provide a description.** Pre-train the Salamandra model family. **What (other) tasks could the dataset be used for?** The data can be used primarily to pre-train other language models, which can then be used for a wide range of use cases. The dataset could also be used for other tasks such as fine-tuning language models, cross-lingual NLP tasks, machine translation, domain-specific text generation, and language-specific data analysis. **Is there anything about the composition of the dataset or the way it was collected and preprocessed/cleaned/labeled that might impact future uses? Is there anything a dataset consumer could do to mitigate these risks or harms?** Web-crawled content is over-represented with standard language varieties, impacting language model performance for minority languages. Language diversity in data is crucial to avoid bias, especially in encoding non-standard dialects, preventing the exclusion of demographic groups. Moreover, despite legal uncertainties in web-scraped data, we prioritize permissive licenses and privacy protection measures, acknowledging the challenges posed by personally identifiable information (PII) within large-scale datasets. Our ongoing efforts aim to address privacy concerns and contribute to a more inclusive linguistic dataset. **Are there tasks for which the dataset should not be used?** - #### Distribution **Will the dataset be distributed to third parties outside of the entity on behalf of which the dataset was created? If so, please provide a description.** The dataset will not be released or distributed to third parties. Any related question to distribution is omitted in this section. #### Maintenance **Who will be supporting/hosting/maintaining the dataset?** The dataset will be hosted by the Language Technologies unit (LangTech) of the Barcelona Supercomputing Center (BSC). The team will ensure regular updates and monitor the dataset for any issues related to content integrity, legal compliance, and bias for the sources they are responsible for. **How can the owner/curator/manager of the dataset be contacted?** The data owner may be contacted with the email address [email protected]. **Will the dataset be updated?** The dataset will not be updated. **If the dataset relates to people, are there applicable limits on the retention of the data associated with the instances? If so, please describe these limits and explain how they will be enforced.** The dataset does not keep sensitive data that could allow direct identification of individuals, apart from the data that is publicly available in web-sourced content. Due to the sheer volume and diversity of web data, it is not feasible to notify individuals or manage data retention on an individual basis. However, efforts are made to mitigate the risks associated with sensitive information through pre-processing and filtering to remove identifiable or harmful content. Despite these measures, vigilance is maintained to address potential privacy and ethical issues. **Will older versions of the dataset continue to be supported/hosted/maintained? If so, please describe how. If not, please describe how its obsolescence will be communicated to dataset consumers.** Since the dataset will not be updated, only the final version will be kept. **If others want to extend/augment/build on/contribute to the dataset, is there a mechanism for them to do so?** The dataset does not allow for external contributions. </details> ### Finetuning Data This instructed-tuned variant has been fine-tuned with a collection of 273k instructions, focusing on the performance of Catalan, English and Spanish. However, instruction data for other closely related Iberian languages has also been included, since it yielded a positive impact on the languages of interest. That said, the performance in these additional languages is not guaranteed due to the limited amount of available data and the lack of resources for thorough testing. | **Dataset** | **ca** | **en** | **es** | **eu** | **gl** | **pt** | **Total** | |----------------------|------------|-------------|------------|-----------|---------|------------|-------------| | alpaca-cleaned | | 49,950 | | | | | **49,950** | | aya-dataset | | 3,941 | 3,851 | 939 | | 8,995 | **17,726** | | coqcat | 4,797 | | | | | | **4,797** | | databricks-dolly-15k | | 15,011 | | | | | **15,011** | | dolly-ca | 3,232 | | | | | | **3,232** | | flores-dev | 986 | 1,037 | 1,964 | 493 | 505 | | **4,985** | | mentor-ca | 7,119 | | | | | | **7,119** | | mentor-es | | | 7,122 | | | | **7,122** | | no-robots | | 9,485 | | | | | **9,485** | | oasst-ca | 2,517 | | | | | | **2,517** | | oasst2 | 750 | 31,086 | 15,438 | 190 | 197 | 1,203 | **48,864** | | open-orca | | 49,996 | | | | | **49,996** | | rag-multilingual | 16,043 | 14,997 | 11,263 | | | | **42,303** | | tower-blocks | | 7,762 | 1,000 | | | 1,000 | **9,762** | | **Total** | **35,444** | **183,265** | **40,638** | **1,622** | **702** | **11,198** | **272,869** | --- ## Evaluation ### Gold-standard benchmarks WiP <!-- Evaluation is done using the Language Model Evaluation Harness (Gao et al., 2024). We evaluate on a set of tasks taken from [SpanishBench](https://github.com/EleutherAI/lm-evaluation-harness/tree/main/lm_eval/tasks/spanish_bench), [CatalanBench](https://github.com/EleutherAI/lm-evaluation-harness/tree/main/lm_eval/tasks/catalan_bench), [BasqueBench](https://github.com/EleutherAI/lm-evaluation-harness/tree/main/lm_eval/tasks/basque_bench) and [GalicianBench](https://github.com/EleutherAI/lm-evaluation-harness/tree/main/lm_eval/tasks/galician_bench). These benchmarks include both new and existing tasks and datasets. Given that this is an instructed model, we add LM Evaluation Harness's native feature of `chat-template` to the setup. In the tables below, we include the results in a selection of evaluation datasets that represent model's performance across a variety of tasks within these benchmarks. We only use tasks that are either human generated, human translated, or with a strong human-in-the-loop (i.e., machine translation followed by professional revision or machine generation followed by human revision and annotation). This is the reason behind the variety in number of tasks reported across languages. As more tasks that fulfill these requirements are published, we will update the presented results. We also intend to expand the evaluation to other languages, as long as the datasets meet our quality standards. During the implementation of the evaluation we observed a series of issues worth considering when replicating and interpreting the results presented. These issues include ≈1.5% variances in performance in some tasks depending on the version of the `transformers` library used, and depending on the use (or lack of use) of tensor parallelism when loading a model. When implementing existing tasks, we carry out a comprehensive quality evaluation of the dataset, the Harness task itself, and what kind of input models see during evaluation. Our implementation (see links above) addresses multiple existing problems such as errors in datasets and prompts, and lack of pre-processing. All this means that results will vary if using other Harness implementations, and may slightly vary depending on the replication setup. It should be noted that these results are subject to all the drawbacks of every current gold-standard evaluation, and that the figures do not fully represent the model's capabilities and potential. We thus advise caution when reading and interpreting the results. A full list of results compared to other baselines, a discussion of the model's performance across tasks and its implications, and details regarding problem-solving with task implementation will soon be available in the technical report. All results reported below are on a 0-shot setting. #### Spanish <table><thead> <tr> <th>Category</th> <th>Task</th> <th>Metric</th> <th>Result</th> </tr></thead> <tbody> <tr> <td>Commonsense Reasoning</td> <td>xstorycloze_es</td> <td>acc</td> <td>73.13</td> </tr> <tr> <td rowspan="2">NLI</td> <td>wnli_es</td> <td>acc</td> <td>60.56</td> </tr> <tr> <td>xnli_es</td> <td>acc</td> <td>50.84</td> </tr> <tr> <td>Paraphrasing</td> <td>paws_es</td> <td>acc</td> <td>60.75</td> </tr> <tr> <td>QA</td> <td>xquad_es</td> <td>acc</td> <td>63.20/td> </tr> <tr> <td>Translation</td> <td>flores_es</td> <td>bleu</td> <td>14.95</td> </tr> </tbody> </table> #### Catalan <table><thead> <tr> <th>Category</th> <th>Task</th> <th>Metric</th> <th>Result</th> </tr></thead> <tbody> <tr> <td rowspan="2">Commonsense Reasoning</td> <td>copa_ca</td> <td>acc</td> <td>82.80</td> </tr> <tr> <td>xstorycloze_ca</td> <td>acc</td> <td>73.73</td> </tr> <tr> <td rowspan="2">NLI</td> <td>wnli_ca</td> <td>acc</td> <td>64.79</td> </tr> <tr> <td>xnli_ca</td> <td>acc</td> <td>53.45</td> </tr> <tr> <td rowspan="2">Paraphrasing</td> <td>parafraseja</td> <td>acc</td> <td>64.15</td> </tr> <tr> <td>paws_ca</td> <td>acc</td> <td>64.35</td> </tr> <tr> <td rowspan="5">QA</td> <td>arc_ca_easy</td> <td>acc</td> <td>73.57</td> </tr> <tr> <td>arc_ca_challenge</td> <td>acc</td> <td>45.90</td> </tr> <tr> <td>openbookqa_ca</td> <td>acc</td> <td>40.60</td> </tr> <tr> <td>piqa_ca</td> <td>acc</td> <td>73.39</td> </tr> <tr> <td>siqa_ca</td> <td>acc</td> <td>51.84</td> </tr> <tr> <td>Translation</td> <td>flores_ca</td> <td>bleu</td> <td>20.49</td> </tr> </tbody></table> #### Basque <table><thead> <tr> <th>Category</th> <th>Task</th> <th>Metric</th> <th>Result</th> </tr></thead> <tbody> <tr> <td rowspan="2">Commonsense Reasoning</td> <td>xcopa_eu</td> <td>acc</td> <td>67.80</td> </tr> <tr> <td>xstorycloze_eu</td> <td>acc</td> <td>65.06</td> </tr> <tr> <td rowspan="2">NLI</td> <td>wnli_eu</td> <td>acc</td> <td>56.34</td> </tr> <tr> <td>xnli_eu</td> <td>acc</td> <td>47.34</td> </tr> <tr> <td rowspan="3">QA</td> <td>eus_exams</td> <td>acc</td> <td>45.98</td> </tr> <tr> <td>eus_proficiency</td> <td>acc</td> <td>43.92</td> </tr> <tr> <td>eus_trivia</td> <td>acc</td> <td>50.38</td> </tr> <tr> <td>Reading Comprehension</td> <td>eus_reading</td> <td>acc</td> <td>48.01</td> </tr> <tr> <td>Translation</td> <td>flores_eu</td> <td>bleu</td> <td>10.99</td> </tr> </tbody></table> #### Galician <table><thead> <tr> <th>Category</th> <th>Task</th> <th>Metric</th> <th>Result</th> </tr></thead> <tbody> <tr> <td rowspan="2">Paraphrasing</td> <td>parafrases_gl</td> <td>acc</td> <td>58.50</td> </tr> <tr> <td>paws_gl</td> <td>acc</td> <td>62.45</td> </tr> <tr> <td>QA</td> <td>openbookqa_gl</td> <td>acc</td> <td>37.20</td> </tr> <tr> <td>Translation</td> <td>flores_gl</td> <td>bleu</td> <td>18.81</td> </tr> </tbody> </table> --> ### LLM-as-a-judge We use [Prometheus-2 8x7B](https://huggingface.co/prometheus-eval/prometheus-8x7b-v2.0) as a judge to evaluate the responses of the model. Tasks are created from existing multilingual evaluation datasets covering the same categories as the ones measured in our gold-standard benchmarks. We randomly select a subset of 250 instances per language from the `test` set of each source dataset. To evaluate the responses of our model, we use task-specific criteria developed in-house for the _LLM-judge_ to use. Each criterion is measured either as a 5-point Likert scale or as a binary task depending on the idiosyncrasy of the task and criterion. Prompts for each task are created in various ways to score the model's robustness in addition to these criteria. This is done by presenting the same source instance within three different prompts. We then calculate the variance between the scores assigned by the _LLM-judge_ to our model's responses to the three prompt styles and average it across all instances. Prompts are human translated to all languages measured. We do not provide the _LLM-judge_ with a reference answer. The _judge_ prompt we use during evaluation is the same used to fine tune the Prometheus-2 family. We keep the _judge_ prompt and criteria used to present the _LLM-judge_ with the task prompts and model responses in English for evaluation across languages. The _judge_ prompt used is: ```python "You are a fair judge assistant tasked with providing clear, objective feedback based on specific criteria, ensuring each assessment reflects the absolute standards set for performance. ###Task Description: An instruction (might include an Input inside it), a response to evaluate, and a score rubric representing a evaluation criteria are given. 1. Write a detailed feedback that assess the quality of the response strictly based on the given score rubric, not evaluating in general. 2. After writing a feedback, write a score that is an integer between {a} and {b}. You should refer to the score rubric. 3. The output format should look as follows: \"Feedback: (write a feedback for criteria) [RESULT] (an integer number between {a} and {b})\" 4. Please do not generate any other opening, closing, and explanations. ###The instruction to evaluate: {input} ###Response to evaluate: {prediction} ###Score Rubrics: {criteria} ###Feedback:" ``` As an example, prompts for the Math task in English are based on instances from [MGSM](https://huggingface.co/datasets/juletxara/mgsm), and each instance is presented within these prompts: ```python "en": [ ("I need help with this math problem: \"", "\" Give me the answer step by step and also the final result separately."), ("Can you please help me answer this? \"", "\" Explain the answer and give me the final result as well. Thanks."), ("Help me with this problem: \"", "\" I need the answer explained and the final result separately.") ] ``` This task is then evaluated by the _LLM-judge_ using two criteria, reasoning capability (5-point Likert) and mathematical correctness (binary): ```python reasoning_capability_criteria = { "reasoning_capability": """ [Does the model's answer demonstrate reasoning capability?] Score 1: The answer demonstrates poor reasoning, with illogical arguments or conclusions that do not follow from the provided information. Score 2: The answer shows weak reasoning, with some logical connections but also contains significant flaws or gaps in the argumentation. Score 3: The answer demonstrates adequate reasoning, with generally logical arguments, but may have minor flaws or a lack of depth in the reasoning process. Score 4: The answer shows strong reasoning, with well-structured arguments and conclusions that logically follow from the information provided. Score 5: The answer demonstrates exceptional reasoning, with clear, coherent, and insightful arguments that are logically sound and well-supported by the information provided.""" } mathematical_correctness_binary_criteria = { "mathematical_correctness_binary": """ [Is the model's answer mathematically correct?] Score 0: The answer contains mathematical errors that render the solution incorrect or unreliable. Score 1: The answer is mathematically correct, with accurate calculations and appropriate use of mathematical concepts.""" } ``` #### Multilingual results Here, we present results for seven categories of tasks in Spanish, Catalan, Basque, Galician, and English. Results are presented for each task, criterion and language. Criteria with a `(B)` after their name are binary criteria (i.e., numbers go from 0 to 1, where 1 is best). The rest of the criteria are measured using a 5-point Likert scale, where 5 is best. The first number of the pair of numbers separated by `/` shows the average score for the criterion (and language). The second number of each pair is the robustness score, where numbers closer to 0 mean that the model generates similar responses when comparing the three prompt varieties for a single instance. Further details on all tasks and criteria, a full list of results compared to other baselines, a discussion of the model's performance across tasks and its implications, and details regarding problem-solving with task implementation will soon be available in the technical report. <table class="tg"><thead> <tr> <th class="tg-0pky"><span style="font-weight:bold">Category</span></th> <th class="tg-0pky"><span style="font-weight:bold">Dataset</span></th> <th class="tg-0pky"><span style="font-weight:bold">Criteria</span></th> <th class="tg-0pky"><span style="font-weight:bold">es</span></th> <th class="tg-0pky"><span style="font-weight:bold">ca</span></th> <th class="tg-0pky"><span style="font-weight:bold">gl</span></th> <th class="tg-0pky"><span style="font-weight:bold">eu</span></th> <th class="tg-0pky"><span style="font-weight:bold">en</span></th> </tr></thead> <tbody> <tr> <td class="tg-0pky">Commonsense Reasoning</td> <td class="tg-0pky">XStoryCloze</td> <td class="tg-0pky">Ending coherence</td> <td class="tg-0pky">3.24/0.63</td> <td class="tg-0pky">3.12/0.51</td> <td class="tg-0pky">2.87/0.59</td> <td class="tg-0pky">2.16/0.52</td> <td class="tg-0pky">3.71/0.50</td> </tr> <tr> <td class="tg-0pky" rowspan="3">Paraphrasing</td> <td class="tg-0pky" rowspan="3">PAWS</td> <td class="tg-0pky">Completeness `(B)`</td> <td class="tg-0pky">0.86/0.07</td> <td class="tg-0pky">0.82/0.09</td> <td class="tg-0pky">0.78/0.10</td> <td class="tg-0pky">-- / --</td> <td class="tg-0pky">0.92/0.05</td> </tr> <tr> <td class="tg-0pky">Paraphrase generation</td> <td class="tg-0pky">3.81/0.54</td> <td class="tg-0pky">3.67/0.55</td> <td class="tg-0pky">3.56/0.57</td> <td class="tg-0pky">-- / --</td> <td class="tg-0pky">3.98/0.37</td> </tr> <tr> <td class="tg-0pky">Grammatical correctness `(B)`</td> <td class="tg-0pky">0.93/0.03</td> <td class="tg-0pky">0.92/0.05</td> <td class="tg-0pky">0.89/0.06</td> <td class="tg-0pky">-- / --</td> <td class="tg-0pky">0.96/0.03</td> </tr> <tr> <td class="tg-0pky" rowspan="2">Reading Comprehension</td> <td class="tg-0pky" rowspan="2">Belebele</td> <td class="tg-0pky">Passage comprehension</td> <td class="tg-0pky">3.43/0.43</td> <td class="tg-0pky">3.28/0.50</td> <td class="tg-0pky">3.02/0.56</td> <td class="tg-0pky">2.61/0.43</td> <td class="tg-0pky">3.43/0.58</td> </tr> <tr> <td class="tg-0pky">Answer relevance `(B)`</td> <td class="tg-0pky">0.86/0.05</td> <td class="tg-0pky">0.84/0.05</td> <td class="tg-0pky">0.75/0.08</td> <td class="tg-0pky">0.65/0.11</td> <td class="tg-0pky">0.83/0.06</td> </tr> <tr> <td class="tg-0pky" rowspan="2">Extreme Summarization</td> <td class="tg-0pky" rowspan="2">XLSum &amp; caBreu &amp; summarization_gl</td> <td class="tg-0pky">Informativeness</td> <td class="tg-0pky">3.37/0.34</td> <td class="tg-0pky">3.57/0.31</td> <td class="tg-0pky">3.40/0.31</td> <td class="tg-0pky">-- / --</td> <td class="tg-0pky">3.32/0.26</td> </tr> <tr> <td class="tg-0pky">Conciseness</td> <td class="tg-0pky">3.06/0.34</td> <td class="tg-0pky">2.88/0.50</td> <td class="tg-0pky">3.09/0.38</td> <td class="tg-0pky">-- / --</td> <td class="tg-0pky">3.32/0.22</td> </tr> <tr> <td class="tg-0pky" rowspan="2">Math</td> <td class="tg-0pky" rowspan="2">MGSM</td> <td class="tg-0pky">Reasoning capability</td> <td class="tg-0pky">3.29/0.72</td> <td class="tg-0pky">3.16/0.65</td> <td class="tg-0pky">3.33/0.60</td> <td class="tg-0pky">2.56/0.52</td> <td class="tg-0pky">3.35/0.65</td> </tr> <tr> <td class="tg-0pky">Mathematical correctness `(B)`</td> <td class="tg-0pky">0.68/0.12</td> <td class="tg-0pky">0.65/0.13</td> <td class="tg-0pky">0.73/0.11</td> <td class="tg-0pky">0.59/0.13</td> <td class="tg-0pky">0.67/0.12</td> </tr> <tr> <td class="tg-0pky" rowspan="2">Translation form Language</td> <td class="tg-0pky" rowspan="2">FLORES-200</td> <td class="tg-0pky">Fluency</td> <td class="tg-0pky">3.95/0.11</td> <td class="tg-0pky">3.88/0.15</td> <td class="tg-0pky">-- / --</td> <td class="tg-0pky">-- / --</td> <td class="tg-0pky">3.92/0.14</td> </tr> <tr> <td class="tg-0pky">Accuracy</td> <td class="tg-0pky">4.22/0.15</td> <td class="tg-0pky">4.25/0.21</td> <td class="tg-0pky">-- / --</td> <td class="tg-0pky">-- / --</td> <td class="tg-0pky">4.25/0.23</td> </tr> <tr> <td class="tg-0pky" rowspan="2">Translation to Language</td> <td class="tg-0pky" rowspan="2">FLORES-200</td> <td class="tg-0pky">Fluency</td> <td class="tg-0pky">3.92/0.11</td> <td class="tg-0pky">3.84/0.14</td> <td class="tg-0pky">-- / --</td> <td class="tg-0pky">-- / --</td> <td class="tg-0pky">4.19/0.14</td> </tr> <tr> <td class="tg-0pky">Accuracy</td> <td class="tg-0pky">4.31/0.16</td> <td class="tg-0pky">4.18/0.20</td> <td class="tg-0pky">-- / --</td> <td class="tg-0pky">-- / --</td> <td class="tg-0pky">4.63/0.15</td> </tr> </tbody></table> --- ## Ethical Considerations and Limitations We examine the presence of undesired societal and cognitive biases present in this model using different benchmarks. For societal biases, we test performance using the BBQ dataset (Parrish et al., 2022) in the original English and the Regard dataset (Sheng et al., 2019). We report that while performance is high (accuracies around 0.8 depending on the social category) in disambiguated settings, the model performs very poorly in ambiguous settings, which indicates the presence of societal biases that need to be further addressed in post-training phases. Our cognitive bias analysis focuses on positional effects in 0-shot settings, and majority class bias in few-shot settings. For positional effects, we leverage the ARC Multiple Choice Question dataset (Clark et al., 2018). We observe significant, but relatively weak primacy effects, whereby the model shows a preference for answers towards the beginning of the list of provided answers. We measure the effects of majority class effects in few-shot settings using SST-2 (Socher et al., 2013). We again detect significant effects, with a small effect size. This suggests that the model is relatively robust against the examined cognitive biases. We highlight that our analyses of these biases are by no means exhaustive and are limited by the relative scarcity of adequate resources in all languages present in the training data. We aim to gradually extend and expand our analyses in future work. These results can be expected from a model that has undergone only a preliminary instruction tuning. These tests are performed in order to show the biases the model may contain. We urge developers to take them into account and perform safety testing and tuning tailored to their specific applications of the model. --- ## Additional information ### Author The Language Technologies Unit from Barcelona Supercomputing Center. ### Contact For further information, please send an email to <[email protected]>. ### Copyright Copyright(c) 2024 by Language Technologies Unit, Barcelona Supercomputing Center. ### Funding This work has been promoted and financed by the Government of Catalonia through the [Aina Project](https://projecteaina.cat/). This work is funded by the _Ministerio para la Transformación Digital y de la Función Pública_ - Funded by EU – NextGenerationEU within the framework of [ILENIA Project](https://proyectoilenia.es/) with reference 2022/TL22/00215337. ### Acknowledgements This project has benefited from the contributions of numerous teams and institutions, mainly through data contributions, knowledge transfer or technical support. In Catalonia, many institutions have been involved in the project. Our thanks to Òmnium Cultural, Parlament de Catalunya, Institut d'Estudis Aranesos, Racó Català, Vilaweb, ACN, Nació Digital, El món and Aquí Berguedà. At the national level, we are especially grateful to our ILENIA project partners: CENID, HiTZ and CiTIUS for their participation. We also extend our genuine gratitude to the Spanish Senate and Congress, Fundación Dialnet, and the ‘Instituto Universitario de Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería (SIANI)’ of the University of Las Palmas de Gran Canaria. At the international level, we thank the Welsh government, DFKI, Occiglot project, especially Malte Ostendorff, and The Common Crawl Foundation, especially Pedro Ortiz, for their collaboration. We would also like to give special thanks to the NVIDIA team, with whom we have met regularly, specially to: Ignacio Sarasua, Adam Henryk Grzywaczewski, Oleg Sudakov, Sergio Perez, Miguel Martinez, Felipes Soares and Meriem Bendris. Their constant support has been especially appreciated throughout the entire process. Their valuable efforts have been instrumental in the development of this work. ### Disclaimer Be aware that the model may contain biases or other unintended distortions. When third parties deploy systems or provide services based on this model, or use the model themselves, they bear the responsibility for mitigating any associated risks and ensuring compliance with applicable regulations, including those governing the use of Artificial Intelligence. The Barcelona Supercomputing Center, as the owner and creator of the model, shall not be held liable for any outcomes resulting from third-party use. ### Citation ``` @misc{gonzalezagirre2025salamandratechnicalreport, title={Salamandra Technical Report}, author={Aitor Gonzalez-Agirre and Marc Pàmies and Joan Llop and Irene Baucells and Severino Da Dalt and Daniel Tamayo and José Javier Saiz and Ferran Espuña and Jaume Prats and Javier Aula-Blasco and Mario Mina and Adrián Rubio and Alexander Shvets and Anna Sallés and Iñaki Lacunza and Iñigo Pikabea and Jorge Palomar and Júlia Falcão and Lucía Tormo and Luis Vasquez-Reina and Montserrat Marimon and Valle Ruíz-Fernández and Marta Villegas}, year={2025}, eprint={2502.08489}, archivePrefix={arXiv}, primaryClass={cs.CL}, url={https://arxiv.org/abs/2502.08489}, } ``` ### License [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ## Model Index |Model|Base|Instruct| |:---:|:---:|:---:| |2B| [Link](https://huggingface.co/BSC-LT/salamandra-2b) | [Link](https://huggingface.co/BSC-LT/salamandra-2b-instruct) | |7B| [Link](https://huggingface.co/BSC-LT/salamandra-7b) | [Link](https://huggingface.co/BSC-LT/salamandra-7b-instruct) | |40B| [Link](https://huggingface.co/BSC-LT/ALIA-40b) | WiP |
[ "QUESTION_ANSWERING", "TRANSLATION", "SUMMARIZATION", "PARAPHRASING" ]
[ "BEAR", "SCIELO" ]
Non_BioNLP
BSC-NLP4BIA/bsc-bio-ehr-es-meddocan
BSC-NLP4BIA
token-classification
[ "transformers", "pytorch", "roberta", "token-classification", "biomedical", "clinical", "EHR", "spanish", "anonymization", "es", "base_model:PlanTL-GOB-ES/bsc-bio-ehr-es", "base_model:finetune:PlanTL-GOB-ES/bsc-bio-ehr-es", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
1,717
1,722
15
0
--- base_model: - PlanTL-GOB-ES/bsc-bio-ehr-es language: - es license: apache-2.0 metrics: - precision - recall - f1 tags: - biomedical - clinical - EHR - spanish - anonymization widget: - text: El diagnóstico definitivo de nuestro paciente fue de un Adenocarcinoma de pulmón cT2a cN3 cM1a Estadio IV (por una única lesión pulmonar contralateral) PD-L1 90%, EGFR negativo, ALK negativo y ROS-1 negativo. - text: Durante el ingreso se realiza una TC, observándose un nódulo pulmonar en el LII y una masa renal derecha indeterminada. Se realiza punción biopsia del nódulo pulmonar, con hallazgos altamente sospechosos de carcinoma. - text: Trombosis paraneoplásica con sospecha de hepatocarcinoma por imagen, sobre hígado cirrótico, en paciente con índice Child-Pugh B. model-index: - name: BSC-NLP4BIA/bsc-bio-ehr-es-meddocan results: - task: type: token-classification dataset: name: MEDDOCAN type: MEDDOCAN metrics: - type: precision value: 0.95 name: precision (micro) - type: recall value: 0.972 name: recall (micro) - type: f1 value: 0.961 name: f1 (micro) - task: type: token-classification dataset: name: carmen-anonymization type: carmen-anonymization metrics: - type: precision value: 0.804 name: precision (micro) - type: recall value: 0.818 name: recall (micro) - type: f1 value: 0.811 name: f1 (micro) --- # Spanish RoBERTa-base biomedical model finetuned for the Named Entity Recognition (NER) task on the MEDDOCAN dataset. ## Table of contents <details> <summary>Click to expand</summary> - [Model description](#model-description) - [Intended uses and limitations](#intended-use) - [How to use](#how-to-use) - [Limitations and bias](#limitations-and-bias) - [Training](#training) - [Evaluation](#evaluation) - [Additional information](#additional-information) - [Authors](#authors) - [Contact information](#contact-information) - [Licensing information](#licensing-information) - [Funding](#funding) - [Citing information](#citing-information) - [Disclaimer](#disclaimer) </details> ## Model description A fine-tuned version of the [bsc-bio-ehr-es](https://huggingface.co/PlanTL-GOB-ES/bsc-bio-ehr-es) model on the [MEDDOCAN](https://zenodo.org/records/4279323) corpus. For further information, check the [official website](https://temu.bsc.es/meddocan/). ## Intended uses and limitations TDB ## How to use ⚠ We recommend pre-tokenizing the input text into words instead of providing it directly to the model, as this is how the model was trained. Otherwise, the results and performance might get affected. A usage example can be found [here](https://github.com/nlp4bia-bsc/hugging-face-pipeline/blob/main/simple_inference.ipynb). ## Limitations and bias At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. ## Training The model was trained using the Barcelona Supercomputing Center infrastructure. ## Evaluation F1 Score: 0.961 on MEDDOCAN F1 Score: 0.811 on CARMEN-I Anonymization ## Additional information ### Authors NLP4BIA team at the Barcelona Supercomputing Center ([email protected]). ### Contact information jan.rodriguez [at] bsc.es ### Licensing information [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ### Funding This project was partially funded by the Spanish Plan for the Advancement of Language Technology (Plan TL) in collaboration with the Barcelona Supercomputing Center (BSC) and the Hospital Clinic de Barcelona (HCB). On the BSC's side, we acknowledge additional funding by the Spanish National AI4ProfHealth project (PID2020-119266RA-I00 MICIU/AEI/10.13039/501100011033) and EU Horizon projects (AI4HF 101080430 and DataTools4Heart 101057849). On the HCB's side, the project was also supported by the Instituto de Salud Carlos III (ISCIII). ### Citing information Please cite the following works: ``` @inproceedings{meddocan, title={{Automatic De-identification of Medical Texts in Spanish: the MEDDOCAN Track, Corpus, Guidelines, Methods and Evaluation of Results}}, author={Marimon, Montserrat and Gonzalez-Agirre, Aitor and Intxaurrondo, Ander and Villegas, Marta and Krallinger, Martin}, booktitle="Proceedings of the Iberian Languages Evaluation Forum (IberLEF 2019)", year={2019} } @misc{carmen_physionet, author = {Farre Maduell, Eulalia and Lima-Lopez, Salvador and Frid, Santiago Andres and Conesa, Artur and Asensio, Elisa and Lopez-Rueda, Antonio and Arino, Helena and Calvo, Elena and Bertran, Maria Jesús and Marcos, Maria Angeles and Nofre Maiz, Montserrat and Tañá Velasco, Laura and Marti, Antonia and Farreres, Ricardo and Pastor, Xavier and Borrat Frigola, Xavier and Krallinger, Martin}, title = {{CARMEN-I: A resource of anonymized electronic health records in Spanish and Catalan for training and testing NLP tools (version 1.0.1)}}, year = {2024}, publisher = {PhysioNet}, url = {https://doi.org/10.13026/x7ed-9r91} } @article{physionet, author = {Ary L. Goldberger and Luis A. N. Amaral and Leon Glass and Jeffrey M. Hausdorff and Plamen Ch. Ivanov and Roger G. Mark and Joseph E. Mietus and George B. Moody and Chung-Kang Peng and H. Eugene Stanley }, title = {PhysioBank, PhysioToolkit, and PhysioNet }, journal = {Circulation}, volume = {101}, number = {23}, pages = {e215-e220}, year = {2000}, doi = {10.1161/01.CIR.101.23.e215}, URL = {https://www.ahajournals.org/doi/abs/10.1161/01.CIR.101.23.e215} } ``` ### Disclaimer The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. --- Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
[ "NAMED_ENTITY_RECOGNITION" ]
[ "MEDDOCAN" ]
BioNLP
samchain/econo-sentence-v2
samchain
sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:45044", "loss:CoSENTLoss", "economics", "finance", "en", "dataset:samchain/econo-pairs-v2", "arxiv:1908.10084", "base_model:samchain/EconoBert", "base_model:finetune:samchain/EconoBert", "license:apache-2.0", "model-index", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
1,739
1,739
64
2
--- base_model: samchain/EconoBert datasets: - samchain/econo-pairs-v2 language: - en library_name: sentence-transformers license: apache-2.0 metrics: - pearson_cosine - spearman_cosine pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:45044 - loss:CoSENTLoss - economics - finance widget: - source_sentence: a consumer protection point of view, including through remuneration arrangements. failure to manage conduct risks can expose a financial institution to a variety of other risks which, if not managed properly, can threaten its solvency and sustainability. the regulatory regime for market conduct therefore provides a framework for the identification and management of conduct risk as a complementary framework to prudential regulation. this is part of the motivation for rigorous coordination and cooperation arrangements between the pa and the fsca envisaged by the financial sector regulation bill aimed at ensuring that all risks are holistically managed within an overarching financial stability policy framework. strengthening conduct in wholesale otc markets another initiative has been the development of a code of conduct for south african wholesale over - the - counter ( otc ) financial markets in a collaborative effort between regulators and key market participants. i will return to this aspect later. against the background of various investigations undertaken in many foreign jurisdictions in relation to foreign exchange market manipulation, the sarb and the fsb launched a review bis central bankers ’ speeches of the foreign exchange trading operations of south african authorised dealers in october 2014. as the rand is a globally traded currency, the aim of the review was to establish whether there may have been any spillover into our markets in relation to any misconduct or malpractice. it is important to note that, unlike in other jurisdictions, the south african review was not informed by whistle - blowing or any allegations – or indeed concrete evidence – of any misconduct. we had no evidence of widespread malpractice in the south african foreign exchange market but felt that, given the broad - based nature of investigations in other jurisdictions ( which also involved trading in emerging market currencies ), it would be prudent to obtain comfort that our foreign exchange trading practices were in line with best practice. it was therefore a proactive step on the part of south african regulators. the foreign exchange review committee established for this purpose – chaired by former senior deputy governor james cross, who is with us today – released its report in october 2015. the committee reported that it had found no evidence of manipulation or serious misconduct in the domestic foreign exchange market during the period covered by the review, but that there was scope for improvement in relation to governance and conduct. the committee also recommended that legislation be enhanced to give market conduct regulators wider powers to strengthen enforcement. south african regulators are in conversation with each other on how best to give effect to the implementation of the recommendations. there was also the recommendation sentences: - 'luigi federico signorini : g20 sustainable finance working group private sector roundtable welcome address by mr luigi federico signorini, senior deputy governor of the bank of italy, at the g20 sustainable finance working group private sector roundtable, online event, 17 may 2021. * * * welcome, and a good day to you all. i am happy to open the private sector roundtable, an event promoted by the g20 presidency and by the chinese and american co - chairs of the sustainable finance working group. the roundtable will focus on the role of finance in helping fight climate change and promoting lowcarbon transition. the g20 finance ministers and central bank governors recently recognised the need to ‘ shape the current economic recovery by investing in innovative technologies and promoting just transitions toward more sustainable economies and societies ’. low - carbon transition is urgent and must be accelerated : the later we act, the greater the costs. it requires an unprecedented and unremitting effort. while quantitative estimates vary, the investments needed for transition are certainly huge ; they need to be sustained for a long time. governments have a central role in that they need to point the way by adopting an appropriate policy framework. a clear and credible path for government regulatory and fiscal action is also a prerequisite for efficient choices on the part of private finance. indeed, while many governments will directly invest their own money in many countries and mdbs will play their part, it is likely that the private sector will be called upon to finance most transition investment. there will be no transition without a general awareness of the need for it and a willingness, even a desire, to finance it. there are in fact quite a few encouraging signs. since last year, we have seen an explosion of ‘ net - zero commitments ’ in the private sector — though such commitments ( i am told ) are still confined to one sixth of publicly listed companies globally. at the same time, the appetite of ultimate investors and asset managers for ‘ green ’ investment is growing fast. i am sure many in the audience will have a clear perception of this fact. however, the path is still fraught with difficulties. on the market side, while sustainable finance is increasingly popular, it suffers from a lack of clear definitions and standards. ‘ greenwashing ’ is a danger ; good data, an agreed taxonomy, and adequate company disclosure are necessary. global consistency is important, as fragmentation of standards across jurisdictions is confusing for investors and costly for companies. standards are currently being drafted' - later, by more. so we raised interest rates through the second half of last year - and again in june - trying, as best we could through our tactics, to minimise any further unwanted upward pressure on sterling. but things have now clearly moved on. the outlook for the world economy deteriorated further through the summer under the impact of a series of new shocks. japan, the world ’ s second largest economy, slipped further into recession. russia - which had only weeks earlier embarked on an imf program - saw the collapse of the rouble and default on its debt. and acute nervousness spread through many of the world ’ s financial markets. although there has been some improvement in sentiment over the past month or two, and although the us and european economies continue to expand, the likelihood remains that world economic growth will be significantly slower than had been expected earlier in the summer. slower growth of world activity is bound to prolong the restraining external effect on growth and inflation in the uk, even though the exchange rate has now started to weaken. at the same time there are also now clearer signs of overall slowdown in our own economy. the evidence for this is less obvious in the backwards - looking economic and monetary data than it is in the forward - looking surveys, but even so the data suggest that we are beginning to see an easing of pressure, including an easing of pressure in the labour market. and the surveys themselves now point to a slowdown in service sector growth, including retail distribution, as well as a sharper decline in manufacturing output. this prospect is consistent with the reports which we receive directly from the bank ’ s network of regional agents and their 7000 - odd industrial and commercial contacts around the country. of course we pay very careful attention to this forward - looking evidence of developments in the economy alongside the data, and, like others, we have revised down our forecasts for output growth and inflation. and we have eased monetary policy quite sharply in the past two months, in the light of that evidence. our current best guess - published in last week ’ s inflation report is that, after the interest rate cuts, the growth of overall output next year will be around 1 %, picking up through the millennium to around trend in the second half of the year 2000. meanwhile, we expect underlying inflation to remain close to the target rate of 21 / 2 % - though perhaps a little above that rate during the course of next year. now no - one likes to see the - 'daniel mminele : conduct and culture in the banking and financial sectors opening address by mr daniel mminele, deputy governor of the south african reserve bank, at the g - 30 forum on banking conduct and culture, pretoria, 18 february 2016. * * * governor kganyago ( sarb ), governor sithole ( central bank of swaziland ), deputy governor mlambo ( reserve bank of zimbabwe ), deputy governors, groepe and naidoo ( sarb ), second deputy governor sullivan ( central bank of seychelles ), sir david walker ( vice chair of the group of thirty steering committee ), dr stuart mackintosh ( executive director of the g30 ), ms maria ramos ( chief executive officer of barclays africa ), the leadership of banks and other financial institutions, panel members, and esteemed delegates. it is a privilege and an honour for me to welcome you, on behalf of south african reserve bank ( sarb ), to this forum on banking conduct and culture, which we are co - hosting with the g30 and barclays africa. the g - 30 has, over the years, played a significant role in bringing together members of the banking, financial and regulatory community to discuss issues of common concern and examine the choices available to market practitioners and policymakers. given the enormous trust deficit that has built up since the global financial crisis, the topic of conduct and culture is of great importance and highly relevant to the global banking and financial sector. bankers have always had a delicate relationship with the societies they serve. it would appear that, at any point in time, it is almost a national sport across the globe to take a swipe at bankers. mark twain famously said : “ a banker is a fellow who lends you his umbrella when the sun is shining and wants it back the minute it begins to rain. ” and j m keynes asked : “ how long will it take to pay city men so entirely out of proportion to what other servants of society commonly received for performing social services not less useful or difficult? ” we would be terribly misguided to treat the current wave of discontent and deep mistrust as just another wave that will eventually subside. the most recent global financial crisis, from which almost nine years later we are still struggling to recover, shook the very foundations of our financial system and almost caused its total meltdown. in a nutshell, the crisis was about failures in conduct, culture, and supervisory practices. the consequences' - source_sentence: of our central bank distribution scheme have grown in number to embrace the physical quality of notes in circulation and the denominational mix. the last of these – denominational mix – poses the biggest challenge, but in the uk i believe we now have evidence to support the business case for atm and retailer dispense of £5s. and, finally, i am sometimes asked why this matters to the bank of england? after all, i can assure you that in the current financial conditions we are not short of difficult challenges. the answer is simple. we should not forget that our job is to ensure that confidence in our currency is maintained – and crucial to that is satisfying the public ’ s demand for our notes. sentences: - '##factory ". but the two clouds are moving towards us. so let us examine them. the outlook for the world economy has deteriorated markedly in recent months as a result of the sudden slowdown in the united states and signs of renewed stagnation in japan. the speed of the deterioration was a surprise, but not the fact of a slowdown. last year saw the fastest growth rate of the world economy for twelve years. growth in the us reached an annual rate of 6 % in the second quarter of last year, well above even optimistic estimates of sustainable growth rates. a slowdown was not only inevitable ; it was desirable. the main surprise in the us was the sharp and sudden break in both business and consumer confidence. us manufacturers''optimism is now almost as low as it was in 1991 - the last time the us economy experienced a recession. consumer confidence also fell sharply in january, driven by marked pessimism over the short - term future. quite why this break in confidence should have been so rapid is not easy to understand. and its origins will largely determine the nature of the us downturn. on the one hand, greater use of information technology to economise on inventories may have led to shorter lags between changes in final demand and changes in output. if so, then it is possible that the speed of the downturn will be matched by the speed of the recovery, leading to a short - lived episode of output growth close to zero as the result of an inventory correction. on the other hand, the slowdown could be much more protracted if the imbalances in the us economy which have built up in recent years start to unwind, leading to a reduction in spending as both households and businesses seek to reduce the amount of outstanding debt on their balance sheets. the key to the nature of the us downturn is what will happen to productivity growth. over the past five years there has been accumulating evidence that the application of information technology has raised productivity growth in the us economy - the " new economy ". expectations of higher productivity growth increased demand by more than it raised supply initially, as firms invested in new technology and households anticipated higher future incomes. as a result, spending grew rapidly, outstripping supply and large imbalances emerged. the current account deficit in the us is now close to 5 % of gdp, a post - war record. it is sustainable as long as foreigners are prepared to finance it. so far, the profitability of' - 83 6. 06 27. 05 39. 42 37. 59 kccs ( no. in lakh ) 243. 07 271. 12 302. 35 337. 87 82. 43 gcc ( no. in lakh ) 13. 87 16. 99 21. 08 36. 29 22. 28 bc - ict accounts ( no. in lakh ) 132. 65 316. 30 573. 01 810. 38 677. 73 ict accounts - bc - total transactions ( no. in lakh ) 265. 15 841. 64 1410. 93 2546. 51 4799. 08 bis central bankers ’ speeches - core business and therefore set out to develop the expertise required to finance the smes successfully. fairly often the discussion on sme financing is reduced to two diametrically opposed positions. on one hand, smes are considered by banks as representing a high risk and therefore, should be avoided or only dealt with cautiously and at a premium price. on the other hand, banks are accused of being inflexible and risk averse and consequently irrelevant to the sector. it is important to understand where the truth lies in these two statements in order to advance the cause of sme financing. firstly, it is a fact that smes present higher credit risk than well - structured corporate entities. smes may not have proper accounting records, may have severe governance issues which undermine accountability, have poor access to markets, poor skill levels including financial illiteracy by promoters, lack collateral which the lender can rely on in the event of failure, may not even exist in an appropriate legal form and even the assessment of the viability of a project might be difficult. lending to smes can be a lenders nightmare for bankers. but it is also true that banks which are structured to deal with corporates are risk averse and inflexible when they deal with smes. often when they bring inappropriate risk assessment tools, they may focus too much on collateral rather than project viability. they may even regard sme financing as peripheral to their business. because of their limited knowledge of smes, they experience failure which itself reinforces the notion that smes are risky. what we want are financing institutions that are structured to respond to the unique characteristics of smes. specialised sme lending institutions are more likely to handle the risk problem presented by smes as a challenge to be overcome with appropriate products and credit risk management strategies and not as a basis for inaction or avoiding the sector altogether. in short, lending strategies which ensure success with corporates do not necessarily ensure similar success with smes. appropriate sme financing institutions must at the very least make lending to smes the core business. the second area of reflection that i would urge this meeting to consider is that of building appropriate financing models that have been shown to work in africa or other developing countries so that we all benefit from the best practices available. in south africa, for instance, franchising which allows the use of brand names and building capacity have been an bis central bankers ’ speeches important driver of sme financing since it reduces the perceived - source_sentence: and non - financial sectors take hold. these concerns are heightened by the recent upward shift in bond yields owing to the global " higher - for - longer " narrative and the flare - up of tensions in the middle east, which have added to the uncertainty surrounding the outlook. after a period of lower market volatility until august, the rising prospect of higher - forlonger rates has started to weigh on riskier asset valuations in recent months. risk sentiment in markets remains highly sensitive to further surprises in inflation and economic growth. higher than expected inflation or lower growth could trigger a rise in market volatility and risk premia, increasing the likelihood of credit events materialising. this brings me to the vulnerabilities in the non - bank financial sector. as regards credit risk, some non - banks remain heavily exposed to interest rate - sensitive sectors, such as highly indebted corporates and real estate. deteriorating corporate fundamentals and the ongoing correction in real estate markets could expose non - banks that have 2 / 4 bis - central bankers'speeches invested in these sectors to revaluation losses and investor outflows. furthermore, low levels of liquidity could expose investment funds to the potential risk of forced asset sales if macro - financial outcomes deteriorate. corporate profitability in the euro area has held up well, but higher interest rates are weighing on the debt servicing capacity of more vulnerable firms. a weakening economy could prove challenging for firms with high debt levels, subdued earnings and low interest coverage ratios. real estate firms are particularly vulnerable to losses stemming from the ongoing downturn in euro area commercial real estate markets. in an environment of tighter financing conditions and elevated uncertainty, real estate prices have declined markedly. the effects of higher interest rates have been compounded by structurally lower demand for some real estate assets following the pandemic. although banks'exposure to these markets is comparatively low, losses in this segment could act as an amplifying factor in the event of a wider shock. euro area households, especially those with lower incomes and in countries with mainly floatingrate mortgages, are being increasingly squeezed by the higher interest rates. tighter financing conditions have reduced the demand for housing, putting downward pressure on prices. on a more positive note, robust labour markets have so far supported household balance sheets, thereby mitigating the credit risk to banks. spreads in government bond markets have remained contained as many governments managed to secure cheap financing at longer maturities during the period of low interest rates. sentences: - market. and if we are to raise our national economic vantage point and take advantage of the momentum coming into 2013, thrift banks need to consider lending more to this economic segment. i say this because msmes provide great employment opportunities. a healthier msme sector will help ensure our economic growth is broad - based and inclusive. thrift banks, however, must not just lend more in terms of nominal amounts. the challenge to the industry really is to ensure that such lending continuously creates further opportunities. in this way, msmes can be assured of viability, regardless of mandated credit programs. final thoughts friends, you want this convention to be a discussion on broadening your horizon, our horizon. in trying to look ahead, i took us through the more scenic route of our recent past. the numbers tell us an encouraging story of growth and consistency of economic and financial stability. however, the paradox of financial stability is that we may just be at our weakest when we believe we are at our strongest position. complacency often sets in when positive news is continuous and this is reinforced by encouraging market parameters. if we fall into this trap of complacency, we risk losing our focus. for the thrift banking industry, the growth trajectory that many are anticipating suggests even better times ahead. it is therefore in our collective interest, if we wish to truly broaden our horizon, to agree on a common vision where thrift banks have a definitive role to play. bis central bankers ’ speeches in my remarks, i have suggested specific action points you may consider in order to accomplish just that, and over time broaden your horizon. i challenge the industry to 1 ) further improve credit underwriting standards for real estate and consumer loans and raise your vantage point by looking at your processes with the lens of financial stability, 2 ) interact with your clients and raise your vantage point by enhancing your practices with the heart of consumer protection and financial education, and 3 ) increase lending to msmes and raise your vantage point by lending with the mind to create greater value. but much more can be done. as the old board takes its place in ctb lore, your new board is now at the helm to guide the industry forward. it is never an easy task to move forward but rest assured that the bangko sentral ng pilipinas will be with you as we take that journey together into broader horizons. thank you very much and good day to all of you. bis central bankers ’ speeches - prices, which may also come under upward pressure owing to adverse weather events and the unfolding climate crisis more broadly. most measures of underlying inflation continue to decline. the eurostat's flash estimate for inflation excluding energy and food points to a further decline to 4. 2 % in october, supported by improving supply conditions, the pass - through of previous declines in energy prices, as well as the impact of tighter monetary policy on demand and corporate pricing power. at the same time, domestic price pressures are still strong and 1 / 4 bis - central bankers'speeches are being increasingly driven by wage pressures and the evolution of profit margins. while most measures of longer - term inflation expectations stand around 2 %, some indicators remain elevated and need to be monitored closely. the resilience of the labour market has been a bright spot for the euro area economy, but there are signs that the labour market is beginning to weaken. fewer new jobs are being created and, according to the latest flash estimate, employment expectations have continued to decline in october for both services and manufacturing. monetary policy based on our assessment of the inflation outlook, the dynamics of underlying inflation and the strength of monetary policy transmission, the governing council decided to keep the three key ecb interest rates unchanged at its october meeting. the incoming information has broadly confirmed our previous assessment of the medium - term inflation outlook. our past interest rate increases continue to be transmitted forcefully into financial and monetary conditions. banks'funding costs have continued to rise and are being passed on to businesses and households. the combination of higher borrowing rates and weakening activity led to a further sharp drop in credit demand in the third quarter of this year. and credit standards have tightened again. we are also seeing increasing signs of the impact of our policy decisions on the real economy. further tightening is still in the pipeline from the current policy stance, and it is set to further dampen demand and help push down inflation. we are determined to ensure that inflation returns to our 2 % medium - term target in a timely manner. based on our current assessment, we consider that the key ecb interest rates are at levels that, maintained for a sufficiently long duration, will make a substantial contribution to this goal. we will continue to follow a data - dependent approach to determining the appropriate level and duration of restriction. financial stability let me now turn to financial stability. in our upcoming financial stability review, we highlight that the financial stability outlook remains fragile as the gradual effects of tighter financial conditions on both the financial - the conservation of natural resources. it was only in the fitness of things that he delivered the inaugural address today. sir, we are overwhelmed by your insightful address and i am sure all of us have immensely benefited listening to his valuable views on a subject which is not only very dear to you but also so vital for the country ’ s economic growth. 4. i also note that the seminar includes several eminent speakers with vast experience on the subject. there are sessions on agricultural productivity, role of research and technology in improving agricultural productivity, linkages between productivity and farm income, incentivizing productivity enhancing investments, relation between credit growth and productivity growth and mitigation of risks in agriculture. in my address today, coming as do from the rbi, i would broadly focus on major trends / issues in agricultural productivity and credit, and the role of agricultural credit in improving agricultural productivity. i would also touch upon some of the steps that can yield results in the short - term. importance of agriculture 5. as you are aware, the recent indian growth story has been service - led. services sector has completely replaced agriculture, which was traditionally the largest contributor to india ’ s gdp. however, the fact that agriculture has the smallest share in gdp of only about 14 per cent today from a high of more than 50 per cent, does not belittle its importance for bis central bankers ’ speeches the indian economy. this is because first, as we all know, agriculture remains the largest employer having a share of around 60 per cent. secondly, it holds the key to creation of demand in other sectors and remains by far an important indirect contributor to india ’ s gdp growth. the agriculture sector needs to grow at least by 4 per cent for the economy to grow at 9 per cent. thus, though having a small share, the fluctuations in agricultural production can have large and significant impact on overall gdp growth. thirdly, since food is an important component in basket of commodities used for measuring consumer price indices, it is necessary that food prices are maintained at reasonable levels to ensure food security, especially for the deprived sections of our society. in fact, food security is emerging as an important policy concern, and the role of agriculture in ensuring equitable access to food has added a new perspective for policy makers. trends in agricultural productivity 6. i would like to discuss certain trends in agricultural productivity in india. as is wellknown, the year 1968 marked the beginning of a turning point in indian agriculture. the country - source_sentence: having a reserve currency may also be desirable for other reasons. first, with more reserve currencies available, portfolio diversification opportunities are enhanced. this is desirable because, all else equal, it would allow investors to move further out on the risk / return frontier. second, more currencies are likely to be close substitutes, which could dampen currency volatility. with many viable reserve currencies available, no particular one would necessarily have to bear the bulk of any adjustment. the dollar ’ s dominant reserve currency status has sometimes been referred to as the united states ’ “ exorbitant privilege, ” implying that the u. s. benefits extraordinarily from this privileged status. i ’ d argue that the situation is much more nuanced. yes, this status does allow the u. s. to benefit from seigniorage. more than half of all u. s. currency outstanding is held abroad. but, there are also costs of being the dominant reserve currency. for example, this can lead to shifts in the valuation of the dollar that are due primarily to developments abroad that affect risk appetites and international capital flows. in such cases, the dollar ’ s valuation can be pushed to levels inconsistent with u. s. economic fundamentals. for the united states, i believe that the most important goal must be to keep our own house in order. if we do this, then i expect that the u. s. dollar will earn the right to remain the most important reserve currency in the world. the united states has a number of advantages in sustaining the dominant reserve currency status of the u. s. dollar. first, there is a first - mover advantage. as the leading reserve currency in the world, there is no strong incentive for countries to move to other currencies as long as the dollar continues to have the attributes i discussed earlier. the history of reserve currency usage is characterized by considerable inertia. the u. s. dollar emerged as the leading reserve currency quite a bit after it became the world ’ s largest economy. typically, the loss of dominant reserve currency status requires either substantial economic decline or political instability that motivates foreign counterparties to shift to a new reserve currency. second, the u. s. has the deepest and most liquid capital markets in the world. this is important in making u. s. treasuries and agency mortgage - backed securities attractive holdings as part of countries ’ foreign exchange reserve portfolio sentences: - as those on student visas, are assumed to begin arriving early next year, subject to appropriate quarantine restrictions. but tourists and other short - term visitors will not be able to return until later. in the baseline and upside scenarios, we assume that the borders reopen in mid 2021. in the downside scenario, where the global spread of the virus does not subside as quickly, we assume the borders are closed for all of 2021. it is always possible to construct other scenarios. the near term would be stronger if there were a major medical breakthrough on treatments soon. an effective vaccine would take a bit longer to be distributed, so it would mainly affect outcomes next year and the year after. but it could also result in a stronger recovery than we have assumed even in the upside scenario presented here. a worse outcome than our downside could be conceivable if the virus cannot be contained and further waves of infection occur around the world for some years yet. there are also plenty of other risks that can be contemplated ; we discuss some of these in the statement. geopolitical tensions were an issue even before the coronavirus outbreak, and could escalate further. the pandemic has also in some places exacerbated domestic political tensions. it is hard to know how these tensions will play out over the next couple of years. if they escalate, it is possible that some countries'recoveries will be derailed. domestically, there are also a number of uncertainties that go beyond the direct effects of the virus and associated activity restrictions. for example, we have assumed that households and businesses are quite cautious in their use of the fiscal and other cash flow support they have been receiving. it is possible that people spend more out of that support than we are assuming. it is even possible that some people do more to make up for the consumption opportunities that were not available during periods of lockdown and other activity restrictions. on the downside, the longer the economy remains weak, the more the recovery will be impeded by scarring effects on workers, the destruction of business supply networks and other lingering damage. what has changed in the past three months the scenarios presented in the past few months. statement incorporate several lessons from the experience of the first, the economic contractions induced by health - related restrictions on activity in the june quarter were very large. however, they were not quite as severe as initially expected. the peak - to - trough declines in output and hours worked - '. norway was hit by oil price shocks in the following years and economic reforms were implemented in several areas. after a period it became clear that the central government budget would be in surplus and that transfers would be made to the fund. norway ’ s experience from its first 30 years as an oil - producing nation led to the introduction of the fiscal rule, which has been a key element of norwegian economic policy for the past decade. report no. 29 to the storting of 2001 laid down guidelines for the phasing - in of petroleum revenues into the norwegian economy, establishing two main principles : economic policy must contribute to stable economic developments and be sustainable over time. by linking petroleum revenue spending to the expected real return on the fund – and not to current petroleum revenues – the fiscal rule provided for a gradual and sustainable phasingin of the revenues. if we can restrict spending to the return, the fund will never shrink. norway will also be less vulnerable to fluctuations in current petroleum revenues. ( chart 8 : effect on the size of gpfg of change in oil price and return ) in the first ten years after the establishment of the fiscal rule, the pace of the fund ’ s growth was determined by oil prices. prospects in 2001 indicated that a 25 percent higher oil price over a 10 - year period would increase the size of the fund by almost nok 800 billion. uncertainty with regard to the return on the fund was far less important for growth. this situation will change as the fund grows and oil and gas production declines. from 2020 to 2030, the oil price will play a less prominent role for the size of the fund, while the return on the fund will be all the more important. in the initial years, the actual return, adjusted for inflation and costs, was close to 4 percent. in recent years, with the financial crisis and the sovereign debt crisis, the real return has been lower, averaging about 2½ percent since 1998. source : lie, e. and c. venneslan : over evne. finansdepartementet 1965 – 1992 [ beyond our power. ministry of finance 1965 – 1992 ]. pax forlag, 2010. see report no. 25 to the storting ( 1973 - 74 ) : “ petroleumsvirksomhetens plass i det norske samfunn ” [ the role of petroleum activity in norwegian society ]. bis central bankers ’ speeches we should be careful about taking a rear -' - ', increasingly relies on price signals generated by trading activity that takes place daily in these markets. the reliance on secondary market trading for price discovery constitutes the fundamental difference between funds from securities markets and loans from banks. let me be a bit more specific. in securities markets, investment decisions are driven by prices that arise from a trading process that reconciles differential information from a diverse group of investors. in bank loans, investment decisions are based on the bank ’ s private information about specific borrowers. while a bank makes its own investment decisions, securities markets rely on the consensus of a multitude of investors. when securities markets work well, they provide efficient ways of aggregating information and allocating risks among a wide range of investors. in order to function well, however, these markets require a trading infrastructure. this infrastructure may consist of an exchange, a network of brokers and dealers, and a clearing system. these markets also rely on a cadre of relatively well - informed investors, who confidently judge asset prices and take positions on the – 2 – strength of their judgments. if the trading infrastructure fails or investors lose confidence, trading will grind to a halt. the global fixed - income markets are unlike equity markets. in equity markets, everyone knows something about the trading infrastructure, which is centralized in exchanges. thus, there is no question as to the focal point of trading information. but the importance of fixed - income markets, which are multiple - dealer, over - the - counter markets, is sometimes hard to appreciate because they are so decentralized. in the united states, the bond market is where companies have been raising most of their funds in recent years. during the last ten years, for example, u. s. nonfinancial corporations borrowed a net amount of $ 785 billion in the form of bonds, three times the net amount they borrowed from banks. over this same period, these companies as a group spent $ 600 billion more to retire stock – through buybacks and mergers – than they raised in new offerings. accompanying these increased levels of debt market activity has been a continuous process of financial innovation. this innovation has served to unbundle different kinds of risk and, thereby, to enlarge the menu of risks that investors may choose to bear. for example, interest - rate swaps, futures and options help reconfigure various interest - rate risks. total return swaps and creditspread options are tools for reallocating the payment risks primarily of emerging market' - source_sentence: education shocks arising from the pandemic could signify the largest reversal in human development on record, equivalent to erasing all the progress in human development of the past six years4. the risk of reversing progress in achieving financial inclusion is accentuating why financial inclusion matters now more than ever. optimising islamic finance for inclusive economic recovery policy and regulatory responses across the globe, and likewise in malaysia, have been focused on safeguarding economic resilience, managing risks to financial stability and minimising repercussions to society. this is done in tandem with the different stages of the pandemic – namely, containment, stabilisation and recovery. at the onset of the crisis, governments along with financial regulators have deployed sizeable stimulus packages and various assistance programmes in order to contain the crisis and stabilise the economy. this includes addressing demand and supply disruptions, maintaining cash flows and keeping workers employed. in malaysia, the total stimulus package amounted to usd 73. 55 billion ( rm3056 billion ) with an additional fiscal injection by the government totalling 1 / 4 bis central bankers'speeches rm45 billion. as at september 2020, a total of 2. 63 million workers and 321, 000 employers had benefitted from the wage subsidy programme, involving an expenditure of rm10. 4 billion. to provide further stimulus to the economy, the bank has reduced the overnight policy rate ( opr ) by a cumulative 125 basis points ( from 3. 00 % to 1. 75 % ) this year, alongside reduction in the statutory reserve requirement by 100 basis points ( from 3. 00 % to 2. 00 % ). the reduction in the opr is intended to provide additional policy stimulus to accelerate the pace of economic recovery. the financial industry, including islamic financial institutions, also lent support to their borrowers and customers. in the first half of 2020, a total of rm120 billion7 was disbursed in lending / financing to smes, with more accounts being approved8 in aggregate in 2020 compared to the same period in previous years. islamic financial institutions and related associations have been actively educating and reaching out to affected borrowers about the financial assistance programmes available in response to the pandemic. the takaful and insurance industry also facilitated affected certificate holders by offering temporary deferment of contribution and premium to promote continuity of takaful protection coverage. more than 1. 1 million9 certificate and policyholders have benefited from this relief measure. while the sentences: - education at an early age. a credit union is similar to a commercial bank, it can provide the same products and services like a commercial bank, but what is different is that the credit union is owned by members who are the shareholders and at the same time are customers to the credit union. credit unions can provide different saving accounts to suit the needs of their members. they can create saving products such as junior saving accounts for the children. credit unions provide lending products for their members. and credit unions can provide insurance coverage for their members as well. one of the reasons people explain why they don ’ t want to open an account with a commercial bank is that because banks charge fees to keep their accounts. while paying fees is something that we all cannot avoid because provision of banking services costs money, in a credit union the interests, fees and charges that are paid by members is distributed back to the members as net interest income. i say this because i am also member of our staff credit union called bokolo credit union limited day one when i join the central bank of solomon islands and know the benefit credit unions can give members. as the registrar for credit unions, i take note of the progress in the growth of the assets of both the soltuna employees credit union limited and the tuna trust credit union limited. if the board and management of these credit unions continue to manage these credit unions professionally, they will both be the vehicle to encourage our children and youth here in noro and munda community to learn how to save and earn and become good financial and economic citizens of our country. finally, let me take this opportunity to thank the many people who made this global money week celebration possible. to the board of directors of both tuna trust credit union limited and soltuna employees credit union limited. i salute you for accepting our request to work together to host this global money week. i would like to thank selwyn talasasa, one of the long serving credit union promoters who has been a credit union person since day one of his career. he has been very helpful in organizing our global money week. thank you selwyn. i also thank the global money week committee for their assistance in organising this global money week. and thanks to the management of both national fisheries development limited and soltuna limited for your support in hosting this global money week. thank you nfd and soltuna, you made us all proud as good cooperate citizens. to our noro town clerk for your support. - 'in tandem, the policy rate of this central bank would have to be changed frequently and forcefully in the same direction to offset the shock. but what would happen in the other economy, if it – unlike the first – were more prone to supply shocks? experience suggests that supply shocks yield sharp transitory increases in inflation, possibly followed by smaller, more permanent “ secondround ” effects, though the longer - run impact on inflation is obviously significantly determined by the response of monetary policy. given the transitory nature of the initial inflation bursts, the simple hypothetical rule – which incorporates the reaction to expected inflation – would advise the central bank to “ look through ” the immediate disturbance and change policy only to the extent needed to offset the anticipated more permanent effects of the shock on inflation in subsequent quarters. its policy rate, again, would be observed to be less variable. what is important to note is that the same rule – equally active strategies – would support two different patterns of observed policy behaviour in different economic environments. the third case is perhaps the most interesting of all. here exogenous shocks are identical, but economic structures differ. different transmission mechanisms therefore propagate the same shocks with lags that vary between the two economies. the first economy has more rigid adjustment mechanisms : price - setters and wage - negotiators are more sluggish than those in the other economy in processing economic news – including changes in the stance of monetary policy – and bringing them to bear on their decisions. what is the source of those rigidities in the first economy? there can be many reasons for rigidity. perhaps labour practices and contractual institutions – dating from the early post - war decades when the economy was heavily regulated – induce distortions in large a similar interpretation of “ interest rate inertia ” – the tendency of central banks to adjust rates in the same direction and in small steps – can be found in g. rudebusch, “ term structure evidence on interest rate smoothing and monetary policy inertia ”, journal of monetary economics, vol. 49 ( pp. 1161 - 1187 ), 2002. segments of the labour market. this stands in the way of an efficient matching of skills and productive capabilities. perhaps tight regulatory restraint on business and statutory inhibitions discourage innovation and impede a faster response to new shocks and new opportunities. whatever the source of rigidity, the observational result is that prices and wages in the first economy reflect changes in fundamentals with considerable lags. how should' - 'fund ( uncdf ) are also actively developing financial inclusion knowledge, and supporting countries in implementing financial inclusion strategies globally. over the years, we have since witnessed the rapid deployment of innovative financial inclusion solutions that have changed the lives of many. since the introduction of the mpesa mobile banking service in kenya, the number of adults with access to financial services has increased from 42 % in 2011 to 75 % today. agent banking in malaysia has provided consumers with an innovative and alternative channel to access financial services, resulting in the percentage of sub - districts served by financial services access point to increase from 46 % in 2011 to 96 % in 2014. this has enabled 99 % of malaysians, particularly those in rural areas, to conveniently access and benefit from financial services. the microfinance “ global financial development report 2014 ”, world bank, 2014. discussion note “ redistribution, inequality, and growth ”, imf, 2014. bis central bankers ’ speeches institution grameen bank in bangladesh has extended credit through its 2, 500 branches to almost 8 million people, with 60 % of them lifted from poverty. the loan recovery rate is higher than 98 percent. the experience in bangladesh, in particular, dispels the myth that the poor is not bankable. of proportionate regulation and the global standards the proportionate application of global standards for financial regulation is a critical factor in enabling innovative financial inclusion solutions, ensuring its delivery in a safe and sound manner. my remarks today will touch on the following topics : recent developments in global standards and financial inclusion ; focus areas to advance proportionality in practice ; and the importance of the global symposium to galvanise action. recent developments in global standards and financial inclusion global standards developed after the global financial crisis were designed to address the financial stability and integrity issues that came to the fore in developed countries. many of these issues involved the activities of global systemically important financial institutions. the effects of these standards on financial inclusion, such as the impact on smaller financial institutions in developing countries most likely were not taken into consideration. fortunately, standard - setting bodies have now recognised the principle of proportionality, emphasising on the balance between objectives of financial stability, integrity and inclusion. however, the real challenge lies in implementing proportionality in practice. if principles are not implemented in a proportionate manner, there could be unintended consequences for financial inclusion. for example, despite the financial action task force ’ s guidance on the implementation of a risk - based approach to anti money laundering /' model-index: - name: SentenceTransformer based on samchain/EconoBert results: - task: type: semantic-similarity name: Semantic Similarity dataset: name: econosentencev2 type: econosentencev2 metrics: - type: pearson_cosine value: 0.8556254615515724 name: Pearson Cosine - type: spearman_cosine value: 0.8619885397301873 name: Spearman Cosine - task: type: semantic-similarity name: Semantic Similarity dataset: name: econosentence prior df type: econosentence-prior_df metrics: - type: pearson_cosine value: 0.8373175950726144 name: Pearson Cosine - type: spearman_cosine value: 0.8294930387036759 name: Spearman Cosine --- # SentenceTransformer based on samchain/EconoBert This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [samchain/EconoBert](https://huggingface.co/samchain/EconoBert). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. This model is the second version of the previous [samchain/econo-sentence-v1](https://huggingface.co/samchain/econo-sentence-v1). They both used the same base model but have been trained on different datasets. The model has been evaluated both on the newest dataset and its precedent version: - `econo-pairs-v2` is the newest dataset. Check its card [here](https://huggingface.co/samchain/econo-pairs-v2) - `econo-pairs-v1`is the oldest version. Check its card [here](https://huggingface.co/samchain/econo-pairs) ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [samchain/EconoBert](https://huggingface.co/samchain/EconoBert) <!-- at revision 1554ddcdc25e1886cc43e05b9c77a2b8c4888da6 --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 dimensions - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("samchain/econosentence-v2") # Run inference sentences = [ "education shocks arising from the pandemic could signify the largest reversal in human development on record, equivalent to erasing all the progress in human development of the past six years4. the risk of reversing progress in achieving financial inclusion is accentuating why financial inclusion matters now more than ever. optimising islamic finance for inclusive economic recovery policy and regulatory responses across the globe, and likewise in malaysia, have been focused on safeguarding economic resilience, managing risks to financial stability and minimising repercussions to society. this is done in tandem with the different stages of the pandemic – namely, containment, stabilisation and recovery. at the onset of the crisis, governments along with financial regulators have deployed sizeable stimulus packages and various assistance programmes in order to contain the crisis and stabilise the economy. this includes addressing demand and supply disruptions, maintaining cash flows and keeping workers employed. in malaysia, the total stimulus package amounted to usd 73. 55 billion ( rm3056 billion ) with an additional fiscal injection by the government totalling 1 / 4 bis central bankers'speeches rm45 billion. as at september 2020, a total of 2. 63 million workers and 321, 000 employers had benefitted from the wage subsidy programme, involving an expenditure of rm10. 4 billion. to provide further stimulus to the economy, the bank has reduced the overnight policy rate ( opr ) by a cumulative 125 basis points ( from 3. 00 % to 1. 75 % ) this year, alongside reduction in the statutory reserve requirement by 100 basis points ( from 3. 00 % to 2. 00 % ). the reduction in the opr is intended to provide additional policy stimulus to accelerate the pace of economic recovery. the financial industry, including islamic financial institutions, also lent support to their borrowers and customers. in the first half of 2020, a total of rm120 billion7 was disbursed in lending / financing to smes, with more accounts being approved8 in aggregate in 2020 compared to the same period in previous years. islamic financial institutions and related associations have been actively educating and reaching out to affected borrowers about the financial assistance programmes available in response to the pandemic. the takaful and insurance industry also facilitated affected certificate holders by offering temporary deferment of contribution and premium to promote continuity of takaful protection coverage. more than 1. 1 million9 certificate and policyholders have benefited from this relief measure. while the", 'education at an early age. a credit union is similar to a commercial bank, it can provide the same products and services like a commercial bank, but what is different is that the credit union is owned by members who are the shareholders and at the same time are customers to the credit union. credit unions can provide different saving accounts to suit the needs of their members. they can create saving products such as junior saving accounts for the children. credit unions provide lending products for their members. and credit unions can provide insurance coverage for their members as well. one of the reasons people explain why they don ’ t want to open an account with a commercial bank is that because banks charge fees to keep their accounts. while paying fees is something that we all cannot avoid because provision of banking services costs money, in a credit union the interests, fees and charges that are paid by members is distributed back to the members as net interest income. i say this because i am also member of our staff credit union called bokolo credit union limited day one when i join the central bank of solomon islands and know the benefit credit unions can give members. as the registrar for credit unions, i take note of the progress in the growth of the assets of both the soltuna employees credit union limited and the tuna trust credit union limited. if the board and management of these credit unions continue to manage these credit unions professionally, they will both be the vehicle to encourage our children and youth here in noro and munda community to learn how to save and earn and become good financial and economic citizens of our country. finally, let me take this opportunity to thank the many people who made this global money week celebration possible. to the board of directors of both tuna trust credit union limited and soltuna employees credit union limited. i salute you for accepting our request to work together to host this global money week. i would like to thank selwyn talasasa, one of the long serving credit union promoters who has been a credit union person since day one of his career. he has been very helpful in organizing our global money week. thank you selwyn. i also thank the global money week committee for their assistance in organising this global money week. and thanks to the management of both national fisheries development limited and soltuna limited for your support in hosting this global money week. thank you nfd and soltuna, you made us all proud as good cooperate citizens. to our noro town clerk for your support.', 'fund ( uncdf ) are also actively developing financial inclusion knowledge, and supporting countries in implementing financial inclusion strategies globally. over the years, we have since witnessed the rapid deployment of innovative financial inclusion solutions that have changed the lives of many. since the introduction of the mpesa mobile banking service in kenya, the number of adults with access to financial services has increased from 42 % in 2011 to 75 % today. agent banking in malaysia has provided consumers with an innovative and alternative channel to access financial services, resulting in the percentage of sub - districts served by financial services access point to increase from 46 % in 2011 to 96 % in 2014. this has enabled 99 % of malaysians, particularly those in rural areas, to conveniently access and benefit from financial services. the microfinance “ global financial development report 2014 ”, world bank, 2014. discussion note “ redistribution, inequality, and growth ”, imf, 2014. bis central bankers ’ speeches institution grameen bank in bangladesh has extended credit through its 2, 500 branches to almost 8 million people, with 60 % of them lifted from poverty. the loan recovery rate is higher than 98 percent. the experience in bangladesh, in particular, dispels the myth that the poor is not bankable. of proportionate regulation and the global standards the proportionate application of global standards for financial regulation is a critical factor in enabling innovative financial inclusion solutions, ensuring its delivery in a safe and sound manner. my remarks today will touch on the following topics : recent developments in global standards and financial inclusion ; focus areas to advance proportionality in practice ; and the importance of the global symposium to galvanise action. recent developments in global standards and financial inclusion global standards developed after the global financial crisis were designed to address the financial stability and integrity issues that came to the fore in developed countries. many of these issues involved the activities of global systemically important financial institutions. the effects of these standards on financial inclusion, such as the impact on smaller financial institutions in developing countries most likely were not taken into consideration. fortunately, standard - setting bodies have now recognised the principle of proportionality, emphasising on the balance between objectives of financial stability, integrity and inclusion. however, the real challenge lies in implementing proportionality in practice. if principles are not implemented in a proportionate manner, there could be unintended consequences for financial inclusion. for example, despite the financial action task force ’ s guidance on the implementation of a risk - based approach to anti money laundering /', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Semantic Similarity * Datasets: `econo-pairs-v2` and `econo-pairs-v1` * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) Econo-sentence-v2 evaluation results: | Metric | econo-pairs-v2 | econo-pairs-v1 | |:--------------------|:----------------|:-----------------------| | pearson_cosine | 0.8556 | 0.8373 | | **spearman_cosine** | **0.862** | **0.8295** | Econo-sentence-v1 evaluation results: | Metric | econo-pairs-v2 | econo-pairs-v1 | |:--------------------|:----------------|:-----------------------| | pearson_cosine | 0.7642 | 0.8999 | | **spearman_cosine** | **0.758** | **0.8358** | The v2 and v1 differs in terms of training as the v2 has been train on 0.5 labels, while the v1 is only trained on 1 and 0. The results show that the v2 still performs on par with its predecessor on the prior dataset while improving the performance by 10 points on the latest. <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Train set of econo-pairs-v2 * Size: 45,044 training samples * Columns: <code>text1</code>, <code>text2</code>, and <code>label</code> * Approximate statistics based on the first 1000 samples: | | text1 | text2 | label | |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:---------------------------------------------------------------| | type | string | string | float | | details | <ul><li>min: 3 tokens</li><li>mean: 454.6 tokens</li><li>max: 505 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 458.88 tokens</li><li>max: 505 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.49</li><li>max: 1.0</li></ul> | * Samples: | text1 | text2 | label | |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------| | <code>in credit intermediation is not disrupted. i hope the analysis and observations i have shared today can assist with a process of learning, evolving, and improving that will prevent future disruptions. thank you. bis central bankers ’ speeches bis central bankers ’ speeches bis central bankers ’ speeches</code> | <code>richard byles : update and outlook for the jamaican economy monetary policy press statement by mr richard byles, governor of the bank of jamaica, at the quarterly monetary policy report press conference, kingston, 21 february 2024. * * * introduction good morning and welcome to our first quarterly monetary policy report press conference for 2024. the year has started with inflation being above the bank's inflation target. statin reported last week that headline inflation at january 2024 was 7. 4 per cent, which is higher than the outturns for the previous three months, and above the bank's target of 4 to 6 per cent. core inflation, however, which excludes food and fuel prices from the consumer price index, was 5. 9 per cent, which is lower than the 7. 1 per cent recorded in january 2023. the inflation outturn at january is higher than the bank had projected in november 2023. as we communicated then, in the wake of the announcement by the minister of finance and the public service of th...</code> | <code>0.0</code> | | <code>european central bank : press conference - introductory statement introductory statement by mr jean - claude trichet, president of the european central bank, frankfurt am main, 4 may 2006. * * * ladies and gentlemen, let me welcome you to our press conference and report on the outcome of today ’ s meeting of the ecb ’ s governing council. the meeting was also attended by commissioner almunia. on the basis of our regular economic and monetary analyses, we have decided to leave the key ecb interest rates unchanged. overall, the information which has become available since our last meeting broadly confirms our earlier assessment of the outlook for price developments and economic activity in the euro area, and that monetary and credit growth remains very dynamic. against this background, the governing council will exercise strong vigilance in order to ensure that risks to price stability over the medium term do not materialise. such vigilance is particularly warranted in a context of ample...</code> | <code>vitor constancio : strengthening european economic governance – surveillance of fiscal and macroeconomic imbalances speech by mr vitor constancio, vice - president of the european central bank, at the brussels economic forum, brussels, 18 may 2011. * * * thank you very much for the invitation to participate in this panel today. the session is intended to focus on “ surveillance of fiscal and macroeconomic imbalances ”, which arguably is the most important strand of the economic governance package. but one should consider that this package – beyond the 6 legislative acts – also contains three other strands : the euro plus pact, the creation of the european stability mechanism and the setting - up of the european system of financial supervision. a fundamental strengthening of economic governance in the euro area requires simultaneous progress in all three areas. the first strand is necessary to prevent and correct imbalances ; the second to ensure the conditions for future growth and com...</code> | <code>0.5</code> | | <code>. innovation is already altering the power source of motor vehicles, and much research is directed at reducing gasoline requirements. at present, gasoline consumption in the united states alone accounts for 11 percent of world oil production. moreover, new technologies to preserve existing conventional oil reserves and to stabilize oil prices will emerge in the years ahead. we will begin the transition to the next major sources of energy perhaps before midcentury as production from conventional oil reservoirs, according to central tendency scenarios of the energy information administration, is projected to peak. in fact, the development and application of new sources of energy, especially nonconventional oil, is already in train. nonetheless, it will take time. we, and the rest of the world, doubtless will have to live with the uncertainties of the oil markets for some time to come.</code> | <code>oil industry to build inventories, demand from investors who have accumulated large net long positions in distant oil futures and options is expanding once again. such speculative positions are claims against future oil holdings of oil firms. currently, strained capacity has limited the ability of oil producers to quickly satisfy this markedly increased demand for inventory. adding to the difficulties is the rising consumption of oil, especially in china and india, both of which are expanding economically in ways that are relatively energy intensive. even the recent notable pickup in opec output, by exhausting most of its remaining excess capacity, has only modestly satisfied overall demand. output from producers outside opec has also increased materially, but investment in new producing wells has lagged, limiting growth of production in the near term. crude oil prices are also being distorted by shortages of capacity to upgrade the higher sulphur content and heavier grades of crude oi...</code> | <code>1.0</code> | * Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "pairwise_cos_sim" } ``` ### Evaluation Dataset #### Test set of econo-pairs-v2 * Size: 11,261 evaluation samples * Columns: <code>text1</code>, <code>text2</code>, and <code>label</code> * Approximate statistics based on the first 1000 samples: | | text1 | text2 | label | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:--------------------------------------------------------------| | type | string | string | float | | details | <ul><li>min: 4 tokens</li><li>mean: 451.65 tokens</li><li>max: 505 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 457.79 tokens</li><li>max: 505 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.5</li><li>max: 1.0</li></ul> | * Samples: | text1 | text2 | label | |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------| | <code>the drying up of the unsecured interbank market a price worth paying for lower interest rate volatility and tighter interest rate control in the money markets? at present, it is difficult to envisage how the unsecured interbank money market could be revived in full in the near future, reaching the pre - global financial crisis levels. this is due both to regulatory reasons and to fragmentation. post - crisis, banks appear reluctant to lend each other in the unsecured interbank market, pricing such lending at levels that are sufficient to compensate for counterparty credit risk. at the same time, short - term ( below 6 months ) interbank funding does not contribute towards satisfying the nsfr requirements. as a result, the unsecured segment of the interbank market is unattractive for prospective borrowers and is not expected to regain importance. furthermore, heterogeneity within and across the banking systems of euro area countries impedes the efficient redistribution of reserves acros...</code> | <code>##obalisation, artificial intelligence, as well as the green transition and climate change adaptation policies. 1 / 7 bis - central bankers'speeches given the uncertain overall impact of these effects and the significant unknowns and ambiguity surrounding the computation of natural interest rates, their prospective level and evolution are not easily determined. turning now to the lessons learnt, experience gained during the past crises has been extremely valuable. first, monetary policy needs to remain flexible so that it can accommodate potential future shocks to price stability, of any nature and direction. second, financial stability considerations need to be taken into account in the decision making of monetary policymakers, in their pursuit of price stability. the past decade provided us with new insights into what monetary policy can do. we are now better prepared to implement the appropriate monetary policy mix, with standard and non - standard tools, if we were to face new chal...</code> | <code>1.0</code> | | <code>therefore worries me that greece might go backwards in terms of its reform agenda. suffice to say, it is not enough to correct misalignments solely at the national level. we also need structural reforms at the european level. and here, a lot has been done since the crisis broke out. first, the rules of the stability and growth pact were tightened and a fiscal compact was adopted in order to restore confidence in public finances. second, a procedure for identifying macroeconomic imbalances at an early stage was established. and third, a crisis mechanism was set up to serve as a “ firewall ”, safeguarding the stability of the financial system in the euro area. in addition to these measures, the euro area took a major step toward deeper financial integration. on 4 november 2014, the first pillar of a european banking union was put in place. on that day, the ecb assumed responsibility for supervising the 123 largest banks in the euro area. together, the banks concerned account for more tha...</code> | <code>andreas dombret : the euro area – where do we stand? speech by dr andreas dombret, member of the executive board of the deutsche bundesbank, at the german - turkish chamber of trade and commerce, istanbul, 10 february 2015. * 1. * * introduction ladies and gentlemen thank you for inviting me to speak here at the german - turkish chamber of industry and commerce in istanbul. isaac newton once observed that “ we build too many walls and not enough bridges ”. istanbul has for centuries been a bridge between two continents – asia and europe. it seems to me that istanbul is therefore an excellent place to hold this year ’ s g20 meetings. after all, the purpose of these meetings is to build bridges between nations by fostering dialogue and the mutual exchange of information. and this objective is of course also pursued by the german - turkish chamber of trade and commerce. moreover, mutual exchange is essential in a globalised world where countries are no longer isolated islands but part of ...</code> | <code>1.0</code> | | <code>for vital issues regarding the agenda. i desire you all success with the materialization and accomplishment of your present and future projects and i conclude by wishing all the foreign guests at this conference a good stay in romania. i am now giving the floor to mr kurt puchinger, coordinator of the eu strategy for the danube region and chair of the first session, to start the conference. thank you. bis central bankers ’ speeches</code> | <code>national bank of romania, as they capture appropriately the opinions of the real sector on some key issues, such as : ( i ) the most pressing problems that firms are facing in their activity ; ( ii ) the investment needs and priorities ; ( iii ) challenges raised by climate change and energy efficiency. the survey shows that firms in romania have become less optimistic regarding investment conditions for the year ahead. while this is a worrying trend, we are rather positive that romania will be able to overcome the main problem identified, the energy costs, in the near future. i would add here that the government has implemented several measures in this respect, such as electricity and natural gas price capping schemes, the compensation of the price of motor fuel / liter and caps on the firewood price. also, we are aware that the war in ukraine and the related sanctions will continue to generate considerable uncertainties and risks to the outlook for economic activity, through possibly...</code> | <code>0.5</code> | * Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "pairwise_cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `learning_rate`: 2e-05 - `num_train_epochs`: 2 - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 2 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: None - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `average_tokens_across_devices`: False - `prompts`: None - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | Validation Loss | econosentencev2_spearman_cosine | |:------:|:----:|:-------------:|:---------------:|:-------------------------------:| | 0.0888 | 250 | 4.2245 | - | - | | 0.1776 | 500 | 3.6201 | 3.5284 | 0.7812 | | 0.2663 | 750 | 3.5093 | - | - | | 0.3551 | 1000 | 3.4858 | 3.4938 | 0.8177 | | 0.4439 | 1250 | 3.5671 | - | - | | 0.5327 | 1500 | 3.3165 | 3.2432 | 0.8302 | | 0.6214 | 1750 | 3.2781 | - | - | | 0.7102 | 2000 | 3.2901 | 3.1754 | 0.8355 | | 0.7990 | 2250 | 3.2053 | - | - | | 0.8878 | 2500 | 3.1673 | 3.1109 | 0.8422 | | 0.9766 | 2750 | 3.0844 | - | - | | 1.0653 | 3000 | 2.7478 | 3.1557 | 0.8483 | | 1.1541 | 3250 | 2.7042 | - | - | | 1.2429 | 3500 | 2.6291 | 3.1707 | 0.8541 | | 1.3317 | 3750 | 2.6193 | - | - | | 1.4205 | 4000 | 2.6128 | 3.1136 | 0.8546 | | 1.5092 | 4250 | 2.5762 | - | - | | 1.5980 | 4500 | 2.5992 | 3.0960 | 0.8579 | | 1.6868 | 4750 | 2.4983 | - | - | | 1.7756 | 5000 | 2.3061 | 3.1615 | 0.8608 | | 1.8643 | 5250 | 2.3459 | - | - | | 1.9531 | 5500 | 2.418 | 3.1436 | 0.8617 | ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.4.1 - Transformers: 4.49.0 - PyTorch: 2.1.0+cu118 - Accelerate: 1.4.0 - Datasets: 3.3.1 - Tokenizers: 0.21.0 ## Citation Samuel Chaineau ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### CoSENTLoss ```bibtex @online{kexuefm-8847, title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT}, author={Su Jianlin}, year={2022}, month={Jan}, url={https://kexue.fm/archives/8847}, } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
[ "TEXT_CLASSIFICATION", "SEMANTIC_SIMILARITY" ]
[ "BEAR" ]
Non_BioNLP
AventIQ-AI/bert-medical-entity-extraction
AventIQ-AI
null
[ "safetensors", "bert", "region:us" ]
1,740
1,740
72
3
--- {} --- # Medical Entity Extraction with BERT ## 📌 Overview This repository hosts the quantized version of the `bert-base-cased` model for Medical Entity Extraction using the 'tner/bc5cdr' dataset. The model is specifically designed to recognize entities related to **Disease,Symptoms,Drug**. The model has been optimized for efficient deployment while maintaining high accuracy, making it suitable for resource-constrained environments. ## 🏗 Model Details - **Model Architecture**: BERT Base Cased - **Task**: Medical Entity Extraction - **Dataset**: Hugging Face's `tner/bc5cdr` - **Quantization**: Float16 - **Fine-tuning Framework**: Hugging Face Transformers --- ## 🚀 Usage ### Installation ```bash pip install transformers torch ``` ### Loading the Model ```python from transformers import BertTokenizerFast, BertForTokenClassification import torch device = "cuda" if torch.cuda.is_available() else "cpu" model_name = "AventIQ-AI/bert-medical-entity-extraction" model = BertForTokenClassification.from_pretrained(model_name).to(device) tokenizer = BertTokenizerFast.from_pretrained(model_name) ``` ### Named Entity Recognition Inference ```python from transformers import pipeline ner_pipeline = pipeline("ner", model=model_name, tokenizer=tokenizer) test_sentence = "An overdose of Ibuprofen can lead to severe gastric issues." ner_results = ner_pipeline(test_sentence) label_map = { "LABEL_0": "O", # Outside (not an entity) "LABEL_1": "Drug", "LABEL_2": "Disease", "LABEL_3": "Symptom", "LABEL_4": "Treatment" } def merge_tokens(ner_results): merged_entities = [] current_word = "" current_label = "" current_score = 0 count = 0 for entity in ner_results: word = entity["word"] label = entity["entity"] # Model's output (e.g., LABEL_1, LABEL_2) score = entity["score"] # Merge subwords if word.startswith("##"): current_word += word[2:] # Remove '##' and append current_score += score count += 1 else: if current_word: # Store the previous merged word mapped_label = label_map.get(current_label, "Unknown") merged_entities.append((current_word, mapped_label, current_score / count)) current_word = word current_label = label current_score = score count = 1 # Add the last word if current_word: mapped_label = label_map.get(current_label, "Unknown") merged_entities.append((current_word, mapped_label, current_score / count)) return merged_entities print("\n🩺 Medical NER Predictions:") for word, label, score in merge_tokens(ner_results): if label != "O": # Skip non-entities print(f"🔹 Entity: {word} | Category: {label} | Score: {score:.4f}") ``` ### **🔹 Labeling Scheme (BIO Format)** - **B-XYZ (Beginning)**: Indicates the beginning of an entity of type XYZ (e.g., B-PER for the beginning of a person’s name). - **I-XYZ (Inside)**: Represents subsequent tokens inside an entity (e.g., I-PER for the second part of a person’s name). - **O (Outside)**: Denotes tokens that are not part of any named entity. --- ## 📊 Evaluation Results for Quantized Model ### **🔹 Overall Performance** - **Accuracy**: **93.27%** ✅ - **Precision**: **92.31%** - **Recall**: **93.27%** - **F1 Score**: **92.31%** --- ### **🔹 Performance by Entity Type** | Entity Type | Precision | Recall | F1 Score | Number of Entities | |------------|-----------|--------|----------|--------------------| | **Disease** | **91.46%** | **92.07%** | **91.76%** | 3,000 | | **Drug** | **71.25%** | **72.83%** | **72.03%** | 1,266 | | **Symptom** | **89.83%** | **93.02%** | **91.40%** | 3,524 | | **Treatment** | **88.83%** | **92.02%** | **90.40%** | 3,124 | --- #### ⏳ **Inference Speed Metrics** - **Total Evaluation Time**: 15.89 sec - **Samples Processed per Second**: 217.26 - **Steps per Second**: 27.18 - **Epochs Completed**: 3 --- ## Fine-Tuning Details ### Dataset The Hugging Face's `tner/bc5cdr` dataset was used, containing texts and their ner tags. ## 📊 Training Details - **Number of epochs**: 3 - **Batch size**: 8 - **Evaluation strategy**: epoch - **Learning Rate**: 2e-5 ### ⚡ Quantization Post-training quantization was applied using PyTorch's built-in quantization framework to reduce the model size and improve inference efficiency. --- ## 📂 Repository Structure ``` . ├── model/ # Contains the quantized model files ├── tokenizer_config/ # Tokenizer configuration and vocabulary files ├── model.safetensors/ # Quantized Model ├── README.md # Model documentation ``` --- ## ⚠️ Limitations - The model may not generalize well to domains outside the fine-tuning dataset. - Quantization may result in minor accuracy degradation compared to full-precision models. --- ## 🤝 Contributing Contributions are welcome! Feel free to open an issue or submit a pull request if you have suggestions or improvements.
[ "NAMED_ENTITY_RECOGNITION" ]
[ "BC5CDR" ]
BioNLP